Sample records for performance modeling challenges

  1. Development, Testing, and Validation of a Model-Based Tool to Predict Operator Responses in Unexpected Workload Transitions

    NASA Technical Reports Server (NTRS)

    Sebok, Angelia; Wickens, Christopher; Sargent, Robert

    2015-01-01

    One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.

  2. Performance improvement CME for quality: challenges inherent to the process.

    PubMed

    Vakani, Farhan Saeed; O'Beirne, Ronan

    2015-01-01

    The purpose of this paper is to discuss the perspective debates upon the real-time challenges for a three-staged Performance Improvement Continuing Medical Education (PI-CME) model, an innovative and potential approach for future CME, to inform providers to think, prepare and to act proactively. In this discussion, the challenges associated for adopting the American Medical Association's three-staged PI-CME model are reported. Not many institutions in USA are using a three-staged performance improvement model and then customizing it to their own healthcare context for the specific targeted audience. They integrate traditional CME methods with performance and quality initiatives, and linking with CME credits. Overall the US health system is interested in a structured PI-CME model with the potential to improve physicians practicing behaviors. Knowing the dearth of evidence for applying this structured performance improvement methodology into the design of CME activities, and the lack of clarity on challenges inherent to the process that learners and providers encounter. This paper establishes all-important first step to render the set of challenges for a three-staged PI-CME model.

  3. Editorial: Cognitive Architectures, Model Comparison and AGI

    NASA Astrophysics Data System (ADS)

    Lebiere, Christian; Gonzalez, Cleotilde; Warwick, Walter

    2010-12-01

    Cognitive Science and Artificial Intelligence share compatible goals of understanding and possibly generating broadly intelligent behavior. In order to determine if progress is made, it is essential to be able to evaluate the behavior of complex computational models, especially those built on general cognitive architectures, and compare it to benchmarks of intelligent behavior such as human performance. Significant methodological challenges arise, however, when trying to extend approaches used to compare model and human performance from tightly controlled laboratory tasks to complex tasks involving more open-ended behavior. This paper describes a model comparison challenge built around a dynamic control task, the Dynamic Stocks and Flows. We present and discuss distinct approaches to evaluating performance and comparing models. Lessons drawn from this challenge are discussed in light of the challenge of using cognitive architectures to achieve Artificial General Intelligence.

  4. The RAPIDD ebola forecasting challenge: Synthesis and lessons learnt.

    PubMed

    Viboud, Cécile; Sun, Kaiyuan; Gaffey, Robert; Ajelli, Marco; Fumanelli, Laura; Merler, Stefano; Zhang, Qian; Chowell, Gerardo; Simonsen, Lone; Vespignani, Alessandro

    2018-03-01

    Infectious disease forecasting is gaining traction in the public health community; however, limited systematic comparisons of model performance exist. Here we present the results of a synthetic forecasting challenge inspired by the West African Ebola crisis in 2014-2015 and involving 16 international academic teams and US government agencies, and compare the predictive performance of 8 independent modeling approaches. Challenge participants were invited to predict 140 epidemiological targets across 5 different time points of 4 synthetic Ebola outbreaks, each involving different levels of interventions and "fog of war" in outbreak data made available for predictions. Prediction targets included 1-4 week-ahead case incidences, outbreak size, peak timing, and several natural history parameters. With respect to weekly case incidence targets, ensemble predictions based on a Bayesian average of the 8 participating models outperformed any individual model and did substantially better than a null auto-regressive model. There was no relationship between model complexity and prediction accuracy; however, the top performing models for short-term weekly incidence were reactive models with few parameters, fitted to a short and recent part of the outbreak. Individual model outputs and ensemble predictions improved with data accuracy and availability; by the second time point, just before the peak of the epidemic, estimates of final size were within 20% of the target. The 4th challenge scenario - mirroring an uncontrolled Ebola outbreak with substantial data reporting noise - was poorly predicted by all modeling teams. Overall, this synthetic forecasting challenge provided a deep understanding of model performance under controlled data and epidemiological conditions. We recommend such "peace time" forecasting challenges as key elements to improve coordination and inspire collaboration between modeling groups ahead of the next pandemic threat, and to assess model forecasting accuracy for a variety of known and hypothetical pathogens. Published by Elsevier B.V.

  5. Geospace environment modeling 2008--2009 challenge: Dst index

    USGS Publications Warehouse

    Rastätter, L.; Kuznetsova, M.M.; Glocer, A.; Welling, D.; Meng, X.; Raeder, J.; Wittberger, M.; Jordanova, V.K.; Yu, Y.; Zaharia, S.; Weigel, R.S.; Sazykin, S.; Boynton, R.; Wei, H.; Eccles, V.; Horton, W.; Mays, M.L.; Gannon, J.

    2013-01-01

    This paper reports the metrics-based results of the Dst index part of the 2008–2009 GEM Metrics Challenge. The 2008–2009 GEM Metrics Challenge asked modelers to submit results for four geomagnetic storm events and five different types of observations that can be modeled by statistical, climatological or physics-based models of the magnetosphere-ionosphere system. We present the results of 30 model settings that were run at the Community Coordinated Modeling Center and at the institutions of various modelers for these events. To measure the performance of each of the models against the observations, we use comparisons of 1 hour averaged model data with the Dst index issued by the World Data Center for Geomagnetism, Kyoto, Japan, and direct comparison of 1 minute model data with the 1 minute Dst index calculated by the United States Geological Survey. The latter index can be used to calculate spectral variability of model outputs in comparison to the index. We find that model rankings vary widely by skill score used. None of the models consistently perform best for all events. We find that empirical models perform well in general. Magnetohydrodynamics-based models of the global magnetosphere with inner magnetosphere physics (ring current model) included and stand-alone ring current models with properly defined boundary conditions perform well and are able to match or surpass results from empirical models. Unlike in similar studies, the statistical models used in this study found their challenge in the weakest events rather than the strongest events.

  6. A systematic review and checklist presenting the main challenges for health economic modeling in personalized medicine: towards implementing patient-level models.

    PubMed

    Degeling, Koen; Koffijberg, Hendrik; IJzerman, Maarten J

    2017-02-01

    The ongoing development of genomic medicine and the use of molecular and imaging markers in personalized medicine (PM) has arguably challenged the field of health economic modeling (HEM). This study aims to provide detailed insights into the current status of HEM in PM, in order to identify if and how modeling methods are used to address the challenges described in literature. Areas covered: A review was performed on studies that simulate health economic outcomes for personalized clinical pathways. Decision tree modeling and Markov modeling were the most observed methods. Not all identified challenges were frequently found, challenges regarding companion diagnostics, diagnostic performance, and evidence gaps were most often found. However, the extent to which challenges were addressed varied considerably between studies. Expert commentary: Challenges for HEM in PM are not yet routinely addressed which may indicate that either (1) their impact is less severe than expected, (2) they are hard to address and therefore not managed appropriately, or (3) HEM in PM is still in an early stage. As evidence on the impact of these challenges is still lacking, we believe that more concrete examples are needed to illustrate the identified challenges and to demonstrate methods to handle them.

  7. CPMIP: measurements of real computational performance of Earth system models in CMIP6

    NASA Astrophysics Data System (ADS)

    Balaji, Venkatramani; Maisonnave, Eric; Zadeh, Niki; Lawrence, Bryan N.; Biercamp, Joachim; Fladrich, Uwe; Aloisio, Giovanni; Benson, Rusty; Caubel, Arnaud; Durachta, Jeffrey; Foujols, Marie-Alice; Lister, Grenville; Mocavero, Silvia; Underwood, Seth; Wright, Garrett

    2017-01-01

    A climate model represents a multitude of processes on a variety of timescales and space scales: a canonical example of multi-physics multi-scale modeling. The underlying climate system is physically characterized by sensitive dependence on initial conditions, and natural stochastic variability, so very long integrations are needed to extract signals of climate change. Algorithms generally possess weak scaling and can be I/O and/or memory-bound. Such weak-scaling, I/O, and memory-bound multi-physics codes present particular challenges to computational performance. Traditional metrics of computational efficiency such as performance counters and scaling curves do not tell us enough about real sustained performance from climate models on different machines. They also do not provide a satisfactory basis for comparative information across models. codes present particular challenges to computational performance. We introduce a set of metrics that can be used for the study of computational performance of climate (and Earth system) models. These measures do not require specialized software or specific hardware counters, and should be accessible to anyone. They are independent of platform and underlying parallel programming models. We show how these metrics can be used to measure actually attained performance of Earth system models on different machines, and identify the most fruitful areas of research and development for performance engineering. codes present particular challenges to computational performance. We present results for these measures for a diverse suite of models from several modeling centers, and propose to use these measures as a basis for a CPMIP, a computational performance model intercomparison project (MIP).

  8. Scaling Equations for Ballistic Modeling of Solid Rocket Motor Case Breach

    NASA Technical Reports Server (NTRS)

    McMillin, Joshua E.

    2006-01-01

    This paper explores the development of a series of scaling equations that can take a known nominal motor performance and scale it for small and growing case failures. This model was developed for the Malfunction-Turn Study as part of Return to Flight activities for the Space Shuttle program. To verify the model, data from the Challenger accident (STS- 51L) were used. The model is able to predict the motor performance beyond the last recorded Challenger data and show how the failed right hand booster would have performed if the vehicle had remained intact.

  9. Challenges and opportunities in analysing students modelling

    NASA Astrophysics Data System (ADS)

    Blanco-Anaya, Paloma; Justi, Rosária; Díaz de Bustamante, Joaquín

    2017-02-01

    Modelling-based teaching activities have been designed and analysed from distinct theoretical perspectives. In this paper, we use one of them - the model of modelling diagram (MMD) - as an analytical tool in a regular classroom context. This paper examines the challenges that arise when the MMD is used as an analytical tool to characterise the modelling process experienced by students working in small groups aiming at creating and testing a model of a sedimentary basin from the information provided. The study was conducted in a regular Biology and Geology classroom (16-17 years old students). Data was collected through video recording of the classes, along with written reports and the material models made by each group. The results show the complexity of adapting MMD at two levels: the group modelling and the actual requirements for the activity. Our main challenges were to gather the modelling process of each individual and the group, as well as to identify, from students' speech, which stage of modelling they were performing at a given time. When facing such challenges, we propose some changes in the MMD so that it can be properly used to analyse students performing modelling activities in groups.

  10. Systematic Analysis of Challenge-Driven Improvements in Molecular Prognostic Models for Breast Cancer

    PubMed Central

    Margolin, Adam A.; Bilal, Erhan; Huang, Erich; Norman, Thea C.; Ottestad, Lars; Mecham, Brigham H.; Sauerwine, Ben; Kellen, Michael R.; Mangravite, Lara M.; Furia, Matthew D.; Vollan, Hans Kristian Moen; Rueda, Oscar M.; Guinney, Justin; Deflaux, Nicole A.; Hoff, Bruce; Schildwachter, Xavier; Russnes, Hege G.; Park, Daehoon; Vang, Veronica O.; Pirtle, Tyler; Youseff, Lamia; Citro, Craig; Curtis, Christina; Kristensen, Vessela N.; Hellerstein, Joseph; Friend, Stephen H.; Stolovitzky, Gustavo; Aparicio, Samuel; Caldas, Carlos; Børresen-Dale, Anne-Lise

    2013-01-01

    Although molecular prognostics in breast cancer are among the most successful examples of translating genomic analysis to clinical applications, optimal approaches to breast cancer clinical risk prediction remain controversial. The Sage Bionetworks–DREAM Breast Cancer Prognosis Challenge (BCC) is a crowdsourced research study for breast cancer prognostic modeling using genome-scale data. The BCC provided a community of data analysts with a common platform for data access and blinded evaluation of model accuracy in predicting breast cancer survival on the basis of gene expression data, copy number data, and clinical covariates. This approach offered the opportunity to assess whether a crowdsourced community Challenge would generate models of breast cancer prognosis commensurate with or exceeding current best-in-class approaches. The BCC comprised multiple rounds of blinded evaluations on held-out portions of data on 1981 patients, resulting in more than 1400 models submitted as open source code. Participants then retrained their models on the full data set of 1981 samples and submitted up to five models for validation in a newly generated data set of 184 breast cancer patients. Analysis of the BCC results suggests that the best-performing modeling strategy outperformed previously reported methods in blinded evaluations; model performance was consistent across several independent evaluations; and aggregating community-developed models achieved performance on par with the best-performing individual models. PMID:23596205

  11. Performance Engineering Research Institute SciDAC-2 Enabling Technologies Institute Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, Mary

    2014-09-19

    Enhancing the performance of SciDAC applications on petascale systems has high priority within DOE SC. As we look to the future, achieving expected levels of performance on high-end com-puting (HEC) systems is growing ever more challenging due to enormous scale, increasing archi-tectural complexity, and increasing application complexity. To address these challenges, PERI has implemented a unified, tripartite research plan encompassing: (1) performance modeling and prediction; (2) automatic performance tuning; and (3) performance engineering of high profile applications. The PERI performance modeling and prediction activity is developing and refining performance models, significantly reducing the cost of collecting the data upon whichmore » the models are based, and increasing model fidelity, speed and generality. Our primary research activity is automatic tuning (autotuning) of scientific software. This activity is spurred by the strong user preference for automatic tools and is based on previous successful activities such as ATLAS, which has automatically tuned components of the LAPACK linear algebra library, and other re-cent work on autotuning domain-specific libraries. Our third major component is application en-gagement, to which we are devoting approximately 30% of our effort to work directly with Sci-DAC-2 applications. This last activity not only helps DOE scientists meet their near-term per-formance goals, but also helps keep PERI research focused on the real challenges facing DOE computational scientists as they enter the Petascale Era.« less

  12. Fast Magnetotail Reconnection: Challenge to Global MHD Modeling

    NASA Astrophysics Data System (ADS)

    Kuznetsova, M. M.; Hesse, M.; Rastaetter, L.; Toth, G.; de Zeeuw, D.; Gombosi, T.

    2005-05-01

    Representation of fast magnetotail reconnection rates during substorm onset is one of the major challenges to global MHD modeling. Our previous comparative study of collisionless magnetic reconnection in GEM Challenge geometry demonstrated that the reconnection rate is controlled by ion nongyrotropic behavior near the reconnection site and that it can be described in terms of nongyrotropic corrections to the magnetic induction equation. To further test the approach we performed MHD simulations with nongyrotropic corrections of forced reconnection for the Newton Challenge setup. As a next step we employ the global MHD code BATSRUS and test different methods to model fast magnetotail reconnection rates by introducing non-ideal corrections to the induction equation in terms of nongyrotropic corrections, spatially localized resistivity, or current dependent resistivity. The BATSRUS adaptive grid structure allows to perform global simulations with spatial resolution near the reconnection site comparable with spatial resolution of local MHD simulations for the Newton Challenge. We select solar wind conditions which drive the accumulation of magnetic field in the tail lobes and subsequent magnetic reconnection and energy release. Testing the ability of global MHD models to describe magnetotail evolution during substroms is one of the elements of science based validation efforts at the Community Coordinated Modeling Center.

  13. Evaluation of average daily gain predictions by the integrated farm system model for forage-finished beef steers

    USDA-ARS?s Scientific Manuscript database

    Representing the performance of cattle finished on an all forage diet in process-based whole farm system models has presented a challenge. To address this challenge, a study was done to evaluate average daily gain (ADG) predictions of the Integrated Farm System Model (IFSM) for steers consuming all-...

  14. Tracking progress: Applying the Forest Service 10 Year Wilderness Stewardship Challenge as a model of performance management

    Treesearch

    Liese C. Dean

    2007-01-01

    The USDA Forest Service applied a performance management/ accountability system to the 407 wildernesses it oversees by defining and tracking critical work. Work elements were consolidated and packaged into the “10 Year Wilderness Stewardship Challenge.” The goal of the Challenge is to have 100 percent of wildernesses administered by the Forest Service managed to a...

  15. Parametric modeling studies of turbulent non-premixed jet flames with thin reaction zones

    NASA Astrophysics Data System (ADS)

    Wang, Haifeng

    2013-11-01

    The Sydney piloted jet flame series (Flames L, B, and M) feature thinner reaction zones and hence impose greater challenges to modeling than the Sanida Piloted jet flames (Flames D, E, and F). Recently, the Sydney flames received renewed interest due to these challenges. Several new modeling efforts have emerged. However, no systematic parametric modeling studies have been reported for the Sydney flames. A large set of modeling computations of the Sydney flames is presented here by using the coupled large eddy simulation (LES)/probability density function (PDF) method. Parametric studies are performed to gain insight into the model performance, its sensitivity and the effect of numerics.

  16. Automated Design Space Exploration with Aspen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spafford, Kyle L.; Vetter, Jeffrey S.

    Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation) language with three new language constructs: user-defined resources, parameter ranges, and a collection ofmore » costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.« less

  17. Automated Design Space Exploration with Aspen

    DOE PAGES

    Spafford, Kyle L.; Vetter, Jeffrey S.

    2015-01-01

    Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation) language with three new language constructs: user-defined resources, parameter ranges, and a collection ofmore » costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.« less

  18. Cognitive appraisals of stereotype threat.

    PubMed

    Berjot, S; Roland-Levy, C; Girault-Lidvan, N

    2011-04-01

    Using the cognitive appraisal conceptualisation of the transactional model of stress, the goal was to assess how victims of stereotype threat respond to this situation in terms of primary appraisals (threat/challenge) and to investigate whether those appraisals may mediate the relation between stereotype threat and performance. Results show that, while participants from North Africa living in France did appraise the situation more as a threat and less as a challenge, only challenge appraisal mediated between stereotype threat and performance.

  19. Precollege Factors and Leading Indicators: Increasing Transfer and Degree Completion in a Community and Technical College System

    ERIC Educational Resources Information Center

    Davidson, J. Cody

    2015-01-01

    Today, community colleges are challenged to maintain their historical identity of open access while increasing student success. This challenge is particularly salient in the context of performance-based funding models. These models create student achievements, which determine institutional levels of state funding. Therefore, these new student…

  20. Report of a consultation on the optimization of clinical challenge trials for evaluation of candidate blood stage malaria vaccines, 18-19 March 2009, Bethesda, MD, USA.

    PubMed

    Moorthy, V S; Diggs, C; Ferro, S; Good, M F; Herrera, S; Hill, A V; Imoukhuede, E B; Kumar, S; Loucq, C; Marsh, K; Ockenhouse, C F; Richie, T L; Sauerwein, R W

    2009-09-25

    Development and optimization of first generation malaria vaccine candidates has been facilitated by the existence of a well-established Plasmodium falciparum clinical challenge model in which infectious sporozoites are administered to human subjects via mosquito bite. While ideal for testing pre-erythrocytic stage vaccines, some researchers believe that the sporozoite challenge model is less appropriate for testing blood stage vaccines. Here we report a consultation, co-sponsored by PATH MVI, USAID, EMVI and WHO, where scientists from all institutions globally that have conducted such clinical challenges in recent years and representatives from regulatory agencies and funding agencies met to discuss clinical malaria challenge models. Participants discussed strengthening and harmonizing the sporozoite challenge model and considered the pros and cons of further developing a blood stage challenge possibly better suited for evaluating the efficacy of blood stage vaccines. This report summarizes major findings and recommendations, including an update on the Plasmodium vivax clinical challenge model, the prospects for performing experimental challenge trials in malaria endemic countries and an update on clinical safety data. While the focus of the meeting was on the optimization of clinical challenge models for evaluation of blood stage candidate malaria vaccines, many of the considerations are relevant for the application of challenge trials to other purposes.

  1. Specifying and Refining a Measurement Model for a Computer-Based Interactive Assessment

    ERIC Educational Resources Information Center

    Levy, Roy; Mislevy, Robert J.

    2004-01-01

    The challenges of modeling students' performance in computer-based interactive assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance. This article describes a Bayesian approach to modeling and estimating cognitive models…

  2. The SAMPL4 host-guest blind prediction challenge: an overview.

    PubMed

    Muddana, Hari S; Fenley, Andrew T; Mobley, David L; Gilson, Michael K

    2014-04-01

    Prospective validation of methods for computing binding affinities can help assess their predictive power and thus set reasonable expectations for their performance in drug design applications. Supramolecular host-guest systems are excellent model systems for testing such affinity prediction methods, because their small size and limited conformational flexibility, relative to proteins, allows higher throughput and better numerical convergence. The SAMPL4 prediction challenge therefore included a series of host-guest systems, based on two hosts, cucurbit[7]uril and octa-acid. Binding affinities in aqueous solution were measured experimentally for a total of 23 guest molecules. Participants submitted 35 sets of computational predictions for these host-guest systems, based on methods ranging from simple docking, to extensive free energy simulations, to quantum mechanical calculations. Over half of the predictions provided better correlations with experiment than two simple null models, but most methods underperformed the null models in terms of root mean squared error and linear regression slope. Interestingly, the overall performance across all SAMPL4 submissions was similar to that for the prior SAMPL3 host-guest challenge, although the experimentalists took steps to simplify the current challenge. While some methods performed fairly consistently across both hosts, no single approach emerged as consistent top performer, and the nonsystematic nature of the various submissions made it impossible to draw definitive conclusions regarding the best choices of energy models or sampling algorithms. Salt effects emerged as an issue in the calculation of absolute binding affinities of cucurbit[7]uril-guest systems, but were not expected to affect the relative affinities significantly. Useful directions for future rounds of the challenge might involve encouraging participants to carry out some calculations that replicate each others' studies, and to systematically explore parameter options.

  3. Gut Health of Pigs: Challenge Models and Response Criteria with a Critical Analysis of the Effectiveness of Selected Feed Additives - A Review.

    PubMed

    Adewole, D I; Kim, I H; Nyachoti, C M

    2016-07-01

    The gut is the largest organ that helps with the immune function. Gut health, especially in young pigs has a significant benefit to health and performance. In an attempt to maintain and enhance intestinal health in pigs and improve productivity in the absence of in-feed antibiotics, researchers have evaluated a wide range of feed additives. Some of these additives such as zinc oxide, copper sulphate, egg yolk antibodies, mannan-oligosaccharides and spray dried porcine plasma and their effectiveness are discussed in this review. One approach to evaluate the effectiveness of these additives in vivo is to use an appropriate disease challenge model. Over the years, researchers have used a number of challenge models which include the use of specific strains of enterotoxigenic Escherichia coli, bacteria lipopolysaccharide challenge, oral challenge with Salmonella enteric serotype Typhimurium, sanitation challenge, and Lawsonia intercellularis challenge. These challenge models together with the criteria used to evaluate the responses of the animals to them are also discussed in this review.

  4. Gut Health of Pigs: Challenge Models and Response Criteria with a Critical Analysis of the Effectiveness of Selected Feed Additives — A Review

    PubMed Central

    Adewole, D. I.; Kim, I. H.; Nyachoti, C. M.

    2016-01-01

    The gut is the largest organ that helps with the immune function. Gut health, especially in young pigs has a significant benefit to health and performance. In an attempt to maintain and enhance intestinal health in pigs and improve productivity in the absence of in-feed antibiotics, researchers have evaluated a wide range of feed additives. Some of these additives such as zinc oxide, copper sulphate, egg yolk antibodies, mannan-oligosaccharides and spray dried porcine plasma and their effectiveness are discussed in this review. One approach to evaluate the effectiveness of these additives in vivo is to use an appropriate disease challenge model. Over the years, researchers have used a number of challenge models which include the use of specific strains of enterotoxigenic Escherichia coli, bacteria lipopolysaccharide challenge, oral challenge with Salmonella enteric serotype Typhimurium, sanitation challenge, and Lawsonia intercellularis challenge. These challenge models together with the criteria used to evaluate the responses of the animals to them are also discussed in this review. PMID:26954144

  5. Human Performance Models of Pilot Behavior

    NASA Technical Reports Server (NTRS)

    Foyle, David C.; Hooey, Becky L.; Byrne, Michael D.; Deutsch, Stephen; Lebiere, Christian; Leiden, Ken; Wickens, Christopher D.; Corker, Kevin M.

    2005-01-01

    Five modeling teams from industry and academia were chosen by the NASA Aviation Safety and Security Program to develop human performance models (HPM) of pilots performing taxi operations and runway instrument approaches with and without advanced displays. One representative from each team will serve as a panelist to discuss their team s model architecture, augmentations and advancements to HPMs, and aviation-safety related lessons learned. Panelists will discuss how modeling results are influenced by a model s architecture and structure, the role of the external environment, specific modeling advances and future directions and challenges for human performance modeling in aviation.

  6. Terrestrial Planet Finder Coronagraph Observatory summary

    NASA Technical Reports Server (NTRS)

    Ford, Virginia; Levine-Westa, Marie; Kissila, Andy; Kwacka, Eug; Hoa, Tim; Dumonta, Phil; Lismana, Doug; Fehera, Peter; Cafferty, Terry

    2005-01-01

    Creating an optical space telescope observatory capable of detecting and characterizing light from extra-solar terrestrial planets poses technical challenges related to extreme wavefront stability. The Terrestrial Planet Finder Coronagraph design team has been developing an observatory based on trade studies, modeling and analysis that has guided us towards design choices to enable this challenging mission. This paper will describe the current flight baseline design of the observatory and the trade studies that have been performed. The modeling and analysis of this design will be described including predicted performance and the tasks yet to be done.

  7. How do challenges increase customer loyalty to online games?

    PubMed

    Teng, Ching-I

    2013-12-01

    Despite the design of various challenge levels in online games, exactly how these challenges increase customer loyalty to online games has seldom been examined. This study investigates how such challenges increase customer loyalty to online games. The study sample comprises 2,861 online gamers. Structural equation modeling is performed. Analytical results indicate that the relationship between challenge and loyalty intensifies when customers perceive that overcoming challenges takes a long time. Results of this study contribute to efforts to determine how challenges and challenge-related perceptions impact customer loyalty to online games.

  8. Specifying and Refining a Measurement Model for a Simulation-Based Assessment. CSE Report 619.

    ERIC Educational Resources Information Center

    Levy, Roy; Mislevy, Robert J.

    2004-01-01

    The challenges of modeling students' performance in simulation-based assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance in a complex assessment. This paper describes a Bayesian approach to modeling and estimating…

  9. Summary of the DREAM8 Parameter Estimation Challenge: Toward Parameter Identification for Whole-Cell Models.

    PubMed

    Karr, Jonathan R; Williams, Alex H; Zucker, Jeremy D; Raue, Andreas; Steiert, Bernhard; Timmer, Jens; Kreutz, Clemens; Wilkinson, Simon; Allgood, Brandon A; Bot, Brian M; Hoff, Bruce R; Kellen, Michael R; Covert, Markus W; Stolovitzky, Gustavo A; Meyer, Pablo

    2015-05-01

    Whole-cell models that explicitly represent all cellular components at the molecular level have the potential to predict phenotype from genotype. However, even for simple bacteria, whole-cell models will contain thousands of parameters, many of which are poorly characterized or unknown. New algorithms are needed to estimate these parameters and enable researchers to build increasingly comprehensive models. We organized the Dialogue for Reverse Engineering Assessments and Methods (DREAM) 8 Whole-Cell Parameter Estimation Challenge to develop new parameter estimation algorithms for whole-cell models. We asked participants to identify a subset of parameters of a whole-cell model given the model's structure and in silico "experimental" data. Here we describe the challenge, the best performing methods, and new insights into the identifiability of whole-cell models. We also describe several valuable lessons we learned toward improving future challenges. Going forward, we believe that collaborative efforts supported by inexpensive cloud computing have the potential to solve whole-cell model parameter estimation.

  10. An aircraft model for the AIAA controls design challenge

    NASA Technical Reports Server (NTRS)

    Brumbaugh, Randal W.

    1991-01-01

    A generic, state-of-the-art, high-performance aircraft model, including detailed, full-envelope, nonlinear aerodynamics, and full-envelope thrust and first-order engine response data is described. While this model was primarily developed Controls Design Challenge, the availability of such a model provides a common focus for research in aeronautical control theory and methodology. An implementation of this model using the FORTRAN computer language, associated routines furnished with the aircraft model, and techniques for interfacing these routines to external procedures is also described. Figures showing vehicle geometry, surfaces, and sign conventions are included.

  11. Leveraging organismal biology to forecast the effects of climate change.

    PubMed

    Buckley, Lauren B; Cannistra, Anthony F; John, Aji

    2018-04-26

    Despite the pressing need for accurate forecasts of ecological and evolutionary responses to environmental change, commonly used modelling approaches exhibit mixed performance because they omit many important aspects of how organisms respond to spatially and temporally variable environments. Integrating models based on organismal phenotypes at the physiological, performance and fitness levels can improve model performance. We summarize current limitations of environmental data and models and discuss potential remedies. The paper reviews emerging techniques for sensing environments at fine spatial and temporal scales, accounting for environmental extremes, and capturing how organisms experience the environment. Intertidal mussel data illustrate biologically important aspects of environmental variability. We then discuss key challenges in translating environmental conditions into organismal performance including accounting for the varied timescales of physiological processes, for responses to environmental fluctuations including the onset of stress and other thresholds, and for how environmental sensitivities vary across lifecycles. We call for the creation of phenotypic databases to parameterize forecasting models and advocate for improved sharing of model code and data for model testing. We conclude with challenges in organismal biology that must be solved to improve forecasts over the next decade.acclimation, biophysical models, ecological forecasting, extremes, microclimate, spatial and temporal variability.

  12. Evaluating Models of Human Performance: Safety-Critical Systems Applications

    NASA Technical Reports Server (NTRS)

    Feary, Michael S.

    2012-01-01

    This presentation is part of panel discussion on Evaluating Models of Human Performance. The purpose of this panel is to discuss the increasing use of models in the world today and specifically focus on how to describe and evaluate models of human performance. My presentation will focus on discussions of generating distributions of performance, and the evaluation of different strategies for humans performing tasks with mixed initiative (Human-Automation) systems. I will also discuss issues with how to provide Human Performance modeling data to support decisions on acceptability and tradeoffs in the design of safety critical systems. I will conclude with challenges for the future.

  13. Making things happen through challenging goals: leader proactivity, trust, and business-unit performance.

    PubMed

    Crossley, Craig D; Cooper, Cecily D; Wernsing, Tara S

    2013-05-01

    Building on decades of research on the proactivity of individual performers, this study integrates research on goal setting and trust in leadership to examine manager proactivity and business unit sales performance in one of the largest sales organizations in the United States. Results of a moderated-mediation model suggest that proactive senior managers establish more challenging goals for their business units (N = 50), which in turn are associated with higher sales performance. We further found that employees' trust in the manager is a critical contingency variable that facilitates the relationship between challenging sales goals and subsequent sales performance. This research contributes to growing literatures on trust in leadership and proactivity by studying their joint effects at a district-unit level of analysis while identifying district managers' tendency to set challenging goals as a process variable that helps translate their proactivity into the collective performance of their units. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  14. Summary of the DREAM8 Parameter Estimation Challenge: Toward Parameter Identification for Whole-Cell Models

    PubMed Central

    Karr, Jonathan R.; Williams, Alex H.; Zucker, Jeremy D.; Raue, Andreas; Steiert, Bernhard; Timmer, Jens; Kreutz, Clemens; Wilkinson, Simon; Allgood, Brandon A.; Bot, Brian M.; Hoff, Bruce R.; Kellen, Michael R.; Covert, Markus W.; Stolovitzky, Gustavo A.; Meyer, Pablo

    2015-01-01

    Whole-cell models that explicitly represent all cellular components at the molecular level have the potential to predict phenotype from genotype. However, even for simple bacteria, whole-cell models will contain thousands of parameters, many of which are poorly characterized or unknown. New algorithms are needed to estimate these parameters and enable researchers to build increasingly comprehensive models. We organized the Dialogue for Reverse Engineering Assessments and Methods (DREAM) 8 Whole-Cell Parameter Estimation Challenge to develop new parameter estimation algorithms for whole-cell models. We asked participants to identify a subset of parameters of a whole-cell model given the model’s structure and in silico “experimental” data. Here we describe the challenge, the best performing methods, and new insights into the identifiability of whole-cell models. We also describe several valuable lessons we learned toward improving future challenges. Going forward, we believe that collaborative efforts supported by inexpensive cloud computing have the potential to solve whole-cell model parameter estimation. PMID:26020786

  15. A University Engagement Model for Achieving Technology Adoption and Performance Improvement Impacts in Healthcare, Manufacturing, and Government

    ERIC Educational Resources Information Center

    McKinnis, David R.; Sloan, Mary Anne; Snow, L. David; Garimella, Suresh V.

    2014-01-01

    The Purdue Technical Assistance Program (TAP) offers a model of university engagement and service that is achieving technology adoption and performance improvement impacts in healthcare, manufacturing, government, and other sectors. The TAP model focuses on understanding and meeting the changing and challenging needs of those served, always…

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barrett, Brian W.; Hemmert, K. Scott; Underwood, Keith Douglas

    Achieving the next three orders of magnitude performance increase to move from petascale to exascale computing will require a significant advancements in several fundamental areas. Recent studies have outlined many of the challenges in hardware and software that will be needed. In this paper, we examine these challenges with respect to high-performance networking. We describe the repercussions of anticipated changes to computing and networking hardware and discuss the impact that alternative parallel programming models will have on the network software stack. We also present some ideas on possible approaches that address some of these challenges.

  17. Overuse Injury Assessment Model

    DTIC Science & Technology

    2007-08-01

    and reduce injuries is a continual challenge. Military researchers face challenges to develop better equipment, improve training regimens, and...performed and calculate biomechanical metrics for each activity. Hardware Challenges • Sampling the sensors sufficiently fast • Having adequate bandwidth...second is that the injury rate increases rapidly initially and, third, that the injury rate continues to climb as the cumulative distance increases

  18. BrainFrame: a node-level heterogeneous accelerator platform for neuron simulations

    NASA Astrophysics Data System (ADS)

    Smaragdos, Georgios; Chatzikonstantis, Georgios; Kukreja, Rahul; Sidiropoulos, Harry; Rodopoulos, Dimitrios; Sourdis, Ioannis; Al-Ars, Zaid; Kachris, Christoforos; Soudris, Dimitrios; De Zeeuw, Chris I.; Strydis, Christos

    2017-12-01

    Objective. The advent of high-performance computing (HPC) in recent years has led to its increasing use in brain studies through computational models. The scale and complexity of such models are constantly increasing, leading to challenging computational requirements. Even though modern HPC platforms can often deal with such challenges, the vast diversity of the modeling field does not permit for a homogeneous acceleration platform to effectively address the complete array of modeling requirements. Approach. In this paper we propose and build BrainFrame, a heterogeneous acceleration platform that incorporates three distinct acceleration technologies, an Intel Xeon-Phi CPU, a NVidia GP-GPU and a Maxeler Dataflow Engine. The PyNN software framework is also integrated into the platform. As a challenging proof of concept, we analyze the performance of BrainFrame on different experiment instances of a state-of-the-art neuron model, representing the inferior-olivary nucleus using a biophysically-meaningful, extended Hodgkin-Huxley representation. The model instances take into account not only the neuronal-network dimensions but also different network-connectivity densities, which can drastically affect the workload’s performance characteristics. Main results. The combined use of different HPC technologies demonstrates that BrainFrame is better able to cope with the modeling diversity encountered in realistic experiments while at the same time running on significantly lower energy budgets. Our performance analysis clearly shows that the model directly affects performance and all three technologies are required to cope with all the model use cases. Significance. The BrainFrame framework is designed to transparently configure and select the appropriate back-end accelerator technology for use per simulation run. The PyNN integration provides a familiar bridge to the vast number of models already available. Additionally, it gives a clear roadmap for extending the platform support beyond the proof of concept, with improved usability and directly useful features to the computational-neuroscience community, paving the way for wider adoption.

  19. BrainFrame: a node-level heterogeneous accelerator platform for neuron simulations.

    PubMed

    Smaragdos, Georgios; Chatzikonstantis, Georgios; Kukreja, Rahul; Sidiropoulos, Harry; Rodopoulos, Dimitrios; Sourdis, Ioannis; Al-Ars, Zaid; Kachris, Christoforos; Soudris, Dimitrios; De Zeeuw, Chris I; Strydis, Christos

    2017-12-01

    The advent of high-performance computing (HPC) in recent years has led to its increasing use in brain studies through computational models. The scale and complexity of such models are constantly increasing, leading to challenging computational requirements. Even though modern HPC platforms can often deal with such challenges, the vast diversity of the modeling field does not permit for a homogeneous acceleration platform to effectively address the complete array of modeling requirements. In this paper we propose and build BrainFrame, a heterogeneous acceleration platform that incorporates three distinct acceleration technologies, an Intel Xeon-Phi CPU, a NVidia GP-GPU and a Maxeler Dataflow Engine. The PyNN software framework is also integrated into the platform. As a challenging proof of concept, we analyze the performance of BrainFrame on different experiment instances of a state-of-the-art neuron model, representing the inferior-olivary nucleus using a biophysically-meaningful, extended Hodgkin-Huxley representation. The model instances take into account not only the neuronal-network dimensions but also different network-connectivity densities, which can drastically affect the workload's performance characteristics. The combined use of different HPC technologies demonstrates that BrainFrame is better able to cope with the modeling diversity encountered in realistic experiments while at the same time running on significantly lower energy budgets. Our performance analysis clearly shows that the model directly affects performance and all three technologies are required to cope with all the model use cases. The BrainFrame framework is designed to transparently configure and select the appropriate back-end accelerator technology for use per simulation run. The PyNN integration provides a familiar bridge to the vast number of models already available. Additionally, it gives a clear roadmap for extending the platform support beyond the proof of concept, with improved usability and directly useful features to the computational-neuroscience community, paving the way for wider adoption.

  20. Software Systems for High-performance Quantum Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S; Britt, Keith A

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventionalmore » computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.« less

  1. Gaia challenging performances verification: combination of spacecraft models and test results

    NASA Astrophysics Data System (ADS)

    Ecale, Eric; Faye, Frédéric; Chassat, François

    2016-08-01

    To achieve the ambitious scientific objectives of the Gaia mission, extremely stringent performance requirements have been given to the spacecraft contractor (Airbus Defence and Space). For a set of those key-performance requirements (e.g. end-of-mission parallax, maximum detectable magnitude, maximum sky density or attitude control system stability), this paper describes how they are engineered during the whole spacecraft development process, with a focus on the end-to-end performance verification. As far as possible, performances are usually verified by end-to-end tests onground (i.e. before launch). However, the challenging Gaia requirements are not verifiable by such a strategy, principally because no test facility exists to reproduce the expected flight conditions. The Gaia performance verification strategy is therefore based on a mix between analyses (based on spacecraft models) and tests (used to directly feed the models or to correlate them). Emphasis is placed on how to maximize the test contribution to performance verification while keeping the test feasible within an affordable effort. In particular, the paper highlights the contribution of the Gaia Payload Module Thermal Vacuum test to the performance verification before launch. Eventually, an overview of the in-flight payload calibration and in-flight performance verification is provided.

  2. Geospace Environment Modeling 2008-2009 Challenge: Ground Magnetic Field Perturbations

    NASA Technical Reports Server (NTRS)

    Pulkkinen, A.; Kuznetsova, M.; Ridley, A.; Raeder, J.; Vapirev, A.; Weimer, D.; Weigel, R. S.; Wiltberger, M.; Millward, G.; Rastatter, L.; hide

    2011-01-01

    Acquiring quantitative metrics!based knowledge about the performance of various space physics modeling approaches is central for the space weather community. Quantification of the performance helps the users of the modeling products to better understand the capabilities of the models and to choose the approach that best suits their specific needs. Further, metrics!based analyses are important for addressing the differences between various modeling approaches and for measuring and guiding the progress in the field. In this paper, the metrics!based results of the ground magnetic field perturbation part of the Geospace Environment Modeling 2008 2009 Challenge are reported. Predictions made by 14 different models, including an ensemble model, are compared to geomagnetic observatory recordings from 12 different northern hemispheric locations. Five different metrics are used to quantify the model performances for four storm events. It is shown that the ranking of the models is strongly dependent on the type of metric used to evaluate the model performance. None of the models rank near or at the top systematically for all used metrics. Consequently, one cannot pick the absolute winner : the choice for the best model depends on the characteristics of the signal one is interested in. Model performances vary also from event to event. This is particularly clear for root!mean!square difference and utility metric!based analyses. Further, analyses indicate that for some of the models, increasing the global magnetohydrodynamic model spatial resolution and the inclusion of the ring current dynamics improve the models capability to generate more realistic ground magnetic field fluctuations.

  3. Review of GEM Radiation Belt Dropout and Buildup Challenges

    NASA Astrophysics Data System (ADS)

    Tu, Weichao; Li, Wen; Morley, Steve; Albert, Jay

    2017-04-01

    In Summer 2015 the US NSF GEM (Geospace Environment Modeling) focus group named "Quantitative Assessment of Radiation Belt Modeling" started the "RB dropout" and "RB buildup" challenges, focused on quantitative modeling of the radiation belt buildups and dropouts. This is a community effort which includes selecting challenge events, gathering model inputs that are required to model the radiation belt dynamics during these events (e.g., various magnetospheric waves, plasmapause and density models, electron phase space density data), simulating the challenge events using different types of radiation belt models, and validating the model results by comparison to in situ observations of radiation belt electrons (from Van Allen Probes, THEMIS, GOES, LANL/GEO, etc). The goal is to quantitatively assess the relative importance of various acceleration, transport, and loss processes in the observed radiation belt dropouts and buildups. Since 2015, the community has selected four "challenge" events under four different categories: "storm-time enhancements", "non-storm enhancements", "storm-time dropouts", and "non-storm dropouts". Model inputs and data for each selected event have been coordinated and shared within the community to establish a common basis for simulations and testing. Modelers within and outside US with different types of radiation belt models (diffusion-type, diffusion-convection-type, test particle codes, etc.) have participated in our challenge and shared their simulation results and comparison with spacecraft measurements. Significant progress has been made in quantitative modeling of the radiation belt buildups and dropouts as well as accessing the modeling with new measures of model performance. In this presentation, I will review the activities from our "RB dropout" and "RB buildup" challenges and the progresses achieved in understanding radiation belt physics and improving model validation and verification.

  4. Research notes : solar powered markers not up to challenge.

    DOT National Transportation Integrated Search

    2008-06-01

    ODOT performed preliminary tests on eight different models of solar powered raised pavement markers. These included environmental tests (extreme temperatures, immersion), optical performance tests, and observation tests. Federal Highway Administratio...

  5. Review of issues and challenges for public private partnership (PPP) project performance in Malaysia

    NASA Astrophysics Data System (ADS)

    Hashim, H.; Che-Ani, A. I.; Ismail, K.

    2017-10-01

    Public Private Partnership (PPP) in Malaysia aims to stimulate economic growth and overcome the weakness of conventional system. Over the years, many critics have been reported along the massive growth of PPP project development. Within that context, this study provides a review of issues and challenges for PPP pertaining to project performance in Malaysia. The study also attempts to investigate four performance measurement models around the globe as a basis for improvement of PPP in Malaysia. A qualitative method was used to analyse literature review from previous published literatures while comparative analysis was carried out within the models to identify their advantages and disadvantages. The findings show that the issues and challenges occurred were related to human, technical and financial factor that could hinder the implementation of PPP project in Malaysia. From the analysis, KPIs, guideline / framework, risk allocation, efficiency & flexibility are perceived as dominant issues. Finally, the findings provide an informed basis on the opportunity areas to be considered for improvement in order to achieved project effectiveness.

  6. International challenge to model the long-range transport of radioxenon released from medical isotope production to six Comprehensive Nuclear-Test-Ban Treaty monitoring stations

    DOE PAGES

    Maurer, Christian; Baré, Jonathan; Kusmierczyk-Michulec, Jolanta; ...

    2018-03-08

    After performing a first multi-model exercise in 2015 a comprehensive and technically more demanding atmospheric transport modelling challenge was organized in 2016. Release data were provided by the Australian Nuclear Science and Technology Organization radiopharmaceutical facility in Sydney (Australia) for a one month period. Measured samples for the same time frame were gathered from six International Monitoring System stations in the Southern Hemisphere with distances to the source ranging between 680 (Melbourne) and about 17,000 km (Tristan da Cunha). Participants were prompted to work with unit emissions in pre-defined emission intervals (daily, half-daily, 3-hourly and hourly emission segment lengths) andmore » in order to perform a blind test actual emission values were not provided to them. Despite the quite different settings of the two atmospheric transport modelling challenges there is common evidence that for long-range atmospheric transport using temporally highly resolved emissions and highly space-resolved meteorological input fields has no significant advantage compared to using lower resolved ones. As well an uncertainty of up to 20% in the daily stack emission data turns out to be acceptable for the purpose of a study like this. Model performance at individual stations is quite diverse depending largely on successfully capturing boundary layer processes. No single model-meteorology combination performs best for all stations. Moreover, the stations statistics do not depend on the distance between the source and the individual stations. Finally, it became more evident how future exercises need to be designed. Set-up parameters like the meteorological driver or the output grid resolution should be pre-scribed in order to enhance diversity as well as comparability among model runs.« less

  7. International challenge to model the long-range transport of radioxenon released from medical isotope production to six Comprehensive Nuclear-Test-Ban Treaty monitoring stations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maurer, Christian; Baré, Jonathan; Kusmierczyk-Michulec, Jolanta

    After performing a first multi-model exercise in 2015 a comprehensive and technically more demanding atmospheric transport modelling challenge was organized in 2016. Release data were provided by the Australian Nuclear Science and Technology Organization radiopharmaceutical facility in Sydney (Australia) for a one month period. Measured samples for the same time frame were gathered from six International Monitoring System stations in the Southern Hemisphere with distances to the source ranging between 680 (Melbourne) and about 17,000 km (Tristan da Cunha). Participants were prompted to work with unit emissions in pre-defined emission intervals (daily, half-daily, 3-hourly and hourly emission segment lengths) andmore » in order to perform a blind test actual emission values were not provided to them. Despite the quite different settings of the two atmospheric transport modelling challenges there is common evidence that for long-range atmospheric transport using temporally highly resolved emissions and highly space-resolved meteorological input fields has no significant advantage compared to using lower resolved ones. As well an uncertainty of up to 20% in the daily stack emission data turns out to be acceptable for the purpose of a study like this. Model performance at individual stations is quite diverse depending largely on successfully capturing boundary layer processes. No single model-meteorology combination performs best for all stations. Moreover, the stations statistics do not depend on the distance between the source and the individual stations. Finally, it became more evident how future exercises need to be designed. Set-up parameters like the meteorological driver or the output grid resolution should be pre-scribed in order to enhance diversity as well as comparability among model runs.« less

  8. Next Step Toward Widespread Residential Deep Energy Retrofits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McIlvaine, J.; Saunders, S.; Bordelon, E.

    The complexity of deep energy retrofits warrants additional training to successfully manage multiple improvements that will change whole house air, heat, and moisture flow dynamics. The home performance contracting industry has responded to these challenges by aggregating skilled labor for assessment of and implementation under one umbrella. Two emerging business models are profiled that seek to resolve many of the challenges, weaknesses, opportunities, and threats described for the conventional business models.

  9. Rising to the challenge: acute stress appraisals and selection centre performance in applicants to postgraduate specialty training in anaesthesia.

    PubMed

    Roberts, Martin J; Gale, Thomas C E; McGrath, John S; Wilson, Mark R

    2016-05-01

    The ability to work under pressure is a vital non-technical skill for doctors working in acute medical specialties. Individuals who evaluate potentially stressful situations as challenging rather than threatening may perform better under pressure and be more resilient to stress and burnout. Training programme recruitment processes provide an important opportunity to examine applicants' reactions to acute stress. In the context of multi-station selection centres for recruitment to anaesthesia training programmes, we investigated the factors influencing candidates' pre-station challenge/threat evaluations and the extent to which their evaluations predicted subsequent station performance. Candidates evaluated the perceived stress of upcoming stations using a measure of challenge/threat evaluation-the cognitive appraisal ratio (CAR)-and consented to release their demographic details and station scores. Using regression analyses we determined which candidate and station factors predicted variation in the CAR and whether, after accounting for these factors, the CAR predicted candidate performance in the station. The CAR was affected by the nature of the station and candidate gender, but not age, ethnicity, country of training or clinical experience. Candidates perceived stations involving work related tasks as more threatening. After controlling for candidates' demographic and professional profiles, the CAR significantly predicted station performance: 'challenge' evaluations were associated with better performance, though the effect was weak. Our selection centre model can help recruit prospective anaesthetists who are able to rise to the challenge of performing in stressful situations but results do not support the direct use of challenge/threat data for recruitment decisions.

  10. CEDAR Electrodynamics Thermosphere Ionosphere (ETI) Challenge for Systematic Assessment of Ionosphere/Thermosphere Models: NmF2, hmF2, and Vertical Drift Using Ground-Based Observations

    NASA Technical Reports Server (NTRS)

    Shim, J. S.; Kuznetsova, M.; Rastatter, L.; Hesse, M.; Bilitza, D.; Butala, M.; Codrescu, M.; Emery, B.; Foster, B.; Fuller-Rowell, T.; hide

    2011-01-01

    Objective quantification of model performance based on metrics helps us evaluate the current state of space physics modeling capability, address differences among various modeling approaches, and track model improvements over time. The Coupling, Energetics, and Dynamics of Atmospheric Regions (CEDAR) Electrodynamics Thermosphere Ionosphere (ETI) Challenge was initiated in 2009 to assess accuracy of various ionosphere/thermosphere models in reproducing ionosphere and thermosphere parameters. A total of nine events and five physical parameters were selected to compare between model outputs and observations. The nine events included two strong and one moderate geomagnetic storm events from GEM Challenge events and three moderate storms and three quiet periods from the first half of the International Polar Year (IPY) campaign, which lasted for 2 years, from March 2007 to March 2009. The five physical parameters selected were NmF2 and hmF2 from ISRs and LEO satellites such as CHAMP and COSMIC, vertical drifts at Jicamarca, and electron and neutral densities along the track of the CHAMP satellite. For this study, four different metrics and up to 10 models were used. In this paper, we focus on preliminary results of the study using ground-based measurements, which include NmF2 and hmF2 from Incoherent Scatter Radars (ISRs), and vertical drifts at Jicamarca. The results show that the model performance strongly depends on the type of metrics used, and thus no model is ranked top for all used metrics. The analysis further indicates that performance of the model also varies with latitude and geomagnetic activity level.

  11. HPT: The Culture Factor.

    ERIC Educational Resources Information Center

    Addison, Roger M.; Wittkuhn, Klaus D.

    2001-01-01

    Discusses the challenges in managing performance across national cultures and within changing corporate cultures. Describes two human performance technology tools that can help performance consultants understand different cultures and provide the basis for successful management action: the culture audit and the systems model that can be adapted…

  12. Evaluation of novel oral vaccine candidates and validation of a caprine model of Johne's disease

    PubMed Central

    Hines, Murray E.; Turnquist, Sue E.; Ilha, Marcia R. S.; Rajeev, Sreekumari; Jones, Arthur L.; Whittington, Lisa; Bannantine, John P.; Barletta, Raúl G.; Gröhn, Yrjö T.; Katani, Robab; Talaat, Adel M.; Li, Lingling; Kapur, Vivek

    2014-01-01

    Johne's disease (JD) caused by Mycobacterium avium subspecies paratuberculosis (MAP) is a major threat to the dairy industry and possibly some cases of Crohn's disease in humans. A MAP vaccine that reduced of clinical disease and/or reduced fecal shedding would aid in the control of JD. The objectives of this study were (1) to evaluate the efficacy of 5 attenuated strains of MAP as vaccine candidates compared to a commercial control vaccine using the protocol proposed by the Johne's Disease Integrated Program (JDIP) Animal Model Standardization Committee (AMSC), and (2) to validate the AMSC Johne's disease goat challenge model. Eighty goat kids were vaccinated orally twice at 8 and 10 weeks of age with an experimental vaccine or once subcutaneously at 8 weeks with Silirum® (Zoetis), or a sham control oral vaccine at 8 and 10 weeks. Kids were challenged orally with a total of approximately 1.44 × 109 CFU divided in two consecutive daily doses using MAP ATCC-700535 (K10-like bovine isolate). All kids were necropsied at 13 months post-challenge. Results indicated that the AMSC goat challenge model is a highly efficient and valid model for JD challenge studies. None of the experimental or control vaccines evaluated prevented MAP infection or eliminated fecal shedding, although the 329 vaccine lowered the incidence of infection, fecal shedding, tissue colonization and reduced lesion scores, but less than the control vaccine. Based on our results the relative performance ranking of the experimental live-attenuated vaccines evaluated, the 329 vaccine was the best performer, followed by the 318 vaccine, then 316 vaccine, 315 vaccine and finally the 319 vaccine was the worst performer. The subcutaneously injected control vaccine outperformed the orally-delivered mutant vaccine candidates. Two vaccines (329 and 318) do reduce presence of JD gross and microscopic lesions, slow progression of disease, and one vaccine (329) reduced fecal shedding and tissue colonization. PMID:24624365

  13. Modified Vaccinia Ankara Virus Vaccination Provides Long-Term Protection against Nasal Rabbitpox Virus Challenge.

    PubMed

    Jones, Dorothy I; McGee, Charles E; Sample, Christopher J; Sempowski, Gregory D; Pickup, David J; Staats, Herman F

    2016-07-01

    Modified vaccinia Ankara virus (MVA) is a smallpox vaccine candidate. This study was performed to determine if MVA vaccination provides long-term protection against rabbitpox virus (RPXV) challenge, an animal model of smallpox. Two doses of MVA provided 100% protection against a lethal intranasal RPXV challenge administered 9 months after vaccination. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  14. Challenge and Error: Critical Events and Attention-Related Errors

    ERIC Educational Resources Information Center

    Cheyne, James Allan; Carriere, Jonathan S. A.; Solman, Grayden J. F.; Smilek, Daniel

    2011-01-01

    Attention lapses resulting from reactivity to task challenges and their consequences constitute a pervasive factor affecting everyday performance errors and accidents. A bidirectional model of attention lapses (error [image omitted] attention-lapse: Cheyne, Solman, Carriere, & Smilek, 2009) argues that errors beget errors by generating attention…

  15. 2008 GEM Modeling Challenge: Metrics Study of the Dst Index in Physics-Based Magnetosphere and Ring Current Models and in Statistical and Analytic Specifications

    NASA Technical Reports Server (NTRS)

    Rastaetter, L.; Kuznetsova, M.; Hesse, M.; Pulkkinen, A.; Glocer, A.; Yu, Y.; Meng, X.; Raeder, J.; Wiltberger, M.; Welling, D.; hide

    2011-01-01

    In this paper the metrics-based results of the Dst part of the 2008-2009 GEM Metrics Challenge are reported. The Metrics Challenge asked modelers to submit results for 4 geomagnetic storm events and 5 different types of observations that can be modeled by statistical or climatological or physics-based (e.g. MHD) models of the magnetosphere-ionosphere system. We present the results of over 25 model settings that were run at the Community Coordinated Modeling Center (CCMC) and at the institutions of various modelers for these events. To measure the performance of each of the models against the observations we use comparisons of one-hour averaged model data with the Dst index issued by the World Data Center for Geomagnetism, Kyoto, Japan, and direct comparison of one-minute model data with the one-minute Dst index calculated by the United States Geologic Survey (USGS).

  16. Need Satisfaction at Work, Job Strain, and Performance: A Diary Study.

    PubMed

    De Gieter, Sara; Hofmans, Joeri; Bakker, Arnold B

    2017-08-24

    We performed a daily diary study to examine the mediating role of autonomy need satisfaction and competence need satisfaction in the relationships between job characteristics (i.e., job resources, challenge and hindrance demands) and strain and performance. For 10 consecutive working days, 194 employees reported on their daily job resources, challenge and hindrance demands, task performance, strain level, and satisfaction of the needs for competence and autonomy. Multilevel path modeling demonstrated that the within-person relationships between job resources, challenge and hindrance demands, and strain are mediated by autonomy need satisfaction, but not by competence need satisfaction. However, the relationships between job resources and hindrance demands, and performance are mediated by both competence and autonomy need satisfaction. Our findings show that organizations may benefit from designing jobs that provide employees with the opportunity to satisfy their basic needs for competence and autonomy. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Using phenomenological models for forecasting the 2015 Ebola challenge.

    PubMed

    Pell, Bruce; Kuang, Yang; Viboud, Cecile; Chowell, Gerardo

    2018-03-01

    The rising number of novel pathogens threatening the human population has motivated the application of mathematical modeling for forecasting the trajectory and size of epidemics. We summarize the real-time forecasting results of the logistic equation during the 2015 Ebola challenge focused on predicting synthetic data derived from a detailed individual-based model of Ebola transmission dynamics and control. We also carry out a post-challenge comparison of two simple phenomenological models. In particular, we systematically compare the logistic growth model and a recently introduced generalized Richards model (GRM) that captures a range of early epidemic growth profiles ranging from sub-exponential to exponential growth. Specifically, we assess the performance of each model for estimating the reproduction number, generate short-term forecasts of the epidemic trajectory, and predict the final epidemic size. During the challenge the logistic equation consistently underestimated the final epidemic size, peak timing and the number of cases at peak timing with an average mean absolute percentage error (MAPE) of 0.49, 0.36 and 0.40, respectively. Post-challenge, the GRM which has the flexibility to reproduce a range of epidemic growth profiles ranging from early sub-exponential to exponential growth dynamics outperformed the logistic growth model in ascertaining the final epidemic size as more incidence data was made available, while the logistic model underestimated the final epidemic even with an increasing amount of data of the evolving epidemic. Incidence forecasts provided by the generalized Richards model performed better across all scenarios and time points than the logistic growth model with mean RMS decreasing from 78.00 (logistic) to 60.80 (GRM). Both models provided reasonable predictions of the effective reproduction number, but the GRM slightly outperformed the logistic growth model with a MAPE of 0.08 compared to 0.10, averaged across all scenarios and time points. Our findings further support the consideration of transmission models that incorporate flexible early epidemic growth profiles in the forecasting toolkit. Such models are particularly useful for quickly evaluating a developing infectious disease outbreak using only case incidence time series of the early phase of an infectious disease outbreak. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Using the cloud to speed-up calibration of watershed-scale hydrologic models (Invited)

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Ercan, M. B.; Castronova, A. M.; Humphrey, M.; Beekwilder, N.; Steele, J.; Kim, I.

    2013-12-01

    This research focuses on using the cloud to address computational challenges associated with hydrologic modeling. One example is calibration of a watershed-scale hydrologic model, which can take days of execution time on typical computers. While parallel algorithms for model calibration exist and some researchers have used multi-core computers or clusters to run these algorithms, these solutions do not fully address the challenge because (i) calibration can still be too time consuming even on multicore personal computers and (ii) few in the community have the time and expertise needed to manage a compute cluster. Given this, another option for addressing this challenge that we are exploring through this work is the use of the cloud for speeding-up calibration of watershed-scale hydrologic models. The cloud used in this capacity provides a means for renting a specific number and type of machines for only the time needed to perform a calibration model run. The cloud allows one to precisely balance the duration of the calibration with the financial costs so that, if the budget allows, the calibration can be performed more quickly by renting more machines. Focusing specifically on the SWAT hydrologic model and a parallel version of the DDS calibration algorithm, we show significant speed-up time across a range of watershed sizes using up to 256 cores to perform a model calibration. The tool provides a simple web-based user interface and the ability to monitor the calibration job submission process during the calibration process. Finally this talk concludes with initial work to leverage the cloud for other tasks associated with hydrologic modeling including tasks related to preparing inputs for constructing place-based hydrologic models.

  19. Relative performance of academic departments using DEA with sensitivity analysis.

    PubMed

    Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S P

    2009-05-01

    The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of academic programs based on multiple criteria. Keeping this in view, this paper attempts to evaluate the performance efficiencies of 19 academic departments of IIT Roorkee (India) through data envelopment analysis (DEA) technique. The technique has been used to assess the performance of academic institutions in a number of countries like USA, UK, Australia, etc. But we are using it first time in Indian context to the best of our knowledge. Applying DEA models, we calculate technical, pure technical and scale efficiencies and identify the reference sets for inefficient departments. Input and output projections are also suggested for inefficient departments to reach the frontier. Overall performance, research performance and teaching performance are assessed separately using sensitivity analysis.

  20. The Next Step Toward Widespread Residential Deep Energy Retrofits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McIlvaine, J.; Martin, E.; Saunders, S.

    The complexity of deep energy retrofits warrants additional training to successfully manage multiple improvements that will change whole house air, heat, and moisture flow dynamics. The home performance contracting industry has responded to these challenges by aggregating skilled labor for assessment of and implementation under one umbrella. Two emerging business models are profiled that seek to resolve many of the challenges, weaknesses, opportunities, and threats described for the conventional business models.

  1. A hierarchical model of daily stream temperature using air-water temperature synchronization, autocorrelation, and time lags

    USGS Publications Warehouse

    Letcher, Benjamin; Hocking, Daniel; O'Neil, Kyle; Whiteley, Andrew R.; Nislow, Keith H.; O'Donnell, Matthew

    2016-01-01

    Water temperature is a primary driver of stream ecosystems and commonly forms the basis of stream classifications. Robust models of stream temperature are critical as the climate changes, but estimating daily stream temperature poses several important challenges. We developed a statistical model that accounts for many challenges that can make stream temperature estimation difficult. Our model identifies the yearly period when air and water temperature are synchronized, accommodates hysteresis, incorporates time lags, deals with missing data and autocorrelation and can include external drivers. In a small stream network, the model performed well (RMSE = 0.59°C), identified a clear warming trend (0.63 °C decade−1) and a widening of the synchronized period (29 d decade−1). We also carefully evaluated how missing data influenced predictions. Missing data within a year had a small effect on performance (∼0.05% average drop in RMSE with 10% fewer days with data). Missing all data for a year decreased performance (∼0.6 °C jump in RMSE), but this decrease was moderated when data were available from other streams in the network.

  2. Core Competencies for Doctoral Education in Public Health

    PubMed Central

    Calhoun, Judith G.; Weist, Elizabeth M.; Raczynski, James M.

    2012-01-01

    The Association of Schools of Public Health (ASPH) released the Doctor of Public Health (DrPH) Core Competency Model in 2009. Between 2007 and 2009, a national expert panel with members of the academic and practice communities guided by the ASPH Education Committee developed its 7 performance domains, including 54 competencies. We provide an overview and analysis of the challenges and issues associated with the variability in DrPH degree offerings, reflect on the model development process and related outcomes, and discuss the significance of the model, future applications, and challenges for integration across educational settings. With the model, ASPH aims to stimulate national discussion on the competencies needed by DrPH graduates with the new challenges of 21st-century public health practice and to better define the DrPH degree. PMID:22095342

  3. A Bayesian Network Approach to Modeling Learning Progressions and Task Performance. CRESST Report 776

    ERIC Educational Resources Information Center

    West, Patti; Rutstein, Daisy Wise; Mislevy, Robert J.; Liu, Junhui; Choi, Younyoung; Levy, Roy; Crawford, Aaron; DiCerbo, Kristen E.; Chappel, Kristina; Behrens, John T.

    2010-01-01

    A major issue in the study of learning progressions (LPs) is linking student performance on assessment tasks to the progressions. This report describes the challenges faced in making this linkage using Bayesian networks to model LPs in the field of computer networking. The ideas are illustrated with exemplar Bayesian networks built on Cisco…

  4. The Relationship between Student Transfers and District Academic Performance: Accounting for Feedback Effects

    ERIC Educational Resources Information Center

    Welsch, David M.; Zimmer, David M.

    2015-01-01

    This paper draws attention to a subtle, but concerning, empirical challenge common in panel data models that seek to estimate the relationship between student transfers and district academic performance. Specifically, if such models have a dynamic element, and if the estimator controls for unobserved traits by including district-level effects,…

  5. A Perkins Challenge: Assessing Technical Skills in CTE

    ERIC Educational Resources Information Center

    Stone, James R., III

    2009-01-01

    Federal law requires state to develop performance measures and data-collection systems for secondary and postsecondary technical-skill attainment. This poses many challenges, such as defining a technical skills, measurement and when to assess students. In this article, the author outlines various assessment models and looks at the challenges…

  6. Employee Engagement and Turnover Intent: An Analysis of the Thai Public Sector

    ERIC Educational Resources Information Center

    Tanthasith, Sitthichai

    2016-01-01

    Organizations these days are facing a number of challenges that affect their performance and productivity. As workplaces become more challenging to employees, employee engagement and turnover become critical concerns for management. Drawing on insights from the Job Demand-Resource model, this study explores the relationships between key…

  7. Integrated Modeling for the James Webb Space Telescope (JWST) Project: Structural Analysis Activities

    NASA Technical Reports Server (NTRS)

    Johnston, John; Mosier, Mark; Howard, Joe; Hyde, Tupper; Parrish, Keith; Ha, Kong; Liu, Frank; McGinnis, Mark

    2004-01-01

    This paper presents viewgraphs about structural analysis activities and integrated modeling for the James Webb Space Telescope (JWST). The topics include: 1) JWST Overview; 2) Observatory Structural Models; 3) Integrated Performance Analysis; and 4) Future Work and Challenges.

  8. Pulse Detonation Rocket Engine Research at NASA Marshall

    NASA Technical Reports Server (NTRS)

    Morris, Christopher I.

    2003-01-01

    Pulse detonation rocket engines (PDREs) offer potential performance improvements over conventional designs, but represent a challenging modeling task. A quasi 1-D, finite-rate chemistry CFD model for a PDRE is described and implemented. A parametric study of the effect of blowdown pressure ratio on the performance of an optimized, fixed PDRE nozzle configuration is reported. The results are compared to a steady-state rocket system using similar modeling assumptions.

  9. Using formal methods to scope performance challenges for Smart Manufacturing Systems: focus on agility.

    PubMed

    Jung, Kiwook; Morris, K C; Lyons, Kevin W; Leong, Swee; Cho, Hyunbo

    2015-12-01

    Smart Manufacturing Systems (SMS) need to be agile to adapt to new situations by using detailed, precise, and appropriate data for intelligent decision-making. The intricacy of the relationship of strategic goals to operational performance across the many levels of a manufacturing system inhibits the realization of SMS. This paper proposes a method for identifying what aspects of a manufacturing system should be addressed to respond to changing strategic goals. The method uses standard modeling techniques in specifying a manufacturing system and the relationship between strategic goals and operational performance metrics. Two existing reference models related to manufacturing operations are represented formally and harmonized to support the proposed method. The method is illustrated for a single scenario using agility as a strategic goal. By replicating the proposed method for other strategic goals and with multiple scenarios, a comprehensive set of performance challenges can be identified.

  10. Using formal methods to scope performance challenges for Smart Manufacturing Systems: focus on agility

    PubMed Central

    Jung, Kiwook; Morris, KC; Lyons, Kevin W.; Leong, Swee; Cho, Hyunbo

    2016-01-01

    Smart Manufacturing Systems (SMS) need to be agile to adapt to new situations by using detailed, precise, and appropriate data for intelligent decision-making. The intricacy of the relationship of strategic goals to operational performance across the many levels of a manufacturing system inhibits the realization of SMS. This paper proposes a method for identifying what aspects of a manufacturing system should be addressed to respond to changing strategic goals. The method uses standard modeling techniques in specifying a manufacturing system and the relationship between strategic goals and operational performance metrics. Two existing reference models related to manufacturing operations are represented formally and harmonized to support the proposed method. The method is illustrated for a single scenario using agility as a strategic goal. By replicating the proposed method for other strategic goals and with multiple scenarios, a comprehensive set of performance challenges can be identified. PMID:27141209

  11. A Holistic Management Architecture for Large-Scale Adaptive Networks

    DTIC Science & Technology

    2007-09-01

    transmission and processing overhead required for management. The challenges of building models to describe dynamic systems are well-known to the field of...increases the challenge of finding a simple approach to assessing the state of the network. Moreover, the performance state of one network link may be... challenging . These obstacles indicate the need for a less comprehensive-analytical, more systemic-holistic approach to managing networks. This approach might

  12. Accomplishments and challenges of surgical simulation.

    PubMed

    Satava, R M

    2001-03-01

    For nearly a decade, advanced computer technologies have created extraordinary educational tools using three-dimensional (3D) visualization and virtual reality. Pioneering efforts in surgical simulation with these tools have resulted in a first generation of simulators for surgical technical skills. Accomplishments include simulations with 3D models of anatomy for practice of surgical tasks, initial assessment of student performance in technical skills, and awareness by professional societies of potential in surgical education and certification. However, enormous challenges remain, which include improvement of technical fidelity, standardization of accurate metrics for performance evaluation, integration of simulators into a robust educational curriculum, stringent evaluation of simulators for effectiveness and value added to surgical training, determination of simulation application to certification of surgical technical skills, and a business model to implement and disseminate simulation successfully throughout the medical education community. This review looks at the historical progress of surgical simulators, their accomplishments, and the challenges that remain.

  13. Participation and occupation in occupational therapy models of practice: A discussion of possibilities and challenges.

    PubMed

    Larsson-Lund, Maria; Nyman, Anneli

    2017-11-01

    Occupation has been the focus in occupational therapy practice to greater or lesser degrees from a historical viewpoint. This evokes a need to discuss whether concepts that are added to our field will enhance or blur our focus on occupation. To explore how the concept of participation in the International Classification of Functioning, Disability and Health (ICF) is related to the concept of occupation by reviewing and comparing its use in three models of practice within occupational therapy. The aim was also to generate discussion on possibilities and challenges concerning the relationship of participation and occupation. The models reviewed were The Model of Human Occupation (MOHO), the Canadian Model of Occupational Performance and Engagement (CMOP-E) and the Occupational Therapy Intervention Process Model (OTIPM). The concept of participation was related to occupation in different ways in these models. Based on the review some challenges and considerations for occupational therapy were generated. Relating the concept of participation from the ICF to the concept of occupation in models of practice can be challenging. At the same time, relating the concepts can be a resource to develop occupational therapy and the understanding of occupational issues in society.

  14. Understanding the dynamics of sustainable social-ecological systems: human behavior, institutions, and regulatory feedback networks.

    PubMed

    Anderies, John M

    2015-02-01

    I present a general mathematical modeling framework that can provide a foundation for the study of sustainability in social- ecological systems (SESs). Using basic principles from feedback control and a sequence of specific models from bioeconomics and economic growth, I outline several mathematical and empirical challenges associated with the study of sustainability of SESs. These challenges are categorized into three classes: (1) the social choice of performance measures, (2) uncertainty, and (3) collective action. Finally, I present some opportunities for combining stylized dynamical systems models with empirical data on human behavior and biophysical systems to address practical challenges for the design of effective governance regimes (policy feedbacks) for highly uncertain natural resource systems.

  15. The IDEA model: A single equation approach to the Ebola forecasting challenge.

    PubMed

    Tuite, Ashleigh R; Fisman, David N

    2018-03-01

    Mathematical modeling is increasingly accepted as a tool that can inform disease control policy in the face of emerging infectious diseases, such as the 2014-2015 West African Ebola epidemic, but little is known about the relative performance of alternate forecasting approaches. The RAPIDD Ebola Forecasting Challenge (REFC) tested the ability of eight mathematical models to generate useful forecasts in the face of simulated Ebola outbreaks. We used a simple, phenomenological single-equation model (the "IDEA" model), which relies only on case counts, in the REFC. Model fits were performed using a maximum likelihood approach. We found that the model performed reasonably well relative to other more complex approaches, with performance metrics ranked on average 4th or 5th among participating models. IDEA appeared better suited to long- than short-term forecasts, and could be fit using nothing but reported case counts. Several limitations were identified, including difficulty in identifying epidemic peak (even retrospectively), unrealistically precise confidence intervals, and difficulty interpolating daily case counts when using a model scaled to epidemic generation time. More realistic confidence intervals were generated when case counts were assumed to follow a negative binomial, rather than Poisson, distribution. Nonetheless, IDEA represents a simple phenomenological model, easily implemented in widely available software packages that could be used by frontline public health personnel to generate forecasts with accuracy that approximates that which is achieved using more complex methodologies. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.

  16. Towards agile large-scale predictive modelling in drug discovery with flow-based programming design principles.

    PubMed

    Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola

    2016-01-01

    Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.

  17. Assessment and Challenges of Ligand Docking into Comparative Models of G-Protein Coupled Receptors

    PubMed Central

    Frimurer, Thomas M.; Meiler, Jens

    2013-01-01

    The rapidly increasing number of high-resolution X-ray structures of G-protein coupled receptors (GPCRs) creates a unique opportunity to employ comparative modeling and docking to provide valuable insight into the function and ligand binding determinants of novel receptors, to assist in virtual screening and to design and optimize drug candidates. However, low sequence identity between receptors, conformational flexibility, and chemical diversity of ligands present an enormous challenge to molecular modeling approaches. It is our hypothesis that rapid Monte-Carlo sampling of protein backbone and side-chain conformational space with Rosetta can be leveraged to meet this challenge. This study performs unbiased comparative modeling and docking methodologies using 14 distinct high-resolution GPCRs and proposes knowledge-based filtering methods for improvement of sampling performance and identification of correct ligand-receptor interactions. On average, top ranked receptor models built on template structures over 50% sequence identity are within 2.9 Å of the experimental structure, with an average root mean square deviation (RMSD) of 2.2 Å for the transmembrane region and 5 Å for the second extracellular loop. Furthermore, these models are consistently correlated with low Rosetta energy score. To predict their binding modes, ligand conformers of the 14 ligands co-crystalized with the GPCRs were docked against the top ranked comparative models. In contrast to the comparative models themselves, however, it remains difficult to unambiguously identify correct binding modes by score alone. On average, sampling performance was improved by 103 fold over random using knowledge-based and energy-based filters. In assessing the applicability of experimental constraints, we found that sampling performance is increased by one order of magnitude for every 10 residues known to contact the ligand. Additionally, in the case of DOR, knowledge of a single specific ligand-protein contact improved sampling efficiency 7 fold. These findings offer specific guidelines which may lead to increased success in determining receptor-ligand complexes. PMID:23844000

  18. Sustainability of cross-functional teams for marketing strategy development and implementation.

    PubMed

    Kono, Ken; Antonucci, Don

    2006-01-01

    This article presents a case study on a cross-functional team used for marketing strategy development and execution at a health insurance company. The study found a set of success factors that contributed to the initial success of the team, but the factors were not enough to maintain the team's high level of productivity over time. The study later identified a set of 8 factors that helped sustain the team's high-productivity level. The 2 sets (ie, success and its subsequent sustainability factors) are analyzed against a normative model of team effectiveness. All the factors are explained by the normative model except for 1 sustainability factor, "challenge motivator." In fact, the study found the "challenge motivator" to be the most critical factor to keep up the team's productivity over time. Apart from a performance crisis, the authors developed 3 "challenge motivators"--first, more granular market information that could unearth hidden performance issues; second, constant value creation to shareholders as the firm being publicly traded; and third, the firm's strategic mandate to meet and exceed customer expectations that puts ultimate performance pressure on the marketing strategy team.

  19. Addressing the Big-Earth-Data Variety Challenge with the Hierarchical Triangular Mesh

    NASA Technical Reports Server (NTRS)

    Rilee, Michael L.; Kuo, Kwo-Sen; Clune, Thomas; Oloso, Amidu; Brown, Paul G.; Yu, Honfeng

    2016-01-01

    We have implemented an updated Hierarchical Triangular Mesh (HTM) as the basis for a unified data model and an indexing scheme for geoscience data to address the variety challenge of Big Earth Data. We observe that, in the absence of variety, the volume challenge of Big Data is relatively easily addressable with parallel processing. The more important challenge in achieving optimal value with a Big Data solution for Earth Science (ES) data analysis, however, is being able to achieve good scalability with variety. With HTM unifying at least the three popular data models, i.e. Grid, Swath, and Point, used by current ES data products, data preparation time for integrative analysis of diverse datasets can be drastically reduced and better variety scaling can be achieved. In addition, since HTM is also an indexing scheme, when it is used to index all ES datasets, data placement alignment (or co-location) on the shared nothing architecture, which most Big Data systems are based on, is guaranteed and better performance is ensured. Moreover, our updated HTM encoding turns most geospatial set operations into integer interval operations, gaining further performance advantages.

  20. Numerical simulation of turbulent combustion: Scientific challenges

    NASA Astrophysics Data System (ADS)

    Ren, ZhuYin; Lu, Zhen; Hou, LingYun; Lu, LiuYan

    2014-08-01

    Predictive simulation of engine combustion is key to understanding the underlying complicated physicochemical processes, improving engine performance, and reducing pollutant emissions. Critical issues as turbulence modeling, turbulence-chemistry interaction, and accommodation of detailed chemical kinetics in complex flows remain challenging and essential for high-fidelity combustion simulation. This paper reviews the current status of the state-of-the-art large eddy simulation (LES)/prob-ability density function (PDF)/detailed chemistry approach that can address the three challenging modelling issues. PDF as a subgrid model for LES is formulated and the hybrid mesh-particle method for LES/PDF simulations is described. Then the development need in micro-mixing models for the PDF simulations of turbulent premixed combustion is identified. Finally the different acceleration methods for detailed chemistry are reviewed and a combined strategy is proposed for further development.

  1. Alternative Method to Simulate a Sub-idle Engine Operation in Order to Synthesize Its Control System

    NASA Astrophysics Data System (ADS)

    Sukhovii, Sergii I.; Sirenko, Feliks F.; Yepifanov, Sergiy V.; Loboda, Igor

    2016-09-01

    The steady-state and transient engine performances in control systems are usually evaluated by applying thermodynamic engine models. Most models operate between the idle and maximum power points, only recently, they sometimes address a sub-idle operating range. The lack of information about the component maps at the sub-idle modes presents a challenging problem. A common method to cope with the problem is to extrapolate the component performances to the sub-idle range. Precise extrapolation is also a challenge. As a rule, many scientists concern only particular aspects of the problem such as the lighting combustion chamber or the turbine operation under the turned-off conditions of the combustion chamber. However, there are no reports about a model that considers all of these aspects and simulates the engine starting. The proposed paper addresses a new method to simulate the starting. The method substitutes the non-linear thermodynamic model with a linear dynamic model, which is supplemented with a simplified static model. The latter model is the set of direct relations between parameters that are used in the control algorithms instead of commonly used component performances. Specifically, this model consists of simplified relations between the gas path parameters and the corrected rotational speed.

  2. A Comparison Study for DNA Motif Modeling on Protein Binding Microarray.

    PubMed

    Wong, Ka-Chun; Li, Yue; Peng, Chengbin; Wong, Hau-San

    2016-01-01

    Transcription factor binding sites (TFBSs) are relatively short (5-15 bp) and degenerate. Identifying them is a computationally challenging task. In particular, protein binding microarray (PBM) is a high-throughput platform that can measure the DNA binding preference of a protein in a comprehensive and unbiased manner; for instance, a typical PBM experiment can measure binding signal intensities of a protein to all possible DNA k-mers (k = 8∼10). Since proteins can often bind to DNA with different binding intensities, one of the major challenges is to build TFBS (also known as DNA motif) models which can fully capture the quantitative binding affinity data. To learn DNA motif models from the non-convex objective function landscape, several optimization methods are compared and applied to the PBM motif model building problem. In particular, representative methods from different optimization paradigms have been chosen for modeling performance comparison on hundreds of PBM datasets. The results suggest that the multimodal optimization methods are very effective for capturing the binding preference information from PBM data. In particular, we observe a general performance improvement if choosing di-nucleotide modeling over mono-nucleotide modeling. In addition, the models learned by the best-performing method are applied to two independent applications: PBM probe rotation testing and ChIP-Seq peak sequence prediction, demonstrating its biological applicability.

  3. Blind prediction of cyclohexane-water distribution coefficients from the SAMPL5 challenge.

    PubMed

    Bannan, Caitlin C; Burley, Kalistyn H; Chiu, Michael; Shirts, Michael R; Gilson, Michael K; Mobley, David L

    2016-11-01

    In the recent SAMPL5 challenge, participants submitted predictions for cyclohexane/water distribution coefficients for a set of 53 small molecules. Distribution coefficients (log D) replace the hydration free energies that were a central part of the past five SAMPL challenges. A wide variety of computational methods were represented by the 76 submissions from 18 participating groups. Here, we analyze submissions by a variety of error metrics and provide details for a number of reference calculations we performed. As in the SAMPL4 challenge, we assessed the ability of participants to evaluate not just their statistical uncertainty, but their model uncertainty-how well they can predict the magnitude of their model or force field error for specific predictions. Unfortunately, this remains an area where prediction and analysis need improvement. In SAMPL4 the top performing submissions achieved a root-mean-squared error (RMSE) around 1.5 kcal/mol. If we anticipate accuracy in log D predictions to be similar to the hydration free energy predictions in SAMPL4, the expected error here would be around 1.54 log units. Only a few submissions had an RMSE below 2.5 log units in their predicted log D values. However, distribution coefficients introduced complexities not present in past SAMPL challenges, including tautomer enumeration, that are likely to be important in predicting biomolecular properties of interest to drug discovery, therefore some decrease in accuracy would be expected. Overall, the SAMPL5 distribution coefficient challenge provided great insight into the importance of modeling a variety of physical effects. We believe these types of measurements will be a promising source of data for future blind challenges, especially in view of the relatively straightforward nature of the experiments and the level of insight provided.

  4. Blind prediction of cyclohexane-water distribution coefficients from the SAMPL5 challenge

    PubMed Central

    Bannan, Caitlin C.; Burley, Kalistyn H.; Chiu, Michael; Shirts, Michael R.; Gilson, Michael K.; Mobley, David L.

    2016-01-01

    In the recent SAMPL5 challenge, participants submitted predictions for cyclohexane/water distribution coefficients for a set of 53 small molecules. Distribution coefficients (log D) replace the hydration free energies that were a central part of the past five SAMPL challenges. A wide variety of computational methods were represented by the 76 submissions from 18 participating groups. Here, we analyze submissions by a variety of error metrics and provide details for a number of reference calculations we performed. As in the SAMPL4 challenge, we assessed the ability of participants to evaluate not just their statistical uncertainty, but their model uncertainty – how well they can predict the magnitude of their model or force field error for specific predictions. Unfortunately, this remains an area where prediction and analysis need improvement. In SAMPL4 the top performing submissions achieved a root-mean-squared error (RMSE) around 1.5 kcal/mol. If we anticipate accuracy in log D predictions to be similar to the hydration free energy predictions in SAMPL4, the expected error here would be around 1.54 log units. Only a few submissions had an RMSE below 2.5 log units in their predicted log D values. However, distribution coefficients introduced complexities not present in past SAMPL challenges, including tautomer enumeration, that are likely to be important in predicting biomolecular properties of interest to drug discovery, therefore some decrease in accuracy would be expected. Overall, the SAMPL5 distribution coefficient challenge provided great insight into the importance of modeling a variety of physical effects. We believe these types of measurements will be a promising source of data for future blind challenges, especially in view of the relatively straightforward nature of the experiments and the level of insight provided. PMID:27677750

  5. System performance and modeling of a bioaerosol detection lidar sensor utilizing polarization diversity

    NASA Astrophysics Data System (ADS)

    Glennon, John J.; Nichols, Terry; Gatt, Phillip; Baynard, Tahllee; Marquardt, John H.; Vanderbeek, Richard G.

    2009-05-01

    The weaponization and dissemination of biological warfare agents (BWA) constitute a high threat to civilians and military personnel. An aerosol release, disseminated from a single point, can directly affect large areas and many people in a short time. Because of this threat real-time standoff detection of BWAs is a key requirement for national and military security. BWAs are a general class of material that can refer to spores, bacteria, toxins, or viruses. These bioaerosols have a tremendous size, shape, and chemical diversity that, at present, are not well characterized [1]. Lockheed Martin Coherent Technologies (LMCT) has developed a standoff lidar sensor with high sensitivity and robust discrimination capabilities with a size and ruggedness that is appropriate for military use. This technology utilizes multiwavelength backscatter polarization diversity to discriminate between biological threats and naturally occurring interferents such as dust, smoke, and pollen. The optical design and hardware selection of the system has been driven by performance modeling leading to an understanding of measured system sensitivity. Here we briefly discuss the challenges of standoff bioaerosol discrimination and the approach used by LMCT to overcome these challenges. We review the radiometric calculations involved in modeling direct-detection of a distributed aerosol target and methods for accurately estimating wavelength dependent plume backscatter coefficients. Key model parameters and their validation are discussed and outlined. Metrics for sensor sensitivity are defined, modeled, and compared directly to data taken at Dugway Proving Ground, UT in 2008. Sensor sensitivity is modeled to predict performance changes between day and night operation and in various challenging environmental conditions.

  6. Thermal-to-visible face recognition using partial least squares.

    PubMed

    Hu, Shuowen; Choi, Jonghyun; Chan, Alex L; Schwartz, William Robson

    2015-03-01

    Although visible face recognition has been an active area of research for several decades, cross-modal face recognition has only been explored by the biometrics community relatively recently. Thermal-to-visible face recognition is one of the most difficult cross-modal face recognition challenges, because of the difference in phenomenology between the thermal and visible imaging modalities. We address the cross-modal recognition problem using a partial least squares (PLS) regression-based approach consisting of preprocessing, feature extraction, and PLS model building. The preprocessing and feature extraction stages are designed to reduce the modality gap between the thermal and visible facial signatures, and facilitate the subsequent one-vs-all PLS-based model building. We incorporate multi-modal information into the PLS model building stage to enhance cross-modal recognition. The performance of the proposed recognition algorithm is evaluated on three challenging datasets containing visible and thermal imagery acquired under different experimental scenarios: time-lapse, physical tasks, mental tasks, and subject-to-camera range. These scenarios represent difficult challenges relevant to real-world applications. We demonstrate that the proposed method performs robustly for the examined scenarios.

  7. Model selection on solid ground: Rigorous comparison of nine ways to evaluate Bayesian model evidence

    PubMed Central

    Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang

    2014-01-01

    Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible. PMID:25745272

  8. Developing Staffing Models to Support Population Health Management And Quality Oucomes in Ambulatory Care Settings.

    PubMed

    Haas, Sheila A; Vlasses, Frances; Havey, Julia

    2016-01-01

    There are multiple demands and challenges inherent in establishing staffing models in ambulatory heath care settings today. If health care administrators establish a supportive physical and interpersonal health care environment, and develop high-performing interprofessional teams and staffing models and electronic documentation systems that track performance, patients will have more opportunities to receive safe, high-quality evidence-based care that encourages patient participation in decision making, as well as provision of their care. The health care organization must be aligned and responsive to the community within which it resides, fully invested in population health management, and continuously scanning the environment for competitive, regulatory, and external environmental risks. All of these challenges require highly competent providers willing to change attitudes and culture such as movement toward collaborative practice among the interprofessional team including the patient.

  9. Additive Manufacturing (AM) in Expeditionary Operations: Current Needs, Technical Challenges, and Opportunities

    DTIC Science & Technology

    2016-06-01

    site customization of existing models. The author performed an empirical study centered around a survey of United States Marine Corps (USMC) and United...recommends that more studies be performed to determine the best way forward for AM within the USMC and USN. 14. SUBJECT TERMS 3D printing, additive...customization of existing models. The author performed an em- pirical study centered around a survey of United States Marine Corps (USMC) and United

  10. Real-Time Robust Adaptive Modeling and Scheduling for an Electronic Commerce Server

    NASA Astrophysics Data System (ADS)

    Du, Bing; Ruan, Chun

    With the increasing importance and pervasiveness of Internet services, it is becoming a challenge for the proliferation of electronic commerce services to provide performance guarantees under extreme overload. This paper describes a real-time optimization modeling and scheduling approach for performance guarantee of electronic commerce servers. We show that an electronic commerce server may be simulated as a multi-tank system. A robust adaptive server model is subject to unknown additive load disturbances and uncertain model matching. Overload control techniques are based on adaptive admission control to achieve timing guarantees. We evaluate the performance of the model using a complex simulation that is subjected to varying model parameters and massive overload.

  11. A Multitasking General Executive for Compound Continuous Tasks

    ERIC Educational Resources Information Center

    Salvucci, Dario D.

    2005-01-01

    As cognitive architectures move to account for increasingly complex real-world tasks, one of the most pressing challenges involves understanding and modeling human multitasking. Although a number of existing models now perform multitasking in real-world scenarios, these models typically employ customized executives that schedule tasks for the…

  12. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: An Earth Modeling System Software Framework Strawman Design that Integrates Cactus and UCLA/UCB Distributed Data Broker

    NASA Technical Reports Server (NTRS)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task. both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation, while maintaining high performance across numerous supercomputer and workstation architectures. This document proposes a strawman framework design for the climate community based on the integration of Cactus, from the relativistic physics community, and UCLA/UCB Distributed Data Broker (DDB) from the climate community. This design is the result of an extensive survey of climate models and frameworks in the climate community as well as frameworks from many other scientific communities. The design addresses fundamental development and runtime needs using Cactus, a framework with interfaces for FORTRAN and C-based languages, and high-performance model communication needs using DDB. This document also specifically explores object-oriented design issues in the context of climate modeling as well as climate modeling issues in terms of object-oriented design.

  13. Towards a Rigorous Assessment of Systems Biology Models: The DREAM3 Challenges

    PubMed Central

    Prill, Robert J.; Marbach, Daniel; Saez-Rodriguez, Julio; Sorger, Peter K.; Alexopoulos, Leonidas G.; Xue, Xiaowei; Clarke, Neil D.; Altan-Bonnet, Gregoire; Stolovitzky, Gustavo

    2010-01-01

    Background Systems biology has embraced computational modeling in response to the quantitative nature and increasing scale of contemporary data sets. The onslaught of data is accelerating as molecular profiling technology evolves. The Dialogue for Reverse Engineering Assessments and Methods (DREAM) is a community effort to catalyze discussion about the design, application, and assessment of systems biology models through annual reverse-engineering challenges. Methodology and Principal Findings We describe our assessments of the four challenges associated with the third DREAM conference which came to be known as the DREAM3 challenges: signaling cascade identification, signaling response prediction, gene expression prediction, and the DREAM3 in silico network challenge. The challenges, based on anonymized data sets, tested participants in network inference and prediction of measurements. Forty teams submitted 413 predicted networks and measurement test sets. Overall, a handful of best-performer teams were identified, while a majority of teams made predictions that were equivalent to random. Counterintuitively, combining the predictions of multiple teams (including the weaker teams) can in some cases improve predictive power beyond that of any single method. Conclusions DREAM provides valuable feedback to practitioners of systems biology modeling. Lessons learned from the predictions of the community provide much-needed context for interpreting claims of efficacy of algorithms described in the scientific literature. PMID:20186320

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. Carmack; L. Braase; F. Goldner

    The mission of the Advanced Fuels Campaign (AFC) is to perform Research, Development, and Demonstration (RD&D) activities for advanced fuel forms (including cladding) to enhance the performance and safety of the nation’s current and future reactors, enhance proliferation resistance of nuclear fuel, effectively utilize nuclear energy resources, and address the longer-term waste management challenges. This includes development of a state of the art Research and Development (R&D) infrastructure to support the use of a “goal oriented science based approach.” AFC uses a “goal oriented, science based approach” aimed at a fundamental understanding of fuel and cladding fabrication methods and performancemore » under irradiation, enabling the pursuit of multiple fuel forms for future fuel cycle options. This approach includes fundamental experiments, theory, and advanced modeling and simulation. One of the most challenging aspects of AFC is the management, integration, and coordination of major R&D activities across multiple organizations. AFC interfaces and collaborates with Fuel Cycle Technologies (FCT) campaigns, universities, industry, various DOE programs and laboratories, federal agencies (e.g., Nuclear Regulatory Commission [NRC]), and international organizations. Key challenges are the development of fuel technologies to enable major increases in fuel performance (safety, reliability, power and burnup) beyond current technologies, and development of characterization methods and predictive fuel performance models to enable more efficient development and licensing of advanced fuels. Challenged with the research and development of fuels for two different reactor technology platforms, AFC targeted transmutation fuel development and focused ceramic fuel development for Advanced LWR Fuels.« less

  15. Evaluation of prostate segmentation algorithms for MRI: the PROMISE12 challenge

    PubMed Central

    Litjens, Geert; Toth, Robert; van de Ven, Wendy; Hoeks, Caroline; Kerkstra, Sjoerd; van Ginneken, Bram; Vincent, Graham; Guillard, Gwenael; Birbeck, Neil; Zhang, Jindang; Strand, Robin; Malmberg, Filip; Ou, Yangming; Davatzikos, Christos; Kirschner, Matthias; Jung, Florian; Yuan, Jing; Qiu, Wu; Gao, Qinquan; Edwards, Philip “Eddie”; Maan, Bianca; van der Heijden, Ferdinand; Ghose, Soumya; Mitra, Jhimli; Dowling, Jason; Barratt, Dean; Huisman, Henkjan; Madabhushi, Anant

    2014-01-01

    Prostate MRI image segmentation has been an area of intense research due to the increased use of MRI as a modality for the clinical workup of prostate cancer. Segmentation is useful for various tasks, e.g. to accurately localize prostate boundaries for radiotherapy or to initialize multi-modal registration algorithms. In the past, it has been difficult for research groups to evaluate prostate segmentation algorithms on multi-center, multi-vendor and multi-protocol data. Especially because we are dealing with MR images, image appearance, resolution and the presence of artifacts are affected by differences in scanners and/or protocols, which in turn can have a large influence on algorithm accuracy. The Prostate MR Image Segmentation (PROMISE12) challenge was setup to allow a fair and meaningful comparison of segmentation methods on the basis of performance and robustness. In this work we will discuss the initial results of the online PROMISE12 challenge, and the results obtained in the live challenge workshop hosted by the MICCAI2012 conference. In the challenge, 100 prostate MR cases from 4 different centers were included, with differences in scanner manufacturer, field strength and protocol. A total of 11 teams from academic research groups and industry participated. Algorithms showed a wide variety in methods and implementation, including active appearance models, atlas registration and level sets. Evaluation was performed using boundary and volume based metrics which were combined into a single score relating the metrics to human expert performance. The winners of the challenge where the algorithms by teams Imorphics and ScrAutoProstate, with scores of 85.72 and 84.29 overall. Both algorithms where significantly better than all other algorithms in the challenge (p < 0.05) and had an efficient implementation with a run time of 8 minutes and 3 second per case respectively. Overall, active appearance model based approaches seemed to outperform other approaches like multi-atlas registration, both on accuracy and computation time. Although average algorithm performance was good to excellent and the Imorphics algorithm outperformed the second observer on average, we showed that algorithm combination might lead to further improvement, indicating that optimal performance for prostate segmentation is not yet obtained. All results are available online at http://promise12.grand-challenge.org/. PMID:24418598

  16. Animal models of asthma: utility and limitations.

    PubMed

    Aun, Marcelo Vivolo; Bonamichi-Santos, Rafael; Arantes-Costa, Fernanda Magalhães; Kalil, Jorge; Giavina-Bianchi, Pedro

    2017-01-01

    Clinical studies in asthma are not able to clear up all aspects of disease pathophysiology. Animal models have been developed to better understand these mechanisms and to evaluate both safety and efficacy of therapies before starting clinical trials. Several species of animals have been used in experimental models of asthma, such as Drosophila , rats, guinea pigs, cats, dogs, pigs, primates and equines. However, the most common species studied in the last two decades is mice, particularly BALB/c. Animal models of asthma try to mimic the pathophysiology of human disease. They classically include two phases: sensitization and challenge. Sensitization is traditionally performed by intraperitoneal and subcutaneous routes, but intranasal instillation of allergens has been increasingly used because human asthma is induced by inhalation of allergens. Challenges with allergens are performed through aerosol, intranasal or intratracheal instillation. However, few studies have compared different routes of sensitization and challenge. The causative allergen is another important issue in developing a good animal model. Despite being more traditional and leading to intense inflammation, ovalbumin has been replaced by aeroallergens, such as house dust mites, to use the allergens that cause human disease. Finally, researchers should define outcomes to be evaluated, such as serum-specific antibodies, airway hyperresponsiveness, inflammation and remodeling. The present review analyzes the animal models of asthma, assessing differences between species, allergens and routes of allergen administration.

  17. MaRIE: A facility for time-dependent materials science at the mesoscale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, Cris William; Kippen, Karen Elizabeth

    To meet new and emerging national security issues the Laboratory is stepping up to meet another grand challenge—transitioning from observing to controlling a material’s performance. This challenge requires the best of experiment, modeling, simulation, and computational tools. MaRIE is the Laboratory’s proposed flagship experimental facility intended to meet the challenge.

  18. Preparing the Principal to Drive the Goals of Education for All: A Conceptual Case Developmental Model

    ERIC Educational Resources Information Center

    Hutton, Disraeli M.

    2014-01-01

    The Education for All (EFA) goals, established for countries to improve educational performance, are most challenging for many developing countries. Notwithstanding the challenges, each country must implement suitable programme intervention in order to accomplish these goals. Goal 6 calls for the overall improvement of the education product, which…

  19. Computational fluid dynamics-habitat suitability index (CFD-HSI) modelling as an exploratory tool for assessing passability of riverine migratory challenge zones for fish

    USGS Publications Warehouse

    Haro, Alexander J.; Chelminski, Michael; Dudley, Robert W.

    2015-01-01

    We developed two-dimensional computational fluid hydraulics-habitat suitability index (CFD-HSI) models to identify and qualitatively assess potential zones of shallow water depth and high water velocity that may present passage challenges for five major anadromous fish species in a 2.63-km reach of the main stem Penobscot River, Maine, as a result of a dam removal downstream of the reach. Suitability parameters were based on distribution of fish lengths and body depths and transformed to cruising, maximum sustained and sprint swimming speeds. Zones of potential depth and velocity challenges were calculated based on the hydraulic models; ability of fish to pass a challenge zone was based on the percent of river channel that the contiguous zone spanned and its maximum along-current length. Three river flows (low: 99.1 m3 sec-1; normal: 344.9 m3 sec-1; and high: 792.9 m3 sec-1) were modelled to simulate existing hydraulic conditions and hydraulic conditions simulating removal of a dam at the downstream boundary of the reach. Potential depth challenge zones were nonexistent for all low-flow simulations of existing conditions for deeper-bodied fishes. Increasing flows for existing conditions and removal of the dam under all flow conditions increased the number and size of potential velocity challenge zones, with the effects of zones being more pronounced for smaller species. The two-dimensional CFD-HSI model has utility in demonstrating gross effects of flow and hydraulic alteration, but may not be as precise a predictive tool as a three-dimensional model. Passability of the potential challenge zones cannot be precisely quantified for two-dimensional or three-dimensional models due to untested assumptions and incomplete data on fish swimming performance and behaviours.

  20. Simplified Analysis of Pulse Detonation Rocket Engine Blowdown Gasdynamics and Performance

    NASA Technical Reports Server (NTRS)

    Morris, C. I.; Rodgers, Stephen L. (Technical Monitor)

    2002-01-01

    Pulse detonation rocket engines (PDREs) offer potential performance improvements over conventional designs, but represent a challenging modellng task. A simplified model for an idealized, straight-tube, single-shot PDRE blowdown process and thrust determination is described and implemented. In order to form an assessment of the accuracy of the model, the flowfield time history is compared to experimental data from Stanford University. Parametric Studies of the effect of mixture stoichiometry, initial fill temperature, and blowdown pressure ratio on the performance of a PDRE are performed using the model. PDRE performance is also compared with a conventional steady-state rocket engine over a range of pressure ratios using similar gasdynamic assumptions.

  1. Modeling nanomaterial environmental fate in aquatic systems.

    PubMed

    Dale, Amy L; Casman, Elizabeth A; Lowry, Gregory V; Lead, Jamie R; Viparelli, Enrica; Baalousha, Mohammed

    2015-03-03

    Mathematical models improve our fundamental understanding of the environmental behavior, fate, and transport of engineered nanomaterials (NMs, chemical substances or materials roughly 1-100 nm in size) and facilitate risk assessment and management activities. Although today's large-scale environmental fate models for NMs are a considerable improvement over early efforts, a gap still remains between the experimental research performed to date on the environmental fate of NMs and its incorporation into models. This article provides an introduction to the current state of the science in modeling the fate and behavior of NMs in aquatic environments. We address the strengths and weaknesses of existing fate models, identify the challenges facing researchers in developing and validating these models, and offer a perspective on how these challenges can be addressed through the combined efforts of modelers and experimentalists.

  2. Quantum Accelerators for High-performance Computing Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Britt, Keith A.; Mohiyaddin, Fahd A.

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, themore » prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.« less

  3. Workflows and performances in the ranking prediction of 2016 D3R Grand Challenge 2: lessons learned from a collaborative effort.

    PubMed

    Gao, Ying-Duo; Hu, Yuan; Crespo, Alejandro; Wang, Deping; Armacost, Kira A; Fells, James I; Fradera, Xavier; Wang, Hongwu; Wang, Huijun; Sherborne, Brad; Verras, Andreas; Peng, Zhengwei

    2018-01-01

    The 2016 D3R Grand Challenge 2 includes both pose and affinity or ranking predictions. This article is focused exclusively on affinity predictions submitted to the D3R challenge from a collaborative effort of the modeling and informatics group. Our submissions include ranking of 102 ligands covering 4 different chemotypes against the FXR ligand binding domain structure, and the relative binding affinity predictions of the two designated free energy subsets of 15 and 18 compounds. Using all the complex structures prepared in the same way allowed us to cover many types of workflows and compare their performances effectively. We evaluated typical workflows used in our daily structure-based design modeling support, which include docking scores, force field-based scores, QM/MM, MMGBSA, MD-MMGBSA, and MacroModel interaction energy estimations. The best performing methods for the two free energy subsets are discussed. Our results suggest that affinity ranking still remains very challenging; that the knowledge of more structural information does not necessarily yield more accurate predictions; and that visual inspection and human intervention are considerably important for ranking. Knowledge of the mode of action and protein flexibility along with visualization tools that depict polar and hydrophobic maps are very useful for visual inspection. QM/MM-based workflows were found to be powerful in affinity ranking and are encouraged to be applied more often. The standardized input and output enable systematic analysis and support methodology development and improvement for high level blinded predictions.

  4. Workflows and performances in the ranking prediction of 2016 D3R Grand Challenge 2: lessons learned from a collaborative effort

    NASA Astrophysics Data System (ADS)

    Gao, Ying-Duo; Hu, Yuan; Crespo, Alejandro; Wang, Deping; Armacost, Kira A.; Fells, James I.; Fradera, Xavier; Wang, Hongwu; Wang, Huijun; Sherborne, Brad; Verras, Andreas; Peng, Zhengwei

    2018-01-01

    The 2016 D3R Grand Challenge 2 includes both pose and affinity or ranking predictions. This article is focused exclusively on affinity predictions submitted to the D3R challenge from a collaborative effort of the modeling and informatics group. Our submissions include ranking of 102 ligands covering 4 different chemotypes against the FXR ligand binding domain structure, and the relative binding affinity predictions of the two designated free energy subsets of 15 and 18 compounds. Using all the complex structures prepared in the same way allowed us to cover many types of workflows and compare their performances effectively. We evaluated typical workflows used in our daily structure-based design modeling support, which include docking scores, force field-based scores, QM/MM, MMGBSA, MD-MMGBSA, and MacroModel interaction energy estimations. The best performing methods for the two free energy subsets are discussed. Our results suggest that affinity ranking still remains very challenging; that the knowledge of more structural information does not necessarily yield more accurate predictions; and that visual inspection and human intervention are considerably important for ranking. Knowledge of the mode of action and protein flexibility along with visualization tools that depict polar and hydrophobic maps are very useful for visual inspection. QM/MM-based workflows were found to be powerful in affinity ranking and are encouraged to be applied more often. The standardized input and output enable systematic analysis and support methodology development and improvement for high level blinded predictions.

  5. Hierarchical Bayesian spatial models for multispecies conservation planning and monitoring

    Treesearch

    Carlos Carroll; Devin S. Johnson; Jeffrey R. Dunk; William J. Zielinski

    2010-01-01

    Biologists who develop and apply habitat models are often familiar with the statistical challenges posed by their data’s spatial structure but are unsure of whether the use of complex spatial models will increase the utility of model results in planning. We compared the relative performance of nonspatial and hierarchical Bayesian spatial models for three vertebrate and...

  6. Modeling and simulation challenges pursued by the Consortium for Advanced Simulation of Light Water Reactors (CASL)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turinsky, Paul J., E-mail: turinsky@ncsu.edu; Kothe, Douglas B., E-mail: kothe@ornl.gov

    The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear powermore » industry that M&S can assist in addressing. To date CASL has developed a multi-physics “core simulator” based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M&S capabilities, which is in progress, will assist in addressing long-standing and future operational and safety challenges of the nuclear industry. - Highlights: • Complexity of physics based modeling of light water reactor cores being addressed. • Capability developed to help address problems that have challenged the nuclear power industry. • Simulation capabilities that take advantage of high performance computing developed.« less

  7. Tangible Landscape: Cognitively Grasping the Flow of Water

    NASA Astrophysics Data System (ADS)

    Harmon, B. A.; Petrasova, A.; Petras, V.; Mitasova, H.; Meentemeyer, R. K.

    2016-06-01

    Complex spatial forms like topography can be challenging to understand, much less intentionally shape, given the heavy cognitive load of visualizing and manipulating 3D form. Spatiotemporal processes like the flow of water over a landscape are even more challenging to understand and intentionally direct as they are dependent upon their context and require the simulation of forces like gravity and momentum. This cognitive work can be offloaded onto computers through 3D geospatial modeling, analysis, and simulation. Interacting with computers, however, can also be challenging, often requiring training and highly abstract thinking. Tangible computing - an emerging paradigm of human-computer interaction in which data is physically manifested so that users can feel it and directly manipulate it - aims to offload this added cognitive work onto the body. We have designed Tangible Landscape, a tangible interface powered by an open source geographic information system (GRASS GIS), so that users can naturally shape topography and interact with simulated processes with their hands in order to make observations, generate and test hypotheses, and make inferences about scientific phenomena in a rapid, iterative process. Conceptually Tangible Landscape couples a malleable physical model with a digital model of a landscape through a continuous cycle of 3D scanning, geospatial modeling, and projection. We ran a flow modeling experiment to test whether tangible interfaces like this can effectively enhance spatial performance by offloading cognitive processes onto computers and our bodies. We used hydrological simulations and statistics to quantitatively assess spatial performance. We found that Tangible Landscape enhanced 3D spatial performance and helped users understand water flow.

  8. Meta-control of combustion performance with a data mining approach

    NASA Astrophysics Data System (ADS)

    Song, Zhe

    Large scale combustion process is complex and proposes challenges of optimizing its performance. Traditional approaches based on thermal dynamics have limitations on finding optimal operational regions due to time-shift nature of the process. Recent advances in information technology enable people collect large volumes of process data easily and continuously. The collected process data contains rich information about the process and, to some extent, represents a digital copy of the process over time. Although large volumes of data exist in industrial combustion processes, they are not fully utilized to the level where the process can be optimized. Data mining is an emerging science which finds patterns or models from large data sets. It has found many successful applications in business marketing, medical and manufacturing domains The focus of this dissertation is on applying data mining to industrial combustion processes, and ultimately optimizing the combustion performance. However the philosophy, methods and frameworks discussed in this research can also be applied to other industrial processes. Optimizing an industrial combustion process has two major challenges. One is the underlying process model changes over time and obtaining an accurate process model is nontrivial. The other is that a process model with high fidelity is usually highly nonlinear, solving the optimization problem needs efficient heuristics. This dissertation is set to solve these two major challenges. The major contribution of this 4-year research is the data-driven solution to optimize the combustion process, where process model or knowledge is identified based on the process data, then optimization is executed by evolutionary algorithms to search for optimal operating regions.

  9. Virtual reality in surgical training.

    PubMed

    Lange, T; Indelicato, D J; Rosen, J M

    2000-01-01

    Virtual reality in surgery and, more specifically, in surgical training, faces a number of challenges in the future. These challenges are building realistic models of the human body, creating interface tools to view, hear, touch, feel, and manipulate these human body models, and integrating virtual reality systems into medical education and treatment. A final system would encompass simulators specifically for surgery, performance machines, telemedicine, and telesurgery. Each of these areas will need significant improvement for virtual reality to impact medicine successfully in the next century. This article gives an overview of, and the challenges faced by, current systems in the fast-changing field of virtual reality technology, and provides a set of specific milestones for a truly realistic virtual human body.

  10. Modeling Multi-Bunch X-band Photoinjector Challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marsh, R A; Anderson, S G; Gibson, D J

    An X-band test station is being developed at LLNL to investigate accelerator optimization for future upgrades to mono-energetic gamma-ray technology at LLNL. The test station will consist of a 5.5 cell X-band rf photoinjector, single accelerator section, and beam diagnostics. Of critical import to the functioning of the LLNL X-band system with multiple electron bunches is the performance of the photoinjector. In depth modeling of the Mark 1 LLNL/SLAC X-band rf photoinjector performance will be presented addressing important challenges that must be addressed in order to fabricate a multi-bunch Mark 2 photoinjector. Emittance performance is evaluated under different nominal electronmore » bunch parameters using electrostatic codes such as PARMELA. Wake potential is analyzed using electromagnetic time domain simulations using the ACE3P code T3P. Plans for multi-bunch experiments and implementation of photoinjector advances for the Mark 2 design will also be discussed.« less

  11. A new model for low-dose food challenge in children with allergy to milk or egg.

    PubMed

    Devenney, Irene; Norrman, Gunilla; Oldaeus, Göran; Strömberg, Leif; Fälth-Magnusson, Karin

    2006-09-01

    Atopic eczema and food allergy are common in early childhood. Children seem to gradually develop tolerance to milk and egg, and it is a relief for families when their child can tolerate small amounts of these basic foods, even if larger doses may still cause symptoms. To develop a model for low-dose oral food challenge, facilitating re-/introduction of milk or egg. In 39 children sensitized to milk and/or egg, we performed 52 challenges using a new standardized model for low-dose oral food challenge. The recipes were validated for blinding with sensorial tests. Four children challenged to milk had a positive challenge outcome. There were no significant differences with respect to family history, associated atopic manifestations, nutritional supply, eczema severity, or skin-prick test compared with the non-reacting children, but total and specific IgE values were significantly higher. All but two of the non-reacting children were able to introduce milk and egg into their diet without problems. We report recipes and a protocol to be used for standardized open and double-blind placebo-controlled low-dose food challenge in young children, enabling the introduction of small amounts of egg and milk into the diet during tolerance development.

  12. Modeling the Effects of Turbulence in Rotating Detonation Engines

    NASA Astrophysics Data System (ADS)

    Towery, Colin; Smith, Katherine; Hamlington, Peter; van Schoor, Marthinus; TESLa Team; Midé Team

    2014-03-01

    Propulsion systems based on detonation waves, such as rotating and pulsed detonation engines, have the potential to substantially improve the efficiency and power density of gas turbine engines. Numerous technical challenges remain to be solved in such systems, however, including obtaining more efficient injection and mixing of air and fuels, more reliable detonation initiation, and better understanding of the flow in the ejection nozzle. These challenges can be addressed using numerical simulations. Such simulations are enormously challenging, however, since accurate descriptions of highly unsteady turbulent flow fields are required in the presence of combustion, shock waves, fluid-structure interactions, and other complex physical processes. In this study, we performed high-fidelity three dimensional simulations of a rotating detonation engine and examined turbulent flow effects on the operation, performance, and efficiency of the engine. Along with experimental data, these simulations were used to test the accuracy of commonly-used Reynolds averaged and subgrid-scale turbulence models when applied to detonation engines. The authors gratefully acknowledge the support of the Defense Advanced Research Projects Agency (DARPA).

  13. A dynamic model of stress and sustained attention

    NASA Technical Reports Server (NTRS)

    Hancock, P. A.; Warm, Joel S.

    1989-01-01

    Arguments are presented that an integrated view of stress and performance must consider the task demanding a sustained attention as a primary source of cognitive stress. A dynamic model is developed on the basis of the concept of adaptability in both physiological and psychological terms, that addresses the effects of stress on vigilance and, potentially, a wide variety of attention-demanding performance tasks. The model provides an insight into the failure of an operator under the driving influences of stress and opens a number of potential avenues through which solutions to the complex challenge of stress and performance might be posed.

  14. Model-based and Model-free Machine Learning Techniques for Diagnostic Prediction and Classification of Clinical Outcomes in Parkinson's Disease.

    PubMed

    Gao, Chao; Sun, Hanbo; Wang, Tuo; Tang, Ming; Bohnen, Nicolaas I; Müller, Martijn L T M; Herman, Talia; Giladi, Nir; Kalinin, Alexandr; Spino, Cathie; Dauer, William; Hausdorff, Jeffrey M; Dinov, Ivo D

    2018-05-08

    In this study, we apply a multidisciplinary approach to investigate falls in PD patients using clinical, demographic and neuroimaging data from two independent initiatives (University of Michigan and Tel Aviv Sourasky Medical Center). Using machine learning techniques, we construct predictive models to discriminate fallers and non-fallers. Through controlled feature selection, we identified the most salient predictors of patient falls including gait speed, Hoehn and Yahr stage, postural instability and gait difficulty-related measurements. The model-based and model-free analytical methods we employed included logistic regression, random forests, support vector machines, and XGboost. The reliability of the forecasts was assessed by internal statistical (5-fold) cross validation as well as by external out-of-bag validation. Four specific challenges were addressed in the study: Challenge 1, develop a protocol for harmonizing and aggregating complex, multisource, and multi-site Parkinson's disease data; Challenge 2, identify salient predictive features associated with specific clinical traits, e.g., patient falls; Challenge 3, forecast patient falls and evaluate the classification performance; and Challenge 4, predict tremor dominance (TD) vs. posture instability and gait difficulty (PIGD). Our findings suggest that, compared to other approaches, model-free machine learning based techniques provide a more reliable clinical outcome forecasting of falls in Parkinson's patients, for example, with a classification accuracy of about 70-80%.

  15. High-End Computing Challenges in Aerospace Design and Engineering

    NASA Technical Reports Server (NTRS)

    Bailey, F. Ronald

    2004-01-01

    High-End Computing (HEC) has had significant impact on aerospace design and engineering and is poised to make even more in the future. In this paper we describe four aerospace design and engineering challenges: Digital Flight, Launch Simulation, Rocket Fuel System and Digital Astronaut. The paper discusses modeling capabilities needed for each challenge and presents projections of future near and far-term HEC computing requirements. NASA's HEC Project Columbia is described and programming strategies presented that are necessary to achieve high real performance.

  16. Empirical Analysis of Optical Attenuator Performance in Quantum Key Distribution Systems Using a Particle Model

    DTIC Science & Technology

    2012-03-01

    EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING A...DISTRIBUTION IS UNLIMITED AFIT/GCS/ENG/12-01 EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING ...challenging as the complexity of actual implementation specifics are considered. Two components common to most quantum key distribution

  17. Data Sparsity Considerations in Climate Impact Analysis for the Water Sector (Invited)

    NASA Astrophysics Data System (ADS)

    Asante, K. O.; Khimsara, P.; Chan, A.

    2013-12-01

    Scientists and planners are helping governments and communities around the world to prepare for climate change by performing local impact studies and developing adaptation plans. Most studies begin by analyzing global climate models outputs to estimate the magnitude of projected change, assessing vulnerabilities and proposing adaptation measures. In these studies, climate projections from the Intergovernmental Panel on Climate Change (IPCC) Data Distribution Centre (DDC) are either used directly or downscaled using regional models. Since climate projections cover the entire global, climate change analysis can be performed for any location. However, selection of climate projections for use in historically data sparse regions presents special challenges. Key questions arise about the impact of historical data sparsity on quality of climate projections, spatial consistency of results and suitability for applications such as water resource planning. In this paper, a water-sector climate study conducted in a data-rich setting in California is compared to a similar study conducted a data-sparse setting in Mozambique. The challenges of selecting projections, performing analysis and interpreting the results for climate adaption planning are compared to illustrate the decision process and challenges encountered in these two very different settings.

  18. Integrated Basin-Scale Modelling and Assessment: Lessons and Challenges in Linking Biophysical and Socioeconomic Sciences for Enhancing Sustainability Outcomes

    NASA Astrophysics Data System (ADS)

    Jakeman, A. J.; Croke, B. F.; Letcher, R. A.; Newham, L. T.; Norton, J. P.

    2004-12-01

    Integrated Assessment (IA) and Integrated Scenario Modelling (ISM) are being increasingly used to assess sustainability options and, in particular, the effects of policy changes, land use management, climate forcing and other uncontrollable drivers on a wide range of river basin outcomes. IA and ISM are processes that invoke the necessary range of biophysical and socioeconomic disciplines and embrace stakeholder involvement as an essential ingredient. The authors report on their IA studies in Australian and Asian river basins. They illustrate a range of modelling frameworks and tools that were used to perform the assessments, engage the relevant interest groups and promote systems understanding and social learning. The studies cover a range of issues and policies including poverty alleviation, industrial investments, infrastructure provision, erosion and sedimentation, water supply allocation, and ecological protection. The positive impacts of these studies are presented, as well as the lessons learnt and the challenges for modellers and disciplinary experts in advancing the reputation and performance of integrated assessment exercises.

  19. Research advances and challenges in one-dimensional modeling of secondary settling tanks--a critical review.

    PubMed

    Li, Ben; Stenstrom, M K

    2014-11-15

    Sedimentation is one of the most important processes that determine the performance of the activated sludge process (ASP), and secondary settling tanks (SSTs) have been frequently investigated with the mathematical models for design and operation optimization. Nevertheless their performance is often far from satisfactory. The starting point of this paper is a review of the development of settling theory, focusing on batch settling and the development of flux theory, since they played an important role in the early stage of SST investigation. The second part is an explicit review of the established 1-D SST models, including the relevant physical law, various settling behaviors (hindered, transient, and compression settling), the constitutive functions, and their advantages and disadvantages. The third part is a discussion of numerical techniques required to solve the governing equation, which is usually a partial differential equation. Finally, the most important modeling challenges, such as settleability description, settling behavior understanding, are presented. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Modeling sustainability in renewable energy supply chain systems

    NASA Astrophysics Data System (ADS)

    Xie, Fei

    This dissertation aims at modeling sustainability of renewable fuel supply chain systems against emerging challenges. In particular, the dissertation focuses on the biofuel supply chain system design, and manages to develop advanced modeling framework and corresponding solution methods in tackling challenges in sustaining biofuel supply chain systems. These challenges include: (1) to integrate "environmental thinking" into the long-term biofuel supply chain planning; (2) to adopt multimodal transportation to mitigate seasonality in biofuel supply chain operations; (3) to provide strategies in hedging against uncertainty from conversion technology; and (4) to develop methodologies in long-term sequential planning of the biofuel supply chain under uncertainties. All models are mixed integer programs, which also involves multi-objective programming method and two-stage/multistage stochastic programming methods. In particular for the long-term sequential planning under uncertainties, to reduce the computational challenges due to the exponential expansion of the scenario tree, I also developed efficient ND-Max method which is more efficient than CPLEX and Nested Decomposition method. Through result analysis of four independent studies, it is found that the proposed modeling frameworks can effectively improve the economic performance, enhance environmental benefits and reduce risks due to systems uncertainties for the biofuel supply chain systems.

  1. Large-watershed flood simulation and forecasting based on different-resolution distributed hydrological model

    NASA Astrophysics Data System (ADS)

    Li, J.

    2017-12-01

    Large-watershed flood simulation and forecasting is very important for a distributed hydrological model in the application. There are some challenges including the model's spatial resolution effect, model performance and accuracy and so on. To cope with the challenge of the model's spatial resolution effect, different model resolution including 1000m*1000m, 600m*600m, 500m*500m, 400m*400m, 200m*200m were used to build the distributed hydrological model—Liuxihe model respectively. The purpose is to find which one is the best resolution for Liuxihe model in Large-watershed flood simulation and forecasting. This study sets up a physically based distributed hydrological model for flood forecasting of the Liujiang River basin in south China. Terrain data digital elevation model (DEM), soil type and land use type are downloaded from the website freely. The model parameters are optimized by using an improved Particle Swarm Optimization(PSO) algorithm; And parameter optimization could reduce the parameter uncertainty that exists for physically deriving model parameters. The different model resolution (200m*200m—1000m*1000m ) are proposed for modeling the Liujiang River basin flood with the Liuxihe model in this study. The best model's spatial resolution effect for flood simulation and forecasting is 200m*200m.And with the model's spatial resolution reduction, the model performance and accuracy also become worse and worse. When the model resolution is 1000m*1000m, the flood simulation and forecasting result is the worst, also the river channel divided based on this resolution is differs from the actual one. To keep the model with an acceptable performance, minimum model spatial resolution is needed. The suggested threshold model spatial resolution for modeling the Liujiang River basin flood is a 500m*500m grid cell, but the model spatial resolution with a 200m*200m grid cell is recommended in this study to keep the model at a best performance.

  2. A Hybrid Secure Scheme for Wireless Sensor Networks against Timing Attacks Using Continuous-Time Markov Chain and Queueing Model.

    PubMed

    Meng, Tianhui; Li, Xiaofan; Zhang, Sha; Zhao, Yubin

    2016-09-28

    Wireless sensor networks (WSNs) have recently gained popularity for a wide spectrum of applications. Monitoring tasks can be performed in various environments. This may be beneficial in many scenarios, but it certainly exhibits new challenges in terms of security due to increased data transmission over the wireless channel with potentially unknown threats. Among possible security issues are timing attacks, which are not prevented by traditional cryptographic security. Moreover, the limited energy and memory resources prohibit the use of complex security mechanisms in such systems. Therefore, balancing between security and the associated energy consumption becomes a crucial challenge. This paper proposes a secure scheme for WSNs while maintaining the requirement of the security-performance tradeoff. In order to proceed to a quantitative treatment of this problem, a hybrid continuous-time Markov chain (CTMC) and queueing model are put forward, and the tradeoff analysis of the security and performance attributes is carried out. By extending and transforming this model, the mean time to security attributes failure is evaluated. Through tradeoff analysis, we show that our scheme can enhance the security of WSNs, and the optimal rekeying rate of the performance and security tradeoff can be obtained.

  3. A Hybrid Secure Scheme for Wireless Sensor Networks against Timing Attacks Using Continuous-Time Markov Chain and Queueing Model

    PubMed Central

    Meng, Tianhui; Li, Xiaofan; Zhang, Sha; Zhao, Yubin

    2016-01-01

    Wireless sensor networks (WSNs) have recently gained popularity for a wide spectrum of applications. Monitoring tasks can be performed in various environments. This may be beneficial in many scenarios, but it certainly exhibits new challenges in terms of security due to increased data transmission over the wireless channel with potentially unknown threats. Among possible security issues are timing attacks, which are not prevented by traditional cryptographic security. Moreover, the limited energy and memory resources prohibit the use of complex security mechanisms in such systems. Therefore, balancing between security and the associated energy consumption becomes a crucial challenge. This paper proposes a secure scheme for WSNs while maintaining the requirement of the security-performance tradeoff. In order to proceed to a quantitative treatment of this problem, a hybrid continuous-time Markov chain (CTMC) and queueing model are put forward, and the tradeoff analysis of the security and performance attributes is carried out. By extending and transforming this model, the mean time to security attributes failure is evaluated. Through tradeoff analysis, we show that our scheme can enhance the security of WSNs, and the optimal rekeying rate of the performance and security tradeoff can be obtained. PMID:27690042

  4. Weather Research and Forecasting Model Wind Sensitivity Study at Edwards Air Force Base, CA

    NASA Technical Reports Server (NTRS)

    Watson, Leela R.; Bauman, William H., III

    2008-01-01

    NASA prefers to land the space shuttle at Kennedy Space Center (KSC). When weather conditions violate Flight Rules at KSC, NASA will usually divert the shuttle landing to Edwards Air Force Base (EAFB) in Southern California. But forecasting surface winds at EAFB is a challenge for the Spaceflight Meteorology Group (SMG) forecasters due to the complex terrain that surrounds EAFB, One particular phenomena identified by SMG is that makes it difficult to forecast the EAFB surface winds is called "wind cycling". This occurs when wind speeds and directions oscillate among towers near the EAFB runway leading to a challenging deorbit bum forecast for shuttle landings. The large-scale numerical weather prediction models cannot properly resolve the wind field due to their coarse horizontal resolutions, so a properly tuned high-resolution mesoscale model is needed. The Weather Research and Forecasting (WRF) model meets this requirement. The AMU assessed the different WRF model options to determine which configuration best predicted surface wind speed and direction at EAFB, To do so, the AMU compared the WRF model performance using two hot start initializations with the Advanced Research WRF and Non-hydrostatic Mesoscale Model dynamical cores and compared model performance while varying the physics options.

  5. Modeling and experimental performance of an intermediate temperature reversible solid oxide cell for high-efficiency, distributed-scale electrical energy storage

    NASA Astrophysics Data System (ADS)

    Wendel, Christopher H.; Gao, Zhan; Barnett, Scott A.; Braun, Robert J.

    2015-06-01

    Electrical energy storage is expected to be a critical component of the future world energy system, performing load-leveling operations to enable increased penetration of renewable and distributed generation. Reversible solid oxide cells, operating sequentially between power-producing fuel cell mode and fuel-producing electrolysis mode, have the capability to provide highly efficient, scalable electricity storage. However, challenges ranging from cell performance and durability to system integration must be addressed before widespread adoption. One central challenge of the system design is establishing effective thermal management in the two distinct operating modes. This work leverages an operating strategy to use carbonaceous reactant species and operate at intermediate stack temperature (650 °C) to promote exothermic fuel-synthesis reactions that thermally self-sustain the electrolysis process. We present performance of a doped lanthanum-gallate (LSGM) electrolyte solid oxide cell that shows high efficiency in both operating modes at 650 °C. A physically based electrochemical model is calibrated to represent the cell performance and used to simulate roundtrip operation for conditions unique to these reversible systems. Design decisions related to system operation are evaluated using the cell model including current density, fuel and oxidant reactant compositions, and flow configuration. The analysis reveals tradeoffs between electrical efficiency, thermal management, energy density, and durability.

  6. The computational challenges of Earth-system science.

    PubMed

    O'Neill, Alan; Steenman-Clark, Lois

    2002-06-15

    The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.

  7. New perspectives to the enterotoxigenic E. coli F4 porcine infection model: Susceptibility genotypes in relation to performance, diarrhoea and bacterial shedding.

    PubMed

    Roubos-van den Hil, Petra J; Litjens, Ralph; Oudshoorn, Anna-Katharina; Resink, Jan Willem; Smits, Coen H M

    2017-04-01

    Enterotoxigenic E. coli (ETEC), causing post-weaning diarrhoea, is a major problem in weaned piglets. Individual animal responses to ETEC infection show high variability in animal experiments. Two studies were designed to optimize the ETEC F4ac infection model in piglets by combining the genotype susceptibility with performance, diarrhoea incidence and bacterial shedding. The studies were performed with respectively 120 and 80 male piglets that were tested for susceptibility or resistance towards ETEC O149:F4ac by a DNA marker based test. Three different genotypes were observed; resistant (RR), susceptible heterozygote (RS) and susceptible homozygote (SS). Piglets, were orally infected with an inoculum suspension (containing 1.5E8 CFU/ml ETEC F4ac) at day 0, 1 and 2 of the study. Performance, diarrhoea incidence and bacterial shedding were followed for 21days. In the first week after challenge a difference in average daily gain was observed between resistant and susceptible piglets in both studies. For the complete study period no significant differences were observed. Diarrhoea incidence was significantly higher in susceptible pigs compared to the resistant pigs in the first week after challenge. Bacterial shedding was much higher in the susceptible pigs and ETEC excretion lasted longer. ETEC was hardly detected in the faecal material of the resistant pigs. In conclusion, susceptible pigs showed higher diarrhoea incidence and higher numbers of faecal ETEC shedding in the first week after challenge compared to resistant pigs. The DNA marker based test can be used to select pigs that are susceptible for ETEC for inclusion in ETEC infection model, resulting in less animals needed to perform infection studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Evaluation of the Service Review Model with Performance Scorecards

    ERIC Educational Resources Information Center

    Szabo, Thomas G.; Williams, W. Larry; Rafacz, Sharlet D.; Newsome, William; Lydon, Christina A.

    2012-01-01

    The current study combined a management technique termed "Service Review" with performance scorecards to enhance staff and consumer behavior in a human service setting consisting of 11 supervisors and 56 front-line staff working with 9 adult consumers with challenging behaviors. Results of our intervention showed that service review and…

  9. The Efficacy of Multidimensional Constraint Keys in Database Query Performance

    ERIC Educational Resources Information Center

    Cardwell, Leslie K.

    2012-01-01

    This work is intended to introduce a database design method to resolve the two-dimensional complexities inherent in the relational data model and its resulting performance challenges through abstract multidimensional constructs. A multidimensional constraint is derived and utilized to implement an indexed Multidimensional Key (MK) to abstract a…

  10. Ensemble method for dengue prediction.

    PubMed

    Buczak, Anna L; Baugher, Benjamin; Moniz, Linda J; Bagley, Thomas; Babin, Steven M; Guven, Erhan

    2018-01-01

    In the 2015 NOAA Dengue Challenge, participants made three dengue target predictions for two locations (Iquitos, Peru, and San Juan, Puerto Rico) during four dengue seasons: 1) peak height (i.e., maximum weekly number of cases during a transmission season; 2) peak week (i.e., week in which the maximum weekly number of cases occurred); and 3) total number of cases reported during a transmission season. A dengue transmission season is the 12-month period commencing with the location-specific, historical week with the lowest number of cases. At the beginning of the Dengue Challenge, participants were provided with the same input data for developing the models, with the prediction testing data provided at a later date. Our approach used ensemble models created by combining three disparate types of component models: 1) two-dimensional Method of Analogues models incorporating both dengue and climate data; 2) additive seasonal Holt-Winters models with and without wavelet smoothing; and 3) simple historical models. Of the individual component models created, those with the best performance on the prior four years of data were incorporated into the ensemble models. There were separate ensembles for predicting each of the three targets at each of the two locations. Our ensemble models scored higher for peak height and total dengue case counts reported in a transmission season for Iquitos than all other models submitted to the Dengue Challenge. However, the ensemble models did not do nearly as well when predicting the peak week. The Dengue Challenge organizers scored the dengue predictions of the Challenge participant groups. Our ensemble approach was the best in predicting the total number of dengue cases reported for transmission season and peak height for Iquitos, Peru.

  11. Ensemble method for dengue prediction

    PubMed Central

    Baugher, Benjamin; Moniz, Linda J.; Bagley, Thomas; Babin, Steven M.; Guven, Erhan

    2018-01-01

    Background In the 2015 NOAA Dengue Challenge, participants made three dengue target predictions for two locations (Iquitos, Peru, and San Juan, Puerto Rico) during four dengue seasons: 1) peak height (i.e., maximum weekly number of cases during a transmission season; 2) peak week (i.e., week in which the maximum weekly number of cases occurred); and 3) total number of cases reported during a transmission season. A dengue transmission season is the 12-month period commencing with the location-specific, historical week with the lowest number of cases. At the beginning of the Dengue Challenge, participants were provided with the same input data for developing the models, with the prediction testing data provided at a later date. Methods Our approach used ensemble models created by combining three disparate types of component models: 1) two-dimensional Method of Analogues models incorporating both dengue and climate data; 2) additive seasonal Holt-Winters models with and without wavelet smoothing; and 3) simple historical models. Of the individual component models created, those with the best performance on the prior four years of data were incorporated into the ensemble models. There were separate ensembles for predicting each of the three targets at each of the two locations. Principal findings Our ensemble models scored higher for peak height and total dengue case counts reported in a transmission season for Iquitos than all other models submitted to the Dengue Challenge. However, the ensemble models did not do nearly as well when predicting the peak week. Conclusions The Dengue Challenge organizers scored the dengue predictions of the Challenge participant groups. Our ensemble approach was the best in predicting the total number of dengue cases reported for transmission season and peak height for Iquitos, Peru. PMID:29298320

  12. Light-weight Parallel Python Tools for Earth System Modeling Workflows

    NASA Astrophysics Data System (ADS)

    Mickelson, S. A.; Paul, K.; Xu, H.; Dennis, J.; Brown, D. I.

    2015-12-01

    With the growth in computing power over the last 30 years, earth system modeling codes have become increasingly data-intensive. As an example, it is expected that the data required for the next Intergovernmental Panel on Climate Change (IPCC) Assessment Report (AR6) will increase by more than 10x to an expected 25PB per climate model. Faced with this daunting challenge, developers of the Community Earth System Model (CESM) have chosen to change the format of their data for long-term storage from time-slice to time-series, in order to reduce the required download bandwidth needed for later analysis and post-processing by climate scientists. Hence, efficient tools are required to (1) perform the transformation of the data from time-slice to time-series format and to (2) compute climatology statistics, needed for many diagnostic computations, on the resulting time-series data. To address the first of these two challenges, we have developed a parallel Python tool for converting time-slice model output to time-series format. To address the second of these challenges, we have developed a parallel Python tool to perform fast time-averaging of time-series data. These tools are designed to be light-weight, be easy to install, have very few dependencies, and can be easily inserted into the Earth system modeling workflow with negligible disruption. In this work, we present the motivation, approach, and testing results of these two light-weight parallel Python tools, as well as our plans for future research and development.

  13. Towards General Models of Effective Science Inquiry in Virtual Performance Assessments

    ERIC Educational Resources Information Center

    Baker, R. S.; Clarke-Midura, J.; Ocumpaugh, J.

    2016-01-01

    Recent interest in online assessment of scientific inquiry has led to several new online systems that attempt to assess these skills, but producing models that detect when students are successfully practising these skills can be challenging. In this paper, we study models that assess student inquiry in an immersive virtual environment, where a…

  14. Learning Strategies Model to Enhance Thai Undergraduate Students' Self-Efficacy Beliefs in EIL Textual Reading Performance

    ERIC Educational Resources Information Center

    Kakew, Jiraporn; Damnet, Anamai

    2017-01-01

    This classroom based research of a learning strategies model was designed to investigate its application in a mixed-ability classroom. The study built on Oxford's language learning strategies model (1990, 2001) and fulfilled it with rhetorical strategies to accommodate challenges encountered in the paradigm of English as an international language…

  15. Modeling Pilot State in Next Generation Aircraft Alert Systems

    NASA Technical Reports Server (NTRS)

    Carlin, Alan S.; Alexander, Amy L.; Schurr, Nathan

    2011-01-01

    The Next Generation Air Transportation System will introduce new, advanced sensor technologies into the cockpit that must convey a large number of potentially complex alerts. Our work focuses on the challenges associated with prioritizing aircraft sensor alerts in a quick and efficient manner, essentially determining when and how to alert the pilot This "alert decision" becomes very difficult in NextGen due to the following challenges: 1) the increasing number of potential hazards, 2) the uncertainty associated with the state of potential hazards as well as pilot slate , and 3) the limited time to make safely-critical decisions. In this paper, we focus on pilot state and present a model for anticipating duration and quality of pilot behavior, for use in a larger system which issues aircraft alerts. We estimate pilot workload, which we model as being dependent on factors including mental effort, task demands. and task performance. We perform a mathematically rigorous analysis of the model and resulting alerting plans. We simulate the model in software and present simulated results with respect to manipulation of the pilot measures.

  16. Applying sport psychology to improve clinical performance.

    PubMed

    Church, Helen R; Rumbold, James L; Sandars, John

    2017-12-01

    Preparedness for practice has become an international theme within Medical Education: for healthcare systems to maintain their highest clinical standards, junior doctors must "hit the ground running" on beginning work. Despite demonstrating logical, structured assessment and management plans during their undergraduate examinations, many newly qualified doctors report difficulty in translating this theoretical knowledge into the real clinical environment. "Preparedness" must constitute more than the knowledge and skills acquired during medical school. Complexities of the clinical environment overwhelm some junior doctors, who acknowledge that they lack strategies to manage their anxieties, under-confidence and low self-efficacy. If uncontrolled, such negative emotions and behaviors may impede the delivery of time-critical treatment for acutely unwell patients and compound junior doctors' self-doubt, thus impacting future patient encounters. Medical Education often seeks inspiration from other industries for potential solutions to challenges. To address "preparedness for practice," this AMEE Guide highlights sport psychology: elite sportspeople train both physically and psychologically for their discipline. The latter promotes management of negative emotions, distractions and under-confidence, thus optimizing performance despite immense pressures of career-defining moments. Similar techniques might allow junior doctors to optimize patient care, especially within stressful situations. This AMEE Guide introduces the novel conceptual model, PERFORM, which targets the challenges faced by junior doctors on graduation. The model applies pre-performance routines from sport psychology with the self-regulatory processes of metacognition to the clinical context. This model could potentially equip junior doctors, and other healthcare professionals facing similar challenges, with strategies to optimize clinical care under the most difficult circumstances.

  17. An anti-nicotinic cognitive challenge model using mecamylamine in comparison with the anti-muscarinic cognitive challenge using scopolamine.

    PubMed

    Baakman, Anne Catrien; Alvarez-Jimenez, Ricardo; Rissmann, Robert; Klaassen, Erica S; Stevens, Jasper; Goulooze, Sebastiaan C; den Burger, Jeroen C G; Swart, Eleonora L; van Gerven, Joop M A; Groeneveld, Geert Jan

    2017-08-01

    The muscarinic acetylcholine receptor antagonist scopolamine is often used for proof-of-pharmacology studies with pro-cognitive compounds. From a pharmacological point of view, it would seem more rational to use a nicotinic rather than a muscarinic anticholinergic challenge to prove pharmacology of a nicotinic acetylcholine receptor agonist. This study aims to characterize a nicotinic anticholinergic challenge model using mecamylamine and to compare it to the scopolamine model. In this double-blind, placebo-controlled, four-way cross-over trial, 12 healthy male subjects received oral mecamylamine 10 and 20 mg, intravenous scopolamine 0.5 mg and placebo. Pharmacokinetics were analysed using non-compartmental analysis. Pharmacodynamic effects were measured with a multidimensional test battery that includes neurophysiological, subjective, (visuo)motor and cognitive measurements. All treatments were safe and well tolerated. Mecamylamine had a t max of 2.5 h and a C max of 64.5 ng ml -1 for the 20 mg dose. Mecamylamine had a dose-dependent effect decreasing the adaptive tracking performance and VAS alertness, and increasing the finger tapping and visual verbal learning task performance time and errors. Scopolamine significantly affected almost all pharmacodynamic tests. This study demonstrated that mecamylamine causes nicotinic receptor specific temporary decline in cognitive functioning. Compared with the scopolamine model, pharmacodynamic effects were less pronounced at the dose levels tested; however, mecamylamine caused less sedation. The cognitive effects of scopolamine might at least partly be caused by sedation. Whether the mecamylamine model can be used for proof-of-pharmacology of nicotinic acetylcholine receptor agonists remains to be established. © 2017 The British Pharmacological Society.

  18. Advances and perspectives in in vitro human gut fermentation modeling.

    PubMed

    Payne, Amanda N; Zihler, Annina; Chassard, Christophe; Lacroix, Christophe

    2012-01-01

    The gut microbiota is a highly specialized organ containing host-specific assemblages of microbes whereby metabolic activity directly impacts human health and disease. In vitro gut fermentation models present an unmatched opportunity of performing studies frequently challenged in humans and animals owing to ethical concerns. Multidisciplinary systems biology analyses supported by '-omics' platforms remain widely neglected in the field of in vitro gut fermentation modeling but are key to advancing the significance of these models. Model-driven experimentation using a combination of in vitro gut fermentation and in vitro human cell models represent an advanced approach in identifying complex host-microbe interactions and niches central to gut fermentation processes. The aim of this review is to highlight the advances and challenges exhibited by in vitro human gut fermentation modeling. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Development of Systems Engineering Maturity Models and Management Tools

    DTIC Science & Technology

    2011-01-21

    Ph.D., Senior Personnel, Stevens Institute of Technology Abhi Deshmukh, Ph.D., Senior Personnel, Texas A&M University Matin Sarfaraz, Research ...WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Stevens Institute of Technology,Systems Engineering Research Center (SERC),1...tools (MPT) for effectively and efficiently addressing these challenges are likewise being challenged. The goal of this research was to develop a

  20. Resolving model parameter values from carbon and nitrogen stock measurements in a wide range of tropical mature forests using nonlinear inversion and regression trees

    Treesearch

    Shuguang Liua; Pamela Anderson; Guoyi Zhoud; Boone Kauffman; Flint Hughes; David Schimel; Vicente Watson; Joseph Tosi

    2008-01-01

    Objectively assessing the performance of a model and deriving model parameter values from observations are critical and challenging in landscape to regional modeling. In this paper, we applied a nonlinear inversion technique to calibrate the ecosystem model CENTURY against carbon (C) and nitrogen (N) stock measurements collected from 39 mature tropical forest sites in...

  1. Modelling Human Emotions for Tactical Decision-Making Games

    ERIC Educational Resources Information Center

    Visschedijk, Gillian C.; Lazonder, Ard W.; van der Hulst, Anja; Vink, Nathalie; Leemkuil, Henny

    2013-01-01

    The training of tactical decision making increasingly occurs through serious computer games. A challenging aspect of designing such games is the modelling of human emotions. Two studies were performed to investigate the relation between fidelity and human emotion recognition in virtual human characters. Study 1 compared five versions of a virtual…

  2. Obtaining Predictions from Models Fit to Multiply Imputed Data

    ERIC Educational Resources Information Center

    Miles, Andrew

    2016-01-01

    Obtaining predictions from regression models fit to multiply imputed data can be challenging because treatments of multiple imputation seldom give clear guidance on how predictions can be calculated, and because available software often does not have built-in routines for performing the necessary calculations. This research note reviews how…

  3. Reactivity Insertion Accident (RIA) Capability Status in the BISON Fuel Performance Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williamson, Richard L.; Folsom, Charles Pearson; Pastore, Giovanni

    2016-05-01

    One of the Challenge Problems being considered within CASL relates to modelling and simulation of Light Water Reactor LWR) fuel under Reactivity Insertion Accident (RIA) conditions. BISON is the fuel performance code used within CASL for LWR fuel under both normal operating and accident conditions, and thus must be capable of addressing the RIA challenge problem. This report outlines required BISON capabilities for RIAs and describes the current status of the code. Information on recent accident capability enhancements, application of BISON to a RIA benchmark exercise, and plans for validation to RIA behavior are included.

  4. Inadequate performance measures affecting practices, organizations and outcomes of Ontario's family health teams.

    PubMed

    Ashcroft, Rachelle

    2014-01-01

    Emphasis on quantity as the main performance measure may be posing challenges for Family Health Team (FHT) practices and organizational structures. This study asked: What healthcare practices and organizational structures are encouraged by the FHT model? An exploratory qualitative design guided by discourse analysis was used. This paper presents findings from in-depth semi-structured interviews conducted with seven policy informants and 29 FHT leaders. Participants report that performance measures value quantity and are not inclusive of the broad scope of attributes that comprise primary healthcare. Performance measures do not appear to be accurately capturing the demand for healthcare services, or the actual amount of services being provided by FHTs. RESULTS suggest that unintended consequences of performance measures may be posing challenges to access and health outcomes. It is recommended that performance measures be developed and used to measure, support and encourage FHTs to achieve the goals of PHC. Copyright © 2014 Longwoods Publishing.

  5. Vehicle Lightweighting: Challenges and Opportunities with Aluminum

    NASA Astrophysics Data System (ADS)

    Sachdev, Anil K.; Mishra, Raja K.; Mahato, Anirban; Alpas, Ahmet

    Rising energy costs, consumer preferences and regulations drive requirements for fuel economy, performance, comfort, safety and cost of future automobiles. These conflicting situations offer challenges for vehicle lightweighting, for which aluminum applications are key. This paper describes product design needs and materials and process development opportunities driven by theoretical, experimental and modeling tools in the area of sheet and castings. Computational tools and novel experimental techniques used in their development are described. The paper concludes with challenges that lie ahead for pervasive use of aluminum and the necessary fundamental R&D that is still needed.

  6. Enhancing Team Performance for Long-Duration Space Missions

    NASA Technical Reports Server (NTRS)

    Orasanu, Judith M.

    2009-01-01

    Success of exploration missions will depend on skilled performance by a distributed team that includes both the astronauts in space and Mission Control personnel. Coordinated and collaborative teamwork will be required to cope with challenging complex problems in a hostile environment. While thorough preflight training and procedures will equip creW'S to address technical problems that can be anticipated, preparing them to solve novel problems is much more challenging. This presentation will review components of effective team performance, challenges to effective teamwork, and strategies for ensuring effective team performance. Teamwork skills essential for successful team performance include the behaviors involved in developing shared mental models, team situation awareness, collaborative decision making, adaptive coordination behaviors, effective team communication, and team cohesion. Challenges to teamwork include both chronic and acute stressors. Chronic stressors are associated with the isolated and confined environment and include monotony, noise, temperatures, weightlessness, poor sleep and circadian disruptions. Acute stressors include high workload, time pressure, imminent danger, and specific task-related stressors. Of particular concern are social and organizational stressors that can disrupt individual resilience and effective mission performance. Effective team performance can be developed by training teamwork skills, techniques for coping with team conflict, intracrew and intercrew communication, and working in a multicultural team; leadership and teamwork skills can be fostered through outdoor survival training exercises. The presentation will conclude with an evaluation of the special requirements associated with preparing crews to function autonomously in long-duration missions.

  7. The challenges of implementing advanced access for residents in family medicine in Quebec. Do promising strategies exist?

    PubMed

    Abou Malham, Sabina; Touati, Nassera; Maillet, Lara; Breton, Mylaine

    2018-12-01

    The advanced access (AA) model is a highly recommended innovation to improve timely access to primary healthcare. Despite that many studies have shown positive impacts for healthcare professionals, and for patients, implementing this model in clinics with a teaching mission for family medicine residents poses specific challenges. To identify these challenges within these clinics, as well as potential strategies to address them. The authors adopted a qualitative multiple case study design, collected data in 2016 using semi-structured interviews (N = 40) with healthcare professionals and clerical staff in four family medicine units in Quebec, and performed a thematic analysis. They validated results through a discussion workshop, involving many family physicians and residents practicing in different regions Results: Five challenges emerged from the data: 1) choosing, organizing residents' patient; 2) managing and balancing residents' appointment schedules; 3) balancing timely access with relational continuity; 4) understanding the AA model; 5) establishing collaborative practices with other health professionals. Several promising strategies were suggested to address these challenges, including clearly defining residents' patient panels; adopting a team-based care approach; incorporating the model into academic curriculum and clinical training; proactive and ongoing education of health professionals, residents, and patients; involving residents in the change process and in adjustment strategies. To meet the challenges of implementing AA, decision-makers should consider exposing residents to AA during academic training and clinical internships, involving them in team work on arrival, engaging them as key actors in the implementation and in intra- and inter-professional collaborative models.

  8. The challenges of implementing advanced access for residents in family medicine in Quebec. Do promising strategies exist?

    PubMed Central

    Abou Malham, Sabina; Touati, Nassera; Maillet, Lara; Breton, Mylaine

    2018-01-01

    ABSTRACT Background: The advanced access (AA) model is a highly recommended innovation to improve timely access to primary healthcare. Despite that many studies have shown positive impacts for healthcare professionals, and for patients, implementing this model in clinics with a teaching mission for family medicine residents poses specific challenges. Objective: To identify these challenges within these clinics, as well as potential strategies to address them. Design: The authors adopted a qualitative multiple case study design, collected data in 2016 using semi-structured interviews (N = 40) with healthcare professionals and clerical staff in four family medicine units in Quebec, and performed a thematic analysis. They validated results through a discussion workshop, involving many family physicians and residents practicing in different regions Results: Five challenges emerged from the data: 1) choosing, organizing residents’ patient; 2) managing and balancing residents’ appointment schedules; 3) balancing timely access with relational continuity; 4) understanding the AA model; 5) establishing collaborative practices with other health professionals. Several promising strategies were suggested to address these challenges, including clearly defining residents’ patient panels; adopting a team-based care approach; incorporating the model into academic curriculum and clinical training; proactive and ongoing education of health professionals, residents, and patients; involving residents in the change process and in adjustment strategies. Conclusions: To meet the challenges of implementing AA, decision-makers should consider exposing residents to AA during academic training and clinical internships, involving them in team work on arrival, engaging them as key actors in the implementation and in intra- and inter-professional collaborative models. PMID:29464984

  9. "...Something Shining, Like Gold--but Better." The National Indian Youth Leadership Model: A Manual for Program Leaders.

    ERIC Educational Resources Information Center

    Hall, McClellan

    The National Indian Youth Leadership (NIYL) model was created to develop leadership skills for Indian youth to perform their future roles in the family, school, tribe, and nation. The model not only instills leadership skills and values through hands-on learning opportunities, but also challenges youth to apply those skills through projects they…

  10. A systematic review of the challenges to implementation of the patient-centred medical home: lessons for Australia.

    PubMed

    Janamian, Tina; Jackson, Claire L; Glasson, Nicola; Nicholson, Caroline

    2014-08-04

    To review the available literature to identify the major challenges and barriers to implementation and adoption of the patient-centred medical home (PCMH) model, topical in current Australian primary care reforms. Systematic review of peer-reviewed literature. PubMed and Embase databases were searched in December 2012 for studies published in English between January 2007 and December 2012. Studies of any type were included if they defined PCMH using the Patient-Centered Primary Care Collaborative Joint Principles, and reported data on challenges and barriers to implementation and adoption of the PCMH model. One researcher with content knowledge in the area abstracted data relating to the review objective and study design from eligible articles. A second researcher reviewed the abstracted data alongside the original article to check for accuracy and completeness. Thematic synthesis was used to in three stages: free line-by-line coding of data; organisation of "free codes" into related areas to construct "descriptive" themes and develop "analytical" themes. The main barriers identified related to: challenges with the transformation process; difficulties associated with change management; challenges in implementing and using an electronic health record that administers principles of PCMH; challenges with funding and appropriate payment models; insufficient resources and infrastructure within practices; and inadequate measures of performance. This systematic review documents the key challenges and barriers to implementing the PCMH model in United States family practice. It provides valuable evidence for Australian clinicians, policymakers, and organisations approaching adoption of PCMH elements within reform initiatives in this country.

  11. Riparian Wetlands: Mapping

    EPA Science Inventory

    Riparian wetlands are critical systems that perform functions and provide services disproportionate to their extent in the landscape. Mapping wetlands allows for better planning, management, and modeling, but riparian wetlands present several challenges to effective mapping due t...

  12. Perspective: A new model of leadership performance in health care.

    PubMed

    Souba, Wiley

    2011-10-01

    Current leadership models are based largely on concepts and explanations, which provide limited access to the being and actions of an effective leader in health care. Rather than teaching leadership from a theoretical vantage point, the ontological perspective teaches leadership as it is lived and experienced. When one exercises leadership "as lived," concurrently informed by theories, one performs at one's best. A distinctive feature of the ontological approach resides in its capacity to disclose human ways of being and acting that limit our freedom to lead effectively as our natural self-expression. Ontological leadership maintains that our worldviews and mental maps affect the way we lead and are shaped by and accessible through language--hence, to lead more effectively, mastery of a new conversational domain of leadership is required. This emerging model of leadership performance reveals that (1) our actions as leaders are correlated with the way in which the leadership situation we are dealing with occurs for us, and (2) this "occurring" is shaped by the context we bring to that situation. Master leaders use language to recontextualize their leadership challenges so that their naturally correlated ways of being and acting can emerge, resulting in effective leadership. When leaders linguistically unveil limiting contexts, they are freed up to create new contexts that shift the way leadership challenges occur for them. This provides leaders--physicians, scientists, educators, executives--with new opportunity sets (previously unavailable) for exercising exemplary leadership. The ontological approach to leadership offers a powerful framework for tackling health care's toughest challenges.

  13. A high-resolution global flood hazard model

    NASA Astrophysics Data System (ADS)

    Sampson, Christopher C.; Smith, Andrew M.; Bates, Paul B.; Neal, Jeffrey C.; Alfieri, Lorenzo; Freer, Jim E.

    2015-09-01

    Floods are a natural hazard that affect communities worldwide, but to date the vast majority of flood hazard research and mapping has been undertaken by wealthy developed nations. As populations and economies have grown across the developing world, so too has demand from governments, businesses, and NGOs for modeled flood hazard data in these data-scarce regions. We identify six key challenges faced when developing a flood hazard model that can be applied globally and present a framework methodology that leverages recent cross-disciplinary advances to tackle each challenge. The model produces return period flood hazard maps at ˜90 m resolution for the whole terrestrial land surface between 56°S and 60°N, and results are validated against high-resolution government flood hazard data sets from the UK and Canada. The global model is shown to capture between two thirds and three quarters of the area determined to be at risk in the benchmark data without generating excessive false positive predictions. When aggregated to ˜1 km, mean absolute error in flooded fraction falls to ˜5%. The full complexity global model contains an automatically parameterized subgrid channel network, and comparison to both a simplified 2-D only variant and an independently developed pan-European model shows the explicit inclusion of channels to be a critical contributor to improved model performance. While careful processing of existing global terrain data sets enables reasonable model performance in urban areas, adoption of forthcoming next-generation global terrain data sets will offer the best prospect for a step-change improvement in model performance.

  14. Inter-species prediction of protein phosphorylation in the sbv IMPROVER species translation challenge

    PubMed Central

    Biehl, Michael; Sadowski, Peter; Bhanot, Gyan; Bilal, Erhan; Dayarian, Adel; Meyer, Pablo; Norel, Raquel; Rhrissorrakrai, Kahn; Zeller, Michael D.; Hormoz, Sahand

    2015-01-01

    Motivation: Animal models are widely used in biomedical research for reasons ranging from practical to ethical. An important issue is whether rodent models are predictive of human biology. This has been addressed recently in the framework of a series of challenges designed by the systems biology verification for Industrial Methodology for Process Verification in Research (sbv IMPROVER) initiative. In particular, one of the sub-challenges was devoted to the prediction of protein phosphorylation responses in human bronchial epithelial cells, exposed to a number of different chemical stimuli, given the responses in rat bronchial epithelial cells. Participating teams were asked to make inter-species predictions on the basis of available training examples, comprising transcriptomics and phosphoproteomics data. Results: Here, the two best performing teams present their data-driven approaches and computational methods. In addition, post hoc analyses of the datasets and challenge results were performed by the participants and challenge organizers. The challenge outcome indicates that successful prediction of protein phosphorylation status in human based on rat phosphorylation levels is feasible. However, within the limitations of the computational tools used, the inclusion of gene expression data does not improve the prediction quality. The post hoc analysis of time-specific measurements sheds light on the signaling pathways in both species. Availability and implementation: A detailed description of the dataset, challenge design and outcome is available at www.sbvimprover.com. The code used by team IGB is provided under http://github.com/uci-igb/improver2013. Implementations of the algorithms applied by team AMG are available at http://bhanot.biomaps.rutgers.edu/wiki/AMG-sc2-code.zip. Contact: meikelbiehl@gmail.com PMID:24994890

  15. Performance Improvement of Receivers Based on Ultra-Tight Integration in GNSS-Challenged Environments

    PubMed Central

    Qin, Feng; Zhan, Xingqun; Du, Gang

    2013-01-01

    Ultra-tight integration was first proposed by Abbott in 2003 with the purpose of integrating a global navigation satellite system (GNSS) and an inertial navigation system (INS). This technology can improve the tracking performances of a receiver by reconfiguring the tracking loops in GNSS-challenged environments. In this paper, the models of all error sources known to date in the phase lock loops (PLLs) of a standard receiver and an ultra-tightly integrated GNSS/INS receiver are built, respectively. Based on these models, the tracking performances of the two receivers are compared to verify the improvement due to the ultra-tight integration. Meanwhile, the PLL error distributions of the two receivers are also depicted to analyze the error changes of the tracking loops. These results show that the tracking error is significantly reduced in the ultra-tightly integrated GNSS/INS receiver since the receiver's dynamics are estimated and compensated by an INS. Moreover, the mathematical relationship between the tracking performances of the ultra-tightly integrated GNSS/INS receiver and the quality of the selected inertial measurement unit (IMU) is derived from the error models and proved by the error comparisons of four ultra-tightly integrated GNSS/INS receivers aided by different grade IMUs.

  16. Ares-I-X Stability and Control Flight Test: Analysis and Plans

    NASA Technical Reports Server (NTRS)

    Brandon, Jay M.; Derry, Stephen D.; Heim, Eugene H.; Hueschen, Richard M.; Bacon, Barton J.

    2008-01-01

    The flight test of the Ares I-X vehicle provides a unique opportunity to reduce risk of the design of the Ares I vehicle and test out design, math modeling, and analysis methods. One of the key features of the Ares I design is the significant static aerodynamic instability coupled with the relatively flexible vehicle - potentially resulting in a challenging controls problem to provide adequate flight path performance while also providing adequate structural mode damping and preventing adverse control coupling to the flexible structural modes. Another challenge is to obtain enough data from the single flight to be able to conduct analysis showing the effectiveness of the controls solutions and have data to inform design decisions for Ares I. This paper will outline the modeling approaches and control system design to conduct this flight test, and also the system identification techniques developed to extract key information such as control system performance (gain/phase margins, for example), structural dynamics responses, and aerodynamic model estimations.

  17. Cognitive Load and Classroom Teaching: The Double-Edged Sword of Automaticity

    ERIC Educational Resources Information Center

    Feldon, David F.

    2007-01-01

    Research in the development of teacher cognition and teaching performance in K-12 classrooms has identified consistent challenges and patterns of behavior that are congruent with the predictions of dual-process models of cognition. However, cognitive models of information processing are not often used to synthesize these results. This article…

  18. Construct-a-Boat. Science by Design Series.

    ERIC Educational Resources Information Center

    Baroway, William

    This book is one of four books in the Science-by-Design Series created by TERC and funded by the National Science Foundation (NSF). It challenges high school students to investigate the physics of boat performance and work with systems and modeling. Through research, design, testing, and evaluation of a model boat, students experience the…

  19. A Model for Considering the Financial Sustainability of Learning and Teaching Programs: Concepts and Challenges

    ERIC Educational Resources Information Center

    De Bellis, David

    2012-01-01

    The expansion of tertiary education, an intensity of focus on accountability and performance, and the emergence of new governance and management structures drives an economic fiscal perspective of the value of learning and teaching. Accurate and meaningful models defining financial sustainability are therefore proposed as an imperative for…

  20. An Investigation of Unified Memory Access Performance in CUDA

    PubMed Central

    Landaverde, Raphael; Zhang, Tiansheng; Coskun, Ayse K.; Herbordt, Martin

    2015-01-01

    Managing memory between the CPU and GPU is a major challenge in GPU computing. A programming model, Unified Memory Access (UMA), has been recently introduced by Nvidia to simplify the complexities of memory management while claiming good overall performance. In this paper, we investigate this programming model and evaluate its performance and programming model simplifications based on our experimental results. We find that beyond on-demand data transfers to the CPU, the GPU is also able to request subsets of data it requires on demand. This feature allows UMA to outperform full data transfer methods for certain parallel applications and small data sizes. We also find, however, that for the majority of applications and memory access patterns, the performance overheads associated with UMA are significant, while the simplifications to the programming model restrict flexibility for adding future optimizations. PMID:26594668

  1. Opportunities and challenges in developing deep learning models using electronic health records data: a systematic review.

    PubMed

    Xiao, Cao; Choi, Edward; Sun, Jimeng

    2018-06-08

    To conduct a systematic review of deep learning models for electronic health record (EHR) data, and illustrate various deep learning architectures for analyzing different data sources and their target applications. We also highlight ongoing research and identify open challenges in building deep learning models of EHRs. We searched PubMed and Google Scholar for papers on deep learning studies using EHR data published between January 1, 2010, and January 31, 2018. We summarize them according to these axes: types of analytics tasks, types of deep learning model architectures, special challenges arising from health data and tasks and their potential solutions, as well as evaluation strategies. We surveyed and analyzed multiple aspects of the 98 articles we found and identified the following analytics tasks: disease detection/classification, sequential prediction of clinical events, concept embedding, data augmentation, and EHR data privacy. We then studied how deep architectures were applied to these tasks. We also discussed some special challenges arising from modeling EHR data and reviewed a few popular approaches. Finally, we summarized how performance evaluations were conducted for each task. Despite the early success in using deep learning for health analytics applications, there still exist a number of issues to be addressed. We discuss them in detail including data and label availability, the interpretability and transparency of the model, and ease of deployment.

  2. A high‐resolution global flood hazard model†

    PubMed Central

    Smith, Andrew M.; Bates, Paul D.; Neal, Jeffrey C.; Alfieri, Lorenzo; Freer, Jim E.

    2015-01-01

    Abstract Floods are a natural hazard that affect communities worldwide, but to date the vast majority of flood hazard research and mapping has been undertaken by wealthy developed nations. As populations and economies have grown across the developing world, so too has demand from governments, businesses, and NGOs for modeled flood hazard data in these data‐scarce regions. We identify six key challenges faced when developing a flood hazard model that can be applied globally and present a framework methodology that leverages recent cross‐disciplinary advances to tackle each challenge. The model produces return period flood hazard maps at ∼90 m resolution for the whole terrestrial land surface between 56°S and 60°N, and results are validated against high‐resolution government flood hazard data sets from the UK and Canada. The global model is shown to capture between two thirds and three quarters of the area determined to be at risk in the benchmark data without generating excessive false positive predictions. When aggregated to ∼1 km, mean absolute error in flooded fraction falls to ∼5%. The full complexity global model contains an automatically parameterized subgrid channel network, and comparison to both a simplified 2‐D only variant and an independently developed pan‐European model shows the explicit inclusion of channels to be a critical contributor to improved model performance. While careful processing of existing global terrain data sets enables reasonable model performance in urban areas, adoption of forthcoming next‐generation global terrain data sets will offer the best prospect for a step‐change improvement in model performance. PMID:27594719

  3. Open Innovation at NASA: A New Business Model for Advancing Human Health and Performance Innovations

    NASA Technical Reports Server (NTRS)

    Davis, Jeffrey R.; Richard, Elizabeth E.; Keeton, Kathryn E.

    2014-01-01

    This paper describes a new business model for advancing NASA human health and performance innovations and demonstrates how open innovation shaped its development. A 45 percent research and technology development budget reduction drove formulation of a strategic plan grounded in collaboration. We describe the strategy execution, including adoption and results of open innovation initiatives, the challenges of cultural change, and the development of virtual centers and a knowledge management tool to educate and engage the workforce and promote cultural change.

  4. An object tracking method based on guided filter for night fusion image

    NASA Astrophysics Data System (ADS)

    Qian, Xiaoyan; Wang, Yuedong; Han, Lei

    2016-01-01

    Online object tracking is a challenging problem as it entails learning an effective model to account for appearance change caused by intrinsic and extrinsic factors. In this paper, we propose a novel online object tracking with guided image filter for accurate and robust night fusion image tracking. Firstly, frame difference is applied to produce the coarse target, which helps to generate observation models. Under the restriction of these models and local source image, guided filter generates sufficient and accurate foreground target. Then accurate boundaries of the target can be extracted from detection results. Finally timely updating for observation models help to avoid tracking shift. Both qualitative and quantitative evaluations on challenging image sequences demonstrate that the proposed tracking algorithm performs favorably against several state-of-art methods.

  5. Building simulation: Ten challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Langevin, Jared; Sun, Kaiyu

    Buildings consume more than one-third of the world’s primary energy. Reducing energy use and greenhouse-gas emissions in the buildings sector through energy conservation and efficiency improvements constitutes a key strategy for achieving global energy and environmental goals. Building performance simulation has been increasingly used as a tool for designing, operating and retrofitting buildings to save energy and utility costs. However, opportunities remain for researchers, software developers, practitioners and policymakers to maximize the value of building performance simulation in the design and operation of low energy buildings and communities that leverage interdisciplinary approaches to integrate humans, buildings, and the power gridmore » at a large scale. This paper presents ten challenges that highlight some of the most important issues in building performance simulation, covering the full building life cycle and a wide range of modeling scales. In conclusion, the formulation and discussion of each challenge aims to provide insights into the state-of-the-art and future research opportunities for each topic, and to inspire new questions from young researchers in this field.« less

  6. Building simulation: Ten challenges

    DOE PAGES

    Hong, Tianzhen; Langevin, Jared; Sun, Kaiyu

    2018-04-12

    Buildings consume more than one-third of the world’s primary energy. Reducing energy use and greenhouse-gas emissions in the buildings sector through energy conservation and efficiency improvements constitutes a key strategy for achieving global energy and environmental goals. Building performance simulation has been increasingly used as a tool for designing, operating and retrofitting buildings to save energy and utility costs. However, opportunities remain for researchers, software developers, practitioners and policymakers to maximize the value of building performance simulation in the design and operation of low energy buildings and communities that leverage interdisciplinary approaches to integrate humans, buildings, and the power gridmore » at a large scale. This paper presents ten challenges that highlight some of the most important issues in building performance simulation, covering the full building life cycle and a wide range of modeling scales. In conclusion, the formulation and discussion of each challenge aims to provide insights into the state-of-the-art and future research opportunities for each topic, and to inspire new questions from young researchers in this field.« less

  7. LL13-MatModelRadDetect-PD2Jf Final Report: Materials Modeling for High-Performance Radiation Detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lordi, Vincenzo

    The aims of this project are to enable rational materials design for select high-payoff challenges in radiation detection materials by using state-of-the-art predictive atomistic modeling techniques. Three specific high-impact challenges are addressed: (i) design and optimization of electrical contact stacks for TlBr detectors to stabilize temporal response at room-temperature; (ii) identification of chemical design principles of host glass materials for large-volume, low-cost, highperformance glass scintillators; and (iii) determination of the electrical impacts of dislocation networks in Cd 1-xZn xTe (CZT) that limit its performance and usable single-crystal volume. The specific goals are to establish design and process strategies to achievemore » improved materials for high performance detectors. Each of the major tasks is discussed below in three sections, which include the goals for the task and a summary of the major results, followed by a listing of publications that contain the full details, including details of the methodologies used. The appendix lists 12 conference presentations given for this project, including 1 invited talk and 1 invited poster.« less

  8. Addressing the Real-World Challenges in the Development of Propulsion IVHM Technology Experiment (PITEX)

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Chicatelli, Amy; Fulton, Christopher E.; Balaban, Edward; Sweet, Adam; Hayden, Sandra Claire; Bajwa, Anupa

    2005-01-01

    The Propulsion IVHM Technology Experiment (PITEX) has been an on-going research effort conducted over several years. PITEX has developed and applied a model-based diagnostic system for the main propulsion system of the X-34 reusable launch vehicle, a space-launch technology demonstrator. The application was simulation-based using detailed models of the propulsion subsystem to generate nominal and failure scenarios during captive carry, which is the most safety-critical portion of the X-34 flight. Since no system-level testing of the X-34 Main Propulsion System (MPS) was performed, these simulated data were used to verify and validate the software system. Advanced diagnostic and signal processing algorithms were developed and tested in real-time on flight-like hardware. In an attempt to expose potential performance problems, these PITEX algorithms were subject to numerous real-world effects in the simulated data including noise, sensor resolution, command/valve talkback information, and nominal build variations. The current research has demonstrated the potential benefits of model-based diagnostics, defined the performance metrics required to evaluate the diagnostic system, and studied the impact of real-world challenges encountered when monitoring propulsion subsystems.

  9. Challenges in the development of very high resolution Earth System Models for climate science

    NASA Astrophysics Data System (ADS)

    Rasch, Philip J.; Xie, Shaocheng; Ma, Po-Lun; Lin, Wuyin; Wan, Hui; Qian, Yun

    2017-04-01

    The authors represent the 20+ members of the ACME atmosphere development team. The US Department of Energy (DOE) has, like many other organizations around the world, identified the need for an Earth System Model capable of rapid completion of decade to century length simulations at very high (vertical and horizontal) resolution with good climate fidelity. Two years ago DOE initiated a multi-institution effort called ACME (Accelerated Climate Modeling for Energy) to meet this an extraordinary challenge, targeting a model eventually capable of running at 10-25km horizontal and 20-400m vertical resolution through the troposphere on exascale computational platforms at speeds sufficient to complete 5+ simulated years per day. I will outline the challenges our team has encountered in development of the atmosphere component of this model, and the strategies we have been using for tuning and debugging a model that we can barely afford to run on today's computational platforms. These strategies include: 1) evaluation at lower resolutions; 2) ensembles of short simulations to explore parameter space, and perform rough tuning and evaluation; 3) use of regionally refined versions of the model for probing high resolution model behavior at less expense; 4) use of "auto-tuning" methodologies for model tuning; and 5) brute force long climate simulations.

  10. Challenges in leveraging existing human performance data for quantifying the IDHEAS HRA method

    DOE PAGES

    Liao, Huafei N.; Groth, Katrina; Stevens-Adams, Susan

    2015-07-29

    Our article documents an exploratory study for collecting and using human performance data to inform human error probability (HEP) estimates for a new human reliability analysis (HRA) method, the IntegrateD Human Event Analysis System (IDHEAS). The method was based on cognitive models and mechanisms underlying human behaviour and employs a framework of 14 crew failure modes (CFMs) to represent human failures typical for human performance in nuclear power plant (NPP) internal, at-power events [1]. A decision tree (DT) was constructed for each CFM to assess the probability of the CFM occurring in different contexts. Data needs for IDHEAS quantification aremore » discussed. Then, the data collection framework and process is described and how the collected data were used to inform HEP estimation is illustrated with two examples. Next, five major technical challenges are identified for leveraging human performance data for IDHEAS quantification. Furthermore, these challenges reflect the data needs specific to IDHEAS. More importantly, they also represent the general issues with current human performance data and can provide insight for a path forward to support HRA data collection, use, and exchange for HRA method development, implementation, and validation.« less

  11. Automated Synthetic Scene Generation

    DTIC Science & Technology

    2014-07-01

    Using the Beard-Maxwell BRDF model , the BRDF from Equations (3.3) and (3.4) is composed of specular, diffuse, and volumetric terms such that x y zSun... models help organizations developing new remote sensing instruments anticipate sensor performance by enabling the ability to create synthetic imagery...for proposed sensor before a sensor is built. One of the largest challenges in modeling realistic synthetic imagery, however, is generating the

  12. Cognitive Model Exploration and Optimization: A New Challenge for Computational Science

    DTIC Science & Technology

    2010-03-01

    the generation and analysis of computational cognitive models to explain various aspects of cognition. Typically the behavior of these models...computational scale of a workstation, so we have turned to high performance computing (HPC) clusters and volunteer computing for large-scale...computational resources. The majority of applications on the Department of Defense HPC clusters focus on solving partial differential equations (Post

  13. Large-scale, high-performance and cloud-enabled multi-model analytics experiments in the context of the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.

    2017-12-01

    The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.

  14. Toward Realism in Human Performance Simulation

    DTIC Science & Technology

    2004-01-01

    toward the development of improved human-like performance of synthetic agents. However, several serious problems continue to challenge researchers and... developers . Developers have insufficient behavioral knowledge. To date, models of emotivity and behavior that have been commercialized still tend...Bindiganavale, 1999). There has even been significant development of architectures to produce animated characters that react appropriately to a small

  15. A Study of a High Performing, High Poverty Elementary School on the Texas-Mexico Border

    ERIC Educational Resources Information Center

    Lopez, Cynthia Iris

    2012-01-01

    Transforming low performing schools to ensure the academic success of Hispanic children situated in poverty remains an educational challenge. External factors impacting student learning are often targeted as the main reasons for poor academic achievement, thereby advancing the culturally deficit model. This study is about an elementary school that…

  16. An Explanatory Item Response Theory Approach for a Computer-Based Case Simulation Test

    ERIC Educational Resources Information Center

    Kahraman, Nilüfer

    2014-01-01

    Problem: Practitioners working with multiple-choice tests have long utilized Item Response Theory (IRT) models to evaluate the performance of test items for quality assurance. The use of similar applications for performance tests, however, is often encumbered due to the challenges encountered in working with complicated data sets in which local…

  17. Robust Fixed-Structure Controller Synthesis

    NASA Technical Reports Server (NTRS)

    Corrado, Joseph R.; Haddad, Wassim M.; Gupta, Kajal (Technical Monitor)

    2000-01-01

    The ability to develop an integrated control system design methodology for robust high performance controllers satisfying multiple design criteria and real world hardware constraints constitutes a challenging task. The increasingly stringent performance specifications required for controlling such systems necessitates a trade-off between controller complexity and robustness. The principle challenge of the minimal complexity robust control design is to arrive at a tractable control design formulation in spite of the extreme complexity of such systems. Hence, design of minimal complexitY robust controllers for systems in the face of modeling errors has been a major preoccupation of system and control theorists and practitioners for the past several decades.

  18. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    NASA Technical Reports Server (NTRS)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  19. An ambient agent model for analyzing managers' performance during stress

    NASA Astrophysics Data System (ADS)

    ChePa, Noraziah; Aziz, Azizi Ab; Gratim, Haned

    2016-08-01

    Stress at work have been reported everywhere. Work related performance during stress is a pattern of reactions that occurs when managers are presented with work demands that are not matched with their knowledge, skills, or abilities, and which challenge their ability to cope. Although there are many prior findings pertaining to explain the development of manager performance during stress, less attention has been given to explain the same concept through computational models. In such, a descriptive nature in psychological theories about managers' performance during stress can be transformed into a causal-mechanistic stage that explains the relationship between a series of observed phenomena. This paper proposed an ambient agent model for analyzing managers' performance during stress. Set of properties and variables are identified through past literatures to construct the model. Differential equations have been used in formalizing the model. Set of equations reflecting relations involved in the proposed model are presented. The proposed model is essential and can be encapsulated within an intelligent agent or robots that can be used to support managers during stress.

  20. Human performance modeling for system of systems analytics :soldier fatigue.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawton, Craig R.; Campbell, James E.; Miller, Dwight Peter

    2005-10-01

    The military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives as can be seen in the Department of Defense's (DoD) Defense Modeling and Simulation Office's (DMSO) Master Plan (DoD 5000.59-P 1995). To this goal, the military is currently spending millions of dollars on programs devoted to HPM in various military contexts. Examples include the Human Performance Modeling Integration (HPMI) program within the Air Force Research Laboratory, which focuses on integrating HPMs with constructive models of systems (e.g. cockpit simulations) and the Navy's Human Performance Center (HPC) established in Septembermore » 2003. Nearly all of these initiatives focus on the interface between humans and a single system. This is insufficient in the era of highly complex network centric SoS. This report presents research and development in the area of HPM in a system-of-systems (SoS). Specifically, this report addresses modeling soldier fatigue and the potential impacts soldier fatigue can have on SoS performance.« less

  1. GraphMeta: Managing HPC Rich Metadata in Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Dong; Chen, Yong; Carns, Philip

    High-performance computing (HPC) systems face increasingly critical metadata management challenges, especially in the approaching exascale era. These challenges arise not only from exploding metadata volumes, but also from increasingly diverse metadata, which contains data provenance and arbitrary user-defined attributes in addition to traditional POSIX metadata. This ‘rich’ metadata is becoming critical to supporting advanced data management functionality such as data auditing and validation. In our prior work, we identified a graph-based model as a promising solution to uniformly manage HPC rich metadata due to its flexibility and generality. However, at the same time, graph-based HPC rich metadata anagement also introducesmore » significant challenges to the underlying infrastructure. In this study, we first identify the challenges on the underlying infrastructure to support scalable, high-performance rich metadata management. Based on that, we introduce GraphMeta, a graphbased engine designed for this use case. It achieves performance scalability by introducing a new graph partitioning algorithm and a write-optimal storage engine. We evaluate GraphMeta under both synthetic and real HPC metadata workloads, compare it with other approaches, and demonstrate its advantages in terms of efficiency and usability for rich metadata management in HPC systems.« less

  2. Job crafting in changing organizations: Antecedents and implications for exhaustion and performance.

    PubMed

    Petrou, Paraskevas; Demerouti, Evangelia; Schaufeli, Wilmar B

    2015-10-01

    The present study addressed employee job crafting behaviors (i.e., seeking resources, seeking challenges, and reducing demands) in the context of organizational change. We examined predictors of job crafting both at the organizational level (i.e., perceived impact of the implemented changes on the working life of employees) and the individual level (i.e., employee willingness to follow the changes). Job crafting behaviors were expected to predict task performance and exhaustion. Two-wave longitudinal data from 580 police officers undergoing organizational changes were analyzed with structural equation modeling. Findings showed that the degree to which changes influence employees' daily work was linked to reducing demands and exhaustion, whereas employee willingness to change was linked to seeking resources and seeking challenges. Furthermore, while seeking resources and seeking challenges were associated with high task performance and low exhaustion respectively, reducing demands seemed to predict exhaustion positively. Our findings suggest that job crafting can act as a strategy of employees to respond to organizational change. While seeking resources and seeking challenges enhance employee adjustment and should be encouraged by managers, reducing demands seems to have unfavorable implications for employees. (c) 2015 APA, all rights reserved).

  3. Muscular contribution to low-back loading and stiffness during standard and suspended push-ups.

    PubMed

    Beach, Tyson A C; Howarth, Samuel J; Callaghan, Jack P

    2008-06-01

    Push-up exercises are normally performed to challenge muscles that span upper extremity joints. However, it is also recognized that push-ups provide an effective abdominal muscle challenge, especially when the hands are in contact with a labile support surface. The purpose of this study was to compare trunk muscle activation levels and resultant intervertebral joint (IVJ) loading when standard and suspended push-ups were performed, and to quantify and compare the contribution of trunk muscles to IVJ rotational stiffness in both exercises. Eleven recreationally trained male volunteers performed sets of standard and suspended push-ups. Upper body kinematic, kinetic, and EMG data were collected and input into a 3D biomechanical model of the lumbar torso to quantify lumbar IVJ loading and the contributions of trunk muscles to IVJ rotational stiffness. When performing suspended push-ups, muscles of the abdominal wall and the latissimus dorsi were activated to levels that were significantly greater than those elicited when performing standard push-ups (p<.05). As a direct result of these increased activation levels, model-predicted muscle forces increased and consequently led to significantly greater mean (p=.0008) and peak (p=.0012) lumbar IVJ compressive forces when performing suspended push-ups. Also directly resulting from the increased activation levels of the abdominal muscles and the latissimus dorsi during suspended push-ups was increased muscular contribution to lumbar IVJ rotational stiffness (p<.05). In comparison to the standard version of the exercise, suspended push-ups appear to provide a superior abdominal muscle challenge. However, for individuals unable to tolerate high lumbar IVJ compressive loads, potential benefits gained by incorporating suspended push-ups into their resistance training regimen may be outweighed by the risk of overloading low-back tissues.

  4. Next-Generation Climate Modeling Science Challenges for Simulation, Workflow and Analysis Systems

    NASA Astrophysics Data System (ADS)

    Koch, D. M.; Anantharaj, V. G.; Bader, D. C.; Krishnan, H.; Leung, L. R.; Ringler, T.; Taylor, M.; Wehner, M. F.; Williams, D. N.

    2016-12-01

    We will present two examples of current and future high-resolution climate-modeling research that are challenging existing simulation run-time I/O, model-data movement, storage and publishing, and analysis. In each case, we will consider lessons learned as current workflow systems are broken by these large-data science challenges, as well as strategies to repair or rebuild the systems. First we consider the science and workflow challenges to be posed by the CMIP6 multi-model HighResMIP, involving around a dozen modeling groups performing quarter-degree simulations, in 3-member ensembles for 100 years, with high-frequency (1-6 hourly) diagnostics, which is expected to generate over 4PB of data. An example of science derived from these experiments will be to study how resolution affects the ability of models to capture extreme-events such as hurricanes or atmospheric rivers. Expected methods to transfer (using parallel Globus) and analyze (using parallel "TECA" software tools) HighResMIP data for such feature-tracking by the DOE CASCADE project will be presented. A second example will be from the Accelerated Climate Modeling for Energy (ACME) project, which is currently addressing challenges involving multiple century-scale coupled high resolution (quarter-degree) climate simulations on DOE Leadership Class computers. ACME is anticipating production of over 5PB of data during the next 2 years of simulations, in order to investigate the drivers of water cycle changes, sea-level-rise, and carbon cycle evolution. The ACME workflow, from simulation to data transfer, storage, analysis and publication will be presented. Current and planned methods to accelerate the workflow, including implementing run-time diagnostics, and implementing server-side analysis to avoid moving large datasets will be presented.

  5. Developing and validating risk prediction models in an individual participant data meta-analysis

    PubMed Central

    2014-01-01

    Background Risk prediction models estimate the risk of developing future outcomes for individuals based on one or more underlying characteristics (predictors). We review how researchers develop and validate risk prediction models within an individual participant data (IPD) meta-analysis, in order to assess the feasibility and conduct of the approach. Methods A qualitative review of the aims, methodology, and reporting in 15 articles that developed a risk prediction model using IPD from multiple studies. Results The IPD approach offers many opportunities but methodological challenges exist, including: unavailability of requested IPD, missing patient data and predictors, and between-study heterogeneity in methods of measurement, outcome definitions and predictor effects. Most articles develop their model using IPD from all available studies and perform only an internal validation (on the same set of data). Ten of the 15 articles did not allow for any study differences in baseline risk (intercepts), potentially limiting their model’s applicability and performance in some populations. Only two articles used external validation (on different data), including a novel method which develops the model on all but one of the IPD studies, tests performance in the excluded study, and repeats by rotating the omitted study. Conclusions An IPD meta-analysis offers unique opportunities for risk prediction research. Researchers can make more of this by allowing separate model intercept terms for each study (population) to improve generalisability, and by using ‘internal-external cross-validation’ to simultaneously develop and validate their model. Methodological challenges can be reduced by prospectively planned collaborations that share IPD for risk prediction. PMID:24397587

  6. International Space Station Passive Thermal Control System Analysis, Top Ten Lessons-Learned

    NASA Technical Reports Server (NTRS)

    Iovine, John

    2011-01-01

    The International Space Station (ISS) has been on-orbit for over 10 years, and there have been numerous technical challenges along the way from design to assembly to on-orbit anomalies and repairs. The Passive Thermal Control System (PTCS) management team has been a key player in successfully dealing with these challenges. The PTCS team performs thermal analysis in support of design and verification, launch and assembly constraints, integration, sustaining engineering, failure response, and model validation. This analysis is a significant body of work and provides a unique opportunity to compile a wealth of real world engineering and analysis knowledge and the corresponding lessons-learned. The analysis lessons encompass the full life cycle of flight hardware from design to on-orbit performance and sustaining engineering. These lessons can provide significant insight for new projects and programs. Key areas to be presented include thermal model fidelity, verification methods, analysis uncertainty, and operations support.

  7. The Role of Flipped Learning in Managing the Cognitive Load of a Threshold Concept in Physiology

    ERIC Educational Resources Information Center

    Akkaraju, Shylaja

    2016-01-01

    To help students master challenging, threshold concepts in physiology, I used the flipped learning model in a human anatomy and physiology course with very encouraging results in terms of student motivation, preparedness, engagement, and performance. The flipped learning model was enhanced by pre-training and formative assessments that provided…

  8. Biocellion: accelerating computer simulation of multicellular biological system models

    PubMed Central

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  9. Status of Foreground and Instrument Challenges for 21cm EoR experiments - Design Strategies for SKA and HERA

    NASA Astrophysics Data System (ADS)

    Thyagarajan, Nithyanandan

    2018-05-01

    Direct detection of the Epoch of Reionization (EoR) via redshifted 21 cm line of H i will reveal the nature of the first stars and galaxies as well as revolutionize our understanding of a poorly explored evolutionary phase of the Universe. Projects such as the MWA, LOFAR, and PAPER commenced in the last decade with the promise of high significance statistical detection of the EoR, but have so far only weakly constrained models owing to unforeseen challenges from bright foreground sources and instrument systematics. It is essential for next generation instruments like the HERA and SKA to have these challenges addressed. I present an analysis of these challenges - wide-field measurements, antenna beam chromaticity, reflections in the instrument, and antenna position errors - along with performance specifications and design solutions that will be critical to designing successful next-generation instruments in enabling the first detection and also in placing meaningful constraints on reionization models.

  10. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  11. ExaSAT: An exascale co-design tool for performance modeling

    DOE PAGES

    Unat, Didem; Chan, Cy; Zhang, Weiqun; ...

    2015-02-09

    One of the emerging challenges to designing HPC systems is understanding and projecting the requirements of exascale applications. In order to determine the performance consequences of different hardware designs, analytic models are essential because they can provide fast feedback to the co-design centers and chip designers without costly simulations. However, current attempts to analytically model program performance typically rely on the user manually specifying a performance model. Here we introduce the ExaSAT framework that automates the extraction of parameterized performance models directly from source code using compiler analysis. The parameterized analytic model enables quantitative evaluation of a broad range ofmore » hardware design trade-offs and software optimizations on a variety of different performance metrics, with a primary focus on data movement as a metric. Finally, we demonstrate the ExaSAT framework’s ability to perform deep code analysis of a proxy application from the Department of Energy Combustion Co-design Center to illustrate its value to the exascale co-design process. ExaSAT analysis provides insights into the hardware and software trade-offs and lays the groundwork for exploring a more targeted set of design points using cycle-accurate architectural simulators.« less

  12. Statistical modelling of networked human-automation performance using working memory capacity.

    PubMed

    Ahmed, Nisar; de Visser, Ewart; Shaw, Tyler; Mohamed-Ameen, Amira; Campbell, Mark; Parasuraman, Raja

    2014-01-01

    This study examines the challenging problem of modelling the interaction between individual attentional limitations and decision-making performance in networked human-automation system tasks. Analysis of real experimental data from a task involving networked supervision of multiple unmanned aerial vehicles by human participants shows that both task load and network message quality affect performance, but that these effects are modulated by individual differences in working memory (WM) capacity. These insights were used to assess three statistical approaches for modelling and making predictions with real experimental networked supervisory performance data: classical linear regression, non-parametric Gaussian processes and probabilistic Bayesian networks. It is shown that each of these approaches can help designers of networked human-automated systems cope with various uncertainties in order to accommodate future users by linking expected operating conditions and performance from real experimental data to observable cognitive traits like WM capacity. Practitioner Summary: Working memory (WM) capacity helps account for inter-individual variability in operator performance in networked unmanned aerial vehicle supervisory tasks. This is useful for reliable performance prediction near experimental conditions via linear models; robust statistical prediction beyond experimental conditions via Gaussian process models and probabilistic inference about unknown task conditions/WM capacities via Bayesian network models.

  13. Model-based nonlinear control of hydraulic servo systems: Challenges, developments and perspectives

    NASA Astrophysics Data System (ADS)

    Yao, Jianyong

    2018-06-01

    Hydraulic servo system plays a significant role in industries, and usually acts as a core point in control and power transmission. Although linear theory-based control methods have been well established, advanced controller design methods for hydraulic servo system to achieve high performance is still an unending pursuit along with the development of modern industry. Essential nonlinearity is a unique feature and makes model-based nonlinear control more attractive, due to benefit from prior knowledge of the servo valve controlled hydraulic system. In this paper, a discussion for challenges in model-based nonlinear control, latest developments and brief perspectives of hydraulic servo systems are presented: Modelling uncertainty in hydraulic system is a major challenge, which includes parametric uncertainty and time-varying disturbance; some specific requirements also arise ad hoc difficulties such as nonlinear friction during low velocity tracking, severe disturbance, periodic disturbance, etc.; to handle various challenges, nonlinear solutions including parameter adaptation, nonlinear robust control, state and disturbance observation, backstepping design and so on, are proposed and integrated, theoretical analysis and lots of applications reveal their powerful capability to solve pertinent problems; and at the end, some perspectives and associated research topics (measurement noise, constraints, inner valve dynamics, input nonlinearity, etc.) in nonlinear hydraulic servo control are briefly explored and discussed.

  14. Optimization of Biomathematical Model Predictions for Cognitive Performance Impairment in Individuals: Accounting for Unknown Traits and Uncertain States in Homeostatic and Circadian Processes

    PubMed Central

    Van Dongen, Hans P. A.; Mott, Christopher G.; Huang, Jen-Kuang; Mollicone, Daniel J.; McKenzie, Frederic D.; Dinges, David F.

    2007-01-01

    Current biomathematical models of fatigue and performance do not accurately predict cognitive performance for individuals with a priori unknown degrees of trait vulnerability to sleep loss, do not predict performance reliably when initial conditions are uncertain, and do not yield statistically valid estimates of prediction accuracy. These limitations diminish their usefulness for predicting the performance of individuals in operational environments. To overcome these 3 limitations, a novel modeling approach was developed, based on the expansion of a statistical technique called Bayesian forecasting. The expanded Bayesian forecasting procedure was implemented in the two-process model of sleep regulation, which has been used to predict performance on the basis of the combination of a sleep homeostatic process and a circadian process. Employing the two-process model with the Bayesian forecasting procedure to predict performance for individual subjects in the face of unknown traits and uncertain states entailed subject-specific optimization of 3 trait parameters (homeostatic build-up rate, circadian amplitude, and basal performance level) and 2 initial state parameters (initial homeostatic state and circadian phase angle). Prior information about the distribution of the trait parameters in the population at large was extracted from psychomotor vigilance test (PVT) performance measurements in 10 subjects who had participated in a laboratory experiment with 88 h of total sleep deprivation. The PVT performance data of 3 additional subjects in this experiment were set aside beforehand for use in prospective computer simulations. The simulations involved updating the subject-specific model parameters every time the next performance measurement became available, and then predicting performance 24 h ahead. Comparison of the predictions to the subjects' actual data revealed that as more data became available for the individuals at hand, the performance predictions became increasingly more accurate and had progressively smaller 95% confidence intervals, as the model parameters converged efficiently to those that best characterized each individual. Even when more challenging simulations were run (mimicking a change in the initial homeostatic state; simulating the data to be sparse), the predictions were still considerably more accurate than would have been achieved by the two-process model alone. Although the work described here is still limited to periods of consolidated wakefulness with stable circadian rhythms, the results obtained thus far indicate that the Bayesian forecasting procedure can successfully overcome some of the major outstanding challenges for biomathematical prediction of cognitive performance in operational settings. Citation: Van Dongen HPA; Mott CG; Huang JK; Mollicone DJ; McKenzie FD; Dinges DF. Optimization of biomathematical model predictions for cognitive performance impairment in individuals: accounting for unknown traits and uncertain states in homeostatic and circadian processes. SLEEP 2007;30(9):1129-1143. PMID:17910385

  15. Challenges in microbial ecology: building predictive understanding of community function and dynamics

    PubMed Central

    Widder, Stefanie; Allen, Rosalind J; Pfeiffer, Thomas; Curtis, Thomas P; Wiuf, Carsten; Sloan, William T; Cordero, Otto X; Brown, Sam P; Momeni, Babak; Shou, Wenying; Kettle, Helen; Flint, Harry J; Haas, Andreas F; Laroche, Béatrice; Kreft, Jan-Ulrich; Rainey, Paul B; Freilich, Shiri; Schuster, Stefan; Milferstedt, Kim; van der Meer, Jan R; Groβkopf, Tobias; Huisman, Jef; Free, Andrew; Picioreanu, Cristian; Quince, Christopher; Klapper, Isaac; Labarthe, Simon; Smets, Barth F; Wang, Harris; Soyer, Orkun S

    2016-01-01

    The importance of microbial communities (MCs) cannot be overstated. MCs underpin the biogeochemical cycles of the earth's soil, oceans and the atmosphere, and perform ecosystem functions that impact plants, animals and humans. Yet our ability to predict and manage the function of these highly complex, dynamically changing communities is limited. Building predictive models that link MC composition to function is a key emerging challenge in microbial ecology. Here, we argue that addressing this challenge requires close coordination of experimental data collection and method development with mathematical model building. We discuss specific examples where model–experiment integration has already resulted in important insights into MC function and structure. We also highlight key research questions that still demand better integration of experiments and models. We argue that such integration is needed to achieve significant progress in our understanding of MC dynamics and function, and we make specific practical suggestions as to how this could be achieved. PMID:27022995

  16. Development of a sheep challenge model for Rift Valley fever.

    PubMed

    Faburay, Bonto; Gaudreault, Natasha N; Liu, Qinfang; Davis, A Sally; Shivanna, Vinay; Sunwoo, Sun Young; Lang, Yuekun; Morozov, Igor; Ruder, Mark; Drolet, Barbara; Scott McVey, D; Ma, Wenjun; Wilson, William; Richt, Juergen A

    2016-02-01

    Rift Valley fever (RVF) is a zoonotic disease that causes severe epizootics in ruminants, characterized by mass abortion and high mortality rates in younger animals. The development of a reliable challenge model is an important prerequisite for evaluation of existing and novel vaccines. A study aimed at comparing the pathogenesis of RVF virus infection in US sheep using two genetically different wild type strains of the virus (SA01-1322 and Kenya-128B-15) was performed. A group of sheep was inoculated with both strains and all infected sheep manifested early-onset viremia accompanied by a transient increase in temperatures. The Kenya-128B-15 strain manifested higher virulence compared to SA01-1322 by inducing more severe liver damage, and longer and higher viremia. Genome sequence analysis revealed sequence variations between the two isolates, which potentially could account for the observed phenotypic differences. We conclude that Kenya-128B-15 sheep infection represents a good and virulent challenge model for RVF. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Gradient and shim technologies for ultra high field MRI

    PubMed Central

    Winkler, Simone A.; Schmitt, Franz; Landes, Hermann; DeBever, Josh; Wade, Trevor; Alejski, Andrew

    2017-01-01

    Ultra High Field (UHF) MRI requires improved gradient and shim performance to fully realize the promised gains (SNR as well as spatial, spectral, diffusion resolution) that higher main magnetic fields offer. Both the more challenging UHF environment by itself, as well as the higher currents used in high performance coils, require a deeper understanding combined with sophisticated engineering modeling and construction, to optimize gradient and shim hardware for safe operation and for highest image quality. This review summarizes the basics of gradient and shim technologies, and outlines a number of UHF-related challenges and solutions. In particular, Lorentz forces, vibroacoustics, eddy currents, and peripheral nerve stimulation are discussed. Several promising UHF-relevant gradient concepts are described, including insertable gradient coils aimed at higher performance neuroimaging. PMID:27915120

  18. Thermal Protection for Mars Sample Return Earth Entry Vehicle: A Grand Challenge for Design Methodology and Reliability Verification

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj; Gage, Peter; Wright, Michael J.

    2017-01-01

    Mars Sample Return is our Grand Challenge for the coming decade. TPS (Thermal Protection System) nominal performance is not the key challenge. The main difficulty for designers is the need to verify unprecedented reliability for the entry system: current guidelines for prevention of backward contamination require that the probability of spores larger than 1 micron diameter escaping into the Earth environment be lower than 1 million for the entire system, and the allocation to TPS would be more stringent than that. For reference, the reliability allocation for Orion TPS is closer to 11000, and the demonstrated reliability for previous human Earth return systems was closer to 1100. Improving reliability by more than 3 orders of magnitude is a grand challenge indeed. The TPS community must embrace the possibility of new architectures that are focused on reliability above thermal performance and mass efficiency. MSR (Mars Sample Return) EEV (Earth Entry Vehicle) will be hit with MMOD (Micrometeoroid and Orbital Debris) prior to reentry. A chute-less aero-shell design which allows for self-righting shape was baselined in prior MSR studies, with the assumption that a passive system will maximize EEV robustness. Hence the aero-shell along with the TPS has to take ground impact and not break apart. System verification will require testing to establish ablative performance and thermal failure but also testing of damage from MMOD, and structural performance at ground impact. Mission requirements will demand analysis, testing and verification that are focused on establishing reliability of the design. In this proposed talk, we will focus on the grand challenge of MSR EEV TPS and the need for innovative approaches to address challenges in modeling, testing, manufacturing and verification.

  19. Hierarchical Context Modeling for Video Event Recognition.

    PubMed

    Wang, Xiaoyang; Ji, Qiang

    2016-10-11

    Current video event recognition research remains largely target-centered. For real-world surveillance videos, targetcentered event recognition faces great challenges due to large intra-class target variation, limited image resolution, and poor detection and tracking results. To mitigate these challenges, we introduced a context-augmented video event recognition approach. Specifically, we explicitly capture different types of contexts from three levels including image level, semantic level, and prior level. At the image level, we introduce two types of contextual features including the appearance context features and interaction context features to capture the appearance of context objects and their interactions with the target objects. At the semantic level, we propose a deep model based on deep Boltzmann machine to learn event object representations and their interactions. At the prior level, we utilize two types of prior-level contexts including scene priming and dynamic cueing. Finally, we introduce a hierarchical context model that systematically integrates the contextual information at different levels. Through the hierarchical context model, contexts at different levels jointly contribute to the event recognition. We evaluate the hierarchical context model for event recognition on benchmark surveillance video datasets. Results show that incorporating contexts in each level can improve event recognition performance, and jointly integrating three levels of contexts through our hierarchical model achieves the best performance.

  20. Enhancing GIS Capabilities for High Resolution Earth Science Grids

    NASA Astrophysics Data System (ADS)

    Koziol, B. W.; Oehmke, R.; Li, P.; O'Kuinghttons, R.; Theurich, G.; DeLuca, C.

    2017-12-01

    Applications for high performance GIS will continue to increase as Earth system models pursue more realistic representations of Earth system processes. Finer spatial resolution model input and output, unstructured or irregular modeling grids, data assimilation, and regional coordinate systems present novel challenges for GIS frameworks operating in the Earth system modeling domain. This presentation provides an overview of two GIS-driven applications that combine high performance software with big geospatial datasets to produce value-added tools for the modeling and geoscientific community. First, a large-scale interpolation experiment using National Hydrography Dataset (NHD) catchments, a high resolution rectilinear CONUS grid, and the Earth System Modeling Framework's (ESMF) conservative interpolation capability will be described. ESMF is a parallel, high-performance software toolkit that provides capabilities (e.g. interpolation) for building and coupling Earth science applications. ESMF is developed primarily by the NOAA Environmental Software Infrastructure and Interoperability (NESII) group. The purpose of this experiment was to test and demonstrate the utility of high performance scientific software in traditional GIS domains. Special attention will be paid to the nuanced requirements for dealing with high resolution, unstructured grids in scientific data formats. Second, a chunked interpolation application using ESMF and OpenClimateGIS (OCGIS) will demonstrate how spatial subsetting can virtually remove computing resource ceilings for very high spatial resolution interpolation operations. OCGIS is a NESII-developed Python software package designed for the geospatial manipulation of high-dimensional scientific datasets. An overview of the data processing workflow, why a chunked approach is required, and how the application could be adapted to meet operational requirements will be discussed here. In addition, we'll provide a general overview of OCGIS's parallel subsetting capabilities including challenges in the design and implementation of a scientific data subsetter.

  1. Identification of a leadership competency model for use in the development, recruitment & retention of intermodal transportation workers.

    DOT National Transportation Integrated Search

    2010-12-01

    Transportation, like most industries, faces critical leadership challenges. Attracting, retaining, : and training high potential candidates are essential to safe and productive organizational : performance. Indeed, as the reliance on efficient public...

  2. A Classification Scheme for Smart Manufacturing Systems’ Performance Metrics

    PubMed Central

    Lee, Y. Tina; Kumaraguru, Senthilkumaran; Jain, Sanjay; Robinson, Stefanie; Helu, Moneer; Hatim, Qais Y.; Rachuri, Sudarsan; Dornfeld, David; Saldana, Christopher J.; Kumara, Soundar

    2017-01-01

    This paper proposes a classification scheme for performance metrics for smart manufacturing systems. The discussion focuses on three such metrics: agility, asset utilization, and sustainability. For each of these metrics, we discuss classification themes, which we then use to develop a generalized classification scheme. In addition to the themes, we discuss a conceptual model that may form the basis for the information necessary for performance evaluations. Finally, we present future challenges in developing robust, performance-measurement systems for real-time, data-intensive enterprises. PMID:28785744

  3. Scale effect challenges in urban hydrology highlighted with a Fully Distributed Model and High-resolution rainfall data

    NASA Astrophysics Data System (ADS)

    Ichiba, Abdellah; Gires, Auguste; Tchiguirinskaia, Ioulia; Schertzer, Daniel; Bompard, Philippe; Ten Veldhuis, Marie-Claire

    2017-04-01

    Nowadays, there is a growing interest on small-scale rainfall information, provided by weather radars, to be used in urban water management and decision-making. Therefore, an increasing interest is in parallel devoted to the development of fully distributed and grid-based models following the increase of computation capabilities, the availability of high-resolution GIS information needed for such models implementation. However, the choice of an appropriate implementation scale to integrate the catchment heterogeneity and the whole measured rainfall variability provided by High-resolution radar technologies still issues. This work proposes a two steps investigation of scale effects in urban hydrology and its effects on modeling works. In the first step fractal tools are used to highlight the scale dependency observed within distributed data used to describe the catchment heterogeneity, both the structure of the sewer network and the distribution of impervious areas are analyzed. Then an intensive multi-scale modeling work is carried out to understand scaling effects on hydrological model performance. Investigations were conducted using a fully distributed and physically based model, Multi-Hydro, developed at Ecole des Ponts ParisTech. The model was implemented at 17 spatial resolutions ranging from 100 m to 5 m and modeling investigations were performed using both rain gauge rainfall information as well as high resolution X band radar data in order to assess the sensitivity of the model to small scale rainfall variability. Results coming out from this work demonstrate scale effect challenges in urban hydrology modeling. In fact, fractal concept highlights the scale dependency observed within distributed data used to implement hydrological models. Patterns of geophysical data change when we change the observation pixel size. The multi-scale modeling investigation performed with Multi-Hydro model at 17 spatial resolutions confirms scaling effect on hydrological model performance. Results were analyzed at three ranges of scales identified in the fractal analysis and confirmed in the modeling work. The sensitivity of the model to small-scale rainfall variability was discussed as well.

  4. Contemporary group estimates adjusted for climatic effects provide a finer definition of the unknown environmental challenges experienced by growing pigs.

    PubMed

    Guy, S Z Y; Li, L; Thomson, P C; Hermesch, S

    2017-12-01

    Environmental descriptors derived from mean performances of contemporary groups (CGs) are assumed to capture any known and unknown environmental challenges. The objective of this paper was to obtain a finer definition of the unknown challenges, by adjusting CG estimates for the known climatic effects of monthly maximum air temperature (MaxT), minimum air temperature (MinT) and monthly rainfall (Rain). As the unknown component could include infection challenges, these refined descriptors may help to better model varying responses of sire progeny to environmental infection challenges for the definition of disease resilience. Data were recorded from 1999 to 2013 at a piggery in south-east Queensland, Australia (n = 31,230). Firstly, CG estimates of average daily gain (ADG) and backfat (BF) were adjusted for MaxT, MinT and Rain, which were fitted as splines. In the models used to derive CG estimates for ADG, MaxT and MinT were significant variables. The models that contained these significant climatic variables had CG estimates with a lower variance compared to models without significant climatic variables. Variance component estimates were similar across all models, suggesting that these significant climatic variables accounted for some known environmental variation captured in CG estimates. No climatic variables were significant in the models used to derive the CG estimates for BF. These CG estimates were used to categorize environments. There was no observable sire by environment interaction (Sire×E) for ADG when using the environmental descriptors based on CG estimates on BF. For the environmental descriptors based on CG estimates of ADG, there was significant Sire×E only when MinT was included in the model (p = .01). Therefore, this new definition of the environment, preadjusted by MinT, increased the ability to detect Sire×E. While the unknown challenges captured in refined CG estimates need verification for infection challenges, this may provide a practical approach for the genetic improvement of disease resilience. © 2017 Blackwell Verlag GmbH.

  5. The effect of challenge and threat states on performance: An examination of potential mechanisms

    PubMed Central

    Moore, Lee J; Vine, Samuel J; Wilson, Mark R; Freeman, Paul

    2012-01-01

    Challenge and threat states predict future performance; however, no research has examined their immediate effect on motor task performance. The present study examined the effect of challenge and threat states on golf putting performance and several possible mechanisms. One hundred twenty-seven participants were assigned to a challenge or threat group and performed six putts during which emotions, gaze, putting kinematics, muscle activity, and performance were recorded. Challenge and threat states were successively manipulated via task instructions. The challenge group performed more accurately, reported more favorable emotions, and displayed more effective gaze, putting kinematics, and muscle activity than the threat group. Multiple putting kinematic variables mediated the relationship between group and performance, suggesting that challenge and threat states impact performance at a predominately kinematic level. PMID:22913339

  6. Learning and remembering strategies of novice and advanced jazz dancers for skill level appropriate dance routines.

    PubMed

    Poon, P P; Rodgers, W M

    2000-06-01

    This study examined the influence of the challenge level of to-be-learned stimulus on learning strategies in novice and advanced dancers. In Study 1, skill-level appropriate dance routines were developed for novice and advanced jazz dancers. In Study 2, 8 novice and 9 advanced female jazz dancers attempted to learn and remember the two routines in mixed model factorial design, with one between-participants factor: skill level (novice or advanced) and two within-participants factors: routine (easy or difficult) and performance (immediate or delayed). Participants were interviewed regarding the strategies used to learn and remember the routines. Results indicated that advanced performers used atypical learning strategies for insufficiently challenging stimuli, which may reflect characteristics of the stimuli rather than the performer. The qualitative data indicate a clear preference of novice and advanced performers for spatial compatibility of stimuli and response.

  7. Brain-Computer Interfaces: A Neuroscience Paradigm of Social Interaction? A Matter of Perspective

    PubMed Central

    Mattout, Jérémie

    2012-01-01

    A number of recent studies have put human subjects in true social interactions, with the aim of better identifying the psychophysiological processes underlying social cognition. Interestingly, this emerging Neuroscience of Social Interactions (NSI) field brings up challenges which resemble important ones in the field of Brain-Computer Interfaces (BCI). Importantly, these challenges go beyond common objectives such as the eventual use of BCI and NSI protocols in the clinical domain or common interests pertaining to the use of online neurophysiological techniques and algorithms. Common fundamental challenges are now apparent and one can argue that a crucial one is to develop computational models of brain processes relevant to human interactions with an adaptive agent, whether human or artificial. Coupled with neuroimaging data, such models have proved promising in revealing the neural basis and mental processes behind social interactions. Similar models could help BCI to move from well-performing but offline static machines to reliable online adaptive agents. This emphasizes a social perspective to BCI, which is not limited to a computational challenge but extends to all questions that arise when studying the brain in interaction with its environment. PMID:22675291

  8. Application of soft computing based hybrid models in hydrological variables modeling: a comprehensive review

    NASA Astrophysics Data System (ADS)

    Fahimi, Farzad; Yaseen, Zaher Mundher; El-shafie, Ahmed

    2017-05-01

    Since the middle of the twentieth century, artificial intelligence (AI) models have been used widely in engineering and science problems. Water resource variable modeling and prediction are the most challenging issues in water engineering. Artificial neural network (ANN) is a common approach used to tackle this problem by using viable and efficient models. Numerous ANN models have been successfully developed to achieve more accurate results. In the current review, different ANN models in water resource applications and hydrological variable predictions are reviewed and outlined. In addition, recent hybrid models and their structures, input preprocessing, and optimization techniques are discussed and the results are compared with similar previous studies. Moreover, to achieve a comprehensive view of the literature, many articles that applied ANN models together with other techniques are included. Consequently, coupling procedure, model evaluation, and performance comparison of hybrid models with conventional ANN models are assessed, as well as, taxonomy and hybrid ANN models structures. Finally, current challenges and recommendations for future researches are indicated and new hybrid approaches are proposed.

  9. Optimizing learning in healthcare: how Island Health is evolving to learn at the speed of change.

    PubMed

    Gottfredson, Conrad; Stroud, Carol; Jackson, Mary; Stevenson, R Lynn; Archer, Jana

    2014-01-01

    Healthcare organizations are challenged with constrained resources and increasing service demands by an aging population with complex care needs. Exponential growth in competency requirements also challenges staff's ability to provide quality patient care. How can a healthcare organization support its staff to learn "at or above the speed of change" while continuing to provide the quality patient care? Island Health is addressing this challenge by transforming its traditional education model into an innovative, evidence-based learning and performance support approach. Implementation of the methodology is yielding several lessons learned, both for the internal Learning and Performance Support team, and for what it takes to bring a new way of doing business into an organization. A key result is that this approach is enabling the organization to be more responsive in helping staff gain and maintain competencies.

  10. Challenging situations when teaching children with autism spectrum disorders in general physical education.

    PubMed

    Obrusnikova, Iva; Dillon, Suzanna R

    2011-04-01

    As the first step of an instrument development, teaching challenges that occur when students with autism spectrum disorders are educated in general physical education were elicited using Goldfried and D'Zurilla's (1969) behavioral-analytic model. Data were collected from a convenience sample of 43 certified physical educators (29 women and 14 men) using a demographic questionnaire and an elicitation questionnaire. Participants listed 225 teaching challenges, 46% related to cooperative, 31% to competitive, and 24% to individualistic learning situations. Teaching challenges were categorized into nine themes: inattentive and hyperactive behaviors, social impairment, emotional regulation difficulties, difficulties understanding and performing tasks, narrow focus and inflexible adherence to routines and structure, isolation by classmates, negative effects on classmates' learning, and need for support.

  11. An Outpatient, Ambulant-Design, Controlled Human Infection Model Using Escalating Doses of Salmonella Typhi Challenge Delivered in Sodium Bicarbonate Solution

    PubMed Central

    Waddington, Claire S.; Darton, Thomas C.; Jones, Claire; Haworth, Kathryn; Peters, Anna; John, Tessa; Thompson, Ben A. V.; Kerridge, Simon A.; Kingsley, Robert A.; Zhou, Liqing; Holt, Kathryn E.; Yu, Ly-Mee; Lockhart, Stephen; Farrar, Jeremy J.; Sztein, Marcelo B.; Dougan, Gordon; Angus, Brian; Levine, Myron M.; Pollard, Andrew J.

    2014-01-01

    Background. Typhoid fever is a major global health problem, the control of which is hindered by lack of a suitable animal model in which to study Salmonella Typhi infection. Until 1974, a human challenge model advanced understanding of typhoid and was used in vaccine development. We set out to establish a new human challenge model and ascertain the S. Typhi (Quailes strain) inoculum required for an attack rate of 60%–75% in typhoid-naive volunteers when ingested with sodium bicarbonate solution. Methods. Groups of healthy consenting adults ingested escalating dose levels of S. Typhi and were closely monitored in an outpatient setting for 2 weeks. Antibiotic treatment was initiated if typhoid diagnosis occurred (temperature ≥38°C sustained ≥12 hours or bacteremia) or at day 14 in those remaining untreated. Results. Two dose levels (103 or 104 colony-forming units) were required to achieve the primary objective, resulting in attack rates of 55% (11/20) or 65% (13/20), respectively. Challenge was well tolerated; 4 of 40 participants fulfilled prespecified criteria for severe infection. Most diagnoses (87.5%) were confirmed by blood culture, and asymptomatic bacteremia and stool shedding of S. Typhi was also observed. Participants who developed typhoid infection demonstrated serological responses to flagellin and lipopolysaccharide antigens by day 14; however, no anti-Vi antibody responses were detected. Conclusions. Human challenge with a small inoculum of virulent S. Typhi administered in bicarbonate solution can be performed safely using an ambulant-model design to advance understanding of host–pathogen interactions and immunity. This model should expedite development of diagnostics, vaccines, and therapeutics for typhoid control. PMID:24519873

  12. New Challenges for Intervertebral Disc Treatment Using Regenerative Medicine

    PubMed Central

    Masuda, Koichi

    2010-01-01

    The development of tissue engineering therapies for the intervertebral disc is challenging due to ambiguities of disease and pain mechanisms in patients, and lack of consensus on preclinical models for safety and efficacy testing. Although the issues associated with model selection for studying orthopedic diseases or treatments have been discussed often, the multifaceted challenges associated with developing intervertebral disc tissue engineering therapies require special discussion. This review covers topics relevant to the clinical translation of tissue-engineered technologies: (1) the unmet clinical need, (2) appropriate models for safety and efficacy testing, (3) the need for standardized model systems, and (4) the translational pathways leading to a clinical trial. For preclinical evaluation of new therapies, we recommend establishing biologic plausibility of efficacy and safety using models of increasing complexity, starting with cell culture, small animals (rats and rabbits), and then large animals (goat and minipig) that more closely mimic nutritional, biomechanical, and surgical realities of human application. The use of standardized and reproducible experimental procedures and outcome measures is critical for judging relative efficacy. Finally, success will hinge on carefully designed clinical trials with well-defined patient selection criteria, gold-standard controls, and objective outcome metrics to assess performance in the early postoperative period. PMID:19903086

  13. Model-based analyses: Promises, pitfalls, and example applications to the study of cognitive control

    PubMed Central

    Mars, Rogier B.; Shea, Nicholas J.; Kolling, Nils; Rushworth, Matthew F. S.

    2011-01-01

    We discuss a recent approach to investigating cognitive control, which has the potential to deal with some of the challenges inherent in this endeavour. In a model-based approach, the researcher defines a formal, computational model that performs the task at hand and whose performance matches that of a research participant. The internal variables in such a model might then be taken as proxies for latent variables computed in the brain. We discuss the potential advantages of such an approach for the study of the neural underpinnings of cognitive control and its pitfalls, and we make explicit the assumptions underlying the interpretation of data obtained using this approach. PMID:20437297

  14. Leaders' experiences and perceptions implementing activity-based funding and pay-for-performance hospital funding models: A systematic review.

    PubMed

    Baxter, Pamela E; Hewko, Sarah J; Pfaff, Kathryn A; Cleghorn, Laura; Cunningham, Barbara J; Elston, Dawn; Cummings, Greta G

    2015-08-01

    Providing cost-effective, accessible, high quality patient care is a challenge to governments and health care delivery systems across the globe. In response to this challenge, two types of hospital funding models have been widely implemented: (1) activity-based funding (ABF) and (2) pay-for-performance (P4P). Although health care leaders play a critical role in the implementation of these funding models, to date their perspectives have not been systematically examined. The purpose of this systematic review was to gain a better understanding of the experiences of health care leaders implementing hospital funding reforms within Organisation for Economic Cooperation and Development countries. We searched literature from 1982 to 2013 using: Medline, EMBASE, CINAHL, Academic Search Complete, Academic Search Elite, and Business Source Complete. Two independent reviewers screened titles, abstracts and full texts using predefined criteria. We included 2 mixed methods and 12 qualitative studies. Thematic analysis was used in synthesizing results. Five common themes and multiple subthemes emerged. Themes include: pre-requisites for success, perceived benefits, barriers/challenges, unintended consequences, and leader recommendations. Irrespective of which type of hospital funding reform was implemented, health care leaders described a complex process requiring the following: organizational commitment; adequate infrastructure; human, financial and information technology resources; change champions and a personal commitment to quality care. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.

  15. Neurocognitive Correlates of Young Drivers' Performance in a Driving Simulator.

    PubMed

    Guinosso, Stephanie A; Johnson, Sara B; Schultheis, Maria T; Graefe, Anna C; Bishai, David M

    2016-04-01

    Differences in neurocognitive functioning may contribute to driving performance among young drivers. However, few studies have examined this relation. This pilot study investigated whether common neurocognitive measures were associated with driving performance among young drivers in a driving simulator. Young drivers (19.8 years (standard deviation [SD] = 1.9; N = 74)) participated in a battery of neurocognitive assessments measuring general intellectual capacity (Full-Scale Intelligence Quotient, FSIQ) and executive functioning, including the Stroop Color-Word Test (cognitive inhibition), Wisconsin Card Sort Test-64 (cognitive flexibility), and Attention Network Task (alerting, orienting, and executive attention). Participants then drove in a simulated vehicle under two conditions-a baseline and driving challenge. During the driving challenge, participants completed a verbal working memory task to increase demand on executive attention. Multiple regression models were used to evaluate the relations between the neurocognitive measures and driving performance under the two conditions. FSIQ, cognitive inhibition, and alerting were associated with better driving performance at baseline. FSIQ and cognitive inhibition were also associated with better driving performance during the verbal challenge. Measures of cognitive flexibility, orienting, and conflict executive control were not associated with driving performance under either condition. FSIQ and, to some extent, measures of executive function are associated with driving performance in a driving simulator. Further research is needed to determine if executive function is associated with more advanced driving performance under conditions that demand greater cognitive load. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  16. The virtues of the virtual world. Enhancing the technology/knowledge professional interface for life-long learning.

    PubMed

    Koerner, JoEllen Goertz

    2003-01-01

    Nurses are quintessential learners. Nested between the fields of science and technology, the professional mandate for life-long learning has never been greater. The expanding demands for performance and quality coupled with the reality of diminishing time and resources increasingly frustrate and challenge providers in the field. By blending the best of current training and education with the emerging potential of virtual learning, new models for enhancing clinical reasoning and performance will simplify the challenges of complexity, moving it to higher order. In this transition lies the key to restoring the joy and commitment of professional practice while enhancing the capacity to care with competence.

  17. Mars Solar Power

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.; Kerslake, Thomas W.; Jenkins, Phillip P.; Scheiman, David A.

    2004-01-01

    NASA missions to Mars, both robotic and human, rely on solar arrays for the primary power system. Mars presents a number of challenges for solar power system operation, including a dusty atmosphere which modifies the spectrum and intensity of the incident solar illumination as a function of time of day, degradation of the array performance by dust deposition, and low temperature operation. The environmental challenges to Mars solar array operation will be discussed and test results of solar cell technology operating under Mars conditions will be presented, along with modeling of solar cell performance under Mars conditions. The design implications for advanced solar arrays for future Mars missions is discussed, and an example case, a Martian polar rover, are analyzed.

  18. Closed Loop System Identification with Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Whorton, Mark S.

    2004-01-01

    High performance control design for a flexible space structure is challenging since high fidelity plant models are di.cult to obtain a priori. Uncertainty in the control design models typically require a very robust, low performance control design which must be tuned on-orbit to achieve the required performance. Closed loop system identi.cation is often required to obtain a multivariable open loop plant model based on closed-loop response data. In order to provide an accurate initial plant model to guarantee convergence for standard local optimization methods, this paper presents a global parameter optimization method using genetic algorithms. A minimal representation of the state space dynamics is employed to mitigate the non-uniqueness and over-parameterization of general state space realizations. This control-relevant system identi.cation procedure stresses the joint nature of the system identi.cation and control design problem by seeking to obtain a model that minimizes the di.erence between the predicted and actual closed-loop performance.

  19. Model for Preparing Marketing and Business Teachers to Meet the Challenge of CTSO Leadership and Advisement

    ERIC Educational Resources Information Center

    Stanislawski, Debbie; Haltinner, Urs

    2009-01-01

    Objective: This article presents a teacher education program graduates' perceptions of their preparedness to take on roles as FBLA, DECA, and BPA advisers. Purpose: The purpose of the study was to better understand impacts that the model of teacher preparation had on participants' confidence to perform their adviser roles. Method: A descriptive…

  20. Correlation between a Student's Performance on the Mental Cutting Test and Their 3D Parametric Modeling Ability

    ERIC Educational Resources Information Center

    Steinhauer, H. M.

    2012-01-01

    Engineering graphics has historically been viewed as a challenging course to teach as students struggle to grasp and understand the fundamental concepts and then to master their proper application. The emergence of stable, fast, affordable 3D parametric modeling platforms such as CATIA, Pro-E, and AutoCAD while providing several pedagogical…

  1. Radiomics-based Prognosis Analysis for Non-Small Cell Lung Cancer

    NASA Astrophysics Data System (ADS)

    Zhang, Yucheng; Oikonomou, Anastasia; Wong, Alexander; Haider, Masoom A.; Khalvati, Farzad

    2017-04-01

    Radiomics characterizes tumor phenotypes by extracting large numbers of quantitative features from radiological images. Radiomic features have been shown to provide prognostic value in predicting clinical outcomes in several studies. However, several challenges including feature redundancy, unbalanced data, and small sample sizes have led to relatively low predictive accuracy. In this study, we explore different strategies for overcoming these challenges and improving predictive performance of radiomics-based prognosis for non-small cell lung cancer (NSCLC). CT images of 112 patients (mean age 75 years) with NSCLC who underwent stereotactic body radiotherapy were used to predict recurrence, death, and recurrence-free survival using a comprehensive radiomics analysis. Different feature selection and predictive modeling techniques were used to determine the optimal configuration of prognosis analysis. To address feature redundancy, comprehensive analysis indicated that Random Forest models and Principal Component Analysis were optimum predictive modeling and feature selection methods, respectively, for achieving high prognosis performance. To address unbalanced data, Synthetic Minority Over-sampling technique was found to significantly increase predictive accuracy. A full analysis of variance showed that data endpoints, feature selection techniques, and classifiers were significant factors in affecting predictive accuracy, suggesting that these factors must be investigated when building radiomics-based predictive models for cancer prognosis.

  2. Statistical Model of Dynamic Markers of the Alzheimer's Pathological Cascade.

    PubMed

    Balsis, Steve; Geraci, Lisa; Benge, Jared; Lowe, Deborah A; Choudhury, Tabina K; Tirso, Robert; Doody, Rachelle S

    2018-05-05

    Alzheimer's disease (AD) is a progressive disease reflected in markers across assessment modalities, including neuroimaging, cognitive testing, and evaluation of adaptive function. Identifying a single continuum of decline across assessment modalities in a single sample is statistically challenging because of the multivariate nature of the data. To address this challenge, we implemented advanced statistical analyses designed specifically to model complex data across a single continuum. We analyzed data from the Alzheimer's Disease Neuroimaging Initiative (ADNI; N = 1,056), focusing on indicators from the assessments of magnetic resonance imaging (MRI) volume, fluorodeoxyglucose positron emission tomography (FDG-PET) metabolic activity, cognitive performance, and adaptive function. Item response theory was used to identify the continuum of decline. Then, through a process of statistical scaling, indicators across all modalities were linked to that continuum and analyzed. Findings revealed that measures of MRI volume, FDG-PET metabolic activity, and adaptive function added measurement precision beyond that provided by cognitive measures, particularly in the relatively mild range of disease severity. More specifically, MRI volume, and FDG-PET metabolic activity become compromised in the very mild range of severity, followed by cognitive performance and finally adaptive function. Our statistically derived models of the AD pathological cascade are consistent with existing theoretical models.

  3. Building confidence and credibility amid growing model and computing complexity

    NASA Astrophysics Data System (ADS)

    Evans, K. J.; Mahajan, S.; Veneziani, C.; Kennedy, J. H.

    2017-12-01

    As global Earth system models are developed to answer an ever-wider range of science questions, software products that provide robust verification, validation, and evaluation must evolve in tandem. Measuring the degree to which these new models capture past behavior, predict the future, and provide the certainty of predictions is becoming ever more challenging for reasons that are generally well known, yet are still challenging to address. Two specific and divergent needs for analysis of the Accelerated Climate Model for Energy (ACME) model - but with a similar software philosophy - are presented to show how a model developer-based focus can address analysis needs during expansive model changes to provide greater fidelity and execute on multi-petascale computing facilities. A-PRIME is a python script-based quick-look overview of a fully-coupled global model configuration to determine quickly if it captures specific behavior before significant computer time and expense is invested. EVE is an ensemble-based software framework that focuses on verification of performance-based ACME model development, such as compiler or machine settings, to determine the equivalence of relevant climate statistics. The challenges and solutions for analysis of multi-petabyte output data are highlighted from the aspect of the scientist using the software, with the aim of fostering discussion and further input from the community about improving developer confidence and community credibility.

  4. Software Design Challenges in Time Series Prediction Systems Using Parallel Implementation of Artificial Neural Networks.

    PubMed

    Manikandan, Narayanan; Subha, Srinivasan

    2016-01-01

    Software development life cycle has been characterized by destructive disconnects between activities like planning, analysis, design, and programming. Particularly software developed with prediction based results is always a big challenge for designers. Time series data forecasting like currency exchange, stock prices, and weather report are some of the areas where an extensive research is going on for the last three decades. In the initial days, the problems with financial analysis and prediction were solved by statistical models and methods. For the last two decades, a large number of Artificial Neural Networks based learning models have been proposed to solve the problems of financial data and get accurate results in prediction of the future trends and prices. This paper addressed some architectural design related issues for performance improvement through vectorising the strengths of multivariate econometric time series models and Artificial Neural Networks. It provides an adaptive approach for predicting exchange rates and it can be called hybrid methodology for predicting exchange rates. This framework is tested for finding the accuracy and performance of parallel algorithms used.

  5. Software Design Challenges in Time Series Prediction Systems Using Parallel Implementation of Artificial Neural Networks

    PubMed Central

    Manikandan, Narayanan; Subha, Srinivasan

    2016-01-01

    Software development life cycle has been characterized by destructive disconnects between activities like planning, analysis, design, and programming. Particularly software developed with prediction based results is always a big challenge for designers. Time series data forecasting like currency exchange, stock prices, and weather report are some of the areas where an extensive research is going on for the last three decades. In the initial days, the problems with financial analysis and prediction were solved by statistical models and methods. For the last two decades, a large number of Artificial Neural Networks based learning models have been proposed to solve the problems of financial data and get accurate results in prediction of the future trends and prices. This paper addressed some architectural design related issues for performance improvement through vectorising the strengths of multivariate econometric time series models and Artificial Neural Networks. It provides an adaptive approach for predicting exchange rates and it can be called hybrid methodology for predicting exchange rates. This framework is tested for finding the accuracy and performance of parallel algorithms used. PMID:26881271

  6. Trajectory Optimization of Electric Aircraft Subject to Subsystem Thermal Constraints

    NASA Technical Reports Server (NTRS)

    Falck, Robert D.; Chin, Jeffrey C.; Schnulo, Sydney L.; Burt, Jonathan M.; Gray, Justin S.

    2017-01-01

    Electric aircraft pose a unique design challenge in that they lack a simple way to reject waste heat from the power train. While conventional aircraft reject most of their excess heat in the exhaust stream, for electric aircraft this is not an option. To examine the implications of this challenge on electric aircraft design and performance, we developed a model of the electric subsystems for the NASA X-57 electric testbed aircraft. We then coupled this model with a model of simple 2D aircraft dynamics and used a Legendre-Gauss-Lobatto collocation optimal control approach to find optimal trajectories for the aircraft with and without thermal constraints. The results show that the X-57 heat rejection systems are well designed for maximum-range and maximum-efficiency flight, without the need to deviate from an optimal trajectory. Stressing the thermal constraints by reducing the cooling capacity or requiring faster flight has a minimal impact on performance, as the trajectory optimization technique is able to find flight paths which honor the thermal constraints with relatively minor deviations from the nominal optimal trajectory.

  7. Dynamical simulation of E-ELT segmented primary mirror

    NASA Astrophysics Data System (ADS)

    Sedghi, B.; Muller, M.; Bauvir, B.

    2011-09-01

    The dynamical behavior of the primary mirror (M1) has an important impact on the control of the segments and the performance of the telescope. Control of large segmented mirrors with a large number of actuators and sensors and multiple control loops in real life is a challenging problem. In virtual life, modeling, simulation and analysis of the M1 bears similar difficulties and challenges. In order to capture the dynamics of the segment subunits (high frequency modes) and the telescope back structure (low frequency modes), high order dynamical models with a very large number of inputs and outputs need to be simulated. In this paper, different approaches for dynamical modeling and simulation of the M1 segmented mirror subject to various perturbations, e.g. sensor noise, wind load, vibrations, earthquake are presented.

  8. Facing the challenges of multiscale modelling of bacterial and fungal pathogen–host interactions

    PubMed Central

    Schleicher, Jana; Conrad, Theresia; Gustafsson, Mika; Cedersund, Gunnar; Guthke, Reinhard

    2017-01-01

    Abstract Recent and rapidly evolving progress on high-throughput measurement techniques and computational performance has led to the emergence of new disciplines, such as systems medicine and translational systems biology. At the core of these disciplines lies the desire to produce multiscale models: mathematical models that integrate multiple scales of biological organization, ranging from molecular, cellular and tissue models to organ, whole-organism and population scale models. Using such models, hypotheses can systematically be tested. In this review, we present state-of-the-art multiscale modelling of bacterial and fungal infections, considering both the pathogen and host as well as their interaction. Multiscale modelling of the interactions of bacteria, especially Mycobacterium tuberculosis, with the human host is quite advanced. In contrast, models for fungal infections are still in their infancy, in particular regarding infections with the most important human pathogenic fungi, Candida albicans and Aspergillus fumigatus. We reflect on the current availability of computational approaches for multiscale modelling of host–pathogen interactions and point out current challenges. Finally, we provide an outlook for future requirements of multiscale modelling. PMID:26857943

  9. Turbulence modeling of free shear layers for high-performance aircraft

    NASA Technical Reports Server (NTRS)

    Sondak, Douglas L.

    1993-01-01

    The High Performance Aircraft (HPA) Grand Challenge of the High Performance Computing and Communications (HPCC) program involves the computation of the flow over a high performance aircraft. A variety of free shear layers, including mixing layers over cavities, impinging jets, blown flaps, and exhaust plumes, may be encountered in such flowfields. Since these free shear layers are usually turbulent, appropriate turbulence models must be utilized in computations in order to accurately simulate these flow features. The HPCC program is relying heavily on parallel computers. A Navier-Stokes solver (POVERFLOW) utilizing the Baldwin-Lomax algebraic turbulence model was developed and tested on a 128-node Intel iPSC/860. Algebraic turbulence models run very fast, and give good results for many flowfields. For complex flowfields such as those mentioned above, however, they are often inadequate. It was therefore deemed that a two-equation turbulence model will be required for the HPA computations. The k-epsilon two-equation turbulence model was implemented on the Intel iPSC/860. Both the Chien low-Reynolds-number model and a generalized wall-function formulation were included.

  10. Applying Risk Prediction Models to Optimize Lung Cancer Screening: Current Knowledge, Challenges, and Future Directions.

    PubMed

    Sakoda, Lori C; Henderson, Louise M; Caverly, Tanner J; Wernli, Karen J; Katki, Hormuzd A

    2017-12-01

    Risk prediction models may be useful for facilitating effective and high-quality decision-making at critical steps in the lung cancer screening process. This review provides a current overview of published lung cancer risk prediction models and their applications to lung cancer screening and highlights both challenges and strategies for improving their predictive performance and use in clinical practice. Since the 2011 publication of the National Lung Screening Trial results, numerous prediction models have been proposed to estimate the probability of developing or dying from lung cancer or the probability that a pulmonary nodule is malignant. Respective models appear to exhibit high discriminatory accuracy in identifying individuals at highest risk of lung cancer or differentiating malignant from benign pulmonary nodules. However, validation and critical comparison of the performance of these models in independent populations are limited. Little is also known about the extent to which risk prediction models are being applied in clinical practice and influencing decision-making processes and outcomes related to lung cancer screening. Current evidence is insufficient to determine which lung cancer risk prediction models are most clinically useful and how to best implement their use to optimize screening effectiveness and quality. To address these knowledge gaps, future research should be directed toward validating and enhancing existing risk prediction models for lung cancer and evaluating the application of model-based risk calculators and its corresponding impact on screening processes and outcomes.

  11. Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald; El-Azab, Anter; Pernice, Michael

    2017-03-23

    In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis formore » computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.« less

  12. A model-based test for treatment effects with probabilistic classifications.

    PubMed

    Cavagnaro, Daniel R; Davis-Stober, Clintin P

    2018-05-21

    Within modern psychology, computational and statistical models play an important role in describing a wide variety of human behavior. Model selection analyses are typically used to classify individuals according to the model(s) that best describe their behavior. These classifications are inherently probabilistic, which presents challenges for performing group-level analyses, such as quantifying the effect of an experimental manipulation. We answer this challenge by presenting a method for quantifying treatment effects in terms of distributional changes in model-based (i.e., probabilistic) classifications across treatment conditions. The method uses hierarchical Bayesian mixture modeling to incorporate classification uncertainty at the individual level into the test for a treatment effect at the group level. We illustrate the method with several worked examples, including a reanalysis of the data from Kellen, Mata, and Davis-Stober (2017), and analyze its performance more generally through simulation studies. Our simulations show that the method is both more powerful and less prone to type-1 errors than Fisher's exact test when classifications are uncertain. In the special case where classifications are deterministic, we find a near-perfect power-law relationship between the Bayes factor, derived from our method, and the p value obtained from Fisher's exact test. We provide code in an online supplement that allows researchers to apply the method to their own data. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  13. Optimization of tomographic reconstruction workflows on geographically distributed resources

    PubMed Central

    Bicer, Tekin; Gürsoy, Doǧa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T.

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Moreover, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks. PMID:27359149

  14. Multidisciplinary Analysis and Control of High Performance Air Vehicles

    DTIC Science & Technology

    2005-05-06

    G (s)u(s) + Gf (s)f(s) (4) where {GP(s) = C(sl - A)-’B + D G1(s) =C(sl - A)-’R, + R2 From a practical point of view, it is reasonable to make no...between various dynamics that makes control of this class of vehicles very challenging. 5. Human Resource Development: The grant was used to attract...characteristics of AHF make modeling and control of AHFVs especially challenging. Due to the strong coupling between the aerodynamics, the airframe, and

  15. Implement balanced scorecard to translate strategic plan into actionable objectives.

    PubMed

    2004-09-01

    Faced with challenges ranging from declining reimbursement to staff shortages, health care organizations--integrated delivery systems, physician group practices, disease management providers, and others--increasingly are turning to general business models to map out step-by-step action plans for performance measurement and process improvement. Creating a "balanced scorecard" is an obvious starting point for assessing and improving clinical and financial performance.

  16. Automotive applications of chromogenic materials

    NASA Astrophysics Data System (ADS)

    Lynam, Niall R.

    1990-03-01

    Automobiles present both opportunities and challenges for large-area chromogenics. Opportunities include optical and thermal control of vehicle glazing along with optical control of rearview mirrors and privacy glass. Challenges include cost-effectively meeting automotive safety, performance, and reliability standards. Worldwide automobile production' for 1987 is listed in Table 1. Of the roughly 33 million cars produced annually, approximately 8% are luxury models which are candidates for features such as auto- matically dimming rearview mirrors or variable opacity sunroofs. Thus copious commercial opportunities await whatever chromogenic technologies qualify for use in automobiles. This review will describe the performance, safety, and reliability/durability required for automotive use. Commercial opportunities and challenges will be discussed including cost factors and specifications. Chromogenic technologies such as electrochromism, liquid crystals and thermochromism will be reviewed in terms of how publicly announced technical developments match automotive needs and expectations. Construction and performance of ex- isting or imminent chromogenic devices will be described. Finally, how opportunities and challenges of the automotive environment translate to other applications for chromogenic materials such as architectural or information display devices will be discussed. The objective is to generally review the applications, the technologies appropriate to these applications, and the automotive chromogenic devices available at the time of writing to match these applications.

  17. Computational Aspects of Data Assimilation and the ESMF

    NASA Technical Reports Server (NTRS)

    daSilva, A.

    2003-01-01

    The scientific challenge of developing advanced data assimilation applications is a daunting task. Independently developed components may have incompatible interfaces or may be written in different computer languages. The high-performance computer (HPC) platforms required by numerically intensive Earth system applications are complex, varied, rapidly evolving and multi-part systems themselves. Since the market for high-end platforms is relatively small, there is little robust middleware available to buffer the modeler from the difficulties of HPC programming. To complicate matters further, the collaborations required to develop large Earth system applications often span initiatives, institutions and agencies, involve geoscience, software engineering, and computer science communities, and cross national borders.The Earth System Modeling Framework (ESMF) project is a concerted response to these challenges. Its goal is to increase software reuse, interoperability, ease of use and performance in Earth system models through the use of a common software framework, developed in an open manner by leaders in the modeling community. The ESMF addresses the technical and to some extent the cultural - aspects of Earth system modeling, laying the groundwork for addressing the more difficult scientific aspects, such as the physical compatibility of components, in the future. In this talk we will discuss the general philosophy and architecture of the ESMF, focussing on those capabilities useful for developing advanced data assimilation applications.

  18. 3D Laser Imprint Using a Smoother Ray-Traced Power Deposition Method

    NASA Astrophysics Data System (ADS)

    Schmitt, Andrew J.

    2017-10-01

    Imprinting of laser nonuniformities in directly-driven icf targets is a challenging problem to accurately simulate with large radiation-hydro codes. One of the most challenging aspects is the proper construction of the complex and rapidly changing laser interference structure driving the imprint using the reduced laser propagation models (usually ray-tracing) found in these codes. We have upgraded the modelling capability in our massively-parallel fastrad3d code by adding a more realistic EM-wave interference structure. This interference model adds an axial laser speckle to the previous transverse-only laser structure, and can be impressed on our improved smoothed 3D raytrace package. This latter package, which connects rays to form bundles and performs power deposition calculations on the bundles, is intended to decrease ray-trace noise (which can mask or add to imprint) while using fewer rays. We apply this improved model to 3D simulations of recent imprint experiments performed on the Omega-EP laser and the Nike laser that examined the reduction of imprinting due to very thin high-Z target coatings. We report on the conditions in which this new model makes a significant impact on the development of laser imprint. Supported by US DoE/NNSA.

  19. Predicting the performance uncertainty of a 1-MW pilot-scale carbon capture system after hierarchical laboratory-scale calibration and validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Zhijie; Lai, Canhai; Marcy, Peter William

    2017-05-01

    A challenging problem in designing pilot-scale carbon capture systems is to predict, with uncertainty, the adsorber performance and capture efficiency under various operating conditions where no direct experimental data exist. Motivated by this challenge, we previously proposed a hierarchical framework in which relevant parameters of physical models were sequentially calibrated from different laboratory-scale carbon capture unit (C2U) experiments. Specifically, three models of increasing complexity were identified based on the fundamental physical and chemical processes of the sorbent-based carbon capture technology. Results from the corresponding laboratory experiments were used to statistically calibrate the physical model parameters while quantifying some of theirmore » inherent uncertainty. The parameter distributions obtained from laboratory-scale C2U calibration runs are used in this study to facilitate prediction at a larger scale where no corresponding experimental results are available. In this paper, we first describe the multiphase reactive flow model for a sorbent-based 1-MW carbon capture system then analyze results from an ensemble of simulations with the upscaled model. The simulation results are used to quantify uncertainty regarding the design’s predicted efficiency in carbon capture. In particular, we determine the minimum gas flow rate necessary to achieve 90% capture efficiency with 95% confidence.« less

  20. Marital Conflict, Allostatic Load, and the Development of Children's Fluid Cognitive Performance

    PubMed Central

    Hinnant, J. Benjamin; El-Sheikh, Mona; Keiley, Margaret; Buckhalt, Joseph A.

    2013-01-01

    Relations between marital conflict, children’s respiratory sinus arrhythmia (RSA), and fluid cognitive performance were examined over three years to assess allostatic processes. Participants were 251 children reporting on marital conflict, baseline RSA and RSA reactivity to a lab challenge were recorded, and fluid cognitive performance was measured using the Woodcock-Johnson III. A cross-lagged model showed that higher levels of marital conflict at age 8 predicted weaker RSA-R at age 9 for children with lower baseline RSA. A growth model showed that lower baseline RSA in conjunction with weaker RSA-R predicted the slowest development of fluid cognitive performance. Findings suggest that stress may affect development of physiological systems regulating attention, which are tied to the development of fluid cognitive performance. PMID:23534537

  1. Challenges of Digital Preservation for Cultural Heritage Institutions

    ERIC Educational Resources Information Center

    Evens, Tom; Hauttekeete, Laurence

    2011-01-01

    This article elaborates four major issues hampering the sustainability of digital preservation within cultural heritage institutions: digitization, metadata indexes, intellectual property rights management and business models. Using a case-study approach, the digitization of audiovisual collections within the performing arts institutions in…

  2. Improving parallel I/O autotuning with performance modeling

    DOE PAGES

    Behzad, Babak; Byna, Surendra; Wild, Stefan M.; ...

    2014-01-01

    Various layers of the parallel I/O subsystem offer tunable parameters for improving I/O performance on large-scale computers. However, searching through a large parameter space is challenging. We are working towards an autotuning framework for determining the parallel I/O parameters that can achieve good I/O performance for different data write patterns. In this paper, we characterize parallel I/O and discuss the development of predictive models for use in effectively reducing the parameter space. Furthermore, applying our technique on tuning an I/O kernel derived from a large-scale simulation code shows that the search time can be reduced from 12 hours to 2more » hours, while achieving 54X I/O performance speedup.« less

  3. Towards a Generalizable Time Expression Model for Temporal Reasoning in Clinical Notes

    PubMed Central

    Velupillai, Sumithra; Mowery, Danielle L.; Abdelrahman, Samir; Christensen, Lee; Chapman, Wendy W

    2015-01-01

    Accurate temporal identification and normalization is imperative for many biomedical and clinical tasks such as generating timelines and identifying phenotypes. A major natural language processing challenge is developing and evaluating a generalizable temporal modeling approach that performs well across corpora and institutions. Our long-term goal is to create such a model. We initiate our work on reaching this goal by focusing on temporal expression (TIMEX3) identification. We present a systematic approach to 1) generalize existing solutions for automated TIMEX3 span detection, and 2) assess similarities and differences by various instantiations of TIMEX3 models applied on separate clinical corpora. When evaluated on the 2012 i2b2 and the 2015 Clinical TempEval challenge corpora, our conclusion is that our approach is successful – we achieve competitive results for automated classification, and we identify similarities and differences in TIMEX3 modeling that will be informative in the development of a simplified, general temporal model. PMID:26958265

  4. Automatic building information model query generation

    DOE PAGES

    Jiang, Yufei; Yu, Nan; Ming, Jiang; ...

    2015-12-01

    Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less

  5. Achievement goals, competition appraisals, and the psychological and emotional welfare of sport participants.

    PubMed

    Adie, James W; Duda, Joan L; Ntoumanis, Nikos

    2008-06-01

    Grounded in the 2x2 achievement goal framework (Elliot & McGregor, 2001), a model was tested examining the hypothesized relationships between approach and avoidance (mastery and performance) goals, challenge and threat appraisals of sport competition, and positive and negative indices of well-being (i.e., self-esteem, positive, and negative affect). A further aim was to determine the degree to which the cognitive appraisals mediated the relationship between the four achievement goals and the indicators of athletes' welfare. Finally, measurement and structural invariance was tested with respect to gender in the hypothesized model. An alternative model was also estimated specifying self-esteem as an antecedent of the four goals and cognitive appraisals. Four hundred and twenty-four team sport participants (Mage=24.25) responded to a multisection questionnaire. Structural equation modeling analyses provided support for the hypothesized model only. Challenge and threat appraisals partially mediated the relationships observed between mastery-based goals and the well-being indicators. Lastly, the hypothesized model was found to be invariant across gender.

  6. Modeling challenges and approaches in simulating the Jovian synchrotron radiation belts from an in-situ perspective

    NASA Astrophysics Data System (ADS)

    Adumitroaie, V.; Oyafuso, F. A.; Levin, S.; Gulkis, S.; Janssen, M. A.; Santos-Costa, D.; Bolton, S. J.

    2017-12-01

    In order to obtain credible atmospheric composition retrieval values from Jupiter's observed radiative signature via Juno's MWR instrument, it is necessary to separate as robustly as possible the contributions from three emission sources: CMB, planet and synchrotron radiation belts. The numerical separation requires a refinement, based on the in-situ data, of a higher fidelity model for the synchrotron emission, namely the multi-parameter, multi-zonal model of Levin at al. (2001). This model employs an empirical electron energy distribution, which prior to the Juno mission, has been adjusted exclusively from VLA observations. At minimum 8 sets of perijove observations (i.e. by PJ9) have to be delivered to an inverse model for retrieval of the electron distribution parameters with the goal of matching the synchrotron emission observed along MWR's lines of sight. The challenges and approaches taken to perform this task are discussed here. The model will be continuously improved with the availability of additional information, both from the MWR and magnetometer instruments.

  7. Automatic building information model query generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Yufei; Yu, Nan; Ming, Jiang

    Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less

  8. Controlled comparison of species- and community-level models across novel climates and communities

    PubMed Central

    Maguire, Kaitlin C.; Blois, Jessica L.; Fitzpatrick, Matthew C.; Williams, John W.; Ferrier, Simon; Lorenz, David J.

    2016-01-01

    Species distribution models (SDMs) assume species exist in isolation and do not influence one another's distributions, thus potentially limiting their ability to predict biodiversity patterns. Community-level models (CLMs) capitalize on species co-occurrences to fit shared environmental responses of species and communities, and therefore may result in more robust and transferable models. Here, we conduct a controlled comparison of five paired SDMs and CLMs across changing climates, using palaeoclimatic simulations and fossil-pollen records of eastern North America for the past 21 000 years. Both SDMs and CLMs performed poorly when projected to time periods that are temporally distant and climatically dissimilar from those in which they were fit; however, CLMs generally outperformed SDMs in these instances, especially when models were fit with sparse calibration datasets. Additionally, CLMs did not over-fit training data, unlike SDMs. The expected emergence of novel climates presents a major forecasting challenge for all models, but CLMs may better rise to this challenge by borrowing information from co-occurring taxa. PMID:26962143

  9. Application of the NUREG/CR-6850 EPRI/NRC Fire PRA Methodology to a DOE Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tom Elicson; Bentley Harwood; Richard Yorg

    2011-03-01

    The application NUREG/CR-6850 EPRI/NRC fire PRA methodology to DOE facility presented several challenges. This paper documents the process and discusses several insights gained during development of the fire PRA. A brief review of the tasks performed is provided with particular focus on the following: • Tasks 5 and 14: Fire-induced risk model and fire risk quantification. A key lesson learned was to begin model development and quantification as early as possible in the project using screening values and simplified modeling if necessary. • Tasks 3 and 9: Fire PRA cable selection and detailed circuit failure analysis. In retrospect, it wouldmore » have been beneficial to perform the model development and quantification in 2 phases with detailed circuit analysis applied during phase 2. This would have allowed for development of a robust model and quantification earlier in the project and would have provided insights into where to focus the detailed circuit analysis efforts. • Tasks 8 and 11: Scoping fire modeling and detailed fire modeling. More focus should be placed on detailed fire modeling and less focus on scoping fire modeling. This was the approach taken for the fire PRA. • Task 14: Fire risk quantification. Typically, multiple safe shutdown (SSD) components fail during a given fire scenario. Therefore dependent failure analysis is critical to obtaining a meaningful fire risk quantification. Dependent failure analysis for the fire PRA presented several challenges which will be discussed in the full paper.« less

  10. Biocellion: accelerating computer simulation of multicellular biological system models.

    PubMed

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Advancing Cyberinfrastructure to support high resolution water resources modeling

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Ogden, F. L.; Jones, N.; Horsburgh, J. S.

    2012-12-01

    Addressing the problem of how the availability and quality of water resources at large scales are sensitive to climate variability, watershed alterations and management activities requires computational resources that combine data from multiple sources and support integrated modeling. Related cyberinfrastructure challenges include: 1) how can we best structure data and computer models to address this scientific problem through the use of high-performance and data-intensive computing, and 2) how can we do this in a way that discipline scientists without extensive computational and algorithmic knowledge and experience can take advantage of advances in cyberinfrastructure? This presentation will describe a new system called CI-WATER that is being developed to address these challenges and advance high resolution water resources modeling in the Western U.S. We are building on existing tools that enable collaboration to develop model and data interfaces that link integrated system models running within an HPC environment to multiple data sources. Our goal is to enhance the use of computational simulation and data-intensive modeling to better understand water resources. Addressing water resource problems in the Western U.S. requires simulation of natural and engineered systems, as well as representation of legal (water rights) and institutional constraints alongside the representation of physical processes. We are establishing data services to represent the engineered infrastructure and legal and institutional systems in a way that they can be used with high resolution multi-physics watershed modeling at high spatial resolution. These services will enable incorporation of location-specific information on water management infrastructure and systems into the assessment of regional water availability in the face of growing demands, uncertain future meteorological forcings, and existing prior-appropriations water rights. This presentation will discuss the informatics challenges involved with data management and easy-to-use access to high performance computing being tackled in this project.

  12. Exemplar for simulation challenges: Large-deformation micromechanics of Sylgard 184/glass microballoon syntactic foams.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Judith Alice; Long, Kevin Nicholas

    2018-05-01

    Sylgard® 184/Glass Microballoon (GMB) potting material is currently used in many NW systems. Analysts need a macroscale constitutive model that can predict material behavior under complex loading and damage evolution. To address this need, ongoing modeling and experimental efforts have focused on study of damage evolution in these materials. Micromechanical finite element simulations that resolve individual GMB and matrix components promote discovery and better understanding of the material behavior. With these simulations, we can study the role of the GMB volume fraction, time-dependent damage, behavior under confined vs. unconfined compression, and the effects of partial damage. These simulations are challengingmore » and push the boundaries of capability even with the high performance computing tools available at Sandia. We summarize the major challenges and the current state of this modeling effort, as an exemplar of micromechanical modeling needs that can motivate advances in future computing efforts.« less

  13. Critical evaluation of the EU-technical guidance on shelf-life studies for L. monocytogenes on RTE-foods: a case study for smoked salmon.

    PubMed

    Vermeulen, A; Devlieghere, F; De Loy-Hendrickx, A; Uyttendaele, M

    2011-01-31

    In November 2008, a technical guidance document on the challenge test protocol was published by the EU CRL (Community of Reference Laboratory) for L. monocytogenes. This document describes the practical aspects on the execution of a challenge test in order to comply to the EU Commission regulation N° 2073/2005 on microbiological criteria for foodstuff. In this guideline two approaches are specified. On the one hand challenge tests, based on actual data measurements at the beginning and end of the shelf-life of products stored under reasonably foreseen T-profile, are described. On the other hand, growth potential is calculated by predictive models using a validated maximum specific growth rate. The present study evaluates the two above mentioned approaches on cold smoked salmon, a typical risk product for L. monocytogenes. The focus is on: (i) the relative importance of intrabatch versus interbatch variability, (ii) the concept of a simple challenge test based on actual data at start and end of shelf life versus a modelling approach and (iii) the interpretation of challenge tests. Next to this, available tertiary models were used to estimate the growth potential of these products based on their initial physicochemical characteristics. From the results it could be concluded that in some batches considerable intrabatch variability was obtained. In general, however, the interbatch variability was significantly higher than intrabatch variability. Concerning the two above mentioned methods for challenge tests, it can be stated that the first approach (simple challenge test) can be set up rather rapidly and is cost-effective for SMEs (small and medium enterprises) but provides only a single isolated outcome. This implies that challenge tests should be redone if changes occur in composition or production process. The second (modelling) approach, using extended challenge tests to establish growth parameters needs larger set ups and more complicated data analysis, which makes them more expensive. Using available tertiary models has the major advantage that the most important intrinsic and extrinsic factors can be included for the prediction of the growth parameter. It was clear that product specific models, taking into account the interaction effects with background flora, performed the best. Regarding the challenge tests, it can be concluded that the best approach to choose will depend on the particular context as in the end both approaches will lead to the same conclusion. Copyright © 2010 Elsevier B.V. All rights reserved.

  14. Opportunities and challenges in developing risk prediction models with electronic health records data: a systematic review.

    PubMed

    Goldstein, Benjamin A; Navar, Ann Marie; Pencina, Michael J; Ioannidis, John P A

    2017-01-01

    Electronic health records (EHRs) are an increasingly common data source for clinical risk prediction, presenting both unique analytic opportunities and challenges. We sought to evaluate the current state of EHR based risk prediction modeling through a systematic review of clinical prediction studies using EHR data. We searched PubMed for articles that reported on the use of an EHR to develop a risk prediction model from 2009 to 2014. Articles were extracted by two reviewers, and we abstracted information on study design, use of EHR data, model building, and performance from each publication and supplementary documentation. We identified 107 articles from 15 different countries. Studies were generally very large (median sample size = 26 100) and utilized a diverse array of predictors. Most used validation techniques (n = 94 of 107) and reported model coefficients for reproducibility (n = 83). However, studies did not fully leverage the breadth of EHR data, as they uncommonly used longitudinal information (n = 37) and employed relatively few predictor variables (median = 27 variables). Less than half of the studies were multicenter (n = 50) and only 26 performed validation across sites. Many studies did not fully address biases of EHR data such as missing data or loss to follow-up. Average c-statistics for different outcomes were: mortality (0.84), clinical prediction (0.83), hospitalization (0.71), and service utilization (0.71). EHR data present both opportunities and challenges for clinical risk prediction. There is room for improvement in designing such studies. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Modelling High-temperature EBPR by Incorporating Glycogen and GAOs: Challenges from a Preliminary Study.

    PubMed

    Liau, Kee Fui; Yeoh, Hak Koon; Shoji, Tadashi; Chua, Adeline Seak May; Ho, Pei Yee

    2017-01-01

      Recently reported kinetic and stoichiometric parameters of the Activated Sludge Model no. 2d (ASM2d) for high-temperature EBPR processes suggested that the absence of glycogen in the model contributed to underestimation of PHA accumulation at 32 °C. Here, two modified ASM2d models were used to further explore the contribution of glycogen in the process. The ASM2d-1G model incorporated glycogen metabolism by PAOs (polyphosphate-accumulating organisms), while the ASM2d-2G model further included processes by GAOs (glycogen-accumulating organisms). These models were calibrated and validated using experimental data at 32 °C. The ASM2d-1G model supported the hypothesis that the excess PHA was attributed to glycogen, but remained inadequate to capture the dynamics of glycogen without considering GAOs activities. The ASM2d-2G model performed better, but it was challenging to calibrate as it often led to wash-out of either PAOs or GAOs. Associated hurdles are highlighted and additional efforts in calibrating ASM2d-2G more effectively are proposed.

  16. Probabilistic image modeling with an extended chain graph for human activity recognition and image segmentation.

    PubMed

    Zhang, Lei; Zeng, Zhi; Ji, Qiang

    2011-09-01

    Chain graph (CG) is a hybrid probabilistic graphical model (PGM) capable of modeling heterogeneous relationships among random variables. So far, however, its application in image and video analysis is very limited due to lack of principled learning and inference methods for a CG of general topology. To overcome this limitation, we introduce methods to extend the conventional chain-like CG model to CG model with more general topology and the associated methods for learning and inference in such a general CG model. Specifically, we propose techniques to systematically construct a generally structured CG, to parameterize this model, to derive its joint probability distribution, to perform joint parameter learning, and to perform probabilistic inference in this model. To demonstrate the utility of such an extended CG, we apply it to two challenging image and video analysis problems: human activity recognition and image segmentation. The experimental results show improved performance of the extended CG model over the conventional directed or undirected PGMs. This study demonstrates the promise of the extended CG for effective modeling and inference of complex real-world problems.

  17. Prediction of overall survival for patients with metastatic castration-resistant prostate cancer: development of a prognostic model through a crowdsourced challenge with open clinical trial data

    PubMed Central

    Guinney, Justin; Wang, Tao; Laajala, Teemu D; Winner, Kimberly Kanigel; Bare, J Christopher; Neto, Elias Chaibub; Khan, Suleiman A; Peddinti, Gopal; Airola, Antti; Pahikkala, Tapio; Mirtti, Tuomas; Yu, Thomas; Bot, Brian M; Shen, Liji; Abdallah, Kald; Norman, Thea; Friend, Stephen; Stolovitzky, Gustavo; Soule, Howard; Sweeney, Christopher J; Ryan, Charles J; Scher, Howard I; Sartor, Oliver; Xie, Yang; Aittokallio, Tero; Zhou, Fang Liz; Costello, James C

    2016-01-01

    Summary Background Improvements to prognostic models in metastatic castration-resistant prostate cancer have the potential to augment clinical trial design and guide treatment strategies. In partnership with Project Data Sphere, a not-for-profit initiative allowing data from cancer clinical trials to be shared broadly with researchers, we designed an open-data, crowdsourced, DREAM (Dialogue for Reverse Engineering Assessments and Methods) challenge to not only identify a better prognostic model for prediction of survival in patients with metastatic castration-resistant prostate cancer but also engage a community of international data scientists to study this disease. Methods Data from the comparator arms of four phase 3 clinical trials in first-line metastatic castration-resistant prostate cancer were obtained from Project Data Sphere, comprising 476 patients treated with docetaxel and prednisone from the ASCENT2 trial, 526 patients treated with docetaxel, prednisone, and placebo in the MAINSAIL trial, 598 patients treated with docetaxel, prednisone or prednisolone, and placebo in the VENICE trial, and 470 patients treated with docetaxel and placebo in the ENTHUSE 33 trial. Datasets consisting of more than 150 clinical variables were curated centrally, including demographics, laboratory values, medical history, lesion sites, and previous treatments. Data from ASCENT2, MAINSAIL, and VENICE were released publicly to be used as training data to predict the outcome of interest—namely, overall survival. Clinical data were also released for ENTHUSE 33, but data for outcome variables (overall survival and event status) were hidden from the challenge participants so that ENTHUSE 33 could be used for independent validation. Methods were evaluated using the integrated time-dependent area under the curve (iAUC). The reference model, based on eight clinical variables and a penalised Cox proportional-hazards model, was used to compare method performance. Further validation was done using data from a fifth trial—ENTHUSE M1—in which 266 patients with metastatic castration-resistant prostate cancer were treated with placebo alone. Findings 50 independent methods were developed to predict overall survival and were evaluated through the DREAM challenge. The top performer was based on an ensemble of penalised Cox regression models (ePCR), which uniquely identified predictive interaction effects with immune biomarkers and markers of hepatic and renal function. Overall, ePCR outperformed all other methods (iAUC 0·791; Bayes factor >5) and surpassed the reference model (iAUC 0·743; Bayes factor >20). Both the ePCR model and reference models stratified patients in the ENTHUSE 33 trial into high-risk and low-risk groups with significantly different overall survival (ePCR: hazard ratio 3·32, 95% CI 2·39–4·62, p<0·0001; reference model: 2·56, 1·85–3·53, p<0·0001). The new model was validated further on the ENTHUSE M1 cohort with similarly high performance (iAUC 0·768). Meta-analysis across all methods confirmed previously identified predictive clinical variables and revealed aspartate aminotransferase as an important, albeit previously under-reported, prognostic biomarker. Interpretation Novel prognostic factors were delineated, and the assessment of 50 methods developed by independent international teams establishes a benchmark for development of methods in the future. The results of this effort show that data-sharing, when combined with a crowdsourced challenge, is a robust and powerful framework to develop new prognostic models in advanced prostate cancer. Funding Sanofi US Services, Project Data Sphere. PMID:27864015

  18. Prediction of overall survival for patients with metastatic castration-resistant prostate cancer: development of a prognostic model through a crowdsourced challenge with open clinical trial data.

    PubMed

    Guinney, Justin; Wang, Tao; Laajala, Teemu D; Winner, Kimberly Kanigel; Bare, J Christopher; Neto, Elias Chaibub; Khan, Suleiman A; Peddinti, Gopal; Airola, Antti; Pahikkala, Tapio; Mirtti, Tuomas; Yu, Thomas; Bot, Brian M; Shen, Liji; Abdallah, Kald; Norman, Thea; Friend, Stephen; Stolovitzky, Gustavo; Soule, Howard; Sweeney, Christopher J; Ryan, Charles J; Scher, Howard I; Sartor, Oliver; Xie, Yang; Aittokallio, Tero; Zhou, Fang Liz; Costello, James C

    2017-01-01

    Improvements to prognostic models in metastatic castration-resistant prostate cancer have the potential to augment clinical trial design and guide treatment strategies. In partnership with Project Data Sphere, a not-for-profit initiative allowing data from cancer clinical trials to be shared broadly with researchers, we designed an open-data, crowdsourced, DREAM (Dialogue for Reverse Engineering Assessments and Methods) challenge to not only identify a better prognostic model for prediction of survival in patients with metastatic castration-resistant prostate cancer but also engage a community of international data scientists to study this disease. Data from the comparator arms of four phase 3 clinical trials in first-line metastatic castration-resistant prostate cancer were obtained from Project Data Sphere, comprising 476 patients treated with docetaxel and prednisone from the ASCENT2 trial, 526 patients treated with docetaxel, prednisone, and placebo in the MAINSAIL trial, 598 patients treated with docetaxel, prednisone or prednisolone, and placebo in the VENICE trial, and 470 patients treated with docetaxel and placebo in the ENTHUSE 33 trial. Datasets consisting of more than 150 clinical variables were curated centrally, including demographics, laboratory values, medical history, lesion sites, and previous treatments. Data from ASCENT2, MAINSAIL, and VENICE were released publicly to be used as training data to predict the outcome of interest-namely, overall survival. Clinical data were also released for ENTHUSE 33, but data for outcome variables (overall survival and event status) were hidden from the challenge participants so that ENTHUSE 33 could be used for independent validation. Methods were evaluated using the integrated time-dependent area under the curve (iAUC). The reference model, based on eight clinical variables and a penalised Cox proportional-hazards model, was used to compare method performance. Further validation was done using data from a fifth trial-ENTHUSE M1-in which 266 patients with metastatic castration-resistant prostate cancer were treated with placebo alone. 50 independent methods were developed to predict overall survival and were evaluated through the DREAM challenge. The top performer was based on an ensemble of penalised Cox regression models (ePCR), which uniquely identified predictive interaction effects with immune biomarkers and markers of hepatic and renal function. Overall, ePCR outperformed all other methods (iAUC 0·791; Bayes factor >5) and surpassed the reference model (iAUC 0·743; Bayes factor >20). Both the ePCR model and reference models stratified patients in the ENTHUSE 33 trial into high-risk and low-risk groups with significantly different overall survival (ePCR: hazard ratio 3·32, 95% CI 2·39-4·62, p<0·0001; reference model: 2·56, 1·85-3·53, p<0·0001). The new model was validated further on the ENTHUSE M1 cohort with similarly high performance (iAUC 0·768). Meta-analysis across all methods confirmed previously identified predictive clinical variables and revealed aspartate aminotransferase as an important, albeit previously under-reported, prognostic biomarker. Novel prognostic factors were delineated, and the assessment of 50 methods developed by independent international teams establishes a benchmark for development of methods in the future. The results of this effort show that data-sharing, when combined with a crowdsourced challenge, is a robust and powerful framework to develop new prognostic models in advanced prostate cancer. Sanofi US Services, Project Data Sphere. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Challenges of developing a cardiovascular risk calculator for patients with rheumatoid arthritis.

    PubMed

    Crowson, Cynthia S; Rollefstad, Silvia; Kitas, George D; van Riel, Piet L C M; Gabriel, Sherine E; Semb, Anne Grete

    2017-01-01

    Cardiovascular disease (CVD) risk calculators designed for use in the general population do not accurately predict the risk of CVD among patients with rheumatoid arthritis (RA), who are at increased risk of CVD. The process of developing risk prediction models involves numerous issues. Our goal was to develop a CVD risk calculator for patients with RA. Thirteen cohorts of patients with RA originating from 10 different countries (UK, Norway, Netherlands, USA, Sweden, Greece, South Africa, Spain, Canada and Mexico) were combined. CVD risk factors and RA characteristics at baseline, in addition to information on CVD outcomes were collected. Cox models were used to develop a CVD risk calculator, considering traditional CVD risk factors and RA characteristics. Model performance was assessed using measures of discrimination and calibration with 10-fold cross-validation. A total of 5638 RA patients without prior CVD were included (mean age: 55 [SD: 14] years, 76% female). During a mean follow-up of 5.8 years (30139 person years), 389 patients developed a CVD event. Event rates varied between cohorts, necessitating inclusion of high and low risk strata in the models. The multivariable analyses revealed 2 risk prediction models including either a disease activity score including a 28 joint count and erythrocyte sedimentation rate (DAS28ESR) or a health assessment questionnaire (HAQ) along with age, sex, presence of hypertension, current smoking and ratio of total cholesterol to high-density lipoprotein cholesterol. Unfortunately, performance of these models was similar to general population CVD risk calculators. Efforts to develop a specific CVD risk calculator for patients with RA yielded 2 potential models including RA disease characteristics, but neither demonstrated improved performance compared to risk calculators designed for use in the general population. Challenges encountered and lessons learned are discussed in detail.

  20. Time dependent data, time independent models: challenges of updating Australia's National Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Griffin, J.; Clark, D.; Allen, T.; Ghasemi, H.; Leonard, M.

    2017-12-01

    Standard probabilistic seismic hazard assessment (PSHA) simulates earthquake occurrence as a time-independent process. However paleoseismic studies in slowly deforming regions such as Australia show compelling evidence that large earthquakes on individual faults cluster within active periods, followed by long periods of quiescence. Therefore the instrumental earthquake catalog, which forms the basis of PSHA earthquake recurrence calculations, may only capture the state of the system over the period of the catalog. Together this means that data informing our PSHA may not be truly time-independent. This poses challenges in developing PSHAs for typical design probabilities (such as 10% in 50 years probability of exceedance): Is the present state observed through the instrumental catalog useful for estimating the next 50 years of earthquake hazard? Can paleo-earthquake data, that shows variations in earthquake frequency over time-scales of 10,000s of years or more, be robustly included in such PSHA models? Can a single PSHA logic tree be useful over a range of different probabilities of exceedance? In developing an updated PSHA for Australia, decadal-scale data based on instrumental earthquake catalogs (i.e. alternative area based source models and smoothed seismicity models) is integrated with paleo-earthquake data through inclusion of a fault source model. Use of time-dependent non-homogeneous Poisson models allows earthquake clustering to be modeled on fault sources with sufficient paleo-earthquake data. This study assesses the performance of alternative models by extracting decade-long segments of the instrumental catalog, developing earthquake probability models based on the remaining catalog, and testing performance against the extracted component of the catalog. Although this provides insights into model performance over the short-term, for longer timescales it is recognised that model choice is subject to considerable epistemic uncertainty. Therefore a formal expert elicitation process has been used to assign weights to alternative models for the 2018 update to Australia's national PSHA.

  1. Multi-objective optimization for generating a weighted multi-model ensemble

    NASA Astrophysics Data System (ADS)

    Lee, H.

    2017-12-01

    Many studies have demonstrated that multi-model ensembles generally show better skill than each ensemble member. When generating weighted multi-model ensembles, the first step is measuring the performance of individual model simulations using observations. There is a consensus on the assignment of weighting factors based on a single evaluation metric. When considering only one evaluation metric, the weighting factor for each model is proportional to a performance score or inversely proportional to an error for the model. While this conventional approach can provide appropriate combinations of multiple models, the approach confronts a big challenge when there are multiple metrics under consideration. When considering multiple evaluation metrics, it is obvious that a simple averaging of multiple performance scores or model ranks does not address the trade-off problem between conflicting metrics. So far, there seems to be no best method to generate weighted multi-model ensembles based on multiple performance metrics. The current study applies the multi-objective optimization, a mathematical process that provides a set of optimal trade-off solutions based on a range of evaluation metrics, to combining multiple performance metrics for the global climate models and their dynamically downscaled regional climate simulations over North America and generating a weighted multi-model ensemble. NASA satellite data and the Regional Climate Model Evaluation System (RCMES) software toolkit are used for assessment of the climate simulations. Overall, the performance of each model differs markedly with strong seasonal dependence. Because of the considerable variability across the climate simulations, it is important to evaluate models systematically and make future projections by assigning optimized weighting factors to the models with relatively good performance. Our results indicate that the optimally weighted multi-model ensemble always shows better performance than an arithmetic ensemble mean and may provide reliable future projections.

  2. Food for trans-Atlantic rowers: a menu planning model and case study.

    PubMed

    Clark, Nancy; Coleman, Cato; Figure, Kerri; Mailhot, Tom; Zeigler, John

    2003-06-01

    Every 4 years, rowers from around the world compete in a 50- to 60-day trans-Atlantic rowing challenge. These ultra-distance rowers require a diet that provides adequate calories, protein, vitamins, minerals, and fluids so they can perform well day after day, minimize fatigue, and stay healthy. Yet, the rowers are confronted with menu planning challenges. The food needs to be lightweight, compact, sturdy, non-spoiling in tropical temperatures, calorie dense, easy to prepare, quick to cook, and good tasting. Financial concerns commonly add another menu planning challenge. The purpose of this case study is to summarize the rowers' food experiences and to provide guidance for sports nutrition professionals who work with ultra-endurance athletes embarking on a physical challenge with similar food requirements. The article provides food and nutrition recommendations as well as practical considerations for ultra-distance athletes. We describe an 8,000 calorie per day menu planning model that uses food exchanges based on familiar, tasty, and reasonably priced supermarket foods that provide the required nutrients and help contain financial costs.

  3. Performance of Environmental Resources of a Tourist Destination

    PubMed Central

    2013-01-01

    Despite the apparent importance of destinations’ environmental resources, there appears to be little theoretical and applied research explicitly focusing on destination environmental supply. This research attempts to address this gap in the literature. First, it reviews and evaluates the body of research in tourism environmental resources and proposes a conceptual model to test their performance. The model combines tourism supply–demand view with importance–performance gaps and was used to survey tourism in Slovenia. The results show that the studied destination uses its environmental resources too extensively and that Slovenian environmental tourism experience does not meet visitors’ expectations. This finding challenges Slovenian policy makers, who position Slovenia as a green destination. The proposed model can form the basis for further conceptual and empirical research into the tourism contributions of environmental resources. In its present form, it can be used to examine environmental performance and to suggest policy implications for any destination. PMID:29901033

  4. Revel8or: Model Driven Capacity Planning Tool Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Liming; Liu, Yan; Bui, Ngoc B.

    2007-05-31

    Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less

  5. ARC-2009-ACD09-0144-127

    NASA Image and Video Library

    2009-07-19

    MoonFest: From Apollo to LCROSS and Beyond public event at NASA'S Ames Researc Center, Moffett Field, Calif. The day included scientific talks, model rocket launches on the flight line, musical performances, family-friendly activities and more. The moon boots booth, walking was quiet a challenge.

  6. Applying chemical engineering concepts to non-thermal plasma reactors

    NASA Astrophysics Data System (ADS)

    Pedro AFFONSO, NOBREGA; Alain, GAUNAND; Vandad, ROHANI; François, CAUNEAU; Laurent, FULCHERI

    2018-06-01

    Process scale-up remains a considerable challenge for environmental applications of non-thermal plasmas. Undersanding the impact of reactor hydrodynamics in the performance of the process is a key step to overcome this challenge. In this work, we apply chemical engineering concepts to analyse the impact that different non-thermal plasma reactor configurations and regimes, such as laminar or plug flow, may have on the reactor performance. We do this in the particular context of the removal of pollutants by non-thermal plasmas, for which a simplified model is available. We generalise this model to different reactor configurations and, under certain hypotheses, we show that a reactor in the laminar regime may have a behaviour significantly different from one in the plug flow regime, often assumed in the non-thermal plasma literature. On the other hand, we show that a packed-bed reactor behaves very similarly to one in the plug flow regime. Beyond those results, the reader will find in this work a quick introduction to chemical reaction engineering concepts.

  7. Drop "impact" on an airfoil surface.

    PubMed

    Wu, Zhenlong

    2018-06-01

    Drop impact on an airfoil surface takes place in drop-laden two-phase flow conditions such as rain and icing, which are encountered by wind turbines or airplanes. This phenomenon is characterized by complex nonlinear interactions that manifest rich flow physics and pose unique modeling challenges. In this article, the state of the art of the research about drop impact on airfoil surface in the natural drop-laden two-phase flow environment is presented. The potential flow physics, hazards, characteristic parameters, droplet trajectory calculation, drop impact dynamics and effects are discussed. The most key points in establishing the governing equations for a drop-laden flow lie in the modeling of raindrop splash and water film. The various factors affecting the drop impact dynamics and the effects of drop impact on airfoil aerodynamic performance are summarized. Finally, the principle challenges and future research directions in the field as well as some promising measures to deal with the adverse effects of drop-laden flows on airfoil performance are proposed. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. From Here to Autonomy.

    PubMed

    Endsley, Mica R

    2017-02-01

    As autonomous and semiautonomous systems are developed for automotive, aviation, cyber, robotics and other applications, the ability of human operators to effectively oversee and interact with them when needed poses a significant challenge. An automation conundrum exists in which as more autonomy is added to a system, and its reliability and robustness increase, the lower the situation awareness of human operators and the less likely that they will be able to take over manual control when needed. The human-autonomy systems oversight model integrates several decades of relevant autonomy research on operator situation awareness, out-of-the-loop performance problems, monitoring, and trust, which are all major challenges underlying the automation conundrum. Key design interventions for improving human performance in interacting with autonomous systems are integrated in the model, including human-automation interface features and central automation interaction paradigms comprising levels of automation, adaptive automation, and granularity of control approaches. Recommendations for the design of human-autonomy interfaces are presented and directions for future research discussed.

  9. GAIA payload module mechanical development

    NASA Astrophysics Data System (ADS)

    Touzeau, S.; Sein, E.; Lebranchu, C.

    2017-11-01

    Gaia is the European Space Agency's cornerstone mission for global space astrometry. Its goal is to make the largest, most precise three-dimensional map of our Galaxy by surveying an unprecedented number of stars. This paper gives an overview of the mechanical system engineering and verification of the payload module. This development includes several technical challenges. First of all, the very high stability performance as required for the mission is a key driver for the design, which incurs a high degree of stability. This is achieved through the extensive use of Silicon Carbide (Boostec® SiC) for both structures and mirrors, a high mechanical and thermal decoupling between payload and service modules, and the use of high-performance engineering tools. Compliance of payload mass and volume with launcher capability is another key challenge, as well as the development and manufacturing of the 3.2-meter diameter toroidal primary structure. The spacecraft mechanical verification follows an innovative approach, with direct testing on the flight model, without any dedicated structural model.

  10. Economic Modeling Considerations for Rare Diseases.

    PubMed

    Pearson, Isobel; Rothwell, Ben; Olaye, Andrew; Knight, Christopher

    2018-05-01

    To identify challenges that affect the feasibility and rigor of economic models in rare diseases and strategies that manufacturers have employed in health technology assessment submissions to demonstrate the value of new orphan products that have limited study data. Targeted reviews of PubMed, the National Institute for Health and Care Excellence's (NICE's) Highly Specialised Technologies (HST), and the Scottish Medicines Consortium's (SMC's) ultra-orphan submissions were performed. A total of 19 PubMed studies, 3 published NICE HSTs, and 11 ultra-orphan SMC submissions were eligible for inclusion. In rare diseases, a number of different factors may affect the model's ability to comply with good practice recommendations. Many products for the treatment of rare diseases have an incomplete efficacy and safety profile at product launch. In addition, there is often limited available natural history and epidemiology data. Information on the direct and indirect cost burden of an orphan disease also may be limited, making it difficult to estimate the potential economic benefit of treatment. These challenges can prevent accurate estimation of a new product's benefits in relation to costs. Approaches that can address such challenges include using patient and/or clinician feedback to inform model assumptions; data from disease analogues; epidemiological techniques, such as matching-adjusted indirect comparison; and long-term data collection. Modeling in rare diseases is often challenging; however, a number of approaches are available to support the development of model structures and the collation of input parameters and to manage uncertainty. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  11. Estimation and uncertainty analysis of dose response in an inter-laboratory experiment

    NASA Astrophysics Data System (ADS)

    Toman, Blaza; Rösslein, Matthias; Elliott, John T.; Petersen, Elijah J.

    2016-02-01

    An inter-laboratory experiment for the evaluation of toxic effects of NH2-polystyrene nanoparticles on living human cancer cells was performed with five participating laboratories. Previously published results from nanocytoxicity assays are often contradictory, mostly due to challenges related to producing a reliable cytotoxicity assay protocol for use with nanomaterials. Specific challenges include reproducibility preparing nanoparticle dispersions, biological variability from testing living cell lines, and the potential for nano-related interference effects. In this experiment, such challenges were addressed by developing a detailed experimental protocol and using a specially designed 96-well plate layout which incorporated a range of control measurements to assess multiple factors such as nanomaterial interference, pipetting accuracy, cell seeding density, and instrument performance. Detailed data analysis of these control measurements showed that good control of the experiments was attained by all participants in most cases. The main measurement objective of the study was the estimation of a dose response relationship between concentration of the nanoparticles and metabolic activity of the living cells, under several experimental conditions. The dose curve estimation was achieved by imbedding a three parameter logistic curve in a three level Bayesian hierarchical model, accounting for uncertainty due to all known experimental conditions as well as between laboratory variability in a top-down manner. Computation was performed using Markov Chain Monte Carlo methods. The fit of the model was evaluated using Bayesian posterior predictive probabilities and found to be satisfactory.

  12. PHARAO laser source flight model: Design and performances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lévèque, T., E-mail: thomas.leveque@cnes.fr; Faure, B.; Esnault, F. X.

    2015-03-15

    In this paper, we describe the design and the main performances of the PHARAO laser source flight model. PHARAO is a laser cooled cesium clock specially designed for operation in space and the laser source is one of the main sub-systems. The flight model presented in this work is the first remote-controlled laser system designed for spaceborne cold atom manipulation. The main challenges arise from mechanical compatibility with space constraints, which impose a high level of compactness, a low electric power consumption, a wide range of operating temperature, and a vacuum environment. We describe the main functions of the lasermore » source and give an overview of the main technologies developed for this instrument. We present some results of the qualification process. The characteristics of the laser source flight model, and their impact on the clock performances, have been verified in operational conditions.« less

  13. Impact of supplemental instruction leader on the success of supplemental instruction model

    NASA Astrophysics Data System (ADS)

    Mahabaduge, Hasitha; Haslam, Jeanne

    Supplemental instruction utilizes peer-assisted study sessions to provide review sessions on course material and an opportunity to discuss and work out problems. The impact of supplemental instruction on student performance is well researched and used in a large number of universities around the world due to its proven success. However, the impact of the student leader who plays a significant role in this model is rarely discussed in the literature. We present a case study on the impact of student leader on the success of supplemental instruction model. This case study was done for an Introductory Physics course correlating student performance and the supplemental instruction sessions they attended. Further analysis revealed that the academic performance and work ethics of the student leader has a significant impact on the success of the supplemental instruction model. Important factors to consider when selecting a student leader, the challenges and possible remedies will also be discussed.

  14. Multicore Programming Challenges

    NASA Astrophysics Data System (ADS)

    Perrone, Michael

    The computer industry is facing fundamental challenges that are driving a major change in the design of computer processors. Due to restrictions imposed by quantum physics, one historical path to higher computer processor performance - by increased clock frequency - has come to an end. Increasing clock frequency now leads to power consumption costs that are too high to justify. As a result, we have seen in recent years that the processor frequencies have peaked and are receding from their high point. At the same time, competitive market conditions are giving business advantage to those companies that can field new streaming applications, handle larger data sets, and update their models to market conditions faster. The desire for newer, faster and larger is driving continued demand for higher computer performance.

  15. Unique Challenges for Modeling Defect Dynamics in Concentrated Solid-Solution Alloys

    NASA Astrophysics Data System (ADS)

    Zhao, Shijun; Weber, William J.; Zhang, Yanwen

    2017-11-01

    Recently developed concentrated solid solution alloys (CSAs) are shown to have improved performance under irradiation that depends strongly on the number of alloying elements, alloying species, and their concentrations. In contrast to conventional dilute alloys, CSAs are composed of multiple principal elements situated randomly in a simple crystalline lattice. As a result, the intrinsic disorder has a profound influence on energy dissipation pathways and defect evolution when these CSAs are subjected to energetic particle irradiation. Extraordinary irradiation resistance, including suppression of void formation by two orders of magnitude at an elevated temperature, has been achieved with increasing compositional complexity in CSAs. Unfortunately, the loss of translational invariance associated with the intrinsic chemical disorder poses great challenges to theoretical modeling at the electronic and atomic levels. Based on recent computer simulation results for a set of novel Ni-containing, face-centered cubic CSAs, we review theoretical modeling progress in handling disorder in CSAs and underscore the impact of disorder on defect dynamics. We emphasize in particular the unique challenges associated with the description of defect dynamics in CSAs.

  16. Layout optimization of DRAM cells using rigorous simulation model for NTD

    NASA Astrophysics Data System (ADS)

    Jeon, Jinhyuck; Kim, Shinyoung; Park, Chanha; Yang, Hyunjo; Yim, Donggyu; Kuechler, Bernd; Zimmermann, Rainer; Muelders, Thomas; Klostermann, Ulrich; Schmoeller, Thomas; Do, Mun-hoe; Choi, Jung-Hoe

    2014-03-01

    DRAM chip space is mainly determined by the size of the memory cell array patterns which consist of periodic memory cell features and edges of the periodic array. Resolution Enhancement Techniques (RET) are used to optimize the periodic pattern process performance. Computational Lithography such as source mask optimization (SMO) to find the optimal off axis illumination and optical proximity correction (OPC) combined with model based SRAF placement are applied to print patterns on target. For 20nm Memory Cell optimization we see challenges that demand additional tool competence for layout optimization. The first challenge is a memory core pattern of brick-wall type with a k1 of 0.28, so it allows only two spectral beams to interfere. We will show how to analytically derive the only valid geometrically limited source. Another consequence of two-beam interference limitation is a "super stable" core pattern, with the advantage of high depth of focus (DoF) but also low sensitivity to proximity corrections or changes of contact aspect ratio. This makes an array edge correction very difficult. The edge can be the most critical pattern since it forms the transition from the very stable regime of periodic patterns to non-periodic periphery, so it combines the most critical pitch and highest susceptibility to defocus. Above challenge makes the layout correction to a complex optimization task demanding a layout optimization that finds a solution with optimal process stability taking into account DoF, exposure dose latitude (EL), mask error enhancement factor (MEEF) and mask manufacturability constraints. This can only be achieved by simultaneously considering all criteria while placing and sizing SRAFs and main mask features. The second challenge is the use of a negative tone development (NTD) type resist, which has a strong resist effect and is difficult to characterize experimentally due to negative resist profile taper angles that perturb CD at bottom characterization by scanning electron microscope (SEM) measurements. High resist impact and difficult model data acquisition demand for a simulation model that hat is capable of extrapolating reliably beyond its calibration dataset. We use rigorous simulation models to provide that predictive performance. We have discussed the need of a rigorous mask optimization process for DRAM contact cell layout yielding mask layouts that are optimal in process performance, mask manufacturability and accuracy. In this paper, we have shown the step by step process from analytical illumination source derivation, a NTD and application tailored model calibration to layout optimization such as OPC and SRAF placement. Finally the work has been verified with simulation and experimental results on wafer.

  17. Modeling and simulation challenges pursued by the Consortium for Advanced Simulation of Light Water Reactors (CASL)

    NASA Astrophysics Data System (ADS)

    Turinsky, Paul J.; Kothe, Douglas B.

    2016-05-01

    The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear power industry that M&S can assist in addressing. To date CASL has developed a multi-physics ;core simulator; based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M&S capabilities, which is in progress, will assist in addressing long-standing and future operational and safety challenges of the nuclear industry.

  18. Challenges in Requirements Engineering: A Research Agenda for Conceptual Modeling

    NASA Astrophysics Data System (ADS)

    March, Salvatore T.; Allen, Gove N.

    Domains for which information systems are developed deal primarily with social constructions—conceptual objects and attributes created by human intentions and for human purposes. Information systems play an active role in these domains. They document the creation of new conceptual objects, record and ascribe values to their attributes, initiate actions within the domain, track activities performed, and infer conclusions based on the application of rules that govern how the domain is affected when socially-defined and identified causal events occur. Emerging applications of information technologies evaluate such business rules, learn from experience, and adapt to changes in the domain. Conceptual modeling grammars aimed at representing their system requirements must include conceptual objects, socially-defined events, and the rules pertaining to them. We identify challenges to conceptual modeling research and pose an ontology of the artificial as a step toward meeting them.

  19. Rasch family models in e-learning: analyzing architectural sketching with a digital pen.

    PubMed

    Scalise, Kathleen; Cheng, Nancy Yen-Wen; Oskui, Nargas

    2009-01-01

    Since architecture students studying design drawing are usually assessed qualitatively on the basis of their final products, the challenges and stages of their learning have remained masked. To clarify the challenges in design drawing, we have been using the BEAR Assessment System and Rasch family models to measure levels of understanding for individuals and groups, in order to correct pedagogical assumptions and tune teaching materials. This chapter discusses the analysis of 81 drawings created by architectural students to solve a space layout problem, collected and analyzed with digital pen-and-paper technology. The approach allows us to map developmental performance criteria and perceive achievement overlaps in learning domains assumed separate, and then re-conceptualize a three-part framework to represent learning in architectural drawing. Results and measurement evidence from the assessment and Rasch modeling are discussed.

  20. Open challenges in structure-based virtual screening: Receptor modeling, target flexibility consideration and active site water molecules description.

    PubMed

    Spyrakis, Francesca; Cavasotto, Claudio N

    2015-10-01

    Structure-based virtual screening is currently an established tool in drug lead discovery projects. Although in the last years the field saw an impressive progress in terms of algorithm development, computational performance, and retrospective and prospective applications in ligand identification, there are still long-standing challenges where further improvement is needed. In this review, we consider the conceptual frame, state-of-the-art and recent developments of three critical "structural" issues in structure-based drug lead discovery: the use of homology modeling to accurately model the binding site when no experimental structures are available, the necessity of accounting for the dynamics of intrinsically flexible systems as proteins, and the importance of considering active site water molecules in lead identification and optimization campaigns. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Scale effect challenges in urban hydrology highlighted with a distributed hydrological model

    NASA Astrophysics Data System (ADS)

    Ichiba, Abdellah; Gires, Auguste; Tchiguirinskaia, Ioulia; Schertzer, Daniel; Bompard, Philippe; Ten Veldhuis, Marie-Claire

    2018-01-01

    Hydrological models are extensively used in urban water management, development and evaluation of future scenarios and research activities. There is a growing interest in the development of fully distributed and grid-based models. However, some complex questions related to scale effects are not yet fully understood and still remain open issues in urban hydrology. In this paper we propose a two-step investigation framework to illustrate the extent of scale effects in urban hydrology. First, fractal tools are used to highlight the scale dependence observed within distributed data input into urban hydrological models. Then an intensive multi-scale modelling work is carried out to understand scale effects on hydrological model performance. Investigations are conducted using a fully distributed and physically based model, Multi-Hydro, developed at Ecole des Ponts ParisTech. The model is implemented at 17 spatial resolutions ranging from 100 to 5 m. Results clearly exhibit scale effect challenges in urban hydrology modelling. The applicability of fractal concepts highlights the scale dependence observed within distributed data. Patterns of geophysical data change when the size of the observation pixel changes. The multi-scale modelling investigation confirms scale effects on hydrological model performance. Results are analysed over three ranges of scales identified in the fractal analysis and confirmed through modelling. This work also discusses some remaining issues in urban hydrology modelling related to the availability of high-quality data at high resolutions, and model numerical instabilities as well as the computation time requirements. The main findings of this paper enable a replacement of traditional methods of model calibration by innovative methods of model resolution alteration based on the spatial data variability and scaling of flows in urban hydrology.

  2. Implementation and Evaluation of Multiple Adaptive Control Technologies for a Generic Transport Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Campbell, Stefan F.; Kaneshige, John T.; Nguyen, Nhan T.; Krishakumar, Kalmanje S.

    2010-01-01

    Presented here is the evaluation of multiple adaptive control technologies for a generic transport aircraft simulation. For this study, seven model reference adaptive control (MRAC) based technologies were considered. Each technology was integrated into an identical dynamic-inversion control architecture and tuned using a methodology based on metrics and specific design requirements. Simulation tests were then performed to evaluate each technology s sensitivity to time-delay, flight condition, model uncertainty, and artificially induced cross-coupling. The resulting robustness and performance characteristics were used to identify potential strengths, weaknesses, and integration challenges of the individual adaptive control technologies

  3. Structured literature review of responses of cattle to viral and bacterial pathogens causing bovine respiratory disease complex.

    PubMed

    Grissett, G P; White, B J; Larson, R L

    2015-01-01

    Bovine respiratory disease (BRD) is an economically important disease of cattle and continues to be an intensely studied topic. However, literature summarizing the time between pathogen exposure and clinical signs, shedding, and seroconversion is minimal. A structured literature review of the published literature was performed to determine cattle responses (time from pathogen exposure to clinical signs, shedding, and seroconversion) in challenge models using common BRD viral and bacterial pathogens. After review a descriptive analysis of published studies using common BRD pathogen challenge studies was performed. Inclusion criteria were single pathogen challenge studies with no treatment or vaccination evaluating outcomes of interest: clinical signs, shedding, and seroconversion. Pathogens of interest included: bovine viral diarrhea virus (BVDV), bovine herpesvirus type 1 (BHV-1), parainfluenza-3 virus, bovine respiratory syncytial virus, Mannheimia haemolytica, Mycoplasma bovis, Pastuerella multocida, and Histophilus somni. Thirty-five studies and 64 trials were included for analysis. The median days to the resolution of clinical signs after BVDV challenge was 15 and shedding was not detected on day 12 postchallenge. Resolution of BHV-1 shedding resolved on day 12 and clinical signs on day 12 postchallenge. Bovine respiratory syncytial virus ceased shedding on day 9 and median time to resolution of clinical signs was on day 12 postchallenge. M. haemolytica resolved clinical signs 8 days postchallenge. This literature review and descriptive analysis can serve as a resource to assist in designing challenge model studies and potentially aid in estimation of duration of clinical disease and shedding after natural pathogen exposure. Copyright © 2015 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  4. Successfully performing a university student's role despite disabilities: challenges of an inclusive environment and appropriate task modification.

    PubMed

    Rochette, Annie; Loiselle, Frederic

    2012-01-01

    To reflect on what it means to successfully perform a university student's role despite the presence of impairments. The Disability Creation Process (DCP) model is used as a tool to zoom in the different activities and tasks required for a successful education as well as to describe how the social and physical environment can be as inclusive as possible to compensate for different impairments. One activity in the student's role (reading) is used to illustrate and reflect on potential challenges in compensating for impairments by way of environmental or task modifications. The student's role is a complex one, characterized by different actions such as getting admitted, moving around, attending courses, studying and participating in student life. Environmental factors or time can facilitate or impede the level of participation in the education domain. One challenge may be to differentiate between compensation for learning (processes) as compared to outcomes (competency level for future employment) as well as to determine how much assistance is acceptable. Intuitive single-case analysis should be replaced by a systematic analysis relying on a conceptual model such as the DCP. To avoid discrimination and to ensure transparency, acceptable amount of compensation for an activity should be defined.

  5. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE PAGES

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar; ...

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less

  6. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less

  7. Optimal feature selection using a modified differential evolution algorithm and its effectiveness for prediction of heart disease.

    PubMed

    Vivekanandan, T; Sriman Narayana Iyengar, N Ch

    2017-11-01

    Enormous data growth in multiple domains has posed a great challenge for data processing and analysis techniques. In particular, the traditional record maintenance strategy has been replaced in the healthcare system. It is vital to develop a model that is able to handle the huge amount of e-healthcare data efficiently. In this paper, the challenging tasks of selecting critical features from the enormous set of available features and diagnosing heart disease are carried out. Feature selection is one of the most widely used pre-processing steps in classification problems. A modified differential evolution (DE) algorithm is used to perform feature selection for cardiovascular disease and optimization of selected features. Of the 10 available strategies for the traditional DE algorithm, the seventh strategy, which is represented by DE/rand/2/exp, is considered for comparative study. The performance analysis of the developed modified DE strategy is given in this paper. With the selected critical features, prediction of heart disease is carried out using fuzzy AHP and a feed-forward neural network. Various performance measures of integrating the modified differential evolution algorithm with fuzzy AHP and a feed-forward neural network in the prediction of heart disease are evaluated in this paper. The accuracy of the proposed hybrid model is 83%, which is higher than that of some other existing models. In addition, the prediction time of the proposed hybrid model is also evaluated and has shown promising results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Recurrent Convolutional Neural Networks: A Better Model of Biological Object Recognition.

    PubMed

    Spoerer, Courtney J; McClure, Patrick; Kriegeskorte, Nikolaus

    2017-01-01

    Feedforward neural networks provide the dominant model of how the brain performs visual object recognition. However, these networks lack the lateral and feedback connections, and the resulting recurrent neuronal dynamics, of the ventral visual pathway in the human and non-human primate brain. Here we investigate recurrent convolutional neural networks with bottom-up (B), lateral (L), and top-down (T) connections. Combining these types of connections yields four architectures (B, BT, BL, and BLT), which we systematically test and compare. We hypothesized that recurrent dynamics might improve recognition performance in the challenging scenario of partial occlusion. We introduce two novel occluded object recognition tasks to test the efficacy of the models, digit clutter (where multiple target digits occlude one another) and digit debris (where target digits are occluded by digit fragments). We find that recurrent neural networks outperform feedforward control models (approximately matched in parametric complexity) at recognizing objects, both in the absence of occlusion and in all occlusion conditions. Recurrent networks were also found to be more robust to the inclusion of additive Gaussian noise. Recurrent neural networks are better in two respects: (1) they are more neurobiologically realistic than their feedforward counterparts; (2) they are better in terms of their ability to recognize objects, especially under challenging conditions. This work shows that computer vision can benefit from using recurrent convolutional architectures and suggests that the ubiquitous recurrent connections in biological brains are essential for task performance.

  9. Modeling and Simulation of Control Actuation System with Fuzzy-PID Logic Controlled Brushless Motor Drives for Missiles Glider Applications

    PubMed Central

    Muniraj, Murali; Arulmozhiyal, Ramaswamy

    2015-01-01

    A control actuation system has been used extensively in automotive, aerospace, and defense applications. The major challenges in modeling control actuation system are rise time, maximum peak to peak overshoot, and response to nonlinear system with percentage error. This paper addresses the challenges in modeling and real time implementation of control actuation system for missiles glider applications. As an alternative fuzzy-PID controller is proposed in BLDC motor drive followed by linkage mechanism to actuate fins in missiles and gliders. The proposed system will realize better rise time and less overshoot while operating in extreme nonlinear dynamic system conditions. A mathematical model of BLDC motor is derived in state space form. The complete control actuation system is modeled in MATLAB/Simulink environment and verified by performing simulation studies. A real time prototype of the control actuation is developed with dSPACE-1104 hardware controller and a detailed analysis is carried out to confirm the viability of the proposed system. PMID:26613102

  10. Modeling and Simulation of Control Actuation System with Fuzzy-PID Logic Controlled Brushless Motor Drives for Missiles Glider Applications.

    PubMed

    Muniraj, Murali; Arulmozhiyal, Ramaswamy

    2015-01-01

    A control actuation system has been used extensively in automotive, aerospace, and defense applications. The major challenges in modeling control actuation system are rise time, maximum peak to peak overshoot, and response to nonlinear system with percentage error. This paper addresses the challenges in modeling and real time implementation of control actuation system for missiles glider applications. As an alternative fuzzy-PID controller is proposed in BLDC motor drive followed by linkage mechanism to actuate fins in missiles and gliders. The proposed system will realize better rise time and less overshoot while operating in extreme nonlinear dynamic system conditions. A mathematical model of BLDC motor is derived in state space form. The complete control actuation system is modeled in MATLAB/Simulink environment and verified by performing simulation studies. A real time prototype of the control actuation is developed with dSPACE-1104 hardware controller and a detailed analysis is carried out to confirm the viability of the proposed system.

  11. A Systems Approach to Scalable Transportation Network Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perumalla, Kalyan S

    2006-01-01

    Emerging needs in transportation network modeling and simulation are raising new challenges with respect to scal-ability of network size and vehicular traffic intensity, speed of simulation for simulation-based optimization, and fidel-ity of vehicular behavior for accurate capture of event phe-nomena. Parallel execution is warranted to sustain the re-quired detail, size and speed. However, few parallel simulators exist for such applications, partly due to the challenges underlying their development. Moreover, many simulators are based on time-stepped models, which can be computationally inefficient for the purposes of modeling evacuation traffic. Here an approach is presented to de-signing a simulator with memory andmore » speed efficiency as the goals from the outset, and, specifically, scalability via parallel execution. The design makes use of discrete event modeling techniques as well as parallel simulation meth-ods. Our simulator, called SCATTER, is being developed, incorporating such design considerations. Preliminary per-formance results are presented on benchmark road net-works, showing scalability to one million vehicles simu-lated on one processor.« less

  12. A Self-Assisting Protein Folding Model for Teaching Structural Molecular Biology.

    PubMed

    Davenport, Jodi; Pique, Michael; Getzoff, Elizabeth; Huntoon, Jon; Gardner, Adam; Olson, Arthur

    2017-04-04

    Structural molecular biology is now becoming part of high school science curriculum thus posing a challenge for teachers who need to convey three-dimensional (3D) structures with conventional text and pictures. In many cases even interactive computer graphics does not go far enough to address these challenges. We have developed a flexible model of the polypeptide backbone using 3D printing technology. With this model we have produced a polypeptide assembly kit to create an idealized model of the Triosephosphate isomerase mutase enzyme (TIM), which forms a structure known as TIM barrel. This kit has been used in a laboratory practical where students perform a step-by-step investigation into the nature of protein folding, starting with the handedness of amino acids to the formation of secondary and tertiary structure. Based on the classroom evidence we collected, we conclude that these models are valuable and inexpensive resource for teaching structural molecular biology. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. FACE-IT. A Science Gateway for Food Security Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montella, Raffaele; Kelly, David; Xiong, Wei

    Progress in sustainability science is hindered by challenges in creating and managing complex data acquisition, processing, simulation, post-processing, and intercomparison pipelines. To address these challenges, we developed the Framework to Advance Climate, Economic, and Impact Investigations with Information Technology (FACE-IT) for crop and climate impact assessments. This integrated data processing and simulation framework enables data ingest from geospatial archives; data regridding, aggregation, and other processing prior to simulation; large-scale climate impact simulations with agricultural and other models, leveraging high-performance and cloud computing; and post-processing to produce aggregated yields and ensemble variables needed for statistics, for model intercomparison, and to connectmore » biophysical models to global and regional economic models. FACE-IT leverages the capabilities of the Globus Galaxies platform to enable the capture of workflows and outputs in well-defined, reusable, and comparable forms. We describe FACE-IT and applications within the Agricultural Model Intercomparison and Improvement Project and the Center for Robust Decision-making on Climate and Energy Policy.« less

  14. Challenges in Real-Time Prediction of Infectious Disease: A Case Study of Dengue in Thailand

    PubMed Central

    Lauer, Stephen A.; Sakrejda, Krzysztof; Iamsirithaworn, Sopon; Hinjoy, Soawapak; Suangtho, Paphanij; Suthachana, Suthanun; Clapham, Hannah E.; Salje, Henrik; Cummings, Derek A. T.; Lessler, Justin

    2016-01-01

    Epidemics of communicable diseases place a huge burden on public health infrastructures across the world. Producing accurate and actionable forecasts of infectious disease incidence at short and long time scales will improve public health response to outbreaks. However, scientists and public health officials face many obstacles in trying to create such real-time forecasts of infectious disease incidence. Dengue is a mosquito-borne virus that annually infects over 400 million people worldwide. We developed a real-time forecasting model for dengue hemorrhagic fever in the 77 provinces of Thailand. We created a practical computational infrastructure that generated multi-step predictions of dengue incidence in Thai provinces every two weeks throughout 2014. These predictions show mixed performance across provinces, out-performing seasonal baseline models in over half of provinces at a 1.5 month horizon. Additionally, to assess the degree to which delays in case reporting make long-range prediction a challenging task, we compared the performance of our real-time predictions with predictions made with fully reported data. This paper provides valuable lessons for the implementation of real-time predictions in the context of public health decision making. PMID:27304062

  15. Challenges in Real-Time Prediction of Infectious Disease: A Case Study of Dengue in Thailand.

    PubMed

    Reich, Nicholas G; Lauer, Stephen A; Sakrejda, Krzysztof; Iamsirithaworn, Sopon; Hinjoy, Soawapak; Suangtho, Paphanij; Suthachana, Suthanun; Clapham, Hannah E; Salje, Henrik; Cummings, Derek A T; Lessler, Justin

    2016-06-01

    Epidemics of communicable diseases place a huge burden on public health infrastructures across the world. Producing accurate and actionable forecasts of infectious disease incidence at short and long time scales will improve public health response to outbreaks. However, scientists and public health officials face many obstacles in trying to create such real-time forecasts of infectious disease incidence. Dengue is a mosquito-borne virus that annually infects over 400 million people worldwide. We developed a real-time forecasting model for dengue hemorrhagic fever in the 77 provinces of Thailand. We created a practical computational infrastructure that generated multi-step predictions of dengue incidence in Thai provinces every two weeks throughout 2014. These predictions show mixed performance across provinces, out-performing seasonal baseline models in over half of provinces at a 1.5 month horizon. Additionally, to assess the degree to which delays in case reporting make long-range prediction a challenging task, we compared the performance of our real-time predictions with predictions made with fully reported data. This paper provides valuable lessons for the implementation of real-time predictions in the context of public health decision making.

  16. Sustainability of cement kiln co-processing of wastes in India: a pilot study.

    PubMed

    Baidya, Rahul; Ghosh, Sadhan Kumar; Parlikar, Ulhas V

    2017-07-01

    Co-processing in cement kiln achieves effective utilization of the material and energy value present in the wastes, thereby conserving the natural resources by reducing the use of virgin material. In India, a number of multifolded initiatives have been taken that take into account the potential and volume of waste generation. This paper studies the factors which might influence the sustainability of co-processing of waste in cement kilns as a business model, considering the issues and challenges in the supply chain framework in India in view of the four canonical pillars of sustainability. A pilot study on co-processing was carried out in one of the cement plant in India to evaluate the environmental performance, economical performance, operational performance and social performance. The findings will help India and other developing countries to introduce effective supply chain management for co-processing while addressing the issues and challenges during co-processing of different waste streams in the cement kilns.

  17. Challenges of Future High-End Computing

    NASA Technical Reports Server (NTRS)

    Bailey, David; Kutler, Paul (Technical Monitor)

    1998-01-01

    The next major milestone in high performance computing is a sustained rate of one Pflop/s (also written one petaflops, or 10(circumflex)15 floating-point operations per second). In addition to prodigiously high computational performance, such systems must of necessity feature very large main memories, as well as comparably high I/O bandwidth and huge mass storage facilities. The current consensus of scientists who have studied these issues is that "affordable" petaflops systems may be feasible by the year 2010, assuming that certain key technologies continue to progress at current rates. One important question is whether applications can be structured to perform efficiently on such systems, which are expected to incorporate many thousands of processors and deeply hierarchical memory systems. To answer these questions, advanced performance modeling techniques, including simulation of future architectures and applications, may be required. It may also be necessary to formulate "latency tolerant algorithms" and other completely new algorithmic approaches for certain applications. This talk will give an overview of these challenges.

  18. Predicting the Impacts of Intravehicular Displays on Driving Performance with Human Performance Modeling

    NASA Technical Reports Server (NTRS)

    Mitchell, Diane Kuhl; Wojciechowski, Josephine; Samms, Charneta

    2012-01-01

    A challenge facing the U.S. National Highway Traffic Safety Administration (NHTSA), as well as international safety experts, is the need to educate car drivers about the dangers associated with performing distraction tasks while driving. Researchers working for the U.S. Army Research Laboratory have developed a technique for predicting the increase in mental workload that results when distraction tasks are combined with driving. They implement this technique using human performance modeling. They have predicted workload associated with driving combined with cell phone use. In addition, they have predicted the workload associated with driving military vehicles combined with threat detection. Their technique can be used by safety personnel internationally to demonstrate the dangers of combining distracter tasks with driving and to mitigate the safety risks.

  19. Modeling radiation belt electron dynamics during GEM challenge intervals with the DREAM3D diffusion model

    NASA Astrophysics Data System (ADS)

    Tu, Weichao; Cunningham, G. S.; Chen, Y.; Henderson, M. G.; Camporeale, E.; Reeves, G. D.

    2013-10-01

    a response to the Geospace Environment Modeling (GEM) "Global Radiation Belt Modeling Challenge," a 3D diffusion model is used to simulate the radiation belt electron dynamics during two intervals of the Combined Release and Radiation Effects Satellite (CRRES) mission, 15 August to 15 October 1990 and 1 February to 31 July 1991. The 3D diffusion model, developed as part of the Dynamic Radiation Environment Assimilation Model (DREAM) project, includes radial, pitch angle, and momentum diffusion and mixed pitch angle-momentum diffusion, which are driven by dynamic wave databases from the statistical CRRES wave data, including plasmaspheric hiss, lower-band, and upper-band chorus. By comparing the DREAM3D model outputs to the CRRES electron phase space density (PSD) data, we find that, with a data-driven boundary condition at Lmax = 5.5, the electron enhancements can generally be explained by radial diffusion, though additional local heating from chorus waves is required. Because the PSD reductions are included in the boundary condition at Lmax = 5.5, our model captures the fast electron dropouts over a large L range, producing better model performance compared to previous published results. Plasmaspheric hiss produces electron losses inside the plasmasphere, but the model still sometimes overestimates the PSD there. Test simulations using reduced radial diffusion coefficients or increased pitch angle diffusion coefficients inside the plasmasphere suggest that better wave models and more realistic radial diffusion coefficients, both inside and outside the plasmasphere, are needed to improve the model performance. Statistically, the results show that, with the data-driven outer boundary condition, including radial diffusion and plasmaspheric hiss is sufficient to model the electrons during geomagnetically quiet times, but to best capture the radiation belt variations during active times, pitch angle and momentum diffusion from chorus waves are required.

  20. Evaluation of cross-protection by immunization with an experimental trivalent companion animal periodontitis vaccine in the mouse periodontitis model.

    PubMed

    Hardham, John; Sfintescu, Cornelia; Evans, Richard T

    2008-03-01

    Companion animal periodontal disease is one of the most prevalent diseases seen by veterinarians. The goal of this study was to evaluate the vaccine performance of a trivalent canine periodontitis vaccine in the mouse oral challenge model of periodontitis. Mice vaccinated subcutaneously with an inactivated, whole-cell vaccine preparation of Porphyromonas denticanis, Porphyromonas gulae, and Porphyromonas salivosa displayed significantly reduced alveolar bone loss in response to heterologous and cross-species challenges as compared to sham vaccinated animals. Based on the results of these studies, a periodontitis vaccine may be a useful tool in preventing the initiation and progression of periodontitis caused by the most commonly isolated pigmenting anaerobic bacteria in animals.

  1. Retrosynthetic Reaction Prediction Using Neural Sequence-to-Sequence Models

    PubMed Central

    2017-01-01

    We describe a fully data driven model that learns to perform a retrosynthetic reaction prediction task, which is treated as a sequence-to-sequence mapping problem. The end-to-end trained model has an encoder–decoder architecture that consists of two recurrent neural networks, which has previously shown great success in solving other sequence-to-sequence prediction tasks such as machine translation. The model is trained on 50,000 experimental reaction examples from the United States patent literature, which span 10 broad reaction types that are commonly used by medicinal chemists. We find that our model performs comparably with a rule-based expert system baseline model, and also overcomes certain limitations associated with rule-based expert systems and with any machine learning approach that contains a rule-based expert system component. Our model provides an important first step toward solving the challenging problem of computational retrosynthetic analysis. PMID:29104927

  2. Polar versus Cartesian velocity models for maneuvering target tracking with IMM

    NASA Astrophysics Data System (ADS)

    Laneuville, Dann

    This paper compares various model sets in different IMM filters for the maneuvering target tracking problem. The aim is to see whether we can improve the tracking performance of what is certainly the most widely used model set in the literature for the maneuvering target tracking problem: a Nearly Constant Velocity model and a Nearly Coordinated Turn model. Our new challenger set consists of a mixed Cartesian position and polar velocity state vector to describe the uniform motion segments and is augmented with the turn rate to obtain the second model for the maneuvering segments. This paper also gives a general procedure to discretize up to second order any non-linear continuous time model with linear diffusion. Comparative simulations on an air defence scenario with a 2D radar, show that this new approach improves significantly the tracking performance in this case.

  3. Mean curvature and texture constrained composite weighted random walk algorithm for optic disc segmentation towards glaucoma screening.

    PubMed

    Panda, Rashmi; Puhan, N B; Panda, Ganapati

    2018-02-01

    Accurate optic disc (OD) segmentation is an important step in obtaining cup-to-disc ratio-based glaucoma screening using fundus imaging. It is a challenging task because of the subtle OD boundary, blood vessel occlusion and intensity inhomogeneity. In this Letter, the authors propose an improved version of the random walk algorithm for OD segmentation to tackle such challenges. The algorithm incorporates the mean curvature and Gabor texture energy features to define the new composite weight function to compute the edge weights. Unlike the deformable model-based OD segmentation techniques, the proposed algorithm remains unaffected by curve initialisation and local energy minima problem. The effectiveness of the proposed method is verified with DRIVE, DIARETDB1, DRISHTI-GS and MESSIDOR database images using the performance measures such as mean absolute distance, overlapping ratio, dice coefficient, sensitivity, specificity and precision. The obtained OD segmentation results and quantitative performance measures show robustness and superiority of the proposed algorithm in handling the complex challenges in OD segmentation.

  4. Crowdsourced assessment of common genetic contribution to predicting anti-TNF treatment response in rheumatoid arthritis.

    PubMed

    Sieberts, Solveig K; Zhu, Fan; García-García, Javier; Stahl, Eli; Pratap, Abhishek; Pandey, Gaurav; Pappas, Dimitrios; Aguilar, Daniel; Anton, Bernat; Bonet, Jaume; Eksi, Ridvan; Fornés, Oriol; Guney, Emre; Li, Hongdong; Marín, Manuel Alejandro; Panwar, Bharat; Planas-Iglesias, Joan; Poglayen, Daniel; Cui, Jing; Falcao, Andre O; Suver, Christine; Hoff, Bruce; Balagurusamy, Venkat S K; Dillenberger, Donna; Neto, Elias Chaibub; Norman, Thea; Aittokallio, Tero; Ammad-Ud-Din, Muhammad; Azencott, Chloe-Agathe; Bellón, Víctor; Boeva, Valentina; Bunte, Kerstin; Chheda, Himanshu; Cheng, Lu; Corander, Jukka; Dumontier, Michel; Goldenberg, Anna; Gopalacharyulu, Peddinti; Hajiloo, Mohsen; Hidru, Daniel; Jaiswal, Alok; Kaski, Samuel; Khalfaoui, Beyrem; Khan, Suleiman Ali; Kramer, Eric R; Marttinen, Pekka; Mezlini, Aziz M; Molparia, Bhuvan; Pirinen, Matti; Saarela, Janna; Samwald, Matthias; Stoven, Véronique; Tang, Hao; Tang, Jing; Torkamani, Ali; Vert, Jean-Phillipe; Wang, Bo; Wang, Tao; Wennerberg, Krister; Wineinger, Nathan E; Xiao, Guanghua; Xie, Yang; Yeung, Rae; Zhan, Xiaowei; Zhao, Cheng; Greenberg, Jeff; Kremer, Joel; Michaud, Kaleb; Barton, Anne; Coenen, Marieke; Mariette, Xavier; Miceli, Corinne; Shadick, Nancy; Weinblatt, Michael; de Vries, Niek; Tak, Paul P; Gerlag, Danielle; Huizinga, Tom W J; Kurreeman, Fina; Allaart, Cornelia F; Louis Bridges, S; Criswell, Lindsey; Moreland, Larry; Klareskog, Lars; Saevarsdottir, Saedis; Padyukov, Leonid; Gregersen, Peter K; Friend, Stephen; Plenge, Robert; Stolovitzky, Gustavo; Oliva, Baldo; Guan, Yuanfang; Mangravite, Lara M; Bridges, S Louis; Criswell, Lindsey; Moreland, Larry; Klareskog, Lars; Saevarsdottir, Saedis; Padyukov, Leonid; Gregersen, Peter K; Friend, Stephen; Plenge, Robert; Stolovitzky, Gustavo; Oliva, Baldo; Guan, Yuanfang; Mangravite, Lara M

    2016-08-23

    Rheumatoid arthritis (RA) affects millions world-wide. While anti-TNF treatment is widely used to reduce disease progression, treatment fails in ∼one-third of patients. No biomarker currently exists that identifies non-responders before treatment. A rigorous community-based assessment of the utility of SNP data for predicting anti-TNF treatment efficacy in RA patients was performed in the context of a DREAM Challenge (http://www.synapse.org/RA_Challenge). An open challenge framework enabled the comparative evaluation of predictions developed by 73 research groups using the most comprehensive available data and covering a wide range of state-of-the-art modelling methodologies. Despite a significant genetic heritability estimate of treatment non-response trait (h(2)=0.18, P value=0.02), no significant genetic contribution to prediction accuracy is observed. Results formally confirm the expectations of the rheumatology community that SNP information does not significantly improve predictive performance relative to standard clinical traits, thereby justifying a refocusing of future efforts on collection of other data.

  5. Low latency network and distributed storage for next generation HPC systems: the ExaNeSt project

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Biagioni, A.; Cretaro, P.; Frezza, O.; Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Paolucci, P. S.; Pastorelli, E.; Pisani, F.; Simula, F.; Vicini, P.; Navaridas, J.; Chaix, F.; Chrysos, N.; Katevenis, M.; Papaeustathiou, V.

    2017-10-01

    With processor architecture evolution, the HPC market has undergone a paradigm shift. The adoption of low-cost, Linux-based clusters extended the reach of HPC from its roots in modelling and simulation of complex physical systems to a broader range of industries, from biotechnology, cloud computing, computer analytics and big data challenges to manufacturing sectors. In this perspective, the near future HPC systems can be envisioned as composed of millions of low-power computing cores, densely packed — meaning cooling by appropriate technology — with a tightly interconnected, low latency and high performance network and equipped with a distributed storage architecture. Each of these features — dense packing, distributed storage and high performance interconnect — represents a challenge, made all the harder by the need to solve them at the same time. These challenges lie as stumbling blocks along the road towards Exascale-class systems; the ExaNeSt project acknowledges them and tasks itself with investigating ways around them.

  6. Smart wing wind tunnel model design

    NASA Astrophysics Data System (ADS)

    Martin, Christopher A.; Jasmin, Larry; Flanagan, John S.; Appa, Kari; Kudva, Jayanth N.

    1997-05-01

    To verify the predicted benefits of the smart wing concept, two 16% scale wind tunnel models, one conventional and the other incorporating smart wing design features, were designed, fabricated and tested. Meticulous design of the two models was essential to: (1) ensure the required factor of safety of four for operation in the NASA Langley TDT wind tunnel, (2) efficiently integrate the smart actuation systems, (3) quantify the performance improvements, and (4) facilitate eventual scale-up to operational aircraft. Significant challenges were encountered in designing the attachment of the shape memory alloy control surfaces to the wing box, integration of the SMA torque tube in the wing structure, and development of control mechanisms to protect the model and the tunnel in the event of failure of the smart systems. In this paper, detailed design of the two models are presented. First, dynamic scaling of the models based on the geometry and structural details of the full- scale aircraft is presented. Next, results of the stress, divergence and flutter analyses are summarized. Finally some of the challenges of integrating the smart actuators with the model are highlighted.

  7. Neutronics Conversion Analyses of the Laue-Langevin Institute (ILL) High Flux Reactor (RHF)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergeron, A.; Dionne, B.; Calzavara, Y.

    2014-09-30

    The following report describes the neutronics results obtained with the MCNP model of the RHF U7Mo LEU reference design that has been established in 2010 during the feasibility analysis. This work constitutes a complete and detailed neutronics analysis of that LEU design using models that have been significantly improved since 2010 and the release of the feasibility report. When possible, the credibility of the neutronics model is tested by comparing the HEU model results with experimental data or other codes calculations results. The results obtained with the LEU model are systematically compared to the HEU model. The changes applied tomore » the neutronics model lead to better comparisons with experimental data or improved the calculation efficiency but do not challenge the conclusion of the feasibility analysis. If the U7Mo fuel is commercially available, not cost prohibitive, a back-end solution is established and if it is possible to manufacture the proposed element, neutronics analyses show that the performance of the reactor would not be challenged by the conversion to LEU fuel.« less

  8. Development and application of diurnal thermal modeling for camouflage, concealment, and deception

    NASA Astrophysics Data System (ADS)

    Rodgers, Mark L. B.

    2000-07-01

    The art of camouflage is to make a military asset appear to be part of the natural environment: its background. In order to predict the likely performance of countermeasures in attaining this goal it is necessary to model the signatures of targets, backgrounds and the effect of countermeasures. A library of diurnal thermal models has been constructed covering a range of backgrounds from vegetated and non- vegetated surfaces to snow cover. These models, originally developed for Western Europe, have been validated successfully for theatres of operation from the arctic to the desert. This paper will show the basis for and development of physically based models for the diurnal thermal behavior both of these backgrounds and for major passive countermeasures: camouflage nets and continuous textile materials. The countermeasures set up significant challenges for the thermal modeler with their low but non-zero thermal inertial and the extent to which they influence local aerodynamic behavior. These challenges have been met and the necessary extensive validation has shown the ability of the models to predict successfully the behavior of in-service countermeasures.

  9. Modeling Self-Heating Effects in Nanoscale Devices

    NASA Astrophysics Data System (ADS)

    Raleva, K.; Shaik, A. R.; Vasileska, D.; Goodnick, S. M.

    2017-08-01

    Accurate thermal modeling and the design of microelectronic devices and thin film structures at the micro- and nanoscales poses a challenge to electrical engineers who are less familiar with the basic concepts and ideas in sub-continuum heat transport. This book aims to bridge that gap. Efficient heat removal methods are necessary to increase device performance and device reliability. The authors provide readers with a combination of nanoscale experimental techniques and accurate modeling methods that must be employed in order to determine a device's temperature profile.

  10. Computational Materials: Modeling and Simulation of Nanostructured Materials and Systems

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Hinkley, Jeffrey A.

    2003-01-01

    The paper provides details on the structure and implementation of the Computational Materials program at the NASA Langley Research Center. Examples are given that illustrate the suggested approaches to predicting the behavior and influencing the design of nanostructured materials such as high-performance polymers, composites, and nanotube-reinforced polymers. Primary simulation and measurement methods applicable to multi-scale modeling are outlined. Key challenges including verification and validation of models are highlighted and discussed within the context of NASA's broad mission objectives.

  11. Crowd Sourcing for Challenging Technical Problems and Business Model

    NASA Technical Reports Server (NTRS)

    Davis, Jeffrey R.; Richard, Elizabeth

    2011-01-01

    Crowd sourcing may be defined as the act of outsourcing tasks that are traditionally performed by an employee or contractor to an undefined, generally large group of people or community (a crowd) in the form of an open call. The open call may be issued by an organization wishing to find a solution to a particular problem or complete a task, or by an open innovation service provider on behalf of that organization. In 2008, the Space Life Sciences Directorate (SLSD), with the support of Wyle Integrated Science and Engineering, established and implemented pilot projects in open innovation (crowd sourcing) to determine if these new internet-based platforms could indeed find solutions to difficult technical challenges. These unsolved technical problems were converted to problem statements, also called "Challenges" or "Technical Needs" by the various open innovation service providers, and were then posted externally to seek solutions. In addition, an open call was issued internally to NASA employees Agency wide (10 Field Centers and NASA HQ) using an open innovation service provider crowd sourcing platform to post NASA challenges from each Center for the others to propose solutions). From 2008 to 2010, the SLSD issued 34 challenges, 14 externally and 20 internally. The 14 external problems or challenges were posted through three different vendors: InnoCentive, Yet2.com and TopCoder. The 20 internal challenges were conducted using the InnoCentive crowd sourcing platform designed for internal use by an organization. This platform was customized for NASA use and promoted as NASA@Work. The results were significant. Of the seven InnoCentive external challenges, two full and five partial awards were made in complex technical areas such as predicting solar flares and long-duration food packaging. Similarly, the TopCoder challenge yielded an optimization algorithm for designing a lunar medical kit. The Yet2.com challenges yielded many new industry and academic contacts in bone imaging, microbial detection and even the use of pharmaceuticals for radiation protection. The internal challenges through NASA@Work drew over 6000 participants across all NASA centers. Challenges conducted by each NASA center elicited ideas and solutions from several other NASA centers and demonstrated rapid and efficient participation from employees at multiple centers to contribute to problem solving. Finally, on January 19, 2011, the SLSD conducted a workshop on open collaboration and innovation strategies and best practices through the newly established NASA Human Health and Performance Center (NHHPC). Initial projects will be described leading to a new business model for SLSD.

  12. Strong leadership and teamwork drive culture and performance change: Ohio State University Medical Center 2000-2006.

    PubMed

    Sanfilippo, Fred; Bendapudi, Neeli; Rucci, Anthony; Schlesinger, Leonard

    2008-09-01

    Several characteristics of academic health centers have the potential to create high levels of internal conflict and misalignment that can pose significant leadership challenges. In September 2000, the positions of Ohio State University (OSU) senior vice president for health sciences, dean of the medical school, and the newly created position of chief executive officer of the OSU Medical Center (OSUMC) were combined under a single leader to oversee the OSUMC. This mandate from the president and trustees was modeled after top institutions with similar structures. The leader who assumed the role was tasked with improving OSUMC's academic, clinical, and financial performance. To achieve this goal, the senior vice president and his team employed the service value chain model of improving performance, based on the premise that leadership behavior/culture drives employee engagement/satisfaction, leading to customer satisfaction and improved organizational performance. Implementing this approach was a seven-step process: (1) selecting the right leadership team, (2) assessing the challenges and opportunities, (3) setting expectations for performance and leadership behavior, (4) aligning structures and functions, (5) engaging constituents, (6) developing leadership skills, and (7) defining strategies and tracking goals. The OSUMC setting during this period provides an observational case study to examine how these stepwise changes, instituted by strong leadership and teamwork, were able to make and implement sound decisions that drove substantial and measurable improvements in the engagement and satisfaction of faculty and staff; the satisfaction of students and patients; and academic, clinical, and financial performance.

  13. Understanding GPU Power. A Survey of Profiling, Modeling, and Simulation Methods

    DOE PAGES

    Bridges, Robert A.; Imam, Neena; Mintz, Tiffany M.

    2016-09-01

    Modern graphics processing units (GPUs) have complex architectures that admit exceptional performance and energy efficiency for high throughput applications.Though GPUs consume large amounts of power, their use for high throughput applications facilitate state-of-the-art energy efficiency and performance. Consequently, continued development relies on understanding their power consumption. Our work is a survey of GPU power modeling and profiling methods with increased detail on noteworthy efforts. Moreover, as direct measurement of GPU power is necessary for model evaluation and parameter initiation, internal and external power sensors are discussed. Hardware counters, which are low-level tallies of hardware events, share strong correlation to powermore » use and performance. Statistical correlation between power and performance counters has yielded worthwhile GPU power models, yet the complexity inherent to GPU architectures presents new hurdles for power modeling. Developments and challenges of counter-based GPU power modeling is discussed. Often building on the counter-based models, research efforts for GPU power simulation, which make power predictions from input code and hardware knowledge, provide opportunities for optimization in programming or architectural design. Noteworthy strides in power simulations for GPUs are included along with their performance or functional simulator counterparts when appropriate. Lastly, possible directions for future research are discussed.« less

  14. Understanding GPU Power. A Survey of Profiling, Modeling, and Simulation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bridges, Robert A.; Imam, Neena; Mintz, Tiffany M.

    Modern graphics processing units (GPUs) have complex architectures that admit exceptional performance and energy efficiency for high throughput applications.Though GPUs consume large amounts of power, their use for high throughput applications facilitate state-of-the-art energy efficiency and performance. Consequently, continued development relies on understanding their power consumption. Our work is a survey of GPU power modeling and profiling methods with increased detail on noteworthy efforts. Moreover, as direct measurement of GPU power is necessary for model evaluation and parameter initiation, internal and external power sensors are discussed. Hardware counters, which are low-level tallies of hardware events, share strong correlation to powermore » use and performance. Statistical correlation between power and performance counters has yielded worthwhile GPU power models, yet the complexity inherent to GPU architectures presents new hurdles for power modeling. Developments and challenges of counter-based GPU power modeling is discussed. Often building on the counter-based models, research efforts for GPU power simulation, which make power predictions from input code and hardware knowledge, provide opportunities for optimization in programming or architectural design. Noteworthy strides in power simulations for GPUs are included along with their performance or functional simulator counterparts when appropriate. Lastly, possible directions for future research are discussed.« less

  15. A guide to using functional magnetic resonance imaging to study Alzheimer's disease in animal models.

    PubMed

    Asaad, Mazen; Lee, Jin Hyung

    2018-05-18

    Alzheimer's disease is a leading healthcare challenge facing our society today. Functional magnetic resonance imaging (fMRI) of the brain has played an important role in our efforts to understand how Alzheimer's disease alters brain function. Using fMRI in animal models of Alzheimer's disease has the potential to provide us with a more comprehensive understanding of the observations made in human clinical fMRI studies. However, using fMRI in animal models of Alzheimer's disease presents some unique challenges. Here, we highlight some of these challenges and discuss potential solutions for researchers interested in performing fMRI in animal models. First, we briefly summarize our current understanding of Alzheimer's disease from a mechanistic standpoint. We then overview the wide array of animal models available for studying this disease and how to choose the most appropriate model to study, depending on which aspects of the condition researchers seek to investigate. Finally, we discuss the contributions of fMRI to our understanding of Alzheimer's disease and the issues to consider when designing fMRI studies for animal models, such as differences in brain activity based on anesthetic choice and ways to interrogate more specific questions in rodents beyond those that can be addressed in humans. The goal of this article is to provide information on the utility of fMRI, and approaches to consider when using fMRI, for studies of Alzheimer's disease in animal models. © 2018. Published by The Company of Biologists Ltd.

  16. A guide to using functional magnetic resonance imaging to study Alzheimer's disease in animal models

    PubMed Central

    Asaad, Mazen

    2018-01-01

    ABSTRACT Alzheimer's disease is a leading healthcare challenge facing our society today. Functional magnetic resonance imaging (fMRI) of the brain has played an important role in our efforts to understand how Alzheimer's disease alters brain function. Using fMRI in animal models of Alzheimer's disease has the potential to provide us with a more comprehensive understanding of the observations made in human clinical fMRI studies. However, using fMRI in animal models of Alzheimer's disease presents some unique challenges. Here, we highlight some of these challenges and discuss potential solutions for researchers interested in performing fMRI in animal models. First, we briefly summarize our current understanding of Alzheimer's disease from a mechanistic standpoint. We then overview the wide array of animal models available for studying this disease and how to choose the most appropriate model to study, depending on which aspects of the condition researchers seek to investigate. Finally, we discuss the contributions of fMRI to our understanding of Alzheimer's disease and the issues to consider when designing fMRI studies for animal models, such as differences in brain activity based on anesthetic choice and ways to interrogate more specific questions in rodents beyond those that can be addressed in humans. The goal of this article is to provide information on the utility of fMRI, and approaches to consider when using fMRI, for studies of Alzheimer's disease in animal models. PMID:29784664

  17. Predicting detection performance with model observers: Fourier domain or spatial domain?

    PubMed

    Chen, Baiyu; Yu, Lifeng; Leng, Shuai; Kofler, James; Favazza, Christopher; Vrieze, Thomas; McCollough, Cynthia

    2016-02-27

    The use of Fourier domain model observer is challenged by iterative reconstruction (IR), because IR algorithms are nonlinear and IR images have noise texture different from that of FBP. A modified Fourier domain model observer, which incorporates nonlinear noise and resolution properties, has been proposed for IR and needs to be validated with human detection performance. On the other hand, the spatial domain model observer is theoretically applicable to IR, but more computationally intensive than the Fourier domain method. The purpose of this study is to compare the modified Fourier domain model observer to the spatial domain model observer with both FBP and IR images, using human detection performance as the gold standard. A phantom with inserts of various low contrast levels and sizes was repeatedly scanned 100 times on a third-generation, dual-source CT scanner at 5 dose levels and reconstructed using FBP and IR algorithms. The human detection performance of the inserts was measured via a 2-alternative-forced-choice (2AFC) test. In addition, two model observer performances were calculated, including a Fourier domain non-prewhitening model observer and a spatial domain channelized Hotelling observer. The performance of these two mode observers was compared in terms of how well they correlated with human observer performance. Our results demonstrated that the spatial domain model observer correlated well with human observers across various dose levels, object contrast levels, and object sizes. The Fourier domain observer correlated well with human observers using FBP images, but overestimated the detection performance using IR images.

  18. Predicting detection performance with model observers: Fourier domain or spatial domain?

    PubMed Central

    Chen, Baiyu; Yu, Lifeng; Leng, Shuai; Kofler, James; Favazza, Christopher; Vrieze, Thomas; McCollough, Cynthia

    2016-01-01

    The use of Fourier domain model observer is challenged by iterative reconstruction (IR), because IR algorithms are nonlinear and IR images have noise texture different from that of FBP. A modified Fourier domain model observer, which incorporates nonlinear noise and resolution properties, has been proposed for IR and needs to be validated with human detection performance. On the other hand, the spatial domain model observer is theoretically applicable to IR, but more computationally intensive than the Fourier domain method. The purpose of this study is to compare the modified Fourier domain model observer to the spatial domain model observer with both FBP and IR images, using human detection performance as the gold standard. A phantom with inserts of various low contrast levels and sizes was repeatedly scanned 100 times on a third-generation, dual-source CT scanner at 5 dose levels and reconstructed using FBP and IR algorithms. The human detection performance of the inserts was measured via a 2-alternative-forced-choice (2AFC) test. In addition, two model observer performances were calculated, including a Fourier domain non-prewhitening model observer and a spatial domain channelized Hotelling observer. The performance of these two mode observers was compared in terms of how well they correlated with human observer performance. Our results demonstrated that the spatial domain model observer correlated well with human observers across various dose levels, object contrast levels, and object sizes. The Fourier domain observer correlated well with human observers using FBP images, but overestimated the detection performance using IR images. PMID:27239086

  19. Manycore Performance-Portability: Kokkos Multidimensional Array Library

    DOE PAGES

    Edwards, H. Carter; Sunderland, Daniel; Porter, Vicki; ...

    2012-01-01

    Large, complex scientific and engineering application code have a significant investment in computational kernels to implement their mathematical models. Porting these computational kernels to the collection of modern manycore accelerator devices is a major challenge in that these devices have diverse programming models, application programming interfaces (APIs), and performance requirements. The Kokkos Array programming model provides library-based approach to implement computational kernels that are performance-portable to CPU-multicore and GPGPU accelerator devices. This programming model is based upon three fundamental concepts: (1) manycore compute devices each with its own memory space, (2) data parallel kernels and (3) multidimensional arrays. Kernel executionmore » performance is, especially for NVIDIA® devices, extremely dependent on data access patterns. Optimal data access pattern can be different for different manycore devices – potentially leading to different implementations of computational kernels specialized for different devices. The Kokkos Array programming model supports performance-portable kernels by (1) separating data access patterns from computational kernels through a multidimensional array API and (2) introduce device-specific data access mappings when a kernel is compiled. An implementation of Kokkos Array is available through Trilinos [Trilinos website, http://trilinos.sandia.gov/, August 2011].« less

  20. An Agent-Based Dynamic Model for Analysis of Distributed Space Exploration Architectures

    NASA Astrophysics Data System (ADS)

    Sindiy, Oleg V.; DeLaurentis, Daniel A.; Stein, William B.

    2009-07-01

    A range of complex challenges, but also potentially unique rewards, underlie the development of exploration architectures that use a distributed, dynamic network of resources across the solar system. From a methodological perspective, the prime challenge is to systematically model the evolution (and quantify comparative performance) of such architectures, under uncertainty, to effectively direct further study of specialized trajectories, spacecraft technologies, concept of operations, and resource allocation. A process model for System-of-Systems Engineering is used to define time-varying performance measures for comparative architecture analysis and identification of distinguishing patterns among interoperating systems. Agent-based modeling serves as the means to create a discrete-time simulation that generates dynamics for the study of architecture evolution. A Solar System Mobility Network proof-of-concept problem is introduced representing a set of longer-term, distributed exploration architectures. Options within this set revolve around deployment of human and robotic exploration and infrastructure assets, their organization, interoperability, and evolution, i.e., a system-of-systems. Agent-based simulations quantify relative payoffs for a fully distributed architecture (which can be significant over the long term), the latency period before they are manifest, and the up-front investment (which can be substantial compared to alternatives). Verification and sensitivity results provide further insight on development paths and indicate that the framework and simulation modeling approach may be useful in architectural design of other space exploration mass, energy, and information exchange settings.

  1. NASA Langley's Approach to the Sandia's Structural Dynamics Challenge Problem

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Kenny, Sean P.; Crespo, Luis G.; Elliott, Kenny B.

    2007-01-01

    The objective of this challenge is to develop a data-based probabilistic model of uncertainty to predict the behavior of subsystems (payloads) by themselves and while coupled to a primary (target) system. Although this type of analysis is routinely performed and representative of issues faced in real-world system design and integration, there are still several key technical challenges that must be addressed when analyzing uncertain interconnected systems. For example, one key technical challenge is related to the fact that there is limited data on target configurations. Moreover, it is typical to have multiple data sets from experiments conducted at the subsystem level, but often samples sizes are not sufficient to compute high confidence statistics. In this challenge problem additional constraints are placed as ground rules for the participants. One such rule is that mathematical models of the subsystem are limited to linear approximations of the nonlinear physics of the problem at hand. Also, participants are constrained to use these models and the multiple data sets to make predictions about the target system response under completely different input conditions. Our approach involved initially the screening of several different methods. Three of the ones considered are presented herein. The first one is based on the transformation of the modal data to an orthogonal space where the mean and covariance of the data are matched by the model. The other two approaches worked solutions in physical space where the uncertain parameter set is made of masses, stiffnesses and damping coefficients; one matches confidence intervals of low order moments of the statistics via optimization while the second one uses a Kernel density estimation approach. The paper will touch on all the approaches, lessons learned, validation 1 metrics and their comparison, data quantity restriction, and assumptions/limitations of each approach. Keywords: Probabilistic modeling, model validation, uncertainty quantification, kernel density

  2. Restoration of in situ fiber degradation and the role of fibrolytic microbes and ruminal pH in cows fed grain-rich diets transiently or continuously.

    PubMed

    Pourazad, P; Khiaosa-Ard, R; Metzler-Zebeli, B U; Klevenhusen, F; Zebeli, Q

    2017-12-01

    In this study, we used two different grain-rich feeding models (continuous or transient) to determine their effects on in situ fiber degradation and abundances of important rumen fibrolytic microbes in the rumen. The role of the magnitude of ruminal pH drop during grain feeding in the fiber degradation was also determined. The study was performed in eight rumen-fistulated dry cows. They were fed forage-only diet (baseline), and then challenged with a 60% concentrate diet for 4 weeks, either continuously (n=4 cows) or transiently (n=4 cows). The cows of transient feeding had 1 week off concentrate in between. Ruminal degradation of grass silage and fiber-rich hay was determined by the in situ technique, and microbial abundances attached to incubated samples were analyzed by quantitative PCR. The in situ trials were performed at the baseline and in the 1st and the last week of concentrate feeding in the continuous model. The in situ trials were done in cows of the transient model at the baseline and in the 1st week of the re-challenge with concentrate. In situ degradation of NDF and ADF of the forage samples, and microbial abundances were determined at 0, 4, 8, 24 and 48 h of the incubation. Ruminal pH and temperature during the incubation were recorded using indwelling pH sensors. Compared with the respective baseline, both grain-rich feeding models lowered ruminal pH and increased the duration of pH below 5.5 and 5.8. Results of the grass silage incubation showed that in the continuous model the extent of NDF and ADF degradation was lower in the 1st, but not in the last week compared with the baseline. For the transient model, degradation of NDF of the silage was lower during the re-challenge compared with the baseline. Degradation of NDF and ADF of the hay was suppressed by both feeding models compared with the respective baseline. Changes in fiber degradation of either grass silage or hay were not related to the magnitude of ruminal pH depression during grain-rich feeding. In both feeding models total fungal numbers and relative abundance of Butyrivibrio fibrisolvens attached to the incubated forages were decreased by the challenge. Overall, Fibrobacter succinogenes was more sensitive to the grain challenge compared with Ruminococcus albus and Ruminococcus flavefaciens. The study provided evidence for a restored ruminal fiber degradation after prolonged time of grain-rich feeding, however depending on physical and chemical characteristics of forages.

  3. Modelling challenges for battery materials and electrical energy storage

    NASA Astrophysics Data System (ADS)

    Muller, Richard P.; Schultz, Peter A.

    2013-10-01

    Many vital requirements in world-wide energy production, from the electrification of transportation to better utilization of renewable energy production, depend on developing economical, reliable batteries with improved performance characteristics. Batteries reduce the need for gasoline and liquid hydrocarbons in an electrified transportation fleet, but need to be lighter, longer-lived and have higher energy densities, without sacrificing safety. Lighter and higher-capacity batteries make portable electronics more convenient. Less expensive electrical storage accelerates the introduction of renewable energy to electrical grids by buffering intermittent generation from solar or wind. Meeting these needs will probably require dramatic changes in the materials and chemistry used by batteries for electrical energy storage. New simulation capabilities, in both methods and computational resources, promise to fundamentally accelerate and advance the development of improved materials for electric energy storage. To fulfil this promise significant challenges remain, both in accurate simulations at various relevant length scales and in the integration of relevant information across multiple length scales. This focus section of Modelling and Simulation in Materials Science and Engineering surveys the challenges of modelling for energy storage, describes recent successes, identifies remaining challenges, considers various approaches to surmount these challenges and discusses the potential of these methods for future battery development. Zhang et al begin with atoms and electrons, with a review of first-principles studies of the lithiation of silicon electrodes, and then Fan et al examine the development and use of interatomic potentials to the study the mechanical properties of lithiated silicon in larger atomistic simulations. Marrocchelli et al study ionic conduction, an important aspect of lithium-ion battery performance, simulated by molecular dynamics. Emerging high-throughput methods allow rapid screening of promising new candidates for battery materials, illustrated for Li-ion olivine phosphates by Hajiyani et al . This collection includes descriptions of new techniques to model the chemistry at an electrode-electrolyte interface; Gunceler et al demonstrate coupling an electronic description of the electrode chemistry with the fluid electrolyte in a joint density functional theory method. Bridging to longer length scales to probe mechanical properties and transport, Preiss et al present a proof-of-concept phase field approach for a permeation model at an electrochemical interface, An and Jiang examine finite element simulations for transient deformation and transport in electrodes, and Haftabaradaran et al study the application of an analytical model to investigate the critical thickness for fracture in thick film electrodes. The focus section concludes with a study by Chung et al which combines modelling and experiment, examining the validity of the Bruggeman relation for porous electrodes. All of the papers were peer-reviewed following the standard procedure established by the Editorial Board of Modelling and Simulation in Materials Science and Engineering .

  4. Understanding Rasch Measurement: Rasch Techniques for Detecting Bias in Performance Assessments: An Example Comparing the Performance of Native and Non-native Speakers on a Test of Academic English.

    ERIC Educational Resources Information Center

    Elder, Catherine; McNamara, Tim; Congdon, Peter

    2003-01-01

    Used Rasch analytic procedures to study item bias or differential item functioning in both dichotomous and scalar items on a test of English for academic purposes. Results for 139 college students on a pilot English language test model the approach and illustrate the measurement challenges posed by a diagnostic instrument to measure English…

  5. The Terrestrial Planet Finder coronagraph dynamics error budget

    NASA Technical Reports Server (NTRS)

    Shaklan, Stuart B.; Marchen, Luis; Green, Joseph J.; Lay, Oliver P.

    2005-01-01

    The Terrestrial Planet Finder Coronagraph (TPF-C) demands extreme wave front control and stability to achieve its goal of detecting earth-like planets around nearby stars. We describe the performance models and error budget used to evaluate image plane contrast and derive engineering requirements for this challenging optical system.

  6. Understanding Effective Program Improvement Schools through a Distributed Leadership Task Context Model

    ERIC Educational Resources Information Center

    Gipson, Frances Marie

    2012-01-01

    Federal, state, and local agencies face challenges organizing resources that create the conditions necessary to create, sustain, and replicate effective high performing schools. Knowing that leadership does impact achievement outcomes and that school districts tackle growing numbers of sanctioned Program Improvement schools, a distributed…

  7. Coordinating the Commons: Diversity & Dynamics in Open Collaborations

    ERIC Educational Resources Information Center

    Morgan, Jonathan T.

    2013-01-01

    The success of Wikipedia demonstrates that open collaboration can be an effective model for organizing geographically-distributed volunteers to perform complex, sustained work at a massive scale. However, Wikipedia's history also demonstrates some of the challenges that large, long-term open collaborations face: the core community of Wikipedia…

  8. Inviting Parents to the Middle: A Proactive Stance for Improving Student Performance.

    ERIC Educational Resources Information Center

    Peel, Henry A.; Foster, Elizabeth S.

    1993-01-01

    Explores the challenge of keeping parents involved in their children's education beyond the elementary school years. Suggests the invitational model as proactive approach for keeping parents involved in the lives of their children, particularly during the transition years of middle school education. As background, discusses nature of…

  9. Performance Evaluation and Accountability for School Psychologists: Challenges and Opportunities

    ERIC Educational Resources Information Center

    Morrison, Julie Q.

    2013-01-01

    The call for school psychologists to demonstrate accountability in the evaluation of services at the individual, group, and system levels comes at a time when school districts nationally are pursuing personnel evaluation models that link teachers' instructional practices to student achievement. School psychologists have an opportunity to take a…

  10. Success and challenges met during the calibration of APEX on large plots

    USDA-ARS?s Scientific Manuscript database

    As the APEX model is increasingly considered for the evaluation of agricultural systems, satisfactory performance of APEX on fields is critical. APEX was applied to 16 replicated large plots established in 1991 in Northeast Missouri. Until 2009, each phase of each rotation was represented every year...

  11. Lesson Study and Pedagogic Literacy in Initial Teacher Education: Challenging Reductive Models

    ERIC Educational Resources Information Center

    Cajkler, Wasyl; Wood, Phil

    2016-01-01

    This paper argues that teacher learning is not reducible to lists of "performative" standards. Funded by the Society for Educational Studies, we used "lesson study" as a vehicle to develop new teacher expertise, following which we concluded that conceptualising "learning to teach" as acquisition of standards is…

  12. Investigating the Impact of Lack of Motorcycle Annual Average Daily Traffic Data in Crash Modeling and the Estimation of Crash Modification Factors

    DOT National Transportation Integrated Search

    2016-10-01

    The development of safety performance functions (SPFs) and crash modification factors (CMFs) requires data on traffic exposure. The analysis of motorcycle crashes can be especially challenging in this regard because few jurisdictions collect motorcyc...

  13. Binding free energy predictions of farnesoid X receptor (FXR) agonists using a linear interaction energy (LIE) approach with reliability estimation: application to the D3R Grand Challenge 2

    NASA Astrophysics Data System (ADS)

    Rifai, Eko Aditya; van Dijk, Marc; Vermeulen, Nico P. E.; Geerke, Daan P.

    2018-01-01

    Computational protein binding affinity prediction can play an important role in drug research but performing efficient and accurate binding free energy calculations is still challenging. In the context of phase 2 of the Drug Design Data Resource (D3R) Grand Challenge 2 we used our automated eTOX ALLIES approach to apply the (iterative) linear interaction energy (LIE) method and we evaluated its performance in predicting binding affinities for farnesoid X receptor (FXR) agonists. Efficiency was obtained by our pre-calibrated LIE models and molecular dynamics (MD) simulations at the nanosecond scale, while predictive accuracy was obtained for a small subset of compounds. Using our recently introduced reliability estimation metrics, we could classify predictions with higher confidence by featuring an applicability domain (AD) analysis in combination with protein-ligand interaction profiling. The outcomes of and agreement between our AD and interaction-profile analyses to distinguish and rationalize the performance of our predictions highlighted the relevance of sufficiently exploring protein-ligand interactions during training and it demonstrated the possibility to quantitatively and efficiently evaluate if this is achieved by using simulation data only.

  14. Exploring predictive performance: A reanalysis of the geospace model transition challenge

    NASA Astrophysics Data System (ADS)

    Welling, D. T.; Anderson, B. J.; Crowley, G.; Pulkkinen, A. A.; Rastätter, L.

    2017-01-01

    The Pulkkinen et al. (2013) study evaluated the ability of five different geospace models to predict surface dB/dt as a function of upstream solar drivers. This was an important step in the assessment of research models for predicting and ultimately preventing the damaging effects of geomagnetically induced currents. Many questions remain concerning the capabilities of these models. This study presents a reanalysis of the Pulkkinen et al. (2013) results in an attempt to better understand the models' performance. The range of validity of the models is determined by examining the conditions corresponding to the empirical input data. It is found that the empirical conductance models on which global magnetohydrodynamic models rely are frequently used outside the limits of their input data. The prediction error for the models is sorted as a function of solar driving and geomagnetic activity. It is found that all models show a bias toward underprediction, especially during active times. These results have implications for future research aimed at improving operational forecast models.

  15. Sorbent, Sublimation, and Icing Modeling Methods: Experimental Validation and Application to an Integrated MTSA Subassembly Thermal Model

    NASA Technical Reports Server (NTRS)

    Bower, Chad; Padilla, Sebastian; Iacomini, Christie; Paul, Heather L.

    2010-01-01

    This paper details the validation of modeling methods for the three core components of a Metabolic heat regenerated Temperature Swing Adsorption (MTSA) subassembly, developed for use in a Portable Life Support System (PLSS). The first core component in the subassembly is a sorbent bed, used to capture and reject metabolically produced carbon dioxide (CO2). The sorbent bed performance can be augmented with a temperature swing driven by a liquid CO2 (LCO2) sublimation heat exchanger (SHX) for cooling the sorbent bed, and a condensing, icing heat exchanger (CIHX) for warming the sorbent bed. As part of the overall MTSA effort, scaled design validation test articles for each of these three components have been independently tested in laboratory conditions. Previously described modeling methodologies developed for implementation in Thermal Desktop and SINDA/FLUINT are reviewed and updated, their application in test article models outlined, and the results of those model correlations relayed. Assessment of the applicability of each modeling methodology to the challenge of simulating the response of the test articles and their extensibility to a full scale integrated subassembly model is given. The independent verified and validated modeling methods are applied to the development of a MTSA subassembly prototype model and predictions of the subassembly performance are given. These models and modeling methodologies capture simulation of several challenging and novel physical phenomena in the Thermal Desktop and SINDA/FLUINT software suite. Novel methodologies include CO2 adsorption front tracking and associated thermal response in the sorbent bed, heat transfer associated with sublimation of entrained solid CO2 in the SHX, and water mass transfer in the form of ice as low as 210 K in the CIHX.

  16. The history of head transplantation: a review.

    PubMed

    Lamba, Nayan; Holsgrove, Daniel; Broekman, Marike L

    2016-12-01

    Since the turn of the last century, the prospect of head transplantation has captured the imagination of scientists and the general public. Recently, head transplant has regained attention in popular media, as neurosurgeons have proposed performing this procedure in 2017. Given the potential impact of such a procedure, we were interested in learning the history of the technical hurdles that need to be overcome, and determine if it is even technically possible to perform such a procedure on humans today. We conducted a historical review of available literature on the technical challenges and developments of head transplantation. The many social, psychological, ethical, religious, cultural, and legal questions of head transplantation were beyond the scope of this review. Our historical review identified the following important technical considerations related to performing a head transplant: maintenance of blood flow to an isolated brain via vessel anastomosis; availability of immunosuppressive agents; spinal anastomosis and fusion following cord transfection; pain control in the recipient. Several animal studies have demonstrated success in maintaining recipient cerebral perfusion and achieving immunosuppression. However, there is currently sparse evidence in favor of successful spinal anastomosis and fusion after transection. While recent publications by an Italian group offer novel approaches to this challenge, research on this topic has been sparse and hinges on procedures performed in animal models in the 1970s. How transferrable these older methods are to the human nervous system is unclear and warrants further exploration. Our review identified several important considerations related to performing a viable head transplantation. Besides the technical challenges that remain, there are important ethical issues to consider, such as exploitation of vulnerable patients and informed consent. Thus, besides the remaining technical challenges, these ethical issues will also need to be addressed before moving these studies to the clinic.

  17. Modeling of Pressure Drop During Refrigerant Condensation in Pipe Minichannels

    NASA Astrophysics Data System (ADS)

    Sikora, Małgorzata; Bohdal, Tadeusz

    2017-12-01

    Investigations of refrigerant condensation in pipe minichannels are very challenging and complicated issue. Due to the multitude of influences very important is mathematical and computer modeling. Its allows for performing calculations for many different refrigerants under different flow conditions. A large number of experimental results published in the literature allows for experimental verification of correctness of the models. In this work is presented a mathematical model for calculation of flow resistance during condensation of refrigerants in the pipe minichannel. The model was developed in environment based on conservation equations. The results of calculations were verified by authors own experimental investigations results.

  18. Perspectives to performance of environment and health assessments and models--from outputs to outcomes?

    PubMed

    Pohjola, Mikko V; Pohjola, Pasi; Tainio, Marko; Tuomisto, Jouni T

    2013-06-26

    The calls for knowledge-based policy and policy-relevant research invoke a need to evaluate and manage environment and health assessments and models according to their societal outcomes. This review explores how well the existing approaches to assessment and model performance serve this need. The perspectives to assessment and model performance in the scientific literature can be called: (1) quality assurance/control, (2) uncertainty analysis, (3) technical assessment of models, (4) effectiveness and (5) other perspectives, according to what is primarily seen to constitute the goodness of assessments and models. The categorization is not strict and methods, tools and frameworks in different perspectives may overlap. However, altogether it seems that most approaches to assessment and model performance are relatively narrow in their scope. The focus in most approaches is on the outputs and making of assessments and models. Practical application of the outputs and the consequential outcomes are often left unaddressed. It appears that more comprehensive approaches that combine the essential characteristics of different perspectives are needed. This necessitates a better account of the mechanisms of collective knowledge creation and the relations between knowledge and practical action. Some new approaches to assessment, modeling and their evaluation and management span the chain from knowledge creation to societal outcomes, but the complexity of evaluating societal outcomes remains a challenge.

  19. Calibration of X-Ray Observatories

    NASA Technical Reports Server (NTRS)

    Weisskopf, Martin C.; L'Dell, Stephen L.

    2011-01-01

    Accurate calibration of x-ray observatories has proved an elusive goal. Inaccuracies and inconsistencies amongst on-ground measurements, differences between on-ground and in-space performance, in-space performance changes, and the absence of cosmic calibration standards whose physics we truly understand have precluded absolute calibration better than several percent and relative spectral calibration better than a few percent. The philosophy "the model is the calibration" relies upon a complete high-fidelity model of performance and an accurate verification and calibration of this model. As high-resolution x-ray spectroscopy begins to play a more important role in astrophysics, additional issues in accurately calibrating at high spectral resolution become more evident. Here we review the challenges of accurately calibrating the absolute and relative response of x-ray observatories. On-ground x-ray testing by itself is unlikely to achieve a high-accuracy calibration of in-space performance, especially when the performance changes with time. Nonetheless, it remains an essential tool in verifying functionality and in characterizing and verifying the performance model. In the absence of verified cosmic calibration sources, we also discuss the notion of an artificial, in-space x-ray calibration standard. 6th

  20. Controlled human infection models for vaccine development: Zika virus debate.

    PubMed

    Gopichandran, Vijayaprasad

    2018-01-01

    An ethics panel, convened by the National Institute of Health and other research bodies in the USA, disallowed researchers from the Johns Hopkins University and University of Vermont from performing controlled human infection of healthy volunteers to develop a vaccine against Zika virus infection. The members published their ethical analysis and recommendations in February 2017. They have elaborated on the risks posed by human challenge with Zika virus to the volunteers and other uninvolved third parties and have systematically analysed the social value of such a human challenge experiment. They have also posited some mandatory ethical requirements which should be met before allowing the infection of healthy volunteers with the Zika virus. This commentary elaborates on the debate on the ethics of the human challenge model for the development of a Zika virus vaccine and the role of systematic ethical analysis in protecting the interests of research participants. It further analyses the importance of this debate to the development of a Zika vaccine in India.

  1. Responding to home maintenance challenge scenarios: the role of selection, optimization, and compensation in aging-in-place.

    PubMed

    Kelly, Andrew John; Fausset, Cara Bailey; Rogers, Wendy; Fisk, Arthur D

    2014-12-01

    This study examined potential issues faced by older adults in managing their homes and their proposed solutions for overcoming hypothetical difficulties. Forty-four diverse, independently living older adults (66-85) participated in structured group interviews in which they discussed potential solutions to manage difficulties presented in four scenarios: perceptual, mobility, physical, and cognitive difficulties. The proposed solutions were classified using the Selection, Optimization, and Compensation (SOC) model. Participants indicated they would continue performing most tasks and reported a range of strategies to manage home maintenance challenges. Most participants reported that they would manage home maintenance challenges using compensation; the most frequently mentioned compensation strategy was using tools and technologies. There were also differences across the scenarios: Optimization was discussed most frequently with perceptual and cognitive difficulty scenarios. These results provide insights into supporting older adults' potential needs for aging-in-place and provide evidence of the value of the SOC model in applied research. © The Author(s) 2012.

  2. Parents' explanatory models and hopes for outcomes of occupational therapy using a sensory integration approach.

    PubMed

    Cohn, Ellen S; Kramer, Jessica; Schub, Jamie A; May-Benson, Teresa

    2014-01-01

    PURPOSE. To describe parents' concerns and hopes for their children who would be receiving occupational therapy using a sensory integration approach. METHOD. Content analysis of 275 parental responses to three open-ended questions on developmental-sensory history intake forms. FINDINGS. Parents' descriptions of why they sought for their children were categorized into four overarching concerns about their children's challenges: self-regulation, interacting with peers, participating in skilled motor activities, and self-confidence. Parents often linked these concerns together, revealing explanatory models of how they make sense of potential relationships among their children's challenges and how these challenges affect occupational performance. Parents hoped occupational therapy would help their children develop self-understanding and frustration tolerance to self-regulate their behavior in socially acceptable ways. IMPLICATIONS. Assessment and intervention should explicitly focus on links among self-regulation, social participation, skills, and perceived competence to address parents' expectations. Copyright © 2014 by the American Occupational Therapy Association, Inc.

  3. Professional self-efficacy as a predictor of burnout and engagement: the role of challenge and hindrance demands.

    PubMed

    Ventura, Mercedes; Salanova, Marisa; Llorens, Susana

    2015-01-01

    The objective of the current study is to analyze the role of professional self-efficacy as a predictor of psychosocial well-being (i.e., burnout and engagement) following the Social Cognitive Theory of Albert Bandura (1997). Structural Equation Modeling was performed in a sample of secondary school teachers (n = 460) and users of Information and Communication Technology (n = 596). Results show empirical support for the predicting role that professional self-efficacy plays in the perception of challenge (i.e., mental overload) and hindrance demands (i.e., role conflict, lack of control, and lack of social support), which are in turn related to burnout (i.e., erosion process) and engagement (i.e., motivational process). Specifically, employees with more professional self-efficacy will perceive more challenge demands and fewer hindrance demands, and this will in turn relate to more engagement and less burnout. A multi-group analysis showed that the research model was invariant across both samples. Theoretical and practical implications are discussed.

  4. The General Ensemble Biogeochemical Modeling System (GEMS) and its applications to agricultural systems in the United States: Chapter 18

    USGS Publications Warehouse

    Liu, Shuguang; Tan, Zhengxi; Chen, Mingshi; Liu, Jinxun; Wein, Anne; Li, Zhengpeng; Huang, Shengli; Oeding, Jennifer; Young, Claudia; Verma, Shashi B.; Suyker, Andrew E.; Faulkner, Stephen P.

    2012-01-01

    The General Ensemble Biogeochemical Modeling System (GEMS) was es in individual models, it uses multiple site-scale biogeochemical models to perform model simulations. Second, it adopts Monte Carlo ensemble simulations of each simulation unit (one site/pixel or group of sites/pixels with similar biophysical conditions) to incorporate uncertainties and variability (as measured by variances and covariance) of input variables into model simulations. In this chapter, we illustrate the applications of GEMS at the site and regional scales with an emphasis on incorporating agricultural practices. Challenges in modeling soil carbon dynamics and greenhouse emissions are also discussed.

  5. Users matter : multi-agent systems model of high performance computing cluster users.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, M. J.; Hood, C. S.; Decision and Information Sciences

    2005-01-01

    High performance computing clusters have been a critical resource for computational science for over a decade and have more recently become integral to large-scale industrial analysis. Despite their well-specified components, the aggregate behavior of clusters is poorly understood. The difficulties arise from complicated interactions between cluster components during operation. These interactions have been studied by many researchers, some of whom have identified the need for holistic multi-scale modeling that simultaneously includes network level, operating system level, process level, and user level behaviors. Each of these levels presents its own modeling challenges, but the user level is the most complex duemore » to the adaptability of human beings. In this vein, there are several major user modeling goals, namely descriptive modeling, predictive modeling and automated weakness discovery. This study shows how multi-agent techniques were used to simulate a large-scale computing cluster at each of these levels.« less

  6. Dst Index in the 2008 GEM Modeling Challenge - Model Performance for Moderate and Strong Magnetic Storms

    NASA Technical Reports Server (NTRS)

    Rastaetter, Lutz; Kuznetsova, Maria; Hesse, Michael; Chulaki, Anna; Pulkkinen, Antti; Ridley, Aaron J.; Gombosi, Tamas; Vapirev, Alexander; Raeder, Joachim; Wiltberger, Michael James; hide

    2010-01-01

    The GEM 2008 modeling challenge efforts are expanding beyond comparing in-situ measurements in the magnetosphere and ionosphere to include the computation of indices to be compared. The Dst index measures the largest deviations of the horizontal magnetic field at 4 equatorial magnetometers from the quiet-time background field and is commonly used to track the strength of the magnetic disturbance of the magnetosphere during storms. Models can calculate a proxy Dst index in various ways, including using the Dessler-Parker Sckopke relation and the energy of the ring current and Biot-Savart integration of electric currents in the magnetosphere. The GEM modeling challenge investigates 4 space weather events and we compare models available at CCMC against each other and the observed values of Ost. Models used include SWMF/BATSRUS, OpenGGCM, LFM, GUMICS (3D magnetosphere MHD models), Fok-RC, CRCM, RAM-SCB (kinetic drift models of the ring current), WINDMI (magnetosphere-ionosphere electric circuit model), and predictions based on an impulse response function (IRF) model and analytic coupling functions with inputs of solar wind data. In addition to the analysis of model-observation comparisons we look at the way Dst is computed in global magnetosphere models. The default value of Dst computed by the SWMF model is for Bz the Earth's center. In addition to this, we present results obtained at different locations on the Earth's surface. We choose equatorial locations at local noon, dusk (18:00 hours), midnight and dawn (6:00 hours). The different virtual observatory locations reveal the variation around the earth-centered Dst value resulting from the distribution of electric currents in the magnetosphere during different phases of a storm.

  7. Business process performance measurement: a structured literature review of indicators, measures and metrics.

    PubMed

    Van Looy, Amy; Shafagatova, Aygun

    2016-01-01

    Measuring the performance of business processes has become a central issue in both academia and business, since organizations are challenged to achieve effective and efficient results. Applying performance measurement models to this purpose ensures alignment with a business strategy, which implies that the choice of performance indicators is organization-dependent. Nonetheless, such measurement models generally suffer from a lack of guidance regarding the performance indicators that exist and how they can be concretized in practice. To fill this gap, we conducted a structured literature review to find patterns or trends in the research on business process performance measurement. The study also documents an extended list of 140 process-related performance indicators in a systematic manner by further categorizing them into 11 performance perspectives in order to gain a holistic view. Managers and scholars can consult the provided list to choose the indicators that are of interest to them, considering each perspective. The structured literature review concludes with avenues for further research.

  8. Effective Clinical Supervision in Substance Use Disorder Treatment Programs and Counselor Job Performance.

    PubMed

    Rothrauff-Laschober, Tanja C; Eby, Lillian Turner de Tormes; Sauer, Julia B

    2013-01-01

    When mental health counselors have limited and/or inadequate training in substance use disorders (SUDs), effective clinical supervision (ECS) may advance their professional development. The purpose of the current study was to investigate whether ECS is related to the job performance of SUD counselors. Data were obtained in person via paper-and-pencil surveys from 392 matched SUD counselor-clinical supervisor dyads working in 27 SUD treatment organizations across the United States. ECS was rated by counselors and measured with five multi-item scales (i.e., sponsoring counselors' careers, providing challenging assignments, role modeling, accepting/confirming counselors' competence, overall supervisor task proficiency). Clinical supervisors rated counselors' job performance, which was measured with two multi-item scales (i.e., task performance, performance within supervisory relationship). Using mixed-effects models, we found that most aspects of ECS are related to SUD counselor job performance. Thus, ECS may indeed enhance counselors' task performance and performance within the supervisory relationship, and, as a consequence, offset limited formal SUD training.

  9. Effective Clinical Supervision in Substance Use Disorder Treatment Programs and Counselor Job Performance

    PubMed Central

    2013-01-01

    When mental health counselors have limited and/or inadequate training in substance use disorders (SUDs), effective clinical supervision (ECS) may advance their professional development. The purpose of the current study was to investigate whether ECS is related to the job performance of SUD counselors. Data were obtained in person via paper-and-pencil surveys from 392 matched SUD counselor-clinical supervisor dyads working in 27 SUD treatment organizations across the United States. ECS was rated by counselors and measured with five multi-item scales (i.e., sponsoring counselors’ careers, providing challenging assignments, role modeling, accepting/confirming counselors’ competence, overall supervisor task proficiency). Clinical supervisors rated counselors’ job performance, which was measured with two multi-item scales (i.e., task performance, performance within supervisory relationship). Using mixed-effects models, we found that most aspects of ECS are related to SUD counselor job performance. Thus, ECS may indeed enhance counselors’ task performance and performance within the supervisory relationship, and, as a consequence, offset limited formal SUD training. PMID:25061265

  10. Development of a Human Motor Model for the Evaluation of an Integrated Alerting and Notification Flight Deck System

    NASA Technical Reports Server (NTRS)

    Daiker, Ron; Schnell, Thomas

    2010-01-01

    A human motor model was developed on the basis of performance data that was collected in a flight simulator. The motor model is under consideration as one component of a virtual pilot model for the evaluation of NextGen crew alerting and notification systems in flight decks. This model may be used in a digital Monte Carlo simulation to compare flight deck layout design alternatives. The virtual pilot model is being developed as part of a NASA project to evaluate multiple crews alerting and notification flight deck configurations. Model parameters were derived from empirical distributions of pilot data collected in a flight simulator experiment. The goal of this model is to simulate pilot motor performance in the approach-to-landing task. The unique challenges associated with modeling the complex dynamics of humans interacting with the cockpit environment are discussed, along with the current state and future direction of the model.

  11. Calibrating Building Energy Models Using Supercomputer Trained Machine Learning Agents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanyal, Jibonananda; New, Joshua Ryan; Edwards, Richard

    2014-01-01

    Building Energy Modeling (BEM) is an approach to model the energy usage in buildings for design and retrofit purposes. EnergyPlus is the flagship Department of Energy software that performs BEM for different types of buildings. The input to EnergyPlus can often extend in the order of a few thousand parameters which have to be calibrated manually by an expert for realistic energy modeling. This makes it challenging and expensive thereby making building energy modeling unfeasible for smaller projects. In this paper, we describe the Autotune research which employs machine learning algorithms to generate agents for the different kinds of standardmore » reference buildings in the U.S. building stock. The parametric space and the variety of building locations and types make this a challenging computational problem necessitating the use of supercomputers. Millions of EnergyPlus simulations are run on supercomputers which are subsequently used to train machine learning algorithms to generate agents. These agents, once created, can then run in a fraction of the time thereby allowing cost-effective calibration of building models.« less

  12. Association of physical performance and biochemical profile of mice with intrinsic endurance swimming.

    PubMed

    Huang, Wen-Ching; Hsu, Yi-Ju; Wei, Li; Chen, Ying-Ju; Huang, Chi-Chang

    2016-01-01

    We aimed to investigate the potential mediators and relationship affecting congenital exercise performance in an animal model with physical activity challenge from physiological and biochemical perspectives. A total of 75 male ICR mice (5 weeks old) were adapted for 1 week, then mice performed a non-loading and exhaustive swimming test and were assigned to 3 groups by exhaustive swimming time: low exercise capacity (LEC) (<3 hr), medium exercise capacity (MEC) (3-5 hr), and high exercise capacity (HEC) (>5 hr). After a 1-week rest, the 3 groups of mice performed an exhaustive swimming test with a 5% and 7.5% weight load and a forelimb grip-strength test, with a 1-week rest between tests. Blood samples were collected immediately after an acute exercise challenge and at the end of the experiment (resting status) to evaluate biochemical blood variables and their relation with physical performance. Physical activity, including exhaustive swimming and grip strength, was greater for HEC than other mice. The swimming performance and grip strength between groups were moderately correlated (r=0.443, p <0.05). Resting serum ammonium level was moderately correlated with endurance with a 7.5% weight load (r=-0.447, p <0.05) and with lactate level (r=0.598, p <0.05). The pulmonary morphology of the HEC group seemed to indicate benefits for aerobic exercise. Mice showed congenital exercise performance, which was significantly correlated with different physical challenges and biochemical variable values. This study may have implications for interference in intrinsic characteristics.

  13. NASA Langley Atmospheric Science Data Centers Near Real-Time Data Products

    NASA Astrophysics Data System (ADS)

    Davenport, T.; Parker, L.; Rinsland, P. L.

    2014-12-01

    Over the past decade the Atmospheric Science Data Center (ASDC) at NASA Langley Research Center has archived and distributed a variety of satellite mission data sets. NASA's goal in Earth science is to observe, understand, and model the Earth system to discover how it is changing, to better predict change, and to understand the consequences for life on Earth. The ASDC has collaborated with Science Teams to accommodate emerging science users in the climate and modeling communities. The ASDC has expanded its original role to support operational usage by related Earth Science satellites, support land and ocean assimilations, support of field campaigns, outreach programs, and application projects for agriculture and energy industries to bridge the gap between Earth science research results and the adoption of data and prediction capabilities for reliable and sustained use in Decision Support Systems (DSS). For example; these products are being used by the community performing data assimilations to regulate aerosol mass in global transport models to improve model response and forecast accuracy, to assess the performance of components of a global coupled atmospheric-ocean climate model, improve atmospheric motion vector (winds) impact on numerical weather prediction models, and to provide internet-based access to parameters specifically tailored to assist in the design of solar and wind powered renewable energy systems. These more focused applications often require Near Real-Time (NRT) products. Generating NRT products pose their own unique set challenges for the ASDC and the Science Teams. Examples of ASDC NRT products and challenges will be discussed.

  14. Skin test sensitivity to mouse predicts allergic symptoms to nasal challenge in urban adults.

    PubMed

    Chong, Laura K; Ong, Mary Jane; Curtin-Brosnan, Jean; Matsui, Elizabeth C

    2010-01-01

    Epidemiologic studies have shown an association between mouse allergen exposure and asthma morbidity among urban populations, but confirmatory challenge studies in community populations have not been performed. This study was designed to examine the clinical relevance of mouse sensitization using a nasal challenge model. Forty-nine urban adults with asthma underwent skin-prick testing (SPT) and intradermal testing (IDT) with mouse epithelia extract. A positive SPT was defined as a net wheal size ≥3 mm and a positive IDT was defined as a net wheal size ≥6 mm using a 1:100 dilution of extract (1:10 w/v was obtained from Greer Laboratories (Lenoir, NC) as a single lot [Mus m 1 concentration = 2130 ng/mL]). Mouse-specific IgE (m-IgE) was measured by ImmunoCAP (Phadia, Uppsala, Sweden). Nasal challenge was performed with increasing concentrations of mouse epithelia extract and symptoms were assessed by visual analog scale. A positive challenge was defined as a 20-mm increase in the scale. The age range of the 49 participants was 18-50 years; 41% were men and 86% were black. Fourteen participants were SPT(+) to mouse, 15 participants were SPT(-) but (IDT(+)), and 20 participants were negative on both SPT(-) and IDT(-) (SPT(-)/IDT(-)). Sixty-four percent of the SPT(+) group, 40% of the IDT(+) group, and 20% of the SPT(-)/IDT(-) group had a positive nasal challenge. Sixty-seven percent (10/15) of those who were either SPT(+) or m-IgE(+) had a positive nasal challenge. SPT or the combination of SPT plus m-IgE performed best in diagnosing mouse allergy. The great majority of mouse-sensitized urban adults with asthma appear to have clinically relevant sensitization. Urban adults with asthma should be evaluated for mouse sensitization using SPT or SPT plus m-IgE testing.

  15. Dynamic shear-lag model for understanding the role of matrix in energy dissipation in fiber-reinforced composites.

    PubMed

    Liu, Junjie; Zhu, Wenqing; Yu, Zhongliang; Wei, Xiaoding

    2018-07-01

    Lightweight and high impact performance composite design is a big challenge for scientists and engineers. Inspired from well-known biological materials, e.g., the bones, spider silk, and claws of mantis shrimp, artificial composites have been synthesized for engineering applications. Presently, the design of ballistic resistant composites mainly emphasizes the utilization of light and high-strength fibers, whereas the contribution from matrix materials receives less attention. However, recent ballistic experiments on fiber-reinforced composites challenge our common sense. The use of matrix with "low-grade" properties enhances effectively the impact performance. In this study, we establish a dynamic shear-lag model to explore the energy dissipation through viscous matrix materials in fiber-reinforced composites and the associations of energy dissipation characteristics with the properties and geometries of constituents. The model suggests that an enhancement in energy dissipation before the material integrity is lost can be achieved by tuning the shear modulus and viscosity of a matrix. Furthermore, our model implies that an appropriately designed staggered microstructure, adopted by many natural composites, can repeatedly activate the energy dissipation process and thus improve dramatically the impact performance. This model demonstrates the role of matrix in energy dissipation, and stimulates new advanced material design concepts for ballistic applications. Biological composites found in nature often possess exceptional mechanical properties that man-made materials haven't be able to achieve. For example, it is predicted that a pencil thick spider silk thread can stop a flying Boeing airplane. Here, by proposing a dynamic shear-lag model, we investigate the relationships between the impact performance of a composite with the dimensions and properties of its constituents. Our analysis suggests that the impact performance of fiber-reinforced composites could improve surprisingly with "low-grade" matrix materials, and discontinuities (often regarded as "defects") may play an important role in energy dissipation. Counter-intuitive as it may seem, our work helps understanding the secrets of the outstanding dynamic properties of some biological materials, and inspire novel ideas for man-made composites. Copyright © 2018 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  16. [Challenges in the organization of investigator initiated trials: in transplantation medicine].

    PubMed

    Schnitzbauer, A A; Lamby, P E; Mutzbauer, I; von Hassel, J; Geissler, E K; Schlitt, H J

    2011-03-01

    Transplantation medicine offers multiple translational questions which should preferably be transferred to clinical evidence. The current gold standard for testing such questions and hypotheses is by prospective randomized controlled trials (RCT). The trials should be performed independently from the medical industry to avoid conflicts of interests and to guarantee a strict scientific approach. A good model is an investigator initiated trial (IIT) in which academic institutions function as the sponsor and in which normally a scientific idea stands before marketing interests of a certain medical product. We present a model for an IIT which is sponsored and coordinated by Regensburg University Hospital at 45 sites in 13 nations (SiLVER study), highlight special pitfalls of this study and offer alternatives to this approach. Finances: financial support in clinical trials can be obtained from the medical industry. Alternatively in Germany the Federal Ministry of Education and Research (Bundesministerium für Bildung und Forschung) offers annual grants. The expansion of financial support through foundations is desirable. Infrastructure: sponsorship within the pharmaceutics act (Arzneimittelgesetz) demands excellent infrastructural conditions and a professional team to accomplish clinical, logistic, regulatory, legal and ethical challenges in a RCT. If a large trial has sufficient financial support certain tasks can be outsourced and delegated to contract research organizations, coordinating centers for clinical trials or partners in the medical industry. Clinical scientific advances to improve evidence are an enormous challenge when performed as an IIT. However, academic sponsors can perform (international) IITs when certain rules are followed and should be defined as the gold standard when scientific findings have to be established clinically.

  17. Decontaminate feature for tracking: adaptive tracking via evolutionary feature subset

    NASA Astrophysics Data System (ADS)

    Liu, Qiaoyuan; Wang, Yuru; Yin, Minghao; Ren, Jinchang; Li, Ruizhi

    2017-11-01

    Although various visual tracking algorithms have been proposed in the last 2-3 decades, it remains a challenging problem for effective tracking with fast motion, deformation, occlusion, etc. Under complex tracking conditions, most tracking models are not discriminative and adaptive enough. When the combined feature vectors are inputted to the visual models, this may lead to redundancy causing low efficiency and ambiguity causing poor performance. An effective tracking algorithm is proposed to decontaminate features for each video sequence adaptively, where the visual modeling is treated as an optimization problem from the perspective of evolution. Every feature vector is compared to a biological individual and then decontaminated via classical evolutionary algorithms. With the optimized subsets of features, the "curse of dimensionality" has been avoided while the accuracy of the visual model has been improved. The proposed algorithm has been tested on several publicly available datasets with various tracking challenges and benchmarked with a number of state-of-the-art approaches. The comprehensive experiments have demonstrated the efficacy of the proposed methodology.

  18. Comparing the Performance of Two Dynamic Load Distribution Methods

    NASA Technical Reports Server (NTRS)

    Kale, L. V.

    1987-01-01

    Parallel processing of symbolic computations on a message-passing multi-processor presents one challenge: To effectively utilize the available processors, the load must be distributed uniformly to all the processors. However, the structure of these computations cannot be predicted in advance. go, static scheduling methods are not applicable. In this paper, we compare the performance of two dynamic, distributed load balancing methods with extensive simulation studies. The two schemes are: the Contracting Within a Neighborhood (CWN) scheme proposed by us, and the Gradient Model proposed by Lin and Keller. We conclude that although simpler, the CWN is significantly more effective at distributing the work than the Gradient model.

  19. Impact of Spatial Scales on the Intercomparison of Climate Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Wei; Steptoe, Michael; Chang, Zheng

    2017-01-01

    Scenario analysis has been widely applied in climate science to understand the impact of climate change on the future human environment, but intercomparison and similarity analysis of different climate scenarios based on multiple simulation runs remain challenging. Although spatial heterogeneity plays a key role in modeling climate and human systems, little research has been performed to understand the impact of spatial variations and scales on similarity analysis of climate scenarios. To address this issue, the authors developed a geovisual analytics framework that lets users perform similarity analysis of climate scenarios from the Global Change Assessment Model (GCAM) using a hierarchicalmore » clustering approach.« less

  20. Implementing the Rock Challenge: Teacher Perspectives on a Performing Arts Programme

    ERIC Educational Resources Information Center

    Jones, Mathew; Murphy, Simon; Salmon, Debra; Kimberlee, Richard; Orme, Judy

    2004-01-01

    The Rock Challenge is a school-based performing arts programme that aims to promote healthy lifestyles amongst secondary school students. This paper reports on teacher perspectives on the implementation of The Rock Challenge in nine English schools. This study highlights how performing arts programmes, such as The Rock Challenge, are unlikely to…

  1. COLLABORATE©, Part IV: Ramping Up Competency-Based Performance Management.

    PubMed

    Treiger, Teresa M; Fink-Samnick, Ellen

    The purpose of this fourth part of the COLLABORATE© article series provides an expansion and application of previously presented concepts pertaining to the COLLABORATE paradigm of professional case management practice. The model is built upon a value-driven foundation that: PRIMARY PRACTICE SETTING(S):: Applicable to all health care sectors where case management is practiced. As an industry, health care continues to evolve. Terrain shifts and new influences continually surface to challenge professional case management practice. The need for top-performing and nimble professionals who are knowledgeable and proficient in the workplace continues to challenge human resource departments. In addition to care setting knowledge, professional case managers must continually invest in their practice competence toolbox to grow skills and abilities that transcend policies and processes. These individuals demonstrate agility in framing (and reframing) their professional practice to facilitate the best possible outcomes for their clients. Therefore, the continued emphasis on practice competence conveyed through the performance management cycle is an essential ingredient to performance management focused on customer service excellence and organizational improvement. Professional case management transcends professional disciplines, educational levels, and practice settings. Business objectives continue to drive work process and priorities in many practice settings. However, competencies that align with regulatory and accreditation requirements should be the critical driver for consistent, high-quality case management practice. Although there is inherent value in what various disciplines bring to the table, this advanced model unifies behind case management's unique, strengths-based identity instead of continuing to align within traditional divisions (e.g., discipline, work setting, population served). This model fosters case management's expanding career advancement opportunities.

  2. Probability-based collaborative filtering model for predicting gene-disease associations.

    PubMed

    Zeng, Xiangxiang; Ding, Ningxiang; Rodríguez-Patón, Alfonso; Zou, Quan

    2017-12-28

    Accurately predicting pathogenic human genes has been challenging in recent research. Considering extensive gene-disease data verified by biological experiments, we can apply computational methods to perform accurate predictions with reduced time and expenses. We propose a probability-based collaborative filtering model (PCFM) to predict pathogenic human genes. Several kinds of data sets, containing data of humans and data of other nonhuman species, are integrated in our model. Firstly, on the basis of a typical latent factorization model, we propose model I with an average heterogeneous regularization. Secondly, we develop modified model II with personal heterogeneous regularization to enhance the accuracy of aforementioned models. In this model, vector space similarity or Pearson correlation coefficient metrics and data on related species are also used. We compared the results of PCFM with the results of four state-of-arts approaches. The results show that PCFM performs better than other advanced approaches. PCFM model can be leveraged for predictions of disease genes, especially for new human genes or diseases with no known relationships.

  3. A neural model of hierarchical reinforcement learning.

    PubMed

    Rasmussen, Daniel; Voelker, Aaron; Eliasmith, Chris

    2017-01-01

    We develop a novel, biologically detailed neural model of reinforcement learning (RL) processes in the brain. This model incorporates a broad range of biological features that pose challenges to neural RL, such as temporally extended action sequences, continuous environments involving unknown time delays, and noisy/imprecise computations. Most significantly, we expand the model into the realm of hierarchical reinforcement learning (HRL), which divides the RL process into a hierarchy of actions at different levels of abstraction. Here we implement all the major components of HRL in a neural model that captures a variety of known anatomical and physiological properties of the brain. We demonstrate the performance of the model in a range of different environments, in order to emphasize the aim of understanding the brain's general reinforcement learning ability. These results show that the model compares well to previous modelling work and demonstrates improved performance as a result of its hierarchical ability. We also show that the model's behaviour is consistent with available data on human hierarchical RL, and generate several novel predictions.

  4. Real-time individualization of the unified model of performance.

    PubMed

    Liu, Jianbo; Ramakrishnan, Sridhar; Laxminarayan, Srinivas; Balkin, Thomas J; Reifman, Jaques

    2017-12-01

    Existing mathematical models for predicting neurobehavioural performance are not suited for mobile computing platforms because they cannot adapt model parameters automatically in real time to reflect individual differences in the effects of sleep loss. We used an extended Kalman filter to develop a computationally efficient algorithm that continually adapts the parameters of the recently developed Unified Model of Performance (UMP) to an individual. The algorithm accomplishes this in real time as new performance data for the individual become available. We assessed the algorithm's performance by simulating real-time model individualization for 18 subjects subjected to 64 h of total sleep deprivation (TSD) and 7 days of chronic sleep restriction (CSR) with 3 h of time in bed per night, using psychomotor vigilance task (PVT) data collected every 2 h during wakefulness. This UMP individualization process produced parameter estimates that progressively approached the solution produced by a post-hoc fitting of model parameters using all data. The minimum number of PVT measurements needed to individualize the model parameters depended upon the type of sleep-loss challenge, with ~30 required for TSD and ~70 for CSR. However, model individualization depended upon the overall duration of data collection, yielding increasingly accurate model parameters with greater number of days. Interestingly, reducing the PVT sampling frequency by a factor of two did not notably hamper model individualization. The proposed algorithm facilitates real-time learning of an individual's trait-like responses to sleep loss and enables the development of individualized performance prediction models for use in a mobile computing platform. © 2017 European Sleep Research Society.

  5. Deep learning for healthcare: review, opportunities and challenges.

    PubMed

    Miotto, Riccardo; Wang, Fei; Wang, Shuang; Jiang, Xiaoqian; Dudley, Joel T

    2017-05-06

    Gaining knowledge and actionable insights from complex, high-dimensional and heterogeneous biomedical data remains a key challenge in transforming health care. Various types of data have been emerging in modern biomedical research, including electronic health records, imaging, -omics, sensor data and text, which are complex, heterogeneous, poorly annotated and generally unstructured. Traditional data mining and statistical learning approaches typically need to first perform feature engineering to obtain effective and more robust features from those data, and then build prediction or clustering models on top of them. There are lots of challenges on both steps in a scenario of complicated data and lacking of sufficient domain knowledge. The latest advances in deep learning technologies provide new effective paradigms to obtain end-to-end learning models from complex data. In this article, we review the recent literature on applying deep learning technologies to advance the health care domain. Based on the analyzed work, we suggest that deep learning approaches could be the vehicle for translating big biomedical data into improved human health. However, we also note limitations and needs for improved methods development and applications, especially in terms of ease-of-understanding for domain experts and citizen scientists. We discuss such challenges and suggest developing holistic and meaningful interpretable architectures to bridge deep learning models and human interpretability. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Challenges in Developing Models Describing Complex Soil Systems

    NASA Astrophysics Data System (ADS)

    Simunek, J.; Jacques, D.

    2014-12-01

    Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.

  7. Development of a computational model on the neural activity patterns of a visual working memory in a hierarchical feedforward Network

    NASA Astrophysics Data System (ADS)

    An, Soyoung; Choi, Woochul; Paik, Se-Bum

    2015-11-01

    Understanding the mechanism of information processing in the human brain remains a unique challenge because the nonlinear interactions between the neurons in the network are extremely complex and because controlling every relevant parameter during an experiment is difficult. Therefore, a simulation using simplified computational models may be an effective approach. In the present study, we developed a general model of neural networks that can simulate nonlinear activity patterns in the hierarchical structure of a neural network system. To test our model, we first examined whether our simulation could match the previously-observed nonlinear features of neural activity patterns. Next, we performed a psychophysics experiment for a simple visual working memory task to evaluate whether the model could predict the performance of human subjects. Our studies show that the model is capable of reproducing the relationship between memory load and performance and may contribute, in part, to our understanding of how the structure of neural circuits can determine the nonlinear neural activity patterns in the human brain.

  8. Real-time economic nonlinear model predictive control for wind turbine control

    NASA Astrophysics Data System (ADS)

    Gros, Sebastien; Schild, Axel

    2017-12-01

    Nonlinear model predictive control (NMPC) is a strong candidate to handle the control challenges emerging in the modern wind energy industry. Recent research suggested that wind turbine (WT) control based on economic NMPC (ENMPC) can improve the closed-loop performance and simplify the task of controller design when compared to a classical NMPC approach. This paper establishes a formal relationship between the ENMPC controller and the classic NMPC approach, and compares empirically their closed-loop nominal behaviour and performance. The robustness of the performance is assessed for an inaccurate modelling of the tower fore-aft main frequency. Additionally, though a perfect wind preview is assumed here, the effect of having a limited horizon of preview of the wind speed via the LIght Detection And Ranging (LIDAR) sensor is investigated. Finally, this paper provides new algorithmic solutions for deploying ENMPC for WT control, and report improved computational times.

  9. The space shuttle launch vehicle aerodynamic verification challenges

    NASA Technical Reports Server (NTRS)

    Wallace, R. O.; Austin, L. D.; Hondros, J. G.; Surber, T. E.; Gaines, L. M.; Hamilton, J. T.

    1985-01-01

    The Space Shuttle aerodynamics and performance communities were challenged to verify the Space Shuttle vehicle (SSV) aerodynamics and system performance by flight measurements. Historically, launch vehicle flight test programs which faced these same challenges were unmanned instrumented flights of simple aerodynamically shaped vehicles. However, the manned SSV flight test program made these challenges more complex because of the unique aerodynamic configuration powered by the first man-rated solid rocket boosters (SRB). The analyses of flight data did not verify the aerodynamics or performance preflight predictions of the first flight of the Space Transportation System (STS-1). However, these analyses have defined the SSV aerodynamics and verified system performance. The aerodynamics community also was challenged to understand the discrepancy between the wind tunnel and flight defined aerodynamics. The preflight analysis challenges, the aerodynamic extraction challenges, and the postflight analyses challenges which led to the SSV system performance verification and which will lead to the verification of the operational ascent aerodynamics data base are presented.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Ang; Song, Shuaiwen; Brugel, Eric

    To continuously comply with Moore’s Law, modern parallel machines become increasingly complex. Effectively tuning application performance for these machines therefore becomes a daunting task. Moreover, identifying performance bottlenecks at application and architecture level, as well as evaluating various optimization strategies, are becoming extremely difficult when the entanglement of numerous correlated factors is being presented. To tackle these challenges, we present a visual analytical model named “X”. It is intuitive and sufficiently flexible to track all the typical features of a parallel machine.

  11. An Examination of Individual Performance Using Markov Models in the Hellenic Navy’s Officer-Performance Evaluation System

    DTIC Science & Technology

    2012-03-01

    similar to primary needs, but now emotions have replaced transmitted signals. In the 1940s, Maslow developed the needs-hierarchy theory. 37...is the specific design to meet new challenges and realize our potential. McShane and Von Glinow state that …according to Maslow , we are...circumstances, individuals seek their constant personal development. In addition to Abraham Maslow’s needs-hierarchy theory, a recently developed

  12. Challenges and opportunities of cloud computing for atmospheric sciences

    NASA Astrophysics Data System (ADS)

    Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.

    2016-04-01

    Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.

  13. Towards first principle medical diagnostics: on the importance of disease-disease and sign-sign interactions

    NASA Astrophysics Data System (ADS)

    Ramezanpour, Abolfazl; Mashaghi, Alireza

    2017-07-01

    A fundamental problem in medicine and biology is to assign states, e.g. healthy or diseased, to cells, organs or individuals. State assignment or making a diagnosis is often a nontrivial and challenging process and, with the advent of omics technologies, the diagnostic challenge is becoming more and more serious. The challenge lies not only in the increasing number of measured properties and dynamics of the system (e.g. cell or human body) but also in the co-evolution of multiple states and overlapping properties, and degeneracy of states. We develop, from first principles, a generic rational framework for state assignment in cell biology and medicine, and demonstrate its applicability with a few simple theoretical case studies from medical diagnostics. We show how disease-related statistical information can be used to build a comprehensive model that includes the relevant dependencies between clinical and laboratory findings (signs) and diseases. In particular, we include disease-disease and sign-sign interactions and study how one can infer the probability of a disease in a patient with given signs. We perform comparative analysis with simple benchmark models to check the performances of our models. We find that including interactions can significantly change the statistical importance of the signs and diseases. This first principles approach, as we show, facilitates the early diagnosis of disease by taking interactions into accounts, and enables the construction of consensus diagnostic flow charts. Additionally, we envision that our approach will find applications in systems biology, and in particular, in characterizing the phenome via the metabolome, the proteome, the transcriptome, and the genome.

  14. Aerodynamic Challenges for the Mars Science Laboratory Entry, Descent and Landing

    NASA Technical Reports Server (NTRS)

    Schoenenberger, Mark; Dyakonov, Artem; Buning, Pieter; Scallion, William; Norman, John Van

    2009-01-01

    An overview of several important aerodynamics challenges new to the Mars Science Laboratory (MSL) entry vehicle are presented. The MSL entry capsule is a 70 degree sphere cone-based on the original Mars Viking entry capsule. Due to payload and landing accuracy requirements, MSL will be flying at the highest lift-to-drag ratio of any capsule sent to Mars (L/D = 0.24). The capsule will also be flying a guided entry, performing bank maneuvers, a first for Mars entry. The system's mechanical design and increased performance requirements require an expansion of the MSL flight envelope beyond those of historical missions. In certain areas, the experience gained by Viking and other recent Mars missions can no longer be claimed as heritage information. New analysis and testing is re1quired to ensure the safe flight of the MSL entry vehicle. The challenge topics include: hypersonic gas chemistry and laminar-versus-turbulent flow effects on trim angle, a general risk assessment of flying at greater angles-of-attack than Viking, quantifying the aerodynamic interactions induced by a new reaction control system and a risk assessment of recontact of a series of masses jettisoned prior to parachute deploy. An overview of the analysis and tests being conducted to understand and reduce risk in each of these areas is presented. The need for proper modeling and implementation of uncertainties for use in trajectory simulation has resulted in a revision of prior models and additional analysis for the MSL entry vehicle. The six degree-of-freedom uncertainty model and new analysis to quantify roll torque dispersions are presented.

  15. Computational challenges in atomic, molecular and optical physics.

    PubMed

    Taylor, Kenneth T

    2002-06-15

    Six challenges are discussed. These are the laser-driven helium atom; the laser-driven hydrogen molecule and hydrogen molecular ion; electron scattering (with ionization) from one-electron atoms; the vibrational and rotational structure of molecules such as H(3)(+) and water at their dissociation limits; laser-heated clusters; and quantum degeneracy and Bose-Einstein condensation. The first four concern fundamental few-body systems where use of high-performance computing (HPC) is currently making possible accurate modelling from first principles. This leads to reliable predictions and support for laboratory experiment as well as true understanding of the dynamics. Important aspects of these challenges addressable only via a terascale facility are set out. Such a facility makes the last two challenges in the above list meaningfully accessible for the first time, and the scientific interest together with the prospective role for HPC in these is emphasized.

  16. Implementing PAT with Standards

    NASA Astrophysics Data System (ADS)

    Chandramohan, Laakshmana Sabari; Doolla, Suryanarayana; Khaparde, S. A.

    2016-02-01

    Perform Achieve Trade (PAT) is a market-based incentive mechanism to promote energy efficiency. The purpose of this work is to address the challenges inherent to inconsistent representation of business processes, and interoperability issues in PAT like cap-and-trade mechanisms especially when scaled. Studies by various agencies have highlighted that as the mechanism evolves including more industrial sectors and industries in its ambit, implementation will become more challenging. This paper analyses the major needs of PAT (namely tracking, monitoring, auditing & verifying energy-saving reports, and providing technical support & guidance to stakeholders); and how the aforesaid reasons affect them. Though current technologies can handle these challenges to an extent, standardization activities for implementation have been scanty for PAT and this work attempts to evolve them. The inconsistent modification of business processes, rules, and procedures across stakeholders, and interoperability among heterogeneous systems are addressed. This paper proposes the adoption of specifically two standards into PAT, namely Business Process Model and Notation for maintaining consistency in business process modelling, and Common Information Model (IEC 61970, 61968, 62325 combined) for information exchange. Detailed architecture and organization of these adoptions are reported. The work can be used by PAT implementing agencies, stakeholders, and standardization bodies.

  17. Validation of a national hydrological model

    NASA Astrophysics Data System (ADS)

    McMillan, H. K.; Booker, D. J.; Cattoën, C.

    2016-10-01

    Nationwide predictions of flow time-series are valuable for development of policies relating to environmental flows, calculating reliability of supply to water users, or assessing risk of floods or droughts. This breadth of model utility is possible because various hydrological signatures can be derived from simulated flow time-series. However, producing national hydrological simulations can be challenging due to strong environmental diversity across catchments and a lack of data available to aid model parameterisation. A comprehensive and consistent suite of test procedures to quantify spatial and temporal patterns in performance across various parts of the hydrograph is described and applied to quantify the performance of an uncalibrated national rainfall-runoff model of New Zealand. Flow time-series observed at 485 gauging stations were used to calculate Nash-Sutcliffe efficiency and percent bias when simulating between-site differences in daily series, between-year differences in annual series, and between-site differences in hydrological signatures. The procedures were used to assess the benefit of applying a correction to the modelled flow duration curve based on an independent statistical analysis. They were used to aid understanding of climatological, hydrological and model-based causes of differences in predictive performance by assessing multiple hypotheses that describe where and when the model was expected to perform best. As the procedures produce quantitative measures of performance, they provide an objective basis for model assessment that could be applied when comparing observed daily flow series with competing simulated flow series from any region-wide or nationwide hydrological model. Model performance varied in space and time with better scores in larger and medium-wet catchments, and in catchments with smaller seasonal variations. Surprisingly, model performance was not sensitive to aquifer fraction or rain gauge density.

  18. Multi-scale process and supply chain modelling: from lignocellulosic feedstock to process and products

    PubMed Central

    Hosseini, Seyed Ali; Shah, Nilay

    2011-01-01

    There is a large body of literature regarding the choice and optimization of different processes for converting feedstock to bioethanol and bio-commodities; moreover, there has been some reasonable technological development in bioconversion methods over the past decade. However, the eventual cost and other important metrics relating to sustainability of biofuel production will be determined not only by the performance of the conversion process, but also by the performance of the entire supply chain from feedstock production to consumption. Moreover, in order to ensure world-class biorefinery performance, both the network and the individual components must be designed appropriately, and allocation of resources over the resulting infrastructure must effectively be performed. The goal of this work is to describe the key challenges in bioenergy supply chain modelling and then to develop a framework and methodology to show how multi-scale modelling can pave the way to answer holistic supply chain questions, such as the prospects for second generation bioenergy crops. PMID:22482032

  19. Correction to Petrou, Demerouti, and Schaufeli (2015).

    PubMed

    2016-07-01

    Reports an error in "Job crafting in changing organizations: Antecedents and implications for exhaustion and performance" by Paraskevas Petrou, Evangelia Demerouti and Wilmar B. Schaufeli (Journal of Occupational Health Psychology, 2015[Oct], Vol 20[4], 470-480). In the article, there were misreported variables in one of the figures. The legend for Figure 1 should read "Tested SEM model. χ² = 76.50, df = 30, p .000, CFI = 0.98, TLI = 0.92, GFI = .98, RMSEA = .05, RMR = .05; significant synchronous correlations are displayed without their coefficients for clarity purposes. *p ≤ .05.**p ≤ .01." (The following abstract of the original article appeared in record 2015-12642-001.) The present study addressed employee job crafting behaviors (i.e., seeking resources, seeking challenges, and reducing demands) in the context of organizational change. We examined predictors of job crafting both at the organizational level (i.e., perceived impact of the implemented changes on the working life of employees) and the individual level (i.e., employee willingness to follow the changes). Job crafting behaviors were expected to predict task performance and exhaustion. Two-wave longitudinal data from 580 police officers undergoing organizational changes were analyzed with structural equation modeling. Findings showed that the degree to which changes influence employees' daily work was linked to reducing demands and exhaustion, whereas employee willingness to change was linked to seeking resources and seeking challenges. Furthermore, while seeking resources and seeking challenges were associated with high task performance and low exhaustion respectively, reducing demands seemed to predict exhaustion positively. Our findings suggest that job crafting can act as a strategy of employees to respond to organizational change. While seeking resources and seeking challenges enhance employee adjustment and should be encouraged by managers, reducing demands seems to have unfavorable implications for employees. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. Uncertainty estimates of purity measurements based on current information: toward a "live validation" of purity methods.

    PubMed

    Apostol, Izydor; Kelner, Drew; Jiang, Xinzhao Grace; Huang, Gang; Wypych, Jette; Zhang, Xin; Gastwirt, Jessica; Chen, Kenneth; Fodor, Szilan; Hapuarachchi, Suminda; Meriage, Dave; Ye, Frank; Poppe, Leszek; Szpankowski, Wojciech

    2012-12-01

    To predict precision and other performance characteristics of chromatographic purity methods, which represent the most widely used form of analysis in the biopharmaceutical industry. We have conducted a comprehensive survey of purity methods, and show that all performance characteristics fall within narrow measurement ranges. This observation was used to develop a model called Uncertainty Based on Current Information (UBCI), which expresses these performance characteristics as a function of the signal and noise levels, hardware specifications, and software settings. We applied the UCBI model to assess the uncertainty of purity measurements, and compared the results to those from conventional qualification. We demonstrated that the UBCI model is suitable to dynamically assess method performance characteristics, based on information extracted from individual chromatograms. The model provides an opportunity for streamlining qualification and validation studies by implementing a "live validation" of test results utilizing UBCI as a concurrent assessment of measurement uncertainty. Therefore, UBCI can potentially mitigate the challenges associated with laborious conventional method validation and facilitates the introduction of more advanced analytical technologies during the method lifecycle.

  1. Probabilistic and machine learning-based retrieval approaches for biomedical dataset retrieval

    PubMed Central

    Karisani, Payam; Qin, Zhaohui S; Agichtein, Eugene

    2018-01-01

    Abstract The bioCADDIE dataset retrieval challenge brought together different approaches to retrieval of biomedical datasets relevant to a user’s query, expressed as a text description of a needed dataset. We describe experiments in applying a data-driven, machine learning-based approach to biomedical dataset retrieval as part of this challenge. We report on a series of experiments carried out to evaluate the performance of both probabilistic and machine learning-driven techniques from information retrieval, as applied to this challenge. Our experiments with probabilistic information retrieval methods, such as query term weight optimization, automatic query expansion and simulated user relevance feedback, demonstrate that automatically boosting the weights of important keywords in a verbose query is more effective than other methods. We also show that although there is a rich space of potential representations and features available in this domain, machine learning-based re-ranking models are not able to improve on probabilistic information retrieval techniques with the currently available training data. The models and algorithms presented in this paper can serve as a viable implementation of a search engine to provide access to biomedical datasets. The retrieval performance is expected to be further improved by using additional training data that is created by expert annotation, or gathered through usage logs, clicks and other processes during natural operation of the system. Database URL: https://github.com/emory-irlab/biocaddie PMID:29688379

  2. The Law of Categorical Judgment (Corrected) and the Interpretation of Changes in Psychophysical Performance

    ERIC Educational Resources Information Center

    Rosner, Burton S.; Kochanski, Greg

    2009-01-01

    Signal detection theory (SDT) makes the frequently challenged assumption that decision criteria have no variance. An extended model, the Law of Categorical Judgment, relaxes this assumption. The long accepted equation for the law, however, is flawed: It can generate negative probabilities. The correct equation, the Law of Categorical Judgment…

  3. A New Model for Teaching Ethical Behavior

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    2009-01-01

    One can scarcely open the newspaper without finding examples of smart, well-educated people who have behaved in ethically challenged ways. What is frightening about ethical lapses is not that they happen to the ethically outrageous but that they can sneak up on just about everyone. An informal classroom "experiment" recently performed by this…

  4. The Evaluator's Role in Recommending Program Closure: A Model for Decision Making and Professional Responsibility

    ERIC Educational Resources Information Center

    Eddy, Rebecca M.; Berry, Tiffany

    2009-01-01

    Evaluators face challenges when programs consistently fail to meet expectations for performance or improvement and consequently, evaluators may recommend that closing a program is the most prudent course of action. However, the evaluation literature provides little guidance regarding when an evaluator might recommend program closure. Given…

  5. Cardiovascular Performance with E. coli Challenges in a Canine Model of Human Sepsis

    DTIC Science & Technology

    1988-01-01

    most severe peritonitis with volume loading. However, in response to volume, the the largest third space fluid loss (i.e., such animals should...in ESVI represent a greater third space loss from peritoneal in- in infected dogs, i.e., a decrease in contractility. Further- flammation and/or a

  6. Learning: Meeting the Challenges of Older Adulthood.

    ERIC Educational Resources Information Center

    Wolf, Mary Alice

    Erik Erikson's model (1963, 1982) is most useful to an understanding of development and aging. He describes lifelong growth as related to tasks that must be performed. At each stage of life, times of stability are followed by developmental crises. Upon resolving the crisis, the individual can enjoy the particular beauty and security of that…

  7. An Overview on Evaluation of E-Learning/Training Response Time Considering Artificial Neural Networks Modeling

    ERIC Educational Resources Information Center

    Mustafa, Hassan M. H.; Tourkia, Fadhel Ben; Ramadan, Ramadan Mohamed

    2017-01-01

    The objective of this piece of research is to interpret and investigate systematically an observed brain functional phenomenon which is associated with proceeding of e-learning processes. More specifically, this work addresses an interesting and challenging educational issue concerned with dynamical evaluation of elearning performance considering…

  8. Management and Leadership in UK Universities: Exploring the Possibilities of Change

    ERIC Educational Resources Information Center

    Waring, Matt

    2017-01-01

    This paper considers the case for reform of management structures in UK universities and offers proposals for change. The model of top-down, performance-led management that characterises many institutions is both outmoded and ill-suited to the challenges of an increasingly turbulent higher education sector. Drawing on the experiences of a…

  9. The Isolation of Motivational, Motoric, and Schedule Effects on Operant Performance: A Modeling Approach

    ERIC Educational Resources Information Center

    Brackney, Ryan J.; Cheung, Timothy H. C.; Neisewander, Janet L.; Sanabria, Federico

    2011-01-01

    Dissociating motoric and motivational effects of pharmacological manipulations on operant behavior is a substantial challenge. To address this problem, we applied a response-bout analysis to data from rats trained to lever press for sucrose on variable-interval (VI) schedules of reinforcement. Motoric, motivational, and schedule factors (effort…

  10. Comparison of Methods to Trace Multiple Subskills: Is LR-DBN Best?

    ERIC Educational Resources Information Center

    Xu, Yanbo; Mostow, Jack

    2012-01-01

    A long-standing challenge for knowledge tracing is how to update estimates of multiple subskills that underlie a single observable step. We characterize approaches to this problem by how they model knowledge tracing, fit its parameters, predict performance, and update subskill estimates. Previous methods allocated blame or credit among subskills…

  11. Challenges in the development of chronic pulmonary hypertension models in large animals

    PubMed Central

    Rothman, Abraham; Wiencek, Robert G.; Davidson, Stephanie; Evans, William N.; Restrepo, Humberto; Sarukhanov, Valeri; Mann, David

    2017-01-01

    Pulmonary hypertension (PH) results in significant morbidity and mortality. Chronic PH animal models may advance the study of PH’s mechanisms, evolution, and therapy. In this report, we describe the challenges and successes in developing three models of chronic PH in large animals: two models (one canine and one swine) utilized repeated infusions of ceramic microspheres into the pulmonary vascular bed, and the third model employed a surgical aorto-pulmonary shunt. In the canine model, seven dogs underwent microsphere infusions that resulted in progressive elevation of pulmonary arterial pressure over a few months. In this model, pulmonary endoarterial tissue was obtained for histology. In the aorto-pulmonary shunt swine model, 17 pigs developed systemic level pulmonary pressures after 2–3 months. In this model, pulmonary endoarterial tissue was sequentially obtained to assess for changes in gene and microRNA expression. In the swine microsphere infusion model, three pigs developed only a modest chronic increase in pulmonary arterial pressure, despite repeated infusions of microspheres (up to 40 in one animal). The main purpose of this model was for vasodilator testing, which was performed successfully immediately after acute microsphere infusions. Chronic PH in large animal models can be successfully created; however, a model’s characteristics need to match the investigational goals. PMID:28680575

  12. A Community Health Worker "logic model": towards a theory of enhanced performance in low- and middle-income countries.

    PubMed

    Naimoli, Joseph F; Frymus, Diana E; Wuliji, Tana; Franco, Lynne M; Newsome, Martha H

    2014-10-02

    There has been a resurgence of interest in national Community Health Worker (CHW) programs in low- and middle-income countries (LMICs). A lack of strong research evidence persists, however, about the most efficient and effective strategies to ensure optimal, sustained performance of CHWs at scale. To facilitate learning and research to address this knowledge gap, the authors developed a generic CHW logic model that proposes a theoretical causal pathway to improved performance. The logic model draws upon available research and expert knowledge on CHWs in LMICs. Construction of the model entailed a multi-stage, inductive, two-year process. It began with the planning and implementation of a structured review of the existing research on community and health system support for enhanced CHW performance. It continued with a facilitated discussion of review findings with experts during a two-day consultation. The process culminated with the authors' review of consultation-generated documentation, additional analysis, and production of multiple iterations of the model. The generic CHW logic model posits that optimal CHW performance is a function of high quality CHW programming, which is reinforced, sustained, and brought to scale by robust, high-performing health and community systems, both of which mobilize inputs and put in place processes needed to fully achieve performance objectives. Multiple contextual factors can influence CHW programming, system functioning, and CHW performance. The model is a novel contribution to current thinking about CHWs. It places CHW performance at the center of the discussion about CHW programming, recognizes the strengths and limitations of discrete, targeted programs, and is comprehensive, reflecting the current state of both scientific and tacit knowledge about support for improving CHW performance. The model is also a practical tool that offers guidance for continuous learning about what works. Despite the model's limitations and several challenges in translating the potential for learning into tangible learning, the CHW generic logic model provides a solid basis for exploring and testing a causal pathway to improved performance.

  13. A comparison of WEC control strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, David G.; Bacelli, Giorgio; Coe, Ryan Geoffrey

    2016-04-01

    The operation of Wave Energy Converter (WEC) devices can pose many challenging problems to the Water Power Community. A key research question is how to significantly improve the performance of these WEC devices through improving the control system design. This report summarizes an effort to analyze and improve the performance of WEC through the design and implementation of control systems. Controllers were selected to span the WEC control design space with the aim of building a more comprehensive understanding of different controller capabilities and requirements. To design and evaluate these control strategies, a model scale test-bed WEC was designed formore » both numerical and experimental testing (see Section 1.1). Seven control strategies have been developed and applied on a numerical model of the selected WEC. This model is capable of performing at a range of levels, spanning from a fully-linear realization to varying levels of nonlinearity. The details of this model and its ongoing development are described in Section 1.2.« less

  14. The role of computer-aided 3D surgery and stereolithographic modelling for vector orientation in premaxillary and trans-sinusoidal maxillary distraction osteogenesis.

    PubMed

    Varol, Altan; Basa, Selçuk

    2009-06-01

    Maxillary distraction osteogenesis is a challenging procedure when it is performed with internal submerged distractors due to obligation of setting accurate distraction vectors. Five patients with severe maxillary retrognathy were planned with Mimics 10.01 CMF and Simplant 10.01 software. Distraction vectors and rods of distractors were arranged in 3D environment and on STL models. All patients were operated under general anaesthesia and complete Le Fort I downfracture was performed. All distractions were performed according to orientated vectors. All patients achieved stable occlusion and satisfactory aesthetic outcome at the end of the treatment period. Preoperative bending of internal maxillary distractors prevents significant loss of operation time. 3D computer-aided surgical simulation and model surgery provide accurate orientation of distraction vectors for premaxillary and internal trans-sinusoidal maxillary distraction. Combination of virtual surgical simulation and stereolithographic models surgery can be validated as an effective method of preoperative planning for complicated maxillofacial surgery cases.

  15. Practices and Processes of Leading High Performance Home Builders in the Upper Midwest

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Von Thoma, E.; Ojczyk, C.

    2012-12-01

    The NorthernSTAR Building America Partnership team proposed this study to gain insight into the business, sales, and construction processes of successful high performance builders. The knowledge gained by understanding the high performance strategies used by individual builders, as well as the process each followed to move from traditional builder to high performance builder, will be beneficial in proposing more in-depth research to yield specific action items to assist the industry at large transform to high performance new home construction. This investigation identified the best practices of three successful high performance builders in the upper Midwest. In-depth field analysis of themore » performance levels of their homes, their business models, and their strategies for market acceptance were explored. All three builders commonly seek ENERGY STAR certification on their homes and implement strategies that would allow them to meet the requirements for the Building America Builders Challenge program. Their desire for continuous improvement, willingness to seek outside assistance, and ambition to be leaders in their field are common themes. Problem solving to overcome challenges was accepted as part of doing business. It was concluded that crossing the gap from code-based building to high performance based building was a natural evolution for these leading builders.« less

  16. Why women perform better in college than admission scores would predict: Exploring the roles of conscientiousness and course-taking patterns.

    PubMed

    Keiser, Heidi N; Sackett, Paul R; Kuncel, Nathan R; Brothen, Thomas

    2016-04-01

    Women typically obtain higher subsequent college GPAs than men with the same admissions test score. A common reaction is to attribute this to a flaw in the admissions test. We explore the possibility that this underprediction of women's performance reflects gender differences in conscientiousness and college course-taking patterns. In Study 1, we focus on using the ACT to predict performance in a single, large course where performance is decomposed into cognitive (exam and quiz scores) and less cognitive, discretionary components (discussion and extra credit points). The ACT does not underpredict female's cognitive performance, but it does underpredict female performance on the less cognitive, discretionary components of academic performance, because it fails to measure and account for the personality trait of conscientiousness. In Study 2, we create 2 course-difficulty indices (Course Challenge and Mean Aptitude in Course) and add them to an HLM regression model to see if they reduce the degree to which SAT scores underpredict female performance. Including Course Challenge does result in a modest reduction of the gender coefficient; however, including Mean Aptitude in Course does not. Thus, differences in course-taking patterns is a partial (albeit small) explanation for the common finding of differential prediction by gender. (c) 2016 APA, all rights reserved).

  17. Investigation of Interference Models for RFID Systems.

    PubMed

    Zhang, Linchao; Ferrero, Renato; Gandino, Filippo; Rebaudengo, Maurizio

    2016-02-04

    The reader-to-reader collision in an RFID system is a challenging problem for communications technology. In order to model the interference between RFID readers, different interference models have been proposed, mainly based on two approaches: single and additive interference. The former only considers the interference from one reader within a certain range, whereas the latter takes into account the sum of all of the simultaneous interferences in order to emulate a more realistic behavior. Although the difference between the two approaches has been theoretically analyzed in previous research, their effects on the estimated performance of the reader-to-reader anti-collision protocols have not yet been investigated. In this paper, the influence of the interference model on the anti-collision protocols is studied by simulating a representative state-of-the-art protocol. The results presented in this paper highlight that the use of additive models, although more computationally intensive, is mandatory to improve the performance of anti-collision protocols.

  18. A wearable computing platform for developing cloud-based machine learning models for health monitoring applications.

    PubMed

    Patel, Shyamal; McGinnis, Ryan S; Silva, Ikaro; DiCristofaro, Steve; Mahadevan, Nikhil; Jortberg, Elise; Franco, Jaime; Martin, Albert; Lust, Joseph; Raj, Milan; McGrane, Bryan; DePetrillo, Paolo; Aranyosi, A J; Ceruolo, Melissa; Pindado, Jesus; Ghaffari, Roozbeh

    2016-08-01

    Wearable sensors have the potential to enable clinical-grade ambulatory health monitoring outside the clinic. Technological advances have enabled development of devices that can measure vital signs with great precision and significant progress has been made towards extracting clinically meaningful information from these devices in research studies. However, translating measurement accuracies achieved in the controlled settings such as the lab and clinic to unconstrained environments such as the home remains a challenge. In this paper, we present a novel wearable computing platform for unobtrusive collection of labeled datasets and a new paradigm for continuous development, deployment and evaluation of machine learning models to ensure robust model performance as we transition from the lab to home. Using this system, we train activity classification models across two studies and track changes in model performance as we go from constrained to unconstrained settings.

  19. A visual tracking method based on deep learning without online model updating

    NASA Astrophysics Data System (ADS)

    Tang, Cong; Wang, Yicheng; Feng, Yunsong; Zheng, Chao; Jin, Wei

    2018-02-01

    The paper proposes a visual tracking method based on deep learning without online model updating. In consideration of the advantages of deep learning in feature representation, deep model SSD (Single Shot Multibox Detector) is used as the object extractor in the tracking model. Simultaneously, the color histogram feature and HOG (Histogram of Oriented Gradient) feature are combined to select the tracking object. In the process of tracking, multi-scale object searching map is built to improve the detection performance of deep detection model and the tracking efficiency. In the experiment of eight respective tracking video sequences in the baseline dataset, compared with six state-of-the-art methods, the method in the paper has better robustness in the tracking challenging factors, such as deformation, scale variation, rotation variation, illumination variation, and background clutters, moreover, its general performance is better than other six tracking methods.

  20. PBPK models for the prediction of in vivo performance of oral dosage forms.

    PubMed

    Kostewicz, Edmund S; Aarons, Leon; Bergstrand, Martin; Bolger, Michael B; Galetin, Aleksandra; Hatley, Oliver; Jamei, Masoud; Lloyd, Richard; Pepin, Xavier; Rostami-Hodjegan, Amin; Sjögren, Erik; Tannergren, Christer; Turner, David B; Wagner, Christian; Weitschies, Werner; Dressman, Jennifer

    2014-06-16

    Drug absorption from the gastrointestinal (GI) tract is a highly complex process dependent upon numerous factors including the physicochemical properties of the drug, characteristics of the formulation and interplay with the underlying physiological properties of the GI tract. The ability to accurately predict oral drug absorption during drug product development is becoming more relevant given the current challenges facing the pharmaceutical industry. Physiologically-based pharmacokinetic (PBPK) modeling provides an approach that enables the plasma concentration-time profiles to be predicted from preclinical in vitro and in vivo data and can thus provide a valuable resource to support decisions at various stages of the drug development process. Whilst there have been quite a few successes with PBPK models identifying key issues in the development of new drugs in vivo, there are still many aspects that need to be addressed in order to maximize the utility of the PBPK models to predict drug absorption, including improving our understanding of conditions in the lower small intestine and colon, taking the influence of disease on GI physiology into account and further exploring the reasons behind population variability. Importantly, there is also a need to create more appropriate in vitro models for testing dosage form performance and to streamline data input from these into the PBPK models. As part of the Oral Biopharmaceutical Tools (OrBiTo) project, this review provides a summary of the current status of PBPK models available. The current challenges in PBPK set-ups for oral drug absorption including the composition of GI luminal contents, transit and hydrodynamics, permeability and intestinal wall metabolism are discussed in detail. Further, the challenges regarding the appropriate integration of results from in vitro models, such as consideration of appropriate integration/estimation of solubility and the complexity of the in vitro release and precipitation data, are also highlighted as important steps to advancing the application of PBPK models in drug development. It is expected that the "innovative" integration of in vitro data from more appropriate in vitro models and the enhancement of the GI physiology component of PBPK models, arising from the OrBiTo project, will lead to a significant enhancement in the ability of PBPK models to successfully predict oral drug absorption and advance their role in preclinical and clinical development, as well as for regulatory applications. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Interactions between soil thermal and hydrological dynamics in the response of Alaska ecosystems to fire disturbance

    USGS Publications Warehouse

    Yi, Shuhua; McGuire, A. David; Harden, Jennifer; Kasischke, Eric; Manies, Kristen L.; Hinzman, Larry; Liljedahl, Anna K.; Randerson, J.; Liu, Heping; Romanovsky, Vladimir E.; Marchenko, Sergey S.; Kim, Yongwon

    2009-01-01

    Soil temperature and moisture are important factors that control many ecosystem processes. However, interactions between soil thermal and hydrological processes are not adequately understood in cold regions, where the frozen soil, fire disturbance, and soil drainage play important roles in controlling interactions among these processes. These interactions were investigated with a new ecosystem model framework, the dynamic organic soil version of the Terrestrial Ecosystem Model, that incorporates an efficient and stable numerical scheme for simulating soil thermal and hydrological dynamics within soil profiles that contain a live moss horizon, fibrous and amorphous organic horizons, and mineral soil horizons. The performance of the model was evaluated for a tundra burn site that had both preburn and postburn measurements, two black spruce fire chronosequences (representing space-for-time substitutions in well and intermediately drained conditions), and a poorly drained black spruce site. Although space-for-time substitutions present challenges in model-data comparison, the model demonstrates substantial ability in simulating the dynamics of evapotranspiration, soil temperature, active layer depth, soil moisture, and water table depth in response to both climate variability and fire disturbance. Several differences between model simulations and field measurements identified key challenges for evaluating/improving model performance that include (1) proper representation of discrepancies between air temperature and ground surface temperature; (2) minimization of precipitation biases in the driving data sets; (3) improvement of the measurement accuracy of soil moisture in surface organic horizons; and (4) proper specification of organic horizon depth/properties, and soil thermal conductivity.

  2. Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Y.; Keller, J.; Wallen, R.

    2015-02-01

    Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.

  3. A Distributed Platform for Global-Scale Agent-Based Models of Disease Transmission

    PubMed Central

    Parker, Jon; Epstein, Joshua M.

    2013-01-01

    The Global-Scale Agent Model (GSAM) is presented. The GSAM is a high-performance distributed platform for agent-based epidemic modeling capable of simulating a disease outbreak in a population of several billion agents. It is unprecedented in its scale, its speed, and its use of Java. Solutions to multiple challenges inherent in distributing massive agent-based models are presented. Communication, synchronization, and memory usage are among the topics covered in detail. The memory usage discussion is Java specific. However, the communication and synchronization discussions apply broadly. We provide benchmarks illustrating the GSAM’s speed and scalability. PMID:24465120

  4. [Research progress on real-time deformable models of soft tissues for surgery simulation].

    PubMed

    Xu, Shaoping; Liu, Xiaoping; Zhang, Hua; Luo, Jie

    2010-04-01

    Biological tissues generally exhibit nonlinearity, anisotropy, quasi-incompressibility and viscoelasticity about material properties. Simulating the behaviour of elastic objects in real time is one of the current objectives of virtual surgery simulation which is still a challenge for researchers to accurately depict the behaviour of human tissues. In this paper, we present a classification of the different deformable models that have been developed. We present the advantages and disadvantages of each one. Finally, we make a comparison of deformable models and perform an evaluation of the state of the art and the future of deformable models.

  5. Computational modeling of electromechanical instabilities in dielectric elastomers (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Park, Harold

    2016-04-01

    Dielectric elastomers are a class of soft, active materials that have recently gained significant interest due to the fact that they can be electrostatically actuated into undergoing extremely large deformations. An ongoing challenge has been the development of robust and accurate computational models for elastomers, particularly those that can capture electromechanical instabilities that limit the performance of elastomers such as creasing, wrinkling, and snap-through. I discuss in this work a recently developed finite element model for elastomers that is dynamic, nonlinear, and fully electromechanically coupled. The model also significantly alleviates volumetric locking due that arises due to the incompressible nature of the elastomers, and incorporates viscoelasticity within a finite deformation framework. Numerical examples are shown that demonstrate the performance of the proposed method in capturing electromechanical instabilities (snap-through, creasing, cratering, wrinkling) that have been observed experimentally.

  6. Fingerstroke time estimates for touchscreen-based mobile gaming interaction.

    PubMed

    Lee, Ahreum; Song, Kiburm; Ryu, Hokyoung Blake; Kim, Jieun; Kwon, Gyuhyun

    2015-12-01

    The growing popularity of gaming applications and ever-faster mobile carrier networks have called attention to an intriguing issue that is closely related to command input performance. A challenging mirroring game service, which simultaneously provides game service to both PC and mobile phone users, allows them to play games against each other with very different control interfaces. Thus, for efficient mobile game design, it is essential to apply a new predictive model for measuring how potential touch input compares to the PC interfaces. The present study empirically tests the keystroke-level model (KLM) for predicting the time performance of basic interaction controls on the touch-sensitive smartphone interface (i.e., tapping, pointing, dragging, and flicking). A modified KLM, tentatively called the fingerstroke-level model (FLM), is proposed using time estimates on regression models. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Dynamic energy models and carbon mitigation policies

    NASA Astrophysics Data System (ADS)

    Tilley, Luke A.

    In this dissertation I examine a specific class of energy models and their implications for carbon mitigation policies. The class of models includes a production function capable of reproducing the empirically observed phenomenon of short run rigidity of energy use in response to energy price changes and long run exibility of energy use in response to energy price changes. I use a theoretical model, parameterized using empirical data, to simulate economic performance under several tax regimes where taxes are levied on capital income, investment, and energy. I also investigate transitions from one tax regime to another. I find that energy taxes intended to reduce energy use can successfully achieve those goals with minimal or even positive impacts on macroeconomic performance. But the transition paths to new steady states are lengthy, making political commitment to such policies very challenging.

  8. Design Analysis Kit for Optimization and Terascale Applications 6.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-19

    Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to: (1) enhance understanding of risk, (2) improve products, and (3) assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a computational model. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, risk analysis, and quantification of margins and uncertainty with such models. It directly supports verificationmore » and validation activities. The algorithms implemented in Dakota aim to address challenges in performing these analyses with complex science and engineering models from desktop to high performance computers.« less

  9. Team Resilience as a Second-Order Emergent State: A Theoretical Model and Research Directions

    PubMed Central

    Bowers, Clint; Kreutzer, Christine; Cannon-Bowers, Janis; Lamb, Jerry

    2017-01-01

    Resilience has been recognized as an important phenomenon for understanding how individuals overcome difficult situations. However, it is not only individuals who face difficulties; it is not uncommon for teams to experience adversity. When they do, they must be able to overcome these challenges without performance decrements.This manuscript represents a theoretical model that might be helpful in conceptualizing this important construct. Specifically, it describes team resilience as a second-order emergent state. We also include research propositions that follow from the model. PMID:28861013

  10. Performance assessment of a compressive sensing single-pixel imaging system

    NASA Astrophysics Data System (ADS)

    Du Bosq, Todd W.; Preece, Bradley L.

    2017-04-01

    Conventional sensors measure the light incident at each pixel in a focal plane array. Compressive sensing (CS) involves capturing a smaller number of unconventional measurements from the scene, and then using a companion process to recover the image. CS has the potential to acquire imagery with equivalent information content to a large format array while using smaller, cheaper, and lower bandwidth components. However, the benefits of CS do not come without compromise. The CS architecture chosen must effectively balance between physical considerations, reconstruction accuracy, and reconstruction speed to meet operational requirements. Performance modeling of CS imagers is challenging due to the complexity and nonlinearity of the system and reconstruction algorithm. To properly assess the value of such systems, it is necessary to fully characterize the image quality, including artifacts and sensitivity to noise. Imagery of a two-handheld object target set was collected using an shortwave infrared single-pixel CS camera for various ranges and number of processed measurements. Human perception experiments were performed to determine the identification performance within the trade space. The performance of the nonlinear CS camera was modeled by mapping the nonlinear degradations to an equivalent linear shift invariant model. Finally, the limitations of CS modeling techniques are discussed.

  11. Application of seemingly unrelated regression in medical data with intermittently observed time-dependent covariates.

    PubMed

    Keshavarzi, Sareh; Ayatollahi, Seyyed Mohammad Taghi; Zare, Najaf; Pakfetrat, Maryam

    2012-01-01

    BACKGROUND. In many studies with longitudinal data, time-dependent covariates can only be measured intermittently (not at all observation times), and this presents difficulties for standard statistical analyses. This situation is common in medical studies, and methods that deal with this challenge would be useful. METHODS. In this study, we performed the seemingly unrelated regression (SUR) based models, with respect to each observation time in longitudinal data with intermittently observed time-dependent covariates and further compared these models with mixed-effect regression models (MRMs) under three classic imputation procedures. Simulation studies were performed to compare the sample size properties of the estimated coefficients for different modeling choices. RESULTS. In general, the proposed models in the presence of intermittently observed time-dependent covariates showed a good performance. However, when we considered only the observed values of the covariate without any imputations, the resulted biases were greater. The performances of the proposed SUR-based models in comparison with MRM using classic imputation methods were nearly similar with approximately equal amounts of bias and MSE. CONCLUSION. The simulation study suggests that the SUR-based models work as efficiently as MRM in the case of intermittently observed time-dependent covariates. Thus, it can be used as an alternative to MRM.

  12. Bayesian techniques for analyzing group differences in the Iowa Gambling Task: A case study of intuitive and deliberate decision-makers.

    PubMed

    Steingroever, Helen; Pachur, Thorsten; Šmíra, Martin; Lee, Michael D

    2018-06-01

    The Iowa Gambling Task (IGT) is one of the most popular experimental paradigms for comparing complex decision-making across groups. Most commonly, IGT behavior is analyzed using frequentist tests to compare performance across groups, and to compare inferred parameters of cognitive models developed for the IGT. Here, we present a Bayesian alternative based on Bayesian repeated-measures ANOVA for comparing performance, and a suite of three complementary model-based methods for assessing the cognitive processes underlying IGT performance. The three model-based methods involve Bayesian hierarchical parameter estimation, Bayes factor model comparison, and Bayesian latent-mixture modeling. We illustrate these Bayesian methods by applying them to test the extent to which differences in intuitive versus deliberate decision style are associated with differences in IGT performance. The results show that intuitive and deliberate decision-makers behave similarly on the IGT, and the modeling analyses consistently suggest that both groups of decision-makers rely on similar cognitive processes. Our results challenge the notion that individual differences in intuitive and deliberate decision styles have a broad impact on decision-making. They also highlight the advantages of Bayesian methods, especially their ability to quantify evidence in favor of the null hypothesis, and that they allow model-based analyses to incorporate hierarchical and latent-mixture structures.

  13. Independence screening for high dimensional nonlinear additive ODE models with applications to dynamic gene regulatory networks.

    PubMed

    Xue, Hongqi; Wu, Shuang; Wu, Yichao; Ramirez Idarraga, Juan C; Wu, Hulin

    2018-05-02

    Mechanism-driven low-dimensional ordinary differential equation (ODE) models are often used to model viral dynamics at cellular levels and epidemics of infectious diseases. However, low-dimensional mechanism-based ODE models are limited for modeling infectious diseases at molecular levels such as transcriptomic or proteomic levels, which is critical to understand pathogenesis of diseases. Although linear ODE models have been proposed for gene regulatory networks (GRNs), nonlinear regulations are common in GRNs. The reconstruction of large-scale nonlinear networks from time-course gene expression data remains an unresolved issue. Here, we use high-dimensional nonlinear additive ODEs to model GRNs and propose a 4-step procedure to efficiently perform variable selection for nonlinear ODEs. To tackle the challenge of high dimensionality, we couple the 2-stage smoothing-based estimation method for ODEs and a nonlinear independence screening method to perform variable selection for the nonlinear ODE models. We have shown that our method possesses the sure screening property and it can handle problems with non-polynomial dimensionality. Numerical performance of the proposed method is illustrated with simulated data and a real data example for identifying the dynamic GRN of Saccharomyces cerevisiae. Copyright © 2018 John Wiley & Sons, Ltd.

  14. Model-driven approach to data collection and reporting for quality improvement

    PubMed Central

    Curcin, Vasa; Woodcock, Thomas; Poots, Alan J.; Majeed, Azeem; Bell, Derek

    2014-01-01

    Continuous data collection and analysis have been shown essential to achieving improvement in healthcare. However, the data required for local improvement initiatives are often not readily available from hospital Electronic Health Record (EHR) systems or not routinely collected. Furthermore, improvement teams are often restricted in time and funding thus requiring inexpensive and rapid tools to support their work. Hence, the informatics challenge in healthcare local improvement initiatives consists of providing a mechanism for rapid modelling of the local domain by non-informatics experts, including performance metric definitions, and grounded in established improvement techniques. We investigate the feasibility of a model-driven software approach to address this challenge, whereby an improvement model designed by a team is used to automatically generate required electronic data collection instruments and reporting tools. To that goal, we have designed a generic Improvement Data Model (IDM) to capture the data items and quality measures relevant to the project, and constructed Web Improvement Support in Healthcare (WISH), a prototype tool that takes user-generated IDM models and creates a data schema, data collection web interfaces, and a set of live reports, based on Statistical Process Control (SPC) for use by improvement teams. The software has been successfully used in over 50 improvement projects, with more than 700 users. We present in detail the experiences of one of those initiatives, Chronic Obstructive Pulmonary Disease project in Northwest London hospitals. The specific challenges of improvement in healthcare are analysed and the benefits and limitations of the approach are discussed. PMID:24874182

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heroux, Michael; Lethin, Richard

    Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that makemore » design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.« less

  16. Collaborative Project: Development of an Isotope-Enabled CESM for Testing Abrupt Climate Changes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Zhengyu

    One of the most important validations for a state-of-art Earth System Model (ESM) with respect to climate changes is the simulation of the climate evolution and abrupt climate change events in the Earth’s history of the last 21,000 years. However, one great challenge for model validation is that ESMs usually do not directly simulate geochemical variables that can be compared directly with past proxy records. In this proposal, we have met this challenge by developing the simulation capability of major isotopes in a state-of-art ESM, the Community Earth System Model (CESM), enabling us to make direct model-data comparison by comparingmore » the model directly against proxy climate records. Our isotope-enabled ESM incorporates the capability of simulating key isotopes and geotracers, notably δ 18O, δD, δ 14C, and δ 13C, Nd and Pa/Th. The isotope-enabled ESM have been used to perform some simulations for the last 21000 years. The direct comparison of these simulations with proxy records has shed light on the mechanisms of important climate change events.« less

  17. Challenges in modeling the X-29 flight test performance

    NASA Technical Reports Server (NTRS)

    Hicks, John W.; Kania, Jan; Pearce, Robert; Mills, Glen

    1987-01-01

    Presented are methods, instrumentation, and difficulties associated with drag measurement of the X-29A aircraft. The initial performance objective of the X-29A program emphasized drag polar shapes rather than absolute drag levels. Priorities during the flight envelope expansion restricted the evaluation of aircraft performance. Changes in aircraft configuration, uncertainties in angle-of-attack calibration, and limitations in instrumentation complicated the analysis. Limited engine instrumentation with uncertainties in overall in-flight thrust accuracy made it difficult to obtain reliable values of coefficient of parasite drag. The aircraft was incapable of tracking the automatic camber control trim schedule for optimum wing flaperon deflection during typical dynamic performance maneuvers; this has also complicated the drag polar shape modeling. The X-29A was far enough off the schedule that the developed trim drag correction procedure has proven inadequate. However, good drag polar shapes have been developed throughout the flight envelope. Preliminary flight results have compared well with wind tunnel predictions. A more comprehensive analysis must be done to complete performance models. The detailed flight performance program with a calibrated engine will benefit from the experience gained during this preliminary performance phase.

  18. Challenges in modeling the X-29A flight test performance

    NASA Technical Reports Server (NTRS)

    Hicks, John W.; Kania, Jan; Pearce, Robert; Mills, Glen

    1987-01-01

    The paper presents the methods, instrumentation, and difficulties associated with drag measurement of the X-29A aircraft. The initial performance objective of the X-29A program emphasized drag polar shapes rather than absolute drag levels. Priorities during the flight envelope expansion restricted the evaluation of aircraft performance. Changes in aircraft configuration, uncertainties in angle-of-attack calibration, and limitations in instrumentation complicated the analysis. Limited engine instrumentation with uncertainties in overall in-flight thrust accuracy made it difficult to obtain reliable values of coefficient of parasite drag. The aircraft was incapable of tracking the automatic camber control trim schedule for optimum wing flaperon deflection during typical dynamic performance maneuvers; this has also complicated the drag polar shape modeling. The X-29A was far enough off the schedule that the developed trim drag correction procedure has proven inadequate. Despite these obstacles, good drag polar shapes have been developed throughout the flight envelope. Preliminary flight results have compared well with wind tunnel predictions. A more comprehensive analysis must be done to complete the performance models. The detailed flight performance program with a calibrated engine will benefit from the experience gained during this preliminary performance phase.

  19. A Programming Model Performance Study Using the NAS Parallel Benchmarks

    DOE PAGES

    Shan, Hongzhang; Blagojević, Filip; Min, Seung-Jai; ...

    2010-01-01

    Harnessing the power of multicore platforms is challenging due to the additional levels of parallelism present. In this paper we use the NAS Parallel Benchmarks to study three programming models, MPI, OpenMP and PGAS to understand their performance and memory usage characteristics on current multicore architectures. To understand these characteristics we use the Integrated Performance Monitoring tool and other ways to measure communication versus computation time, as well as the fraction of the run time spent in OpenMP. The benchmarks are run on two different Cray XT5 systems and an Infiniband cluster. Our results show that in general the threemore » programming models exhibit very similar performance characteristics. In a few cases, OpenMP is significantly faster because it explicitly avoids communication. For these particular cases, we were able to re-write the UPC versions and achieve equal performance to OpenMP. Using OpenMP was also the most advantageous in terms of memory usage. Also we compare performance differences between the two Cray systems, which have quad-core and hex-core processors. We show that at scale the performance is almost always slower on the hex-core system because of increased contention for network resources.« less

  20. Buildings Lean Maintenance Implementation Model

    NASA Astrophysics Data System (ADS)

    Abreu, Antonio; Calado, João; Requeijo, José

    2016-11-01

    Nowadays, companies in global markets have to achieve high levels of performance and competitiveness to stay "alive".Within this assumption, the building maintenance cannot be done in a casual and improvised way due to the costs related. Starting with some discussion about lean management and building maintenance, this paper introduces a model to support the Lean Building Maintenance (LBM) approach. Finally based on a real case study from a Portuguese company, the benefits, challenges and difficulties are presented and discussed.

  1. A review on channel models in free space optical communication systems

    NASA Astrophysics Data System (ADS)

    Anbarasi, K.; Hemanth, C.; Sangeetha, R. G.

    2017-12-01

    Free Space Optical communication (FSO) is a wireless communication technology which uses light to transmit the data in free space. FSO has advantages like unlicensed spectrum and higher bandwidth. In this paper FSO system merits and demerits, challenges in FSO, and various channel models are discussed. To mitigate the turbulence in FSO the mitigation techniques like relaying, diversity schemes and adopting different modulation techniques used in different channels are discussed and its performance comparison is given.

  2. A coupled duration-focused architecture for real-time music-to-score alignment.

    PubMed

    Cont, Arshia

    2010-06-01

    The capacity for real-time synchronization and coordination is a common ability among trained musicians performing a music score that presents an interesting challenge for machine intelligence. Compared to speech recognition, which has influenced many music information retrieval systems, music's temporal dynamics and complexity pose challenging problems to common approximations regarding time modeling of data streams. In this paper, we propose a design for a real-time music-to-score alignment system. Given a live recording of a musician playing a music score, the system is capable of following the musician in real time within the score and decoding the tempo (or pace) of its performance. The proposed design features two coupled audio and tempo agents within a unique probabilistic inference framework that adaptively updates its parameters based on the real-time context. Online decoding is achieved through the collaboration of the coupled agents in a Hidden Hybrid Markov/semi-Markov framework, where prediction feedback of one agent affects the behavior of the other. We perform evaluations for both real-time alignment and the proposed temporal model. An implementation of the presented system has been widely used in real concert situations worldwide and the readers are encouraged to access the actual system and experiment the results.

  3. From global circulation to flood loss: Coupling models across the scales

    NASA Astrophysics Data System (ADS)

    Felder, Guido; Gomez-Navarro, Juan Jose; Bozhinova, Denica; Zischg, Andreas; Raible, Christoph C.; Ole, Roessler; Martius, Olivia; Weingartner, Rolf

    2017-04-01

    The prediction and the prevention of flood losses requires an extensive understanding of underlying meteorological, hydrological, hydraulic and damage processes. Coupled models help to improve the understanding of such underlying processes and therefore contribute the understanding of flood risk. Using such a modelling approach to determine potentially flood-affected areas and damages requires a complex coupling between several models operating at different spatial and temporal scales. Although the isolated parts of the single modelling components are well established and commonly used in the literature, a full coupling including a mesoscale meteorological model driven by a global circulation one, a hydrologic model, a hydrodynamic model and a flood impact and loss model has not been reported so far. In the present study, we tackle the application of such a coupled model chain in terms of computational resources, scale effects, and model performance. From a technical point of view, results show the general applicability of such a coupled model, as well as good model performance. From a practical point of view, such an approach enables the prediction of flood-induced damages, although some future challenges have been identified.

  4. Anisakis simplex allergy: a murine model of anaphylaxis induced by parasitic proteins displays a mixed Th1/Th2 pattern

    PubMed Central

    Baeza, M L; Conejero, L; Higaki, Y; Martín, E; Pérez, C; Infante, S; Rubio, M; Zubeldia, J M

    2005-01-01

    The study of the singular hypersensitivity reactions to Anisakis simplex (A.s) proteins, may help us to undestand many of the unknown immune interactions between helmiths infections and allergy. We have developed a murine model of allergy to A. simplex, that mimics human A. simplex allergy to study the specific aspects of anaphylaxis induced by parasites. Male C3H/HeJ mice were intraperitoneally sensitized to A. simplex. Mice were then intravenous or orally challenged with A. simplex. Antigen-specific immunoglobulins, polyclonal IgE, anaphylactic symptoms, plasma histamine levels and cytokine profiles were determined. Comparative IgE immunoblot analyses were also performed. Specific IgE, IgG1 and IgG2a were detected in sensitized mice since week 3. Polyclonal IgE raised and peaked with different kinetics. Intravenous A. simplex challenge produced anaphylaxis in mice, accompanied by plasma histamine release. Oral A. simplex challenge in similarly sensitized mice did not caused symptoms nor histamine release. Numerous A. simplex allergens were recognized by sensitized mouse sera, some of them similar to human serum. The A. simplex stimulated splenocytes released IL-10, IFN-γ, IL-4, IL-13 and IL-5. We describe a new animal model of anaphylaxis. It exhibits characteristics of type I hypersensitivity reactions to Anisakis simplex similar to those observed in allergic humans. Different responses to i.v. or oral A. simplex challenges emerged, which did not reflect a window tolerization period. The cytokine profile developed (mixed Th1/Th2 pattern) differed from the observed in classical models of anaphylaxis or allergy to food antigens. This model may permit to investigate the peculiar allergic reactions to parasitic proteins. PMID:16297154

  5. Effect of coccidia challenge and natural betaine supplementation on performance, nutrient utilization, and intestinal lesion scores of broiler chickens fed suboptimal level of dietary methionine

    PubMed Central

    Amerah, A. M.; Ravindran, V.

    2015-01-01

    The aim of the present experiment was to examine the effect of coccidia challenge and natural betaine supplementation on performance, nutrient utilization, and intestinal lesion scores of broiler chickens fed suboptimal level of dietary methionine. The experimental design was a 2 × 2 factorial arrangement of treatments evaluating two levels of betaine supplementation (0 and 960 g betaine/t of feed) without or with coccidia challenge. Each treatment was fed to 8 cages of 8 male broilers (Ross 308) for 1 to 21d. On d 14, birds in the 2 challenged groups received mixed inocula of Eimeria species from a recent field isolate, containing approximately 180,000 E. acervulina, 6,000 E. maxima, and 18,000 E. tenella oocysts. At 21d, digesta from the terminal ileum was collected for the determination of dry matter, energy, nitrogen, amino acids, starch, fat, and ash digestibilities. Lesion scores in the different segments of the small intestine were also measured on d 21. Performance and nutrient digestibility data were analyzed by two-way ANOVA. Lesion score data were analyzed using Pearson chi-square test to identify significant differences between treatments. Orthogonal polynomial contrasts were used to assess the significance of linear or quadratic models to describe the response in the dependent variable to total lesion scores. Coccidia challenge reduced (P < 0.0001) the weight gain and feed intake, and increased (P < 0.0001) the feed conversion ratio. Betaine supplementation had no effect (P > 0.05) on the weight gain or feed intake, but lowered (P < 0.05) the feed conversion ratio. No interaction (P > 0.05) between coccidia challenge and betaine supplementation was observed for performance parameters. Betaine supplementation increased (P < 0.05) the digestibility of dry matter, nitrogen, energy, fat, and amino acids only in birds challenged with coccidia as indicated by the significant interaction (P < 0.0001) between betaine supplementation and coccidia challenge. The main effect of coccidia challenge reduced (P < 0.05) starch digestibility. Betaine supplementation improved (P < 0.05) starch digestibility regardless of the coccidia challenge. For each unit increase in the total lesion score, there was a linear (P < 0.001) decrease in digestibility of mean amino acids, starch, and fat by 3.8, 3.4 and 16%, respectively. Increasing total lesion scores resulted in a quadratic (P < 0.05) decrease in dry matter digestibility and ileal digestible energy. No lesions were found in the intestine or ceca of the unchallenged treatments. In the challenged treatments, betaine supplementation reduced (P < 0.01) the lesion scores at the duodenum, lower jejunum, and total lesion scores compared to the treatment without supplements. In conclusion, coccidia challenge lowered the digestibility of energy and nutrients and increased the feed conversion ratio of broilers. However, betaine supplementation reduced the impact of coccidia challenge and positively affected nutrient digestibility and the feed conversion ratio. PMID:25691757

  6. Simulations in Cyber-Security: A Review of Cognitive Modeling of Network Attackers, Defenders, and Users.

    PubMed

    Veksler, Vladislav D; Buchler, Norbou; Hoffman, Blaine E; Cassenti, Daniel N; Sample, Char; Sugrim, Shridat

    2018-01-01

    Computational models of cognitive processes may be employed in cyber-security tools, experiments, and simulations to address human agency and effective decision-making in keeping computational networks secure. Cognitive modeling can addresses multi-disciplinary cyber-security challenges requiring cross-cutting approaches over the human and computational sciences such as the following: (a) adversarial reasoning and behavioral game theory to predict attacker subjective utilities and decision likelihood distributions, (b) human factors of cyber tools to address human system integration challenges, estimation of defender cognitive states, and opportunities for automation, (c) dynamic simulations involving attacker, defender, and user models to enhance studies of cyber epidemiology and cyber hygiene, and (d) training effectiveness research and training scenarios to address human cyber-security performance, maturation of cyber-security skill sets, and effective decision-making. Models may be initially constructed at the group-level based on mean tendencies of each subject's subgroup, based on known statistics such as specific skill proficiencies, demographic characteristics, and cultural factors. For more precise and accurate predictions, cognitive models may be fine-tuned to each individual attacker, defender, or user profile, and updated over time (based on recorded behavior) via techniques such as model tracing and dynamic parameter fitting.

  7. Detection and quantification of flow consistency in business process models.

    PubMed

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel; Soffer, Pnina; Weber, Barbara

    2018-01-01

    Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second, to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics addressing these challenges, each following a different view of flow consistency. We then report the results of an empirical evaluation, which indicates which metric is more effective in predicting the human perception of this feature. Moreover, two other automatic evaluations describing the performance and the computational capabilities of our metrics are reported as well.

  8. Frontiers in Atmospheric Chemistry Modelling

    NASA Astrophysics Data System (ADS)

    Colette, Augustin; Bessagnet, Bertrand; Meleux, Frederik; Rouïl, Laurence

    2013-04-01

    The first pan-European kilometre-scale atmospheric chemistry simulation is introduced. The continental-scale air pollution episode of January 2009 is modelled with the CHIMERE offline chemistry-transport model with a massive grid of 2 million horizontal points, performed on 2000 CPU of a high performance computing system hosted by the Research and Technology Computing Center at the French Alternative Energies and Atomic Energy Commission (CCRT/CEA). Besides the technical challenge, which demonstrated the robustness of the selected air quality model, we discuss the added value in terms of air pollution modelling and decision support. The comparison with in-situ observations shows that model biases are significantly improved despite some spurious added spatial variability attributed to shortcomings in the emission downscaling process and coarse resolution of the meteorological fields. The increased spatial resolution is clearly beneficial for the detection of exceedances and exposure modelling. We reveal small scale air pollution patterns that highlight the contribution of city plumes to background air pollution levels. Up to a factor 5 underestimation of the fraction of population exposed to detrimental levels of pollution can be obtained with a coarse simulation if subgrid scale correction such as urban increments are ignored. This experiment opens new perspectives for environmental decision making. After two decades of efforts to reduce air pollutant emissions across Europe, the challenge is now to find the optimal trade-off between national and local air quality management strategies. While the first approach is based on sectoral strategies and energy policies, the later builds upon new alternatives such as urban development. The strategies, the decision pathways and the involvement of individual citizen differ, and a compromise based on cost and efficiency must be found. We illustrated how high performance computing in atmospheric science can contribute to this aim. Although further developments are still needed to secure the results for routine policy use, the door is now open...

  9. Design and Simulation of a PID Controller for Motion Control Systems

    NASA Astrophysics Data System (ADS)

    Hassan Abdullahi, Zakariyya; Danzomo, Bashir Ahmed; Suleiman Abdullahi, Zainab

    2018-04-01

    Motion control system plays important role in many industrial applications among which are in robot system, missile launching, positioning systems etc. However, the performance requirement for these applications in terms of high accuracy, high speed, insignificant or no overshoot and robustness have generated continuous challenges in the field of motion control system design and implementation. To compensate this challenge, a PID controller was design using mathematical model of a DC motor based on classical root-locus approach. The reason for adopting root locus design is to remodel the closed-loop response by putting the closed-loop poles of the system at desired points. Adding poles and zeros to the initial open-loop transfer function through the controller provide a way to transform the root locus in order to place the closed-loop poles at the required points. This process can also be used for discrete-time models. The Advantages of root locus over other methods is that, it gives the better way of pinpointing the parameters and can easily predict the fulfilment of the whole system. The controller performance was simulated using MATLAB code and a reasonable degree of accuracy was obtained. Implementation of the proposed model was conducted using-Simulink and the result obtained shows that the PID controller met the transient performance specifications with both settling time and overshoot less than 0.1s and 5% respectively. In terms of steady state error, the PID controller gave good response for both step input and ramp.

  10. An Artificial Intelligence Approach for Modeling and Prediction of Water Diffusion Inside a Carbon Nanotube

    PubMed Central

    2009-01-01

    Modeling of water flow in carbon nanotubes is still a challenge for the classic models of fluid dynamics. In this investigation, an adaptive-network-based fuzzy inference system (ANFIS) is presented to solve this problem. The proposed ANFIS approach can construct an input–output mapping based on both human knowledge in the form of fuzzy if-then rules and stipulated input–output data pairs. Good performance of the designed ANFIS ensures its capability as a promising tool for modeling and prediction of fluid flow at nanoscale where the continuum models of fluid dynamics tend to break down. PMID:20596382

  11. An Artificial Intelligence Approach for Modeling and Prediction of Water Diffusion Inside a Carbon Nanotube.

    PubMed

    Ahadian, Samad; Kawazoe, Yoshiyuki

    2009-06-04

    Modeling of water flow in carbon nanotubes is still a challenge for the classic models of fluid dynamics. In this investigation, an adaptive-network-based fuzzy inference system (ANFIS) is presented to solve this problem. The proposed ANFIS approach can construct an input-output mapping based on both human knowledge in the form of fuzzy if-then rules and stipulated input-output data pairs. Good performance of the designed ANFIS ensures its capability as a promising tool for modeling and prediction of fluid flow at nanoscale where the continuum models of fluid dynamics tend to break down.

  12. How bootstrap can help in forecasting time series with more than one seasonal pattern

    NASA Astrophysics Data System (ADS)

    Cordeiro, Clara; Neves, M. Manuela

    2012-09-01

    The search for the future is an appealing challenge in time series analysis. The diversity of forecasting methodologies is inevitable and is still in expansion. Exponential smoothing methods are the launch platform for modelling and forecasting in time series analysis. Recently this methodology has been combined with bootstrapping revealing a good performance. The algorithm (Boot. EXPOS) using exponential smoothing and bootstrap methodologies, has showed promising results for forecasting time series with one seasonal pattern. In case of more than one seasonal pattern, the double seasonal Holt-Winters methods and the exponential smoothing methods were developed. A new challenge was now to combine these seasonal methods with bootstrap and carry over a similar resampling scheme used in Boot. EXPOS procedure. The performance of such partnership will be illustrated for some well-know data sets existing in software.

  13. Design and validation of diffusion MRI models of white matter

    NASA Astrophysics Data System (ADS)

    Jelescu, Ileana O.; Budde, Matthew D.

    2017-11-01

    Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open questions and converge towards consensus.

  14. Why we need better predictive models of vegetation phenology

    NASA Astrophysics Data System (ADS)

    Richardson, Andrew; Migliavacca, Mirco; Keenan, Trevor

    2014-05-01

    Vegetation phenology is strongly affected by climate change, with warmer temperatures causing earlier spring onset and delayed autumn senescence in most temperate and boreal ecosystems. In arid regions where phenology is driven by the seasonality of soil water availability, shifts in the timing, intensity, and total amount of precipitation are, likewise, affecting the seasonality of vegetation activity. Changes in the duration of the growing season have important implications for ecosystem productivity and uptake of CO2 from the atmosphere, as well as site water balance and runoff, microclimate, ecological interactions within and across trophic levels, and numerous feedbacks to the climate system associated with the surface energy budget. However, an outstanding challenge is that existing phenology sub-models used in ecosystem, land surface, and terrestrial biosphere models fail to adequately represent the seasonality, or sensitivity to environmental drivers, of vegetation phenology. This has two implications. First, these models are therefore likely to perform poorly under future climate scenarios. Second, the seasonality of important ecological processes and interactions, as well as biosphere-atmosphere feedbacks, is likely to be misrepresented as a result. Using data from several recent analyses, and focusing on temperate and boreal ecosystems, we will review current challenges associated with modeling vegetation phenology. We will discuss uncertainties associated with phenology model structure, model parameters, and driver sensitivity (forcing, chilling, and photoperiod). We will show why being able to extrapolate and generalize models (and model parameterization) is essential. We will consider added challenges associated with trying to model autumn phenology. Finally, we will use canopy photosynthesis and uptake of CO2 as an example of why improved understanding of the "rhythm of the seasons" is critically important.

  15. Design and validation of diffusion MRI models of white matter

    PubMed Central

    Jelescu, Ileana O.; Budde, Matthew D.

    2018-01-01

    Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open questions and converge towards consensus. PMID:29755979

  16. An entrustable professional activity (EPA) for handoffs as a model for EPA assessment development.

    PubMed

    Aylward, Michael; Nixon, James; Gladding, Sophia

    2014-10-01

    Medical education is moving toward assessment of educational outcomes rather than educational processes. The American Board of Internal Medicine and American Board of Pediatrics milestones and the concept of entrustable professional activities (EPA)--skills essential to the practice of medicine that educators progressively entrust learners to perform--provide new approaches to assessing outcomes. Although some defined EPAs exist for internal medicine and pediatrics, the continued development and implementation of EPAs remains challenging. As residency programs are expected to begin reporting milestone-based performance, however, they will need examples of how to overcome these challenges. The authors describe a model for the development and implementation of an EPA using the resident handoff as an example. The model includes nine steps: selecting the EPA, determining where skills are practiced and assessed, addressing barriers to assessment, determining components of the EPA, determining needed assessment tools, developing new assessments if needed, determining criteria for advancement through entrustment levels, mapping milestones to the EPA, and faculty development. Following implementation, 78% of interns at the University of Minnesota Medical School were observed giving handoffs and provided feedback. The authors suggest that this model of EPA development--which includes engaging stakeholders, an iterative process to describing the behavioral characteristics of each domain at each level of entrustment, and the development of specific assessment tools that support both formative feedback and summative decisions about entrustment--can serve as a model for EPA development for other clinical skills and specialty areas.

  17. SERVER DEVELOPMENT FOR NSLS-II PHYSICS APPLICATIONS AND PERFORMANCE ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, G.; Kraimer, M.

    2011-03-28

    The beam commissioning software framework of NSLS-II project adopts a client/server based architecture to replace the more traditional monolithic high level application approach. The server software under development is available via an open source sourceforge project named epics-pvdata, which consists of modules pvData, pvAccess, pvIOC, and pvService. Examples of two services that already exist in the pvService module are itemFinder, and gather. Each service uses pvData to store in-memory transient data, pvService to transfer data over the network, and pvIOC as the service engine. The performance benchmarking for pvAccess and both gather service and item finder service are presented inmore » this paper. The performance comparison between pvAccess and Channel Access are presented also. For an ultra low emittance synchrotron radiation light source like NSLS II, the control system requirements, especially for beam control are tight. To control and manipulate the beam effectively, a use case study has been performed to satisfy the requirement and theoretical evaluation has been performed. The analysis shows that model based control is indispensable for beam commissioning and routine operation. However, there are many challenges such as how to re-use a design model for on-line model based control, and how to combine the numerical methods for modeling of a realistic lattice with the analytical techniques for analysis of its properties. To satisfy the requirements and challenges, adequate system architecture for the software framework for beam commissioning and operation is critical. The existing traditional approaches are self-consistent, and monolithic. Some of them have adopted a concept of middle layer to separate low level hardware processing from numerical algorithm computing, physics modelling, data manipulating and plotting, and error handling. However, none of the existing approaches can satisfy the requirement. A new design has been proposed by introducing service oriented architecture technology, and client interface is undergoing. The design and implementation adopted a new EPICS implementation, namely epics-pvdata [9], which is under active development. The implementation of this project under Java is close to stable, and binding to other language such as C++ and/or Python is undergoing. In this paper, we focus on the performance benchmarking and comparison for pvAccess and Channel Access, the performance evaluation for 2 services, gather and item finder respectively.« less

  18. Feedforward object-vision models only tolerate small image variations compared to human

    PubMed Central

    Ghodrati, Masoud; Farzmahdi, Amirhossein; Rajaei, Karim; Ebrahimpour, Reza; Khaligh-Razavi, Seyed-Mahdi

    2014-01-01

    Invariant object recognition is a remarkable ability of primates' visual system that its underlying mechanism has constantly been under intense investigations. Computational modeling is a valuable tool toward understanding the processes involved in invariant object recognition. Although recent computational models have shown outstanding performances on challenging image databases, they fail to perform well in image categorization under more complex image variations. Studies have shown that making sparse representation of objects by extracting more informative visual features through a feedforward sweep can lead to higher recognition performances. Here, however, we show that when the complexity of image variations is high, even this approach results in poor performance compared to humans. To assess the performance of models and humans in invariant object recognition tasks, we built a parametrically controlled image database consisting of several object categories varied in different dimensions and levels, rendered from 3D planes. Comparing the performance of several object recognition models with human observers shows that only in low-level image variations the models perform similar to humans in categorization tasks. Furthermore, the results of our behavioral experiments demonstrate that, even under difficult experimental conditions (i.e., briefly presented masked stimuli with complex image variations), human observers performed outstandingly well, suggesting that the models are still far from resembling humans in invariant object recognition. Taken together, we suggest that learning sparse informative visual features, although desirable, is not a complete solution for future progresses in object-vision modeling. We show that this approach is not of significant help in solving the computational crux of object recognition (i.e., invariant object recognition) when the identity-preserving image variations become more complex. PMID:25100986

  19. Brain Regional Blood Flow and Working Memory Performance Predict Change in Blood Pressure Over 2 Years.

    PubMed

    Jennings, J Richard; Heim, Alicia F; Sheu, Lei K; Muldoon, Matthew F; Ryan, Christopher; Gach, H Michael; Schirda, Claudiu; Gianaros, Peter J

    2017-12-01

    Hypertension is a presumptive risk factor for premature cognitive decline. However, lowering blood pressure (BP) does not uniformly reverse cognitive decline, suggesting that high BP per se may not cause cognitive decline. We hypothesized that essential hypertension has initial effects on the brain that, over time, manifest as cognitive dysfunction in conjunction with both brain vascular abnormalities and systemic BP elevation. Accordingly, we tested whether neuropsychological function and brain blood flow responses to cognitive challenges among prehypertensive individuals would predict subsequent progression of BP. Midlife adults (n=154; mean age, 49; 45% men) with prehypertensive BP underwent neuropsychological testing and assessment of regional cerebral blood flow (rCBF) response to cognitive challenges. Neuropsychological performance measures were derived for verbal and logical memory (memory), executive function, working memory, mental efficiency, and attention. A pseudo-continuous arterial spin labeling magnetic resonance imaging sequence compared rCBF responses with control and active phases of cognitive challenges. Brain areas previously associated with BP were grouped into composites for frontoparietal, frontostriatal, and insular-subcortical rCBF areas. Multiple regression models tested whether BP after 2 years was predicted by initial BP, initial neuropsychological scores, and initial rCBF responses to cognitive challenge. The neuropsychological composite of working memory (standardized beta, -0.276; se=0.116; P =0.02) and the frontostriatal rCBF response to cognitive challenge (standardized beta, 0.234; se=0.108; P =0.03) significantly predicted follow-up BP. Initial BP failed to significantly predict subsequent cognitive performance or rCBF. Changes in brain function may precede or co-occur with progression of BP toward hypertensive levels in midlife. © 2017 American Heart Association, Inc.

  20. NASA Workshop on Distributed Parameter Modeling and Control of Flexible Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Marks, Virginia B. (Compiler); Keckler, Claude R. (Compiler)

    1994-01-01

    Although significant advances have been made in modeling and controlling flexible systems, there remains a need for improvements in model accuracy and in control performance. The finite element models of flexible systems are unduly complex and are almost intractable to optimum parameter estimation for refinement using experimental data. Distributed parameter or continuum modeling offers some advantages and some challenges in both modeling and control. Continuum models often result in a significantly reduced number of model parameters, thereby enabling optimum parameter estimation. The dynamic equations of motion of continuum models provide the advantage of allowing the embedding of the control system dynamics, thus forming a complete set of system dynamics. There is also increased insight provided by the continuum model approach.

  1. Community-based benchmarking improves spike rate inference from two-photon calcium imaging data.

    PubMed

    Berens, Philipp; Freeman, Jeremy; Deneux, Thomas; Chenkov, Nikolay; McColgan, Thomas; Speiser, Artur; Macke, Jakob H; Turaga, Srinivas C; Mineault, Patrick; Rupprecht, Peter; Gerhard, Stephan; Friedrich, Rainer W; Friedrich, Johannes; Paninski, Liam; Pachitariu, Marius; Harris, Kenneth D; Bolte, Ben; Machado, Timothy A; Ringach, Dario; Stone, Jasmine; Rogerson, Luke E; Sofroniew, Nicolas J; Reimer, Jacob; Froudarakis, Emmanouil; Euler, Thomas; Román Rosón, Miroslav; Theis, Lucas; Tolias, Andreas S; Bethge, Matthias

    2018-05-01

    In recent years, two-photon calcium imaging has become a standard tool to probe the function of neural circuits and to study computations in neuronal populations. However, the acquired signal is only an indirect measurement of neural activity due to the comparatively slow dynamics of fluorescent calcium indicators. Different algorithms for estimating spike rates from noisy calcium measurements have been proposed in the past, but it is an open question how far performance can be improved. Here, we report the results of the spikefinder challenge, launched to catalyze the development of new spike rate inference algorithms through crowd-sourcing. We present ten of the submitted algorithms which show improved performance compared to previously evaluated methods. Interestingly, the top-performing algorithms are based on a wide range of principles from deep neural networks to generative models, yet provide highly correlated estimates of the neural activity. The competition shows that benchmark challenges can drive algorithmic developments in neuroscience.

  2. Bayesian analysis of longitudinal dyadic data with informative missing data using a dyadic shared-parameter model.

    PubMed

    Ahn, Jaeil; Morita, Satoshi; Wang, Wenyi; Yuan, Ying

    2017-01-01

    Analyzing longitudinal dyadic data is a challenging task due to the complicated correlations from repeated measurements and within-dyad interdependence, as well as potentially informative (or non-ignorable) missing data. We propose a dyadic shared-parameter model to analyze longitudinal dyadic data with ordinal outcomes and informative intermittent missing data and dropouts. We model the longitudinal measurement process using a proportional odds model, which accommodates the within-dyad interdependence using the concept of the actor-partner interdependence effects, as well as dyad-specific random effects. We model informative dropouts and intermittent missing data using a transition model, which shares the same set of random effects as the longitudinal measurement model. We evaluate the performance of the proposed method through extensive simulation studies. As our approach relies on some untestable assumptions on the missing data mechanism, we perform sensitivity analyses to evaluate how the analysis results change when the missing data mechanism is misspecified. We demonstrate our method using a longitudinal dyadic study of metastatic breast cancer.

  3. Dynamic Modeling, Controls, and Testing for Electrified Aircraft

    NASA Technical Reports Server (NTRS)

    Connolly, Joseph; Stalcup, Erik

    2017-01-01

    Electrified aircraft have the potential to provide significant benefits for efficiency and emissions reductions. To assess these potential benefits, modeling tools are needed to provide rapid evaluation of diverse concepts and to ensure safe operability and peak performance over the mission. The modeling challenge for these vehicles is the ability to show significant benefits over the current highly refined aircraft systems. The STARC-ABL (single-aisle turbo-electric aircraft with an aft boundary layer propulsor) is a new test proposal that builds upon previous N3-X team hybrid designs. This presentation describes the STARC-ABL concept, the NASA Electric Aircraft Testbed (NEAT) which will allow testing of the STARC-ABL powertrain, and the related modeling and simulation efforts to date. Modeling and simulation includes a turbofan simulation, Numeric Propulsion System Simulation (NPSS), which has been integrated with NEAT; and a power systems and control model for predicting testbed performance and evaluating control schemes. Model predictions provide good comparisons with testbed data for an NPSS-integrated test of the single-string configuration of NEAT.

  4. In-situ biogas upgrading process: Modeling and simulations aspects.

    PubMed

    Lovato, Giovanna; Alvarado-Morales, Merlin; Kovalovszki, Adam; Peprah, Maria; Kougias, Panagiotis G; Rodrigues, José Alberto Domingues; Angelidaki, Irini

    2017-12-01

    Biogas upgrading processes by in-situ hydrogen (H 2 ) injection are still challenging and could benefit from a mathematical model to predict system performance. Therefore, a previous model on anaerobic digestion was updated and expanded to include the effect of H 2 injection into the liquid phase of a fermenter with the aim of modeling and simulating these processes. This was done by including hydrogenotrophic methanogen kinetics for H 2 consumption and inhibition effect on the acetogenic steps. Special attention was paid to gas to liquid transfer of H 2 . The final model was successfully validated considering a set of Case Studies. Biogas composition and H 2 utilization were correctly predicted, with overall deviation below 10% compared to experimental measurements. Parameter sensitivity analysis revealed that the model is highly sensitive to the H 2 injection rate and mass transfer coefficient. The model developed is an effective tool for predicting process performance in scenarios with biogas upgrading. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. GEM-CEDAR Study of Ionospheric Energy Input and Joule Dissipation

    NASA Technical Reports Server (NTRS)

    Rastaetter, Lutz; Kuznetsova, Maria M.; Shim, Jasoon

    2012-01-01

    We are studying ionospheric model performance for six events selected for the GEM-CEDAR modeling challenge. DMSP measurements of electric and magnetic fields are converted into Poynting Flux values that estimate the energy input into the ionosphere. Models generate rates of ionospheric Joule dissipation that are compared to the energy influx. Models include the ionosphere models CTIPe and Weimer and the ionospheric electrodynamic outputs of global magnetosphere models SWMF, LFM, and OpenGGCM. This study evaluates the model performance in terms of overall balance between energy influx and dissipation and tests the assumption that Joule dissipation occurs locally where electromagnetic energy flux enters the ionosphere. We present results in terms of skill scores now commonly used in metrics and validation studies and we can measure the agreement in terms of temporal and spatial distribution of dissipation (i.e, location of auroral activity) along passes of the DMSP satellite with the passes' proximity to the magnetic pole and solar wind activity level.

  6. Performance Benefits for Wave Rotor-Topped Gas Turbine Engines

    NASA Technical Reports Server (NTRS)

    Jones, Scott M.; Welch, Gerard E.

    1996-01-01

    The benefits of wave rotor-topping in turboshaft engines, subsonic high-bypass turbofan engines, auxiliary power units, and ground power units are evaluated. The thermodynamic cycle performance is modeled using a one-dimensional steady-state code; wave rotor performance is modeled using one-dimensional design/analysis codes. Design and off-design engine performance is calculated for baseline engines and wave rotor-topped engines, where the wave rotor acts as a high pressure spool. The wave rotor-enhanced engines are shown to have benefits in specific power and specific fuel flow over the baseline engines without increasing turbine inlet temperature. The off-design steady-state behavior of a wave rotor-topped engine is shown to be similar to a conventional engine. Mission studies are performed to quantify aircraft performance benefits for various wave rotor cycle and weight parameters. Gas turbine engine cycles most likely to benefit from wave rotor-topping are identified. Issues of practical integration and the corresponding technical challenges with various engine types are discussed.

  7. Analysis of a rotating spool expander for Organic Rankine Cycle applications

    NASA Astrophysics Data System (ADS)

    Krishna, Abhinav

    Increasing interest in recovering or utilizing low-grade heat for power generation has prompted a search for ways in which the power conversion process may be enhanced. Amongst the conversion systems, the Organic Rankine Cycle (ORC) has generated an enormous amount of interest amongst researchers and system designers. Nevertheless, component level technologies need to be developed and match the range of potential applications. In particular, technical challenges associated with scaling expansion machines (turbines) from utility scale to commercial scale have prevented widespread adoption of the technology. In this regard, this work focuses on a novel rotating spool expansion machine at the heart of an Organic Rankine Cycle. A comprehensive, deterministic simulation model of the rotating spool expander is developed. The comprehensive model includes a detailed geometry model of the spool expander and the suction valve mechanism. Sub-models for mass flow, leakage, heat transfer and friction within the expander are also developed. Apart from providing the ability to characterize the expander in a particular system, the model provides a valuable tool to study the impact of various design variables on the performance of the machine. The investigative approach also involved an experimental program to assess the performance of a working prototype. In general, the experimental data showed that the expander performance was sub-par, largely due to the mismatch of prevailing operating conditions and the expander design criteria. Operating challenges during the shakedown tests and subsequent sub-optimal design changes also detracted from performance. Nevertheless, the results of the experimental program were sufficient for a proof-of-concept assessment of the expander and for model validation over a wide range of operating conditions. The results of the validated model reveal several interesting details concerning the expander design and performance. For example, the match between the design expansion ratio and the system imposed pressure ratio has a large influence on the performance of the expander. Further exploration shows that from an operating perspective, under-expansion is preferable to over-expansion. The model is also able to provide insight on the dominant leakage paths in the expander and points to the fact that this is the primary loss mechanism in the current expander. Similar insights are obtained from assessing the sensitivity of various other design variables on expander performance. Based on the understanding provided by the sensitivity analysis, exercising the validated model showed that expander efficiencies on the order of 75% are imminently possible in an improved design. Therefore, with sufficient future development, adoption of the spool expander in ORC systems that span a 50 kW -- 200 kW range is broadly feasible.

  8. Crowdsourced assessment of common genetic contribution to predicting anti-TNF treatment response in rheumatoid arthritis

    PubMed Central

    Sieberts, Solveig K.; Zhu, Fan; García-García, Javier; Stahl, Eli; Pratap, Abhishek; Pandey, Gaurav; Pappas, Dimitrios; Aguilar, Daniel; Anton, Bernat; Bonet, Jaume; Eksi, Ridvan; Fornés, Oriol; Guney, Emre; Li, Hongdong; Marín, Manuel Alejandro; Panwar, Bharat; Planas-Iglesias, Joan; Poglayen, Daniel; Cui, Jing; Falcao, Andre O.; Suver, Christine; Hoff, Bruce; Balagurusamy, Venkat S. K.; Dillenberger, Donna; Neto, Elias Chaibub; Norman, Thea; Aittokallio, Tero; Ammad-ud-din, Muhammad; Azencott, Chloe-Agathe; Bellón, Víctor; Boeva, Valentina; Bunte, Kerstin; Chheda, Himanshu; Cheng, Lu; Corander, Jukka; Dumontier, Michel; Goldenberg, Anna; Gopalacharyulu, Peddinti; Hajiloo, Mohsen; Hidru, Daniel; Jaiswal, Alok; Kaski, Samuel; Khalfaoui, Beyrem; Khan, Suleiman Ali; Kramer, Eric R.; Marttinen, Pekka; Mezlini, Aziz M.; Molparia, Bhuvan; Pirinen, Matti; Saarela, Janna; Samwald, Matthias; Stoven, Véronique; Tang, Hao; Tang, Jing; Torkamani, Ali; Vert, Jean-Phillipe; Wang, Bo; Wang, Tao; Wennerberg, Krister; Wineinger, Nathan E.; Xiao, Guanghua; Xie, Yang; Yeung, Rae; Zhan, Xiaowei; Zhao, Cheng; Calaza, Manuel; Elmarakeby, Haitham; Heath, Lenwood S.; Long, Quan; Moore, Jonathan D.; Opiyo, Stephen Obol; Savage, Richard S.; Zhu, Jun; Greenberg, Jeff; Kremer, Joel; Michaud, Kaleb; Barton, Anne; Coenen, Marieke; Mariette, Xavier; Miceli, Corinne; Shadick, Nancy; Weinblatt, Michael; de Vries, Niek; Tak, Paul P.; Gerlag, Danielle; Huizinga, Tom W. J.; Kurreeman, Fina; Allaart, Cornelia F.; Louis Bridges Jr., S.; Criswell, Lindsey; Moreland, Larry; Klareskog, Lars; Saevarsdottir, Saedis; Padyukov, Leonid; Gregersen, Peter K.; Friend, Stephen; Plenge, Robert; Stolovitzky, Gustavo; Oliva, Baldo; Guan, Yuanfang; Mangravite, Lara M.

    2016-01-01

    Rheumatoid arthritis (RA) affects millions world-wide. While anti-TNF treatment is widely used to reduce disease progression, treatment fails in ∼one-third of patients. No biomarker currently exists that identifies non-responders before treatment. A rigorous community-based assessment of the utility of SNP data for predicting anti-TNF treatment efficacy in RA patients was performed in the context of a DREAM Challenge (http://www.synapse.org/RA_Challenge). An open challenge framework enabled the comparative evaluation of predictions developed by 73 research groups using the most comprehensive available data and covering a wide range of state-of-the-art modelling methodologies. Despite a significant genetic heritability estimate of treatment non-response trait (h2=0.18, P value=0.02), no significant genetic contribution to prediction accuracy is observed. Results formally confirm the expectations of the rheumatology community that SNP information does not significantly improve predictive performance relative to standard clinical traits, thereby justifying a refocusing of future efforts on collection of other data. PMID:27549343

  9. Projecting future expansion of invasive species: comparing and improving methodologies for species distribution modeling.

    PubMed

    Mainali, Kumar P; Warren, Dan L; Dhileepan, Kunjithapatham; McConnachie, Andrew; Strathie, Lorraine; Hassan, Gul; Karki, Debendra; Shrestha, Bharat B; Parmesan, Camille

    2015-12-01

    Modeling the distributions of species, especially of invasive species in non-native ranges, involves multiple challenges. Here, we developed some novel approaches to species distribution modeling aimed at reducing the influences of such challenges and improving the realism of projections. We estimated species-environment relationships for Parthenium hysterophorus L. (Asteraceae) with four modeling methods run with multiple scenarios of (i) sources of occurrences and geographically isolated background ranges for absences, (ii) approaches to drawing background (absence) points, and (iii) alternate sets of predictor variables. We further tested various quantitative metrics of model evaluation against biological insight. Model projections were very sensitive to the choice of training dataset. Model accuracy was much improved using a global dataset for model training, rather than restricting data input to the species' native range. AUC score was a poor metric for model evaluation and, if used alone, was not a useful criterion for assessing model performance. Projections away from the sampled space (i.e., into areas of potential future invasion) were very different depending on the modeling methods used, raising questions about the reliability of ensemble projections. Generalized linear models gave very unrealistic projections far away from the training region. Models that efficiently fit the dominant pattern, but exclude highly local patterns in the dataset and capture interactions as they appear in data (e.g., boosted regression trees), improved generalization of the models. Biological knowledge of the species and its distribution was important in refining choices about the best set of projections. A post hoc test conducted on a new Parthenium dataset from Nepal validated excellent predictive performance of our 'best' model. We showed that vast stretches of currently uninvaded geographic areas on multiple continents harbor highly suitable habitats for parthenium. However, discrepancies between model predictions and parthenium invasion in Australia indicate successful management for this globally significant weed. © 2015 John Wiley & Sons Ltd.

  10. Race to the Top - Early Learning Challenge: 2014 Annual Performance Report. Wisconsin

    ERIC Educational Resources Information Center

    Race to the Top - Early Learning Challenge, 2015

    2015-01-01

    This Race to the Top - Early Learning Challenge (RTT-ELC) annual performance report for the year 2014 describes Wisconsin's accomplishments, lessons learned, challenges, and strategies Wisconsin will implement to address those challenges. During the second year of the Race to the Top - Early Learning Challenge (RTT-ELC) in Wisconsin, there have…

  11. Race to the Top - Early Learning Challenge: 2014 Annual Performance Report. Delaware

    ERIC Educational Resources Information Center

    Race to the Top - Early Learning Challenge, 2015

    2015-01-01

    This Race to the Top - Early Learning Challenge (RTT-ELC) annual performance report for the year 2014 describes Delaware's accomplishments, lessons learned, challenges, and strategies Delaware will implement to address those challenges. At the end of Year Three of the Early Learning Challenge Grant, Delaware continues to make significant progress…

  12. Test Driven Development of a Parameterized Ice Sheet Component

    NASA Astrophysics Data System (ADS)

    Clune, T.

    2011-12-01

    Test driven development (TDD) is a software development methodology that offers many advantages over traditional approaches including reduced development and maintenance costs, improved reliability, and superior design quality. Although TDD is widely accepted in many software communities, the suitability to scientific software is largely undemonstrated and warrants a degree of skepticism. Indeed, numerical algorithms pose several challenges to unit testing in general, and TDD in particular. Among these challenges are the need to have simple, non-redundant closed-form expressions to compare against the results obtained from the implementation as well as realistic error estimates. The necessity for serial and parallel performance raises additional concerns for many scientific applicaitons. In previous work I demonstrated that TDD performed well for the development of a relatively simple numerical model that simulates the growth of snowflakes, but the results were anecdotal and of limited relevance to far more complex software components typical of climate models. This investigation has now been extended by successfully applying TDD to the implementation of a substantial portion of a new parameterized ice sheet component within a full climate model. After a brief introduction to TDD, I will present techniques that address some of the obstacles encountered with numerical algorithms. I will conclude with some quantitative and qualitative comparisons against climate components developed in a more traditional manner.

  13. A pervasive visual-haptic framework for virtual delivery training.

    PubMed

    Abate, Andrea F; Acampora, Giovanni; Loia, Vincenzo; Ricciardi, Stefano; Vasilakos, Athanasios V

    2010-03-01

    Thanks to the advances of voltage regulator (VR) technologies and haptic systems, virtual simulators are increasingly becoming a viable alternative to physical simulators in medicine and surgery, though many challenges still remain. In this study, a pervasive visual-haptic framework aimed to the training of obstetricians and midwives to vaginal delivery is described. The haptic feedback is provided by means of two hand-based haptic devices able to reproduce force-feedbacks on fingers and arms, thus enabling a much more realistic manipulation respect to stylus-based solutions. The interactive simulation is not solely driven by an approximated model of complex forces and physical constraints but, instead, is approached by a formal modeling of the whole labor and of the assistance/intervention procedures performed by means of a timed automata network and applied to a parametrical 3-D model of the anatomy, able to mimic a wide range of configurations. This novel methodology is able to represent not only the sequence of the main events associated to either a spontaneous or to an operative childbirth process, but also to help in validating the manual intervention as the actions performed by the user during the simulation are evaluated according to established medical guidelines. A discussion on the first results as well as on the challenges still unaddressed is included.

  14. Performance Engineering Research Institute SciDAC-2 Enabling Technologies Institute Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucas, Robert

    2013-04-20

    Enhancing the performance of SciDAC applications on petascale systems had high priority within DOE SC at the start of the second phase of the SciDAC program, SciDAC-2, as it continues to do so today. Achieving expected levels of performance on high-end computing (HEC) systems is growing ever more challenging due to enormous scale, increasing architectural complexity, and increasing application complexity. To address these challenges, the University of Southern California?s Information Sciences Institute organized the Performance Engineering Research Institute (PERI). PERI implemented a unified, tripartite research plan encompassing: (1) performance modeling and prediction; (2) automatic performance tuning; and (3) performance engineeringmore » of high profile applications. Within PERI, USC?s primary research activity was automatic tuning (autotuning) of scientific software. This activity was spurred by the strong user preference for automatic tools and was based on previous successful activities such as ATLAS, which automatically tuned components of the LAPACK linear algebra library, and other recent work on autotuning domain-specific libraries. Our other major component was application engagement, to which we devoted approximately 30% of our effort to work directly with SciDAC-2 applications. This report is a summary of the overall results of the USC PERI effort.« less

  15. Benchmarking novel approaches for modelling species range dynamics

    PubMed Central

    Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H.; Moore, Kara A.; Zimmermann, Niklaus E.

    2016-01-01

    Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species’ range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species’ response to climate change but also emphasise several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches operational for large numbers of species. PMID:26872305

  16. Worldwide evaluation of mean and extreme runoff from six global-scale hydrological models that account for human impacts

    NASA Astrophysics Data System (ADS)

    Zaherpour, Jamal; Gosling, Simon N.; Mount, Nick; Müller Schmied, Hannes; Veldkamp, Ted I. E.; Dankers, Rutger; Eisner, Stephanie; Gerten, Dieter; Gudmundsson, Lukas; Haddeland, Ingjerd; Hanasaki, Naota; Kim, Hyungjun; Leng, Guoyong; Liu, Junguo; Masaki, Yoshimitsu; Oki, Taikan; Pokhrel, Yadu; Satoh, Yusuke; Schewe, Jacob; Wada, Yoshihide

    2018-06-01

    Global-scale hydrological models are routinely used to assess water scarcity, flood hazards and droughts worldwide. Recent efforts to incorporate anthropogenic activities in these models have enabled more realistic comparisons with observations. Here we evaluate simulations from an ensemble of six models participating in the second phase of the Inter-Sectoral Impact Model Inter-comparison Project (ISIMIP2a). We simulate monthly runoff in 40 catchments, spatially distributed across eight global hydrobelts. The performance of each model and the ensemble mean is examined with respect to their ability to replicate observed mean and extreme runoff under human-influenced conditions. Application of a novel integrated evaluation metric to quantify the models’ ability to simulate timeseries of monthly runoff suggests that the models generally perform better in the wetter equatorial and northern hydrobelts than in drier southern hydrobelts. When model outputs are temporally aggregated to assess mean annual and extreme runoff, the models perform better. Nevertheless, we find a general trend in the majority of models towards the overestimation of mean annual runoff and all indicators of upper and lower extreme runoff. The models struggle to capture the timing of the seasonal cycle, particularly in northern hydrobelts, while in southern hydrobelts the models struggle to reproduce the magnitude of the seasonal cycle. It is noteworthy that over all hydrological indicators, the ensemble mean fails to perform better than any individual model—a finding that challenges the commonly held perception that model ensemble estimates deliver superior performance over individual models. The study highlights the need for continued model development and improvement. It also suggests that caution should be taken when summarising the simulations from a model ensemble based upon its mean output.

  17. Benchmarking novel approaches for modelling species range dynamics.

    PubMed

    Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H; Moore, Kara A; Zimmermann, Niklaus E

    2016-08-01

    Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species' range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species' response to climate change but also emphasize several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches operational for large numbers of species. © 2016 John Wiley & Sons Ltd.

  18. Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Dali; Yuan, Fengming; Hernandez, Benjamin

    Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less

  19. Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations

    DOE PAGES

    Wang, Dali; Yuan, Fengming; Hernandez, Benjamin; ...

    2017-01-01

    Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less

  20. Modelling invasion for a habitat generalist and a specialist plant species

    USGS Publications Warehouse

    Evangelista, P.H.; Kumar, S.; Stohlgren, T.J.; Jarnevich, C.S.; Crall, A.W.; Norman, J. B.; Barnett, D.T.

    2008-01-01

    Predicting suitable habitat and the potential distribution of invasive species is a high priority for resource managers and systems ecologists. Most models are designed to identify habitat characteristics that define the ecological niche of a species with little consideration to individual species' traits. We tested five commonly used modelling methods on two invasive plant species, the habitat generalist Bromus tectorum and habitat specialist Tamarix chinensis, to compare model performances, evaluate predictability, and relate results to distribution traits associated with each species. Most of the tested models performed similarly for each species; however, the generalist species proved to be more difficult to predict than the specialist species. The highest area under the receiver-operating characteristic curve values with independent validation data sets of B. tectorum and T. chinensis was 0.503 and 0.885, respectively. Similarly, a confusion matrix for B. tectorum had the highest overall accuracy of 55%, while the overall accuracy for T. chinensis was 85%. Models for the generalist species had varying performances, poor evaluations, and inconsistent results. This may be a result of a generalist's capability to persist in a wide range of environmental conditions that are not easily defined by the data, independent variables or model design. Models for the specialist species had consistently strong performances, high evaluations, and similar results among different model applications. This is likely a consequence of the specialist's requirement for explicit environmental resources and ecological barriers that are easily defined by predictive models. Although defining new invaders as generalist or specialist species can be challenging, model performances and evaluations may provide valuable information on a species' potential invasiveness.

  1. Ontology Performance Profiling and Model Examination: First Steps

    NASA Astrophysics Data System (ADS)

    Wang, Taowei David; Parsia, Bijan

    "[Reasoner] performance can be scary, so much so, that we cannot deploy the technology in our products." - Michael Shepard. What are typical OWL users to do when their favorite reasoner never seems to return? In this paper, we present our first steps considering this problem. We describe the challenges and our approach, and present a prototype tool to help users identify reasoner performance bottlenecks with respect to their ontologies. We then describe 4 case studies on synthetic and real-world ontologies. While the anecdotal evidence suggests that the service can be useful for both ontology developers and reasoner implementors, much more is desired.

  2. What Differentiates Employees' Job Performance Under Stressful Situations: The Role of General Self-Efficacy.

    PubMed

    Lu, Chang-Qin; Du, Dan-Yang; Xu, Xiao-Min

    2016-10-02

    The aim of this research is to verify the two-dimensional challenge-hindrance stressor framework in the Chinese context, and investigate the moderating effect of general self-efficacy in the stress process. Data were collected from 164 Chinese employee-supervisor dyads. The results demonstrated that challenge stressors were positively related to job performance while hindrance stressors were negatively related to job performance. Furthermore, general self-efficacy strengthened the positive relationship between challenge stressors and job performance, whereas the attenuating effect of general self-efficacy on the negative relationship between hindrance stressors and job performance was nonsignificant. These findings qualify the two-dimensional challenge-hindrance stressor framework, and support the notion that employees with high self-efficacy benefit more from the positive effect of challenge stressors in the workplace. By investigating the role of an individual difference variable in the challenge-hindrance stressor framework, this research provides a more accurate picture of the nature of job stress, and enhances our understanding of the job stressor-job performance relationship.

  3. Next generation human skin constructs as advanced tools for drug development.

    PubMed

    Abaci, H E; Guo, Zongyou; Doucet, Yanne; Jacków, Joanna; Christiano, Angela

    2017-11-01

    Many diseases, as well as side effects of drugs, manifest themselves through skin symptoms. Skin is a complex tissue that hosts various specialized cell types and performs many roles including physical barrier, immune and sensory functions. Therefore, modeling skin in vitro presents technical challenges for tissue engineering. Since the first attempts at engineering human epidermis in 1970s, there has been a growing interest in generating full-thickness skin constructs mimicking physiological functions by incorporating various skin components, such as vasculature and melanocytes for pigmentation. Development of biomimetic in vitro human skin models with these physiological functions provides a new tool for drug discovery, disease modeling, regenerative medicine and basic research for skin biology. This goal, however, has long been delayed by the limited availability of different cell types, the challenges in establishing co-culture conditions, and the ability to recapitulate the 3D anatomy of the skin. Recent breakthroughs in induced pluripotent stem cell (iPSC) technology and microfabrication techniques such as 3D-printing have allowed for building more reliable and complex in vitro skin models for pharmaceutical screening. In this review, we focus on the current developments and prevailing challenges in generating skin constructs with vasculature, skin appendages such as hair follicles, pigmentation, immune response, innervation, and hypodermis. Furthermore, we discuss the promising advances that iPSC technology offers in order to generate in vitro models of genetic skin diseases, such as epidermolysis bullosa and psoriasis. We also discuss how future integration of the next generation human skin constructs onto microfluidic platforms along with other tissues could revolutionize the early stages of drug development by creating reliable evaluation of patient-specific effects of pharmaceutical agents. Impact statement Skin is a complex tissue that hosts various specialized cell types and performs many roles including barrier, immune, and sensory functions. For human-relevant drug testing, there has been a growing interest in building more physiological skin constructs by incorporating different skin components, such as vasculature, appendages, pigment, innervation, and adipose tissue. This paper provides an overview of the strategies to build complex human skin constructs that can faithfully recapitulate human skin and thus can be used in drug development targeting skin diseases. In particular, we discuss recent developments and remaining challenges in incorporating various skin components, availability of iPSC-derived skin cell types and in vitro skin disease models. In addition, we provide insights on the future integration of these complex skin models with other organs on microfluidic platforms as well as potential readout technologies for high-throughput drug screening.

  4. Understanding the limits of animal models as predictors of human biology: lessons learned from the sbv IMPROVER Species Translation Challenge

    PubMed Central

    Mathis, Carole; Dulize, Rémi H. J.; Ivanov, Nikolai V.; Alexopoulos, Leonidas; Jeremy Rice, J.; Peitsch, Manuel C.; Stolovitzky, Gustavo; Meyer, Pablo; Hoeng, Julia

    2015-01-01

    Motivation: Inferring how humans respond to external cues such as drugs, chemicals, viruses or hormones is an essential question in biomedicine. Very often, however, this question cannot be addressed because it is not possible to perform experiments in humans. A reasonable alternative consists of generating responses in animal models and ‘translating’ those results to humans. The limitations of such translation, however, are far from clear, and systematic assessments of its actual potential are urgently needed. sbv IMPROVER (systems biology verification for Industrial Methodology for PROcess VErification in Research) was designed as a series of challenges to address translatability between humans and rodents. This collaborative crowd-sourcing initiative invited scientists from around the world to apply their own computational methodologies on a multilayer systems biology dataset composed of phosphoproteomics, transcriptomics and cytokine data derived from normal human and rat bronchial epithelial cells exposed in parallel to 52 different stimuli under identical conditions. Our aim was to understand the limits of species-to-species translatability at different levels of biological organization: signaling, transcriptional and release of secreted factors (such as cytokines). Participating teams submitted 49 different solutions across the sub-challenges, two-thirds of which were statistically significantly better than random. Additionally, similar computational methods were found to range widely in their performance within the same challenge, and no single method emerged as a clear winner across all sub-challenges. Finally, computational methods were able to effectively translate some specific stimuli and biological processes in the lung epithelial system, such as DNA synthesis, cytoskeleton and extracellular matrix, translation, immune/inflammation and growth factor/proliferation pathways, better than the expected response similarity between species. Contact: pmeyerr@us.ibm.com or Julia.Hoeng@pmi.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25236459

  5. Understanding the limits of animal models as predictors of human biology: lessons learned from the sbv IMPROVER Species Translation Challenge.

    PubMed

    Rhrissorrakrai, Kahn; Belcastro, Vincenzo; Bilal, Erhan; Norel, Raquel; Poussin, Carine; Mathis, Carole; Dulize, Rémi H J; Ivanov, Nikolai V; Alexopoulos, Leonidas; Rice, J Jeremy; Peitsch, Manuel C; Stolovitzky, Gustavo; Meyer, Pablo; Hoeng, Julia

    2015-02-15

    Inferring how humans respond to external cues such as drugs, chemicals, viruses or hormones is an essential question in biomedicine. Very often, however, this question cannot be addressed because it is not possible to perform experiments in humans. A reasonable alternative consists of generating responses in animal models and 'translating' those results to humans. The limitations of such translation, however, are far from clear, and systematic assessments of its actual potential are urgently needed. sbv IMPROVER (systems biology verification for Industrial Methodology for PROcess VErification in Research) was designed as a series of challenges to address translatability between humans and rodents. This collaborative crowd-sourcing initiative invited scientists from around the world to apply their own computational methodologies on a multilayer systems biology dataset composed of phosphoproteomics, transcriptomics and cytokine data derived from normal human and rat bronchial epithelial cells exposed in parallel to 52 different stimuli under identical conditions. Our aim was to understand the limits of species-to-species translatability at different levels of biological organization: signaling, transcriptional and release of secreted factors (such as cytokines). Participating teams submitted 49 different solutions across the sub-challenges, two-thirds of which were statistically significantly better than random. Additionally, similar computational methods were found to range widely in their performance within the same challenge, and no single method emerged as a clear winner across all sub-challenges. Finally, computational methods were able to effectively translate some specific stimuli and biological processes in the lung epithelial system, such as DNA synthesis, cytoskeleton and extracellular matrix, translation, immune/inflammation and growth factor/proliferation pathways, better than the expected response similarity between species. pmeyerr@us.ibm.com or Julia.Hoeng@pmi.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  6. Effective grouping for energy and performance: Construction of adaptive, sustainable, and maintainable data storage

    NASA Astrophysics Data System (ADS)

    Essary, David S.

    The performance gap between processors and storage systems has been increasingly critical over the years. Yet the performance disparity remains, and further, storage energy consumption is rapidly becoming a new critical problem. While smarter caching and predictive techniques do much to alleviate this disparity, the problem persists, and data storage remains a growing contributor to latency and energy consumption. Attempts have been made at data layout maintenance, or intelligent physical placement of data, yet in practice, basic heuristics remain predominant. Problems that early studies sought to solve via layout strategies were proven to be NP-Hard, and data layout maintenance today remains more art than science. With unknown potential and a domain inherently full of uncertainty, layout maintenance persists as an area largely untapped by modern systems. But uncertainty in workloads does not imply randomness; access patterns have exhibited repeatable, stable behavior. Predictive information can be gathered, analyzed, and exploited to improve data layouts. Our goal is a dynamic, robust, sustainable predictive engine, aimed at improving existing layouts by replicating data at the storage device level. We present a comprehensive discussion of the design and construction of such a predictive engine, including workload evaluation, where we present and evaluate classical workloads as well as our own highly detailed traces collected over an extended period. We demonstrate significant gains through an initial static grouping mechanism, and compare against an optimal grouping method of our own construction, and further show significant improvement over competing techniques. We also explore and illustrate the challenges faced when moving from static to dynamic (i.e. online) grouping, and provide motivation and solutions for addressing these challenges. These challenges include metadata storage, appropriate predictive collocation, online performance, and physical placement. We reduced the metadata needed by several orders of magnitude, reducing the required volume from more than 14% of total storage down to less than 1/2%. We also demonstrate how our collocation strategies outperform competing techniques. Finally, we present our complete model and evaluate a prototype implementation against real hardware. This model was demonstrated to be capable of reducing device-level accesses by up to 65%. Keywords: computer systems, collocation, data management, file systems, grouping, metadata, modeling and prediction, operating systems, performance, power, secondary storage.

  7. Hair loss and regeneration performed on animal models

    PubMed Central

    ORASAN, MEDA SANDRA; ROMAN, IULIA IOANA; CONEAC, ANDREI; MURESAN, ADRIANA; ORASAN, REMUS IOAN

    2016-01-01

    Research in the field of reversal hair loss remains a challenging subject. As Minoxidil 2% or 5% and Finasteride are so far the only FDA approved topical treatments for inducing hair regrowth, research is necessary in order to improve therapeutical approach in alopecia. In vitro studies have focused on cultures of a cell type - dermal papilla or organ culture of isolated cell follicles. In vivo research on this topic was performed on mice, rats, hamsters, rabbits, sheep and monkeys, taking into consideration the advantages and disadvantages of each animal model and the depilation options. Further studies are required not only to compare the efficiency of different therapies but more importantly to establish their long term safety. PMID:27547051

  8. Developing non-technical ward-round skills.

    PubMed

    Harvey, Rachel; Mellanby, Edward; Dearden, Effie; Medjoub, Karima; Edgar, Simon

    2015-10-01

    Conducting clinical 'rounds' is one of the most onerous and important duties that every junior doctor is expected to perform. There is evidence that newly qualified doctors are not adequately prepared by their undergraduate experiences for this task. The aim of this study was to analyse the challenges pertaining to non-technical skills that students would face during ward rounds, and to create a model that facilitates the transition from medical student to doctor. A total of 217 final-year medical students completed a simulated ward round. Free-text responses were analysed using template analysis applying an a priori template developed from the literature by the research team. This drew on the generic categories of non-technical skills suggested by Flin et al. Ninety-seven per cent of students agreed or strongly agreed that the simulated ward round improved their insight into the challenges of ward rounds and their perceived ability to work efficiently as an active member of the ward round. The responding students (206) submitted written feedback describing the learning that they planned to use: 800 learning points were recorded, and all could be categorised into one of seven non-technical skills. Conducting clinical 'rounds' is one of the most onerous and important duties that every junior doctor is expected to perform We believe that improved task efficiency and insight into the challenges of the ward round gained by medical students will lead to an enhancement in performance during clinical rounds, and will have a positive impact on patient safety. We would suggest that undergraduate medical schools consider this model in the preparation for the clinical practice element of the curriculum. © 2015 John Wiley & Sons Ltd.

  9. Novel Hierarchical Fall Detection Algorithm Using a Multiphase Fall Model.

    PubMed

    Hsieh, Chia-Yeh; Liu, Kai-Chun; Huang, Chih-Ning; Chu, Woei-Chyn; Chan, Chia-Tai

    2017-02-08

    Falls are the primary cause of accidents for the elderly in the living environment. Reducing hazards in the living environment and performing exercises for training balance and muscles are the common strategies for fall prevention. However, falls cannot be avoided completely; fall detection provides an alarm that can decrease injuries or death caused by the lack of rescue. The automatic fall detection system has opportunities to provide real-time emergency alarms for improving the safety and quality of home healthcare services. Two common technical challenges are also tackled in order to provide a reliable fall detection algorithm, including variability and ambiguity. We propose a novel hierarchical fall detection algorithm involving threshold-based and knowledge-based approaches to detect a fall event. The threshold-based approach efficiently supports the detection and identification of fall events from continuous sensor data. A multiphase fall model is utilized, including free fall, impact, and rest phases for the knowledge-based approach, which identifies fall events and has the potential to deal with the aforementioned technical challenges of a fall detection system. Seven kinds of falls and seven types of daily activities arranged in an experiment are used to explore the performance of the proposed fall detection algorithm. The overall performances of the sensitivity, specificity, precision, and accuracy using a knowledge-based algorithm are 99.79%, 98.74%, 99.05% and 99.33%, respectively. The results show that the proposed novel hierarchical fall detection algorithm can cope with the variability and ambiguity of the technical challenges and fulfill the reliability, adaptability, and flexibility requirements of an automatic fall detection system with respect to the individual differences.

  10. Novel Hierarchical Fall Detection Algorithm Using a Multiphase Fall Model

    PubMed Central

    Hsieh, Chia-Yeh; Liu, Kai-Chun; Huang, Chih-Ning; Chu, Woei-Chyn; Chan, Chia-Tai

    2017-01-01

    Falls are the primary cause of accidents for the elderly in the living environment. Reducing hazards in the living environment and performing exercises for training balance and muscles are the common strategies for fall prevention. However, falls cannot be avoided completely; fall detection provides an alarm that can decrease injuries or death caused by the lack of rescue. The automatic fall detection system has opportunities to provide real-time emergency alarms for improving the safety and quality of home healthcare services. Two common technical challenges are also tackled in order to provide a reliable fall detection algorithm, including variability and ambiguity. We propose a novel hierarchical fall detection algorithm involving threshold-based and knowledge-based approaches to detect a fall event. The threshold-based approach efficiently supports the detection and identification of fall events from continuous sensor data. A multiphase fall model is utilized, including free fall, impact, and rest phases for the knowledge-based approach, which identifies fall events and has the potential to deal with the aforementioned technical challenges of a fall detection system. Seven kinds of falls and seven types of daily activities arranged in an experiment are used to explore the performance of the proposed fall detection algorithm. The overall performances of the sensitivity, specificity, precision, and accuracy using a knowledge-based algorithm are 99.79%, 98.74%, 99.05% and 99.33%, respectively. The results show that the proposed novel hierarchical fall detection algorithm can cope with the variability and ambiguity of the technical challenges and fulfill the reliability, adaptability, and flexibility requirements of an automatic fall detection system with respect to the individual differences. PMID:28208694

  11. Aligning Learning and Talent Development Performance Outcomes with Organizational Objectives: A Proposed Model

    ERIC Educational Resources Information Center

    Ware, Iris

    2017-01-01

    The value proposition for learning and talent development (LTD) is often challenged due to human resources' inability to demonstrate meaningful outcomes in relation to organizational needs and return-on-investment. The primary role of human resources (HR) and the learning and talent development (LTD) function is to produce meaningful outcomes to…

  12. The Defense Life Cycle Management System as a Working Model for Academic Application

    ERIC Educational Resources Information Center

    Burian, Philip E.; Keffel, Leslie M.; Maffei, Francis R., III

    2011-01-01

    Performing the review and assessment of masters' level degree programs can be an overwhelming and challenging endeavor. Getting organized and mapping out the entire review and assessment process can be extremely helpful and more importantly provide a path for successfully accomplishing the review and assessment of the entire program. This paper…

  13. Combining Readers Theater, Story Mapping and Video Self-Modeling Interventions to Improve Narrative Reading Comprehension in Children with High-Functioning Autism

    ERIC Educational Resources Information Center

    Schatz, Rochelle B.

    2017-01-01

    Individuals with High-Functioning Autism Spectrum Disorder (HFA) demonstrate atypical development resulting in significant deficits in the areas of perspective-taking and observational learning. These deficits lead to challenges in social interactions and academic performance. In particular, children with HFA tend to struggle with comprehending…

  14. Building Curriculum-Based Concerts: Tired of the Same Old Approach to Your Ensemble's Concert and Festival Schedule?

    ERIC Educational Resources Information Center

    Russell, Joshua A.

    2006-01-01

    Since--and even before--the National Standards for Music Education were published, music educators have tried to balance the expectations associated with the traditional performance curriculum and contemporary models of music education. The Standards-based curriculum challenges directors to consider how student experiences and learning can be…

  15. A Sense of Balance: District Aligns Personalized Learning with School and System Goals

    ERIC Educational Resources Information Center

    Donsky, Debbie; Witherow, Kathy

    2015-01-01

    This article addresses the challenge of personalizing learning while also ensuring alignment with system and school improvement plans. Leaders of the York Region District School Board in Ontario knew that what took their high-performing school district from good to great would not take it from great to excellent. The district's early model of…

  16. Wiki or Word? Evaluating Tools for Collaborative Writing and Editing

    ERIC Educational Resources Information Center

    Dishaw, Mark; Eierman, Michael A.; Iversen, Jakob H.; Philip, George C.

    2011-01-01

    Businesses and other organizations are relying increasingly on virtual teams to perform a range of business activities. A key challenge in utilizing virtual teams is to support collaboration among team members who are separated by distance and/or time. In this paper we use a research model based on a combination of the Technology Acceptance Model…

  17. Challenging Freedom: Neoliberalism and the Erosion of Democratic Education

    ERIC Educational Resources Information Center

    Karaba, Robert

    2016-01-01

    Goodlad, et al. (2002) rightly point out that a culture can either resist or support change. Schein's (2010) model of culture indicates observable behaviors of a culture can be explained by exposing underlying shared values and basic assumptions that give meaning to the performance. Yet culture is many-faceted and complex. So Schein advised a…

  18. Effects and detection of raw material variability on the performance of near-infrared calibration models for pharmaceutical products.

    PubMed

    Igne, Benoit; Shi, Zhenqi; Drennen, James K; Anderson, Carl A

    2014-02-01

    The impact of raw material variability on the prediction ability of a near-infrared calibration model was studied. Calibrations, developed from a quaternary mixture design comprising theophylline anhydrous, lactose monohydrate, microcrystalline cellulose, and soluble starch, were challenged by intentional variation of raw material properties. A design with two theophylline physical forms, three lactose particle sizes, and two starch manufacturers was created to test model robustness. Further challenges to the models were accomplished through environmental conditions. Along with full-spectrum partial least squares (PLS) modeling, variable selection by dynamic backward PLS and genetic algorithms was utilized in an effort to mitigate the effects of raw material variability. In addition to evaluating models based on their prediction statistics, prediction residuals were analyzed by analyses of variance and model diagnostics (Hotelling's T(2) and Q residuals). Full-spectrum models were significantly affected by lactose particle size. Models developed by selecting variables gave lower prediction errors and proved to be a good approach to limit the effect of changing raw material characteristics. Hotelling's T(2) and Q residuals provided valuable information that was not detectable when studying only prediction trends. Diagnostic statistics were demonstrated to be critical in the appropriate interpretation of the prediction of quality parameters. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.

  19. Advanced Noise Abatement Procedures for a Supersonic Business Jet

    NASA Technical Reports Server (NTRS)

    Berton, Jeffrey J.; Jones, Scott M.; Seidel, Jonathan A.; Huff, Dennis L.

    2017-01-01

    Supersonic civil aircraft present a unique noise certification challenge. High specific thrust required for supersonic cruise results in high engine exhaust velocity and high levels of jet noise during takeoff. Aerodynamics of thin, low-aspect-ratio wings equipped with relatively simple flap systems deepen the challenge. Advanced noise abatement procedures have been proposed for supersonic aircraft. These procedures promise to reduce airport noise, but they may require departures from normal reference procedures defined in noise regulations. The subject of this report is a takeoff performance and noise assessment of a notional supersonic business jet. Analytical models of an airframe and a supersonic engine derived from a contemporary subsonic turbofan core are developed. These models are used to predict takeoff trajectories and noise. Results indicate advanced noise abatement takeoff procedures are helpful in reducing noise along lateral sidelines.

  20. Successes and Challenges of HIV Mentoring in Malawi: The Mentee Perspective.

    PubMed

    Chien, Emily; Phiri, Khumbo; Schooley, Alan; Chivwala, Mackenzie; Hamilton, John; Hoffman, Risa M

    2016-01-01

    HIV clinical mentoring has been utilized for capacity building in Africa, but few formal program evaluations have explored mentee perspectives on these programs. EQUIP is a PEPFAR-USAID funded program in Malawi that has been providing HIV mentoring on clinical and health systems since 2010. We sought to understand the successes and challenges of EQUIP's mentorship program. From June-September 2014 we performed semi-structured, in-depth interviews with EQUIP mentees who had received mentoring for ≥ 1 year. Interview questions focused on program successes and challenges and were performed in English, audio recorded, coded, and analyzed using inductive content analysis with ATLAS.ti v7. Fifty-two mentees from 32 health centers were interviewed. The majority of mentees were 18-40 years old (79%, N = 41), 69% (N = 36) were male, 50% (N = 26) were nurses, 29% (N = 15) medical assistants, and 21% (N = 11) clinical officers. All mentees felt that EQUIP mentorship was successful (100%, N = 52). The most common benefit reported was an increase in clinical knowledge allowing for initiation of antiretroviral therapy (33%, N = 17). One-third of mentees (N = 17) reported increased clinic efficiency and improved systems for patient care due to EQUIP's systems mentoring including documentation, supply chain and support for minor construction at clinics. The most common challenge (52%, N = 27) was understaffing at facilities, with mentees having multiple responsibilities during mentorship visits resulting in impaired ability to focus on learning. Mentees also reported that medication stock-outs (42%, N = 22) created challenges for the mentoring process. EQUIP's systems-based mentorship and infrastructure improvements allowed for an optimized environment for clinical training. Shortages of health workers at sites pose a challenge for mentoring programs because mentees are pulled from learning experiences to perform non-HIV-related clinic duties. Evaluations of existing mentoring models are needed to continue to improve mentoring strategies that result in sustainable benefits for mentees, facilities, and patients.

  1. Evaluating the morphology of the left atrial appendage by a transesophageal echocardiographic 3-dimensional printed model

    PubMed Central

    Song, Hongning; Zhou, Qing; Zhang, Lan; Deng, Qing; Wang, Yijia; Hu, Bo; Tan, Tuantuan; Chen, Jinling; Pan, Yiteng; He, Fazhi

    2017-01-01

    Abstract The novel 3-dimensional printing (3DP) technique has shown its ability to assist personalized cardiac intervention therapy. This study aimed to determine the feasibility of 3D-printed left atrial appendage (LAA) models based on 3D transesophageal echocardiography (3D TEE) data and their application value in treating LAA occlusions. Eighteen patients with transcatheter LAA occlusion, and preprocedure 3D TEE and cardiac computed tomography were enrolled. 3D TEE volumetric data of the LAA were acquired and postprocessed for 3DP. Two types of 3D models of the LAA (ie, hard chamber model and flexible wall model) were printed by a 3D printer. The morphological classification and lobe identification of the LAA were assessed by the 3D chamber model, and LAA dimensions were measured via the 3D wall model. Additionally, a simulation operative rehearsal was performed on the 3D models in cases of challenging LAA morphology for the purpose of understanding the interactions between the device and the model. Three-dimensional TEE volumetric data of the LAA were successfully reprocessed and printed as 3D LAA chamber models and 3D LAA wall models in all patients. The consistency of the morphological classifications of the LAA based on 3D models and cardiac computed tomography was 0.92 (P < .01). The differences between the LAA ostium dimensions and depth measured using the 3D models were not significant from those measured on 3D TEE (P > .05). A simulation occlusion was successfully performed on the 3D model of the 2 challenging cases and compared with the real procedure. The echocardiographic 3DP technique is feasible and accurate in reflecting the spatial morphology of the LAA, which may be promising for the personalized planning of transcatheter LAA occlusion. PMID:28930824

  2. Human performance modeling for system of systems analytics: combat performance-shaping factors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawton, Craig R.; Miller, Dwight Peter

    The US military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives. To support this goal, Sandia National Laboratories (SNL) has undertaken a program of HPM as an integral augmentation to its system-of-system (SoS) analytics capabilities. The previous effort, reported in SAND2005-6569, evaluated the effects of soldier cognitive fatigue on SoS performance. The current effort began with a very broad survey of any performance-shaping factors (PSFs) that also might affect soldiers performance in combat situations. The work included consideration of three different approaches to cognition modeling and how appropriate theymore » would be for application to SoS analytics. This bulk of this report categorizes 47 PSFs into three groups (internal, external, and task-related) and provides brief descriptions of how each affects combat performance, according to the literature. The PSFs were then assembled into a matrix with 22 representative military tasks and assigned one of four levels of estimated negative impact on task performance, based on the literature. Blank versions of the matrix were then sent to two ex-military subject-matter experts to be filled out based on their personal experiences. Data analysis was performed to identify the consensus most influential PSFs. Results indicate that combat-related injury, cognitive fatigue, inadequate training, physical fatigue, thirst, stress, poor perceptual processing, and presence of chemical agents are among the PSFs with the most negative impact on combat performance.« less

  3. Perspectives to Performance of Environment and Health Assessments and Models—From Outputs to Outcomes?

    PubMed Central

    Pohjola, Mikko V.; Pohjola, Pasi; Tainio, Marko; Tuomisto, Jouni T.

    2013-01-01

    The calls for knowledge-based policy and policy-relevant research invoke a need to evaluate and manage environment and health assessments and models according to their societal outcomes. This review explores how well the existing approaches to assessment and model performance serve this need. The perspectives to assessment and model performance in the scientific literature can be called: (1) quality assurance/control, (2) uncertainty analysis, (3) technical assessment of models, (4) effectiveness and (5) other perspectives, according to what is primarily seen to constitute the goodness of assessments and models. The categorization is not strict and methods, tools and frameworks in different perspectives may overlap. However, altogether it seems that most approaches to assessment and model performance are relatively narrow in their scope. The focus in most approaches is on the outputs and making of assessments and models. Practical application of the outputs and the consequential outcomes are often left unaddressed. It appears that more comprehensive approaches that combine the essential characteristics of different perspectives are needed. This necessitates a better account of the mechanisms of collective knowledge creation and the relations between knowledge and practical action. Some new approaches to assessment, modeling and their evaluation and management span the chain from knowledge creation to societal outcomes, but the complexity of evaluating societal outcomes remains a challenge. PMID:23803642

  4. Board evaluation and effectiveness: models, components and perspectives.

    PubMed

    Scharf, M; Marty, D; Barnsley, J

    1994-01-01

    Health facility boards are being challenged to increase their effectiveness in the face of the changing health care environment. To this end, accreditation standards require boards to develop methods of evaluating their governing function and performance. During a survey of governance issues, the authors interviewed a group of health service executives with respect to board evaluation at their facilities. The responses yielded insights relating to models and components of evaluation, board missions and policies, mentoring programs and trustee education and orientation.

  5. Causal inferences on the effectiveness of complex social programs: Navigating assumptions, sources of complexity and evaluation design challenges.

    PubMed

    Chatterji, Madhabi

    2016-12-01

    This paper explores avenues for navigating evaluation design challenges posed by complex social programs (CSPs) and their environments when conducting studies that call for generalizable, causal inferences on the intervention's effectiveness. A definition is provided of a CSP drawing on examples from different fields, and an evaluation case is analyzed in depth to derive seven (7) major sources of complexity that typify CSPs, threatening assumptions of textbook-recommended experimental designs for performing impact evaluations. Theoretically-supported, alternative methodological strategies are discussed to navigate assumptions and counter the design challenges posed by the complex configurations and ecology of CSPs. Specific recommendations include: sequential refinement of the evaluation design through systems thinking, systems-informed logic modeling; and use of extended term, mixed methods (ETMM) approaches with exploratory and confirmatory phases of the evaluation. In the proposed approach, logic models are refined through direct induction and interactions with stakeholders. To better guide assumption evaluation, question-framing, and selection of appropriate methodological strategies, a multiphase evaluation design is recommended. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Multi -omics and metabolic modelling pipelines: challenges and tools for systems microbiology.

    PubMed

    Fondi, Marco; Liò, Pietro

    2015-02-01

    Integrated -omics approaches are quickly spreading across microbiology research labs, leading to (i) the possibility of detecting previously hidden features of microbial cells like multi-scale spatial organization and (ii) tracing molecular components across multiple cellular functional states. This promises to reduce the knowledge gap between genotype and phenotype and poses new challenges for computational microbiologists. We underline how the capability to unravel the complexity of microbial life will strongly depend on the integration of the huge and diverse amount of information that can be derived today from -omics experiments. In this work, we present opportunities and challenges of multi -omics data integration in current systems biology pipelines. We here discuss which layers of biological information are important for biotechnological and clinical purposes, with a special focus on bacterial metabolism and modelling procedures. A general review of the most recent computational tools for performing large-scale datasets integration is also presented, together with a possible framework to guide the design of systems biology experiments by microbiologists. Copyright © 2015. Published by Elsevier GmbH.

  7. Implementing the community health worker model within diabetes management: challenges and lessons learned from programs across the United States.

    PubMed

    Cherrington, Andrea; Ayala, Guadalupe X; Amick, Halle; Allison, Jeroan; Corbie-Smith, Giselle; Scarinci, Isabel

    2008-01-01

    The purpose of this qualitative study was to examine methods of implementation of the community health worker (CHW) model within diabetes programs, as well as related challenges and lessons learned. Semi-structured interviews were conducted with program managers. Four databases (PubMed, CINAHL, ISI Web of Knowledge, PsycInfo), the CDC's 1998 directory of CHW programs, and Google Search Engine were used to identify CHW programs. Criteria for inclusion were: DM program; used CHW strategy; occurred in United States. Two independent reviewers performed content analyses to identify major themes and findings. Sixteen programs were assessed, all but 3 focused on minority populations. Most CHWs were recruited informally; 6 programs required CHWs to have diabetes. CHW roles and responsibilities varied across programs; educator was the most commonly identified role. Training also varied in terms of both content and intensity. All programs gave CHWs remuneration for their work. Common challenges included difficulties with CHW retention, intervention fidelity and issues related to sustainability. Cultural and gender issues also emerged. Examples of lessons learned included the need for community buy-in and the need to anticipate nondiabetes related issues. Lessons learned from these programs may be useful to others as they apply the CHW model to diabetes management within their own communities. Further research is needed to elucidate the specific features of this model necessary to positively impact health outcomes.

  8. Non-lethal control of the cariogenic potential of an agent-based model for dental plaque.

    PubMed

    Head, David A; Marsh, Phil D; Devine, Deirdre A

    2014-01-01

    Dental caries or tooth decay is a prevalent global disease whose causative agent is the oral biofilm known as plaque. According to the ecological plaque hypothesis, this biofilm becomes pathogenic when external challenges drive it towards a state with a high proportion of acid-producing bacteria. Determining which factors control biofilm composition is therefore desirable when developing novel clinical treatments to combat caries, but is also challenging due to the system complexity and the existence of multiple bacterial species performing similar functions. Here we employ agent-based mathematical modelling to simulate a biofilm consisting of two competing, distinct types of bacterial populations, each parameterised by their nutrient uptake and aciduricity, periodically subjected to an acid challenge resulting from the metabolism of dietary carbohydrates. It was found that one population was progressively eliminated from the system to give either a benign or a pathogenic biofilm, with a tipping point between these two fates depending on a multiplicity of factors relating to microbial physiology and biofilm geometry. Parameter sensitivity was quantified by individually varying the model parameters against putative experimental measures, suggesting non-lethal interventions that can favourably modulate biofilm composition. We discuss how the same parameter sensitivity data can be used to guide the design of validation experiments, and argue for the benefits of in silico modelling in providing an additional predictive capability upstream from in vitro experiments.

  9. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of themore » kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.« less

  10. Improved Doubly Robust Estimation when Data are Monotonely Coarsened, with Application to Longitudinal Studies with Dropout

    PubMed Central

    Tsiatis, Anastasios A.; Davidian, Marie; Cao, Weihua

    2010-01-01

    Summary A routine challenge is that of making inference on parameters in a statistical model of interest from longitudinal data subject to drop out, which are a special case of the more general setting of monotonely coarsened data. Considerable recent attention has focused on doubly robust estimators, which in this context involve positing models for both the missingness (more generally, coarsening) mechanism and aspects of the distribution of the full data, that have the appealing property of yielding consistent inferences if only one of these models is correctly specified. Doubly robust estimators have been criticized for potentially disastrous performance when both of these models are even only mildly misspecified. We propose a doubly robust estimator applicable in general monotone coarsening problems that achieves comparable or improved performance relative to existing doubly robust methods, which we demonstrate via simulation studies and by application to data from an AIDS clinical trial. PMID:20731640

  11. Volume Dynamics Propulsion System Modeling for Supersonics Vehicle Research

    NASA Technical Reports Server (NTRS)

    Kopasakis, George; Connolly, Joseph W.; Paxson, Daniel E.; Ma, Peter

    2010-01-01

    Under the NASA Fundamental Aeronautics Program the Supersonics Project is working to overcome the obstacles to supersonic commercial flight. The proposed vehicles are long slim body aircraft with pronounced aero-servo-elastic modes. These modes can potentially couple with propulsion system dynamics; leading to performance challenges such as aircraft ride quality and stability. Other disturbances upstream of the engine generated from atmospheric wind gusts, angle of attack, and yaw can have similar effects. In addition, for optimal propulsion system performance, normal inlet-engine operations are required to be closer to compressor stall and inlet unstart. To study these phenomena an integrated model is needed that includes both airframe structural dynamics as well as the propulsion system dynamics. This paper covers the propulsion system component volume dynamics modeling of a turbojet engine that will be used for an integrated vehicle Aero-Propulso-Servo-Elastic model and for propulsion efficiency studies.

  12. Volume Dynamics Propulsion System Modeling for Supersonics Vehicle Research

    NASA Technical Reports Server (NTRS)

    Kopasakis, George; Connolly, Joseph W.; Paxson, Daniel E.; Ma, Peter

    2008-01-01

    Under the NASA Fundamental Aeronautics Program, the Supersonics Project is working to overcome the obstacles to supersonic commercial flight. The proposed vehicles are long slim body aircraft with pronounced aero-servo-elastic modes. These modes can potentially couple with propulsion system dynamics; leading to performance challenges such as aircraft ride quality and stability. Other disturbances upstream of the engine generated from atmospheric wind gusts, angle of attack, and yaw can have similar effects. In addition, for optimal propulsion system performance, normal inlet-engine operations are required to be closer to compressor stall and inlet unstart. To study these phenomena an integrated model is needed that includes both airframe structural dynamics as well as the propulsion system dynamics. This paper covers the propulsion system component volume dynamics modeling of a turbojet engine that will be used for an integrated vehicle Aero-Propulso-Servo-Elastic model and for propulsion efficiency studies.

  13. Volume Dynamics Propulsion System Modeling for Supersonics Vehicle Research

    NASA Technical Reports Server (NTRS)

    Kopasakis, George; Connolly, Joseph W.; Paxson, Daniel E.; Ma, Peter

    2008-01-01

    Under the NASA Fundamental Aeronautics Program the Supersonics Project is working to overcome the obstacles to supersonic commercial flight. The proposed vehicles are long slim body aircraft with pronounced aero-servo-elastic modes. These modes can potentially couple with propulsion system dynamics; leading to performance challenges such as aircraft ride quality and stability. Other disturbances upstream of the engine generated from atmospheric wind gusts, angle of attack, and yaw can have similar effects. In addition, for optimal propulsion system performance, normal inlet-engine operations are required to be closer to compressor stall and inlet unstart. To study these phenomena an integrated model is needed that includes both airframe structural dynamics as well as the propulsion system dynamics. This paper covers the propulsion system component volume dynamics modeling of a turbojet engine that will be used for an integrated vehicle Aero- Propulso-Servo-Elastic model and for propulsion efficiency studies.

  14. Validation of pore network simulations of ex-situ water distributions in a gas diffusion layer of proton exchange membrane fuel cells with X-ray tomographic images

    NASA Astrophysics Data System (ADS)

    Agaesse, Tristan; Lamibrac, Adrien; Büchi, Felix N.; Pauchet, Joel; Prat, Marc

    2016-11-01

    Understanding and modeling two-phase flows in the gas diffusion layer (GDL) of proton exchange membrane fuel cells are important in order to improve fuel cells performance. They are scientifically challenging because of the peculiarities of GDLs microstructures. In the present work, simulations on a pore network model are compared to X-ray tomographic images of water distributions during an ex-situ water invasion experiment. A method based on watershed segmentation was developed to extract a pore network from the 3D segmented image of the dry GDL. Pore network modeling and a full morphology model were then used to perform two-phase simulations and compared to the experimental data. The results show good agreement between experimental and simulated microscopic water distributions. Pore network extraction parameters were also benchmarked using the experimental data and results from full morphology simulations.

  15. Validating a Model for Welding Induced Residual Stress Using High-Energy X-ray Diffraction

    NASA Astrophysics Data System (ADS)

    Mach, J. C.; Budrow, C. J.; Pagan, D. C.; Ruff, J. P. C.; Park, J.-S.; Okasinski, J.; Beaudoin, A. J.; Miller, M. P.

    2017-05-01

    Integrated computational materials engineering (ICME) provides a pathway to advance performance in structures through the use of physically-based models to better understand how manufacturing processes influence product performance. As one particular challenge, consider that residual stresses induced in fabrication are pervasive and directly impact the life of structures. For ICME to be an effective strategy, it is essential that predictive capability be developed in conjunction with critical experiments. In the present work, simulation results from a multi-physics model for gas metal arc welding are evaluated through x-ray diffraction using synchrotron radiation. A test component was designed with intent to develop significant gradients in residual stress, be representative of real-world engineering application, yet remain tractable for finely spaced strain measurements with positioning equipment available at synchrotron facilities. The experimental validation lends confidence to model predictions, facilitating the explicit consideration of residual stress distribution in prediction of fatigue life.

  16. Improved Range Estimation Model for Three-Dimensional (3D) Range Gated Reconstruction

    PubMed Central

    Chua, Sing Yee; Guo, Ningqun; Tan, Ching Seong; Wang, Xin

    2017-01-01

    Accuracy is an important measure of system performance and remains a challenge in 3D range gated reconstruction despite the advancement in laser and sensor technology. The weighted average model that is commonly used for range estimation is heavily influenced by the intensity variation due to various factors. Accuracy improvement in term of range estimation is therefore important to fully optimise the system performance. In this paper, a 3D range gated reconstruction model is derived based on the operating principles of range gated imaging and time slicing reconstruction, fundamental of radiant energy, Laser Detection And Ranging (LADAR), and Bidirectional Reflection Distribution Function (BRDF). Accordingly, a new range estimation model is proposed to alleviate the effects induced by distance, target reflection, and range distortion. From the experimental results, the proposed model outperforms the conventional weighted average model to improve the range estimation for better 3D reconstruction. The outcome demonstrated is of interest to various laser ranging applications and can be a reference for future works. PMID:28872589

  17. GANViz: A Visual Analytics Approach to Understand the Adversarial Game.

    PubMed

    Wang, Junpeng; Gou, Liang; Yang, Hao; Shen, Han-Wei

    2018-06-01

    Generative models bear promising implications to learn data representations in an unsupervised fashion with deep learning. Generative Adversarial Nets (GAN) is one of the most popular frameworks in this arena. Despite the promising results from different types of GANs, in-depth understanding on the adversarial training process of the models remains a challenge to domain experts. The complexity and the potential long-time training process of the models make it hard to evaluate, interpret, and optimize them. In this work, guided by practical needs from domain experts, we design and develop a visual analytics system, GANViz, aiming to help experts understand the adversarial process of GANs in-depth. Specifically, GANViz evaluates the model performance of two subnetworks of GANs, provides evidence and interpretations of the models' performance, and empowers comparative analysis with the evidence. Through our case studies with two real-world datasets, we demonstrate that GANViz can provide useful insight into helping domain experts understand, interpret, evaluate, and potentially improve GAN models.

  18. Uncertainty quantification and experimental design based on unsupervised machine learning identification of contaminant sources and groundwater types using hydrogeochemical data

    NASA Astrophysics Data System (ADS)

    Vesselinov, V. V.

    2017-12-01

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical species. Numerous geochemical constituents and processes may need to be simulated in these models which further complicates the analyses. As a result, these types of model analyses are typically extremely challenging. Here, we demonstrate a new contaminant source identification approach that performs decomposition of the observation mixtures based on Nonnegative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. We also demonstrate how NMFk can be extended to perform uncertainty quantification and experimental design related to real-world site characterization. The NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios). The NMFk algorithm has been extensively tested on synthetic datasets; NMFk analyses have been actively performed on real-world data collected at the Los Alamos National Laboratory (LANL) groundwater sites related to Chromium and RDX contamination.

  19. Efficient multi-scenario Model Predictive Control for water resources management with ensemble streamflow forecasts

    NASA Astrophysics Data System (ADS)

    Tian, Xin; Negenborn, Rudy R.; van Overloop, Peter-Jules; María Maestre, José; Sadowska, Anna; van de Giesen, Nick

    2017-11-01

    Model Predictive Control (MPC) is one of the most advanced real-time control techniques that has been widely applied to Water Resources Management (WRM). MPC can manage the water system in a holistic manner and has a flexible structure to incorporate specific elements, such as setpoints and constraints. Therefore, MPC has shown its versatile performance in many branches of WRM. Nonetheless, with the in-depth understanding of stochastic hydrology in recent studies, MPC also faces the challenge of how to cope with hydrological uncertainty in its decision-making process. A possible way to embed the uncertainty is to generate an Ensemble Forecast (EF) of hydrological variables, rather than a deterministic one. The combination of MPC and EF results in a more comprehensive approach: Multi-scenario MPC (MS-MPC). In this study, we will first assess the model performance of MS-MPC, considering an ensemble streamflow forecast. Noticeably, the computational inefficiency may be a critical obstacle that hinders applicability of MS-MPC. In fact, with more scenarios taken into account, the computational burden of solving an optimization problem in MS-MPC accordingly increases. To deal with this challenge, we propose the Adaptive Control Resolution (ACR) approach as a computationally efficient scheme to practically reduce the number of control variables in MS-MPC. In brief, the ACR approach uses a mixed-resolution control time step from the near future to the distant future. The ACR-MPC approach is tested on a real-world case study: an integrated flood control and navigation problem in the North Sea Canal of the Netherlands. Such an approach reduces the computation time by 18% and up in our case study. At the same time, the model performance of ACR-MPC remains close to that of conventional MPC.

  20. Digital Model-Based Engineering: Expectations, Prerequisites, and Challenges of Infusion

    NASA Technical Reports Server (NTRS)

    Hale, J. P.; Zimmerman, P.; Kukkala, G.; Guerrero, J.; Kobryn, P.; Puchek, B.; Bisconti, M.; Baldwin, C.; Mulpuri, M.

    2017-01-01

    Digital model-based engineering (DMbE) is the use of digital artifacts, digital environments, and digital tools in the performance of engineering functions. DMbE is intended to allow an organization to progress from documentation-based engineering methods to digital methods that may provide greater flexibility, agility, and efficiency. The term 'DMbE' was developed as part of an effort by the Model-Based Systems Engineering (MBSE) Infusion Task team to identify what government organizations might expect in the course of moving to or infusing MBSE into their organizations. The Task team was established by the Interagency Working Group on Engineering Complex Systems, an informal collaboration among government systems engineering organizations. This Technical Memorandum (TM) discusses the work of the MBSE Infusion Task team to date. The Task team identified prerequisites, expectations, initial challenges, and recommendations for areas of study to pursue, as well as examples of efforts already in progress. The team identified the following five expectations associated with DMbE infusion, discussed further in this TM: (1) Informed decision making through increased transparency, and greater insight. (2) Enhanced communication. (3) Increased understanding for greater flexibility/adaptability in design. (4) Increased confidence that the capability will perform as expected. (5) Increased efficiency. The team identified the following seven challenges an organization might encounter when looking to infuse DMbE: (1) Assessing value added to the organization. Not all DMbE practices will be applicable to every situation in every organization, and not all implementations will have positive results. (2) Overcoming organizational and cultural hurdles. (3) Adopting contractual practices and technical data management. (4) Redefining configuration management. The DMbE environment changes the range of configuration information to be managed to include performance and design models, database objects, as well as more traditional book-form objects and formats. (5) Developing information technology (IT) infrastructure. Approaches to implementing critical, enabling IT infrastructure capabilities must be flexible, reconfigurable, and updatable. (6) Ensuring security of the single source of truth (7) Potential overreliance on quantitative data over qualitative data. Executable/ computational models and simulations generally incorporate and generate quantitative vice qualitative data. The Task team also developed several recommendations for government, academia, and industry, as discussed in this TM. The Task team recommends continuing beyond this initial work to further develop the means of implementing DMbE and to look for opportunities to collaborate and share best practices.

Top