Sample records for tackle model complexity

  1. Toward Modeling the Intrinsic Complexity of Test Problems

    ERIC Educational Resources Information Center

    Shoufan, Abdulhadi

    2017-01-01

    The concept of intrinsic complexity explains why different problems of the same type, tackled by the same problem solver, can require different times to solve and yield solutions of different quality. This paper proposes a general four-step approach that can be used to establish a model for the intrinsic complexity of a problem class in terms of…

  2. Tackling antibiotic resistance in India.

    PubMed

    Wattal, Chand; Goel, Neeraj

    2014-12-01

    Infectious diseases are major causes of mortality in India. This is aggravated by the increasing prevalence of antimicrobial resistance (AMR) both in the community and in hospitals. Due to the emergence of resistance to all effective antibiotics in nosocomial pathogens, the situation calls for emergency measures to tackle AMR in India. India has huge challenges in tackling AMR, ranging from lack of surveillance mechanisms for monitoring AMR and use; effective hospital control policies; sanitation and non-human use of antimicrobial. The Ministry of Health and Family Welfare of Govt. of India has taken initiatives to tackle AMR. Extensive guidelines have been drafted and a model worksheet has been developed as a roadmap to tackle AMR.

  3. Epidemic modeling in complex realities.

    PubMed

    Colizza, Vittoria; Barthélemy, Marc; Barrat, Alain; Vespignani, Alessandro

    2007-04-01

    In our global world, the increasing complexity of social relations and transport infrastructures are key factors in the spread of epidemics. In recent years, the increasing availability of computer power has enabled both to obtain reliable data allowing one to quantify the complexity of the networks on which epidemics may propagate and to envision computational tools able to tackle the analysis of such propagation phenomena. These advances have put in evidence the limits of homogeneous assumptions and simple spatial diffusion approaches, and stimulated the inclusion of complex features and heterogeneities relevant in the description of epidemic diffusion. In this paper, we review recent progresses that integrate complex systems and networks analysis with epidemic modelling and focus on the impact of the various complex features of real systems on the dynamics of epidemic spreading.

  4. Engineering and control of biological systems: A new way to tackle complex diseases.

    PubMed

    Menolascina, Filippo; Siciliano, Velia; di Bernardo, Diego

    2012-07-16

    The ongoing merge between engineering and biology has contributed to the emerging field of synthetic biology. The defining features of this new discipline are abstraction and standardisation of biological parts, decoupling between parts to prevent undesired cross-talking, and the application of quantitative modelling of synthetic genetic circuits in order to guide their design. Most of the efforts in the field of synthetic biology in the last decade have been devoted to the design and development of functional gene circuits in prokaryotes and unicellular eukaryotes. Researchers have used synthetic biology not only to engineer new functions in the cell, but also to build simpler models of endogenous gene regulatory networks to gain knowledge of the "rules" governing their wiring diagram. However, the need for innovative approaches to study and modify complex signalling and regulatory networks in mammalian cells and multicellular organisms has prompted advances of synthetic biology also in these species, thus contributing to develop innovative ways to tackle human diseases. In this work, we will review the latest progress in synthetic biology and the most significant developments achieved so far, both in unicellular and multicellular organisms, with emphasis on human health. Copyright © 2012 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  5. Tackle and impact detection in elite Australian football using wearable microsensor technology.

    PubMed

    Gastin, Paul B; McLean, Owen C; Breed, Ray V P; Spittle, Michael

    2014-01-01

    The effectiveness of a wearable microsensor device (MinimaxX(TM) S4, Catapult Innovations, Melbourne, VIC, Australia) to automatically detect tackles and impact events in elite Australian football (AF) was assessed during four matches. Video observation was used as the criterion measure. A total of 352 tackles were observed, with 78% correctly detected as tackles by the manufacturer's software. Tackles against (i.e. tackled by an opponent) were more accurately detected than tackles made (90% v 66%). Of the 77 tackles that were not detected at all, the majority (74%) were categorised as low-intensity. In contrast, a total of 1510 "tackle" events were detected, with only 18% of these verified as tackles. A further 57% were from contested ball situations involving player contact. The remaining 25% were in general play where no contact was evident; these were significantly lower in peak Player Load™ than those involving player contact (P < 0.01). The tackle detection algorithm, developed primarily for rugby, was not suitable for tackle detection in AF. The underlying sensor data may have the potential to detect a range of events within contact sports such as AF, yet to do so is a complex task and requires sophisticated sport and event-specific algorithms.

  6. Tackling the challenges of matching biomedical ontologies.

    PubMed

    Faria, Daniel; Pesquita, Catia; Mott, Isabela; Martins, Catarina; Couto, Francisco M; Cruz, Isabel F

    2018-01-15

    Biomedical ontologies pose several challenges to ontology matching due both to the complexity of the biomedical domain and to the characteristics of the ontologies themselves. The biomedical tracks in the Ontology Matching Evaluation Initiative (OAEI) have spurred the development of matching systems able to tackle these challenges, and benchmarked their general performance. In this study, we dissect the strategies employed by matching systems to tackle the challenges of matching biomedical ontologies and gauge the impact of the challenges themselves on matching performance, using the AgreementMakerLight (AML) system as the platform for this study. We demonstrate that the linear complexity of the hash-based searching strategy implemented by most state-of-the-art ontology matching systems is essential for matching large biomedical ontologies efficiently. We show that accounting for all lexical annotations (e.g., labels and synonyms) in biomedical ontologies leads to a substantial improvement in F-measure over using only the primary name, and that accounting for the reliability of different types of annotations generally also leads to a marked improvement. Finally, we show that cross-references are a reliable source of information and that, when using biomedical ontologies as background knowledge, it is generally more reliable to use them as mediators than to perform lexical expansion. We anticipate that translating traditional matching algorithms to the hash-based searching paradigm will be a critical direction for the future development of the field. Improving the evaluation carried out in the biomedical tracks of the OAEI will also be important, as without proper reference alignments there is only so much that can be ascertained about matching systems or strategies. Nevertheless, it is clear that, to tackle the various challenges posed by biomedical ontologies, ontology matching systems must be able to efficiently combine multiple strategies into a mature matching

  7. Factors influencing tackle injuries in rugby union football

    PubMed Central

    Garraway, W. M.; Lee, A. J.; Macleod, D. A.; Telfer, J. W.; Deary, I. J.; Murray, G. D.

    1999-01-01

    OBJECTIVES: To assess the influence of selected aspects of lifestyle, personality, and other player related factors on injuries in the tackle. To describe the detailed circumstances in which these tackles occurred. METHODS: A prospective case-control study was undertaken in which the tackling and tackled players ("the cases") involved in a tackle injury were each matched with "control" players who held the same respective playing positions in the opposing teams. A total of 964 rugby matches involving 71 senior clubs drawn from all districts of the Scottish Rugby Union (SRU) were observed by nominated linkmen who administered self report questionnaires to the players identified as cases and controls. Information on lifestyle habits, match preparation, training, and coaching experience was obtained. A validated battery of psychological tests assessed players' trait anger and responses to anger and hostility. The circumstances of the tackles in which injury occurred were recorded by experienced SRU coaching staff in interviews with involved players after the match. RESULTS: A total of 71 tackle injury episodes with correct matching of cases and controls were studied. The following player related factors did not contribute significantly to tackle injuries: alcohol consumption before the match, feeling "below par" through minor illness, the extent of match preparation, previous coaching, or practising tackling. Injured and non- injured players in the tackle did not differ in their disposition toward, or expression of, anger or hostility. Some 85% of tackling players who were injured were three quarters, and 52% of injuries occurred when the tackle came in behind the tackled player or within his peripheral vision. Either the tackling or tackled player was sprinting or running in all of these injury episodes. One third of injuries occurred in differential speed tackles--that is, when one player was travelling much faster than the other at impact. The player with the lower

  8. Tackling in Youth Football.

    PubMed

    2015-11-01

    American football remains one of the most popular sports for young athletes. The injuries sustained during football, especially those to the head and neck, have been a topic of intense interest recently in both the public media and medical literature. The recognition of these injuries and the potential for long-term sequelae have led some physicians to call for a reduction in the number of contact practices, a postponement of tackling until a certain age, and even a ban on high school football. This statement reviews the literature regarding injuries in football, particularly those of the head and neck, the relationship between tackling and football-related injuries, and the potential effects of limiting or delaying tackling on injury risk. Copyright © 2015 by the American Academy of Pediatrics.

  9. Tackle technique and tackle-related injuries in high-level South African Rugby Union under-18 players: real-match video analysis.

    PubMed

    Burger, Nicholas; Lambert, Michael I; Viljoen, Wayne; Brown, James C; Readhead, Clint; Hendricks, Sharief

    2016-08-01

    The high injury rate associated with rugby union is primarily due to the tackle, and poor contact technique has been identified as a risk factor for injury. We aimed to determine whether the tackle technique proficiency scores were different in injurious tackles versus tackles that did not result in injury using real-match scenarios in high-level youth rugby union. Injury surveillance was conducted at the under-18 Craven Week tournaments (2011-2013). Tackle-related injury information was used to identify injury events in the match video footage and non-injury events were identified for the injured player cohort. Injury and non-injury events were scored for technique proficiency and Cohen's effect sizes were calculated and the Student t test (p<0.05) was performed to compare injury versus non-injury scores. The overall mean score for front-on ball-carrier proficiency was 7.17±1.90 and 9.02±2.15 for injury and non-injury tackle events, respectively (effect size=moderate; p<0.05). The overall mean score for side/behind ball-carrier proficiency was 4.09±2.12 and 7.68±1.72 for injury and non-injury tackle events, respectively (effect size=large; p<0.01). The overall mean score for front-on tackler proficiency was 7.00±1.95 and 9.35±2.56 for injury and non-injury tackle events, respectively (effect size=moderate; p<0.05). The overall mean score for side/behind tackler proficiency was 5.47±1.60 and 8.14±1.75 for injury and non-injury tackle events, respectively (effect size=large; p<0.01). Higher overall mean and criterion-specific tackle-related technique scores were associated with a non-injury outcome. The ability to perform well during tackle events may decrease the risk of injury and may manifest in superior performance. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  10. Tackling Complex Emergency Response Solutions Evaluation Problems in Sustainable Development by Fuzzy Group Decision Making Approaches with Considering Decision Hesitancy and Prioritization among Assessing Criteria.

    PubMed

    Qi, Xiao-Wen; Zhang, Jun-Ling; Zhao, Shu-Ping; Liang, Chang-Yong

    2017-10-02

    In order to be prepared against potential balance-breaking risks affecting economic development, more and more countries have recognized emergency response solutions evaluation (ERSE) as an indispensable activity in their governance of sustainable development. Traditional multiple criteria group decision making (MCGDM) approaches to ERSE have been facing simultaneous challenging characteristics of decision hesitancy and prioritization relations among assessing criteria, due to the complexity in practical ERSE problems. Therefore, aiming at the special type of ERSE problems that hold the two characteristics, we investigate effective MCGDM approaches by hiring interval-valued dual hesitant fuzzy set (IVDHFS) to comprehensively depict decision hesitancy. To exploit decision information embedded in prioritization relations among criteria, we firstly define an fuzzy entropy measure for IVDHFS so that its derivative decision models can avoid potential information distortion in models based on classic IVDHFS distance measures with subjective supplementing mechanism; further, based on defined entropy measure, we develop two fundamental prioritized operators for IVDHFS by extending Yager's prioritized operators. Furthermore, on the strength of above methods, we construct two hesitant fuzzy MCGDM approaches to tackle complex scenarios with or without known weights for decision makers, respectively. Finally, case studies have been conducted to show effectiveness and practicality of our proposed approaches.

  11. Tackling Complex Emergency Response Solutions Evaluation Problems in Sustainable Development by Fuzzy Group Decision Making Approaches with Considering Decision Hesitancy and Prioritization among Assessing Criteria

    PubMed Central

    Qi, Xiao-Wen; Zhang, Jun-Ling; Zhao, Shu-Ping; Liang, Chang-Yong

    2017-01-01

    In order to be prepared against potential balance-breaking risks affecting economic development, more and more countries have recognized emergency response solutions evaluation (ERSE) as an indispensable activity in their governance of sustainable development. Traditional multiple criteria group decision making (MCGDM) approaches to ERSE have been facing simultaneous challenging characteristics of decision hesitancy and prioritization relations among assessing criteria, due to the complexity in practical ERSE problems. Therefore, aiming at the special type of ERSE problems that hold the two characteristics, we investigate effective MCGDM approaches by hiring interval-valued dual hesitant fuzzy set (IVDHFS) to comprehensively depict decision hesitancy. To exploit decision information embedded in prioritization relations among criteria, we firstly define an fuzzy entropy measure for IVDHFS so that its derivative decision models can avoid potential information distortion in models based on classic IVDHFS distance measures with subjective supplementing mechanism; further, based on defined entropy measure, we develop two fundamental prioritized operators for IVDHFS by extending Yager’s prioritized operators. Furthermore, on the strength of above methods, we construct two hesitant fuzzy MCGDM approaches to tackle complex scenarios with or without known weights for decision makers, respectively. Finally, case studies have been conducted to show effectiveness and practicality of our proposed approaches. PMID:28974045

  12. 46 CFR 184.300 - Ground tackle and mooring lines.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Ground tackle and mooring lines. 184.300 Section 184.300... Ground tackle and mooring lines. A vessel must be fitted with ground tackle and mooring lines necessary for the vessel to be safely anchored or moored. The ground tackle and mooring lines provided must be...

  13. 46 CFR 184.300 - Ground tackle and mooring lines.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 7 2011-10-01 2011-10-01 false Ground tackle and mooring lines. 184.300 Section 184.300... Ground tackle and mooring lines. A vessel must be fitted with ground tackle and mooring lines necessary for the vessel to be safely anchored or moored. The ground tackle and mooring lines provided must be...

  14. Tackle mechanisms and match characteristics in women's elite football tournaments.

    PubMed

    Tscholl, P; O'Riordan, D; Fuller, C W; Dvorak, J; Junge, A

    2007-08-01

    Several tools have been used for assessing risk situations and for gathering tackle information from international football matches for men but not for women. To analyse activities in women's football and to identify the characteristics and risk potentials of tackles. Retrospective video analysis. Video recordings of 24 representative matches from six women's top-level tournaments were analysed for tackle parameters and their risk potential. 3531 tackles were recorded. Tackles in which the tackling player came from the side and stayed on her feet accounted for nearly half of all challenges for the ball in which body contact occurred. 2.7% of all tackles were classified as risk situations, with sliding-in tackles from behind and the side having the highest risk potential. Match referees sanctioned sliding-in tackles more often than other tackles (20% v 17%, respectively). Tackle parameters did not change in the duration of a match; however, there was an increase in the number of injury risk situations and foul plays towards the end of each half. Match properties provide valuable information for a better understanding of injury situations in football. Staying on feet and jumping vertically tackle actions leading to injury were sanctioned significantly more times by the referee than those not leading to injury (p<0.001), but no such difference was seen for sliding-in tackles (previously reported to have the highest injury potential in women's football). Therefore, either the laws of the game are not adequate or match referees in women's football are not able to distinguish between sliding-in tackles leading to and those not leading to injury.

  15. Non-sanctioning of illegal tackles in South African youth community rugby.

    PubMed

    Brown, J C; Boucher, S J; Lambert, M; Viljoen, W; Readhead, C; Hendricks, S; Kraak, W J

    2018-06-01

    The tackle event in rugby union ('rugby') contributes to the majority of players' injuries. Referees can reduce this risk by sanctioning dangerous tackles. A study in elite adult rugby suggests that referees only sanction a minority of illegal tackles. The aim of this study was to assess if this finding was similar in youth community rugby. Observational study. Using EncodePro, 99 South African Rugby Union U18 Youth Week tournament matches were coded between 2011 and 2015. All tackles were coded by a researcher and an international referee to ensure that laws were interpreted correctly. The inter- and intra-rater reliabilities were 0.97-1.00. A regression analysis compared the non-sanctioned rates over time. In total, 12 216 tackles were coded, of which less than 1% (n=113) were 'illegal'. The majority of the 113 illegal tackles were front-on (75%), high tackles (72%) and occurred in the 2nd/4th quarters (29% each). Of the illegal tackles, only 59% were sanctioned. The proportions of illegal tackles and sanctioning of these illegal tackles to all tackles improved by 0.2% per year from 2011-2015 (p<0.05). In these youth community rugby players, 59% of illegal tackles were not sanctioned appropriately. This was better than a previous study in elite adult rugby, where only 7% of illegal tackles were penalised. Moreover, the rates of illegal tackles and non-sanctioned illegal tackles both improved over time. However, it is critical that referees consistently enforce all laws to enhance injury prevention efforts. Further studies should investigate the reasons for non-sanctioning. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  16. 46 CFR 121.300 - Ground tackle and mooring lines.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Ground tackle and mooring lines. 121.300 Section 121.300... MISCELLANEOUS SYSTEMS AND EQUIPMENT Mooring and Towing Equipment § 121.300 Ground tackle and mooring lines. A vessel must be fitted with ground tackle and mooring lines necessary for the vessel to be safely anchored...

  17. 46 CFR 121.300 - Ground tackle and mooring lines.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 4 2011-10-01 2011-10-01 false Ground tackle and mooring lines. 121.300 Section 121.300... MISCELLANEOUS SYSTEMS AND EQUIPMENT Mooring and Towing Equipment § 121.300 Ground tackle and mooring lines. A vessel must be fitted with ground tackle and mooring lines necessary for the vessel to be safely anchored...

  18. Modeling the Propagation of Mobile Phone Virus under Complex Network

    PubMed Central

    Yang, Wei; Wei, Xi-liang; Guo, Hao; An, Gang; Guo, Lei

    2014-01-01

    Mobile phone virus is a rogue program written to propagate from one phone to another, which can take control of a mobile device by exploiting its vulnerabilities. In this paper the propagation model of mobile phone virus is tackled to understand how particular factors can affect its propagation and design effective containment strategies to suppress mobile phone virus. Two different propagation models of mobile phone viruses under the complex network are proposed in this paper. One is intended to describe the propagation of user-tricking virus, and the other is to describe the propagation of the vulnerability-exploiting virus. Based on the traditional epidemic models, the characteristics of mobile phone viruses and the network topology structure are incorporated into our models. A detailed analysis is conducted to analyze the propagation models. Through analysis, the stable infection-free equilibrium point and the stability condition are derived. Finally, considering the network topology, the numerical and simulation experiments are carried out. Results indicate that both models are correct and suitable for describing the spread of two different mobile phone viruses, respectively. PMID:25133209

  19. Addressing Complex Challenges through Adaptive Leadership: A Promising Approach to Collaborative Problem Solving

    ERIC Educational Resources Information Center

    Nelson, Tenneisha; Squires, Vicki

    2017-01-01

    Organizations are faced with solving increasingly complex problems. Addressing these issues requires effective leadership that can facilitate a collaborative problem solving approach where multiple perspectives are leveraged. In this conceptual paper, we critique the effectiveness of earlier leadership models in tackling complex organizational…

  20. Momentum and Kinetic Energy Before the Tackle in Rugby Union

    PubMed Central

    Hendricks, Sharief; Karpul, David; Lambert, Mike

    2014-01-01

    Understanding the physical demands of a tackle in match situations is important for safe and effective training, developing equipment and research. Physical components such as momentum and kinetic energy, and it relationship to tackle outcome is not known. The aim of this study was to compare momenta between ball-carrier and tackler, level of play (elite, university and junior) and position (forwards vs. backs), and describe the relationship between ball-carrier and tackler mass, velocity and momentum and the tackle outcome. Also, report on the ball-carrier and tackler kinetic energy before contact and the estimated magnitude of impact (energy distributed between ball-carrier and tackler upon contact). Velocity over 0.5 seconds before contact was determined using a 2-dimensional scaled version of the field generated from a computer alogorithm. Body masses of players were obtained from their player profiles. Momentum and kinetic energy were subsequently calculated for 60 tackle events. Ball-carriers were heavier than the tacklers (ball-carrier 100 ± 14 kg vs. tackler 93 ± 11 kg, d = 0.52, p = 0.0041, n = 60). Ball-carriers as forwards had a significantly higher momentum than backs (forwards 563 ± 226 Kg.m.s-1 n = 31 vs. backs 438 ± 135 Kg.m.s-1, d = 0.63, p = 0.0012, n = 29). Tacklers dominated 57% of tackles and ball-carriers dominated 43% of tackles. Despite the ball-carrier having a mass advantage before contact more frequently than the tackler, momentum advantage and tackle dominance between the ball-carrier and tackler was proportionally similar. These findings may reflect a characteristic of the modern game of rugby where efficiently heavier players (particularly forwards) are tactically predetermined to carry the ball in contact. Key Points First study to quantify momentum, kinetic energy, and magnitude of impact in rugby tackles across different levels in matches without a device attached to a player. Physical components alone, of either ball-carrier or

  1. Momentum and kinetic energy before the tackle in rugby union.

    PubMed

    Hendricks, Sharief; Karpul, David; Lambert, Mike

    2014-09-01

    Understanding the physical demands of a tackle in match situations is important for safe and effective training, developing equipment and research. Physical components such as momentum and kinetic energy, and it relationship to tackle outcome is not known. The aim of this study was to compare momenta between ball-carrier and tackler, level of play (elite, university and junior) and position (forwards vs. backs), and describe the relationship between ball-carrier and tackler mass, velocity and momentum and the tackle outcome. Also, report on the ball-carrier and tackler kinetic energy before contact and the estimated magnitude of impact (energy distributed between ball-carrier and tackler upon contact). Velocity over 0.5 seconds before contact was determined using a 2-dimensional scaled version of the field generated from a computer alogorithm. Body masses of players were obtained from their player profiles. Momentum and kinetic energy were subsequently calculated for 60 tackle events. Ball-carriers were heavier than the tacklers (ball-carrier 100 ± 14 kg vs. tackler 93 ± 11 kg, d = 0.52, p = 0.0041, n = 60). Ball-carriers as forwards had a significantly higher momentum than backs (forwards 563 ± 226 Kg(.)m(.)s(-1) n = 31 vs. backs 438 ± 135 Kg(.)m(.)s(-1), d = 0.63, p = 0.0012, n = 29). Tacklers dominated 57% of tackles and ball-carriers dominated 43% of tackles. Despite the ball-carrier having a mass advantage before contact more frequently than the tackler, momentum advantage and tackle dominance between the ball-carrier and tackler was proportionally similar. These findings may reflect a characteristic of the modern game of rugby where efficiently heavier players (particularly forwards) are tactically predetermined to carry the ball in contact. Key PointsFirst study to quantify momentum, kinetic energy, and magnitude of impact in rugby tackles across different levels in matches without a device attached to a player.Physical components alone, of either ball

  2. Tackle characteristics and injury in a cross section of rugby union football.

    PubMed

    McIntosh, Andrew S; Savage, Trevor N; McCrory, Paul; Fréchède, Bertrand O; Wolfe, Rory

    2010-05-01

    The tackle is the game event in rugby union most associated with injury. This study's main aims were to measure tackle characteristics from video using a qualitative protocol, to assess whether the characteristics differed by level of play, and to measure the associations between tackle characteristics and injury. A cohort study was undertaken. The cohort comprised male rugby players in the following levels: younger than 15 yr, 18 yr, and 20 yr, grade, and elite (Super 12 and Wallabies). All tackle events and technique characteristics were coded in 77 game halves using a standardized qualitative protocol. Game injuries and missed-game injuries were identified and correlated with tackle events. A total of 6618 tackle events, including 81 resulting in a game injury, were observed and coded in the 77 game halves fully analyzed (145 tackle events per hour). An increase in the proportion of active shoulder tackles was observed from younger than 15 yr (13%) to elite (31%). Younger players engaged in more passive tackles and tended to stay on their feet more than experienced players. Younger than 15 yr rugby players had a significantly lower risk of tackle game injury compared with elite players. No specific tackle technique was observed to be associated with a significantly increased risk of game injury. There was a greater risk of game injury associated with two or more tacklers involved in the tackle event, and the greatest risk was associated with simultaneous contact by tacklers, after adjusting for level of play. Tackle characteristics differed between levels of play. The number of tacklers and the sequence of tackler contact with the ball carrier require consideration from an injury prevention perspective.

  3. Decision Making Under Uncertainty and Complexity: A Model-Based Scenario Approach to Supporting Integrated Water Resources Management

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Gupta, H.; Wagener, T.; Stewart, S.; Mahmoud, M.; Hartmann, H.; Springer, E.

    2007-12-01

    Some of the most challenging issues facing contemporary water resources management are those typified by complex coupled human-environmental systems with poorly characterized uncertainties. In other words, major decisions regarding water resources have to be made in the face of substantial uncertainty and complexity. It has been suggested that integrated models can be used to coherently assemble information from a broad set of domains, and can therefore serve as an effective means for tackling the complexity of environmental systems. Further, well-conceived scenarios can effectively inform decision making, particularly when high complexity and poorly characterized uncertainties make the problem intractable via traditional uncertainty analysis methods. This presentation discusses the integrated modeling framework adopted by SAHRA, an NSF Science & Technology Center, to investigate stakeholder-driven water sustainability issues within the semi-arid southwestern US. The multi-disciplinary, multi-resolution modeling framework incorporates a formal scenario approach to analyze the impacts of plausible (albeit uncertain) alternative futures to support adaptive management of water resources systems. Some of the major challenges involved in, and lessons learned from, this effort will be discussed.

  4. Wind Turbine Failures - Tackling current Problems in Failure Data Analysis

    NASA Astrophysics Data System (ADS)

    Reder, M. D.; Gonzalez, E.; Melero, J. J.

    2016-09-01

    The wind industry has been growing significantly over the past decades, resulting in a remarkable increase in installed wind power capacity. Turbine technologies are rapidly evolving in terms of complexity and size, and there is an urgent need for cost effective operation and maintenance (O&M) strategies. Especially unplanned downtime represents one of the main cost drivers of a modern wind farm. Here, reliability and failure prediction models can enable operators to apply preventive O&M strategies rather than corrective actions. In order to develop these models, the failure rates and downtimes of wind turbine (WT) components have to be understood profoundly. This paper is focused on tackling three of the main issues related to WT failure analyses. These are, the non-uniform data treatment, the scarcity of available failure analyses, and the lack of investigation on alternative data sources. For this, a modernised form of an existing WT taxonomy is introduced. Additionally, an extensive analysis of historical failure and downtime data of more than 4300 turbines is presented. Finally, the possibilities to encounter the lack of available failure data by complementing historical databases with Supervisory Control and Data Acquisition (SCADA) alarms are evaluated.

  5. Technical determinants of tackle and ruck performance in International rugby union.

    PubMed

    Hendricks, Sharief; van Niekerk, Tiffany; Sin, Drew Wade; Lambert, Mike; den Hollander, Steve; Brown, James; Maree, Willie; Treu, Paul; Till, Kevin; Jones, Ben

    2018-03-01

    The most frequently occurring contact events in rugby union are the tackle and ruck. The ability repeatedly to engage and win the tackle and ruck has been associated with team success. To win the tackle and ruck, players have to perform specific techniques. These techniques have not been studied at the highest level of rugby union. Therefore, the purpose of this study was to identify technical determinants of tackle and ruck performance at the highest level of rugby union. A total of 4479 tackle and 2914 ruck events were coded for the Six Nations and Championship competitions. Relative risk ratio (RR), the ratio of the probability of an outcome occurring when a characteristic was observed (versus the non-observed characteristic), was determined using multinomial logistic regression. Executing front-on tackles reduced the likelihood of offloads and tackle breaks in both competitions (Six Nations RR 3.0 Behind tackle, 95% confidence interval [95% CI]: 1.9-4.6, effect size [ES] = large, P < 0.001); Championship RR 2.9 Jersey tackle, 95% CI: 1.3-6.4, ES = moderate, P = 0.01). Fending during contact increased the chances of offloading and breaking the tackle in both competitions (Six Nations RR 4.5 Strong, 95% CI: 2.2-9.2, ES = large, P = P < 0.001; Championship RR 5.1 Moderate, 95% CI: 3.5-7.4, ES = large, P < 0.001). For the ruck, actively placing the ball increased the probability of maintaining possession (Six Nations RR 2.2, 95% CI: 1.1-4.3, ES = moderate, P = 0.03); Championship RR 4.0, 95% CI: 1.3-11.8, ES = large, P = 0.01). The techniques identified in this study should be incorporated and emphasised during training to prepare players for competition. Furthermore, these techniques need to be added to coaching manuals for the tackle and ruck.

  6. Collapsed scrums and collision tackles: what is the injury risk?

    PubMed

    Roberts, Simon P; Trewartha, Grant; England, Mike; Stokes, Keith A

    2015-04-01

    To establish the propensity for specific contact events to cause injury in rugby union. Medical staff at participating English community-level rugby clubs reported any injury resulting in the absence for one match or more from the day of the injury during the 2009/2010 (n=46), 2010/2011 (n=67) and 2011/2012 (n=76) seasons. Injury severity was defined as the number of matches missed. Thirty community rugby matches were filmed and the number of contact events (tackles, collision tackles, rucks, mauls, lineouts and scrums) recorded. Of 370 (95% CI 364 to 378) contact events per match, 141 (137 to 145) were tackles, 115 (111 to 119) were rucks and 32 (30 to 33) were scrums. Tackles resulted in the greatest propensity for injury (2.3 (2.2 to 2.4) injuries/1000 events) and the greatest severity (16 (15 to 17) weeks missed/1000 events). Collision tackles (illegal tackles involving a shoulder charge) had a propensity for injury of 15 (12.4 to 18.3) injuries/1000 events and severity was 92 (75 to 112) weeks missed/1000 events, both of which were higher than any other event. Additional scrum analysis showed that only 5% of all scrums collapsed, but the propensity for injury was four times higher (2.9 (1.5 to 5.4) injuries/1000 events) and the severity was six times greater (22 (12 to 42) weeks missed/1000 events) than for non-collapsed scrums. Injury prevention in the tackle should focus on technique with strict enforcement of existing laws for illegal collision tackles. The scrum is a relatively controllable event and further attempts should be made to reduce the frequency of scrum collapse. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  7. [Family Health Strategies to tackle violence involving adolescents].

    PubMed

    Vieira Netto, Moysés Francisco; Deslandes, Suely Ferreira

    2016-05-01

    The Family Health Strategy (FHS) has an acknowledged potential for the promotion of health and the prevention of violence. This is an integrative bibliographic review with the aim of evaluating the performance of FHS professionals in tackling and preventing violence involving adolescents. It is an integrative review of dissertations and theses on healthcare published from 1994 to 2014. The collection of 17 dissertations and 2 doctoral theses reveals that these studies are recent. The FHS professionals acknowledge the vulnerability of adolescents to inflicting and being subject to violence, however the FHS proves ineffective in tackling and preventing such violence. The predominance of the medical technical care model, the deficiencies in Public Health education in professional training and the lack of institutional support are seen as the main obstacles. Many of these professionals are unaware of the files for notification of violence. The existence of family violence and criminal groups were the aspects most mentioned in the territories. The social representation of adolescents as being "problematic" and the lack of ESF actions that promote an increase youth leadership and empowerment were clearly detected.

  8. Severe and Catastrophic Neck Injuries Resulting from Tackle Football

    ERIC Educational Resources Information Center

    Torg, Joseph S.; And Others

    1977-01-01

    Use of the spring-loaded blocking and tackling devices should be discontinued due to severe neck injuries resulting from their use; employment of the head and helmet as the primary assault weapon in blocking, tackling, and head butting should be condemned for the same reason. (MJB)

  9. 46 CFR 121.300 - Ground tackle and mooring lines.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 4 2012-10-01 2012-10-01 false Ground tackle and mooring lines. 121.300 Section 121.300 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SMALL PASSENGER VESSELS CARRYING MORE... MISCELLANEOUS SYSTEMS AND EQUIPMENT Mooring and Towing Equipment § 121.300 Ground tackle and mooring lines. A...

  10. Tackling racism in the NHS.

    PubMed

    Dean, Erin

    2016-11-30

    Essential facts Trade union Unite has developed a policy briefing on a new toolkit to combat racism in the NHS. It can help nurses and other staff tackle racial discrimination in health, with black and minority ethnic (BME) nurses often treated unequally compared with their white colleagues.

  11. An investigation of shoulder forces in active shoulder tackles in rugby union football.

    PubMed

    Usman, Juliana; McIntosh, Andrew S; Fréchède, Bertrand

    2011-11-01

    In rugby union football the tackle is the most frequently executed skill and one most associated with injury, including shoulder injury to the tackler. Despite the importance of the tackle, little is known about the magnitude of shoulder forces in the tackle and influencing factors. The objectives of the study were to measure the shoulder force in the tackle, as well as the effects of shoulder padding, skill level, side of body, player size, and experimental setting on shoulder force. Experiments were conducted in laboratory and field settings using a repeated measures design. Thirty-five participants were recruited to the laboratory and 98 to the field setting. All were male aged over 18 years with rugby experience. The maximum force applied to the shoulder in an active shoulder tackle was measured with a custom built forceplate incorporated into a 45 kg tackle bag. The overall average maximum shoulder force was 1660 N in the laboratory and 1997 N in the field. This difference was significant. The shoulder force for tackling without shoulder pads was 1684 N compared to 1635 N with shoulder pads. There was no difference between the shoulder forces on the dominant and non-dominant sides. Shoulder force reduced with tackle repetition. No relationship was observed between player skill level and size. A substantial force can be applied to the shoulder and to an opponent in the tackle. This force is within the shoulder's injury tolerance range and is unaffected by shoulder pads. Copyright © 2011 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  12. European Teacher Education: A Fractal Perspective Tackling Complexity

    ERIC Educational Resources Information Center

    Caena, Francesa; Margiotta, Umberto

    2010-01-01

    This article takes stock of the complex scenario of the European education space in its past, present and future developments, which highlights the priorities of the modernisation, improvement and convergence of the goals for education and training systems in the knowledge and learning society. The critical case of teacher education is then…

  13. Does player time-in-game affect tackle technique in elite level rugby union?

    PubMed

    Tierney, Gregory J; Denvir, Karl; Farrell, Garreth; Simms, Ciaran K

    2018-02-01

    It has been hypothesised that fatigue may be a major factor in tackle-related injury risk in rugby union and hence more injuries occur in the later stages of a game. The aim of this study is to identify changes in ball carrier or tackler proficiency characteristics, using elite level match video data, as player time-in-game increases. Qualitative observational cohort study. Three 2014/15 European Rugby Champions Cup games were selected for ball carrier and tackler proficiency analysis. Analysis was only conducted on players who started and remained on the field for the entire game. A separate analysis was conducted on 10 randomly selected 2014/15 European Rugby Champions Cup/Pro 12 games to assess the time distribution of tackles throughout a game. A Chi-square test and one-way way ANOVA with post-hoc testing was conducted to identify significant differences (p<0.05) for proficiency characteristics and tackle counts between quarters in the game, respectively. Player time-in-game did not affect tackle proficiency for both the ball carrier and tackler. Any results that showed statistical significance did not indicate a trend of deterioration in proficiency with increased player time-in-game. The time distribution of tackles analysis indicated that more tackles occurring in the final quarter of the game than the first (p=0.04) and second (p=<0.01). It appears that player time-in-game does not affect tackler or ball carrier tackle technique proficiency at the elite level. More tackles occurring in the final quarter of a game provides an alternative explanation to more tackle-related injuries occurring at this stage. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  14. TSCA Section 21 Petition Requesting EPA to Regulate Lead in Fishing Tackle

    EPA Pesticide Factsheets

    This petition requests EPA to promulgate regulations under section 6 of TSCA to protect the environment from fishing tackle containing lead including fishing weights, sinkers, lures, jigs, and/or other tackle.

  15. Baltimore District Tackles High Suspension Rates

    ERIC Educational Resources Information Center

    Maxwell, Lesli A.

    2007-01-01

    This article reports on how the Baltimore District tackles its high suspension rates. Driven by an increasing belief that zero-tolerance disciplinary policies are ineffective, more educators are embracing strategies that do not exclude misbehaving students from school for offenses such as insubordination, disrespect, cutting class, tardiness, and…

  16. The Effects of Verbal Instruction and Shaping to Improve Tackling by High School Football Players

    ERIC Educational Resources Information Center

    Harrison, Antonio M.; Pyles, David A.

    2013-01-01

    We evaluated verbal instruction and shaping using TAG (teaching with acoustical guidance) to improve tackling by 3 high school football players. Verbal instruction and shaping improved tackling for all 3 participants. In addition, performance was maintained as participants moved more quickly through the tackling procedure.

  17. Periodic reference tracking control approach for smart material actuators with complex hysteretic characteristics

    NASA Astrophysics Data System (ADS)

    Sun, Zhiyong; Hao, Lina; Song, Bo; Yang, Ruiguo; Cao, Ruimin; Cheng, Yu

    2016-10-01

    Micro/nano positioning technologies have been attractive for decades for their various applications in both industrial and scientific fields. The actuators employed in these technologies are typically smart material actuators, which possess inherent hysteresis that may cause systems behave unexpectedly. Periodic reference tracking capability is fundamental for apparatuses such as scanning probe microscope, which employs smart material actuators to generate periodic scanning motion. However, traditional controller such as PID method cannot guarantee accurate fast periodic scanning motion. To tackle this problem and to conduct practical implementation in digital devices, this paper proposes a novel control method named discrete extended unparallel Prandtl-Ishlinskii model based internal model (d-EUPI-IM) control approach. To tackle modeling uncertainties, the robust d-EUPI-IM control approach is investigated, and the associated sufficient stabilizing conditions are derived. The advantages of the proposed controller are: it is designed and represented in discrete form, thus practical for digital devices implementation; the extended unparallel Prandtl-Ishlinskii model can precisely represent forward/inverse complex hysteretic characteristics, thus can reduce modeling uncertainties and benefits controllers design; in addition, the internal model principle based control module can be utilized as a natural oscillator for tackling periodic references tracking problem. The proposed controller was verified through comparative experiments on a piezoelectric actuator platform, and convincing results have been achieved.

  18. Specific tackling situations affect the biomechanical demands experienced by rugby union players.

    PubMed

    Seminati, Elena; Cazzola, Dario; Preatoni, Ezio; Trewartha, Grant

    2017-03-01

    Tackling in Rugby Union is an open skill which can involve high-speed collisions and is the match event associated with the greatest proportion of injuries. This study aimed to analyse the biomechanics of rugby tackling under three conditions: from a stationary position, with dominant and non-dominant shoulder, and moving forward, with dominant shoulder. A specially devised contact simulator, a 50-kg punch bag instrumented with pressure sensors, was translated towards the tackler (n = 15) to evaluate the effect of laterality and tackling approach on the external loads absorbed by the tackler, on head and trunk motion, and on trunk muscle activities. Peak impact force was substantially higher in the stationary dominant (2.84 ± 0.74 kN) than in the stationary non-dominant condition (2.44 ± 0.64 kN), but lower than in the moving condition (3.40 ± 0.86 kN). Muscle activation started on average 300 ms before impact, with higher activation for impact-side trapezius and non-impact-side erector spinae and gluteus maximus muscles. Players' technique for non-dominant-side tackles was less compliant with current coaching recommendations in terms of cervical motion (more neck flexion and lateral bending in the stationary non-dominant condition) and players could benefit from specific coaching focus on non-dominant-side tackles.

  19. Evaluating behavioral skills training to teach safe tackling skills to youth football players.

    PubMed

    Tai, Sharayah S M; Miltenberger, Raymond G

    2017-10-01

    With concussion rates on the rise for football players, there is a need for further research to increase skills and decrease injuries. Behavioral skills training is effective in teaching a wide variety of skills but has yet to be studied in the sports setting. We evaluated behavioral skills training to teach safer tackling techniques to six participants from a Pop Warner football team. Safer tackling techniques increased during practice and generalized to games for the two participants who had opportunities to tackle in games. © 2017 Society for the Experimental Analysis of Behavior.

  20. Tackling Noncommunicable Diseases in Africa: Caveat Lector

    ERIC Educational Resources Information Center

    Mensah, George A.

    2016-01-01

    Noncommunicable disease (NCD), principally cardiovascular diseases, cancer, chronic lung disease, and diabetes, constitutes the major cause of death worldwide. Evidence of a continuing increase in the global burden of these diseases has generated recent urgent calls for global action to tackle and reduce related death and disability. Because the…

  1. Tackling some of the most intricate geophysical challenges via high-performance computing

    NASA Astrophysics Data System (ADS)

    Khosronejad, A.

    2016-12-01

    Recently, world has been witnessing significant enhancements in computing power of supercomputers. Computer clusters in conjunction with the advanced mathematical algorithms has set the stage for developing and applying powerful numerical tools to tackle some of the most intricate geophysical challenges that today`s engineers face. One such challenge is to understand how turbulent flows, in real-world settings, interact with (a) rigid and/or mobile complex bed bathymetry of waterways and sea-beds in the coastal areas; (b) objects with complex geometry that are fully or partially immersed; and (c) free-surface of waterways and water surface waves in the coastal area. This understanding is especially important because the turbulent flows in real-world environments are often bounded by geometrically complex boundaries, which dynamically deform and give rise to multi-scale and multi-physics transport phenomena, and characterized by multi-lateral interactions among various phases (e.g. air/water/sediment phases). Herein, I present some of the multi-scale and multi-physics geophysical fluid mechanics processes that I have attempted to study using an in-house high-performance computational model, the so-called VFS-Geophysics. More specifically, I will present the simulation results of turbulence/sediment/solute/turbine interactions in real-world settings. Parts of the simulations I present are performed to gain scientific insights into the processes such as sand wave formation (A. Khosronejad, and F. Sotiropoulos, (2014), Numerical simulation of sand waves in a turbulent open channel flow, Journal of Fluid Mechanics, 753:150-216), while others are carried out to predict the effects of climate change and large flood events on societal infrastructures ( A. Khosronejad, et al., (2016), Large eddy simulation of turbulence and solute transport in a forested headwater stream, Journal of Geophysical Research:, doi: 10.1002/2014JF003423).

  2. A Legal Approach to Tackling Contract Cheating?

    ERIC Educational Resources Information Center

    Draper, Michael J.; Newton, Philip M.

    2017-01-01

    The phenomenon of contract cheating presents, potentially, a serious threat to the quality and standards of Higher Education around the world. There have been suggestions, cited below, to tackle the problem using legal means, but we find that current laws are not fit for this purpose. In this article we present a proposal for a specific new law to…

  3. Using VCL as an Aspect-Oriented Approach to Requirements Modelling

    NASA Astrophysics Data System (ADS)

    Amálio, Nuno; Kelsen, Pierre; Ma, Qin; Glodt, Christian

    Software systems are becoming larger and more complex. By tackling the modularisation of crosscutting concerns, aspect orientation draws attention to modularity as a means to address the problems of scalability, complexity and evolution in software systems development. Aspect-oriented modelling (AOM) applies aspect-orientation to the construction of models. Most existing AOM approaches are designed without a formal semantics, and use multi-view partial descriptions of behaviour. This paper presents an AOM approach based on the Visual Contract Language (VCL): a visual language for abstract and precise modelling, designed with a formal semantics, and comprising a novel approach to visual behavioural modelling based on design by contract where behavioural descriptions are total. By applying VCL to a large case study of a car-crash crisis management system, the paper demonstrates how modularity of VCL's constructs, at different levels of granularity, help to tackle complexity. In particular, it shows how VCL's package construct and its associated composition mechanisms are key in supporting separation of concerns, coarse-grained problem decomposition and aspect-orientation. The case study's modelling solution has a clear and well-defined modular structure; the backbone of this structure is a collection of packages encapsulating local solutions to concerns.

  4. Probing the Topological Properties of Complex Networks Modeling Short Written Texts

    PubMed Central

    Amancio, Diego R.

    2015-01-01

    In recent years, graph theory has been widely employed to probe several language properties. More specifically, the so-called word adjacency model has been proven useful for tackling several practical problems, especially those relying on textual stylistic analysis. The most common approach to treat texts as networks has simply considered either large pieces of texts or entire books. This approach has certainly worked well—many informative discoveries have been made this way—but it raises an uncomfortable question: could there be important topological patterns in small pieces of texts? To address this problem, the topological properties of subtexts sampled from entire books was probed. Statistical analyses performed on a dataset comprising 50 novels revealed that most of the traditional topological measurements are stable for short subtexts. When the performance of the authorship recognition task was analyzed, it was found that a proper sampling yields a discriminability similar to the one found with full texts. Surprisingly, the support vector machine classification based on the characterization of short texts outperformed the one performed with entire books. These findings suggest that a local topological analysis of large documents might improve its global characterization. Most importantly, it was verified, as a proof of principle, that short texts can be analyzed with the methods and concepts of complex networks. As a consequence, the techniques described here can be extended in a straightforward fashion to analyze texts as time-varying complex networks. PMID:25719799

  5. On the Way to Appropriate Model Complexity

    NASA Astrophysics Data System (ADS)

    Höge, M.

    2016-12-01

    When statistical models are used to represent natural phenomena they are often too simple or too complex - this is known. But what exactly is model complexity? Among many other definitions, the complexity of a model can be conceptualized as a measure of statistical dependence between observations and parameters (Van der Linde, 2014). However, several issues remain when working with model complexity: A unique definition for model complexity is missing. Assuming a definition is accepted, how can model complexity be quantified? How can we use a quantified complexity to the better of modeling? Generally defined, "complexity is a measure of the information needed to specify the relationships between the elements of organized systems" (Bawden & Robinson, 2015). The complexity of a system changes as the knowledge about the system changes. For models this means that complexity is not a static concept: With more data or higher spatio-temporal resolution of parameters, the complexity of a model changes. There are essentially three categories into which all commonly used complexity measures can be classified: (1) An explicit representation of model complexity as "Degrees of freedom" of a model, e.g. effective number of parameters. (2) Model complexity as code length, a.k.a. "Kolmogorov complexity": The longer the shortest model code, the higher its complexity (e.g. in bits). (3) Complexity defined via information entropy of parametric or predictive uncertainty. Preliminary results show that Bayes theorem allows for incorporating all parts of the non-static concept of model complexity like data quality and quantity or parametric uncertainty. Therefore, we test how different approaches for measuring model complexity perform in comparison to a fully Bayesian model selection procedure. Ultimately, we want to find a measure that helps to assess the most appropriate model.

  6. Surface complexation modeling

    USDA-ARS?s Scientific Manuscript database

    Adsorption-desorption reactions are important processes that affect the transport of contaminants in the environment. Surface complexation models are chemical models that can account for the effects of variable chemical conditions, such as pH, on adsorption reactions. These models define specific ...

  7. Synthetic biology approaches to biological containment: pre-emptively tackling potential risks

    PubMed Central

    Krüger, Antje; Csibra, Eszter; Gianni, Edoardo

    2016-01-01

    Biocontainment comprises any strategy applied to ensure that harmful organisms are confined to controlled laboratory conditions and not allowed to escape into the environment. Genetically engineered microorganisms (GEMs), regardless of the nature of the modification and how it was established, have potential human or ecological impact if accidentally leaked or voluntarily released into a natural setting. Although all evidence to date is that GEMs are unable to compete in the environment, the power of synthetic biology to rewrite life requires a pre-emptive strategy to tackle possible unknown risks. Physical containment barriers have proven effective but a number of strategies have been developed to further strengthen biocontainment. Research on complex genetic circuits, lethal genes, alternative nucleic acids, genome recoding and synthetic auxotrophies aim to design more effective routes towards biocontainment. Here, we describe recent advances in synthetic biology that contribute to the ongoing efforts to develop new and improved genetic, semantic, metabolic and mechanistic plans for the containment of GEMs. PMID:27903826

  8. Synthetic biology approaches to biological containment: pre-emptively tackling potential risks.

    PubMed

    Torres, Leticia; Krüger, Antje; Csibra, Eszter; Gianni, Edoardo; Pinheiro, Vitor B

    2016-11-30

    Biocontainment comprises any strategy applied to ensure that harmful organisms are confined to controlled laboratory conditions and not allowed to escape into the environment. Genetically engineered microorganisms (GEMs), regardless of the nature of the modification and how it was established, have potential human or ecological impact if accidentally leaked or voluntarily released into a natural setting. Although all evidence to date is that GEMs are unable to compete in the environment, the power of synthetic biology to rewrite life requires a pre-emptive strategy to tackle possible unknown risks. Physical containment barriers have proven effective but a number of strategies have been developed to further strengthen biocontainment. Research on complex genetic circuits, lethal genes, alternative nucleic acids, genome recoding and synthetic auxotrophies aim to design more effective routes towards biocontainment. Here, we describe recent advances in synthetic biology that contribute to the ongoing efforts to develop new and improved genetic, semantic, metabolic and mechanistic plans for the containment of GEMs. © 2016 The Author(s).

  9. Mechanisms and Factors Associated With Tackle-Related Injuries in South African Youth Rugby Union Players.

    PubMed

    Burger, Nicholas; Lambert, Mike Ian; Viljoen, Wayne; Brown, James Craig; Readhead, Clint; den Hollander, Steve; Hendricks, Sharief

    2017-02-01

    The majority of injuries in rugby union occur during tackle events. The mechanisms and causes of these injuries are well established in senior rugby union. To use information from an injury database and assess video footage of tackle-related injuries in youth rugby union matches to identify environmental factors and mechanisms that are potentially confounding to these injuries. Descriptive epidemiological study. Injury surveillance was conducted at the under-18 Craven Week rugby tournament. Tackle-related injury information was used to identify injury events in match video footage (role-matched noninjury tackle events were identified for the cohort of injured players). Events were coded using match situational variables (precontact, contact, and postcontact). Relative risk ratio (RRR; ratio of probability of an injury or noninjury outcome occurring when a characteristic was observed) was reported by use of logistic regression. In comparison with the first quarter, injury risk was greater in the third (RRR = 9.75 [95% CI, 1.71-55.64]; P = .010) and fourth quarters (RRR = 6.97 [95% CI, 1.09-44.57]; P = .040) for ball carriers and in the fourth quarter (RRR = 9.63 [95% CI, 1.94-47.79]; P = .006) for tacklers. Ball carriers were less likely to be injured when they were aware of impending contact (RRR = 0.14 [95% CI, 0.03-0.66]; P = .012) or when they executed a moderate fend (hand-off) (RRR = 0.22 [95% CI, 0.06-0.84]; P = .026). Tacklers were less likely to be injured when performing shoulder tackles (same side as leading leg) in comparison to an arm-only tackle (RRR = 0.02 [95% CI, 0.001-0.79]; P = .037). Ball carriers (RRR = 0.09 [95% CI, 0.01-0.89]; P = .040) and tacklers (RRR = 0.02 [95% CI, 0.001-0.32]; P =.006) were less likely to be injured when initial contact was made with the tackler's shoulder/arm instead of his head/neck. The relative risk of tackle-related injury was higher toward the end of matches. Incorrect technique may contribute to increased injury

  10. Local Communities and Schools Tackling Sustainability and Climate Change

    ERIC Educational Resources Information Center

    Flowers, Rick; Chodkiewicz, Andrew

    2009-01-01

    Local communities and their schools remain key sites for actions tackling issues of sustainability and climate change. A government-funded environmental education initiative, the Australian Sustainable Schools Initiative (AuSSI), working together with state based Sustainable Schools Programs (SSP), has the ability to support the development of…

  11. 2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS FROM POND TO JACK LADDER--AN ENDLESS CHAIN CONVEYOR THAT MOVES LOGS INTO MILL - Lester Shingle Mill, 1602 North Eighteenth Street, Sweet Home, Linn County, OR

  12. 46 CFR 184.300 - Ground tackle and mooring lines.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 7 2014-10-01 2014-10-01 false Ground tackle and mooring lines. 184.300 Section 184.300 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SMALL PASSENGER VESSELS (UNDER 100 GROSS TONS) VESSEL CONTROL AND MISCELLANEOUS SYSTEMS AND EQUIPMENT Mooring and Towing Equipment § 184.300...

  13. Clinical Complexity in Medicine: A Measurement Model of Task and Patient Complexity.

    PubMed

    Islam, R; Weir, C; Del Fiol, G

    2016-01-01

    Complexity in medicine needs to be reduced to simple components in a way that is comprehensible to researchers and clinicians. Few studies in the current literature propose a measurement model that addresses both task and patient complexity in medicine. The objective of this paper is to develop an integrated approach to understand and measure clinical complexity by incorporating both task and patient complexity components focusing on the infectious disease domain. The measurement model was adapted and modified for the healthcare domain. Three clinical infectious disease teams were observed, audio-recorded and transcribed. Each team included an infectious diseases expert, one infectious diseases fellow, one physician assistant and one pharmacy resident fellow. The transcripts were parsed and the authors independently coded complexity attributes. This baseline measurement model of clinical complexity was modified in an initial set of coding processes and further validated in a consensus-based iterative process that included several meetings and email discussions by three clinical experts from diverse backgrounds from the Department of Biomedical Informatics at the University of Utah. Inter-rater reliability was calculated using Cohen's kappa. The proposed clinical complexity model consists of two separate components. The first is a clinical task complexity model with 13 clinical complexity-contributing factors and 7 dimensions. The second is the patient complexity model with 11 complexity-contributing factors and 5 dimensions. The measurement model for complexity encompassing both task and patient complexity will be a valuable resource for future researchers and industry to measure and understand complexity in healthcare.

  14. Tackling antibiotic resistance: the environmental framework.

    PubMed

    Berendonk, Thomas U; Manaia, Célia M; Merlin, Christophe; Fatta-Kassinos, Despo; Cytryn, Eddie; Walsh, Fiona; Bürgmann, Helmut; Sørum, Henning; Norström, Madelaine; Pons, Marie-Noëlle; Kreuzinger, Norbert; Huovinen, Pentti; Stefani, Stefania; Schwartz, Thomas; Kisand, Veljo; Baquero, Fernando; Martinez, José Luis

    2015-05-01

    Antibiotic resistance is a threat to human and animal health worldwide, and key measures are required to reduce the risks posed by antibiotic resistance genes that occur in the environment. These measures include the identification of critical points of control, the development of reliable surveillance and risk assessment procedures, and the implementation of technological solutions that can prevent environmental contamination with antibiotic resistant bacteria and genes. In this Opinion article, we discuss the main knowledge gaps, the future research needs and the policy and management options that should be prioritized to tackle antibiotic resistance in the environment.

  15. The QSAR study of flavonoid-metal complexes scavenging rad OH free radical

    NASA Astrophysics Data System (ADS)

    Wang, Bo-chu; Qian, Jun-zhen; Fan, Ying; Tan, Jun

    2014-10-01

    Flavonoid-metal complexes have antioxidant activities. However, quantitative structure-activity relationships (QSAR) of flavonoid-metal complexes and their antioxidant activities has still not been tackled. On the basis of 21 structures of flavonoid-metal complexes and their antioxidant activities for scavenging rad OH free radical, we optimised their structures using Gaussian 03 software package and we subsequently calculated and chose 18 quantum chemistry descriptors such as dipole, charge and energy. Then we chose several quantum chemistry descriptors that are very important to the IC50 of flavonoid-metal complexes for scavenging rad OH free radical through method of stepwise linear regression, Meanwhile we obtained 4 new variables through the principal component analysis. Finally, we built the QSAR models based on those important quantum chemistry descriptors and the 4 new variables as the independent variables and the IC50 as the dependent variable using an Artificial Neural Network (ANN), and we validated the two models using experimental data. These results show that the two models in this paper are reliable and predictable.

  16. Comparing and improving proper orthogonal decomposition (POD) to reduce the complexity of groundwater models

    NASA Astrophysics Data System (ADS)

    Gosses, Moritz; Nowak, Wolfgang; Wöhling, Thomas

    2017-04-01

    Physically-based modeling is a wide-spread tool in understanding and management of natural systems. With the high complexity of many such models and the huge amount of model runs necessary for parameter estimation and uncertainty analysis, overall run times can be prohibitively long even on modern computer systems. An encouraging strategy to tackle this problem are model reduction methods. In this contribution, we compare different proper orthogonal decomposition (POD, Siade et al. (2010)) methods and their potential applications to groundwater models. The POD method performs a singular value decomposition on system states as simulated by the complex (e.g., PDE-based) groundwater model taken at several time-steps, so-called snapshots. The singular vectors with the highest information content resulting from this decomposition are then used as a basis for projection of the system of model equations onto a subspace of much lower dimensionality than the original complex model, thereby greatly reducing complexity and accelerating run times. In its original form, this method is only applicable to linear problems. Many real-world groundwater models are non-linear, tough. These non-linearities are introduced either through model structure (unconfined aquifers) or boundary conditions (certain Cauchy boundaries, like rivers with variable connection to the groundwater table). To date, applications of POD focused on groundwater models simulating pumping tests in confined aquifers with constant head boundaries. In contrast, POD model reduction either greatly looses accuracy or does not significantly reduce model run time if the above-mentioned non-linearities are introduced. We have also found that variable Dirichlet boundaries are problematic for POD model reduction. An extension to the POD method, called POD-DEIM, has been developed for non-linear groundwater models by Stanko et al. (2016). This method uses spatial interpolation points to build the equation system in the

  17. Modeling complexes of modeled proteins.

    PubMed

    Anishchenko, Ivan; Kundrotas, Petras J; Vakser, Ilya A

    2017-03-01

    Structural characterization of proteins is essential for understanding life processes at the molecular level. However, only a fraction of known proteins have experimentally determined structures. This fraction is even smaller for protein-protein complexes. Thus, structural modeling of protein-protein interactions (docking) primarily has to rely on modeled structures of the individual proteins, which typically are less accurate than the experimentally determined ones. Such "double" modeling is the Grand Challenge of structural reconstruction of the interactome. Yet it remains so far largely untested in a systematic way. We present a comprehensive validation of template-based and free docking on a set of 165 complexes, where each protein model has six levels of structural accuracy, from 1 to 6 Å C α RMSD. Many template-based docking predictions fall into acceptable quality category, according to the CAPRI criteria, even for highly inaccurate proteins (5-6 Å RMSD), although the number of such models (and, consequently, the docking success rate) drops significantly for models with RMSD > 4 Å. The results show that the existing docking methodologies can be successfully applied to protein models with a broad range of structural accuracy, and the template-based docking is much less sensitive to inaccuracies of protein models than the free docking. Proteins 2017; 85:470-478. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  18. Tackling Health Inequalities in the United Kingdom: The Progress and Pitfalls of Policy

    PubMed Central

    Exworthy, Mark; Blane, David; Marmot, Michael

    2003-01-01

    Goal Assess the progress and pitfalls of current United Kingdom (U.K.) policies to reduce health inequalities. Objectives (1) Describe the context enabling health inequalities to get onto the policy agenda in the United Kingdom. (2) Categorize and assess selected current U.K. policies that may affect health inequalities. (3) Apply the “policy windows” model to understand the issues faced in formulating and implementing such policies. (4) Examine the emerging policy challenges in the U.K. and elsewhere. Data Sources Official documents, secondary analyses, and interviews with policymakers. Study Design Qualitative, policy analysis. Data Collection 2001–2002. The methods were divided into two stages. The first identified policies which were connected with individual inquiry recommendations. The second involved case-studies of three policies areas which were thought to be crucial in tackling health inequalities. Both stages involved interviews with policy-makers and documentary analysis. Principal Findings (1) The current U.K. government stated a commitment to reducing health inequalities. (2) The government has begun to implement policies that address the wider determinants. (3) Some progress is evident but many indicators remain stubborn. (4) Difficulties remain in terms of coordinating policies across government and measuring progress. (5) The “policy windows” model explains the limited extent of progress and highlights current and possible future pitfalls. (6) The U.K.'s experience has lessons for other governments involved in tackling health inequalities. Conclusions Health inequalities are on the agenda of U.K. government policy and steps have been made to address them. There are some signs of progress but much remains to be done including overcoming some of the perverse incentives at the national level, improving joint working, ensuring appropriate measures of performance/progress, and improving monitoring arrangements. A conceptual policy model aids

  19. Tackling childhood obesity: the importance of understanding the context.

    PubMed

    Knai, Cécile; McKee, Martin

    2010-12-01

    Recommendations to tackle major health problems such as childhood obesity may not be appropriate if they fail to take account of the prevailing socio-political, cultural and economic context. We describe the development and application of a qualitative risk analysis approach to identify non-scientific considerations framing the policy response to obesity in Denmark and Latvia. Interviews conducted with key stakeholders in Denmark and Latvia, undertaken following a review of relevant literature on obesity and national policies. A qualitative risk analysis model was developed to help explain the findings in the light of national context. Non-scientific considerations that appeared to influence the response to obesity include the perceived relative importance of childhood obesity; the nature of stakeholder relations and its impact on decision-making; the place of obesity on the policy agenda; the legitimacy of the state to act for population health and views on alliances between public and private sectors. Better recognition of the exogenous factors affecting policy-making may lead to a more adequate policy response. The development and use of a qualitative risk analysis model enabled a better understanding of the contextual factors and processes influencing the response to childhood obesity in each country.

  20. Tackling emerging fungal threats to animal health, food security and ecosystem resilience.

    PubMed

    Fisher, Matthew C; Gow, Neil A R; Gurr, Sarah J

    2016-12-05

    Emerging infections caused by fungi have become a widely recognized global phenomenon. Their notoriety stems from their causing plagues and famines, driving species extinctions, and the difficulty in treating human mycoses alongside the increase of their resistance to antifungal drugs. This special issue comprises a collection of articles resulting from a Royal Society discussion meeting examining why pathogenic fungi are causing more disease now than they did in the past, and how we can tackle this rapidly emerging threat to the health of plants and animals worldwide.This article is part of the themed issue 'Tackling emerging fungal threats to animal health, food security and ecosystem resilience'. © 2016 The Author(s).

  1. Training young scientists across empirical and modeling approaches

    NASA Astrophysics Data System (ADS)

    Moore, D. J.

    2014-12-01

    The "fluxcourse," is a two-week program of study on Flux Measurements and Advanced Modeling (www.fluxcourse.org). Since 2007, this course has trained early career scientists to use both empirical observations and models to tackle terrestrial ecological questions. The fluxcourse seeks to cross train young scientists in measurement techniques and advanced modeling approaches for quantifying carbon and water fluxes between the atmosphere and the biosphere. We invited between ten and twenty volunteer instructors depending on the year ranging in experience and expertise, including representatives from industry, university professors and research specialists. The course combines online learning, lecture and discussion with hands on activities that range from measuring photosynthesis and installing an eddy covariance system to wrangling data and carrying out modeling experiments. Attendees are asked to develop and present two different group projects throughout the course. The overall goal is provide the next generation of scientists with the tools to tackle complex problems that require collaboration.

  2. Tackling misconceptions in geometrical optics

    NASA Astrophysics Data System (ADS)

    Ceuppens, S.; Deprez, J.; Dehaene, W.; De Cock, M.

    2018-07-01

    To improve the teaching and learning materials for a curriculum it is important to incorporate the findings from educational research. In light of this, we present creative exercises and experiments to elicit, confront and resolve misconceptions in geometrical optics. Since ray diagrams can be both the cause and the solution for many misconceptions we focus strongly on improving understanding of this tool to solve and understand optical phenomena. Through a combination of a conceptual understanding programme (CUP) and provocative exercises with ray diagrams we aim to elicit conceptual or cognitive conflict and exploit this to tackle misconceptions and increase students’ conceptual understanding through inquiry. We describe exercises for image formation by a plane mirror, image formation by a convex lens and indirect and direct observation of a real image formed by a convex lens as examples of our approach.

  3. Link-prediction to tackle the boundary specification problem in social network surveys

    PubMed Central

    De Wilde, Philippe; Buarque de Lima-Neto, Fernando

    2017-01-01

    Diffusion processes in social networks often cause the emergence of global phenomena from individual behavior within a society. The study of those global phenomena and the simulation of those diffusion processes frequently require a good model of the global network. However, survey data and data from online sources are often restricted to single social groups or features, such as age groups, single schools, companies, or interest groups. Hence, a modeling approach is required that extrapolates the locally restricted data to a global network model. We tackle this Missing Data Problem using Link-Prediction techniques from social network research, network generation techniques from the area of Social Simulation, as well as a combination of both. We found that techniques employing less information may be more adequate to solve this problem, especially when data granularity is an issue. We validated the network models created with our techniques on a number of real-world networks, investigating degree distributions as well as the likelihood of links given the geographical distance between two nodes. PMID:28426826

  4. Teacher Modeling Using Complex Informational Texts

    ERIC Educational Resources Information Center

    Fisher, Douglas; Frey, Nancy

    2015-01-01

    Modeling in complex texts requires that teachers analyze the text for factors of qualitative complexity and then design lessons that introduce students to that complexity. In addition, teachers can model the disciplinary nature of content area texts as well as word solving and comprehension strategies. Included is a planning guide for think aloud.

  5. Multifaceted Modelling of Complex Business Enterprises

    PubMed Central

    2015-01-01

    We formalise and present a new generic multifaceted complex system approach for modelling complex business enterprises. Our method has a strong focus on integrating the various data types available in an enterprise which represent the diverse perspectives of various stakeholders. We explain the challenges faced and define a novel approach to converting diverse data types into usable Bayesian probability forms. The data types that can be integrated include historic data, survey data, and management planning data, expert knowledge and incomplete data. The structural complexities of the complex system modelling process, based on various decision contexts, are also explained along with a solution. This new application of complex system models as a management tool for decision making is demonstrated using a railway transport case study. The case study demonstrates how the new approach can be utilised to develop a customised decision support model for a specific enterprise. Various decision scenarios are also provided to illustrate the versatility of the decision model at different phases of enterprise operations such as planning and control. PMID:26247591

  6. Multifaceted Modelling of Complex Business Enterprises.

    PubMed

    Chakraborty, Subrata; Mengersen, Kerrie; Fidge, Colin; Ma, Lin; Lassen, David

    2015-01-01

    We formalise and present a new generic multifaceted complex system approach for modelling complex business enterprises. Our method has a strong focus on integrating the various data types available in an enterprise which represent the diverse perspectives of various stakeholders. We explain the challenges faced and define a novel approach to converting diverse data types into usable Bayesian probability forms. The data types that can be integrated include historic data, survey data, and management planning data, expert knowledge and incomplete data. The structural complexities of the complex system modelling process, based on various decision contexts, are also explained along with a solution. This new application of complex system models as a management tool for decision making is demonstrated using a railway transport case study. The case study demonstrates how the new approach can be utilised to develop a customised decision support model for a specific enterprise. Various decision scenarios are also provided to illustrate the versatility of the decision model at different phases of enterprise operations such as planning and control.

  7. The impact of tackle football injuries on the American healthcare system with a neurological focus.

    PubMed

    McGinity, Michael J; Grandhi, Ramesh; Michalek, Joel E; Rodriguez, Jesse S; Trevino, Aron M; McGinity, Ashley C; Seifi, Ali

    2018-01-01

    Recent interest in the study of concussion and other neurological injuries has heightened awareness of the medical implications of American tackle football injuries amongst the public. Using the National Emergency Department Sample (NEDS) and the National Inpatient Sample (NIS), the largest publicly available all-payer emergency department and inpatient healthcare databases in the United States, we sought to describe the impact of tackle football injuries on the American healthcare system by delineating injuries, specifically neurological in nature, suffered as a consequence of tackle football between 2010 and 2013. The NEDS and NIS databases were queried to collect data on all patients presented to the emergency department (ED) and/or were admitted to hospitals with an ICD code for injuries related to American tackle football between the years 2010 and 2013. Subsequently those with football-related neurological injuries were abstracted using ICD codes for concussion, skull/face injury, intracranial injury, spine injury, and spinal cord injury (SCI). Patient demographics, length of hospital stay (LOS), cost and charge data, neurosurgical interventions, hospital type, and disposition were collected and analyzed. A total of 819,000 patients presented to EDs for evaluation of injuries secondary to American tackle football between 2010 and 2013, with 1.13% having injuries requiring inpatient admission (average length of stay 2.4 days). 80.4% of the ED visits were from the pediatric population. Of note, a statistically significant increase in the number of pediatric concussions over time was demonstrated (OR = 1.1, 95% CI 1.1 to 1.2). Patients were more likely to be admitted to trauma centers, teaching hospitals, the south or west regions, or with private insurance. There were 471 spinal cord injuries and 1,908 total spine injuries. Ten patients died during the study time period. The combined ED and inpatient charges were $1.35 billion. Injuries related to tackle

  8. The impact of tackle football injuries on the American healthcare system with a neurological focus

    PubMed Central

    McGinity, Michael J.; Grandhi, Ramesh; Michalek, Joel E.; Rodriguez, Jesse S.; Trevino, Aron M.; McGinity, Ashley C.

    2018-01-01

    Background Recent interest in the study of concussion and other neurological injuries has heightened awareness of the medical implications of American tackle football injuries amongst the public. Objective Using the National Emergency Department Sample (NEDS) and the National Inpatient Sample (NIS), the largest publicly available all-payer emergency department and inpatient healthcare databases in the United States, we sought to describe the impact of tackle football injuries on the American healthcare system by delineating injuries, specifically neurological in nature, suffered as a consequence of tackle football between 2010 and 2013. Methods The NEDS and NIS databases were queried to collect data on all patients presented to the emergency department (ED) and/or were admitted to hospitals with an ICD code for injuries related to American tackle football between the years 2010 and 2013. Subsequently those with football-related neurological injuries were abstracted using ICD codes for concussion, skull/face injury, intracranial injury, spine injury, and spinal cord injury (SCI). Patient demographics, length of hospital stay (LOS), cost and charge data, neurosurgical interventions, hospital type, and disposition were collected and analyzed. Results A total of 819,000 patients presented to EDs for evaluation of injuries secondary to American tackle football between 2010 and 2013, with 1.13% having injuries requiring inpatient admission (average length of stay 2.4 days). 80.4% of the ED visits were from the pediatric population. Of note, a statistically significant increase in the number of pediatric concussions over time was demonstrated (OR = 1.1, 95% CI 1.1 to 1.2). Patients were more likely to be admitted to trauma centers, teaching hospitals, the south or west regions, or with private insurance. There were 471 spinal cord injuries and 1,908 total spine injuries. Ten patients died during the study time period. The combined ED and inpatient charges were $1

  9. Tackling wicked problems in infection prevention and control: a guideline for co-creation with stakeholders.

    PubMed

    van Woezik, Anne F G; Braakman-Jansen, Louise M A; Kulyk, Olga; Siemons, Liseth; van Gemert-Pijnen, Julia E W C

    2016-01-01

    Infection prevention and control can be seen as a wicked public health problem as there is no consensus regarding problem definition and solution, multiple stakeholders with different needs and values are involved, and there is no clear end-point of the problem-solving process. Co-creation with stakeholders has been proposed as a suitable strategy to tackle wicked problems, yet little information and no clear step-by-step guide exist on how to do this. The objectives of this study were to develop a guideline to assist developers in tackling wicked problems using co-creation with stakeholders, and to apply this guideline to practice with an example case in the field of infection prevention and control. A mixed-method approach consisting of the integration of both quantitative and qualitative research was used. Relevant stakeholders from the veterinary, human health, and public health sectors were identified using a literature scan, expert recommendations, and snowball sampling. The stakeholder salience approach was used to select key stakeholders based on 3 attributes: power, legitimacy, and urgency. Key values of stakeholders (N = 20) were derived by qualitative semi-structured interviews and quantitatively weighted and prioritized using an online survey. Our method showed that stakeholder identification and analysis are prerequisites for understanding the complex stakeholder network that characterizes wicked problems. A total of 73 stakeholders were identified of which 36 were selected as potential key stakeholders, and only one was seen as a definite stakeholder. In addition, deriving key stakeholder values is a necessity to gain insights into different problem definitions, solutions and needs stakeholders have regarding the wicked problem. Based on the methods used, we developed a step-by-step guideline for co-creation with stakeholders when tackling wicked problems. The mixed-methods guideline presented here provides a systematic, transparent method to

  10. Tackle-related injury rates and nature of injuries in South African Youth Week tournament rugby union players (under-13 to under-18): an observational cohort study.

    PubMed

    Burger, Nicholas; Lambert, Mike I; Viljoen, Wayne; Brown, James C; Readhead, Clint; Hendricks, Sharief

    2014-08-12

    The tackle situation is most often associated with the high injury rates in rugby union. Tackle injury epidemiology in rugby union has previously been focused on senior cohorts but less is known about younger cohorts. The aim of this study was to report on the nature and rates of tackle-related injuries in South African youth rugby union players representing their provinces at national tournaments. Observational cohort study. Four South African Youth Week tournaments (under-13 Craven Week, under-16 Grant Khomo Week, under-18 Academy Week, under-18 Craven Week). Injury data were collected from 3652 youth rugby union players (population at risk) in 2011 and 2012. Tackle-related injury severity ('time-loss' and 'medical attention'), type and location, injury rate per 1000 h (including 95% CIs). Injury rate ratios (IRR) were calculated and modelled using a Poisson regression. A χ(2) analysis was used to detect linear trends between injuries and increasing match quarters. The 2012 under-13 Craven Week had a significantly greater 'time-loss' injury rate when compared with the 2012 under-18 Academy Week (IRR=4.43; 95% CI 2.13 to 9.21, p<0.05) and under-18 Craven Week (IRR=3.52; 95% CI 1.54 to 8.00, p<0.05). The Poisson regression also revealed a higher probability of 'overall' ('time-loss' and 'medical attention' combined) and 'time-loss' tackle-related injuries occurring at the under-13 Craven Week. The proportion of 'overall' and 'time-loss' injuries increased significantly with each quarter of the match when all four tournaments were combined (p<0.05). There was a difference in the tackle-related injury rate between the under-13 tournament and the two under-18 tournaments, and the tackle-related injury rate was higher in the final quarter of matches. Ongoing injury surveillance is required to better interpret these findings. Injury prevention strategies targeting the tackle may only be effective once the rate and nature of injuries have been accurately determined

  11. Tackle-related injury rates and nature of injuries in South African Youth Week tournament rugby union players (under-13 to under-18): an observational cohort study

    PubMed Central

    Burger, Nicholas; Lambert, Mike I; Viljoen, Wayne; Brown, James C; Readhead, Clint; Hendricks, Sharief

    2014-01-01

    Objectives The tackle situation is most often associated with the high injury rates in rugby union. Tackle injury epidemiology in rugby union has previously been focused on senior cohorts but less is known about younger cohorts. The aim of this study was to report on the nature and rates of tackle-related injuries in South African youth rugby union players representing their provinces at national tournaments. Design Observational cohort study. Setting Four South African Youth Week tournaments (under-13 Craven Week, under-16 Grant Khomo Week, under-18 Academy Week, under-18 Craven Week). Participants Injury data were collected from 3652 youth rugby union players (population at risk) in 2011 and 2012. Outcome measures Tackle-related injury severity (‘time-loss’ and ‘medical attention’), type and location, injury rate per 1000 h (including 95% CIs). Injury rate ratios (IRR) were calculated and modelled using a Poisson regression. A χ2 analysis was used to detect linear trends between injuries and increasing match quarters. Results The 2012 under-13 Craven Week had a significantly greater ‘time-loss’ injury rate when compared with the 2012 under-18 Academy Week (IRR=4.43; 95% CI 2.13 to 9.21, p<0.05) and under-18 Craven Week (IRR=3.52; 95% CI 1.54 to 8.00, p<0.05). The Poisson regression also revealed a higher probability of ‘overall’ (‘time-loss’ and ‘medical attention’ combined) and ‘time-loss’ tackle-related injuries occurring at the under-13 Craven Week. The proportion of ‘overall’ and ‘time-loss’ injuries increased significantly with each quarter of the match when all four tournaments were combined (p<0.05). Conclusions There was a difference in the tackle-related injury rate between the under-13 tournament and the two under-18 tournaments, and the tackle-related injury rate was higher in the final quarter of matches. Ongoing injury surveillance is required to better interpret these findings. Injury prevention strategies

  12. E-Index for Differentiating Complex Dynamic Traits

    PubMed Central

    Qi, Jiandong; Sun, Jianfeng; Wang, Jianxin

    2016-01-01

    While it is a daunting challenge in current biology to understand how the underlying network of genes regulates complex dynamic traits, functional mapping, a tool for mapping quantitative trait loci (QTLs) and single nucleotide polymorphisms (SNPs), has been applied in a variety of cases to tackle this challenge. Though useful and powerful, functional mapping performs well only when one or more model parameters are clearly responsible for the developmental trajectory, typically being a logistic curve. Moreover, it does not work when the curves are more complex than that, especially when they are not monotonic. To overcome this inadaptability, we therefore propose a mathematical-biological concept and measurement, E-index (earliness-index), which cumulatively measures the earliness degree to which a variable (or a dynamic trait) increases or decreases its value. Theoretical proofs and simulation studies show that E-index is more general than functional mapping and can be applied to any complex dynamic traits, including those with logistic curves and those with nonmonotonic curves. Meanwhile, E-index vector is proposed as well to capture more subtle differences of developmental patterns. PMID:27064292

  13. Tackling Work Related Stress in a National Health Service Trust

    ERIC Educational Resources Information Center

    Vick, Donna; Whyatt, Hilary

    2004-01-01

    The challenge of tackling the problem of coping with work related stress in a National Health Service (NHS) Trust was undertaken. Ideas were developed within the context of two different action learning sets and led to actions resulting in a large therapy Taster Session event and the establishment of a centre offering alternative therapies and…

  14. Public health nutrition in the civil service (England): approaches to tackling obesity.

    PubMed

    Blackshaw, J R

    2016-08-01

    The seriousness and scale of the physical, psychological, economic and societal consequences relating to poor diets, inactivity and obesity is unprecedented. Consequently, the contextual factors underpinning the work of a nutritionist in the civil service are complex and significant; however, there are real opportunities to make a difference and help improve the health of the nation. The present paper describes the delivery of public health nutrition through two work programmes, namely action to support young people develop healthier lifestyle choices and more recently the investigation and deployment of local insights to develop action to tackle obesity. Combining the application of nutrition expertise along with broader skills and approaches has enabled the translation of research and evidence into programmes of work to better the public's health. It is evident that the appropriate evaluation of such approaches has helped to deliver engaging and practical learning opportunities for young people. Furthermore, efforts to build on local intelligence and seek collaborative development can help inform the evidence base and seek to deliver public health approaches, which resonate with how people live their lives.

  15. The Role of Empathy in Preparing Teachers to Tackle Bullying

    ERIC Educational Resources Information Center

    Murphy, Helena; Tubritt, John; Norman, James O'Higgins

    2018-01-01

    Much research on bullying behaviour in schools among students has been carried out since the 1970's, when Olweus started a large-scale project in Norway which is now generally regarded as the first scientific study on bullying. Yet, there has been little research on how teachers respond to reports of bullying and tackle bullying behaviour in…

  16. Updating the debate on model complexity

    USGS Publications Warehouse

    Simmons, Craig T.; Hunt, Randall J.

    2012-01-01

    As scientists who are trying to understand a complex natural world that cannot be fully characterized in the field, how can we best inform the society in which we live? This founding context was addressed in a special session, “Complexity in Modeling: How Much is Too Much?” convened at the 2011 Geological Society of America Annual Meeting. The session had a variety of thought-provoking presentations—ranging from philosophy to cost-benefit analyses—and provided some areas of broad agreement that were not evident in discussions of the topic in 1998 (Hunt and Zheng, 1999). The session began with a short introduction during which model complexity was framed borrowing from an economic concept, the Law of Diminishing Returns, and an example of enjoyment derived by eating ice cream. Initially, there is increasing satisfaction gained from eating more ice cream, to a point where the gain in satisfaction starts to decrease, ending at a point when the eater sees no value in eating more ice cream. A traditional view of model complexity is similar—understanding gained from modeling can actually decrease if models become unnecessarily complex. However, oversimplified models—those that omit important aspects of the problem needed to make a good prediction—can also limit and confound our understanding. Thus, the goal of all modeling is to find the “sweet spot” of model sophistication—regardless of whether complexity was added sequentially to an overly simple model or collapsed from an initial highly parameterized framework that uses mathematics and statistics to attain an optimum (e.g., Hunt et al., 2007). Thus, holistic parsimony is attained, incorporating “as simple as possible,” as well as the equally important corollary “but no simpler.”

  17. The evolution of policy and actions to tackle obesity in England.

    PubMed

    Jebb, S A; Aveyard, P N; Hawkes, C

    2013-11-01

    Tackling obesity has been a policy priority in England for more than 20 years. Two formal government strategies on obesity in 2008 and 2011 drew together a range of actions and developed new initiatives to fill perceived gaps. Today, a wide range of policies are in place, including support for breastfeeding and healthy weaning practices, nutritional standards in schools, restrictions on marketing foods high in fat, sugar and salt to children, schemes to boost participation in sport, active travel plans, and weight management services. Data from annual surveys show that the rate of increase in obesity has attenuated in recent years, but has not yet been reversed. This paper considers the actions taken and what is known about the impact of individual policies and the overarching strategy to tackle obesity in England. © 2013 The Authors. Obesity Reviews published by John Wiley & Sons Ltd on behalf of the International Association for the Study of Obesity.

  18. Waste management under multiple complexities: Inexact piecewise-linearization-based fuzzy flexible programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun Wei; Huang, Guo H., E-mail: huang@iseis.org; Institute for Energy, Environment and Sustainable Communities, University of Regina, Regina, Saskatchewan, S4S 0A2

    2012-06-15

    Highlights: Black-Right-Pointing-Pointer Inexact piecewise-linearization-based fuzzy flexible programming is proposed. Black-Right-Pointing-Pointer It's the first application to waste management under multiple complexities. Black-Right-Pointing-Pointer It tackles nonlinear economies-of-scale effects in interval-parameter constraints. Black-Right-Pointing-Pointer It estimates costs more accurately than the linear-regression-based model. Black-Right-Pointing-Pointer Uncertainties are decreased and more satisfactory interval solutions are obtained. - Abstract: To tackle nonlinear economies-of-scale (EOS) effects in interval-parameter constraints for a representative waste management problem, an inexact piecewise-linearization-based fuzzy flexible programming (IPFP) model is developed. In IPFP, interval parameters for waste amounts and transportation/operation costs can be quantified; aspiration levels for net system costs, as well as tolerancemore » intervals for both capacities of waste treatment facilities and waste generation rates can be reflected; and the nonlinear EOS effects transformed from objective function to constraints can be approximated. An interactive algorithm is proposed for solving the IPFP model, which in nature is an interval-parameter mixed-integer quadratically constrained programming model. To demonstrate the IPFP's advantages, two alternative models are developed to compare their performances. One is a conventional linear-regression-based inexact fuzzy programming model (IPFP2) and the other is an IPFP model with all right-hand-sides of fussy constraints being the corresponding interval numbers (IPFP3). The comparison results between IPFP and IPFP2 indicate that the optimized waste amounts would have the similar patterns in both models. However, when dealing with EOS effects in constraints, the IPFP2 may underestimate the net system costs while the IPFP can estimate the costs more accurately. The comparison results between IPFP and IPFP3

  19. Short-term emergency response planning and risk assessment via an integrated modeling system for nuclear power plants in complex terrain

    NASA Astrophysics Data System (ADS)

    Chang, Ni-Bin; Weng, Yu-Chi

    2013-03-01

    Short-term predictions of potential impacts from accidental release of various radionuclides at nuclear power plants are acutely needed, especially after the Fukushima accident in Japan. An integrated modeling system that provides expert services to assess the consequences of accidental or intentional releases of radioactive materials to the atmosphere has received wide attention. These scenarios can be initiated either by accident due to human, software, or mechanical failures, or from intentional acts such as sabotage and radiological dispersal devices. Stringent action might be required just minutes after the occurrence of accidental or intentional release. To fulfill the basic functions of emergency preparedness and response systems, previous studies seldom consider the suitability of air pollutant dispersion models or the connectivity between source term, dispersion, and exposure assessment models in a holistic context for decision support. Therefore, the Gaussian plume and puff models, which are only suitable for illustrating neutral air pollutants in flat terrain conditional to limited meteorological situations, are frequently used to predict the impact from accidental release of industrial sources. In situations with complex terrain or special meteorological conditions, the proposing emergency response actions might be questionable and even intractable to decisionmakers responsible for maintaining public health and environmental quality. This study is a preliminary effort to integrate the source term, dispersion, and exposure assessment models into a Spatial Decision Support System (SDSS) to tackle the complex issues for short-term emergency response planning and risk assessment at nuclear power plants. Through a series model screening procedures, we found that the diagnostic (objective) wind field model with the aid of sufficient on-site meteorological monitoring data was the most applicable model to promptly address the trend of local wind field patterns

  20. Virtual Construction of Space Habitats: Connecting Building Information Models (BIM) and SysML

    NASA Technical Reports Server (NTRS)

    Polit-Casillas, Raul; Howe, A. Scott

    2013-01-01

    Current trends in design, construction and management of complex projects make use of Building Information Models (BIM) connecting different types of data to geometrical models. This information model allow different types of analysis beyond pure graphical representations. Space habitats, regardless their size, are also complex systems that require the synchronization of many types of information and disciplines beyond mass, volume, power or other basic volumetric parameters. For this, the state-of-the-art model based systems engineering languages and processes - for instance SysML - represent a solid way to tackle this problem from a programmatic point of view. Nevertheless integrating this with a powerful geometrical architectural design tool with BIM capabilities could represent a change in the workflow and paradigm of space habitats design applicable to other aerospace complex systems. This paper shows some general findings and overall conclusions based on the ongoing research to create a design protocol and method that practically connects a systems engineering approach with a BIM architectural and engineering design as a complete Model Based Engineering approach. Therefore, one hypothetical example is created and followed during the design process. In order to make it possible this research also tackles the application of IFC categories and parameters in the aerospace field starting with the application upon the space habitats design as way to understand the information flow between disciplines and tools. By building virtual space habitats we can potentially improve in the near future the way more complex designs are developed from very little detail from concept to manufacturing.

  1. Tackling reliability and construct validity: the systematic development of a qualitative protocol for skill and incident analysis.

    PubMed

    Savage, Trevor Nicholas; McIntosh, Andrew Stuart

    2017-03-01

    It is important to understand factors contributing to and directly causing sports injuries to improve the effectiveness and safety of sports skills. The characteristics of injury events must be evaluated and described meaningfully and reliably. However, many complex skills cannot be effectively investigated quantitatively because of ethical, technological and validity considerations. Increasingly, qualitative methods are being used to investigate human movement for research purposes, but there are concerns about reliability and measurement bias of such methods. Using the tackle in Rugby union as an example, we outline a systematic approach for developing a skill analysis protocol with a focus on improving objectivity, validity and reliability. Characteristics for analysis were selected using qualitative analysis and biomechanical theoretical models and epidemiological and coaching literature. An expert panel comprising subject matter experts provided feedback and the inter-rater reliability of the protocol was assessed using ten trained raters. The inter-rater reliability results were reviewed by the expert panel and the protocol was revised and assessed in a second inter-rater reliability study. Mean agreement in the second study improved and was comparable (52-90% agreement and ICC between 0.6 and 0.9) with other studies that have reported inter-rater reliability of qualitative analysis of human movement.

  2. Mesoscale modeling: solving complex flows in biology and biotechnology.

    PubMed

    Mills, Zachary Grant; Mao, Wenbin; Alexeev, Alexander

    2013-07-01

    Fluids are involved in practically all physiological activities of living organisms. However, biological and biorelated flows are hard to analyze due to the inherent combination of interdependent effects and processes that occur on a multitude of spatial and temporal scales. Recent advances in mesoscale simulations enable researchers to tackle problems that are central for the understanding of such flows. Furthermore, computational modeling effectively facilitates the development of novel therapeutic approaches. Among other methods, dissipative particle dynamics and the lattice Boltzmann method have become increasingly popular during recent years due to their ability to solve a large variety of problems. In this review, we discuss recent applications of these mesoscale methods to several fluid-related problems in medicine, bioengineering, and biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Flexible and structured survival model for a simultaneous estimation of non-linear and non-proportional effects and complex interactions between continuous variables: Performance of this multidimensional penalized spline approach in net survival trend analysis.

    PubMed

    Remontet, Laurent; Uhry, Zoé; Bossard, Nadine; Iwaz, Jean; Belot, Aurélien; Danieli, Coraline; Charvat, Hadrien; Roche, Laurent

    2018-01-01

    Cancer survival trend analyses are essential to describe accurately the way medical practices impact patients' survival according to the year of diagnosis. To this end, survival models should be able to account simultaneously for non-linear and non-proportional effects and for complex interactions between continuous variables. However, in the statistical literature, there is no consensus yet on how to build such models that should be flexible but still provide smooth estimates of survival. In this article, we tackle this challenge by smoothing the complex hypersurface (time since diagnosis, age at diagnosis, year of diagnosis, and mortality hazard) using a multidimensional penalized spline built from the tensor product of the marginal bases of time, age, and year. Considering this penalized survival model as a Poisson model, we assess the performance of this approach in estimating the net survival with a comprehensive simulation study that reflects simple and complex realistic survival trends. The bias was generally small and the root mean squared error was good and often similar to that of the true model that generated the data. This parametric approach offers many advantages and interesting prospects (such as forecasting) that make it an attractive and efficient tool for survival trend analyses.

  4. A brief introduction to mixed effects modelling and multi-model inference in ecology

    PubMed Central

    Donaldson, Lynda; Correa-Cano, Maria Eugenia; Goodwin, Cecily E.D.

    2018-01-01

    The use of linear mixed effects models (LMMs) is increasingly common in the analysis of biological data. Whilst LMMs offer a flexible approach to modelling a broad range of data types, ecological data are often complex and require complex model structures, and the fitting and interpretation of such models is not always straightforward. The ability to achieve robust biological inference requires that practitioners know how and when to apply these tools. Here, we provide a general overview of current methods for the application of LMMs to biological data, and highlight the typical pitfalls that can be encountered in the statistical modelling process. We tackle several issues regarding methods of model selection, with particular reference to the use of information theory and multi-model inference in ecology. We offer practical solutions and direct the reader to key references that provide further technical detail for those seeking a deeper understanding. This overview should serve as a widely accessible code of best practice for applying LMMs to complex biological problems and model structures, and in doing so improve the robustness of conclusions drawn from studies investigating ecological and evolutionary questions. PMID:29844961

  5. A brief introduction to mixed effects modelling and multi-model inference in ecology.

    PubMed

    Harrison, Xavier A; Donaldson, Lynda; Correa-Cano, Maria Eugenia; Evans, Julian; Fisher, David N; Goodwin, Cecily E D; Robinson, Beth S; Hodgson, David J; Inger, Richard

    2018-01-01

    The use of linear mixed effects models (LMMs) is increasingly common in the analysis of biological data. Whilst LMMs offer a flexible approach to modelling a broad range of data types, ecological data are often complex and require complex model structures, and the fitting and interpretation of such models is not always straightforward. The ability to achieve robust biological inference requires that practitioners know how and when to apply these tools. Here, we provide a general overview of current methods for the application of LMMs to biological data, and highlight the typical pitfalls that can be encountered in the statistical modelling process. We tackle several issues regarding methods of model selection, with particular reference to the use of information theory and multi-model inference in ecology. We offer practical solutions and direct the reader to key references that provide further technical detail for those seeking a deeper understanding. This overview should serve as a widely accessible code of best practice for applying LMMs to complex biological problems and model structures, and in doing so improve the robustness of conclusions drawn from studies investigating ecological and evolutionary questions.

  6. Toward a Learning Science for Complex Crowdsourcing Tasks

    ERIC Educational Resources Information Center

    Doroudi, Shayan; Kamar, Ece; Brunskill, Emma; Horvitz, Eric

    2016-01-01

    We explore how crowdworkers can be trained to tackle complex crowdsourcing tasks. We are particularly interested in training novice workers to perform well on solving tasks in situations where the space of strategies is large and workers need to discover and try different strategies to be successful. In a first experiment, we perform a comparison…

  7. Predictive Surface Complexation Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sverjensky, Dimitri A.

    Surface complexation plays an important role in the equilibria and kinetics of processes controlling the compositions of soilwaters and groundwaters, the fate of contaminants in groundwaters, and the subsurface storage of CO 2 and nuclear waste. Over the last several decades, many dozens of individual experimental studies have addressed aspects of surface complexation that have contributed to an increased understanding of its role in natural systems. However, there has been no previous attempt to develop a model of surface complexation that can be used to link all the experimental studies in order to place them on a predictive basis. Overall,more » my research has successfully integrated the results of the work of many experimentalists published over several decades. For the first time in studies of the geochemistry of the mineral-water interface, a practical predictive capability for modeling has become available. The predictive correlations developed in my research now enable extrapolations of experimental studies to provide estimates of surface chemistry for systems not yet studied experimentally and for natural and anthropogenically perturbed systems.« less

  8. Are Model Transferability And Complexity Antithetical? Insights From Validation of a Variable-Complexity Empirical Snow Model in Space and Time

    NASA Astrophysics Data System (ADS)

    Lute, A. C.; Luce, Charles H.

    2017-11-01

    The related challenges of predictions in ungauged basins and predictions in ungauged climates point to the need to develop environmental models that are transferable across both space and time. Hydrologic modeling has historically focused on modelling one or only a few basins using highly parameterized conceptual or physically based models. However, model parameters and structures have been shown to change significantly when calibrated to new basins or time periods, suggesting that model complexity and model transferability may be antithetical. Empirical space-for-time models provide a framework within which to assess model transferability and any tradeoff with model complexity. Using 497 SNOTEL sites in the western U.S., we develop space-for-time models of April 1 SWE and Snow Residence Time based on mean winter temperature and cumulative winter precipitation. The transferability of the models to new conditions (in both space and time) is assessed using non-random cross-validation tests with consideration of the influence of model complexity on transferability. As others have noted, the algorithmic empirical models transfer best when minimal extrapolation in input variables is required. Temporal split-sample validations use pseudoreplicated samples, resulting in the selection of overly complex models, which has implications for the design of hydrologic model validation tests. Finally, we show that low to moderate complexity models transfer most successfully to new conditions in space and time, providing empirical confirmation of the parsimony principal.

  9. The politics of partnerships: a study of police and housing collaboration to tackle anti-social behaviour on Australian public housing estates.

    PubMed

    Jacobs, Keith

    2010-01-01

    This paper draws on the findings from a research project on partnership arrangements between the police and housing departments on three Australian public housing estates to tackle problems associated with illicit drug activity and anti-social behaviour (ASB). The analysis focused on the setting up of the partnerships and the interactions that followed from these institutional arrangements. The assumption that informs the paper is that when studying partnerships there is a need for a more critically framed analysis. The temptation to posit "a successful model" of what partnership entails and then to judge practices in relation to this model is considerable, but it inevitably falls into the trap of constructing a narrative of partnership success or failure in terms of individual agency (that is, the degree of commitment from individuals). The analysis undertaken in this paper has therefore sought to fathom a more complex set of organizational processes. Rather than confine the discussion to issues of success and failure, the study foregrounds the subjective accounts of individuals who work within partnership and the constraints they encounter. The paper therefore makes explicit the cultural tensions within and across agencies, contestation as to the extent of the policy "problem," and the divergent perspectives on the appropriate modes of intervention.

  10. Early Results of a Helmetless-Tackling Intervention to Decrease Head Impacts in Football Players

    PubMed Central

    Swartz, Erik E.; Broglio, Steven P.; Cook, Summer B.; Cantu, Robert C.; Ferrara, Michael S.; Guskiewicz, Kevin M.; Myers, Jay L.

    2015-01-01

    Objective To test a helmetless-tackling behavioral intervention for reducing head impacts in National Collegiate Athletic Association Division I football players. Design Randomized controlled clinical trial. Setting Football field. Patients or Other Participants Fifty collegiate football players (intervention = 25, control = 25). Intervention(s) The intervention group participated in a 5-minute tackling drill without their helmets and shoulder pads twice per week in the preseason and once per week through the season. During this time, the control group performed noncontact football skills. Main Outcome Measure(s) Frequency of head impacts was recorded by an impact sensor for each athlete-exposure (AE). Data were tested with a 2 × 3 (group and time) repeated-measures analysis of variance. Significant interactions and main effects (P < .05) were followed with t tests. Results Head impacts/AE decreased for the intervention group compared with the control group by the end of the season (9.99 ± 6.10 versus 13.84 ± 7.27, respectively). The intervention group had 30% fewer impacts/AE than the control group by season's end (9.99 ± 6.10 versus 14.32 ± 8.45, respectively). Conclusion A helmetless-tackling training intervention reduced head impacts in collegiate football players within 1 season. PMID:26651278

  11. Cognition of an expert tackling an unfamiliar conceptual physics problem

    NASA Astrophysics Data System (ADS)

    Schuster, David; Undreiu, Adriana

    2009-11-01

    We have investigated and analyzed the cognition of an expert tackling a qualitative conceptual physics problem of an unfamiliar type. Our goal was to elucidate the detailed cognitive processes and knowledge elements involved, irrespective of final solution form, and consider implications for instruction. The basic but non-trivial problem was to find qualitatively the direction of acceleration of a pendulum bob at various stages of its motion, a problem originally studied by Reif and Allen. Methodology included interviews, introspection, retrospection and self-reported metacognition. Multiple facets of cognition were revealed, with different reasoning strategies used at different stages and for different points on the path. An account is given of the zigzag thinking paths and interplay of reasoning modes and schema elements involved. We interpret the cognitive processes in terms of theoretical concepts that emerged, namely: case-based, principle-based, experiential-intuitive and practical-heuristic reasoning; knowledge elements and schemata; activation; metacognition and epistemic framing. The complexity of cognition revealed in this case study contrasts with the tidy principle-based solutions we present to students. The pervasive role of schemata, case-based reasoning, practical heuristic strategies, and their interplay with physics principles is noteworthy, since these aspects of cognition are generally neither recognized nor taught. The schema/reasoning-mode perspective has direct application in science teaching, learning and problem-solving.

  12. Are Elementary School Teachers Prepared to Tackle Bullying? A Pilot Study

    ERIC Educational Resources Information Center

    Oldenburg, Beau; Bosman, Rie; Veenstra, René

    2016-01-01

    The aim of this pilot study was to investigate to what extent elementary school teachers were prepared to tackle bullying. Interview data from 22 Dutch elementary school teachers (M[subscript age]?=?43.3, 18 classrooms in eight schools) were combined with survey data from 373 students of these teachers (M age?=?10.7, grades 3-6, ages 8- to…

  13. Tackling Behaviour in Your Primary School: A Practical Handbook for Teachers

    ERIC Educational Resources Information Center

    Reid, Ken; Morgan, Nicola S.

    2012-01-01

    "Tackling Behaviour in the Primary School" provides ready-made advice and support for classroom professionals and can be used, read and adapted to suit the busy everyday lives of teachers working in primary schools today. This valuable text sets the scene for managing behaviour in the primary classroom in the context of the Children Act 2004…

  14. Balancing model complexity and measurements in hydrology

    NASA Astrophysics Data System (ADS)

    Van De Giesen, N.; Schoups, G.; Weijs, S. V.

    2012-12-01

    The Data Processing Inequality implies that hydrological modeling can only reduce, and never increase, the amount of information available in the original data used to formulate and calibrate hydrological models: I(X;Z(Y)) ≤ I(X;Y). Still, hydrologists around the world seem quite content building models for "their" watersheds to move our discipline forward. Hydrological models tend to have a hybrid character with respect to underlying physics. Most models make use of some well established physical principles, such as mass and energy balances. One could argue that such principles are based on many observations, and therefore add data. These physical principles, however, are applied to hydrological models that often contain concepts that have no direct counterpart in the observable physical universe, such as "buckets" or "reservoirs" that fill up and empty out over time. These not-so-physical concepts are more like the Artificial Neural Networks and Support Vector Machines of the Artificial Intelligence (AI) community. Within AI, one quickly came to the realization that by increasing model complexity, one could basically fit any dataset but that complexity should be controlled in order to be able to predict unseen events. The more data are available to train or calibrate the model, the more complex it can be. Many complexity control approaches exist in AI, with Solomonoff inductive inference being one of the first formal approaches, the Akaike Information Criterion the most popular, and Statistical Learning Theory arguably being the most comprehensive practical approach. In hydrology, complexity control has hardly been used so far. There are a number of reasons for that lack of interest, the more valid ones of which will be presented during the presentation. For starters, there are no readily available complexity measures for our models. Second, some unrealistic simplifications of the underlying complex physics tend to have a smoothing effect on possible model

  15. Generalist solutions to complex problems: generating practice-based evidence - the example of managing multi-morbidity

    PubMed Central

    2013-01-01

    Background A growing proportion of people are living with long term conditions. The majority have more than one. Dealing with multi-morbidity is a complex problem for health systems: for those designing and implementing healthcare as well as for those providing the evidence informing practice. Yet the concept of multi-morbidity (the presence of >2 diseases) is a product of the design of health care systems which define health care need on the basis of disease status. So does the solution lie in an alternative model of healthcare? Discussion Strengthening generalist practice has been proposed as part of the solution to tackling multi-morbidity. Generalism is a professional philosophy of practice, deeply known to many practitioners, and described as expertise in whole person medicine. But generalism lacks the evidence base needed by policy makers and planners to support service redesign. The challenge is to fill this practice-research gap in order to critically explore if and when generalist care offers a robust alternative to management of this complex problem. We need practice-based evidence to fill this gap. By recognising generalist practice as a ‘complex intervention’ (intervening in a complex system), we outline an approach to evaluate impact using action-research principles. We highlight the implications for those who both commission and undertake research in order to tackle this problem. Summary Answers to the complex problem of multi-morbidity won’t come from doing more of the same. We need to change systems of care, and so the systems for generating evidence to support that care. This paper contributes to that work through outlining a process for generating practice-based evidence of generalist solutions to the complex problem of person-centred care for people with multi-morbidity. PMID:23919296

  16. Generalist solutions to complex problems: generating practice-based evidence--the example of managing multi-morbidity.

    PubMed

    Reeve, Joanne; Blakeman, Tom; Freeman, George K; Green, Larry A; James, Paul A; Lucassen, Peter; Martin, Carmel M; Sturmberg, Joachim P; van Weel, Chris

    2013-08-07

    A growing proportion of people are living with long term conditions. The majority have more than one. Dealing with multi-morbidity is a complex problem for health systems: for those designing and implementing healthcare as well as for those providing the evidence informing practice. Yet the concept of multi-morbidity (the presence of >2 diseases) is a product of the design of health care systems which define health care need on the basis of disease status. So does the solution lie in an alternative model of healthcare? Strengthening generalist practice has been proposed as part of the solution to tackling multi-morbidity. Generalism is a professional philosophy of practice, deeply known to many practitioners, and described as expertise in whole person medicine. But generalism lacks the evidence base needed by policy makers and planners to support service redesign. The challenge is to fill this practice-research gap in order to critically explore if and when generalist care offers a robust alternative to management of this complex problem. We need practice-based evidence to fill this gap. By recognising generalist practice as a 'complex intervention' (intervening in a complex system), we outline an approach to evaluate impact using action-research principles. We highlight the implications for those who both commission and undertake research in order to tackle this problem. Answers to the complex problem of multi-morbidity won't come from doing more of the same. We need to change systems of care, and so the systems for generating evidence to support that care. This paper contributes to that work through outlining a process for generating practice-based evidence of generalist solutions to the complex problem of person-centred care for people with multi-morbidity.

  17. Combining complex networks and data mining: Why and how

    NASA Astrophysics Data System (ADS)

    Zanin, M.; Papo, D.; Sousa, P. A.; Menasalvas, E.; Nicchi, A.; Kubik, E.; Boccaletti, S.

    2016-05-01

    The increasing power of computer technology does not dispense with the need to extract meaningful information out of data sets of ever growing size, and indeed typically exacerbates the complexity of this task. To tackle this general problem, two methods have emerged, at chronologically different times, that are now commonly used in the scientific community: data mining and complex network theory. Not only do complex network analysis and data mining share the same general goal, that of extracting information from complex systems to ultimately create a new compact quantifiable representation, but they also often address similar problems too. In the face of that, a surprisingly low number of researchers turn out to resort to both methodologies. One may then be tempted to conclude that these two fields are either largely redundant or totally antithetic. The starting point of this review is that this state of affairs should be put down to contingent rather than conceptual differences, and that these two fields can in fact advantageously be used in a synergistic manner. An overview of both fields is first provided, some fundamental concepts of which are illustrated. A variety of contexts in which complex network theory and data mining have been used in a synergistic manner are then presented. Contexts in which the appropriate integration of complex network metrics can lead to improved classification rates with respect to classical data mining algorithms and, conversely, contexts in which data mining can be used to tackle important issues in complex network theory applications are illustrated. Finally, ways to achieve a tighter integration between complex networks and data mining, and open lines of research are discussed.

  18. Tackling the Triple-Threat Genome of Miscanthus x giganteus (2010 JGI User Meeting)

    ScienceCinema

    Moose, Steve

    2018-02-05

    Steve Moose from the University of Illinois at Urbana-Champaign and the Energy Biosciences Institute on "Tackling the Triple-Threat Genome of Miscanthus x giganteus" on March 25, 2010 at the 5th Annual DOE JGI User Meeting.

  19. Tackling the Triple-Threat Genome of Miscanthus x giganteus (2010 JGI User Meeting)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moose, Steve

    2010-03-25

    Steve Moose from the University of Illinois at Urbana-Champaign and the Energy Biosciences Institute on "Tackling the Triple-Threat Genome of Miscanthus x giganteus" on March 25, 2010 at the 5th Annual DOE JGI User Meeting.

  20. Preclinical models for obesity research

    PubMed Central

    Barrett, Perry; Morgan, Peter J.

    2016-01-01

    ABSTRACT A multi-dimensional strategy to tackle the global obesity epidemic requires an in-depth understanding of the mechanisms that underlie this complex condition. Much of the current mechanistic knowledge has arisen from preclinical research performed mostly, but not exclusively, in laboratory mouse and rat strains. These experimental models mimic certain aspects of the human condition and its root causes, particularly the over-consumption of calories and unbalanced diets. As with human obesity, obesity in rodents is the result of complex gene–environment interactions. Here, we review the traditional monogenic models of obesity, their contemporary optogenetic and chemogenetic successors, and the use of dietary manipulations and meal-feeding regimes to recapitulate the complexity of human obesity. We critically appraise the strengths and weaknesses of these different models to explore the underlying mechanisms, including the neural circuits that drive behaviours such as appetite control. We also discuss the use of these models for testing and screening anti-obesity drugs, beneficial bio-actives, and nutritional strategies, with the goal of ultimately translating these findings for the treatment of human obesity. PMID:27821603

  1. Complex systems in metabolic engineering.

    PubMed

    Winkler, James D; Erickson, Keesha; Choudhury, Alaksh; Halweg-Edwards, Andrea L; Gill, Ryan T

    2015-12-01

    Metabolic engineers manipulate intricate biological networks to build efficient biological machines. The inherent complexity of this task, derived from the extensive and often unknown interconnectivity between and within these networks, often prevents researchers from achieving desired performance. Other fields have developed methods to tackle the issue of complexity for their unique subset of engineering problems, but to date, there has not been extensive and comprehensive examination of how metabolic engineers use existing tools to ameliorate this effect on their own research projects. In this review, we examine how complexity affects engineering at the protein, pathway, and genome levels within an organism, and the tools for handling these issues to achieve high-performing strain designs. Quantitative complexity metrics and their applications to metabolic engineering versus traditional engineering fields are also discussed. We conclude by predicting how metabolic engineering practices may advance in light of an explicit consideration of design complexity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Anatomy and Physiology of Multiscale Modeling and Simulation in Systems Medicine.

    PubMed

    Mizeranschi, Alexandru; Groen, Derek; Borgdorff, Joris; Hoekstra, Alfons G; Chopard, Bastien; Dubitzky, Werner

    2016-01-01

    Systems medicine is the application of systems biology concepts, methods, and tools to medical research and practice. It aims to integrate data and knowledge from different disciplines into biomedical models and simulations for the understanding, prevention, cure, and management of complex diseases. Complex diseases arise from the interactions among disease-influencing factors across multiple levels of biological organization from the environment to molecules. To tackle the enormous challenges posed by complex diseases, we need a modeling and simulation framework capable of capturing and integrating information originating from multiple spatiotemporal and organizational scales. Multiscale modeling and simulation in systems medicine is an emerging methodology and discipline that has already demonstrated its potential in becoming this framework. The aim of this chapter is to present some of the main concepts, requirements, and challenges of multiscale modeling and simulation in systems medicine.

  3. Tackling complexities in understanding the social determinants of health: the contribution of ethnographic research.

    PubMed

    Bandyopadhyay, Mridula

    2011-11-25

    The complexities inherent in understanding the social determinants of health are often not well-served by quantitative approaches. My aim is to show that well-designed and well-conducted ethnographic studies have an important contribution to make in this regard. Ethnographic research designs are a difficult but rigorous approach to research questions that require us to understand the complexity of people's social and cultural lives. I draw on an ethnographic study to describe the complexities of studying maternal health in a rural area in India. I then show how the lessons learnt in that setting and context can be applied to studies done in very different settings. I show how ethnographic research depends for rigour on a theoretical framework for sample selection; why immersion in the community under study, and rapport building with research participants, is important to ensure rich and meaningful data; and how flexible approaches to data collection lead to the gradual emergence of an analysis based on intense cross-referencing with community views and thus a conclusion that explains the similarities and differences observed. When using ethnographic research design it can be difficult to specify in advance the exact details of the study design. Researchers can encounter issues in the field that require them to change what they planned on doing. In rigorous ethnographic studies, the researcher in the field is the research instrument and needs to be well trained in the method. Ethnographic research is challenging, but nevertheless provides a rewarding way of researching complex health problems that require an understanding of the social and cultural determinants of health.

  4. A Primer for Model Selection: The Decisive Role of Model Complexity

    NASA Astrophysics Data System (ADS)

    Höge, Marvin; Wöhling, Thomas; Nowak, Wolfgang

    2018-03-01

    Selecting a "best" model among several competing candidate models poses an often encountered problem in water resources modeling (and other disciplines which employ models). For a modeler, the best model fulfills a certain purpose best (e.g., flood prediction), which is typically assessed by comparing model simulations to data (e.g., stream flow). Model selection methods find the "best" trade-off between good fit with data and model complexity. In this context, the interpretations of model complexity implied by different model selection methods are crucial, because they represent different underlying goals of modeling. Over the last decades, numerous model selection criteria have been proposed, but modelers who primarily want to apply a model selection criterion often face a lack of guidance for choosing the right criterion that matches their goal. We propose a classification scheme for model selection criteria that helps to find the right criterion for a specific goal, i.e., which employs the correct complexity interpretation. We identify four model selection classes which seek to achieve high predictive density, low predictive error, high model probability, or shortest compression of data. These goals can be achieved by following either nonconsistent or consistent model selection and by either incorporating a Bayesian parameter prior or not. We allocate commonly used criteria to these four classes, analyze how they represent model complexity and what this means for the model selection task. Finally, we provide guidance on choosing the right type of criteria for specific model selection tasks. (A quick guide through all key points is given at the end of the introduction.)

  5. Challenges in Developing Models Describing Complex Soil Systems

    NASA Astrophysics Data System (ADS)

    Simunek, J.; Jacques, D.

    2014-12-01

    Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.

  6. Tackling complexities in understanding the social determinants of health: the contribution of ethnographic research

    PubMed Central

    2011-01-01

    Objective The complexities inherent in understanding the social determinants of health are often not well-served by quantitative approaches. My aim is to show that well-designed and well-conducted ethnographic studies have an important contribution to make in this regard. Ethnographic research designs are a difficult but rigorous approach to research questions that require us to understand the complexity of people’s social and cultural lives. Approach I draw on an ethnographic study to describe the complexities of studying maternal health in a rural area in India. I then show how the lessons learnt in that setting and context can be applied to studies done in very different settings. Results I show how ethnographic research depends for rigour on a theoretical framework for sample selection; why immersion in the community under study, and rapport building with research participants, is important to ensure rich and meaningful data; and how flexible approaches to data collection lead to the gradual emergence of an analysis based on intense cross-referencing with community views and thus a conclusion that explains the similarities and differences observed. Conclusion When using ethnographic research design it can be difficult to specify in advance the exact details of the study design. Researchers can encounter issues in the field that require them to change what they planned on doing. In rigorous ethnographic studies, the researcher in the field is the research instrument and needs to be well trained in the method. Implication Ethnographic research is challenging, but nevertheless provides a rewarding way of researching complex health problems that require an understanding of the social and cultural determinants of health. PMID:22168509

  7. Using Plants to Explore the Nature & Structural Complexity of Life

    ERIC Educational Resources Information Center

    Howard, Ava R.

    2014-01-01

    Use of real specimens brings the study of biology to life. This activity brings easily acquired plant specimens into the classroom to tackle common alternative conceptions regarding life, size, complexity, the nature of science, and plants as multicellular organisms. The activity occurs after a discussion of the characteristics of life and engages…

  8. Modeling the chemistry of complex petroleum mixtures.

    PubMed Central

    Quann, R J

    1998-01-01

    Determining the complete molecular composition of petroleum and its refined products is not feasible with current analytical techniques because of the astronomical number of molecular components. Modeling the composition and behavior of such complex mixtures in refinery processes has accordingly evolved along a simplifying concept called lumping. Lumping reduces the complexity of the problem to a manageable form by grouping the entire set of molecular components into a handful of lumps. This traditional approach does not have a molecular basis and therefore excludes important aspects of process chemistry and molecular property fundamentals from the model's formulation. A new approach called structure-oriented lumping has been developed to model the composition and chemistry of complex mixtures at a molecular level. The central concept is to represent an individual molecular or a set of closely related isomers as a mathematical construct of certain specific and repeating structural groups. A complex mixture such as petroleum can then be represented as thousands of distinct molecular components, each having a mathematical identity. This enables the automated construction of large complex reaction networks with tens of thousands of specific reactions for simulating the chemistry of complex mixtures. Further, the method provides a convenient framework for incorporating molecular physical property correlations, existing group contribution methods, molecular thermodynamic properties, and the structure--activity relationships of chemical kinetics in the development of models. PMID:9860903

  9. Refiners Switch to RFG Complex Model

    EIA Publications

    1998-01-01

    On January 1, 1998, domestic and foreign refineries and importers must stop using the "simple" model and begin using the "complex" model to calculate emissions of volatile organic compounds (VOC), toxic air pollutants (TAP), and nitrogen oxides (NOx) from motor gasoline. The primary differences between application of the two models is that some refineries may have to meet stricter standards for the sulfur and olefin content of the reformulated gasoline (RFG) they produce and all refineries will now be held accountable for NOx emissions. Requirements for calculating emissions from conventional gasoline under the anti-dumping rule similarly change for exhaust TAP and NOx. However, the change to the complex model is not expected to result in an increase in the price premium for RFG or constrain supplies.

  10. Towards an Evidence-Based Approach to Tackling Health Inequalities: The English Experience

    ERIC Educational Resources Information Center

    Killoran, Amanda; Kelly, Michael

    2004-01-01

    This short paper considers the development of an evidence-based approach to tackling health inequalities. Inequalities in health in England at the beginning of the 21st century have widened and are stark. Despite overall improvements in death rates, the growing gap between social groups means that now some parts of England have the same levels of…

  11. A Novel Interdisciplinary Approach to Socio-Technical Complexity

    NASA Astrophysics Data System (ADS)

    Bassetti, Chiara

    The chapter presents a novel interdisciplinary approach that integrates micro-sociological analysis into computer-vision and pattern-recognition modeling and algorithms, the purpose being to tackle socio-technical complexity at a systemic yet micro-grounded level. The approach is empirically-grounded and both theoretically- and analytically-driven, yet systemic and multidimensional, semi-supervised and computable, and oriented towards large scale applications. The chapter describes the proposed approach especially as for its sociological foundations, and as applied to the analysis of a particular setting --i.e. sport-spectator crowds. Crowds, better defined as large gatherings, are almost ever-present in our societies, and capturing their dynamics is crucial. From social sciences to public safety management and emergency response, modeling and predicting large gatherings' presence and dynamics, thus possibly preventing critical situations and being able to properly react to them, is fundamental. This is where semi/automated technologies can make the difference. The work presented in this chapter is intended as a scientific step towards such an objective.

  12. Complexity-aware simple modeling.

    PubMed

    Gómez-Schiavon, Mariana; El-Samad, Hana

    2018-02-26

    Mathematical models continue to be essential for deepening our understanding of biology. On one extreme, simple or small-scale models help delineate general biological principles. However, the parsimony of detail in these models as well as their assumption of modularity and insulation make them inaccurate for describing quantitative features. On the other extreme, large-scale and detailed models can quantitatively recapitulate a phenotype of interest, but have to rely on many unknown parameters, making them often difficult to parse mechanistically and to use for extracting general principles. We discuss some examples of a new approach-complexity-aware simple modeling-that can bridge the gap between the small-scale and large-scale approaches. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Simplistic and complex thought in medicine: the rationale for a person-centered care model as a medical revolution

    PubMed Central

    Reach, Gérard

    2016-01-01

    According to the concept developed by Thomas Kuhn, a scientific revolution occurs when scientists encounter a crisis due to the observation of anomalies that cannot be explained by the generally accepted paradigm within which scientific progress has thereto been made: a scientific revolution can therefore be described as a change in paradigm aimed at solving a crisis. Described herein is an application of this concept to the medical realm, starting from the reflection that during the past decades, the medical community has encountered two anomalies that, by their frequency and consequences, represent a crisis in the system, as they deeply jeopardize the efficiency of care: nonadherence of patients who do not follow the prescriptions of their doctors, and clinical inertia of doctors who do not comply with good practice guidelines. It is proposed that these phenomena are caused by a contrast between, on the one hand, the complex thought of patients and doctors that sometimes escapes rationalization, and on the other hand, the simplification imposed by the current paradigm of medicine dominated by the technical rationality of evidence-based medicine. It is suggested therefore that this crisis must provoke a change in paradigm, inventing a new model of care defined by an ability to take again into account, on an individual basis, the complex thought of patients and doctors. If this overall analysis is correct, such a person-centered care model should represent a solution to the two problems of patients’ nonadherence and doctors’ clinical inertia, as it tackles their cause. These considerations may have important implications for the teaching and the practice of medicine. PMID:27103790

  14. Trends in modeling Biomedical Complex Systems

    PubMed Central

    Milanesi, Luciano; Romano, Paolo; Castellani, Gastone; Remondini, Daniel; Liò, Petro

    2009-01-01

    In this paper we provide an introduction to the techniques for multi-scale complex biological systems, from the single bio-molecule to the cell, combining theoretical modeling, experiments, informatics tools and technologies suitable for biological and biomedical research, which are becoming increasingly multidisciplinary, multidimensional and information-driven. The most important concepts on mathematical modeling methodologies and statistical inference, bioinformatics and standards tools to investigate complex biomedical systems are discussed and the prominent literature useful to both the practitioner and the theoretician are presented. PMID:19828068

  15. Elements of complexity in subsurface modeling, exemplified with three case studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Truex, Michael J.; Rockhold, Mark

    2017-04-03

    There are complexity elements to consider when applying subsurface flow and transport models to support environmental analyses. Modelers balance the benefits and costs of modeling along the spectrum of complexity, taking into account the attributes of more simple models (e.g., lower cost, faster execution, easier to explain, less mechanistic) and the attributes of more complex models (higher cost, slower execution, harder to explain, more mechanistic and technically defensible). In this paper, modeling complexity is examined with respect to considering this balance. The discussion of modeling complexity is organized into three primary elements: 1) modeling approach, 2) description of process, andmore » 3) description of heterogeneity. Three examples are used to examine these complexity elements. Two of the examples use simulations generated from a complex model to develop simpler models for efficient use in model applications. The first example is designed to support performance evaluation of soil vapor extraction remediation in terms of groundwater protection. The second example investigates the importance of simulating different categories of geochemical reactions for carbon sequestration and selecting appropriate simplifications for use in evaluating sequestration scenarios. In the third example, the modeling history for a uranium-contaminated site demonstrates that conservative parameter estimates were inadequate surrogates for complex, critical processes and there is discussion on the selection of more appropriate model complexity for this application. All three examples highlight how complexity considerations are essential to create scientifically defensible models that achieve a balance between model simplification and complexity.« less

  16. Elements of complexity in subsurface modeling, exemplified with three case studies

    NASA Astrophysics Data System (ADS)

    Freedman, Vicky L.; Truex, Michael J.; Rockhold, Mark L.; Bacon, Diana H.; Freshley, Mark D.; Wellman, Dawn M.

    2017-09-01

    There are complexity elements to consider when applying subsurface flow and transport models to support environmental analyses. Modelers balance the benefits and costs of modeling along the spectrum of complexity, taking into account the attributes of more simple models (e.g., lower cost, faster execution, easier to explain, less mechanistic) and the attributes of more complex models (higher cost, slower execution, harder to explain, more mechanistic and technically defensible). In this report, modeling complexity is examined with respect to considering this balance. The discussion of modeling complexity is organized into three primary elements: (1) modeling approach, (2) description of process, and (3) description of heterogeneity. Three examples are used to examine these complexity elements. Two of the examples use simulations generated from a complex model to develop simpler models for efficient use in model applications. The first example is designed to support performance evaluation of soil-vapor-extraction remediation in terms of groundwater protection. The second example investigates the importance of simulating different categories of geochemical reactions for carbon sequestration and selecting appropriate simplifications for use in evaluating sequestration scenarios. In the third example, the modeling history for a uranium-contaminated site demonstrates that conservative parameter estimates were inadequate surrogates for complex, critical processes and there is discussion on the selection of more appropriate model complexity for this application. All three examples highlight how complexity considerations are essential to create scientifically defensible models that achieve a balance between model simplification and complexity.

  17. Genotypic Complexity of Fisher’s Geometric Model

    PubMed Central

    Hwang, Sungmin; Park, Su-Chan; Krug, Joachim

    2017-01-01

    Fisher’s geometric model was originally introduced to argue that complex adaptations must occur in small steps because of pleiotropic constraints. When supplemented with the assumption of additivity of mutational effects on phenotypic traits, it provides a simple mechanism for the emergence of genotypic epistasis from the nonlinear mapping of phenotypes to fitness. Of particular interest is the occurrence of reciprocal sign epistasis, which is a necessary condition for multipeaked genotypic fitness landscapes. Here we compute the probability that a pair of randomly chosen mutations interacts sign epistatically, which is found to decrease with increasing phenotypic dimension n, and varies nonmonotonically with the distance from the phenotypic optimum. We then derive expressions for the mean number of fitness maxima in genotypic landscapes comprised of all combinations of L random mutations. This number increases exponentially with L, and the corresponding growth rate is used as a measure of the complexity of the landscape. The dependence of the complexity on the model parameters is found to be surprisingly rich, and three distinct phases characterized by different landscape structures are identified. Our analysis shows that the phenotypic dimension, which is often referred to as phenotypic complexity, does not generally correlate with the complexity of fitness landscapes and that even organisms with a single phenotypic trait can have complex landscapes. Our results further inform the interpretation of experiments where the parameters of Fisher’s model have been inferred from data, and help to elucidate which features of empirical fitness landscapes can be described by this model. PMID:28450460

  18. Cumulative complexity: a functional, patient-centered model of patient complexity can improve research and practice.

    PubMed

    Shippee, Nathan D; Shah, Nilay D; May, Carl R; Mair, Frances S; Montori, Victor M

    2012-10-01

    To design a functional, patient-centered model of patient complexity with practical applicability to analytic design and clinical practice. Existing literature on patient complexity has mainly identified its components descriptively and in isolation, lacking clarity as to their combined functions in disrupting care or to how complexity changes over time. The authors developed a cumulative complexity model, which integrates existing literature and emphasizes how clinical and social factors accumulate and interact to complicate patient care. A narrative literature review is used to explicate the model. The model emphasizes a core, patient-level mechanism whereby complicating factors impact care and outcomes: the balance between patient workload of demands and patient capacity to address demands. Workload encompasses the demands on the patient's time and energy, including demands of treatment, self-care, and life in general. Capacity concerns ability to handle work (e.g., functional morbidity, financial/social resources, literacy). Workload-capacity imbalances comprise the mechanism driving patient complexity. Treatment and illness burdens serve as feedback loops, linking negative outcomes to further imbalances, such that complexity may accumulate over time. With its components largely supported by existing literature, the model has implications for analytic design, clinical epidemiology, and clinical practice. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. Deterministic ripple-spreading model for complex networks.

    PubMed

    Hu, Xiao-Bing; Wang, Ming; Leeson, Mark S; Hines, Evor L; Di Paolo, Ezequiel

    2011-04-01

    This paper proposes a deterministic complex network model, which is inspired by the natural ripple-spreading phenomenon. The motivations and main advantages of the model are the following: (i) The establishment of many real-world networks is a dynamic process, where it is often observed that the influence of a few local events spreads out through nodes, and then largely determines the final network topology. Obviously, this dynamic process involves many spatial and temporal factors. By simulating the natural ripple-spreading process, this paper reports a very natural way to set up a spatial and temporal model for such complex networks. (ii) Existing relevant network models are all stochastic models, i.e., with a given input, they cannot output a unique topology. Differently, the proposed ripple-spreading model can uniquely determine the final network topology, and at the same time, the stochastic feature of complex networks is captured by randomly initializing ripple-spreading related parameters. (iii) The proposed model can use an easily manageable number of ripple-spreading related parameters to precisely describe a network topology, which is more memory efficient when compared with traditional adjacency matrix or similar memory-expensive data structures. (iv) The ripple-spreading model has a very good potential for both extensions and applications.

  20. Modeling OPC complexity for design for manufacturability

    NASA Astrophysics Data System (ADS)

    Gupta, Puneet; Kahng, Andrew B.; Muddu, Swamy; Nakagawa, Sam; Park, Chul-Hong

    2005-11-01

    Increasing design complexity in sub-90nm designs results in increased mask complexity and cost. Resolution enhancement techniques (RET) such as assist feature addition, phase shifting (attenuated PSM) and aggressive optical proximity correction (OPC) help in preserving feature fidelity in silicon but increase mask complexity and cost. Data volume increase with rise in mask complexity is becoming prohibitive for manufacturing. Mask cost is determined by mask write time and mask inspection time, which are directly related to the complexity of features printed on the mask. Aggressive RET increase complexity by adding assist features and by modifying existing features. Passing design intent to OPC has been identified as a solution for reducing mask complexity and cost in several recent works. The goal of design-aware OPC is to relax OPC tolerances of layout features to minimize mask cost, without sacrificing parametric yield. To convey optimal OPC tolerances for manufacturing, design optimization should drive OPC tolerance optimization using models of mask cost for devices and wires. Design optimization should be aware of impact of OPC correction levels on mask cost and performance of the design. This work introduces mask cost characterization (MCC) that quantifies OPC complexity, measured in terms of fracture count of the mask, for different OPC tolerances. MCC with different OPC tolerances is a critical step in linking design and manufacturing. In this paper, we present a MCC methodology that provides models of fracture count of standard cells and wire patterns for use in design optimization. MCC cannot be performed by designers as they do not have access to foundry OPC recipes and RET tools. To build a fracture count model, we perform OPC and fracturing on a limited set of standard cells and wire configurations with all tolerance combinations. Separately, we identify the characteristics of the layout that impact fracture count. Based on the fracture count (FC) data

  1. Emulator-assisted data assimilation in complex models

    NASA Astrophysics Data System (ADS)

    Margvelashvili, Nugzar Yu; Herzfeld, Mike; Rizwi, Farhan; Mongin, Mathieu; Baird, Mark E.; Jones, Emlyn; Schaffelke, Britta; King, Edward; Schroeder, Thomas

    2016-09-01

    Emulators are surrogates of complex models that run orders of magnitude faster than the original model. The utility of emulators for the data assimilation into ocean models is still not well understood. High complexity of ocean models translates into high uncertainty of the corresponding emulators which may undermine the quality of the assimilation schemes based on such emulators. Numerical experiments with a chaotic Lorenz-95 model are conducted to illustrate this point and suggest a strategy to alleviate this problem through the localization of the emulation and data assimilation procedures. Insights gained through these experiments are used to design and implement data assimilation scenario for a 3D fine-resolution sediment transport model of the Great Barrier Reef (GBR), Australia.

  2. The Role of Labour Inspectorates in Tackling the Psychosocial Risks at Work in Europe: Problems and Perspectives

    PubMed Central

    Toukas, Dimitrios; Delichas, Miltiadis; Toufekoula, Chryssoula; Spyrouli, Anastasia

    2015-01-01

    Significant changes in the past year have taken place in the world of work that are bringing new challenges with regard to employee safety and health. These changes have led to emerging psychosocial risks (PSRs) at work. The risks are primarily linked to how work is designed, organized, and managed, and to the economic and social frame of work. These factors have increased the level of work-related stress and can lead to serious deterioration in mental and physical health. In tackling PSRs, the European labor inspectorates can have an important role by enforcing preventive and/or corrective interventions in the content and context of work. However, to improve working conditions, unilateral interventions in the context and content of work are insufficient and require adopting a common strategy to tackle PSRs, based on a holistic approach. The implementation of a common strategy by the European Labor Inspectorate for tackling PSRs is restricted by the lack of a common legislative frame with regard to PSR evaluation and management, the different levels of labor inspectors' training, and the different levels of employees' and employers' health and safety culture. PMID:26929837

  3. [Social and organizational innovation to tackle the challenge of integrated care of the chronically ill].

    PubMed

    Nuño-Solinís, Roberto

    2014-01-01

    The increase in life expectancy, coupled with other factors, has led to an increase in the prevalence of chronic diseases and multiple morbidity. This has led to the need to develop new health and social care models, which will allow managing these efficiently and in a sustainable manner. In particular, there seems to be consensus on the need to move towards integrated, patient-centered, and more proactive care. Thus, in recent years, chronic care models have been developed at international, national and regional level, as well as introducing strategies to tackle the challenge of chronic illness. However, the implementation of actions facilitating the change towards this new model of care does not seem to be an easy task. This paper presents some of the strategic lines and initiatives carried out by the Department of Health of the Basque Government. These actions can be described within a social and organizational innovation framework, as a means for effective implementation of interventions and strategies that shape the model required for the improved care of chronic illnesses within a universal and tax-funded health system. Copyright © 2013 Elsevier España, S.L. All rights reserved.

  4. A novel BA complex network model on color template matching.

    PubMed

    Han, Risheng; Shen, Shigen; Yue, Guangxue; Ding, Hui

    2014-01-01

    A novel BA complex network model of color space is proposed based on two fundamental rules of BA scale-free network model: growth and preferential attachment. The scale-free characteristic of color space is discovered by analyzing evolving process of template's color distribution. And then the template's BA complex network model can be used to select important color pixels which have much larger effects than other color pixels in matching process. The proposed BA complex network model of color space can be easily integrated into many traditional template matching algorithms, such as SSD based matching and SAD based matching. Experiments show the performance of color template matching results can be improved based on the proposed algorithm. To the best of our knowledge, this is the first study about how to model the color space of images using a proper complex network model and apply the complex network model to template matching.

  5. Evaluating Behavioral Skills Training to Teach Safe Tackling Skills to Youth Football Players

    ERIC Educational Resources Information Center

    Tai, Sharayah S. M.; Miltenberger, Raymond G.

    2017-01-01

    With concussion rates on the rise for football players, there is a need for further research to increase skills and decrease injuries. Behavioral skills training is effective in teaching a wide variety of skills but has yet to be studied in the sports setting. We evaluated behavioral skills training to teach safer tackling techniques to six…

  6. Watershed Complexity Impacts on Rainfall-Runoff Modeling

    NASA Astrophysics Data System (ADS)

    Goodrich, D. C.; Grayson, R.; Willgoose, G.; Palacios-Velez, O.; Bloeschl, G.

    2002-12-01

    Application of distributed hydrologic watershed models fundamentally requires watershed partitioning or discretization. In addition to partitioning the watershed into modeling elements, these elements typically represent a further abstraction of the actual watershed surface and its relevant hydrologic properties. A critical issue that must be addressed by any user of these models prior to their application is definition of an acceptable level of watershed discretization or geometric model complexity. A quantitative methodology to define a level of geometric model complexity commensurate with a specified level of model performance is developed for watershed rainfall-runoff modeling. In the case where watershed contributing areas are represented by overland flow planes, equilibrium discharge storage was used to define the transition from overland to channel dominated flow response. The methodology is tested on four subcatchments which cover a range of watershed scales of over three orders of magnitude in the USDA-ARS Walnut Gulch Experimental Watershed in Southeastern Arizona. It was found that distortion of the hydraulic roughness can compensate for a lower level of discretization (fewer channels) to a point. Beyond this point, hydraulic roughness distortion cannot compensate for topographic distortion of representing the watershed by fewer elements (e.g. less complex channel network). Similarly, differences in representation of topography by different model or digital elevation model (DEM) types (e.g. Triangular Irregular Elements - TINs; contour lines; and regular grid DEMs) also result in difference in runoff routing responses that can be largely compensated for by a distortion in hydraulic roughness.

  7. A complex social-ecological disaster: Environmentally induced forced migration.

    PubMed

    Rechkemmer, Andreas; O'Connor, Ashley; Rai, Abha; Decker Sparks, Jessica L; Mudliar, Pranietha; Shultz, James M

    2016-01-01

    In the 21 st century, global issues are increasingly characterized by inter-connectedness and complexity. Global environmental change, and climate change in particular, has become a powerful driver and catalyst of forced migration and internal displacement of people. Environmental migrants may far outnumber any other group of displaced people and refugees in the years to come. Deeper scientific integration, especially across the social sciences, is a prerequisite to tackle this issue.

  8. A Novel BA Complex Network Model on Color Template Matching

    PubMed Central

    Han, Risheng; Yue, Guangxue; Ding, Hui

    2014-01-01

    A novel BA complex network model of color space is proposed based on two fundamental rules of BA scale-free network model: growth and preferential attachment. The scale-free characteristic of color space is discovered by analyzing evolving process of template's color distribution. And then the template's BA complex network model can be used to select important color pixels which have much larger effects than other color pixels in matching process. The proposed BA complex network model of color space can be easily integrated into many traditional template matching algorithms, such as SSD based matching and SAD based matching. Experiments show the performance of color template matching results can be improved based on the proposed algorithm. To the best of our knowledge, this is the first study about how to model the color space of images using a proper complex network model and apply the complex network model to template matching. PMID:25243235

  9. Complex networks under dynamic repair model

    NASA Astrophysics Data System (ADS)

    Chaoqi, Fu; Ying, Wang; Kun, Zhao; Yangjun, Gao

    2018-01-01

    Invulnerability is not the only factor of importance when considering complex networks' security. It is also critical to have an effective and reasonable repair strategy. Existing research on network repair is confined to the static model. The dynamic model makes better use of the redundant capacity of repaired nodes and repairs the damaged network more efficiently than the static model; however, the dynamic repair model is complex and polytropic. In this paper, we construct a dynamic repair model and systematically describe the energy-transfer relationships between nodes in the repair process of the failure network. Nodes are divided into three types, corresponding to three structures. We find that the strong coupling structure is responsible for secondary failure of the repaired nodes and propose an algorithm that can select the most suitable targets (nodes or links) to repair the failure network with minimal cost. Two types of repair strategies are identified, with different effects under the two energy-transfer rules. The research results enable a more flexible approach to network repair.

  10. Seismic modeling of complex stratified reservoirs

    NASA Astrophysics Data System (ADS)

    Lai, Hung-Liang

    Turbidite reservoirs in deep-water depositional systems, such as the oil fields in the offshore Gulf of Mexico and North Sea, are becoming an important exploration target in the petroleum industry. Accurate seismic reservoir characterization, however, is complicated by the heterogeneous of the sand and shale distribution and also by the lack of resolution when imaging thin channel deposits. Amplitude variation with offset (AVO) is a very important technique that is widely applied to locate hydrocarbons. Inaccurate estimates of seismic reflection amplitudes may result in misleading interpretations because of these problems in application to turbidite reservoirs. Therefore, an efficient, accurate, and robust method of modeling seismic responses for such complex reservoirs is crucial and necessary to reduce exploration risk. A fast and accurate approach generating synthetic seismograms for such reservoir models combines wavefront construction ray tracing with composite reflection coefficients in a hybrid modeling algorithm. The wavefront construction approach is a modern, fast implementation of ray tracing that I have extended to model quasi-shear wave propagation in anisotropic media. Composite reflection coefficients, which are computed using propagator matrix methods, provide the exact seismic reflection amplitude for a stratified reservoir model. This is a distinct improvement over conventional AVO analysis based on a model with only two homogeneous half spaces. I combine the two methods to compute synthetic seismograms for test models of turbidite reservoirs in the Ursa field, Gulf of Mexico, validating the new results against exact calculations using the discrete wavenumber method. The new method, however, can also be used to generate synthetic seismograms for the laterally heterogeneous, complex stratified reservoir models. The results show important frequency dependence that may be useful for exploration. Because turbidite channel systems often display complex

  11. On the dangers of model complexity without ecological justification in species distribution modeling

    Treesearch

    David M. Bell; Daniel R. Schlaepfer

    2016-01-01

    Although biogeographic patterns are the product of complex ecological processes, the increasing com-plexity of correlative species distribution models (SDMs) is not always motivated by ecological theory,but by model fit. The validity of model projections, such as shifts in a species’ climatic niche, becomesquestionable particularly during extrapolations, such as for...

  12. Calibration of Complex Subsurface Reaction Models Using a Surrogate-Model Approach

    EPA Science Inventory

    Application of model assessment techniques to complex subsurface reaction models involves numerous difficulties, including non-trivial model selection, parameter non-uniqueness, and excessive computational burden. To overcome these difficulties, this study introduces SAMM (Simult...

  13. Tackling the x-ray cargo inspection challenge using machine learning

    NASA Astrophysics Data System (ADS)

    Jaccard, Nicolas; Rogers, Thomas W.; Morton, Edward J.; Griffin, Lewis D.

    2016-05-01

    The current infrastructure for non-intrusive inspection of cargo containers cannot accommodate exploding com-merce volumes and increasingly stringent regulations. There is a pressing need to develop methods to automate parts of the inspection workflow, enabling expert operators to focus on a manageable number of high-risk images. To tackle this challenge, we developed a modular framework for automated X-ray cargo image inspection. Employing state-of-the-art machine learning approaches, including deep learning, we demonstrate high performance for empty container verification and specific threat detection. This work constitutes a significant step towards the partial automation of X-ray cargo image inspection.

  14. On Using Meta-Modeling and Multi-Modeling to Address Complex Problems

    ERIC Educational Resources Information Center

    Abu Jbara, Ahmed

    2013-01-01

    Models, created using different modeling techniques, usually serve different purposes and provide unique insights. While each modeling technique might be capable of answering specific questions, complex problems require multiple models interoperating to complement/supplement each other; we call this Multi-Modeling. To address the syntactic and…

  15. Tackling the social determinants of inequalities in health during Phase V of the Healthy Cities Project in Europe.

    PubMed

    Ritsatakis, Anna; Ostergren, Per-Olof; Webster, Premila

    2015-06-01

    The WHO European Healthy Cities Network has from its inception aimed at tackling inequalities in health. In carrying out an evaluation of Phase V of the project (2009-13), an attempt was made to examine how far the concept of equity in health is understood and accepted; whether cities had moved further from a disease/medical model to looking at the social determinants of inequalities in health; how far the HC project contributed to cities determining the extent and causes of inequalities in health; what efforts were made to tackle such inequalities and how far inequalities in health may have increased or decreased during Phase V. A broader range of resources was utilized for this evaluation than in previous phases of the project. These indicated that most cities were definitely looking at the broader determinants. Equality in health was better understood and had been included as a value in a range of city policies. This was facilitated by stronger involvement of the HC project in city planning processes. Although almost half the cities participating had prepared a City Health Profile, only few cities had the necessary local level data to monitor changes in inequalities in health. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Managing Complex Interoperability Solutions using Model-Driven Architecture

    DTIC Science & Technology

    2011-06-01

    such as Oracle or MySQL . Each data model for a specific RDBMS is a distinct PSM. Or the system may want to exchange information with other C2...reduced number of transformations, e.g., from an RDBMS physical schema to the corresponding SQL script needed to instantiate the tables in a relational...tance of models. In engineering, a model serves several purposes: 1. It presents an abstract view of a complex system or of a complex information

  17. Foundations for Streaming Model Transformations by Complex Event Processing.

    PubMed

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  18. Modeling wildfire incident complexity dynamics.

    PubMed

    Thompson, Matthew P

    2013-01-01

    Wildfire management in the United States and elsewhere is challenged by substantial uncertainty regarding the location and timing of fire events, the socioeconomic and ecological consequences of these events, and the costs of suppression. Escalating U.S. Forest Service suppression expenditures is of particular concern at a time of fiscal austerity as swelling fire management budgets lead to decreases for non-fire programs, and as the likelihood of disruptive within-season borrowing potentially increases. Thus there is a strong interest in better understanding factors influencing suppression decisions and in turn their influence on suppression costs. As a step in that direction, this paper presents a probabilistic analysis of geographic and temporal variation in incident management team response to wildfires. The specific focus is incident complexity dynamics through time for fires managed by the U.S. Forest Service. The modeling framework is based on the recognition that large wildfire management entails recurrent decisions across time in response to changing conditions, which can be represented as a stochastic dynamic system. Daily incident complexity dynamics are modeled according to a first-order Markov chain, with containment represented as an absorbing state. A statistically significant difference in complexity dynamics between Forest Service Regions is demonstrated. Incident complexity probability transition matrices and expected times until containment are presented at national and regional levels. Results of this analysis can help improve understanding of geographic variation in incident management and associated cost structures, and can be incorporated into future analyses examining the economic efficiency of wildfire management.

  19. Wavefield complexity and stealth structures: Resolution constraints by wave physics

    NASA Astrophysics Data System (ADS)

    Nissen-Meyer, T.; Leng, K.

    2017-12-01

    Imaging the Earth's interior relies on understanding how waveforms encode information from heterogeneous multi-scale structure. This relation is given by elastodynamics, but forward modeling in the context of tomography primarily serves to deliver synthetic waveforms and gradients for the inversion procedure. While this is entirely appropriate, it depreciates a wealth of complementary inference that can be obtained from the complexity of the wavefield. Here, we are concerned with the imprint of realistic multi-scale Earth structure on the wavefield, and the question on the inherent physical resolution limit of structures encoded in seismograms. We identify parameter and scattering regimes where structures remain invisible as a function of seismic wavelength, structural multi-scale geometry, scattering strength, and propagation path. Ultimately, this will aid in interpreting tomographic images by acknowledging the scope of "forgotten" structures, and shall offer guidance for optimising the selection of seismic data for tomography. To do so, we use our novel 3D modeling method AxiSEM3D which tackles global wave propagation in visco-elastic, anisotropic 3D structures with undulating boundaries at unprecedented resolution and efficiency by exploiting the inherent azimuthal smoothness of wavefields via a coupled Fourier expansion-spectral-element approach. The method links computational cost to wavefield complexity and thereby lends itself well to exploring the relation between waveforms and structures. We will show various examples of multi-scale heterogeneities which appear or disappear in the waveform, and argue that the nature of the structural power spectrum plays a central role in this. We introduce the concept of wavefield learning to examine the true wavefield complexity for a complexity-dependent modeling framework and discriminate which scattering structures can be retrieved by surface measurements. This leads to the question of physical invisibility and the

  20. Some Approaches to Modeling Complex Information Systems.

    ERIC Educational Resources Information Center

    Rao, V. Venkata; Zunde, Pranas

    1982-01-01

    Brief discussion of state-of-the-art of modeling complex information systems distinguishes between macrolevel and microlevel modeling of such systems. Network layout and hierarchical system models, simulation, information acquisition and dissemination, databases and information storage, and operating systems are described and assessed. Thirty-four…

  1. Tackling health inequalities: moving theory to action

    PubMed Central

    Signal, Louise; Martin, Jennifer; Reid, Papaarangi; Carroll, Christopher; Howden-Chapman, Philippa; Ormsby, Vera Keefe; Richards, Ruth; Robson, Bridget; Wall, Teresa

    2007-01-01

    Background This paper reports on health inequalities awareness-raising workshops conducted with senior New Zealand health sector staff as part of the Government's goal of reducing inequalities in health, education, employment and housing. Methods The workshops were based on a multi-method needs assessment with senior staff in key health institutions. The workshops aimed to increase the knowledge and skills of health sector staff to act on, and advocate for, eliminating inequalities in health. They were practical, evidence-based, and action oriented and took a social approach to the causes of inequalities in health. The workshops used ethnicity as a case study and explored racism as a driver of inequalities. They focused on the role of institutionalized racism, or racism that is built into health sector institutions. Institutional theory provided a framework for participants to analyse how their institutions create and maintain inequalities and how they can act to change this. Results Participants identified a range of institutional mechanisms that promote inequalities and a range of ways to address them including: undertaking further training, using Māori (the indigenous people) models of health in policy-making, increasing Māori participation and partnership in decision making, strengthening sector relationships with iwi (tribes), funding and supporting services provided 'by Māori for Māori', ensuring a strategic approach to intersectoral work, encouraging stronger community involvement in the work of the institution, requiring all evaluations to assess impact on inequalities, and requiring the sector to report on progress in addressing health inequalities. The workshops were rated highly by participants, who indicated increased commitment to tackle inequalities as a result of the training. Discussion Government and sector leadership were critical to the success of the workshops and subsequent changes in policy and practice. The use of locally adapted equity

  2. Tackling health inequalities: moving theory to action.

    PubMed

    Signal, Louise; Martin, Jennifer; Reid, Papaarangi; Carroll, Christopher; Howden-Chapman, Philippa; Ormsby, Vera Keefe; Richards, Ruth; Robson, Bridget; Wall, Teresa

    2007-10-03

    This paper reports on health inequalities awareness-raising workshops conducted with senior New Zealand health sector staff as part of the Government's goal of reducing inequalities in health, education, employment and housing. The workshops were based on a multi-method needs assessment with senior staff in key health institutions. The workshops aimed to increase the knowledge and skills of health sector staff to act on, and advocate for, eliminating inequalities in health. They were practical, evidence-based, and action oriented and took a social approach to the causes of inequalities in health. The workshops used ethnicity as a case study and explored racism as a driver of inequalities. They focused on the role of institutionalized racism, or racism that is built into health sector institutions. Institutional theory provided a framework for participants to analyse how their institutions create and maintain inequalities and how they can act to change this. Participants identified a range of institutional mechanisms that promote inequalities and a range of ways to address them including: undertaking further training, using Māori (the indigenous people) models of health in policy-making, increasing Māori participation and partnership in decision making, strengthening sector relationships with iwi (tribes), funding and supporting services provided 'by Māori for Māori', ensuring a strategic approach to intersectoral work, encouraging stronger community involvement in the work of the institution, requiring all evaluations to assess impact on inequalities, and requiring the sector to report on progress in addressing health inequalities. The workshops were rated highly by participants, who indicated increased commitment to tackle inequalities as a result of the training. Government and sector leadership were critical to the success of the workshops and subsequent changes in policy and practice. The use of locally adapted equity tools, requiring participants to

  3. A musculoskeletal model of the elbow joint complex

    NASA Technical Reports Server (NTRS)

    Gonzalez, Roger V.; Barr, Ronald E.; Abraham, Lawrence D.

    1993-01-01

    This paper describes a musculoskeletal model that represents human elbow flexion-extension and forearm pronation-supination. Musculotendon parameters and the skeletal geometry were determined for the musculoskeletal model in the analysis of ballistic elbow joint complex movements. The key objective was to develop a computational model, guided by optimal control, to investigate the relationship among patterns of muscle excitation, individual muscle forces, and movement kinematics. The model was verified using experimental kinematic, torque, and electromyographic data from volunteer subjects performing both isometric and ballistic elbow joint complex movements. In general, the model predicted kinematic and muscle excitation patterns similar to what was experimentally measured.

  4. Enhancing metaproteomics-The value of models and defined environmental microbial systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herbst, Florian-Alexander; Lünsmann, Vanessa; Kjeldal, Henrik

    Metaproteomicsthe large-scale characterization of the entire protein complement of environmental microbiota at a given point in timehas provided new features to study complex microbial communities in order to unravel these black boxes. Some new technical challenges arose that were not an issue for classical proteome analytics before that could be tackled by the application of different model systems. Here, we review different current and future model systems for metaproteome analysis. We introduce model systems for clinical and biotechnological research questions including acid mine drainage, anaerobic digesters, and activated sludge, following a short introduction to microbial communities and metaproteomics. Model systemsmore » are useful to evaluate the challenges encountered within (but not limited to) metaproteomics, including species complexity and coverage, biomass availability, or reliable protein extraction. Moreover, the implementation of model systems can be considered as a step forward to better understand microbial community responses and ecological functions of single member organisms. In the future, improvements are necessary to fully explore complex environmental systems by metaproteomics.« less

  5. Enhancing metaproteomics-The value of models and defined environmental microbial systems

    DOE PAGES

    Herbst, Florian-Alexander; Lünsmann, Vanessa; Kjeldal, Henrik; ...

    2016-01-21

    Metaproteomicsthe large-scale characterization of the entire protein complement of environmental microbiota at a given point in timehas provided new features to study complex microbial communities in order to unravel these black boxes. Some new technical challenges arose that were not an issue for classical proteome analytics before that could be tackled by the application of different model systems. Here, we review different current and future model systems for metaproteome analysis. We introduce model systems for clinical and biotechnological research questions including acid mine drainage, anaerobic digesters, and activated sludge, following a short introduction to microbial communities and metaproteomics. Model systemsmore » are useful to evaluate the challenges encountered within (but not limited to) metaproteomics, including species complexity and coverage, biomass availability, or reliable protein extraction. Moreover, the implementation of model systems can be considered as a step forward to better understand microbial community responses and ecological functions of single member organisms. In the future, improvements are necessary to fully explore complex environmental systems by metaproteomics.« less

  6. A complex social-ecological disaster: Environmentally induced forced migration

    PubMed Central

    Rechkemmer, Andreas; O'Connor, Ashley; Rai, Abha; Decker Sparks, Jessica L.; Mudliar, Pranietha; Shultz, James M.

    2016-01-01

    ABSTRACT In the 21st century, global issues are increasingly characterized by inter-connectedness and complexity. Global environmental change, and climate change in particular, has become a powerful driver and catalyst of forced migration and internal displacement of people. Environmental migrants may far outnumber any other group of displaced people and refugees in the years to come. Deeper scientific integration, especially across the social sciences, is a prerequisite to tackle this issue.

  7. A Practical Philosophy of Complex Climate Modelling

    NASA Technical Reports Server (NTRS)

    Schmidt, Gavin A.; Sherwood, Steven

    2014-01-01

    We give an overview of the practice of developing and using complex climate models, as seen from experiences in a major climate modelling center and through participation in the Coupled Model Intercomparison Project (CMIP).We discuss the construction and calibration of models; their evaluation, especially through use of out-of-sample tests; and their exploitation in multi-model ensembles to identify biases and make predictions. We stress that adequacy or utility of climate models is best assessed via their skill against more naive predictions. The framework we use for making inferences about reality using simulations is naturally Bayesian (in an informal sense), and has many points of contact with more familiar examples of scientific epistemology. While the use of complex simulations in science is a development that changes much in how science is done in practice, we argue that the concepts being applied fit very much into traditional practices of the scientific method, albeit those more often associated with laboratory work.

  8. Theoretical Modeling and Electromagnetic Response of Complex Metamaterials

    DTIC Science & Technology

    2017-03-06

    AFRL-AFOSR-VA-TR-2017-0042 Theoretical Modeling and Electromagnetic Response of Complex Metamaterials Andrea Alu UNIVERSITY OF TEXAS AT AUSTIN Final...Nov 2016 4. TITLE AND SUBTITLE Theoretical Modeling and Electromagnetic Response of Complex Metamaterials 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER...based on parity-time symmetric metasurfaces, and various advances in electromagnetic and acoustic theory and applications. Our findings have opened

  9. Reassessing Geophysical Models of the Bushveld Complex in 3D

    NASA Astrophysics Data System (ADS)

    Cole, J.; Webb, S. J.; Finn, C.

    2012-12-01

    Conceptual geophysical models of the Bushveld Igneous Complex show three possible geometries for its mafic component: 1) Separate intrusions with vertical feeders for the eastern and western lobes (Cousins, 1959) 2) Separate dipping sheets for the two lobes (Du Plessis and Kleywegt, 1987) 3) A single saucer-shaped unit connected at depth in the central part between the two lobes (Cawthorn et al, 1998) Model three incorporates isostatic adjustment of the crust in response to the weight of the dense mafic material. The model was corroborated by results of a broadband seismic array over southern Africa, known as the Southern African Seismic Experiment (SASE) (Nguuri, et al, 2001; Webb et al, 2004). This new information about the crustal thickness only became available in the last decade and could not be considered in the earlier models. Nevertheless, there is still on-going debate as to which model is correct. All of the models published up to now have been done in 2 or 2.5 dimensions. This is not well suited to modelling the complex geometry of the Bushveld intrusion. 3D modelling takes into account effects of variations in geometry and geophysical properties of lithologies in a full three dimensional sense and therefore affects the shape and amplitude of calculated fields. The main question is how the new knowledge of the increased crustal thickness, as well as the complexity of the Bushveld Complex, will impact on the gravity fields calculated for the existing conceptual models, when modelling in 3D. The three published geophysical models were remodelled using full 3Dl potential field modelling software, and including crustal thickness obtained from the SASE. The aim was not to construct very detailed models, but to test the existing conceptual models in an equally conceptual way. Firstly a specific 2D model was recreated in 3D, without crustal thickening, to establish the difference between 2D and 3D results. Then the thicker crust was added. Including the less

  10. Routine Discovery of Complex Genetic Models using Genetic Algorithms

    PubMed Central

    Moore, Jason H.; Hahn, Lance W.; Ritchie, Marylyn D.; Thornton, Tricia A.; White, Bill C.

    2010-01-01

    Simulation studies are useful in various disciplines for a number of reasons including the development and evaluation of new computational and statistical methods. This is particularly true in human genetics and genetic epidemiology where new analytical methods are needed for the detection and characterization of disease susceptibility genes whose effects are complex, nonlinear, and partially or solely dependent on the effects of other genes (i.e. epistasis or gene-gene interaction). Despite this need, the development of complex genetic models that can be used to simulate data is not always intuitive. In fact, only a few such models have been published. We have previously developed a genetic algorithm approach to discovering complex genetic models in which two single nucleotide polymorphisms (SNPs) influence disease risk solely through nonlinear interactions. In this paper, we extend this approach for the discovery of high-order epistasis models involving three to five SNPs. We demonstrate that the genetic algorithm is capable of routinely discovering interesting high-order epistasis models in which each SNP influences risk of disease only through interactions with the other SNPs in the model. This study opens the door for routine simulation of complex gene-gene interactions among SNPs for the development and evaluation of new statistical and computational approaches for identifying common, complex multifactorial disease susceptibility genes. PMID:20948983

  11. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    NASA Astrophysics Data System (ADS)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  12. Tackling saponin diversity in marine animals by mass spectrometry: data acquisition and integration.

    PubMed

    Decroo, Corentin; Colson, Emmanuel; Demeyer, Marie; Lemaur, Vincent; Caulier, Guillaume; Eeckhaut, Igor; Cornil, Jérôme; Flammang, Patrick; Gerbaux, Pascal

    2017-05-01

    Saponin analysis by mass spectrometry methods is nowadays progressively supplementing other analytical methods such as nuclear magnetic resonance (NMR). Indeed, saponin extracts from plant or marine animals are often constituted by a complex mixture of (slightly) different saponin molecules that requires extensive purification and separation steps to meet the requirement for NMR spectroscopy measurements. Based on its intrinsic features, mass spectrometry represents an inescapable tool to access the structures of saponins within extracts by using LC-MS, MALDI-MS, and tandem mass spectrometry experiments. The combination of different MS methods nowadays allows for a nice description of saponin structures, without extensive purification. However, the structural characterization process is based on low kinetic energy CID which cannot afford a total structure elucidation as far as stereochemistry is concerned. Moreover, the structural difference between saponins in a same extract is often so small that coelution upon LC-MS analysis is unavoidable, rendering the isomeric distinction and characterization by CID challenging or impossible. In the present paper, we introduce ion mobility in combination with liquid chromatography to better tackle the structural complexity of saponin congeners. When analyzing saponin extracts with MS-based methods, handling the data remains problematic for the comprehensive report of the results, but also for their efficient comparison. We here introduce an original schematic representation using sector diagrams that are constructed from mass spectrometry data. We strongly believe that the proposed data integration could be useful for data interpretation since it allows for a direct and fast comparison, both in terms of composition and relative proportion of the saponin contents in different extracts. Graphical Abstract A combination of state-of-the-art mass spectrometry methods, including ion mobility spectroscopy, is developed to afford a

  13. Modeling of protein binary complexes using structural mass spectrometry data

    PubMed Central

    Kamal, J.K. Amisha; Chance, Mark R.

    2008-01-01

    In this article, we describe a general approach to modeling the structure of binary protein complexes using structural mass spectrometry data combined with molecular docking. In the first step, hydroxyl radical mediated oxidative protein footprinting is used to identify residues that experience conformational reorganization due to binding or participate in the binding interface. In the second step, a three-dimensional atomic structure of the complex is derived by computational modeling. Homology modeling approaches are used to define the structures of the individual proteins if footprinting detects significant conformational reorganization as a function of complex formation. A three-dimensional model of the complex is constructed from these binary partners using the ClusPro program, which is composed of docking, energy filtering, and clustering steps. Footprinting data are used to incorporate constraints—positive and/or negative—in the docking step and are also used to decide the type of energy filter—electrostatics or desolvation—in the successive energy-filtering step. By using this approach, we examine the structure of a number of binary complexes of monomeric actin and compare the results to crystallographic data. Based on docking alone, a number of competing models with widely varying structures are observed, one of which is likely to agree with crystallographic data. When the docking steps are guided by footprinting data, accurate models emerge as top scoring. We demonstrate this method with the actin/gelsolin segment-1 complex. We also provide a structural model for the actin/cofilin complex using this approach which does not have a crystal or NMR structure. PMID:18042684

  14. Mathematic modeling of complex aquifer: Evian Natural Mineral Water case study considering lumped and distributed models.

    NASA Astrophysics Data System (ADS)

    Henriot, abel; Blavoux, bernard; Travi, yves; Lachassagne, patrick; Beon, olivier; Dewandel, benoit; Ladouche, bernard

    2013-04-01

    The Evian Natural Mineral Water (NMW) aquifer is a highly heterogeneous Quaternary glacial deposits complex composed of three main units, from bottom to top: - The "Inferior Complex" mainly composed of basal and impermeable till lying on the Alpine rocks. It outcrops only at the higher altitudes but is known in depth through drilled holes. - The "Gavot Plateau Complex" is an interstratified complex of mainly basal and lateral till up to 400 m thick. It outcrops at heights above approximately 850 m a.m.s.l. and up to 1200 m a.m.s.l. over a 30 km² area. It is the main recharge area known for the hydromineral system. - The "Terminal Complex" from which the Evian NMW is emerging at 410 m a.m.s.l. It is composed of sand and gravel Kame terraces that allow water to flow from the deep "Gavot Plateau Complex" permeable layers to the "Terminal Complex". A thick and impermeable terminal till caps and seals the system. Aquifer is then confined at its downstream area. Because of heterogeneity and complexity of this hydrosystem, distributed modeling tools are difficult to implement at the whole system scale: important hypothesis would have to be made about geometry, hydraulic properties, boundary conditions for example and extrapolation would lead with no doubt to unacceptable errors. Consequently a modeling strategy is being developed and leads also to improve the conceptual model of the hydrosystem. Lumped models mainly based on tritium time series allow the whole hydrosystem to be modeled combining in series: an exponential model (superficial aquifers of the "Gavot Plateau Complex"), a dispersive model (Gavot Plateau interstratified complex) and a piston flow model (sand and gravel from the Kame terraces) respectively 8, 60 and 2.5 years of mean transit time. These models provide insight on the governing parameters for the whole mineral aquifer. They help improving the current conceptual model and are to be improved with other environmental tracers such as CFC, SF6. A

  15. How to Tackle Key Challenges in the Promotion of Physical Activity among Older Adults (65+): The AEQUIPA Network Approach

    PubMed Central

    Forberger, Sarah; Bammann, Karin; Bauer, Jürgen; Boll, Susanne; Bolte, Gabriele; Brand, Tilman; Hein, Andreas; Koppelin, Frauke; Lippke, Sonia; Meyer, Jochen; Pischke, Claudia R.; Voelcker-Rehage, Claudia; Zeeb, Hajo

    2017-01-01

    The paper introduces the theoretical framework and methods/instruments used by the Physical Activity and Health Equity: Primary Prevention for Healthy Ageing (AEQUIPA) prevention research network as an interdisciplinary approach to tackle key challenges in the promotion of physical activity among older people (65+). Drawing on the social-ecological model, the AEQUIPA network developed an interdisciplinary methodological design including quantitative/qualitative studies and systematic reviews, while combining expertise from diverse fields: public health, psychology, urban planning, sports sciences, health technology and geriatrics. AEQUIPA tackles key challenges when promoting physical activity (PA) in older adults: tailoring of interventions, fostering community readiness and participation, strengthening intersectoral collaboration, using new technological devices and evaluating intervention generated inequalities. AEQUIPA aims to strengthen the evidence base for age-specific preventive PA interventions and to yield new insights into the explanatory power of individual and contextual factors. Currently, the empirical work is still underway. First experiences indicate that the network has achieved a strong regional linkage with communities, local stakeholders and individuals. However, involving inactive persons and individuals from minority groups remained challenging. A review of existing PA intervention studies among the elderly revealed the potential to assess equity effects. The results will add to the theoretical and methodological discussion on evidence-based age-specific PA interventions and will contribute to the discussion about European and national health targets. PMID:28375177

  16. How to Tackle Key Challenges in the Promotion of Physical Activity among Older Adults (65+): The AEQUIPA Network Approach.

    PubMed

    Forberger, Sarah; Bammann, Karin; Bauer, Jürgen; Boll, Susanne; Bolte, Gabriele; Brand, Tilman; Hein, Andreas; Koppelin, Frauke; Lippke, Sonia; Meyer, Jochen; Pischke, Claudia R; Voelcker-Rehage, Claudia; Zeeb, Hajo

    2017-04-04

    The paper introduces the theoretical framework and methods/instruments used by the Physical Activity and Health Equity: Primary Prevention for Healthy Ageing (AEQUIPA) prevention research network as an interdisciplinary approach to tackle key challenges in the promotion of physical activity among older people (65+). Drawing on the social-ecological model, the AEQUIPA network developed an interdisciplinary methodological design including quantitative/qualitative studies and systematic reviews, while combining expertise from diverse fields: public health, psychology, urban planning, sports sciences, health technology and geriatrics. AEQUIPA tackles key challenges when promoting physical activity (PA) in older adults: tailoring of interventions, fostering community readiness and participation, strengthening intersectoral collaboration, using new technological devices and evaluating intervention generated inequalities. AEQUIPA aims to strengthen the evidence base for age-specific preventive PA interventions and to yield new insights into the explanatory power of individual and contextual factors. Currently, the empirical work is still underway. First experiences indicate that thenetwork has achieved a strong regional linkage with communities, local stakeholders and individuals. However, involving inactive persons and individuals from minority groups remained challenging. A review of existing PA intervention studies among the elderly revealed the potential to assess equity effects. The results will add to the theoretical and methodological discussion on evidence-based age-specific PA interventions and will contribute to the discussion about European and national health targets.

  17. Geometric modeling of subcellular structures, organelles, and multiprotein complexes

    PubMed Central

    Feng, Xin; Xia, Kelin; Tong, Yiying; Wei, Guo-Wei

    2013-01-01

    SUMMARY Recently, the structure, function, stability, and dynamics of subcellular structures, organelles, and multi-protein complexes have emerged as a leading interest in structural biology. Geometric modeling not only provides visualizations of shapes for large biomolecular complexes but also fills the gap between structural information and theoretical modeling, and enables the understanding of function, stability, and dynamics. This paper introduces a suite of computational tools for volumetric data processing, information extraction, surface mesh rendering, geometric measurement, and curvature estimation of biomolecular complexes. Particular emphasis is given to the modeling of cryo-electron microscopy data. Lagrangian-triangle meshes are employed for the surface presentation. On the basis of this representation, algorithms are developed for surface area and surface-enclosed volume calculation, and curvature estimation. Methods for volumetric meshing have also been presented. Because the technological development in computer science and mathematics has led to multiple choices at each stage of the geometric modeling, we discuss the rationales in the design and selection of various algorithms. Analytical models are designed to test the computational accuracy and convergence of proposed algorithms. Finally, we select a set of six cryo-electron microscopy data representing typical subcellular complexes to demonstrate the efficacy of the proposed algorithms in handling biomolecular surfaces and explore their capability of geometric characterization of binding targets. This paper offers a comprehensive protocol for the geometric modeling of subcellular structures, organelles, and multiprotein complexes. PMID:23212797

  18. Beyond Individual Behaviour Change: The Role of Power, Knowledge and Strategy in Tackling Climate Change

    ERIC Educational Resources Information Center

    Kenis, Anneleen; Mathijs, Erik

    2012-01-01

    Individual behaviour change is fast becoming a kind of "holy grail" to tackle climate change, in environmental policy, the environmental movement and academic literature. This is contested by those who claim that social structures are the main problem and who advocate collective social action. The objective of the research presented in…

  19. Acquisition of Complex Systemic Thinking: Mental Models of Evolution

    ERIC Educational Resources Information Center

    d'Apollonia, Sylvia T.; Charles, Elizabeth S.; Boyd, Gary M.

    2004-01-01

    We investigated the impact of introducing college students to complex adaptive systems on their subsequent mental models of evolution compared to those of students taught in the same manner but with no reference to complex systems. The students' mental models (derived from similarity ratings of 12 evolutionary terms using the pathfinder algorithm)…

  20. Interactive Visualizations of Complex Seismic Data and Models

    NASA Astrophysics Data System (ADS)

    Chai, C.; Ammon, C. J.; Maceira, M.; Herrmann, R. B.

    2016-12-01

    The volume and complexity of seismic data and models have increased dramatically thanks to dense seismic station deployments and advances in data modeling and processing. Seismic observations such as receiver functions and surface-wave dispersion are multidimensional: latitude, longitude, time, amplitude and latitude, longitude, period, and velocity. Three-dimensional seismic velocity models are characterized with three spatial dimensions and one additional dimension for the speed. In these circumstances, exploring the data and models and assessing the data fits is a challenge. A few professional packages are available to visualize these complex data and models. However, most of these packages rely on expensive commercial software or require a substantial time investment to master, and even when that effort is complete, communicating the results to others remains a problem. A traditional approach during the model interpretation stage is to examine data fits and model features using a large number of static displays. Publications include a few key slices or cross-sections of these high-dimensional data, but this prevents others from directly exploring the model and corresponding data fits. In this presentation, we share interactive visualization examples of complex seismic data and models that are based on open-source tools and are easy to implement. Model and data are linked in an intuitive and informative web-browser based display that can be used to explore the model and the features in the data that influence various aspects of the model. We encode the model and data into HTML files and present high-dimensional information using two approaches. The first uses a Python package to pack both data and interactive plots in a single file. The second approach uses JavaScript, CSS, and HTML to build a dynamic webpage for seismic data visualization. The tools have proven useful and led to deeper insight into 3D seismic models and the data that were used to construct them

  1. Complexation and molecular modeling studies of europium(III)-gallic acid-amino acid complexes.

    PubMed

    Taha, Mohamed; Khan, Imran; Coutinho, João A P

    2016-04-01

    With many metal-based drugs extensively used today in the treatment of cancer, attention has focused on the development of new coordination compounds with antitumor activity with europium(III) complexes recently introduced as novel anticancer drugs. The aim of this work is to design new Eu(III) complexes with gallic acid, an antioxida'nt phenolic compound. Gallic acid was chosen because it shows anticancer activity without harming health cells. As antioxidant, it helps to protect human cells against oxidative damage that implicated in DNA damage, cancer, and accelerated cell aging. In this work, the formation of binary and ternary complexes of Eu(III) with gallic acid, primary ligand, and amino acids alanine, leucine, isoleucine, and tryptophan was studied by glass electrode potentiometry in aqueous solution containing 0.1M NaNO3 at (298.2 ± 0.1) K. Their overall stability constants were evaluated and the concentration distributions of the complex species in solution were calculated. The protonation constants of gallic acid and amino acids were also determined at our experimental conditions and compared with those predicted by using conductor-like screening model for realistic solvation (COSMO-RS) model. The geometries of Eu(III)-gallic acid complexes were characterized by the density functional theory (DFT). The spectroscopic UV-visible and photoluminescence measurements are carried out to confirm the formation of Eu(III)-gallic acid complexes in aqueous solutions. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. A simple model clarifies the complicated relationships of complex networks

    PubMed Central

    Zheng, Bojin; Wu, Hongrun; Kuang, Li; Qin, Jun; Du, Wenhua; Wang, Jianmin; Li, Deyi

    2014-01-01

    Real-world networks such as the Internet and WWW have many common traits. Until now, hundreds of models were proposed to characterize these traits for understanding the networks. Because different models used very different mechanisms, it is widely believed that these traits origin from different causes. However, we find that a simple model based on optimisation can produce many traits, including scale-free, small-world, ultra small-world, Delta-distribution, compact, fractal, regular and random networks. Moreover, by revising the proposed model, the community-structure networks are generated. By this model and the revised versions, the complicated relationships of complex networks are illustrated. The model brings a new universal perspective to the understanding of complex networks and provide a universal method to model complex networks from the viewpoint of optimisation. PMID:25160506

  3. Antimicrobial Peptides: A Promising Therapeutic Strategy in Tackling Antimicrobial Resistance.

    PubMed

    Nuti, Ramya; Goud, Nerella S; Saraswati, A Prasanth; Alvala, Ravi; Alvala, Mallika

    2017-01-01

    Antimicrobial resistance (AMR) has posed a serious threat to global public health and it requires immediate action, preferably long term. Current drug therapies have failed to curb this menace due to the ability of microbes to circumvent the mechanisms through which the drugs act. From the drug discovery point of view, the majority of drugs currently employed for antimicrobial therapy are small molecules. Recent trends reveal a surge in the use of peptides as drug candidates as they offer remarkable advantages over small molecules. Newer synthetic strategies like organometalic complexes, Peptide-polymer conjugates, solid phase, liquid phase and recombinant DNA technology encouraging the use of peptides as therapeutic agents with a host of chemical functions, and tailored for specific applications. In the last decade, many peptide based drugs have been successfully approved by the Food and Drug Administration (FDA). This success can be attributed to their high specificity, selectivity and efficacy, high penetrability into the tissues, less immunogenicity and less tissue accumulation. Considering the enormity of AMR, the use of Antimicrobial Peptides (AMPs) can be a viable alternative to current therapeutics strategies. AMPs are naturally abundant allowing synthetic chemists to develop semi-synthetics peptide molecules. AMPs have a broad spectrum of activity towards microbes and they possess the ability to bypass the resistance induction mechanisms of microbes. The present review focuses on the potential applications of AMPs against various microbial disorders and their future prospects. Several resistance mechanisms and their strategies have also been discussed to highlight the importance in the current scenario. Breakthroughs in AMP designing, peptide synthesis and biotechnology have shown promise in tackling this challenge and has revived the interest of using AMPs as an important weapon in fighting AMR. Copyright© Bentham Science Publishers; For any queries

  4. Debating complexity in modeling

    USGS Publications Warehouse

    Hunt, Randall J.; Zheng, Chunmiao

    1999-01-01

    As scientists trying to understand the natural world, how should our effort be apportioned? We know that the natural world is characterized by complex and interrelated processes. Yet do we need to explicitly incorporate these intricacies to perform the tasks we are charged with? In this era of expanding computer power and development of sophisticated preprocessors and postprocessors, are bigger machines making better models? Put another way, do we understand the natural world better now with all these advancements in our simulation ability? Today the public's patience for long-term projects producing indeterminate results is wearing thin. This increases pressure on the investigator to use the appropriate technology efficiently. On the other hand, bringing scientific results into the legal arena opens up a new dimension to the issue: to the layperson, a tool that includes more of the complexity known to exist in the real world is expected to provide the more scientifically valid answer.

  5. Thinking about complexity in health: A systematic review of the key systems thinking and complexity ideas in health.

    PubMed

    Rusoja, Evan; Haynie, Deson; Sievers, Jessica; Mustafee, Navonil; Nelson, Fred; Reynolds, Martin; Sarriot, Eric; Swanson, Robert Chad; Williams, Bob

    2018-01-30

    Goals. Key messages Systems thinking and complexity science, theories that acknowledge the dynamic, connected, and context-dependent nature of health, are highly relevant to the post-millennium development goal era yet lack consensus on their use in relation to health Although heterogeneous, terms, and concepts like emergence, dynamic/dynamical Systems, nonlinear(ity), and interdependent/interconnected as well as methods like systems dynamic modelling and agent-based modelling that comprise systems thinking and complexity science in the health literature are shared across an increasing number of publications within medical/healthcare disciplines Planners, practitioners, and theorists that can better understand these key systems thinking and complexity science concepts will be better equipped to tackle the challenges of the upcoming development goals. © 2018 John Wiley & Sons, Ltd.

  6. Using Invention to Change How Students Tackle Problems

    PubMed Central

    Smith, Karen M.; van Stolk, Adrian P.; Spiegelman, George B.

    2010-01-01

    Invention activities challenge students to tackle problems that superficially appear unrelated to the course material but illustrate underlying fundamental concepts that are fundamental to material that will be presented. During our invention activities in a first-year biology class, students were presented with problems that are parallel to those that living cells must solve, in weekly sessions over a 13-wk term. We compared students who participated in the invention activities sessions with students who participated in sessions of structured problem solving and with students who did not participate in either activity. When faced with developing a solution to a challenging and unfamiliar biology problem, invention activity students were much quicker to engage with the problem and routinely provided multiple reasonable hypotheses. In contrast the other students were significantly slower in beginning to work on the problem and routinely produced relatively few ideas. We suggest that the invention activities develop a highly valuable skill that operates at the initial stages of problem solving. PMID:21123697

  7. Evidence in support of the call to ban the tackle and harmful contact in school rugby: a response to World Rugby.

    PubMed

    Pollock, Allyson M; White, Adam John; Kirkwood, Graham

    2017-08-01

    In a paper published in BJSM (June 2016), World Rugby employees Ross Tucker and Martin Raftery and a third coauthor Evert Verhagen took issue with the recent call to ban tackling in school rugby in the UK and Ireland. That call (to ban tackling) was supported by a systematic review published in BJSM Tucker et al claim that: (1) the mechanisms and risk factors for injury along with the incidence and severity of injury in youth rugby union have not been thoroughly identified or understood; (2) rugby players are at no greater risk of injury than other sports people, (3) this is particularly the case for children under 15 years and (4) removing the opportunity to learn the tackle from school pupils might increase rates of injuries. They conclude that a ban 'may be unnecessary and may also lead to unintended consequences such as an increase in the risk of injury later in participation.' Here we aim to rebut the case by Tucker et al We share new research that extends the findings of our original systematic review and meta-analysis. A cautionary approach requires the removal of the tackle from school rugby as the quickest and most effective method of reducing high injury rates in youth rugby, a public health priority. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  8. Modeling protein complexes with BiGGER.

    PubMed

    Krippahl, Ludwig; Moura, José J; Palma, P Nuno

    2003-07-01

    This article describes the method and results of our participation in the Critical Assessment of PRediction of Interactions (CAPRI) experiment, using the protein docking program BiGGER (Bimolecular complex Generation with Global Evaluation and Ranking) (Palma et al., Proteins 2000;39:372-384). Of five target complexes (CAPRI targets 2, 4, 5, 6, and 7), only one was successfully predicted (target 6), but BiGGER generated reasonable models for targets 4, 5, and 7, which could have been identified if additional biochemical information had been available. Copyright 2003 Wiley-Liss, Inc.

  9. Modeling Complex Cross-Systems Software Interfaces Using SysML

    NASA Technical Reports Server (NTRS)

    Mandutianu, Sanda; Morillo, Ron; Simpson, Kim; Liepack, Otfrid; Bonanne, Kevin

    2013-01-01

    The complex flight and ground systems for NASA human space exploration are designed, built, operated and managed as separate programs and projects. However, each system relies on one or more of the other systems in order to accomplish specific mission objectives, creating a complex, tightly coupled architecture. Thus, there is a fundamental need to understand how each system interacts with the other. To determine if a model-based system engineering approach could be utilized to assist with understanding the complex system interactions, the NASA Engineering and Safety Center (NESC) sponsored a task to develop an approach for performing cross-system behavior modeling. This paper presents the results of applying Model Based Systems Engineering (MBSE) principles using the System Modeling Language (SysML) to define cross-system behaviors and how they map to crosssystem software interfaces documented in system-level Interface Control Documents (ICDs).

  10. What do we gain from simplicity versus complexity in species distribution models?

    USGS Publications Warehouse

    Merow, Cory; Smith, Matthew J.; Edwards, Thomas C.; Guisan, Antoine; McMahon, Sean M.; Normand, Signe; Thuiller, Wilfried; Wuest, Rafael O.; Zimmermann, Niklaus E.; Elith, Jane

    2014-01-01

    Species distribution models (SDMs) are widely used to explain and predict species ranges and environmental niches. They are most commonly constructed by inferring species' occurrence–environment relationships using statistical and machine-learning methods. The variety of methods that can be used to construct SDMs (e.g. generalized linear/additive models, tree-based models, maximum entropy, etc.), and the variety of ways that such models can be implemented, permits substantial flexibility in SDM complexity. Building models with an appropriate amount of complexity for the study objectives is critical for robust inference. We characterize complexity as the shape of the inferred occurrence–environment relationships and the number of parameters used to describe them, and search for insights into whether additional complexity is informative or superfluous. By building ‘under fit’ models, having insufficient flexibility to describe observed occurrence–environment relationships, we risk misunderstanding the factors shaping species distributions. By building ‘over fit’ models, with excessive flexibility, we risk inadvertently ascribing pattern to noise or building opaque models. However, model selection can be challenging, especially when comparing models constructed under different modeling approaches. Here we argue for a more pragmatic approach: researchers should constrain the complexity of their models based on study objective, attributes of the data, and an understanding of how these interact with the underlying biological processes. We discuss guidelines for balancing under fitting with over fitting and consequently how complexity affects decisions made during model building. Although some generalities are possible, our discussion reflects differences in opinions that favor simpler versus more complex models. We conclude that combining insights from both simple and complex SDM building approaches best advances our knowledge of current and future species

  11. From Complex to Simple: Interdisciplinary Stochastic Models

    ERIC Educational Resources Information Center

    Mazilu, D. A.; Zamora, G.; Mazilu, I.

    2012-01-01

    We present two simple, one-dimensional, stochastic models that lead to a qualitative understanding of very complex systems from biology, nanoscience and social sciences. The first model explains the complicated dynamics of microtubules, stochastic cellular highways. Using the theory of random walks in one dimension, we find analytical expressions…

  12. The development of a model of creative space and its potential for transfer from non-formal to formal education

    NASA Astrophysics Data System (ADS)

    White, Irene; Lorenzi, Francesca

    2016-12-01

    Creativity has been emerging as a key concept in educational policies since the mid-1990s, with many Western countries restructuring their education systems to embrace innovative approaches likely to stimulate creative and critical thinking. But despite current intentions of putting more emphasis on creativity in education policies worldwide, there is still a relative dearth of viable models which capture the complexity of creativity and the conditions for its successful infusion into formal school environments. The push for creativity is in direct conflict with the results-driven/competitive performance-oriented culture which continues to dominate formal education systems. The authors of this article argue that incorporating creativity into mainstream education is a complex task and is best tackled by taking a systematic and multifaceted approach. They present a multidimensional model designed to help educators in tackling the challenges of the promotion of creativity. Their model encompasses three distinct yet interrelated dimensions of a creative space - physical, social-emotional and critical. The authors use the metaphor of space to refer to the interplay of the three identified dimensions. Drawing on confluence approaches to the theorisation of creativity, this paper exemplifies the development of a model before the background of a growing trend of systems theories. The aim of the model is to be helpful in systematising creativity by offering parameters - derived from the evaluation of an example offered by a non-formal educational environment - for the development of creative environments within mainstream secondary schools.

  13. Research on complex 3D tree modeling based on L-system

    NASA Astrophysics Data System (ADS)

    Gang, Chen; Bin, Chen; Yuming, Liu; Hui, Li

    2018-03-01

    L-system as a fractal iterative system could simulate complex geometric patterns. Based on the field observation data of trees and knowledge of forestry experts, this paper extracted modeling constraint rules and obtained an L-system rules set. Using the self-developed L-system modeling software the L-system rule set was parsed to generate complex tree 3d models.The results showed that the geometrical modeling method based on l-system could be used to describe the morphological structure of complex trees and generate 3D tree models.

  14. Improving Psychosexual Knowledge in Adolescents with Autism Spectrum Disorder: Pilot of the Tackling Teenage Training Program

    ERIC Educational Resources Information Center

    Dekker, Linda P.; van der Vegt, Esther J.; Visser, Kirsten; Tick, Nouchka; Boudesteijn, Frieda; Verhulst, Frank C.; Maras, Athanasios; Greaves-Lord, Kirstin

    2015-01-01

    Previous studies have shown that psychosexual functioning in adolescents with autism spectrum disorder (ASD) is hampered and emphasize the need for a specialized training program tailored to their needs. Therefore, an individual training program was developed; the Tackling Teenage Training (TTT) program. The current pilot study systematically…

  15. A Complex Systems Model Approach to Quantified Mineral Resource Appraisal

    USGS Publications Warehouse

    Gettings, M.E.; Bultman, M.W.; Fisher, F.S.

    2004-01-01

    For federal and state land management agencies, mineral resource appraisal has evolved from value-based to outcome-based procedures wherein the consequences of resource development are compared with those of other management options. Complex systems modeling is proposed as a general framework in which to build models that can evaluate outcomes. Three frequently used methods of mineral resource appraisal (subjective probabilistic estimates, weights of evidence modeling, and fuzzy logic modeling) are discussed to obtain insight into methods of incorporating complexity into mineral resource appraisal models. Fuzzy logic and weights of evidence are most easily utilized in complex systems models. A fundamental product of new appraisals is the production of reusable, accessible databases and methodologies so that appraisals can easily be repeated with new or refined data. The data are representations of complex systems and must be so regarded if all of their information content is to be utilized. The proposed generalized model framework is applicable to mineral assessment and other geoscience problems. We begin with a (fuzzy) cognitive map using (+1,0,-1) values for the links and evaluate the map for various scenarios to obtain a ranking of the importance of various links. Fieldwork and modeling studies identify important links and help identify unanticipated links. Next, the links are given membership functions in accordance with the data. Finally, processes are associated with the links; ideally, the controlling physical and chemical events and equations are found for each link. After calibration and testing, this complex systems model is used for predictions under various scenarios.

  16. Intrinsic Uncertainties in Modeling Complex Systems.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.

    2014-09-01

    Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrainedmore » within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project #170979, entitled "Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties", which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.« less

  17. Surface complexation modeling of americium sorption onto volcanic tuff.

    PubMed

    Ding, M; Kelkar, S; Meijer, A

    2014-10-01

    Results of a surface complexation model (SCM) for americium sorption on volcanic rocks (devitrified and zeolitic tuff) are presented. The model was developed using PHREEQC and based on laboratory data for americium sorption on quartz. Available data for sorption of americium on quartz as a function of pH in dilute groundwater can be modeled with two surface reactions involving an americium sulfate and an americium carbonate complex. It was assumed in applying the model to volcanic rocks from Yucca Mountain, that the surface properties of volcanic rocks can be represented by a quartz surface. Using groundwaters compositionally representative of Yucca Mountain, americium sorption distribution coefficient (Kd, L/Kg) values were calculated as function of pH. These Kd values are close to the experimentally determined Kd values for americium sorption on volcanic rocks, decreasing with increasing pH in the pH range from 7 to 9. The surface complexation constants, derived in this study, allow prediction of sorption of americium in a natural complex system, taking into account the inherent uncertainty associated with geochemical conditions that occur along transport pathways. Published by Elsevier Ltd.

  18. Network model of bilateral power markets based on complex networks

    NASA Astrophysics Data System (ADS)

    Wu, Yang; Liu, Junyong; Li, Furong; Yan, Zhanxin; Zhang, Li

    2014-06-01

    The bilateral power transaction (BPT) mode becomes a typical market organization with the restructuring of electric power industry, the proper model which could capture its characteristics is in urgent need. However, the model is lacking because of this market organization's complexity. As a promising approach to modeling complex systems, complex networks could provide a sound theoretical framework for developing proper simulation model. In this paper, a complex network model of the BPT market is proposed. In this model, price advantage mechanism is a precondition. Unlike other general commodity transactions, both of the financial layer and the physical layer are considered in the model. Through simulation analysis, the feasibility and validity of the model are verified. At same time, some typical statistical features of BPT network are identified. Namely, the degree distribution follows the power law, the clustering coefficient is low and the average path length is a bit long. Moreover, the topological stability of the BPT network is tested. The results show that the network displays a topological robustness to random market member's failures while it is fragile against deliberate attacks, and the network could resist cascading failure to some extent. These features are helpful for making decisions and risk management in BPT markets.

  19. Tackling Missing Data in Community Health Studies Using Additive LS-SVM Classifier.

    PubMed

    Wang, Guanjin; Deng, Zhaohong; Choi, Kup-Sze

    2018-03-01

    Missing data is a common issue in community health and epidemiological studies. Direct removal of samples with missing data can lead to reduced sample size and information bias, which deteriorates the significance of the results. While data imputation methods are available to deal with missing data, they are limited in performance and could introduce noises into the dataset. Instead of data imputation, a novel method based on additive least square support vector machine (LS-SVM) is proposed in this paper for predictive modeling when the input features of the model contain missing data. The method also determines simultaneously the influence of the features with missing values on the classification accuracy using the fast leave-one-out cross-validation strategy. The performance of the method is evaluated by applying it to predict the quality of life (QOL) of elderly people using health data collected in the community. The dataset involves demographics, socioeconomic status, health history, and the outcomes of health assessments of 444 community-dwelling elderly people, with 5% to 60% of data missing in some of the input features. The QOL is measured using a standard questionnaire of the World Health Organization. Results show that the proposed method outperforms four conventional methods for handling missing data-case deletion, feature deletion, mean imputation, and K-nearest neighbor imputation, with the average QOL prediction accuracy reaching 0.7418. It is potentially a promising technique for tackling missing data in community health research and other applications.

  20. Minimum-complexity helicopter simulation math model

    NASA Technical Reports Server (NTRS)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  1. Elastic Network Model of a Nuclear Transport Complex

    NASA Astrophysics Data System (ADS)

    Ryan, Patrick; Liu, Wing K.; Lee, Dockjin; Seo, Sangjae; Kim, Young-Jin; Kim, Moon K.

    2010-05-01

    The structure of Kap95p was obtained from the Protein Data Bank (www.pdb.org) and analyzed RanGTP plays an important role in both nuclear protein import and export cycles. In the nucleus, RanGTP releases macromolecular cargoes from importins and conversely facilitates cargo binding to exportins. Although the crystal structure of the nuclear import complex formed by importin Kap95p and RanGTP was recently identified, its molecular mechanism still remains unclear. To understand the relationship between structure and function of a nuclear transport complex, a structure-based mechanical model of Kap95p:RanGTP complex is introduced. In this model, a protein structure is simply modeled as an elastic network in which a set of coarse-grained point masses are connected by linear springs representing biochemical interactions at atomic level. Harmonic normal mode analysis (NMA) and anharmonic elastic network interpolation (ENI) are performed to predict the modes of vibrations and a feasible pathway between locked and unlocked conformations of Kap95p, respectively. Simulation results imply that the binding of RanGTP to Kap95p induces the release of the cargo in the nucleus as well as prevents any new cargo from attaching to the Kap95p:RanGTP complex.

  2. Orbital Architectures of Dynamically Complex Exoplanet Systems

    NASA Astrophysics Data System (ADS)

    Nelson, Benjamin E.

    2015-01-01

    The most powerful constraints on planet formation will come from characterizing the dynamical state of complex multi-planet systems. Unfortunately, with that complexity comes a number of factors that make analyzing these systems a computationally challenging endeavor: the sheer number of model parameters, a wonky shaped posterior distribution, and hundreds to thousands of time series measurements. We develop a differential evolution Markov chain Monte Carlo (RUN DMC) to tackle these difficult aspects of data analysis. We apply RUN DMC to two classic multi-planet systems from radial velocity surveys, 55 Cancri and GJ 876. For 55 Cancri, we find the inner-most planet "e" must be coplanar to within 40 degrees of the outer planets, otherwise Kozai-like perturbations will cause the planet's orbit to cross the stellar surface. We find the orbits of planets "b" and "c" are apsidally aligned and librating with low to median amplitude (50±610 degrees), but they are not orbiting in a mean-motion resonance. For GJ 876, we can meaningfully constrain the three-dimensional orbital architecture of all the planets based on the radial velocity data alone. By demanding orbital stability, we find the resonant planets have low mutual inclinations (Φ) so they must be roughly coplanar (Φcb = 1.41±0.620.57 degrees and Φbe = 3.87±1.991.86 degrees). The three-dimensional Laplace argument librates with an amplitude of 50.5±7.910.0 degrees, indicating significant past disk migration and ensuring long-term stability. These empirically derived models will provide new challenges for planet formation models and motivate the need for more sophisticated algorithms to analyze exoplanet data.

  3. Primary care support for tackling obesity: a qualitative study of the perceptions of obese patients.

    PubMed

    Brown, Ian; Thompson, Joanne; Tod, Angela; Jones, Georgina

    2006-09-01

    Obesity has become a major public health issue and there is concern about the response of health services to patients who are obese. The perceptions of obese patients using primary care services have not been studied in depth. To explore obese patients' experiences and perceptions of support in primary care. Qualitative study with semi-structured interviews conducted in participants' homes. Five general practices contrasting in socioeconomic populations in Sheffield. Purposive sampling and semi-structured interviewing of 28 patients with a diverse range of ages, backgrounds, levels of obesity and experiences of primary care services. Participants typically felt reluctance when presenting with concerns about weight and ambivalence about the services received. They also perceived there to be ambivalence and a lack of resources on the part of the health services. Participants showed a strong sense of personal responsibility about their condition and stigma-related cognitions were common. These contributed to their ambivalence about using services and their sensitivity to its features. Good relationships with primary care professionals and more intensive support partly ameliorated these effects. The challenges of improving access to and quality of primary care support in tackling obesity are made more complex by patients' ambivalence and other effects of the stigma associated with obesity.

  4. Complex Networks in Psychological Models

    NASA Astrophysics Data System (ADS)

    Wedemann, R. S.; Carvalho, L. S. A. V. D.; Donangelo, R.

    We develop schematic, self-organizing, neural-network models to describe mechanisms associated with mental processes, by a neurocomputational substrate. These models are examples of real world complex networks with interesting general topological structures. Considering dopaminergic signal-to-noise neuronal modulation in the central nervous system, we propose neural network models to explain development of cortical map structure and dynamics of memory access, and unify different mental processes into a single neurocomputational substrate. Based on our neural network models, neurotic behavior may be understood as an associative memory process in the brain, and the linguistic, symbolic associative process involved in psychoanalytic working-through can be mapped onto a corresponding process of reconfiguration of the neural network. The models are illustrated through computer simulations, where we varied dopaminergic modulation and observed the self-organizing emergent patterns at the resulting semantic map, interpreting them as different manifestations of mental functioning, from psychotic through to normal and neurotic behavior, and creativity.

  5. Assessing Complexity in Learning Outcomes--A Comparison between the SOLO Taxonomy and the Model of Hierarchical Complexity

    ERIC Educational Resources Information Center

    Stålne, Kristian; Kjellström, Sofia; Utriainen, Jukka

    2016-01-01

    An important aspect of higher education is to educate students who can manage complex relationships and solve complex problems. Teachers need to be able to evaluate course content with regard to complexity, as well as evaluate students' ability to assimilate complex content and express it in the form of a learning outcome. One model for evaluating…

  6. Building a pseudo-atomic model of the anaphase-promoting complex.

    PubMed

    Kulkarni, Kiran; Zhang, Ziguo; Chang, Leifu; Yang, Jing; da Fonseca, Paula C A; Barford, David

    2013-11-01

    The anaphase-promoting complex (APC/C) is a large E3 ubiquitin ligase that regulates progression through specific stages of the cell cycle by coordinating the ubiquitin-dependent degradation of cell-cycle regulatory proteins. Depending on the species, the active form of the APC/C consists of 14-15 different proteins that assemble into a 20-subunit complex with a mass of approximately 1.3 MDa. A hybrid approach of single-particle electron microscopy and protein crystallography of individual APC/C subunits has been applied to generate pseudo-atomic models of various functional states of the complex. Three approaches for assigning regions of the EM-derived APC/C density map to specific APC/C subunits are described. This information was used to dock atomic models of APC/C subunits, determined either by protein crystallography or homology modelling, to specific regions of the APC/C EM map, allowing the generation of a pseudo-atomic model corresponding to 80% of the entire complex.

  7. The noisy voter model on complex networks.

    PubMed

    Carro, Adrián; Toral, Raúl; San Miguel, Maxi

    2016-04-20

    We propose a new analytical method to study stochastic, binary-state models on complex networks. Moving beyond the usual mean-field theories, this alternative approach is based on the introduction of an annealed approximation for uncorrelated networks, allowing to deal with the network structure as parametric heterogeneity. As an illustration, we study the noisy voter model, a modification of the original voter model including random changes of state. The proposed method is able to unfold the dependence of the model not only on the mean degree (the mean-field prediction) but also on more complex averages over the degree distribution. In particular, we find that the degree heterogeneity--variance of the underlying degree distribution--has a strong influence on the location of the critical point of a noise-induced, finite-size transition occurring in the model, on the local ordering of the system, and on the functional form of its temporal correlations. Finally, we show how this latter point opens the possibility of inferring the degree heterogeneity of the underlying network by observing only the aggregate behavior of the system as a whole, an issue of interest for systems where only macroscopic, population level variables can be measured.

  8. Classrooms as Complex Adaptive Systems: A Relational Model

    ERIC Educational Resources Information Center

    Burns, Anne; Knox, John S.

    2011-01-01

    In this article, we describe and model the language classroom as a complex adaptive system (see Logan & Schumann, 2005). We argue that linear, categorical descriptions of classroom processes and interactions do not sufficiently explain the complex nature of classrooms, and cannot account for how classroom change occurs (or does not occur), over…

  9. Mathematical Models to Determine Stable Behavior of Complex Systems

    NASA Astrophysics Data System (ADS)

    Sumin, V. I.; Dushkin, A. V.; Smolentseva, T. E.

    2018-05-01

    The paper analyzes a possibility to predict functioning of a complex dynamic system with a significant amount of circulating information and a large number of random factors impacting its functioning. Functioning of the complex dynamic system is described as a chaotic state, self-organized criticality and bifurcation. This problem may be resolved by modeling such systems as dynamic ones, without applying stochastic models and taking into account strange attractors.

  10. Mathematical and Computational Modeling in Complex Biological Systems

    PubMed Central

    Li, Wenyang; Zhu, Xiaoliang

    2017-01-01

    The biological process and molecular functions involved in the cancer progression remain difficult to understand for biologists and clinical doctors. Recent developments in high-throughput technologies urge the systems biology to achieve more precise models for complex diseases. Computational and mathematical models are gradually being used to help us understand the omics data produced by high-throughput experimental techniques. The use of computational models in systems biology allows us to explore the pathogenesis of complex diseases, improve our understanding of the latent molecular mechanisms, and promote treatment strategy optimization and new drug discovery. Currently, it is urgent to bridge the gap between the developments of high-throughput technologies and systemic modeling of the biological process in cancer research. In this review, we firstly studied several typical mathematical modeling approaches of biological systems in different scales and deeply analyzed their characteristics, advantages, applications, and limitations. Next, three potential research directions in systems modeling were summarized. To conclude, this review provides an update of important solutions using computational modeling approaches in systems biology. PMID:28386558

  11. Mathematical and Computational Modeling in Complex Biological Systems.

    PubMed

    Ji, Zhiwei; Yan, Ke; Li, Wenyang; Hu, Haigen; Zhu, Xiaoliang

    2017-01-01

    The biological process and molecular functions involved in the cancer progression remain difficult to understand for biologists and clinical doctors. Recent developments in high-throughput technologies urge the systems biology to achieve more precise models for complex diseases. Computational and mathematical models are gradually being used to help us understand the omics data produced by high-throughput experimental techniques. The use of computational models in systems biology allows us to explore the pathogenesis of complex diseases, improve our understanding of the latent molecular mechanisms, and promote treatment strategy optimization and new drug discovery. Currently, it is urgent to bridge the gap between the developments of high-throughput technologies and systemic modeling of the biological process in cancer research. In this review, we firstly studied several typical mathematical modeling approaches of biological systems in different scales and deeply analyzed their characteristics, advantages, applications, and limitations. Next, three potential research directions in systems modeling were summarized. To conclude, this review provides an update of important solutions using computational modeling approaches in systems biology.

  12. Pattern-oriented modeling of agent-based complex systems: Lessons from ecology

    USGS Publications Warehouse

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.

    2005-01-01

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.

  13. Pattern-Oriented Modeling of Agent-Based Complex Systems: Lessons from Ecology

    NASA Astrophysics Data System (ADS)

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.

    2005-11-01

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.

  14. Improving a regional model using reduced complexity and parameter estimation

    USGS Publications Warehouse

    Kelson, Victor A.; Hunt, Randall J.; Haitjema, Henk M.

    2002-01-01

    The availability of powerful desktop computers and graphical user interfaces for ground water flow models makes possible the construction of ever more complex models. A proposed copper-zinc sulfide mine in northern Wisconsin offers a unique case in which the same hydrologic system has been modeled using a variety of techniques covering a wide range of sophistication and complexity. Early in the permitting process, simple numerical models were used to evaluate the necessary amount of water to be pumped from the mine, reductions in streamflow, and the drawdowns in the regional aquifer. More complex models have subsequently been used in an attempt to refine the predictions. Even after so much modeling effort, questions regarding the accuracy and reliability of the predictions remain. We have performed a new analysis of the proposed mine using the two-dimensional analytic element code GFLOW coupled with the nonlinear parameter estimation code UCODE. The new model is parsimonious, containing fewer than 10 parameters, and covers a region several times larger in areal extent than any of the previous models. The model demonstrates the suitability of analytic element codes for use with parameter estimation codes. The simplified model results are similar to the more complex models; predicted mine inflows and UCODE-derived 95% confidence intervals are consistent with the previous predictions. More important, the large areal extent of the model allowed us to examine hydrological features not included in the previous models, resulting in new insights about the effects that far-field boundary conditions can have on near-field model calibration and parameterization. In this case, the addition of surface water runoff into a lake in the headwaters of a stream while holding recharge constant moved a regional ground watershed divide and resulted in some of the added water being captured by the adjoining basin. Finally, a simple analytical solution was used to clarify the GFLOW model

  15. Improving a regional model using reduced complexity and parameter estimation.

    PubMed

    Kelson, Victor A; Hunt, Randall J; Haitjema, Henk M

    2002-01-01

    The availability of powerful desktop computers and graphical user interfaces for ground water flow models makes possible the construction of ever more complex models. A proposed copper-zinc sulfide mine in northern Wisconsin offers a unique case in which the same hydrologic system has been modeled using a variety of techniques covering a wide range of sophistication and complexity. Early in the permitting process, simple numerical models were used to evaluate the necessary amount of water to be pumped from the mine, reductions in streamflow, and the drawdowns in the regional aquifer. More complex models have subsequently been used in an attempt to refine the predictions. Even after so much modeling effort, questions regarding the accuracy and reliability of the predictions remain. We have performed a new analysis of the proposed mine using the two-dimensional analytic element code GFLOW coupled with the nonlinear parameter estimation code UCODE. The new model is parsimonious, containing fewer than 10 parameters, and covers a region several times larger in areal extent than any of the previous models. The model demonstrates the suitability of analytic element codes for use with parameter estimation codes. The simplified model results are similar to the more complex models; predicted mine inflows and UCODE-derived 95% confidence intervals are consistent with the previous predictions. More important, the large areal extent of the model allowed us to examine hydrological features not included in the previous models, resulting in new insights about the effects that far-field boundary conditions can have on near-field model calibration and parameterization. In this case, the addition of surface water runoff into a lake in the headwaters of a stream while holding recharge constant moved a regional ground watershed divide and resulted in some of the added water being captured by the adjoining basin. Finally, a simple analytical solution was used to clarify the GFLOW model

  16. Musculoskeletal modelling of human ankle complex: Estimation of ankle joint moments.

    PubMed

    Jamwal, Prashant K; Hussain, Shahid; Tsoi, Yun Ho; Ghayesh, Mergen H; Xie, Sheng Quan

    2017-05-01

    A musculoskeletal model for the ankle complex is vital in order to enhance the understanding of neuro-mechanical control of ankle motions, diagnose ankle disorders and assess subsequent treatments. Motions at the human ankle and foot, however, are complex due to simultaneous movements at the two joints namely, the ankle joint and the subtalar joint. The musculoskeletal elements at the ankle complex, such as ligaments, muscles and tendons, have intricate arrangements and exhibit transient and nonlinear behaviour. This paper develops a musculoskeletal model of the ankle complex considering the biaxial ankle structure. The model provides estimates of overall mechanical characteristics (motion and moments) of ankle complex through consideration of forces applied along ligaments and muscle-tendon units. The dynamics of the ankle complex and its surrounding ligaments and muscle-tendon units is modelled and formulated into a state space model to facilitate simulations. A graphical user interface is also developed during this research in order to include the visual anatomical information by converting it to quantitative information on coordinates. Validation of the ankle model was carried out by comparing its outputs with those published in literature as well as with experimental data obtained from an existing parallel ankle rehabilitation robot. Qualitative agreement was observed between the model and measured data for both, the passive and active ankle motions during trials in terms of displacements and moments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Drosophila as an In Vivo Model for Human Neurodegenerative Disease

    PubMed Central

    McGurk, Leeanne; Berson, Amit; Bonini, Nancy M.

    2015-01-01

    With the increase in the ageing population, neurodegenerative disease is devastating to families and poses a huge burden on society. The brain and spinal cord are extraordinarily complex: they consist of a highly organized network of neuronal and support cells that communicate in a highly specialized manner. One approach to tackling problems of such complexity is to address the scientific questions in simpler, yet analogous, systems. The fruit fly, Drosophila melanogaster, has been proven tremendously valuable as a model organism, enabling many major discoveries in neuroscientific disease research. The plethora of genetic tools available in Drosophila allows for exquisite targeted manipulation of the genome. Due to its relatively short lifespan, complex questions of brain function can be addressed more rapidly than in other model organisms, such as the mouse. Here we discuss features of the fly as a model for human neurodegenerative disease. There are many distinct fly models for a range of neurodegenerative diseases; we focus on select studies from models of polyglutamine disease and amyotrophic lateral sclerosis that illustrate the type and range of insights that can be gleaned. In discussion of these models, we underscore strengths of the fly in providing understanding into mechanisms and pathways, as a foundation for translational and therapeutic research. PMID:26447127

  18. Pattern-Based Inverse Modeling for Characterization of Subsurface Flow Models with Complex Geologic Heterogeneity

    NASA Astrophysics Data System (ADS)

    Golmohammadi, A.; Jafarpour, B.; M Khaninezhad, M. R.

    2017-12-01

    Calibration of heterogeneous subsurface flow models leads to ill-posed nonlinear inverse problems, where too many unknown parameters are estimated from limited response measurements. When the underlying parameters form complex (non-Gaussian) structured spatial connectivity patterns, classical variogram-based geostatistical techniques cannot describe the underlying connectivity patterns. Modern pattern-based geostatistical methods that incorporate higher-order spatial statistics are more suitable for describing such complex spatial patterns. Moreover, when the underlying unknown parameters are discrete (geologic facies distribution), conventional model calibration techniques that are designed for continuous parameters cannot be applied directly. In this paper, we introduce a novel pattern-based model calibration method to reconstruct discrete and spatially complex facies distributions from dynamic flow response data. To reproduce complex connectivity patterns during model calibration, we impose a feasibility constraint to ensure that the solution follows the expected higher-order spatial statistics. For model calibration, we adopt a regularized least-squares formulation, involving data mismatch, pattern connectivity, and feasibility constraint terms. Using an alternating directions optimization algorithm, the regularized objective function is divided into a continuous model calibration problem, followed by mapping the solution onto the feasible set. The feasibility constraint to honor the expected spatial statistics is implemented using a supervised machine learning algorithm. The two steps of the model calibration formulation are repeated until the convergence criterion is met. Several numerical examples are used to evaluate the performance of the developed method.

  19. 2.5D complex resistivity modeling and inversion using unstructured grids

    NASA Astrophysics Data System (ADS)

    Xu, Kaijun; Sun, Jie

    2016-04-01

    The characteristic of complex resistivity on rock and ore has been recognized by people for a long time. Generally we have used the Cole-Cole Model(CCM) to describe complex resistivity. It has been proved that the electrical anomaly of geologic body can be quantitative estimated by CCM parameters such as direct resistivity(ρ0), chargeability(m), time constant(τ) and frequency dependence(c). Thus it is very important to obtain the complex parameters of geologic body. It is difficult to approximate complex structures and terrain using traditional rectangular grid. In order to enhance the numerical accuracy and rationality of modeling and inversion, we use an adaptive finite-element algorithm for forward modeling of the frequency-domain 2.5D complex resistivity and implement the conjugate gradient algorithm in the inversion of 2.5D complex resistivity. An adaptive finite element method is applied for solving the 2.5D complex resistivity forward modeling of horizontal electric dipole source. First of all, the CCM is introduced into the Maxwell's equations to calculate the complex resistivity electromagnetic fields. Next, the pseudo delta function is used to distribute electric dipole source. Then the electromagnetic fields can be expressed in terms of the primary fields caused by layered structure and the secondary fields caused by inhomogeneities anomalous conductivity. At last, we calculated the electromagnetic fields response of complex geoelectric structures such as anticline, syncline, fault. The modeling results show that adaptive finite-element methods can automatically improve mesh generation and simulate complex geoelectric models using unstructured grids. The 2.5D complex resistivity invertion is implemented based the conjugate gradient algorithm.The conjugate gradient algorithm doesn't need to compute the sensitivity matrix but directly computes the sensitivity matrix or its transpose multiplying vector. In addition, the inversion target zones are

  20. Complex versus simple models: ion-channel cardiac toxicity prediction.

    PubMed

    Mistry, Hitesh B

    2018-01-01

    There is growing interest in applying detailed mathematical models of the heart for ion-channel related cardiac toxicity prediction. However, a debate as to whether such complex models are required exists. Here an assessment in the predictive performance between two established large-scale biophysical cardiac models and a simple linear model B net was conducted. Three ion-channel data-sets were extracted from literature. Each compound was designated a cardiac risk category using two different classification schemes based on information within CredibleMeds. The predictive performance of each model within each data-set for each classification scheme was assessed via a leave-one-out cross validation. Overall the B net model performed equally as well as the leading cardiac models in two of the data-sets and outperformed both cardiac models on the latest. These results highlight the importance of benchmarking complex versus simple models but also encourage the development of simple models.

  1. Reduced complexity modeling of Arctic delta dynamics

    NASA Astrophysics Data System (ADS)

    Piliouras, A.; Lauzon, R.; Rowland, J. C.

    2017-12-01

    How water and sediment are routed through deltas has important implications for our understanding of nutrient and sediment fluxes to the coastal ocean. These fluxes may be especially important in Arctic environments, because the Arctic ocean receives a disproportionately large amount of river discharge and high latitude regions are expected to be particularly vulnerable to climate change. The Arctic has some of the world's largest but least studied deltas. This lack of data is due to remote and hazardous conditions, sparse human populations, and limited remote sensing resources. In the absence of data, complex models may be of limited scientific utility in understanding Arctic delta dynamics. To overcome this challenge, we adapt the reduced complexity delta-building model DeltaRCM for Arctic environments to explore the influence of sea ice and permafrost on delta morphology and dynamics. We represent permafrost by increasing the threshold for sediment erosion, as permafrost has been found to increase cohesion and reduce channel migration rates. The presence of permafrost in the model results in the creation of more elongate channels, fewer active channels, and a rougher shoreline. We consider several effects of sea ice, including introducing friction which increases flow resistance, constriction of flow by landfast ice, and changes in effective water surface elevation. Flow constriction and increased friction from ice results in a rougher shoreline, more frequent channel switching, decreased channel migration rates, and enhanced deposition offshore of channel mouths. The reduced complexity nature of the model is ideal for generating a basic understanding of which processes unique to Arctic environments may have important effects on delta evolution, and it allows us to explore a variety of rules for incorporating those processes into the model to inform future Arctic delta modelling efforts. Finally, we plan to use the modeling results to determine how the presence

  2. Turing instability in reaction-diffusion models on complex networks

    NASA Astrophysics Data System (ADS)

    Ide, Yusuke; Izuhara, Hirofumi; Machida, Takuya

    2016-09-01

    In this paper, the Turing instability in reaction-diffusion models defined on complex networks is studied. Here, we focus on three types of models which generate complex networks, i.e. the Erdős-Rényi, the Watts-Strogatz, and the threshold network models. From analysis of the Laplacian matrices of graphs generated by these models, we numerically reveal that stable and unstable regions of a homogeneous steady state on the parameter space of two diffusion coefficients completely differ, depending on the network architecture. In addition, we theoretically discuss the stable and unstable regions in the cases of regular enhanced ring lattices which include regular circles, and networks generated by the threshold network model when the number of vertices is large enough.

  3. On the Complexity of Item Response Theory Models.

    PubMed

    Bonifay, Wes; Cai, Li

    2017-01-01

    Complexity in item response theory (IRT) has traditionally been quantified by simply counting the number of freely estimated parameters in the model. However, complexity is also contingent upon the functional form of the model. We examined four popular IRT models-exploratory factor analytic, bifactor, DINA, and DINO-with different functional forms but the same number of free parameters. In comparison, a simpler (unidimensional 3PL) model was specified such that it had 1 more parameter than the previous models. All models were then evaluated according to the minimum description length principle. Specifically, each model was fit to 1,000 data sets that were randomly and uniformly sampled from the complete data space and then assessed using global and item-level fit and diagnostic measures. The findings revealed that the factor analytic and bifactor models possess a strong tendency to fit any possible data. The unidimensional 3PL model displayed minimal fitting propensity, despite the fact that it included an additional free parameter. The DINA and DINO models did not demonstrate a proclivity to fit any possible data, but they did fit well to distinct data patterns. Applied researchers and psychometricians should therefore consider functional form-and not goodness-of-fit alone-when selecting an IRT model.

  4. Epidemic threshold of the susceptible-infected-susceptible model on complex networks

    NASA Astrophysics Data System (ADS)

    Lee, Hyun Keun; Shim, Pyoung-Seop; Noh, Jae Dong

    2013-06-01

    We demonstrate that the susceptible-infected-susceptible (SIS) model on complex networks can have an inactive Griffiths phase characterized by a slow relaxation dynamics. It contrasts with the mean-field theoretical prediction that the SIS model on complex networks is active at any nonzero infection rate. The dynamic fluctuation of infected nodes, ignored in the mean field approach, is responsible for the inactive phase. It is proposed that the question whether the epidemic threshold of the SIS model on complex networks is zero or not can be resolved by the percolation threshold in a model where nodes are occupied in degree-descending order. Our arguments are supported by the numerical studies on scale-free network models.

  5. An evaluation of sampling methods and supporting techniques for tackling lead in drinking water in Aberta Province

    EPA Science Inventory

    A collaborative project commenced in August 2013 with the aim of demonstrating a range of techniques that can be used in tackling the problems of lead in drinking water. The main project was completed in March 2014, with supplementary sampling exercises in mid-2014. It involved t...

  6. Can Smoking Cessation Services Be Better Targeted to Tackle Health Inequalities? Evidence from a Cross-Sectional Study

    ERIC Educational Resources Information Center

    Blackman, Tim

    2008-01-01

    Objective: To investigate how smoking cessation services could be more effectively targeted to tackle socioeconomic inequalities in health. Design: Secondary analysis of data from a household interview survey undertaken for Middlesbrough Council in north east England using the technique of Qualitative Comparative Analysis. Setting: Home-based…

  7. A Systematic Review of Conceptual Frameworks of Medical Complexity and New Model Development.

    PubMed

    Zullig, Leah L; Whitson, Heather E; Hastings, Susan N; Beadles, Chris; Kravchenko, Julia; Akushevich, Igor; Maciejewski, Matthew L

    2016-03-01

    Patient complexity is often operationalized by counting multiple chronic conditions (MCC) without considering contextual factors that can affect patient risk for adverse outcomes. Our objective was to develop a conceptual model of complexity addressing gaps identified in a review of published conceptual models. We searched for English-language MEDLINE papers published between 1 January 2004 and 16 January 2014. Two reviewers independently evaluated abstracts and all authors contributed to the development of the conceptual model in an iterative process. From 1606 identified abstracts, six conceptual models were selected. One additional model was identified through reference review. Each model had strengths, but several constructs were not fully considered: 1) contextual factors; 2) dynamics of complexity; 3) patients' preferences; 4) acute health shocks; and 5) resilience. Our Cycle of Complexity model illustrates relationships between acute shocks and medical events, healthcare access and utilization, workload and capacity, and patient preferences in the context of interpersonal, organizational, and community factors. This model may inform studies on the etiology of and changes in complexity, the relationship between complexity and patient outcomes, and intervention development to improve modifiable elements of complex patients.

  8. Industry Responsibilities in Tackling Direct-to-Consumer Marketing of Unproven Stem Cell Treatments.

    PubMed

    Master, Z; Fu, W; Paciulli, D; Sipp, D

    2017-08-01

    The direct-to-consumer marketing of unproven stem cell interventions (SCIs) is a serious public health concern. Regulations and education have had modest impact, indicating that different actors must play a role to stop this unfettered market. We consider the role of the biotech industry in tackling unproven SCIs. Grounded in the concept of corporate social responsibility, we argue that biotech companies should screen consumers to ensure that products and services are being used appropriately and educate employees about unproven SCIs. © 2017 ASCPT.

  9. A modelling tool for policy analysis to support the design of efficient and effective policy responses for complex public health problems.

    PubMed

    Atkinson, Jo-An; Page, Andrew; Wells, Robert; Milat, Andrew; Wilson, Andrew

    2015-03-03

    In the design of public health policy, a broader understanding of risk factors for disease across the life course, and an increasing awareness of the social determinants of health, has led to the development of more comprehensive, cross-sectoral strategies to tackle complex problems. However, comprehensive strategies may not represent the most efficient or effective approach to reducing disease burden at the population level. Rather, they may act to spread finite resources less intensively over a greater number of programs and initiatives, diluting the potential impact of the investment. While analytic tools are available that use research evidence to help identify and prioritise disease risk factors for public health action, they are inadequate to support more targeted and effective policy responses for complex public health problems. This paper discusses the limitations of analytic tools that are commonly used to support evidence-informed policy decisions for complex problems. It proposes an alternative policy analysis tool which can integrate diverse evidence sources and provide a platform for virtual testing of policy alternatives in order to design solutions that are efficient, effective, and equitable. The case of suicide prevention in Australia is presented to demonstrate the limitations of current tools to adequately inform prevention policy and discusses the utility of the new policy analysis tool. In contrast to popular belief, a systems approach takes a step beyond comprehensive thinking and seeks to identify where best to target public health action and resources for optimal impact. It is concerned primarily with what can be reasonably left out of strategies for prevention and can be used to explore where disinvestment may occur without adversely affecting population health (or equity). Simulation modelling used for policy analysis offers promise in being able to better operationalise research evidence to support decision making for complex problems

  10. Tackling unsafe abortion in Mauritius.

    PubMed

    Nyong'o, D; Oodit, G

    1996-01-01

    Despite a contraceptive prevalence rate of 75% Mauritius has a high incidence of unsafe abortions because of unprotected intercourse experienced by many young women in a rapidly industrializing environment. The Mauritius Family Planning Association (MFPA) tackled the issue of unsafe abortion in 1993. Abortion is illegal in the country, and the Catholic Church also strongly opposes modern family planning methods, thus the use of withdrawal and/or calendar methods have been increasing. The MFPA organized an advocacy symposium in 1993 on unsafe abortion with the result of revealing the pressure the Church was exerting relative to abortion and contraceptives. The advocacy campaign of the MFPA consists of having abortion legalized on health grounds and improving family planning services, especially for young unmarried women and men. The full support of the media was secured on the abortion issue: articles appeared, meetings were attended by the press, and public relations support was also received from them. The MFPA worked closely with parliamentarians. A motion was tabled in 1994 in the National Assembly which called for legalization of abortion on health grounds, but the Church squelched its debate. In March 1994 MFPA hosted the IPPF African Regional Conference on Unsafe Abortion in Mauritius with the participation of over 100 representatives from 20 countries, and subsequently a second motion was tabled without parliamentary debate. The deliberations were covered by the media and the Ministry of Women's Rights recognized abortion as an urgent issue as outlined in a white paper prepared for the Fourth World Conference on Women held in Beijing in 1995. The campaign changed the policy climate favorably making the public more conscious of unsafe abortion. The Ministry of Health decided to collect more data and the newly elected government seems to be more open about this issue.

  11. Volterra representation enables modeling of complex synaptic nonlinear dynamics in large-scale simulations.

    PubMed

    Hu, Eric Y; Bouteiller, Jean-Marie C; Song, Dong; Baudry, Michel; Berger, Theodore W

    2015-01-01

    Chemical synapses are comprised of a wide collection of intricate signaling pathways involving complex dynamics. These mechanisms are often reduced to simple spikes or exponential representations in order to enable computer simulations at higher spatial levels of complexity. However, these representations cannot capture important nonlinear dynamics found in synaptic transmission. Here, we propose an input-output (IO) synapse model capable of generating complex nonlinear dynamics while maintaining low computational complexity. This IO synapse model is an extension of a detailed mechanistic glutamatergic synapse model capable of capturing the input-output relationships of the mechanistic model using the Volterra functional power series. We demonstrate that the IO synapse model is able to successfully track the nonlinear dynamics of the synapse up to the third order with high accuracy. We also evaluate the accuracy of the IO synapse model at different input frequencies and compared its performance with that of kinetic models in compartmental neuron models. Our results demonstrate that the IO synapse model is capable of efficiently replicating complex nonlinear dynamics that were represented in the original mechanistic model and provide a method to replicate complex and diverse synaptic transmission within neuron network simulations.

  12. Volterra representation enables modeling of complex synaptic nonlinear dynamics in large-scale simulations

    PubMed Central

    Hu, Eric Y.; Bouteiller, Jean-Marie C.; Song, Dong; Baudry, Michel; Berger, Theodore W.

    2015-01-01

    Chemical synapses are comprised of a wide collection of intricate signaling pathways involving complex dynamics. These mechanisms are often reduced to simple spikes or exponential representations in order to enable computer simulations at higher spatial levels of complexity. However, these representations cannot capture important nonlinear dynamics found in synaptic transmission. Here, we propose an input-output (IO) synapse model capable of generating complex nonlinear dynamics while maintaining low computational complexity. This IO synapse model is an extension of a detailed mechanistic glutamatergic synapse model capable of capturing the input-output relationships of the mechanistic model using the Volterra functional power series. We demonstrate that the IO synapse model is able to successfully track the nonlinear dynamics of the synapse up to the third order with high accuracy. We also evaluate the accuracy of the IO synapse model at different input frequencies and compared its performance with that of kinetic models in compartmental neuron models. Our results demonstrate that the IO synapse model is capable of efficiently replicating complex nonlinear dynamics that were represented in the original mechanistic model and provide a method to replicate complex and diverse synaptic transmission within neuron network simulations. PMID:26441622

  13. Modeling of Complex Mixtures: JP-8 Toxicokinetics

    DTIC Science & Technology

    2008-10-01

    generic tissue compartments in which we have combined diffusion limitation and deep tissue (global tissue model). We also applied a QSAR approach for...SUBJECT TERMS jet fuel, JP-8, PBPK modeling, complex mixtures, nonane, decane, naphthalene, QSAR , alternative fuels 16. SECURITY CLASSIFICATION OF...necessary, to apply to the interaction of specific compounds with specific tissues. We have also applied a QSAR approach for estimating blood and tissue

  14. Tackling maize fusariosis: in search of Fusarium graminearum biosuppressors.

    PubMed

    Adeniji, Adetomiwa Ayodele; Babalola, Olubukola Oluranti

    2018-06-22

    This review presents biocontrol agents employed to alleviate the deleterious effect of the pathogen Fusarium graminearum on maize. The control of this mycotoxigenic phytopathogen remains elusive despite the elaborate research conducted on its detection, identification, and molecular fingerprinting. This could be attributed to the fact that in vitro and greenhouse biocontrol studies on F. graminearum have exceeded the number of field studies done. Furthermore, along with the variances seen among these F. graminearum suppressing biocontrol strains, it is also clear that the majority of research done to tackle F. graminearum outbreaks was on wheat and barley cultivars. Most fusariosis management related to maize targeted other members of Fusarium such as Fusarium verticillioides, with biocontrol strains from the genera Bacillus and Pseudomonas being used frequently in the experiments. We highlight relevant current techniques needed to identify an effective biofungicide for maize fusariosis and recommend alternative approaches to reduce the scarcity of data for indigenous maize field trials.

  15. Hierarchical Model for the Evolution of Cloud Complexes

    NASA Astrophysics Data System (ADS)

    Sánchez D., Néstor M.; Parravano, Antonio

    1999-01-01

    The structure of cloud complexes appears to be well described by a tree structure (i.e., a simplified ``stick man'') representation when the image is partitioned into ``clouds.'' In this representation, the parent-child relationships are assigned according to containment. Based on this picture, a hierarchical model for the evolution of cloud complexes, including star formation, is constructed. The model follows the mass evolution of each substructure by computing its mass exchange with its parent and children. The parent-child mass exchange (evaporation or condensation) depends on the radiation density at the interphase. At the end of the ``lineage,'' stars may be born or die, so that there is a nonstationary mass flow in the hierarchical structure. For a variety of parameter sets the system follows the same series of steps to transform diffuse gas into stars, and the regulation of the mass flux in the tree by previously formed stars dominates the evolution of the star formation. For the set of parameters used here as a reference model, the system tends to produce initial mass functions (IMFs) that have a maximum at a mass that is too high (~2 Msolar) and the characteristic times for evolution seem too long. We show that these undesired properties can be improved by adjusting the model parameters. The model requires further physics (e.g., allowing for multiple stellar systems and clump collisions) before a definitive comparison with observations can be made. Instead, the emphasis here is to illustrate some general properties of this kind of complex nonlinear model for the star formation process. Notwithstanding the simplifications involved, the model reveals an essential feature that will likely remain if additional physical processes are included, that is, the detailed behavior of the system is very sensitive to the variations on the initial and external conditions, suggesting that a ``universal'' IMF is very unlikely. When an ensemble of IMFs corresponding to a

  16. Computer models of complex multiloop branched pipeline systems

    NASA Astrophysics Data System (ADS)

    Kudinov, I. V.; Kolesnikov, S. V.; Eremin, A. V.; Branfileva, A. N.

    2013-11-01

    This paper describes the principal theoretical concepts of the method used for constructing computer models of complex multiloop branched pipeline networks, and this method is based on the theory of graphs and two Kirchhoff's laws applied to electrical circuits. The models make it possible to calculate velocities, flow rates, and pressures of a fluid medium in any section of pipeline networks, when the latter are considered as single hydraulic systems. On the basis of multivariant calculations the reasons for existing problems can be identified, the least costly methods of their elimination can be proposed, and recommendations for planning the modernization of pipeline systems and construction of their new sections can be made. The results obtained can be applied to complex pipeline systems intended for various purposes (water pipelines, petroleum pipelines, etc.). The operability of the model has been verified on an example of designing a unified computer model of the heat network for centralized heat supply of the city of Samara.

  17. Modeling the propagation of mobile malware on complex networks

    NASA Astrophysics Data System (ADS)

    Liu, Wanping; Liu, Chao; Yang, Zheng; Liu, Xiaoyang; Zhang, Yihao; Wei, Zuxue

    2016-08-01

    In this paper, the spreading behavior of malware across mobile devices is addressed. By introducing complex networks to model mobile networks, which follows the power-law degree distribution, a novel epidemic model for mobile malware propagation is proposed. The spreading threshold that guarantees the dynamics of the model is calculated. Theoretically, the asymptotic stability of the malware-free equilibrium is confirmed when the threshold is below the unity, and the global stability is further proved under some sufficient conditions. The influences of different model parameters as well as the network topology on malware propagation are also analyzed. Our theoretical studies and numerical simulations show that networks with higher heterogeneity conduce to the diffusion of malware, and complex networks with lower power-law exponents benefit malware spreading.

  18. Simulating complex intracellular processes using object-oriented computational modelling.

    PubMed

    Johnson, Colin G; Goldman, Jacki P; Gullick, William J

    2004-11-01

    The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation.

  19. Risk Modeling of Interdependent Complex Systems of Systems: Theory and Practice.

    PubMed

    Haimes, Yacov Y

    2018-01-01

    The emergence of the complexity characterizing our systems of systems (SoS) requires a reevaluation of the way we model, assess, manage, communicate, and analyze the risk thereto. Current models for risk analysis of emergent complex SoS are insufficient because too often they rely on the same risk functions and models used for single systems. These models commonly fail to incorporate the complexity derived from the networks of interdependencies and interconnectedness (I-I) characterizing SoS. There is a need to reevaluate currently practiced risk analysis to respond to this reality by examining, and thus comprehending, what makes emergent SoS complex. The key to evaluating the risk to SoS lies in understanding the genesis of characterizing I-I of systems manifested through shared states and other essential entities within and among the systems that constitute SoS. The term "essential entities" includes shared decisions, resources, functions, policies, decisionmakers, stakeholders, organizational setups, and others. This undertaking can be accomplished by building on state-space theory, which is fundamental to systems engineering and process control. This article presents a theoretical and analytical framework for modeling the risk to SoS with two case studies performed with the MITRE Corporation and demonstrates the pivotal contributions made by shared states and other essential entities to modeling and analysis of the risk to complex SoS. A third case study highlights the multifarious representations of SoS, which require harmonizing the risk analysis process currently applied to single systems when applied to complex SoS. © 2017 Society for Risk Analysis.

  20. Caught in the (Education) Act: Tackling Michael Gove's Education Revolution. Report on 19th November 2011 Conference

    ERIC Educational Resources Information Center

    FORUM: for promoting 3-19 comprehensive education, 2012

    2012-01-01

    A number of significant campaigning organisations and education trades unions--the Anti-Academies Alliance, CASE, Comprehensive Future, Forum, ISCG and the Socialist Educational Association, along with ASCL, ATL, NASUWT and NUT--staged a conference in London on 19 November 2011, with the title 'Caught in the (Education) Act: tackling Michael…

  1. Modeling Electromagnetic Scattering From Complex Inhomogeneous Objects

    NASA Technical Reports Server (NTRS)

    Deshpande, Manohar; Reddy, C. J.

    2011-01-01

    This software innovation is designed to develop a mathematical formulation to estimate the electromagnetic scattering characteristics of complex, inhomogeneous objects using the finite-element-method (FEM) and method-of-moments (MoM) concepts, as well as to develop a FORTRAN code called FEMOM3DS (Finite Element Method and Method of Moments for 3-Dimensional Scattering), which will implement the steps that are described in the mathematical formulation. Very complex objects can be easily modeled, and the operator of the code is not required to know the details of electromagnetic theory to study electromagnetic scattering.

  2. Modelling the evolution of complex conductivity during calcite precipitation on glass beads

    NASA Astrophysics Data System (ADS)

    Leroy, Philippe; Li, Shuai; Jougnot, Damien; Revil, André; Wu, Yuxin

    2017-04-01

    When pH and alkalinity increase, calcite frequently precipitates and hence modifies the petrophysical properties of porous media. The complex conductivity method can be used to directly monitor calcite precipitation in porous media because it is sensitive to the evolution of the mineralogy, pore structure and its connectivity. We have developed a mechanistic grain polarization model considering the electrochemical polarization of the Stern and diffuse layers surrounding calcite particles. Our complex conductivity model depends on the surface charge density of the Stern layer and on the electrical potential at the onset of the diffuse layer, which are computed using a basic Stern model of the calcite/water interface. The complex conductivity measurements of Wu et al. on a column packed with glass beads where calcite precipitation occurs are reproduced by our surface complexation and complex conductivity models. The evolution of the size and shape of calcite particles during the calcite precipitation experiment is estimated by our complex conductivity model. At the early stage of the calcite precipitation experiment, modelled particles sizes increase and calcite particles flatten with time because calcite crystals nucleate at the surface of glass beads and grow into larger calcite grains. At the later stage of the calcite precipitation experiment, modelled sizes and cementation exponents of calcite particles decrease with time because large calcite grains aggregate over multiple glass beads and only small calcite crystals polarize.

  3. Modeling of Wall-Bounded Complex Flows and Free Shear Flows

    NASA Technical Reports Server (NTRS)

    Shih, Tsan-Hsing; Zhu, Jiang; Lumley, John L.

    1994-01-01

    Various wall-bounded flows with complex geometries and free shear flows have been studied with a newly developed realizable Reynolds stress algebraic equation model. The model development is based on the invariant theory in continuum mechanics. This theory enables us to formulate a general constitutive relation for the Reynolds stresses. Pope was the first to introduce this kind of constitutive relation to turbulence modeling. In our study, realizability is imposed on the truncated constitutive relation to determine the coefficients so that, unlike the standard k-E eddy viscosity model, the present model will not produce negative normal stresses in any situations of rapid distortion. The calculations based on the present model have shown an encouraging success in modeling complex turbulent flows.

  4. Population-reaction model and microbial experimental ecosystems for understanding hierarchical dynamics of ecosystems.

    PubMed

    Hosoda, Kazufumi; Tsuda, Soichiro; Kadowaki, Kohmei; Nakamura, Yutaka; Nakano, Tadashi; Ishii, Kojiro

    2016-02-01

    Understanding ecosystem dynamics is crucial as contemporary human societies face ecosystem degradation. One of the challenges that needs to be recognized is the complex hierarchical dynamics. Conventional dynamic models in ecology often represent only the population level and have yet to include the dynamics of the sub-organism level, which makes an ecosystem a complex adaptive system that shows characteristic behaviors such as resilience and regime shifts. The neglect of the sub-organism level in the conventional dynamic models would be because integrating multiple hierarchical levels makes the models unnecessarily complex unless supporting experimental data are present. Now that large amounts of molecular and ecological data are increasingly accessible in microbial experimental ecosystems, it is worthwhile to tackle the questions of their complex hierarchical dynamics. Here, we propose an approach that combines microbial experimental ecosystems and a hierarchical dynamic model named population-reaction model. We present a simple microbial experimental ecosystem as an example and show how the system can be analyzed by a population-reaction model. We also show that population-reaction models can be applied to various ecological concepts, such as predator-prey interactions, climate change, evolution, and stability of diversity. Our approach will reveal a path to the general understanding of various ecosystems and organisms. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  5. GalaxyRefineComplex: Refinement of protein-protein complex model structures driven by interface repacking.

    PubMed

    Heo, Lim; Lee, Hasup; Seok, Chaok

    2016-08-18

    Protein-protein docking methods have been widely used to gain an atomic-level understanding of protein interactions. However, docking methods that employ low-resolution energy functions are popular because of computational efficiency. Low-resolution docking tends to generate protein complex structures that are not fully optimized. GalaxyRefineComplex takes such low-resolution docking structures and refines them to improve model accuracy in terms of both interface contact and inter-protein orientation. This refinement method allows flexibility at the protein interface and in the overall docking structure to capture conformational changes that occur upon binding. Symmetric refinement is also provided for symmetric homo-complexes. This method was validated by refining models produced by available docking programs, including ZDOCK and M-ZDOCK, and was successfully applied to CAPRI targets in a blind fashion. An example of using the refinement method with an existing docking method for ligand binding mode prediction of a drug target is also presented. A web server that implements the method is freely available at http://galaxy.seoklab.org/refinecomplex.

  6. Application of surface complexation models to anion adsorption by natural materials.

    PubMed

    Goldberg, Sabine

    2014-10-01

    Various chemical models of ion adsorption are presented and discussed. Chemical models, such as surface complexation models, provide a molecular description of anion adsorption reactions using an equilibrium approach. Two such models, the constant capacitance model and the triple layer model, are described in the present study. Characteristics common to all the surface complexation models are equilibrium constant expressions, mass and charge balances, and surface activity coefficient electrostatic potential terms. Methods for determining parameter values for surface site density, capacitances, and surface complexation constants also are discussed. Spectroscopic experimental methods of establishing ion adsorption mechanisms include vibrational spectroscopy, nuclear magnetic resonance spectroscopy, electron spin resonance spectroscopy, X-ray absorption spectroscopy, and X-ray reflectivity. Experimental determinations of point of zero charge shifts and ionic strength dependence of adsorption results and molecular modeling calculations also can be used to deduce adsorption mechanisms. Applications of the surface complexation models to heterogeneous natural materials, such as soils, using the component additivity and the generalized composite approaches are described. Emphasis is on the generalized composite approach for predicting anion adsorption by soils. Continuing research is needed to develop consistent and realistic protocols for describing ion adsorption reactions on soil minerals and soils. The availability of standardized model parameter databases for use in chemical speciation-transport models is critical. Published 2014 Wiley Periodicals Inc. on behalf of SETAC. This article is a US Government work and as such, is in the public domain in the in the United States of America.

  7. Evidence for complex contagion models of social contagion from observational data

    PubMed Central

    Sprague, Daniel A.

    2017-01-01

    Social influence can lead to behavioural ‘fads’ that are briefly popular and quickly die out. Various models have been proposed for these phenomena, but empirical evidence of their accuracy as real-world predictive tools has so far been absent. Here we find that a ‘complex contagion’ model accurately describes the spread of behaviours driven by online sharing. We found that standard, ‘simple’, contagion often fails to capture both the rapid spread and the long tails of popularity seen in real fads, where our complex contagion model succeeds. Complex contagion also has predictive power: it successfully predicted the peak time and duration of the ALS Icebucket Challenge. The fast spread and longer duration of fads driven by complex contagion has important implications for activities such as publicity campaigns and charity drives. PMID:28686719

  8. A multi-element cosmological model with a complex space-time topology

    NASA Astrophysics Data System (ADS)

    Kardashev, N. S.; Lipatova, L. N.; Novikov, I. D.; Shatskiy, A. A.

    2015-02-01

    Wormhole models with a complex topology having one entrance and two exits into the same space-time of another universe are considered, as well as models with two entrances from the same space-time and one exit to another universe. These models are used to build a model of a multi-sheeted universe (a multi-element model of the "Multiverse") with a complex topology. Spherical symmetry is assumed in all the models. A Reissner-Norström black-hole model having no singularity beyond the horizon is constructed. The strength of the central singularity of the black hole is analyzed.

  9. Modeling the assembly order of multimeric heteroprotein complexes

    PubMed Central

    Esquivel-Rodriguez, Juan; Terashi, Genki; Christoffer, Charles; Shin, Woong-Hee

    2018-01-01

    Protein-protein interactions are the cornerstone of numerous biological processes. Although an increasing number of protein complex structures have been determined using experimental methods, relatively fewer studies have been performed to determine the assembly order of complexes. In addition to the insights into the molecular mechanisms of biological function provided by the structure of a complex, knowing the assembly order is important for understanding the process of complex formation. Assembly order is also practically useful for constructing subcomplexes as a step toward solving the entire complex experimentally, designing artificial protein complexes, and developing drugs that interrupt a critical step in the complex assembly. There are several experimental methods for determining the assembly order of complexes; however, these techniques are resource-intensive. Here, we present a computational method that predicts the assembly order of protein complexes by building the complex structure. The method, named Path-LzerD, uses a multimeric protein docking algorithm that assembles a protein complex structure from individual subunit structures and predicts assembly order by observing the simulated assembly process of the complex. Benchmarked on a dataset of complexes with experimental evidence of assembly order, Path-LZerD was successful in predicting the assembly pathway for the majority of the cases. Moreover, when compared with a simple approach that infers the assembly path from the buried surface area of subunits in the native complex, Path-LZerD has the strong advantage that it can be used for cases where the complex structure is not known. The path prediction accuracy decreased when starting from unbound monomers, particularly for larger complexes of five or more subunits, for which only a part of the assembly path was correctly identified. As the first method of its kind, Path-LZerD opens a new area of computational protein structure modeling and will be

  10. Modeling the assembly order of multimeric heteroprotein complexes.

    PubMed

    Peterson, Lenna X; Togawa, Yoichiro; Esquivel-Rodriguez, Juan; Terashi, Genki; Christoffer, Charles; Roy, Amitava; Shin, Woong-Hee; Kihara, Daisuke

    2018-01-01

    Protein-protein interactions are the cornerstone of numerous biological processes. Although an increasing number of protein complex structures have been determined using experimental methods, relatively fewer studies have been performed to determine the assembly order of complexes. In addition to the insights into the molecular mechanisms of biological function provided by the structure of a complex, knowing the assembly order is important for understanding the process of complex formation. Assembly order is also practically useful for constructing subcomplexes as a step toward solving the entire complex experimentally, designing artificial protein complexes, and developing drugs that interrupt a critical step in the complex assembly. There are several experimental methods for determining the assembly order of complexes; however, these techniques are resource-intensive. Here, we present a computational method that predicts the assembly order of protein complexes by building the complex structure. The method, named Path-LzerD, uses a multimeric protein docking algorithm that assembles a protein complex structure from individual subunit structures and predicts assembly order by observing the simulated assembly process of the complex. Benchmarked on a dataset of complexes with experimental evidence of assembly order, Path-LZerD was successful in predicting the assembly pathway for the majority of the cases. Moreover, when compared with a simple approach that infers the assembly path from the buried surface area of subunits in the native complex, Path-LZerD has the strong advantage that it can be used for cases where the complex structure is not known. The path prediction accuracy decreased when starting from unbound monomers, particularly for larger complexes of five or more subunits, for which only a part of the assembly path was correctly identified. As the first method of its kind, Path-LZerD opens a new area of computational protein structure modeling and will be

  11. Local synchronization of a complex network model.

    PubMed

    Yu, Wenwu; Cao, Jinde; Chen, Guanrong; Lü, Jinhu; Han, Jian; Wei, Wei

    2009-02-01

    This paper introduces a novel complex network model to evaluate the reputation of virtual organizations. By using the Lyapunov function and linear matrix inequality approaches, the local synchronization of the proposed model is further investigated. Here, the local synchronization is defined by the inner synchronization within a group which does not mean the synchronization between different groups. Moreover, several sufficient conditions are derived to ensure the local synchronization of the proposed network model. Finally, several representative examples are given to show the effectiveness of the proposed methods and theories.

  12. The Use of Behavior Models for Predicting Complex Operations

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2010-01-01

    Modeling and simulation (M&S) plays an important role when complex human-system notions are being proposed, developed and tested within the system design process. National Aeronautics and Space Administration (NASA) as an agency uses many different types of M&S approaches for predicting human-system interactions, especially when it is early in the development phase of a conceptual design. NASA Ames Research Center possesses a number of M&S capabilities ranging from airflow, flight path models, aircraft models, scheduling models, human performance models (HPMs), and bioinformatics models among a host of other kinds of M&S capabilities that are used for predicting whether the proposed designs will benefit the specific mission criteria. The Man-Machine Integration Design and Analysis System (MIDAS) is a NASA ARC HPM software tool that integrates many models of human behavior with environment models, equipment models, and procedural / task models. The challenge to model comprehensibility is heightened as the number of models that are integrated and the requisite fidelity of the procedural sets are increased. Model transparency is needed for some of the more complex HPMs to maintain comprehensibility of the integrated model performance. This will be exemplified in a recent MIDAS v5 application model and plans for future model refinements will be presented.

  13. Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.

    1997-01-01

    Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.

  14. Post-closure biosphere assessment modelling: comparison of complex and more stylised approaches.

    PubMed

    Walke, Russell C; Kirchner, Gerald; Xu, Shulan; Dverstorp, Björn

    2015-10-01

    Geological disposal facilities are the preferred option for high-level radioactive waste, due to their potential to provide isolation from the surface environment (biosphere) on very long timescales. Assessments need to strike a balance between stylised models and more complex approaches that draw more extensively on site-specific information. This paper explores the relative merits of complex versus more stylised biosphere models in the context of a site-specific assessment. The more complex biosphere modelling approach was developed by the Swedish Nuclear Fuel and Waste Management Co (SKB) for the Formark candidate site for a spent nuclear fuel repository in Sweden. SKB's approach is built on a landscape development model, whereby radionuclide releases to distinct hydrological basins/sub-catchments (termed 'objects') are represented as they evolve through land rise and climate change. Each of seventeen of these objects is represented with more than 80 site specific parameters, with about 22 that are time-dependent and result in over 5000 input values per object. The more stylised biosphere models developed for this study represent releases to individual ecosystems without environmental change and include the most plausible transport processes. In the context of regulatory review of the landscape modelling approach adopted in the SR-Site assessment in Sweden, the more stylised representation has helped to build understanding in the more complex modelling approaches by providing bounding results, checking the reasonableness of the more complex modelling, highlighting uncertainties introduced through conceptual assumptions and helping to quantify the conservatisms involved. The more stylised biosphere models are also shown capable of reproducing the results of more complex approaches. A major recommendation is that biosphere assessments need to justify the degree of complexity in modelling approaches as well as simplifying and conservative assumptions. In light of

  15. Redesigning primary care to tackle the global epidemic of noncommunicable disease.

    PubMed

    Kruk, Margaret E; Nigenda, Gustavo; Knaul, Felicia M

    2015-03-01

    Noncommunicable diseases (NCDs) have become the major contributors to death and disability worldwide. Nearly 80% of the deaths in 2010 occurred in low- and middle-income countries, which have experienced rapid population aging, urbanization, rise in smoking, and changes in diet and activity. Yet the health systems of low- and middle-income countries, historically oriented to infectious disease and often severely underfunded, are poorly prepared for the challenge of caring for people with cardiovascular disease, diabetes, cancer, and chronic respiratory disease. We have discussed how primary care can be redesigned to tackle the challenge of NCDs in resource-constrained countries. We suggest that four changes will be required: integration of services, innovative service delivery, a focus on patients and communities, and adoption of new technologies for communication.

  16. Molecular modeling of the neurophysin I/oxytocin complex

    NASA Astrophysics Data System (ADS)

    Kazmierkiewicz, R.; Czaplewski, C.; Lammek, B.; Ciarkowski, J.

    1997-01-01

    Neurophysins I and II (NPI and NPII) act in the neurosecretory granules as carrier proteinsfor the neurophyseal hormones oxytocin (OT) and vasopressin (VP), respectively. The NPI/OTfunctional unit, believed to be an (NPI/OT)2 heterotetramer, was modeled using low-resolution structure information, viz. the Cα carbon atom coordinates of the homologousNPII/dipeptide complex (file 1BN2 in the Brookhaven Protein Databank) as a template. Itsall-atom representation was obtained using standard modeling tools available within theINSIGHT/Biopolymer modules supplied by Biosym Technologies Inc. A conformation of theNPI-bound OT, similar to that recently proposed in a transfer NOE experiment, was dockedinto the ligand-binding site by a superposition of its Cys1-Tyr2 fragment onto the equivalentportion of the dipeptide in the template. The starting complex for the initial refinements wasprepared by two alternative strategies, termed Model I and Model II, each ending with a˜100 ps molecular dynamics (MD) simulation in water using the AMBER 4.1 force field. The freehomodimer NPI2 was obtained by removal of the two OT subunits from their sites, followedby a similar structure refinement. The use of Model I, consisting of a constrained simulatedannealing, resulted in a structure remarkably similar to both the NPII/dipeptide complex anda recently published solid-state structure of the NPII/OT complex. Thus, Model I isrecommended as the method of choice for the preparation of the starting all-atom data forMD. The MD simulations indicate that, both in the homodimer and in the heterotetramer, the310-helices demonstrate an increased mobility relative to the remaining body of the protein.Also, the C-terminal domains in the NPI2 homodimer are more mobile than the N-terminalones. Finally, a distinct intermonomer interaction is identified, concentrated around its mostprominent, although not unique, contribution provided by an H-bond from Ser25Oγ in one NPI unit to Glu81 Oɛ in the other

  17. Alpha-synuclein mitochondrial interaction leads to irreversible translocation and complex I impairment.

    PubMed

    Martínez, Jimena H; Fuentes, Federico; Vanasco, Virginia; Alvarez, Silvia; Alaimo, Agustina; Cassina, Adriana; Coluccio Leskow, Federico; Velazquez, Francisco

    2018-08-01

    α-synuclein is involved in both familial and sporadic Parkinson's disease. Although its interaction with mitochondria has been well documented, several aspects remains unknown or under debate such as the specific sub-mitochondrial localization or the dynamics of the interaction. It has been suggested that α-synuclein could only interact with ER-associated mitochondria. The vast use of model systems and experimental conditions makes difficult to compare results and extract definitive conclusions. Here we tackle this by analyzing, in a simplified system, the interaction between purified α-synuclein and isolated rat brain mitochondria. This work shows that wild type α-synuclein interacts with isolated mitochondria and translocates into the mitochondrial matrix. This interaction and the irreversibility of α-synuclein translocation depend on incubation time and α-synuclein concentration. FRET experiments show that α-synuclein localizes close to components of the TOM complex suggesting a passive transport of α-synuclein through the outer membrane. In addition, α-synuclein binding alters mitochondrial function at the level of Complex I leading to a decrease in ATP synthesis and an increase of ROS production. Copyright © 2018. Published by Elsevier Inc.

  18. New approaches in agent-based modeling of complex financial systems

    NASA Astrophysics Data System (ADS)

    Chen, Ting-Ting; Zheng, Bo; Li, Yan; Jiang, Xiong-Fei

    2017-12-01

    Agent-based modeling is a powerful simulation technique to understand the collective behavior and microscopic interaction in complex financial systems. Recently, the concept for determining the key parameters of agent-based models from empirical data instead of setting them artificially was suggested. We first review several agent-based models and the new approaches to determine the key model parameters from historical market data. Based on the agents' behaviors with heterogeneous personal preferences and interactions, these models are successful in explaining the microscopic origination of the temporal and spatial correlations of financial markets. We then present a novel paradigm combining big-data analysis with agent-based modeling. Specifically, from internet query and stock market data, we extract the information driving forces and develop an agent-based model to simulate the dynamic behaviors of complex financial systems.

  19. Describing Ecosystem Complexity through Integrated Catchment Modeling

    NASA Astrophysics Data System (ADS)

    Shope, C. L.; Tenhunen, J. D.; Peiffer, S.

    2011-12-01

    Land use and climate change have been implicated in reduced ecosystem services (ie: high quality water yield, biodiversity, and agricultural yield. The prediction of ecosystem services expected under future land use decisions and changing climate conditions has become increasingly important. Complex policy and management decisions require the integration of physical, economic, and social data over several scales to assess effects on water resources and ecology. Field-based meteorology, hydrology, soil physics, plant production, solute and sediment transport, economic, and social behavior data were measured in a South Korean catchment. A variety of models are being used to simulate plot and field scale experiments within the catchment. Results from each of the local-scale models provide identification of sensitive, local-scale parameters which are then used as inputs into a large-scale watershed model. We used the spatially distributed SWAT model to synthesize the experimental field data throughout the catchment. The approach of our study was that the range in local-scale model parameter results can be used to define the sensitivity and uncertainty in the large-scale watershed model. Further, this example shows how research can be structured for scientific results describing complex ecosystems and landscapes where cross-disciplinary linkages benefit the end result. The field-based and modeling framework described is being used to develop scenarios to examine spatial and temporal changes in land use practices and climatic effects on water quantity, water quality, and sediment transport. Development of accurate modeling scenarios requires understanding the social relationship between individual and policy driven land management practices and the value of sustainable resources to all shareholders.

  20. Mathematical modelling of complex contagion on clustered networks

    NASA Astrophysics Data System (ADS)

    O'sullivan, David J.; O'Keeffe, Gary; Fennell, Peter; Gleeson, James

    2015-09-01

    The spreading of behavior, such as the adoption of a new innovation, is influenced bythe structure of social networks that interconnect the population. In the experiments of Centola (Science, 2010), adoption of new behavior was shown to spread further and faster across clustered-lattice networks than across corresponding random networks. This implies that the “complex contagion” effects of social reinforcement are important in such diffusion, in contrast to “simple” contagion models of disease-spread which predict that epidemics would grow more efficiently on random networks than on clustered networks. To accurately model complex contagion on clustered networks remains a challenge because the usual assumptions (e.g. of mean-field theory) regarding tree-like networks are invalidated by the presence of triangles in the network; the triangles are, however, crucial to the social reinforcement mechanism, which posits an increased probability of a person adopting behavior that has been adopted by two or more neighbors. In this paper we modify the analytical approach that was introduced by Hebert-Dufresne et al. (Phys. Rev. E, 2010), to study disease-spread on clustered networks. We show how the approximation method can be adapted to a complex contagion model, and confirm the accuracy of the method with numerical simulations. The analytical results of the model enable us to quantify the level of social reinforcement that is required to observe—as in Centola’s experiments—faster diffusion on clustered topologies than on random networks.

  1. Does a SLAP lesion affect shoulder muscle recruitment as measured by EMG activity during a rugby tackle?

    PubMed Central

    2010-01-01

    Background The study objective was to assess the influence of a SLAP lesion on onset of EMG activity in shoulder muscles during a front on rugby football tackle within professional rugby players. Methods Mixed cross-sectional study evaluating between and within group differences in EMG onset times. Testing was carried out within the physiotherapy department of a university sports medicine clinic. The test group consisted of 7 players with clinically diagnosed SLAP lesions, later verified on arthroscopy. The reference group consisted of 15 uninjured and full time professional rugby players from within the same playing squad. Controlled tackles were performed against a tackle dummy. Onset of EMG activity was assessed from surface EMG of Pectorialis Major, Biceps Brachii, Latissimus Dorsi, Serratus Anterior and Infraspinatus muscles relative to time of impact. Analysis of differences in activation timing between muscles and limbs (injured versus non-injured side and non injured side versus matched reference group). Results Serratus Anterior was activated prior to all other muscles in all (P = 0.001-0.03) subjects. In the SLAP injured shoulder Biceps was activated later than in the non-injured side. Onset times of all muscles of the non-injured shoulder in the injured player were consistently earlier compared with the reference group. Whereas, within the injured shoulder, all muscle activation timings were later than in the reference group. Conclusions This study shows that in shoulders with a SLAP lesion there is a trend towards delay in activation time of Biceps and other muscles with the exception of an associated earlier onset of activation of Serratus anterior, possibly due to a coping strategy to protect glenohumeral stability and thoraco-scapular stability. This trend was not statistically significant in all cases PMID:20184752

  2. Promoting healthy diets and tackling obesity and diet-related chronic diseases: what are the agricultural policy levers?

    PubMed

    Hawkes, Corinna

    2007-06-01

    Diet-related chronic diseases are now a serious global public health problem. Public health groups are calling for the agricultural sector to play a greater role in tackling the threat. To identify potential points of policy intervention in the agricultural sector that could be leveraged to promote healthy diets and tackle obesity and diet-related chronic diseases. A review of the literature on the dietary implications of agriculture, a conceptual analysis of the issues, and the identification of relevant examples. There are two main potential points of intervention in the agricultural sector that could be leveraged to promote healthy diets: agricultural policies and agricultural production practices. Agricultural policies and practices affect diet through their influence on food availability, price, and nutrient quality, which in turn affects food choices available to consumers. Agricultural policies amenable to intervention include input, production, and trade policies; agricultural production practices amenable to intervention include crop breeding, crop fertilization practices, livestock-feeding practices, and crop systems diversity. It is well-known that agricultural policies and production practices influence what farmers choose to grow. Agricultural policies and production practices could also play a role in influencing what consumers choose to eat. To identify how agricultural policies and practices can usefully contribute toward promoting healthy diets and tackling obesity and diet-related chronic diseases, health policymakers need to examine whether current agricultural policies and production practices are contributing to-or detracting from-efforts to attain dietary goals; where and how could agricultural intervention help achieve dietary goals; and whether there are trade-offs between these interventions and other important concerns, such as undernutrition and the livelihoods of agricultural producers. Given the potential of agriculture to contribute to

  3. Complex groundwater flow systems as traveling agent models

    PubMed Central

    Padilla, Pablo; Escolero, Oscar; González, Tomas; Morales-Casique, Eric; Osorio-Olvera, Luis

    2014-01-01

    Analyzing field data from pumping tests, we show that as with many other natural phenomena, groundwater flow exhibits complex dynamics described by 1/f power spectrum. This result is theoretically studied within an agent perspective. Using a traveling agent model, we prove that this statistical behavior emerges when the medium is complex. Some heuristic reasoning is provided to justify both spatial and dynamic complexity, as the result of the superposition of an infinite number of stochastic processes. Even more, we show that this implies that non-Kolmogorovian probability is needed for its study, and provide a set of new partial differential equations for groundwater flow. PMID:25337455

  4. The Complex Action Recognition via the Correlated Topic Model

    PubMed Central

    Tu, Hong-bin; Xia, Li-min; Wang, Zheng-wu

    2014-01-01

    Human complex action recognition is an important research area of the action recognition. Among various obstacles to human complex action recognition, one of the most challenging is to deal with self-occlusion, where one body part occludes another one. This paper presents a new method of human complex action recognition, which is based on optical flow and correlated topic model (CTM). Firstly, the Markov random field was used to represent the occlusion relationship between human body parts in terms of an occlusion state variable. Secondly, the structure from motion (SFM) is used for reconstructing the missing data of point trajectories. Then, we can extract the key frame based on motion feature from optical flow and the ratios of the width and height are extracted by the human silhouette. Finally, we use the topic model of correlated topic model (CTM) to classify action. Experiments were performed on the KTH, Weizmann, and UIUC action dataset to test and evaluate the proposed method. The compared experiment results showed that the proposed method was more effective than compared methods. PMID:24574920

  5. The persistent problem of lead poisoning in birds from ammunition and fishing tackle

    USGS Publications Warehouse

    Haig, Susan M.; D'Elia, Jesse; Eagles-Smith, Collin A.; Fair, Jeanne M.; Gervais, Jennifer; Herring, Garth; Rivers, James W.; Schulz, John H.

    2014-01-01

    Lead (Pb) is a metabolic poison that can negatively influence biological processes, leading to illness and mortality across a large spectrum of North American avifauna (>120 species) and other organisms. Pb poisoning can result from numerous sources, including ingestion of bullet fragments and shot pellets left in animal carcasses, spent ammunition left in the field, lost fishing tackle, Pb-based paints, large-scale mining, and Pb smelting activities. Although Pb shot has been banned for waterfowl hunting in the United States (since 1991) and Canada (since 1999), Pb exposure remains a problem for many avian species. Despite a large body of scientific literature on exposure to Pb and its toxicological effects on birds, controversy still exists regarding its impacts at a population level. We explore these issues and highlight areas in need of investigation: (1) variation in sensitivity to Pb exposure among bird species; (2) spatial extent and sources of Pb contamination in habitats in relation to bird exposure in those same locations; and (3) interactions between avian Pb exposure and other landscape-level stressors that synergistically affect bird demography. We explore multiple paths taken to reduce Pb exposure in birds that (1) recognize common ground among a range of affected interests; (2) have been applied at local to national scales; and (3) engage governmental agencies, interest groups, and professional societies to communicate the impacts of Pb ammunition and fishing tackle, and to describe approaches for reducing their availability to birds. As they have in previous times, users of fish and wildlife will play a key role in resolving the Pb poisoning issue.

  6. Drosophila as an In Vivo Model for Human Neurodegenerative Disease.

    PubMed

    McGurk, Leeanne; Berson, Amit; Bonini, Nancy M

    2015-10-01

    With the increase in the ageing population, neurodegenerative disease is devastating to families and poses a huge burden on society. The brain and spinal cord are extraordinarily complex: they consist of a highly organized network of neuronal and support cells that communicate in a highly specialized manner. One approach to tackling problems of such complexity is to address the scientific questions in simpler, yet analogous, systems. The fruit fly, Drosophila melanogaster, has been proven tremendously valuable as a model organism, enabling many major discoveries in neuroscientific disease research. The plethora of genetic tools available in Drosophila allows for exquisite targeted manipulation of the genome. Due to its relatively short lifespan, complex questions of brain function can be addressed more rapidly than in other model organisms, such as the mouse. Here we discuss features of the fly as a model for human neurodegenerative disease. There are many distinct fly models for a range of neurodegenerative diseases; we focus on select studies from models of polyglutamine disease and amyotrophic lateral sclerosis that illustrate the type and range of insights that can be gleaned. In discussion of these models, we underscore strengths of the fly in providing understanding into mechanisms and pathways, as a foundation for translational and therapeutic research. Copyright © 2015 by the Genetics Society of America.

  7. a Range Based Method for Complex Facade Modeling

    NASA Astrophysics Data System (ADS)

    Adami, A.; Fregonese, L.; Taffurelli, L.

    2011-09-01

    3d modelling of Architectural Heritage does not follow a very well-defined way, but it goes through different algorithms and digital form according to the shape complexity of the object, to the main goal of the representation and to the starting data. Even if the process starts from the same data, such as a pointcloud acquired by laser scanner, there are different possibilities to realize a digital model. In particular we can choose between two different attitudes: the mesh and the solid model. In the first case the complexity of architecture is represented by a dense net of triangular surfaces which approximates the real surface of the object. In the other -opposite- case the 3d digital model can be realized by the use of simple geometrical shapes, by the use of sweeping algorithm and the Boolean operations. Obviously these two models are not the same and each one is characterized by some peculiarities concerning the way of modelling (the choice of a particular triangulation algorithm or the quasi-automatic modelling by known shapes) and the final results (a more detailed and complex mesh versus an approximate and more simple solid model). Usually the expected final representation and the possibility of publishing lead to one way or the other. In this paper we want to suggest a semiautomatic process to build 3d digital models of the facades of complex architecture to be used for example in city models or in other large scale representations. This way of modelling guarantees also to obtain small files to be published on the web or to be transmitted. The modelling procedure starts from laser scanner data which can be processed in the well known way. Usually more than one scan is necessary to describe a complex architecture and to avoid some shadows on the facades. These have to be registered in a single reference system by the use of targets which are surveyed by topography and then to be filtered in order to obtain a well controlled and homogeneous point cloud of

  8. Challenges in process marginality for advanced technology nodes and tackling its contributors

    NASA Astrophysics Data System (ADS)

    Narayana Samy, Aravind; Schiwon, Roberto; Seltmann, Rolf; Kahlenberg, Frank; Katakamsetty, Ushasree

    2013-10-01

    Process margin is getting critical in the present node shrinkage scenario due to the physical limits reached (Rayleigh's criterion) using ArF lithography tools. K1 is used to its best for better resolution and to enhance the process margin (28nm metal patterning k1=0.31). In this paper, we would like to give an overview of various contributors in the advanced technology nodes which limit the process margins and how the challenges have been tackled in a modern foundry model. Advanced OPC algorithms are used to make the design content at the mask optimum for patterning. However, as we work at the physical limit, critical features (Hot-spots) are very susceptible to litho process variations. Furthermore, etch can have a significant impact as well. Pattern that still looks healthy at litho can fail due to etch interactions. This makes the traditional 2D contour output from ORC tools not able to predict accurately all defects and hence not able to fully correct it in the early mask tapeout phase. The above makes a huge difference in the fast ramp-up and high yield in a competitive foundry market. We will explain in this paper how the early introduction of 3D resist model based simulation of resist profiles (resist top-loss, bottom bridging, top-rounding, etc.,) helped in our prediction and correction of hot-spots in the early 28nm process development phase. The paper also discusses about the other overall process window reduction contributors due to mask 3D effects, wafer topography (focus shifts/variations) and how this has been addressed with different simulation efforts in a fast and timely manner.

  9. Research Area 3: Mathematics (3.1 Modeling of Complex Systems)

    DTIC Science & Technology

    2017-10-31

    RESEARCH AREA 3: MATHEMATICS (3.1 Modeling of Complex Systems). Proposal should be directed to Dr. John Lavery The views, opinions and/or findings...so designated by other documentation. 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research ...Title: RESEARCH AREA 3: MATHEMATICS (3.1 Modeling of Complex Systems). Proposal should be directed to Dr. John Lavery Report Term: 0-Other Email

  10. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.

    PubMed

    Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model.

  11. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems

    PubMed Central

    Timmis, Jon; Qwarnstrom, Eva E.

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414

  12. Postprocessing of docked protein-ligand complexes using implicit solvation models.

    PubMed

    Lindström, Anton; Edvinsson, Lotta; Johansson, Andreas; Andersson, C David; Andersson, Ida E; Raubacher, Florian; Linusson, Anna

    2011-02-28

    Molecular docking plays an important role in drug discovery as a tool for the structure-based design of small organic ligands for macromolecules. Possible applications of docking are identification of the bioactive conformation of a protein-ligand complex and the ranking of different ligands with respect to their strength of binding to a particular target. We have investigated the effect of implicit water on the postprocessing of binding poses generated by molecular docking using MM-PB/GB-SA (molecular mechanics Poisson-Boltzmann and generalized Born surface area) methodology. The investigation was divided into three parts: geometry optimization, pose selection, and estimation of the relative binding energies of docked protein-ligand complexes. Appropriate geometry optimization afforded more accurate binding poses for 20% of the complexes investigated. The time required for this step was greatly reduced by minimizing the energy of the binding site using GB solvation models rather than minimizing the entire complex using the PB model. By optimizing the geometries of docking poses using the GB(HCT+SA) model then calculating their free energies of binding using the PB implicit solvent model, binding poses similar to those observed in crystal structures were obtained. Rescoring of these poses according to their calculated binding energies resulted in improved correlations with experimental binding data. These correlations could be further improved by applying the postprocessing to several of the most highly ranked poses rather than focusing exclusively on the top-scored pose. The postprocessing protocol was successfully applied to the analysis of a set of Factor Xa inhibitors and a set of glycopeptide ligands for the class II major histocompatibility complex (MHC) A(q) protein. These results indicate that the protocol for the postprocessing of docked protein-ligand complexes developed in this paper may be generally useful for structure-based design in drug discovery.

  13. Intelligent Decisions Need Intelligent Choice of Models and Data - a Bayesian Justifiability Analysis for Models with Vastly Different Complexity

    NASA Astrophysics Data System (ADS)

    Nowak, W.; Schöniger, A.; Wöhling, T.; Illman, W. A.

    2016-12-01

    Model-based decision support requires justifiable models with good predictive capabilities. This, in turn, calls for a fine adjustment between predictive accuracy (small systematic model bias that can be achieved with rather complex models), and predictive precision (small predictive uncertainties that can be achieved with simpler models with fewer parameters). The implied complexity/simplicity trade-off depends on the availability of informative data for calibration. If not available, additional data collection can be planned through optimal experimental design. We present a model justifiability analysis that can compare models of vastly different complexity. It rests on Bayesian model averaging (BMA) to investigate the complexity/performance trade-off dependent on data availability. Then, we disentangle the complexity component from the performance component. We achieve this by replacing actually observed data by realizations of synthetic data predicted by the models. This results in a "model confusion matrix". Based on this matrix, the modeler can identify the maximum model complexity that can be justified by the available (or planned) amount and type of data. As a side product, the matrix quantifies model (dis-)similarity. We apply this analysis to aquifer characterization via hydraulic tomography, comparing four models with a vastly different number of parameters (from a homogeneous model to geostatistical random fields). As a testing scenario, we consider hydraulic tomography data. Using subsets of these data, we determine model justifiability as a function of data set size. The test case shows that geostatistical parameterization requires a substantial amount of hydraulic tomography data to be justified, while a zonation-based model can be justified with more limited data set sizes. The actual model performance (as opposed to model justifiability), however, depends strongly on the quality of prior geological information.

  14. Uranium(VI) adsorption to ferrihydrite: Application of a surface complexation model

    USGS Publications Warehouse

    Waite, T.D.; Davis, J.A.; Payne, T.E.; Waychunas, G.A.; Xu, N.

    1994-01-01

    A study of U(VI) adsorption by ferrihydrite was conducted over a wide range of U(VI) concentrations, pH, and at two partial pressures of carbon dioxide. A two-site (strong- and weak-affinity sites, FesOH and FewOH, respectively) surface complexation model was able to describe the experimental data well over a wide range of conditions, with only one species formed with each site type: an inner-sphere, mononuclear, bidentate complex of the type (FeO2)UO2. The existence of such a surface species was supported by results of uranium EXAFS spectroscopy performed on two samples with U(VI) adsorption density in the upper range observed in this study (10 and 18% occupancy of total surface sites). Adsorption data in the alkaline pH range suggested the existence of a second surface species, modeled as a ternary surface complex with UO2CO30 binding to a bidentate surface site. Previous surface complexation models for U(VI) adsorption have proposed surface species that are identical to the predominant aqueous species, e.g., multinuclear hydrolysis complexes or several U(VI)-carbonate complexes. The results demonstrate that the speciation of adsorbed U(VI) may be constrained by the coordination environment at the surface, giving rise to surface speciation for U(VI) that is significantly less complex than aqueous speciation.

  15. Modeling Structure and Dynamics of Protein Complexes with SAXS Profiles

    PubMed Central

    Schneidman-Duhovny, Dina; Hammel, Michal

    2018-01-01

    Small-angle X-ray scattering (SAXS) is an increasingly common and useful technique for structural characterization of molecules in solution. A SAXS experiment determines the scattering intensity of a molecule as a function of spatial frequency, termed SAXS profile. SAXS profiles can be utilized in a variety of molecular modeling applications, such as comparing solution and crystal structures, structural characterization of flexible proteins, assembly of multi-protein complexes, and modeling of missing regions in the high-resolution structure. Here, we describe protocols for modeling atomic structures based on SAXS profiles. The first protocol is for comparing solution and crystal structures including modeling of missing regions and determination of the oligomeric state. The second protocol performs multi-state modeling by finding a set of conformations and their weights that fit the SAXS profile starting from a single-input structure. The third protocol is for protein-protein docking based on the SAXS profile of the complex. We describe the underlying software, followed by demonstrating their application on interleukin 33 (IL33) with its primary receptor ST2 and DNA ligase IV-XRCC4 complex. PMID:29605933

  16. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  17. GO2OGS 1.0: a versatile workflow to integrate complex geological information with fault data into numerical simulation models

    NASA Astrophysics Data System (ADS)

    Fischer, T.; Naumov, D.; Sattler, S.; Kolditz, O.; Walther, M.

    2015-11-01

    We offer a versatile workflow to convert geological models built with the ParadigmTM GOCAD© (Geological Object Computer Aided Design) software into the open-source VTU (Visualization Toolkit unstructured grid) format for usage in numerical simulation models. Tackling relevant scientific questions or engineering tasks often involves multidisciplinary approaches. Conversion workflows are needed as a way of communication between the diverse tools of the various disciplines. Our approach offers an open-source, platform-independent, robust, and comprehensible method that is potentially useful for a multitude of environmental studies. With two application examples in the Thuringian Syncline, we show how a heterogeneous geological GOCAD model including multiple layers and faults can be used for numerical groundwater flow modeling, in our case employing the OpenGeoSys open-source numerical toolbox for groundwater flow simulations. The presented workflow offers the chance to incorporate increasingly detailed data, utilizing the growing availability of computational power to simulate numerical models.

  18. Transgressive Local Act: Tackling Domestic Violence with Forum and Popular Theatre in "Sisterhood Bound as Yuan Ze Flowers"

    ERIC Educational Resources Information Center

    Wang, Wan-Jung

    2010-01-01

    This paper examines a community theatre project in Kaohsiung County, Taiwan that aimed to tackle domestic violence through a collaboration between local community female elders and the facilitator. The paper investigates how an outside facilitator could unfix the assumed community identities which tend to exclude outsiders or sub-groups, in this…

  19. Accuracy of travel time distribution (TTD) models as affected by TTD complexity, observation errors, and model and tracer selection

    USGS Publications Warehouse

    Green, Christopher T.; Zhang, Yong; Jurgens, Bryant C.; Starn, J. Jeffrey; Landon, Matthew K.

    2014-01-01

    Analytical models of the travel time distribution (TTD) from a source area to a sample location are often used to estimate groundwater ages and solute concentration trends. The accuracies of these models are not well known for geologically complex aquifers. In this study, synthetic datasets were used to quantify the accuracy of four analytical TTD models as affected by TTD complexity, observation errors, model selection, and tracer selection. Synthetic TTDs and tracer data were generated from existing numerical models with complex hydrofacies distributions for one public-supply well and 14 monitoring wells in the Central Valley, California. Analytical TTD models were calibrated to synthetic tracer data, and prediction errors were determined for estimates of TTDs and conservative tracer (NO3−) concentrations. Analytical models included a new, scale-dependent dispersivity model (SDM) for two-dimensional transport from the watertable to a well, and three other established analytical models. The relative influence of the error sources (TTD complexity, observation error, model selection, and tracer selection) depended on the type of prediction. Geological complexity gave rise to complex TTDs in monitoring wells that strongly affected errors of the estimated TTDs. However, prediction errors for NO3− and median age depended more on tracer concentration errors. The SDM tended to give the most accurate estimates of the vertical velocity and other predictions, although TTD model selection had minor effects overall. Adding tracers improved predictions if the new tracers had different input histories. Studies using TTD models should focus on the factors that most strongly affect the desired predictions.

  20. Redesigning Primary Care to Tackle the Global Epidemic of Noncommunicable Disease

    PubMed Central

    Nigenda, Gustavo; Knaul, Felicia M.

    2015-01-01

    Noncommunicable diseases (NCDs) have become the major contributors to death and disability worldwide. Nearly 80% of the deaths in 2010 occurred in low- and middle-income countries, which have experienced rapid population aging, urbanization, rise in smoking, and changes in diet and activity. Yet the health systems of low- and middle-income countries, historically oriented to infectious disease and often severely underfunded, are poorly prepared for the challenge of caring for people with cardiovascular disease, diabetes, cancer, and chronic respiratory disease. We have discussed how primary care can be redesigned to tackle the challenge of NCDs in resource-constrained countries. We suggest that four changes will be required: integration of services, innovative service delivery, a focus on patients and communities, and adoption of new technologies for communication. PMID:25602898

  1. Hierarchical Modeling of Sequential Behavioral Data: Examining Complex Association Patterns in Mediation Models

    ERIC Educational Resources Information Center

    Dagne, Getachew A.; Brown, C. Hendricks; Howe, George W.

    2007-01-01

    This article presents new methods for modeling the strength of association between multiple behaviors in a behavioral sequence, particularly those involving substantively important interaction patterns. Modeling and identifying such interaction patterns becomes more complex when behaviors are assigned to more than two categories, as is the case…

  2. An analysis of urban collisions using an artificial intelligence model.

    PubMed

    Mussone, L; Ferrari, A; Oneta, M

    1999-11-01

    Traditional studies on road accidents estimate the effect of variables (such as vehicular flows, road geometry, vehicular characteristics), and the calculation of the number of accidents. A descriptive statistical analysis of the accidents (those used in the model) over the period 1992-1995 is proposed. The paper describes an alternative method based on the use of artificial neural networks (ANN) in order to work out a model that relates to the analysis of vehicular accidents in Milan. The degree of danger of urban intersections using different scenarios is quantified by the ANN model. Methodology is the first result, which allows us to tackle the modelling of urban vehicular accidents by the innovative use of ANN. Other results deal with model outputs: intersection complexity may determine a higher accident index depending on the regulation of intersection. The highest index for running over of pedestrian occurs at non-signalised intersections at night-time.

  3. Expansion of the 'Antibiotic Guardian' one health behavioural campaign across Europe to tackle antibiotic resistance: pilot phase and analysis of AMR knowledge.

    PubMed

    Newitt, Sophie; Anthierens, Sibyl; Coenen, Samuel; Lo Fo Wong, Danilo; Salvi, Cristiana; Puleston, Richard; Ashiru-Oredope, Diane

    2018-06-01

    Antimicrobial resistance (AMR) is a major public health threat. The UK Antibiotic Guardian (AG) behavioural change campaign developed to tackle AMR was expanded across Europe through translation into Russian, Dutch and French. Demographics and knowledge of AGs were analyzed between 01 November 2016 and 31 December 2016. A total of 367 pledges were received with the majority from the public and health care professionals. The pilot has significantly increased the proportion of pledges from Europe (excluding UK) (χ2 = 108.7, P < 0.001). AMR knowledge was greater in AGs (including the public) compared to the EU Eurobarometer survey. Further promotion across Europe is required to measure an impact on tackling AMR.

  4. Reduced Complexity Modelling of Urban Floodplain Inundation

    NASA Astrophysics Data System (ADS)

    McMillan, H. K.; Brasington, J.; Mihir, M.

    2004-12-01

    Significant recent advances in floodplain inundation modelling have been achieved by directly coupling 1d channel hydraulic models with a raster storage cell approximation for floodplain flows. The strengths of this reduced-complexity model structure derive from its explicit dependence on a digital elevation model (DEM) to parameterize flows through riparian areas, providing a computationally efficient algorithm to model heterogeneous floodplains. Previous applications of this framework have generally used mid-range grid scales (101-102 m), showing the capacity of the models to simulate long reaches (103-104 m). However, the increasing availability of precision DEMs derived from airborne laser altimetry (LIDAR) enables their use at very high spatial resolutions (100-101 m). This spatial scale offers the opportunity to incorporate the complexity of the built environment directly within the floodplain DEM and simulate urban flooding. This poster describes a series of experiments designed to explore model functionality at these reduced scales. Important questions are considered, raised by this new approach, about the reliability and representation of the floodplain topography and built environment, and the resultant sensitivity of inundation forecasts. The experiments apply a raster floodplain model to reconstruct a 1:100 year flood event on the River Granta in eastern England, which flooded 72 properties in the town of Linton in October 2001. The simulations use a nested-scale model to maintain efficiency. A 2km by 4km urban zone is represented by a high-resolution DEM derived from single-pulse LIDAR data supplied by the UK Environment Agency, together with surveyed data and aerial photography. Novel methods of processing the raw data to provide the individual structure detail required are investigated and compared. This is then embedded within a lower-resolution model application at the reach scale which provides boundary conditions based on recorded flood stage

  5. [The challenge of clinical complexity in the 21st century: Could frailty indexes be the answer?

    PubMed

    Amblàs-Novellas, Jordi; Espaulella-Panicot, Joan; Inzitari, Marco; Rexach, Lourdes; Fontecha, Benito; Romero-Ortuno, Roman

    The number of older people with complex clinical conditions and complex care needs continues to increase in the population. This is presenting many challenges to healthcare professionals and healthcare systems. In the face of these challenges, approaches are required that are practical and feasible. The frailty paradigm may be an excellent opportunity to review and establish some of the principles of comprehensive Geriatric Assessment in specialties outside Geriatric Medicine. The assessment of frailty using Frailty Indexes provides an aid to the 'situational diagnosis' of complex clinical situations, and may help in tackling uncertainty in a person-centred approach. Copyright © 2016 SEGG. Publicado por Elsevier España, S.L.U. All rights reserved.

  6. Modeling and complexity of stochastic interacting Lévy type financial price dynamics

    NASA Astrophysics Data System (ADS)

    Wang, Yiduan; Zheng, Shenzhou; Zhang, Wei; Wang, Jun; Wang, Guochao

    2018-06-01

    In attempt to reproduce and investigate nonlinear dynamics of security markets, a novel nonlinear random interacting price dynamics, which is considered as a Lévy type process, is developed and investigated by the combination of lattice oriented percolation and Potts dynamics, which concerns with the instinctive random fluctuation and the fluctuation caused by the spread of the investors' trading attitudes, respectively. To better understand the fluctuation complexity properties of the proposed model, the complexity analyses of random logarithmic price return and corresponding volatility series are preformed, including power-law distribution, Lempel-Ziv complexity and fractional sample entropy. In order to verify the rationality of the proposed model, the corresponding studies of actual security market datasets are also implemented for comparison. The empirical results reveal that this financial price model can reproduce some important complexity features of actual security markets to some extent. The complexity of returns decreases with the increase of parameters γ1 and β respectively, furthermore, the volatility series exhibit lower complexity than the return series

  7. NGL Viewer: Web-based molecular graphics for large complexes.

    PubMed

    Rose, Alexander S; Bradley, Anthony R; Valasatava, Yana; Duarte, Jose M; Prlic, Andreas; Rose, Peter W

    2018-05-29

    The interactive visualization of very large macromolecular complexes on the web is becoming a challenging problem as experimental techniques advance at an unprecedented rate and deliver structures of increasing size. We have tackled this problem by developing highly memory-efficient and scalable extensions for the NGL WebGL-based molecular viewer and by using MMTF, a binary and compressed Macromolecular Transmission Format. These enable NGL to download and render molecular complexes with millions of atoms interactively on desktop computers and smartphones alike, making it a tool of choice for web-based molecular visualization in research and education. The source code is freely available under the MIT license at github.com/arose/ngl and distributed on NPM (npmjs.com/package/ngl). MMTF-JavaScript encoders and decoders are available at github.com/rcsb/mmtf-javascript. asr.moin@gmail.com.

  8. Surface complexation modeling of zinc sorption onto ferrihydrite.

    PubMed

    Dyer, James A; Trivedi, Paras; Scrivner, Noel C; Sparks, Donald L

    2004-02-01

    A previous study involving lead(II) [Pb(II)] sorption onto ferrihydrite over a wide range of conditions highlighted the advantages of combining molecular- and macroscopic-scale investigations with surface complexation modeling to predict Pb(II) speciation and partitioning in aqueous systems. In this work, an extensive collection of new macroscopic and spectroscopic data was used to assess the ability of the modified triple-layer model (TLM) to predict single-solute zinc(II) [Zn(II)] sorption onto 2-line ferrihydrite in NaNO(3) solutions as a function of pH, ionic strength, and concentration. Regression of constant-pH isotherm data, together with potentiometric titration and pH edge data, was a much more rigorous test of the modified TLM than fitting pH edge data alone. When coupled with valuable input from spectroscopic analyses, good fits of the isotherm data were obtained with a one-species, one-Zn-sorption-site model using the bidentate-mononuclear surface complex, (triple bond FeO)(2)Zn; however, surprisingly, both the density of Zn(II) sorption sites and the value of the best-fit equilibrium "constant" for the bidentate-mononuclear complex had to be adjusted with pH to adequately fit the isotherm data. Although spectroscopy provided some evidence for multinuclear surface complex formation at surface loadings approaching site saturation at pH >/=6.5, the assumption of a bidentate-mononuclear surface complex provided acceptable fits of the sorption data over the entire range of conditions studied. Regressing edge data in the absence of isotherm and spectroscopic data resulted in a fair number of surface-species/site-type combinations that provided acceptable fits of the edge data, but unacceptable fits of the isotherm data. A linear relationship between logK((triple bond FeO)2Zn) and pH was found, given by logK((triple bond FeO)2Znat1g/l)=2.058 (pH)-6.131. In addition, a surface activity coefficient term was introduced to the model to reduce the ionic strength

  9. Adaptive Online Sequential ELM for Concept Drift Tackling

    PubMed Central

    Basaruddin, Chan

    2016-01-01

    A machine learning method needs to adapt to over time changes in the environment. Such changes are known as concept drift. In this paper, we propose concept drift tackling method as an enhancement of Online Sequential Extreme Learning Machine (OS-ELM) and Constructive Enhancement OS-ELM (CEOS-ELM) by adding adaptive capability for classification and regression problem. The scheme is named as adaptive OS-ELM (AOS-ELM). It is a single classifier scheme that works well to handle real drift, virtual drift, and hybrid drift. The AOS-ELM also works well for sudden drift and recurrent context change type. The scheme is a simple unified method implemented in simple lines of code. We evaluated AOS-ELM on regression and classification problem by using concept drift public data set (SEA and STAGGER) and other public data sets such as MNIST, USPS, and IDS. Experiments show that our method gives higher kappa value compared to the multiclassifier ELM ensemble. Even though AOS-ELM in practice does not need hidden nodes increase, we address some issues related to the increasing of the hidden nodes such as error condition and rank values. We propose taking the rank of the pseudoinverse matrix as an indicator parameter to detect “underfitting” condition. PMID:27594879

  10. Efficient algorithms for accurate hierarchical clustering of huge datasets: tackling the entire protein space.

    PubMed

    Loewenstein, Yaniv; Portugaly, Elon; Fromer, Menachem; Linial, Michal

    2008-07-01

    UPGMA (average linking) is probably the most popular algorithm for hierarchical data clustering, especially in computational biology. However, UPGMA requires the entire dissimilarity matrix in memory. Due to this prohibitive requirement, UPGMA is not scalable to very large datasets. We present a novel class of memory-constrained UPGMA (MC-UPGMA) algorithms. Given any practical memory size constraint, this framework guarantees the correct clustering solution without explicitly requiring all dissimilarities in memory. The algorithms are general and are applicable to any dataset. We present a data-dependent characterization of hardness and clustering efficiency. The presented concepts are applicable to any agglomerative clustering formulation. We apply our algorithm to the entire collection of protein sequences, to automatically build a comprehensive evolutionary-driven hierarchy of proteins from sequence alone. The newly created tree captures protein families better than state-of-the-art large-scale methods such as CluSTr, ProtoNet4 or single-linkage clustering. We demonstrate that leveraging the entire mass embodied in all sequence similarities allows to significantly improve on current protein family clusterings which are unable to directly tackle the sheer mass of this data. Furthermore, we argue that non-metric constraints are an inherent complexity of the sequence space and should not be overlooked. The robustness of UPGMA allows significant improvement, especially for multidomain proteins, and for large or divergent families. A comprehensive tree built from all UniProt sequence similarities, together with navigation and classification tools will be made available as part of the ProtoNet service. A C++ implementation of the algorithm is available on request.

  11. Structure, dynamics and biophysics of the cytoplasmic protein–protein complexes of the bacterial phosphoenolpyruvate: Sugar phosphotransferase system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clore, G. Marius; Venditti, Vincenzo

    2013-10-01

    The bacterial phosphotransferase system (PTS) couples phosphoryl transfer, via a series of bimolecular protein–protein interactions, to sugar transport across the membrane. The multitude of complexes in the PTS provides a paradigm for studying protein interactions, and for understanding how the same binding surface can specifically recognize a diverse array of targets. Fifteen years of work aimed at solving the solution structures of all soluble protein–protein complexes of the PTS has served as a test bed for developing NMR and integrated hybrid approaches to study larger complexes in solution and to probe transient, spectroscopically invisible states, including encounter complexes. We reviewmore » these approaches, highlighting the problems that can be tackled with these methods, and summarize the current findings on protein interactions.« less

  12. An Adaptive Complex Network Model for Brain Functional Networks

    PubMed Central

    Gomez Portillo, Ignacio J.; Gleiser, Pablo M.

    2009-01-01

    Brain functional networks are graph representations of activity in the brain, where the vertices represent anatomical regions and the edges their functional connectivity. These networks present a robust small world topological structure, characterized by highly integrated modules connected sparsely by long range links. Recent studies showed that other topological properties such as the degree distribution and the presence (or absence) of a hierarchical structure are not robust, and show different intriguing behaviors. In order to understand the basic ingredients necessary for the emergence of these complex network structures we present an adaptive complex network model for human brain functional networks. The microscopic units of the model are dynamical nodes that represent active regions of the brain, whose interaction gives rise to complex network structures. The links between the nodes are chosen following an adaptive algorithm that establishes connections between dynamical elements with similar internal states. We show that the model is able to describe topological characteristics of human brain networks obtained from functional magnetic resonance imaging studies. In particular, when the dynamical rules of the model allow for integrated processing over the entire network scale-free non-hierarchical networks with well defined communities emerge. On the other hand, when the dynamical rules restrict the information to a local neighborhood, communities cluster together into larger ones, giving rise to a hierarchical structure, with a truncated power law degree distribution. PMID:19738902

  13. Are more complex physiological models of forest ecosystems better choices for plot and regional predictions?

    Treesearch

    Wenchi Jin; Hong S. He; Frank R. Thompson

    2016-01-01

    Process-based forest ecosystem models vary from simple physiological, complex physiological, to hybrid empirical-physiological models. Previous studies indicate that complex models provide the best prediction at plot scale with a temporal extent of less than 10 years, however, it is largely untested as to whether complex models outperform the other two types of models...

  14. Tackling NCD in LMIC: Achievements and Lessons Learned From the NHLBI-UnitedHealth Global Health Centers of Excellence Program.

    PubMed

    Engelgau, Michael M; Sampson, Uchechukwu K; Rabadan-Diehl, Cristina; Smith, Richard; Miranda, Jaime; Bloomfield, Gerald S; Belis, Deshiree; Narayan, K M Venkat

    2016-03-01

    Effectively tackling the growing noncommunicable disease (NCD) burden in low- and middle-income countries (LMIC) is a major challenge. To address research needs in this setting for NCDs, in 2009, National Heart, Lung, and Blood Institute (NHLBI) and UnitedHealth Group (UHG) engaged in a public-private partnership that supported a network of 11 LMIC-based research centers and created the NHLBI-UnitedHealth Global Health Centers of Excellence (COE) Program. The Program's overall goal was to contribute to reducing the cardiovascular and lung disease burdens by catalyzing in-country research institutions to develop a global network of biomedical research centers. Key elements of the Program included team science and collaborative approaches, developing research and training platforms for future investigators, and creating a data commons. This Program embraced a strategic approach for tackling NCDs in LMICs and will provide capacity for locally driven research efforts that can identify and address priority health issues in specific countries' settings. Published by Elsevier B.V.

  15. Modeling Real-Time Applications with Reusable Design Patterns

    NASA Astrophysics Data System (ADS)

    Rekhis, Saoussen; Bouassida, Nadia; Bouaziz, Rafik

    Real-Time (RT) applications, which manipulate important volumes of data, need to be managed with RT databases that deal with time-constrained data and time-constrained transactions. In spite of their numerous advantages, RT databases development remains a complex task, since developers must study many design issues related to the RT domain. In this paper, we tackle this problem by proposing RT design patterns that allow the modeling of structural and behavioral aspects of RT databases. We show how RT design patterns can provide design assistance through architecture reuse of reoccurring design problems. In addition, we present an UML profile that represents patterns and facilitates further their reuse. This profile proposes, on one hand, UML extensions allowing to model the variability of patterns in the RT context and, on another hand, extensions inspired from the MARTE (Modeling and Analysis of Real-Time Embedded systems) profile.

  16. Utility of Small Animal Models of Developmental Programming.

    PubMed

    Reynolds, Clare M; Vickers, Mark H

    2018-01-01

    Any effective strategy to tackle the global obesity and rising noncommunicable disease epidemic requires an in-depth understanding of the mechanisms that underlie these conditions that manifest as a consequence of complex gene-environment interactions. In this context, it is now well established that alterations in the early life environment, including suboptimal nutrition, can result in an increased risk for a range of metabolic, cardiovascular, and behavioral disorders in later life, a process preferentially termed developmental programming. To date, most of the mechanistic knowledge around the processes underpinning development programming has been derived from preclinical research performed mostly, but not exclusively, in laboratory mouse and rat strains. This review will cover the utility of small animal models in developmental programming, the limitations of such models, and potential future directions that are required to fully maximize information derived from preclinical models in order to effectively translate to clinical use.

  17. Application of surface complexation models to anion adsorption by natural materials

    USDA-ARS?s Scientific Manuscript database

    Various chemical models of ion adsorption will be presented and discussed. Chemical models, such as surface complexation models, provide a molecular description of anion adsorption reactions using an equilibrium approach. Two such models, the constant capacitance model and the triple layer model w...

  18. Understanding large multiprotein complexes: applying a multiple allosteric networks model to explain the function of the Mediator transcription complex.

    PubMed

    Lewis, Brian A

    2010-01-15

    The regulation of transcription and of many other cellular processes involves large multi-subunit protein complexes. In the context of transcription, it is known that these complexes serve as regulatory platforms that connect activator DNA-binding proteins to a target promoter. However, there is still a lack of understanding regarding the function of these complexes. Why do multi-subunit complexes exist? What is the molecular basis of the function of their constituent subunits, and how are these subunits organized within a complex? What is the reason for physical connections between certain subunits and not others? In this article, I address these issues through a model of network allostery and its application to the eukaryotic RNA polymerase II Mediator transcription complex. The multiple allosteric networks model (MANM) suggests that protein complexes such as Mediator exist not only as physical but also as functional networks of interconnected proteins through which information is transferred from subunit to subunit by the propagation of an allosteric state known as conformational spread. Additionally, there are multiple distinct sub-networks within the Mediator complex that can be defined by their connections to different subunits; these sub-networks have discrete functions that are activated when specific subunits interact with other activator proteins.

  19. Modeling complexity in pathologist workload measurement: the Automatable Activity-Based Approach to Complexity Unit Scoring (AABACUS).

    PubMed

    Cheung, Carol C; Torlakovic, Emina E; Chow, Hung; Snover, Dale C; Asa, Sylvia L

    2015-03-01

    Pathologists provide diagnoses relevant to the disease state of the patient and identify specific tissue characteristics relevant to response to therapy and prognosis. As personalized medicine evolves, there is a trend for increased demand of tissue-derived parameters. Pathologists perform increasingly complex analyses on the same 'cases'. Traditional methods of workload assessment and reimbursement, based on number of cases sometimes with a modifier (eg, the relative value unit (RVU) system used in the United States), often grossly underestimate the amount of work needed for complex cases and may overvalue simple, small biopsy cases. We describe a new approach to pathologist workload measurement that aligns with this new practice paradigm. Our multisite institution with geographically diverse partner institutions has developed the Automatable Activity-Based Approach to Complexity Unit Scoring (AABACUS) model that captures pathologists' clinical activities from parameters documented in departmental laboratory information systems (LISs). The model's algorithm includes: 'capture', 'export', 'identify', 'count', 'score', 'attribute', 'filter', and 'assess filtered results'. Captured data include specimen acquisition, handling, analysis, and reporting activities. Activities were counted and complexity units (CUs) generated using a complexity factor for each activity. CUs were compared between institutions, practice groups, and practice types and evaluated over a 5-year period (2008-2012). The annual load of a clinical service pathologist, irrespective of subspecialty, was ∼40,000 CUs using relative benchmarking. The model detected changing practice patterns and was appropriate for monitoring clinical workload for anatomical pathology, neuropathology, and hematopathology in academic and community settings, and encompassing subspecialty and generalist practices. AABACUS is objective, can be integrated with an LIS and automated, is reproducible, backwards compatible

  20. Complex Instruction: A Model for Reaching Up--and Out

    ERIC Educational Resources Information Center

    Tomlinson, Carol Ann

    2018-01-01

    Complex Instruction is a multifaceted instructional model designed to provide highly challenging learning opportunities for students in heterogeneous classrooms. The model provides a rationale for and philosophy of creating equity of access to excellent curriculum and instruction for a broad range of learners, guidance for preparing students for…

  1. Progressive Learning of Topic Modeling Parameters: A Visual Analytics Framework.

    PubMed

    El-Assady, Mennatallah; Sevastjanova, Rita; Sperrle, Fabian; Keim, Daniel; Collins, Christopher

    2018-01-01

    Topic modeling algorithms are widely used to analyze the thematic composition of text corpora but remain difficult to interpret and adjust. Addressing these limitations, we present a modular visual analytics framework, tackling the understandability and adaptability of topic models through a user-driven reinforcement learning process which does not require a deep understanding of the underlying topic modeling algorithms. Given a document corpus, our approach initializes two algorithm configurations based on a parameter space analysis that enhances document separability. We abstract the model complexity in an interactive visual workspace for exploring the automatic matching results of two models, investigating topic summaries, analyzing parameter distributions, and reviewing documents. The main contribution of our work is an iterative decision-making technique in which users provide a document-based relevance feedback that allows the framework to converge to a user-endorsed topic distribution. We also report feedback from a two-stage study which shows that our technique results in topic model quality improvements on two independent measures.

  2. Entropy, complexity, and Markov diagrams for random walk cancer models

    PubMed Central

    Newton, Paul K.; Mason, Jeremy; Hurt, Brian; Bethel, Kelly; Bazhenova, Lyudmila; Nieva, Jorge; Kuhn, Peter

    2014-01-01

    The notion of entropy is used to compare the complexity associated with 12 common cancers based on metastatic tumor distribution autopsy data. We characterize power-law distributions, entropy, and Kullback-Liebler divergence associated with each primary cancer as compared with data for all cancer types aggregated. We then correlate entropy values with other measures of complexity associated with Markov chain dynamical systems models of progression. The Markov transition matrix associated with each cancer is associated with a directed graph model where nodes are anatomical locations where a metastatic tumor could develop, and edge weightings are transition probabilities of progression from site to site. The steady-state distribution corresponds to the autopsy data distribution. Entropy correlates well with the overall complexity of the reduced directed graph structure for each cancer and with a measure of systemic interconnectedness of the graph, called graph conductance. The models suggest that grouping cancers according to their entropy values, with skin, breast, kidney, and lung cancers being prototypical high entropy cancers, stomach, uterine, pancreatic and ovarian being mid-level entropy cancers, and colorectal, cervical, bladder, and prostate cancers being prototypical low entropy cancers, provides a potentially useful framework for viewing metastatic cancer in terms of predictability, complexity, and metastatic potential. PMID:25523357

  3. Entropy, complexity, and Markov diagrams for random walk cancer models.

    PubMed

    Newton, Paul K; Mason, Jeremy; Hurt, Brian; Bethel, Kelly; Bazhenova, Lyudmila; Nieva, Jorge; Kuhn, Peter

    2014-12-19

    The notion of entropy is used to compare the complexity associated with 12 common cancers based on metastatic tumor distribution autopsy data. We characterize power-law distributions, entropy, and Kullback-Liebler divergence associated with each primary cancer as compared with data for all cancer types aggregated. We then correlate entropy values with other measures of complexity associated with Markov chain dynamical systems models of progression. The Markov transition matrix associated with each cancer is associated with a directed graph model where nodes are anatomical locations where a metastatic tumor could develop, and edge weightings are transition probabilities of progression from site to site. The steady-state distribution corresponds to the autopsy data distribution. Entropy correlates well with the overall complexity of the reduced directed graph structure for each cancer and with a measure of systemic interconnectedness of the graph, called graph conductance. The models suggest that grouping cancers according to their entropy values, with skin, breast, kidney, and lung cancers being prototypical high entropy cancers, stomach, uterine, pancreatic and ovarian being mid-level entropy cancers, and colorectal, cervical, bladder, and prostate cancers being prototypical low entropy cancers, provides a potentially useful framework for viewing metastatic cancer in terms of predictability, complexity, and metastatic potential.

  4. Entropy, complexity, and Markov diagrams for random walk cancer models

    NASA Astrophysics Data System (ADS)

    Newton, Paul K.; Mason, Jeremy; Hurt, Brian; Bethel, Kelly; Bazhenova, Lyudmila; Nieva, Jorge; Kuhn, Peter

    2014-12-01

    The notion of entropy is used to compare the complexity associated with 12 common cancers based on metastatic tumor distribution autopsy data. We characterize power-law distributions, entropy, and Kullback-Liebler divergence associated with each primary cancer as compared with data for all cancer types aggregated. We then correlate entropy values with other measures of complexity associated with Markov chain dynamical systems models of progression. The Markov transition matrix associated with each cancer is associated with a directed graph model where nodes are anatomical locations where a metastatic tumor could develop, and edge weightings are transition probabilities of progression from site to site. The steady-state distribution corresponds to the autopsy data distribution. Entropy correlates well with the overall complexity of the reduced directed graph structure for each cancer and with a measure of systemic interconnectedness of the graph, called graph conductance. The models suggest that grouping cancers according to their entropy values, with skin, breast, kidney, and lung cancers being prototypical high entropy cancers, stomach, uterine, pancreatic and ovarian being mid-level entropy cancers, and colorectal, cervical, bladder, and prostate cancers being prototypical low entropy cancers, provides a potentially useful framework for viewing metastatic cancer in terms of predictability, complexity, and metastatic potential.

  5. Sparkle model for AM1 calculation of lanthanide complexes: improved parameters for europium.

    PubMed

    Rocha, Gerd B; Freire, Ricardo O; Da Costa, Nivan B; De Sá, Gilberto F; Simas, Alfredo M

    2004-04-05

    In the present work, we sought to improve our sparkle model for the calculation of lanthanide complexes, SMLC,in various ways: (i) inclusion of the europium atomic mass, (ii) reparametrization of the model within AM1 from a new response function including all distances of the coordination polyhedron for tris(acetylacetonate)(1,10-phenanthroline) europium(III), (iii) implementation of the model in the software package MOPAC93r2, and (iv) inclusion of spherical Gaussian functions in the expression which computes the core-core repulsion energy. The parametrization results indicate that SMLC II is superior to the previous version of the model because Gaussian functions proved essential if one requires a better description of the geometries of the complexes. In order to validate our parametrization, we carried out calculations on 96 europium(III) complexes, selected from Cambridge Structural Database 2003, and compared our predicted ground state geometries with the experimental ones. Our results show that this new parametrization of the SMLC model, with the inclusion of spherical Gaussian functions in the core-core repulsion energy, is better capable of predicting the Eu-ligand distances than the previous version. The unsigned mean error for all interatomic distances Eu-L, in all 96 complexes, which, for the original SMLC is 0.3564 A, is lowered to 0.1993 A when the model was parametrized with the inclusion of two Gaussian functions. Our results also indicate that this model is more applicable to europium complexes with beta-diketone ligands. As such, we conclude that this improved model can be considered a powerful tool for the study of lanthanide complexes and their applications, such as the modeling of light conversion molecular devices.

  6. Assessment of wear dependence parameters in complex model of cutting tool wear

    NASA Astrophysics Data System (ADS)

    Antsev, A. V.; Pasko, N. I.; Antseva, N. V.

    2018-03-01

    This paper addresses wear dependence of the generic efficient life period of cutting tools taken as an aggregate of the law of tool wear rate distribution and dependence of parameters of this law's on the cutting mode, factoring in the random factor as exemplified by the complex model of wear. The complex model of wear takes into account the variance of cutting properties within one batch of tools, variance in machinability within one batch of workpieces, and the stochastic nature of the wear process itself. A technique of assessment of wear dependence parameters in a complex model of cutting tool wear is provided. The technique is supported by a numerical example.

  7. STEPS: Modeling and Simulating Complex Reaction-Diffusion Systems with Python

    PubMed Central

    Wils, Stefan; Schutter, Erik De

    2008-01-01

    We describe how the use of the Python language improved the user interface of the program STEPS. STEPS is a simulation platform for modeling and stochastic simulation of coupled reaction-diffusion systems with complex 3-dimensional boundary conditions. Setting up such models is a complicated process that consists of many phases. Initial versions of STEPS relied on a static input format that did not cleanly separate these phases, limiting modelers in how they could control the simulation and becoming increasingly complex as new features and new simulation algorithms were added. We solved all of these problems by tightly integrating STEPS with Python, using SWIG to expose our existing simulation code. PMID:19623245

  8. Perpetual Model Validation

    DTIC Science & Technology

    2017-03-01

    models of software execution, for example memory access patterns, to check for security intrusions. Additional research was performed to tackle the...considered using indirect models of software execution, for example memory access patterns, to check for security intrusions. Additional research ...deterioration for example , no longer corresponds to the model used during verification time. Finally, the research looked at ways to combine hybrid systems

  9. Tips on Creating Complex Geometry Using Solid Modeling Software

    ERIC Educational Resources Information Center

    Gow, George

    2008-01-01

    Three-dimensional computer-aided drafting (CAD) software, sometimes referred to as "solid modeling" software, is easy to learn, fun to use, and becoming the standard in industry. However, many users have difficulty creating complex geometry with the solid modeling software. And the problem is not entirely a student problem. Even some teachers and…

  10. The effects of numerical-model complexity and observation type on estimated porosity values

    USGS Publications Warehouse

    Starn, Jeffrey; Bagtzoglou, Amvrossios C.; Green, Christopher T.

    2015-01-01

    The relative merits of model complexity and types of observations employed in model calibration are compared. An existing groundwater flow model coupled with an advective transport simulation of the Salt Lake Valley, Utah (USA), is adapted for advective transport, and effective porosity is adjusted until simulated tritium concentrations match concentrations in samples from wells. Two calibration approaches are used: a “complex” highly parameterized porosity field and a “simple” parsimonious model of porosity distribution. The use of an atmospheric tracer (tritium in this case) and apparent ages (from tritium/helium) in model calibration also are discussed. Of the models tested, the complex model (with tritium concentrations and tritium/helium apparent ages) performs best. Although tritium breakthrough curves simulated by complex and simple models are very generally similar, and there is value in the simple model, the complex model is supported by a more realistic porosity distribution and a greater number of estimable parameters. Culling the best quality data did not lead to better calibration, possibly because of processes and aquifer characteristics that are not simulated. Despite many factors that contribute to shortcomings of both the models and the data, useful information is obtained from all the models evaluated. Although any particular prediction of tritium breakthrough may have large errors, overall, the models mimic observed trends.

  11. Mechanistic kinetic models of enzymatic cellulose hydrolysis-A review.

    PubMed

    Jeoh, Tina; Cardona, Maria J; Karuna, Nardrapee; Mudinoor, Akshata R; Nill, Jennifer

    2017-07-01

    Bioconversion of lignocellulose forms the basis for renewable, advanced biofuels, and bioproducts. Mechanisms of hydrolysis of cellulose by cellulases have been actively studied for nearly 70 years with significant gains in understanding of the cellulolytic enzymes. Yet, a full mechanistic understanding of the hydrolysis reaction has been elusive. We present a review to highlight new insights gained since the most recent comprehensive review of cellulose hydrolysis kinetic models by Bansal et al. (2009) Biotechnol Adv 27:833-848. Recent models have taken a two-pronged approach to tackle the challenge of modeling the complex heterogeneous reaction-an enzyme-centric modeling approach centered on the molecularity of the cellulase-cellulose interactions to examine rate limiting elementary steps and a substrate-centric modeling approach aimed at capturing the limiting property of the insoluble cellulose substrate. Collectively, modeling results suggest that at the molecular-scale, how rapidly cellulases can bind productively (complexation) and release from cellulose (decomplexation) is limiting, while the overall hydrolysis rate is largely insensitive to the catalytic rate constant. The surface area of the insoluble substrate and the degrees of polymerization of the cellulose molecules in the reaction both limit initial hydrolysis rates only. Neither enzyme-centric models nor substrate-centric models can consistently capture hydrolysis time course at extended reaction times. Thus, questions of the true reaction limiting factors at extended reaction times and the role of complexation and decomplexation in rate limitation remain unresolved. Biotechnol. Bioeng. 2017;114: 1369-1385. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  12. Complexity, accuracy and practical applicability of different biogeochemical model versions

    NASA Astrophysics Data System (ADS)

    Los, F. J.; Blaas, M.

    2010-04-01

    The construction of validated biogeochemical model applications as prognostic tools for the marine environment involves a large number of choices particularly with respect to the level of details of the .physical, chemical and biological aspects. Generally speaking, enhanced complexity might enhance veracity, accuracy and credibility. However, very complex models are not necessarily effective or efficient forecast tools. In this paper, models of varying degrees of complexity are evaluated with respect to their forecast skills. In total 11 biogeochemical model variants have been considered based on four different horizontal grids. The applications vary in spatial resolution, in vertical resolution (2DH versus 3D), in nature of transport, in turbidity and in the number of phytoplankton species. Included models range from 15 year old applications with relatively simple physics up to present state of the art 3D models. With all applications the same year, 2003, has been simulated. During the model intercomparison it has been noticed that the 'OSPAR' Goodness of Fit cost function (Villars and de Vries, 1998) leads to insufficient discrimination of different models. This results in models obtaining similar scores although closer inspection of the results reveals large differences. In this paper therefore, we have adopted the target diagram by Jolliff et al. (2008) which provides a concise and more contrasting picture of model skill on the entire model domain and for the entire period of the simulations. Correctness in prediction of the mean and the variability are separated and thus enhance insight in model functioning. Using the target diagrams it is demonstrated that recent models are more consistent and have smaller biases. Graphical inspection of time series confirms this, as the level of variability appears more realistic, also given the multi-annual background statistics of the observations. Nevertheless, whether the improvements are all genuine for the particular

  13. Using Models to Inform Policy: Insights from Modeling the Complexities of Global Polio Eradication

    NASA Astrophysics Data System (ADS)

    Thompson, Kimberly M.

    Drawing on over 20 years of experience modeling risks in complex systems, this talk will challenge SBP participants to develop models that provide timely and useful answers to critical policy questions when decision makers need them. The talk will include reflections on the opportunities and challenges associated with developing integrated models for complex problems and communicating their results effectively. Dr. Thompson will focus the talk largely on collaborative modeling related to global polio eradication and the application of system dynamics tools. After successful global eradication of wild polioviruses, live polioviruses will still present risks that could potentially lead to paralytic polio cases. This talk will present the insights of efforts to use integrated dynamic, probabilistic risk, decision, and economic models to address critical policy questions related to managing global polio risks. Using a dynamic disease transmission model combined with probabilistic model inputs that characterize uncertainty for a stratified world to account for variability, we find that global health leaders will face some difficult choices, but that they can take actions that will manage the risks effectively. The talk will emphasize the need for true collaboration between modelers and subject matter experts, and the importance of working with decision makers as partners to ensure the development of useful models that actually get used.

  14. Complex Road Intersection Modelling Based on Low-Frequency GPS Track Data

    NASA Astrophysics Data System (ADS)

    Huang, J.; Deng, M.; Zhang, Y.; Liu, H.

    2017-09-01

    It is widely accepted that digital map becomes an indispensable guide for human daily traveling. Traditional road network maps are produced in the time-consuming and labour-intensive ways, such as digitizing printed maps and extraction from remote sensing images. At present, a large number of GPS trajectory data collected by floating vehicles makes it a reality to extract high-detailed and up-to-date road network information. Road intersections are often accident-prone areas and very critical to route planning and the connectivity of road networks is mainly determined by the topological geometry of road intersections. A few studies paid attention on detecting complex road intersections and mining the attached traffic information (e.g., connectivity, topology and turning restriction) from massive GPS traces. To the authors' knowledge, recent studies mainly used high frequency (1 s sampling rate) trajectory data to detect the crossroads regions or extract rough intersection models. It is still difficult to make use of low frequency (20-100 s) and easily available trajectory data to modelling complex road intersections geometrically and semantically. The paper thus attempts to construct precise models for complex road intersection by using low frequency GPS traces. We propose to firstly extract the complex road intersections by a LCSS-based (Longest Common Subsequence) trajectory clustering method, then delineate the geometry shapes of complex road intersections by a K-segment principle curve algorithm, and finally infer the traffic constraint rules inside the complex intersections.

  15. Tackling Critical Catalytic Residues in Helicobacter pylori l-Asparaginase

    PubMed Central

    Maggi, Maristella; Chiarelli, Laurent R; Valentini, Giovanna; Scotti, Claudia

    2015-01-01

    Bacterial asparaginases (amidohydrolases, EC 3.5.1.1) are important enzymes in cancer therapy, especially for Acute Lymphoblastic Leukemia. They are tetrameric enzymes able to catalyze the deamination of l-ASN and, to a variable extent, of l-GLN, on which leukemia cells are dependent for survival. In contrast to other known l-asparaginases, Helicobacter pylori CCUG 17874 type II enzyme (HpASNase) is cooperative and has a low affinity towards l-GLN. In this study, some critical amino acids forming the active site of HpASNase (T16, T95 and E289) have been tackled by rational engineering in the attempt to better define their role in catalysis and to achieve a deeper understanding of the peculiar cooperative behavior of this enzyme. Mutations T16E, T95D and T95H led to a complete loss of enzymatic activity. Mutation E289A dramatically reduced the catalytic activity of the enzyme, but increased its thermostability. Interestingly, E289 belongs to a loop that is very variable in l-asparaginases from the structure, sequence and length point of view, and which could be a main determinant of their different catalytic features. PMID:25826146

  16. Tackling Critical Catalytic Residues in Helicobacter pylori L-Asparaginase.

    PubMed

    Maggi, Maristella; Chiarelli, Laurent R; Valentini, Giovanna; Scotti, Claudia

    2015-03-27

    Bacterial asparaginases (amidohydrolases, EC 3.5.1.1) are important enzymes in cancer therapy, especially for Acute Lymphoblastic Leukemia. They are tetrameric enzymes able to catalyze the deamination of L-ASN and, to a variable extent, of L-GLN, on which leukemia cells are dependent for survival. In contrast to other known L-asparaginases, Helicobacter pylori CCUG 17874 type II enzyme (HpASNase) is cooperative and has a low affinity towards L-GLN. In this study, some critical amino acids forming the active site of HpASNase (T16, T95 and E289) have been tackled by rational engineering in the attempt to better define their role in catalysis and to achieve a deeper understanding of the peculiar cooperative behavior of this enzyme. Mutations T16E, T95D and T95H led to a complete loss of enzymatic activity. Mutation E289A dramatically reduced the catalytic activity of the enzyme, but increased its thermostability. Interestingly, E289 belongs to a loop that is very variable in L-asparaginases from the structure, sequence and length point of view, and which could be a main determinant of their different catalytic features.

  17. Structured analysis and modeling of complex systems

    NASA Technical Reports Server (NTRS)

    Strome, David R.; Dalrymple, Mathieu A.

    1992-01-01

    The Aircrew Evaluation Sustained Operations Performance (AESOP) facility at Brooks AFB, Texas, combines the realism of an operational environment with the control of a research laboratory. In recent studies we collected extensive data from the Airborne Warning and Control Systems (AWACS) Weapons Directors subjected to high and low workload Defensive Counter Air Scenarios. A critical and complex task in this environment involves committing a friendly fighter against a hostile fighter. Structured Analysis and Design techniques and computer modeling systems were applied to this task as tools for analyzing subject performance and workload. This technology is being transferred to the Man-Systems Division of NASA Johnson Space Center for application to complex mission related tasks, such as manipulating the Shuttle grappler arm.

  18. A model of clutter for complex, multivariate geospatial displays.

    PubMed

    Lohrenz, Maura C; Trafton, J Gregory; Beck, R Melissa; Gendron, Marlin L

    2009-02-01

    A novel model of measuring clutter in complex geospatial displays was compared with human ratings of subjective clutter as a measure of convergent validity. The new model is called the color-clustering clutter (C3) model. Clutter is a known problem in displays of complex data and has been shown to affect target search performance. Previous clutter models are discussed and compared with the C3 model. Two experiments were performed. In Experiment 1, participants performed subjective clutter ratings on six classes of information visualizations. Empirical results were used to set two free parameters in the model. In Experiment 2, participants performed subjective clutter ratings on aeronautical charts. Both experiments compared and correlated empirical data to model predictions. The first experiment resulted in a .76 correlation between ratings and C3. The second experiment resulted in a .86 correlation, significantly better than results from a model developed by Rosenholtz et al. Outliers to our correlation suggest further improvements to C3. We suggest that (a) the C3 model is a good predictor of subjective impressions of clutter in geospatial displays, (b) geospatial clutter is a function of color density and saliency (primary C3 components), and (c) pattern analysis techniques could further improve C3. The C3 model could be used to improve the design of electronic geospatial displays by suggesting when a display will be too cluttered for its intended audience.

  19. Modeling complexity in engineered infrastructure system: Water distribution network as an example

    NASA Astrophysics Data System (ADS)

    Zeng, Fang; Li, Xiang; Li, Ke

    2017-02-01

    The complex topology and adaptive behavior of infrastructure systems are driven by both self-organization of the demand and rigid engineering solutions. Therefore, engineering complex systems requires a method balancing holism and reductionism. To model the growth of water distribution networks, a complex network model was developed following the combination of local optimization rules and engineering considerations. The demand node generation is dynamic and follows the scaling law of urban growth. The proposed model can generate a water distribution network (WDN) similar to reported real-world WDNs on some structural properties. Comparison with different modeling approaches indicates that a realistic demand node distribution and co-evolvement of demand node and network are important for the simulation of real complex networks. The simulation results indicate that the efficiency of water distribution networks is exponentially affected by the urban growth pattern. On the contrary, the improvement of efficiency by engineering optimization is limited and relatively insignificant. The redundancy and robustness, on another aspect, can be significantly improved through engineering methods.

  20. Modeling complex tone perception: grouping harmonics with combination-sensitive neurons.

    PubMed

    Medvedev, Andrei V; Chiao, Faye; Kanwal, Jagmeet S

    2002-06-01

    Perception of complex communication sounds is a major function of the auditory system. To create a coherent precept of these sounds the auditory system may instantaneously group or bind multiple harmonics within complex sounds. This perception strategy simplifies further processing of complex sounds and facilitates their meaningful integration with other sensory inputs. Based on experimental data and a realistic model, we propose that associative learning of combinations of harmonic frequencies and nonlinear facilitation of responses to those combinations, also referred to as "combination-sensitivity," are important for spectral grouping. For our model, we simulated combination sensitivity using Hebbian and associative types of synaptic plasticity in auditory neurons. We also provided a parallel tonotopic input that converges and diverges within the network. Neurons in higher-order layers of the network exhibited an emergent property of multifrequency tuning that is consistent with experimental findings. Furthermore, this network had the capacity to "recognize" the pitch or fundamental frequency of a harmonic tone complex even when the fundamental frequency itself was missing.

  1. Tackling causes and costs of ED presentation for American football injuries: a population-level study.

    PubMed

    Smart, Blair J; Haring, R Sterling; Asemota, Anthony O; Scott, John W; Canner, Joseph K; Nejim, Besma J; George, Benjamin P; Alsulaim, Hatim; Kirsch, Thomas D; Schneider, Eric B

    2016-07-01

    American tackle football is the most popular high-energy impact sport in the United States, with approximately 9 million participants competing annually. Previous epidemiologic studies of football-related injuries have generally focused on specific geographic areas or pediatric age groups. Our study sought to examine patient characteristics and outcomes, including hospital charges, among athletes presenting for emergency department (ED) treatment of football-related injury across all age groups in a large nationally representative data set. Patients presenting for ED treatment of injuries sustained playing American tackle football (identified using International Classification of Diseases, Ninth Revision, Clinical Modification code E007.0) from 2010 to 2011 were studied in the Nationwide Emergency Department Sample. Patient-specific injuries were identified using the primary International Classification of Diseases, Ninth Revision, Clinical Modification diagnosis code and categorized by type and anatomical region. Standard descriptive methods examined patient demographics, diagnosis categories, and ED and inpatient outcomes and charges. During the study period 397363 football players presented for ED treatment, 95.8% of whom were male. Sprains/strains (25.6%), limb fractures (20.7%), and head injuries (including traumatic brain injury; 17.5%) represented the most presenting injuries. Overall, 97.9% of patients underwent routine ED discharge with 1.1% admitted directly and fewer than 11 patients in the 2-year study period dying prior to discharge. The proportion of admitted patients who required surgical interventions was 15.7%, of which 89.9% were orthopedic, 4.7% neurologic, and 2.6% abdominal. Among individuals admitted to inpatient care, mean hospital length of stay was 2.4days (95% confidence interval, 2.2-2.6) and 95.6% underwent routine discharge home. The mean total charge for all patients was $1941 (95% confidence interval, $1890-$1992) with substantial

  2. Modeling alpine grasslands with two integrated hydrologic models: a comparison of the different process representation in CATHY and GEOtop

    NASA Astrophysics Data System (ADS)

    Camporese, M.; Bertoldi, G.; Bortoli, E.; Wohlfahrt, G.

    2017-12-01

    Integrated hydrologic surface-subsurface models (IHSSMs) are increasingly used as prediction tools to solve simultaneously states and fluxes in and between multiple terrestrial compartments (e.g., snow cover, surface water, groundwater), in an attempt to tackle environmental problems in a holistic approach. Two such models, CATHY and GEOtop, are used in this study to investigate their capabilities to reproduce hydrological processes in alpine grasslands. The two models differ significantly in the complexity of the representation of the surface energy balance and the solution of Richards equation for water flow in the variably saturated subsurface. The main goal of this research is to show how these differences in process representation can lead to different predictions of hydrologic states and fluxes, in the simulation of an experimental site located in the Venosta Valley (South Tyrol, Italy). Here, a large set of relevant hydrological data (e.g., evapotranspiration, soil moisture) has been collected, with ground and remote sensing observations. The area of interest is part of a Long-Term Ecological Research (LTER) site, a mountain steep, heterogeneous slope, where the predominant land use types are meadow, pasture, and forest. The comparison between data and model predictions, as well as between simulations with the two IHSSMs, contributes to advance our understanding of the tradeoffs between different complexities in modeĺs process representation, model accuracy, and the ability to explain observed hydrological dynamics in alpine environments.

  3. A computational framework for modeling targets as complex adaptive systems

    NASA Astrophysics Data System (ADS)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  4. High School Football Players Use Their Helmets to Tackle Other Players Despite Knowing the Risks

    PubMed Central

    Kuriyama, Andrew M; Nakatsuka, Austin S

    2017-01-01

    There is greater attention to head-related injuries and concussions in American football. The helmet's structural safety and the way that football players use their helmets are important in preventing head injuries. Current strategies include penalizing players for high-risk behavior such as leading with their helmet or hitting an opposing player above the shoulder. Passive strategies include helmet modification to better protect the head of the players or to change the playing style of the players. Hawai‘i high school varsity football players were surveyed to determine how they use their helmets and how a new helmet design would affect their style of play. One hundred seventy-seven surveys were completed; 79% said that they used their helmet to hit an opposing player during a tackle and 46% said they made this contact intentionally. When asked about modifying helmets with a soft material on the outside, 48% said they thought putting a soft cover over a regular helmet would protect their head better. However, many participants said that putting a soft cover over their regular helmet was a bad idea for various reasons. Most young football players use their helmets to block or tackle despite being taught they would be penalized or potentially injured if they did so. By gaining a better understanding of why and how players use their helmets and how they would respond to new helmet designs, steps can be taken to reduce head injuries for all levels of play. PMID:28352493

  5. High School Football Players Use Their Helmets to Tackle Other Players Despite Knowing the Risks.

    PubMed

    Kuriyama, Andrew M; Nakatsuka, Austin S; Yamamoto, Loren G

    2017-03-01

    There is greater attention to head-related injuries and concussions in American football. The helmet's structural safety and the way that football players use their helmets are important in preventing head injuries. Current strategies include penalizing players for high-risk behavior such as leading with their helmet or hitting an opposing player above the shoulder. Passive strategies include helmet modification to better protect the head of the players or to change the playing style of the players. Hawai'i high school varsity football players were surveyed to determine how they use their helmets and how a new helmet design would affect their style of play. One hundred seventy-seven surveys were completed; 79% said that they used their helmet to hit an opposing player during a tackle and 46% said they made this contact intentionally. When asked about modifying helmets with a soft material on the outside, 48% said they thought putting a soft cover over a regular helmet would protect their head better. However, many participants said that putting a soft cover over their regular helmet was a bad idea for various reasons. Most young football players use their helmets to block or tackle despite being taught they would be penalized or potentially injured if they did so. By gaining a better understanding of why and how players use their helmets and how they would respond to new helmet designs, steps can be taken to reduce head injuries for all levels of play.

  6. Contrasting model complexity under a changing climate in a headwaters catchment.

    NASA Astrophysics Data System (ADS)

    Foster, L.; Williams, K. H.; Maxwell, R. M.

    2017-12-01

    Alpine, snowmelt-dominated catchments are the source of water for more than 1/6th of the world's population. These catchments are topographically complex, leading to steep weather gradients and nonlinear relationships between water and energy fluxes. Recent evidence suggests that alpine systems are more sensitive to climate warming, but these regions are vastly simplified in climate models and operational water management tools due to computational limitations. Simultaneously, point-scale observations are often extrapolated to larger regions where feedbacks can both exacerbate or mitigate locally observed changes. It is critical to determine whether projected climate impacts are robust to different methodologies, including model complexity. Using high performance computing and an integrated model of a representative headwater catchment we determined the hydrologic response from 30 projected climate changes to precipitation, temperature and vegetation for the Rocky Mountains. Simulations were run with 100m and 1km resolution, and with and without lateral subsurface flow in order to vary model complexity. We found that model complexity alters nonlinear relationships between water and energy fluxes. Higher-resolution models predicted larger changes per degree of temperature increase than lower resolution models, suggesting that reductions to snowpack, surface water, and groundwater due to warming may be underestimated in simple models. Increases in temperature were found to have a larger impact on water fluxes and stores than changes in precipitation, corroborating previous research showing that mountain systems are significantly more sensitive to temperature changes than to precipitation changes and that increases in winter precipitation are unlikely to compensate for increased evapotranspiration in a higher energy environment. These numerical experiments help to (1) bracket the range of uncertainty in published literature of climate change impacts on headwater

  7. Schizophrenia: an integrative approach to modelling a complex disorder

    PubMed Central

    Robertson, George S.; Hori, Sarah E.; Powell, Kelly J.

    2006-01-01

    The discovery of candidate susceptibility genes for schizophrenia and the generation of mice lacking proteins that reproduce biochemical processes that are disrupted in this mental illness offer unprecedented opportunities for improved modelling of this complex disorder. Several lines of evidence indicate that obstetrical complications, as well as fetal or neonatal exposure to viral infection, are predisposing events for some forms of schizophrenia. These environmental events can be modelled in animals, resulting in some of the characteristic features of schizophrenia; however, animal models have yet to be developed that encompass both environmental and genetic aspects of this mental illness. A large number of candidate schizophrenia susceptibility genes have been identified that encode proteins implicated in the regulation of synaptic plasticity, neurotransmission, neuronal migration, cell adherence, signal transduction, energy metabolism and neurite outgrowth. In support of the importance of these processes in schizophrenia, mice that have reduced levels or completely lack proteins that control glutamatergic neurotransmission, neuronal migration, cell adherence, signal transduction, neurite outgrowth and synaptic plasticity display many features reminiscent of schizophrenia. In the present review, we discuss strategies for modelling schizophrenia that involve treating mice that bear these mutations in a variety of ways to better model both environmental and genetic factors responsible for this complex mental illness according to a “two-hit hypothesis.” Because rodents are able to perform complex cognitive tasks using odour but not visual or auditory cues, we hypothesize that olfactory-based tests of cognitive performance should be used to search for novel therapeutics that ameliorate the cognitive deficits that are a feature of this devastating mental disorder. PMID:16699601

  8. Modeling and simulation for fewer-axis grinding of complex surface

    NASA Astrophysics Data System (ADS)

    Li, Zhengjian; Peng, Xiaoqiang; Song, Ci

    2017-10-01

    As the basis of fewer-axis grinding of complex surface, the grinding mathematical model is of great importance. A mathematical model of the grinding wheel was established, and then coordinate and normal vector of the wheel profile could be calculated. Through normal vector matching at the cutter contact point and the coordinate system transformation, the grinding mathematical model was established to work out the coordinate of the cutter location point. Based on the model, interference analysis was simulated to find out the right position and posture of workpiece for grinding. Then positioning errors of the workpiece including the translation positioning error and the rotation positioning error were analyzed respectively, and the main locating datum was obtained. According to the analysis results, the grinding tool path was planned and generated to grind the complex surface, and good form accuracy was obtained. The grinding mathematical model is simple, feasible and can be widely applied.

  9. A modeling framework for exposing risks in complex systems.

    PubMed

    Sharit, J

    2000-08-01

    This article introduces and develops a modeling framework for exposing risks in the form of human errors and adverse consequences in high-risk systems. The modeling framework is based on two components: a two-dimensional theory of accidents in systems developed by Perrow in 1984, and the concept of multiple system perspectives. The theory of accidents differentiates systems on the basis of two sets of attributes. One set characterizes the degree to which systems are interactively complex; the other emphasizes the extent to which systems are tightly coupled. The concept of multiple perspectives provides alternative descriptions of the entire system that serve to enhance insight into system processes. The usefulness of these two model components derives from a modeling framework that cross-links them, enabling a variety of work contexts to be exposed and understood that would otherwise be very difficult or impossible to identify. The model components and the modeling framework are illustrated in the case of a large and comprehensive trauma care system. In addition to its general utility in the area of risk analysis, this methodology may be valuable in applications of current methods of human and system reliability analysis in complex and continually evolving high-risk systems.

  10. Nonlinear model of epidemic spreading in a complex social network.

    PubMed

    Kosiński, Robert A; Grabowski, A

    2007-10-01

    The epidemic spreading in a human society is a complex process, which can be described on the basis of a nonlinear mathematical model. In such an approach the complex and hierarchical structure of social network (which has implications for the spreading of pathogens and can be treated as a complex network), can be taken into account. In our model each individual has one of the four permitted states: susceptible, infected, infective, unsusceptible or dead. This refers to the SEIR model used in epidemiology. The state of an individual changes in time, depending on the previous state and the interactions with other individuals. The description of the interpersonal contacts is based on the experimental observations of the social relations in the community. It includes spatial localization of the individuals and hierarchical structure of interpersonal interactions. Numerical simulations were performed for different types of epidemics, giving the progress of a spreading process and typical relationships (e.g. range of epidemic in time, the epidemic curve). The spreading process has a complex and spatially chaotic character. The time dependence of the number of infective individuals shows the nonlinear character of the spreading process. We investigate the influence of the preventive vaccinations on the spreading process. In particular, for a critical value of preventively vaccinated individuals the percolation threshold is observed and the epidemic is suppressed.

  11. Interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations.

    PubMed

    Simic, Vladimir

    2016-06-01

    As the number of end-of-life vehicles (ELVs) is estimated to increase to 79.3 million units per year by 2020 (e.g., 40 million units were generated in 2010), there is strong motivation to effectively manage this fast-growing waste flow. Intensive work on management of ELVs is necessary in order to more successfully tackle this important environmental challenge. This paper proposes an interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations. The proposed model can incorporate various uncertainty information in the modeling process. The complex relationships between different ELV management sub-systems are successfully addressed. Particularly, the formulated model can help identify optimal patterns of procurement from multiple sources of ELV supply, production and inventory planning in multiple vehicle recycling factories, and allocation of sorted material flows to multiple final destinations under rigorous environmental regulations. A case study is conducted in order to demonstrate the potentials and applicability of the proposed model. Various constraint-violation probability levels are examined in detail. Influences of parameter uncertainty on model solutions are thoroughly investigated. Useful solutions for the management of ELVs are obtained under different probabilities of violating system constraints. The formulated model is able to tackle a hard, uncertainty existing ELV management problem. The presented model has advantages in providing bases for determining long-term ELV management plans with desired compromises between economic efficiency of vehicle recycling system and system-reliability considerations. The results are helpful for supporting generation and improvement of ELV management plans. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Deciphering the complexity of acute inflammation using mathematical models.

    PubMed

    Vodovotz, Yoram

    2006-01-01

    Various stresses elicit an acute, complex inflammatory response, leading to healing but sometimes also to organ dysfunction and death. We constructed both equation-based models (EBM) and agent-based models (ABM) of various degrees of granularity--which encompass the dynamics of relevant cells, cytokines, and the resulting global tissue dysfunction--in order to begin to unravel these inflammatory interactions. The EBMs describe and predict various features of septic shock and trauma/hemorrhage (including the response to anthrax, preconditioning phenomena, and irreversible hemorrhage) and were used to simulate anti-inflammatory strategies in clinical trials. The ABMs that describe the interrelationship between inflammation and wound healing yielded insights into intestinal healing in necrotizing enterocolitis, vocal fold healing during phonotrauma, and skin healing in the setting of diabetic foot ulcers. Modeling may help in understanding the complex interactions among the components of inflammation and response to stress, and therefore aid in the development of novel therapies and diagnostics.

  13. Low frequency complex dielectric (conductivity) response of dilute clay suspensions: Modeling and experiments.

    PubMed

    Hou, Chang-Yu; Feng, Ling; Seleznev, Nikita; Freed, Denise E

    2018-09-01

    In this work, we establish an effective medium model to describe the low-frequency complex dielectric (conductivity) dispersion of dilute clay suspensions. We use previously obtained low-frequency polarization coefficients for a charged oblate spheroidal particle immersed in an electrolyte as the building block for the Maxwell Garnett mixing formula to model the dilute clay suspension. The complex conductivity phase dispersion exhibits a near-resonance peak when the clay grains have a narrow size distribution. The peak frequency is associated with the size distribution as well as the shape of clay grains and is often referred to as the characteristic frequency. In contrast, if the size of the clay grains has a broad distribution, the phase peak is broadened and can disappear into the background of the canonical phase response of the brine. To benchmark our model, the low-frequency dispersion of the complex conductivity of dilute clay suspensions is measured using a four-point impedance measurement, which can be reliably calibrated in the frequency range between 0.1 Hz and 10 kHz. By using a minimal number of fitting parameters when reliable information is available as input for the model and carefully examining the issue of potential over-fitting, we found that our model can be used to fit the measured dispersion of the complex conductivity with reasonable parameters. The good match between the modeled and experimental complex conductivity dispersion allows us to argue that our simplified model captures the essential physics for describing the low-frequency dispersion of the complex conductivity of dilute clay suspensions. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Cx-02 Program, workshop on modeling complex systems

    USGS Publications Warehouse

    Mossotti, Victor G.; Barragan, Jo Ann; Westergard, Todd D.

    2003-01-01

    This publication contains the abstracts and program for the workshop on complex systems that was held on November 19-21, 2002, in Reno, Nevada. Complex systems are ubiquitous within the realm of the earth sciences. Geological systems consist of a multiplicity of linked components with nested feedback loops; the dynamics of these systems are non-linear, iterative, multi-scale, and operate far from equilibrium. That notwithstanding, It appears that, with the exception of papers on seismic studies, geology and geophysics work has been disproportionally underrepresented at regional and national meetings on complex systems relative to papers in the life sciences. This is somewhat puzzling because geologists and geophysicists are, in many ways, preadapted to thinking of complex system mechanisms. Geologists and geophysicists think about processes involving large volumes of rock below the sunlit surface of Earth, the accumulated consequence of processes extending hundreds of millions of years in the past. Not only do geologists think in the abstract by virtue of the vast time spans, most of the evidence is out-of-sight. A primary goal of this workshop is to begin to bridge the gap between the Earth sciences and life sciences through demonstration of the universality of complex systems science, both philosophically and in model structures.

  15. An Ontology for Modeling Complex Inter-relational Organizations

    NASA Astrophysics Data System (ADS)

    Wautelet, Yves; Neysen, Nicolas; Kolp, Manuel

    This paper presents an ontology for organizational modeling through multiple complementary aspects. The primary goal of the ontology is to dispose of an adequate set of related concepts for studying complex organizations involved in a lot of relationships at the same time. In this paper, we define complex organizations as networked organizations involved in a market eco-system that are playing several roles simultaneously. In such a context, traditional approaches focus on the macro analytic level of transactions; this is supplemented here with a micro analytic study of the actors' rationale. At first, the paper overviews enterprise ontologies literature to position our proposal and exposes its contributions and limitations. The ontology is then brought to an advanced level of formalization: a meta-model in the form of a UML class diagram allows to overview the ontology concepts and their relationships which are formally defined. Finally, the paper presents the case study on which the ontology has been validated.

  16. Enabling Controlling Complex Networks with Local Topological Information.

    PubMed

    Li, Guoqi; Deng, Lei; Xiao, Gaoxi; Tang, Pei; Wen, Changyun; Hu, Wuhua; Pei, Jing; Shi, Luping; Stanley, H Eugene

    2018-03-15

    Complex networks characterize the nature of internal/external interactions in real-world systems including social, economic, biological, ecological, and technological networks. Two issues keep as obstacles to fulfilling control of large-scale networks: structural controllability which describes the ability to guide a dynamical system from any initial state to any desired final state in finite time, with a suitable choice of inputs; and optimal control, which is a typical control approach to minimize the cost for driving the network to a predefined state with a given number of control inputs. For large complex networks without global information of network topology, both problems remain essentially open. Here we combine graph theory and control theory for tackling the two problems in one go, using only local network topology information. For the structural controllability problem, a distributed local-game matching method is proposed, where every node plays a simple Bayesian game with local information and local interactions with adjacent nodes, ensuring a suboptimal solution at a linear complexity. Starring from any structural controllability solution, a minimizing longest control path method can efficiently reach a good solution for the optimal control in large networks. Our results provide solutions for distributed complex network control and demonstrate a way to link the structural controllability and optimal control together.

  17. Efficient algorithms for accurate hierarchical clustering of huge datasets: tackling the entire protein space

    PubMed Central

    Loewenstein, Yaniv; Portugaly, Elon; Fromer, Menachem; Linial, Michal

    2008-01-01

    Motivation: UPGMA (average linking) is probably the most popular algorithm for hierarchical data clustering, especially in computational biology. However, UPGMA requires the entire dissimilarity matrix in memory. Due to this prohibitive requirement, UPGMA is not scalable to very large datasets. Application: We present a novel class of memory-constrained UPGMA (MC-UPGMA) algorithms. Given any practical memory size constraint, this framework guarantees the correct clustering solution without explicitly requiring all dissimilarities in memory. The algorithms are general and are applicable to any dataset. We present a data-dependent characterization of hardness and clustering efficiency. The presented concepts are applicable to any agglomerative clustering formulation. Results: We apply our algorithm to the entire collection of protein sequences, to automatically build a comprehensive evolutionary-driven hierarchy of proteins from sequence alone. The newly created tree captures protein families better than state-of-the-art large-scale methods such as CluSTr, ProtoNet4 or single-linkage clustering. We demonstrate that leveraging the entire mass embodied in all sequence similarities allows to significantly improve on current protein family clusterings which are unable to directly tackle the sheer mass of this data. Furthermore, we argue that non-metric constraints are an inherent complexity of the sequence space and should not be overlooked. The robustness of UPGMA allows significant improvement, especially for multidomain proteins, and for large or divergent families. Availability: A comprehensive tree built from all UniProt sequence similarities, together with navigation and classification tools will be made available as part of the ProtoNet service. A C++ implementation of the algorithm is available on request. Contact: lonshy@cs.huji.ac.il PMID:18586742

  18. 40 CFR 80.49 - Fuels to be used in augmenting the complex emission model through vehicle testing.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... complex emission model through vehicle testing. 80.49 Section 80.49 Protection of Environment... Reformulated Gasoline § 80.49 Fuels to be used in augmenting the complex emission model through vehicle testing... augmenting the complex emission model with a parameter not currently included in the complex emission model...

  19. 40 CFR 80.49 - Fuels to be used in augmenting the complex emission model through vehicle testing.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... complex emission model through vehicle testing. 80.49 Section 80.49 Protection of Environment... Reformulated Gasoline § 80.49 Fuels to be used in augmenting the complex emission model through vehicle testing... augmenting the complex emission model with a parameter not currently included in the complex emission model...

  20. 40 CFR 80.49 - Fuels to be used in augmenting the complex emission model through vehicle testing.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... complex emission model through vehicle testing. 80.49 Section 80.49 Protection of Environment... Reformulated Gasoline § 80.49 Fuels to be used in augmenting the complex emission model through vehicle testing... augmenting the complex emission model with a parameter not currently included in the complex emission model...

  1. 40 CFR 80.49 - Fuels to be used in augmenting the complex emission model through vehicle testing.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... complex emission model through vehicle testing. 80.49 Section 80.49 Protection of Environment... Reformulated Gasoline § 80.49 Fuels to be used in augmenting the complex emission model through vehicle testing... augmenting the complex emission model with a parameter not currently included in the complex emission model...

  2. A new multi-objective optimization model for preventive maintenance and replacement scheduling of multi-component systems

    NASA Astrophysics Data System (ADS)

    Moghaddam, Kamran S.; Usher, John S.

    2011-07-01

    In this article, a new multi-objective optimization model is developed to determine the optimal preventive maintenance and replacement schedules in a repairable and maintainable multi-component system. In this model, the planning horizon is divided into discrete and equally-sized periods in which three possible actions must be planned for each component, namely maintenance, replacement, or do nothing. The objective is to determine a plan of actions for each component in the system while minimizing the total cost and maximizing overall system reliability simultaneously over the planning horizon. Because of the complexity, combinatorial and highly nonlinear structure of the mathematical model, two metaheuristic solution methods, generational genetic algorithm, and a simulated annealing are applied to tackle the problem. The Pareto optimal solutions that provide good tradeoffs between the total cost and the overall reliability of the system can be obtained by the solution approach. Such a modeling approach should be useful for maintenance planners and engineers tasked with the problem of developing recommended maintenance plans for complex systems of components.

  3. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  4. A complex fermionic tensor model in d dimensions

    NASA Astrophysics Data System (ADS)

    Prakash, Shiroman; Sinha, Ritam

    2018-02-01

    In this note, we study a melonic tensor model in d dimensions based on three-index Dirac fermions with a four-fermion interaction. Summing the melonic diagrams at strong coupling allows one to define a formal large- N saddle point in arbitrary d and calculate the spectrum of scalar bilinear singlet operators. For d = 2 - ɛ the theory is an infrared fixed point, which we find has a purely real spectrum that we determine numerically for arbitrary d < 2, and analytically as a power series in ɛ. The theory appears to be weakly interacting when ɛ is small, suggesting that fermionic tensor models in 1-dimension can be studied in an ɛ expansion. For d > 2, the spectrum can still be calculated using the saddle point equations, which may define a formal large- N ultraviolet fixed point analogous to the Gross-Neveu model in d > 2. For 2 < d < 6, we find that the spectrum contains at least one complex scalar eigenvalue (similar to the complex eigenvalue present in the bosonic tensor model recently studied by Giombi, Klebanov and Tarnopolsky) which indicates that the theory is unstable. We also find that the fixed point is weakly-interacting when d = 6 (or more generally d = 4 n + 2) and has a real spectrum for 6 < d < 6 .14 which we present as a power series in ɛ in 6 + ɛ dimensions.

  5. High-resolution dust modelling over complex terrains in West Asia

    NASA Astrophysics Data System (ADS)

    Basart, S.; Vendrell, L.; Baldasano, J. M.

    2016-12-01

    The present work demonstrates the impact of model resolution in dust propagation in a complex terrain region such as West Asia. For this purpose, two simulations using the NMMB/BSC-Dust model are performed and analysed, one with a high horizontal resolution (at 0.03° × 0.03°) and one with a lower horizontal resolution (at 0.33° × 0.33°). Both model experiments cover two intense dust storms that occurred on 17-20 March 2012 as a consequence of strong northwesterly Shamal winds that spanned over thousands of kilometres in West Asia. The comparison with ground-based (surface weather stations and sunphotometers) and satellite aerosol observations (Aqua/MODIS and MSG/SEVIRI) shows that despite differences in the magnitude of the simulated dust concentrations, the model is able to reproduce these two dust outbreaks. Differences between both simulations on the dust spread rise on regional dust transport areas in south-western Saudi Arabia, Yemen and Oman. The complex orography in south-western Saudi Arabia, Yemen and Oman (with peaks higher than 3000 m) has an impact on the transported dust concentration fields over mountain regions. Differences between both model configurations are mainly associated to the channelization of the dust flow through valleys and the differences in the modelled altitude of the mountains that alters the meteorology and blocks the dust fronts limiting the dust transport. These results demonstrate how the dust prediction in the vicinity of complex terrains improves using high-horizontal resolution simulations.

  6. Capturing complexity in work disability research: application of system dynamics modeling methodology.

    PubMed

    Jetha, Arif; Pransky, Glenn; Hettinger, Lawrence J

    2016-01-01

    Work disability (WD) is characterized by variable and occasionally undesirable outcomes. The underlying determinants of WD outcomes include patterns of dynamic relationships among health, personal, organizational and regulatory factors that have been challenging to characterize, and inadequately represented by contemporary WD models. System dynamics modeling (SDM) methodology applies a sociotechnical systems thinking lens to view WD systems as comprising a range of influential factors linked by feedback relationships. SDM can potentially overcome limitations in contemporary WD models by uncovering causal feedback relationships, and conceptualizing dynamic system behaviors. It employs a collaborative and stakeholder-based model building methodology to create a visual depiction of the system as a whole. SDM can also enable researchers to run dynamic simulations to provide evidence of anticipated or unanticipated outcomes that could result from policy and programmatic intervention. SDM may advance rehabilitation research by providing greater insights into the structure and dynamics of WD systems while helping to understand inherent complexity. Challenges related to data availability, determining validity, and the extensive time and technical skill requirements for model building may limit SDM's use in the field and should be considered. Contemporary work disability (WD) models provide limited insight into complexity associated with WD processes. System dynamics modeling (SDM) has the potential to capture complexity through a stakeholder-based approach that generates a simulation model consisting of multiple feedback loops. SDM may enable WD researchers and practitioners to understand the structure and behavior of the WD system as a whole, and inform development of improved strategies to manage straightforward and complex WD cases.

  7. Structural model of control system for hydraulic stepper motor complex

    NASA Astrophysics Data System (ADS)

    Obukhov, A. D.; Dedov, D. L.; Kolodin, A. N.

    2018-03-01

    The article considers the problem of developing a structural model of the control system for a hydraulic stepper drive complex. A comparative analysis of stepper drives and assessment of the applicability of HSM for solving problems, requiring accurate displacement in space with subsequent positioning of the object, are carried out. The presented structural model of the automated control system of the multi-spindle complex of hydraulic stepper drives reflects the main components of the system, as well as the process of its control based on the control signals transfer to the solenoid valves by the controller. The models and methods described in the article can be used to formalize the control process in technical systems based on the application hydraulic stepper drives and allow switching from mechanical control to automated control.

  8. Bayesian Mixed-Membership Models of Complex and Evolving Networks

    DTIC Science & Technology

    2006-12-01

    R. Hughes, J. Parkinson , M. Gerstein, S . J. Wodak, A. Emili, and J. F. Greenblatt. Global landscape of protein complexes in the yeast Saccharomyces...provision of law , no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid...Membership Models of Complex and Evolving Networks 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) 5d. PROJECT NUMBER 5e

  9. Comment on "A dynamic network model of mTOR signaling reveals TSC-independent mTORC2 regulation": building a model of the mTOR signaling network with a potentially faulty tool.

    PubMed

    Manning, Brendan D

    2012-07-10

    In their study published in Science Signaling (Research Article, 27 March 2012, DOI: 10.1126/scisignal.2002469), Dalle Pezze et al. tackle the dynamic and complex wiring of the signaling network involving the protein kinase mTOR, which exists within two distinct protein complexes (mTORC1 and mTORC2) that differ in their regulation and function. The authors use a combination of immunoblotting for specific phosphorylation events and computational modeling. The primary experimental tool employed is to monitor the autophosphorylation of mTOR on Ser(2481) in cell lysates as a surrogate for mTOR activity, which the authors conclude is a specific readout for mTORC2. However, Ser(2481) phosphorylation occurs on both mTORC1 and mTORC2 and will dynamically change as the network through which these two complexes are connected is manipulated. Therefore, models of mTOR network regulation built using this tool are inherently imperfect and open to alternative explanations. Specific issues with the main conclusion made in this study, involving the TSC1-TSC2 (tuberous sclerosis complex 1 and 2) complex and its potential regulation of mTORC2, are discussed here. A broader goal of this Letter is to clarify to other investigators the caveats of using mTOR Ser(2481) phosphorylation in cell lysates as a specific readout for either of the two mTOR complexes.

  10. Reducing the Complexity of an Agent-Based Local Heroin Market Model

    PubMed Central

    Heard, Daniel; Bobashev, Georgiy V.; Morris, Robert J.

    2014-01-01

    This project explores techniques for reducing the complexity of an agent-based model (ABM). The analysis involved a model developed from the ethnographic research of Dr. Lee Hoffer in the Larimer area heroin market, which involved drug users, drug sellers, homeless individuals and police. The authors used statistical techniques to create a reduced version of the original model which maintained simulation fidelity while reducing computational complexity. This involved identifying key summary quantities of individual customer behavior as well as overall market activity and replacing some agents with probability distributions and regressions. The model was then extended to allow external market interventions in the form of police busts. Extensions of this research perspective, as well as its strengths and limitations, are discussed. PMID:25025132

  11. Agent-based modelling in synthetic biology.

    PubMed

    Gorochowski, Thomas E

    2016-11-30

    Biological systems exhibit complex behaviours that emerge at many different levels of organization. These span the regulation of gene expression within single cells to the use of quorum sensing to co-ordinate the action of entire bacterial colonies. Synthetic biology aims to make the engineering of biology easier, offering an opportunity to control natural systems and develop new synthetic systems with useful prescribed behaviours. However, in many cases, it is not understood how individual cells should be programmed to ensure the emergence of a required collective behaviour. Agent-based modelling aims to tackle this problem, offering a framework in which to simulate such systems and explore cellular design rules. In this article, I review the use of agent-based models in synthetic biology, outline the available computational tools, and provide details on recently engineered biological systems that are amenable to this approach. I further highlight the challenges facing this methodology and some of the potential future directions. © 2016 The Author(s).

  12. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  13. Beyond Aspirations: Deploying the Capability Approach to Tackle the Under-Representation in Higher Education of Young People from Deprived Communities

    ERIC Educational Resources Information Center

    Campbell, Laurie Anne; McKendrick, John H.

    2017-01-01

    This paper re-examines the low participation of young people from deprived communities through the lens of the capability approach. A fundamental problem for tackling widening participation is that much of the thinking of policymakers is grounded on the flawed "poverty of aspiration thesis". This paper contends that Sen's [1992.…

  14. Border Security: A Conceptual Model of Complexity

    DTIC Science & Technology

    2013-12-01

    maximum 200 words ) This research applies complexity and system dynamics theory to the idea of border security, culminating in the development of...alternative policy options. E. LIMITATIONS OF RESEARCH AND MODEL This research explores whether border security is a living system. In other words , whether...border inspections. Washington State, for example, experienced a 50% drop in tourism and lost over $100 million in local revenue because of the

  15. Scale Mixture Models with Applications to Bayesian Inference

    NASA Astrophysics Data System (ADS)

    Qin, Zhaohui S.; Damien, Paul; Walker, Stephen

    2003-11-01

    Scale mixtures of uniform distributions are used to model non-normal data in time series and econometrics in a Bayesian framework. Heteroscedastic and skewed data models are also tackled using scale mixture of uniform distributions.

  16. Persistent model order reduction for complex dynamical systems using smooth orthogonal decomposition

    NASA Astrophysics Data System (ADS)

    Ilbeigi, Shahab; Chelidze, David

    2017-11-01

    Full-scale complex dynamic models are not effective for parametric studies due to the inherent constraints on available computational power and storage resources. A persistent reduced order model (ROM) that is robust, stable, and provides high-fidelity simulations for a relatively wide range of parameters and operating conditions can provide a solution to this problem. The fidelity of a new framework for persistent model order reduction of large and complex dynamical systems is investigated. The framework is validated using several numerical examples including a large linear system and two complex nonlinear systems with material and geometrical nonlinearities. While the framework is used for identifying the robust subspaces obtained from both proper and smooth orthogonal decompositions (POD and SOD, respectively), the results show that SOD outperforms POD in terms of stability, accuracy, and robustness.

  17. Calibration of an Unsteady Groundwater Flow Model for a Complex, Strongly Heterogeneous Aquifer

    NASA Astrophysics Data System (ADS)

    Curtis, Z. K.; Liao, H.; Li, S. G.; Phanikumar, M. S.; Lusch, D.

    2016-12-01

    Modeling of groundwater systems characterized by complex three-dimensional structure and heterogeneity remains a significant challenge. Most of today's groundwater models are developed based on relatively simple conceptual representations in favor of model calibratibility. As more complexities are modeled, e.g., by adding more layers and/or zones, or introducing transient processes, more parameters have to be estimated and issues related to ill-posed groundwater problems and non-unique calibration arise. Here, we explore the use of an alternative conceptual representation for groundwater modeling that is fully three-dimensional and can capture complex 3D heterogeneity (both systematic and "random") without over-parameterizing the aquifer system. In particular, we apply Transition Probability (TP) geostatistics on high resolution borehole data from a water well database to characterize the complex 3D geology. Different aquifer material classes, e.g., `AQ' (aquifer material), `MAQ' (marginal aquifer material'), `PCM' (partially confining material), and `CM' (confining material), are simulated, with the hydraulic properties of each material type as tuning parameters during calibration. The TP-based approach is applied to simulate unsteady groundwater flow in a large, complex, and strongly heterogeneous glacial aquifer system in Michigan across multiple spatial and temporal scales. The resulting model is calibrated to observed static water level data over a time span of 50 years. The results show that the TP-based conceptualization enables much more accurate and robust calibration/simulation than that based on conventional deterministic layer/zone based conceptual representations.

  18. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    PubMed

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  19. Modeling the surface tension of complex, reactive organic-inorganic mixtures

    NASA Astrophysics Data System (ADS)

    Schwier, A. N.; Viglione, G. A.; Li, Z.; McNeill, V. Faye

    2013-11-01

    Atmospheric aerosols can contain thousands of organic compounds which impact aerosol surface tension, affecting aerosol properties such as heterogeneous reactivity, ice nucleation, and cloud droplet formation. We present new experimental data for the surface tension of complex, reactive organic-inorganic aqueous mixtures mimicking tropospheric aerosols. Each solution contained 2-6 organic compounds, including methylglyoxal, glyoxal, formaldehyde, acetaldehyde, oxalic acid, succinic acid, leucine, alanine, glycine, and serine, with and without ammonium sulfate. We test two semi-empirical surface tension models and find that most reactive, complex, aqueous organic mixtures which do not contain salt are well described by a weighted Szyszkowski-Langmuir (S-L) model which was first presented by Henning et al. (2005). Two approaches for modeling the effects of salt were tested: (1) the Tuckermann approach (an extension of the Henning model with an additional explicit salt term), and (2) a new implicit method proposed here which employs experimental surface tension data obtained for each organic species in the presence of salt used with the Henning model. We recommend the use of method (2) for surface tension modeling of aerosol systems because the Henning model (using data obtained from organic-inorganic systems) and Tuckermann approach provide similar modeling results and goodness-of-fit (χ2) values, yet the Henning model is a simpler and more physical approach to modeling the effects of salt, requiring less empirically determined parameters.

  20. Modeling the surface tension of complex, reactive organic-inorganic mixtures

    NASA Astrophysics Data System (ADS)

    Schwier, A. N.; Viglione, G. A.; Li, Z.; McNeill, V. F.

    2013-01-01

    Atmospheric aerosols can contain thousands of organic compounds which impact aerosol surface tension, affecting aerosol properties such as cloud condensation nuclei (CCN) ability. We present new experimental data for the surface tension of complex, reactive organic-inorganic aqueous mixtures mimicking tropospheric aerosols. Each solution contained 2-6 organic compounds, including methylglyoxal, glyoxal, formaldehyde, acetaldehyde, oxalic acid, succinic acid, leucine, alanine, glycine, and serine, with and without ammonium sulfate. We test two surface tension models and find that most reactive, complex, aqueous organic mixtures which do not contain salt are well-described by a weighted Szyszkowski-Langmuir (S-L) model which was first presented by Henning et al. (2005). Two approaches for modeling the effects of salt were tested: (1) the Tuckermann approach (an extension of the Henning model with an additional explicit salt term), and (2) a new implicit method proposed here which employs experimental surface tension data obtained for each organic species in the presence of salt used with the Henning model. We recommend the use of method (2) for surface tension modeling because the Henning model (using data obtained from organic-inorganic systems) and Tuckermann approach provide similar modeling fits and goodness of fit (χ2) values, yet the Henning model is a simpler and more physical approach to modeling the effects of salt, requiring less empirically determined parameters.

  1. Evaluating models of remember-know judgments: complexity, mimicry, and discriminability.

    PubMed

    Cohen, Andrew L; Rotello, Caren M; Macmillan, Neil A

    2008-10-01

    Remember-know judgments provide additional information in recognition memory tests, but the nature of this information and the attendant decision process are in dispute. Competing models have proposed that remember judgments reflect a sum of familiarity and recollective information (the one-dimensional model), are based on a difference between these strengths (STREAK), or are purely recollective (the dual-process model). A choice among these accounts is sometimes made by comparing the precision of their fits to data, but this strategy may be muddied by differences in model complexity: Some models that appear to provide good fits may simply be better able to mimic the data produced by other models. To evaluate this possibility, we simulated data with each of the models in each of three popular remember-know paradigms, then fit those data to each of the models. We found that the one-dimensional model is generally less complex than the others, but despite this handicap, it dominates the others as the best-fitting model. For both reasons, the one-dimensional model should be preferred. In addition, we found that some empirical paradigms are ill-suited for distinguishing among models. For example, data collected by soliciting remember/know/new judgments--that is, the trinary task--provide a particularly weak ground for distinguishing models. Additional tables and figures may be downloaded from the Psychonomic Society's Archive of Norms, Stimuli, and Data, at www.psychonomic.org/archive.

  2. Process consistency in models: The importance of system signatures, expert knowledge, and process complexity

    NASA Astrophysics Data System (ADS)

    Hrachowitz, M.; Fovet, O.; Ruiz, L.; Euser, T.; Gharari, S.; Nijzink, R.; Freer, J.; Savenije, H. H. G.; Gascuel-Odoux, C.

    2014-09-01

    Hydrological models frequently suffer from limited predictive power despite adequate calibration performances. This can indicate insufficient representations of the underlying processes. Thus, ways are sought to increase model consistency while satisfying the contrasting priorities of increased model complexity and limited equifinality. In this study, the value of a systematic use of hydrological signatures and expert knowledge for increasing model consistency was tested. It was found that a simple conceptual model, constrained by four calibration objective functions, was able to adequately reproduce the hydrograph in the calibration period. The model, however, could not reproduce a suite of hydrological signatures, indicating a lack of model consistency. Subsequently, testing 11 models, model complexity was increased in a stepwise way and counter-balanced by "prior constraints," inferred from expert knowledge to ensure a model which behaves well with respect to the modeler's perception of the system. We showed that, in spite of unchanged calibration performance, the most complex model setup exhibited increased performance in the independent test period and skill to better reproduce all tested signatures, indicating a better system representation. The results suggest that a model may be inadequate despite good performance with respect to multiple calibration objectives and that increasing model complexity, if counter-balanced by prior constraints, can significantly increase predictive performance of a model and its skill to reproduce hydrological signatures. The results strongly illustrate the need to balance automated model calibration with a more expert-knowledge-driven strategy of constraining models.

  3. Socio-Environmental Resilience and Complex Urban Systems Modeling

    NASA Astrophysics Data System (ADS)

    Deal, Brian; Petri, Aaron; Pan, Haozhi; Goldenberg, Romain; Kalantari, Zahra; Cvetkovic, Vladimir

    2017-04-01

    The increasing pressure of climate change has inspired two normative agendas; socio-technical transitions and socio-ecological resilience, both sharing a complex-systems epistemology (Gillard et al. 2016). Socio-technical solutions include a continuous, massive data gathering exercise now underway in urban places under the guise of developing a 'smart'(er) city. This has led to the creation of data-rich environments where large data sets have become central to monitoring and forming a response to anomalies. Some have argued that these kinds of data sets can help in planning for resilient cities (Norberg and Cumming 2008; Batty 2013). In this paper, we focus on a more nuanced, ecologically based, socio-environmental perspective of resilience planning that is often given less consideration. Here, we broadly discuss (and model) the tightly linked, mutually influenced, social and biophysical subsystems that are critical for understanding urban resilience. We argue for the need to incorporate these sub system linkages into the resilience planning lexicon through the integration of systems models and planning support systems. We make our case by first providing a context for urban resilience from a socio-ecological and planning perspective. We highlight the data needs for this type of resilient planning and compare it to currently collected data streams in various smart city efforts. This helps to define an approach for operationalizing socio-environmental resilience planning using robust systems models and planning support systems. For this, we draw from our experiences in coupling a spatio-temporal land use model (the Landuse Evolution and impact Assessment Model (LEAM)) with water quality and quantity models in Stockholm Sweden. We describe the coupling of these systems models using a robust Planning Support System (PSS) structural framework. We use the coupled model simulations and PSS to analyze the connection between urban land use transformation (social) and water

  4. Mathematical concepts for modeling human behavior in complex man-machine systems

    NASA Technical Reports Server (NTRS)

    Johannsen, G.; Rouse, W. B.

    1979-01-01

    Many human behavior (e.g., manual control) models have been found to be inadequate for describing processes in certain real complex man-machine systems. An attempt is made to find a way to overcome this problem by examining the range of applicability of existing mathematical models with respect to the hierarchy of human activities in real complex tasks. Automobile driving is chosen as a baseline scenario, and a hierarchy of human activities is derived by analyzing this task in general terms. A structural description leads to a block diagram and a time-sharing computer analogy.

  5. Decoding the complex genetic causes of heart diseases using systems biology.

    PubMed

    Djordjevic, Djordje; Deshpande, Vinita; Szczesnik, Tomasz; Yang, Andrian; Humphreys, David T; Giannoulatou, Eleni; Ho, Joshua W K

    2015-03-01

    The pace of disease gene discovery is still much slower than expected, even with the use of cost-effective DNA sequencing and genotyping technologies. It is increasingly clear that many inherited heart diseases have a more complex polygenic aetiology than previously thought. Understanding the role of gene-gene interactions, epigenetics, and non-coding regulatory regions is becoming increasingly critical in predicting the functional consequences of genetic mutations identified by genome-wide association studies and whole-genome or exome sequencing. A systems biology approach is now being widely employed to systematically discover genes that are involved in heart diseases in humans or relevant animal models through bioinformatics. The overarching premise is that the integration of high-quality causal gene regulatory networks (GRNs), genomics, epigenomics, transcriptomics and other genome-wide data will greatly accelerate the discovery of the complex genetic causes of congenital and complex heart diseases. This review summarises state-of-the-art genomic and bioinformatics techniques that are used in accelerating the pace of disease gene discovery in heart diseases. Accompanying this review, we provide an interactive web-resource for systems biology analysis of mammalian heart development and diseases, CardiacCode ( http://CardiacCode.victorchang.edu.au/ ). CardiacCode features a dataset of over 700 pieces of manually curated genetic or molecular perturbation data, which enables the inference of a cardiac-specific GRN of 280 regulatory relationships between 33 regulator genes and 129 target genes. We believe this growing resource will fill an urgent unmet need to fully realise the true potential of predictive and personalised genomic medicine in tackling human heart disease.

  6. A Randomized Controlled Trial to Examine the Effects of the Tackling Teenage Psychosexual Training Program for Adolescents with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Visser, Kirsten; Greaves-Lord, Kirstin; Tick, Nouchka T.; Verhulst, Frank C.; Maras, Athanasios; van der Vegt, Esther J. M.

    2017-01-01

    Background: Previous research underscores the importance of psychosexual guidance for adolescents with autism spectrum disorder (ASD). Such guidance is provided in the Tackling Teenage Training (TTT) program, in which adolescents with ASD receive psycho-education and practice communicative skills regarding topics related to puberty, sexuality, and…

  7. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more.

    PubMed

    Rivas, Elena; Lang, Raymond; Eddy, Sean R

    2012-02-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.

  8. Evaluating and Mitigating the Impact of Complexity in Software Models

    DTIC Science & Technology

    2015-12-01

    Internal use:* Permission to reproduce this material and to prepare derivative works from this material for internal use is granted, provided the...introduction) provides our motivation to study complexity and the essential re- search questions that we address in this effort. Some background information... provides the reader with a basis for the work and related areas explored. Section 2 (The Impact of Complexity) discusses the impact of model-based

  9. Developing and Modeling Complex Social Interventions: Introducing the Connecting People Intervention

    ERIC Educational Resources Information Center

    Webber, Martin; Reidy, Hannah; Ansari, David; Stevens, Martin; Morris, David

    2016-01-01

    Objectives: Modeling the processes involved in complex social interventions is important in social work practice, as it facilitates their implementation and translation into different contexts. This article reports the process of developing and modeling the connecting people intervention (CPI), a model of practice that supports people with mental…

  10. Are our dynamic water quality models too complex? A comparison of a new parsimonious phosphorus model, SimplyP, and INCA-P

    NASA Astrophysics Data System (ADS)

    Jackson-Blake, L. A.; Sample, J. E.; Wade, A. J.; Helliwell, R. C.; Skeffington, R. A.

    2017-07-01

    Catchment-scale water quality models are increasingly popular tools for exploring the potential effects of land management, land use change and climate change on water quality. However, the dynamic, catchment-scale nutrient models in common usage are complex, with many uncertain parameters requiring calibration, limiting their usability and robustness. A key question is whether this complexity is justified. To explore this, we developed a parsimonious phosphorus model, SimplyP, incorporating a rainfall-runoff model and a biogeochemical model able to simulate daily streamflow, suspended sediment, and particulate and dissolved phosphorus dynamics. The model's complexity was compared to one popular nutrient model, INCA-P, and the performance of the two models was compared in a small rural catchment in northeast Scotland. For three land use classes, less than six SimplyP parameters must be determined through calibration, the rest may be based on measurements, while INCA-P has around 40 unmeasurable parameters. Despite substantially simpler process-representation, SimplyP performed comparably to INCA-P in both calibration and validation and produced similar long-term projections in response to changes in land management. Results support the hypothesis that INCA-P is overly complex for the study catchment. We hope our findings will help prompt wider model comparison exercises, as well as debate among the water quality modeling community as to whether today's models are fit for purpose. Simpler models such as SimplyP have the potential to be useful management and research tools, building blocks for future model development (prototype code is freely available), or benchmarks against which more complex models could be evaluated.

  11. Computational modeling of carbohydrate recognition in protein complex

    NASA Astrophysics Data System (ADS)

    Ishida, Toyokazu

    2017-11-01

    To understand the mechanistic principle of carbohydrate recognition in proteins, we propose a systematic computational modeling strategy to identify complex carbohydrate chain onto the reduced 2D free energy surface (2D-FES), determined by MD sampling combined with QM/MM energy corrections. In this article, we first report a detailed atomistic simulation study of the norovirus capsid proteins with carbohydrate antigens based on ab initio QM/MM combined with MD-FEP simulations. The present result clearly shows that the binding geometries of complex carbohydrate antigen are determined not by one single, rigid carbohydrate structure, but rather by the sum of averaged conformations mapped onto the minimum free energy region of QM/MM 2D-FES.

  12. On Convergence of Development Costs and Cost Models for Complex Spaceflight Instrument Electronics

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Patel, Umeshkumar D.; Kasa, Robert L.; Hestnes, Phyllis; Brown, Tammy; Vootukuru, Madhavi

    2008-01-01

    Development costs of a few recent spaceflight instrument electrical and electronics subsystems have diverged from respective heritage cost model predictions. The cost models used are Grass Roots, Price-H and Parametric Model. These cost models originated in the military and industry around 1970 and were successfully adopted and patched by NASA on a mission-by-mission basis for years. However, the complexity of new instruments recently changed rapidly by orders of magnitude. This is most obvious in the complexity of representative spaceflight instrument electronics' data system. It is now required to perform intermediate processing of digitized data apart from conventional processing of science phenomenon signals from multiple detectors. This involves on-board instrument formatting of computational operands from row data for example, images), multi-million operations per second on large volumes of data in reconfigurable hardware (in addition to processing on a general purpose imbedded or standalone instrument flight computer), as well as making decisions for on-board system adaptation and resource reconfiguration. The instrument data system is now tasked to perform more functions, such as forming packets and instrument-level data compression of more than one data stream, which are traditionally performed by the spacecraft command and data handling system. It is furthermore required that the electronics box for new complex instruments is developed for one-digit watt power consumption, small size and that it is light-weight, and delivers super-computing capabilities. The conflict between the actual development cost of newer complex instruments and its electronics components' heritage cost model predictions seems to be irreconcilable. This conflict and an approach to its resolution are addressed in this paper by determining the complexity parameters, complexity index, and their use in enhanced cost model.

  13. Recent advances in non-LTE stellar atmosphere models

    NASA Astrophysics Data System (ADS)

    Sander, Andreas A. C.

    2017-11-01

    In the last decades, stellar atmosphere models have become a key tool in understanding massive stars. Applied for spectroscopic analysis, these models provide quantitative information on stellar wind properties as well as fundamental stellar parameters. The intricate non-LTE conditions in stellar winds dictate the development of adequate sophisticated model atmosphere codes. The increase in both, the computational power and our understanding of physical processes in stellar atmospheres, led to an increasing complexity in the models. As a result, codes emerged that can tackle a wide range of stellar and wind parameters. After a brief address of the fundamentals of stellar atmosphere modeling, the current stage of clumped and line-blanketed model atmospheres will be discussed. Finally, the path for the next generation of stellar atmosphere models will be outlined. Apart from discussing multi-dimensional approaches, I will emphasize on the coupling of hydrodynamics with a sophisticated treatment of the radiative transfer. This next generation of models will be able to predict wind parameters from first principles, which could open new doors for our understanding of the various facets of massive star physics, evolution, and death.

  14. A framework for modelling the complexities of food and water security under globalisation

    NASA Astrophysics Data System (ADS)

    Dermody, Brian J.; Sivapalan, Murugesu; Stehfest, Elke; van Vuuren, Detlef P.; Wassen, Martin J.; Bierkens, Marc F. P.; Dekker, Stefan C.

    2018-01-01

    We present a new framework for modelling the complexities of food and water security under globalisation. The framework sets out a method to capture regional and sectoral interdependencies and cross-scale feedbacks within the global food system that contribute to emergent water use patterns. The framework integrates aspects of existing models and approaches in the fields of hydrology and integrated assessment modelling. The core of the framework is a multi-agent network of city agents connected by infrastructural trade networks. Agents receive socio-economic and environmental constraint information from integrated assessment models and hydrological models respectively and simulate complex, socio-environmental dynamics that operate within those constraints. The emergent changes in food and water resources are aggregated and fed back to the original models with minimal modification of the structure of those models. It is our conviction that the framework presented can form the basis for a new wave of decision tools that capture complex socio-environmental change within our globalised world. In doing so they will contribute to illuminating pathways towards a sustainable future for humans, ecosystems and the water they share.

  15. Learning in the model space for cognitive fault diagnosis.

    PubMed

    Chen, Huanhuan; Tino, Peter; Rodan, Ali; Yao, Xin

    2014-01-01

    The emergence of large sensor networks has facilitated the collection of large amounts of real-time data to monitor and control complex engineering systems. However, in many cases the collected data may be incomplete or inconsistent, while the underlying environment may be time-varying or unformulated. In this paper, we develop an innovative cognitive fault diagnosis framework that tackles the above challenges. This framework investigates fault diagnosis in the model space instead of the signal space. Learning in the model space is implemented by fitting a series of models using a series of signal segments selected with a sliding window. By investigating the learning techniques in the fitted model space, faulty models can be discriminated from healthy models using a one-class learning algorithm. The framework enables us to construct a fault library when unknown faults occur, which can be regarded as cognitive fault isolation. This paper also theoretically investigates how to measure the pairwise distance between two models in the model space and incorporates the model distance into the learning algorithm in the model space. The results on three benchmark applications and one simulated model for the Barcelona water distribution network confirm the effectiveness of the proposed framework.

  16. Low-complexity stochastic modeling of wall-bounded shear flows

    NASA Astrophysics Data System (ADS)

    Zare, Armin

    Turbulent flows are ubiquitous in nature and they appear in many engineering applications. Transition to turbulence, in general, increases skin-friction drag in air/water vehicles compromising their fuel-efficiency and reduces the efficiency and longevity of wind turbines. While traditional flow control techniques combine physical intuition with costly experiments, their effectiveness can be significantly enhanced by control design based on low-complexity models and optimization. In this dissertation, we develop a theoretical and computational framework for the low-complexity stochastic modeling of wall-bounded shear flows. Part I of the dissertation is devoted to the development of a modeling framework which incorporates data-driven techniques to refine physics-based models. We consider the problem of completing partially known sample statistics in a way that is consistent with underlying stochastically driven linear dynamics. Neither the statistics nor the dynamics are precisely known. Thus, our objective is to reconcile the two in a parsimonious manner. To this end, we formulate optimization problems to identify the dynamics and directionality of input excitation in order to explain and complete available covariance data. For problem sizes that general-purpose solvers cannot handle, we develop customized optimization algorithms based on alternating direction methods. The solution to the optimization problem provides information about critical directions that have maximal effect in bringing model and statistics in agreement. In Part II, we employ our modeling framework to account for statistical signatures of turbulent channel flow using low-complexity stochastic dynamical models. We demonstrate that white-in-time stochastic forcing is not sufficient to explain turbulent flow statistics and develop models for colored-in-time forcing of the linearized Navier-Stokes equations. We also examine the efficacy of stochastically forced linearized NS equations and their

  17. Molecular modelling, spectroscopic characterization and biological studies of tetraazamacrocyclic metal complexes

    NASA Astrophysics Data System (ADS)

    Rathi, Parveen; Sharma, Kavita; Singh, Dharam Pal

    2014-09-01

    Macrocyclic complexes of the type [MLX]X2; where L is (C30H28N4), a macrocyclic ligand, M = Cr(III) and Fe(III) and X = Cl-, CH3COO- or NO3-, have been synthesized by template condensation reaction of 1,8-diaminonaphthalene and acetylacetone in the presence of trivalent metal salts in a methanolic medium. The complexes have been formulated as [MLX]X2 due to 1:2 electrolytic nature of these complexes. The complexes have been characterized with the help of elemental analyses, molar conductance measurements, magnetic susceptibility measurements, electronic, infrared, far infrared, Mass spectral studies and molecular modelling. Molecular weight of these complexes indicates their monomeric nature. On the basis of all these studies, a five coordinated square pyramidal geometry has been proposed for all these complexes. These metal complexes have also been screened for their in vitro antimicrobial activities.

  18. Comparing flood loss models of different complexity

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Riggelsen, Carsten; Scherbaum, Frank; Merz, Bruno

    2013-04-01

    Any deliberation on flood risk requires the consideration of potential flood losses. In particular, reliable flood loss models are needed to evaluate cost-effectiveness of mitigation measures, to assess vulnerability, for comparative risk analysis and financial appraisal during and after floods. In recent years, considerable improvements have been made both concerning the data basis and the methodological approaches used for the development of flood loss models. Despite of that, flood loss models remain an important source of uncertainty. Likewise the temporal and spatial transferability of flood loss models is still limited. This contribution investigates the predictive capability of different flood loss models in a split sample cross regional validation approach. For this purpose, flood loss models of different complexity, i.e. based on different numbers of explaining variables, are learned from a set of damage records that was obtained from a survey after the Elbe flood in 2002. The validation of model predictions is carried out for different flood events in the Elbe and Danube river basins in 2002, 2005 and 2006 for which damage records are available from surveys after the flood events. The models investigated are a stage-damage model, the rule based model FLEMOps+r as well as novel model approaches which are derived using data mining techniques of regression trees and Bayesian networks. The Bayesian network approach to flood loss modelling provides attractive additional information concerning the probability distribution of both model predictions and explaining variables.

  19. PeTTSy: a computational tool for perturbation analysis of complex systems biology models.

    PubMed

    Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A

    2016-03-10

    Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and

  20. Natural killer cells as a promising tool to tackle cancer-A review of sources, methodologies, and potentials.

    PubMed

    Preethy, Senthilkumar; Dedeepiya, Vidyasagar Devaprasad; Senthilkumar, Rajappa; Rajmohan, Mathaiyan; Karthick, Ramalingam; Terunuma, Hiroshi; Abraham, Samuel J K

    2017-07-04

    Immune cell-based therapies are emerging as a promising tool to tackle malignancies, both solid tumors and selected hematological tumors. Vast experiences in literature have documented their safety and added survival benefits when such cell-based therapies are combined with the existing treatment options. Numerous methodologies of processing and in vitro expansion protocols of immune cells, such as the dendritic cells, natural killer (NK) cells, NKT cells, αβ T cells, so-called activated T lymphocytes, γδ T cells, cytotoxic T lymphocytes, and lymphokine-activated killer cells, have been reported for use in cell-based therapies. Among this handful of immune cells of significance, the NK cells stand apart from the rest for not only their direct cytotoxic ability against cancer cells but also their added advantage, which includes their capability of (i) action through both innate and adaptive immune mechanism, (ii) tackling viruses too, giving benefits in conditions where viral infections culminate in cancer, and (iii) destroying cancer stem cells, thereby preventing resistance to chemotherapy and radiotherapy. This review thoroughly analyses the sources of such NK cells, methods for expansion, and the future potentials of taking the in vitro expanded allogeneic NK cells with good cytotoxic ability as a drug for treating cancer and/or viral infection and even as a prophylactic tool for prevention of cancer after initial remission.

  1. On explicit algebraic stress models for complex turbulent flows

    NASA Technical Reports Server (NTRS)

    Gatski, T. B.; Speziale, C. G.

    1992-01-01

    Explicit algebraic stress models that are valid for three-dimensional turbulent flows in noninertial frames are systematically derived from a hierarchy of second-order closure models. This represents a generalization of the model derived by Pope who based his analysis on the Launder, Reece, and Rodi model restricted to two-dimensional turbulent flows in an inertial frame. The relationship between the new models and traditional algebraic stress models -- as well as anistropic eddy visosity models -- is theoretically established. The need for regularization is demonstrated in an effort to explain why traditional algebraic stress models have failed in complex flows. It is also shown that these explicit algebraic stress models can shed new light on what second-order closure models predict for the equilibrium states of homogeneous turbulent flows and can serve as a useful alternative in practical computations.

  2. QMU as an approach to strengthening the predictive capabilities of complex models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, Genetha Anne.; Boggs, Paul T.; Grace, Matthew D.

    2010-09-01

    Complex systems are made up of multiple interdependent parts, and the behavior of the entire system cannot always be directly inferred from the behavior of the individual parts. They are nonlinear and system responses are not necessarily additive. Examples of complex systems include energy, cyber and telecommunication infrastructures, human and animal social structures, and biological structures such as cells. To meet the goals of infrastructure development, maintenance, and protection for cyber-related complex systems, novel modeling and simulation technology is needed. Sandia has shown success using M&S in the nuclear weapons (NW) program. However, complex systems represent a significant challenge andmore » relative departure from the classical M&S exercises, and many of the scientific and mathematical M&S processes must be re-envisioned. Specifically, in the NW program, requirements and acceptable margins for performance, resilience, and security are well-defined and given quantitatively from the start. The Quantification of Margins and Uncertainties (QMU) process helps to assess whether or not these safety, reliability and performance requirements have been met after a system has been developed. In this sense, QMU is used as a sort of check that requirements have been met once the development process is completed. In contrast, performance requirements and margins may not have been defined a priori for many complex systems, (i.e. the Internet, electrical distribution grids, etc.), particularly not in quantitative terms. This project addresses this fundamental difference by investigating the use of QMU at the start of the design process for complex systems. Three major tasks were completed. First, the characteristics of the cyber infrastructure problem were collected and considered in the context of QMU-based tools. Second, UQ methodologies for the quantification of model discrepancies were considered in the context of statistical models of cyber activity

  3. Tackling the conformational sampling of larger flexible compounds and macrocycles in pharmacology and drug discovery.

    PubMed

    Chen, I-Jen; Foloppe, Nicolas

    2013-12-15

    Computational conformational sampling underpins much of molecular modeling and design in pharmaceutical work. The sampling of smaller drug-like compounds has been an active area of research. However, few studies have tested in details the sampling of larger more flexible compounds, which are also relevant to drug discovery, including therapeutic peptides, macrocycles, and inhibitors of protein-protein interactions. Here, we investigate extensively mainstream conformational sampling methods on three carefully curated compound sets, namely the 'Drug-like', larger 'Flexible', and 'Macrocycle' compounds. These test molecules are chemically diverse with reliable X-ray protein-bound bioactive structures. The compared sampling methods include Stochastic Search and the recent LowModeMD from MOE, all the low-mode based approaches from MacroModel, and MD/LLMOD recently developed for macrocycles. In addition to default settings, key parameters of the sampling protocols were explored. The performance of the computational protocols was assessed via (i) the reproduction of the X-ray bioactive structures, (ii) the size, coverage and diversity of the output conformational ensembles, (iii) the compactness/extendedness of the conformers, and (iv) the ability to locate the global energy minimum. The influence of the stochastic nature of the searches on the results was also examined. Much better results were obtained by adopting search parameters enhanced over the default settings, while maintaining computational tractability. In MOE, the recent LowModeMD emerged as the method of choice. Mixed torsional/low-mode from MacroModel performed as well as LowModeMD, and MD/LLMOD performed well for macrocycles. The low-mode based approaches yielded very encouraging results with the flexible and macrocycle sets. Thus, one can productively tackle the computational conformational search of larger flexible compounds for drug discovery, including macrocycles. Copyright © 2013 Elsevier Ltd. All

  4. First results from the International Urban Energy Balance Model Comparison: Model Complexity

    NASA Astrophysics Data System (ADS)

    Blackett, M.; Grimmond, S.; Best, M.

    2009-04-01

    A great variety of urban energy balance models has been developed. These vary in complexity from simple schemes that represent the city as a slab, through those which model various facets (i.e. road, walls and roof) to more complex urban forms (including street canyons with intersections) and features (such as vegetation cover and anthropogenic heat fluxes). Some schemes also incorporate detailed representations of momentum and energy fluxes distributed throughout various layers of the urban canopy layer. The models each differ in the parameters they require to describe the site and the in demands they make on computational processing power. Many of these models have been evaluated using observational datasets but to date, no controlled comparisons have been conducted. Urban surface energy balance models provide a means to predict the energy exchange processes which influence factors such as urban temperature, humidity, atmospheric stability and winds. These all need to be modelled accurately to capture features such as the urban heat island effect and to provide key information for dispersion and air quality modelling. A comparison of the various models available will assist in improving current and future models and will assist in formulating research priorities for future observational campaigns within urban areas. In this presentation we will summarise the initial results of this international urban energy balance model comparison. In particular, the relative performance of the models involved will be compared based on their degree of complexity. These results will inform us on ways in which we can improve the modelling of air quality within, and climate impacts of, global megacities. The methodology employed in conducting this comparison followed that used in PILPS (the Project for Intercomparison of Land-Surface Parameterization Schemes) which is also endorsed by the GEWEX Global Land Atmosphere System Study (GLASS) panel. In all cases, models were run

  5. Effect of shoulder model complexity in upper-body kinematics analysis of the golf swing.

    PubMed

    Bourgain, M; Hybois, S; Thoreux, P; Rouillon, O; Rouch, P; Sauret, C

    2018-06-25

    The golf swing is a complex full body movement during which the spine and shoulders are highly involved. In order to determine shoulder kinematics during this movement, multibody kinematics optimization (MKO) can be recommended to limit the effect of the soft tissue artifact and to avoid joint dislocations or bone penetration in reconstructed kinematics. Classically, in golf biomechanics research, the shoulder is represented by a 3 degrees-of-freedom model representing the glenohumeral joint. More complex and physiological models are already provided in the scientific literature. Particularly, the model used in this study was a full body model and also described motions of clavicles and scapulae. This study aimed at quantifying the effect of utilizing a more complex and physiological shoulder model when studying the golf swing. Results obtained on 20 golfers showed that a more complex and physiologically-accurate model can more efficiently track experimental markers, which resulted in differences in joint kinematics. Hence, the model with 3 degrees-of-freedom between the humerus and the thorax may be inadequate when combined with MKO and a more physiological model would be beneficial. Finally, results would also be improved through a subject-specific approach for the determination of the segment lengths. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. The Emergence of an Amplified Mindset of Design: Implications for Postgraduate Design Education

    ERIC Educational Resources Information Center

    Moreira, Mafalda; Murphy, Emma; McAra-McWilliam, Irene

    2016-01-01

    In a global scenario of complexity, research shows that emerging design practices are changing and expanding, creating a complex and ambiguous disciplinary landscape. This directly impacts on the field of design education, calling for new, flexible models able to tackle future practitioners' needs, unknown markets and emergent societal cultures.…

  7. Dynamical Behaviors in Complex-Valued Love Model With or Without Time Delays

    NASA Astrophysics Data System (ADS)

    Deng, Wei; Liao, Xiaofeng; Dong, Tao

    2017-12-01

    In this paper, a novel version of nonlinear model, i.e. a complex-valued love model with two time delays between two individuals in a love affair, has been proposed. A notable feature in this model is that we separate the emotion of one individual into real and imaginary parts to represent the variation and complexity of psychophysiological emotion in romantic relationship instead of just real domain, and make our model much closer to reality. This is because love is a complicated cognitive and social phenomenon, full of complexity, diversity and unpredictability, which refers to the coexistence of different aspects of feelings, states and attitudes ranging from joy and trust to sadness and disgust. By analyzing associated characteristic equation of linearized equations for our model, it is found that the Hopf bifurcation occurs when the sum of time delays passes through a sequence of critical value. Stability of bifurcating cyclic love dynamics is also derived by applying the normal form theory and the center manifold theorem. In addition, it is also shown that, for some appropriate chosen parameters, chaotic behaviors can appear even without time delay.

  8. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more

    PubMed Central

    Rivas, Elena; Lang, Raymond; Eddy, Sean R.

    2012-01-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases. PMID:22194308

  9. OpenMinds: creating a mental health workshop for teenagers to tackle stigma and raise awareness.

    PubMed

    Jones, Sammy; Sinha, Kathryn; Swinton, Martin; Millar, Christina; Rayment, Dane; Simmons, Meinou

    2011-09-01

    As a group of four clinical medical students from Cambridge University, we undertook a Student Selected Module (SSC- "OpenMinds") whereby we designed and delivered a workshop about mental health to year 9 pupils. The aim of our SSC was to produce an interactive, informative lesson which addressed the complex issues of stigma and discrimination against those suffering from a mental illness as well as teaching the pupils how to recognise mental health problems and provide them with guidance on how to seek help. We split a fifty minute session into the following sections: tackling stigma; how common mental illness is; celebrity examples; real life examples; role play; and small group work. To engage the pupils we used a combination of teaching modalities targeting all learning. We delivered the workshop to four separate classes and received feedback from the pupils after each. We used this feedback to adapt and improve our presentation and assess the efficacy. Feedback was overwhelmingly positive with the striking results of 101/109 pupils saying that they would recommend the workshop to a friend and 68/109 pupils saying they enjoyed all aspects. Our SSC built upon work by a contingent of trainee Psychiatrists who undertook a similar project of mental health education for teenagers, called "Heads above the rest", in Northern Ireland with great success. By continuing their work we were able to demonstrate that medical students can successfully complete the same project under the guidance of a Psychiatrist, thus increasing the sustainability of the project by reducing the time burden on the Psychiatrists. Participating in the project was also valuable to our own personal development of teaching skills.

  10. Tackling children's road safety through edutainment: an evaluation of effectiveness.

    PubMed

    Zeedyk, M S; Wallace, L

    2003-08-01

    The burgeoning market in electronic media has encouraged a trend toward 'edutainment', where entertaining, media-based materials are used to facilitate educational outcomes. In this study, we evaluated the effectiveness of a video that has recently been released by a popular children's entertainment group to help tackle Britain's poor record on children's road safety. We wished to determine whether the video had an impact on either children's knowledge or parents' awareness of pedestrian skills, when used in a standard home-based fashion. A total of 120 families participated, all of whom had children 5 years of age. Half the families received videos at the beginning of the study, while the other half served as a control group against which to measure change in the treatment group. Data were gathered at baseline and again 1 month later, using a series of tailored questionnaire items. A robust pattern of null findings indicated that the video, when used in this casual fashion, had no educational impact on either parents or children. Crucially, however, parents strongly believed that it had. The discussion explores the implications of such a mismatch and highlights similarities with outcomes of other health education interventions.

  11. Petascale Many Body Methods for Complex Correlated Systems

    NASA Astrophysics Data System (ADS)

    Pruschke, Thomas

    2012-02-01

    Correlated systems constitute an important class of materials in modern condensed matter physics. Correlation among electrons are at the heart of all ordering phenomena and many intriguing novel aspects, such as quantum phase transitions or topological insulators, observed in a variety of compounds. Yet, theoretically describing these phenomena is still a formidable task, even if one restricts the models used to the smallest possible set of degrees of freedom. Here, modern computer architectures play an essential role, and the joint effort to devise efficient algorithms and implement them on state-of-the art hardware has become an extremely active field in condensed-matter research. To tackle this task single-handed is quite obviously not possible. The NSF-OISE funded PIRE collaboration ``Graduate Education and Research in Petascale Many Body Methods for Complex Correlated Systems'' is a successful initiative to bring together leading experts around the world to form a virtual international organization for addressing these emerging challenges and educate the next generation of computational condensed matter physicists. The collaboration includes research groups developing novel theoretical tools to reliably and systematically study correlated solids, experts in efficient computational algorithms needed to solve the emerging equations, and those able to use modern heterogeneous computer architectures to make then working tools for the growing community.

  12. Visualizing and modelling complex rockfall slopes using game-engine hosted models

    NASA Astrophysics Data System (ADS)

    Ondercin, Matthew; Hutchinson, D. Jean; Harrap, Rob

    2015-04-01

    Innovations in computing in the past few decades have resulted in entirely new ways to collect 3d geological data and visualize it. For example, new tools and techniques relying on high performance computing capabilities have become widely available, allowing us to model rockfalls with more attention to complexity of the rock slope geometry and rockfall path, with significantly higher quality base data, and with more analytical options. Model results are used to design mitigation solutions, considering the potential paths of the rockfall events and the energy they impart on impacted structures. Such models are currently implemented as general-purpose GIS tools and in specialized programs. These tools are used to inspect geometrical and geomechanical data, model rockfalls, and communicate results to researchers and the larger community. The research reported here explores the notion that 3D game engines provide a high speed, widely accessible platform on which to build rockfall modelling workflows and to provide a new and accessible outreach method. Taking advantage of the in-built physics capability of the 3D game codes, and ability to handle large terrains, these models are rapidly deployed and generate realistic visualizations of rockfall trajectories. Their utility in this area is as yet unproven, but preliminary research shows that they are capable of producing results that are comparable to existing approaches. Furthermore, modelling of case histories shows that the output matches the behaviour that is observed in the field. The key advantage of game-engine hosted models is their accessibility to the general public and to people with little to no knowledge of rockfall hazards. With much of the younger generation being very familiar with 3D environments such as Minecraft, the idea of a game-like simulation is intuitive and thus offers new ways to communicate to the general public. We present results from using the Unity game engine to develop 3D voxel worlds

  13. Per Aspera ad Astra: Through Complex Population Modeling to Predictive Theory.

    PubMed

    Topping, Christopher J; Alrøe, Hugo Fjelsted; Farrell, Katharine N; Grimm, Volker

    2015-11-01

    Population models in ecology are often not good at predictions, even if they are complex and seem to be realistic enough. The reason for this might be that Occam's razor, which is key for minimal models exploring ideas and concepts, has been too uncritically adopted for more realistic models of systems. This can tie models too closely to certain situations, thereby preventing them from predicting the response to new conditions. We therefore advocate a new kind of parsimony to improve the application of Occam's razor. This new parsimony balances two contrasting strategies for avoiding errors in modeling: avoiding inclusion of nonessential factors (false inclusions) and avoiding exclusion of sometimes-important factors (false exclusions). It involves a synthesis of traditional modeling and analysis, used to describe the essentials of mechanistic relationships, with elements that are included in a model because they have been reported to be or can arguably be assumed to be important under certain conditions. The resulting models should be able to reflect how the internal organization of populations change and thereby generate representations of the novel behavior necessary for complex predictions, including regime shifts.

  14. Model development and validation of geometrically complex eddy current coils using finite element methods

    NASA Astrophysics Data System (ADS)

    Brown, Alexander; Eviston, Connor

    2017-02-01

    Multiple FEM models of complex eddy current coil geometries were created and validated to calculate the change of impedance due to the presence of a notch. Capable realistic simulations of eddy current inspections are required for model assisted probability of detection (MAPOD) studies, inversion algorithms, experimental verification, and tailored probe design for NDE applications. An FEM solver was chosen to model complex real world situations including varying probe dimensions and orientations along with complex probe geometries. This will also enable creation of a probe model library database with variable parameters. Verification and validation was performed using other commercially available eddy current modeling software as well as experimentally collected benchmark data. Data analysis and comparison showed that the created models were able to correctly model the probe and conductor interactions and accurately calculate the change in impedance of several experimental scenarios with acceptable error. The promising results of the models enabled the start of an eddy current probe model library to give experimenters easy access to powerful parameter based eddy current models for alternate project applications.

  15. Knowledge-based grouping of modeled HLA peptide complexes.

    PubMed

    Kangueane, P; Sakharkar, M K; Lim, K S; Hao, H; Lin, K; Chee, R E; Kolatkar, P R

    2000-05-01

    Human leukocyte antigens are the most polymorphic of human genes and multiple sequence alignment shows that such polymorphisms are clustered in the functional peptide binding domains. Because of such polymorphism among the peptide binding residues, the prediction of peptides that bind to specific HLA molecules is very difficult. In recent years two different types of computer based prediction methods have been developed and both the methods have their own advantages and disadvantages. The nonavailability of allele specific binding data restricts the use of knowledge-based prediction methods for a wide range of HLA alleles. Alternatively, the modeling scheme appears to be a promising predictive tool for the selection of peptides that bind to specific HLA molecules. The scoring of the modeled HLA-peptide complexes is a major concern. The use of knowledge based rules (van der Waals clashes and solvent exposed hydrophobic residues) to distinguish binders from nonbinders is applied in the present study. The rules based on (1) number of observed atomic clashes between the modeled peptide and the HLA structure, and (2) number of solvent exposed hydrophobic residues on the modeled peptide effectively discriminate experimentally known binders from poor/nonbinders. Solved crystal complexes show no vdW Clash (vdWC) in 95% cases and no solvent exposed hydrophobic peptide residues (SEHPR) were seen in 86% cases. In our attempt to compare experimental binding data with the predicted scores by this scoring scheme, 77% of the peptides are correctly grouped as good binders with a sensitivity of 71%.

  16. Prediction of Complex Aerodynamic Flows with Explicit Algebraic Stress Models

    NASA Technical Reports Server (NTRS)

    Abid, Ridha; Morrison, Joseph H.; Gatski, Thomas B.; Speziale, Charles G.

    1996-01-01

    An explicit algebraic stress equation, developed by Gatski and Speziale, is used in the framework of K-epsilon formulation to predict complex aerodynamic turbulent flows. The nonequilibrium effects are modeled through coefficients that depend nonlinearly on both rotational and irrotational strains. The proposed model was implemented in the ISAAC Navier-Stokes code. Comparisons with the experimental data are presented which clearly demonstrate that explicit algebraic stress models can predict the correct response to nonequilibrium flow.

  17. The Model of Complex Structure of Quark

    NASA Astrophysics Data System (ADS)

    Liu, Rongwu

    2017-09-01

    In Quantum Chromodynamics, quark is known as a kind of point-like fundamental particle which carries mass, charge, color, and flavor, strong interaction takes place between quarks by means of exchanging intermediate particles-gluons. An important consequence of this theory is that, strong interaction is a kind of short-range force, and it has the features of ``asymptotic freedom'' and ``quark confinement''. In order to reveal the nature of strong interaction, the ``bag'' model of vacuum and the ``string'' model of string theory were proposed in the context of quantum mechanics, but neither of them can provide a clear interaction mechanism. This article formulates a new mechanism by proposing a model of complex structure of quark, it can be outlined as follows: (1) Quark (as well as electron, etc) is a kind of complex structure, it is composed of fundamental particle (fundamental matter mass and electricity) and fundamental volume field (fundamental matter flavor and color) which exists in the form of limited volume; fundamental particle lies in the center of fundamental volume field, forms the ``nucleus'' of quark. (2) As static electric force, the color field force between quarks has classical form, it is proportional to the square of the color quantity carried by each color field, and inversely proportional to the area of cross section of overlapping color fields which is along force direction, it has the properties of overlap, saturation, non-central, and constant. (3) Any volume field undergoes deformation when interacting with other volume field, the deformation force follows Hooke's law. (4) The phenomena of ``asymptotic freedom'' and ``quark confinement'' are the result of color field force and deformation force.

  18. Lattice Boltzmann Modeling of Complex Flows for Engineering Applications

    NASA Astrophysics Data System (ADS)

    Montessori, Andrea; Falcucci, Giacomo

    2018-01-01

    Nature continuously presents a huge number of complex and multiscale phenomena, which in many cases, involve the presence of one or more fluids flowing, merging and evolving around us. Since the very first years of the third millennium, the Lattice Boltzmann method (LB) has seen an exponential growth of applications, especially in the fields connected with the simulation of complex and soft matter flows. LB, in fact, has shown a remarkable versatility in different fields of applications from nanoactive materials, free surface flows, and multiphase and reactive flows to the simulation of the processes inside engines and fluid machinery. In this book, the authors present the most recent advances of the application of the LB to complex flow phenomena of scientific and technical interest with focus on the multiscale modeling of heterogeneous catalysis within nano-porous media and multiphase, multicomponent flows.

  19. EVALUATING PREDICTIVE ERRORS OF A COMPLEX ENVIRONMENTAL MODEL USING A GENERAL LINEAR MODEL AND LEAST SQUARE MEANS

    EPA Science Inventory

    A General Linear Model (GLM) was used to evaluate the deviation of predicted values from expected values for a complex environmental model. For this demonstration, we used the default level interface of the Regional Mercury Cycling Model (R-MCM) to simulate epilimnetic total mer...

  20. Modeling complex aquifer systems: a case study in Baton Rouge, Louisiana (USA)

    NASA Astrophysics Data System (ADS)

    Pham, Hai V.; Tsai, Frank T.-C.

    2017-05-01

    This study targets two challenges in groundwater model development: grid generation and model calibration for aquifer systems that are fluvial in origin. Realistic hydrostratigraphy can be developed using a large quantity of well log data to capture the complexity of an aquifer system. However, generating valid groundwater model grids to be consistent with the complex hydrostratigraphy is non-trivial. Model calibration can also become intractable for groundwater models that intend to match the complex hydrostratigraphy. This study uses the Baton Rouge aquifer system, Louisiana (USA), to illustrate a technical need to cope with grid generation and model calibration issues. A grid generation technique is introduced based on indicator kriging to interpolate 583 wireline well logs in the Baton Rouge area to derive a hydrostratigraphic architecture with fine vertical discretization. Then, an upscaling procedure is developed to determine a groundwater model structure with 162 layers that captures facies geometry in the hydrostratigraphic architecture. To handle model calibration for such a large model, this study utilizes a derivative-free optimization method in parallel computing to complete parameter estimation in a few months. The constructed hydrostratigraphy indicates the Baton Rouge aquifer system is fluvial in origin. The calibration result indicates hydraulic conductivity for Miocene sands is higher than that for Pliocene to Holocene sands and indicates the Baton Rouge fault and the Denham Springs-Scotlandville fault to be low-permeability leaky aquifers. The modeling result shows significantly low groundwater level in the "2,000-foot" sand due to heavy pumping, indicating potential groundwater upward flow from the "2,400-foot" sand.

  1. Modeling complex treatment strategies: construction and validation of a discrete event simulation model for glaucoma.

    PubMed

    van Gestel, Aukje; Severens, Johan L; Webers, Carroll A B; Beckers, Henny J M; Jansonius, Nomdo M; Schouten, Jan S A G

    2010-01-01

    Discrete event simulation (DES) modeling has several advantages over simpler modeling techniques in health economics, such as increased flexibility and the ability to model complex systems. Nevertheless, these benefits may come at the cost of reduced transparency, which may compromise the model's face validity and credibility. We aimed to produce a transparent report on the construction and validation of a DES model using a recently developed model of ocular hypertension and glaucoma. Current evidence of associations between prognostic factors and disease progression in ocular hypertension and glaucoma was translated into DES model elements. The model was extended to simulate treatment decisions and effects. Utility and costs were linked to disease status and treatment, and clinical and health economic outcomes were defined. The model was validated at several levels. The soundness of design and the plausibility of input estimates were evaluated in interdisciplinary meetings (face validity). Individual patients were traced throughout the simulation under a multitude of model settings to debug the model, and the model was run with a variety of extreme scenarios to compare the outcomes with prior expectations (internal validity). Finally, several intermediate (clinical) outcomes of the model were compared with those observed in experimental or observational studies (external validity) and the feasibility of evaluating hypothetical treatment strategies was tested. The model performed well in all validity tests. Analyses of hypothetical treatment strategies took about 30 minutes per cohort and lead to plausible health-economic outcomes. There is added value of DES models in complex treatment strategies such as glaucoma. Achieving transparency in model structure and outcomes may require some effort in reporting and validating the model, but it is feasible.

  2. A density-based clustering model for community detection in complex networks

    NASA Astrophysics Data System (ADS)

    Zhao, Xiang; Li, Yantao; Qu, Zehui

    2018-04-01

    Network clustering (or graph partitioning) is an important technique for uncovering the underlying community structures in complex networks, which has been widely applied in various fields including astronomy, bioinformatics, sociology, and bibliometric. In this paper, we propose a density-based clustering model for community detection in complex networks (DCCN). The key idea is to find group centers with a higher density than their neighbors and a relatively large integrated-distance from nodes with higher density. The experimental results indicate that our approach is efficient and effective for community detection of complex networks.

  3. Delineating parameter unidentifiabilities in complex models

    NASA Astrophysics Data System (ADS)

    Raman, Dhruva V.; Anderson, James; Papachristodoulou, Antonis

    2017-03-01

    Scientists use mathematical modeling as a tool for understanding and predicting the properties of complex physical systems. In highly parametrized models there often exist relationships between parameters over which model predictions are identical, or nearly identical. These are known as structural or practical unidentifiabilities, respectively. They are hard to diagnose and make reliable parameter estimation from data impossible. They furthermore imply the existence of an underlying model simplification. We describe a scalable method for detecting unidentifiabilities, as well as the functional relations defining them, for generic models. This allows for model simplification, and appreciation of which parameters (or functions thereof) cannot be estimated from data. Our algorithm can identify features such as redundant mechanisms and fast time-scale subsystems, as well as the regimes in parameter space over which such approximations are valid. We base our algorithm on a quantification of regional parametric sensitivity that we call `multiscale sloppiness'. Traditionally, the link between parametric sensitivity and the conditioning of the parameter estimation problem is made locally, through the Fisher information matrix. This is valid in the regime of infinitesimal measurement uncertainty. We demonstrate the duality between multiscale sloppiness and the geometry of confidence regions surrounding parameter estimates made where measurement uncertainty is non-negligible. Further theoretical relationships are provided linking multiscale sloppiness to the likelihood-ratio test. From this, we show that a local sensitivity analysis (as typically done) is insufficient for determining the reliability of parameter estimation, even with simple (non)linear systems. Our algorithm can provide a tractable alternative. We finally apply our methods to a large-scale, benchmark systems biology model of necrosis factor (NF)-κ B , uncovering unidentifiabilities.

  4. Inference, simulation, modeling, and analysis of complex networks, with special emphasis on complex networks in systems biology

    NASA Astrophysics Data System (ADS)

    Christensen, Claire Petra

    Across diverse fields ranging from physics to biology, sociology, and economics, the technological advances of the past decade have engendered an unprecedented explosion of data on highly complex systems with thousands, if not millions of interacting components. These systems exist at many scales of size and complexity, and it is becoming ever-more apparent that they are, in fact, universal, arising in every field of study. Moreover, they share fundamental properties---chief among these, that the individual interactions of their constituent parts may be well-understood, but the characteristic behaviour produced by the confluence of these interactions---by these complex networks---is unpredictable; in a nutshell, the whole is more than the sum of its parts. There is, perhaps, no better illustration of this concept than the discoveries being made regarding complex networks in the biological sciences. In particular, though the sequencing of the human genome in 2003 was a remarkable feat, scientists understand that the "cellular-level blueprints" for the human being are cellular-level parts lists, but they say nothing (explicitly) about cellular-level processes. The challenge of modern molecular biology is to understand these processes in terms of the networks of parts---in terms of the interactions among proteins, enzymes, genes, and metabolites---as it is these processes that ultimately differentiate animate from inanimate, giving rise to life! It is the goal of systems biology---an umbrella field encapsulating everything from molecular biology to epidemiology in social systems---to understand processes in terms of fundamental networks of core biological parts, be they proteins or people. By virtue of the fact that there are literally countless complex systems, not to mention tools and techniques used to infer, simulate, analyze, and model these systems, it is impossible to give a truly comprehensive account of the history and study of complex systems. The author

  5. Designing management strategies for carbon dioxide storage and utilization under uncertainty using inexact modelling

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong

    2017-06-01

    Effective application of carbon capture, utilization and storage (CCUS) systems could help to alleviate the influence of climate change by reducing carbon dioxide (CO2) emissions. The research objective of this study is to develop an equilibrium chance-constrained programming model with bi-random variables (ECCP model) for supporting the CCUS management system under random circumstances. The major advantage of the ECCP model is that it tackles random variables as bi-random variables with a normal distribution, where the mean values follow a normal distribution. This could avoid irrational assumptions and oversimplifications in the process of parameter design and enrich the theory of stochastic optimization. The ECCP model is solved by an equilibrium change-constrained programming algorithm, which provides convenience for decision makers to rank the solution set using the natural order of real numbers. The ECCP model is applied to a CCUS management problem, and the solutions could be useful in helping managers to design and generate rational CO2-allocation patterns under complexities and uncertainties.

  6. Complex Constructivism: A Theoretical Model of Complexity and Cognition

    ERIC Educational Resources Information Center

    Doolittle, Peter E.

    2014-01-01

    Education has long been driven by its metaphors for teaching and learning. These metaphors have influenced both educational research and educational practice. Complexity and constructivism are two theories that provide functional and robust metaphors. Complexity provides a metaphor for the structure of myriad phenomena, while constructivism…

  7. Formative feedback and scaffolding for developing complex problem solving and modelling outcomes

    NASA Astrophysics Data System (ADS)

    Frank, Brian; Simper, Natalie; Kaupp, James

    2018-07-01

    This paper discusses the use and impact of formative feedback and scaffolding to develop outcomes for complex problem solving in a required first-year course in engineering design and practice at a medium-sized research-intensive Canadian university. In 2010, the course began to use team-based, complex, open-ended contextualised problems to develop problem solving, communications, teamwork, modelling, and professional skills. Since then, formative feedback has been incorporated into: task and process-level feedback on scaffolded tasks in-class, formative assignments, and post-assignment review. Development in complex problem solving and modelling has been assessed through analysis of responses from student surveys, direct criterion-referenced assessment of course outcomes from 2013 to 2015, and an external longitudinal study. The findings suggest that students are improving in outcomes related to complex problem solving over the duration of the course. Most notably, the addition of new feedback and scaffolding coincided with improved student performance.

  8. Development of structural model of adaptive training complex in ergatic systems for professional use

    NASA Astrophysics Data System (ADS)

    Obukhov, A. D.; Dedov, D. L.; Arkhipov, A. E.

    2018-03-01

    The article considers the structural model of the adaptive training complex (ATC), which reflects the interrelations between the hardware, software and mathematical model of ATC and describes the processes in this subject area. The description of the main components of software and hardware complex, their interaction and functioning within the common system are given. Also the article scrutinizers a brief description of mathematical models of personnel activity, a technical system and influences, the interactions of which formalize the regularities of ATC functioning. The studies of main objects of training complexes and connections between them will make it possible to realize practical implementation of ATC in ergatic systems for professional use.

  9. Contingency Detection in a Complex World: A Developmental Model and Implications for Atypical Development

    ERIC Educational Resources Information Center

    Northrup, Jessie Bolz

    2017-01-01

    The present article proposes a new developmental model of how young infants adapt and respond to complex contingencies in their environment, and how this influences development. The model proposes that typically developing infants adjust to an increasingly complex environment in ways that make it easier for them to allocate limited attentional…

  10. Process Consistency in Models: the Importance of System Signatures, Expert Knowledge and Process Complexity

    NASA Astrophysics Data System (ADS)

    Hrachowitz, Markus; Fovet, Ophelie; Ruiz, Laurent; Gascuel-Odoux, Chantal; Savenije, Hubert

    2014-05-01

    Hydrological models are frequently characterized by what is often considered to be adequate calibration performances. In many cases, however, these models experience a substantial uncertainty and performance decrease in validation periods, thus resulting in poor predictive power. Besides the likely presence of data errors, this observation can point towards wrong or insufficient representations of the underlying processes and their heterogeneity. In other words, right results are generated for the wrong reasons. Thus ways are sought to increase model consistency and to thereby satisfy the contrasting priorities of the need a) to increase model complexity and b) to limit model equifinality. In this study a stepwise model development approach is chosen to test the value of an exhaustive and systematic combined use of hydrological signatures, expert knowledge and readily available, yet anecdotal and rarely exploited, hydrological information for increasing model consistency towards generating the right answer for the right reasons. A simple 3-box, 7 parameter, conceptual HBV-type model, constrained by 4 calibration objective functions was able to adequately reproduce the hydrograph with comparatively high values for the 4 objective functions in the 5-year calibration period. However, closer inspection of the results showed a dramatic decrease of model performance in the 5-year validation period. In addition, assessing the model's skill to reproduce a range of 20 hydrological signatures including, amongst others, the flow duration curve, the autocorrelation function and the rising limb density, showed that it could not adequately reproduce the vast majority of these signatures, indicating a lack of model consistency. Subsequently model complexity was increased in a stepwise way to allow for more process heterogeneity. To limit model equifinality, increase in complexity was counter-balanced by a stepwise application of "realism constraints", inferred from expert

  11. Surface complexation model of uranyl sorption on Georgia kaolinite

    USGS Publications Warehouse

    Payne, T.E.; Davis, J.A.; Lumpkin, G.R.; Chisari, R.; Waite, T.D.

    2004-01-01

    The adsorption of uranyl on standard Georgia kaolinites (KGa-1 and KGa-1B) was studied as a function of pH (3-10), total U (1 and 10 ??mol/l), and mass loading of clay (4 and 40 g/l). The uptake of uranyl in air-equilibrated systems increased with pH and reached a maximum in the near-neutral pH range. At higher pH values, the sorption decreased due to the presence of aqueous uranyl carbonate complexes. One kaolinite sample was examined after the uranyl uptake experiments by transmission electron microscopy (TEM), using energy dispersive X-ray spectroscopy (EDS) to determine the U content. It was found that uranium was preferentially adsorbed by Ti-rich impurity phases (predominantly anatase), which are present in the kaolinite samples. Uranyl sorption on the Georgia kaolinites was simulated with U sorption reactions on both titanol and aluminol sites, using a simple non-electrostatic surface complexation model (SCM). The relative amounts of U-binding >TiOH and >AlOH sites were estimated from the TEM/EDS results. A ternary uranyl carbonate complex on the titanol site improved the fit to the experimental data in the higher pH range. The final model contained only three optimised log K values, and was able to simulate adsorption data across a wide range of experimental conditions. The >TiOH (anatase) sites appear to play an important role in retaining U at low uranyl concentrations. As kaolinite often contains trace TiO2, its presence may need to be taken into account when modelling the results of sorption experiments with radionuclides or trace metals on kaolinite. ?? 2004 Elsevier B.V. All rights reserved.

  12. Using multi-criteria analysis of simulation models to understand complex biological systems

    Treesearch

    Maureen C. Kennedy; E. David Ford

    2011-01-01

    Scientists frequently use computer-simulation models to help solve complex biological problems. Typically, such models are highly integrated, they produce multiple outputs, and standard methods of model analysis are ill suited for evaluating them. We show how multi-criteria optimization with Pareto optimality allows for model outputs to be compared to multiple system...

  13. Modelling Complex Fenestration Systems using physical and virtual models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thanachareonkit, Anothai; Scartezzini, Jean-Louis

    2010-04-15

    Physical or virtual models are commonly used to visualize the conceptual ideas of architects, lighting designers and researchers; they are also employed to assess the daylighting performance of buildings, particularly in cases where Complex Fenestration Systems (CFS) are considered. Recent studies have however revealed a general tendency of physical models to over-estimate this performance, compared to those of real buildings; these discrepancies can be attributed to several reasons. In order to identify the main error sources, a series of comparisons in-between a real building (a single office room within a test module) and the corresponding physical and virtual models wasmore » undertaken. The physical model was placed in outdoor conditions, which were strictly identical to those of the real building, as well as underneath a scanning sky simulator. The virtual model simulations were carried out by way of the Radiance program using the GenSky function; an alternative evaluation method, named Partial Daylight Factor method (PDF method), was also employed with the physical model together with sky luminance distributions acquired by a digital sky scanner during the monitoring of the real building. The overall daylighting performance of physical and virtual models were assessed and compared. The causes of discrepancies between the daylighting performance of the real building and the models were analysed. The main identified sources of errors are the reproduction of building details, the CFS modelling and the mocking-up of the geometrical and photometrical properties. To study the impact of these errors on daylighting performance assessment, computer simulation models created using the Radiance program were also used to carry out a sensitivity analysis of modelling errors. The study of the models showed that large discrepancies can occur in daylighting performance assessment. In case of improper mocking-up of the glazing for instance, relative divergences of 25

  14. From the baker to the bedside: yeast models of Parkinson's disease

    PubMed Central

    Menezes, Regina; Tenreiro, Sandra; Macedo, Diana; Santos, Cláudia N.; Outeiro, Tiago F.

    2015-01-01

    The baker’s yeast Saccharomyces cerevisiae has been extensively explored for our understanding of fundamental cell biology processes highly conserved in the eukaryotic kingdom. In this context, they have proven invaluable in the study of complex mechanisms such as those involved in a variety of human disorders. Here, we first provide a brief historical perspective on the emergence of yeast as an experimental model and on how the field evolved to exploit the potential of the model for tackling the intricacies of various human diseases. In particular, we focus on existing yeast models of the molecular underpinnings of Parkinson’s disease (PD), focusing primarily on the central role of protein quality control systems. Finally, we compile and discuss the major discoveries derived from these studies, highlighting their far-reaching impact on the elucidation of PD-associated mechanisms as well as in the identification of candidate therapeutic targets and compounds with therapeutic potential. PMID:28357302

  15. Modeling ultrasound propagation through material of increasing geometrical complexity.

    PubMed

    Odabaee, Maryam; Odabaee, Mostafa; Pelekanos, Matthew; Leinenga, Gerhard; Götz, Jürgen

    2018-06-01

    Ultrasound is increasingly being recognized as a neuromodulatory and therapeutic tool, inducing a broad range of bio-effects in the tissue of experimental animals and humans. To achieve these effects in a predictable manner in the human brain, the thick cancellous skull presents a problem, causing attenuation. In order to overcome this challenge, as a first step, the acoustic properties of a set of simple bone-modeling resin samples that displayed an increasing geometrical complexity (increasing step sizes) were analyzed. Using two Non-Destructive Testing (NDT) transducers, we found that Wiener deconvolution predicted the Ultrasound Acoustic Response (UAR) and attenuation caused by the samples. However, whereas the UAR of samples with step sizes larger than the wavelength could be accurately estimated, the prediction was not accurate when the sample had a smaller step size. Furthermore, a Finite Element Analysis (FEA) performed in ANSYS determined that the scattering and refraction of sound waves was significantly higher in complex samples with smaller step sizes compared to simple samples with a larger step size. Together, this reveals an interaction of frequency and geometrical complexity in predicting the UAR and attenuation. These findings could in future be applied to poro-visco-elastic materials that better model the human skull. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Predictive model of complexity in early palliative care: a cohort of advanced cancer patients (PALCOM study).

    PubMed

    Tuca, Albert; Gómez-Martínez, Mónica; Prat, Aleix

    2018-01-01

    Model of early palliative care (PC) integrated in oncology is based on shared care from the diagnosis to the end of life and is mainly focused on patients with greater complexity. However, there is no definition or tools to evaluate PC complexity. The objectives of the study were to identify the factors influencing level determination of complexity, propose predictive models, and build a complexity scale of PC. We performed a prospective, observational, multicenter study in a cohort of advanced cancer patients with an estimated prognosis ≤ 6 months. An ad hoc structured evaluation including socio-demographic and clinical data, symptom burden, functional and cognitive status, psychosocial problems, and existential-ethic dilemmas was recorded systematically. According to this multidimensional evaluation, investigator classified patients as high, medium, or low palliative complexity, associated to need of basic or specialized PC. Logistic regression was used to identify the variables influencing determination of level of PC complexity and explore predictive models. We included 324 patients; 41% were classified as having high PC complexity and 42.9% as medium, both levels being associated with specialized PC. Variables influencing determination of PC complexity were as follows: high symptom burden (OR 3.19 95%CI: 1.72-6.17), difficult pain (OR 2.81 95%CI:1.64-4.9), functional status (OR 0.99 95%CI:0.98-0.9), and social-ethical existential risk factors (OR 3.11 95%CI:1.73-5.77). Logistic analysis of variables allowed construct a complexity model and structured scales (PALCOM 1 and 2) with high predictive value (AUC ROC 76%). This study provides a new model and tools to assess complexity in palliative care, which may be very useful to manage referral to specialized PC services, and agree intensity of their intervention in a model of early-shared care integrated in oncology.

  17. Using machine learning tools to model complex toxic interactions with limited sampling regimes.

    PubMed

    Bertin, Matthew J; Moeller, Peter; Guillette, Louis J; Chapman, Robert W

    2013-03-19

    A major impediment to understanding the impact of environmental stress, including toxins and other pollutants, on organisms, is that organisms are rarely challenged by one or a few stressors in natural systems. Thus, linking laboratory experiments that are limited by practical considerations to a few stressors and a few levels of these stressors to real world conditions is constrained. In addition, while the existence of complex interactions among stressors can be identified by current statistical methods, these methods do not provide a means to construct mathematical models of these interactions. In this paper, we offer a two-step process by which complex interactions of stressors on biological systems can be modeled in an experimental design that is within the limits of practicality. We begin with the notion that environment conditions circumscribe an n-dimensional hyperspace within which biological processes or end points are embedded. We then randomly sample this hyperspace to establish experimental conditions that span the range of the relevant parameters and conduct the experiment(s) based upon these selected conditions. Models of the complex interactions of the parameters are then extracted using machine learning tools, specifically artificial neural networks. This approach can rapidly generate highly accurate models of biological responses to complex interactions among environmentally relevant toxins, identify critical subspaces where nonlinear responses exist, and provide an expedient means of designing traditional experiments to test the impact of complex mixtures on biological responses. Further, this can be accomplished with an astonishingly small sample size.

  18. Experimental determination and modeling of arsenic complexation with humic and fulvic acids.

    PubMed

    Fakour, Hoda; Lin, Tsair-Fuh

    2014-08-30

    The complexation of humic acid (HA) and fulvic acid (FA) with arsenic (As) in water was studied. Experimental results indicate that arsenic may form complexes with HA and FA with a higher affinity for arsenate than for arsenite. With the presence of iron oxide based adsorbents, binding of arsenic to HA/FA in water was significantly suppressed, probably due to adsorption of As and HA/FA. A two-site ligand binding model, considering only strong and weak site types of binding affinity, was successfully developed to describe the complexation of arsenic on the two natural organic fractions. The model showed that the numbers of weak sites were more than 10 times those of strong sites on both HA and FA for both arsenic species studied. The numbers of both types of binding sites were found to be proportional to the HA concentrations, while the apparent stability constants, defined for describing binding affinity between arsenic and the sites, are independent of the HA concentrations. To the best of our knowledge, this is the first study to characterize the impact of HA concentrations on the applicability of the ligand binding model, and to extrapolate the model to FA. The obtained results may give insights on the complexation of arsenic in HA/FA laden groundwater and on the selection of more effective adsorption-based treatment methods for natural waters. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. From global circulation to flood loss: Coupling models across the scales

    NASA Astrophysics Data System (ADS)

    Felder, Guido; Gomez-Navarro, Juan Jose; Bozhinova, Denica; Zischg, Andreas; Raible, Christoph C.; Ole, Roessler; Martius, Olivia; Weingartner, Rolf

    2017-04-01

    The prediction and the prevention of flood losses requires an extensive understanding of underlying meteorological, hydrological, hydraulic and damage processes. Coupled models help to improve the understanding of such underlying processes and therefore contribute the understanding of flood risk. Using such a modelling approach to determine potentially flood-affected areas and damages requires a complex coupling between several models operating at different spatial and temporal scales. Although the isolated parts of the single modelling components are well established and commonly used in the literature, a full coupling including a mesoscale meteorological model driven by a global circulation one, a hydrologic model, a hydrodynamic model and a flood impact and loss model has not been reported so far. In the present study, we tackle the application of such a coupled model chain in terms of computational resources, scale effects, and model performance. From a technical point of view, results show the general applicability of such a coupled model, as well as good model performance. From a practical point of view, such an approach enables the prediction of flood-induced damages, although some future challenges have been identified.

  20. The Complexity of Developmental Predictions from Dual Process Models

    ERIC Educational Resources Information Center

    Stanovich, Keith E.; West, Richard F.; Toplak, Maggie E.

    2011-01-01

    Drawing developmental predictions from dual-process theories is more complex than is commonly realized. Overly simplified predictions drawn from such models may lead to premature rejection of the dual process approach as one of many tools for understanding cognitive development. Misleading predictions can be avoided by paying attention to several…

  1. Fischer and Schrock Carbene Complexes: A Molecular Modeling Exercise

    ERIC Educational Resources Information Center

    Montgomery, Craig D.

    2015-01-01

    An exercise in molecular modeling that demonstrates the distinctive features of Fischer and Schrock carbene complexes is presented. Semi-empirical calculations (PM3) demonstrate the singlet ground electronic state, restricted rotation about the C-Y bond, the positive charge on the carbon atom, and hence, the electrophilic nature of the Fischer…

  2. Health governance by collaboration: a case study on an area-based programme to tackle health inequalities in the Dutch city of the Hague.

    PubMed

    Plochg, Thomas; Schmidt, Melanie; Klazinga, Niek S; Stronks, Karien

    2013-12-01

    Area-based programmes are seen as a promising strategy for tackling health inequalities. In these programmes, local authorities and other local actors collaborate to employ health promoting interventions and policies. Little is known about the underlying processes of collaborative governance. To unravel this black box, we explored how the authority of The Hague, The Netherlands, developed a programme tackling health inequalities drawing on a collaborative mode of governance. Case study drawing on qualitative semi-structured interviews and document review. Data were inductively analysed against the concept of collaborative governance. The authority's ambition was to co-produce a programme on tackling health inequalities with local actors. Three stages could be distinguished in the governing process: (i) formulating policy objectives, (ii) translating policy objectives into interventions and (iii) executing health interventions. In the stage of formulating policy objectives, the collaboration led to a reframing of the initial objectives. Furthermore, the translation of the policy objectives into health interventions was rather pragmatic and loosely based on health needs and/or evidence. As a result, the concrete actions that ensued from the programme did not necessarily reflect the initial objectives. In a local system of health governance by collaboration, factors other than the stated policy objectives played a role, eventually undermining the effectiveness of the programme in reducing health inequalities. To be effective, the processes of collaborative governance underlying area-based programmes require the attention of the local authority, including the building and governing of networks, a competent public health workforce and supportive infrastructures.

  3. Impact of a United Kingdom-wide campaign to tackle antimicrobial resistance on self-reported knowledge and behaviour change.

    PubMed

    Chaintarli, Katerina; Ingle, Suzanne M; Bhattacharya, Alex; Ashiru-Oredope, Diane; Oliver, Isabel; Gobin, Maya

    2016-05-12

    As part of the 2014 European Antibiotic Awareness Day plans, a new campaign called Antibiotic Guardian (AG) was launched in the United Kingdom, including an online pledge system to increase commitment from healthcare professionals and members of the public to reduce antimicrobial resistance (AMR). The aim of this evaluation was to determine the impact of the campaign on self-reported knowledge and behaviour around AMR. An online survey was sent to 9016 Antibiotic Guardians (AGs) to assess changes in self-reported knowledge and behaviour (outcomes) following the campaign. Logistic regression models, adjusted for variables including age, sex and pledge group (pledging as member of public or as healthcare professional), were used to estimate associations between outcomes and AG characteristics. 2478 AGs responded to the survey (27.5 % response rate) of whom 1696 (68.4 %) pledged as healthcare professionals and 782 (31.6 %) as members of public (similar proportions to the total number of AGs). 96.3 % of all AGs who responded had prior knowledge of AMR. 73.5 % of participants were female and participants were most commonly between 45 and 54 years old. Two thirds (63.4 %) of participants reported always acting according to their pledge. Members of the public were more likely to act in line with their pledge than professionals (Odds Ratio (OR) =3.60, 95 % Confidence Interval (CI): 2.88-4.51). Approximately half of participants (44.5 %) (both healthcare professionals and members of public) reported that they acquired more knowledge about AMR post-campaign. People that were confused about AMR prior to the campaign acquired more knowledge after the campaign (OR = 3.10, 95 % CI: 1.36-7.09). More participants reported a sense of personal responsibility towards tackling AMR post-campaign, increasing from 58.3 % of participants pre-campaign to 70.5 % post-campaign. This study demonstrated that the campaign increased commitment to tackling AMR in both healthcare

  4. Development and evaluation of a musculoskeletal model of the elbow joint complex

    NASA Technical Reports Server (NTRS)

    Gonzalez, Roger V.; Hutchins, E. L.; Barr, Ronald E.; Abraham, Lawrence D.

    1993-01-01

    This paper describes the development and evaluation of a musculoskeletal model that represents human elbow flexion-extension and forearm pronation-supination. The length, velocity, and moment arm for each of the eight musculotendon actuators were based on skeletal anatomy and position. Musculotendon parameters were determined for each actuator and verified by comparing analytical torque-angle curves with experimental joint torque data. The parameters and skeletal geometry were also utilized in the musculoskeletal model for the analysis of ballistic elbow joint complex movements. The key objective was to develop a computational model, guided by parameterized optimal control, to investigate the relationship among patterns of muscle excitation, individual muscle forces, and movement kinematics. The model was verified using experimental kinematic, torque, and electromyographic data from volunteer subjects performing ballistic elbow joint complex movements.

  5. The Skilled Counselor Training Model: Skills Acquisition, Self-Assessment, and Cognitive Complexity

    ERIC Educational Resources Information Center

    Little, Cassandra; Packman, Jill; Smaby, Marlowe H.; Maddux, Cleborne D.

    2005-01-01

    The authors evaluated the effectiveness of the Skilled Counselor Training Model (SCTM; M. H. Smaby, C. D. Maddux, E. Torres-Rivera, & R. Zimmick, 1999) in teaching counseling skills and in fostering counselor cognitive complexity. Counselor trainees who completed the SCTM had better counseling skills and higher levels of cognitive complexity than…

  6. Simulating Complex, Cold-region Process Interactions Using a Multi-scale, Variable-complexity Hydrological Model

    NASA Astrophysics Data System (ADS)

    Marsh, C.; Pomeroy, J. W.; Wheater, H. S.

    2017-12-01

    Accurate management of water resources is necessary for social, economic, and environmental sustainability worldwide. In locations with seasonal snowcovers, the accurate prediction of these water resources is further complicated due to frozen soils, solid-phase precipitation, blowing snow transport, and snowcover-vegetation-atmosphere interactions. Complex process interactions and feedbacks are a key feature of hydrological systems and may result in emergent phenomena, i.e., the arising of novel and unexpected properties within a complex system. One example is the feedback associated with blowing snow redistribution, which can lead to drifts that cause locally-increased soil moisture, thus increasing plant growth that in turn subsequently impacts snow redistribution, creating larger drifts. Attempting to simulate these emergent behaviours is a significant challenge, however, and there is concern that process conceptualizations within current models are too incomplete to represent the needed interactions. An improved understanding of the role of emergence in hydrological systems often requires high resolution distributed numerical hydrological models that incorporate the relevant process dynamics. The Canadian Hydrological Model (CHM) provides a novel tool for examining cold region hydrological systems. Key features include efficient terrain representation, allowing simulations at various spatial scales, reduced computational overhead, and a modular process representation allowing for an alternative-hypothesis framework. Using both physics-based and conceptual process representations sourced from long term process studies and the current cold regions literature allows for comparison of process representations and importantly, their ability to produce emergent behaviours. Examining the system in a holistic, process-based manner can hopefully derive important insights and aid in development of improved process representations.

  7. Mapping injustice, visualizing equity: why theory, metaphors and images matter in tackling inequalities.

    PubMed

    Krieger, N; Dorling, D; McCartney, G

    2012-03-01

    This symposia discussed "Mapping injustice, visualizing equity: why theory, metaphors and images matter in tackling inequalities". It sought to provoke critical thinking about the current theories used to analyze the health impact of injustice, variously referred to as "health inequalities" in the UK, "social inequalities in health" in the US, and "health inequities" more globally. Our focus was the types of explanations, images, and metaphors these theories employ. Building on frameworks that emphasize politics, agency, and accountability, we suggested that it was essential to engage the general public in the politics of health inequities if progress is to be made. We showcased some examples of such engagement before inviting the audience to consider how this might apply in their own areas of responsibility. Copyright © 2012 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  8. Quantitative nanoscopy: Tackling sampling limitations in (S)TEM imaging of polymers and composites.

    PubMed

    Gnanasekaran, Karthikeyan; Snel, Roderick; de With, Gijsbertus; Friedrich, Heiner

    2016-01-01

    Sampling limitations in electron microscopy questions whether the analysis of a bulk material is representative, especially while analyzing hierarchical morphologies that extend over multiple length scales. We tackled this problem by automatically acquiring a large series of partially overlapping (S)TEM images with sufficient resolution, subsequently stitched together to generate a large-area map using an in-house developed acquisition toolbox (TU/e Acquisition ToolBox) and stitching module (TU/e Stitcher). In addition, we show that quantitative image analysis of the large scale maps provides representative information that can be related to the synthesis and process conditions of hierarchical materials, which moves electron microscopy analysis towards becoming a bulk characterization tool. We demonstrate the power of such an analysis by examining two different multi-phase materials that are structured over multiple length scales. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Computer-aided molecular modeling techniques for predicting the stability of drug cyclodextrin inclusion complexes in aqueous solutions

    NASA Astrophysics Data System (ADS)

    Faucci, Maria Teresa; Melani, Fabrizio; Mura, Paola

    2002-06-01

    Molecular modeling was used to investigate factors influencing complex formation between cyclodextrins and guest molecules and predict their stability through a theoretical model based on the search for a correlation between experimental stability constants ( Ks) and some theoretical parameters describing complexation (docking energy, host-guest contact surfaces, intermolecular interaction fields) calculated from complex structures at a minimum conformational energy, obtained through stochastic methods based on molecular dynamic simulations. Naproxen, ibuprofen, ketoprofen and ibuproxam were used as model drug molecules. Multiple Regression Analysis allowed identification of the significant factors for the complex stability. A mathematical model ( r=0.897) related log Ks with complex docking energy and lipophilic molecular fields of cyclodextrin and drug.

  10. Comparison of new generation low-complexity flood inundation mapping tools with a hydrodynamic model

    NASA Astrophysics Data System (ADS)

    Afshari, Shahab; Tavakoly, Ahmad A.; Rajib, Mohammad Adnan; Zheng, Xing; Follum, Michael L.; Omranian, Ehsan; Fekete, Balázs M.

    2018-01-01

    The objective of this study is to compare two new generation low-complexity tools, AutoRoute and Height Above the Nearest Drainage (HAND), with a two-dimensional hydrodynamic model (Hydrologic Engineering Center-River Analysis System, HEC-RAS 2D). The assessment was conducted on two hydrologically different and geographically distant test-cases in the United States, including the 16,900 km2 Cedar River (CR) watershed in Iowa and a 62 km2 domain along the Black Warrior River (BWR) in Alabama. For BWR, twelve different configurations were set up for each of the models, including four different terrain setups (e.g. with and without channel bathymetry and a levee), and three flooding conditions representing moderate to extreme hazards at 10-, 100-, and 500-year return periods. For the CR watershed, models were compared with a simplistic terrain setup (without bathymetry and any form of hydraulic controls) and one flooding condition (100-year return period). Input streamflow forcing data representing these hypothetical events were constructed by applying a new fusion approach on National Water Model outputs. Simulated inundation extent and depth from AutoRoute, HAND, and HEC-RAS 2D were compared with one another and with the corresponding FEMA reference estimates. Irrespective of the configurations, the low-complexity models were able to produce inundation extents similar to HEC-RAS 2D, with AutoRoute showing slightly higher accuracy than the HAND model. Among four terrain setups, the one including both levee and channel bathymetry showed lowest fitness score on the spatial agreement of inundation extent, due to the weak physical representation of low-complexity models compared to a hydrodynamic model. For inundation depth, the low-complexity models showed an overestimating tendency, especially in the deeper segments of the channel. Based on such reasonably good prediction skills, low-complexity flood models can be considered as a suitable alternative for fast

  11. Comparing GWAS Results of Complex Traits Using Full Genetic Model and Additive Models for Revealing Genetic Architecture

    PubMed Central

    Monir, Md. Mamun; Zhu, Jun

    2017-01-01

    Most of the genome-wide association studies (GWASs) for human complex diseases have ignored dominance, epistasis and ethnic interactions. We conducted comparative GWASs for total cholesterol using full model and additive models, which illustrate the impacts of the ignoring genetic variants on analysis results and demonstrate how genetic effects of multiple loci could differ across different ethnic groups. There were 15 quantitative trait loci with 13 individual loci and 3 pairs of epistasis loci identified by full model, whereas only 14 loci (9 common loci and 5 different loci) identified by multi-loci additive model. Again, 4 full model detected loci were not detected using multi-loci additive model. PLINK-analysis identified two loci and GCTA-analysis detected only one locus with genome-wide significance. Full model identified three previously reported genes as well as several new genes. Bioinformatics analysis showed some new genes are related with cholesterol related chemicals and/or diseases. Analyses of cholesterol data and simulation studies revealed that the full model performs were better than the additive-model performs in terms of detecting power and unbiased estimations of genetic variants of complex traits. PMID:28079101

  12. KRISSY: user's guide to modeling three-dimensional wind flow in complex terrain

    Treesearch

    Michael A. Fosberg; Michael L. Sestak

    1986-01-01

    KRISSY is a computer model for generating three-dimensional wind flows in complex terrain from data that were not or perhaps cannot be collected. The model is written in FORTRAN IV This guide describes data requirements, modeling, and output from an applications viewpoint rather than that of programming or theoretical modeling. KRISSY is designed to minimize...

  13. An Associational Model for the Diffusion of Complex Innovations.

    ERIC Educational Resources Information Center

    Barnett, George A.

    A paradigm for the study of the diffusion of complex innovations through a society is presented in this paper; the paradigm is useful for studying sociocultural change as innovations diffuse. The model is designed to account for change within social systems rather than in individuals, although it would also be consistent with information…

  14. Fitting Meta-Analytic Structural Equation Models with Complex Datasets

    ERIC Educational Resources Information Center

    Wilson, Sandra Jo; Polanin, Joshua R.; Lipsey, Mark W.

    2016-01-01

    A modification of the first stage of the standard procedure for two-stage meta-analytic structural equation modeling for use with large complex datasets is presented. This modification addresses two common problems that arise in such meta-analyses: (a) primary studies that provide multiple measures of the same construct and (b) the correlation…

  15. Multiagent model and mean field theory of complex auction dynamics

    NASA Astrophysics Data System (ADS)

    Chen, Qinghua; Huang, Zi-Gang; Wang, Yougui; Lai, Ying-Cheng

    2015-09-01

    Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena.

  16. Seeing & Feeling How Enzymes Work Using Tangible Models

    ERIC Educational Resources Information Center

    Lau, Kwok-chi

    2013-01-01

    This article presents a tangible model used to help students tackle some misconceptions about enzyme actions, particularly the induced-fit model, enzyme-substrate complementarity, and enzyme inhibition. The model can simulate how substrates induce a change in the shape of the active site and the role of attraction force during enzyme-substrate…

  17. Student Cognitive Difficulties and Mental Model Development of Complex Earth and Environmental Systems

    NASA Astrophysics Data System (ADS)

    Sell, K.; Herbert, B.; Schielack, J.

    2004-05-01

    Students organize scientific knowledge and reason about environmental issues through manipulation of mental models. The nature of the environmental sciences, which are focused on the study of complex, dynamic systems, may present cognitive difficulties to students in their development of authentic, accurate mental models of environmental systems. The inquiry project seeks to develop and assess the coupling of information technology (IT)-based learning with physical models in order to foster rich mental model development of environmental systems in geoscience undergraduate students. The manipulation of multiple representations, the development and testing of conceptual models based on available evidence, and exposure to authentic, complex and ill-constrained problems were the components of investigation utilized to reach the learning goals. Upper-level undergraduate students enrolled in an environmental geology course at Texas A&M University participated in this research which served as a pilot study. Data based on rubric evaluations interpreted by principal component analyses suggest students' understanding of the nature of scientific inquiry is limited and the ability to cross scales and link systems proved problematic. Results categorized into content knowledge and cognition processes where reasoning, critical thinking and cognitive load were driving factors behind difficulties in student learning. Student mental model development revealed multiple misconceptions and lacked complexity and completeness to represent the studied systems. Further, the positive learning impacts of the implemented modules favored the physical model over the IT-based learning projects, likely due to cognitive load issues. This study illustrates the need to better understand student difficulties in solving complex problems when using IT, where the appropriate scaffolding can then be implemented to enhance student learning of the earth system sciences.

  18. Applicability study of classical and contemporary models for effective complex permittivity of metal powders.

    PubMed

    Kiley, Erin M; Yakovlev, Vadim V; Ishizaki, Kotaro; Vaucher, Sebastien

    2012-01-01

    Microwave thermal processing of metal powders has recently been a topic of a substantial interest; however, experimental data on the physical properties of mixtures involving metal particles are often unavailable. In this paper, we perform a systematic analysis of classical and contemporary models of complex permittivity of mixtures and discuss the use of these models for determining effective permittivity of dielectric matrices with metal inclusions. Results from various mixture and core-shell mixture models are compared to experimental data for a titanium/stearic acid mixture and a boron nitride/graphite mixture (both obtained through the original measurements), and for a tungsten/Teflon mixture (from literature). We find that for certain experiments, the average error in determining the effective complex permittivity using Lichtenecker's, Maxwell Garnett's, Bruggeman's, Buchelnikov's, and Ignatenko's models is about 10%. This suggests that, for multiphysics computer models describing the processing of metal powder in the full temperature range, input data on effective complex permittivity obtained from direct measurement has, up to now, no substitute.

  19. [Analysis of a three-dimensional finite element model of atlas and axis complex fracture].

    PubMed

    Tang, X M; Liu, C; Huang, K; Zhu, G T; Sun, H L; Dai, J; Tian, J W

    2018-05-22

    Objective: To explored the clinical application of the three-dimensional finite element model of atlantoaxial complex fracture. Methods: A three-dimensional finite element model of cervical spine (FEM/intact) was established by software of Abaqus6.12.On the basis of this model, a three-dimensional finite element model of four types of atlantoaxial complex fracture was established: C(1) fracture (Jefferson)+ C(2) fracture (type Ⅱfracture), Jefferson+ C(2) fracture(type Ⅲfracture), Jefferson+ C(2) fracture(Hangman), Jefferson+ stable C(2) fracture (FEM/fracture). The range of motion under flexion, extension, lateral bending and axial rotation were measured and compared with the model of cervical spine. Results: The three-dimensional finite element model of four types of atlantoaxial complex fracture had the same similarity and profile.The range of motion (ROM) of different segments had different changes.Compared with those in the normal model, the ROM of C(0/1) and C(1/2) in C(1) combined Ⅱ odontoid fracture model in flexion/extension, lateral bending and rotation increased by 57.45%, 29.34%, 48.09% and 95.49%, 88.52%, 36.71%, respectively.The ROM of C(0/1) and C(1/2) in C(1) combined Ⅲodontoid fracture model in flexion/extension, lateral bending and rotation increased by 47.01%, 27.30%, 45.31% and 90.38%, 27.30%, 30.0%.The ROM of C(0/1) and C(1/2) in C(1) combined Hangman fracture model in flexion/extension, lateral bending and rotation increased by 32.68%, 79.34%, 77.62% and 60.53%, 81.20%, 21.48%, respectively.The ROM of C(0/1) and C(1/2) in C(1) combined axis fracture model in flexion/extension, lateral bending and rotation increased by 15.00%, 29.30%, 8.47% and 37.87%, 75.57%, 8.30%, respectively. Conclusions: The three-dimensional finite element model can be used to simulate the biomechanics of atlantoaxial complex fracture.The ROM of atlantoaxial complex fracture is larger than nomal model, which indicates that surgical treatment should be performed.

  20. Petri net model for analysis of concurrently processed complex algorithms

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1986-01-01

    This paper presents a Petri-net model suitable for analyzing the concurrent processing of computationally complex algorithms. The decomposed operations are to be processed in a multiple processor, data driven architecture. Of particular interest is the application of the model to both the description of the data/control flow of a particular algorithm, and to the general specification of the data driven architecture. A candidate architecture is also presented.

  1. The development and application of composite complexity models and a relative complexity metric in a software maintenance environment

    NASA Technical Reports Server (NTRS)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  2. Modeling relations in nature and eco-informatics: a practical application of rosennean complexity.

    PubMed

    Kineman, John J

    2007-10-01

    The purpose of eco-informatics is to communicate critical information about organisms and ecosystems. To accomplish this, it must reflect the complexity of natural systems. Present information systems are designed around mechanistic concepts that do not capture complexity. Robert Rosen's relational theory offers a way of representing complexity in terms of information entailments that are part of an ontologically implicit 'modeling relation'. This relation has corresponding epistemological components that can be captured empirically, the components being structure (associated with model encoding) and function (associated with model decoding). Relational complexity, thus, provides a long-awaited theoretical underpinning for these concepts that ecology has found indispensable. Structural information pertains to the material organization of a system, which can be represented by data. Functional information specifies potential change, which can be inferred from experiment and represented as models or descriptions of state transformations. Contextual dependency (of structure or function) implies meaning. Biological functions imply internalized or system-dependent laws. Complexity can be represented epistemologically by relating structure and function in two different ways. One expresses the phenomenal relation that exists in any present or past instance, and the other draws the ontology of a system into the empirical world in terms of multiple potentials subject to natural forms of selection and optimality. These act as system attractors. Implementing these components and their theoretical relations in an informatics system will provide more-complete ecological informatics than is possible from a strictly mechanistic point of view. This approach will enable many new possibilities for supporting science and decision making.

  3. A mouse model of mitochondrial complex III dysfunction induced by myxothiazol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davoudi, Mina; Kallijärvi, Jukka; Marjavaara, Sanna

    2014-04-18

    Highlights: • Reversible chemical inhibition of complex III in wild type mouse. • Myxothiazol causes decreased complex III activity in mouse liver. • The model is useful for therapeutic trials to improve mitochondrial function. - Abstract: Myxothiazol is a respiratory chain complex III (CIII) inhibitor that binds to the ubiquinol oxidation site Qo of CIII. It blocks electron transfer from ubiquinol to cytochrome b and thus inhibits CIII activity. It has been utilized as a tool in studies of respiratory chain function in in vitro and cell culture models. We developed a mouse model of biochemically induced and reversible CIIImore » inhibition using myxothiazol. We administered myxothiazol intraperitoneally at a dose of 0.56 mg/kg to C57Bl/J6 mice every 24 h and assessed CIII activity, histology, lipid content, supercomplex formation, and gene expression in the livers of the mice. A reversible CIII activity decrease to 50% of control value occurred at 2 h post-injection. At 74 h only minor histological changes in the liver were found, supercomplex formation was preserved and no significant changes in the expression of genes indicating hepatotoxicity or inflammation were found. Thus, myxothiazol-induced CIII inhibition can be induced in mice for four days in a row without overt hepatotoxicity or lethality. This model could be utilized in further studies of respiratory chain function and pharmacological approaches to mitochondrial hepatopathies.« less

  4. Atomic Resolution Modeling of the Ferredoxin:[FeFe] Hydrogenase Complex from Chlamydomonas reinhardtii

    PubMed Central

    Chang, Christopher H.; King, Paul W.; Ghirardi, Maria L.; Kim, Kwiseon

    2007-01-01

    The [FeFe] hydrogenases HydA1 and HydA2 in the green alga Chlamydomonas reinhardtii catalyze the final reaction in a remarkable metabolic pathway allowing this photosynthetic organism to produce H2 from water in the chloroplast. A [2Fe-2S] ferredoxin is a critical branch point in electron flow from Photosystem I toward a variety of metabolic fates, including proton reduction by hydrogenases. To better understand the binding determinants involved in ferredoxin:hydrogenase interactions, we have modeled Chlamydomonas PetF1 and HydA2 based on amino-acid sequence homology, and produced two promising electron-transfer model complexes by computational docking. To characterize these models, quantitative free energy calculations at atomic resolution were carried out, and detailed analysis of the interprotein interactions undertaken. The protein complex model we propose for ferredoxin:HydA2 interaction is energetically favored over the alternative candidate by 20 kcal/mol. This proposed model of the electron-transfer complex between PetF1 and HydA2 permits a more detailed view of the molecular events leading up to H2 evolution, and suggests potential mutagenic strategies to modulate electron flow to HydA2. PMID:17660315

  5. Atomic resolution modeling of the ferredoxin:[FeFe] hydrogenase complex from Chlamydomonas reinhardtii.

    PubMed

    Chang, Christopher H; King, Paul W; Ghirardi, Maria L; Kim, Kwiseon

    2007-11-01

    The [FeFe] hydrogenases HydA1 and HydA2 in the green alga Chlamydomonas reinhardtii catalyze the final reaction in a remarkable metabolic pathway allowing this photosynthetic organism to produce H(2) from water in the chloroplast. A [2Fe-2S] ferredoxin is a critical branch point in electron flow from Photosystem I toward a variety of metabolic fates, including proton reduction by hydrogenases. To better understand the binding determinants involved in ferredoxin:hydrogenase interactions, we have modeled Chlamydomonas PetF1 and HydA2 based on amino-acid sequence homology, and produced two promising electron-transfer model complexes by computational docking. To characterize these models, quantitative free energy calculations at atomic resolution were carried out, and detailed analysis of the interprotein interactions undertaken. The protein complex model we propose for ferredoxin:HydA2 interaction is energetically favored over the alternative candidate by 20 kcal/mol. This proposed model of the electron-transfer complex between PetF1 and HydA2 permits a more detailed view of the molecular events leading up to H(2) evolution, and suggests potential mutagenic strategies to modulate electron flow to HydA2.

  6. Rethinking the Psychogenic Model of Complex Regional Pain Syndrome: Somatoform Disorders and Complex Regional Pain Syndrome

    PubMed Central

    Hill, Renee J.; Chopra, Pradeep; Richardi, Toni

    2012-01-01

    Abstract Explaining the etiology of Complex Regional Pain Syndrome (CRPS) from the psychogenic model is exceedingly unsophisticated, because neurocognitive deficits, neuroanatomical abnormalities, and distortions in cognitive mapping are features of CRPS pathology. More importantly, many people who have developed CRPS have no history of mental illness. The psychogenic model offers comfort to physicians and mental health practitioners (MHPs) who have difficulty understanding pain maintained by newly uncovered neuro inflammatory processes. With increased education about CRPS through a biopsychosocial perspective, both physicians and MHPs can better diagnose, treat, and manage CRPS symptomatology. PMID:24223338

  7. Modeling of the Human - Operator in a Complex System Functioning Under Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Getzov, Peter; Hubenova, Zoia; Yordanov, Dimitar; Popov, Wiliam

    2013-12-01

    Problems, related to the explication of sophisticated control systems of objects, operating under extreme conditions, have been examined and the impact of the effectiveness of the operator's activity on the systems as a whole. The necessity of creation of complex simulation models, reflecting operator's activity, is discussed. Organizational and technical system of an unmanned aviation complex is described as a sophisticated ergatic system. Computer realization of main subsystems of algorithmic system of the man as a controlling system is implemented and specialized software for data processing and analysis is developed. An original computer model of a Man as a tracking system has been implemented. Model of unmanned complex for operators training and formation of a mental model in emergency situation, implemented in "matlab-simulink" environment, has been synthesized. As a unit of the control loop, the pilot (operator) is simplified viewed as an autocontrol system consisting of three main interconnected subsystems: sensitive organs (perception sensors); central nervous system; executive organs (muscles of the arms, legs, back). Theoretical-data model of prediction the level of operator's information load in ergatic systems is proposed. It allows the assessment and prediction of the effectiveness of a real working operator. Simulation model of operator's activity in takeoff based on the Petri nets has been synthesized.

  8. A Simple Model for Complex Dynamical Transitions in Epidemics

    NASA Astrophysics Data System (ADS)

    Earn, David J. D.; Rohani, Pejman; Bolker, Benjamin M.; Grenfell, Bryan T.

    2000-01-01

    Dramatic changes in patterns of epidemics have been observed throughout this century. For childhood infectious diseases such as measles, the major transitions are between regular cycles and irregular, possibly chaotic epidemics, and from regionally synchronized oscillations to complex, spatially incoherent epidemics. A simple model can explain both kinds of transitions as the consequences of changes in birth and vaccination rates. Measles is a natural ecological system that exhibits different dynamical transitions at different times and places, yet all of these transitions can be predicted as bifurcations of a single nonlinear model.

  9. Discovering Link Communities in Complex Networks by an Integer Programming Model and a Genetic Algorithm

    PubMed Central

    Li, Zhenping; Zhang, Xiang-Sun; Wang, Rui-Sheng; Liu, Hongwei; Zhang, Shihua

    2013-01-01

    Identification of communities in complex networks is an important topic and issue in many fields such as sociology, biology, and computer science. Communities are often defined as groups of related nodes or links that correspond to functional subunits in the corresponding complex systems. While most conventional approaches have focused on discovering communities of nodes, some recent studies start partitioning links to find overlapping communities straightforwardly. In this paper, we propose a new quantity function for link community identification in complex networks. Based on this quantity function we formulate the link community partition problem into an integer programming model which allows us to partition a complex network into overlapping communities. We further propose a genetic algorithm for link community detection which can partition a network into overlapping communities without knowing the number of communities. We test our model and algorithm on both artificial networks and real-world networks. The results demonstrate that the model and algorithm are efficient in detecting overlapping community structure in complex networks. PMID:24386268

  10. Physical modelling of the nuclear pore complex

    PubMed Central

    Fassati, Ariberto; Ford, Ian J.; Hoogenboom, Bart W.

    2013-01-01

    Physically interesting behaviour can arise when soft matter is confined to nanoscale dimensions. A highly relevant biological example of such a phenomenon is the Nuclear Pore Complex (NPC) found perforating the nuclear envelope of eukaryotic cells. In the central conduit of the NPC, of ∼30–60 nm diameter, a disordered network of proteins regulates all macromolecular transport between the nucleus and the cytoplasm. In spite of a wealth of experimental data, the selectivity barrier of the NPC has yet to be explained fully. Experimental and theoretical approaches are complicated by the disordered and heterogeneous nature of the NPC conduit. Modelling approaches have focused on the behaviour of the partially unfolded protein domains in the confined geometry of the NPC conduit, and have demonstrated that within the range of parameters thought relevant for the NPC, widely varying behaviour can be observed. In this review, we summarise recent efforts to physically model the NPC barrier and function. We illustrate how attempts to understand NPC barrier function have employed many different modelling techniques, each of which have contributed to our understanding of the NPC.

  11. Tackling obesity: new therapeutic agents for assisted weight loss

    PubMed Central

    Karam, JG; McFarlane, SI

    2010-01-01

    The pandemic of overweight and obesity continues to rise in an alarming rate in western countries and around the globe representing a major public health challenge in desperate need for new strategies tackling obesity. In the United States nearly two thirds of the population is overweight or obese. Worldwide the number of persons who are overweight or obese exceeded 1.6 billion. These rising figures have been clearly associated with increased morbidity and mortality. For example, in the Framingham study, the risk of death increases with each additional pound of weight gain even in the relatively younger population between 30 and 42 years of age. Overweight and obesity are also associated with increased co-morbid conditions such as diabetes, hypertension and cardiovascular disease as well as certain types of cancer. In this review we discuss the epidemic of obesity, highlighting the pathophysiologic mechanisms of weight gain. We also provide an overview of the assessment of overweight and obese individuals discussing possible secondary causes of obesity. In a detailed section we discuss the currently approved therapeutic interventions for obesity highlighting their mechanisms of action and evidence of their efficacy and safety as provided in clinical trials. Finally, we discuss novel therapeutic interventions that are in various stages of development with a special section on the weight loss effects of anti-diabetic medications. These agents are particularly attractive options for our growing population of obese diabetic individuals. PMID:21437080

  12. McSnow: A Monte-Carlo Particle Model for Riming and Aggregation of Ice Particles in a Multidimensional Microphysical Phase Space

    NASA Astrophysics Data System (ADS)

    Brdar, S.; Seifert, A.

    2018-01-01

    We present a novel Monte-Carlo ice microphysics model, McSnow, to simulate the evolution of ice particles due to deposition, aggregation, riming, and sedimentation. The model is an application and extension of the super-droplet method of Shima et al. (2009) to the more complex problem of rimed ice particles and aggregates. For each individual super-particle, the ice mass, rime mass, rime volume, and the number of monomers are predicted establishing a four-dimensional particle-size distribution. The sensitivity of the model to various assumptions is discussed based on box model and one-dimensional simulations. We show that the Monte-Carlo method provides a feasible approach to tackle this high-dimensional problem. The largest uncertainty seems to be related to the treatment of the riming processes. This calls for additional field and laboratory measurements of partially rimed snowflakes.

  13. Teacher Stress: Complex Model Building with LISREL. Pedagogical Reports, No. 16.

    ERIC Educational Resources Information Center

    Tellenback, Sten

    This paper presents a complex causal model of teacher stress based on data received from the responses of 1,466 teachers from Malmo, Sweden to a questionnaire. Also presented is a method for treating the model variables as higher-order factors or higher-order theoretical constructs. The paper's introduction presents a brief review of teacher…

  14. A complex speciation–richness relationship in a simple neutral model

    PubMed Central

    Desjardins-Proulx, Philippe; Gravel, Dominique

    2012-01-01

    Speciation is the “elephant in the room” of community ecology. As the ultimate source of biodiversity, its integration in ecology's theoretical corpus is necessary to understand community assembly. Yet, speciation is often completely ignored or stripped of its spatial dimension. Recent approaches based on network theory have allowed ecologists to effectively model complex landscapes. In this study, we use this framework to model allopatric and parapatric speciation in networks of communities. We focus on the relationship between speciation, richness, and the spatial structure of communities. We find a strong opposition between speciation and local richness, with speciation being more common in isolated communities and local richness being higher in more connected communities. Unlike previous models, we also find a transition to a positive relationship between speciation and local richness when dispersal is low and the number of communities is small. We use several measures of centrality to characterize the effect of network structure on diversity. The degree, the simplest measure of centrality, is the best predictor of local richness and speciation, although it loses some of its predictive power as connectivity grows. Our framework shows how a simple neutral model can be combined with network theory to reveal complex relationships between speciation, richness, and the spatial organization of populations. PMID:22957181

  15. Spinning Q-balls in the complex signum-Gordon model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arodz, H.; Karkowski, J.; Swierczynski, Z.

    2009-09-15

    Rotational excitations of compact Q-balls in the complex signum-Gordon model in 2+1 dimensions are investigated. We find that almost all such spinning Q-balls have the form of a ring of strictly finite width. In the limit of large angular momentum M{sub z}, their energy is proportional to |M{sub z}|{sup 1/5}.

  16. Waste management under multiple complexities: inexact piecewise-linearization-based fuzzy flexible programming.

    PubMed

    Sun, Wei; Huang, Guo H; Lv, Ying; Li, Gongchen

    2012-06-01

    To tackle nonlinear economies-of-scale (EOS) effects in interval-parameter constraints for a representative waste management problem, an inexact piecewise-linearization-based fuzzy flexible programming (IPFP) model is developed. In IPFP, interval parameters for waste amounts and transportation/operation costs can be quantified; aspiration levels for net system costs, as well as tolerance intervals for both capacities of waste treatment facilities and waste generation rates can be reflected; and the nonlinear EOS effects transformed from objective function to constraints can be approximated. An interactive algorithm is proposed for solving the IPFP model, which in nature is an interval-parameter mixed-integer quadratically constrained programming model. To demonstrate the IPFP's advantages, two alternative models are developed to compare their performances. One is a conventional linear-regression-based inexact fuzzy programming model (IPFP2) and the other is an IPFP model with all right-hand-sides of fussy constraints being the corresponding interval numbers (IPFP3). The comparison results between IPFP and IPFP2 indicate that the optimized waste amounts would have the similar patterns in both models. However, when dealing with EOS effects in constraints, the IPFP2 may underestimate the net system costs while the IPFP can estimate the costs more accurately. The comparison results between IPFP and IPFP3 indicate that their solutions would be significantly different. The decreased system uncertainties in IPFP's solutions demonstrate its effectiveness for providing more satisfactory interval solutions than IPFP3. Following its first application to waste management, the IPFP can be potentially applied to other environmental problems under multiple complexities. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Waste collection in developing countries - Tackling occupational safety and health hazards at their source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bleck, Daniela, E-mail: bleck.daniela@baua.bund.de; Wettberg, Wieland, E-mail: wettberg.wieland@baua.bund.de

    2012-11-15

    Waste management procedures in developing countries are associated with occupational safety and health risks. Gastro-intestinal infections, respiratory and skin diseases as well as muscular-skeletal problems and cutting injuries are commonly found among waste workers around the globe. In order to find efficient, sustainable solutions to reduce occupational risks of waste workers, a methodological risk assessment has to be performed and counteractive measures have to be developed according to an internationally acknowledged hierarchy. From a case study in Addis Ababa, Ethiopia suggestions for the transferral of collected household waste into roadside containers are given. With construction of ramps to dump collectedmore » household waste straight into roadside containers and an adaptation of pushcarts and collection procedures, the risk is tackled at the source.« less

  18. Generalized model of electromigration with 1:1 (analyte:selector) complexation stoichiometry: part I. Theory.

    PubMed

    Dubský, Pavel; Müllerová, Ludmila; Dvořák, Martin; Gaš, Bohuslav

    2015-03-06

    The model of electromigration of a multivalent weak acidic/basic/amphoteric analyte that undergoes complexation with a mixture of selectors is introduced. The model provides an extension of the series of models starting with the single-selector model without dissociation by Wren and Rowe in 1992, continuing with the monovalent weak analyte/single-selector model by Rawjee, Williams and Vigh in 1993 and that by Lelièvre in 1994, and ending with the multi-selector overall model without dissociation developed by our group in 2008. The new multivalent analyte multi-selector model shows that the effective mobility of the analyte obeys the original Wren and Row's formula. The overall complexation constant, mobility of the free analyte and mobility of complex can be measured and used in a standard way. The mathematical expressions for the overall parameters are provided. We further demonstrate mathematically that the pH dependent parameters for weak analytes can be simply used as an input into the multi-selector overall model and, in reverse, the multi-selector overall parameters can serve as an input into the pH-dependent models for the weak analytes. These findings can greatly simplify the rationale method development in analytical electrophoresis, specifically enantioseparations. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. A Corticothalamic Circuit Model for Sound Identification in Complex Scenes

    PubMed Central

    Otazu, Gonzalo H.; Leibold, Christian

    2011-01-01

    The identification of the sound sources present in the environment is essential for the survival of many animals. However, these sounds are not presented in isolation, as natural scenes consist of a superposition of sounds originating from multiple sources. The identification of a source under these circumstances is a complex computational problem that is readily solved by most animals. We present a model of the thalamocortical circuit that performs level-invariant recognition of auditory objects in complex auditory scenes. The circuit identifies the objects present from a large dictionary of possible elements and operates reliably for real sound signals with multiple concurrently active sources. The key model assumption is that the activities of some cortical neurons encode the difference between the observed signal and an internal estimate. Reanalysis of awake auditory cortex recordings revealed neurons with patterns of activity corresponding to such an error signal. PMID:21931668

  20. An egalitarian network model for the emergence of simple and complex cells in visual cortex

    PubMed Central

    Tao, Louis; Shelley, Michael; McLaughlin, David; Shapley, Robert

    2004-01-01

    We explain how simple and complex cells arise in a large-scale neuronal network model of the primary visual cortex of the macaque. Our model consists of ≈4,000 integrate-and-fire, conductance-based point neurons, representing the cells in a small, 1-mm2 patch of an input layer of the primary visual cortex. In the model the local connections are isotropic and nonspecific, and convergent input from the lateral geniculate nucleus confers cortical cells with orientation and spatial phase preference. The balance between lateral connections and lateral geniculate nucleus drive determines whether individual neurons in this recurrent circuit are simple or complex. The model reproduces qualitatively the experimentally observed distributions of both extracellular and intracellular measures of simple and complex response. PMID:14695891

  1. History matching of a complex epidemiological model of human immunodeficiency virus transmission by using variance emulation.

    PubMed

    Andrianakis, I; Vernon, I; McCreesh, N; McKinley, T J; Oakley, J E; Nsubuga, R N; Goldstein, M; White, R G

    2017-08-01

    Complex stochastic models are commonplace in epidemiology, but their utility depends on their calibration to empirical data. History matching is a (pre)calibration method that has been applied successfully to complex deterministic models. In this work, we adapt history matching to stochastic models, by emulating the variance in the model outputs, and therefore accounting for its dependence on the model's input values. The method proposed is applied to a real complex epidemiological model of human immunodeficiency virus in Uganda with 22 inputs and 18 outputs, and is found to increase the efficiency of history matching, requiring 70% of the time and 43% fewer simulator evaluations compared with a previous variant of the method. The insight gained into the structure of the human immunodeficiency virus model, and the constraints placed on it, are then discussed.

  2. Geomatic methods at the service of water resources modelling

    NASA Astrophysics Data System (ADS)

    Molina, José-Luis; Rodríguez-Gonzálvez, Pablo; Molina, Mª Carmen; González-Aguilera, Diego; Espejo, Fernando

    2014-02-01

    Acquisition, management and/or use of spatial information are crucial for the quality of water resources studies. In this sense, several geomatic methods arise at the service of water modelling, aiming the generation of cartographic products, especially in terms of 3D models and orthophotos. They may also perform as tools for problem solving and decision making. However, choosing the right geomatic method is still a challenge in this field. That is mostly due to the complexity of the different applications and variables involved for water resources management. This study is aimed to provide a guide to best practices in this context by tackling a deep review of geomatic methods and their suitability assessment for the following study types: Surface Hydrology, Groundwater Hydrology, Hydraulics, Agronomy, Morphodynamics and Geotechnical Processes. This assessment is driven by several decision variables grouped in two categories, classified depending on their nature as geometric or radiometric. As a result, the reader comes with the best choice/choices for the method to use, depending on the type of water resources modelling study in hand.

  3. Variable speed limit strategies analysis with mesoscopic traffic flow model based on complex networks

    NASA Astrophysics Data System (ADS)

    Li, Shu-Bin; Cao, Dan-Ni; Dang, Wen-Xiu; Zhang, Lin

    As a new cross-discipline, the complexity science has penetrated into every field of economy and society. With the arrival of big data, the research of the complexity science has reached its summit again. In recent years, it offers a new perspective for traffic control by using complex networks theory. The interaction course of various kinds of information in traffic system forms a huge complex system. A new mesoscopic traffic flow model is improved with variable speed limit (VSL), and the simulation process is designed, which is based on the complex networks theory combined with the proposed model. This paper studies effect of VSL on the dynamic traffic flow, and then analyzes the optimal control strategy of VSL in different network topologies. The conclusion of this research is meaningful to put forward some reasonable transportation plan and develop effective traffic management and control measures to help the department of traffic management.

  4. Virtual enterprise model for the electronic components business in the Nuclear Weapons Complex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferguson, T.J.; Long, K.S.; Sayre, J.A.

    1994-08-01

    The electronic components business within the Nuclear Weapons Complex spans organizational and Department of Energy contractor boundaries. An assessment of the current processes indicates a need for fundamentally changing the way electronic components are developed, procured, and manufactured. A model is provided based on a virtual enterprise that recognizes distinctive competencies within the Nuclear Weapons Complex and at the vendors. The model incorporates changes that reduce component delivery cycle time and improve cost effectiveness while delivering components of the appropriate quality.

  5. Some aspects of mathematical and chemical modeling of complex chemical processes

    NASA Technical Reports Server (NTRS)

    Nemes, I.; Botar, L.; Danoczy, E.; Vidoczy, T.; Gal, D.

    1983-01-01

    Some theoretical questions involved in the mathematical modeling of the kinetics of complex chemical process are discussed. The analysis is carried out for the homogeneous oxidation of ethylbenzene in the liquid phase. Particular attention is given to the determination of the general characteristics of chemical systems from an analysis of mathematical models developed on the basis of linear algebra.

  6. Systematic coarse-grained modeling of complexation between small interfering RNA and polycations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Zonghui; Luijten, Erik, E-mail: luijten@northwestern.edu; Department of Materials Science and Engineering, Northwestern University, Evanston, Illinois 60208

    All-atom molecular dynamics simulations can provide insight into the properties of polymeric gene-delivery carriers by elucidating their interactions and detailed binding patterns with nucleic acids. However, to explore nanoparticle formation through complexation of these polymers and nucleic acids and study their behavior at experimentally relevant time and length scales, a reliable coarse-grained model is needed. Here, we systematically develop such a model for the complexation of small interfering RNA (siRNA) and grafted polyethyleneimine copolymers, a promising candidate for siRNA delivery. We compare the predictions of this model with all-atom simulations and demonstrate that it is capable of reproducing detailed bindingmore » patterns, charge characteristics, and water release kinetics. Since the coarse-grained model accelerates the simulations by one to two orders of magnitude, it will make it possible to quantitatively investigate nanoparticle formation involving multiple siRNA molecules and cationic copolymers.« less

  7. Balancing the stochastic description of uncertainties as a function of hydrologic model complexity

    NASA Astrophysics Data System (ADS)

    Del Giudice, D.; Reichert, P.; Albert, C.; Kalcic, M.; Logsdon Muenich, R.; Scavia, D.; Bosch, N. S.; Michalak, A. M.

    2016-12-01

    Uncertainty analysis is becoming an important component of forecasting water and pollutant fluxes in urban and rural environments. Properly accounting for errors in the modeling process can help to robustly assess the uncertainties associated with the inputs (e.g. precipitation) and outputs (e.g. runoff) of hydrological models. In recent years we have investigated several Bayesian methods to infer the parameters of a mechanistic hydrological model along with those of the stochastic error component. The latter describes the uncertainties of model outputs and possibly inputs. We have adapted our framework to a variety of applications, ranging from predicting floods in small stormwater systems to nutrient loads in large agricultural watersheds. Given practical constraints, we discuss how in general the number of quantities to infer probabilistically varies inversely with the complexity of the mechanistic model. Most often, when evaluating a hydrological model of intermediate complexity, we can infer the parameters of the model as well as of the output error model. Describing the output errors as a first order autoregressive process can realistically capture the "downstream" effect of inaccurate inputs and structure. With simpler runoff models we can additionally quantify input uncertainty by using a stochastic rainfall process. For complex hydrologic transport models, instead, we show that keeping model parameters fixed and just estimating time-dependent output uncertainties could be a viable option. The common goal across all these applications is to create time-dependent prediction intervals which are both reliable (cover the nominal amount of validation data) and precise (are as narrow as possible). In conclusion, we recommend focusing both on the choice of the hydrological model and of the probabilistic error description. The latter can include output uncertainty only, if the model is computationally-expensive, or, with simpler models, it can separately account

  8. [Repercussions of the Maria da Penha law in tackling gender violence].

    PubMed

    Meneghel, Stela Nazareth; Mueller, Betânia; Collaziol, Marceli Emer; de Quadros, Maíra Meneghel

    2013-03-01

    This paper presents the declarations about the Maria da Penha law made by a sample of women victims and care workers who handle situations of gender violence in the city of Porto Alegre. The data are part of a study that investigated the critical path followed by women who decide to denounce violence. The statements were selected from 45 semi-structured interviews answered by 21 women and 25 professionals from the police, legal, social and health services and nongovernmental institutions. Data were analyzed using NVivo software and one of the categories selected was the Maria da Penha law. Most respondents mentioned the positive and innovatory aspects of the law, though they also pointed out its limitations. The care workers see the legal device as an important tool for tackling violence, aligned with international conventions, bringing innovations and broadening women's access to justice. In terms of weaknesses, both women and care workers stress the inefficiency in the implementation of protective measures, the lack of material resources and manpower, the fragmentation of the health care network and the movement of conservative sectors in society to delegitimize the law.

  9. Syntactic Complexity as an Aspect of Text Complexity

    ERIC Educational Resources Information Center

    Frantz, Roger S.; Starr, Laura E.; Bailey, Alison L.

    2015-01-01

    Students' ability to read complex texts is emphasized in the Common Core State Standards (CCSS) for English Language Arts and Literacy. The standards propose a three-part model for measuring text complexity. Although the model presents a robust means for determining text complexity based on a variety of features inherent to a text as well as…

  10. Understanding Complex Natural Systems by Articulating Structure-Behavior-Function Models

    ERIC Educational Resources Information Center

    Vattam, Swaroop S.; Goel, Ashok K.; Rugaber, Spencer; Hmelo-Silver, Cindy E.; Jordan, Rebecca; Gray, Steven; Sinha, Suparna

    2011-01-01

    Artificial intelligence research on creative design has led to Structure-Behavior-Function (SBF) models that emphasize functions as abstractions for organizing understanding of physical systems. Empirical studies on understanding complex systems suggest that novice understanding is shallow, typically focusing on their visible structures and…

  11. Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering.

    PubMed

    Groen, Nathalie; Guvendiren, Murat; Rabitz, Herschel; Welsh, William J; Kohn, Joachim; de Boer, Jan

    2016-04-01

    The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. In this opinion paper, we postulate that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would

  12. [Comparison of predictive models for the selection of high-complexity patients].

    PubMed

    Estupiñán-Ramírez, Marcos; Tristancho-Ajamil, Rita; Company-Sancho, María Consuelo; Sánchez-Janáriz, Hilda

    2017-08-18

    To compare the concordance of complexity weights between Clinical Risk Groups (CRG) and Adjusted Morbidity Groups (AMG). To determine which one is the best predictor of patient admission. To optimise the method used to select the 0.5% of patients of higher complexity that will be included in an intervention protocol. Cross-sectional analytical study in 18 Canary Island health areas, 385,049 citizens were enrolled, using sociodemographic variables from health cards; diagnoses and use of healthcare resources obtained from primary health care electronic records (PCHR) and the basic minimum set of hospital data; the functional status recorded in the PCHR, and the drugs prescribed through the electronic prescription system. The correlation between stratifiers was estimated from these data. The ability of each stratifier to predict patient admissions was evaluated and prediction optimisation models were constructed. Concordance between weights complexity stratifiers was strong (rho = 0.735) and the correlation between categories of complexity was moderate (weighted kappa = 0.515). AMG complexity weight predicts better patient admission than CRG (AUC: 0.696 [0.695-0.697] versus 0.692 [0.691-0.693]). Other predictive variables were added to the AMG weight, obtaining the best AUC (0.708 [0.707-0.708]) the model composed by AMG, sex, age, Pfeiffer and Barthel scales, re-admissions and number of prescribed therapeutic groups. strong concordance was found between stratifiers, and higher predictive capacity for admission from AMG, which can be increased by adding other dimensions. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. The big data-big model (BDBM) challenges in ecological research

    NASA Astrophysics Data System (ADS)

    Luo, Y.

    2015-12-01

    The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple

  14. Scientific and technical complex for modeling, researching and testing of rocket-space vehicles’ electric power installations

    NASA Astrophysics Data System (ADS)

    Bezruchko, Konstantin; Davidov, Albert

    2009-01-01

    In the given article scientific and technical complex for modeling, researching and testing of rocket-space vehicles' power installations which was created in Power Source Laboratory of National Aerospace University "KhAI" is described. This scientific and technical complex gives the opportunity to replace the full-sized tests on model tests and to reduce financial and temporary inputs at modeling, researching and testing of rocket-space vehicles' power installations. Using the given complex it is possible to solve the problems of designing and researching of rocket-space vehicles' power installations efficiently, and also to provide experimental researches of physical processes and tests of solar and chemical batteries of rocket-space complexes and space vehicles. Scientific and technical complex also allows providing accelerated tests, diagnostics, life-time control and restoring of chemical accumulators for rocket-space vehicles' power supply systems.

  15. The application of CFD to the modelling of fires in complex geometries

    NASA Astrophysics Data System (ADS)

    Burns, A. D.; Clarke, D. S.; Guilbert, P.; Jones, I. P.; Simcox, S.; Wilkes, N. S.

    The application of Computational Fluid Dynamics (CFD) to industrial safety is a challenging activity. In particular it involves the interaction of several different physical processes, including turbulence, combustion, radiation, buoyancy, compressible flow and shock waves in complex three-dimensional geometries. In addition, there may be multi-phase effects arising, for example, from sprinkler systems for extinguishing fires. The FLOW3D software (1-3) from Computational Fluid Dynamics Services (CFDS) is in widespread use in industrial safety problems, both within AEA Technology, and also by CFDS's commercial customers, for example references (4-13). This paper discusses some other applications of FLOW3D to safety problems. These applications illustrate the coupling of the gas flows with radiation models and combustion models, particularly for complex geometries where simpler radiation models are not applicable.

  16. Increasing Model Complexity: Unit Testing and Validation of a Coupled Electrical Resistive Heating and Macroscopic Invasion Percolation Model

    NASA Astrophysics Data System (ADS)

    Molnar, I. L.; Krol, M.; Mumford, K. G.

    2016-12-01

    Geoenvironmental models are becoming increasingly sophisticated as they incorporate rising numbers of mechanisms and process couplings to describe environmental scenarios. When combined with advances in computing and numerical techniques, these already complicated models are experiencing large increases in code complexity and simulation time. Although, this complexity has enabled breakthroughs in the ability to describe environmental problems, it is difficult to ensure that complex models are sufficiently robust and behave as intended. Many development tools used for testing software robustness have not seen widespread use in geoenvironmental sciences despite an increasing reliance on complex numerical models, leaving many models at risk of undiscovered errors and potentially improper validations. This study explores the use of unit testing, which independently examines small code elements to ensure each unit is working as intended as well as their integrated behaviour, to test the functionality and robustness of a coupled Electrical Resistive Heating (ERH) - Macroscopic Invasion Percolation (MIP) model. ERH is a thermal remediation technique where the soil is heated until boiling and volatile contaminants are stripped from the soil. There is significant interest in improving the efficiency of ERH, including taking advantage of low-temperature co-boiling behaviour which may reduce energy consumption. However, at lower co-boiling temperatures gas bubbles can form, mobilize and collapse in cooler areas, potentially contaminating previously clean zones. The ERH-MIP model was created to simulate the behaviour of gas bubbles in the subsurface and to evaluate ERH during co-boiling1. This study demonstrates how unit testing ensures that the model behaves in an expected manner and examines the robustness of every component within the ERH-MIP model. Once unit testing is established, the MIP module (a discrete gas transport algorithm for gas expansion, mobilization and

  17. Modeling uranium(VI) adsorption onto montmorillonite under varying carbonate concentrations: A surface complexation model accounting for the spillover effect on surface potential

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tournassat, C.; Tinnacher, R. M.; Grangeon, S.

    The prediction of U(VI) adsorption onto montmorillonite clay is confounded by the complexities of: (1) the montmorillonite structure in terms of adsorption sites on basal and edge surfaces, and the complex interactions between the electrical double layers at these surfaces, and (2) U(VI) solution speciation, which can include cationic, anionic and neutral species. Previous U(VI)-montmorillonite adsorption and modeling studies have typically expanded classical surface complexation modeling approaches, initially developed for simple oxides, to include both cation exchange and surface complexation reactions. However, previous models have not taken into account the unique characteristics of electrostatic surface potentials that occur at montmorillonitemore » edge sites, where the electrostatic surface potential of basal plane cation exchange sites influences the surface potential of neighboring edge sites (‘spillover’ effect).« less

  18. Modeling uranium(VI) adsorption onto montmorillonite under varying carbonate concentrations: A surface complexation model accounting for the spillover effect on surface potential

    DOE PAGES

    Tournassat, C.; Tinnacher, R. M.; Grangeon, S.; ...

    2017-10-06

    The prediction of U(VI) adsorption onto montmorillonite clay is confounded by the complexities of: (1) the montmorillonite structure in terms of adsorption sites on basal and edge surfaces, and the complex interactions between the electrical double layers at these surfaces, and (2) U(VI) solution speciation, which can include cationic, anionic and neutral species. Previous U(VI)-montmorillonite adsorption and modeling studies have typically expanded classical surface complexation modeling approaches, initially developed for simple oxides, to include both cation exchange and surface complexation reactions. However, previous models have not taken into account the unique characteristics of electrostatic surface potentials that occur at montmorillonitemore » edge sites, where the electrostatic surface potential of basal plane cation exchange sites influences the surface potential of neighboring edge sites (‘spillover’ effect).« less

  19. Complex networks generated by the Penna bit-string model: Emergence of small-world and assortative mixing

    NASA Astrophysics Data System (ADS)

    Li, Chunguang; Maini, Philip K.

    2005-10-01

    The Penna bit-string model successfully encompasses many phenomena of population evolution, including inheritance, mutation, evolution, and aging. If we consider social interactions among individuals in the Penna model, the population will form a complex network. In this paper, we first modify the Verhulst factor to control only the birth rate, and introduce activity-based preferential reproduction of offspring in the Penna model. The social interactions among individuals are generated by both inheritance and activity-based preferential increase. Then we study the properties of the complex network generated by the modified Penna model. We find that the resulting complex network has a small-world effect and the assortative mixing property.

  20. Compact Q-balls in the complex signum-Gordon model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arodz, H.; Lis, J.

    2008-05-15

    We discuss Q-balls in the complex signum-Gordon model in d-dimensional space for d=1, 2, 3. The Q-balls have strictly finite size. Their total energy is a powerlike function of the conserved U(1) charge with the exponent equal to (d+2)(d+3){sup -1}. In the cases d=1 and d=3 explicit analytic solutions are presented.

  1. An integrated water system model considering hydrological and biogeochemical processes at basin scale: model construction and application

    NASA Astrophysics Data System (ADS)

    Zhang, Y. Y.; Shao, Q. X.; Ye, A. Z.; Xing, H. T.

    2014-08-01

    Integrated water system modeling is a reasonable approach to provide scientific understanding and possible solutions to tackle the severe water crisis faced over the world and to promote the implementation of integrated river basin management. Such a modeling practice becomes more feasible nowadays due to better computing facilities and available data sources. In this study, the process-oriented water system model (HEXM) is developed by integrating multiple water related processes including hydrology, biogeochemistry, environment and ecology, as well as the interference of human activities. The model was tested in the Shaying River Catchment, the largest, highly regulated and heavily polluted tributary of Huai River Basin in China. The results show that: HEXM is well integrated with good performance on the key water related components in the complex catchments. The simulated daily runoff series at all the regulated and less-regulated stations matches observations, especially for the high and low flow events. The average values of correlation coefficient and coefficient of efficiency are 0.81 and 0.63, respectively. The dynamics of observed daily ammonia-nitrogen (NH4N) concentration, as an important index to assess water environmental quality in China, are well captured with average correlation coefficient of 0.66. Furthermore, the spatial patterns of nonpoint source pollutant load and grain yield are also simulated properly, and the outputs have good agreements with the statistics at city scale. Our model shows clear superior performance in both calibration and validation in comparison with the widely used SWAT model. This model is expected to give a strong reference for water system modeling in complex basins, and provide the scientific foundation for the implementation of integrated river basin management all over the world as well as the technical guide for the reasonable regulation of dams and sluices and environmental improvement in river basins.

  2. Modeling uranium(VI) adsorption onto montmorillonite under varying carbonate concentrations: A surface complexation model accounting for the spillover effect on surface potential

    NASA Astrophysics Data System (ADS)

    Tournassat, C.; Tinnacher, R. M.; Grangeon, S.; Davis, J. A.

    2018-01-01

    The prediction of U(VI) adsorption onto montmorillonite clay is confounded by the complexities of: (1) the montmorillonite structure in terms of adsorption sites on basal and edge surfaces, and the complex interactions between the electrical double layers at these surfaces, and (2) U(VI) solution speciation, which can include cationic, anionic and neutral species. Previous U(VI)-montmorillonite adsorption and modeling studies have typically expanded classical surface complexation modeling approaches, initially developed for simple oxides, to include both cation exchange and surface complexation reactions. However, previous models have not taken into account the unique characteristics of electrostatic surface potentials that occur at montmorillonite edge sites, where the electrostatic surface potential of basal plane cation exchange sites influences the surface potential of neighboring edge sites ('spillover' effect). A series of U(VI) - Na-montmorillonite batch adsorption experiments was conducted as a function of pH, with variable U(VI), Ca, and dissolved carbonate concentrations. Based on the experimental data, a new type of surface complexation model (SCM) was developed for montmorillonite, that specifically accounts for the spillover effect using the edge surface speciation model by Tournassat et al. (2016a). The SCM allows for a prediction of U(VI) adsorption under varying chemical conditions with a minimum number of fitting parameters, not only for our own experimental results, but also for a number of published data sets. The model agreed well with many of these datasets without introducing a second site type or including the formation of ternary U(VI)-carbonato surface complexes. The model predictions were greatly impacted by utilizing analytical measurements of dissolved inorganic carbon (DIC) concentrations in individual sample solutions rather than assuming solution equilibration with a specific partial pressure of CO2, even when the gas phase was

  3. Turbulent Dispersion Modelling in a Complex Urban Environment - Data Analysis and Model Development

    DTIC Science & Technology

    2010-02-01

    Technology Laboratory (Dstl) is used as a benchmark for comparison. Comparisons are also made with some more practically oriented computational fluid dynamics...predictions. To achieve clarity in the range of approaches available for practical models of con- taminant dispersion in urban areas, an overview of...complexity of those methods is simplified to a degree that allows straightforward practical implementation and application. Using these results as a

  4. A fast mass spring model solver for high-resolution elastic objects

    NASA Astrophysics Data System (ADS)

    Zheng, Mianlun; Yuan, Zhiyong; Zhu, Weixu; Zhang, Guian

    2017-03-01

    Real-time simulation of elastic objects is of great importance for computer graphics and virtual reality applications. The fast mass spring model solver can achieve visually realistic simulation in an efficient way. Unfortunately, this method suffers from resolution limitations and lack of mechanical realism for a surface geometry model, which greatly restricts its application. To tackle these problems, in this paper we propose a fast mass spring model solver for high-resolution elastic objects. First, we project the complex surface geometry model into a set of uniform grid cells as cages through *cages mean value coordinate method to reflect its internal structure and mechanics properties. Then, we replace the original Cholesky decomposition method in the fast mass spring model solver with a conjugate gradient method, which can make the fast mass spring model solver more efficient for detailed surface geometry models. Finally, we propose a graphics processing unit accelerated parallel algorithm for the conjugate gradient method. Experimental results show that our method can realize efficient deformation simulation of 3D elastic objects with visual reality and physical fidelity, which has a great potential for applications in computer animation.

  5. Modelling DC responses of 3D complex fracture networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beskardes, Gungor Didem; Weiss, Chester Joseph

    Here, the determination of the geometrical properties of fractures plays a critical role in many engineering problems to assess the current hydrological and mechanical states of geological media and to predict their future states. However, numerical modeling of geoelectrical responses in realistic fractured media has been challenging due to the explosive computational cost imposed by the explicit discretizations of fractures at multiple length scales, which often brings about a tradeoff between computational efficiency and geologic realism. Here, we use the hierarchical finite element method to model electrostatic response of realistically complex 3D conductive fracture networks with minimal computational cost.

  6. Modelling DC responses of 3D complex fracture networks

    DOE PAGES

    Beskardes, Gungor Didem; Weiss, Chester Joseph

    2018-03-01

    Here, the determination of the geometrical properties of fractures plays a critical role in many engineering problems to assess the current hydrological and mechanical states of geological media and to predict their future states. However, numerical modeling of geoelectrical responses in realistic fractured media has been challenging due to the explosive computational cost imposed by the explicit discretizations of fractures at multiple length scales, which often brings about a tradeoff between computational efficiency and geologic realism. Here, we use the hierarchical finite element method to model electrostatic response of realistically complex 3D conductive fracture networks with minimal computational cost.

  7. Use of paired simple and complex models to reduce predictive bias and quantify uncertainty

    NASA Astrophysics Data System (ADS)

    Doherty, John; Christensen, Steen

    2011-12-01

    Modern environmental management and decision-making is based on the use of increasingly complex numerical models. Such models have the advantage of allowing representation of complex processes and heterogeneous system property distributions inasmuch as these are understood at any particular study site. The latter are often represented stochastically, this reflecting knowledge of the character of system heterogeneity at the same time as it reflects a lack of knowledge of its spatial details. Unfortunately, however, complex models are often difficult to calibrate because of their long run times and sometimes questionable numerical stability. Analysis of predictive uncertainty is also a difficult undertaking when using models such as these. Such analysis must reflect a lack of knowledge of spatial hydraulic property details. At the same time, it must be subject to constraints on the spatial variability of these details born of the necessity for model outputs to replicate observations of historical system behavior. In contrast, the rapid run times and general numerical reliability of simple models often promulgates good calibration and ready implementation of sophisticated methods of calibration-constrained uncertainty analysis. Unfortunately, however, many system and process details on which uncertainty may depend are, by design, omitted from simple models. This can lead to underestimation of the uncertainty associated with many predictions of management interest. The present paper proposes a methodology that attempts to overcome the problems associated with complex models on the one hand and simple models on the other hand, while allowing access to the benefits each of them offers. It provides a theoretical analysis of the simplification process from a subspace point of view, this yielding insights into the costs of model simplification, and into how some of these costs may be reduced. It then describes a methodology for paired model usage through which predictive

  8. A role for low-order system dynamics models in urban health policy making.

    PubMed

    Newell, Barry; Siri, José

    2016-10-01

    Cities are complex adaptive systems whose responses to policy initiatives emerge from feedback interactions between their parts. Urban policy makers must routinely deal with both detail and dynamic complexity, coupled with high levels of diversity, uncertainty and contingency. In such circumstances, it is difficult to generate reliable predictions of health-policy outcomes. In this paper we explore the potential for low-order system dynamics (LOSD) models to make a contribution towards meeting this challenge. By definition, LOSD models have few state variables (≤5), illustrate the non-linear effects caused by feedback and accumulation, and focus on endogenous dynamics generated within well-defined boundaries. We suggest that experience with LOSD models can help practitioners to develop an understanding of basic principles of system dynamics, giving them the ability to 'see with new eyes'. Because efforts to build a set of LOSD models can help a transdisciplinary group to develop a shared, coherent view of the problems that they seek to tackle, such models can also become the foundations of 'powerful ideas'. Powerful ideas are conceptual metaphors that provide the members of a policy-making group with the a priori shared context required for effective communication, the co-production of knowledge, and the collaborative development of effective public health policies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Turbulence modeling needs of commercial CFD codes: Complex flows in the aerospace and automotive industries

    NASA Technical Reports Server (NTRS)

    Befrui, Bizhan A.

    1995-01-01

    This viewgraph presentation discusses the following: STAR-CD computational features; STAR-CD turbulence models; common features of industrial complex flows; industry-specific CFD development requirements; applications and experiences of industrial complex flows, including flow in rotating disc cavities, diffusion hole film cooling, internal blade cooling, and external car aerodynamics; and conclusions on turbulence modeling needs.

  10. MASS BALANCE MODELLING OF PCBS IN THE FOX RIVER/GREEN BAY COMPLEX

    EPA Science Inventory

    The USEPA Office of Research and Development developed and applies a multimedia, mass balance modeling approach to the Fox River/Green Bay complex to aid managers with remedial decision-making. The suite of models were applied to PCBs due to the long history of contamination and ...

  11. Multi-level emulation of complex climate model responses to boundary forcing data

    NASA Astrophysics Data System (ADS)

    Tran, Giang T.; Oliver, Kevin I. C.; Holden, Philip B.; Edwards, Neil R.; Sóbester, András; Challenor, Peter

    2018-04-01

    Climate model components involve both high-dimensional input and output fields. It is desirable to efficiently generate spatio-temporal outputs of these models for applications in integrated assessment modelling or to assess the statistical relationship between such sets of inputs and outputs, for example, uncertainty analysis. However, the need for efficiency often compromises the fidelity of output through the use of low complexity models. Here, we develop a technique which combines statistical emulation with a dimensionality reduction technique to emulate a wide range of outputs from an atmospheric general circulation model, PLASIM, as functions of the boundary forcing prescribed by the ocean component of a lower complexity climate model, GENIE-1. Although accurate and detailed spatial information on atmospheric variables such as precipitation and wind speed is well beyond the capability of GENIE-1's energy-moisture balance model of the atmosphere, this study demonstrates that the output of this model is useful in predicting PLASIM's spatio-temporal fields through multi-level emulation. Meaningful information from the fast model, GENIE-1 was extracted by utilising the correlation between variables of the same type in the two models and between variables of different types in PLASIM. We present here the construction and validation of several PLASIM variable emulators and discuss their potential use in developing a hybrid model with statistical components.

  12. Adsorption of uranium(VI) to manganese oxides: X-ray absorption spectroscopy and surface complexation modeling.

    PubMed

    Wang, Zimeng; Lee, Sung-Woo; Catalano, Jeffrey G; Lezama-Pacheco, Juan S; Bargar, John R; Tebo, Bradley M; Giammar, Daniel E

    2013-01-15

    The mobility of hexavalent uranium in soil and groundwater is strongly governed by adsorption to mineral surfaces. As strong naturally occurring adsorbents, manganese oxides may significantly influence the fate and transport of uranium. Models for U(VI) adsorption over a broad range of chemical conditions can improve predictive capabilities for uranium transport in the subsurface. This study integrated batch experiments of U(VI) adsorption to synthetic and biogenic MnO(2), surface complexation modeling, ζ-potential analysis, and molecular-scale characterization of adsorbed U(VI) with extended X-ray absorption fine structure (EXAFS) spectroscopy. The surface complexation model included inner-sphere monodentate and bidentate surface complexes and a ternary uranyl-carbonato surface complex, which was consistent with the EXAFS analysis. The model could successfully simulate adsorption results over a broad range of pH and dissolved inorganic carbon concentrations. U(VI) adsorption to synthetic δ-MnO(2) appears to be stronger than to biogenic MnO(2), and the differences in adsorption affinity and capacity are not associated with any substantial difference in U(VI) coordination.

  13. Tackling drug and alcohol misuse in Brazil: priorities and challenges for nurses.

    PubMed

    Rassool, G H; Villar-Luis, M

    2004-12-01

    To provide an overview of the extent of drug and alcohol misuse in Brazil and the policies and approaches in tackling substance misuse. An examination of the challenges facing the nursing profession in working with substance misusers is presented. Alcohol, cocaine, and cannabis are the most commonly misused psychoactive substances in Brazil. One of the biggest public health problems is the interface between the misuse of psychoactive substances and HIV prevalence and other sexually transmitted diseases. Findings from a recent study suggest that undergraduate nurses in Brazil are not adequately prepared in the care and management of substance misuse problems. The nursing profession in Brazil faces numerous challenges in the development of professional competence of nurses in this field. A strategy proposed is the creation of regional centres in Brazil to study the integration of substance use and misuse in the nursing undergraduate curriculum and the giving of specific support in teaching and research to nurse teachers. Nurses have a key role to play in the early recognition, assessment, prevention, and treatment of substance misuse.

  14. Surface Complexation Modeling of Eu(III) and U(VI) Interactions with Graphene Oxide.

    PubMed

    Xie, Yu; Helvenston, Edward M; Shuller-Nickles, Lindsay C; Powell, Brian A

    2016-02-16

    Graphene oxide (GO) has great potential for actinide removal due to its extremely high sorption capacity, but the mechanism of sorption remains unclear. In this study, the carboxylic functional group and an unexpected sulfonate functional group on GO were characterized as the reactive surface sites and quantified via diffuse layer modeling of the GO acid/base titrations. The presence of sulfonate functional group on GO was confirmed using elemental analysis and X-ray photoelectron spectroscopy. Batch experiments of Eu(III) and U(VI) sorption to GO as the function of pH (1-8) and as the function of analyte concentration (10-100, 000 ppb) at a constant pH ≈ 5 were conducted; the batch sorption results were modeled simultaneously using surface complexation modeling (SCM). The SCM indicated that Eu(III) and U(VI) complexation to carboxylate functional group is the main mechanism for their sorption to GO; their complexation to the sulfonate site occurred at the lower pH range and the complexation of Eu(III) to sulfonate site are more significant than that of U(VI). Eu(III) and U(VI) facilitated GO aggregation was observed with high Eu(III) and U(VI) concentration and may be caused by surface charge neutralization of GO after sorption.

  15. Tackling U.S. energy challenges and opportunities: preliminary policy recommendations for enhancing energy innovation in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anadon, Laura Diaz; Gallagher, Kelly Sims; Bunn, Matthew

    2009-02-18

    The report offers preliminary recommendations for near-term actions to strengthen the U.S. effort to develop and deploy advanced energy technologies. The report comes as the Obama Administration and the 111th U.S. Congress face enormous challenges and opportunities in tackling the pressing security, economic, and environmental problems posed by the energy sector. Improving the technologies of energy supply and end-use is a prerequisite for surmounting these challenges in a timely and cost-effective way, and this report elaborates on how policy can support develop of these important energy technologies.

  16. Diffusion in higher dimensional SYK model with complex fermions

    NASA Astrophysics Data System (ADS)

    Cai, Wenhe; Ge, Xian-Hui; Yang, Guo-Hong

    2018-01-01

    We construct a new higher dimensional SYK model with complex fermions on bipartite lattices. As an extension of the original zero-dimensional SYK model, we focus on the one-dimension case, and similar Hamiltonian can be obtained in higher dimensions. This model has a conserved U(1) fermion number Q and a conjugate chemical potential μ. We evaluate the thermal and charge diffusion constants via large q expansion at low temperature limit. The results show that the diffusivity depends on the ratio of free Majorana fermions to Majorana fermions with SYK interactions. The transport properties and the butterfly velocity are accordingly calculated at low temperature. The specific heat and the thermal conductivity are proportional to the temperature. The electrical resistivity also has a linear temperature dependence term.

  17. How will a life course framework be used to tackle wider social determinants of health?

    PubMed

    Nicolau, Belinda; Marcenes, Wagner

    2012-10-01

    The life course framework, proposed by Kuh and Schlomo in 1997, offers policy makers the means to understand the interaction between nature and nurture. This conceptual model illustrates how an individual's biological resources are influenced by their genetic endowment, their prenatal and postnatal development and their social and physical environment, both in early life and throughout the life course. Health is conceptualized as a dynamic process connecting biological and social elements that are affected by previous experiences and by present circumstances. Therefore, exposure at different stages of people's lives can either enhance or deplete the individual's health resources. Indeed, life course processes are of many kinds, including parent-child relationships, levels of social deprivation, the acquisition of emotional and behavioural assets in adolescence and the long-term effects of occupational hazards and work stress. The long-term effects of nature and nurture combine to influence disease outcomes. It is only in the last decade that theories, methods and new data have begun to be amalgamated, allowing us to further our understanding of health over the life course in ways that may eventually lead to more effective health policies and better health care. This article discusses life course concepts and how this framework can enlighten our understanding of wider social determinants of health, and provides a few examples of potential interventions to tackle their impact on health. © 2012 John Wiley & Sons A/S.

  18. Understanding Transportation Systems : An Integrated Approach to Modeling Complex Transportation Systems

    DOT National Transportation Integrated Search

    2013-01-01

    The ability to model and understand the complex dynamics of intelligent agents as they interact within a transportation system could lead to revolutionary advances in transportation engineering and intermodal surface transportation in the United Stat...

  19. A 3D modeling approach to complex faults with multi-source data

    NASA Astrophysics Data System (ADS)

    Wu, Qiang; Xu, Hua; Zou, Xukai; Lei, Hongzhuan

    2015-04-01

    Fault modeling is a very important step in making an accurate and reliable 3D geological model. Typical existing methods demand enough fault data to be able to construct complex fault models, however, it is well known that the available fault data are generally sparse and undersampled. In this paper, we propose a workflow of fault modeling, which can integrate multi-source data to construct fault models. For the faults that are not modeled with these data, especially small-scale or approximately parallel with the sections, we propose the fault deduction method to infer the hanging wall and footwall lines after displacement calculation. Moreover, using the fault cutting algorithm can supplement the available fault points on the location where faults cut each other. Increasing fault points in poor sample areas can not only efficiently construct fault models, but also reduce manual intervention. By using a fault-based interpolation and remeshing the horizons, an accurate 3D geological model can be constructed. The method can naturally simulate geological structures no matter whether the available geological data are sufficient or not. A concrete example of using the method in Tangshan, China, shows that the method can be applied to broad and complex geological areas.

  20. Progress on Complex Langevin simulations of a finite density matrix model for QCD

    NASA Astrophysics Data System (ADS)

    Bloch, Jacques; Glesaaen, Jonas; Verbaarschot, Jacobus; Zafeiropoulos, Savvas

    2018-03-01

    We study the Stephanov model, which is an RMT model for QCD at finite density, using the Complex Langevin algorithm. Naive implementation of the algorithm shows convergence towards the phase quenched or quenched theory rather than to intended theory with dynamical quarks. A detailed analysis of this issue and a potential resolution of the failure of this algorithm are discussed. We study the effect of gauge cooling on the Dirac eigenvalue distribution and time evolution of the norm for various cooling norms, which were specifically designed to remove the pathologies of the complex Langevin evolution. The cooling is further supplemented with a shifted representation for the random matrices. Unfortunately, none of these modifications generate a substantial improvement on the complex Langevin evolution and the final results still do not agree with the analytical predictions.

  1. Introduction to a special section on ecohydrology of semiarid environments: Confronting mathematical models with ecosystem complexity

    NASA Astrophysics Data System (ADS)

    Svoray, Tal; Assouline, Shmuel; Katul, Gabriel

    2015-11-01

    Current literature provides large number of publications about ecohydrological processes and their effect on the biota in drylands. Given the limited laboratory and field experiments in such systems, many of these publications are based on mathematical models of varying complexity. The underlying implicit assumption is that the data set used to evaluate these models covers the parameter space of conditions that characterize drylands and that the models represent the actual processes with acceptable certainty. However, a question raised is to what extent these mathematical models are valid when confronted with observed ecosystem complexity? This Introduction reviews the 16 papers that comprise the Special Section on Eco-hydrology of Semiarid Environments: Confronting Mathematical Models with Ecosystem Complexity. The subjects studied in these papers include rainfall regime, infiltration and preferential flow, evaporation and evapotranspiration, annual net primary production, dispersal and invasion, and vegetation greening. The findings in the papers published in this Special Section show that innovative mathematical modeling approaches can represent actual field measurements. Hence, there are strong grounds for suggesting that mathematical models can contribute to greater understanding of ecosystem complexity through characterization of space-time dynamics of biomass and water storage as well as their multiscale interactions. However, the generality of the models and their low-dimensional representation of many processes may also be a "curse" that results in failures when particulars of an ecosystem are required. It is envisaged that the search for a unifying "general" model, while seductive, may remain elusive in the foreseeable future. It is for this reason that improving the merger between experiments and models of various degrees of complexity continues to shape the future research agenda.

  2. Synchronization Experiments With A Global Coupled Model of Intermediate Complexity

    NASA Astrophysics Data System (ADS)

    Selten, Frank; Hiemstra, Paul; Shen, Mao-Lin

    2013-04-01

    In the super modeling approach an ensemble of imperfect models are connected through nudging terms that nudge the solution of each model to the solution of all other models in the ensemble. The goal is to obtain a synchronized state through a proper choice of connection strengths that closely tracks the trajectory of the true system. For the super modeling approach to be successful, the connections should be dense and strong enough for synchronization to occur. In this study we analyze the behavior of an ensemble of connected global atmosphere-ocean models of intermediate complexity. All atmosphere models are connected to the same ocean model through the surface fluxes of heat, water and momentum, the ocean is integrated using weighted averaged surface fluxes. In particular we analyze the degree of synchronization between the atmosphere models and the characteristics of the ensemble mean solution. The results are interpreted using a low order atmosphere-ocean toy model.

  3. New Age of 3D Geological Modelling or Complexity is not an Issue Anymore

    NASA Astrophysics Data System (ADS)

    Mitrofanov, Aleksandr

    2017-04-01

    Geological model has a significant value in almost all types of researches related to regional mapping, geodynamics and especially to structural and resource geology of mineral deposits. Well-developed geological model must take into account all vital features of modelling object without over-simplification and also should adequately represent the interpretation of the geologist. In recent years with the gradual exhaustion deposits with relatively simple morphology geologists from all over the world are faced with the necessity of building the representative models for more and more structurally complex objects. Meanwhile, the amount of tools used for that has not significantly changed in the last two-three decades. The most widespread method of wireframe geological modelling now was developed in 1990s and is fully based on engineering design set of instruments (so-called CAD). Strings and polygons representing the section-based interpretation are being used as an intermediate step in the process of wireframes generation. Despite of significant time required for this type of modelling, it still can provide sufficient results for simple and medium-complexity geological objects. However, with the increasing complexity more and more vital features of the deposit are being sacrificed because of fundamental inability (or much greater time required for modelling) of CAD-based explicit techniques to develop the wireframes of the appropriate complexity. At the same time alternative technology which is not based on sectional approach and which uses the fundamentally different mathematical algorithms is being actively developed in the variety of other disciplines: medicine, advanced industrial design, game and cinema industry. In the recent years this implicit technology started to being developed for geological modelling purpose and nowadays it is represented by very powerful set of tools that has been integrated in almost all major commercial software packages. Implicit

  4. Derivative Free Optimization of Complex Systems with the Use of Statistical Machine Learning Models

    DTIC Science & Technology

    2015-09-12

    AFRL-AFOSR-VA-TR-2015-0278 DERIVATIVE FREE OPTIMIZATION OF COMPLEX SYSTEMS WITH THE USE OF STATISTICAL MACHINE LEARNING MODELS Katya Scheinberg...COMPLEX SYSTEMS WITH THE USE OF STATISTICAL MACHINE LEARNING MODELS 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA9550-11-1-0239 5c.  PROGRAM ELEMENT...developed, which has been the focus of our research. 15. SUBJECT TERMS optimization, Derivative-Free Optimization, Statistical Machine Learning 16. SECURITY

  5. Generative complexity of Gray-Scott model

    NASA Astrophysics Data System (ADS)

    Adamatzky, Andrew

    2018-03-01

    In the Gray-Scott reaction-diffusion system one reactant is constantly fed in the system, another reactant is reproduced by consuming the supplied reactant and also converted to an inert product. The rate of feeding one reactant in the system and the rate of removing another reactant from the system determine configurations of concentration profiles: stripes, spots, waves. We calculate the generative complexity-a morphological complexity of concentration profiles grown from a point-wise perturbation of the medium-of the Gray-Scott system for a range of the feeding and removal rates. The morphological complexity is evaluated using Shannon entropy, Simpson diversity, approximation of Lempel-Ziv complexity, and expressivity (Shannon entropy divided by space-filling). We analyse behaviour of the systems with highest values of the generative morphological complexity and show that the Gray-Scott systems expressing highest levels of the complexity are composed of the wave-fragments (similar to wave-fragments in sub-excitable media) and travelling localisations (similar to quasi-dissipative solitons and gliders in Conway's Game of Life).

  6. Tackling Social Cognition in Schizophrenia: A Randomized Feasibility Trial.

    PubMed

    Taylor, Rumina; Cella, Matteo; Csipke, Emese; Heriot-Maitland, Charles; Gibbs, Caroline; Wykes, Til

    2016-05-01

    Social cognition difficulties in schizophrenia are seen as a barrier to recovery. Intervention tackling problems in this domain have the potential to facilitate functioning and recovery. Social Cognition and Interaction Training (SCIT) is a manual-based psychological therapy designed to improve social functioning in schizophrenia. The aim of this study is to evaluate the feasibility and acceptability of a modified version of SCIT for inpatient forensic wards. The potential benefits of the intervention were also assessed. This study is a randomized single blind controlled design, with participants randomized to receive SCIT (N = 21) or treatment as usual (TAU; N = 15). SCIT consisted of 8-week therapy sessions twice per week. Participants were assessed at week 0 and one week after the intervention on measures of social cognition. Feasibility was assessed through group attendance and attrition. Participant acceptability and outcome was evaluated through post-group satisfaction and achievement of social goals. The intervention was well received by all participants and the majority reported their confidence improved. The SCIT group showed a significant improvement in facial affect recognition compared to TAU. Almost all participants agreed they had achieved their social goal as a result of the intervention. It is feasible to deliver SCIT in a forensic ward setting; however, some adaptation to the protocol may need to be considered in order to accommodate for the reduced social contact within forensic wards. Practice of social cognition skills in real life may be necessary to achieve benefits to theory of mind and attributional style.

  7. The evaluative imaging of mental models - Visual representations of complexity

    NASA Technical Reports Server (NTRS)

    Dede, Christopher

    1989-01-01

    The paper deals with some design issues involved in building a system that could visually represent the semantic structures of training materials and their underlying mental models. In particular, hypermedia-based semantic networks that instantiate classification problem solving strategies are thought to be a useful formalism for such representations; the complexity of these web structures can be best managed through visual depictions. It is also noted that a useful approach to implement in these hypermedia models would be some metrics of conceptual distance.

  8. The electronic structure of vanadium monochloride cation (VCl+): Tackling the complexities of transition metal species

    NASA Astrophysics Data System (ADS)

    DeYonker, Nathan J.; Halfen, DeWayne T.; Allen, Wesley D.; Ziurys, Lucy M.

    2014-11-01

    Six electronic states (X 4Σ-, A 4Π, B 4Δ, 2Φ, 2Δ, 2Σ+) of the vanadium monochloride cation (VCl+) are described using large basis set coupled cluster theory. For the two lowest quartet states (X 4Σ- and A 4Π), a focal point analysis (FPA) approach was used that conjoined a correlation-consistent family of basis sets up to aug-cc-pwCV5Z-DK with high-order coupled cluster theory through pentuple (CCSDTQP) excitations. FPA adiabatic excitation energies (T0) and spectroscopic constants (re, r0, Be, B0, bar De, He, ωe, v0, αe, ωexe) were extrapolated to the valence complete basis set Douglas-Kroll (DK) aug-cc-pV∞Z-DK CCSDT level of theory, and additional treatments accounted for higher-order valence electron correlation, core correlation, and spin-orbit coupling. Due to the delicate interplay between dynamical and static electronic correlation, single reference coupled cluster theory is able to provide the correct ground electronic state (X 4Σ-), while multireference configuration interaction theory cannot. Perturbations from the first- and second-order spin orbit coupling of low-lying states with quartet spin multiplicity reveal an immensely complex rotational spectrum relative to the isovalent species VO, VS, and TiCl. Computational data on the doublet manifold suggest that the lowest-lying doublet state (2Γ) has a Te of ˜11 200 cm-1. Overall, this study shows that laboratory and theoretical rotational spectroscopists must work more closely in tandem to better understand the bonding and structure of molecules containing transition metals.

  9. Fort Collins Science Center Ecosystem Dynamics branch--interdisciplinary research for addressing complex natural resource issues across landscapes and time

    USGS Publications Warehouse

    Bowen, Zachary H.; Melcher, Cynthia P.; Wilson, Juliette T.

    2013-01-01

    The Ecosystem Dynamics Branch of the Fort Collins Science Center offers an interdisciplinary team of talented and creative scientists with expertise in biology, botany, ecology, geology, biogeochemistry, physical sciences, geographic information systems, and remote-sensing, for tackling complex questions about natural resources. As demand for natural resources increases, the issues facing natural resource managers, planners, policy makers, industry, and private landowners are increasing in spatial and temporal scope, often involving entire regions, multiple jurisdictions, and long timeframes. Needs for addressing these issues include (1) a better understanding of biotic and abiotic ecosystem components and their complex interactions; (2) the ability to easily monitor, assess, and visualize the spatially complex movements of animals, plants, water, and elements across highly variable landscapes; and (3) the techniques for accurately predicting both immediate and long-term responses of system components to natural and human-caused change. The overall objectives of our research are to provide the knowledge, tools, and techniques needed by the U.S. Department of the Interior, state agencies, and other stakeholders in their endeavors to meet the demand for natural resources while conserving biodiversity and ecosystem services. Ecosystem Dynamics scientists use field and laboratory research, data assimilation, and ecological modeling to understand ecosystem patterns, trends, and mechanistic processes. This information is used to predict the outcomes of changes imposed on species, habitats, landscapes, and climate across spatiotemporal scales. The products we develop include conceptual models to illustrate system structure and processes; regional baseline and integrated assessments; predictive spatial and mathematical models; literature syntheses; and frameworks or protocols for improved ecosystem monitoring, adaptive management, and program evaluation. The descriptions

  10. Coevolution at protein complex interfaces can be detected by the complementarity trace with important impact for predictive docking

    PubMed Central

    Madaoui, Hocine; Guerois, Raphaël

    2008-01-01

    Protein surfaces are under significant selection pressure to maintain interactions with their partners throughout evolution. Capturing how selection pressure acts at the interfaces of protein–protein complexes is a fundamental issue with high interest for the structural prediction of macromolecular assemblies. We tackled this issue under the assumption that, throughout evolution, mutations should minimally disrupt the physicochemical compatibility between specific clusters of interacting residues. This constraint drove the development of the so-called Surface COmplementarity Trace in Complex History score (SCOTCH), which was found to discriminate with high efficiency the structure of biological complexes. SCOTCH performances were assessed not only with respect to other evolution-based approaches, such as conservation and coevolution analyses, but also with respect to statistically based scoring methods. Validated on a set of 129 complexes of known structure exhibiting both permanent and transient intermolecular interactions, SCOTCH appears as a robust strategy to guide the prediction of protein–protein complex structures. Of particular interest, it also provides a basic framework to efficiently track how protein surfaces could evolve while keeping their partners in contact. PMID:18511568

  11. Recommended Research Directions for Improving the Validation of Complex Systems Models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vugrin, Eric D.; Trucano, Timothy G.; Swiler, Laura Painton

    Improved validation for models of complex systems has been a primary focus over the past year for the Resilience in Complex Systems Research Challenge. This document describes a set of research directions that are the result of distilling those ideas into three categories of research -- epistemic uncertainty, strong tests, and value of information. The content of this document can be used to transmit valuable information to future research activities, update the Resilience in Complex Systems Research Challenge's roadmap, inform the upcoming FY18 Laboratory Directed Research and Development (LDRD) call and research proposals, and facilitate collaborations between Sandia and externalmore » organizations. The recommended research directions can provide topics for collaborative research, development of proposals, workshops, and other opportunities.« less

  12. Performance of Random Effects Model Estimators under Complex Sampling Designs

    ERIC Educational Resources Information Center

    Jia, Yue; Stokes, Lynne; Harris, Ian; Wang, Yan

    2011-01-01

    In this article, we consider estimation of parameters of random effects models from samples collected via complex multistage designs. Incorporation of sampling weights is one way to reduce estimation bias due to unequal probabilities of selection. Several weighting methods have been proposed in the literature for estimating the parameters of…

  13. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    NASA Astrophysics Data System (ADS)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  14. An electrostatic model for the determination of magnetic anisotropy in dysprosium complexes.

    PubMed

    Chilton, Nicholas F; Collison, David; McInnes, Eric J L; Winpenny, Richard E P; Soncini, Alessandro

    2013-01-01

    Understanding the anisotropic electronic structure of lanthanide complexes is important in areas as diverse as magnetic resonance imaging, luminescent cell labelling and quantum computing. Here we present an intuitive strategy based on a simple electrostatic method, capable of predicting the magnetic anisotropy of dysprosium(III) complexes, even in low symmetry. The strategy relies only on knowing the X-ray structure of the complex and the well-established observation that, in the absence of high symmetry, the ground state of dysprosium(III) is a doublet quantized along the anisotropy axis with an angular momentum quantum number mJ=±(15)/2. The magnetic anisotropy axis of 14 low-symmetry monometallic dysprosium(III) complexes computed via high-level ab initio calculations are very well reproduced by our electrostatic model. Furthermore, we show that the magnetic anisotropy is equally well predicted in a selection of low-symmetry polymetallic complexes.

  15. An example of complex modelling in dentistry using Markov chain Monte Carlo (MCMC) simulation.

    PubMed

    Helfenstein, Ulrich; Menghini, Giorgio; Steiner, Marcel; Murati, Francesca

    2002-09-01

    In the usual regression setting one regression line is computed for a whole data set. In a more complex situation, each person may be observed for example at several points in time and thus a regression line might be calculated for each person. Additional complexities, such as various forms of errors in covariables may make a straightforward statistical evaluation difficult or even impossible. During recent years methods have been developed allowing convenient analysis of problems where the data and the corresponding models show these and many other forms of complexity. The methodology makes use of a Bayesian approach and Markov chain Monte Carlo (MCMC) simulations. The methods allow the construction of increasingly elaborate models by building them up from local sub-models. The essential structure of the models can be represented visually by directed acyclic graphs (DAG). This attractive property allows communication and discussion of the essential structure and the substantial meaning of a complex model without needing algebra. After presentation of the statistical methods an example from dentistry is presented in order to demonstrate their application and use. The dataset of the example had a complex structure; each of a set of children was followed up over several years. The number of new fillings in permanent teeth had been recorded at several ages. The dependent variables were markedly different from the normal distribution and could not be transformed to normality. In addition, explanatory variables were assumed to be measured with different forms of error. Illustration of how the corresponding models can be estimated conveniently via MCMC simulation, in particular, 'Gibbs sampling', using the freely available software BUGS is presented. In addition, how the measurement error may influence the estimates of the corresponding coefficients is explored. It is demonstrated that the effect of the independent variable on the dependent variable may be markedly

  16. XML Encoding of Features Describing Rule-Based Modeling of Reaction Networks with Multi-Component Molecular Complexes

    PubMed Central

    Blinov, Michael L.; Moraru, Ion I.

    2011-01-01

    Multi-state molecules and multi-component complexes are commonly involved in cellular signaling. Accounting for molecules that have multiple potential states, such as a protein that may be phosphorylated on multiple residues, and molecules that combine to form heterogeneous complexes located among multiple compartments, generates an effect of combinatorial complexity. Models involving relatively few signaling molecules can include thousands of distinct chemical species. Several software tools (StochSim, BioNetGen) are already available to deal with combinatorial complexity. Such tools need information standards if models are to be shared, jointly evaluated and developed. Here we discuss XML conventions that can be adopted for modeling biochemical reaction networks described by user-specified reaction rules. These could form a basis for possible future extensions of the Systems Biology Markup Language (SBML). PMID:21464833

  17. Using iMCFA to Perform the CFA, Multilevel CFA, and Maximum Model for Analyzing Complex Survey Data.

    PubMed

    Wu, Jiun-Yu; Lee, Yuan-Hsuan; Lin, John J H

    2018-01-01

    To construct CFA, MCFA, and maximum MCFA with LISREL v.8 and below, we provide iMCFA (integrated Multilevel Confirmatory Analysis) to examine the potential multilevel factorial structure in the complex survey data. Modeling multilevel structure for complex survey data is complicated because building a multilevel model is not an infallible statistical strategy unless the hypothesized model is close to the real data structure. Methodologists have suggested using different modeling techniques to investigate potential multilevel structure of survey data. Using iMCFA, researchers can visually set the between- and within-level factorial structure to fit MCFA, CFA and/or MAX MCFA models for complex survey data. iMCFA can then yield between- and within-level variance-covariance matrices, calculate intraclass correlations, perform the analyses and generate the outputs for respective models. The summary of the analytical outputs from LISREL is gathered and tabulated for further model comparison and interpretation. iMCFA also provides LISREL syntax of different models for researchers' future use. An empirical and a simulated multilevel dataset with complex and simple structures in the within or between level was used to illustrate the usability and the effectiveness of the iMCFA procedure on analyzing complex survey data. The analytic results of iMCFA using Muthen's limited information estimator were compared with those of Mplus using Full Information Maximum Likelihood regarding the effectiveness of different estimation methods.

  18. A GIS-based atmospheric dispersion model for pollutants emitted by complex source areas.

    PubMed

    Teggi, Sergio; Costanzini, Sofia; Ghermandi, Grazia; Malagoli, Carlotta; Vinceti, Marco

    2018-01-01

    Gaussian dispersion models are widely used to simulate the concentrations and deposition fluxes of pollutants emitted by source areas. Very often, the calculation time limits the number of sources and receptors and the geometry of the sources must be simple and without holes. This paper presents CAREA, a new GIS-based Gaussian model for complex source areas. CAREA was coded in the Python language, and is largely based on a simplified formulation of the very popular and recognized AERMOD model. The model allows users to define in a GIS environment thousands of gridded or scattered receptors and thousands of complex sources with hundreds of vertices and holes. CAREA computes ground level, or near ground level, concentrations and dry deposition fluxes of pollutants. The input/output and the runs of the model can be completely managed in GIS environment (e.g. inside a GIS project). The paper presents the CAREA formulation and its applications to very complex test cases. The tests shows that the processing time are satisfactory and that the definition of sources and receptors and the output retrieval are quite easy in a GIS environment. CAREA and AERMOD are compared using simple and reproducible test cases. The comparison shows that CAREA satisfactorily reproduces AERMOD simulations and is considerably faster than AERMOD. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Large eddy simulation modeling of particle-laden flows in complex terrain

    NASA Astrophysics Data System (ADS)

    Salesky, S.; Giometto, M. G.; Chamecki, M.; Lehning, M.; Parlange, M. B.

    2017-12-01

    The transport, deposition, and erosion of heavy particles over complex terrain in the atmospheric boundary layer is an important process for hydrology, air quality forecasting, biology, and geomorphology. However, in situ observations can be challenging in complex terrain due to spatial heterogeneity. Furthermore, there is a need to develop numerical tools that can accurately represent the physics of these multiphase flows over complex surfaces. We present a new numerical approach to accurately model the transport and deposition of heavy particles in complex terrain using large eddy simulation (LES). Particle transport is represented through solution of the advection-diffusion equation including terms that represent gravitational settling and inertia. The particle conservation equation is discretized in a cut-cell finite volume framework in order to accurately enforce mass conservation. Simulation results will be validated with experimental data, and numerical considerations required to enforce boundary conditions at the surface will be discussed. Applications will be presented in the context of snow deposition and transport, as well as urban dispersion.

  20. Interactive Visualization of Complex Seismic Data and Models Using Bokeh

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chai, Chengping; Ammon, Charles J.; Maceira, Monica

    Visualizing multidimensional data and models becomes more challenging as the volume and resolution of seismic data and models increase. But thanks to the development of powerful and accessible computer systems, a model web browser can be used to visualize complex scientific data and models dynamically. In this paper, we present four examples of seismic model visualization using an open-source Python package Bokeh. One example is a visualization of a surface-wave dispersion data set, another presents a view of three-component seismograms, and two illustrate methods to explore a 3D seismic-velocity model. Unlike other 3D visualization packages, our visualization approach has amore » minimum requirement on users and is relatively easy to develop, provided you have reasonable programming skills. Finally, utilizing familiar web browsing interfaces, the dynamic tools provide us an effective and efficient approach to explore large data sets and models.« less