Toward Modeling the Intrinsic Complexity of Test Problems
ERIC Educational Resources Information Center
Shoufan, Abdulhadi
2017-01-01
The concept of intrinsic complexity explains why different problems of the same type, tackled by the same problem solver, can require different times to solve and yield solutions of different quality. This paper proposes a general four-step approach that can be used to establish a model for the intrinsic complexity of a problem class in terms of…
ERIC Educational Resources Information Center
Nelson, Tenneisha; Squires, Vicki
2017-01-01
Organizations are faced with solving increasingly complex problems. Addressing these issues requires effective leadership that can facilitate a collaborative problem solving approach where multiple perspectives are leveraged. In this conceptual paper, we critique the effectiveness of earlier leadership models in tackling complex organizational…
NASA Astrophysics Data System (ADS)
Sun, Zhiyong; Hao, Lina; Song, Bo; Yang, Ruiguo; Cao, Ruimin; Cheng, Yu
2016-10-01
Micro/nano positioning technologies have been attractive for decades for their various applications in both industrial and scientific fields. The actuators employed in these technologies are typically smart material actuators, which possess inherent hysteresis that may cause systems behave unexpectedly. Periodic reference tracking capability is fundamental for apparatuses such as scanning probe microscope, which employs smart material actuators to generate periodic scanning motion. However, traditional controller such as PID method cannot guarantee accurate fast periodic scanning motion. To tackle this problem and to conduct practical implementation in digital devices, this paper proposes a novel control method named discrete extended unparallel Prandtl-Ishlinskii model based internal model (d-EUPI-IM) control approach. To tackle modeling uncertainties, the robust d-EUPI-IM control approach is investigated, and the associated sufficient stabilizing conditions are derived. The advantages of the proposed controller are: it is designed and represented in discrete form, thus practical for digital devices implementation; the extended unparallel Prandtl-Ishlinskii model can precisely represent forward/inverse complex hysteretic characteristics, thus can reduce modeling uncertainties and benefits controllers design; in addition, the internal model principle based control module can be utilized as a natural oscillator for tackling periodic references tracking problem. The proposed controller was verified through comparative experiments on a piezoelectric actuator platform, and convincing results have been achieved.
Using VCL as an Aspect-Oriented Approach to Requirements Modelling
NASA Astrophysics Data System (ADS)
Amálio, Nuno; Kelsen, Pierre; Ma, Qin; Glodt, Christian
Software systems are becoming larger and more complex. By tackling the modularisation of crosscutting concerns, aspect orientation draws attention to modularity as a means to address the problems of scalability, complexity and evolution in software systems development. Aspect-oriented modelling (AOM) applies aspect-orientation to the construction of models. Most existing AOM approaches are designed without a formal semantics, and use multi-view partial descriptions of behaviour. This paper presents an AOM approach based on the Visual Contract Language (VCL): a visual language for abstract and precise modelling, designed with a formal semantics, and comprising a novel approach to visual behavioural modelling based on design by contract where behavioural descriptions are total. By applying VCL to a large case study of a car-crash crisis management system, the paper demonstrates how modularity of VCL's constructs, at different levels of granularity, help to tackle complexity. In particular, it shows how VCL's package construct and its associated composition mechanisms are key in supporting separation of concerns, coarse-grained problem decomposition and aspect-orientation. The case study's modelling solution has a clear and well-defined modular structure; the backbone of this structure is a collection of packages encapsulating local solutions to concerns.
The Emergence of an Amplified Mindset of Design: Implications for Postgraduate Design Education
ERIC Educational Resources Information Center
Moreira, Mafalda; Murphy, Emma; McAra-McWilliam, Irene
2016-01-01
In a global scenario of complexity, research shows that emerging design practices are changing and expanding, creating a complex and ambiguous disciplinary landscape. This directly impacts on the field of design education, calling for new, flexible models able to tackle future practitioners' needs, unknown markets and emergent societal cultures.…
Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering.
Groen, Nathalie; Guvendiren, Murat; Rabitz, Herschel; Welsh, William J; Kohn, Joachim; de Boer, Jan
2016-04-01
The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. In this opinion paper, we postulate that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. Copyright © 2016. Published by Elsevier Ltd.
Training young scientists across empirical and modeling approaches
NASA Astrophysics Data System (ADS)
Moore, D. J.
2014-12-01
The "fluxcourse," is a two-week program of study on Flux Measurements and Advanced Modeling (www.fluxcourse.org). Since 2007, this course has trained early career scientists to use both empirical observations and models to tackle terrestrial ecological questions. The fluxcourse seeks to cross train young scientists in measurement techniques and advanced modeling approaches for quantifying carbon and water fluxes between the atmosphere and the biosphere. We invited between ten and twenty volunteer instructors depending on the year ranging in experience and expertise, including representatives from industry, university professors and research specialists. The course combines online learning, lecture and discussion with hands on activities that range from measuring photosynthesis and installing an eddy covariance system to wrangling data and carrying out modeling experiments. Attendees are asked to develop and present two different group projects throughout the course. The overall goal is provide the next generation of scientists with the tools to tackle complex problems that require collaboration.
Virtual Construction of Space Habitats: Connecting Building Information Models (BIM) and SysML
NASA Technical Reports Server (NTRS)
Polit-Casillas, Raul; Howe, A. Scott
2013-01-01
Current trends in design, construction and management of complex projects make use of Building Information Models (BIM) connecting different types of data to geometrical models. This information model allow different types of analysis beyond pure graphical representations. Space habitats, regardless their size, are also complex systems that require the synchronization of many types of information and disciplines beyond mass, volume, power or other basic volumetric parameters. For this, the state-of-the-art model based systems engineering languages and processes - for instance SysML - represent a solid way to tackle this problem from a programmatic point of view. Nevertheless integrating this with a powerful geometrical architectural design tool with BIM capabilities could represent a change in the workflow and paradigm of space habitats design applicable to other aerospace complex systems. This paper shows some general findings and overall conclusions based on the ongoing research to create a design protocol and method that practically connects a systems engineering approach with a BIM architectural and engineering design as a complete Model Based Engineering approach. Therefore, one hypothetical example is created and followed during the design process. In order to make it possible this research also tackles the application of IFC categories and parameters in the aerospace field starting with the application upon the space habitats design as way to understand the information flow between disciplines and tools. By building virtual space habitats we can potentially improve in the near future the way more complex designs are developed from very little detail from concept to manufacturing.
Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering☆
Rabitz, Herschel; Welsh, William J.; Kohn, Joachim; de Boer, Jan
2016-01-01
The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. PMID:26876875
Epidemic modeling in complex realities.
Colizza, Vittoria; Barthélemy, Marc; Barrat, Alain; Vespignani, Alessandro
2007-04-01
In our global world, the increasing complexity of social relations and transport infrastructures are key factors in the spread of epidemics. In recent years, the increasing availability of computer power has enabled both to obtain reliable data allowing one to quantify the complexity of the networks on which epidemics may propagate and to envision computational tools able to tackle the analysis of such propagation phenomena. These advances have put in evidence the limits of homogeneous assumptions and simple spatial diffusion approaches, and stimulated the inclusion of complex features and heterogeneities relevant in the description of epidemic diffusion. In this paper, we review recent progresses that integrate complex systems and networks analysis with epidemic modelling and focus on the impact of the various complex features of real systems on the dynamics of epidemic spreading.
Tackle and impact detection in elite Australian football using wearable microsensor technology.
Gastin, Paul B; McLean, Owen C; Breed, Ray V P; Spittle, Michael
2014-01-01
The effectiveness of a wearable microsensor device (MinimaxX(TM) S4, Catapult Innovations, Melbourne, VIC, Australia) to automatically detect tackles and impact events in elite Australian football (AF) was assessed during four matches. Video observation was used as the criterion measure. A total of 352 tackles were observed, with 78% correctly detected as tackles by the manufacturer's software. Tackles against (i.e. tackled by an opponent) were more accurately detected than tackles made (90% v 66%). Of the 77 tackles that were not detected at all, the majority (74%) were categorised as low-intensity. In contrast, a total of 1510 "tackle" events were detected, with only 18% of these verified as tackles. A further 57% were from contested ball situations involving player contact. The remaining 25% were in general play where no contact was evident; these were significantly lower in peak Player Load™ than those involving player contact (P < 0.01). The tackle detection algorithm, developed primarily for rugby, was not suitable for tackle detection in AF. The underlying sensor data may have the potential to detect a range of events within contact sports such as AF, yet to do so is a complex task and requires sophisticated sport and event-specific algorithms.
Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion
NASA Astrophysics Data System (ADS)
Li, Z.; Ghaith, M.
2017-12-01
Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.
Dynamical analysis of the global business-cycle synchronization
2018-01-01
This paper reports the dynamical analysis of the business cycles of 12 (developed and developing) countries over the last 56 years by applying computational techniques used for tackling complex systems. They reveal long-term convergence and country-level interconnections because of close contagion effects caused by bilateral networking exposure. Interconnectivity determines the magnitude of cross-border impacts. Local features and shock propagation complexity also may be true engines for local configuration of cycles. The algorithmic modeling proves to represent a solid approach to study the complex dynamics involved in the world economies. PMID:29408909
Dynamical analysis of the global business-cycle synchronization.
Lopes, António M; Tenreiro Machado, J A; Huffstot, John S; Mata, Maria Eugénia
2018-01-01
This paper reports the dynamical analysis of the business cycles of 12 (developed and developing) countries over the last 56 years by applying computational techniques used for tackling complex systems. They reveal long-term convergence and country-level interconnections because of close contagion effects caused by bilateral networking exposure. Interconnectivity determines the magnitude of cross-border impacts. Local features and shock propagation complexity also may be true engines for local configuration of cycles. The algorithmic modeling proves to represent a solid approach to study the complex dynamics involved in the world economies.
Trusted Autonomy: Concept Development in Technology Foresight
2015-09-01
executing the response . UNCLASSIFIED 19 UNCLASSIFIED DST-Group-TR-3153 UNCLASSIFIED 20 Previous models explored autonomy through the lenses of...w /users to leam to tackle complex problems Robots are able to intanalise and use world models Controllers have multiple modules based on leru.ning...technology across society. It aims to describe usage or uptake, and evolving trends, in technological development over time. Through doing so, it seeks to
Qi, Xiao-Wen; Zhang, Jun-Ling; Zhao, Shu-Ping; Liang, Chang-Yong
2017-10-02
In order to be prepared against potential balance-breaking risks affecting economic development, more and more countries have recognized emergency response solutions evaluation (ERSE) as an indispensable activity in their governance of sustainable development. Traditional multiple criteria group decision making (MCGDM) approaches to ERSE have been facing simultaneous challenging characteristics of decision hesitancy and prioritization relations among assessing criteria, due to the complexity in practical ERSE problems. Therefore, aiming at the special type of ERSE problems that hold the two characteristics, we investigate effective MCGDM approaches by hiring interval-valued dual hesitant fuzzy set (IVDHFS) to comprehensively depict decision hesitancy. To exploit decision information embedded in prioritization relations among criteria, we firstly define an fuzzy entropy measure for IVDHFS so that its derivative decision models can avoid potential information distortion in models based on classic IVDHFS distance measures with subjective supplementing mechanism; further, based on defined entropy measure, we develop two fundamental prioritized operators for IVDHFS by extending Yager's prioritized operators. Furthermore, on the strength of above methods, we construct two hesitant fuzzy MCGDM approaches to tackle complex scenarios with or without known weights for decision makers, respectively. Finally, case studies have been conducted to show effectiveness and practicality of our proposed approaches.
Qi, Xiao-Wen; Zhang, Jun-Ling; Zhao, Shu-Ping; Liang, Chang-Yong
2017-01-01
In order to be prepared against potential balance-breaking risks affecting economic development, more and more countries have recognized emergency response solutions evaluation (ERSE) as an indispensable activity in their governance of sustainable development. Traditional multiple criteria group decision making (MCGDM) approaches to ERSE have been facing simultaneous challenging characteristics of decision hesitancy and prioritization relations among assessing criteria, due to the complexity in practical ERSE problems. Therefore, aiming at the special type of ERSE problems that hold the two characteristics, we investigate effective MCGDM approaches by hiring interval-valued dual hesitant fuzzy set (IVDHFS) to comprehensively depict decision hesitancy. To exploit decision information embedded in prioritization relations among criteria, we firstly define an fuzzy entropy measure for IVDHFS so that its derivative decision models can avoid potential information distortion in models based on classic IVDHFS distance measures with subjective supplementing mechanism; further, based on defined entropy measure, we develop two fundamental prioritized operators for IVDHFS by extending Yager’s prioritized operators. Furthermore, on the strength of above methods, we construct two hesitant fuzzy MCGDM approaches to tackle complex scenarios with or without known weights for decision makers, respectively. Finally, case studies have been conducted to show effectiveness and practicality of our proposed approaches. PMID:28974045
Anatomy and Physiology of Multiscale Modeling and Simulation in Systems Medicine.
Mizeranschi, Alexandru; Groen, Derek; Borgdorff, Joris; Hoekstra, Alfons G; Chopard, Bastien; Dubitzky, Werner
2016-01-01
Systems medicine is the application of systems biology concepts, methods, and tools to medical research and practice. It aims to integrate data and knowledge from different disciplines into biomedical models and simulations for the understanding, prevention, cure, and management of complex diseases. Complex diseases arise from the interactions among disease-influencing factors across multiple levels of biological organization from the environment to molecules. To tackle the enormous challenges posed by complex diseases, we need a modeling and simulation framework capable of capturing and integrating information originating from multiple spatiotemporal and organizational scales. Multiscale modeling and simulation in systems medicine is an emerging methodology and discipline that has already demonstrated its potential in becoming this framework. The aim of this chapter is to present some of the main concepts, requirements, and challenges of multiscale modeling and simulation in systems medicine.
Transforming patient experience: health web science meets medicine 2.0.
McHattie, Lynn-Sayers; Cumming, Grant; French, Tara
2014-01-01
Until recently, the Western biomedical paradigm has been effective in delivering health care, however this model is not positioned to tackle complex societal challenges or solve the current problems facing health care and delivery. The future of medicine requires a shift to a patient-centric model and in so doing the Internet has a significant role to play. The disciplines of Health Web Science and Medicine 2.0 are pivotal to this approach. This viewpoint paper argues that these disciplines, together with the field of design, can tackle these challenges. Drawing together ideas from design practice and research, complexity theory, and participatory action research we depict design as an approach that is fundamentally social and linked to concepts of person-centered care. We discuss the role of design, specifically co-design, in understanding the social, psychological, and behavioral dimensions of illness and the implications for the design of future care towards transforming the patient experience. This paper builds on the presentations and subsequent interdisciplinary dialogue that developed from the panel session "Transforming Patient Experience: Health Web Science Meets Web 2.0" at the 2013 Medicine 2.0 conference in London.
Transforming Patient Experience: Health Web Science Meets Medicine 2.0
2014-01-01
Until recently, the Western biomedical paradigm has been effective in delivering health care, however this model is not positioned to tackle complex societal challenges or solve the current problems facing health care and delivery. The future of medicine requires a shift to a patient-centric model and in so doing the Internet has a significant role to play. The disciplines of Health Web Science and Medicine 2.0 are pivotal to this approach. This viewpoint paper argues that these disciplines, together with the field of design, can tackle these challenges. Drawing together ideas from design practice and research, complexity theory, and participatory action research we depict design as an approach that is fundamentally social and linked to concepts of person-centered care. We discuss the role of design, specifically co-design, in understanding the social, psychological, and behavioral dimensions of illness and the implications for the design of future care towards transforming the patient experience. This paper builds on the presentations and subsequent interdisciplinary dialogue that developed from the panel session "Transforming Patient Experience: Health Web Science Meets Web 2.0" at the 2013 Medicine 2.0 conference in London. PMID:25075246
NASA Astrophysics Data System (ADS)
White, Irene; Lorenzi, Francesca
2016-12-01
Creativity has been emerging as a key concept in educational policies since the mid-1990s, with many Western countries restructuring their education systems to embrace innovative approaches likely to stimulate creative and critical thinking. But despite current intentions of putting more emphasis on creativity in education policies worldwide, there is still a relative dearth of viable models which capture the complexity of creativity and the conditions for its successful infusion into formal school environments. The push for creativity is in direct conflict with the results-driven/competitive performance-oriented culture which continues to dominate formal education systems. The authors of this article argue that incorporating creativity into mainstream education is a complex task and is best tackled by taking a systematic and multifaceted approach. They present a multidimensional model designed to help educators in tackling the challenges of the promotion of creativity. Their model encompasses three distinct yet interrelated dimensions of a creative space - physical, social-emotional and critical. The authors use the metaphor of space to refer to the interplay of the three identified dimensions. Drawing on confluence approaches to the theorisation of creativity, this paper exemplifies the development of a model before the background of a growing trend of systems theories. The aim of the model is to be helpful in systematising creativity by offering parameters - derived from the evaluation of an example offered by a non-formal educational environment - for the development of creative environments within mainstream secondary schools.
The QSAR study of flavonoid-metal complexes scavenging rad OH free radical
NASA Astrophysics Data System (ADS)
Wang, Bo-chu; Qian, Jun-zhen; Fan, Ying; Tan, Jun
2014-10-01
Flavonoid-metal complexes have antioxidant activities. However, quantitative structure-activity relationships (QSAR) of flavonoid-metal complexes and their antioxidant activities has still not been tackled. On the basis of 21 structures of flavonoid-metal complexes and their antioxidant activities for scavenging rad OH free radical, we optimised their structures using Gaussian 03 software package and we subsequently calculated and chose 18 quantum chemistry descriptors such as dipole, charge and energy. Then we chose several quantum chemistry descriptors that are very important to the IC50 of flavonoid-metal complexes for scavenging rad OH free radical through method of stepwise linear regression, Meanwhile we obtained 4 new variables through the principal component analysis. Finally, we built the QSAR models based on those important quantum chemistry descriptors and the 4 new variables as the independent variables and the IC50 as the dependent variable using an Artificial Neural Network (ANN), and we validated the two models using experimental data. These results show that the two models in this paper are reliable and predictable.
Enhancing metaproteomics-The value of models and defined environmental microbial systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herbst, Florian-Alexander; Lünsmann, Vanessa; Kjeldal, Henrik
Metaproteomicsthe large-scale characterization of the entire protein complement of environmental microbiota at a given point in timehas provided new features to study complex microbial communities in order to unravel these black boxes. Some new technical challenges arose that were not an issue for classical proteome analytics before that could be tackled by the application of different model systems. Here, we review different current and future model systems for metaproteome analysis. We introduce model systems for clinical and biotechnological research questions including acid mine drainage, anaerobic digesters, and activated sludge, following a short introduction to microbial communities and metaproteomics. Model systemsmore » are useful to evaluate the challenges encountered within (but not limited to) metaproteomics, including species complexity and coverage, biomass availability, or reliable protein extraction. Moreover, the implementation of model systems can be considered as a step forward to better understand microbial community responses and ecological functions of single member organisms. In the future, improvements are necessary to fully explore complex environmental systems by metaproteomics.« less
Enhancing metaproteomics-The value of models and defined environmental microbial systems
Herbst, Florian-Alexander; Lünsmann, Vanessa; Kjeldal, Henrik; ...
2016-01-21
Metaproteomicsthe large-scale characterization of the entire protein complement of environmental microbiota at a given point in timehas provided new features to study complex microbial communities in order to unravel these black boxes. Some new technical challenges arose that were not an issue for classical proteome analytics before that could be tackled by the application of different model systems. Here, we review different current and future model systems for metaproteome analysis. We introduce model systems for clinical and biotechnological research questions including acid mine drainage, anaerobic digesters, and activated sludge, following a short introduction to microbial communities and metaproteomics. Model systemsmore » are useful to evaluate the challenges encountered within (but not limited to) metaproteomics, including species complexity and coverage, biomass availability, or reliable protein extraction. Moreover, the implementation of model systems can be considered as a step forward to better understand microbial community responses and ecological functions of single member organisms. In the future, improvements are necessary to fully explore complex environmental systems by metaproteomics.« less
NASA Astrophysics Data System (ADS)
Liu, Y.; Gupta, H.; Wagener, T.; Stewart, S.; Mahmoud, M.; Hartmann, H.; Springer, E.
2007-12-01
Some of the most challenging issues facing contemporary water resources management are those typified by complex coupled human-environmental systems with poorly characterized uncertainties. In other words, major decisions regarding water resources have to be made in the face of substantial uncertainty and complexity. It has been suggested that integrated models can be used to coherently assemble information from a broad set of domains, and can therefore serve as an effective means for tackling the complexity of environmental systems. Further, well-conceived scenarios can effectively inform decision making, particularly when high complexity and poorly characterized uncertainties make the problem intractable via traditional uncertainty analysis methods. This presentation discusses the integrated modeling framework adopted by SAHRA, an NSF Science & Technology Center, to investigate stakeholder-driven water sustainability issues within the semi-arid southwestern US. The multi-disciplinary, multi-resolution modeling framework incorporates a formal scenario approach to analyze the impacts of plausible (albeit uncertain) alternative futures to support adaptive management of water resources systems. Some of the major challenges involved in, and lessons learned from, this effort will be discussed.
Engineering and control of biological systems: A new way to tackle complex diseases.
Menolascina, Filippo; Siciliano, Velia; di Bernardo, Diego
2012-07-16
The ongoing merge between engineering and biology has contributed to the emerging field of synthetic biology. The defining features of this new discipline are abstraction and standardisation of biological parts, decoupling between parts to prevent undesired cross-talking, and the application of quantitative modelling of synthetic genetic circuits in order to guide their design. Most of the efforts in the field of synthetic biology in the last decade have been devoted to the design and development of functional gene circuits in prokaryotes and unicellular eukaryotes. Researchers have used synthetic biology not only to engineer new functions in the cell, but also to build simpler models of endogenous gene regulatory networks to gain knowledge of the "rules" governing their wiring diagram. However, the need for innovative approaches to study and modify complex signalling and regulatory networks in mammalian cells and multicellular organisms has prompted advances of synthetic biology also in these species, thus contributing to develop innovative ways to tackle human diseases. In this work, we will review the latest progress in synthetic biology and the most significant developments achieved so far, both in unicellular and multicellular organisms, with emphasis on human health. Copyright © 2012 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.
Reliability of a k—out—of—n : G System with Identical Repairable Elements
NASA Astrophysics Data System (ADS)
Sharifi, M.; Nia, A. Torabi; Shafie, P.; Norozi-Zare, F.; Sabet-Ghadam, A.
2009-09-01
k—out—of—n models, are one of the most useful models to calculate the reliability of complex systems like electrical and mechanical devices. In this paper, we consider a k—out—of—n : G system with identical elements. The failure rate of each element is constant. The elements are repairable and the repair rate of each element is constant. The system works when at least k elements work. The system of equations are established and sought for the parameters like MTTF in real time situation. It seems that this model can tackle more realistic situations.
2013-01-01
Background A growing proportion of people are living with long term conditions. The majority have more than one. Dealing with multi-morbidity is a complex problem for health systems: for those designing and implementing healthcare as well as for those providing the evidence informing practice. Yet the concept of multi-morbidity (the presence of >2 diseases) is a product of the design of health care systems which define health care need on the basis of disease status. So does the solution lie in an alternative model of healthcare? Discussion Strengthening generalist practice has been proposed as part of the solution to tackling multi-morbidity. Generalism is a professional philosophy of practice, deeply known to many practitioners, and described as expertise in whole person medicine. But generalism lacks the evidence base needed by policy makers and planners to support service redesign. The challenge is to fill this practice-research gap in order to critically explore if and when generalist care offers a robust alternative to management of this complex problem. We need practice-based evidence to fill this gap. By recognising generalist practice as a ‘complex intervention’ (intervening in a complex system), we outline an approach to evaluate impact using action-research principles. We highlight the implications for those who both commission and undertake research in order to tackle this problem. Summary Answers to the complex problem of multi-morbidity won’t come from doing more of the same. We need to change systems of care, and so the systems for generating evidence to support that care. This paper contributes to that work through outlining a process for generating practice-based evidence of generalist solutions to the complex problem of person-centred care for people with multi-morbidity. PMID:23919296
Reeve, Joanne; Blakeman, Tom; Freeman, George K; Green, Larry A; James, Paul A; Lucassen, Peter; Martin, Carmel M; Sturmberg, Joachim P; van Weel, Chris
2013-08-07
A growing proportion of people are living with long term conditions. The majority have more than one. Dealing with multi-morbidity is a complex problem for health systems: for those designing and implementing healthcare as well as for those providing the evidence informing practice. Yet the concept of multi-morbidity (the presence of >2 diseases) is a product of the design of health care systems which define health care need on the basis of disease status. So does the solution lie in an alternative model of healthcare? Strengthening generalist practice has been proposed as part of the solution to tackling multi-morbidity. Generalism is a professional philosophy of practice, deeply known to many practitioners, and described as expertise in whole person medicine. But generalism lacks the evidence base needed by policy makers and planners to support service redesign. The challenge is to fill this practice-research gap in order to critically explore if and when generalist care offers a robust alternative to management of this complex problem. We need practice-based evidence to fill this gap. By recognising generalist practice as a 'complex intervention' (intervening in a complex system), we outline an approach to evaluate impact using action-research principles. We highlight the implications for those who both commission and undertake research in order to tackle this problem. Answers to the complex problem of multi-morbidity won't come from doing more of the same. We need to change systems of care, and so the systems for generating evidence to support that care. This paper contributes to that work through outlining a process for generating practice-based evidence of generalist solutions to the complex problem of person-centred care for people with multi-morbidity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Ang; Song, Shuaiwen; Brugel, Eric
To continuously comply with Moore’s Law, modern parallel machines become increasingly complex. Effectively tuning application performance for these machines therefore becomes a daunting task. Moreover, identifying performance bottlenecks at application and architecture level, as well as evaluating various optimization strategies, are becoming extremely difficult when the entanglement of numerous correlated factors is being presented. To tackle these challenges, we present a visual analytical model named “X”. It is intuitive and sufficiently flexible to track all the typical features of a parallel machine.
Neuroanatomy and transgenic technologies
USDA-ARS?s Scientific Manuscript database
This is a short review that introduces recent advances of neuroanatomy and transgenic technologies. The anatomical complexity of the nervous system remains a subject of tremendous fascination among neuroscientists. In order to tackle this extraordinary complexity, powerful transgenic technologies a...
A brief introduction to mixed effects modelling and multi-model inference in ecology
Donaldson, Lynda; Correa-Cano, Maria Eugenia; Goodwin, Cecily E.D.
2018-01-01
The use of linear mixed effects models (LMMs) is increasingly common in the analysis of biological data. Whilst LMMs offer a flexible approach to modelling a broad range of data types, ecological data are often complex and require complex model structures, and the fitting and interpretation of such models is not always straightforward. The ability to achieve robust biological inference requires that practitioners know how and when to apply these tools. Here, we provide a general overview of current methods for the application of LMMs to biological data, and highlight the typical pitfalls that can be encountered in the statistical modelling process. We tackle several issues regarding methods of model selection, with particular reference to the use of information theory and multi-model inference in ecology. We offer practical solutions and direct the reader to key references that provide further technical detail for those seeking a deeper understanding. This overview should serve as a widely accessible code of best practice for applying LMMs to complex biological problems and model structures, and in doing so improve the robustness of conclusions drawn from studies investigating ecological and evolutionary questions. PMID:29844961
A brief introduction to mixed effects modelling and multi-model inference in ecology.
Harrison, Xavier A; Donaldson, Lynda; Correa-Cano, Maria Eugenia; Evans, Julian; Fisher, David N; Goodwin, Cecily E D; Robinson, Beth S; Hodgson, David J; Inger, Richard
2018-01-01
The use of linear mixed effects models (LMMs) is increasingly common in the analysis of biological data. Whilst LMMs offer a flexible approach to modelling a broad range of data types, ecological data are often complex and require complex model structures, and the fitting and interpretation of such models is not always straightforward. The ability to achieve robust biological inference requires that practitioners know how and when to apply these tools. Here, we provide a general overview of current methods for the application of LMMs to biological data, and highlight the typical pitfalls that can be encountered in the statistical modelling process. We tackle several issues regarding methods of model selection, with particular reference to the use of information theory and multi-model inference in ecology. We offer practical solutions and direct the reader to key references that provide further technical detail for those seeking a deeper understanding. This overview should serve as a widely accessible code of best practice for applying LMMs to complex biological problems and model structures, and in doing so improve the robustness of conclusions drawn from studies investigating ecological and evolutionary questions.
Preclinical models for obesity research
Barrett, Perry; Morgan, Peter J.
2016-01-01
ABSTRACT A multi-dimensional strategy to tackle the global obesity epidemic requires an in-depth understanding of the mechanisms that underlie this complex condition. Much of the current mechanistic knowledge has arisen from preclinical research performed mostly, but not exclusively, in laboratory mouse and rat strains. These experimental models mimic certain aspects of the human condition and its root causes, particularly the over-consumption of calories and unbalanced diets. As with human obesity, obesity in rodents is the result of complex gene–environment interactions. Here, we review the traditional monogenic models of obesity, their contemporary optogenetic and chemogenetic successors, and the use of dietary manipulations and meal-feeding regimes to recapitulate the complexity of human obesity. We critically appraise the strengths and weaknesses of these different models to explore the underlying mechanisms, including the neural circuits that drive behaviours such as appetite control. We also discuss the use of these models for testing and screening anti-obesity drugs, beneficial bio-actives, and nutritional strategies, with the goal of ultimately translating these findings for the treatment of human obesity. PMID:27821603
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun Wei; Huang, Guo H., E-mail: huang@iseis.org; Institute for Energy, Environment and Sustainable Communities, University of Regina, Regina, Saskatchewan, S4S 0A2
2012-06-15
Highlights: Black-Right-Pointing-Pointer Inexact piecewise-linearization-based fuzzy flexible programming is proposed. Black-Right-Pointing-Pointer It's the first application to waste management under multiple complexities. Black-Right-Pointing-Pointer It tackles nonlinear economies-of-scale effects in interval-parameter constraints. Black-Right-Pointing-Pointer It estimates costs more accurately than the linear-regression-based model. Black-Right-Pointing-Pointer Uncertainties are decreased and more satisfactory interval solutions are obtained. - Abstract: To tackle nonlinear economies-of-scale (EOS) effects in interval-parameter constraints for a representative waste management problem, an inexact piecewise-linearization-based fuzzy flexible programming (IPFP) model is developed. In IPFP, interval parameters for waste amounts and transportation/operation costs can be quantified; aspiration levels for net system costs, as well as tolerancemore » intervals for both capacities of waste treatment facilities and waste generation rates can be reflected; and the nonlinear EOS effects transformed from objective function to constraints can be approximated. An interactive algorithm is proposed for solving the IPFP model, which in nature is an interval-parameter mixed-integer quadratically constrained programming model. To demonstrate the IPFP's advantages, two alternative models are developed to compare their performances. One is a conventional linear-regression-based inexact fuzzy programming model (IPFP2) and the other is an IPFP model with all right-hand-sides of fussy constraints being the corresponding interval numbers (IPFP3). The comparison results between IPFP and IPFP2 indicate that the optimized waste amounts would have the similar patterns in both models. However, when dealing with EOS effects in constraints, the IPFP2 may underestimate the net system costs while the IPFP can estimate the costs more accurately. The comparison results between IPFP and IPFP3 indicate that their solutions would be significantly different. The decreased system uncertainties in IPFP's solutions demonstrate its effectiveness for providing more satisfactory interval solutions than IPFP3. Following its first application to waste management, the IPFP can be potentially applied to other environmental problems under multiple complexities.« less
Tackling antibiotic resistance in India.
Wattal, Chand; Goel, Neeraj
2014-12-01
Infectious diseases are major causes of mortality in India. This is aggravated by the increasing prevalence of antimicrobial resistance (AMR) both in the community and in hospitals. Due to the emergence of resistance to all effective antibiotics in nosocomial pathogens, the situation calls for emergency measures to tackle AMR in India. India has huge challenges in tackling AMR, ranging from lack of surveillance mechanisms for monitoring AMR and use; effective hospital control policies; sanitation and non-human use of antimicrobial. The Ministry of Health and Family Welfare of Govt. of India has taken initiatives to tackle AMR. Extensive guidelines have been drafted and a model worksheet has been developed as a roadmap to tackle AMR.
The Use of Cellular Automata in the Learning of Emergence
ERIC Educational Resources Information Center
Faraco, G.; Pantano, P.; Servidio, R.
2006-01-01
In recent years, research efforts on complex systems have contributed to improve our ability in investigating, at different levels of complexity, the emergent behaviour shown by a system in the course of its evolution. The study of emergence, an intrinsic property of a large number of complex systems, can be tackled by making use of Cellular…
Boosting Learning Algorithm for Stock Price Forecasting
NASA Astrophysics Data System (ADS)
Wang, Chengzhang; Bai, Xiaoming
2018-03-01
To tackle complexity and uncertainty of stock market behavior, more studies have introduced machine learning algorithms to forecast stock price. ANN (artificial neural network) is one of the most successful and promising applications. We propose a boosting-ANN model in this paper to predict the stock close price. On the basis of boosting theory, multiple weak predicting machines, i.e. ANNs, are assembled to build a stronger predictor, i.e. boosting-ANN model. New error criteria of the weak studying machine and rules of weights updating are adopted in this study. We select technical factors from financial markets as forecasting input variables. Final results demonstrate the boosting-ANN model works better than other ones for stock price forecasting.
Li, Qu; Yao, Min; Yang, Jianhua; Xu, Ning
2014-01-01
Online friend recommendation is a fast developing topic in web mining. In this paper, we used SVD matrix factorization to model user and item feature vector and used stochastic gradient descent to amend parameter and improve accuracy. To tackle cold start problem and data sparsity, we used KNN model to influence user feature vector. At the same time, we used graph theory to partition communities with fairly low time and space complexity. What is more, matrix factorization can combine online and offline recommendation. Experiments showed that the hybrid recommendation algorithm is able to recommend online friends with good accuracy.
Wind Turbine Failures - Tackling current Problems in Failure Data Analysis
NASA Astrophysics Data System (ADS)
Reder, M. D.; Gonzalez, E.; Melero, J. J.
2016-09-01
The wind industry has been growing significantly over the past decades, resulting in a remarkable increase in installed wind power capacity. Turbine technologies are rapidly evolving in terms of complexity and size, and there is an urgent need for cost effective operation and maintenance (O&M) strategies. Especially unplanned downtime represents one of the main cost drivers of a modern wind farm. Here, reliability and failure prediction models can enable operators to apply preventive O&M strategies rather than corrective actions. In order to develop these models, the failure rates and downtimes of wind turbine (WT) components have to be understood profoundly. This paper is focused on tackling three of the main issues related to WT failure analyses. These are, the non-uniform data treatment, the scarcity of available failure analyses, and the lack of investigation on alternative data sources. For this, a modernised form of an existing WT taxonomy is introduced. Additionally, an extensive analysis of historical failure and downtime data of more than 4300 turbines is presented. Finally, the possibilities to encounter the lack of available failure data by complementing historical databases with Supervisory Control and Data Acquisition (SCADA) alarms are evaluated.
Hosoda, Kazufumi; Tsuda, Soichiro; Kadowaki, Kohmei; Nakamura, Yutaka; Nakano, Tadashi; Ishii, Kojiro
2016-02-01
Understanding ecosystem dynamics is crucial as contemporary human societies face ecosystem degradation. One of the challenges that needs to be recognized is the complex hierarchical dynamics. Conventional dynamic models in ecology often represent only the population level and have yet to include the dynamics of the sub-organism level, which makes an ecosystem a complex adaptive system that shows characteristic behaviors such as resilience and regime shifts. The neglect of the sub-organism level in the conventional dynamic models would be because integrating multiple hierarchical levels makes the models unnecessarily complex unless supporting experimental data are present. Now that large amounts of molecular and ecological data are increasingly accessible in microbial experimental ecosystems, it is worthwhile to tackle the questions of their complex hierarchical dynamics. Here, we propose an approach that combines microbial experimental ecosystems and a hierarchical dynamic model named population-reaction model. We present a simple microbial experimental ecosystem as an example and show how the system can be analyzed by a population-reaction model. We also show that population-reaction models can be applied to various ecological concepts, such as predator-prey interactions, climate change, evolution, and stability of diversity. Our approach will reveal a path to the general understanding of various ecosystems and organisms. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Networking at the Protein Society symposium.
McKnight, C James; Cordes, Matthew H J
2005-10-01
From the complex behavior of multicomponent signaling networks to the structures of large protein complexes and aggregates, questions once viewed as daunting are now being tackled fearlessly by protein scientists. The 19th Annual Symposium of the Protein Society in Boston highlighted the maturation of systems biology as applied to proteins.
Making Textbook Reading Meaningful
ERIC Educational Resources Information Center
Guthrie, John T.; Klauda, Susan Lutz
2012-01-01
When students enter middle school, they are confronted with the necessity of learning from complex content-area textbooks. Many students find these texts boring, and they may lack the higher-order reading comprehension skills they need to tackle complex text. Yet the ability to read informational text is essential to success in middle school and…
Theoretical and experimental aspects of chaos control by time-delayed feedback.
Just, Wolfram; Benner, Hartmut; Reibold, Ekkehard
2003-03-01
We review recent developments for the control of chaos by time-delayed feedback methods. While such methods are easily applied even in quite complex experimental context the theoretical analysis yields infinite-dimensional differential-difference systems which are hard to tackle. The essential ideas for a general theoretical approach are sketched and the results are compared to electronic circuits and to high power ferromagnetic resonance experiments. Our results show that the control performance can be understood on the basis of experimentally accessible quantities without resort to any model for the internal dynamics.
Modeling the Propagation of Mobile Phone Virus under Complex Network
Yang, Wei; Wei, Xi-liang; Guo, Hao; An, Gang; Guo, Lei
2014-01-01
Mobile phone virus is a rogue program written to propagate from one phone to another, which can take control of a mobile device by exploiting its vulnerabilities. In this paper the propagation model of mobile phone virus is tackled to understand how particular factors can affect its propagation and design effective containment strategies to suppress mobile phone virus. Two different propagation models of mobile phone viruses under the complex network are proposed in this paper. One is intended to describe the propagation of user-tricking virus, and the other is to describe the propagation of the vulnerability-exploiting virus. Based on the traditional epidemic models, the characteristics of mobile phone viruses and the network topology structure are incorporated into our models. A detailed analysis is conducted to analyze the propagation models. Through analysis, the stable infection-free equilibrium point and the stability condition are derived. Finally, considering the network topology, the numerical and simulation experiments are carried out. Results indicate that both models are correct and suitable for describing the spread of two different mobile phone viruses, respectively. PMID:25133209
Quaternion error-based optimal control applied to pinpoint landing
NASA Astrophysics Data System (ADS)
Ghiglino, Pablo
Accurate control techniques for pinpoint planetary landing - i.e., the goal of achieving landing errors in the order of 100m for unmanned missions - is a complex problem that have been tackled in different ways in the available literature. Among other challenges, this kind of control is also affected by the well known trade-off in UAV control that for complex underlying models the control is sub-optimal, while optimal control is applied to simplifed models. The goal of this research has been the development new control algorithms that would be able to tackle these challenges and the result are two novel optimal control algorithms namely: OQTAL and HEX2OQTAL. These controllers share three key properties that are thoroughly proven and shown in this thesis; stability, accuracy and adaptability. Stability is rigorously demonstrated for both controllers. Accuracy is shown in results of comparing these novel controllers with other industry standard algorithms in several different scenarios: there is a gain in accuracy of at least 15% for each controller, and in many cases much more than that. A new tuning algorithm based on swarm heuristics optimisation was developed as well as part of this research in order to tune in an online manner the standard Proportional-Integral-Derivative (PID) controllers used for benchmarking. Finally, adaptability of these controllers can be seen as a combination of four elements: mathematical model extensibility, cost matrices tuning, reduced computation time required and finally no prior knowledge of the navigation or guidance strategies needed. Further simulations in real planetary landing trajectories has shown that these controllers have the capacity of achieving landing errors in the order of pinpoint landing requirements, making them not only very precise UAV controllers, but also potential candidates for pinpoint landing unmanned missions.
Simic, Vladimir
2016-06-01
As the number of end-of-life vehicles (ELVs) is estimated to increase to 79.3 million units per year by 2020 (e.g., 40 million units were generated in 2010), there is strong motivation to effectively manage this fast-growing waste flow. Intensive work on management of ELVs is necessary in order to more successfully tackle this important environmental challenge. This paper proposes an interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations. The proposed model can incorporate various uncertainty information in the modeling process. The complex relationships between different ELV management sub-systems are successfully addressed. Particularly, the formulated model can help identify optimal patterns of procurement from multiple sources of ELV supply, production and inventory planning in multiple vehicle recycling factories, and allocation of sorted material flows to multiple final destinations under rigorous environmental regulations. A case study is conducted in order to demonstrate the potentials and applicability of the proposed model. Various constraint-violation probability levels are examined in detail. Influences of parameter uncertainty on model solutions are thoroughly investigated. Useful solutions for the management of ELVs are obtained under different probabilities of violating system constraints. The formulated model is able to tackle a hard, uncertainty existing ELV management problem. The presented model has advantages in providing bases for determining long-term ELV management plans with desired compromises between economic efficiency of vehicle recycling system and system-reliability considerations. The results are helpful for supporting generation and improvement of ELV management plans. Copyright © 2016 Elsevier Ltd. All rights reserved.
Drosophila as an In Vivo Model for Human Neurodegenerative Disease
McGurk, Leeanne; Berson, Amit; Bonini, Nancy M.
2015-01-01
With the increase in the ageing population, neurodegenerative disease is devastating to families and poses a huge burden on society. The brain and spinal cord are extraordinarily complex: they consist of a highly organized network of neuronal and support cells that communicate in a highly specialized manner. One approach to tackling problems of such complexity is to address the scientific questions in simpler, yet analogous, systems. The fruit fly, Drosophila melanogaster, has been proven tremendously valuable as a model organism, enabling many major discoveries in neuroscientific disease research. The plethora of genetic tools available in Drosophila allows for exquisite targeted manipulation of the genome. Due to its relatively short lifespan, complex questions of brain function can be addressed more rapidly than in other model organisms, such as the mouse. Here we discuss features of the fly as a model for human neurodegenerative disease. There are many distinct fly models for a range of neurodegenerative diseases; we focus on select studies from models of polyglutamine disease and amyotrophic lateral sclerosis that illustrate the type and range of insights that can be gleaned. In discussion of these models, we underscore strengths of the fly in providing understanding into mechanisms and pathways, as a foundation for translational and therapeutic research. PMID:26447127
Drosophila as an In Vivo Model for Human Neurodegenerative Disease.
McGurk, Leeanne; Berson, Amit; Bonini, Nancy M
2015-10-01
With the increase in the ageing population, neurodegenerative disease is devastating to families and poses a huge burden on society. The brain and spinal cord are extraordinarily complex: they consist of a highly organized network of neuronal and support cells that communicate in a highly specialized manner. One approach to tackling problems of such complexity is to address the scientific questions in simpler, yet analogous, systems. The fruit fly, Drosophila melanogaster, has been proven tremendously valuable as a model organism, enabling many major discoveries in neuroscientific disease research. The plethora of genetic tools available in Drosophila allows for exquisite targeted manipulation of the genome. Due to its relatively short lifespan, complex questions of brain function can be addressed more rapidly than in other model organisms, such as the mouse. Here we discuss features of the fly as a model for human neurodegenerative disease. There are many distinct fly models for a range of neurodegenerative diseases; we focus on select studies from models of polyglutamine disease and amyotrophic lateral sclerosis that illustrate the type and range of insights that can be gleaned. In discussion of these models, we underscore strengths of the fly in providing understanding into mechanisms and pathways, as a foundation for translational and therapeutic research. Copyright © 2015 by the Genetics Society of America.
Remontet, Laurent; Uhry, Zoé; Bossard, Nadine; Iwaz, Jean; Belot, Aurélien; Danieli, Coraline; Charvat, Hadrien; Roche, Laurent
2018-01-01
Cancer survival trend analyses are essential to describe accurately the way medical practices impact patients' survival according to the year of diagnosis. To this end, survival models should be able to account simultaneously for non-linear and non-proportional effects and for complex interactions between continuous variables. However, in the statistical literature, there is no consensus yet on how to build such models that should be flexible but still provide smooth estimates of survival. In this article, we tackle this challenge by smoothing the complex hypersurface (time since diagnosis, age at diagnosis, year of diagnosis, and mortality hazard) using a multidimensional penalized spline built from the tensor product of the marginal bases of time, age, and year. Considering this penalized survival model as a Poisson model, we assess the performance of this approach in estimating the net survival with a comprehensive simulation study that reflects simple and complex realistic survival trends. The bias was generally small and the root mean squared error was good and often similar to that of the true model that generated the data. This parametric approach offers many advantages and interesting prospects (such as forecasting) that make it an attractive and efficient tool for survival trend analyses.
Toward a Learning Science for Complex Crowdsourcing Tasks
ERIC Educational Resources Information Center
Doroudi, Shayan; Kamar, Ece; Brunskill, Emma; Horvitz, Eric
2016-01-01
We explore how crowdworkers can be trained to tackle complex crowdsourcing tasks. We are particularly interested in training novice workers to perform well on solving tasks in situations where the space of strategies is large and workers need to discover and try different strategies to be successful. In a first experiment, we perform a comparison…
European Teacher Education: A Fractal Perspective Tackling Complexity
ERIC Educational Resources Information Center
Caena, Francesa; Margiotta, Umberto
2010-01-01
This article takes stock of the complex scenario of the European education space in its past, present and future developments, which highlights the priorities of the modernisation, improvement and convergence of the goals for education and training systems in the knowledge and learning society. The critical case of teacher education is then…
Using Plants to Explore the Nature & Structural Complexity of Life
ERIC Educational Resources Information Center
Howard, Ava R.
2014-01-01
Use of real specimens brings the study of biology to life. This activity brings easily acquired plant specimens into the classroom to tackle common alternative conceptions regarding life, size, complexity, the nature of science, and plants as multicellular organisms. The activity occurs after a discussion of the characteristics of life and engages…
NASA Astrophysics Data System (ADS)
Moghaddam, Kamran S.; Usher, John S.
2011-07-01
In this article, a new multi-objective optimization model is developed to determine the optimal preventive maintenance and replacement schedules in a repairable and maintainable multi-component system. In this model, the planning horizon is divided into discrete and equally-sized periods in which three possible actions must be planned for each component, namely maintenance, replacement, or do nothing. The objective is to determine a plan of actions for each component in the system while minimizing the total cost and maximizing overall system reliability simultaneously over the planning horizon. Because of the complexity, combinatorial and highly nonlinear structure of the mathematical model, two metaheuristic solution methods, generational genetic algorithm, and a simulated annealing are applied to tackle the problem. The Pareto optimal solutions that provide good tradeoffs between the total cost and the overall reliability of the system can be obtained by the solution approach. Such a modeling approach should be useful for maintenance planners and engineers tasked with the problem of developing recommended maintenance plans for complex systems of components.
Manning, Brendan D
2012-07-10
In their study published in Science Signaling (Research Article, 27 March 2012, DOI: 10.1126/scisignal.2002469), Dalle Pezze et al. tackle the dynamic and complex wiring of the signaling network involving the protein kinase mTOR, which exists within two distinct protein complexes (mTORC1 and mTORC2) that differ in their regulation and function. The authors use a combination of immunoblotting for specific phosphorylation events and computational modeling. The primary experimental tool employed is to monitor the autophosphorylation of mTOR on Ser(2481) in cell lysates as a surrogate for mTOR activity, which the authors conclude is a specific readout for mTORC2. However, Ser(2481) phosphorylation occurs on both mTORC1 and mTORC2 and will dynamically change as the network through which these two complexes are connected is manipulated. Therefore, models of mTOR network regulation built using this tool are inherently imperfect and open to alternative explanations. Specific issues with the main conclusion made in this study, involving the TSC1-TSC2 (tuberous sclerosis complex 1 and 2) complex and its potential regulation of mTORC2, are discussed here. A broader goal of this Letter is to clarify to other investigators the caveats of using mTOR Ser(2481) phosphorylation in cell lysates as a specific readout for either of the two mTOR complexes.
Youth Football Injuries: A Prospective Cohort
Peterson, Andrew R.; Kruse, Adam J.; Meester, Scott M.; Olson, Tyler S.; Riedle, Benjamin N.; Slayman, Tyler G.; Domeyer, Todd J.; Cavanaugh, Joseph E.; Smoot, M. Kyle
2017-01-01
Background: There are approximately 2.8 million youth football players between the ages of 7 and 14 years in the United States. Rates of injury in this population are poorly described. Recent studies have reported injury rates between 2.3% and 30.4% per season and between 8.5 and 43 per 1000 exposures. Hypothesis: Youth flag football has a lower injury rate than youth tackle football. The concussion rates in flag football are lower than in tackle football. Study Design: Cohort study; Level of evidence, 3. Methods: Three large youth (grades 2-7) football leagues with a total of 3794 players were enrolled. Research personnel partnered with the leagues to provide electronic attendance and injury reporting systems. Researchers had access to deidentified player data and injury information. Injury rates for both the tackle and flag leagues were calculated and compared using Poisson regression with a log link. The probability an injury was severe and an injury resulted in a concussion were modeled using logistic regression. For these 2 responses, best subset model selection was performed, and the model with the minimum Akaike information criterion value was chosen as best. Kaplan-Meier curves were examined to compare time loss due to injury for various subgroups of the population. Finally, time loss was modeled using Cox proportional hazards regression models. Results: A total of 46,416 exposures and 128 injuries were reported. The mean age at injury was 10.64 years. The hazard ratio for tackle football (compared with flag football) was 0.45 (95% CI, 0.25-0.80; P = .0065). The rate of severe injuries per exposure for tackle football was 1.1 (95% CI, 0.33-3.4; P = .93) times that of the flag league. The rate for concussions in tackle football per exposure was 0.51 (95% CI, 0.16-1.7; P = .27) times that of the flag league. Conclusion: Injury is more likely to occur in youth flag football than in youth tackle football. Severe injuries and concussions were not significantly different between leagues. Concussion was more likely to occur during games than during practice. Players in the sixth or seventh grade were more likely to suffer a concussion than were younger players. PMID:28255566
A quantum annealing approach for fault detection and diagnosis of graph-based systems
NASA Astrophysics Data System (ADS)
Perdomo-Ortiz, A.; Fluegemann, J.; Narasimhan, S.; Biswas, R.; Smelyanskiy, V. N.
2015-02-01
Diagnosing the minimal set of faults capable of explaining a set of given observations, e.g., from sensor readouts, is a hard combinatorial optimization problem usually tackled with artificial intelligence techniques. We present the mapping of this combinatorial problem to quadratic unconstrained binary optimization (QUBO), and the experimental results of instances embedded onto a quantum annealing device with 509 quantum bits. Besides being the first time a quantum approach has been proposed for problems in the advanced diagnostics community, to the best of our knowledge this work is also the first research utilizing the route Problem → QUBO → Direct embedding into quantum hardware, where we are able to implement and tackle problem instances with sizes that go beyond previously reported toy-model proof-of-principle quantum annealing implementations; this is a significant leap in the solution of problems via direct-embedding adiabatic quantum optimization. We discuss some of the programmability challenges in the current generation of the quantum device as well as a few possible ways to extend this work to more complex arbitrary network graphs.
ERIC Educational Resources Information Center
Imhof, Margarete; Starker, Ulrike; Spaude, Elena
2016-01-01
Building on Dörner's (1996) theory of complex problem-solving, a learning scenario for teacher students was created and tested. Classroom management is interpreted as a complex problem, which requires the integration of competing interests and tackling multiple, simultaneous tasks under time pressure and with limited information. In addition,…
TERRA: Building New Communities for Advanced Biofuels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cornelius, Joe; Mockler, Todd; Tuinstra, Mitch
ARPA-E’s Transportation Energy Resources from Renewable Agriculture (TERRA) program is bringing together top experts from different disciplines – agriculture, robotics and data analytics – to rethink the production of advanced biofuel crops. ARPA-E Program Director Dr. Joe Cornelius discusses the TERRA program and explains how ARPA-E’s model enables multidisciplinary collaboration among diverse communities. The video focuses on two TERRA projects—Donald Danforth Center and Purdue University—that are developing and integrating cutting-edge remote sensing platforms, complex data analytics tools and plant breeding technologies to tackle the challenge of sustainably increasing biofuel stocks.
Advancing migratory bird conservation and management by using radar: An interagency collaboration
Ruth, Janet M.; Barrow, Wylie C.; Sojda, Richard S.; Dawson, Deanna K.; Diehl, Robert H.; Manville, Albert; Green, Michael T.; Krueper, David J.; Johnston, Scott
2005-01-01
Many technical issues make this work difficult, including complex data structures, massive data sets, digital recognition of birds, large areas not covered by weather radar, and model validation; however, progress will only be furthered by tackling the challenge. The new coalition will meets its goals by: (1) facilitating a productive collaboration with NOAA, Department of the Interior bureaus, state wildlife agencies, universities, power companies, and other potential partners; (2) building and strengthening scientific capabilities within USGS; (3) addressing key migratory bird management issues; and (4) ensuring full funding for the collaborative effort.
Jacobs, Keith
2010-01-01
This paper draws on the findings from a research project on partnership arrangements between the police and housing departments on three Australian public housing estates to tackle problems associated with illicit drug activity and anti-social behaviour (ASB). The analysis focused on the setting up of the partnerships and the interactions that followed from these institutional arrangements. The assumption that informs the paper is that when studying partnerships there is a need for a more critically framed analysis. The temptation to posit "a successful model" of what partnership entails and then to judge practices in relation to this model is considerable, but it inevitably falls into the trap of constructing a narrative of partnership success or failure in terms of individual agency (that is, the degree of commitment from individuals). The analysis undertaken in this paper has therefore sought to fathom a more complex set of organizational processes. Rather than confine the discussion to issues of success and failure, the study foregrounds the subjective accounts of individuals who work within partnership and the constraints they encounter. The paper therefore makes explicit the cultural tensions within and across agencies, contestation as to the extent of the policy "problem," and the divergent perspectives on the appropriate modes of intervention.
High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics
Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis
2014-07-28
The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less
An analysis of urban collisions using an artificial intelligence model.
Mussone, L; Ferrari, A; Oneta, M
1999-11-01
Traditional studies on road accidents estimate the effect of variables (such as vehicular flows, road geometry, vehicular characteristics), and the calculation of the number of accidents. A descriptive statistical analysis of the accidents (those used in the model) over the period 1992-1995 is proposed. The paper describes an alternative method based on the use of artificial neural networks (ANN) in order to work out a model that relates to the analysis of vehicular accidents in Milan. The degree of danger of urban intersections using different scenarios is quantified by the ANN model. Methodology is the first result, which allows us to tackle the modelling of urban vehicular accidents by the innovative use of ANN. Other results deal with model outputs: intersection complexity may determine a higher accident index depending on the regulation of intersection. The highest index for running over of pedestrian occurs at non-signalised intersections at night-time.
Cloud Computing Techniques for Space Mission Design
NASA Technical Reports Server (NTRS)
Arrieta, Juan; Senent, Juan
2014-01-01
The overarching objective of space mission design is to tackle complex problems producing better results, and faster. In developing the methods and tools to fulfill this objective, the user interacts with the different layers of a computing system.
The food-energy-water nexus and urban complexity
NASA Astrophysics Data System (ADS)
Romero-Lankao, Patricia; McPhearson, Timon; Davidson, Debra J.
2017-04-01
While tackling interdependencies among food, energy, and water security is promising, three fundamental challenges to effective operationalization need addressing: the feasibility of science-policy integration, cross-scale inequalities, and path-dependencies in infrastructure and socio-institutional practices.
Tackling the challenges of matching biomedical ontologies.
Faria, Daniel; Pesquita, Catia; Mott, Isabela; Martins, Catarina; Couto, Francisco M; Cruz, Isabel F
2018-01-15
Biomedical ontologies pose several challenges to ontology matching due both to the complexity of the biomedical domain and to the characteristics of the ontologies themselves. The biomedical tracks in the Ontology Matching Evaluation Initiative (OAEI) have spurred the development of matching systems able to tackle these challenges, and benchmarked their general performance. In this study, we dissect the strategies employed by matching systems to tackle the challenges of matching biomedical ontologies and gauge the impact of the challenges themselves on matching performance, using the AgreementMakerLight (AML) system as the platform for this study. We demonstrate that the linear complexity of the hash-based searching strategy implemented by most state-of-the-art ontology matching systems is essential for matching large biomedical ontologies efficiently. We show that accounting for all lexical annotations (e.g., labels and synonyms) in biomedical ontologies leads to a substantial improvement in F-measure over using only the primary name, and that accounting for the reliability of different types of annotations generally also leads to a marked improvement. Finally, we show that cross-references are a reliable source of information and that, when using biomedical ontologies as background knowledge, it is generally more reliable to use them as mediators than to perform lexical expansion. We anticipate that translating traditional matching algorithms to the hash-based searching paradigm will be a critical direction for the future development of the field. Improving the evaluation carried out in the biomedical tracks of the OAEI will also be important, as without proper reference alignments there is only so much that can be ascertained about matching systems or strategies. Nevertheless, it is clear that, to tackle the various challenges posed by biomedical ontologies, ontology matching systems must be able to efficiently combine multiple strategies into a mature matching approach.
NASA Astrophysics Data System (ADS)
Brdar, S.; Seifert, A.
2018-01-01
We present a novel Monte-Carlo ice microphysics model, McSnow, to simulate the evolution of ice particles due to deposition, aggregation, riming, and sedimentation. The model is an application and extension of the super-droplet method of Shima et al. (2009) to the more complex problem of rimed ice particles and aggregates. For each individual super-particle, the ice mass, rime mass, rime volume, and the number of monomers are predicted establishing a four-dimensional particle-size distribution. The sensitivity of the model to various assumptions is discussed based on box model and one-dimensional simulations. We show that the Monte-Carlo method provides a feasible approach to tackle this high-dimensional problem. The largest uncertainty seems to be related to the treatment of the riming processes. This calls for additional field and laboratory measurements of partially rimed snowflakes.
Telerobotic system performance measurement - Motivation and methods
NASA Technical Reports Server (NTRS)
Kondraske, George V.; Khoury, George J.
1992-01-01
A systems performance-based strategy for modeling and conducting experiments relevant to the design and performance characterization of telerobotic systems is described. A developmental testbed consisting of a distributed telerobotics network and initial efforts to implement the strategy described is presented. Consideration is given to the general systems performance theory (GSPT) to tackle human performance problems as a basis for: measurement of overall telerobotic system (TRS) performance; task decomposition; development of a generic TRS model; and the characterization of performance of subsystems comprising the generic model. GSPT employs a resource construct to model performance and resource economic principles to govern the interface of systems to tasks. It provides a comprehensive modeling/measurement strategy applicable to complex systems including both human and artificial components. Application is presented within the framework of a distributed telerobotics network as a testbed. Insight into the design of test protocols which elicit application-independent data is described.
Tackling some of the most intricate geophysical challenges via high-performance computing
NASA Astrophysics Data System (ADS)
Khosronejad, A.
2016-12-01
Recently, world has been witnessing significant enhancements in computing power of supercomputers. Computer clusters in conjunction with the advanced mathematical algorithms has set the stage for developing and applying powerful numerical tools to tackle some of the most intricate geophysical challenges that today`s engineers face. One such challenge is to understand how turbulent flows, in real-world settings, interact with (a) rigid and/or mobile complex bed bathymetry of waterways and sea-beds in the coastal areas; (b) objects with complex geometry that are fully or partially immersed; and (c) free-surface of waterways and water surface waves in the coastal area. This understanding is especially important because the turbulent flows in real-world environments are often bounded by geometrically complex boundaries, which dynamically deform and give rise to multi-scale and multi-physics transport phenomena, and characterized by multi-lateral interactions among various phases (e.g. air/water/sediment phases). Herein, I present some of the multi-scale and multi-physics geophysical fluid mechanics processes that I have attempted to study using an in-house high-performance computational model, the so-called VFS-Geophysics. More specifically, I will present the simulation results of turbulence/sediment/solute/turbine interactions in real-world settings. Parts of the simulations I present are performed to gain scientific insights into the processes such as sand wave formation (A. Khosronejad, and F. Sotiropoulos, (2014), Numerical simulation of sand waves in a turbulent open channel flow, Journal of Fluid Mechanics, 753:150-216), while others are carried out to predict the effects of climate change and large flood events on societal infrastructures ( A. Khosronejad, et al., (2016), Large eddy simulation of turbulence and solute transport in a forested headwater stream, Journal of Geophysical Research:, doi: 10.1002/2014JF003423).
A Novel Interdisciplinary Approach to Socio-Technical Complexity
NASA Astrophysics Data System (ADS)
Bassetti, Chiara
The chapter presents a novel interdisciplinary approach that integrates micro-sociological analysis into computer-vision and pattern-recognition modeling and algorithms, the purpose being to tackle socio-technical complexity at a systemic yet micro-grounded level. The approach is empirically-grounded and both theoretically- and analytically-driven, yet systemic and multidimensional, semi-supervised and computable, and oriented towards large scale applications. The chapter describes the proposed approach especially as for its sociological foundations, and as applied to the analysis of a particular setting --i.e. sport-spectator crowds. Crowds, better defined as large gatherings, are almost ever-present in our societies, and capturing their dynamics is crucial. From social sciences to public safety management and emergency response, modeling and predicting large gatherings' presence and dynamics, thus possibly preventing critical situations and being able to properly react to them, is fundamental. This is where semi/automated technologies can make the difference. The work presented in this chapter is intended as a scientific step towards such an objective.
Mechanistic kinetic models of enzymatic cellulose hydrolysis-A review.
Jeoh, Tina; Cardona, Maria J; Karuna, Nardrapee; Mudinoor, Akshata R; Nill, Jennifer
2017-07-01
Bioconversion of lignocellulose forms the basis for renewable, advanced biofuels, and bioproducts. Mechanisms of hydrolysis of cellulose by cellulases have been actively studied for nearly 70 years with significant gains in understanding of the cellulolytic enzymes. Yet, a full mechanistic understanding of the hydrolysis reaction has been elusive. We present a review to highlight new insights gained since the most recent comprehensive review of cellulose hydrolysis kinetic models by Bansal et al. (2009) Biotechnol Adv 27:833-848. Recent models have taken a two-pronged approach to tackle the challenge of modeling the complex heterogeneous reaction-an enzyme-centric modeling approach centered on the molecularity of the cellulase-cellulose interactions to examine rate limiting elementary steps and a substrate-centric modeling approach aimed at capturing the limiting property of the insoluble cellulose substrate. Collectively, modeling results suggest that at the molecular-scale, how rapidly cellulases can bind productively (complexation) and release from cellulose (decomplexation) is limiting, while the overall hydrolysis rate is largely insensitive to the catalytic rate constant. The surface area of the insoluble substrate and the degrees of polymerization of the cellulose molecules in the reaction both limit initial hydrolysis rates only. Neither enzyme-centric models nor substrate-centric models can consistently capture hydrolysis time course at extended reaction times. Thus, questions of the true reaction limiting factors at extended reaction times and the role of complexation and decomplexation in rate limitation remain unresolved. Biotechnol. Bioeng. 2017;114: 1369-1385. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
TERRA: Building New Communities for Advanced Biofuels
Cornelius, Joe; Mockler, Todd; Tuinstra, Mitch
2018-01-16
ARPA-Eâs Transportation Energy Resources from Renewable Agriculture (TERRA) program is bringing together top experts from different disciplines â agriculture, robotics and data analytics â to rethink the production of advanced biofuel crops. ARPA-E Program Director Dr. Joe Cornelius discusses the TERRA program and explains how ARPA-Eâs model enables multidisciplinary collaboration among diverse communities. The video focuses on two TERRA projectsâDonald Danforth Center and Purdue Universityâthat are developing and integrating cutting-edge remote sensing platforms, complex data analytics tools and plant breeding technologies to tackle the challenge of sustainably increasing biofuel stocks.
Microtubules and cellulose microfibrils: how intimate is their relationship?
Emons, Anne Mie C; Höfte, Herman; Mulder, Bela M
2007-07-01
The recent visualization of the motion of fluorescently labeled cellulose synthase complexes by Alexander Paredez and colleagues heralds the start of a new era in the science of the plant cell wall. Upon drug-induced complete depolymerization, the movement of the complexes does not become disordered but instead establishes an apparently self-organized novel pattern. The ability to label complexes in vivo has provided us with the ideal tool for tackling the intriguing question of the underlying default mechanisms at play.
Modeling Real-Time Applications with Reusable Design Patterns
NASA Astrophysics Data System (ADS)
Rekhis, Saoussen; Bouassida, Nadia; Bouaziz, Rafik
Real-Time (RT) applications, which manipulate important volumes of data, need to be managed with RT databases that deal with time-constrained data and time-constrained transactions. In spite of their numerous advantages, RT databases development remains a complex task, since developers must study many design issues related to the RT domain. In this paper, we tackle this problem by proposing RT design patterns that allow the modeling of structural and behavioral aspects of RT databases. We show how RT design patterns can provide design assistance through architecture reuse of reoccurring design problems. In addition, we present an UML profile that represents patterns and facilitates further their reuse. This profile proposes, on one hand, UML extensions allowing to model the variability of patterns in the RT context and, on another hand, extensions inspired from the MARTE (Modeling and Analysis of Real-Time Embedded systems) profile.
From the baker to the bedside: yeast models of Parkinson's disease
Menezes, Regina; Tenreiro, Sandra; Macedo, Diana; Santos, Cláudia N.; Outeiro, Tiago F.
2015-01-01
The baker’s yeast Saccharomyces cerevisiae has been extensively explored for our understanding of fundamental cell biology processes highly conserved in the eukaryotic kingdom. In this context, they have proven invaluable in the study of complex mechanisms such as those involved in a variety of human disorders. Here, we first provide a brief historical perspective on the emergence of yeast as an experimental model and on how the field evolved to exploit the potential of the model for tackling the intricacies of various human diseases. In particular, we focus on existing yeast models of the molecular underpinnings of Parkinson’s disease (PD), focusing primarily on the central role of protein quality control systems. Finally, we compile and discuss the major discoveries derived from these studies, highlighting their far-reaching impact on the elucidation of PD-associated mechanisms as well as in the identification of candidate therapeutic targets and compounds with therapeutic potential. PMID:28357302
Utility of Small Animal Models of Developmental Programming.
Reynolds, Clare M; Vickers, Mark H
2018-01-01
Any effective strategy to tackle the global obesity and rising noncommunicable disease epidemic requires an in-depth understanding of the mechanisms that underlie these conditions that manifest as a consequence of complex gene-environment interactions. In this context, it is now well established that alterations in the early life environment, including suboptimal nutrition, can result in an increased risk for a range of metabolic, cardiovascular, and behavioral disorders in later life, a process preferentially termed developmental programming. To date, most of the mechanistic knowledge around the processes underpinning development programming has been derived from preclinical research performed mostly, but not exclusively, in laboratory mouse and rat strains. This review will cover the utility of small animal models in developmental programming, the limitations of such models, and potential future directions that are required to fully maximize information derived from preclinical models in order to effectively translate to clinical use.
Complex Event Extraction using DRUM
2015-10-01
towards tackling these challenges . Figure 9. Evaluation results for eleven teams. The diamond ◆ represents the results of our system. The two topmost...Proceedings of the Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora (EMNLP/ VLC -2000). The UniProt
Commentary: Rural Histories, Rural Boundaries, Rural Change
ERIC Educational Resources Information Center
Tieken, Mara Casey
2017-01-01
Cross-sector collaborations can generate the resources and political will necessary to tackle urgent, complex issues. Because these partnerships involve local leaders, they are typically responsive to their surrounding communities, addressing local concerns, and capitalizing upon local assets. These strengths-oriented, locally driven…
Won, Sungho; Choi, Hosik; Park, Suyeon; Lee, Juyoung; Park, Changyi; Kwon, Sunghoon
2015-01-01
Owing to recent improvement of genotyping technology, large-scale genetic data can be utilized to identify disease susceptibility loci and this successful finding has substantially improved our understanding of complex diseases. However, in spite of these successes, most of the genetic effects for many complex diseases were found to be very small, which have been a big hurdle to build disease prediction model. Recently, many statistical methods based on penalized regressions have been proposed to tackle the so-called "large P and small N" problem. Penalized regressions including least absolute selection and shrinkage operator (LASSO) and ridge regression limit the space of parameters, and this constraint enables the estimation of effects for very large number of SNPs. Various extensions have been suggested, and, in this report, we compare their accuracy by applying them to several complex diseases. Our results show that penalized regressions are usually robust and provide better accuracy than the existing methods for at least diseases under consideration.
Hepatic transporter drug-drug interactions: an evaluation of approaches and methodologies.
Williamson, Beth; Riley, Robert J
2017-12-01
Drug-drug interactions (DDIs) continue to account for 5% of hospital admissions and therefore remain a major regulatory concern. Effective, quantitative prediction of DDIs will reduce unexpected clinical findings and encourage projects to frontload DDI investigations rather than concentrating on risk management ('manage the baggage') later in drug development. A key challenge in DDI prediction is the discrepancies between reported models. Areas covered: The current synopsis focuses on four recent influential publications on hepatic drug transporter DDIs using static models that tackle interactions with individual transporters and in combination with other drug transporters and metabolising enzymes. These models vary in their assumptions (including input parameters), transparency, reproducibility and complexity. In this review, these facets are compared and contrasted with recommendations made as to their application. Expert opinion: Over the past decade, static models have evolved from simple [I]/k i models to incorporate victim and perpetrator disposition mechanisms including the absorption rate constant, the fraction of the drug metabolised/eliminated and/or clearance concepts. Nonetheless, models that comprise additional parameters and complexity do not necessarily out-perform simpler models with fewer inputs. Further, consideration of the property space to exploit some drug target classes has also highlighted the fine balance required between frontloading and back-loading studies to design out or 'manage the baggage'.
Providing data science support for systems pharmacology and its implications to drug discovery.
Hart, Thomas; Xie, Lei
2016-01-01
The conventional one-drug-one-target-one-disease drug discovery process has been less successful in tracking multi-genic, multi-faceted complex diseases. Systems pharmacology has emerged as a new discipline to tackle the current challenges in drug discovery. The goal of systems pharmacology is to transform huge, heterogeneous, and dynamic biological and clinical data into interpretable and actionable mechanistic models for decision making in drug discovery and patient treatment. Thus, big data technology and data science will play an essential role in systems pharmacology. This paper critically reviews the impact of three fundamental concepts of data science on systems pharmacology: similarity inference, overfitting avoidance, and disentangling causality from correlation. The authors then discuss recent advances and future directions in applying the three concepts of data science to drug discovery, with a focus on proteome-wide context-specific quantitative drug target deconvolution and personalized adverse drug reaction prediction. Data science will facilitate reducing the complexity of systems pharmacology modeling, detecting hidden correlations between complex data sets, and distinguishing causation from correlation. The power of data science can only be fully realized when integrated with mechanism-based multi-scale modeling that explicitly takes into account the hierarchical organization of biological systems from nucleic acid to proteins, to molecular interaction networks, to cells, to tissues, to patients, and to populations.
E-Index for Differentiating Complex Dynamic Traits
Qi, Jiandong; Sun, Jianfeng; Wang, Jianxin
2016-01-01
While it is a daunting challenge in current biology to understand how the underlying network of genes regulates complex dynamic traits, functional mapping, a tool for mapping quantitative trait loci (QTLs) and single nucleotide polymorphisms (SNPs), has been applied in a variety of cases to tackle this challenge. Though useful and powerful, functional mapping performs well only when one or more model parameters are clearly responsible for the developmental trajectory, typically being a logistic curve. Moreover, it does not work when the curves are more complex than that, especially when they are not monotonic. To overcome this inadaptability, we therefore propose a mathematical-biological concept and measurement, E-index (earliness-index), which cumulatively measures the earliness degree to which a variable (or a dynamic trait) increases or decreases its value. Theoretical proofs and simulation studies show that E-index is more general than functional mapping and can be applied to any complex dynamic traits, including those with logistic curves and those with nonmonotonic curves. Meanwhile, E-index vector is proposed as well to capture more subtle differences of developmental patterns. PMID:27064292
NASA Astrophysics Data System (ADS)
Najafi, Amir Abbas; Pourahmadi, Zahra
2016-04-01
Selecting the optimal combination of assets in a portfolio is one of the most important decisions in investment management. As investment is a long term concept, looking into a portfolio optimization problem just in a single period may cause loss of some opportunities that could be exploited in a long term view. Hence, it is tried to extend the problem from single to multi-period model. We include trading costs and uncertain conditions to this model which made it more realistic and complex. Hence, we propose an efficient heuristic method to tackle this problem. The efficiency of the method is examined and compared with the results of the rolling single-period optimization and the buy and hold method which shows the superiority of the proposed method.
Mesoscale modeling: solving complex flows in biology and biotechnology.
Mills, Zachary Grant; Mao, Wenbin; Alexeev, Alexander
2013-07-01
Fluids are involved in practically all physiological activities of living organisms. However, biological and biorelated flows are hard to analyze due to the inherent combination of interdependent effects and processes that occur on a multitude of spatial and temporal scales. Recent advances in mesoscale simulations enable researchers to tackle problems that are central for the understanding of such flows. Furthermore, computational modeling effectively facilitates the development of novel therapeutic approaches. Among other methods, dissipative particle dynamics and the lattice Boltzmann method have become increasingly popular during recent years due to their ability to solve a large variety of problems. In this review, we discuss recent applications of these mesoscale methods to several fluid-related problems in medicine, bioengineering, and biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.
Progressive Learning of Topic Modeling Parameters: A Visual Analytics Framework.
El-Assady, Mennatallah; Sevastjanova, Rita; Sperrle, Fabian; Keim, Daniel; Collins, Christopher
2018-01-01
Topic modeling algorithms are widely used to analyze the thematic composition of text corpora but remain difficult to interpret and adjust. Addressing these limitations, we present a modular visual analytics framework, tackling the understandability and adaptability of topic models through a user-driven reinforcement learning process which does not require a deep understanding of the underlying topic modeling algorithms. Given a document corpus, our approach initializes two algorithm configurations based on a parameter space analysis that enhances document separability. We abstract the model complexity in an interactive visual workspace for exploring the automatic matching results of two models, investigating topic summaries, analyzing parameter distributions, and reviewing documents. The main contribution of our work is an iterative decision-making technique in which users provide a document-based relevance feedback that allows the framework to converge to a user-endorsed topic distribution. We also report feedback from a two-stage study which shows that our technique results in topic model quality improvements on two independent measures.
A role for low-order system dynamics models in urban health policy making.
Newell, Barry; Siri, José
2016-10-01
Cities are complex adaptive systems whose responses to policy initiatives emerge from feedback interactions between their parts. Urban policy makers must routinely deal with both detail and dynamic complexity, coupled with high levels of diversity, uncertainty and contingency. In such circumstances, it is difficult to generate reliable predictions of health-policy outcomes. In this paper we explore the potential for low-order system dynamics (LOSD) models to make a contribution towards meeting this challenge. By definition, LOSD models have few state variables (≤5), illustrate the non-linear effects caused by feedback and accumulation, and focus on endogenous dynamics generated within well-defined boundaries. We suggest that experience with LOSD models can help practitioners to develop an understanding of basic principles of system dynamics, giving them the ability to 'see with new eyes'. Because efforts to build a set of LOSD models can help a transdisciplinary group to develop a shared, coherent view of the problems that they seek to tackle, such models can also become the foundations of 'powerful ideas'. Powerful ideas are conceptual metaphors that provide the members of a policy-making group with the a priori shared context required for effective communication, the co-production of knowledge, and the collaborative development of effective public health policies. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Fischer, T.; Naumov, D.; Sattler, S.; Kolditz, O.; Walther, M.
2015-11-01
We offer a versatile workflow to convert geological models built with the ParadigmTM GOCAD© (Geological Object Computer Aided Design) software into the open-source VTU (Visualization Toolkit unstructured grid) format for usage in numerical simulation models. Tackling relevant scientific questions or engineering tasks often involves multidisciplinary approaches. Conversion workflows are needed as a way of communication between the diverse tools of the various disciplines. Our approach offers an open-source, platform-independent, robust, and comprehensible method that is potentially useful for a multitude of environmental studies. With two application examples in the Thuringian Syncline, we show how a heterogeneous geological GOCAD model including multiple layers and faults can be used for numerical groundwater flow modeling, in our case employing the OpenGeoSys open-source numerical toolbox for groundwater flow simulations. The presented workflow offers the chance to incorporate increasingly detailed data, utilizing the growing availability of computational power to simulate numerical models.
NASA Astrophysics Data System (ADS)
Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong
2017-06-01
Effective application of carbon capture, utilization and storage (CCUS) systems could help to alleviate the influence of climate change by reducing carbon dioxide (CO2) emissions. The research objective of this study is to develop an equilibrium chance-constrained programming model with bi-random variables (ECCP model) for supporting the CCUS management system under random circumstances. The major advantage of the ECCP model is that it tackles random variables as bi-random variables with a normal distribution, where the mean values follow a normal distribution. This could avoid irrational assumptions and oversimplifications in the process of parameter design and enrich the theory of stochastic optimization. The ECCP model is solved by an equilibrium change-constrained programming algorithm, which provides convenience for decision makers to rank the solution set using the natural order of real numbers. The ECCP model is applied to a CCUS management problem, and the solutions could be useful in helping managers to design and generate rational CO2-allocation patterns under complexities and uncertainties.
Combining complex networks and data mining: Why and how
NASA Astrophysics Data System (ADS)
Zanin, M.; Papo, D.; Sousa, P. A.; Menasalvas, E.; Nicchi, A.; Kubik, E.; Boccaletti, S.
2016-05-01
The increasing power of computer technology does not dispense with the need to extract meaningful information out of data sets of ever growing size, and indeed typically exacerbates the complexity of this task. To tackle this general problem, two methods have emerged, at chronologically different times, that are now commonly used in the scientific community: data mining and complex network theory. Not only do complex network analysis and data mining share the same general goal, that of extracting information from complex systems to ultimately create a new compact quantifiable representation, but they also often address similar problems too. In the face of that, a surprisingly low number of researchers turn out to resort to both methodologies. One may then be tempted to conclude that these two fields are either largely redundant or totally antithetic. The starting point of this review is that this state of affairs should be put down to contingent rather than conceptual differences, and that these two fields can in fact advantageously be used in a synergistic manner. An overview of both fields is first provided, some fundamental concepts of which are illustrated. A variety of contexts in which complex network theory and data mining have been used in a synergistic manner are then presented. Contexts in which the appropriate integration of complex network metrics can lead to improved classification rates with respect to classical data mining algorithms and, conversely, contexts in which data mining can be used to tackle important issues in complex network theory applications are illustrated. Finally, ways to achieve a tighter integration between complex networks and data mining, and open lines of research are discussed.
Inviting Uncertainty into the Classroom
ERIC Educational Resources Information Center
Beghetto, Ronald A.
2017-01-01
Most teachers try to avoid having students experience uncertainty in their schoolwork. But if we want to prepare students to tackle complex problems (and the uncertainty that accompanies such problems), we must give them learning experiences that involve feeling unsure and sometimes even confused. Beghetto presents five strategies that help…
A complex social-ecological disaster: Environmentally induced forced migration
Rechkemmer, Andreas; O'Connor, Ashley; Rai, Abha; Decker Sparks, Jessica L.; Mudliar, Pranietha; Shultz, James M.
2016-01-01
ABSTRACT In the 21st century, global issues are increasingly characterized by inter-connectedness and complexity. Global environmental change, and climate change in particular, has become a powerful driver and catalyst of forced migration and internal displacement of people. Environmental migrants may far outnumber any other group of displaced people and refugees in the years to come. Deeper scientific integration, especially across the social sciences, is a prerequisite to tackle this issue.
A complex social-ecological disaster: Environmentally induced forced migration.
Rechkemmer, Andreas; O'Connor, Ashley; Rai, Abha; Decker Sparks, Jessica L; Mudliar, Pranietha; Shultz, James M
2016-01-01
In the 21 st century, global issues are increasingly characterized by inter-connectedness and complexity. Global environmental change, and climate change in particular, has become a powerful driver and catalyst of forced migration and internal displacement of people. Environmental migrants may far outnumber any other group of displaced people and refugees in the years to come. Deeper scientific integration, especially across the social sciences, is a prerequisite to tackle this issue.
Dühring, Sybille; Germerodt, Sebastian; Skerka, Christine; Zipfel, Peter F.; Dandekar, Thomas; Schuster, Stefan
2015-01-01
The diploid, polymorphic yeast Candida albicans is one of the most important human pathogenic fungi. C. albicans can grow, proliferate and coexist as a commensal on or within the human host for a long time. However, alterations in the host environment can render C. albicans virulent. In this review, we describe the immunological cross-talk between C. albicans and the human innate immune system. We give an overview in form of pairs of human defense strategies including immunological mechanisms as well as general stressors such as nutrient limitation, pH, fever etc. and the corresponding fungal response and evasion mechanisms. Furthermore, Computational Systems Biology approaches to model and investigate these complex interactions are highlighted with a special focus on game-theoretical methods and agent-based models. An outlook on interesting questions to be tackled by Systems Biology regarding entangled defense and evasion mechanisms is given. PMID:26175718
NASA Astrophysics Data System (ADS)
Xuan, Hejun; Wang, Yuping; Xu, Zhanqi; Hao, Shanshan; Wang, Xiaoli
2017-11-01
Virtualization technology can greatly improve the efficiency of the networks by allowing the virtual optical networks to share the resources of the physical networks. However, it will face some challenges, such as finding the efficient strategies for virtual nodes mapping, virtual links mapping and spectrum assignment. It is even more complex and challenging when the physical elastic optical networks using multi-core fibers. To tackle these challenges, we establish a constrained optimization model to determine the optimal schemes of optical network mapping, core allocation and spectrum assignment. To solve the model efficiently, tailor-made encoding scheme, crossover and mutation operators are designed. Based on these, an efficient genetic algorithm is proposed to obtain the optimal schemes of the virtual nodes mapping, virtual links mapping, core allocation. The simulation experiments are conducted on three widely used networks, and the experimental results show the effectiveness of the proposed model and algorithm.
From global circulation to flood loss: Coupling models across the scales
NASA Astrophysics Data System (ADS)
Felder, Guido; Gomez-Navarro, Juan Jose; Bozhinova, Denica; Zischg, Andreas; Raible, Christoph C.; Ole, Roessler; Martius, Olivia; Weingartner, Rolf
2017-04-01
The prediction and the prevention of flood losses requires an extensive understanding of underlying meteorological, hydrological, hydraulic and damage processes. Coupled models help to improve the understanding of such underlying processes and therefore contribute the understanding of flood risk. Using such a modelling approach to determine potentially flood-affected areas and damages requires a complex coupling between several models operating at different spatial and temporal scales. Although the isolated parts of the single modelling components are well established and commonly used in the literature, a full coupling including a mesoscale meteorological model driven by a global circulation one, a hydrologic model, a hydrodynamic model and a flood impact and loss model has not been reported so far. In the present study, we tackle the application of such a coupled model chain in terms of computational resources, scale effects, and model performance. From a technical point of view, results show the general applicability of such a coupled model, as well as good model performance. From a practical point of view, such an approach enables the prediction of flood-induced damages, although some future challenges have been identified.
Benson, Neil
2015-08-01
Phase II attrition remains the most important challenge for drug discovery. Tackling the problem requires improved understanding of the complexity of disease biology. Systems biology approaches to this problem can, in principle, deliver this. This article reviews the reports of the application of mechanistic systems models to drug discovery questions and discusses the added value. Although we are on the journey to the virtual human, the length, path and rate of learning from this remain an open question. Success will be dependent on the will to invest and make the most of the insight generated along the way. Copyright © 2015 Elsevier Ltd. All rights reserved.
Exploring first-order phase transitions with population annealing
NASA Astrophysics Data System (ADS)
Barash, Lev Yu.; Weigel, Martin; Shchur, Lev N.; Janke, Wolfhard
2017-03-01
Population annealing is a hybrid of sequential and Markov chain Monte Carlo methods geared towards the efficient parallel simulation of systems with complex free-energy landscapes. Systems with first-order phase transitions are among the problems in computational physics that are difficult to tackle with standard methods such as local-update simulations in the canonical ensemble, for example with the Metropolis algorithm. It is hence interesting to see whether such transitions can be more easily studied using population annealing. We report here our preliminary observations from population annealing runs for the two-dimensional Potts model with q > 4, where it undergoes a first-order transition.
Evolution for Young Victorians
ERIC Educational Resources Information Center
Lightman, Bernard
2012-01-01
Evolution was a difficult topic to tackle when writing books for the young in the wake of the controversies over Darwin's "Origin of Species." Authors who wrote about evolution for the young experimented with different ways of making the complex concepts of evolutionary theory accessible and less controversial. Many authors depicted presented…
Deaf Children's Construction of Writing
ERIC Educational Resources Information Center
Massone, Maria Ignacia; Baez, Monica
2009-01-01
High illiteracy rates among the Argentine deaf population, even after long years of schooling, point to the need to revise certain approaches to deaf literacy, particularly in school settings. Qualitative change in deaf literacy requires the use of multiple conceptual tools if learners are to be able to tackle its complexity without reductionism…
Evaluation and expression analysis of alfalfa genotypes in response to prolonged salt stress
USDA-ARS?s Scientific Manuscript database
Salinity is one of the most important abiotic stresses that adversely affect plant growth and productivity globally. In order to tackle this complex problem, it is important to link the biochemical and physiological responses with the underlying genetic mechanisms. In this study, we used 12 previous...
Mapping a Semester: Using Cultural Mapping in an Honors Humanities Course
ERIC Educational Resources Information Center
Martin, Robyn S.
2013-01-01
Grand Canyon Semesters (GCS) at Northern Arizona University are integrated learning experiences in the humanities and sciences. Students study the environmental and social challenges confronting us in the twenty-first century using an interdisciplinary approach to the curriculum. During previous semesters, participants have tackled complex issues…
Improving Collaborative Learning in Online Software Engineering Education
ERIC Educational Resources Information Center
Neill, Colin J.; DeFranco, Joanna F.; Sangwan, Raghvinder S.
2017-01-01
Team projects are commonplace in software engineering education. They address a key educational objective, provide students critical experience relevant to their future careers, allow instructors to set problems of greater scale and complexity than could be tackled individually, and are a vehicle for socially constructed learning. While all…
Tackling environmental, economic, and social sustainability issues with community stakeholders will often lead to choices that are costly, complex and uncertain. A formal process with proper guidance is needed to understand the issues, identify sources of disagreement, consider t...
The Future Problem Solving Program.
ERIC Educational Resources Information Center
Crabbe, Anne B.
1989-01-01
Describes the Future Problem Solving Program, in which students from the U.S. and around the world are tackling some complex challenges facing society, ranging from acid rain to terrorism. The program uses a creative problem solving process developed for business and industry. A sixth-grade toxic waste cleanup project illustrates the process.…
Consultancy on Strategic Information Planning.
ERIC Educational Resources Information Center
Pejova, Zdravka, Ed.; Horton, Forest W., Ed.
At the workshop, better management through strategic planning of information and consultancy was discussed as one way in which developing and Eastern European countries could tackle the complex information problems they are facing during the transition to a market economy. The sixteen papers in this volume are grouped into three basic categories:…
NASA Astrophysics Data System (ADS)
Barati Farimani, Amir; Gomes, Joseph; Pande, Vijay
2017-11-01
We have developed a new data-driven model paradigm for the rapid inference and solution of the constitutive equations of fluid mechanic by deep learning models. Using generative adversarial networks (GAN), we train models for the direct generation of solutions to steady state heat conduction and incompressible fluid flow without knowledge of the underlying governing equations. Rather than using artificial neural networks to approximate the solution of the constitutive equations, GANs can directly generate the solutions to these equations conditional upon an arbitrary set of boundary conditions. Both models predict temperature, velocity and pressure fields with great test accuracy (>99.5%). The application of our framework for inferring and generating the solutions of partial differential equations can be applied to any physical phenomena and can be used to learn directly from experiments where the underlying physical model is complex or unknown. We also have shown that our framework can be used to couple multiple physics simultaneously, making it amenable to tackle multi-physics problems.
Extension of specification language for soundness and completeness of service workflow
NASA Astrophysics Data System (ADS)
Viriyasitavat, Wattana; Xu, Li Da; Bi, Zhuming; Sapsomboon, Assadaporn
2018-05-01
A Service Workflow is an aggregation of distributed services to fulfill specific functionalities. With ever increasing available services, the methodologies for the selections of the services against the given requirements become main research subjects in multiple disciplines. A few of researchers have contributed to the formal specification languages and the methods for model checking; however, existing methods have the difficulties to tackle with the complexity of workflow compositions. In this paper, we propose to formalize the specification language to reduce the complexity of the workflow composition. To this end, we extend a specification language with the consideration of formal logic, so that some effective theorems can be derived for the verification of syntax, semantics, and inference rules in the workflow composition. The logic-based approach automates compliance checking effectively. The Service Workflow Specification (SWSpec) has been extended and formulated, and the soundness, completeness, and consistency of SWSpec applications have been verified; note that a logic-based SWSpec is mandatory for the development of model checking. The application of the proposed SWSpec has been demonstrated by the examples with the addressed soundness, completeness, and consistency.
Cross-Dependency Inference in Multi-Layered Networks: A Collaborative Filtering Perspective.
Chen, Chen; Tong, Hanghang; Xie, Lei; Ying, Lei; He, Qing
2017-08-01
The increasingly connected world has catalyzed the fusion of networks from different domains, which facilitates the emergence of a new network model-multi-layered networks. Examples of such kind of network systems include critical infrastructure networks, biological systems, organization-level collaborations, cross-platform e-commerce, and so forth. One crucial structure that distances multi-layered network from other network models is its cross-layer dependency, which describes the associations between the nodes from different layers. Needless to say, the cross-layer dependency in the network plays an essential role in many data mining applications like system robustness analysis and complex network control. However, it remains a daunting task to know the exact dependency relationships due to noise, limited accessibility, and so forth. In this article, we tackle the cross-layer dependency inference problem by modeling it as a collective collaborative filtering problem. Based on this idea, we propose an effective algorithm Fascinate that can reveal unobserved dependencies with linear complexity. Moreover, we derive Fascinate-ZERO, an online variant of Fascinate that can respond to a newly added node timely by checking its neighborhood dependencies. We perform extensive evaluations on real datasets to substantiate the superiority of our proposed approaches.
Multiple hypothesis tracking for cluttered biological image sequences.
Chenouard, Nicolas; Bloch, Isabelle; Olivo-Marin, Jean-Christophe
2013-11-01
In this paper, we present a method for simultaneously tracking thousands of targets in biological image sequences, which is of major importance in modern biology. The complexity and inherent randomness of the problem lead us to propose a unified probabilistic framework for tracking biological particles in microscope images. The framework includes realistic models of particle motion and existence and of fluorescence image features. For the track extraction process per se, the very cluttered conditions motivate the adoption of a multiframe approach that enforces tracking decision robustness to poor imaging conditions and to random target movements. We tackle the large-scale nature of the problem by adapting the multiple hypothesis tracking algorithm to the proposed framework, resulting in a method with a favorable tradeoff between the model complexity and the computational cost of the tracking procedure. When compared to the state-of-the-art tracking techniques for bioimaging, the proposed algorithm is shown to be the only method providing high-quality results despite the critically poor imaging conditions and the dense target presence. We thus demonstrate the benefits of advanced Bayesian tracking techniques for the accurate computational modeling of dynamical biological processes, which is promising for further developments in this domain.
Advice to Policy Makers Who Would Tackle Syria: The Problem with Problem Solving
2014-01-01
consensus on the specific way forward in Syria, there is one thing most do agree on; Syria is complex. It is complex in the familiar use of that term...SYRIA SYRIA SUPPLEMENTAL FEATURES | 125 strengthen stabilizing loops (ones that keep things from getting worse) or virtuous cycles (ones that make... things worse and worse over time). Second, and more importantly, affecting patterns can be the key to solving the problem of strained or insufficient
Recent advances in non-LTE stellar atmosphere models
NASA Astrophysics Data System (ADS)
Sander, Andreas A. C.
2017-11-01
In the last decades, stellar atmosphere models have become a key tool in understanding massive stars. Applied for spectroscopic analysis, these models provide quantitative information on stellar wind properties as well as fundamental stellar parameters. The intricate non-LTE conditions in stellar winds dictate the development of adequate sophisticated model atmosphere codes. The increase in both, the computational power and our understanding of physical processes in stellar atmospheres, led to an increasing complexity in the models. As a result, codes emerged that can tackle a wide range of stellar and wind parameters. After a brief address of the fundamentals of stellar atmosphere modeling, the current stage of clumped and line-blanketed model atmospheres will be discussed. Finally, the path for the next generation of stellar atmosphere models will be outlined. Apart from discussing multi-dimensional approaches, I will emphasize on the coupling of hydrodynamics with a sophisticated treatment of the radiative transfer. This next generation of models will be able to predict wind parameters from first principles, which could open new doors for our understanding of the various facets of massive star physics, evolution, and death.
Savage, Trevor Nicholas; McIntosh, Andrew Stuart
2017-03-01
It is important to understand factors contributing to and directly causing sports injuries to improve the effectiveness and safety of sports skills. The characteristics of injury events must be evaluated and described meaningfully and reliably. However, many complex skills cannot be effectively investigated quantitatively because of ethical, technological and validity considerations. Increasingly, qualitative methods are being used to investigate human movement for research purposes, but there are concerns about reliability and measurement bias of such methods. Using the tackle in Rugby union as an example, we outline a systematic approach for developing a skill analysis protocol with a focus on improving objectivity, validity and reliability. Characteristics for analysis were selected using qualitative analysis and biomechanical theoretical models and epidemiological and coaching literature. An expert panel comprising subject matter experts provided feedback and the inter-rater reliability of the protocol was assessed using ten trained raters. The inter-rater reliability results were reviewed by the expert panel and the protocol was revised and assessed in a second inter-rater reliability study. Mean agreement in the second study improved and was comparable (52-90% agreement and ICC between 0.6 and 0.9) with other studies that have reported inter-rater reliability of qualitative analysis of human movement.
NASA Astrophysics Data System (ADS)
Heylighen, Francis
2017-01-01
The world is confronted with a variety of interdependent problems, including scarcity, unsustainability, inequality, pollution and poor governance. Tackling such complex challenges requires coordinated action. The present paper proposes the development of a self-organizing system for coordination, called an "offer network", that would use the distributed intelligence of the Internet to match the offers and needs of all human, technological and natural agents on the planet. This would maximize synergy and thus minimize waste and scarcity of resources. Implementing such coordination requires a protocol that formally defines agents, offers, needs, and the network of condition-action rules or reactions that interconnect them. Matching algorithms can then determine self-sustaining subnetworks in which each consumed resource (need) is also produced (offer). After sketching the elements of a mathematical foundation for offer networks, the paper proposes a roadmap for their practical implementation. This includes step-by-step integration with technologies such as the Semantic Web, ontologies, the Internet of Things, reputation and recommendation systems, reinforcement learning, governance through legal constraints and nudging, and ecosystem modeling. The resulting intelligent platform should be able to tackle nearly all practical and theoretical problems in a bottom-up, distributed manner, thus functioning like a Global Brain for humanity.
Translational Systems Biology and Voice Pathophysiology
Li, Nicole Y. K.; Abbott, Katherine Verdolini; Rosen, Clark; An, Gary; Hebda, Patricia A.; Vodovotz, Yoram
2011-01-01
Objectives/Hypothesis Personalized medicine has been called upon to tailor healthcare to an individual's needs. Evidence-based medicine (EBM) has advocated using randomized clinical trials with large populations to evaluate treatment effects. However, due to large variations across patients, the results are likely not to apply to an individual patient. We suggest that a complementary, systems biology approach using computational modeling may help tackle biological complexity in order to improve ultimate patient care. The purpose of the article is: 1) to review the pros and cons of EBM, and 2) to discuss the alternative systems biology method and present its utility in clinical voice research. Study Design Tutorial Methods Literature review and discussion. Results We propose that translational systems biology can address many of the limitations of EBM pertinent to voice and other health care domains, and thus complement current health research models. In particular, recent work using mathematical modeling suggests that systems biology has the ability to quantify the highly complex biologic processes underlying voice pathophysiology. Recent data support the premise that this approach can be applied specifically in the case of phonotrauma and surgically induced vocal fold trauma, and may have particular power to address personalized medicine. Conclusions We propose that evidence around vocal health and disease be expanded beyond a population-based method to consider more fully issues of complexity and systems interactions, especially in implementing personalized medicine in voice care and beyond. PMID:20025041
Factors influencing tackle injuries in rugby union football
Garraway, W. M.; Lee, A. J.; Macleod, D. A.; Telfer, J. W.; Deary, I. J.; Murray, G. D.
1999-01-01
OBJECTIVES: To assess the influence of selected aspects of lifestyle, personality, and other player related factors on injuries in the tackle. To describe the detailed circumstances in which these tackles occurred. METHODS: A prospective case-control study was undertaken in which the tackling and tackled players ("the cases") involved in a tackle injury were each matched with "control" players who held the same respective playing positions in the opposing teams. A total of 964 rugby matches involving 71 senior clubs drawn from all districts of the Scottish Rugby Union (SRU) were observed by nominated linkmen who administered self report questionnaires to the players identified as cases and controls. Information on lifestyle habits, match preparation, training, and coaching experience was obtained. A validated battery of psychological tests assessed players' trait anger and responses to anger and hostility. The circumstances of the tackles in which injury occurred were recorded by experienced SRU coaching staff in interviews with involved players after the match. RESULTS: A total of 71 tackle injury episodes with correct matching of cases and controls were studied. The following player related factors did not contribute significantly to tackle injuries: alcohol consumption before the match, feeling "below par" through minor illness, the extent of match preparation, previous coaching, or practising tackling. Injured and non- injured players in the tackle did not differ in their disposition toward, or expression of, anger or hostility. Some 85% of tackling players who were injured were three quarters, and 52% of injuries occurred when the tackle came in behind the tackled player or within his peripheral vision. Either the tackling or tackled player was sprinting or running in all of these injury episodes. One third of injuries occurred in differential speed tackles--that is, when one player was travelling much faster than the other at impact. The player with the lower momentum was injured in 80% of these cases. Forceful or crunching tackles resulting in injury mostly occurred head on or within the tackled player's side vision. CONCLUSIONS: Attention should be focused on high speed tackles going in behind the tackled player's line of vision. Comparative information on the circumstances of the vast majority of tackles in which no injury occurs is required before any changes are considered to reduce injuries in the tackle. PMID:10027056
A fast mass spring model solver for high-resolution elastic objects
NASA Astrophysics Data System (ADS)
Zheng, Mianlun; Yuan, Zhiyong; Zhu, Weixu; Zhang, Guian
2017-03-01
Real-time simulation of elastic objects is of great importance for computer graphics and virtual reality applications. The fast mass spring model solver can achieve visually realistic simulation in an efficient way. Unfortunately, this method suffers from resolution limitations and lack of mechanical realism for a surface geometry model, which greatly restricts its application. To tackle these problems, in this paper we propose a fast mass spring model solver for high-resolution elastic objects. First, we project the complex surface geometry model into a set of uniform grid cells as cages through *cages mean value coordinate method to reflect its internal structure and mechanics properties. Then, we replace the original Cholesky decomposition method in the fast mass spring model solver with a conjugate gradient method, which can make the fast mass spring model solver more efficient for detailed surface geometry models. Finally, we propose a graphics processing unit accelerated parallel algorithm for the conjugate gradient method. Experimental results show that our method can realize efficient deformation simulation of 3D elastic objects with visual reality and physical fidelity, which has a great potential for applications in computer animation.
Information Retrieval and Graph Analysis Approaches for Book Recommendation.
Benkoussas, Chahinez; Bellot, Patrice
2015-01-01
A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.
Information Retrieval and Graph Analysis Approaches for Book Recommendation
Benkoussas, Chahinez; Bellot, Patrice
2015-01-01
A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments. PMID:26504899
Martínez, Jimena H; Fuentes, Federico; Vanasco, Virginia; Alvarez, Silvia; Alaimo, Agustina; Cassina, Adriana; Coluccio Leskow, Federico; Velazquez, Francisco
2018-08-01
α-synuclein is involved in both familial and sporadic Parkinson's disease. Although its interaction with mitochondria has been well documented, several aspects remains unknown or under debate such as the specific sub-mitochondrial localization or the dynamics of the interaction. It has been suggested that α-synuclein could only interact with ER-associated mitochondria. The vast use of model systems and experimental conditions makes difficult to compare results and extract definitive conclusions. Here we tackle this by analyzing, in a simplified system, the interaction between purified α-synuclein and isolated rat brain mitochondria. This work shows that wild type α-synuclein interacts with isolated mitochondria and translocates into the mitochondrial matrix. This interaction and the irreversibility of α-synuclein translocation depend on incubation time and α-synuclein concentration. FRET experiments show that α-synuclein localizes close to components of the TOM complex suggesting a passive transport of α-synuclein through the outer membrane. In addition, α-synuclein binding alters mitochondrial function at the level of Complex I leading to a decrease in ATP synthesis and an increase of ROS production. Copyright © 2018. Published by Elsevier Inc.
Beyond Words: Comics in the Social Work Classroom
ERIC Educational Resources Information Center
Akesson, Bree; Oba, Olufunke
2017-01-01
Equipping future social workers to interrogate social justice, human rights, and cultural issues requires a revision of social work education. Culturally relevant teaching is increasingly important in today's globalized world. In this article, we explore the role of comics as a form of social work pedagogy to tackle complex social issues. The…
Science Education: A (Pending) Chapter in the Curriculum Transformation in Argentina
ERIC Educational Resources Information Center
Labate, Hugo
2007-01-01
The article documents the complex process of changing Argentina's science curriculum and implementing those changes over the last 15 years. It recounts how reformers tackled the challenges of balancing national (federal) unity in education with local (provincial) autonomy from the political, social and pedagogical points of view. It also analyzes…
Young Adult Literature: From Romance to Realism
ERIC Educational Resources Information Center
Cart, Michael
2010-01-01
Today's young adult (YA) literature is every bit as complex as the audience it's written for, unflinchingly addressing such topics as homosexuality, mental illness, AIDS and drug abuse. In this much expanded revision of his 1996 book, veteran author Michael Cart shows how the best of contemporary YA lit has evolved to tackle such daunting subjects…
Integrating ecological and social knowledge: learning from CHANS research
Bruce Shindler; Thomas A. Spies; John P. Bolte; Jeffrey D. Kline
2017-01-01
Scientists are increasingly called upon to integrate across ecological and social disciplines to tackle complex coupled human and natural system (CHANS) problems. Integration of these disciplines is challenging and many scientists do not have experience with large integrated research projects. However, much can be learned about the complicated process of integration...
Urban stewardship as a catalyst for recovery and change
Erika S. Svendsen; Lindsay K. Campbell; Nancy F. Sonti; Gillian Baine
2015-01-01
Current scientific conversation and practice often emphasizes the importance of interdisciplinary research in tackling complex, contemporary issues. Direct observation is one of the most abiding, and sometimes overlooked, scientific methods that is common across most disciplines. On a summer afternoon in 2012, our USDA Forest Service research team went for a hike along...
Infusing Ethics into the Development of Engineers: Exemplary Education Activities and Programs
ERIC Educational Resources Information Center
National Academies Press, 2016
2016-01-01
Ethical practice in engineering is critical for ensuring public trust in the field and in its practitioners, especially as engineers increasingly tackle international and socially complex problems that combine technical and ethical challenges. This report aims to raise awareness of the variety of exceptional programs and strategies for improving…
A Critical Approach towards Dyslexia
ERIC Educational Resources Information Center
Leonard, Bobby
2005-01-01
This article discusses dyslexia (one of the many complex issues that affects students) and the ways to tackle it appropriately. Dyslexia is described as a syndrome in which a person's reading and/or writing ability is significantly lower than that which would be predicted by his or her general level of intelligence. People are diagnosed as…
ERIC Educational Resources Information Center
Heras, Maria; Ruiz-Mallén, Isabel
2017-01-01
The emerging paradigm of responsible research and innovation (RRI) in the European Commission policy discourse identifies science education as a key agenda for better equipping students with skills and knowledge to tackle complex societal challenges and foster active citizenship in democratic societies. The operationalisation of this broad…
Teaching and Learning Processes for Social Transformation: Engaging a Kaleidoscope of Learners
ERIC Educational Resources Information Center
Rutherford, Gayle E.; Walsh, Christine A.; Rook, John
2011-01-01
To tackle the complexity of issues associated with homelessness, an interdisciplinary lens with direct input from service providers and community members is necessary. Within a community-university partnership between a larger inner-city multiservice shelter serving the homeless population, and faculties of social work and nursing in a Canadian…
Prose and Cons: Theatrical Encounters with Students and Prisoners in Ma'asiyahu, Israel
ERIC Educational Resources Information Center
Kuftinec, Sonja; Alon, Chen
2007-01-01
This article details how a unique educational project conducted through Tel Aviv University's Community Theatre program tackled the complex dynamics of the prison-political system over nine months in 2005-2006. The program focused on theatrical facilitations between mainly female students and male prisoners - two more or less homogeneous groups…
We Need More Migration Between the Sciences
NASA Astrophysics Data System (ADS)
Wiesner, Karoline
The most exciting prospects for complexity science today are in the social sciences. Migration is a good example. According to the UN 720 million people worldwide are currently internal migrants and 120 million are international migrants. How many will there be in 2030, from where and to where do they migrate, why, at what costs and what are the consequences? We require a cross-disciplinary effort involving tools from complexity science, political science, social science, environmental science, psychology, epidemiology, biochemistry, and mathematics to tackle these questions...
Scale Mixture Models with Applications to Bayesian Inference
NASA Astrophysics Data System (ADS)
Qin, Zhaohui S.; Damien, Paul; Walker, Stephen
2003-11-01
Scale mixtures of uniform distributions are used to model non-normal data in time series and econometrics in a Bayesian framework. Heteroscedastic and skewed data models are also tackled using scale mixture of uniform distributions.
The big data-big model (BDBM) challenges in ecological research
NASA Astrophysics Data System (ADS)
Luo, Y.
2015-12-01
The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple, heterogeneous data sets; intractability of structural complexity of big models; equifinality of model structure selection and parameter estimation; and computational demand of global optimization with Big Models.
Non-sanctioning of illegal tackles in South African youth community rugby.
Brown, J C; Boucher, S J; Lambert, M; Viljoen, W; Readhead, C; Hendricks, S; Kraak, W J
2018-06-01
The tackle event in rugby union ('rugby') contributes to the majority of players' injuries. Referees can reduce this risk by sanctioning dangerous tackles. A study in elite adult rugby suggests that referees only sanction a minority of illegal tackles. The aim of this study was to assess if this finding was similar in youth community rugby. Observational study. Using EncodePro, 99 South African Rugby Union U18 Youth Week tournament matches were coded between 2011 and 2015. All tackles were coded by a researcher and an international referee to ensure that laws were interpreted correctly. The inter- and intra-rater reliabilities were 0.97-1.00. A regression analysis compared the non-sanctioned rates over time. In total, 12 216 tackles were coded, of which less than 1% (n=113) were 'illegal'. The majority of the 113 illegal tackles were front-on (75%), high tackles (72%) and occurred in the 2nd/4th quarters (29% each). Of the illegal tackles, only 59% were sanctioned. The proportions of illegal tackles and sanctioning of these illegal tackles to all tackles improved by 0.2% per year from 2011-2015 (p<0.05). In these youth community rugby players, 59% of illegal tackles were not sanctioned appropriately. This was better than a previous study in elite adult rugby, where only 7% of illegal tackles were penalised. Moreover, the rates of illegal tackles and non-sanctioned illegal tackles both improved over time. However, it is critical that referees consistently enforce all laws to enhance injury prevention efforts. Further studies should investigate the reasons for non-sanctioning. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Tackle mechanisms and match characteristics in women's elite football tournaments.
Tscholl, P; O'Riordan, D; Fuller, C W; Dvorak, J; Junge, A
2007-08-01
Several tools have been used for assessing risk situations and for gathering tackle information from international football matches for men but not for women. To analyse activities in women's football and to identify the characteristics and risk potentials of tackles. Retrospective video analysis. Video recordings of 24 representative matches from six women's top-level tournaments were analysed for tackle parameters and their risk potential. 3531 tackles were recorded. Tackles in which the tackling player came from the side and stayed on her feet accounted for nearly half of all challenges for the ball in which body contact occurred. 2.7% of all tackles were classified as risk situations, with sliding-in tackles from behind and the side having the highest risk potential. Match referees sanctioned sliding-in tackles more often than other tackles (20% v 17%, respectively). Tackle parameters did not change in the duration of a match; however, there was an increase in the number of injury risk situations and foul plays towards the end of each half. Match properties provide valuable information for a better understanding of injury situations in football. Staying on feet and jumping vertically tackle actions leading to injury were sanctioned significantly more times by the referee than those not leading to injury (p<0.001), but no such difference was seen for sliding-in tackles (previously reported to have the highest injury potential in women's football). Therefore, either the laws of the game are not adequate or match referees in women's football are not able to distinguish between sliding-in tackles leading to and those not leading to injury.
More than a meal: integrating non-feeding interactions into food webs
Kéfi, Sonia; Berlow, Eric L.; Wieters, Evie A.; Navarrete, Sergio A.; Petchey, Owen L.; Wood, Spencer A.; Boit, Alice; Joppa, Lucas N.; Lafferty, Kevin D.; Williams, Richard J.; Martinez, Neo D.; Menge, Bruce A.; Blanchette, Carol A.; Iles, Alison C.; Brose, Ulrich
2012-01-01
Organisms eating each other are only one of many types of well documented and important interactions among species. Other such types include habitat modification, predator interference and facilitation. However, ecological network research has been typically limited to either pure food webs or to networks of only a few (<3) interaction types. The great diversity of non-trophic interactions observed in nature has been poorly addressed by ecologists and largely excluded from network theory. Herein, we propose a conceptual framework that organises this diversity into three main functional classes defined by how they modify specific parameters in a dynamic food web model. This approach provides a path forward for incorporating non-trophic interactions in traditional food web models and offers a new perspective on tackling ecological complexity that should stimulate both theoretical and empirical approaches to understanding the patterns and dynamics of diverse species interactions in nature.
How linear response shaped models of neural circuits and the quest for alternatives.
Herfurth, Tim; Tchumatchenko, Tatjana
2017-10-01
In the past decades, many mathematical approaches to solve complex nonlinear systems in physics have been successfully applied to neuroscience. One of these tools is the concept of linear response functions. However, phenomena observed in the brain emerge from fundamentally nonlinear interactions and feedback loops rather than from a composition of linear filters. Here, we review the successes achieved by applying the linear response formalism to topics, such as rhythm generation and synchrony and by incorporating it into models that combine linear and nonlinear transformations. We also discuss the challenges encountered in the linear response applications and argue that new theoretical concepts are needed to tackle feedback loops and non-equilibrium dynamics which are experimentally observed in neural networks but are outside of the validity regime of the linear response formalism. Copyright © 2017 Elsevier Ltd. All rights reserved.
Simulation Models of Obesity: A Review of the Literature and Implications for Research and Policy
Levy, David T.; Mabry, Patricia L.; Wang, Y. Claire; Gortmaker, Steve; Huang, Terry T-K; Marsh, Tim; Moodie, Marj; Swinburn, Boyd
2015-01-01
Simulation models (SMs) combine information from a variety of sources to provide a useful tool for examining how the effects of obesity unfold over time and impact population health. SMs can aid in the understanding of the complex interaction of the drivers of diet and activity and their relation to health outcomes. As emphasized in a recently released report of the Institute or Medicine, SMs can be especially useful for considering the potential impact of an array of policies that will be required to tackle the obesity problem. The purpose of this paper is to present an overview of existing SMs for obesity. First, a background section introduces the different types of models, explains how models are constructed, shows the utility of SMs, and discusses their strengths and weaknesses. Using these typologies, we then briefly review extant obesity SMs. We categorize these models according to their focus: health and economic outcomes, trends in obesity as a function of past trends, physiologically-based behavioral models, environmental contributors to obesity, and policy interventions. Finally, we suggest directions for future research. PMID:20973910
Burger, Nicholas; Lambert, Mike I; Viljoen, Wayne; Brown, James C; Readhead, Clint; Hendricks, Sharief
2014-08-12
The tackle situation is most often associated with the high injury rates in rugby union. Tackle injury epidemiology in rugby union has previously been focused on senior cohorts but less is known about younger cohorts. The aim of this study was to report on the nature and rates of tackle-related injuries in South African youth rugby union players representing their provinces at national tournaments. Observational cohort study. Four South African Youth Week tournaments (under-13 Craven Week, under-16 Grant Khomo Week, under-18 Academy Week, under-18 Craven Week). Injury data were collected from 3652 youth rugby union players (population at risk) in 2011 and 2012. Tackle-related injury severity ('time-loss' and 'medical attention'), type and location, injury rate per 1000 h (including 95% CIs). Injury rate ratios (IRR) were calculated and modelled using a Poisson regression. A χ(2) analysis was used to detect linear trends between injuries and increasing match quarters. The 2012 under-13 Craven Week had a significantly greater 'time-loss' injury rate when compared with the 2012 under-18 Academy Week (IRR=4.43; 95% CI 2.13 to 9.21, p<0.05) and under-18 Craven Week (IRR=3.52; 95% CI 1.54 to 8.00, p<0.05). The Poisson regression also revealed a higher probability of 'overall' ('time-loss' and 'medical attention' combined) and 'time-loss' tackle-related injuries occurring at the under-13 Craven Week. The proportion of 'overall' and 'time-loss' injuries increased significantly with each quarter of the match when all four tournaments were combined (p<0.05). There was a difference in the tackle-related injury rate between the under-13 tournament and the two under-18 tournaments, and the tackle-related injury rate was higher in the final quarter of matches. Ongoing injury surveillance is required to better interpret these findings. Injury prevention strategies targeting the tackle may only be effective once the rate and nature of injuries have been accurately determined. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Burger, Nicholas; Lambert, Mike I; Viljoen, Wayne; Brown, James C; Readhead, Clint; Hendricks, Sharief
2014-01-01
Objectives The tackle situation is most often associated with the high injury rates in rugby union. Tackle injury epidemiology in rugby union has previously been focused on senior cohorts but less is known about younger cohorts. The aim of this study was to report on the nature and rates of tackle-related injuries in South African youth rugby union players representing their provinces at national tournaments. Design Observational cohort study. Setting Four South African Youth Week tournaments (under-13 Craven Week, under-16 Grant Khomo Week, under-18 Academy Week, under-18 Craven Week). Participants Injury data were collected from 3652 youth rugby union players (population at risk) in 2011 and 2012. Outcome measures Tackle-related injury severity (‘time-loss’ and ‘medical attention’), type and location, injury rate per 1000 h (including 95% CIs). Injury rate ratios (IRR) were calculated and modelled using a Poisson regression. A χ2 analysis was used to detect linear trends between injuries and increasing match quarters. Results The 2012 under-13 Craven Week had a significantly greater ‘time-loss’ injury rate when compared with the 2012 under-18 Academy Week (IRR=4.43; 95% CI 2.13 to 9.21, p<0.05) and under-18 Craven Week (IRR=3.52; 95% CI 1.54 to 8.00, p<0.05). The Poisson regression also revealed a higher probability of ‘overall’ (‘time-loss’ and ‘medical attention’ combined) and ‘time-loss’ tackle-related injuries occurring at the under-13 Craven Week. The proportion of ‘overall’ and ‘time-loss’ injuries increased significantly with each quarter of the match when all four tournaments were combined (p<0.05). Conclusions There was a difference in the tackle-related injury rate between the under-13 tournament and the two under-18 tournaments, and the tackle-related injury rate was higher in the final quarter of matches. Ongoing injury surveillance is required to better interpret these findings. Injury prevention strategies targeting the tackle may only be effective once the rate and nature of injuries have been accurately determined. PMID:25116454
Doing Literary Criticism: Helping Students Engage with Challenging Texts
ERIC Educational Resources Information Center
Gillespie, Tim
2010-01-01
One of the greatest challenges for English language arts teachers today is the call to engage students in more complex texts. Tim Gillespie, who has taught in public schools for almost four decades, has found the lenses of literary criticism a powerful tool for helping students tackle challenging literary texts. Tim breaks down the dense language…
Tackling the Law and Raising the Issues: Summer Program Prepares Students.
ERIC Educational Resources Information Center
Bowannie, Mary
2003-01-01
An intensive 8-week summer program in New Mexico prepares American Indian and Alaska Native students to succeed in law school, focusing on law research, analysis, and writing. Two program graduates who went on to complete law school discuss the complexities of federal Indian law and the Native lawyers' responsibility to their communities--an…
ERIC Educational Resources Information Center
Kodama, Corinne M.; Dugan, John P.
2013-01-01
Cultivating leaders who are prepared to tackle complex social issues is positioned as a critical outcome of higher education and a tool for diversification of the workforce. Both leadership studies literature and leadership development practice, however, are negligent in the attention directed at understanding the role of social identity in…
Sola, Christophe
2015-06-01
The natural history of tuberculosis may be tackled by various means, among which the record of molecular scars that have been registered by the Mycobacterium tuberculosis complex (MTBC) genomes transmitted from patient to patient for tens of thousands years and possibly more. Recently discovered polymorphic loci, the CRISPR sequences, are indirect witnesses of the historical phage-bacteria struggle, and may be related to the time when the ancestor of today's tubercle bacilli were environmental bacteria, i.e. before becoming intracellular parasites. In this article, we present what are CRISPRs and try to summarize almost 20 years of research results obtained using the genetic diversity of the CRISPR loci in MTBC as a perspective for studying new models. We show that the study of the diversity of CRISPR sequences, thanks to «spoligotyping», has played a great role in our global understanding of the population structure of MTBC. Copyright © 2015 Elsevier Ltd. All rights reserved.
Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B
2016-11-01
As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.
Learning in the model space for cognitive fault diagnosis.
Chen, Huanhuan; Tino, Peter; Rodan, Ali; Yao, Xin
2014-01-01
The emergence of large sensor networks has facilitated the collection of large amounts of real-time data to monitor and control complex engineering systems. However, in many cases the collected data may be incomplete or inconsistent, while the underlying environment may be time-varying or unformulated. In this paper, we develop an innovative cognitive fault diagnosis framework that tackles the above challenges. This framework investigates fault diagnosis in the model space instead of the signal space. Learning in the model space is implemented by fitting a series of models using a series of signal segments selected with a sliding window. By investigating the learning techniques in the fitted model space, faulty models can be discriminated from healthy models using a one-class learning algorithm. The framework enables us to construct a fault library when unknown faults occur, which can be regarded as cognitive fault isolation. This paper also theoretically investigates how to measure the pairwise distance between two models in the model space and incorporates the model distance into the learning algorithm in the model space. The results on three benchmark applications and one simulated model for the Barcelona water distribution network confirm the effectiveness of the proposed framework.
Agent-based modelling in synthetic biology.
Gorochowski, Thomas E
2016-11-30
Biological systems exhibit complex behaviours that emerge at many different levels of organization. These span the regulation of gene expression within single cells to the use of quorum sensing to co-ordinate the action of entire bacterial colonies. Synthetic biology aims to make the engineering of biology easier, offering an opportunity to control natural systems and develop new synthetic systems with useful prescribed behaviours. However, in many cases, it is not understood how individual cells should be programmed to ensure the emergence of a required collective behaviour. Agent-based modelling aims to tackle this problem, offering a framework in which to simulate such systems and explore cellular design rules. In this article, I review the use of agent-based models in synthetic biology, outline the available computational tools, and provide details on recently engineered biological systems that are amenable to this approach. I further highlight the challenges facing this methodology and some of the potential future directions. © 2016 The Author(s).
Sabne, Amit J.; Sakdhnagool, Putt; Lee, Seyong; ...
2015-07-13
Accelerator-based heterogeneous computing is gaining momentum in the high-performance computing arena. However, the increased complexity of heterogeneous architectures demands more generic, high-level programming models. OpenACC is one such attempt to tackle this problem. Although the abstraction provided by OpenACC offers productivity, it raises questions concerning both functional and performance portability. In this article, the authors propose HeteroIR, a high-level, architecture-independent intermediate representation, to map high-level programming models, such as OpenACC, to heterogeneous architectures. They present a compiler approach that translates OpenACC programs into HeteroIR and accelerator kernels to obtain OpenACC functional portability. They then evaluate the performance portability obtained bymore » OpenACC with their approach on 12 OpenACC programs on Nvidia CUDA, AMD GCN, and Intel Xeon Phi architectures. They study the effects of various compiler optimizations and OpenACC program settings on these architectures to provide insights into the achieved performance portability.« less
Bumblebees minimize control challenges by combining active and passive modes in unsteady winds
NASA Astrophysics Data System (ADS)
Ravi, Sridhar; Kolomenskiy, Dmitry; Engels, Thomas; Schneider, Kai; Wang, Chun; Sesterhenn, Jörn; Liu, Hao
2016-10-01
The natural wind environment that volant insects encounter is unsteady and highly complex, posing significant flight-control and stability challenges. It is critical to understand the strategies insects employ to safely navigate in natural environments. We combined experiments on free flying bumblebees with high-fidelity numerical simulations and lower-order modeling to identify the mechanics that mediate insect flight in unsteady winds. We trained bumblebees to fly upwind towards an artificial flower in a wind tunnel under steady wind and in a von Kármán street formed in the wake of a cylinder. Analysis revealed that at lower frequencies in both steady and unsteady winds the bees mediated lateral movement with body roll - typical casting motion. Numerical simulations of a bumblebee in similar conditions permitted the separation of the passive and active components of the flight trajectories. Consequently, we derived simple mathematical models that describe these two motion components. Comparison between the free-flying live and modeled bees revealed a novel mechanism that enables bees to passively ride out high-frequency perturbations while performing active maneuvers at lower frequencies. The capacity of maintaining stability by combining passive and active modes at different timescales provides a viable means for animals and machines to tackle the challenges posed by complex airflows.
Jenkins, Daniel P; Salmon, Paul M; Stanton, Neville A; Walker, Guy H; Rafferty, Laura
2011-02-01
Understanding why an individual acted in a certain way is of fundamental importance to the human factors community, especially when the choice of action results in an undesirable outcome. This challenge is typically tackled by applying retrospective interview techniques to generate models of what happened, recording deviations from a 'correct procedure'. While such approaches may have great utility in tightly constrained procedural environments, they are less applicable in complex sociotechnical systems that require individuals to modify procedures in real time to respond to a changing environment. For complex sociotechnical systems, a formative approach is required that maps the information available to the individual and considers its impact on performance and action. A context-specific, activity-independent, constraint-based model forms the basis of this approach. To illustrate, an example of the Stockwell shooting is used, where an innocent man, mistaken for a suicide bomber, was shot dead. Transferable findings are then presented. STATEMENT OF RELEVANCE: This paper presents a new approach that can be applied proactively to consider how sociotechnical system design, and the information available to an individual, can affect their performance. The approach is proposed to be complementary to the existing tools in the mental models phase of the cognitive work analysis framework.
The Earthquake‐Source Inversion Validation (SIV) Project
Mai, P. Martin; Schorlemmer, Danijel; Page, Morgan T.; Ampuero, Jean-Paul; Asano, Kimiyuki; Causse, Mathieu; Custodio, Susana; Fan, Wenyuan; Festa, Gaetano; Galis, Martin; Gallovic, Frantisek; Imperatori, Walter; Käser, Martin; Malytskyy, Dmytro; Okuwaki, Ryo; Pollitz, Fred; Passone, Luca; Razafindrakoto, Hoby N. T.; Sekiguchi, Haruko; Song, Seok Goo; Somala, Surendra N.; Thingbaijam, Kiran K. S.; Twardzik, Cedric; van Driel, Martin; Vyas, Jagdish C.; Wang, Rongjiang; Yagi, Yuji; Zielke, Olaf
2016-01-01
Finite‐fault earthquake source inversions infer the (time‐dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake‐source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward‐modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source‐model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake‐source imaging problem.
NASA Astrophysics Data System (ADS)
Camporese, M.; Bertoldi, G.; Bortoli, E.; Wohlfahrt, G.
2017-12-01
Integrated hydrologic surface-subsurface models (IHSSMs) are increasingly used as prediction tools to solve simultaneously states and fluxes in and between multiple terrestrial compartments (e.g., snow cover, surface water, groundwater), in an attempt to tackle environmental problems in a holistic approach. Two such models, CATHY and GEOtop, are used in this study to investigate their capabilities to reproduce hydrological processes in alpine grasslands. The two models differ significantly in the complexity of the representation of the surface energy balance and the solution of Richards equation for water flow in the variably saturated subsurface. The main goal of this research is to show how these differences in process representation can lead to different predictions of hydrologic states and fluxes, in the simulation of an experimental site located in the Venosta Valley (South Tyrol, Italy). Here, a large set of relevant hydrological data (e.g., evapotranspiration, soil moisture) has been collected, with ground and remote sensing observations. The area of interest is part of a Long-Term Ecological Research (LTER) site, a mountain steep, heterogeneous slope, where the predominant land use types are meadow, pasture, and forest. The comparison between data and model predictions, as well as between simulations with the two IHSSMs, contributes to advance our understanding of the tradeoffs between different complexities in modeĺs process representation, model accuracy, and the ability to explain observed hydrological dynamics in alpine environments.
Versatile clinical information system design for emergency departments.
Amouh, Teh; Gemo, Monica; Macq, Benoît; Vanderdonckt, Jean; El Gariani, Abdul Wahed; Reynaert, Marc S; Stamatakis, Lambert; Thys, Frédéric
2005-06-01
Compared to other hospital units, the emergency department presents some distinguishing characteristics of its own. Emergency health-care delivery is a collaborative process involving the contribution of several individuals who accomplish their tasks while working autonomously under pressure and sometimes with limited resources. Effective computerization of the emergency department information system presents a real challenge due to the complexity of the scenario. Current computerized support suffers from several problems, including inadequate data models, clumsy user interfaces, and poor integration with other clinical information systems. To tackle such complexity, we propose an approach combining three points of view, namely the transactions (in and out of the department), the (mono and multi) user interfaces and data management. Unlike current systems, we pay particular attention to the user-friendliness and versatility of our system. This means that intuitive user interfaces have been conceived and specific software modeling methodologies have been applied to provide our system with the flexibility and adaptability necessary for the individual and group coordinated tasks. Our approach has been implemented by prototyping a web-based, multiplatform, multiuser, and versatile clinical information system built upon multitier software architecture, using the Java programming language.
Hasegawa, Yoshinori; Shiota, Yuki; Ota, Chihiro; Yoneda, Takeshi; Tahara, Shigeyuki; Maki, Nobukazu; Matsuura, Takahiro; Sekiguchi, Masahiro; Itoigawa, Yoshiaki; Tateishi, Tomohiko; Kaneko, Kazuo
2018-01-01
Objectives To characterise the tackler’s head position during one-on-one tackling in rugby and to determine the incidence of head, neck and shoulder injuries through analysis of game videos, injury records and a questionnaire completed by the tacklers themselves. Methods We randomly selected 28 game videos featuring two university teams in competitions held in 2015 and 2016. Tackles were categorised according to tackler’s head position. The ‘pre-contact phase’ was defined; its duration and the number of steps taken by the ball carrier prior to a tackle were evaluated. Results In total, 3970 tackles, including 317 (8.0%) with the tackler’s head incorrectly positioned (ie, in front of the ball carrier) were examined. Thirty-two head, neck or shoulder injuries occurred for an injury incidence of 0.8% (32/3970). The incidence of injury in tackles with incorrect head positioning was 69.4/1000 tackles; the injury incidence with correct head positioning (ie, behind or to one side of the ball carrier) was 2.7/1000 tackles. Concussions, neck injuries, ‘stingers’ and nasal fractures occurred significantly more often during tackles with incorrect head positioning than during tackles with correct head positioning. Significantly fewer steps were taken before tackles with incorrect head positioning that resulted in injury than before tackles that did not result in injury. Conclusion Tackling with incorrect head position relative to the ball carrier resulted in a significantly higher incidence of concussions, neck injuries, stingers and nasal fractures than tackling with correct head position. Tackles with shorter duration and distance before contact resulted in more injuries. PMID:29162618
Tackle characteristics and injury in a cross section of rugby union football.
McIntosh, Andrew S; Savage, Trevor N; McCrory, Paul; Fréchède, Bertrand O; Wolfe, Rory
2010-05-01
The tackle is the game event in rugby union most associated with injury. This study's main aims were to measure tackle characteristics from video using a qualitative protocol, to assess whether the characteristics differed by level of play, and to measure the associations between tackle characteristics and injury. A cohort study was undertaken. The cohort comprised male rugby players in the following levels: younger than 15 yr, 18 yr, and 20 yr, grade, and elite (Super 12 and Wallabies). All tackle events and technique characteristics were coded in 77 game halves using a standardized qualitative protocol. Game injuries and missed-game injuries were identified and correlated with tackle events. A total of 6618 tackle events, including 81 resulting in a game injury, were observed and coded in the 77 game halves fully analyzed (145 tackle events per hour). An increase in the proportion of active shoulder tackles was observed from younger than 15 yr (13%) to elite (31%). Younger players engaged in more passive tackles and tended to stay on their feet more than experienced players. Younger than 15 yr rugby players had a significantly lower risk of tackle game injury compared with elite players. No specific tackle technique was observed to be associated with a significantly increased risk of game injury. There was a greater risk of game injury associated with two or more tacklers involved in the tackle event, and the greatest risk was associated with simultaneous contact by tacklers, after adjusting for level of play. Tackle characteristics differed between levels of play. The number of tacklers and the sequence of tackler contact with the ball carrier require consideration from an injury prevention perspective.
Understanding DNA under oxidative stress and sensitization: the role of molecular modeling
Dumont, Elise; Monari, Antonio
2015-01-01
DNA is constantly exposed to damaging threats coming from oxidative stress, i.e., from the presence of free radicals and reactive oxygen species. Sensitization from exogenous and endogenous compounds that strongly enhance the frequency of light-induced lesions also plays an important role. The experimental determination of DNA lesions, though a difficult subject, is somehow well established and allows to elucidate even extremely rare DNA lesions. In parallel, molecular modeling has become fundamental to clearly understand the fine mechanisms related to DNA defects induction. Indeed, it offers an unprecedented possibility to get access to an atomistic or even electronic resolution. Ab initio molecular dynamics may also describe the time-evolution of the molecular system and its reactivity. Yet the modeling of DNA (photo-)reactions does necessitate elaborate multi-scale methodologies to tackle a damage induction reactivity that takes place in a complex environment. The double-stranded DNA environment is first characterized by a very high flexibility, but also a strongly inhomogeneous electrostatic embedding. Additionally, one aims at capturing more subtle effects, such as the sequence selectivity which is of critical important for DNA damage. The structure and dynamics of the DNA/sensitizers complexes, as well as the photo-induced electron- and energy-transfer phenomena taking place upon sensitization, should be carefully modeled. Finally the factors inducing different repair ratios for different lesions should also be rationalized. In this review we will critically analyze the different computational strategies used to model DNA lesions. A clear picture of the complex interplay between reactivity and structural factors will be sketched. The use of proper multi-scale modeling leads to the in-depth comprehension of DNA lesions mechanisms and also to the rational design of new chemo-therapeutic agents. PMID:26236706
Big questions, big science: meeting the challenges of global ecology
David Schimel; Michael Keller
2015-01-01
Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigatorâs or s group of investigatorâs labs, sustained for longer...
K-8 Science Education: Elements That Matter. A Report from the 2007 North Carolina Science Summit
ERIC Educational Resources Information Center
James B. Hunt Jr. Institute for Educational Leadership and Policy, 2007
2007-01-01
Today's world is challenged by complex issues and citizens need ever-increasing scientific literacy to understand the impact of issues such as global warming, alternative energy, and genetic engineering, on their daily lives and to make informed decisions. Students need to be prepared to tackle issues such as these by being exposed to learning…
ERIC Educational Resources Information Center
Shuldburg, Sara; Carroll, Jennifer
2017-01-01
An advanced undergraduate experiment involving the synthesis and characterization of a series of six unique cinnamamides is described. This experiment allows for a progressive mastery of skills students need to tackle more complex NMR structure elucidation problems. Characterization of the products involves IR spectroscopy, GCMS, and proton,…
ERIC Educational Resources Information Center
Hayton, Annette Ruth; Haste, Polly; Jones, Alison
2015-01-01
Students studying art at university in the United Kingdom tend to be female, from higher social classes and from majority ethnic groups. This paper considers some of the complex and deeply-rooted social and economic factors that militate against wider participation in the arts and describes how we started to tackle under-representation at…
Perez-Temprano, Monica Helvia; Martínez de Salinas, Sara; Mudarra, Angel Luis; Benet-Buchholz, Jordi; Parella, Teodor; Maseras, Feliu
2018-05-23
This work describes the employment of discrete "AgCF₃" complexes, including unique (Cat)[Ag(CF₃)₂] salts, as efficient transmetalating agents to PdII in order to tackle some of the usually overshadowed limitations related to this step within the trifluoromethylation area. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
ERIC Educational Resources Information Center
Warwick, Ian; Aggleton, Peter
2014-01-01
In countries such as the UK, schools have a responsibility to prevent all forms of bullying, including those related to sexual orientation. However, relatively little is known about how schools go about this work successfully. This study aimed to identify how three secondary schools in south London, England, were addressing homophobia. Three…
"A shape bend in the road, showing how the horses ...
"A shape bend in the road, showing how the horses are hitched in 'blocking.' The remainder of the team has been hitched to the block and tackle." San Joaquin Light and Power Magazine, Vol. I, No. 12, December 1913, p. 553 - Tule River Hydroelectric Complex, CA Highway 190 at North Fork of Middle Fork of Tule River, Springville, Tulare County, CA
Language Guardian BBC? Investigating the BBC's Language Advice in Its 2003 "News Styleguide"
ERIC Educational Resources Information Center
Ebner, Carmen
2016-01-01
In this paper, the BBC's stance on English language use is investigated by analysing its language guidelines provided in the 2003 BBC "News Styleguide." Before the analysis is tackled, a brief discussion of the use of language and style guides in the media is given to illustrate its complexities and effects on news providers. In order to…
USDA-ARS?s Scientific Manuscript database
The rates of foodborne disease caused by gastrointestinal pathogens continue to be a concern in both the developed and developing worlds. The growing world population, the increasing complexity of agri-food networks and the wide range of foods now associated with STEC are potential drivers for incre...
ERIC Educational Resources Information Center
Windisch, Hendrickje Catriona
2016-01-01
Low basic skills levels of adults are a complex policy problem which has neither straightforward causes nor solutions, and successful interventions are still relatively rare. Tackling serious literacy and numeracy weaknesses among adults is challenging, partly because the task itself is difficult, and partly because even if accomplished…
46 CFR 184.300 - Ground tackle and mooring lines.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 46 Shipping 7 2010-10-01 2010-10-01 false Ground tackle and mooring lines. 184.300 Section 184.300... Ground tackle and mooring lines. A vessel must be fitted with ground tackle and mooring lines necessary for the vessel to be safely anchored or moored. The ground tackle and mooring lines provided must be...
46 CFR 184.300 - Ground tackle and mooring lines.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 46 Shipping 7 2011-10-01 2011-10-01 false Ground tackle and mooring lines. 184.300 Section 184.300... Ground tackle and mooring lines. A vessel must be fitted with ground tackle and mooring lines necessary for the vessel to be safely anchored or moored. The ground tackle and mooring lines provided must be...
The structural bioinformatics library: modeling in biomolecular science and beyond.
Cazals, Frédéric; Dreyfus, Tom
2017-04-01
Software in structural bioinformatics has mainly been application driven. To favor practitioners seeking off-the-shelf applications, but also developers seeking advanced building blocks to develop novel applications, we undertook the design of the Structural Bioinformatics Library ( SBL , http://sbl.inria.fr ), a generic C ++/python cross-platform software library targeting complex problems in structural bioinformatics. Its tenet is based on a modular design offering a rich and versatile framework allowing the development of novel applications requiring well specified complex operations, without compromising robustness and performances. The SBL involves four software components (1-4 thereafter). For end-users, the SBL provides ready to use, state-of-the-art (1) applications to handle molecular models defined by unions of balls, to deal with molecular flexibility, to model macro-molecular assemblies. These applications can also be combined to tackle integrated analysis problems. For developers, the SBL provides a broad C ++ toolbox with modular design, involving core (2) algorithms , (3) biophysical models and (4) modules , the latter being especially suited to develop novel applications. The SBL comes with a thorough documentation consisting of user and reference manuals, and a bugzilla platform to handle community feedback. The SBL is available from http://sbl.inria.fr. Frederic.Cazals@inria.fr. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Declarative Programming with Temporal Constraints, in the Language CG.
Negreanu, Lorina
2015-01-01
Specifying and interpreting temporal constraints are key elements of knowledge representation and reasoning, with applications in temporal databases, agent programming, and ambient intelligence. We present and formally characterize the language CG, which tackles this issue. In CG, users are able to develop time-dependent programs, in a flexible and straightforward manner. Such programs can, in turn, be coupled with evolving environments, thus empowering users to control the environment's evolution. CG relies on a structure for storing temporal information, together with a dedicated query mechanism. Hence, we explore the computational complexity of our query satisfaction problem. We discuss previous implementation attempts of CG and introduce a novel prototype which relies on logic programming. Finally, we address the issue of consistency and correctness of CG program execution, using the Event-B modeling approach.
[The challenge of clinical complexity in the 21st century: Could frailty indexes be the answer?
Amblàs-Novellas, Jordi; Espaulella-Panicot, Joan; Inzitari, Marco; Rexach, Lourdes; Fontecha, Benito; Romero-Ortuno, Roman
The number of older people with complex clinical conditions and complex care needs continues to increase in the population. This is presenting many challenges to healthcare professionals and healthcare systems. In the face of these challenges, approaches are required that are practical and feasible. The frailty paradigm may be an excellent opportunity to review and establish some of the principles of comprehensive Geriatric Assessment in specialties outside Geriatric Medicine. The assessment of frailty using Frailty Indexes provides an aid to the 'situational diagnosis' of complex clinical situations, and may help in tackling uncertainty in a person-centred approach. Copyright © 2016 SEGG. Publicado por Elsevier España, S.L.U. All rights reserved.
Tian, Xiaolin; Zhu, Mingwei; Li, Long; Wu, Chunlai
2013-01-01
Genetic screens conducted using Drosophila melanogaster (fruit fly) have made numerous milestone discoveries in the advance of biological sciences. However, the use of biochemical screens aimed at extending the knowledge gained from genetic analysis was explored only recently. Here we describe a method to purify the protein complex that associates with any protein of interest from adult fly heads. This method takes advantage of the Drosophila GAL4/UAS system to express a bait protein fused with a Tandem Affinity Purification (TAP) tag in fly neurons in vivo, and then implements two rounds of purification using a TAP procedure similar to the one originally established in yeast1 to purify the interacting protein complex. At the end of this procedure, a mixture of multiple protein complexes is obtained whose molecular identities can be determined by mass spectrometry. Validation of the candidate proteins will benefit from the resource and ease of performing loss-of-function studies in flies. Similar approaches can be applied to other fly tissues. We believe that the combination of genetic manipulations and this proteomic approach in the fly model system holds tremendous potential for tackling fundamental problems in the field of neurobiology and beyond. PMID:24335807
Pinfold, Vanessa; Byrne, Peter; Toulmin, Hilary
2005-06-01
Stigma and discrimination experienced by people with mental health problems have been identified as major obstacles to treatment and recovery. Less is known about how to effectively tackle stigma-discrimination, although there are numerous international, national and local programmes attempting to improve public mental health literacy and anti-discrimination evidenced based practice. To explore mental health service users' views on how campaigns to address stigma and discrimination should prioritise their actions. Qualitative study using focus group discussions, involving 33 persons aged between 25 and 75. A triad of diminished credibility, dis-empowerment with particular reference to communication problems and avoidance by their social network defined experiences of stigma. Reactions to stigma can be placed in four categories: avoid stigma, resign yourself to it, challenge it, or distance yourself from others with a mental health problem. A range of solutions was discussed with most favouring changes within the health services that are currently supporting them over traditional educational programmes with the public. For mental health service users stigma must be tackled on many different levels reflecting the varied and complex impact that negative social reactions have on an individual's life. When asked to prioritise one area, most service users in our sample highlighted reforms within the health service for tackling stigma and discrimination.
Geomatic methods at the service of water resources modelling
NASA Astrophysics Data System (ADS)
Molina, José-Luis; Rodríguez-Gonzálvez, Pablo; Molina, Mª Carmen; González-Aguilera, Diego; Espejo, Fernando
2014-02-01
Acquisition, management and/or use of spatial information are crucial for the quality of water resources studies. In this sense, several geomatic methods arise at the service of water modelling, aiming the generation of cartographic products, especially in terms of 3D models and orthophotos. They may also perform as tools for problem solving and decision making. However, choosing the right geomatic method is still a challenge in this field. That is mostly due to the complexity of the different applications and variables involved for water resources management. This study is aimed to provide a guide to best practices in this context by tackling a deep review of geomatic methods and their suitability assessment for the following study types: Surface Hydrology, Groundwater Hydrology, Hydraulics, Agronomy, Morphodynamics and Geotechnical Processes. This assessment is driven by several decision variables grouped in two categories, classified depending on their nature as geometric or radiometric. As a result, the reader comes with the best choice/choices for the method to use, depending on the type of water resources modelling study in hand.
Shen, Chung-Wei; Chen, Yi-Hau
2015-10-01
Missing observations and covariate measurement error commonly arise in longitudinal data. However, existing methods for model selection in marginal regression analysis of longitudinal data fail to address the potential bias resulting from these issues. To tackle this problem, we propose a new model selection criterion, the Generalized Longitudinal Information Criterion, which is based on an approximately unbiased estimator for the expected quadratic error of a considered marginal model accounting for both data missingness and covariate measurement error. The simulation results reveal that the proposed method performs quite well in the presence of missing data and covariate measurement error. On the contrary, the naive procedures without taking care of such complexity in data may perform quite poorly. The proposed method is applied to data from the Taiwan Longitudinal Study on Aging to assess the relationship of depression with health and social status in the elderly, accommodating measurement error in the covariate as well as missing observations. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Antigravity ESD - double-balloon-assisted underwater with traction hybrid technique.
Sharma, Sam K; Hiratsuka, Takahiro; Hara, Hisashi; Milsom, Jeffrey W
2018-06-01
Complex colorectal polyps or those positioned in difficult anatomic locations are an endoscopic therapeutic challenge. Underwater endoscopic submucosal dissection (UESD) is a potential technical solution to facilitate efficient polyp removal. In addition, endoscopic tissue retraction has been confined to limited methods of varying efficacy and complexity. The aim of this study was to evaluate the efficiency of a unique UESD technique for removing complex polyps using double-balloon-assisted retraction (R). Using fresh ex-vivo porcine rectum, 4-cm polyps were created using electrosurgery and positioned at "6 o'clock" within an established ESD model. Six resections were performed in each group. Underwater techniques were facilitated using a novel double-balloon platform (Dilumen, Lumendi, Westport, Connecticut, United States). UESD-R had a significantly shorter total procedural time than cap-assisted ESD and UESD alone (24 vs. 58 vs. 56 mins). UESD-R produced a dissection time on average of 5 minutes, attributed to the retraction provided. There was also a subjective significant reduction in electrosurgical smoke with the underwater techniques contributing to improved visualization. Here we report the first ex-vivo experience of a unique double-balloon endoscopic platform optimized for UESD with tissue traction capability. UESD-R removed complex lesions in significantly shorter time than conventional means. The combined benefits of UESD and retraction appeared to be additive when tackling complex polyps and should be studied further.
NASA Astrophysics Data System (ADS)
Imperiale, Alexandre; Chatillon, Sylvain; Darmon, Michel; Leymarie, Nicolas; Demaldent, Edouard
2018-04-01
The high frequency models gathered in the CIVA software allow fast computations and provide satisfactory quantitative predictions in a wide range of situations. However, the domain of validity of these models is limited since they do not accurately predict the ultrasound response in configurations involving subwavelength complex phenomena. In addition, when modelling backwall breaking defects inspection, an important challenge remains to capture the propagation of the creeping waves that are generated at the critical angle. Hybrid models combining numerical and asymptotic methods have already been shown to be an effective strategy to overcome these limitations in 2D [1]. However, 3D simulations remain a crucial issue for industrial applications because of the computational cost of the numerical solver. A dedicated three dimensional high order finite element model combined with a domain decomposition method has been recently proposed to tackle 3D limitations [2]. In this communication, we will focus on the specific case of planar backwall breaking defects, with an adapted coupling strategy in order to efficiently model the propagation of creeping waves. Numerical and experimental validations will be proposed on various configurations.
Wavefield complexity and stealth structures: Resolution constraints by wave physics
NASA Astrophysics Data System (ADS)
Nissen-Meyer, T.; Leng, K.
2017-12-01
Imaging the Earth's interior relies on understanding how waveforms encode information from heterogeneous multi-scale structure. This relation is given by elastodynamics, but forward modeling in the context of tomography primarily serves to deliver synthetic waveforms and gradients for the inversion procedure. While this is entirely appropriate, it depreciates a wealth of complementary inference that can be obtained from the complexity of the wavefield. Here, we are concerned with the imprint of realistic multi-scale Earth structure on the wavefield, and the question on the inherent physical resolution limit of structures encoded in seismograms. We identify parameter and scattering regimes where structures remain invisible as a function of seismic wavelength, structural multi-scale geometry, scattering strength, and propagation path. Ultimately, this will aid in interpreting tomographic images by acknowledging the scope of "forgotten" structures, and shall offer guidance for optimising the selection of seismic data for tomography. To do so, we use our novel 3D modeling method AxiSEM3D which tackles global wave propagation in visco-elastic, anisotropic 3D structures with undulating boundaries at unprecedented resolution and efficiency by exploiting the inherent azimuthal smoothness of wavefields via a coupled Fourier expansion-spectral-element approach. The method links computational cost to wavefield complexity and thereby lends itself well to exploring the relation between waveforms and structures. We will show various examples of multi-scale heterogeneities which appear or disappear in the waveform, and argue that the nature of the structural power spectrum plays a central role in this. We introduce the concept of wavefield learning to examine the true wavefield complexity for a complexity-dependent modeling framework and discriminate which scattering structures can be retrieved by surface measurements. This leads to the question of physical invisibility and the tomographic resolution limit, and offers insight as to why tomographic images still show stark differences for smaller-scale heterogeneities despite progress in modeling and data resolution. Finally, we give an outlook on how we expand this modeling framework towards an inversion procedure guided by wavefield complexity.
Bowen, Zachary H.; Melcher, Cynthia P.; Wilson, Juliette T.
2013-01-01
The Ecosystem Dynamics Branch of the Fort Collins Science Center offers an interdisciplinary team of talented and creative scientists with expertise in biology, botany, ecology, geology, biogeochemistry, physical sciences, geographic information systems, and remote-sensing, for tackling complex questions about natural resources. As demand for natural resources increases, the issues facing natural resource managers, planners, policy makers, industry, and private landowners are increasing in spatial and temporal scope, often involving entire regions, multiple jurisdictions, and long timeframes. Needs for addressing these issues include (1) a better understanding of biotic and abiotic ecosystem components and their complex interactions; (2) the ability to easily monitor, assess, and visualize the spatially complex movements of animals, plants, water, and elements across highly variable landscapes; and (3) the techniques for accurately predicting both immediate and long-term responses of system components to natural and human-caused change. The overall objectives of our research are to provide the knowledge, tools, and techniques needed by the U.S. Department of the Interior, state agencies, and other stakeholders in their endeavors to meet the demand for natural resources while conserving biodiversity and ecosystem services. Ecosystem Dynamics scientists use field and laboratory research, data assimilation, and ecological modeling to understand ecosystem patterns, trends, and mechanistic processes. This information is used to predict the outcomes of changes imposed on species, habitats, landscapes, and climate across spatiotemporal scales. The products we develop include conceptual models to illustrate system structure and processes; regional baseline and integrated assessments; predictive spatial and mathematical models; literature syntheses; and frameworks or protocols for improved ecosystem monitoring, adaptive management, and program evaluation. The descriptions in this fact sheet provide snapshots of our three research emphases, followed by descriptions of select current projects.
A unified approach to VLSI layout automation and algorithm mapping on processor arrays
NASA Technical Reports Server (NTRS)
Venkateswaran, N.; Pattabiraman, S.; Srinivasan, Vinoo N.
1993-01-01
Development of software tools for designing supercomputing systems is highly complex and cost ineffective. To tackle this a special purpose PAcube silicon compiler which integrates different design levels from cell to processor arrays has been proposed. As a part of this, we present in this paper a novel methodology which unifies the problems of Layout Automation and Algorithm Mapping.
A Case Study of Introducing Innovation Through Design
2014-03-01
contacts, freeing more of their mental energy to assist the CO in developing and tackling the overall complexities of the mission. With more energy ...organizations experiencing change while design thinking is devoted to finding solutions to difficult problems by harnessing the creative energy inherent...change. “Rather than focusing on one major opportunity, [embedded actors] pepper the landscape with many cultivated opportunities.”53 (2) Fitting the
From Campus Tug-of-War to Pulling Together: Using the Lean Approach
ERIC Educational Resources Information Center
MacIntyre, Stephen; Meade, Kelly; McEwen, Melissa
2009-01-01
Some days seem like bouts in an endless game of tug-of-war. At one end of the rope, facilities professionals must do more--tackle deferred maintenance, develop a climate strategy, and meet the energy and operational needs for a complex mix of building types and stakeholders. Tugging on the other end are the obstacles of less money, staff, and…
Head impact exposure measured in a single youth football team during practice drills.
Kelley, Mireille E; Kane, Joeline M; Espeland, Mark A; Miller, Logan E; Powers, Alexander K; Stitzel, Joel D; Urban, Jillian E
2017-11-01
OBJECTIVE This study evaluated the frequency, magnitude, and location of head impacts in practice drills within a youth football team to determine how head impact exposure varies among different types of drills. METHODS On-field head impact data were collected from athletes participating in a youth football team for a single season. Each athlete wore a helmet instrumented with a Head Impact Telemetry (HIT) System head acceleration measurement device during all preseason, regular season, and playoff practices. Video was recorded for all practices, and video analysis was performed to verify head impacts and assign each head impact to a specific drill. Eleven drills were identified: dummy/sled tackling, install, special teams, Oklahoma, one-on-one, open-field tackling, passing, position skill work, multiplayer tackle, scrimmage, and tackling drill stations. Generalized linear models were fitted to log-transformed data, and Wald tests were used to assess differences in head accelerations and impact rates. RESULTS A total of 2125 impacts were measured during 30 contact practices in 9 athletes (mean age 11.1 ± 0.6 years, mean mass 44.9 ± 4.1 kg). Open-field tackling had the highest median and 95th percentile linear accelerations (24.7 g and 97.8 g, respectively) and resulted in significantly higher mean head accelerations than several other drills. The multiplayer tackle drill resulted in the highest head impact frequency, with an average of 0.59 impacts per minute per athlete, but the lowest 95th percentile linear accelerations of all drills. The front of the head was the most common impact location for all drills except dummy/sled tackling. CONCLUSIONS Head impact exposure varies significantly in youth football practice drills, with several drills exposing athletes to high-magnitude and/or high-frequency head impacts. These data suggest that further study of practice drills is an important step in developing evidence-based recommendations for modifying or eliminating certain high-intensity drills to reduce head impact exposure and injury risk for all levels of play.
NASA Astrophysics Data System (ADS)
Jin, Yongmei
In recent years, theoretical modeling and computational simulation of microstructure evolution and materials property has been attracting much attention. While significant advances have been made, two major challenges remain. One is the integration of multiple physical phenomena for simulation of complex materials behavior, the other is the bridging over multiple length and time scales in materials modeling and simulation. The research presented in this Thesis is focused mainly on tackling the first major challenge. In this Thesis, a unified Phase Field Microelasticity (PFM) approach is developed. This approach is an advanced version of the phase field method that takes into account the exact elasticity of arbitrarily anisotropic, elastically and structurally inhomogeneous systems. The proposed theory and models are applicable to infinite solids, elastic half-space, and finite bodies with arbitrary-shaped free surfaces, which may undergo various concomitant physical processes. The Phase Field Microelasticity approach is employed to formulate the theories and models of martensitic transformation, dislocation dynamics, and crack evolution in single crystal and polycrystalline solids. It is also used to study strain relaxation in heteroepitaxial thin films through misfit dislocation and surface roughening. Magnetic domain evolution in nanocrystalline thin films is also investigated. Numerous simulation studies are performed. Comparison with analytical predictions and experimental observations are presented. Agreement verities the theory and models as realistic simulation tools for computational materials science and engineering. The same Phase Field Microelasticity formalism of individual models of different physical phenomena makes it easy to integrate multiple physical processes into one unified simulation model, where multiple phenomena are treated as various relaxation modes that together act as one common cooperative phenomenon. The model does not impose a priori constraints on possible microstructure evolution paths. This gives the model predicting power, where material system itself "chooses" the optimal path for multiple processes. The advances made in this Thesis present a significant step forward to overcome the first challenge, mesoscale multi-physics modeling and simulation of materials. At the end of this Thesis, the way to tackle the second challenge, bridging over multiple length and time scales in materials modeling and simulation, is discussed based on connection between the mesoscale Phase Field Microelasticity modeling and microscopic atomistic calculation as well as macroscopic continuum theory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clore, G. Marius; Venditti, Vincenzo
2013-10-01
The bacterial phosphotransferase system (PTS) couples phosphoryl transfer, via a series of bimolecular protein–protein interactions, to sugar transport across the membrane. The multitude of complexes in the PTS provides a paradigm for studying protein interactions, and for understanding how the same binding surface can specifically recognize a diverse array of targets. Fifteen years of work aimed at solving the solution structures of all soluble protein–protein complexes of the PTS has served as a test bed for developing NMR and integrated hybrid approaches to study larger complexes in solution and to probe transient, spectroscopically invisible states, including encounter complexes. We reviewmore » these approaches, highlighting the problems that can be tackled with these methods, and summarize the current findings on protein interactions.« less
Kernel methods for large-scale genomic data analysis
Xing, Eric P.; Schaid, Daniel J.
2015-01-01
Machine learning, particularly kernel methods, has been demonstrated as a promising new tool to tackle the challenges imposed by today’s explosive data growth in genomics. They provide a practical and principled approach to learning how a large number of genetic variants are associated with complex phenotypes, to help reveal the complexity in the relationship between the genetic markers and the outcome of interest. In this review, we highlight the potential key role it will have in modern genomic data processing, especially with regard to integration with classical methods for gene prioritizing, prediction and data fusion. PMID:25053743
Seeing & Feeling How Enzymes Work Using Tangible Models
ERIC Educational Resources Information Center
Lau, Kwok-chi
2013-01-01
This article presents a tangible model used to help students tackle some misconceptions about enzyme actions, particularly the induced-fit model, enzyme-substrate complementarity, and enzyme inhibition. The model can simulate how substrates induce a change in the shape of the active site and the role of attraction force during enzyme-substrate…
Complex systems in metabolic engineering.
Winkler, James D; Erickson, Keesha; Choudhury, Alaksh; Halweg-Edwards, Andrea L; Gill, Ryan T
2015-12-01
Metabolic engineers manipulate intricate biological networks to build efficient biological machines. The inherent complexity of this task, derived from the extensive and often unknown interconnectivity between and within these networks, often prevents researchers from achieving desired performance. Other fields have developed methods to tackle the issue of complexity for their unique subset of engineering problems, but to date, there has not been extensive and comprehensive examination of how metabolic engineers use existing tools to ameliorate this effect on their own research projects. In this review, we examine how complexity affects engineering at the protein, pathway, and genome levels within an organism, and the tools for handling these issues to achieve high-performing strain designs. Quantitative complexity metrics and their applications to metabolic engineering versus traditional engineering fields are also discussed. We conclude by predicting how metabolic engineering practices may advance in light of an explicit consideration of design complexity. Copyright © 2015 Elsevier Ltd. All rights reserved.
Sun, Wei; Huang, Guo H; Lv, Ying; Li, Gongchen
2012-06-01
To tackle nonlinear economies-of-scale (EOS) effects in interval-parameter constraints for a representative waste management problem, an inexact piecewise-linearization-based fuzzy flexible programming (IPFP) model is developed. In IPFP, interval parameters for waste amounts and transportation/operation costs can be quantified; aspiration levels for net system costs, as well as tolerance intervals for both capacities of waste treatment facilities and waste generation rates can be reflected; and the nonlinear EOS effects transformed from objective function to constraints can be approximated. An interactive algorithm is proposed for solving the IPFP model, which in nature is an interval-parameter mixed-integer quadratically constrained programming model. To demonstrate the IPFP's advantages, two alternative models are developed to compare their performances. One is a conventional linear-regression-based inexact fuzzy programming model (IPFP2) and the other is an IPFP model with all right-hand-sides of fussy constraints being the corresponding interval numbers (IPFP3). The comparison results between IPFP and IPFP2 indicate that the optimized waste amounts would have the similar patterns in both models. However, when dealing with EOS effects in constraints, the IPFP2 may underestimate the net system costs while the IPFP can estimate the costs more accurately. The comparison results between IPFP and IPFP3 indicate that their solutions would be significantly different. The decreased system uncertainties in IPFP's solutions demonstrate its effectiveness for providing more satisfactory interval solutions than IPFP3. Following its first application to waste management, the IPFP can be potentially applied to other environmental problems under multiple complexities. Copyright © 2012 Elsevier Ltd. All rights reserved.
Different Approaches to Covariate Inclusion in the Mixture Rasch Model
ERIC Educational Resources Information Center
Li, Tongyun; Jiao, Hong; Macready, George B.
2016-01-01
The present study investigates different approaches to adding covariates and the impact in fitting mixture item response theory models. Mixture item response theory models serve as an important methodology for tackling several psychometric issues in test development, including the detection of latent differential item functioning. A Monte Carlo…
Small-Scale Fabrication of Biomimetic Structures for Periodontal Regeneration
Green, David W.; Lee, Jung-Seok; Jung, Han-Sung
2016-01-01
The periodontium is the supporting tissues for the tooth organ and is vulnerable to destruction, arising from overpopulating pathogenic bacteria and spirochaetes. The presence of microbes together with host responses can destroy large parts of the periodontium sometimes leading tooth loss. Permanent tissue replacements are made possible with tissue engineering techniques. However, existing periodontal biomaterials cannot promote proper tissue architectures, necessary tissue volumes within the periodontal pocket and a “water-tight” barrier, to become clinically acceptable. New kinds of small-scale engineered biomaterials, with increasing biological complexity are needed to guide proper biomimetic regeneration of periodontal tissues. So the ability to make compound structures with small modules, filled with tissue components, is a promising design strategy for simulating the anatomical complexity of the periodotium attachment complexes along the tooth root and the abutment with the tooth collar. Anatomical structures such as, intima, adventitia, and special compartments such as the epithelial cell rests of Malassez or a stellate reticulum niche need to be engineered from the start of regeneration to produce proper periodontium replacement. It is our contention that the positioning of tissue components at the origin is also necessary to promote self-organizing cell–cell connections, cell–matrix connections. This leads to accelerated, synchronized and well-formed tissue architectures and anatomies. This strategy is a highly effective preparation for tackling periodontitis, periodontium tissue resorption, and to ultimately prevent tooth loss. Furthermore, such biomimetic tissue replacements will tackle problems associated with dental implant support and perimimplantitis. PMID:26903872
Improving collaborative learning in online software engineering education
NASA Astrophysics Data System (ADS)
Neill, Colin J.; DeFranco, Joanna F.; Sangwan, Raghvinder S.
2017-11-01
Team projects are commonplace in software engineering education. They address a key educational objective, provide students critical experience relevant to their future careers, allow instructors to set problems of greater scale and complexity than could be tackled individually, and are a vehicle for socially constructed learning. While all student teams experience challenges, those in fully online programmes must also deal with remote working, asynchronous coordination, and computer-mediated communications all of which contribute to greater social distance between team members. We have developed a facilitation framework to aid team collaboration and have demonstrated its efficacy, in prior research, with respect to team performance and outcomes. Those studies indicated, however, that despite experiencing improved project outcomes, students working in effective software engineering teams did not experience significantly improved individual achievement. To address this deficiency we implemented theoretically grounded refinements to the collaboration model based upon peer-tutoring research. Our results indicate a modest, but statistically significant (p = .08), improvement in individual achievement using this refined model.
Artificial neural network methods in quantum mechanics
NASA Astrophysics Data System (ADS)
Lagaris, I. E.; Likas, A.; Fotiadis, D. I.
1997-08-01
In a previous article we have shown how one can employ Artificial Neural Networks (ANNs) in order to solve non-homogeneous ordinary and partial differential equations. In the present work we consider the solution of eigenvalue problems for differential and integrodifferential operators, using ANNs. We start by considering the Schrödinger equation for the Morse potential that has an analytically known solution, to test the accuracy of the method. We then proceed with the Schrödinger and the Dirac equations for a muonic atom, as well as with a nonlocal Schrödinger integrodifferential equation that models the n + α system in the framework of the resonating group method. In two dimensions we consider the well-studied Henon-Heiles Hamiltonian and in three dimensions the model problem of three coupled anharmonic oscillators. The method in all of the treated cases proved to be highly accurate, robust and efficient. Hence it is a promising tool for tackling problems of higher complexity and dimensionality.
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012
Metareasoning and Social Evaluations in Cognitive Agents
NASA Astrophysics Data System (ADS)
Pinyol, Isaac; Sabater-Mir, Jordi
Reputation mechanisms have been recognized one of the key technologies when designing multi-agent systems. They are specially relevant in complex open environments, becoming a non-centralized mechanism to control interactions among agents. Cognitive agents tackling such complex societies must use reputation information not only for selecting partners to interact with, but also in metareasoning processes to change reasoning rules. This is the focus of this paper. We argue about the necessity to allow, as a cognitive systems designers, certain degree of freedom in the reasoning rules of the agents. We also describes cognitive approaches of agency that support this idea. Furthermore, taking as a base the computational reputation model Repage, and its integration in a BDI architecture, we use the previous ideas to specify metarules and processes to modify at run-time the reasoning paths of the agent. In concrete we propose a metarule to update the link between Repage and the belief base, and a metarule and a process to update an axiom incorporated in the belief logic of the agent. Regarding this last issue we also provide empirical results that show the evolution of agents that use it.
Tait, E. W.; Ratcliff, L. E.; Payne, M. C.; ...
2016-04-20
Experimental techniques for electron energy loss spectroscopy (EELS) combine high energy resolution with high spatial resolution. They are therefore powerful tools for investigating the local electronic structure of complex systems such as nanostructures, interfaces and even individual defects. Interpretation of experimental electron energy loss spectra is often challenging and can require theoretical modelling of candidate structures, which themselves may be large and complex, beyond the capabilities of traditional cubic-scaling density functional theory. In this work, we present functionality to compute electron energy loss spectra within the onetep linear-scaling density functional theory code. We first demonstrate that simulated spectra agree withmore » those computed using conventional plane wave pseudopotential methods to a high degree of precision. The ability of onetep to tackle large problems is then exploited to investigate convergence of spectra with respect to supercell size. As a result, we apply the novel functionality to a study of the electron energy loss spectra of defects on the (1 0 1) surface of an anatase slab and determine concentrations of defects which might be experimentally detectable.« less
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.
Toward edge minability for role mining in bipartite networks
NASA Astrophysics Data System (ADS)
Dong, Lijun; Wang, Yi; Liu, Ran; Pi, Benjie; Wu, Liuyi
2016-11-01
Bipartite network models have been extensively used in information security to automatically generate role-based access control (RBAC) from dataset. This process is called role mining. However, not all the topologies of bipartite networks are suitable for role mining; some edges may even reduce the quality of role mining. This causes unnecessary time consumption as role mining is NP-hard. Therefore, to promote the quality of role mining results, the capability that an edge composes roles with other edges, called the minability of edge, needs to be identified. We tackle the problem from an angle of edge importance in complex networks; that is an edge easily covered by roles is considered to be more important. Based on this idea, the k-shell decomposition of complex networks is extended to reveal the different minability of edges. By this way, a bipartite network can be quickly purified by excluding the low-minability edges from role mining, and thus the quality of role mining can be effectively improved. Extensive experiments via the real-world datasets are conducted to confirm the above claims.
A 16-bit Coherent Ising Machine for One-Dimensional Ring and Cubic Graph Problems
NASA Astrophysics Data System (ADS)
Takata, Kenta; Marandi, Alireza; Hamerly, Ryan; Haribara, Yoshitaka; Maruo, Daiki; Tamate, Shuhei; Sakaguchi, Hiromasa; Utsunomiya, Shoko; Yamamoto, Yoshihisa
2016-09-01
Many tasks in our modern life, such as planning an efficient travel, image processing and optimizing integrated circuit design, are modeled as complex combinatorial optimization problems with binary variables. Such problems can be mapped to finding a ground state of the Ising Hamiltonian, thus various physical systems have been studied to emulate and solve this Ising problem. Recently, networks of mutually injected optical oscillators, called coherent Ising machines, have been developed as promising solvers for the problem, benefiting from programmability, scalability and room temperature operation. Here, we report a 16-bit coherent Ising machine based on a network of time-division-multiplexed femtosecond degenerate optical parametric oscillators. The system experimentally gives more than 99.6% of success rates for one-dimensional Ising ring and nondeterministic polynomial-time (NP) hard instances. The experimental and numerical results indicate that gradual pumping of the network combined with multiple spectral and temporal modes of the femtosecond pulses can improve the computational performance of the Ising machine, offering a new path for tackling larger and more complex instances.
Adamson, M W; Morozov, A Y; Kuzenkov, O A
2016-09-01
Mathematical models in biology are highly simplified representations of a complex underlying reality and there is always a high degree of uncertainty with regards to model function specification. This uncertainty becomes critical for models in which the use of different functions fitting the same dataset can yield substantially different predictions-a property known as structural sensitivity. Thus, even if the model is purely deterministic, then the uncertainty in the model functions carries through into uncertainty in model predictions, and new frameworks are required to tackle this fundamental problem. Here, we consider a framework that uses partially specified models in which some functions are not represented by a specific form. The main idea is to project infinite dimensional function space into a low-dimensional space taking into account biological constraints. The key question of how to carry out this projection has so far remained a serious mathematical challenge and hindered the use of partially specified models. Here, we propose and demonstrate a potentially powerful technique to perform such a projection by using optimal control theory to construct functions with the specified global properties. This approach opens up the prospect of a flexible and easy to use method to fulfil uncertainty analysis of biological models.
Emulation of complex open quantum systems using superconducting qubits
NASA Astrophysics Data System (ADS)
Mostame, Sarah; Huh, Joonsuk; Kreisbeck, Christoph; Kerman, Andrew J.; Fujita, Takatoshi; Eisfeld, Alexander; Aspuru-Guzik, Alán
2017-02-01
With quantum computers being out of reach for now, quantum simulators are alternative devices for efficient and accurate simulation of problems that are challenging to tackle using conventional computers. Quantum simulators are classified into analog and digital, with the possibility of constructing "hybrid" simulators by combining both techniques. Here we focus on analog quantum simulators of open quantum systems and address the limit that they can beat classical computers. In particular, as an example, we discuss simulation of the chlorosome light-harvesting antenna from green sulfur bacteria with over 250 phonon modes coupled to each electronic state. Furthermore, we propose physical setups that can be used to reproduce the quantum dynamics of a standard and multiple-mode Holstein model. The proposed scheme is based on currently available technology of superconducting circuits consist of flux qubits and quantum oscillators.
Dangerous dogs: culprits or victims?
Mills, Georgina
2014-12-06
Dangerous dogs and dog bite incidents are rarely out of the news and are a matter of great public interest, but what can be done to tackle this issue and are the dogs really to blame? A debate at the BVA Congress at the London Vet Show discussed the complexities surrounding dog bites and dog behaviour, and looked at possible ways of preventing future incidents. Georgina Mills reports. British Veterinary Association.
Managing obesity in primary care.
Goldie, Christine; Brown, Jenny
Obesity is a complex problem and often difficult to tackle in primary care. A year-long pilot of a practice nurse-led scheme that used a holistic approach towards self-care in obesity management was set up to reduce the cardiovascular risk of patients who were obese and improve their quality of life. This person-centred approach may offer an important tool in the management of these patients in the GP surgery.
,
2009-01-01
In the Southeast, U.S. Geological Survey (USGS) scientists are researching issues through technical studies of water availability and quality, geologic processes (marine, coastal, and terrestrial), geographic complexity, and biological resources. The USGS is prepared to tackle multifaceted questions associated with global climate change and resulting weather patterns such as drought through expert scientific skill, innovative research approaches, and accurate information technology.
ERIC Educational Resources Information Center
Kollmann, Elizabeth Kunz; Reich, Christine; Bell, Larry; Goss, Juli
2013-01-01
In a world of increasing scientific and technological complexity, where science and technology play an expanding role in our lives, there is need for a democratic citizenry that is skilled at discussing and making choices that are informed by science and shaped by individual and collective values. Although an oft argued rationale for teaching…
Comfortable with Chaos: Operational Design in the Naval Special Warfare Planning Process
2011-05-08
President Alvaro Uribe Velez took office, Colombia was enduring a multi- faceted and interactively complex strategic situation. Three major insurgent groups...President Uribe took office and designed a comprehensive strategy to tackle the "wicked" problem. President Uribe designed an operational approach that...government -unattainable by previous presidents. From 2002 to 2006, the Uribe administration reframed 19 their understanding of the problem and
ERIC Educational Resources Information Center
Nowrouzian, Forough L.; Farewell, Anne
2013-01-01
Teamwork has become an integral part of most organisations today, and it is clearly important in Science and other disciplines. In Science, research teams increase in size while the number of single-authored papers and patents decline. Team-work in laboratory sciences permits projects that are too big or complex for one individual to be tackled.…
NASA Astrophysics Data System (ADS)
Xia, Xilin; Liang, Qiuhua; Ming, Xiaodong; Hou, Jingming
2017-05-01
Numerical models solving the full 2-D shallow water equations (SWEs) have been increasingly used to simulate overland flows and better understand the transient flow dynamics of flash floods in a catchment. However, there still exist key challenges that have not yet been resolved for the development of fully dynamic overland flow models, related to (1) the difficulty of maintaining numerical stability and accuracy in the limit of disappearing water depth and (2) inaccurate estimation of velocities and discharges on slopes as a result of strong nonlinearity of friction terms. This paper aims to tackle these key research challenges and present a new numerical scheme for accurately and efficiently modeling large-scale transient overland flows over complex terrains. The proposed scheme features a novel surface reconstruction method (SRM) to correctly compute slope source terms and maintain numerical stability at small water depth, and a new implicit discretization method to handle the highly nonlinear friction terms. The resulting shallow water overland flow model is first validated against analytical and experimental test cases and then applied to simulate a hypothetic rainfall event in the 42 km2 Haltwhistle Burn, UK.
Philosophy and Sociology of Science Evolution and History
NASA Astrophysics Data System (ADS)
Rosen, Joe
The following sections are included: * Concrete Versus Abstract Theoretical Models * Introduction: concrete and abstract in kepler's contribution * Einstein's theory of gravitation and mach's principle * Unitary symmetry and the structure of hadrons * Conclusion * Dedication * Symmetry, Entropy and Complexity * Introduction * Symmetry Implies Abstraction and Loss of Information * Broken Symmetries - Imposed or Spontaneous * Symmetry, Order and Information * References * Cosmological Surrealism: More Than "Eternal Reality" Is Needed * Pythagoreanism in atomic, nuclear and particle physics * Introduction: Pythagoreanism as part of the Greek scientific world view — and the three questions I will tackle * Point 1: the impact of Gersonides and Crescas, two scientific anti-Aristotelian rebels * Point 2: Kepler's spheres to Bohr's orbits — Pythagoreanisms at last! * Point 3: Aristotle to Maupertuis, Emmy Noether, Schwinger * References * Paradigm Completion For Generalized Evolutionary Theory With Application To Epistemology * Evolution Fully Generalized * Entropy: Gravity as Model * Evolution and Entropy: Measures of Complexity * Extinctions and a Balanced Evolutionary Paradigm * The Evolution of Human Society - the Age of Information as example * High-Energy Physics and the World Wide Web * Twentieth Century Epistemology has Strong (de facto) Evolutionary Elements * The discoveries towards the beginning of the XXth Century * Summary and Conclusions * References * Evolutionary Epistemology and Invalidation * Introduction * Extinctions and A New Evolutionary Paradigm * Evolutionary Epistemology - Active Mutations * Evolutionary Epistemology: Invalidation as An Extinction * References
The Effects of Verbal Instruction and Shaping to Improve Tackling by High School Football Players
ERIC Educational Resources Information Center
Harrison, Antonio M.; Pyles, David A.
2013-01-01
We evaluated verbal instruction and shaping using TAG (teaching with acoustical guidance) to improve tackling by 3 high school football players. Verbal instruction and shaping improved tackling for all 3 participants. In addition, performance was maintained as participants moved more quickly through the tackling procedure.
NASA Astrophysics Data System (ADS)
Berraud-Pache, Romain; Garcia-Iriepa, Cristina; Navizet, Isabelle
2018-04-01
In less than half a century, the hybrid QM/MM method has become one of the most used technique to model molecules embedded in a complex environment. A well-known application of the QM/MM method is for biological systems. Nowadays, one can understand how enzymatic reactions work or compute spectroscopic properties, like the wavelength of emission. Here, we have tackled the issue of modelling chemical reactions inside proteins. We have studied a bioluminescent system, fireflies, and deciphered if a keto-enol tautomerization is possible inside the protein. The two tautomers are candidates to be the emissive molecule of the bioluminescence but no outcome has been reached. One hypothesis is to consider a possible keto-enol tautomerization to treat this issue, as it has been already observed in water. A joint approach combining extensive MD simulations as well as computation of key intermediates like TS using QM/MM calculations is presented in this publication. We also emphasize the procedure and difficulties met during this approach in order to give a guide for this kind of chemical reactions using QM/MM methods.
A Synergetic Approach to Describe the Stability and Variability of Motor Behavior
NASA Astrophysics Data System (ADS)
Witte, Kersttn; Bock, Holger; Storb, Ulrich; Blaser, Peter
At the beginning of the 20th century, the Russian physiologist and biomechanist Bernstein developed his cyclograms, in which he showed in the non-repetition of the same movement under constant conditions. We can also observe this phenomenon when we analyze several cyclic sports movements. For example, we investigated the trajectories of single joints and segments of the body in breaststroke, walking, and running. The problem of the stability and variability of movement, and the relation between the two, cannot be satisfactorily tackled by means of linear methods. Thus, several authors (Turvey, 1977; Kugler et al., 1980; Haken et al., 1985; Schöner et al., 1986; Mitra et al., 1997; Kay et al., 1991; Ganz et al., 1996; Schöllhorn, 1999) use nonlinear models to describe human movement. These models and approaches have shown that nonlinear theories of complex systems provide a new understanding of the stability and variability of motor control. The purpose of this chapter is a presentation of a common synergetic model of motor behavior and its application to foot tapping, walking, and running.
Berraud-Pache, Romain; Garcia-Iriepa, Cristina; Navizet, Isabelle
2018-01-01
In less than half a century, the hybrid QM/MM method has become one of the most used technique to model molecules embedded in a complex environment. A well-known application of the QM/MM method is for biological systems. Nowadays, one can understand how enzymatic reactions work or compute spectroscopic properties, like the wavelength of emission. Here, we have tackled the issue of modeling chemical reactions inside proteins. We have studied a bioluminescent system, fireflies, and deciphered if a keto-enol tautomerization is possible inside the protein. The two tautomers are candidates to be the emissive molecule of the bioluminescence but no outcome has been reached. One hypothesis is to consider a possible keto-enol tautomerization to treat this issue, as it has been already observed in water. A joint approach combining extensive MD simulations as well as computation of key intermediates like TS using QM/MM calculations is presented in this publication. We also emphasize the procedure and difficulties met during this approach in order to give a guide for this kind of chemical reactions using QM/MM methods. PMID:29719820
Berraud-Pache, Romain; Garcia-Iriepa, Cristina; Navizet, Isabelle
2018-01-01
In less than half a century, the hybrid QM/MM method has become one of the most used technique to model molecules embedded in a complex environment. A well-known application of the QM/MM method is for biological systems. Nowadays, one can understand how enzymatic reactions work or compute spectroscopic properties, like the wavelength of emission. Here, we have tackled the issue of modeling chemical reactions inside proteins. We have studied a bioluminescent system, fireflies, and deciphered if a keto-enol tautomerization is possible inside the protein. The two tautomers are candidates to be the emissive molecule of the bioluminescence but no outcome has been reached. One hypothesis is to consider a possible keto-enol tautomerization to treat this issue, as it has been already observed in water. A joint approach combining extensive MD simulations as well as computation of key intermediates like TS using QM/MM calculations is presented in this publication. We also emphasize the procedure and difficulties met during this approach in order to give a guide for this kind of chemical reactions using QM/MM methods.
Haynes, Abby; Brennan, Sue; Carter, Stacy; O'Connor, Denise; Schneider, Carmen Huckel; Turner, Tari; Gallego, Gisselle
2014-09-27
Process evaluation is vital for understanding how interventions function in different settings, including if and why they have different effects or do not work at all. This is particularly important in trials of complex interventions in 'real world' organisational settings where causality is difficult to determine. Complexity presents challenges for process evaluation, and process evaluations that tackle complexity are rarely reported. This paper presents the detailed protocol for a process evaluation embedded in a randomised trial of a complex intervention known as SPIRIT (Supporting Policy In health with Research: an Intervention Trial). SPIRIT aims to build capacity for using research in health policy and program agencies. We describe the flexible and pragmatic methods used for capturing, managing and analysing data across three domains: (a) the intervention as it was implemented; (b) how people participated in and responded to the intervention; and (c) the contextual characteristics that mediated this relationship and may influence outcomes. Qualitative and quantitative data collection methods include purposively sampled semi-structured interviews at two time points, direct observation and coding of intervention activities, and participant feedback forms. We provide examples of the data collection and data management tools developed. This protocol provides a worked example of how to embed process evaluation in the design and evaluation of a complex intervention trial. It tackles complexity in the intervention and its implementation settings. To our knowledge, it is the only detailed example of the methods for a process evaluation of an intervention conducted as part of a randomised trial in policy organisations. We identify strengths and weaknesses, and discuss how the methods are functioning during early implementation. Using 'insider' consultation to develop methods is enabling us to optimise data collection while minimising discomfort and burden for participants. Embedding the process evaluation within the trial design is facilitating access to data, but may impair participants' willingness to talk openly in interviews. While it is challenging to evaluate the process of conducting a randomised trial of a complex intervention, our experience so far suggests that it is feasible and can add considerably to the knowledge generated.
Technical determinants of tackle and ruck performance in International rugby union.
Hendricks, Sharief; van Niekerk, Tiffany; Sin, Drew Wade; Lambert, Mike; den Hollander, Steve; Brown, James; Maree, Willie; Treu, Paul; Till, Kevin; Jones, Ben
2018-03-01
The most frequently occurring contact events in rugby union are the tackle and ruck. The ability repeatedly to engage and win the tackle and ruck has been associated with team success. To win the tackle and ruck, players have to perform specific techniques. These techniques have not been studied at the highest level of rugby union. Therefore, the purpose of this study was to identify technical determinants of tackle and ruck performance at the highest level of rugby union. A total of 4479 tackle and 2914 ruck events were coded for the Six Nations and Championship competitions. Relative risk ratio (RR), the ratio of the probability of an outcome occurring when a characteristic was observed (versus the non-observed characteristic), was determined using multinomial logistic regression. Executing front-on tackles reduced the likelihood of offloads and tackle breaks in both competitions (Six Nations RR 3.0 Behind tackle, 95% confidence interval [95% CI]: 1.9-4.6, effect size [ES] = large, P < 0.001); Championship RR 2.9 Jersey tackle, 95% CI: 1.3-6.4, ES = moderate, P = 0.01). Fending during contact increased the chances of offloading and breaking the tackle in both competitions (Six Nations RR 4.5 Strong, 95% CI: 2.2-9.2, ES = large, P = P < 0.001; Championship RR 5.1 Moderate, 95% CI: 3.5-7.4, ES = large, P < 0.001). For the ruck, actively placing the ball increased the probability of maintaining possession (Six Nations RR 2.2, 95% CI: 1.1-4.3, ES = moderate, P = 0.03); Championship RR 4.0, 95% CI: 1.3-11.8, ES = large, P = 0.01). The techniques identified in this study should be incorporated and emphasised during training to prepare players for competition. Furthermore, these techniques need to be added to coaching manuals for the tackle and ruck.
Burger, Nicholas; Lambert, Michael I; Viljoen, Wayne; Brown, James C; Readhead, Clint; Hendricks, Sharief
2016-08-01
The high injury rate associated with rugby union is primarily due to the tackle, and poor contact technique has been identified as a risk factor for injury. We aimed to determine whether the tackle technique proficiency scores were different in injurious tackles versus tackles that did not result in injury using real-match scenarios in high-level youth rugby union. Injury surveillance was conducted at the under-18 Craven Week tournaments (2011-2013). Tackle-related injury information was used to identify injury events in the match video footage and non-injury events were identified for the injured player cohort. Injury and non-injury events were scored for technique proficiency and Cohen's effect sizes were calculated and the Student t test (p<0.05) was performed to compare injury versus non-injury scores. The overall mean score for front-on ball-carrier proficiency was 7.17±1.90 and 9.02±2.15 for injury and non-injury tackle events, respectively (effect size=moderate; p<0.05). The overall mean score for side/behind ball-carrier proficiency was 4.09±2.12 and 7.68±1.72 for injury and non-injury tackle events, respectively (effect size=large; p<0.01). The overall mean score for front-on tackler proficiency was 7.00±1.95 and 9.35±2.56 for injury and non-injury tackle events, respectively (effect size=moderate; p<0.05). The overall mean score for side/behind tackler proficiency was 5.47±1.60 and 8.14±1.75 for injury and non-injury tackle events, respectively (effect size=large; p<0.01). Higher overall mean and criterion-specific tackle-related technique scores were associated with a non-injury outcome. The ability to perform well during tackle events may decrease the risk of injury and may manifest in superior performance. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
2017-03-01
models of software execution, for example memory access patterns, to check for security intrusions. Additional research was performed to tackle the...considered using indirect models of software execution, for example memory access patterns, to check for security intrusions. Additional research ...deterioration for example , no longer corresponds to the model used during verification time. Finally, the research looked at ways to combine hybrid systems
Automated model optimisation using the Cylc workflow engine (Cyclops v1.0)
NASA Astrophysics Data System (ADS)
Gorman, Richard M.; Oliver, Hilary J.
2018-06-01
Most geophysical models include many parameters that are not fully determined by theory, and can be tuned
to improve the model's agreement with available data. We might attempt to automate this tuning process in an objective way by employing an optimisation algorithm to find the set of parameters that minimises a cost function derived from comparing model outputs with measurements. A number of algorithms are available for solving optimisation problems, in various programming languages, but interfacing such software to a complex geophysical model simulation presents certain challenges. To tackle this problem, we have developed an optimisation suite (Cyclops
) based on the Cylc workflow engine that implements a wide selection of optimisation algorithms from the NLopt Python toolbox (Johnson, 2014). The Cyclops optimisation suite can be used to calibrate any modelling system that has itself been implemented as a (separate) Cylc model suite, provided it includes computation and output of the desired scalar cost function. A growing number of institutions are using Cylc to orchestrate complex distributed suites of interdependent cycling tasks within their operational forecast systems, and in such cases application of the optimisation suite is particularly straightforward. As a test case, we applied the Cyclops to calibrate a global implementation of the WAVEWATCH III (v4.18) third-generation spectral wave model, forced by ERA-Interim input fields. This was calibrated over a 1-year period (1997), before applying the calibrated model to a full (1979-2016) wave hindcast. The chosen error metric was the spatial average of the root mean square error of hindcast significant wave height compared with collocated altimeter records. We describe the results of a calibration in which up to 19 parameters were optimised.
Link-prediction to tackle the boundary specification problem in social network surveys
De Wilde, Philippe; Buarque de Lima-Neto, Fernando
2017-01-01
Diffusion processes in social networks often cause the emergence of global phenomena from individual behavior within a society. The study of those global phenomena and the simulation of those diffusion processes frequently require a good model of the global network. However, survey data and data from online sources are often restricted to single social groups or features, such as age groups, single schools, companies, or interest groups. Hence, a modeling approach is required that extrapolates the locally restricted data to a global network model. We tackle this Missing Data Problem using Link-Prediction techniques from social network research, network generation techniques from the area of Social Simulation, as well as a combination of both. We found that techniques employing less information may be more adequate to solve this problem, especially when data granularity is an issue. We validated the network models created with our techniques on a number of real-world networks, investigating degree distributions as well as the likelihood of links given the geographical distance between two nodes. PMID:28426826
Probing the Topological Properties of Complex Networks Modeling Short Written Texts
Amancio, Diego R.
2015-01-01
In recent years, graph theory has been widely employed to probe several language properties. More specifically, the so-called word adjacency model has been proven useful for tackling several practical problems, especially those relying on textual stylistic analysis. The most common approach to treat texts as networks has simply considered either large pieces of texts or entire books. This approach has certainly worked well—many informative discoveries have been made this way—but it raises an uncomfortable question: could there be important topological patterns in small pieces of texts? To address this problem, the topological properties of subtexts sampled from entire books was probed. Statistical analyses performed on a dataset comprising 50 novels revealed that most of the traditional topological measurements are stable for short subtexts. When the performance of the authorship recognition task was analyzed, it was found that a proper sampling yields a discriminability similar to the one found with full texts. Surprisingly, the support vector machine classification based on the characterization of short texts outperformed the one performed with entire books. These findings suggest that a local topological analysis of large documents might improve its global characterization. Most importantly, it was verified, as a proof of principle, that short texts can be analyzed with the methods and concepts of complex networks. As a consequence, the techniques described here can be extended in a straightforward fashion to analyze texts as time-varying complex networks. PMID:25719799
46 CFR 121.300 - Ground tackle and mooring lines.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 46 Shipping 4 2010-10-01 2010-10-01 false Ground tackle and mooring lines. 121.300 Section 121.300... MISCELLANEOUS SYSTEMS AND EQUIPMENT Mooring and Towing Equipment § 121.300 Ground tackle and mooring lines. A vessel must be fitted with ground tackle and mooring lines necessary for the vessel to be safely anchored...
46 CFR 121.300 - Ground tackle and mooring lines.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 46 Shipping 4 2011-10-01 2011-10-01 false Ground tackle and mooring lines. 121.300 Section 121.300... MISCELLANEOUS SYSTEMS AND EQUIPMENT Mooring and Towing Equipment § 121.300 Ground tackle and mooring lines. A vessel must be fitted with ground tackle and mooring lines necessary for the vessel to be safely anchored...
An investigation of shoulder forces in active shoulder tackles in rugby union football.
Usman, Juliana; McIntosh, Andrew S; Fréchède, Bertrand
2011-11-01
In rugby union football the tackle is the most frequently executed skill and one most associated with injury, including shoulder injury to the tackler. Despite the importance of the tackle, little is known about the magnitude of shoulder forces in the tackle and influencing factors. The objectives of the study were to measure the shoulder force in the tackle, as well as the effects of shoulder padding, skill level, side of body, player size, and experimental setting on shoulder force. Experiments were conducted in laboratory and field settings using a repeated measures design. Thirty-five participants were recruited to the laboratory and 98 to the field setting. All were male aged over 18 years with rugby experience. The maximum force applied to the shoulder in an active shoulder tackle was measured with a custom built forceplate incorporated into a 45 kg tackle bag. The overall average maximum shoulder force was 1660 N in the laboratory and 1997 N in the field. This difference was significant. The shoulder force for tackling without shoulder pads was 1684 N compared to 1635 N with shoulder pads. There was no difference between the shoulder forces on the dominant and non-dominant sides. Shoulder force reduced with tackle repetition. No relationship was observed between player skill level and size. A substantial force can be applied to the shoulder and to an opponent in the tackle. This force is within the shoulder's injury tolerance range and is unaffected by shoulder pads. Copyright © 2011 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Catastrophic rugby injuries of the spinal cord: changing patterns of injury.
Scher, A T
1991-01-01
In reports from the UK and New Zealand, it is noted that the incidence of rugby injuries to the cervical spinal cord has dropped and that the percentage of players injured in the tackle has similarly decreased. In contrast, this does not appear to be the pattern in South Africa and an analysis has therefore been made of 40 rugby players sustaining injuries to the spinal cord during the period 1985 to 1989. The radiological appearances on admission have been correlated with the circumstances of injury, associated orthopaedic injuries and neurological deficits. The tackle was responsible for the majority of injuries, causing more than the scrum. Tackles were also responsible for more cases of complete, permanent quadriplegia than the scrum. The commonest cause of injury in players being tackled was the high tackle around the neck, while the commonest cause of injury in players making the tackle was the dive tackle. This survey has shown that the tackle is now the major cause of spinal cord injury in South African rugby, in contrast to earlier analyses in which the scrum was identified as the most common cause. Images Figure 1 Figure 2 PMID:1913034
The neurobiology of psychopathy.
Glenn, Andrea L; Raine, Adrian
2008-09-01
Numerous studies have tackled the complex challenge of understanding the neural substrates of psychopathy, revealing that brain abnormalities exist on several levels and in several structures. As we discover more about complex neural networks, it becomes increasingly difficult to clarify how these systems interact with each other to produce the distinct pattern of behavioral and personality characteristics observed in psychopathy. The authors review the recent research on the neurobiology of psychopathy, beginning with molecular neuroscience work and progressing to the level of brain structures and their connectivity. Potential factors that may affect the development of brain impairments, as well as how some systems may be targeted for potential treatment, are discussed.
Lade, Steven J; Niiranen, Susa; Hentati-Sundberg, Jonas; Blenckner, Thorsten; Boonstra, Wiebren J; Orach, Kirill; Quaas, Martin F; Österblom, Henrik; Schlüter, Maja
2015-09-01
Regime shifts triggered by human activities and environmental changes have led to significant ecological and socioeconomic consequences in marine and terrestrial ecosystems worldwide. Ecological processes and feedbacks associated with regime shifts have received considerable attention, but human individual and collective behavior is rarely treated as an integrated component of such shifts. Here, we used generalized modeling to develop a coupled social-ecological model that integrated rich social and ecological data to investigate the role of social dynamics in the 1980s Baltic Sea cod boom and collapse. We showed that psychological, economic, and regulatory aspects of fisher decision making, in addition to ecological interactions, contributed both to the temporary persistence of the cod boom and to its subsequent collapse. These features of the social-ecological system also would have limited the effectiveness of stronger fishery regulations. Our results provide quantitative, empirical evidence that incorporating social dynamics into models of natural resources is critical for understanding how resources can be managed sustainably. We also show that generalized modeling, which is well-suited to collaborative model development and does not require detailed specification of causal relationships between system variables, can help tackle the complexities involved in creating and analyzing social-ecological models.
Lade, Steven J.; Niiranen, Susa; Hentati-Sundberg, Jonas; Blenckner, Thorsten; Boonstra, Wiebren J.; Orach, Kirill; Quaas, Martin F.; Österblom, Henrik; Schlüter, Maja
2015-01-01
Regime shifts triggered by human activities and environmental changes have led to significant ecological and socioeconomic consequences in marine and terrestrial ecosystems worldwide. Ecological processes and feedbacks associated with regime shifts have received considerable attention, but human individual and collective behavior is rarely treated as an integrated component of such shifts. Here, we used generalized modeling to develop a coupled social–ecological model that integrated rich social and ecological data to investigate the role of social dynamics in the 1980s Baltic Sea cod boom and collapse. We showed that psychological, economic, and regulatory aspects of fisher decision making, in addition to ecological interactions, contributed both to the temporary persistence of the cod boom and to its subsequent collapse. These features of the social–ecological system also would have limited the effectiveness of stronger fishery regulations. Our results provide quantitative, empirical evidence that incorporating social dynamics into models of natural resources is critical for understanding how resources can be managed sustainably. We also show that generalized modeling, which is well-suited to collaborative model development and does not require detailed specification of causal relationships between system variables, can help tackle the complexities involved in creating and analyzing social–ecological models. PMID:26283344
Solving lot-sizing problem with quantity discount and transportation cost
NASA Astrophysics Data System (ADS)
Lee, Amy H. I.; Kang, He-Yau; Lai, Chun-Mei
2013-04-01
Owing to today's increasingly competitive market and ever-changing manufacturing environment, the inventory problem is becoming more complicated to solve. The incorporation of heuristics methods has become a new trend to tackle the complex problem in the past decade. This article considers a lot-sizing problem, and the objective is to minimise total costs, where the costs include ordering, holding, purchase and transportation costs, under the requirement that no inventory shortage is allowed in the system. We first formulate the lot-sizing problem as a mixed integer programming (MIP) model. Next, an efficient genetic algorithm (GA) model is constructed for solving large-scale lot-sizing problems. An illustrative example with two cases in a touch panel manufacturer is used to illustrate the practicality of these models, and a sensitivity analysis is applied to understand the impact of the changes in parameters to the outcomes. The results demonstrate that both the MIP model and the GA model are effective and relatively accurate tools for determining the replenishment for touch panel manufacturing for multi-periods with quantity discount and batch transportation. The contributions of this article are to construct an MIP model to obtain an optimal solution when the problem is not too complicated itself and to present a GA model to find a near-optimal solution efficiently when the problem is complicated.
Dynamical systems approach to the study of a sociophysics agent-based model
NASA Astrophysics Data System (ADS)
Timpanaro, André M.; Prado, Carmen P. C.
2011-03-01
The Sznajd model is a Potts-like model that has been studied in the context of sociophysics [1,2] (where spins are interpreted as opinions). In a recent work [3], we generalized the Sznajd model to include assymetric interactions between the spins (interpreted as biases towards opinions) and used dynamical systems techniques to tackle its mean-field version, given by the flow: ησ = ∑ σ' = 1Mησησ'(ησρσ'→σ-σ'ρσ→σ'). Where hs is the proportion of agents with opinion (spin) σ', M is the number of opinions and σ'→σ' is the probability weight for an agent with opinion σ being convinced by another agent with opinion σ'. We made Monte Carlo simulations of the model in a complex network (using Barabási-Albert networks [4]) and they displayed the same attractors than the mean-field. Using linear stability analysis, we were able to determine the mean-field attractor structure analytically and to show that it has connections with well known graph theory problems (maximal independent sets and positive fluxes in directed graphs). Our dynamical systems approach is quite simple and can be used also in other models, like the voter model.
Pérez-Del-Olmo, A; Montero, F E; Fernández, M; Barrett, J; Raga, J A; Kostadinova, A
2010-10-01
We address the effect of spatial scale and temporal variation on model generality when forming predictive models for fish assignment using a new data mining approach, Random Forests (RF), to variable biological markers (parasite community data). Models were implemented for a fish host-parasite system sampled along the Mediterranean and Atlantic coasts of Spain and were validated using independent datasets. We considered 2 basic classification problems in evaluating the importance of variations in parasite infracommunities for assignment of individual fish to their populations of origin: multiclass (2-5 population models, using 2 seasonal replicates from each of the populations) and 2-class task (using 4 seasonal replicates from 1 Atlantic and 1 Mediterranean population each). The main results are that (i) RF are well suited for multiclass population assignment using parasite communities in non-migratory fish; (ii) RF provide an efficient means for model cross-validation on the baseline data and this allows sample size limitations in parasite tag studies to be tackled effectively; (iii) the performance of RF is dependent on the complexity and spatial extent/configuration of the problem; and (iv) the development of predictive models is strongly influenced by seasonal change and this stresses the importance of both temporal replication and model validation in parasite tagging studies.
Dynamical systems approach to the study of a sociophysics agent-based model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timpanaro, Andre M.; Prado, Carmen P. C.
2011-03-24
The Sznajd model is a Potts-like model that has been studied in the context of sociophysics [1,2](where spins are interpreted as opinions). In a recent work [3], we generalized the Sznajd model to include assymetric interactions between the spins (interpreted as biases towards opinions) and used dynamical systems techniques to tackle its mean-field version, given by the flow: {eta}{sub {sigma}} = {Sigma}{sub {sigma}}'{sup M} = 1{eta}{sub {sigma}}{eta}{sigma}'({eta}{sub {sigma}}{rho}{sigma}'{yields}{sigma}-{sigma}'{rho}{sigma}{yields}{sigma}').Where hs is the proportion of agents with opinion (spin){sigma}', M is the number of opinions and {sigma}'{yields}{sigma}' is the probability weight for an agent with opinion {sigma} being convinced by another agentmore » with opinion {sigma}'. We made Monte Carlo simulations of the model in a complex network (using Barabasi-Albert networks [4]) and they displayed the same attractors than the mean-field. Using linear stability analysis, we were able to determine the mean-field attractor structure analytically and to show that it has connections with well known graph theory problems (maximal independent sets and positive fluxes in directed graphs). Our dynamical systems approach is quite simple and can be used also in other models, like the voter model.« less
Room for improvement: tackling high-cost patients for high potential return.
2004-06-01
Assuming that most DM programs do a decent job of managing the typical chronically ill patient, a California company decided to take on the atypical patient with a program that seeks out highly complex, at high-cost patients, and uses some non-traditional interventions to get these patients stabilized. In fact, pilot study results suggest developers may be on the right track with their "SWAT team" approach to care.
Reconfigurable Computing for Computational Science: A New Focus in High Performance Computing
2006-11-01
in the past decade. Researchers are regularly employing the power of large computing systems and parallel processing to tackle larger and more...complex problems in all of the physical sciences. For the past decade or so, most of this growth in computing power has been “free” with increased...the scientific computing community as a means to continued growth in computing capability. This paper offers a glimpse of the hardware and
Bayesian Hierarchical Modeling for Big Data Fusion in Soil Hydrology
NASA Astrophysics Data System (ADS)
Mohanty, B.; Kathuria, D.; Katzfuss, M.
2016-12-01
Soil moisture datasets from remote sensing (RS) platforms (such as SMOS and SMAP) and reanalysis products from land surface models are typically available on a coarse spatial granularity of several square km. Ground based sensors on the other hand provide observations on a finer spatial scale (meter scale or less) but are sparsely available. Soil moisture is affected by high variability due to complex interactions between geologic, topographic, vegetation and atmospheric variables. Hydrologic processes usually occur at a scale of 1 km or less and therefore spatially ubiquitous and temporally periodic soil moisture products at this scale are required to aid local decision makers in agriculture, weather prediction and reservoir operations. Past literature has largely focused on downscaling RS soil moisture for a small extent of a field or a watershed and hence the applicability of such products has been limited. The present study employs a spatial Bayesian Hierarchical Model (BHM) to derive soil moisture products at a spatial scale of 1 km for the state of Oklahoma by fusing point scale Mesonet data and coarse scale RS data for soil moisture and its auxiliary covariates such as precipitation, topography, soil texture and vegetation. It is seen that the BHM model handles change of support problems easily while performing accurate uncertainty quantification arising from measurement errors and imperfect retrieval algorithms. The computational challenge arising due to the large number of measurements is tackled by utilizing basis function approaches and likelihood approximations. The BHM model can be considered as a complex Bayesian extension of traditional geostatistical prediction methods (such as Kriging) for large datasets in the presence of uncertainties.
NASA Astrophysics Data System (ADS)
Chang, Ni-Bin; Weng, Yu-Chi
2013-03-01
Short-term predictions of potential impacts from accidental release of various radionuclides at nuclear power plants are acutely needed, especially after the Fukushima accident in Japan. An integrated modeling system that provides expert services to assess the consequences of accidental or intentional releases of radioactive materials to the atmosphere has received wide attention. These scenarios can be initiated either by accident due to human, software, or mechanical failures, or from intentional acts such as sabotage and radiological dispersal devices. Stringent action might be required just minutes after the occurrence of accidental or intentional release. To fulfill the basic functions of emergency preparedness and response systems, previous studies seldom consider the suitability of air pollutant dispersion models or the connectivity between source term, dispersion, and exposure assessment models in a holistic context for decision support. Therefore, the Gaussian plume and puff models, which are only suitable for illustrating neutral air pollutants in flat terrain conditional to limited meteorological situations, are frequently used to predict the impact from accidental release of industrial sources. In situations with complex terrain or special meteorological conditions, the proposing emergency response actions might be questionable and even intractable to decisionmakers responsible for maintaining public health and environmental quality. This study is a preliminary effort to integrate the source term, dispersion, and exposure assessment models into a Spatial Decision Support System (SDSS) to tackle the complex issues for short-term emergency response planning and risk assessment at nuclear power plants. Through a series model screening procedures, we found that the diagnostic (objective) wind field model with the aid of sufficient on-site meteorological monitoring data was the most applicable model to promptly address the trend of local wind field patterns. However, most of the hazardous materials being released into the environment from nuclear power plants are not neutral pollutants, so the particle and multi-segment puff models can be regarded as the most suitable models to incorporate into the output of the diagnostic wind field model in a modern emergency preparedness and response system. The proposed SDSS illustrates the state-of-the-art system design based on the situation of complex terrain in South Taiwan. This system design of SDSS with 3-dimensional animation capability using a tailored source term model in connection with ArcView® Geographical Information System map layers and remote sensing images is useful for meeting the design goal of nuclear power plants located in complex terrain.
NASA Astrophysics Data System (ADS)
Mohamed, Raihani; Perumal, Thinagaran; Sulaiman, Md Nasir; Mustapha, Norwati; Zainudin, M. N. Shah
2017-10-01
Pertaining to the human centric concern and non-obtrusive way, the ambient sensor type technology has been selected, accepted and embedded in the environment in resilient style. Human activities, everyday are gradually becoming complex and thus complicate the inferences of activities when it involving the multi resident in the same smart environment. Current works solutions focus on separate model between the resident, activities and interactions. Some study use data association and extra auxiliary of graphical nodes to model human tracking information in an environment and some produce separate framework to incorporate the auxiliary for interaction feature model. Thus, recognizing the activities and which resident perform the activity at the same time in the smart home are vital for the smart home development and future applications. This paper will cater the above issue by considering the simplification and efficient method using the multi label classification framework. This effort eliminates time consuming and simplifies a lot of pre-processing tasks comparing with previous approach. Applications to the multi resident multi label learning in smart home problems shows the LC (Label Combination) using Decision Tree (DT) as base classifier can tackle the above problems.
Momentum and Kinetic Energy Before the Tackle in Rugby Union
Hendricks, Sharief; Karpul, David; Lambert, Mike
2014-01-01
Understanding the physical demands of a tackle in match situations is important for safe and effective training, developing equipment and research. Physical components such as momentum and kinetic energy, and it relationship to tackle outcome is not known. The aim of this study was to compare momenta between ball-carrier and tackler, level of play (elite, university and junior) and position (forwards vs. backs), and describe the relationship between ball-carrier and tackler mass, velocity and momentum and the tackle outcome. Also, report on the ball-carrier and tackler kinetic energy before contact and the estimated magnitude of impact (energy distributed between ball-carrier and tackler upon contact). Velocity over 0.5 seconds before contact was determined using a 2-dimensional scaled version of the field generated from a computer alogorithm. Body masses of players were obtained from their player profiles. Momentum and kinetic energy were subsequently calculated for 60 tackle events. Ball-carriers were heavier than the tacklers (ball-carrier 100 ± 14 kg vs. tackler 93 ± 11 kg, d = 0.52, p = 0.0041, n = 60). Ball-carriers as forwards had a significantly higher momentum than backs (forwards 563 ± 226 Kg.m.s-1 n = 31 vs. backs 438 ± 135 Kg.m.s-1, d = 0.63, p = 0.0012, n = 29). Tacklers dominated 57% of tackles and ball-carriers dominated 43% of tackles. Despite the ball-carrier having a mass advantage before contact more frequently than the tackler, momentum advantage and tackle dominance between the ball-carrier and tackler was proportionally similar. These findings may reflect a characteristic of the modern game of rugby where efficiently heavier players (particularly forwards) are tactically predetermined to carry the ball in contact. Key Points First study to quantify momentum, kinetic energy, and magnitude of impact in rugby tackles across different levels in matches without a device attached to a player. Physical components alone, of either ball-carrier or tackler, are not good predictors of tackle dominance. The range of magnitudes of impact of injury free tackles observed in this study provides evidence for the physical tolerance of players during the tackle. PMID:25177182
Momentum and kinetic energy before the tackle in rugby union.
Hendricks, Sharief; Karpul, David; Lambert, Mike
2014-09-01
Understanding the physical demands of a tackle in match situations is important for safe and effective training, developing equipment and research. Physical components such as momentum and kinetic energy, and it relationship to tackle outcome is not known. The aim of this study was to compare momenta between ball-carrier and tackler, level of play (elite, university and junior) and position (forwards vs. backs), and describe the relationship between ball-carrier and tackler mass, velocity and momentum and the tackle outcome. Also, report on the ball-carrier and tackler kinetic energy before contact and the estimated magnitude of impact (energy distributed between ball-carrier and tackler upon contact). Velocity over 0.5 seconds before contact was determined using a 2-dimensional scaled version of the field generated from a computer alogorithm. Body masses of players were obtained from their player profiles. Momentum and kinetic energy were subsequently calculated for 60 tackle events. Ball-carriers were heavier than the tacklers (ball-carrier 100 ± 14 kg vs. tackler 93 ± 11 kg, d = 0.52, p = 0.0041, n = 60). Ball-carriers as forwards had a significantly higher momentum than backs (forwards 563 ± 226 Kg(.)m(.)s(-1) n = 31 vs. backs 438 ± 135 Kg(.)m(.)s(-1), d = 0.63, p = 0.0012, n = 29). Tacklers dominated 57% of tackles and ball-carriers dominated 43% of tackles. Despite the ball-carrier having a mass advantage before contact more frequently than the tackler, momentum advantage and tackle dominance between the ball-carrier and tackler was proportionally similar. These findings may reflect a characteristic of the modern game of rugby where efficiently heavier players (particularly forwards) are tactically predetermined to carry the ball in contact. Key PointsFirst study to quantify momentum, kinetic energy, and magnitude of impact in rugby tackles across different levels in matches without a device attached to a player.Physical components alone, of either ball-carrier or tackler, are not good predictors of tackle dominance.The range of magnitudes of impact of injury free tackles observed in this study provides evidence for the physical tolerance of players during the tackle.
Synthetic biology approaches to biological containment: pre-emptively tackling potential risks
Krüger, Antje; Csibra, Eszter; Gianni, Edoardo
2016-01-01
Biocontainment comprises any strategy applied to ensure that harmful organisms are confined to controlled laboratory conditions and not allowed to escape into the environment. Genetically engineered microorganisms (GEMs), regardless of the nature of the modification and how it was established, have potential human or ecological impact if accidentally leaked or voluntarily released into a natural setting. Although all evidence to date is that GEMs are unable to compete in the environment, the power of synthetic biology to rewrite life requires a pre-emptive strategy to tackle possible unknown risks. Physical containment barriers have proven effective but a number of strategies have been developed to further strengthen biocontainment. Research on complex genetic circuits, lethal genes, alternative nucleic acids, genome recoding and synthetic auxotrophies aim to design more effective routes towards biocontainment. Here, we describe recent advances in synthetic biology that contribute to the ongoing efforts to develop new and improved genetic, semantic, metabolic and mechanistic plans for the containment of GEMs. PMID:27903826
Synthetic biology approaches to biological containment: pre-emptively tackling potential risks.
Torres, Leticia; Krüger, Antje; Csibra, Eszter; Gianni, Edoardo; Pinheiro, Vitor B
2016-11-30
Biocontainment comprises any strategy applied to ensure that harmful organisms are confined to controlled laboratory conditions and not allowed to escape into the environment. Genetically engineered microorganisms (GEMs), regardless of the nature of the modification and how it was established, have potential human or ecological impact if accidentally leaked or voluntarily released into a natural setting. Although all evidence to date is that GEMs are unable to compete in the environment, the power of synthetic biology to rewrite life requires a pre-emptive strategy to tackle possible unknown risks. Physical containment barriers have proven effective but a number of strategies have been developed to further strengthen biocontainment. Research on complex genetic circuits, lethal genes, alternative nucleic acids, genome recoding and synthetic auxotrophies aim to design more effective routes towards biocontainment. Here, we describe recent advances in synthetic biology that contribute to the ongoing efforts to develop new and improved genetic, semantic, metabolic and mechanistic plans for the containment of GEMs. © 2016 The Author(s).
Does player time-in-game affect tackle technique in elite level rugby union?
Tierney, Gregory J; Denvir, Karl; Farrell, Garreth; Simms, Ciaran K
2018-02-01
It has been hypothesised that fatigue may be a major factor in tackle-related injury risk in rugby union and hence more injuries occur in the later stages of a game. The aim of this study is to identify changes in ball carrier or tackler proficiency characteristics, using elite level match video data, as player time-in-game increases. Qualitative observational cohort study. Three 2014/15 European Rugby Champions Cup games were selected for ball carrier and tackler proficiency analysis. Analysis was only conducted on players who started and remained on the field for the entire game. A separate analysis was conducted on 10 randomly selected 2014/15 European Rugby Champions Cup/Pro 12 games to assess the time distribution of tackles throughout a game. A Chi-square test and one-way way ANOVA with post-hoc testing was conducted to identify significant differences (p<0.05) for proficiency characteristics and tackle counts between quarters in the game, respectively. Player time-in-game did not affect tackle proficiency for both the ball carrier and tackler. Any results that showed statistical significance did not indicate a trend of deterioration in proficiency with increased player time-in-game. The time distribution of tackles analysis indicated that more tackles occurring in the final quarter of the game than the first (p=0.04) and second (p=<0.01). It appears that player time-in-game does not affect tackler or ball carrier tackle technique proficiency at the elite level. More tackles occurring in the final quarter of a game provides an alternative explanation to more tackle-related injuries occurring at this stage. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Chieu, Vu Minh; Luengo, Vanda; Vadcard, Lucile; Tonetti, Jerome
2010-01-01
Cognitive approaches have been used for student modeling in intelligent tutoring systems (ITSs). Many of those systems have tackled fundamental subjects such as mathematics, physics, and computer programming. The change of the student's cognitive behavior over time, however, has not been considered and modeled systematically. Furthermore, the…
USDA-ARS?s Scientific Manuscript database
The mixed linear model (MLM) is currently among the most advanced and flexible statistical modeling techniques and its use in tackling problems in plant pathology has begun surfacing in the literature. The longitudinal MLM is a multivariate extension that handles repeatedly measured data, such as r...
Numerical comparisons of ground motion predictions with kinematic rupture modeling
NASA Astrophysics Data System (ADS)
Yuan, Y. O.; Zurek, B.; Liu, F.; deMartin, B.; Lacasse, M. D.
2017-12-01
Recent advances in large-scale wave simulators allow for the computation of seismograms at unprecedented levels of detail and for areas sufficiently large to be relevant to small regional studies. In some instances, detailed information of the mechanical properties of the subsurface has been obtained from seismic exploration surveys, well data, and core analysis. Using kinematic rupture modeling, this information can be used with a wave propagation simulator to predict the ground motion that would result from an assumed fault rupture. The purpose of this work is to explore the limits of wave propagation simulators for modeling ground motion in different settings, and in particular, to explore the numerical accuracy of different methods in the presence of features that are challenging to simulate such as topography, low-velocity surface layers, and shallow sources. In the main part of this work, we use a variety of synthetic three-dimensional models and compare the relative costs and benefits of different numerical discretization methods in computing the seismograms of realistic-size models. The finite-difference method, the discontinuous-Galerkin method, and the spectral-element method are compared for a range of synthetic models having different levels of complexity such as topography, large subsurface features, low-velocity surface layers, and the location and characteristics of fault ruptures represented as an array of seismic sources. While some previous studies have already demonstrated that unstructured-mesh methods can sometimes tackle complex problems (Moczo et al.), we investigate the trade-off between unstructured-mesh methods and regular-grid methods for a broad range of models and source configurations. Finally, for comparison, our direct simulation results are briefly contrasted with those predicted by a few phenomenological ground-motion prediction equations, and a workflow for accurately predicting ground motion is proposed.
Managing routine food choices in UK families: the role of convenience consumption.
Carrigan, Marylyn; Szmigin, Isabelle; Leek, Sheena
2006-11-01
The paper explores the meaning of convenience food for UK mothers, investigating the relationship between mothers and their families' food. The study examines the role of convenience food within the food strategies of contemporary UK families, and aims to elicit consumption meanings in the broader social context of family relationships with food, their rituals, routines and conventions. The findings reveal convenience has multiple meanings for UK women, and that convenience food has been incorporated into reinterpreted versions of homemade and "proper" meals. A hierarchy of acceptable convenience food is presented by the mothers, who tackle complex and conflicting family routines by introducing convenience solutions. Rules of eating have evolved, yet remain essentially controlled by the mother in terms of nutrition. While the traditional model of "proper" food remains aspirational, contemporary family lifestyles require that convenience food become part of the equation.
Bassett, Danielle S; Sporns, Olaf
2017-01-01
Despite substantial recent progress, our understanding of the principles and mechanisms underlying complex brain function and cognition remains incomplete. Network neuroscience proposes to tackle these enduring challenges. Approaching brain structure and function from an explicitly integrative perspective, network neuroscience pursues new ways to map, record, analyze and model the elements and interactions of neurobiological systems. Two parallel trends drive the approach: the availability of new empirical tools to create comprehensive maps and record dynamic patterns among molecules, neurons, brain areas and social systems; and the theoretical framework and computational tools of modern network science. The convergence of empirical and computational advances opens new frontiers of scientific inquiry, including network dynamics, manipulation and control of brain networks, and integration of network processes across spatiotemporal domains. We review emerging trends in network neuroscience and attempt to chart a path toward a better understanding of the brain as a multiscale networked system. PMID:28230844
2010 Diffraction Methods in Structural Biology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dr. Ana Gonzalez
2011-03-10
Advances in basic methodologies have played a major role in the dramatic progress in macromolecular crystallography over the past decade, both in terms of overall productivity and in the increasing complexity of the systems being successfully tackled. The 2010 Gordon Research Conference on Diffraction Methods in Structural Biology will, as in the past, focus on the most recent developments in methodology, covering all aspects of the process from crystallization to model building and refinement, complemented by examples of structural highlights and complementary methods. Extensive discussion will be encouraged and it is hoped that all attendees will participate by giving oralmore » or poster presentations, the latter using the excellent poster display area available at Bates College. The relatively small size and informal atmosphere of the meeting provides an excellent opportunity for all participants, especially younger scientists, to meet and exchange ideas with leading methods developers.« less
Crowd motion segmentation and behavior recognition fusing streak flow and collectiveness
NASA Astrophysics Data System (ADS)
Gao, Mingliang; Jiang, Jun; Shen, Jin; Zou, Guofeng; Fu, Guixia
2018-04-01
Crowd motion segmentation and crowd behavior recognition are two hot issues in computer vision. A number of methods have been proposed to tackle these two problems. Among the methods, flow dynamics is utilized to model the crowd motion, with little consideration of collective property. Moreover, the traditional crowd behavior recognition methods treat the local feature and dynamic feature separately and overlook the interconnection of topological and dynamical heterogeneity in complex crowd processes. A crowd motion segmentation method and a crowd behavior recognition method are proposed based on streak flow and crowd collectiveness. The streak flow is adopted to reveal the dynamical property of crowd motion, and the collectiveness is incorporated to reveal the structure property. Experimental results show that the proposed methods improve the crowd motion segmentation accuracy and the crowd recognition rates compared with the state-of-the-art methods.
TimeBench: a data model and software library for visual analytics of time-oriented data.
Rind, Alexander; Lammarsch, Tim; Aigner, Wolfgang; Alsallakh, Bilal; Miksch, Silvia
2013-12-01
Time-oriented data play an essential role in many Visual Analytics scenarios such as extracting medical insights from collections of electronic health records or identifying emerging problems and vulnerabilities in network traffic. However, many software libraries for Visual Analytics treat time as a flat numerical data type and insufficiently tackle the complexity of the time domain such as calendar granularities and intervals. Therefore, developers of advanced Visual Analytics designs need to implement temporal foundations in their application code over and over again. We present TimeBench, a software library that provides foundational data structures and algorithms for time-oriented data in Visual Analytics. Its expressiveness and developer accessibility have been evaluated through application examples demonstrating a variety of challenges with time-oriented data and long-term developer studies conducted in the scope of research and student projects.
Nidumolu, Ram; Ellison, Jib; Whalen, John; Billman, Erin
2014-04-01
Addressing global sustainability challenges--including climate change, resource depletion, and ecosystem loss--is beyond the individual capabilities of even the largest companies. To tackle these threats, and unleash new value, companies and other stakeholders must collaborate in new ways that treat fragile and complex ecosystems as a whole. In this article, the authors draw on cases including the Latin American Water Funds Partnership, the Sustainable Apparel Coalition (led by Nike, Patagonia, and Walmart), and Action to Accelerate Recycling (a partnership between Alcoa, consumer packaged goods companies, and local governments, among others) to describe four new collaboration models that create shared value and address environmental protection across the value stream. Optimal collaborations focus on improving either business processes or outcomes. They start with a small group of key organizations, bring in project management expertise, link self-interest to shared interest, encourage productive competition, create quick wins, and, above all, build and maintain trust.
TSCA Section 21 Petition Requesting EPA to Regulate Lead in Fishing Tackle
This petition requests EPA to promulgate regulations under section 6 of TSCA to protect the environment from fishing tackle containing lead including fishing weights, sinkers, lures, jigs, and/or other tackle.
CellML and associated tools and techniques.
Garny, Alan; Nickerson, David P; Cooper, Jonathan; Weber dos Santos, Rodrigo; Miller, Andrew K; McKeever, Steve; Nielsen, Poul M F; Hunter, Peter J
2008-09-13
We have, in the last few years, witnessed the development and availability of an ever increasing number of computer models that describe complex biological structures and processes. The multi-scale and multi-physics nature of these models makes their development particularly challenging, not only from a biological or biophysical viewpoint but also from a mathematical and computational perspective. In addition, the issue of sharing and reusing such models has proved to be particularly problematic, with the published models often lacking information that is required to accurately reproduce the published results. The International Union of Physiological Sciences Physiome Project was launched in 1997 with the aim of tackling the aforementioned issues by providing a framework for the modelling of the human body. As part of this initiative, the specifications of the CellML mark-up language were released in 2001. Now, more than 7 years later, the time has come to assess the situation, in particular with regard to the tools and techniques that are now available to the modelling community. Thus, after introducing CellML, we review and discuss existing editors, validators, online repository, code generators and simulation environments, as well as the CellML Application Program Interface. We also address possible future directions including the need for additional mark-up languages.
Anderson, Ross P; Jimenez, Geronimo; Bae, Jin Yung; Silver, Diana; Macinko, James; Porfiri, Maurizio
2016-01-01
Detecting and explaining the relationships among interacting components has long been a focal point of dynamical systems research. In this paper, we extend these types of data-driven analyses to the realm of public policy, whereby individual legislative entities interact to produce changes in their legal and political environments. We focus on the U.S. public health policy landscape, whose complexity determines our capacity as a society to effectively tackle pressing health issues. It has long been thought that some U.S. states innovate and enact new policies, while others mimic successful or competing states. However, the extent to which states learn from others, and the state characteristics that lead two states to influence one another, are not fully understood. Here, we propose a model-free, information-theoretical method to measure the existence and direction of influence of one state's policy or legal activity on others. Specifically, we tailor a popular notion of causality to handle the slow time-scale of policy adoption dynamics and unravel relationships among states from their recent law enactment histories. The method is validated using surrogate data generated from a new stochastic model of policy activity. Through the analysis of real data in alcohol, driving safety, and impaired driving policy, we provide evidence for the role of geography, political ideology, risk factors, and demographic and economic indicators on a state's tendency to learn from others when shaping its approach to public health regulation. Our method offers a new model-free approach to uncover interactions and establish cause-and-effect in slowly-evolving complex dynamical systems.
Orbital Architectures of Dynamically Complex Exoplanet Systems
NASA Astrophysics Data System (ADS)
Nelson, Benjamin E.
2015-01-01
The most powerful constraints on planet formation will come from characterizing the dynamical state of complex multi-planet systems. Unfortunately, with that complexity comes a number of factors that make analyzing these systems a computationally challenging endeavor: the sheer number of model parameters, a wonky shaped posterior distribution, and hundreds to thousands of time series measurements. We develop a differential evolution Markov chain Monte Carlo (RUN DMC) to tackle these difficult aspects of data analysis. We apply RUN DMC to two classic multi-planet systems from radial velocity surveys, 55 Cancri and GJ 876. For 55 Cancri, we find the inner-most planet "e" must be coplanar to within 40 degrees of the outer planets, otherwise Kozai-like perturbations will cause the planet's orbit to cross the stellar surface. We find the orbits of planets "b" and "c" are apsidally aligned and librating with low to median amplitude (50±610 degrees), but they are not orbiting in a mean-motion resonance. For GJ 876, we can meaningfully constrain the three-dimensional orbital architecture of all the planets based on the radial velocity data alone. By demanding orbital stability, we find the resonant planets have low mutual inclinations (Φ) so they must be roughly coplanar (Φcb = 1.41±0.620.57 degrees and Φbe = 3.87±1.991.86 degrees). The three-dimensional Laplace argument librates with an amplitude of 50.5±7.910.0 degrees, indicating significant past disk migration and ensuring long-term stability. These empirically derived models will provide new challenges for planet formation models and motivate the need for more sophisticated algorithms to analyze exoplanet data.
Reach, Gérard
2016-01-01
According to the concept developed by Thomas Kuhn, a scientific revolution occurs when scientists encounter a crisis due to the observation of anomalies that cannot be explained by the generally accepted paradigm within which scientific progress has thereto been made: a scientific revolution can therefore be described as a change in paradigm aimed at solving a crisis. Described herein is an application of this concept to the medical realm, starting from the reflection that during the past decades, the medical community has encountered two anomalies that, by their frequency and consequences, represent a crisis in the system, as they deeply jeopardize the efficiency of care: nonadherence of patients who do not follow the prescriptions of their doctors, and clinical inertia of doctors who do not comply with good practice guidelines. It is proposed that these phenomena are caused by a contrast between, on the one hand, the complex thought of patients and doctors that sometimes escapes rationalization, and on the other hand, the simplification imposed by the current paradigm of medicine dominated by the technical rationality of evidence-based medicine. It is suggested therefore that this crisis must provoke a change in paradigm, inventing a new model of care defined by an ability to take again into account, on an individual basis, the complex thought of patients and doctors. If this overall analysis is correct, such a person-centered care model should represent a solution to the two problems of patients’ nonadherence and doctors’ clinical inertia, as it tackles their cause. These considerations may have important implications for the teaching and the practice of medicine. PMID:27103790
Anderson, Ross P.; Jimenez, Geronimo; Bae, Jin Yung; Silver, Diana; Macinko, James; Porfiri, Maurizio
2017-01-01
Detecting and explaining the relationships among interacting components has long been a focal point of dynamical systems research. In this paper, we extend these types of data-driven analyses to the realm of public policy, whereby individual legislative entities interact to produce changes in their legal and political environments. We focus on the U.S. public health policy landscape, whose complexity determines our capacity as a society to effectively tackle pressing health issues. It has long been thought that some U.S. states innovate and enact new policies, while others mimic successful or competing states. However, the extent to which states learn from others, and the state characteristics that lead two states to influence one another, are not fully understood. Here, we propose a model-free, information-theoretical method to measure the existence and direction of influence of one state’s policy or legal activity on others. Specifically, we tailor a popular notion of causality to handle the slow time-scale of policy adoption dynamics and unravel relationships among states from their recent law enactment histories. The method is validated using surrogate data generated from a new stochastic model of policy activity. Through the analysis of real data in alcohol, driving safety, and impaired driving policy, we provide evidence for the role of geography, political ideology, risk factors, and demographic and economic indicators on a state’s tendency to learn from others when shaping its approach to public health regulation. Our method offers a new model-free approach to uncover interactions and establish cause-and-effect in slowly-evolving complex dynamical systems. PMID:29075163
Dynamic optimization of chemical processes using ant colony framework.
Rajesh, J; Gupta, K; Kusumakar, H S; Jayaraman, V K; Kulkarni, B D
2001-11-01
Ant colony framework is illustrated by considering dynamic optimization of six important bench marking examples. This new computational tool is simple to implement and can tackle problems with state as well as terminal constraints in a straightforward fashion. It requires fewer grid points to reach the global optimum at relatively very low computational effort. The examples with varying degree of complexities, analyzed here, illustrate its potential for solving a large class of process optimization problems in chemical engineering.
Neural Networks Based Approach to Enhance Space Hardware Reliability
NASA Technical Reports Server (NTRS)
Zebulum, Ricardo S.; Thakoor, Anilkumar; Lu, Thomas; Franco, Lauro; Lin, Tsung Han; McClure, S. S.
2011-01-01
This paper demonstrates the use of Neural Networks as a device modeling tool to increase the reliability analysis accuracy of circuits targeted for space applications. The paper tackles a number of case studies of relevance to the design of Flight hardware. The results show that the proposed technique generates more accurate models than the ones regularly used to model circuits.
Modeling and Intervening across Time in Scientific Inquiry Exploratory Learning Environment
ERIC Educational Resources Information Center
Ting, Choo-Yee; Phon-Amnuaisuk, Somnuk; Chong, Yen-Kuan
2008-01-01
This article aims at discussing how Dynamic Decision Network (DDN) can be employed to tackle the challenges in modeling temporally variable scientific inquiry skills and provision of adaptive pedagogical interventions in INQPRO, a scientific inquiry exploratory learning environment for learning O'level Physics. We begin with an overview of INQPRO…
Training Emotional and Social Competences in Higher Education: The Seminar Methodology
ERIC Educational Resources Information Center
Oberst, Ursula; Gallifa, Josep; Farriols, Nuria; Vilaregut, Anna
2009-01-01
This article discusses the importance of emotional and social competences in higher education and presents a training model. In 1991, Ramon Llull University of Barcelona (Spain) created the Seminar methodology to tackle these challenges. A general model derived from the Emotional Intelligence concept and the general principles of this methodology…
Decision makers often need assistance in understanding dynamic interactions and linkages among economic, environmental and social systems in coastal watersheds. They also need scientific input to better evaluate potential costs and benefits of alternative policy interventions. Th...
Depth-color fusion strategy for 3-D scene modeling with Kinect.
Camplani, Massimo; Mantecon, Tomas; Salgado, Luis
2013-12-01
Low-cost depth cameras, such as Microsoft Kinect, have completely changed the world of human-computer interaction through controller-free gaming applications. Depth data provided by the Kinect sensor presents several noise-related problems that have to be tackled to improve the accuracy of the depth data, thus obtaining more reliable game control platforms and broadening its applicability. In this paper, we present a depth-color fusion strategy for 3-D modeling of indoor scenes with Kinect. Accurate depth and color models of the background elements are iteratively built, and used to detect moving objects in the scene. Kinect depth data is processed with an innovative adaptive joint-bilateral filter that efficiently combines depth and color by analyzing an edge-uncertainty map and the detected foreground regions. Results show that the proposed approach efficiently tackles main Kinect data problems: distance-dependent depth maps, spatial noise, and temporal random fluctuations are dramatically reduced; objects depth boundaries are refined, and nonmeasured depth pixels are interpolated. Moreover, a robust depth and color background model and accurate moving objects silhouette are generated.
[Family Health Strategies to tackle violence involving adolescents].
Vieira Netto, Moysés Francisco; Deslandes, Suely Ferreira
2016-05-01
The Family Health Strategy (FHS) has an acknowledged potential for the promotion of health and the prevention of violence. This is an integrative bibliographic review with the aim of evaluating the performance of FHS professionals in tackling and preventing violence involving adolescents. It is an integrative review of dissertations and theses on healthcare published from 1994 to 2014. The collection of 17 dissertations and 2 doctoral theses reveals that these studies are recent. The FHS professionals acknowledge the vulnerability of adolescents to inflicting and being subject to violence, however the FHS proves ineffective in tackling and preventing such violence. The predominance of the medical technical care model, the deficiencies in Public Health education in professional training and the lack of institutional support are seen as the main obstacles. Many of these professionals are unaware of the files for notification of violence. The existence of family violence and criminal groups were the aspects most mentioned in the territories. The social representation of adolescents as being "problematic" and the lack of ESF actions that promote an increase youth leadership and empowerment were clearly detected.
Tucker, Ross; Raftery, Martin; Kemp, Simon; Brown, James; Fuller, Gordon; Hester, Ben; Cross, Matthew; Quarrie, Ken
2017-08-01
The tackle is responsible for the majority of head injuries during rugby union. In order to address head injury risk, risk factors during the tackle must first be identified. This study analysed tackle characteristics in the professional game in order to inform potential interventions. 464 tackles resulting in a head injury assessment (HIA) were analysed in detail, with tackle type, direction, speed, acceleration, nature of head contact and player body position the characteristics of interest. Propensity to cause an HIA was significantly greater for active shoulder tackles, front-on tackles, high speeder tackles and an accelerating tackler. Head contact between a tackler's head and ball carrier's head or shoulder was significantly more likely to cause an HIA than contact below the level of the shoulder (incident rate ratio (IRR) 4.25, 95%-CI 3.38 to 5.35). The tackler experiences the majority (78%) of HIAs when head-to-head contact occurs. An upright tackler was 1.5 times more likely to experience an HIA than a bent at the waist tackler (IRR 1.44, 95% CI 1.18 to 1.76). This study confirms that energy transfer in the tackle is a risk factor for head injury, since direction, type and speed all influence HIA propensity. The study provides evidence that body position and the height of tackles should be a focus for interventions, since lowering height and adopting a bent at the waist body position is associated with reduced risk for both tacklers and ball carriers. To this end, World Rugby has implemented law change based on the present data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Astrophysics Data System (ADS)
Validi, AbdoulAhad
2014-03-01
This study introduces a non-intrusive approach in the context of low-rank separated representation to construct a surrogate of high-dimensional stochastic functions, e.g., PDEs/ODEs, in order to decrease the computational cost of Markov Chain Monte Carlo simulations in Bayesian inference. The surrogate model is constructed via a regularized alternative least-square regression with Tikhonov regularization using a roughening matrix computing the gradient of the solution, in conjunction with a perturbation-based error indicator to detect optimal model complexities. The model approximates a vector of a continuous solution at discrete values of a physical variable. The required number of random realizations to achieve a successful approximation linearly depends on the function dimensionality. The computational cost of the model construction is quadratic in the number of random inputs, which potentially tackles the curse of dimensionality in high-dimensional stochastic functions. Furthermore, this vector-valued separated representation-based model, in comparison to the available scalar-valued case, leads to a significant reduction in the cost of approximation by an order of magnitude equal to the vector size. The performance of the method is studied through its application to three numerical examples including a 41-dimensional elliptic PDE and a 21-dimensional cavity flow.
Chemical Memory Reactions Induced Bursting Dynamics in Gene Expression
Tian, Tianhai
2013-01-01
Memory is a ubiquitous phenomenon in biological systems in which the present system state is not entirely determined by the current conditions but also depends on the time evolutionary path of the system. Specifically, many memorial phenomena are characterized by chemical memory reactions that may fire under particular system conditions. These conditional chemical reactions contradict to the extant stochastic approaches for modeling chemical kinetics and have increasingly posed significant challenges to mathematical modeling and computer simulation. To tackle the challenge, I proposed a novel theory consisting of the memory chemical master equations and memory stochastic simulation algorithm. A stochastic model for single-gene expression was proposed to illustrate the key function of memory reactions in inducing bursting dynamics of gene expression that has been observed in experiments recently. The importance of memory reactions has been further validated by the stochastic model of the p53-MDM2 core module. Simulations showed that memory reactions is a major mechanism for realizing both sustained oscillations of p53 protein numbers in single cells and damped oscillations over a population of cells. These successful applications of the memory modeling framework suggested that this innovative theory is an effective and powerful tool to study memory process and conditional chemical reactions in a wide range of complex biological systems. PMID:23349679
Chemical memory reactions induced bursting dynamics in gene expression.
Tian, Tianhai
2013-01-01
Memory is a ubiquitous phenomenon in biological systems in which the present system state is not entirely determined by the current conditions but also depends on the time evolutionary path of the system. Specifically, many memorial phenomena are characterized by chemical memory reactions that may fire under particular system conditions. These conditional chemical reactions contradict to the extant stochastic approaches for modeling chemical kinetics and have increasingly posed significant challenges to mathematical modeling and computer simulation. To tackle the challenge, I proposed a novel theory consisting of the memory chemical master equations and memory stochastic simulation algorithm. A stochastic model for single-gene expression was proposed to illustrate the key function of memory reactions in inducing bursting dynamics of gene expression that has been observed in experiments recently. The importance of memory reactions has been further validated by the stochastic model of the p53-MDM2 core module. Simulations showed that memory reactions is a major mechanism for realizing both sustained oscillations of p53 protein numbers in single cells and damped oscillations over a population of cells. These successful applications of the memory modeling framework suggested that this innovative theory is an effective and powerful tool to study memory process and conditional chemical reactions in a wide range of complex biological systems.
Accelerated Edge-Preserving Image Restoration Without Boundary Artifacts
Matakos, Antonios; Ramani, Sathish; Fessler, Jeffrey A.
2013-01-01
To reduce blur in noisy images, regularized image restoration methods have been proposed that use non-quadratic regularizers (like l1 regularization or total-variation) that suppress noise while preserving edges in the image. Most of these methods assume a circulant blur (periodic convolution with a blurring kernel) that can lead to wraparound artifacts along the boundaries of the image due to the implied periodicity of the circulant model. Using a non-circulant model could prevent these artifacts at the cost of increased computational complexity. In this work we propose to use a circulant blur model combined with a masking operator that prevents wraparound artifacts. The resulting model is non-circulant, so we propose an efficient algorithm using variable splitting and augmented Lagrangian (AL) strategies. Our variable splitting scheme, when combined with the AL framework and alternating minimization, leads to simple linear systems that can be solved non-iteratively using FFTs, eliminating the need for more expensive CG-type solvers. The proposed method can also efficiently tackle a variety of convex regularizers including edge-preserving (e.g., total-variation) and sparsity promoting (e.g., l1 norm) regularizers. Simulation results show fast convergence of the proposed method, along with improved image quality at the boundaries where the circulant model is inaccurate. PMID:23372080
NASA Astrophysics Data System (ADS)
Sanchez, M. J.; Santamarina, C.; Gai, X., Sr.; Teymouri, M., Sr.
2017-12-01
Stability and behavior of Hydrate Bearing Sediments (HBS) are characterized by the metastable character of the gas hydrate structure which strongly depends on thermo-hydro-chemo-mechanical (THCM) actions. Hydrate formation, dissociation and methane production from hydrate bearing sediments are coupled THCM processes that involve, amongst other, exothermic formation and endothermic dissociation of hydrate and ice phases, mixed fluid flow and large changes in fluid pressure. The analysis of available data from past field and laboratory experiments, and the optimization of future field production studies require a formal and robust numerical framework able to capture the very complex behavior of this type of soil. A comprehensive fully coupled THCM formulation has been developed and implemented into a finite element code to tackle problems involving gas hydrates sediments. Special attention is paid to the geomechanical behavior of HBS, and particularly to their response upon hydrate dissociation under loading. The numerical framework has been validated against recent experiments conducted under controlled conditions in the laboratory that challenge the proposed approach and highlight the complex interaction among THCM processes in HBS. The performance of the models in these case studies is highly satisfactory. Finally, the numerical code is applied to analyze the behavior of gas hydrate soils under field-scale conditions exploring different features of material behavior under possible reservoir conditions.
Shape optimization techniques for musical instrument design
NASA Astrophysics Data System (ADS)
Henrique, Luis; Antunes, Jose; Carvalho, Joao S.
2002-11-01
The design of musical instruments is still mostly based on empirical knowledge and costly experimentation. One interesting improvement is the shape optimization of resonating components, given a number of constraints (allowed parameter ranges, shape smoothness, etc.), so that vibrations occur at specified modal frequencies. Each admissible geometrical configuration generates an error between computed eigenfrequencies and the target set. Typically, error surfaces present many local minima, corresponding to suboptimal designs. This difficulty can be overcome using global optimization techniques, such as simulated annealing. However these methods are greedy, concerning the number of function evaluations required. Thus, the computational effort can be unacceptable if complex problems, such as bell optimization, are tackled. Those issues are addressed in this paper, and a method for improving optimization procedures is proposed. Instead of using the local geometric parameters as searched variables, the system geometry is modeled in terms of truncated series of orthogonal space-funcitons, and optimization is performed on their amplitude coefficients. Fourier series and orthogonal polynomials are typical such functions. This technique reduces considerably the number of searched variables, and has a potential for significant computational savings in complex problems. It is illustrated by optimizing the shapes of both current and uncommon marimba bars.
Sambo, Francesco; de Oca, Marco A Montes; Di Camillo, Barbara; Toffolo, Gianna; Stützle, Thomas
2012-01-01
Reverse engineering is the problem of inferring the structure of a network of interactions between biological variables from a set of observations. In this paper, we propose an optimization algorithm, called MORE, for the reverse engineering of biological networks from time series data. The model inferred by MORE is a sparse system of nonlinear differential equations, complex enough to realistically describe the dynamics of a biological system. MORE tackles separately the discrete component of the problem, the determination of the biological network topology, and the continuous component of the problem, the strength of the interactions. This approach allows us both to enforce system sparsity, by globally constraining the number of edges, and to integrate a priori information about the structure of the underlying interaction network. Experimental results on simulated and real-world networks show that the mixed discrete/continuous optimization approach of MORE significantly outperforms standard continuous optimization and that MORE is competitive with the state of the art in terms of accuracy of the inferred networks.
Implementation of a platform dedicated to the biomedical analysis terminologies management
Cormont, Sylvie; Vandenbussche, Pierre-Yves; Buemi, Antoine; Delahousse, Jean; Lepage, Eric; Charlet, Jean
2011-01-01
Background and objectives. Assistance Publique - Hôpitaux de Paris (AP-HP) is implementing a new laboratory management system (LMS) common to the 12 hospital groups. First step to this process was to acquire a biological analysis dictionary. This dictionary is interfaced with the international nomenclature LOINC, and has been developed in collaboration with experts from all biological disciplines. In this paper we describe in three steps (modeling, data migration and integration/verification) the implementation of a platform for publishing and maintaining the AP-HP laboratory data dictionary (AnaBio). Material and Methods. Due to data complexity and volume, setting up a platform dedicated to the terminology management was a key requirement. This is an enhancement tackling identified weaknesses of previous spreadsheet tool. Our core model allows interoperability regarding data exchange standards and dictionary evolution. Results. We completed our goals within one year. In addition, structuring data representation has lead to a significant data quality improvement (impacting more than 10% of data). The platform is active in the 21 hospitals of the institution spread into 165 laboratories. PMID:22195205
Tension-compression viscoelastic behaviors of the periodontal ligament.
Wang, Chen-Ying; Su, Ming-Zen; Chang, Hao-Hueng; Chiang, Yu-Chih; Tao, Shao-Huan; Cheng, Jung-Ho; Fuh, Lih-Jyh; Lin, Chun-Pin
2012-09-01
Although exhaustively studied, the mechanism responsible for tooth support and the mechanical properties of the periodontal ligament (PDL) remain a subject of considerable controversy. In the past, various experimental techniques and theoretical analyses have been employed to tackle this intricate problem. The aim of this study was to investigate the viscoelastic behaviors of the PDL using three-dimensional finite element analysis. Three dentoalveolar complex models were established to simulate the tissue behaviors of the PDL: (1) deviatoric viscoelastic model; (2) volumetric viscoelastic model; and (3) tension-compression volumetric viscoelastic model. These modified models took into consideration the presence of tension and compression along the PDL during both loading and unloading. The inverse parameter identification process was developed to determine the mechanical properties of the PDL from the results of previously reported in vitro and in vivo experiments. The results suggest that the tension-compression volumetric viscoelastic model is a good approximation of normal PDL behavior during the loading-unloading process, and the deviatoric viscoelastic model is a good representation of how a damaged PDL behaves under loading conditions. Moreover, fluid appears to be the main creep source in the PDL. We believe that the biomechanical properties of the PDL established via retrograde calculation in this study can lead to the construction of more accurate extra-oral models and a comprehensive understanding of the biomechanical behavior of the PDL. Copyright © 2012. Published by Elsevier B.V.
Xie, Y L; Li, Y P; Huang, G H; Li, Y F; Chen, L R
2011-04-15
In this study, an inexact-chance-constrained water quality management (ICC-WQM) model is developed for planning regional environmental management under uncertainty. This method is based on an integration of interval linear programming (ILP) and chance-constrained programming (CCP) techniques. ICC-WQM allows uncertainties presented as both probability distributions and interval values to be incorporated within a general optimization framework. Complexities in environmental management systems can be systematically reflected, thus applicability of the modeling process can be highly enhanced. The developed method is applied to planning chemical-industry development in Binhai New Area of Tianjin, China. Interval solutions associated with different risk levels of constraint violation have been obtained. They can be used for generating decision alternatives and thus help decision makers identify desired policies under various system-reliability constraints of water environmental capacity of pollutant. Tradeoffs between system benefits and constraint-violation risks can also be tackled. They are helpful for supporting (a) decision of wastewater discharge and government investment, (b) formulation of local policies regarding water consumption, economic development and industry structure, and (c) analysis of interactions among economic benefits, system reliability and pollutant discharges. Copyright © 2011 Elsevier B.V. All rights reserved.
Investigating Integration Capabilities Between Ifc and Citygml LOD3 for 3d City Modelling
NASA Astrophysics Data System (ADS)
Floros, G.; Pispidikis, I.; Dimopoulou, E.
2017-10-01
Smart cities are applied to an increasing number of application fields. This evolution though urges data collection and integration, hence major issues arise that need to be tackled. One of the most important challenges is the heterogeneity of collected data, especially if those data derive from different standards and vary in terms of geometry, topology and semantics. Another key challenge is the efficient analysis and visualization of spatial data, which due to the complexity of the physical reality in modern world, 2D GIS struggles to cope with. So, in order to facilitate data analysis and enhance the role of smart cities, the 3rd dimension needs to be implemented. Standards such as CityGML and IFC fulfill that necessity but they present major differences in their schemas that render their integration a challenging task. This paper focuses on addressing those differences, examining the up to date research work and investigates an alternative methodology in order to bridge the gap between those Standards. Within this framework, a generic IFC model is generated and converted to a CityGML Model, which is validated and evaluated on its geometrical correctness and semantical coherence. General results as well as future research considerations are presented.
Haywood, Alan M.; Dowsett, Harry J.; Dolan, Aisling M.; Rowley, David; Abe-Ouchi, Ayako; Otto-Bliesner, Bette; Chandler, Mark A.; Hunter, Stephen J.; Lunt, Daniel J.; Pound, Matthew; Salzmann, Ulrich
2016-01-01
Finally we have designed a suite of prioritized experiments that tackle issues surrounding the basic understanding of the Pliocene and its relevance in the context of future climate change in a discrete way.
Regulation by consensus: The expanded use of regulatory negotiation under the Clean Air Act
DOE Office of Scientific and Technical Information (OSTI.GOV)
Claiborne, M.L.
This article discusses the consensus building approach, which stems from the more formal regulatory negotiation process under the Negotiated Rulemaking Act of 1990, for improving air quality. The article uses as examples the joint plan to improve air quality and visibility in the Grand Canyon and 15 other national parks and wilderness areas in the SW USA, and the Southern Appalachian Mountain initiative tackling more complex issues including visibility, ground ozone, acid deposition, etc.
2017-09-14
Application to Wind Turbines . Aviation 2015. Dallas. TX. 18. Glegg, S., A. Buono, J. Grant, F. Lachowski, W. Devenport and N. Alexander (2015). Sound...complicate the interaction. The over-arching goal of this work is to tackle the turbulence ingestion noise (TIN) problem in a carefully controlled... Wind tunnel tests were performed to study the sound generated by the ingestion of the wake, as well as the unsteady upwash correlations on the blades
Optimizing the management of elderly colorectal surgery patients.
Tan, Kok-Yang; Konishi, Fumio; Tan, Lawrence; Chin, Wui-Kin; Ong, Hean-Yee; Tan, Phyllis
2010-11-01
With the ever increasing number of geriatric surgical patients, there is a need to develop efficient processes that address all of the potential issues faced by patients during the perioperative period. This article explores the physiological changes in elderly surgical patients and the outcomes achieved after major abdominal surgery. Perioperative management strategies for elderly surgical patients in line with the practices of the Geriatric Surgical Team of Alexandra Health, Singapore, are also presented. A coordinated transdisciplinary approach best tackles the complexities encountered in these patients.
Severe and Catastrophic Neck Injuries Resulting from Tackle Football
ERIC Educational Resources Information Center
Torg, Joseph S.; And Others
1977-01-01
Use of the spring-loaded blocking and tackling devices should be discontinued due to severe neck injuries resulting from their use; employment of the head and helmet as the primary assault weapon in blocking, tackling, and head butting should be condemned for the same reason. (MJB)
NGL Viewer: Web-based molecular graphics for large complexes.
Rose, Alexander S; Bradley, Anthony R; Valasatava, Yana; Duarte, Jose M; Prlic, Andreas; Rose, Peter W
2018-05-29
The interactive visualization of very large macromolecular complexes on the web is becoming a challenging problem as experimental techniques advance at an unprecedented rate and deliver structures of increasing size. We have tackled this problem by developing highly memory-efficient and scalable extensions for the NGL WebGL-based molecular viewer and by using MMTF, a binary and compressed Macromolecular Transmission Format. These enable NGL to download and render molecular complexes with millions of atoms interactively on desktop computers and smartphones alike, making it a tool of choice for web-based molecular visualization in research and education. The source code is freely available under the MIT license at github.com/arose/ngl and distributed on NPM (npmjs.com/package/ngl). MMTF-JavaScript encoders and decoders are available at github.com/rcsb/mmtf-javascript. asr.moin@gmail.com.
Atkinson, Jo-An; Page, Andrew; Wells, Robert; Milat, Andrew; Wilson, Andrew
2015-03-03
In the design of public health policy, a broader understanding of risk factors for disease across the life course, and an increasing awareness of the social determinants of health, has led to the development of more comprehensive, cross-sectoral strategies to tackle complex problems. However, comprehensive strategies may not represent the most efficient or effective approach to reducing disease burden at the population level. Rather, they may act to spread finite resources less intensively over a greater number of programs and initiatives, diluting the potential impact of the investment. While analytic tools are available that use research evidence to help identify and prioritise disease risk factors for public health action, they are inadequate to support more targeted and effective policy responses for complex public health problems. This paper discusses the limitations of analytic tools that are commonly used to support evidence-informed policy decisions for complex problems. It proposes an alternative policy analysis tool which can integrate diverse evidence sources and provide a platform for virtual testing of policy alternatives in order to design solutions that are efficient, effective, and equitable. The case of suicide prevention in Australia is presented to demonstrate the limitations of current tools to adequately inform prevention policy and discusses the utility of the new policy analysis tool. In contrast to popular belief, a systems approach takes a step beyond comprehensive thinking and seeks to identify where best to target public health action and resources for optimal impact. It is concerned primarily with what can be reasonably left out of strategies for prevention and can be used to explore where disinvestment may occur without adversely affecting population health (or equity). Simulation modelling used for policy analysis offers promise in being able to better operationalise research evidence to support decision making for complex problems, improve targeting of public health policy, and offers a foundation for strengthening relationships between policy makers, stakeholders, and researchers.
46 CFR 121.300 - Ground tackle and mooring lines.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 46 Shipping 4 2012-10-01 2012-10-01 false Ground tackle and mooring lines. 121.300 Section 121.300 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SMALL PASSENGER VESSELS CARRYING MORE... MISCELLANEOUS SYSTEMS AND EQUIPMENT Mooring and Towing Equipment § 121.300 Ground tackle and mooring lines. A...
Detecting and estimating errors in 3D restoration methods using analog models.
NASA Astrophysics Data System (ADS)
José Ramón, Ma; Pueyo, Emilio L.; Briz, José Luis
2015-04-01
Some geological scenarios may be important for a number of socio-economic reasons, such as water or energy resources, but the available underground information is often limited, scarce and heterogeneous. A truly 3D reconstruction, which is still necessary during the decision-making process, may have important social and economic implications. For this reason, restoration methods were developed. By honoring some geometric or mechanical laws, they help build a reliable image of the subsurface. Pioneer methods were firstly applied in 2D (balanced and restored cross-sections) during the sixties and seventies. Later on, and due to the improvements of computational capabilities, they were extended to 3D. Currently, there are some academic and commercial restoration solutions; Unfold by the Université de Grenoble, Move by Midland Valley Exploration, Kine3D (on gOcad code) by Paradigm, Dynel3D by igeoss-Schlumberger. We have developed our own restoration method, Pmag3Drest (IGME-Universidad de Zaragoza), which is designed to tackle complex geometrical scenarios using paleomagnetic vectors as a pseudo-3D indicator of deformation. However, all these methods have limitations based on the assumptions they need to establish. For this reason, detecting and estimating uncertainty in 3D restoration methods is of key importance to trust the reconstructions. Checking the reliability and the internal consistency of every method, as well as to compare the results among restoration tools, is a critical issue never tackled so far because of the impossibility to test out the results in Nature. To overcome this problem we have developed a technique using analog models. We built complex geometric models inspired in real cases of superposed and/or conical folding at laboratory scale. The stratigraphic volumes were modeled using EVA sheets (ethylene vinyl acetate). Their rheology (tensile and tear strength, elongation, density etc) and thickness can be chosen among a large number of values, allowing to simulate many geologic settings. Besides, we also developed a novel technique to reconstruct the deformation ellipsoid. It consists in the screen-printing of an orthogonal net in every single EVA plate. The CT scan of the stack of plates allows the numbering of the nodes in 3D. Then, the geologic geometry is simulated and scanned again. The comparison of the nets before and after the deformation allows computing the distribution of strain ellipsoids in 3D. After extracting the principal axes, we can calculate dilation, total anisotropy etc. with a density proportional to the mesh size. The resultant geometry is perfectly known and thus, the expected result if we apply any restoration method. In this contribution we will show the first results obtained after testing some restoration methods with this stress test.
NASA Astrophysics Data System (ADS)
Zhang, Y. Y.; Shao, Q. X.; Ye, A. Z.; Xing, H. T.
2014-08-01
Integrated water system modeling is a reasonable approach to provide scientific understanding and possible solutions to tackle the severe water crisis faced over the world and to promote the implementation of integrated river basin management. Such a modeling practice becomes more feasible nowadays due to better computing facilities and available data sources. In this study, the process-oriented water system model (HEXM) is developed by integrating multiple water related processes including hydrology, biogeochemistry, environment and ecology, as well as the interference of human activities. The model was tested in the Shaying River Catchment, the largest, highly regulated and heavily polluted tributary of Huai River Basin in China. The results show that: HEXM is well integrated with good performance on the key water related components in the complex catchments. The simulated daily runoff series at all the regulated and less-regulated stations matches observations, especially for the high and low flow events. The average values of correlation coefficient and coefficient of efficiency are 0.81 and 0.63, respectively. The dynamics of observed daily ammonia-nitrogen (NH4N) concentration, as an important index to assess water environmental quality in China, are well captured with average correlation coefficient of 0.66. Furthermore, the spatial patterns of nonpoint source pollutant load and grain yield are also simulated properly, and the outputs have good agreements with the statistics at city scale. Our model shows clear superior performance in both calibration and validation in comparison with the widely used SWAT model. This model is expected to give a strong reference for water system modeling in complex basins, and provide the scientific foundation for the implementation of integrated river basin management all over the world as well as the technical guide for the reasonable regulation of dams and sluices and environmental improvement in river basins.
Public health nutrition in the civil service (England): approaches to tackling obesity.
Blackshaw, J R
2016-08-01
The seriousness and scale of the physical, psychological, economic and societal consequences relating to poor diets, inactivity and obesity is unprecedented. Consequently, the contextual factors underpinning the work of a nutritionist in the civil service are complex and significant; however, there are real opportunities to make a difference and help improve the health of the nation. The present paper describes the delivery of public health nutrition through two work programmes, namely action to support young people develop healthier lifestyle choices and more recently the investigation and deployment of local insights to develop action to tackle obesity. Combining the application of nutrition expertise along with broader skills and approaches has enabled the translation of research and evidence into programmes of work to better the public's health. It is evident that the appropriate evaluation of such approaches has helped to deliver engaging and practical learning opportunities for young people. Furthermore, efforts to build on local intelligence and seek collaborative development can help inform the evidence base and seek to deliver public health approaches, which resonate with how people live their lives.
Giabbanelli, Philippe J; Crutzen, Rik
2017-01-01
Most adults are overweight or obese in many western countries. Several population-level interventions on the physical, economical, political, or sociocultural environment have thus attempted to achieve a healthier weight. These interventions have involved different weight-related behaviours, such as food behaviours. Agent-based models (ABMs) have the potential to help policymakers evaluate food behaviour interventions from a systems perspective. However, fully realizing this potential involves a complex procedure starting with obtaining and analyzing data to populate the model and eventually identifying more efficient cross-sectoral policies. Current procedures for ABMs of food behaviours are mostly rooted in one technique, often ignore the food environment beyond home and work, and underutilize rich datasets. In this paper, we address some of these limitations to better support policymakers through two contributions. First, via a scoping review, we highlight readily available datasets and techniques to deal with these limitations independently. Second, we propose a three steps' process to tackle all limitations together and discuss its use to develop future models for food behaviours. We acknowledge that this integrated process is a leap forward in ABMs. However, this long-term objective is well-worth addressing as it can generate robust findings to effectively inform the design of food behaviour interventions.
Madaoui, Hocine; Guerois, Raphaël
2008-01-01
Protein surfaces are under significant selection pressure to maintain interactions with their partners throughout evolution. Capturing how selection pressure acts at the interfaces of protein–protein complexes is a fundamental issue with high interest for the structural prediction of macromolecular assemblies. We tackled this issue under the assumption that, throughout evolution, mutations should minimally disrupt the physicochemical compatibility between specific clusters of interacting residues. This constraint drove the development of the so-called Surface COmplementarity Trace in Complex History score (SCOTCH), which was found to discriminate with high efficiency the structure of biological complexes. SCOTCH performances were assessed not only with respect to other evolution-based approaches, such as conservation and coevolution analyses, but also with respect to statistically based scoring methods. Validated on a set of 129 complexes of known structure exhibiting both permanent and transient intermolecular interactions, SCOTCH appears as a robust strategy to guide the prediction of protein–protein complex structures. Of particular interest, it also provides a basic framework to efficiently track how protein surfaces could evolve while keeping their partners in contact. PMID:18511568
Farrer, Emily C; Ashton, Isabel W; Knape, Jonas; Suding, Katharine N
2014-04-01
Two sources of complexity make predicting plant community response to global change particularly challenging. First, realistic global change scenarios involve multiple drivers of environmental change that can interact with one another to produce non-additive effects. Second, in addition to these direct effects, global change drivers can indirectly affect plants by modifying species interactions. In order to tackle both of these challenges, we propose a novel population modeling approach, requiring only measurements of abundance and climate over time. To demonstrate the applicability of this approach, we model population dynamics of eight abundant plant species in a multifactorial global change experiment in alpine tundra where we manipulated nitrogen, precipitation, and temperature over 7 years. We test whether indirect and interactive effects are important to population dynamics and whether explicitly incorporating species interactions can change predictions when models are forecast under future climate change scenarios. For three of the eight species, population dynamics were best explained by direct effect models, for one species neither direct nor indirect effects were important, and for the other four species indirect effects mattered. Overall, global change had negative effects on species population growth, although species responded to different global change drivers, and single-factor effects were slightly more common than interactive direct effects. When the fitted population dynamic models were extrapolated under changing climatic conditions to the end of the century, forecasts of community dynamics and diversity loss were largely similar using direct effect models that do not explicitly incorporate species interactions or best-fit models; however, inclusion of species interactions was important in refining the predictions for two of the species. The modeling approach proposed here is a powerful way of analyzing readily available datasets which should be added to our toolbox to tease apart complex drivers of global change. © 2013 John Wiley & Sons Ltd.
Collapsed scrums and collision tackles: what is the injury risk?
Roberts, Simon P; Trewartha, Grant; England, Mike; Stokes, Keith A
2015-04-01
To establish the propensity for specific contact events to cause injury in rugby union. Medical staff at participating English community-level rugby clubs reported any injury resulting in the absence for one match or more from the day of the injury during the 2009/2010 (n=46), 2010/2011 (n=67) and 2011/2012 (n=76) seasons. Injury severity was defined as the number of matches missed. Thirty community rugby matches were filmed and the number of contact events (tackles, collision tackles, rucks, mauls, lineouts and scrums) recorded. Of 370 (95% CI 364 to 378) contact events per match, 141 (137 to 145) were tackles, 115 (111 to 119) were rucks and 32 (30 to 33) were scrums. Tackles resulted in the greatest propensity for injury (2.3 (2.2 to 2.4) injuries/1000 events) and the greatest severity (16 (15 to 17) weeks missed/1000 events). Collision tackles (illegal tackles involving a shoulder charge) had a propensity for injury of 15 (12.4 to 18.3) injuries/1000 events and severity was 92 (75 to 112) weeks missed/1000 events, both of which were higher than any other event. Additional scrum analysis showed that only 5% of all scrums collapsed, but the propensity for injury was four times higher (2.9 (1.5 to 5.4) injuries/1000 events) and the severity was six times greater (22 (12 to 42) weeks missed/1000 events) than for non-collapsed scrums. Injury prevention in the tackle should focus on technique with strict enforcement of existing laws for illegal collision tackles. The scrum is a relatively controllable event and further attempts should be made to reduce the frequency of scrum collapse. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Evaluating behavioral skills training to teach safe tackling skills to youth football players.
Tai, Sharayah S M; Miltenberger, Raymond G
2017-10-01
With concussion rates on the rise for football players, there is a need for further research to increase skills and decrease injuries. Behavioral skills training is effective in teaching a wide variety of skills but has yet to be studied in the sports setting. We evaluated behavioral skills training to teach safer tackling techniques to six participants from a Pop Warner football team. Safer tackling techniques increased during practice and generalized to games for the two participants who had opportunities to tackle in games. © 2017 Society for the Experimental Analysis of Behavior.
Hunter, David J
2015-03-12
Health systems have entered a third era embracing whole systems thinking and posing complex policy and management challenges. Understanding how such systems work and agreeing what needs to be put in place to enable them to undergo effective and sustainable change are more pressing issues than ever for policy-makers. The theory-policy-practice-gap and its four dimensions, as articulated by Chinitz and Rodwin, is acknowledged. It is suggested that insights derived from political science can both enrich our understanding of the gap and suggest what changes are needed to tackle the complex challenges facing health systems. © 2015 by Kerman University of Medical Sciences.
Non-cooperative Brownian donkeys: A solvable 1D model
NASA Astrophysics Data System (ADS)
Jiménez de Cisneros, B.; Reimann, P.; Parrondo, J. M. R.
2003-12-01
A paradigmatic 1D model for Brownian motion in a spatially symmetric, periodic system is tackled analytically. Upon application of an external static force F the system's response is an average current which is positive for F < 0 and negative for F > 0 (absolute negative mobility). Under suitable conditions, the system approaches 100% efficiency when working against the external force F.
Nuño-Solinís, Roberto
2014-01-01
The increase in life expectancy, coupled with other factors, has led to an increase in the prevalence of chronic diseases and multiple morbidity. This has led to the need to develop new health and social care models, which will allow managing these efficiently and in a sustainable manner. In particular, there seems to be consensus on the need to move towards integrated, patient-centered, and more proactive care. Thus, in recent years, chronic care models have been developed at international, national and regional level, as well as introducing strategies to tackle the challenge of chronic illness. However, the implementation of actions facilitating the change towards this new model of care does not seem to be an easy task. This paper presents some of the strategic lines and initiatives carried out by the Department of Health of the Basque Government. These actions can be described within a social and organizational innovation framework, as a means for effective implementation of interventions and strategies that shape the model required for the improved care of chronic illnesses within a universal and tax-funded health system. Copyright © 2013 Elsevier España, S.L. All rights reserved.
A demonstration of mixed-methods research in the health sciences.
Katz, Janet; Vandermause, Roxanne; McPherson, Sterling; Barbosa-Leiker, Celestina
2016-11-18
Background The growth of patient, community and population-centred nursing research is a rationale for the use of research methods that can examine complex healthcare issues, not only from a biophysical perspective, but also from cultural, psychosocial and political viewpoints. This need for multiple perspectives requires mixed-methods research. Philosophy and practicality are needed to plan, conduct, and make mixed-methods research more broadly accessible to the health sciences research community. The traditions and dichotomy between qualitative and quantitative research makes the application of mixed methods a challenge. Aim To propose an integrated model for a research project containing steps from start to finish, and to use the unique strengths brought by each approach to meet the health needs of patients and communities. Discussion Mixed-methods research is a practical approach to inquiry, that focuses on asking questions and how best to answer them to improve the health of individuals, communities and populations. An integrated model of research begins with the research question(s) and moves in a continuum. The lines dividing methods do not dissolve, but become permeable boundaries where two or more methods can be used to answer research questions more completely. Rigorous and expert methodologists work together to solve common problems. Conclusion Mixed-methods research enables discussion among researchers from varied traditions. There is a plethora of methodological approaches available. Combining expertise by communicating across disciplines and professions is one way to tackle large and complex healthcare issues. Implications for practice The model presented in this paper exemplifies the integration of multiple approaches in a unified focus on identified phenomena. The dynamic nature of the model signals a need to be open to the data generated and the methodological directions implied by findings.
Structural identifiability of cyclic graphical models of biological networks with latent variables.
Wang, Yulin; Lu, Na; Miao, Hongyu
2016-06-13
Graphical models have long been used to describe biological networks for a variety of important tasks such as the determination of key biological parameters, and the structure of graphical model ultimately determines whether such unknown parameters can be unambiguously obtained from experimental observations (i.e., the identifiability problem). Limited by resources or technical capacities, complex biological networks are usually partially observed in experiment, which thus introduces latent variables into the corresponding graphical models. A number of previous studies have tackled the parameter identifiability problem for graphical models such as linear structural equation models (SEMs) with or without latent variables. However, the limited resolution and efficiency of existing approaches necessarily calls for further development of novel structural identifiability analysis algorithms. An efficient structural identifiability analysis algorithm is developed in this study for a broad range of network structures. The proposed method adopts the Wright's path coefficient method to generate identifiability equations in forms of symbolic polynomials, and then converts these symbolic equations to binary matrices (called identifiability matrix). Several matrix operations are introduced for identifiability matrix reduction with system equivalency maintained. Based on the reduced identifiability matrices, the structural identifiability of each parameter is determined. A number of benchmark models are used to verify the validity of the proposed approach. Finally, the network module for influenza A virus replication is employed as a real example to illustrate the application of the proposed approach in practice. The proposed approach can deal with cyclic networks with latent variables. The key advantage is that it intentionally avoids symbolic computation and is thus highly efficient. Also, this method is capable of determining the identifiability of each single parameter and is thus of higher resolution in comparison with many existing approaches. Overall, this study provides a basis for systematic examination and refinement of graphical models of biological networks from the identifiability point of view, and it has a significant potential to be extended to more complex network structures or high-dimensional systems.
Decision makers often need assistance in understanding dynamic interactions and linkages among economic, environmental and social systems in coastal watersheds. They also need scientific input to better evaluate potential costs and benefits of alternative policy interventions. EP...
ERIC Educational Resources Information Center
Cummings, Lynda; Winston, Michael
1998-01-01
Describes the Solutions model used at Shelley High School in Idaho which gives students the opportunity to gain practical experience while tackling community problems. This approach is built on the three fundamentals of an integrated curriculum, a problem-solving focus, and service-based learning. Sample problems include increasing certain trout…
Kelemen, Arpad; Vasilakos, Athanasios V; Liang, Yulan
2009-09-01
Comprehensive evaluation of common genetic variations through association of single-nucleotide polymorphism (SNP) structure with common complex disease in the genome-wide scale is currently a hot area in human genome research due to the recent development of the Human Genome Project and HapMap Project. Computational science, which includes computational intelligence (CI), has recently become the third method of scientific enquiry besides theory and experimentation. There have been fast growing interests in developing and applying CI in disease mapping using SNP and haplotype data. Some of the recent studies have demonstrated the promise and importance of CI for common complex diseases in genomic association study using SNP/haplotype data, especially for tackling challenges, such as gene-gene and gene-environment interactions, and the notorious "curse of dimensionality" problem. This review provides coverage of recent developments of CI approaches for complex diseases in genetic association study with SNP/haplotype data.
Pomorska, Grazyna; Ockene, Judith K
2017-11-01
The goal of this article was to look at the problem of Alzheimer's disease (AD) through the lens of a socioecological resilience-thinking framework to help expand our view of the prevention and treatment of AD. This serious and complex public health problem requires a holistic systems approach. We present the view that resilience thinking, a theoretical framework that offers multidisciplinary approaches in ecology and natural resource management to solve environmental problems, can be applied to the prevention and treatment of AD. Resilience thinking explains a natural process that occurs in all complex systems in response to stressful challenges. The brain is a complex system, much like an ecosystem, and AD is a disturbance (allostatic overload) within the ecosystem of the brain. Resilience thinking gives us guidance, direction, and ideas about how to comprehensively prevent and treat AD and tackle the AD epidemic.
Progress of genome wide association study in domestic animals
2012-01-01
Domestic animals are invaluable resources for study of the molecular architecture of complex traits. Although the mapping of quantitative trait loci (QTL) responsible for economically important traits in domestic animals has achieved remarkable results in recent decades, not all of the genetic variation in the complex traits has been captured because of the low density of markers used in QTL mapping studies. The genome wide association study (GWAS), which utilizes high-density single-nucleotide polymorphism (SNP), provides a new way to tackle this issue. Encouraging achievements in dissection of the genetic mechanisms of complex diseases in humans have resulted from the use of GWAS. At present, GWAS has been applied to the field of domestic animal breeding and genetics, and some advances have been made. Many genes or markers that affect economic traits of interest in domestic animals have been identified. In this review, advances in the use of GWAS in domestic animals are described. PMID:22958308
Extending SME to Handle Large-Scale Cognitive Modeling.
Forbus, Kenneth D; Ferguson, Ronald W; Lovett, Andrew; Gentner, Dedre
2017-07-01
Analogy and similarity are central phenomena in human cognition, involved in processes ranging from visual perception to conceptual change. To capture this centrality requires that a model of comparison must be able to integrate with other processes and handle the size and complexity of the representations required by the tasks being modeled. This paper describes extensions to Structure-Mapping Engine (SME) since its inception in 1986 that have increased its scope of operation. We first review the basic SME algorithm, describe psychological evidence for SME as a process model, and summarize its role in simulating similarity-based retrieval and generalization. Then we describe five techniques now incorporated into the SME that have enabled it to tackle large-scale modeling tasks: (a) Greedy merging rapidly constructs one or more best interpretations of a match in polynomial time: O(n 2 log(n)); (b) Incremental operation enables mappings to be extended as new information is retrieved or derived about the base or target, to model situations where information in a task is updated over time; (c) Ubiquitous predicates model the varying degrees to which items may suggest alignment; (d) Structural evaluation of analogical inferences models aspects of plausibility judgments; (e) Match filters enable large-scale task models to communicate constraints to SME to influence the mapping process. We illustrate via examples from published studies how these enable it to capture a broader range of psychological phenomena than before. Copyright © 2016 Cognitive Science Society, Inc.
van Woezik, Anne F G; Braakman-Jansen, Louise M A; Kulyk, Olga; Siemons, Liseth; van Gemert-Pijnen, Julia E W C
2016-01-01
Infection prevention and control can be seen as a wicked public health problem as there is no consensus regarding problem definition and solution, multiple stakeholders with different needs and values are involved, and there is no clear end-point of the problem-solving process. Co-creation with stakeholders has been proposed as a suitable strategy to tackle wicked problems, yet little information and no clear step-by-step guide exist on how to do this. The objectives of this study were to develop a guideline to assist developers in tackling wicked problems using co-creation with stakeholders, and to apply this guideline to practice with an example case in the field of infection prevention and control. A mixed-method approach consisting of the integration of both quantitative and qualitative research was used. Relevant stakeholders from the veterinary, human health, and public health sectors were identified using a literature scan, expert recommendations, and snowball sampling. The stakeholder salience approach was used to select key stakeholders based on 3 attributes: power, legitimacy, and urgency. Key values of stakeholders (N = 20) were derived by qualitative semi-structured interviews and quantitatively weighted and prioritized using an online survey. Our method showed that stakeholder identification and analysis are prerequisites for understanding the complex stakeholder network that characterizes wicked problems. A total of 73 stakeholders were identified of which 36 were selected as potential key stakeholders, and only one was seen as a definite stakeholder. In addition, deriving key stakeholder values is a necessity to gain insights into different problem definitions, solutions and needs stakeholders have regarding the wicked problem. Based on the methods used, we developed a step-by-step guideline for co-creation with stakeholders when tackling wicked problems. The mixed-methods guideline presented here provides a systematic, transparent method to identify, analyze, and co-create with stakeholders, and to recognize and prioritize their values, problem definitions, and solutions in the context of wicked problems. This guideline consists of a general framework and although it was applied in an eHealth context, may be relevant outside of eHealth as well.
Rusoja, Evan; Haynie, Deson; Sievers, Jessica; Mustafee, Navonil; Nelson, Fred; Reynolds, Martin; Sarriot, Eric; Swanson, Robert Chad; Williams, Bob
2018-01-30
As the Sustainable Development Goals are rolled out worldwide, development leaders will be looking to the experiences of the past to improve implementation in the future. Systems thinking and complexity science (ST/CS) propose that health and the health system are composed of dynamic actors constantly evolving in response to each other and their context. While offering practical guidance for steering the next development agenda, there is no consensus as to how these important ideas are discussed in relation to health. This systematic review sought to identify and describe some of the key terms, concepts, and methods in recent ST/CS literature. Using the search terms "systems thinkin * AND health OR complexity theor* AND health OR complex adaptive system* AND health," we identified 516 relevant full texts out of 3982 titles across the search period (2002-2015). The peak number of articles were published in 2014 (83) with journals specifically focused on medicine/healthcare (265) and particularly the Journal of Evaluation in Clinical Practice (37) representing the largest number by volume. Dynamic/dynamical systems (n = 332), emergence (n = 294), complex adaptive system(s) (n = 270), and interdependent/interconnected (n = 263) were the most common terms with systems dynamic modelling (58) and agent-based modelling (43) as the most common methods. The review offered several important conclusions. First, while there was no core ST/CS "canon," certain terms appeared frequently across the reviewed texts. Second, even as these ideas are gaining traction in academic and practitioner communities, most are concentrated in a few journals. Finally, articles on ST/CS remain largely theoretical illustrating the need for further study and practical application. Given the challenge posed by the next phase of development, gaining a better understanding of ST/CS ideas and their use may lead to improvements in the implementation and practice of the Sustainable Development Goals. Key messages Systems thinking and complexity science, theories that acknowledge the dynamic, connected, and context-dependent nature of health, are highly relevant to the post-millennium development goal era yet lack consensus on their use in relation to health Although heterogeneous, terms, and concepts like emergence, dynamic/dynamical Systems, nonlinear(ity), and interdependent/interconnected as well as methods like systems dynamic modelling and agent-based modelling that comprise systems thinking and complexity science in the health literature are shared across an increasing number of publications within medical/healthcare disciplines Planners, practitioners, and theorists that can better understand these key systems thinking and complexity science concepts will be better equipped to tackle the challenges of the upcoming development goals. © 2018 John Wiley & Sons, Ltd.
Topics in QCD at Nonzero Temperature and Density
NASA Astrophysics Data System (ADS)
Pangeni, Kamal
Understanding the behavior of matter at ultra-high density such as neutron stars require the knowledge of ground state properties of Quantum chromodynamics (QCD) at finite chemical potential. However, this task has turned out to be very difficult because of two main reasons: 1) QCD may still be strongly coupled at those regimes making perturbative calculations unreliable and 2) QCD at finite density suffers from the sign problem that makes the use of lattice simulation problematic and it even affects phenomenological models. In the first part of this thesis, we show that the sign problem in analytical calculations of finite density models can be solved by considering the CK-symmetric, where C is charge conjugation and K is complex conjugation, complex saddle points of the effective action. We then explore the properties and consequences of such complex saddle points at non-zero temperature and density. Due to CK symmetry, the mass matrix eigenvalues in these models are not always real but can be complex, which results in damped oscillation of the density-density correlation function, a new feature of finite density models. To address the generality of such behavior, we next consider a lattice model of QCD with static quarks at strong-coupling. Computation of the mass spectrum confirms the existence of complex eigenvalues in much of temperature-chemical potential plane. This provides an independent confirmation of our results obtained using phenomenological models of QCD. The existence of regions in parameter space where density-density correlation function exhibit damped oscillation is one of the hallmarks of typical liquid-gas system. The formalism developed to tackle the sign problem in QCD models actually gives a simple understanding for the existence of such behavior in liquid-gas system. To this end, we develop a generic field theoretic model for the treatment of liquid-gas phase transition. An effective field theory at finite density derived from a fundamental four dimensional field theory turns out to be complex but CK symmetric. The existence of CK symmetry results in complex mass eigenvalues, which in turn leads to damped oscillatory behavior of the density-density correlation function. In the last part of this thesis, we study the effect of large amplitude density oscillations on the transport properties of superfluid nuclear matter. In nuclear matter at neutron-star densities and temperature, Cooper pairing leads to the formations of a gap in the nucleon excitation spectra resulting in exponentially strong Boltzmann suppression of many transport coefficients. Previous calculations have shown evidence that density oscillations of sufficiently large amplitude can overcome this suppression for flavor-changing beta processes via the mechanism of "gap-bridging". We address the simplifications made in that initial work, and show that gap bridging can counteract Boltzmann suppression of neutrino emissivity for the realistic case of modified Urca processes in matter with 3 P2 neutron pairing.
A cyclostationary multi-domain analysis of fluid instability in Kaplan turbines
NASA Astrophysics Data System (ADS)
Pennacchi, P.; Borghesani, P.; Chatterton, S.
2015-08-01
Hydraulic instabilities represent a critical problem for Francis and Kaplan turbines, reducing their useful life due to increase of fatigue on the components and cavitation phenomena. Whereas an exhaustive list of publications on computational fluid-dynamic models of hydraulic instability is available, the possibility of applying diagnostic techniques based on vibration measurements has not been investigated sufficiently, also because the appropriate sensors seldom equip hydro turbine units. The aim of this study is to fill this knowledge gap and to exploit fully, for this purpose, the potentiality of combining cyclostationary analysis tools, able to describe complex dynamics such as those of fluid-structure interactions, with order tracking procedures, allowing domain transformations and consequently the separation of synchronous and non-synchronous components. This paper will focus on experimental data obtained on a full-scale Kaplan turbine unit, operating in a real power plant, tackling the issues of adapting such diagnostic tools for the analysis of hydraulic instabilities and proposing techniques and methodologies for a highly automated condition monitoring system.
Panda, Rashmi; Puhan, N B; Panda, Ganapati
2018-02-01
Accurate optic disc (OD) segmentation is an important step in obtaining cup-to-disc ratio-based glaucoma screening using fundus imaging. It is a challenging task because of the subtle OD boundary, blood vessel occlusion and intensity inhomogeneity. In this Letter, the authors propose an improved version of the random walk algorithm for OD segmentation to tackle such challenges. The algorithm incorporates the mean curvature and Gabor texture energy features to define the new composite weight function to compute the edge weights. Unlike the deformable model-based OD segmentation techniques, the proposed algorithm remains unaffected by curve initialisation and local energy minima problem. The effectiveness of the proposed method is verified with DRIVE, DIARETDB1, DRISHTI-GS and MESSIDOR database images using the performance measures such as mean absolute distance, overlapping ratio, dice coefficient, sensitivity, specificity and precision. The obtained OD segmentation results and quantitative performance measures show robustness and superiority of the proposed algorithm in handling the complex challenges in OD segmentation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Visweswara Sathanur, Arun; Choudhury, Sutanay; Joslyn, Cliff A.
Property graphs can be used to represent heterogeneous networks with attributed vertices and edges. Given one property graph, simulating another graph with same or greater size with identical statistical properties with respect to the attributes and connectivity is critical for privacy preservation and benchmarking purposes. In this work we tackle the problem of capturing the statistical dependence of the edge connectivity on the vertex labels and using the same distribution to regenerate property graphs of the same or expanded size in a scalable manner. However, accurate simulation becomes a challenge when the attributes do not completely explain the network structure.more » We propose the Property Graph Model (PGM) approach that uses an attribute (or label) augmentation strategy to mitigate the problem and preserve the graph connectivity as measured via degree distribution, vertex label distributions and edge connectivity. Our proposed algorithm is scalable with a linear complexity in the number of edges in the target graph. We illustrate the efficacy of the PGM approach in regenerating and expanding the datasets by leveraging two distinct illustrations.« less
Murla, Damian; Gutierrez, Oriol; Martinez, Montse; Suñer, David; Malgrat, Pere; Poch, Manel
2016-04-15
During heavy rainfall, the capacity of sewer systems and wastewater treatment plants may be surcharged producing uncontrolled wastewater discharges and a depletion of the environmental quality. Therefore there is a need of advanced management tools to tackle with these complex problems. In this paper an environmental decision support system (EDSS), based on the integration of mathematical modeling and knowledge-based systems, has been developed for the coordinated management of urban wastewater systems (UWS) to control and minimize uncontrolled wastewater spills. Effectiveness of the EDSS has been tested in a specially designed virtual UWS, including two sewers systems, two WWTP and one river subjected to typical Mediterranean rain conditions. Results show that sewer systems, retention tanks and wastewater treatment plants improve their performance under wet weather conditions and that EDSS can be very effective tools to improve the management and prevent the system from possible uncontrolled wastewater discharges. Copyright © 2016 Elsevier B.V. All rights reserved.
Decoding the complex genetic causes of heart diseases using systems biology.
Djordjevic, Djordje; Deshpande, Vinita; Szczesnik, Tomasz; Yang, Andrian; Humphreys, David T; Giannoulatou, Eleni; Ho, Joshua W K
2015-03-01
The pace of disease gene discovery is still much slower than expected, even with the use of cost-effective DNA sequencing and genotyping technologies. It is increasingly clear that many inherited heart diseases have a more complex polygenic aetiology than previously thought. Understanding the role of gene-gene interactions, epigenetics, and non-coding regulatory regions is becoming increasingly critical in predicting the functional consequences of genetic mutations identified by genome-wide association studies and whole-genome or exome sequencing. A systems biology approach is now being widely employed to systematically discover genes that are involved in heart diseases in humans or relevant animal models through bioinformatics. The overarching premise is that the integration of high-quality causal gene regulatory networks (GRNs), genomics, epigenomics, transcriptomics and other genome-wide data will greatly accelerate the discovery of the complex genetic causes of congenital and complex heart diseases. This review summarises state-of-the-art genomic and bioinformatics techniques that are used in accelerating the pace of disease gene discovery in heart diseases. Accompanying this review, we provide an interactive web-resource for systems biology analysis of mammalian heart development and diseases, CardiacCode ( http://CardiacCode.victorchang.edu.au/ ). CardiacCode features a dataset of over 700 pieces of manually curated genetic or molecular perturbation data, which enables the inference of a cardiac-specific GRN of 280 regulatory relationships between 33 regulator genes and 129 target genes. We believe this growing resource will fill an urgent unmet need to fully realise the true potential of predictive and personalised genomic medicine in tackling human heart disease.
Techniques for Single System Integration of Elastic Simulation Features
NASA Astrophysics Data System (ADS)
Mitchell, Nathan M.
Techniques for simulating the behavior of elastic objects have matured considerably over the last several decades, tackling diverse problems from non-linear models for incompressibility to accurate self-collisions. Alongside these contributions, advances in parallel hardware design and algorithms have made simulation more efficient and affordable than ever before. However, prior research often has had to commit to design choices that compromise certain simulation features to better optimize others, resulting in a fragmented landscape of solutions. For complex, real-world tasks, such as virtual surgery, a holistic approach is desirable, where complex behavior, performance, and ease of modeling are supported equally. This dissertation caters to this goal in the form of several interconnected threads of investigation, each of which contributes a piece of an unified solution. First, it will be demonstrated how various non-linear materials can be combined with lattice deformers to yield simulations with behavioral richness and a high potential for parallelism. This potential will be exploited to show how a hybrid solver approach based on large macroblocks can accelerate the convergence of these deformers. Further extensions of the lattice concept with non-manifold topology will allow for efficient processing of self-collisions and topology change. Finally, these concepts will be explored in the context of a case study on virtual plastic surgery, demonstrating a real-world problem space where these ideas can be combined to build an expressive authoring tool, allowing surgeons to record procedures digitally for future reference or education.
NASA Astrophysics Data System (ADS)
Bosman, Peter A. N.; Alderliesten, Tanja
2016-03-01
We recently demonstrated the strong potential of using dual-dynamic transformation models when tackling deformable image registration problems involving large anatomical differences. Dual-dynamic transformation models employ two moving grids instead of the common single moving grid for the target image (and single fixed grid for the source image). We previously employed powerful optimization algorithms to make use of the additional flexibility offered by a dual-dynamic transformation model with good results, directly obtaining insight into the trade-off between important registration objectives as a result of taking a multi-objective approach to optimization. However, optimization has so far been initialized using two regular grids, which still leaves a great potential of dual-dynamic transformation models untapped: a-priori grid alignment with image structures/areas that are expected to deform more. This allows (far) less grid points to be used, compared to using a sufficiently refined regular grid, leading to (far) more efficient optimization, or, equivalently, more accurate results using the same number of grid points. We study the implications of exploiting this potential by experimenting with two new smart grid initialization procedures: one manual expert-based and one automated image-feature-based. We consider a CT test case with large differences in bladder volume with and without a multi-resolution scheme and find a substantial benefit of using smart grid initialization.
NASA Astrophysics Data System (ADS)
Le Bars, M.; Wacheul, J. B.
2015-12-01
Telluric planet formation involved the settling of large amounts of liquid iron coming from impacting planetesimals into an ambient viscous magma ocean. The initial state of planets was mostly determined by exchanges of heat and elements during this iron rain. Up to now, most models of planet formation simply assume that the metal rapidly equilibrated with the whole mantle. Other models account for simplified dynamics of the iron rain, involving the settling of single size drops at the Stokes velocity. But the fluid dynamics of iron sedimentation is much more complex, and influenced by the large viscosity ratio between the metal and the ambient fluid, as shown in studies of rising gas bubbles (e.g. Bonometti and Magnaudet 2006). We aim at developing a global understanding of the iron rain dynamics. Our study relies on a model experiment, consisting in popping a balloon of heated metal liquid at the top of a tank filled with viscous liquid. The experiments reach the relevant turbulent planetary regime, and tackle the whole range of expected viscosity ratios. High-speed videos allow determining the dynamics of drop clouds, as well as the statistics of drop sizes, shapes, and velocities. We also develop an analytical model of turbulent diffusion during settling, validated by measuring the temperature decrease of the metal blob. We finally present consequences for models of planet formation.
Azorin-Lopez, Jorge; Fuster-Guillo, Andres; Saval-Calvo, Marcelo; Mora-Mora, Higinio; Garcia-Chamizo, Juan Manuel
2017-01-01
The use of visual information is a very well known input from different kinds of sensors. However, most of the perception problems are individually modeled and tackled. It is necessary to provide a general imaging model that allows us to parametrize different input systems as well as their problems and possible solutions. In this paper, we present an active vision model considering the imaging system as a whole (including camera, lighting system, object to be perceived) in order to propose solutions to automated visual systems that present problems that we perceive. As a concrete case study, we instantiate the model in a real application and still challenging problem: automated visual inspection. It is one of the most used quality control systems to detect defects on manufactured objects. However, it presents problems for specular products. We model these perception problems taking into account environmental conditions and camera parameters that allow a system to properly perceive the specific object characteristics to determine defects on surfaces. The validation of the model has been carried out using simulations providing an efficient way to perform a large set of tests (different environment conditions and camera parameters) as a previous step of experimentation in real manufacturing environments, which more complex in terms of instrumentation and more expensive. Results prove the success of the model application adjusting scale, viewpoint and lighting conditions to detect structural and color defects on specular surfaces. PMID:28640211
Homophobic and Transphobic Bullying: Barriers and Supports to School Intervention
ERIC Educational Resources Information Center
O'Donoghue, Kate; Guerin, Suzanne
2017-01-01
This study explores the perceived barriers and supports identified by school staff in tackling homophobic and transphobic bullying, using Bronfenbrenner's ecological model as a framework. Semi-structured interviews were conducted with participants from five separate second-level/high schools (two designated disadvantaged schools, and three…
ERIC Educational Resources Information Center
2002
This 40-minute videotape tackles the issue of childhood bullying and unwanted teasing and torment. This videotape features real school children handling dramatic roles, and "doing the right thing" (aka "positive modeling.") The film is divided into two distinct parts: first act themes include bullying, girl bullies, children without one or both…
Effects of axial coordination on immobilized Mn(salen) catalysts.
Teixeira, Filipe; Mosquera, Ricardo A; Melo, André; Freire, Cristina; Cordeiro, M Natália D S
2014-11-13
The consequences of anchoring Mn(salen) catalysts onto a supporting material using one of the vacant positions of the metal center are tackled by studying several Mn(salen) complexes with different axial ligands attached. This is accomplished using Density Functional Theory at the X3LYP/Triple-ζ level of theory and the Atom In Molecules formalism. The results suggest that both Mn(salen) complexes and their oxo derivatives should lie in a triplet ground state. Also, the choice of the axial ligand bears a moderate effect on the energy involved in the oxidation of the former to oxo-Mn(salen) complexes, as well as in the stability of such complexes toward ligand removal by HCl. AIM analysis further suggests that the salen ligand acts as a "charge reservoir" for the metal center, with strong correlations being obtained between the charge of salen and the electron population donated by the axial ligand to the metal center. Moreover, the results suggest that the Mn atom in Mn(salen) complexes holds different hybridization of its valence orbitals depending on the type of axial ligand present in the system.
Petascale Many Body Methods for Complex Correlated Systems
NASA Astrophysics Data System (ADS)
Pruschke, Thomas
2012-02-01
Correlated systems constitute an important class of materials in modern condensed matter physics. Correlation among electrons are at the heart of all ordering phenomena and many intriguing novel aspects, such as quantum phase transitions or topological insulators, observed in a variety of compounds. Yet, theoretically describing these phenomena is still a formidable task, even if one restricts the models used to the smallest possible set of degrees of freedom. Here, modern computer architectures play an essential role, and the joint effort to devise efficient algorithms and implement them on state-of-the art hardware has become an extremely active field in condensed-matter research. To tackle this task single-handed is quite obviously not possible. The NSF-OISE funded PIRE collaboration ``Graduate Education and Research in Petascale Many Body Methods for Complex Correlated Systems'' is a successful initiative to bring together leading experts around the world to form a virtual international organization for addressing these emerging challenges and educate the next generation of computational condensed matter physicists. The collaboration includes research groups developing novel theoretical tools to reliably and systematically study correlated solids, experts in efficient computational algorithms needed to solve the emerging equations, and those able to use modern heterogeneous computer architectures to make then working tools for the growing community.
Li, Y; Nielsen, P V
2011-12-01
There has been a rapid growth of scientific literature on the application of computational fluid dynamics (CFD) in the research of ventilation and indoor air science. With a 1000-10,000 times increase in computer hardware capability in the past 20 years, CFD has become an integral part of scientific research and engineering development of complex air distribution and ventilation systems in buildings. This review discusses the major and specific challenges of CFD in terms of turbulence modelling, numerical approximation, and boundary conditions relevant to building ventilation. We emphasize the growing need for CFD verification and validation, suggest ongoing needs for analytical and experimental methods to support the numerical solutions, and discuss the growing capacity of CFD in opening up new research areas. We suggest that CFD has not become a replacement for experiment and theoretical analysis in ventilation research, rather it has become an increasingly important partner. We believe that an effective scientific approach for ventilation studies is still to combine experiments, theory, and CFD. We argue that CFD verification and validation are becoming more crucial than ever as more complex ventilation problems are solved. It is anticipated that ventilation problems at the city scale will be tackled by CFD in the next 10 years. © 2011 John Wiley & Sons A/S.
Constructing Acceptable RWM Approaches: The Politics of Participation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laes, E.; Bombaerts, G.
2006-07-01
Public participation in a complex technological issue such as the management of radioactive waste needs to be based on a simultaneous construction of scientific, ethical and socio-political foundations. Confronting this challenge is in no way straightforward. The problem is not only that the 'hard' technocrats downplay the importance of socio-political and ethical factors; also, our 'soft' ethical vocabularies (e.g. Habermasian 'discourse ethics') seem to be ill-equipped for tackling such complex questions (in terms of finding concrete solutions). On the other hand, professionals in the field, confronted with a (sometimes urgent) need for finding workable solutions, cannot wait for armchair philosophersmore » to formulate the correct academic answers to their questions. Different public participation and communication models have been developed and tested in real-world conditions, for instance in the Belgian 'partnership approach' to the siting of a low-level waste management facility. Starting from the confrontation of theoretical outlooks and pragmatic solutions, this paper identifies a number of 'dilemmas of participation' that can only be resolved by inherently political choices. Successfully negotiating these dilemmas is of course difficult and conditional on many contextual factors, but nevertheless at the end of the paper an attempt is made to sketch the contours of three possible future scenarios (each with their own limits and possibilities). (authors)« less
Viard, Frédérique; Arnaud, Jean-François; Delescluse, Maxime; Cuguen, Joël
2004-06-01
Hybrids between transgenic crops and wild relatives have been documented successfully in a wide range of cultivated species, having implications on conservation and biosafety management. Nonetheless, the magnitude and frequency of hybridization in the wild is still an open question, in particular when considering several populations at the landscape level. The Beta vulgaris complex provides an excellent biological model to tackle this issue. Weed beets contaminating sugar beet fields are expected to act as a relay between wild populations and crops and from crops-to-crops. In one major European sugar beet production area, nine wild populations and 12 weed populations were genetically characterized using cytoplasmic markers specific to the cultivated lines and nuclear microsatellite loci. A tremendous overall genetic differentiation between neighbouring wild and weed populations was depicted. However, genetic admixture analyses at the individual level revealed clear evidence for gene flow between wild and weed populations. In particular, one wild population displayed a high magnitude of nuclear genetic admixture, reinforced by direct seed flow as evidenced by cytoplasmic markers. Altogether, weed beets were shown to act as relay for gene flow between crops to wild populations and crops to crops by pollen and seeds at a landscape level.
Social regulation of emotion: messy layers
Kappas, Arvid
2013-01-01
Emotions are evolved systems of intra- and interpersonal processes that are regulatory in nature, dealing mostly with issues of personal or social concern. They regulate social interaction and in extension, the social sphere. In turn, processes in the social sphere regulate emotions of individuals and groups. In other words, intrapersonal processes project in the interpersonal space, and inversely, interpersonal experiences deeply influence intrapersonal processes. Thus, I argue that the concepts of emotion generation and regulation should not be artificially separated. Similarly, interpersonal emotions should not be reduced to interacting systems of intraindividual processes. Instead, we can consider emotions at different social levels, ranging from dyads to large scale e-communities. The interaction between these levels is complex and does not only involve influences from one level to the next. In this sense the levels of emotion/regulation are messy and a challenge for empirical study. In this article, I discuss the concepts of emotions and regulation at different intra- and interpersonal levels. I extend the concept of auto-regulation of emotions (Kappas, 2008, 2011a,b) to social processes. Furthermore, I argue for the necessity of including mediated communication, particularly in cyberspace in contemporary models of emotion/regulation. Lastly, I suggest the use of concepts from systems dynamics and complex systems to tackle the challenge of the “messy layers.” PMID:23424049
Human genetics as a model for target validation: finding new therapies for diabetes.
Thomsen, Soren K; Gloyn, Anna L
2017-06-01
Type 2 diabetes is a global epidemic with major effects on healthcare expenditure and quality of life. Currently available treatments are inadequate for the prevention of comorbidities, yet progress towards new therapies remains slow. A major barrier is the insufficiency of traditional preclinical models for predicting drug efficacy and safety. Human genetics offers a complementary model to assess causal mechanisms for target validation. Genetic perturbations are 'experiments of nature' that provide a uniquely relevant window into the long-term effects of modulating specific targets. Here, we show that genetic discoveries over the past decades have accurately predicted (now known) therapeutic mechanisms for type 2 diabetes. These findings highlight the potential for use of human genetic variation for prospective target validation, and establish a framework for future applications. Studies into rare, monogenic forms of diabetes have also provided proof-of-principle for precision medicine, and the applicability of this paradigm to complex disease is discussed. Finally, we highlight some of the limitations that are relevant to the use of genome-wide association studies (GWAS) in the search for new therapies for diabetes. A key outstanding challenge is the translation of GWAS signals into disease biology and we outline possible solutions for tackling this experimental bottleneck.
Large eddy simulation of flows in industrial compressors: a path from 2015 to 2035
Gourdain, N.; Sicot, F.; Duchaine, F.; Gicquel, L.
2014-01-01
A better understanding of turbulent unsteady flows is a necessary step towards a breakthrough in the design of modern compressors. Owing to high Reynolds numbers and very complex geometry, the flow that develops in such industrial machines is extremely hard to predict. At this time, the most popular method to simulate these flows is still based on a Reynolds-averaged Navier–Stokes approach. However, there is some evidence that this formalism is not accurate for these components, especially when a description of time-dependent turbulent flows is desired. With the increase in computing power, large eddy simulation (LES) emerges as a promising technique to improve both knowledge of complex physics and reliability of flow solver predictions. The objective of the paper is thus to give an overview of the current status of LES for industrial compressor flows as well as to propose future research axes regarding the use of LES for compressor design. While the use of wall-resolved LES for industrial multistage compressors at realistic Reynolds number should not be ready before 2035, some possibilities exist to reduce the cost of LES, such as wall modelling and the adaptation of the phase-lag condition. This paper also points out the necessity to combine LES to techniques able to tackle complex geometries. Indeed LES alone, i.e. without prior knowledge of such flows for grid construction or the prohibitive yet ideal use of fully homogeneous meshes to predict compressor flows, is quite limited today. PMID:25024422
A CRISPR Path to Engineering New Genetic Mouse Models for Cardiovascular Research.
Miano, Joseph M; Zhu, Qiuyu Martin; Lowenstein, Charles J
2016-06-01
Previous efforts to target the mouse genome for the addition, subtraction, or substitution of biologically informative sequences required complex vector design and a series of arduous steps only a handful of laboratories could master. The facile and inexpensive clustered regularly interspaced short palindromic repeats (CRISPR) method has now superseded traditional means of genome modification such that virtually any laboratory can quickly assemble reagents for developing new mouse models for cardiovascular research. Here, we briefly review the history of CRISPR in prokaryotes, highlighting major discoveries leading to its formulation for genome modification in the animal kingdom. Core components of CRISPR technology are reviewed and updated. Practical pointers for 2-component and 3-component CRISPR editing are summarized with many applications in mice including frameshift mutations, deletion of enhancers and noncoding genes, nucleotide substitution of protein-coding and gene regulatory sequences, incorporation of loxP sites for conditional gene inactivation, and epitope tag integration. Genotyping strategies are presented and topics of genetic mosaicism and inadvertent targeting discussed. Finally, clinical applications and ethical considerations are addressed as the biomedical community eagerly embraces this astonishing innovation in genome editing to tackle previously intractable questions. © 2016 American Heart Association, Inc.
A CRISPR Path to Engineering New Genetic Mouse Models for Cardiovascular Research
Miano, Joseph M.; Zhu, Qiuyu Martin; Lowenstein, Charles J.
2016-01-01
Previous efforts to target the mouse genome for the addition, subtraction, or substitution of biologically informative sequences required complex vector design and a series of arduous steps only a handful of labs could master. The facile and inexpensive clustered regularly interspaced short palindromic repeats (CRISPR) method has now superseded traditional means of genome modification such that virtually any lab can quickly assemble reagents for developing new mouse models for cardiovascular research. Here we briefly review the history of CRISPR in prokaryotes, highlighting major discoveries leading to its formulation for genome modification in the animal kingdom. Core components of CRISPR technology are reviewed and updated. Practical pointers for two-component and three-component CRISPR editing are summarized with a number of applications in mice including frameshift mutations, deletion of enhancers and non-coding genes, nucleotide substitution of protein-coding and gene regulatory sequences, incorporation of loxP sites for conditional gene inactivation, and epitope tag integration. Genotyping strategies are presented and topics of genetic mosaicism and inadvertent targeting discussed. Finally, clinical applications and ethical considerations are addressed as the biomedical community eagerly embraces this astonishing innovation in genome editing to tackle previously intractable questions. PMID:27102963
Essential Readings in Gifted Education: 12 Volume Set
ERIC Educational Resources Information Center
Reis, Sally M., Ed.
2004-01-01
The National Association for Gifted Children series "Essential Readings in Gifted Education," edited by Sally M. Reis, is a comprehensive collection of the leading research, theories, and findings that span more than 25 years. Each volume tackles the major issues, chronicles chief trends, and imparts effective models and solutions for gifted…
ERIC Educational Resources Information Center
Gipson, Frances Marie
2012-01-01
Federal, state, and local agencies face challenges organizing resources that create the conditions necessary to create, sustain, and replicate effective high performing schools. Knowing that leadership does impact achievement outcomes and that school districts tackle growing numbers of sanctioned Program Improvement schools, a distributed…
ERIC Educational Resources Information Center
Lawlor, Hugh
2010-01-01
At the heart of the vision for the Diploma in Science is a multidisciplinary approach to learning by tackling scientific challenges and questions in applied work-related contexts. This, together with the innovative delivery model offered by a consortia approach, will bridge a significant gap in the provision of science and mathematics education.…
Re-Modelling as De-Professionalisation
ERIC Educational Resources Information Center
Thompson, Meryl
2006-01-01
The article sets out the consequences of the British Government's remodelling agenda and its emphasis on less demarcation, for the professional status of teachers in England. It describes how the National Agreement on Raising Standards and Tackling Workload, reached between five of the six trade unions for teachers and headteachers paves the way…
Data Integration Approaches to Longitudinal Growth Modeling
ERIC Educational Resources Information Center
Marcoulides, Katerina M.; Grimm, Kevin J.
2017-01-01
Synthesizing results from multiple studies is a daunting task during which researchers must tackle a variety of challenges. The task is even more demanding when studying developmental processes longitudinally and when different instruments are used to measure constructs. Data integration methodology is an emerging field that enables researchers to…
Alvaro, C.; Jackson, L. A.; Kirk, S.; McHugh, T. L.; Hughes, J.; Chircop, A.; Lyons, R. F.
2011-01-01
This paper explores why Canadian government policies, particularly those related to obesity, are ‘stuck’ at promoting individual lifestyle change. Key concepts within complexity and critical theories are considered a basis for understanding the continued emphasis on lifestyle factors in spite of strong evidence indicating that a change in the environment and conditions of poverty isare needed to tackle obesity. Opportunities to get ‘unstuck’ from individual-level lifestyle interventions are also suggested by critical concepts found within these two theories, although getting ‘unstuck’ will also require cross-sectoral collective action. Our discussion focuses on the Canadian context but will undoubtedly be relevant to other countries, where health promoters and others engage in similar struggles for fundamental government policy change. PMID:20709791
Pasolini's Edipo Re: myth, play, and autobiography.
Pipolo, Tony
2013-08-01
The pervasive influence of the Oedipus complex on world culture is a given, yet throughout the long history of motion pictures only one major filmmaker has tackled the literary source that inspired Freud. The film, Edipo Re, directed by Italian poet, novelist, and social and political activist Pier Paolo Pasolini, not only reconstructs the myth and adapts Sophocles' tragedy, but uses both as a basis of cinematic autobiography. This paper is a detailed analysis of the formal, stylistic, and thematic dimensions of this film, illustrating the complex manner in which Pasolini interweaves myth, play, and autobiography into a unique cinematic achievement. This analysis is followed by speculations on the implications of the film's structure and techniques and on what they reveal about Pasolini's character, his sexual profile, and the ignominious murder that ended his life.
L-hop percolation on networks with arbitrary degree distributions and its applications
NASA Astrophysics Data System (ADS)
Shang, Yilun; Luo, Weiliang; Xu, Shouhuai
2011-09-01
Site percolation has been used to help understand analytically the robustness of complex networks in the presence of random node deletion (or failure). In this paper we move a further step beyond random node deletion by considering that a node can be deleted because it is chosen or because it is within some L-hop distance of a chosen node. Using the generating functions approach, we present analytic results on the percolation threshold as well as the mean size, and size distribution, of nongiant components of complex networks under such operations. The introduction of parameter L is both conceptually interesting because it accommodates a sort of nonindependent node deletion, which is often difficult to tackle analytically, and practically interesting because it offers useful insights for cybersecurity (such as botnet defense).
Multiple crack detection in 3D using a stable XFEM and global optimization
NASA Astrophysics Data System (ADS)
Agathos, Konstantinos; Chatzi, Eleni; Bordas, Stéphane P. A.
2018-02-01
A numerical scheme is proposed for the detection of multiple cracks in three dimensional (3D) structures. The scheme is based on a variant of the extended finite element method (XFEM) and a hybrid optimizer solution. The proposed XFEM variant is particularly well-suited for the simulation of 3D fracture problems, and as such serves as an efficient solution to the so-called forward problem. A set of heuristic optimization algorithms are recombined into a multiscale optimization scheme. The introduced approach proves effective in tackling the complex inverse problem involved, where identification of multiple flaws is sought on the basis of sparse measurements collected near the structural boundary. The potential of the scheme is demonstrated through a set of numerical case studies of varying complexity.
[Parricide, abuse and emotional processes: a review starting from some paradigmatic cases].
Grattagliano, I; Greco, R; Di Vella, G; Campobasso, C P; Corbi, G; Romanelli, M C; Petruzzelli, N; Ostuni, A; Brunetti, V; Cassibba, R
2015-01-01
The authors of this study tackle the complex subject of parricide, which is a rare and often brutal form of homicide. Parricide has a high emotional impact on public opinion and on our collective imagination, especially in light of the fact that the perpetrators are often minors.. Three striking cases of parricide, taken from various documented sources and judicial files from the "N. Fornelli" Juvenile Penal Institute (Bari, Italy), are presented here. A review of the literature on the topic has revealed differences between parricides committed by adults and those committed by minors. In the end, the complex issues underlying such an unusual crime are connected to abuses and maltreatment that minor perpetrators of parricide have suffered, especially the emotional processes that are activated.
Big Data Analytics with Datalog Queries on Spark.
Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo
2016-01-01
There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics.
Big Data Analytics with Datalog Queries on Spark
Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo
2017-01-01
There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics. PMID:28626296
NASA Astrophysics Data System (ADS)
Gosses, Moritz; Nowak, Wolfgang; Wöhling, Thomas
2017-04-01
Physically-based modeling is a wide-spread tool in understanding and management of natural systems. With the high complexity of many such models and the huge amount of model runs necessary for parameter estimation and uncertainty analysis, overall run times can be prohibitively long even on modern computer systems. An encouraging strategy to tackle this problem are model reduction methods. In this contribution, we compare different proper orthogonal decomposition (POD, Siade et al. (2010)) methods and their potential applications to groundwater models. The POD method performs a singular value decomposition on system states as simulated by the complex (e.g., PDE-based) groundwater model taken at several time-steps, so-called snapshots. The singular vectors with the highest information content resulting from this decomposition are then used as a basis for projection of the system of model equations onto a subspace of much lower dimensionality than the original complex model, thereby greatly reducing complexity and accelerating run times. In its original form, this method is only applicable to linear problems. Many real-world groundwater models are non-linear, tough. These non-linearities are introduced either through model structure (unconfined aquifers) or boundary conditions (certain Cauchy boundaries, like rivers with variable connection to the groundwater table). To date, applications of POD focused on groundwater models simulating pumping tests in confined aquifers with constant head boundaries. In contrast, POD model reduction either greatly looses accuracy or does not significantly reduce model run time if the above-mentioned non-linearities are introduced. We have also found that variable Dirichlet boundaries are problematic for POD model reduction. An extension to the POD method, called POD-DEIM, has been developed for non-linear groundwater models by Stanko et al. (2016). This method uses spatial interpolation points to build the equation system in the reduced model space, thereby allowing the recalculation of system matrices at every time-step necessary for non-linear models while retaining the speed of the reduced model. This makes POD-DEIM applicable for groundwater models simulating unconfined aquifers. However, in our analysis, the method struggled to reproduce variable river boundaries accurately and gave no advantage for variable Dirichlet boundaries compared to the original POD method. We have developed another extension for POD that targets to address these remaining problems by performing a second POD operation on the model matrix on the left-hand side of the equation. The method aims to at least reproduce the accuracy of the other methods where they are applicable while outperforming them for setups with changing river boundaries or variable Dirichlet boundaries. We compared the new extension with original POD and POD-DEIM for different combinations of model structures and boundary conditions. The new method shows the potential of POD extensions for applications to non-linear groundwater systems and complex boundary conditions that go beyond the current, relatively limited range of applications. References: Siade, A. J., Putti, M., and Yeh, W. W.-G. (2010). Snapshot selection for groundwater model reduction using proper orthogonal decomposition. Water Resour. Res., 46(8):W08539. Stanko, Z. P., Boyce, S. E., and Yeh, W. W.-G. (2016). Nonlinear model reduction of unconfined groundwater flow using pod and deim. Advances in Water Resources, 97:130 - 143.
NASA Astrophysics Data System (ADS)
Jacquey, Antoine; Cacace, Mauro
2017-04-01
Utilization of the underground for energy-related purposes have received increasing attention in the last decades as a source for carbon-free energy and for safe storage solutions. Understanding the key processes controlling fluid and heat flow around geological discontinuities such as faults and fractures as well as their mechanical behaviours is therefore of interest in order to design safe and sustainable reservoir operations. These processes occur in a naturally complex geological setting, comprising natural or engineered discrete heterogeneities as faults and fractures, span a relatively large spectrum of temporal and spatial scales and they interact in a highly non-linear fashion. In this regard, numerical simulators have become necessary in geological studies to model coupled processes and complex geological geometries. In this study, we present a new simulator GOLEM, using multiphysics coupling to characterize geological reservoirs. In particular, special attention is given to discrete geological features such as faults and fractures. GOLEM is based on the Multiphysics Object-Oriented Simulation Environment (MOOSE). The MOOSE framework provides a powerful and flexible platform to solve multiphysics problems implicitly and in a tightly coupled manner on unstructured meshes which is of interest for the considered non-linear context. Governing equations in 3D for fluid flow, heat transfer (conductive and advective), saline transport as well as deformation (elastic and plastic) have been implemented into the GOLEM application. Coupling between rock deformation and fluid and heat flow is considered using theories of poroelasticity and thermoelasticity. Furthermore, considering material properties such as density and viscosity and transport properties such as porosity as dependent on the state variables (based on the International Association for the Properties of Water and Steam models) increase the coupling complexity of the problem. The GOLEM application aims therefore at integrating more physical processes observed in the field or in the laboratory to simulate more realistic scenarios. The use of high-level nonlinear solver technology allow us to tackle these complex multiphysics problems in three dimensions. Basic concepts behing the GOLEM simulator will be presented in this study as well as a few application examples to illustrate its main features.
Systems Medicine: Sketching the Landscape.
Kirschner, Marc
2016-01-01
To understand the meaning of the term Systems Medicine and to distinguish it from seemingly related other expressions currently in use, such as precision, personalized, -omics, or big data medicine, its underlying history and development into present time needs to be highlighted. Having this development in mind, it becomes evident that Systems Medicine is a genuine concept as well as a novel way of tackling the manifold complexity that occurs in nowadays clinical medicine-and not just a rebranding of what has previously been done in the past. So looking back it seems clear to many in the field that Systems Medicine has its origin in an integrative method to unravel biocomplexity, namely, Systems Biology. Here scientist by now gained useful experience that is on the verge toward implementation in clinical research and practice.Systems Medicine and Systems Biology have the same underlying theoretical principle in systems-based thinking-a methodology to understand complexity that can be traced back to ancient Greece. During the last decade, however, and due to a rapid methodological development in the life sciences and computing/IT technologies, Systems Biology has evolved from a scientific concept into an independent discipline most competent to tackle key questions of biocomplexity-with the potential to transform medicine and how it will be practiced in the future. To understand this process in more detail, the following section will thus give a short summary of the foundation of systems-based thinking and the different developmental stages including systems theory, the development of modern Systems Biology, and its transition into clinical practice. These are the components to pave the way toward Systems Medicine.
Fleştea, Alina Maria; Fodor, Oana Cătălina; Curşeu, Petru Lucian; Miclea, Mircea
2017-01-01
Multi-team systems (MTS) are used to tackle unpredictable events and to respond effectively to fast-changing environmental contingencies. Their effectiveness is influenced by within as well as between team processes (i.e. communication, coordination) and emergent phenomena (i.e. situational awareness). The present case study explores the way in which the emergent structures and the involvement of bystanders intertwine with the dynamics of processes and emergent states both within and between the component teams. Our findings show that inefficient transition process and the ambiguous leadership generated poor coordination and hindered the development of emergent phenomena within the whole system. Emergent structures and bystanders substituted leadership functions and provided a pool of critical resources for the MTS. Their involvement fostered the emergence of situational awareness and facilitated contingency planning processes. However, bystander involvement impaired the emergence of cross-understandings and interfered with coordination processes between the component teams. Practitioner Summary: Based on a real emergency situation, the present research provides important theoretical and practical insights about the role of bystander involvement in the dynamics of multi-team systems composed to tackle complex tasks and respond to fast changing and unpredictable environmental contingencies.
Hormiga, J A; Vera, J; Frías, I; Torres Darias, N V
2008-10-10
The well-documented ability to degrade lignin and a variety of complex chemicals showed by the white-rot fungus Phanerochaete chrysosporium has made it the subject of many studies in areas of environmental concern, including pulp bioleaching and bioremediation technologies. However, until now, most of the work in this field has been focused on the ligninolytic sub-system but, due to the great complexity of the involved processes, less progress has been made in understanding the biochemical regulatory structure that could explain growth dynamics, the substrate utilization and the ligninolytic system production itself. In this work we want to tackle this problem from the perspectives and approaches of systems biology, which have been shown to be effective in the case of complex systems. We will use a top-down approach to the construction of this model aiming to identify the cellular sub-systems that play a major role in the whole process. We have investigated growth dynamics, substrate consumption and lignin peroxidase production of the P. chrysosporium wild type under a set of definite culture conditions. Based on data gathered from different authors and in our own experimental determinations, we built a model using a GMA power-law representation, which was used as platform to make predictive simulations. Thereby, we could assess the consistency of some current assumptions about the regulatory structure of the overall process. The model parameters were estimated from a time series experimental measurements by means of an algorithm previously adapted and optimized for power-law models. The model was subsequently checked for quality by comparing its predictions with the experimental behavior observed in new, different experimental settings and through perturbation analysis aimed to test the robustness of the model. Hence, the model showed to be able to predict the dynamics of two critical variables such as biomass and lignin peroxidase activity when in conditions of nutrient deprivation and after pulses of veratryl alcohol. Moreover, it successfully predicts the evolution of the variables during both, the active growth phase and after the deprivation shock. The close agreement between the predicted and observed behavior and the advanced understanding of its kinetic structure and regulatory features provides the necessary background for the design of a biotechnological set-up designed for the continuous production of the ligninolityc system and its optimization.
2015-11-01
American football remains one of the most popular sports for young athletes. The injuries sustained during football, especially those to the head and neck, have been a topic of intense interest recently in both the public media and medical literature. The recognition of these injuries and the potential for long-term sequelae have led some physicians to call for a reduction in the number of contact practices, a postponement of tackling until a certain age, and even a ban on high school football. This statement reviews the literature regarding injuries in football, particularly those of the head and neck, the relationship between tackling and football-related injuries, and the potential effects of limiting or delaying tackling on injury risk. Copyright © 2015 by the American Academy of Pediatrics.
Water resources of the Black Sea Basin at high spatial and temporal resolution
NASA Astrophysics Data System (ADS)
Rouholahnejad, Elham; Abbaspour, Karim C.; Srinivasan, Raghvan; Bacu, Victor; Lehmann, Anthony
2014-07-01
The pressure on water resources, deteriorating water quality, and uncertainties associated with the climate change create an environment of conflict in large and complex river system. The Black Sea Basin (BSB), in particular, suffers from ecological unsustainability and inadequate resource management leading to severe environmental, social, and economical problems. To better tackle the future challenges, we used the Soil and Water Assessment Tool (SWAT) to model the hydrology of the BSB coupling water quantity, water quality, and crop yield components. The hydrological model of the BSB was calibrated and validated considering sensitivity and uncertainty analysis. River discharges, nitrate loads, and crop yields were used to calibrate the model. Employing grid technology improved calibration computation time by more than an order of magnitude. We calculated components of water resources such as river discharge, infiltration, aquifer recharge, soil moisture, and actual and potential evapotranspiration. Furthermore, available water resources were calculated at subbasin spatial and monthly temporal levels. Within this framework, a comprehensive database of the BSB was created to fill the existing gaps in water resources data in the region. In this paper, we discuss the challenges of building a large-scale model in fine spatial and temporal detail. This study provides the basis for further research on the impacts of climate and land use change on water resources in the BSB.
Advanced Machine Learning Emulators of Radiative Transfer Models
NASA Astrophysics Data System (ADS)
Camps-Valls, G.; Verrelst, J.; Martino, L.; Vicent, J.
2017-12-01
Physically-based model inversion methodologies are based on physical laws and established cause-effect relationships. A plethora of remote sensing applications rely on the physical inversion of a Radiative Transfer Model (RTM), which lead to physically meaningful bio-geo-physical parameter estimates. The process is however computationally expensive, needs expert knowledge for both the selection of the RTM, its parametrization and the the look-up table generation, as well as its inversion. Mimicking complex codes with statistical nonlinear machine learning algorithms has become the natural alternative very recently. Emulators are statistical constructs able to approximate the RTM, although at a fraction of the computational cost, providing an estimation of uncertainty, and estimations of the gradient or finite integral forms. We review the field and recent advances of emulation of RTMs with machine learning models. We posit Gaussian processes (GPs) as the proper framework to tackle the problem. Furthermore, we introduce an automatic methodology to construct emulators for costly RTMs. The Automatic Gaussian Process Emulator (AGAPE) methodology combines the interpolation capabilities of GPs with the accurate design of an acquisition function that favours sampling in low density regions and flatness of the interpolation function. We illustrate the good capabilities of our emulators in toy examples, leaf and canopy levels PROSPECT and PROSAIL RTMs, and for the construction of an optimal look-up-table for atmospheric correction based on MODTRAN5.
Unconditional or Conditional Logistic Regression Model for Age-Matched Case-Control Data?
Kuo, Chia-Ling; Duan, Yinghui; Grady, James
2018-01-01
Matching on demographic variables is commonly used in case-control studies to adjust for confounding at the design stage. There is a presumption that matched data need to be analyzed by matched methods. Conditional logistic regression has become a standard for matched case-control data to tackle the sparse data problem. The sparse data problem, however, may not be a concern for loose-matching data when the matching between cases and controls is not unique, and one case can be matched to other controls without substantially changing the association. Data matched on a few demographic variables are clearly loose-matching data, and we hypothesize that unconditional logistic regression is a proper method to perform. To address the hypothesis, we compare unconditional and conditional logistic regression models by precision in estimates and hypothesis testing using simulated matched case-control data. Our results support our hypothesis; however, the unconditional model is not as robust as the conditional model to the matching distortion that the matching process not only makes cases and controls similar for matching variables but also for the exposure status. When the study design involves other complex features or the computational burden is high, matching in loose-matching data can be ignored for negligible loss in testing and estimation if the distributions of matching variables are not extremely different between cases and controls.
Unconditional or Conditional Logistic Regression Model for Age-Matched Case–Control Data?
Kuo, Chia-Ling; Duan, Yinghui; Grady, James
2018-01-01
Matching on demographic variables is commonly used in case–control studies to adjust for confounding at the design stage. There is a presumption that matched data need to be analyzed by matched methods. Conditional logistic regression has become a standard for matched case–control data to tackle the sparse data problem. The sparse data problem, however, may not be a concern for loose-matching data when the matching between cases and controls is not unique, and one case can be matched to other controls without substantially changing the association. Data matched on a few demographic variables are clearly loose-matching data, and we hypothesize that unconditional logistic regression is a proper method to perform. To address the hypothesis, we compare unconditional and conditional logistic regression models by precision in estimates and hypothesis testing using simulated matched case–control data. Our results support our hypothesis; however, the unconditional model is not as robust as the conditional model to the matching distortion that the matching process not only makes cases and controls similar for matching variables but also for the exposure status. When the study design involves other complex features or the computational burden is high, matching in loose-matching data can be ignored for negligible loss in testing and estimation if the distributions of matching variables are not extremely different between cases and controls. PMID:29552553
Evidence-based ergonomics: a model and conceptual structure proposal.
Silveira, Dierci Marcio
2012-01-01
In Human Factors and Ergonomics Science (HFES), it is difficult to identify what is the best approach to tackle the workplace and systems design problems which needs to be solved, and it has been also advocated as transdisciplinary and multidisciplinary the issue of "How to solve the human factors and ergonomics problems that are identified?". The proposition on this study is to combine the theoretical approach for Sustainability Science, the Taxonomy of the Human Factors and Ergonomics (HFE) discipline and the framework for Evidence-Based Medicine in an attempt to be applied in Human Factors and Ergonomics. Applications of ontologies are known in the field of medical research and computer science. By scrutinizing the key requirements for the HFES structuring of knowledge, it was designed a reference model, First, it was identified the important requirements for HFES Concept structuring, as regarded by Meister. Second, it was developed an evidence-based ergonomics framework as a reference model composed of six levels based on these requirements. Third, it was devised a mapping tool using linguistic resources to translate human work, systems environment and the complexities inherent to their hierarchical relationships to support future development at Level 2 of the reference model and for meeting the two major challenges for HFES, namely, identifying what problems should be addressed in HFE as an Autonomous Science itself and proposing solutions by integrating concepts and methods applied in HFES for those problems.
Actual Romanian research in post-newtonian dynamics
NASA Astrophysics Data System (ADS)
Mioc, V.; Stavinschi, M.
2007-05-01
We survey the recent Romanian results in the study of the two-body problem in post-Newtonian fields. Such a field is characterized, in general, by a potential of the form U(q)=|q|^{-1}+ something (small, but not compulsorily). We distinguish some classes of post-Newtonian models: relativistic (Schwarzschild, Fock, Einstein PN, Reissner-Nordström, Schwarzschild - de Sitter, etc.) and nonrelativistic (Manev, Mücket-Treder, Seeliger, gravito-elastic, etc.). Generalized models (the zonal-satellite problem, quasihomogeneous fields), as well as special cases (anisotropic Manev-type and Schwarzschild-type models, Popovici or Popovici-Manev photogravitational problem), were also tackled. The methods used in such studies are various: analytical (using mainly the theory of perturbations, but also other theories: functions of complex variable, variational calculus, etc.), geometric (qualitative approach of the theory of dynamical systems), and numerical (especially using the Poincaré-section technique). The areas of interest and the general results obtained focus on: exact or approximate analytical solutions; characteristics of local flows (especially at limit situations: collision and escape); quasiperiodic and periodic orbits; equilibria; symmetries; chaoticity; geometric description of the global flow (and physical interpretation of the phase-space structure). We emphasize some special features, which cannot be met within the Newtonian framework: black-hole effect, oscillatory collisions, radial librations, bounded orbits for nonnegative energy, existence of unstable circular motion (or unstable rest), symmetric periodic orbits within anisotropic models, etc.
Fagan, William F; Lutscher, Frithjof
2006-04-01
Spatially explicit models for populations are often difficult to tackle mathematically and, in addition, require detailed data on individual movement behavior that are not easily obtained. An approximation known as the "average dispersal success" provides a tool for converting complex models, which may include stage structure and a mechanistic description of dispersal, into a simple matrix model. This simpler matrix model has two key advantages. First, it is easier to parameterize from the types of empirical data typically available to conservation biologists, such as survivorship, fecundity, and the fraction of juveniles produced in a study area that also recruit within the study area. Second, it is more amenable to theoretical investigation. Here, we use the average dispersal success approximation to develop estimates of the critical reserve size for systems comprising single patches or simple metapopulations. The quantitative approach can be used for both plants and animals; however, to provide a concrete example of the technique's utility, we focus on a special case pertinent to animals. Specifically, for territorial animals, we can characterize such an estimate of minimum viable habitat area in terms of the number of home ranges that the reserve contains. Consequently, the average dispersal success framework provides a framework through which home range size, natal dispersal distances, and metapopulation dynamics can be linked to reserve design. We briefly illustrate the approach using empirical data for the swift fox (Vulpes velox).
A high-resolution global flood hazard model
NASA Astrophysics Data System (ADS)
Sampson, Christopher C.; Smith, Andrew M.; Bates, Paul B.; Neal, Jeffrey C.; Alfieri, Lorenzo; Freer, Jim E.
2015-09-01
Floods are a natural hazard that affect communities worldwide, but to date the vast majority of flood hazard research and mapping has been undertaken by wealthy developed nations. As populations and economies have grown across the developing world, so too has demand from governments, businesses, and NGOs for modeled flood hazard data in these data-scarce regions. We identify six key challenges faced when developing a flood hazard model that can be applied globally and present a framework methodology that leverages recent cross-disciplinary advances to tackle each challenge. The model produces return period flood hazard maps at ˜90 m resolution for the whole terrestrial land surface between 56°S and 60°N, and results are validated against high-resolution government flood hazard data sets from the UK and Canada. The global model is shown to capture between two thirds and three quarters of the area determined to be at risk in the benchmark data without generating excessive false positive predictions. When aggregated to ˜1 km, mean absolute error in flooded fraction falls to ˜5%. The full complexity global model contains an automatically parameterized subgrid channel network, and comparison to both a simplified 2-D only variant and an independently developed pan-European model shows the explicit inclusion of channels to be a critical contributor to improved model performance. While careful processing of existing global terrain data sets enables reasonable model performance in urban areas, adoption of forthcoming next-generation global terrain data sets will offer the best prospect for a step-change improvement in model performance.
Specific tackling situations affect the biomechanical demands experienced by rugby union players.
Seminati, Elena; Cazzola, Dario; Preatoni, Ezio; Trewartha, Grant
2017-03-01
Tackling in Rugby Union is an open skill which can involve high-speed collisions and is the match event associated with the greatest proportion of injuries. This study aimed to analyse the biomechanics of rugby tackling under three conditions: from a stationary position, with dominant and non-dominant shoulder, and moving forward, with dominant shoulder. A specially devised contact simulator, a 50-kg punch bag instrumented with pressure sensors, was translated towards the tackler (n = 15) to evaluate the effect of laterality and tackling approach on the external loads absorbed by the tackler, on head and trunk motion, and on trunk muscle activities. Peak impact force was substantially higher in the stationary dominant (2.84 ± 0.74 kN) than in the stationary non-dominant condition (2.44 ± 0.64 kN), but lower than in the moving condition (3.40 ± 0.86 kN). Muscle activation started on average 300 ms before impact, with higher activation for impact-side trapezius and non-impact-side erector spinae and gluteus maximus muscles. Players' technique for non-dominant-side tackles was less compliant with current coaching recommendations in terms of cervical motion (more neck flexion and lateral bending in the stationary non-dominant condition) and players could benefit from specific coaching focus on non-dominant-side tackles.
A high‐resolution global flood hazard model†
Smith, Andrew M.; Bates, Paul D.; Neal, Jeffrey C.; Alfieri, Lorenzo; Freer, Jim E.
2015-01-01
Abstract Floods are a natural hazard that affect communities worldwide, but to date the vast majority of flood hazard research and mapping has been undertaken by wealthy developed nations. As populations and economies have grown across the developing world, so too has demand from governments, businesses, and NGOs for modeled flood hazard data in these data‐scarce regions. We identify six key challenges faced when developing a flood hazard model that can be applied globally and present a framework methodology that leverages recent cross‐disciplinary advances to tackle each challenge. The model produces return period flood hazard maps at ∼90 m resolution for the whole terrestrial land surface between 56°S and 60°N, and results are validated against high‐resolution government flood hazard data sets from the UK and Canada. The global model is shown to capture between two thirds and three quarters of the area determined to be at risk in the benchmark data without generating excessive false positive predictions. When aggregated to ∼1 km, mean absolute error in flooded fraction falls to ∼5%. The full complexity global model contains an automatically parameterized subgrid channel network, and comparison to both a simplified 2‐D only variant and an independently developed pan‐European model shows the explicit inclusion of channels to be a critical contributor to improved model performance. While careful processing of existing global terrain data sets enables reasonable model performance in urban areas, adoption of forthcoming next‐generation global terrain data sets will offer the best prospect for a step‐change improvement in model performance. PMID:27594719
Now for the Hard Stuff: Next Steps in ECB Research and Practice
ERIC Educational Resources Information Center
Preskill, Hallie
2014-01-01
Though several excellent literature reviews and research syntheses have been conducted, and thoughtful frameworks and models have been proposed, I believe it is time for the evaluation field to tackle the "hard stuff" of evaluation capacity building (ECB). This entails engaging staff in ECB activities, building the evaluation capacity of…
A Guide to Implementing Response to Intervention in Long-Term Residential Juvenile Justice Schools
ERIC Educational Resources Information Center
McDaniel, Sara; Heil, Kristen M.; Houchins, David E.; Duchaine, Ellen L.
2011-01-01
Since the passage of the Individuals with Disabilities Education Improvement Act (2004), public schools have been permitted to use a response to intervention model to address academic and social problems of students and identify students with disabilities. As the collective educational community tackles implementation of response to intervention…
Solar Car, Solar Boat: Model Classroom Projects. Seattle Tech Prep.
ERIC Educational Resources Information Center
Seattle Community Coll. District, Washington.
This booklet shows how teachers at Ingraham High School and Madison Middle School in Seattle (Washington) challenged their students to tackle demanding technical projects. It also shows how well the students responded to that challenge. The booklet begins with the background of the project, the framework for which would be a university-sponsored…
NASA Astrophysics Data System (ADS)
Cristallini, Achille
2016-07-01
A new and intriguing machine may be obtained replacing the moving pulley of a gun tackle with a fixed point in the rope. Its most important feature is the asymptotic efficiency. Here we obtain a satisfactory description of this machine by means of vector calculus and elementary trigonometry. The mathematical model has been compared with experimental data and briefly discussed.
Wu, Guo-sheng; Lin, Hui-hua; Zhu, He-jian; Sha, Jin-ming; Dai, Wen-yuan
2011-07-01
Based on the 1988, 2000, and 2007 remote sensing images of a typical red soil eroded region (Changting County, Fujian Province) and the digital elevation model (DEM), the eroded landscape types were worked out, and the changes of the eroded landscape pattern in the region from 1988 to 2007 were analyzed with the spatial mathematics model. In 1988-2007, different eroded landscape types in the region had the characteristics of inter-transfer, mainly manifested in the transfer from seriously eroded to lightly eroded types but still existed small amount of the transference from lightly eroded to seriously eroded types. Little change was observed in the controid of the eroded landscape. In the County, Hetian Town was all along the eroded center. During the study period, the landscape pattern index showed a tendency of low heterogeneity, low fragmentation, and high regularization at landscape level, but an overall improvement and expansion of lightly eroded and easy-to-tackle patches as well as the partial improvement and fragmentation of seriously eroded and difficult-to-tackle patches at patch level.
Code of Federal Regulations, 2014 CFR
2014-10-01
...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Fishing Tackle Kits, Emergency, for Merchant Vessels § 160.061-5... FISHING TACKLE KIT OPEN ONLY FOR ACTUAL EMERGENCY USE NOT FOR INSPECTION (b) [Reserved] ...
Code of Federal Regulations, 2013 CFR
2013-10-01
...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Fishing Tackle Kits, Emergency, for Merchant Vessels § 160.061-5... FISHING TACKLE KIT OPEN ONLY FOR ACTUAL EMERGENCY USE NOT FOR INSPECTION (b) [Reserved] ...
Code of Federal Regulations, 2012 CFR
2012-10-01
...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Fishing Tackle Kits, Emergency, for Merchant Vessels § 160.061-5... FISHING TACKLE KIT OPEN ONLY FOR ACTUAL EMERGENCY USE NOT FOR INSPECTION (b) [Reserved] ...
Code of Federal Regulations, 2011 CFR
2011-10-01
...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Fishing Tackle Kits, Emergency, for Merchant Vessels § 160.061-5... FISHING TACKLE KIT OPEN ONLY FOR ACTUAL EMERGENCY USE NOT FOR INSPECTION (b) [Reserved] ...
Code of Federal Regulations, 2010 CFR
2010-10-01
...: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Fishing Tackle Kits, Emergency, for Merchant Vessels § 160.061-5... FISHING TACKLE KIT OPEN ONLY FOR ACTUAL EMERGENCY USE NOT FOR INSPECTION (b) [Reserved] ...
Implementing Genome-Driven Oncology
Hyman, David M.; Taylor, Barry S.; Baselga, José
2017-01-01
Early successes in identifying and targeting individual oncogenic drivers, together with the increasing feasibility of sequencing tumor genomes, have brought forth the promise of genome-driven oncology care. As we expand the breadth and depth of genomic analyses, the biological and clinical complexity of its implementation will be unparalleled. Challenges include target credentialing and validation, implementing drug combinations, clinical trial designs, targeting tumor heterogeneity, and deploying technologies beyond DNA sequencing, among others. We review how contemporary approaches are tackling these challenges and will ultimately serve as an engine for biological discovery and increase our insight into cancer and its treatment. PMID:28187282
Simulation of acid water movement in canals
NASA Astrophysics Data System (ADS)
Van Truong, To; Tat Dac, Nguyen; Ngoc Phienc, Huynh
1996-05-01
An attempt to tackle the problem of the propagation of acid water in canals is described, and a mathematical model to simulate the acid water movement is developed, in which the jurbanite equilibrium is found to prevail. The processes of settling owing to sedimentation, precipitation and redissolution have been considered in the modelling. Data available from Tan Thanh, in the Plain of Reeds of the Mekong Delta in Viet Nam, are used as a case study.
... Fall Allergies Before Tackling the Leaves (News) Can Trees Curb Asthma Flare-Ups in Polluted Cities? (News) ... Allergies Before Tackling the Leaves News HealthDay Can Trees Curb Asthma Flare-Ups in Polluted Cities? News ...
Tackling Health Inequalities in the United Kingdom: The Progress and Pitfalls of Policy
Exworthy, Mark; Blane, David; Marmot, Michael
2003-01-01
Goal Assess the progress and pitfalls of current United Kingdom (U.K.) policies to reduce health inequalities. Objectives (1) Describe the context enabling health inequalities to get onto the policy agenda in the United Kingdom. (2) Categorize and assess selected current U.K. policies that may affect health inequalities. (3) Apply the “policy windows” model to understand the issues faced in formulating and implementing such policies. (4) Examine the emerging policy challenges in the U.K. and elsewhere. Data Sources Official documents, secondary analyses, and interviews with policymakers. Study Design Qualitative, policy analysis. Data Collection 2001–2002. The methods were divided into two stages. The first identified policies which were connected with individual inquiry recommendations. The second involved case-studies of three policies areas which were thought to be crucial in tackling health inequalities. Both stages involved interviews with policy-makers and documentary analysis. Principal Findings (1) The current U.K. government stated a commitment to reducing health inequalities. (2) The government has begun to implement policies that address the wider determinants. (3) Some progress is evident but many indicators remain stubborn. (4) Difficulties remain in terms of coordinating policies across government and measuring progress. (5) The “policy windows” model explains the limited extent of progress and highlights current and possible future pitfalls. (6) The U.K.'s experience has lessons for other governments involved in tackling health inequalities. Conclusions Health inequalities are on the agenda of U.K. government policy and steps have been made to address them. There are some signs of progress but much remains to be done including overcoming some of the perverse incentives at the national level, improving joint working, ensuring appropriate measures of performance/progress, and improving monitoring arrangements. A conceptual policy model aids understanding and points to ways of sustaining and extending the recent progress and overcoming pitfalls. PMID:14727803
Technical Aspects for the Creation of a Multi-Dimensional Land Information System
NASA Astrophysics Data System (ADS)
Ioannidis, Charalabos; Potsiou, Chryssy; Soile, Sofia; Verykokou, Styliani; Mourafetis, George; Doulamis, Nikolaos
2016-06-01
The complexity of modern urban environments and civil demands for fast, reliable and affordable decision-making requires not only a 3D Land Information System, which tends to replace traditional 2D LIS architectures, but also the need to address the time and scale parameters, that is, the 3D geometry of buildings in various time instances (4th dimension) at various levels of detail (LoDs - 5th dimension). This paper describes and proposes solutions for technical aspects that need to be addressed for the 5D modelling pipeline. Such solutions include the creation of a 3D model, the application of a selective modelling procedure between various time instances and at various LoDs, enriched with cadastral and other spatial data, and a procedural modelling approach for the representation of the inner parts of the buildings. The methodology is based on automatic change detection algorithms for spatial-temporal analysis of the changes that took place in subsequent time periods, using dense image matching and structure from motion algorithms. The selective modelling approach allows a detailed modelling only for the areas where spatial changes are detected. The procedural modelling techniques use programming languages for the textual semantic description of a building; they require the modeller to describe its part-to-whole relationships. Finally, a 5D viewer is developed, in order to tackle existing limitations that accompany the use of global systems, such as the Google Earth or the Google Maps, as visualization software. An application based on the proposed methodology in an urban area is presented and it provides satisfactory results.
Computational models of epilepsy.
Stefanescu, Roxana A; Shivakeshavan, R G; Talathi, Sachin S
2012-12-01
Approximately 30% of epilepsy patients suffer from medically refractory epilepsy, in which seizures can not controlled by the use of anti-epileptic drugs (AEDs). Understanding the mechanisms underlying these forms of drug-resistant epileptic seizures and the development of alternative effective treatment strategies are fundamental challenges for modern epilepsy research. In this context, computational modeling has gained prominence as an important tool for tackling the complexity of the epileptic phenomenon. In this review article, we present a survey of computational models of epilepsy from the point of view that epilepsy is a dynamical brain disease that is primarily characterized by unprovoked spontaneous epileptic seizures. We introduce key concepts from the mathematical theory of dynamical systems, such as multi-stability and bifurcations, and explain how these concepts aid in our understanding of the brain mechanisms involved in the emergence of epileptic seizures. We present a literature survey of the different computational modeling approaches that are used in the study of epilepsy. Special emphasis is placed on highlighting the fine balance between the degree of model simplification and the extent of biological realism that modelers seek in order to address relevant questions. In this context, we discuss three specific examples from published literature, which exemplify different approaches used for developing computational models of epilepsy. We further explore the potential of recently developed optogenetics tools to provide novel avenue for seizure control. We conclude with a discussion on the utility of computational models for the development of new epilepsy treatment protocols. Copyright © 2012 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.
A cut-cell immersed boundary technique for fire dynamics simulation
NASA Astrophysics Data System (ADS)
Vanella, Marcos; McDermott, Randall; Forney, Glenn
2015-11-01
Fire simulation around complex geometry is gaining increasing attention in performance based design of fire protection systems, fire-structure interaction and pollutant transport in complex terrains, among others. This presentation will focus on our present effort in improving the capability of FDS (Fire Dynamics Simulator, developed at the Fire Research Division, NIST. https://github.com/firemodels/fds-smv) to represent fire scenarios around complex bodies. Velocities in the vicinity of the bodies are reconstructed using a classical immersed boundary scheme (Fadlun and co-workers, J. Comput. Phys., 161:35-60, 2000). Also, a conservative treatment of scalar transport equations (i.e. for chemical species) will be presented. In our method, discrete conservation and no penetration of species across solid boundaries are enforced using a cut-cell finite volume scheme. The small cell problem inherent to the method is tackled using explicit-implicit domain decomposition for scalar, within the FDS time integration scheme. Some details on the derivation, implementation and numerical tests of this numerical scheme will be discussed.
NASA Astrophysics Data System (ADS)
Ferreira, Maria Teodora; Follmann, Rosangela; Domingues, Margarete O.; Macau, Elbert E. N.; Kiss, István Z.
2017-08-01
Phase synchronization may emerge from mutually interacting non-linear oscillators, even under weak coupling, when phase differences are bounded, while amplitudes remain uncorrelated. However, the detection of this phenomenon can be a challenging problem to tackle. In this work, we apply the Discrete Complex Wavelet Approach (DCWA) for phase assignment, considering signals from coupled chaotic systems and experimental data. The DCWA is based on the Dual-Tree Complex Wavelet Transform (DT-CWT), which is a discrete transformation. Due to its multi-scale properties in the context of phase characterization, it is possible to obtain very good results from scalar time series, even with non-phase-coherent chaotic systems without state space reconstruction or pre-processing. The method correctly predicts the phase synchronization for a chemical experiment with three locally coupled, non-phase-coherent chaotic processes. The impact of different time-scales is demonstrated on the synchronization process that outlines the advantages of DCWA for analysis of experimental data.
... to put the cells What is Islet Transplantation? Sustainability - Tackling the immune system Supply - Creating more cells ... to put the cells What is Islet Transplantation? Sustainability - Tackling the immune system Supply - Creating more cells ...
Sorting of Streptomyces Cell Pellets Using a Complex Object Parametric Analyzer and Sorter
Petrus, Marloes L. C.; van Veluw, G. Jerre; Wösten, Han A. B.; Claessen, Dennis
2014-01-01
Streptomycetes are filamentous soil bacteria that are used in industry for the production of enzymes and antibiotics. When grown in bioreactors, these organisms form networks of interconnected hyphae, known as pellets, which are heterogeneous in size. Here we describe a method to analyze and sort mycelial pellets using a Complex Object Parametric Analyzer and Sorter (COPAS). Detailed instructions are given for the use of the instrument and the basic statistical analysis of the data. We furthermore describe how pellets can be sorted according to user-defined settings, which enables downstream processing such as the analysis of the RNA or protein content. Using this methodology the mechanism underlying heterogeneous growth can be tackled. This will be instrumental for improving streptomycetes as a cell factory, considering the fact that productivity correlates with pellet size. PMID:24561666
NASA Astrophysics Data System (ADS)
Min, Byungjoon
2018-01-01
Identifying the most influential spreaders is one of outstanding problems in physics of complex systems. So far, many approaches have attempted to rank the influence of nodes but there is still the lack of accuracy to single out influential spreaders. Here, we directly tackle the problem of finding important spreaders by solving analytically the expected size of epidemic outbreaks when spreading originates from a single seed. We derive and validate a theory for calculating the size of epidemic outbreaks with a single seed using a message-passing approach. In addition, we find that the probability to occur epidemic outbreaks is highly dependent on the location of the seed but the size of epidemic outbreaks once it occurs is insensitive to the seed. We also show that our approach can be successfully adapted into weighted networks.
Puspitasari, Hanni P.; Aslani, Parisa; Krass, Ines
2014-01-01
Background As primary healthcare professionals, community pharmacists have both opportunity and potential to contribute to the prevention and progression of chronic diseases. Using cardiovascular disease (CVD) as a case study, we explored factors that influence community pharmacists’ everyday practice in this area. We also propose a model to best illustrate relationships between influencing factors and the scope of community pharmacy practice in the care of clients with established CVD. Methods In-depth, semi-structured interviews were conducted with 21 community pharmacists in New South Wales, Australia. All interviews were audio-recorded, transcribed ad verbatim, and analysed using a “grounded-theory” approach. Results Our model shows that community pharmacists work within a complex system and their practice is influenced by interactions between three main domains: the “people” factors, including their own attitudes and beliefs as well as those of clients and doctors; the “environment” within and beyond the control of community pharmacy; and outcomes of their professional care. Despite the complexity of factors and interactions, our findings shed some light on the interrelationships between these various influences. The overarching obstacle to maximizing the community pharmacists’ contribution is the lack of integration within health systems. However, achieving better integration of community pharmacists in primary care is a challenge since the systems of remuneration for healthcare professional services do not currently support this integration. Conclusion Tackling chronic diseases such as CVD requires mobilization of all sources of support in the community through innovative policies which facilitate inter-professional collaboration and team care to achieve the best possible healthcare outcomes for society. PMID:25409194
CMU DeepLens: deep learning for automatic image-based galaxy-galaxy strong lens finding
NASA Astrophysics Data System (ADS)
Lanusse, François; Ma, Quanbin; Li, Nan; Collett, Thomas E.; Li, Chun-Liang; Ravanbakhsh, Siamak; Mandelbaum, Rachel; Póczos, Barnabás
2018-01-01
Galaxy-scale strong gravitational lensing can not only provide a valuable probe of the dark matter distribution of massive galaxies, but also provide valuable cosmological constraints, either by studying the population of strong lenses or by measuring time delays in lensed quasars. Due to the rarity of galaxy-scale strongly lensed systems, fast and reliable automated lens finding methods will be essential in the era of large surveys such as Large Synoptic Survey Telescope, Euclid and Wide-Field Infrared Survey Telescope. To tackle this challenge, we introduce CMU DeepLens, a new fully automated galaxy-galaxy lens finding method based on deep learning. This supervised machine learning approach does not require any tuning after the training step which only requires realistic image simulations of strongly lensed systems. We train and validate our model on a set of 20 000 LSST-like mock observations including a range of lensed systems of various sizes and signal-to-noise ratios (S/N). We find on our simulated data set that for a rejection rate of non-lenses of 99 per cent, a completeness of 90 per cent can be achieved for lenses with Einstein radii larger than 1.4 arcsec and S/N larger than 20 on individual g-band LSST exposures. Finally, we emphasize the importance of realistically complex simulations for training such machine learning methods by demonstrating that the performance of models of significantly different complexities cannot be distinguished on simpler simulations. We make our code publicly available at https://github.com/McWilliamsCenter/CMUDeepLens.
Subgrid Combustion Modeling for the Next Generation National Combustion Code
NASA Technical Reports Server (NTRS)
Menon, Suresh; Sankaran, Vaidyanathan; Stone, Christopher
2003-01-01
In the first year of this research, a subgrid turbulent mixing and combustion methodology developed earlier at Georgia Tech has been provided to researchers at NASA/GRC for incorporation into the next generation National Combustion Code (called NCCLES hereafter). A key feature of this approach is that scalar mixing and combustion processes are simulated within the LES grid using a stochastic 1D model. The subgrid simulation approach recovers locally molecular diffusion and reaction kinetics exactly without requiring closure and thus, provides an attractive feature to simulate complex, highly turbulent reacting flows of interest. Data acquisition algorithms and statistical analysis strategies and routines to analyze NCCLES results have also been provided to NASA/GRC. The overall goal of this research is to systematically develop and implement LES capability into the current NCC. For this purpose, issues regarding initialization and running LES are also addressed in the collaborative effort. In parallel to this technology transfer effort (that is continuously on going), research has also been underway at Georgia Tech to enhance the LES capability to tackle more complex flows. In particular, subgrid scalar mixing and combustion method has been evaluated in three distinctly different flow field in order to demonstrate its generality: (a) Flame-Turbulence Interactions using premixed combustion, (b) Spatially evolving supersonic mixing layers, and (c) Temporal single and two-phase mixing layers. The configurations chosen are such that they can be implemented in NCCLES and used to evaluate the ability of the new code. Future development and validation will be in spray combustion in gas turbine engine and supersonic scalar mixing.
Puspitasari, Hanni P; Aslani, Parisa; Krass, Ines
2014-01-01
As primary healthcare professionals, community pharmacists have both opportunity and potential to contribute to the prevention and progression of chronic diseases. Using cardiovascular disease (CVD) as a case study, we explored factors that influence community pharmacists' everyday practice in this area. We also propose a model to best illustrate relationships between influencing factors and the scope of community pharmacy practice in the care of clients with established CVD. In-depth, semi-structured interviews were conducted with 21 community pharmacists in New South Wales, Australia. All interviews were audio-recorded, transcribed ad verbatim, and analysed using a "grounded-theory" approach. Our model shows that community pharmacists work within a complex system and their practice is influenced by interactions between three main domains: the "people" factors, including their own attitudes and beliefs as well as those of clients and doctors; the "environment" within and beyond the control of community pharmacy; and outcomes of their professional care. Despite the complexity of factors and interactions, our findings shed some light on the interrelationships between these various influences. The overarching obstacle to maximizing the community pharmacists' contribution is the lack of integration within health systems. However, achieving better integration of community pharmacists in primary care is a challenge since the systems of remuneration for healthcare professional services do not currently support this integration. Tackling chronic diseases such as CVD requires mobilization of all sources of support in the community through innovative policies which facilitate inter-professional collaboration and team care to achieve the best possible healthcare outcomes for society.
Using Predictive Analytics to Predict Power Outages from Severe Weather
NASA Astrophysics Data System (ADS)
Wanik, D. W.; Anagnostou, E. N.; Hartman, B.; Frediani, M. E.; Astitha, M.
2015-12-01
The distribution of reliable power is essential to businesses, public services, and our daily lives. With the growing abundance of data being collected and created by industry (i.e. outage data), government agencies (i.e. land cover), and academia (i.e. weather forecasts), we can begin to tackle problems that previously seemed too complex to solve. In this session, we will present newly developed tools to aid decision-support challenges at electric distribution utilities that must mitigate, prepare for, respond to and recover from severe weather. We will show a performance evaluation of outage predictive models built for Eversource Energy (formerly Connecticut Light & Power) for storms of all types (i.e. blizzards, thunderstorms and hurricanes) and magnitudes (from 20 to >15,000 outages). High resolution weather simulations (simulated with the Weather and Research Forecast Model) were joined with utility outage data to calibrate four types of models: a decision tree (DT), random forest (RF), boosted gradient tree (BT) and an ensemble (ENS) decision tree regression that combined predictions from DT, RF and BT. The study shows that the ENS model forced with weather, infrastructure and land cover data was superior to the other models we evaluated, especially in terms of predicting the spatial distribution of outages. This research has the potential to be used for other critical infrastructure systems (such as telecommunications, drinking water and gas distribution networks), and can be readily expanded to the entire New England region to facilitate better planning and coordination among decision-makers when severe weather strikes.
NASA Astrophysics Data System (ADS)
Thomas, Stephanie Margarete; Beierkuhnlein, Carl
2013-05-01
The occurrence of ectotherm disease vectors outside of their previous distribution area and the emergence of vector-borne diseases can be increasingly observed at a global scale and are accompanied by a growing number of studies which investigate the vast range of determining factors and their causal links. Consequently, a broad span of scientific disciplines is involved in tackling these complex phenomena. First, we evaluate the citation behaviour of relevant scientific literature in order to clarify the question "do scientists consider results of other disciplines to extend their expertise?" We then highlight emerging tools and concepts useful for risk assessment. Correlative models (regression-based, machine-learning and profile techniques), mechanistic models (basic reproduction number R 0) and methods of spatial regression, interaction and interpolation are described. We discuss further steps towards multidisciplinary approaches regarding new tools and emerging concepts to combine existing approaches such as Bayesian geostatistical modelling, mechanistic models which avoid the need for parameter fitting, joined correlative and mechanistic models, multi-criteria decision analysis and geographic profiling. We take the quality of both occurrence data for vector, host and disease cases, and data of the predictor variables into consideration as both determine the accuracy of risk area identification. Finally, we underline the importance of multidisciplinary research approaches. Even if the establishment of communication networks between scientific disciplines and the share of specific methods is time consuming, it promises new insights for the surveillance and control of vector-borne diseases worldwide.
NASA Astrophysics Data System (ADS)
Erickson, M.; Olaguer, J.; Wijesinghe, A.; Colvin, J.; Neish, B.; Williams, J.
2014-12-01
It is becoming increasingly important to understand the emissions and health effects of industrial facilities. Many areas have no or limited sustained monitoring capabilities, making it difficult to quantify the major pollution sources affecting human health, especially in fence line communities. Developments in real-time monitoring and micro-scale modeling offer unique ways to tackle these complex issues. This presentation will demonstrate the capability of coupling real-time observations with micro-scale modeling to provide real-time information and near real-time source attribution. The Houston Advanced Research Center constructed the Mobile Acquisition of Real-time Concentrations (MARC) laboratory. MARC consists of a Ford E-350 passenger van outfitted with a Proton Transfer Reaction Mass Spectrometer (PTR-MS) and meteorological equipment. This allows for the fast measurement of various VOCs important to air quality. The data recorded from the van is uploaded to an off-site database and the information is broadcast to a website in real-time. This provides for off-site monitoring of MARC's observations, which allows off-site personnel to provide immediate input to the MARC operators on how to best achieve project objectives. The information stored in the database can also be used to provide near real-time source attribution. An inverse model has been used to ascertain the amount, location, and timing of emissions based on MARC measurements in the vicinity of industrial sites. The inverse model is based on a 3D micro-scale Eulerian forward and adjoint air quality model known as the HARC model. The HARC model uses output from the Quick Urban and Industrial Complex (QUIC) wind model and requires a 3D digital model of the monitored facility based on lidar or industrial permit data. MARC is one of the instrument platforms deployed during the 2014 Benzene and other Toxics Exposure Study (BEE-TEX) in Houston, TX. The main goal of the study is to quantify and explain the origin of ambient exposure to hazardous air pollutants in an industrial fence line community near the Houston Ship Channel. Preliminary results derived from analysis of MARC observations during the BEE-TEX experiment will be presented.
3D and 4D Bioprinting of the Myocardium: Current Approaches, Challenges, and Future Prospects
Ong, Chin Siang; Nam, Lucy; Ong, Kingsfield; Krishnan, Aravind; Huang, Chen Yu; Fukunishi, Takuma
2018-01-01
3D and 4D bioprinting of the heart are exciting notions in the modern era. However, myocardial bioprinting has proven to be challenging. This review outlines the methods, materials, cell types, issues, challenges, and future prospects in myocardial bioprinting. Advances in 3D bioprinting technology have significantly improved the manufacturing process. While scaffolds have traditionally been utilized, 3D bioprinters, which do not require scaffolds, are increasingly being employed. Improved understanding of the cardiac cellular composition and multiple strategies to tackle the issues of vascularization and viability had led to progress in this field. In vivo studies utilizing small animal models have been promising. 4D bioprinting is a new concept that has potential to advance the field of 3D bioprinting further by incorporating the fourth dimension of time. Clinical translation will require multidisciplinary collaboration to tackle the pertinent issues facing this field. PMID:29850546
Adoption of multivariate copulae in prognostication of economic growth by means of interest rate
NASA Astrophysics Data System (ADS)
Saputra, Dewi Tanasia; Indratno, Sapto Wahyu, Dr.
2015-12-01
Inflation, at a healthy rate, is a sign of growing economy. Nonetheless, when inflation rate grows uncontrollably, it will negatively influence economic growth. Many tackle this problem by increasing interest rate to help protecting the value of money which is detained by inflation. There are few, however, who study the effects of interest rate in economic growth. The main purposes of this paper are to find how the change of interest rate affects economic growth and to use the relationship in prognostication of economic growth. By using expenditure model, a linear relationship between economic growth and interest rate is developed. The result is then used for prediction by normal copula and Vine Archimedean copula. It is shown that increasing interest rate to tackle inflation is a poor solution. Whereas implementation of copula in predicting economic growth yields an accurate result, with not more than 0.5% difference.
Robust optimization based energy dispatch in smart grids considering demand uncertainty
NASA Astrophysics Data System (ADS)
Nassourou, M.; Puig, V.; Blesa, J.
2017-01-01
In this study we discuss the application of robust optimization to the problem of economic energy dispatch in smart grids. Robust optimization based MPC strategies for tackling uncertain load demands are developed. Unexpected additive disturbances are modelled by defining an affine dependence between the control inputs and the uncertain load demands. The developed strategies were applied to a hybrid power system connected to an electrical power grid. Furthermore, to demonstrate the superiority of the standard Economic MPC over the MPC tracking, a comparison (e.g average daily cost) between the standard MPC tracking, the standard Economic MPC, and the integration of both in one-layer and two-layer approaches was carried out. The goal of this research is to design a controller based on Economic MPC strategies, that tackles uncertainties, in order to minimise economic costs and guarantee service reliability of the system.
Evaluating Variability and Uncertainty of Geological Strength Index at a Specific Site
NASA Astrophysics Data System (ADS)
Wang, Yu; Aladejare, Adeyemi Emman
2016-09-01
Geological Strength Index (GSI) is an important parameter for estimating rock mass properties. GSI can be estimated from quantitative GSI chart, as an alternative to the direct observational method which requires vast geological experience of rock. GSI chart was developed from past observations and engineering experience, with either empiricism or some theoretical simplifications. The GSI chart thereby contains model uncertainty which arises from its development. The presence of such model uncertainty affects the GSI estimated from GSI chart at a specific site; it is, therefore, imperative to quantify and incorporate the model uncertainty during GSI estimation from the GSI chart. A major challenge for quantifying the GSI chart model uncertainty is a lack of the original datasets that have been used to develop the GSI chart, since the GSI chart was developed from past experience without referring to specific datasets. This paper intends to tackle this problem by developing a Bayesian approach for quantifying the model uncertainty in GSI chart when using it to estimate GSI at a specific site. The model uncertainty in the GSI chart and the inherent spatial variability in GSI are modeled explicitly in the Bayesian approach. The Bayesian approach generates equivalent samples of GSI from the integrated knowledge of GSI chart, prior knowledge and observation data available from site investigation. Equations are derived for the Bayesian approach, and the proposed approach is illustrated using data from a drill and blast tunnel project. The proposed approach effectively tackles the problem of how to quantify the model uncertainty that arises from using GSI chart for characterization of site-specific GSI in a transparent manner.
Burger, Nicholas; Lambert, Mike Ian; Viljoen, Wayne; Brown, James Craig; Readhead, Clint; den Hollander, Steve; Hendricks, Sharief
2017-02-01
The majority of injuries in rugby union occur during tackle events. The mechanisms and causes of these injuries are well established in senior rugby union. To use information from an injury database and assess video footage of tackle-related injuries in youth rugby union matches to identify environmental factors and mechanisms that are potentially confounding to these injuries. Descriptive epidemiological study. Injury surveillance was conducted at the under-18 Craven Week rugby tournament. Tackle-related injury information was used to identify injury events in match video footage (role-matched noninjury tackle events were identified for the cohort of injured players). Events were coded using match situational variables (precontact, contact, and postcontact). Relative risk ratio (RRR; ratio of probability of an injury or noninjury outcome occurring when a characteristic was observed) was reported by use of logistic regression. In comparison with the first quarter, injury risk was greater in the third (RRR = 9.75 [95% CI, 1.71-55.64]; P = .010) and fourth quarters (RRR = 6.97 [95% CI, 1.09-44.57]; P = .040) for ball carriers and in the fourth quarter (RRR = 9.63 [95% CI, 1.94-47.79]; P = .006) for tacklers. Ball carriers were less likely to be injured when they were aware of impending contact (RRR = 0.14 [95% CI, 0.03-0.66]; P = .012) or when they executed a moderate fend (hand-off) (RRR = 0.22 [95% CI, 0.06-0.84]; P = .026). Tacklers were less likely to be injured when performing shoulder tackles (same side as leading leg) in comparison to an arm-only tackle (RRR = 0.02 [95% CI, 0.001-0.79]; P = .037). Ball carriers (RRR = 0.09 [95% CI, 0.01-0.89]; P = .040) and tacklers (RRR = 0.02 [95% CI, 0.001-0.32]; P =.006) were less likely to be injured when initial contact was made with the tackler's shoulder/arm instead of his head/neck. The relative risk of tackle-related injury was higher toward the end of matches. Incorrect technique may contribute to increased injury risk. Implementing recovery strategies between matches, training safe and effective techniques, and improving levels of conditioning may counter the negative effects of fatigue. These findings may assist stakeholders in youth rugby to formulate injury prevention strategies and may improve the preparation of field-side medical staff for managing tackle-related injuries at these or similar tournaments.
NASA Astrophysics Data System (ADS)
Salomone, Horacio D.; Olivieri, Néstor A.; Véliz, Maximiliano E.; Raviola, Lisandro A.
2018-05-01
In the context of fluid mechanics courses, it is customary to consider the problem of a sphere falling under the action of gravity inside a viscous fluid. Under suitable assumptions, this phenomenon can be modelled using Stokes’ law and is routinely reproduced in teaching laboratories to determine terminal velocities and fluid viscosities. In many cases, however, the measured physical quantities show important deviations with respect to the predictions deduced from the simple Stokes’ model, and the causes of these apparent ‘anomalies’ (for example, whether the flow is laminar or turbulent) are seldom discussed in the classroom. On the other hand, there are various variable-mass problems that students tackle during elementary mechanics courses and which are discussed in many textbooks. In this work, we combine both kinds of problems and analyse—both theoretically and experimentally—the evolution of a system composed of a sphere pulled by a chain of variable length inside a tube filled with water. We investigate the effects of different forces acting on the system such as weight, buoyancy, viscous friction and drag force. By means of a sequence of mathematical models of increasing complexity, we obtain a progressive fit that accounts for the experimental data. The contrast between the various models exposes the strengths and weaknessess of each one. The proposed experience can be useful for integrating concepts of elementary mechanics and fluids, and is suitable as laboratory practice, stressing the importance of the experimental validation of theoretical models and showing the model-building processes in a didactic framework.
Simone, Joanne; Hoyt, Mary Jo; Storm, Deborah S; Finocchario-Kessler, Sarah
2018-06-05
Preconception care can improve maternal and infant outcomes by identifying and modifying health risks before pregnancy and reducing unplanned pregnancies. However, information about how preconception care is provided to persons living with HIV (PLWH) is lacking. This study uses qualitative interviews with HIV care providers to describe current models of preconception care and explore factors influencing services. Single, anonymous, telephone interviews were conducted with 92 purposively selected HIV healthcare providers in Atlanta, Baltimore, Houston, Kansas City, Newark, Philadelphia, and San Francisco in 2013-2014. Content analysis and a grounded theory approach were used to analyze data. Participants included 57% physicians with a median of 10 [interquartile range (IQR) = 5-17] years HIV care experience; the mean proportion of female patients was 45%. Participants described Individual Provider (48.9%), Team-based (43.2%), and Referral-only (7.6%) models of preconception care, with 63% incorporating referrals outside their clinics. Thematic analysis identified five key elements influencing the provision of preconception care within and across models: consistency of delivery, knowledge and attitudes, clinic characteristics, coordination of care, and referral accessibility. Described models of preconception care reflect the complexity of our healthcare system. Qualitative analysis offers insights about how HIV clinicians provide preconception care and how key elements influence services. However, additional research about the models and outcomes of preconception care services are needed. To improve preconception care for PLWH, research and quality improvement initiatives must utilize available strengths and tackle existing barriers, identified by our study and others, to define and implement effective models of preconception care services.
Forberger, Sarah; Bammann, Karin; Bauer, Jürgen; Boll, Susanne; Bolte, Gabriele; Brand, Tilman; Hein, Andreas; Koppelin, Frauke; Lippke, Sonia; Meyer, Jochen; Pischke, Claudia R.; Voelcker-Rehage, Claudia; Zeeb, Hajo
2017-01-01
The paper introduces the theoretical framework and methods/instruments used by the Physical Activity and Health Equity: Primary Prevention for Healthy Ageing (AEQUIPA) prevention research network as an interdisciplinary approach to tackle key challenges in the promotion of physical activity among older people (65+). Drawing on the social-ecological model, the AEQUIPA network developed an interdisciplinary methodological design including quantitative/qualitative studies and systematic reviews, while combining expertise from diverse fields: public health, psychology, urban planning, sports sciences, health technology and geriatrics. AEQUIPA tackles key challenges when promoting physical activity (PA) in older adults: tailoring of interventions, fostering community readiness and participation, strengthening intersectoral collaboration, using new technological devices and evaluating intervention generated inequalities. AEQUIPA aims to strengthen the evidence base for age-specific preventive PA interventions and to yield new insights into the explanatory power of individual and contextual factors. Currently, the empirical work is still underway. First experiences indicate that the network has achieved a strong regional linkage with communities, local stakeholders and individuals. However, involving inactive persons and individuals from minority groups remained challenging. A review of existing PA intervention studies among the elderly revealed the potential to assess equity effects. The results will add to the theoretical and methodological discussion on evidence-based age-specific PA interventions and will contribute to the discussion about European and national health targets. PMID:28375177
Forberger, Sarah; Bammann, Karin; Bauer, Jürgen; Boll, Susanne; Bolte, Gabriele; Brand, Tilman; Hein, Andreas; Koppelin, Frauke; Lippke, Sonia; Meyer, Jochen; Pischke, Claudia R; Voelcker-Rehage, Claudia; Zeeb, Hajo
2017-04-04
The paper introduces the theoretical framework and methods/instruments used by the Physical Activity and Health Equity: Primary Prevention for Healthy Ageing (AEQUIPA) prevention research network as an interdisciplinary approach to tackle key challenges in the promotion of physical activity among older people (65+). Drawing on the social-ecological model, the AEQUIPA network developed an interdisciplinary methodological design including quantitative/qualitative studies and systematic reviews, while combining expertise from diverse fields: public health, psychology, urban planning, sports sciences, health technology and geriatrics. AEQUIPA tackles key challenges when promoting physical activity (PA) in older adults: tailoring of interventions, fostering community readiness and participation, strengthening intersectoral collaboration, using new technological devices and evaluating intervention generated inequalities. AEQUIPA aims to strengthen the evidence base for age-specific preventive PA interventions and to yield new insights into the explanatory power of individual and contextual factors. Currently, the empirical work is still underway. First experiences indicate that thenetwork has achieved a strong regional linkage with communities, local stakeholders and individuals. However, involving inactive persons and individuals from minority groups remained challenging. A review of existing PA intervention studies among the elderly revealed the potential to assess equity effects. The results will add to the theoretical and methodological discussion on evidence-based age-specific PA interventions and will contribute to the discussion about European and national health targets.
Ritsatakis, Anna; Ostergren, Per-Olof; Webster, Premila
2015-06-01
The WHO European Healthy Cities Network has from its inception aimed at tackling inequalities in health. In carrying out an evaluation of Phase V of the project (2009-13), an attempt was made to examine how far the concept of equity in health is understood and accepted; whether cities had moved further from a disease/medical model to looking at the social determinants of inequalities in health; how far the HC project contributed to cities determining the extent and causes of inequalities in health; what efforts were made to tackle such inequalities and how far inequalities in health may have increased or decreased during Phase V. A broader range of resources was utilized for this evaluation than in previous phases of the project. These indicated that most cities were definitely looking at the broader determinants. Equality in health was better understood and had been included as a value in a range of city policies. This was facilitated by stronger involvement of the HC project in city planning processes. Although almost half the cities participating had prepared a City Health Profile, only few cities had the necessary local level data to monitor changes in inequalities in health. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Gonzalez, John A.
2012-01-01
A critical goal of many school and training interventions is to provide learners with the strategies and foundational knowledge that will allow them to tackle novel problems encountered under circumstances different than the learning situations. This is also quite often referred to as the ability to transfer learning. Theories of transfer posit…
Gamification for Non-Majors Mathematics: An Innovative Assignment Model
ERIC Educational Resources Information Center
Leong, Siow Hoo; Tang, Howe Eng
2017-01-01
The most important ingredient of the pedagogy for teaching non-majors is getting their engagement. This paper proposes to use gamification to engage non-majors. An innovative game termed as Cover the Hungarian's Zeros is designed to tackle the common weakness of non-majors mathematics in solving the assignment problem using the Hungarian Method.…
A Fuzzy Group Decision Making Model for Ordinal Peer Assessment
ERIC Educational Resources Information Center
Capuano, Nicola; Loia, Vincenzo; Orciuoli, Francesco
2017-01-01
Massive Open Online Courses (MOOCs) are becoming an increasingly popular choice for education but, to reach their full extent, they require the resolution of new issues like assessing students at scale. A feasible approach to tackle this problem is peer assessment, in which students also play the role of assessor for assignments submitted by…
A Pragmatic Study of Exaggeration in British and American Novels
ERIC Educational Resources Information Center
Abbas, Qassim; Al-Tufaili, Dhayef
2016-01-01
The main concern of this study is to tackle exaggeration in British and American situations taken from "Mrs. Dalloway" and "The Great Gatsby" novels. From a pragmatic point of view, exaggeration in the field of literature has not been given enough attention. Accordingly, this study is an attempt to develop a model for the…
Karatzias, Thanos; Shevlin, Mark; Hyland, Philip; Brewin, Chris R; Cloitre, Marylene; Bradley, Aoife; Kitchiner, Neil J; Jumbe, Sandra; Bisson, Jonathan I; Roberts, Neil P
2018-06-01
We set out to investigate the association between negative trauma-related cognitions, emotional regulation strategies, and attachment style and complex post-traumatic stress disorder (CPTSD). As the evidence regarding the treatment of CPTSD is emerging, investigating psychological factors that are associated with CPTSD can inform the adaptation or the development of effective interventions for CPTSD. A cross-sectional design was employed. Measures of CPTSD, negative trauma-related cognitions, emotion regulation strategies, and attachment style were completed by a British clinical sample of trauma-exposed patients (N = 171). Logistic regression analysis was used to assess the predictive utility of these psychological factors on diagnosis of CPTSD as compared to PTSD. It was found that the most important factor in the diagnosis of CPTSD was negative trauma-related cognitions about the self, followed by attachment anxiety, and expressive suppression. Targeting negative thoughts and attachment representations while promoting skills acquisition in emotional regulation hold promise in the treatment of CPTSD. Further research is required on the development of appropriate models to treat CPTSD that tackle skills deficit in these areas. Results suggest that cognitive-behavioural interventions might be useful for the treatment of CPTSD. Targeting negative thoughts and attachment representations while promoting skills acquisition in emotional regulation hold promise in the treatment of CPTSD. © 2018 The British Psychological Society.
Marchal, Bruno; Hoerée, Tom; da Silveira, Valéria Campos; Van Belle, Sara; Prashanth, Nuggehalli S; Kegels, Guy
2014-04-17
Performance of health care systems is a key concern of policy makers and health service managers all over the world. It is also a major challenge, given its multidimensional nature that easily leads to conceptual and methodological confusion. This is reflected by a scarcity of models that comprehensively analyse health system performance. In health, one of the most comprehensive performance frameworks was developed by the team of Leggat and Sicotte. Their framework integrates 4 key organisational functions (goal attainment, production, adaptation to the environment, and values and culture) and the tensions between these functions.We modified this framework to better fit the assessment of the performance of health organisations in the public service domain and propose an analytical strategy that takes it into the social complexity of health organisations. The resulting multipolar performance framework (MPF) is a meta-framework that facilitates the analysis of the relations and interactions between the multiple actors that influence the performance of health organisations. Using the MPF in a dynamic reiterative mode not only helps managers to identify the bottlenecks that hamper performance, but also the unintended effects and feedback loops that emerge. Similarly, it helps policymakers and programme managers at central level to better anticipate the potential results and side effects of and required conditions for health policies and programmes and to steer their implementation accordingly.
Evolutionary Games of Multiplayer Cooperation on Graphs
Arranz, Jordi; Traulsen, Arne
2016-01-01
There has been much interest in studying evolutionary games in structured populations, often modeled as graphs. However, most analytical results so far have only been obtained for two-player or linear games, while the study of more complex multiplayer games has been usually tackled by computer simulations. Here we investigate evolutionary multiplayer games on graphs updated with a Moran death-Birth process. For cycles, we obtain an exact analytical condition for cooperation to be favored by natural selection, given in terms of the payoffs of the game and a set of structure coefficients. For regular graphs of degree three and larger, we estimate this condition using a combination of pair approximation and diffusion approximation. For a large class of cooperation games, our approximations suggest that graph-structured populations are stronger promoters of cooperation than populations lacking spatial structure. Computer simulations validate our analytical approximations for random regular graphs and cycles, but show systematic differences for graphs with many loops such as lattices. In particular, our simulation results show that these kinds of graphs can even lead to more stringent conditions for the evolution of cooperation than well-mixed populations. Overall, we provide evidence suggesting that the complexity arising from many-player interactions and spatial structure can be captured by pair approximation in the case of random graphs, but that it need to be handled with care for graphs with high clustering. PMID:27513946
Model Checking the Remote Agent Planner
NASA Technical Reports Server (NTRS)
Khatib, Lina; Muscettola, Nicola; Havelund, Klaus; Norvig, Peter (Technical Monitor)
2001-01-01
This work tackles the problem of using Model Checking for the purpose of verifying the HSTS (Scheduling Testbed System) planning system. HSTS is the planner and scheduler of the remote agent autonomous control system deployed in Deep Space One (DS1). Model Checking allows for the verification of domain models as well as planning entries. We have chosen the real-time model checker UPPAAL for this work. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a sketch for the mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify.
Tackling saponin diversity in marine animals by mass spectrometry: data acquisition and integration.
Decroo, Corentin; Colson, Emmanuel; Demeyer, Marie; Lemaur, Vincent; Caulier, Guillaume; Eeckhaut, Igor; Cornil, Jérôme; Flammang, Patrick; Gerbaux, Pascal
2017-05-01
Saponin analysis by mass spectrometry methods is nowadays progressively supplementing other analytical methods such as nuclear magnetic resonance (NMR). Indeed, saponin extracts from plant or marine animals are often constituted by a complex mixture of (slightly) different saponin molecules that requires extensive purification and separation steps to meet the requirement for NMR spectroscopy measurements. Based on its intrinsic features, mass spectrometry represents an inescapable tool to access the structures of saponins within extracts by using LC-MS, MALDI-MS, and tandem mass spectrometry experiments. The combination of different MS methods nowadays allows for a nice description of saponin structures, without extensive purification. However, the structural characterization process is based on low kinetic energy CID which cannot afford a total structure elucidation as far as stereochemistry is concerned. Moreover, the structural difference between saponins in a same extract is often so small that coelution upon LC-MS analysis is unavoidable, rendering the isomeric distinction and characterization by CID challenging or impossible. In the present paper, we introduce ion mobility in combination with liquid chromatography to better tackle the structural complexity of saponin congeners. When analyzing saponin extracts with MS-based methods, handling the data remains problematic for the comprehensive report of the results, but also for their efficient comparison. We here introduce an original schematic representation using sector diagrams that are constructed from mass spectrometry data. We strongly believe that the proposed data integration could be useful for data interpretation since it allows for a direct and fast comparison, both in terms of composition and relative proportion of the saponin contents in different extracts. Graphical Abstract A combination of state-of-the-art mass spectrometry methods, including ion mobility spectroscopy, is developed to afford a complete description of the saponin molecules in natural extracts.
Strategic Studies Quarterly. Volume 5, Number 1, Spring 2011
2011-01-01
2010). 17. The White House, National Space Policy of the United States of America (Washington: White House, 28 June 2010), 3. 18. John Oneal and Bruce... censors and vigilantes) model operating on many levels at once. In this model, China is expressing a long-standing concern for the stability and...Ansfield, “China’s Censors Tackle and Trip Over the Internet,” New York Times, 8 April 2010. 32. Ching Cheong, “Fighting the Digital War with the
Özkan, Şeyda; Vitali, Andrea; Lacetera, Nicola; Amon, Barbara; Bannink, André; Bartley, Dave J; Blanco-Penedo, Isabel; de Haas, Yvette; Dufrasne, Isabelle; Elliott, John; Eory, Vera; Fox, Naomi J; Garnsworthy, Phil C; Gengler, Nicolas; Hammami, Hedi; Kyriazakis, Ilias; Leclère, David; Lessire, Françoise; Macleod, Michael; Robinson, Timothy P; Ruete, Alejandro; Sandars, Daniel L; Shrestha, Shailesh; Stott, Alistair W; Twardy, Stanislaw; Vanrobays, Marie-Laure; Ahmadi, Bouda Vosough; Weindl, Isabelle; Wheelhouse, Nick; Williams, Adrian G; Williams, Hefin W; Wilson, Anthony J; Østergaard, Søren; Kipling, Richard P
2016-11-01
Climate change has the potential to impair livestock health, with consequences for animal welfare, productivity, greenhouse gas emissions, and human livelihoods and health. Modelling has an important role in assessing the impacts of climate change on livestock systems and the efficacy of potential adaptation strategies, to support decision making for more efficient, resilient and sustainable production. However, a coherent set of challenges and research priorities for modelling livestock health and pathogens under climate change has not previously been available. To identify such challenges and priorities, researchers from across Europe were engaged in a horizon-scanning study, involving workshop and questionnaire based exercises and focussed literature reviews. Eighteen key challenges were identified and grouped into six categories based on subject-specific and capacity building requirements. Across a number of challenges, the need for inventories relating model types to different applications (e.g. the pathogen species, region, scale of focus and purpose to which they can be applied) was identified, in order to identify gaps in capability in relation to the impacts of climate change on animal health. The need for collaboration and learning across disciplines was highlighted in several challenges, e.g. to better understand and model complex ecological interactions between pathogens, vectors, wildlife hosts and livestock in the context of climate change. Collaboration between socio-economic and biophysical disciplines was seen as important for better engagement with stakeholders and for improved modelling of the costs and benefits of poor livestock health. The need for more comprehensive validation of empirical relationships, for harmonising terminology and measurements, and for building capacity for under-researched nations, systems and health problems indicated the importance of joined up approaches across nations. The challenges and priorities identified can help focus the development of modelling capacity and future research structures in this vital field. Well-funded networks capable of managing the long-term development of shared resources are required in order to create a cohesive modelling community equipped to tackle the complex challenges of climate change. Copyright © 2016 Elsevier Inc. All rights reserved.
From naturalistic neuroscience to modeling radical embodiment with narrative enactive systems
Tikka, Pia; Kaipainen, Mauri Ylermi
2014-01-01
Mainstream cognitive neuroscience has begun to accept the idea of embodied mind, which assumes that the human mind is fundamentally constituted by the dynamical interactions of the brain, body, and the environment. In today’s paradigm of naturalistic neurosciences, subjects are exposed to rich contexts, such as video sequences or entire films, under relatively controlled conditions, against which researchers can interpret changes in neural responses within a time window. However, from the point of view of radical embodied cognitive neuroscience, the increasing complexity alone will not suffice as the explanatory apparatus for dynamical embodiment and situatedness of the mind. We suggest that narrative enactive systems with dynamically adaptive content as stimuli, may serve better to account for the embodied mind engaged with the surrounding world. Among the ensuing challenges for neuroimaging studies is how to interpret brain data against broad temporal contexts of previous experiences that condition the unfolding experience of nowness. We propose means to tackle this issue, as well as ways to limit the exponentially growing combinatoria of narrative paths to a controllable number. PMID:25339890
The use of soft system methodology (SSM) in a serviced-focussed study on the personal tutor's role.
Por, Jitna
2008-09-01
Soft system methodology (SSM) is described as a system-based methodology for tackling real world problems. SSM may be used as a means of articulating complex social processes in a particular way. SSM allows peoples' viewpoints and assumptions about the world to be bought to light, challenged and tested. This paper reports on the use of SSM in a service-focussed study (SFS) to explore the role of a personal tutor in nurse education. [Checkland, P., 1981. Systems Thinking Systems Practice. John Wiley and Sons, Chichester] highlighted the importance of considering cultural, social and political systems in the analysis. The seven stages of SSM are discussed in relation to the SFS and some of the findings are expressed through a 'Rich Picture'. It encourages commitment, brings diverse interests together and opens up the organizational culture. It also enables feasible and desirable changes to be recommended within the context of limited resources and competing demands upon lecturers' time. The SSM was an appropriate systematic model for this study and could be potentially useful in nurse education research.
Improving long time behavior of Poisson bracket mapping equation: A non-Hamiltonian approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Hyun Woo; Rhee, Young Min, E-mail: ymrhee@postech.ac.kr
2014-05-14
Understanding nonadiabatic dynamics in complex systems is a challenging subject. A series of semiclassical approaches have been proposed to tackle the problem in various settings. The Poisson bracket mapping equation (PBME) utilizes a partial Wigner transform and a mapping representation for its formulation, and has been developed to describe nonadiabatic processes in an efficient manner. Operationally, it is expressed as a set of Hamilton's equations of motion, similar to more conventional classical molecular dynamics. However, this original Hamiltonian PBME sometimes suffers from a large deviation in accuracy especially in the long time limit. Here, we propose a non-Hamiltonian variant ofmore » PBME to improve its behavior especially in that limit. As a benchmark, we simulate spin-boson and photosynthetic model systems and find that it consistently outperforms the original PBME and its Ehrenfest style variant. We explain the source of this improvement by decomposing the components of the mapping Hamiltonian and by assessing the energy flow between the system and the bath. We discuss strengths and weaknesses of our scheme with a viewpoint of offering future prospects.« less
Barrett, Meredith; Combs, Veronica; Su, Jason G; Henderson, Kelly; Tuffli, Michael
2018-04-01
Cross-sector partnerships benefit public health by leveraging ideas, resources, and expertise from a wide range of partners. In this study we documented the process and impact of AIR Louisville (a collaboration forged among the Louisville Metro Government, a nonprofit institute, and a technology company) in successfully tackling a complex public health challenge: asthma. We enrolled residents of Louisville, Kentucky, with asthma and used electronic inhaler sensors to monitor where and when they used medication. We found that the use of the digital health platform achieved positive clinical outcomes, including a 78 percent reduction in rescue inhaler use and a 48 percent improvement in symptom-free days. Moreover, the crowdsourced real-world data on inhaler use, combined with environmental data, led to policy recommendations including enhancing tree canopy, tree removal mitigation, zoning for air pollution emission buffers, recommended truck routes, and developing a community asthma notification system. AIR Louisville represents a model that can be replicated to address many public health challenges by simultaneously guiding individual, clinical, and policy decisions.
Orme, J; Pilkington, P; Gray, S; Rao, M
2009-12-01
This paper examines the development and achievements of the Teaching Public Health Networks (TPHNs) in England; an initiative that aimed to catalyse collaborative working between the public health workforce and further and higher education, to enhance public health knowledge in the wider workforce with a view to enhancing capacity to tackle inequalities and meeting public health targets. This paper highlights activities under three outcomes: mobilizing resources, people, money and materials; building capacity through training and infrastructure development; and raising public and political awareness. The TPHN approach is shown to have led to innovative developments in public health education and training, including engagement with professionals that have not previously had exposure to public health. This paper aims to disseminate the learning from this complex public health initiative, now in its third year of development, and to share examples of good practice. It is hoped that other countries can use the TPHN approach as a model to address the various common and country-specific challenges in public health workforce development.
Trash track--active location sensing for evaluating e-waste transportation.
Offenhuber, Dietmar; Wolf, Malima I; Ratti, Carlo
2013-02-01
Waste and recycling systems are complex and far-reaching, but its mechanisms are poorly understood by the public, in some cases government organizations and even the waste management sector itself. The lack of empirical data makes it challenging to assess the environmental impact of trash collection, removal and disposal. This is especially the case for the global movement of electronic wastes. Senseable City Lab's Trash Track project tackles this scarcity of data by following the trajectories of individual objects. The project presents a methodology involving active location sensors that were placed on end-of-life products donated by volunteers in the Seattle, Washington area. These tags sent location messages chronicling their journey, some over the course of a month or more. In this paper, the authors focus on the analysis of traces acquired from 146 items of electronic waste, estimating evaluating the environmental impact, including the travel distances and end-of-life treatments for the products. Combining this information with impact evaluation from the US Environmental Protection Agency's Waste Reduction Model (WARM) allows for the creation of environmental impact profiles for individual pieces of trash.
PPDB - A tool for investigation of plants physiology based on gene ontology.
Sharma, Ajay Shiv; Gupta, Hari Om; Prasad, Rajendra
2014-09-02
Representing the way forward, from functional genomics and its ontology to functional understanding and physiological model, in a computationally tractable fashion is one of the ongoing challenges faced by computational biology. To tackle the standpoint, we herein feature the applications of contemporary database management to the development of PPDB, a searching and browsing tool for the Plants Physiology Database that is based upon the mining of a large amount of gene ontology data currently available. The working principles and search options associated with the PPDB are publicly available and freely accessible on-line ( http://www.iitr.ernet.in/ajayshiv/ ) through a user friendly environment generated by means of Drupal-6.24. By knowing that genes are expressed in temporally and spatially characteristic patterns and that their functionally distinct products often reside in specific cellular compartments and may be part of one or more multi-component complexes, this sort of work is intended to be relevant for investigating the functional relationships of gene products at a system level and, thus, helps us approach to the full physiology.
PPDB: A Tool for Investigation of Plants Physiology Based on Gene Ontology.
Sharma, Ajay Shiv; Gupta, Hari Om; Prasad, Rajendra
2015-09-01
Representing the way forward, from functional genomics and its ontology to functional understanding and physiological model, in a computationally tractable fashion is one of the ongoing challenges faced by computational biology. To tackle the standpoint, we herein feature the applications of contemporary database management to the development of PPDB, a searching and browsing tool for the Plants Physiology Database that is based upon the mining of a large amount of gene ontology data currently available. The working principles and search options associated with the PPDB are publicly available and freely accessible online ( http://www.iitr.ac.in/ajayshiv/ ) through a user-friendly environment generated by means of Drupal-6.24. By knowing that genes are expressed in temporally and spatially characteristic patterns and that their functionally distinct products often reside in specific cellular compartments and may be part of one or more multicomponent complexes, this sort of work is intended to be relevant for investigating the functional relationships of gene products at a system level and, thus, helps us approach to the full physiology.
Transitions between homogeneous phases of polar active liquids
NASA Astrophysics Data System (ADS)
Dauchot, Olivier; Nguyen Thu Lam, Khanh Dang; Schindler, Michael; EC2M Team; PCT Team
2015-03-01
Polar active liquids, composed of aligning self-propelled particle exhibit large scale collective motion. Simulations of Vicsek-like models of constant-speed point particles, aligning with their neighbors in the presence of noise, have revealed the existence of a transition towards a true long range order polar-motion phase. Generically, the homogenous polar state is unstable; non-linear propagative structures develop; and the transition is discontinuous. The long range dynamics of these systems has been successfully captured using various scheme of kinetic theories. However the complexity of the dynamics close to the transition has somewhat hindered more basics questions. Is there a simple way to predict the existence and the order of a transition to collective motion for a given microscopic dynamics? What would be the physically meaningful and relevant quantity to answer this question? Here, we tackle these questions, restricting ourselves to the study of the homogeneous phases of polar active liquids in the low density limit and obtain a very intuitive understanding of the conditions which particle interaction must satisfy to induce a transition towards collective motion.
Variety, Palatability, and Obesity1234
Johnson, Fiona; Wardle, Jane
2014-01-01
Among the key characteristics of the Western obesogenic food environment is a highly palatable and varied food supply. Laboratory investigations of eating behavior in both humans and animals established key roles for palatability and variety in stimulating appetite, delaying satiety, and promoting excessive energy intake. There is a robust effect of food palatability and variety on short-term food intake, and increased variety and palatability also cause weight gain in animal models. However, laboratory paradigms do not replicate the complexities of eating in a natural setting, and there is a shortage of evidence to estimate the magnitude of effects on weight in humans. There are substantial individual differences in susceptibility to the palatability effect and this may be a key determinant in individual vulnerability to weight gain. The understanding of pathways through which palatability and variety can affect eating is advancing, and epidemiologic and intervention studies are needed to translate laboratory findings into applications in public health or clinical domains, and to establish whether there is a role for greater regulation of the food environment in tackling increases in obesity. PMID:25398751
Putting the Barker Theory into the Future: Time to Act on Preventing Pediatric Obesity.
Pietrobelli, Angelo; Agosti, Massimo; Zuccotti, Gianvincenzo
2016-11-17
Growth and development are key characteristics of childhood and sensitive markers of health and adequate nutrition. The first 1000 days of life-conception through 24 months of age-represent a fundamental period for development and thus the prevention of childhood obesity and its adverse consequences is mandatory. There are many growth drivers during this complex phase of life, such as nutrition, genetic and epigenetic factors, and hormonal regulation. The challenge thus involves maximizing the potential for normal growth without increasing the risk of associated disorders. The Mediterranean Nutrition Group (MeNu Group), a group of researchers of the Mediterranean Region, in this Special Issue titled "Prevent Obesity in the First 1000 Days", presented results that advanced the science of obesity risk factors in early life, coming both from animal model studies and studies in humans. In the future, early-life intervention designs for the prevention of pediatric obesity will need to look at different strategies, and the MeNu Group is available for guidance regarding an appropriate conceptual framework to accomplish either prevention or treatment strategies to tackle pediatric obesity.
NASA Astrophysics Data System (ADS)
Seyedhosseini, Seyed Mohammad; Makui, Ahmad; Shahanaghi, Kamran; Torkestani, Sara Sadat
2016-09-01
Determining the best location to be profitable for the facility's lifetime is the important decision of public and private firms, so this is why discussion about dynamic location problems (DLPs) is a critical significance. This paper presented a comprehensive review from 1968 up to most recent on published researches about DLPs and classified them into two parts. First, mathematical models developed based on different characteristics: type of parameters (deterministic, probabilistic or stochastic), number and type of objective function, numbers of commodity and modes, relocation time, number of relocation and relocating facilities, time horizon, budget and capacity constraints and their applicability. In second part, It have been also presented solution algorithms, main specification, applications and some real-world case studies of DLPs. At the ends, we concluded that in the current literature of DLPs, distribution systems and production-distribution systems with simple assumption of the tackle to the complexity of these models studied more than any other fields, as well as the concept of variety of services (hierarchical network), reliability, sustainability, relief management, waiting time for services (queuing theory) and risk of facility disruption need for further investigation. All of the available categories based on different criteria, solution methods and applicability of them, gaps and analysis which have been done in this paper suggest the ways for future research.
A stepwise-cluster microbial biomass inference model in food waste composting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun Wei; Huang, Guo H., E-mail: huangg@iseis.or; Chinese Research Academy of Environmental Science, North China Electric Power University, Beijing 100012-102206
2009-12-15
A stepwise-cluster microbial biomass inference (SMI) model was developed through introducing stepwise-cluster analysis (SCA) into composting process modeling to tackle the nonlinear relationships among state variables and microbial activities. The essence of SCA is to form a classification tree based on a series of cutting or mergence processes according to given statistical criteria. Eight runs of designed experiments in bench-scale reactors in a laboratory were constructed to demonstrate the feasibility of the proposed method. The results indicated that SMI could help establish a statistical relationship between state variables and composting microbial characteristics, where discrete and nonlinear complexities exist. Significance levelsmore » of cutting/merging were provided such that the accuracies of the developed forecasting trees were controllable. Through an attempted definition of input effects on the output in SMI, the effects of the state variables on thermophilic bacteria were ranged in a descending order as: Time (day) > moisture content (%) > ash content (%, dry) > Lower Temperature (deg. C) > pH > NH{sub 4}{sup +}-N (mg/Kg, dry) > Total N (%, dry) > Total C (%, dry); the effects on mesophilic bacteria were ordered as: Time > Upper Temperature (deg. C) > Total N > moisture content > NH{sub 4}{sup +}-N > Total C > pH. This study made the first attempt in applying SCA to mapping the nonlinear and discrete relationships in composting processes.« less
The roadmap for estimation of cell-type-specific neuronal activity from non-invasive measurements
Uhlirova, Hana; Kılıç, Kıvılcım; Tian, Peifang; Sakadžić, Sava; Thunemann, Martin; Desjardins, Michèle; Saisan, Payam A.; Nizar, Krystal; Yaseen, Mohammad A.; Hagler, Donald J.; Vandenberghe, Matthieu; Djurovic, Srdjan; Andreassen, Ole A.; Silva, Gabriel A.; Masliah, Eliezer; Vinogradov, Sergei; Buxton, Richard B.; Einevoll, Gaute T.; Boas, David A.; Dale, Anders M.; Devor, Anna
2016-01-01
The computational properties of the human brain arise from an intricate interplay between billions of neurons connected in complex networks. However, our ability to study these networks in healthy human brain is limited by the necessity to use non-invasive technologies. This is in contrast to animal models where a rich, detailed view of cellular-level brain function with cell-type-specific molecular identity has become available due to recent advances in microscopic optical imaging and genetics. Thus, a central challenge facing neuroscience today is leveraging these mechanistic insights from animal studies to accurately draw physiological inferences from non-invasive signals in humans. On the essential path towards this goal is the development of a detailed ‘bottom-up’ forward model bridging neuronal activity at the level of cell-type-specific populations to non-invasive imaging signals. The general idea is that specific neuronal cell types have identifiable signatures in the way they drive changes in cerebral blood flow, cerebral metabolic rate of O2 (measurable with quantitative functional Magnetic Resonance Imaging), and electrical currents/potentials (measurable with magneto/electroencephalography). This forward model would then provide the ‘ground truth’ for the development of new tools for tackling the inverse problem—estimation of neuronal activity from multimodal non-invasive imaging data. This article is part of the themed issue ‘Interpreting BOLD: a dialogue between cognitive and cellular neuroscience’. PMID:27574309
Role of the plurality rule in multiple choices
NASA Astrophysics Data System (ADS)
Calvão, A. M.; Ramos, M.; Anteneodo, C.
2016-02-01
People are often challenged to select one among several alternatives. This situation is present not only in decisions about complex issues, e.g. political or academic choices, but also about trivial ones, such as in daily purchases at a supermarket. We tackle this scenario by means of the tools of statistical mechanics. Following this approach, we introduce and analyse a model of opinion dynamics, using a Potts-like state variable to represent the multiple choices, including the ‘undecided state’, which represents the individuals who do not make a choice. We investigate the dynamics over Erdös-Rényi and Barabási-Albert networks, two paradigmatic classes with the small-world property, and we show the impact of the type of network on the opinion dynamics. Depending on the number of available options q and on the degree distribution of the network of contacts, different final steady states are accessible: from a wide distribution of choices to a state where a given option largely dominates. The abrupt transition between them is consistent with the sudden viral dominance of a given option over many similar ones. Moreover, the probability distributions produced by the model are validated by real data. Finally, we show that the model also contemplates the real situation of overchoice, where a large number of similar alternatives makes the choice process harder and indecision prevail.
Jabbour, Charbel Jose Chiappetta; Jugend, Daniel; Jabbour, Ana Beatriz Lopes de Sousa; Govindan, Kannan; Kannan, Devika; Leal Filho, Walter
2018-01-15
Considering the unique relevance of Brazilian biodiversity, this research aims to investigate the main barriers to biodiversity-based R&D and eco-design development in a leading national company which has been commended for its innovation and sustainability. The methodology for this research was based on on-location visits, in-depth interviews, and consensus building among R&D, sustainability, and quality managers. A multi-criteria decision-making (MCDM) approach was adopted through interpretive structural modelling (ISM), a method that assists decision makers to transform complex models with unclear data into structural models. Some of the most influential barriers to biodiversity-based eco-design initiatives are "lack of legal incentive", "not enough demand from the market", and "not enough available knowledge/scientific data." The most relevant barrier was "no legal incentive" from government. Consequently, managers should concentrate their efforts in tackling those barriers that may affect other barriers known as 'key barriers'. Government should work decisively toward promoting a framework of legal incentives for bio-based eco-design; otherwise, metaphorically, "there is not carnival without the samba singer who pushes the rhythm". The results given here reveal the barriers for bio-based eco-design in a Brazilian leading company, and this is the first work combining ISM to barriers to biodiversity R&D and eco-design. Copyright © 2017 Elsevier Ltd. All rights reserved.
Estimation for the Linear Model With Uncertain Covariance Matrices
NASA Astrophysics Data System (ADS)
Zachariah, Dave; Shariati, Nafiseh; Bengtsson, Mats; Jansson, Magnus; Chatterjee, Saikat
2014-03-01
We derive a maximum a posteriori estimator for the linear observation model, where the signal and noise covariance matrices are both uncertain. The uncertainties are treated probabilistically by modeling the covariance matrices with prior inverse-Wishart distributions. The nonconvex problem of jointly estimating the signal of interest and the covariance matrices is tackled by a computationally efficient fixed-point iteration as well as an approximate variational Bayes solution. The statistical performance of estimators is compared numerically to state-of-the-art estimators from the literature and shown to perform favorably.
Mathematical and computational modelling of skin biophysics: a review
2017-01-01
The objective of this paper is to provide a review on some aspects of the mathematical and computational modelling of skin biophysics, with special focus on constitutive theories based on nonlinear continuum mechanics from elasticity, through anelasticity, including growth, to thermoelasticity. Microstructural and phenomenological approaches combining imaging techniques are also discussed. Finally, recent research applications on skin wrinkles will be presented to highlight the potential of physics-based modelling of skin in tackling global challenges such as ageing of the population and the associated skin degradation, diseases and traumas. PMID:28804267
Mathematical and computational modelling of skin biophysics: a review
NASA Astrophysics Data System (ADS)
Limbert, Georges
2017-07-01
The objective of this paper is to provide a review on some aspects of the mathematical and computational modelling of skin biophysics, with special focus on constitutive theories based on nonlinear continuum mechanics from elasticity, through anelasticity, including growth, to thermoelasticity. Microstructural and phenomenological approaches combining imaging techniques are also discussed. Finally, recent research applications on skin wrinkles will be presented to highlight the potential of physics-based modelling of skin in tackling global challenges such as ageing of the population and the associated skin degradation, diseases and traumas.
Mazaris, Antonios D; Germond, Basil
2018-09-01
For the past two decades, the need to shield strategic maritime interests, to tackle criminality and terrorism at or from the sea and to conserve valuable marine resources has been recognized at the highest political level. Acknowledging and accounting for the interplay between climate change, the vulnerability of coastal populations and the occurrence of maritime criminality should be part of any ocean governance process. Still, given the complex interactions between climate change and socio-economic components of the marine realm, it has become urgent to establish a solid methodological framework, which could lead to sound and effective decisions. We propose that any such framework should not be built from scratch. The adaptation of well tested, existing uncertainty-management tools, such as Cumulative Effect Assessments, could serve as a solid basis to account for the magnitude and directionality of the dependencies between the impacts of climate change and the occurrence of maritime criminality, offering spatial explicit risk evaluations. Multi-Criteria Decision Making could then be employed to better and faster inform decision-makers. These mechanisms could provide a framework for comparison of alternative mitigation and adaptation actions and are essential in assessing responses to tackle maritime crime in the context of climate change. Copyright © 2018 Elsevier B.V. All rights reserved.
The membrane as the gatekeeper of infection: Cholesterol in host-pathogen interaction.
Kumar, G Aditya; Jafurulla, Md; Chattopadhyay, Amitabha
2016-09-01
The cellular plasma membrane serves as a portal for the entry of intracellular pathogens. An essential step for an intracellular pathogen to gain entry into a host cell therefore is to be able to cross the cell membrane. In this review, we highlight the role of host membrane cholesterol in regulating the entry of intracellular pathogens using insights obtained from work on the interaction of Leishmania and Mycobacterium with host cells. The entry of these pathogens is known to be dependent on host membrane cholesterol. Importantly, pathogen entry is inhibited either upon depletion (or complexation), or enrichment of membrane cholesterol. In other words, an optimum level of host membrane cholesterol is necessary for efficient infection by pathogens. In this overall context, we propose a general mechanism, based on cholesterol-induced conformational changes, involving cholesterol binding sites in host cell surface receptors that are implicated in this process. A therapeutic strategy targeting modulation of membrane cholesterol would have the advantage of avoiding the commonly encountered problem of drug resistance in tackling infection by intracellular pathogens. Insights into the role of host membrane cholesterol in pathogen entry would be instrumental in the development of novel therapeutic strategies to effectively tackle intracellular pathogenesis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Primary care support for tackling obesity: a qualitative study of the perceptions of obese patients.
Brown, Ian; Thompson, Joanne; Tod, Angela; Jones, Georgina
2006-09-01
Obesity has become a major public health issue and there is concern about the response of health services to patients who are obese. The perceptions of obese patients using primary care services have not been studied in depth. To explore obese patients' experiences and perceptions of support in primary care. Qualitative study with semi-structured interviews conducted in participants' homes. Five general practices contrasting in socioeconomic populations in Sheffield. Purposive sampling and semi-structured interviewing of 28 patients with a diverse range of ages, backgrounds, levels of obesity and experiences of primary care services. Participants typically felt reluctance when presenting with concerns about weight and ambivalence about the services received. They also perceived there to be ambivalence and a lack of resources on the part of the health services. Participants showed a strong sense of personal responsibility about their condition and stigma-related cognitions were common. These contributed to their ambivalence about using services and their sensitivity to its features. Good relationships with primary care professionals and more intensive support partly ameliorated these effects. The challenges of improving access to and quality of primary care support in tackling obesity are made more complex by patients' ambivalence and other effects of the stigma associated with obesity.
Cognition of an expert tackling an unfamiliar conceptual physics problem
NASA Astrophysics Data System (ADS)
Schuster, David; Undreiu, Adriana
2009-11-01
We have investigated and analyzed the cognition of an expert tackling a qualitative conceptual physics problem of an unfamiliar type. Our goal was to elucidate the detailed cognitive processes and knowledge elements involved, irrespective of final solution form, and consider implications for instruction. The basic but non-trivial problem was to find qualitatively the direction of acceleration of a pendulum bob at various stages of its motion, a problem originally studied by Reif and Allen. Methodology included interviews, introspection, retrospection and self-reported metacognition. Multiple facets of cognition were revealed, with different reasoning strategies used at different stages and for different points on the path. An account is given of the zigzag thinking paths and interplay of reasoning modes and schema elements involved. We interpret the cognitive processes in terms of theoretical concepts that emerged, namely: case-based, principle-based, experiential-intuitive and practical-heuristic reasoning; knowledge elements and schemata; activation; metacognition and epistemic framing. The complexity of cognition revealed in this case study contrasts with the tidy principle-based solutions we present to students. The pervasive role of schemata, case-based reasoning, practical heuristic strategies, and their interplay with physics principles is noteworthy, since these aspects of cognition are generally neither recognized nor taught. The schema/reasoning-mode perspective has direct application in science teaching, learning and problem-solving.
The study of production performance of water heater manufacturing by using simulation method
NASA Astrophysics Data System (ADS)
Iqbal, M.; Bamatraf, OAA; Tadjuddin, M.
2018-02-01
In industrial companies, as demand increases, decision-making to increase production becomes difficult due to the complexity of the model systems. Companies are trying to find the optimum methods to tackle such problems so that resources are utilized and production is increased. One line system of a manufacturing company in Malaysia was considered in this research. The Company produces several types of water heater and each type went into many processes, which was divided into twenty six sections. Each section has several operations. The main type of the product was 10G water heater which is produced most compare to other types, hence it was taken under consideration to be studied in this research. It was difficult to find the critical section that could improve the productions of the company. This research paper employed Delmia Quest software, Distribution Analyser software and Design of Experiment (DOE software) to simulate one model system taken from the company to be studied and to find the critical section that will improve the production system. As a result, assembly of inner and outer tank section were found to be the bottleneck section. Adding one section to the bottleneck increases the production rate by four products a day. The buffer size is determined by the experiment was six items.
Saitou, Takashi; Imamura, Takeshi
2016-01-01
Cell cycle progression is strictly coordinated to ensure proper tissue growth, development, and regeneration of multicellular organisms. Spatiotemporal visualization of cell cycle phases directly helps us to obtain a deeper understanding of controlled, multicellular, cell cycle progression. The fluorescent ubiquitination-based cell cycle indicator (Fucci) system allows us to monitor, in living cells, the G1 and the S/G2/M phases of the cell cycle in red and green fluorescent colors, respectively. Since the discovery of Fucci technology, it has found numerous applications in the characterization of the timing of cell cycle phase transitions under diverse conditions and various biological processes. However, due to the complexity of cell cycle dynamics, understanding of specific patterns of cell cycle progression is still far from complete. In order to tackle this issue, quantitative approaches combined with mathematical modeling seem to be essential. Here, we review several studies that attempted to integrate Fucci technology and mathematical models to obtain quantitative information regarding cell cycle regulatory patterns. Focusing on the technological development of utilizing mathematics to retrieve meaningful information from the Fucci producing data, we discuss how the combined methods advance a quantitative understanding of cell cycle regulation. © 2015 Japanese Society of Developmental Biologists.
NASA Astrophysics Data System (ADS)
Çelik, Emre; Uzun, Yunus; Kurt, Erol; Öztürk, Nihat; Topaloğlu, Nurettin
2018-01-01
An application of an artificial neural network (ANN) has been implemented in this article to model the nonlinear relationship of the harvested electrical power of a recently developed piezoelectric pendulum with respect to its resistive load R L and magnetic excitation frequency f. Prediction of harvested power for a wide range is a difficult task, because it increases dramatically when f gets closer to the natural frequency f 0 of the system. The neural model of the concerned system is designed upon the basis of a standard multi-layer network with a back propagation learning algorithm. Input data, termed input patterns, to present to the network and the respective output data, termed output patterns, describing desired network output that are carefully collected from the experiment under several conditions in order to train the developed network accurately. Results have indicated that the designed ANN is an effective means for predicting the harvested power of the piezoelectric harvester as functions of R L and f with a root mean square error of 6.65 × 10-3 for training and 1.40 for different test conditions. Using the proposed approach, the harvested power can be estimated reasonably without tackling the difficulty of experimental studies and complexity of analytical formulas representing the concerned system.
Efficient integration of spectral features for vehicle tracking utilizing an adaptive sensor
NASA Astrophysics Data System (ADS)
Uzkent, Burak; Hoffman, Matthew J.; Vodacek, Anthony
2015-03-01
Object tracking in urban environments is an important and challenging problem that is traditionally tackled using visible and near infrared wavelengths. By inserting extended data such as spectral features of the objects one can improve the reliability of the identification process. However, huge increase in data created by hyperspectral imaging is usually prohibitive. To overcome the complexity problem, we propose a persistent air-to-ground target tracking system inspired by a state-of-the-art, adaptive, multi-modal sensor. The adaptive sensor is capable of providing panchromatic images as well as the spectra of desired pixels. This addresses the data challenge of hyperspectral tracking by only recording spectral data as needed. Spectral likelihoods are integrated into a data association algorithm in a Bayesian fashion to minimize the likelihood of misidentification. A framework for controlling spectral data collection is developed by incorporating motion segmentation information and prior information from a Gaussian Sum filter (GSF) movement predictions from a multi-model forecasting set. An intersection mask of the surveillance area is extracted from OpenStreetMap source and incorporated into the tracking algorithm to perform online refinement of multiple model set. The proposed system is tested using challenging and realistic scenarios generated in an adverse environment.
Design of a Modular Monolithic Implicit Solver for Multi-Physics Applications
NASA Technical Reports Server (NTRS)
Carton De Wiart, Corentin; Diosady, Laslo T.; Garai, Anirban; Burgess, Nicholas; Blonigan, Patrick; Ekelschot, Dirk; Murman, Scott M.
2018-01-01
The design of a modular multi-physics high-order space-time finite-element framework is presented together with its extension to allow monolithic coupling of different physics. One of the main objectives of the framework is to perform efficient high- fidelity simulations of capsule/parachute systems. This problem requires simulating multiple physics including, but not limited to, the compressible Navier-Stokes equations, the dynamics of a moving body with mesh deformations and adaptation, the linear shell equations, non-re effective boundary conditions and wall modeling. The solver is based on high-order space-time - finite element methods. Continuous, discontinuous and C1-discontinuous Galerkin methods are implemented, allowing one to discretize various physical models. Tangent and adjoint sensitivity analysis are also targeted in order to conduct gradient-based optimization, error estimation, mesh adaptation, and flow control, adding another layer of complexity to the framework. The decisions made to tackle these challenges are presented. The discussion focuses first on the "single-physics" solver and later on its extension to the monolithic coupling of different physics. The implementation of different physics modules, relevant to the capsule/parachute system, are also presented. Finally, examples of coupled computations are presented, paving the way to the simulation of the full capsule/parachute system.
Experimental Economies and Tax Evasion: The Order Beyond the Market
NASA Astrophysics Data System (ADS)
Bernhofer, Juliana
Research on tax evasion will probably never get old. As long as there are taxes, there will also be policy-makers all over the world eager to tackle deviant conduct in the most efficient and efficacious way. To fill this purpose a number of theoretical and empirical frameworks have been developed in economics over the last decades, starting from the classical models of Allingham and Sandmo (1972) where individuals were assumed to be perfectly rational following a pure cost-benefit logic. Today, however, we look at a body of literature which has opened up to a number of new and interdisciplinary findings, also thanks to the inclusion of behavioral aspects that do not necessarily follow the paradigms of the homo economicus. To this end, the discipline of Experimental Economics has developed numerous ways to overcome the distance between economic theory and human behavior. The aim of this survey is to take the reader on a tour through some of these methodologies applied to the analysis of tax evasion, arguing that further research should focus on integrating multi-agent simulation models with outcomes from human subject experiments in order to create useful and necessary tools to administer, consolidate and represent the complex theoretical, empirical and experimental panorama of tax evasion research.
A multivariate prediction model for Rho-dependent termination of transcription.
Nadiras, Cédric; Eveno, Eric; Schwartz, Annie; Figueroa-Bossi, Nara; Boudvillain, Marc
2018-06-21
Bacterial transcription termination proceeds via two main mechanisms triggered either by simple, well-conserved (intrinsic) nucleic acid motifs or by the motor protein Rho. Although bacterial genomes can harbor hundreds of termination signals of either type, only intrinsic terminators are reliably predicted. Computational tools to detect the more complex and diversiform Rho-dependent terminators are lacking. To tackle this issue, we devised a prediction method based on Orthogonal Projections to Latent Structures Discriminant Analysis [OPLS-DA] of a large set of in vitro termination data. Using previously uncharacterized genomic sequences for biochemical evaluation and OPLS-DA, we identified new Rho-dependent signals and quantitative sequence descriptors with significant predictive value. Most relevant descriptors specify features of transcript C>G skewness, secondary structure, and richness in regularly-spaced 5'CC/UC dinucleotides that are consistent with known principles for Rho-RNA interaction. Descriptors collectively warrant OPLS-DA predictions of Rho-dependent termination with a ∼85% success rate. Scanning of the Escherichia coli genome with the OPLS-DA model identifies significantly more termination-competent regions than anticipated from transcriptomics and predicts that regions intrinsically refractory to Rho are primarily located in open reading frames. Altogether, this work delineates features important for Rho activity and describes the first method able to predict Rho-dependent terminators in bacterial genomes.
TSOS and TSOS-FK hybrid methods for modelling the propagation of seismic waves
NASA Astrophysics Data System (ADS)
Ma, Jian; Yang, Dinghui; Tong, Ping; Ma, Xiao
2018-05-01
We develop a new time-space optimized symplectic (TSOS) method for numerically solving elastic wave equations in heterogeneous isotropic media. We use the phase-preserving symplectic partitioned Runge-Kutta method to evaluate the time derivatives and optimized explicit finite-difference (FD) schemes to discretize the space derivatives. We introduce the averaged medium scheme into the TSOS method to further increase its capability of dealing with heterogeneous media and match the boundary-modified scheme for implementing free-surface boundary conditions and the auxiliary differential equation complex frequency-shifted perfectly matched layer (ADE CFS-PML) non-reflecting boundaries with the TSOS method. A comparison of the TSOS method with analytical solutions and standard FD schemes indicates that the waveform generated by the TSOS method is more similar to the analytic solution and has a smaller error than other FD methods, which illustrates the efficiency and accuracy of the TSOS method. Subsequently, we focus on the calculation of synthetic seismograms for teleseismic P- or S-waves entering and propagating in the local heterogeneous region of interest. To improve the computational efficiency, we successfully combine the TSOS method with the frequency-wavenumber (FK) method and apply the ADE CFS-PML to absorb the scattered waves caused by the regional heterogeneity. The TSOS-FK hybrid method is benchmarked against semi-analytical solutions provided by the FK method for a 1-D layered model. Several numerical experiments, including a vertical cross-section of the Chinese capital area crustal model, illustrate that the TSOS-FK hybrid method works well for modelling waves propagating in complex heterogeneous media and remains stable for long-time computation. These numerical examples also show that the TSOS-FK method can tackle the converted and scattered waves of the teleseismic plane waves caused by local heterogeneity. Thus, the TSOS and TSOS-FK methods proposed in this study present an essential tool for the joint inversion of local, regional, and teleseismic waveform data.
Contact events in rugby union and their propensity to cause injury
Fuller, Colin W; Brooks, John H M; Cancea, Rebecca J; Hall, John; Kemp, Simon P T
2007-01-01
Objective The objective of this study was to determine the incidence of contact events in professional rugby union matches and to assess their propensity to cause injury. Design The study was a two‐season (2003/2004 and 2005/2006) prospective cohort design. It included 645 professional rugby union players from 13 English Premiership rugby union clubs. The main outcome measures were: incidence of match contact events (events per game); incidence (injuries per 1000 player‐hours and per 1000 contact events), risk (days lost per 1000 player‐hours and per 1000 contact events) and diagnosis of injury; referee's decision. Risk factors were player–player contact, position on pitch and period of play. Results Tackles (221.0 events/game) and rucks (142.5 events/game) were the most common events and mauls (13.6%) and scrums (12.6%) the most penalised. Tackles (701.6 days/1000 player‐hours) were responsible for the greatest loss of time but scrums (213.2 days lost/1000 events) and collisions (199.8 days lost/1000 events) presented the highest risk per event. Conclusions Tackles were the game event responsible for the highest number of injuries and the greatest loss of time in rugby union because they were by far the most common contact event. Collisions were 70% more likely to result in an injury than a tackle and scrums carried a 60% greater risk of injury than a tackle. The relative propensities for contact events to cause injury were rated as: lineout – very low; ruck – low; maul and tackle – average; collision and scrum – high. PMID:17513332
Contact events in rugby union and their propensity to cause injury.
Fuller, Colin W; Brooks, John H M; Cancea, Rebecca J; Hall, John; Kemp, Simon P T
2007-12-01
The objective of this study was to determine the incidence of contact events in professional rugby union matches and to assess their propensity to cause injury. The study was a two-season (2003/2004 and 2005/2006) prospective cohort design. It included 645 professional rugby union players from 13 English Premiership rugby union clubs. The main outcome measures were: incidence of match contact events (events per game); incidence (injuries per 1000 player-hours and per 1000 contact events), risk (days lost per 1000 player-hours and per 1000 contact events) and diagnosis of injury; referee's decision. Risk factors were player-player contact, position on pitch and period of play. Tackles (221.0 events/game) and rucks (142.5 events/game) were the most common events and mauls (13.6%) and scrums (12.6%) the most penalised. Tackles (701.6 days/1000 player-hours) were responsible for the greatest loss of time but scrums (213.2 days lost/1000 events) and collisions (199.8 days lost/1000 events) presented the highest risk per event. Tackles were the game event responsible for the highest number of injuries and the greatest loss of time in rugby union because they were by far the most common contact event. Collisions were 70% more likely to result in an injury than a tackle and scrums carried a 60% greater risk of injury than a tackle. The relative propensities for contact events to cause injury were rated as: lineout--very low; ruck--low; maul and tackle--average; collision and scrum--high.
International Vision Care: Issues and Approaches.
Khanna, Rohit C; Marmamula, Srinivas; Rao, Gullapalli N
2017-09-15
Globally, 32.4 million individuals are blind and 191 million have moderate or severe visual impairment (MSVI); 80% of cases of blindness and MSVI are avoidable. However, great efforts are needed to tackle blindness and MSVI, as eye care in most places is delivered in isolation from and without significant integration with general health sectors. Success stories, including control of vitamin A deficiency, onchocerciasis, and trachoma, showed that global partnerships, multisectoral collaboration, public-private partnerships, corporate philanthropy, support from nongovernmental organizations-both local and international-and governments are responsible for the success of these programs. Hence, the World Health Organization's universal eye health global action plan for 2014-2019 has a goal of reducing the public health problem of blindness and ensuring access to comprehensive eye care; the plan aims to integrate eye health into health systems, thus providing universal eye health coverage (UEHC). This article discusses the challenges faced by low- and middle-income countries in strengthening the six building blocks of the health system. It discusses how the health systems in these countries need to be geared toward tackling the issues of emerging noncommunicable eye diseases, existing infectious diseases, and the common causes of blindness and visual impairment, such as cataract and refractive error. It also discusses how some of the comprehensive eye care models in the developing world have addressed these challenges. Moving ahead, if we are to achieve UEHC, we need to develop robust, sustainable, good-quality, comprehensive eye care programs throughout the world, focusing on the areas of greatest need. We also need to develop public health approaches for more complex problems such as diabetic retinopathy, glaucoma, childhood blindness, corneal blindness, and low vision. There is also a great need to train high-level human resources of all cadres in adequate numbers and quality. In addition to this, we need to exploit the benefits of modern technological innovations in information, communications, biomedical technology, and other domains to enhance quality of, access to, and equity in eye care.
NASA Astrophysics Data System (ADS)
Carotenuto, Federico; Georgiadis, Teodoro; Gioli, Beniamino; Leyronas, Christel; Morris, Cindy E.; Nardino, Marianna; Wohlfahrt, Georg; Miglietta, Franco
2017-12-01
Microbial aerosols (mainly composed of bacterial and fungal cells) may constitute up to 74 % of the total aerosol volume. These biological aerosols are not only relevant to the dispersion of pathogens, but they also have geochemical implications. Some bacteria and fungi may, in fact, serve as cloud condensation or ice nuclei, potentially affecting cloud formation and precipitation and are active at higher temperatures compared to their inorganic counterparts. Simulations of the impact of microbial aerosols on climate are still hindered by the lack of information regarding their emissions from ground sources. This present work tackles this knowledge gap by (i) applying a rigorous micrometeorological approach to the estimation of microbial net fluxes above a Mediterranean grassland and (ii) developing a deterministic model (the PLAnET model) to estimate these emissions on the basis of a few meteorological parameters that are easy to obtain. The grassland is characterized by an abundance of positive net microbial fluxes and the model proves to be a promising tool capable of capturing the day-to-day variability in microbial fluxes with a relatively small bias and sufficient accuracy. PLAnET is still in its infancy and will benefit from future campaigns extending the available training dataset as well as the inclusion of ever more complex and critical phenomena triggering the emission of microbial aerosol (such as rainfall). The model itself is also adaptable as an emission module for dispersion and chemical transport models, allowing further exploration of the impact of land-cover-driven microbial aerosols on the atmosphere and climate.
2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS ...
2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS FROM POND TO JACK LADDER--AN ENDLESS CHAIN CONVEYOR THAT MOVES LOGS INTO MILL - Lester Shingle Mill, 1602 North Eighteenth Street, Sweet Home, Linn County, OR
Regulation of the mammalian heat shock factor 1.
Dayalan Naidu, Sharadha; Dinkova-Kostova, Albena T
2017-06-01
Living organisms are endowed with the capability to tackle various forms of cellular stress due to the presence of molecular chaperone machinery complexes that are ubiquitous throughout the cell. During conditions of proteotoxic stress, the transcription factor heat shock factor 1 (HSF1) mediates the elevation of heat shock proteins, which are crucial components of the chaperone complex machinery and function to ameliorate protein misfolding and aggregation and restore protein homeostasis. In addition, HSF1 orchestrates a versatile transcriptional programme that includes genes involved in repair and clearance of damaged macromolecules and maintenance of cell structure and metabolism, and provides protection against a broad range of cellular stress mediators, beyond heat shock. Here, we discuss the structure and function of the mammalian HSF1 and its regulation by post-translational modifications (phosphorylation, sumoylation and acetylation), proteasomal degradation, and small-molecule activators and inhibitors. © 2017 Federation of European Biochemical Societies.
The complex networks approach for authorship attribution of books
NASA Astrophysics Data System (ADS)
Mehri, Ali; Darooneh, Amir H.; Shariati, Ashrafalsadat
2012-04-01
Authorship analysis by means of textual features is an important task in linguistic studies. We employ complex networks theory to tackle this disputed problem. In this work, we focus on some measurable quantities of word co-occurrence network of each book for authorship characterization. Based on the network features, attribution probability is defined for authorship identification. Furthermore, two scaling exponents, q-parameter and α-exponent, are combined to classify personal writing style with acceptable high resolution power. The q-parameter, generally known as the nonextensivity measure, is calculated for degree distribution and the α-exponent comes from a power law relationship between number of links and number of nodes in the co-occurrence network constructed for different books written by each author. The applicability of the presented method is evaluated in an experiment with thirty six books of five Persian litterateurs. Our results show high accuracy rate in authorship attribution.
Amalberti, René; Nicklin, Wendy; Braithwaite, Jeffrey
2016-06-01
Healthcare systems across the world are experiencing increased financial, organizational and social pressures attributable to a range of critical issues including the challenge of ageing populations. Health systems need to adapt, in order to sustainably provide quality care to the widest range of patients, particularly those with chronic and complex diseases, and especially those in vulnerable and low-income groups. We report on a workshop designed to tackle such issues under the auspices of ISQua, with representatives from Argentina, Australia, Canada, Columbia, Denmark, Emirates, France, Ireland, Jordan, Qatar, Malaysia, Norway, Oman, UK, South Africa and Switzerland. We discuss some of the challenges facing healthcare systems in countries ageing rapidly, to those less so, and touch on current and future reform options. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.
The food environment is a complex social network.
Brown, David R; Brewster, Luther G
2015-05-01
The lack of demonstrated impact of the South LA fast food ban suggests that the policy was too narrowly crafted. Healthy food deserts like South LA are simultaneously unhealthy food swamps; and face myriad interrelated social, economic, and environmental challenges. The food environment is a complex social network impacted by social, economic and political factors at the neighborhood, regional, national, and international levels. Banning one subtype of unhealthy food venue is not likely to limit the availability of unhealthy processed and packaged foods nor result in increased access to affordable healthy foods. Food deserts and food insecurity are symptoms of the interacting pathologies of poverty, distressed communities, and unhealthy global macroeconomic and industrial policies. Policies that seek to impact urban health disparities need to tackle root causes including poverty and the global production and distribution of cheap, addictive, unhealthy products that promote unhealthy lifestyles. Copyright © 2015 Elsevier Ltd. All rights reserved.
Research, the lifeline of medicine.
Kornberg, A
1976-05-27
Advances in medicine spring from discoveries in physics, chemistry and biology. Among key contributions to the diagnosis, treatment and prevention of cardiovascular and pulmonary diseases, a recent Comroe-Dripps analysis shows two thirds to have been basic rather than applied research. Without a firm foundation in basic knowledge innovations perceived as advances prove hollow and collapse. Strong social, economic and political pressures now threaten acquisition of basic knowledge. Scientists feel driven to undertake excessively complex problems and gamble against the historical record that science generally progresses by tackling discrete and well defined questions. Regardless of circumstances, professional standards require the physician and scientist to be creative and enlarge the fund of knowledge.
Managing Personal and Group Collections of Information
NASA Technical Reports Server (NTRS)
Wolfe, Shawn R.; Wragg, Stephen D.; Chen, James R.; Koga, Dennis (Technical Monitor)
1999-01-01
The internet revolution has dramatically increased the amount of information available to users. Various tools such as search engines have been developed to help users find the information they need from this vast repository. Users often also need tools to help manipulate the growing amount of useful information they have discovered. Current tools available for this purpose are typically local components of web browsers designed to manage URL bookmarks. They provide limited functionalities to handle high information complexities. To tackle this have created DIAMS, an agent-based tool to help users or groups manage their information collections and share their collections with other. the main features of DIAMS are described here.
Systems thinking in combating infectious diseases.
Xia, Shang; Zhou, Xiao-Nong; Liu, Jiming
2017-09-11
The transmission of infectious diseases is a dynamic process determined by multiple factors originating from disease pathogens and/or parasites, vector species, and human populations. These factors interact with each other and demonstrate the intrinsic mechanisms of the disease transmission temporally, spatially, and socially. In this article, we provide a comprehensive perspective, named as systems thinking, for investigating disease dynamics and associated impact factors, by means of emphasizing the entirety of a system's components and the complexity of their interrelated behaviors. We further develop the general steps for performing systems approach to tackling infectious diseases in the real-world settings, so as to expand our abilities to understand, predict, and mitigate infectious diseases.
Data-driven approaches in the investigation of social perception
Adolphs, Ralph; Nummenmaa, Lauri; Todorov, Alexander; Haxby, James V.
2016-01-01
The complexity of social perception poses a challenge to traditional approaches to understand its psychological and neurobiological underpinnings. Data-driven methods are particularly well suited to tackling the often high-dimensional nature of stimulus spaces and of neural representations that characterize social perception. Such methods are more exploratory, capitalize on rich and large datasets, and attempt to discover patterns often without strict hypothesis testing. We present four case studies here: behavioural studies on face judgements, two neuroimaging studies of movies, and eyetracking studies in autism. We conclude with suggestions for particular topics that seem ripe for data-driven approaches, as well as caveats and limitations. PMID:27069045
ERIC Educational Resources Information Center
Somech, Anit; Oplatka, Izhar
2009-01-01
Purpose: The current literature's call for a more ecological approach to violence theory, research, and practice stimulated the current study. This model postulates that teachers' willingness to engage in behaviors intended to tackle violence in school as part of their in-role duties (role breadth) will affect school violence. Specifically, the…
A Model of Active Ageing through Elder Learning: The Elder Academy Network in Hong Kong
ERIC Educational Resources Information Center
Tam, Maureen
2013-01-01
This article presents the Elder Academy (EA) Network as the policy and practice in promoting active ageing through elder learning in Hong Kong. First, the article examines how the change in demographics and the prevalent trend of an ageing population have propelled the government in Hong Kong to tackle issues and challenges brought about by an…
ERIC Educational Resources Information Center
Leniz, Ane; Zuza, Kristina; Guiasola, Jenaro
2017-01-01
This study examines the causal reasoning that university students use to explain how dc circuits work. We analyze how students use the concepts of electric field and potential difference in their explanatory models of dc circuits, and what kinds of reasoning they use at the macroscopic and microscopic levels in their explanations. This knowledge…
Tackling childhood obesity: the importance of understanding the context.
Knai, Cécile; McKee, Martin
2010-12-01
Recommendations to tackle major health problems such as childhood obesity may not be appropriate if they fail to take account of the prevailing socio-political, cultural and economic context. We describe the development and application of a qualitative risk analysis approach to identify non-scientific considerations framing the policy response to obesity in Denmark and Latvia. Interviews conducted with key stakeholders in Denmark and Latvia, undertaken following a review of relevant literature on obesity and national policies. A qualitative risk analysis model was developed to help explain the findings in the light of national context. Non-scientific considerations that appeared to influence the response to obesity include the perceived relative importance of childhood obesity; the nature of stakeholder relations and its impact on decision-making; the place of obesity on the policy agenda; the legitimacy of the state to act for population health and views on alliances between public and private sectors. Better recognition of the exogenous factors affecting policy-making may lead to a more adequate policy response. The development and use of a qualitative risk analysis model enabled a better understanding of the contextual factors and processes influencing the response to childhood obesity in each country.
NASA Astrophysics Data System (ADS)
Daude, F.; Galon, P.
2018-06-01
A Finite-Volume scheme for the numerical computations of compressible single- and two-phase flows in flexible pipelines is proposed based on an approximate Godunov-type approach. The spatial discretization is here obtained using the HLLC scheme. In addition, the numerical treatment of abrupt changes in area and network including several pipelines connected at junctions is also considered. The proposed approach is based on the integral form of the governing equations making it possible to tackle general equations of state. A coupled approach for the resolution of fluid-structure interaction of compressible fluid flowing in flexible pipes is considered. The structural problem is solved using Euler-Bernoulli beam finite elements. The present Finite-Volume method is applied to ideal gas and two-phase steam-water based on the Homogeneous Equilibrium Model (HEM) in conjunction with a tabulated equation of state in order to demonstrate its ability to tackle general equations of state. The extensive application of the scheme for both shock tube and other transient flow problems demonstrates its capability to resolve such problems accurately and robustly. Finally, the proposed 1-D fluid-structure interaction model appears to be computationally efficient.
Quantitative Diagnosis of Continuous-Valued, Stead-State Systems
NASA Technical Reports Server (NTRS)
Rouquette, N.
1995-01-01
Quantitative diagnosis involves numerically estimating the values of unobservable parameters that best explain the observed parameter values. We consider quantitative diagnosis for continuous, lumped- parameter, steady-state physical systems because such models are easy to construct and the diagnosis problem is considerably simpler than that for corresponding dynamic models. To further tackle the difficulties of numerically inverting a simulation model to compute a diagnosis, we propose to decompose a physical system model in terms of feedback loops. This decomposition reduces the dimension of the problem and consequently decreases the diagnosis search space. We illustrate this approach on a model of thermal control system studied in earlier research.
Bayesian state space models for dynamic genetic network construction across multiple tissues.
Liang, Yulan; Kelemen, Arpad
2016-08-01
Construction of gene-gene interaction networks and potential pathways is a challenging and important problem in genomic research for complex diseases while estimating the dynamic changes of the temporal correlations and non-stationarity are the keys in this process. In this paper, we develop dynamic state space models with hierarchical Bayesian settings to tackle this challenge for inferring the dynamic profiles and genetic networks associated with disease treatments. We treat both the stochastic transition matrix and the observation matrix time-variant and include temporal correlation structures in the covariance matrix estimations in the multivariate Bayesian state space models. The unevenly spaced short time courses with unseen time points are treated as hidden state variables. Hierarchical Bayesian approaches with various prior and hyper-prior models with Monte Carlo Markov Chain and Gibbs sampling algorithms are used to estimate the model parameters and the hidden state variables. We apply the proposed Hierarchical Bayesian state space models to multiple tissues (liver, skeletal muscle, and kidney) Affymetrix time course data sets following corticosteroid (CS) drug administration. Both simulation and real data analysis results show that the genomic changes over time and gene-gene interaction in response to CS treatment can be well captured by the proposed models. The proposed dynamic Hierarchical Bayesian state space modeling approaches could be expanded and applied to other large scale genomic data, such as next generation sequence (NGS) combined with real time and time varying electronic health record (EHR) for more comprehensive and robust systematic and network based analysis in order to transform big biomedical data into predictions and diagnostics for precision medicine and personalized healthcare with better decision making and patient outcomes.
Tackling the Triple-Threat Genome of Miscanthus x giganteus (2010 JGI User Meeting)
Moose, Steve
2018-02-05
Steve Moose from the University of Illinois at Urbana-Champaign and the Energy Biosciences Institute on "Tackling the Triple-Threat Genome of Miscanthus x giganteus" on March 25, 2010 at the 5th Annual DOE JGI User Meeting.
Tackling the Triple-Threat Genome of Miscanthus x giganteus (2010 JGI User Meeting)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moose, Steve
2010-03-25
Steve Moose from the University of Illinois at Urbana-Champaign and the Energy Biosciences Institute on "Tackling the Triple-Threat Genome of Miscanthus x giganteus" on March 25, 2010 at the 5th Annual DOE JGI User Meeting.
Ferreira da Costa, Joana; Silva, David; Caamaño, Olga; Brea, José M; Loza, Maria Isabel; Munteanu, Cristian R; Pazos, Alejandro; García-Mera, Xerardo; González-Díaz, Humbert
2018-06-25
Predicting drug-protein interactions (DPIs) for target proteins involved in dopamine pathways is a very important goal in medicinal chemistry. We can tackle this problem using Molecular Docking or Machine Learning (ML) models for one specific protein. Unfortunately, these models fail to account for large and complex big data sets of preclinical assays reported in public databases. This includes multiple conditions of assays, such as different experimental parameters, biological assays, target proteins, cell lines, organism of the target, or organism of assay. On the other hand, perturbation theory (PT) models allow us to predict the properties of a query compound or molecular system in experimental assays with multiple boundary conditions based on a previously known case of reference. In this work, we report the first PTML (PT + ML) study of a large ChEMBL data set of preclinical assays of compounds targeting dopamine pathway proteins. The best PTML model found predicts 50000 cases with accuracy of 70-91% in training and external validation series. We also compared the linear PTML model with alternative PTML models trained with multiple nonlinear methods (artificial neural network (ANN), Random Forest, Deep Learning, etc.). Some of the nonlinear methods outperform the linear model but at the cost of a notable increment of the complexity of the model. We illustrated the practical use of the new model with a proof-of-concept theoretical-experimental study. We reported for the first time the organic synthesis, chemical characterization, and pharmacological assay of a new series of l-prolyl-l-leucyl-glycinamide (PLG) peptidomimetic compounds. In addition, we performed a molecular docking study for some of these compounds with the software Vina AutoDock. The work ends with a PTML model predictive study of the outcomes of the new compounds in a large number of assays. Therefore, this study offers a new computational methodology for predicting the outcome for any compound in new assays. This PTML method focuses on the prediction with a simple linear model of multiple pharmacological parameters (IC 50 , EC 50 , K i , etc.) for compounds in assays involving different cell lines used, organisms of the protein target, or organism of assay for proteins in the dopamine pathway.
Association between community socioeconomic characteristics and access to youth flag football.
Kroshus, Emily; Sonnen, Aly J; Chrisman, Sara Pd; Rivara, Frederick P
2018-01-12
The American Academy of Pediatrics has recommended that opportunities for non-tackling American football (e.g., flag football) be expanded, given concerns about the risks of brain trauma from tackle football. This study tested the hypothesis that flag football would be more accessible in communities characterised by higher socioeconomic status residents. In July 2017, the locations of community-based organisations offering youth flag and tackle football for youth between the ages of 6 and 13 in two US states (Georgia and Washington) were aggregated (n=440). Organisations were coded in terms of the availability of tackle and/or flag football teams for youth at each year of age between 6 and 13. Multivariate logistic regression analyses were used to assess the odds of a community-based football organisation offering flag football, by community socioeconomic and demographic characteristics. In both states, communities with more educated residents were more likely to offer flag football for youth aged 6-12. For example, among 6 year-olds every 10% increase in the number of adult residents with a college education was associated with 1.51 times the odds of flag football availability (95% CI 1.22 to 1.86, P<0.001). These results suggest that youth living in communities characterised by low educational attainment are less likely than other youth to have the option of a lower contact alternative to tackle football. Relying on voluntary community-level adoption of lower contact alternatives to tackle football may result in inequitable access to such sport options. This may contribute to an inequitable burden of brain trauma from youth sport. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Looking behind the bars: emerging health issues for people in prison.
Stürup-Toft, S; O'Moore, E J; Plugge, E H
2018-03-01
There are more than 10 million people imprisoned worldwide. These individuals experience a higher burden of communicable and non-communicable disease, mental health and substance misuse problems than the general population and often come from marginalized and underserved groups in the community. Prisons offer an important opportunity for tackling health problems in a way that can deliver benefits to the individual and to the community. This paper focuses specifically on emerging health issues for prisons across the world. This paper uses sources of international data from published systematic reviews and research studies, the Ministry of Justice for England and Wales, the Prisons and Probations Ombudsmen Review and other United Kingdom government briefing papers. Deaths in custody are a key concern for the justice system as well as the health system. Suicide is the leading cause of mortality in prisons worldwide but non-communicable diseases, such as cardiovascular disease, are increasing in importance in high-income countries and are now the leading cause of mortality in prisons in England and Wales. The prison population is ageing in most high-income countries. Older people in prison typically have multiple and complex medical and social care needs including reduced mobility and personal care needs as well as poor health. Further research is needed to understand the complex relationship between sentencing patterns, the ageing prison population and deaths in custody; to model its impact on prisons and healthcare provision in the future and to determine effective and cost-effective models of care. Research into the health of prisoners is important in improving the health of prisoners but there is considerable variation in quantity and quality between countries. Recent innovations seek to address this disparity and facilitate the sharing of good practice.
NASA Astrophysics Data System (ADS)
Uen, T. S.; Tsai, W. P.; Chang, F. J.; Huang, A.
2016-12-01
In recent years, urbanization had a great effect on the growth of population and the resource management scheme of water, food and energy nexus (WFE nexus) in Taiwan. Resource shortages of WFE become a long-term and thorny issue due to the complex interactions of WFE nexus. In consideration of rapid socio-economic development, it is imperative to explore an efficient and practical approach for WFE resources management. This study aims to search the optimal solution to WFE nexus and construct a stable water supply system for multiple stakeholders. The Shimen Reservoir and Feitsui Reservoir in northern Taiwan are chosen to conduct the joint operation of the two reservoirs for water supply. This study intends to achieve water resource allocation from the two reservoirs subject to different operating rules and restrictions of resource allocation. The multi-objectives of the joint operation aim at maximizing hydro-power synergistic gains while minimizing water supply deficiency as well as food shortages. We propose to build a multi-objective evolutionary optimization model for analyzing the hydro-power synergistic gains to suggest the most favorable solutions in terms of tradeoffs between WFE. First, this study collected data from two reservoirs and Taiwan power company. Next, we built a WFE nexus model based on system dynamics. Finally, this study optimized the joint operation of the two reservoirs and calculated the synergy of hydro-power generation. The proposed methodology can tackle the complex joint reservoir operation problems. Results can suggest a reliable policy for joint reservoir operation for creating a green economic city under the lowest risks of water supply.
Theodosopoulou, Eleni; Papanastasiou, Elena
2014-01-01
Objective This study aims to investigate the prevalence of multimorbidity in Cyprus and the extent to which citizens are satisfied with the currently provided healthcare and to provide recommendations on the basis of findings. Design A nationally based survey conducted through personal interviews, using a structured questionnaire designed for this survey. Setting Cyprus rural and urban areas (excluding Turkish occupied areas). Participants Four hundred and sixty-five Cypriot adults, average age 53 years. Main outcome measures Lifetime prevalence of self-reported non-communicable diseases. Results This study demonstrated initial evidence for a high prevalence of non-age specific multimorbidity among Cypriots and dissatisfaction with their doctors, especially for the time allocated to discuss their general state of health. Recommendations focus on a new cost-effective, person-centred model of healthcare. The model prioritizes prevention rather than treatment, targeting the determinants of complexity before their influences create conditions that demand high-cost interventions, and it is based on three fundamental principles: (1) tackling health as a political issue, (2) empowering the patient and (3) introducing Applied Nutrition in the system. Conclusions This study threw light into the issue of patient complexity and revealed unmet people’s needs and expectations for a more person-centred care, providing a first challenge to the single disease-based system of healthcare in Cyprus. The findings of the study may have important implications for government policies and highlight the need for more research in this area to inform policy makers, particularly in view of the fact that a new Health System is currently being designed. PMID:25057367
From Panoramic Photos to a Low-Cost Photogrammetric Workflow for Cultural Heritage 3d Documentation
NASA Astrophysics Data System (ADS)
D'Annibale, E.; Tassetti, A. N.; Malinverni, E. S.
2013-07-01
The research aims to optimize a workflow of architecture documentation: starting from panoramic photos, tackling available instruments and technologies to propose an integrated, quick and low-cost solution of Virtual Architecture. The broader research background shows how to use spherical panoramic images for the architectural metric survey. The input data (oriented panoramic photos), the level of reliability and Image-based Modeling methods constitute an integrated and flexible 3D reconstruction approach: from the professional survey of cultural heritage to its communication in virtual museum. The proposed work results from the integration and implementation of different techniques (Multi-Image Spherical Photogrammetry, Structure from Motion, Imagebased Modeling) with the aim to achieve high metric accuracy and photorealistic performance. Different documentation chances are possible within the proposed workflow: from the virtual navigation of spherical panoramas to complex solutions of simulation and virtual reconstruction. VR tools make for the integration of different technologies and the development of new solutions for virtual navigation. Image-based Modeling techniques allow 3D model reconstruction with photo realistic and high-resolution texture. High resolution of panoramic photo and algorithms of panorama orientation and photogrammetric restitution vouch high accuracy and high-resolution texture. Automated techniques and their following integration are subject of this research. Data, advisably processed and integrated, provide different levels of analysis and virtual reconstruction joining the photogrammetric accuracy to the photorealistic performance of the shaped surfaces. Lastly, a new solution of virtual navigation is tested. Inside the same environment, it proposes the chance to interact with high resolution oriented spherical panorama and 3D reconstructed model at once.
The Earthquake Source Inversion Validation (SIV) - Project: Summary, Status, Outlook
NASA Astrophysics Data System (ADS)
Mai, P. M.
2017-12-01
Finite-fault earthquake source inversions infer the (time-dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, this kinematic source inversion is ill-posed and returns non-unique solutions, as seen for instance in multiple source models for the same earthquake, obtained by different research teams, that often exhibit remarkable dissimilarities. To address the uncertainties in earthquake-source inversions and to understand strengths and weaknesses of various methods, the Source Inversion Validation (SIV) project developed a set of forward-modeling exercises and inversion benchmarks. Several research teams then use these validation exercises to test their codes and methods, but also to develop and benchmark new approaches. In this presentation I will summarize the SIV strategy, the existing benchmark exercises and corresponding results. Using various waveform-misfit criteria and newly developed statistical comparison tools to quantify source-model (dis)similarities, the SIV platforms is able to rank solutions and identify particularly promising source inversion approaches. Existing SIV exercises (with related data and descriptions) and all computational tools remain available via the open online collaboration platform; additional exercises and benchmark tests will be uploaded once they are fully developed. I encourage source modelers to use the SIV benchmarks for developing and testing new methods. The SIV efforts have already led to several promising new techniques for tackling the earthquake-source imaging problem. I expect that future SIV benchmarks will provide further innovations and insights into earthquake source kinematics that will ultimately help to better understand the dynamics of the rupture process.
Enabling Controlling Complex Networks with Local Topological Information.
Li, Guoqi; Deng, Lei; Xiao, Gaoxi; Tang, Pei; Wen, Changyun; Hu, Wuhua; Pei, Jing; Shi, Luping; Stanley, H Eugene
2018-03-15
Complex networks characterize the nature of internal/external interactions in real-world systems including social, economic, biological, ecological, and technological networks. Two issues keep as obstacles to fulfilling control of large-scale networks: structural controllability which describes the ability to guide a dynamical system from any initial state to any desired final state in finite time, with a suitable choice of inputs; and optimal control, which is a typical control approach to minimize the cost for driving the network to a predefined state with a given number of control inputs. For large complex networks without global information of network topology, both problems remain essentially open. Here we combine graph theory and control theory for tackling the two problems in one go, using only local network topology information. For the structural controllability problem, a distributed local-game matching method is proposed, where every node plays a simple Bayesian game with local information and local interactions with adjacent nodes, ensuring a suboptimal solution at a linear complexity. Starring from any structural controllability solution, a minimizing longest control path method can efficiently reach a good solution for the optimal control in large networks. Our results provide solutions for distributed complex network control and demonstrate a way to link the structural controllability and optimal control together.
Tackling 'wicked' health promotion problems: a New Zealand case study.
Signal, Louise N; Walton, Mat D; Ni Mhurchu, Cliona; Maddison, Ralph; Bowers, Sharron G; Carter, Kristie N; Gorton, Delvina; Heta, Craig; Lanumata, Tolotea S; McKerchar, Christina W; O'Dea, Des; Pearce, Jamie
2013-03-01
This paper reports on a complex environmental approach to addressing 'wicked' health promotion problems devised to inform policy for enhancing food security and physical activity among Māori, Pacific and low-income people in New Zealand. This multi-phase research utilized literature reviews, focus groups, stakeholder workshops and key informant interviews. Participants included members of affected communities, policy-makers and academics. Results suggest that food security and physical activity 'emerge' from complex systems. Key areas for intervention include availability of money within households; the cost of food; improvements in urban design and culturally specific physical activity programmes. Seventeen prioritized intervention areas were explored in-depth and recommendations for action identified. These include healthy food subsidies, increasing the statutory minimum wage rate and enhancing open space and connectivity in communities. This approach has moved away from seeking individual solutions to complex social problems. In doing so, it has enabled the mapping of the relevant systems and the identification of a range of interventions while taking account of the views of affected communities and the concerns of policy-makers. The complex environmental approach used in this research provides a method to identify how to intervene in complex systems that may be relevant to other 'wicked' health promotion problems.
Dean, Erin
2016-11-30
Essential facts Trade union Unite has developed a policy briefing on a new toolkit to combat racism in the NHS. It can help nurses and other staff tackle racial discrimination in health, with black and minority ethnic (BME) nurses often treated unequally compared with their white colleagues.
The impact of tackle football injuries on the American healthcare system with a neurological focus.
McGinity, Michael J; Grandhi, Ramesh; Michalek, Joel E; Rodriguez, Jesse S; Trevino, Aron M; McGinity, Ashley C; Seifi, Ali
2018-01-01
Recent interest in the study of concussion and other neurological injuries has heightened awareness of the medical implications of American tackle football injuries amongst the public. Using the National Emergency Department Sample (NEDS) and the National Inpatient Sample (NIS), the largest publicly available all-payer emergency department and inpatient healthcare databases in the United States, we sought to describe the impact of tackle football injuries on the American healthcare system by delineating injuries, specifically neurological in nature, suffered as a consequence of tackle football between 2010 and 2013. The NEDS and NIS databases were queried to collect data on all patients presented to the emergency department (ED) and/or were admitted to hospitals with an ICD code for injuries related to American tackle football between the years 2010 and 2013. Subsequently those with football-related neurological injuries were abstracted using ICD codes for concussion, skull/face injury, intracranial injury, spine injury, and spinal cord injury (SCI). Patient demographics, length of hospital stay (LOS), cost and charge data, neurosurgical interventions, hospital type, and disposition were collected and analyzed. A total of 819,000 patients presented to EDs for evaluation of injuries secondary to American tackle football between 2010 and 2013, with 1.13% having injuries requiring inpatient admission (average length of stay 2.4 days). 80.4% of the ED visits were from the pediatric population. Of note, a statistically significant increase in the number of pediatric concussions over time was demonstrated (OR = 1.1, 95% CI 1.1 to 1.2). Patients were more likely to be admitted to trauma centers, teaching hospitals, the south or west regions, or with private insurance. There were 471 spinal cord injuries and 1,908 total spine injuries. Ten patients died during the study time period. The combined ED and inpatient charges were $1.35 billion. Injuries related to tackle football are a frequent cause of emergency room visits, specifically in the pediatric population, but severe acute trauma requiring inpatient admission or operative interventions are rare. Continued investigation in the long-term health impact of football related concussion and other repetitive lower impact trauma is warranted.
The impact of tackle football injuries on the American healthcare system with a neurological focus
McGinity, Michael J.; Grandhi, Ramesh; Michalek, Joel E.; Rodriguez, Jesse S.; Trevino, Aron M.; McGinity, Ashley C.
2018-01-01
Background Recent interest in the study of concussion and other neurological injuries has heightened awareness of the medical implications of American tackle football injuries amongst the public. Objective Using the National Emergency Department Sample (NEDS) and the National Inpatient Sample (NIS), the largest publicly available all-payer emergency department and inpatient healthcare databases in the United States, we sought to describe the impact of tackle football injuries on the American healthcare system by delineating injuries, specifically neurological in nature, suffered as a consequence of tackle football between 2010 and 2013. Methods The NEDS and NIS databases were queried to collect data on all patients presented to the emergency department (ED) and/or were admitted to hospitals with an ICD code for injuries related to American tackle football between the years 2010 and 2013. Subsequently those with football-related neurological injuries were abstracted using ICD codes for concussion, skull/face injury, intracranial injury, spine injury, and spinal cord injury (SCI). Patient demographics, length of hospital stay (LOS), cost and charge data, neurosurgical interventions, hospital type, and disposition were collected and analyzed. Results A total of 819,000 patients presented to EDs for evaluation of injuries secondary to American tackle football between 2010 and 2013, with 1.13% having injuries requiring inpatient admission (average length of stay 2.4 days). 80.4% of the ED visits were from the pediatric population. Of note, a statistically significant increase in the number of pediatric concussions over time was demonstrated (OR = 1.1, 95% CI 1.1 to 1.2). Patients were more likely to be admitted to trauma centers, teaching hospitals, the south or west regions, or with private insurance. There were 471 spinal cord injuries and 1,908 total spine injuries. Ten patients died during the study time period. The combined ED and inpatient charges were $1.35 billion. Conclusion Injuries related to tackle football are a frequent cause of emergency room visits, specifically in the pediatric population, but severe acute trauma requiring inpatient admission or operative interventions are rare. Continued investigation in the long-term health impact of football related concussion and other repetitive lower impact trauma is warranted. PMID:29734348
Untethered Recyclable Tubular Actuators with Versatile Locomotion for Soft Continuum Robots.
Qian, Xiaojie; Chen, Qiaomei; Yang, Yang; Xu, Yanshuang; Li, Zhen; Wang, Zhenhua; Wu, Yahe; Wei, Yen; Ji, Yan
2018-05-27
Stimuli-responsive materials offer a distinguished platform to build tether-free compact soft robots, which can combine sensing and actuation without a linked power supply. In the past, tubular soft robots have to be made by multiple components with various internal channels or complex cavities assembled together. Moreover, robust processing, complex locomotion, simple structure, and easy recyclability represent major challenges in this area. Here, it is shown that those challenges can be tackled by liquid crystalline elastomers with allyl sulfide functional groups. The light-controlled exchange reaction between allyl sulfide groups allows flexible processing of tubular soft robots/actuators, which does not need any assisting materials. Complex locomotion demonstrated here includes reversible simultaneous bending and elongation; reversible diameter expansion; and omnidirectional bending via remote infrared light control. Different modes of actuation can be programmed into the same tube without the routine assembly of multiple tubes as used in the past. In addition, the exchange reaction also makes it possible to use the same single tube repeatedly to perform different functions by erasing and reprogramming. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Reverse membrane bioreactor: Introduction to a new technology for biofuel production.
Mahboubi, Amir; Ylitervo, Päivi; Doyen, Wim; De Wever, Heleen; Taherzadeh, Mohammad J
2016-01-01
The novel concept of reverse membrane bioreactors (rMBR) introduced in this review is a new membrane-assisted cell retention technique benefiting from the advantageous properties of both conventional MBRs and cell encapsulation techniques to tackle issues in bioconversion and fermentation of complex feeds. The rMBR applies high local cell density and membrane separation of cell/feed to the conventional immersed membrane bioreactor (iMBR) set up. Moreover, this new membrane configuration functions on basis of concentration-driven diffusion rather than pressure-driven convection previously used in conventional MBRs. These new features bring along the exceptional ability of rMBRs in aiding complex bioconversion and fermentation feeds containing high concentrations of inhibitory compounds, a variety of sugar sources and high suspended solid content. In the current review, the similarities and differences between the rMBR and conventional MBRs and cell encapsulation regarding advantages, disadvantages, principles and applications for biofuel production are presented and compared. Moreover, the potential of rMBRs in bioconversion of specific complex substrates of interest such as lignocellulosic hydrolysate is thoroughly studied. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
de Zelicourt, Diane; Ge, Liang; Sotiropoulos, Fotis; Yoganathan, Ajit
2008-11-01
Image-guided computational fluid dynamics has recently gained attention as a tool for predicting the outcome of different surgical scenarios. Cartesian Immersed-Boundary methods constitute an attractive option to tackle the complexity of real-life anatomies. However, when such methods are applied to the branching, multi-vessel configurations typically encountered in cardiovascular anatomies the majority of the grid nodes of the background Cartesian mesh end up lying outside the computational domain, increasing the memory and computational overhead without enhancing the numerical resolution in the region of interest. To remedy this situation, the method presented here superimposes local mesh refinement onto an unstructured Cartesian grid formulation. A baseline unstructured Cartesian mesh is generated by eliminating all nodes that reside in the exterior of the flow domain from the grid structure, and is locally refined in the vicinity of the immersed-boundary. The potential of the method is demonstrated by carrying out systematic mesh refinement studies for internal flow problems ranging in complexity from a 90 deg pipe bend to an actual, patient-specific anatomy reconstructed from magnetic resonance.
NASA Astrophysics Data System (ADS)
Larasati, Ophilia; Puspita Dirgahayani, Eng., Dr.
2018-05-01
Transport services are essential to support daily life. A lack of transport supply leads to the existence of transport disadvantaged (TDA) groups who are vulnerable to social exclusion, which happens when a particular group or individual is having difficulties to access certain activities that are considered normal in society. To tackle this phenomenon, the understanding of the influence of TDA variables on social exclusion is needed. The aim of this study is to analyze the influences of TDA variables on social exclusion in a rural context, with Cibeureum Village (Bandung Barat Regency) and Bunikasih Village (Subang Regency) as the study case. Both case studies provide different characteristics of accessibility. Partial Least Squares (PLS) Structural Equation Modeling (SEM) is chosen as the method to analyze the influences of TDA variables on social exclusion. The PLS-SEM model is developed according to the social exclusion variable and four TDA variables, i.e., accessibility, individual characteristics, private vehicle existence, and travel behavior. IPMA is done after the PLS-SEM model is evaluated. The study reveals that among four of the TDA variables, accessibility has the most influence on social exclusion, hence interventions related to improving accessibility are needed to tackle social exclusion. More specifically, the provision of alternative modes is needed in both study areas, while in Bunikasih Village the cost of travel is also an important variable to consider.
Modeling an integrated hospital management planning problem using integer optimization approach
NASA Astrophysics Data System (ADS)
Sitepu, Suryati; Mawengkang, Herman; Irvan
2017-09-01
Hospital is a very important institution to provide health care for people. It is not surprising that nowadays the people’s demands for hospital is increasing. However, due to the rising cost of healthcare services, hospitals need to consider efficiencies in order to overcome these two problems. This paper deals with an integrated strategy of staff capacity management and bed allocation planning to tackle these problems. Mathematically, the strategy can be modeled as an integer linear programming problem. We solve the model using a direct neighborhood search approach, based on the notion of superbasic variables.
Optimization Model for Capacity Management and Bed Scheduling for Hospital
NASA Astrophysics Data System (ADS)
Sitepu, Suryati; Mawengkang, Herman; Husein, Ismail
2018-01-01
Hospital is a very important institution to provide health care for people. It is not surprising that nowadays the people’s demands for hospital is increasing.. However, due to the rising cost of healthcare services, hospitals need to consider efficiencies in order to overcome these two problems. This paper deals with an integrated strategy of staff capacity management and bed allocation planning to tackle these problems. Mathematically, the strategy can be modeled as an integer linear programming problem. We solve the model using a direct neighborhood search approach, based on the notion of superbasic variables.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the Enemy Act (TWEA) Penalties § 501.701 Penalties. (a) Attention is directed to section 16 of the..., papers, or other articles or documents, or any vessel, together with its tackle, apparel, furniture, and... articles or documents, or any vessel, together with its tackle, apparel, furniture, and equipment, that is...
Baltimore District Tackles High Suspension Rates
ERIC Educational Resources Information Center
Maxwell, Lesli A.
2007-01-01
This article reports on how the Baltimore District tackles its high suspension rates. Driven by an increasing belief that zero-tolerance disciplinary policies are ineffective, more educators are embracing strategies that do not exclude misbehaving students from school for offenses such as insubordination, disrespect, cutting class, tardiness, and…
McCaugherty, D
1991-09-01
Kurt Lewin, the originator of action research, proposed that it was valuable not only for innovating change, but also the process of change could lead to new insights into the nature of the problem that was being tackled. This action research project developed and evaluated a teaching model that aimed to help RGN (registered general nurse) students to bridge the theory-practice gap. During the course of this work, the possible reasons for a theory-practice gap started to become clear. This paper provides a discussion of these factors. The viewpoint for this discussion is that of the student nurse. The student is assumed to 'own' the problem and it is from her perspective that the theory-practice gap is analysed. The paper includes a critical examination of books, lectures, the school curriculum and ward nursing practice. Finally, possible solutions to the theory-practice problem are discussed and it is hoped that these will provide a rational basis for tackling the problem.
Pereira, Anieli G; Sterli, Juliana; Moreira, Filipe R R; Schrago, Carlos G
2017-08-01
Despite their complex evolutionary history and the rich fossil record, the higher level phylogeny and historical biogeography of living turtles have not been investigated in a comprehensive and statistical framework. To tackle these issues, we assembled a large molecular dataset, maximizing both taxonomic and gene sampling. As different models provide alternative biogeographical scenarios, we have explicitly tested such hypotheses in order to reconstruct a robust biogeographical history of Testudines. We scanned publicly available databases for nucleotide sequences and composed a dataset comprising 13 loci for 294 living species of Testudines, which accounts for all living genera and 85% of their extant species diversity. Phylogenetic relationships and species divergence times were estimated using a thorough evaluation of fossil information as calibration priors. We then carried out the analysis of historical biogeography of Testudines in a fully statistical framework. Our study recovered the first large-scale phylogeny of turtles with well-supported relationships following the topology proposed by phylogenomic works. Our dating result consistently indicated that the origin of the main clades, Pleurodira and Cryptodira, occurred in the early Jurassic. The phylogenetic and historical biogeographical inferences permitted us to clarify how geological events affected the evolutionary dynamics of crown turtles. For instance, our analyses support the hypothesis that the breakup of Pangaea would have driven the divergence between the cryptodiran and pleurodiran lineages. The reticulated pattern in the ancestral distribution of the cryptodiran lineage suggests a complex biogeographic history for the clade, which was supposedly related to the complex paleogeographic history of Laurasia. On the other hand, the biogeographical history of Pleurodira indicated a tight correlation with the paleogeography of the Gondwanan landmasses. Copyright © 2017 Elsevier Inc. All rights reserved.
The NEUF-DIX space project - Non-EquilibriUm Fluctuations during DIffusion in compleX liquids.
Baaske, Philipp; Bataller, Henri; Braibanti, Marco; Carpineti, Marina; Cerbino, Roberto; Croccolo, Fabrizio; Donev, Aleksandar; Köhler, Werner; Ortiz de Zárate, José M; Vailati, Alberto
2016-12-01
Diffusion and thermal diffusion processes in a liquid mixture are accompanied by long-range non-equilibrium fluctuations, whose amplitude is orders of magnitude larger than that of equilibrium fluctuations. The mean-square amplitude of the non-equilibrium fluctuations presents a scale-free power law behavior q -4 as a function of the wave vector q, but the divergence of the amplitude of the fluctuations at small wave vectors is prevented by the presence of gravity. In microgravity conditions the non-equilibrium fluctuations are fully developed and span all the available length scales up to the macroscopic size of the systems in the direction parallel to the applied gradient. Available theoretical models are based on linearized hydrodynamics and provide an adequate description of the statics and dynamics of the fluctuations in the presence of small temperature/concentration gradients and under stationary or quasi-stationary conditions. We describe a project aimed at the investigation of Non-EquilibriUm Fluctuations during DIffusion in compleX liquids (NEUF-DIX). The focus of the project is on the investigation in micro-gravity conditions of the non-equilibrium fluctuations in complex liquids, trying to tackle several challenging problems that emerged during the latest years, such as the theoretical predictions of Casimir-like forces induced by non-equilibrium fluctuations; the understanding of the non-equilibrium fluctuations in multi-component mixtures including a polymer, both in relation to the transport coefficients and to their behavior close to a glass transition; the understanding of the non-equilibrium fluctuations in concentrated colloidal suspensions, a problem closely related with the detection of Casimir forces; and the investigation of the development of fluctuations during transient diffusion. We envision to parallel these experiments with state-of-the-art multi-scale simulations.
How models can support ecosystem-based management of coral reefs
NASA Astrophysics Data System (ADS)
Weijerman, Mariska; Fulton, Elizabeth A.; Janssen, Annette B. G.; Kuiper, Jan J.; Leemans, Rik; Robson, Barbara J.; van de Leemput, Ingrid A.; Mooij, Wolf M.
2015-11-01
Despite the importance of coral reef ecosystems to the social and economic welfare of coastal communities, the condition of these marine ecosystems have generally degraded over the past decades. With an increased knowledge of coral reef ecosystem processes and a rise in computer power, dynamic models are useful tools in assessing the synergistic effects of local and global stressors on ecosystem functions. We review representative approaches for dynamically modeling coral reef ecosystems and categorize them as minimal, intermediate and complex models. The categorization was based on the leading principle for model development and their level of realism and process detail. This review aims to improve the knowledge of concurrent approaches in coral reef ecosystem modeling and highlights the importance of choosing an appropriate approach based on the type of question(s) to be answered. We contend that minimal and intermediate models are generally valuable tools to assess the response of key states to main stressors and, hence, contribute to understanding ecological surprises. As has been shown in freshwater resources management, insight into these conceptual relations profoundly influences how natural resource managers perceive their systems and how they manage ecosystem recovery. We argue that adaptive resource management requires integrated thinking and decision support, which demands a diversity of modeling approaches. Integration can be achieved through complimentary use of models or through integrated models that systemically combine all relevant aspects in one model. Such whole-of-system models can be useful tools for quantitatively evaluating scenarios. These models allow an assessment of the interactive effects of multiple stressors on various, potentially conflicting, management objectives. All models simplify reality and, as such, have their weaknesses. While minimal models lack multidimensionality, system models are likely difficult to interpret as they require many efforts to decipher the numerous interactions and feedback loops. Given the breadth of questions to be tackled when dealing with coral reefs, the best practice approach uses multiple model types and thus benefits from the strength of different models types.
Tackling Noncommunicable Diseases in Africa: Caveat Lector
ERIC Educational Resources Information Center
Mensah, George A.
2016-01-01
Noncommunicable disease (NCD), principally cardiovascular diseases, cancer, chronic lung disease, and diabetes, constitutes the major cause of death worldwide. Evidence of a continuing increase in the global burden of these diseases has generated recent urgent calls for global action to tackle and reduce related death and disability. Because the…
Local Communities and Schools Tackling Sustainability and Climate Change
ERIC Educational Resources Information Center
Flowers, Rick; Chodkiewicz, Andrew
2009-01-01
Local communities and their schools remain key sites for actions tackling issues of sustainability and climate change. A government-funded environmental education initiative, the Australian Sustainable Schools Initiative (AuSSI), working together with state based Sustainable Schools Programs (SSP), has the ability to support the development of…
29 CFR 1917.114 - Cargo doors.
Code of Federal Regulations, 2010 CFR
2010-07-01
... counterweights shall be guarded. (2) Lift trucks and cranes shall not be used to move mechanically operated doors.... (1) The door shall be connected to its lifting tackle with shackles or equally secure means. (2) Lifting bridles and tackles shall have a safety factor of five, based upon maximum anticipated static...
46 CFR 184.300 - Ground tackle and mooring lines.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 46 Shipping 7 2014-10-01 2014-10-01 false Ground tackle and mooring lines. 184.300 Section 184.300 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SMALL PASSENGER VESSELS (UNDER 100 GROSS TONS) VESSEL CONTROL AND MISCELLANEOUS SYSTEMS AND EQUIPMENT Mooring and Towing Equipment § 184.300...
2014-01-01
Background Performance of health care systems is a key concern of policy makers and health service managers all over the world. It is also a major challenge, given its multidimensional nature that easily leads to conceptual and methodological confusion. This is reflected by a scarcity of models that comprehensively analyse health system performance. Discussion In health, one of the most comprehensive performance frameworks was developed by the team of Leggat and Sicotte. Their framework integrates 4 key organisational functions (goal attainment, production, adaptation to the environment, and values and culture) and the tensions between these functions. We modified this framework to better fit the assessment of the performance of health organisations in the public service domain and propose an analytical strategy that takes it into the social complexity of health organisations. The resulting multipolar performance framework (MPF) is a meta-framework that facilitates the analysis of the relations and interactions between the multiple actors that influence the performance of health organisations. Summary Using the MPF in a dynamic reiterative mode not only helps managers to identify the bottlenecks that hamper performance, but also the unintended effects and feedback loops that emerge. Similarly, it helps policymakers and programme managers at central level to better anticipate the potential results and side effects of and required conditions for health policies and programmes and to steer their implementation accordingly. PMID:24742181
Parallel processing for scientific computations
NASA Technical Reports Server (NTRS)
Alkhatib, Hasan S.
1995-01-01
The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.
Current trends in protein crystallization.
Gavira, José A
2016-07-15
Proteins belong to the most complex colloidal system in terms of their physicochemical properties, size and conformational-flexibility. This complexity contributes to their great sensitivity to any external change and dictate the uncertainty of crystallization. The need of 3D models to understand their functionality and interaction mechanisms with other neighbouring (macro)molecules has driven the tremendous effort put into the field of crystallography that has also permeated other fields trying to shed some light into reluctant-to-crystallize proteins. This review is aimed at revising protein crystallization from a regular-laboratory point of view. It is also devoted to highlight the latest developments and achievements to produce, identify and deliver high-quality protein crystals for XFEL, Micro-ED or neutron diffraction. The low likelihood of protein crystallization is rationalized by considering the intrinsic polypeptide nature (folded state, surface charge, etc) followed by a description of the standard crystallization methods (batch, vapour diffusion and counter-diffusion), including high throughput advances. Other methodologies aimed at determining protein features in solution (NMR, SAS, DLS) or to gather structural information from single particles such as Cryo-EM are also discussed. Finally, current approaches showing the convergence of different structural biology techniques and the cross-methodologies adaptation to tackle the most difficult problems, are presented. Current advances in biomacromolecules crystallization, from nano crystals for XFEL and Micro-ED to large crystals for neutron diffraction, are covered with special emphasis in methodologies applicable at laboratory scale. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Ghaemi, Z.; Farnaghi, M.; Alimohammadi, A.
2015-12-01
The critical impact of air pollution on human health and environment in one hand and the complexity of pollutant concentration behavior in the other hand lead the scientists to look for advance techniques for monitoring and predicting the urban air quality. Additionally, recent developments in data measurement techniques have led to collection of various types of data about air quality. Such data is extremely voluminous and to be useful it must be processed at high velocity. Due to the complexity of big data analysis especially for dynamic applications, online forecasting of pollutant concentration trends within a reasonable processing time is still an open problem. The purpose of this paper is to present an online forecasting approach based on Support Vector Machine (SVM) to predict the air quality one day in advance. In order to overcome the computational requirements for large-scale data analysis, distributed computing based on the Hadoop platform has been employed to leverage the processing power of multiple processing units. The MapReduce programming model is adopted for massive parallel processing in this study. Based on the online algorithm and Hadoop framework, an online forecasting system is designed to predict the air pollution of Tehran for the next 24 hours. The results have been assessed on the basis of Processing Time and Efficiency. Quite accurate predictions of air pollutant indicator levels within an acceptable processing time prove that the presented approach is very suitable to tackle large scale air pollution prediction problems.
Modeling and simulation of multi-physics multi-scale transport phenomenain bio-medical applications
NASA Astrophysics Data System (ADS)
Kenjereš, Saša
2014-08-01
We present a short overview of some of our most recent work that combines the mathematical modeling, advanced computer simulations and state-of-the-art experimental techniques of physical transport phenomena in various bio-medical applications. In the first example, we tackle predictions of complex blood flow patterns in the patient-specific vascular system (carotid artery bifurcation) and transfer of the so-called "bad" cholesterol (low-density lipoprotein, LDL) within the multi-layered artery wall. This two-way coupling between the blood flow and corresponding mass transfer of LDL within the artery wall is essential for predictions of regions where atherosclerosis can develop. It is demonstrated that a recently developed mathematical model, which takes into account the complex multi-layer arterial-wall structure, produced LDL profiles within the artery wall in good agreement with in-vivo experiments in rabbits, and it can be used for predictions of locations where the initial stage of development of atherosclerosis may take place. The second example includes a combination of pulsating blood flow and medical drug delivery and deposition controlled by external magnetic field gradients in the patient specific carotid artery bifurcation. The results of numerical simulations are compared with own PIV (Particle Image Velocimetry) and MRI (Magnetic Resonance Imaging) in the PDMS (silicon-based organic polymer) phantom. A very good agreement between simulations and experiments is obtained for different stages of the pulsating cycle. Application of the magnetic drug targeting resulted in an increase of up to ten fold in the efficiency of local deposition of the medical drug at desired locations. Finally, the LES (Large Eddy Simulation) of the aerosol distribution within the human respiratory system that includes up to eight bronchial generations is performed. A very good agreement between simulations and MRV (Magnetic Resonance Velocimetry) measurements is obtained. Magnetic steering of aerosols towards the left or right part of lungs proved to be possible, which can open new strategies for medical treatment of respiratory diseases.
Efficient parallel simulation of CO2 geologic sequestration insaline aquifers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Keni; Doughty, Christine; Wu, Yu-Shu
2007-01-01
An efficient parallel simulator for large-scale, long-termCO2 geologic sequestration in saline aquifers has been developed. Theparallel simulator is a three-dimensional, fully implicit model thatsolves large, sparse linear systems arising from discretization of thepartial differential equations for mass and energy balance in porous andfractured media. The simulator is based on the ECO2N module of the TOUGH2code and inherits all the process capabilities of the single-CPU TOUGH2code, including a comprehensive description of the thermodynamics andthermophysical properties of H2O-NaCl- CO2 mixtures, modeling singleand/or two-phase isothermal or non-isothermal flow processes, two-phasemixtures, fluid phases appearing or disappearing, as well as saltprecipitation or dissolution. The newmore » parallel simulator uses MPI forparallel implementation, the METIS software package for simulation domainpartitioning, and the iterative parallel linear solver package Aztec forsolving linear equations by multiple processors. In addition, theparallel simulator has been implemented with an efficient communicationscheme. Test examples show that a linear or super-linear speedup can beobtained on Linux clusters as well as on supercomputers. Because of thesignificant improvement in both simulation time and memory requirement,the new simulator provides a powerful tool for tackling larger scale andmore complex problems than can be solved by single-CPU codes. Ahigh-resolution simulation example is presented that models buoyantconvection, induced by a small increase in brine density caused bydissolution of CO2.« less
Chica, Manuel
2012-11-01
A novel method for authenticating pollen grains in bright-field microscopic images is presented in this work. The usage of this new method is clear in many application fields such as bee-keeping sector, where laboratory experts need to identify fraudulent bee pollen samples against local known pollen types. Our system is based on image processing and one-class classification to reject unknown pollen grain objects. The latter classification technique allows us to tackle the major difficulty of the problem, the existence of many possible fraudulent pollen types, and the impossibility of modeling all of them. Different one-class classification paradigms are compared to study the most suitable technique for solving the problem. In addition, feature selection algorithms are applied to reduce the complexity and increase the accuracy of the models. For each local pollen type, a one-class classifier is trained and aggregated into a multiclassifier model. This multiclassification scheme combines the output of all the one-class classifiers in a unique final response. The proposed method is validated by authenticating pollen grains belonging to different Spanish bee pollen types. The overall accuracy of the system on classifying fraudulent microscopic pollen grain objects is 92.3%. The system is able to rapidly reject pollen grains, which belong to nonlocal pollen types, reducing the laboratory work and effort. The number of possible applications of this authentication method in the microscopy research field is unlimited. Copyright © 2012 Wiley Periodicals, Inc.
Ambiguity: A new way of thinking about responses to climate change.
Fleming, A; Howden, S M
2016-11-15
Diversity, interdisciplinarity and transdisciplinarity are now recognized as vital to tackling wicked problems such as those presented by a changing climate (Nature editorial 2015, Ledford 2015; Dick et al., 2016). Including diverse disciplines in science projects enables a range of different views which often facilitate the creation of innovative solutions. Supporting multiple views and options requires a different way of working beyond traditional reductionist approaches to science, communication and decision-making. To embrace diversity in scientific project teams in order to tackle complex, integrated and urgent issues but to expect singular and linear pathways forward is paradoxical. Much has been written about the need for the scientific community to embrace uncertainty (e.g. Popper, Lempert & Bankes 2005; Lempert et al., 2004; Nelson, Howden & Hayman 2013; Bammer & Smithson 2008). We argue that this in itself will not suffice, and that there is also a need to embrace ambiguity in certain situations. Thus, in this article we explore: (1) what ambiguity is, including the benefits it can offer to climate adaptation in particular, using existing approaches to ambiguity in the arts and humanities as examples (2), we discuss practical meanings of ambiguity in relation to climate change, (3) we propose possible next steps for bringing ambiguity into interdisciplinary practice, and (4) we identify some challenges and necessary preconditions to successfully and appropriately embracing ambiguity. Copyright © 2016 Elsevier B.V. All rights reserved.
Perspective: The language of leadership.
Souba, Chip
2010-10-01
Human meaning is not given before language in and by some detached, prelinguistic domain and then labeled with words. Rather, language itself, always already ardently at play in our lives, is constitutive of the realities of our experience, opening up to us a uniquely human world. Language is the bridge between the created present and the uncreated future, affording leaders of medical schools with an underused opportunity to transform academic medicine. In creating and exchanging meaning, good leaders translate ambiguity into clear messages that convey the rationale for change and enroll others in a compelling strategy that fosters alignment and commitment. Because language influences our thinking and emotions, it is most powerful and effective for tackling challenges that rely heavily on conceptual, innovative solutions as opposed to those problems whose solutions are simple and technical in nature. However, many leaders in academic medicine spend much of their time in the domain of content, where issues are understandable, strategies are familiar, and solutions are seemingly apparent. Complex problems cannot be tackled by solely addressing content; the issue in question must be situated within an appropriate conversational context to provide a basis for action. Leaders do this by creating linguistic distinctions that prompt cognitive shifts in others, jarring them loose from their entrenched worldviews. This property of language--its ability to bring forth, out of the unspoken realm, innovative ideas and possibilities--will determine the future of our health care system and our world.
Development of modelling algorithm of technological systems by statistical tests
NASA Astrophysics Data System (ADS)
Shemshura, E. A.; Otrokov, A. V.; Chernyh, V. G.
2018-03-01
The paper tackles the problem of economic assessment of design efficiency regarding various technological systems at the stage of their operation. The modelling algorithm of a technological system was performed using statistical tests and with account of the reliability index allows estimating the level of machinery technical excellence and defining the efficiency of design reliability against its performance. Economic feasibility of its application shall be determined on the basis of service quality of a technological system with further forecasting of volumes and the range of spare parts supply.
Using Web 2.0 Techniques To Bring Global Climate Modeling To More Users
NASA Astrophysics Data System (ADS)
Chandler, M. A.; Sohl, L. E.; Tortorici, S.
2012-12-01
The Educational Global Climate Model has been used for many years in undergraduate courses and professional development settings to teach the fundamentals of global climate modeling and climate change simulation to students and teachers. While course participants have reported a high level of satisfaction in these courses and overwhelmingly claim that EdGCM projects are worth the effort, there is often a high level of frustration during the initial learning stages. Many of the problems stem from issues related to installation of the software suite and to the length of time it can take to run initial experiments. Two or more days of continuous run time may be required before enough data has been gathered to begin analyses. Asking users to download existing simulation data has not been a solution because the GCM data sets are several gigabytes in size, requiring substantial bandwidth and stable dedicated internet connections. As a means of getting around these problems we have been developing a Web 2.0 utility called EzGCM (Easy G-G-M) which emphasizes that participants learn the steps involved in climate modeling research: constructing a hypothesis, designing an experiment, running a computer model and assessing when an experiment has finished (reached equilibrium), using scientific visualization to support analysis, and finally communicating the results through social networking methods. We use classic climate experiments that can be "rediscovered" through exercises with EzGCM and are attempting to make this Web 2.0 tool an entry point into climate modeling for teachers with little time to cover the subject, users with limited computer skills, and for those who want an introduction to the process before tackling more complex projects with EdGCM.
Harnessing dendritic cells in inflammatory skin diseases
Chu, Chung-Ching; Di Meglio, Paola; Nestle, Frank O.
2011-01-01
The skin immune system harbors a complex network of dendritic cells (DCs). Recent studies highlight a diverse functional specialization of skin DC subsets. In addition to generating cellular and humoral immunity against pathogens, skin DCs are involved in tolerogenic mechanisms to ensure the maintenance of immune homeostasis, as well as in pathogenesis of chronic inflammation in the skin when excessive immune responses are initiated and unrestrained. Harnessing DCs by directly targeting DC-derived molecules or selectively modulate DC subsets is a convincing strategy to tackle inflammatory skin diseases. In this review we discuss recent advances underlining the functional specialization of skin DCs and discuss the potential implication for future DC-based therapeutic strategies. PMID:21295490
Tracking Activities in Complex Settings Using Smart Environment Technologies.
Singla, Geetika; Cook, Diane J; Schmitter-Edgecombe, Maureen
2009-01-01
The pervasive sensing technologies found in smart homes offer unprecedented opportunities for providing health monitoring and assistance to individuals experiencing difficulties living independently at home. A primary challenge that needs to be tackled to meet this need is the ability to recognize and track functional activities that people perform in their own homes and everyday settings. In this paper we look at approaches to perform real-time recognition of Activities of Daily Living. We enhance other related research efforts to develop approaches that are effective when activities are interrupted and interleaved. To evaluate the accuracy of our recognition algorithms we assess them using real data collected from participants performing activities in our on-campus smart apartment testbed.
Streamlined Genome Sequence Compression using Distributed Source Coding
Wang, Shuang; Jiang, Xiaoqian; Chen, Feng; Cui, Lijuan; Cheng, Samuel
2014-01-01
We aim at developing a streamlined genome sequence compression algorithm to support alternative miniaturized sequencing devices, which have limited communication, storage, and computation power. Existing techniques that require heavy client (encoder side) cannot be applied. To tackle this challenge, we carefully examined distributed source coding theory and developed a customized reference-based genome compression protocol to meet the low-complexity need at the client side. Based on the variation between source and reference, our protocol will pick adaptively either syndrome coding or hash coding to compress subsequences of changing code length. Our experimental results showed promising performance of the proposed method when compared with the state-of-the-art algorithm (GRS). PMID:25520552
Elucidating Reaction Mechanisms on Quantum Computers
NASA Astrophysics Data System (ADS)
Wiebe, Nathan; Reiher, Markus; Svore, Krysta; Wecker, Dave; Troyer, Matthias
We show how a quantum computer can be employed to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical-computer simulations for such problems, to significantly increase their accuracy and enable hitherto intractable simulations. Detailed resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. This demonstrates that quantum computers will realistically be able to tackle important problems in chemistry that are both scientifically and economically significant.
A Thick-Restart Lanczos Algorithm with Polynomial Filtering for Hermitian Eigenvalue Problems
Li, Ruipeng; Xi, Yuanzhe; Vecharynski, Eugene; ...
2016-08-16
Polynomial filtering can provide a highly effective means of computing all eigenvalues of a real symmetric (or complex Hermitian) matrix that are located in a given interval, anywhere in the spectrum. This paper describes a technique for tackling this problem by combining a thick-restart version of the Lanczos algorithm with deflation ("locking'') and a new type of polynomial filter obtained from a least-squares technique. Furthermore, the resulting algorithm can be utilized in a “spectrum-slicing” approach whereby a very large number of eigenvalues and associated eigenvectors of the matrix are computed by extracting eigenpairs located in different subintervals independently from onemore » another.« less