Sample records for improved complex variable

  1. Variable Complexity Optimization of Composite Structures

    NASA Technical Reports Server (NTRS)

    Haftka, Raphael T.

    2002-01-01

    The use of several levels of modeling in design has been dubbed variable complexity modeling. The work under the grant focused on developing variable complexity modeling strategies with emphasis on response surface techniques. Applications included design of stiffened composite plates for improved damage tolerance, the use of response surfaces for fitting weights obtained by structural optimization, and design against uncertainty using response surface techniques.

  2. Statistical Assessment of Variability of Terminal Restriction Fragment Length Polymorphism Analysis Applied to Complex Microbial Communities ▿ †

    PubMed Central

    Rossi, Pierre; Gillet, François; Rohrbach, Emmanuelle; Diaby, Nouhou; Holliger, Christof

    2009-01-01

    The variability of terminal restriction fragment polymorphism analysis applied to complex microbial communities was assessed statistically. Recent technological improvements were implemented in the successive steps of the procedure, resulting in a standardized procedure which provided a high level of reproducibility. PMID:19749066

  3. Variable sensory perception in autism.

    PubMed

    Haigh, Sarah M

    2018-03-01

    Autism is associated with sensory and cognitive abnormalities. Individuals with autism generally show normal or superior early sensory processing abilities compared to healthy controls, but deficits in complex sensory processing. In the current opinion paper, it will be argued that sensory abnormalities impact cognition by limiting the amount of signal that can be used to interpret and interact with environment. There is a growing body of literature showing that individuals with autism exhibit greater trial-to-trial variability in behavioural and cortical sensory responses. If multiple sensory signals that are highly variable are added together to process more complex sensory stimuli, then this might destabilise later perception and impair cognition. Methods to improve sensory processing have shown improvements in more general cognition. Studies that specifically investigate differences in sensory trial-to-trial variability in autism, and the potential changes in variability before and after treatment, could ascertain if trial-to-trial variability is a good mechanism to target for treatment in autism. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  4. The trajectory of life. Decreasing physiological network complexity through changing fractal patterns

    PubMed Central

    Sturmberg, Joachim P.; Bennett, Jeanette M.; Picard, Martin; Seely, Andrew J. E.

    2015-01-01

    In this position paper, we submit a synthesis of theoretical models based on physiology, non-equilibrium thermodynamics, and non-linear time-series analysis. Based on an understanding of the human organism as a system of interconnected complex adaptive systems, we seek to examine the relationship between health, complexity, variability, and entropy production, as it might be useful to help understand aging, and improve care for patients. We observe the trajectory of life is characterized by the growth, plateauing and subsequent loss of adaptive function of organ systems, associated with loss of functioning and coordination of systems. Understanding development and aging requires the examination of interdependence among these organ systems. Increasing evidence suggests network interconnectedness and complexity can be captured/measured/associated with the degree and complexity of healthy biologic rhythm variability (e.g., heart and respiratory rate variability). We review physiological mechanisms linking the omics, arousal/stress systems, immune function, and mitochondrial bioenergetics; highlighting their interdependence in normal physiological function and aging. We argue that aging, known to be characterized by a loss of variability, is manifested at multiple scales, within functional units at the small scale, and reflected by diagnostic features at the larger scale. While still controversial and under investigation, it appears conceivable that the integrity of whole body complexity may be, at least partially, reflected in the degree and variability of intrinsic biologic rhythms, which we believe are related to overall system complexity that may be a defining feature of health and it's loss through aging. Harnessing this information for the development of therapeutic and preventative strategies may hold an opportunity to significantly improve the health of our patients across the trajectory of life. PMID:26082722

  5. An Improved Estimation Using Polya-Gamma Augmentation for Bayesian Structural Equation Models with Dichotomous Variables

    ERIC Educational Resources Information Center

    Kim, Seohyun; Lu, Zhenqiu; Cohen, Allan S.

    2018-01-01

    Bayesian algorithms have been used successfully in the social and behavioral sciences to analyze dichotomous data particularly with complex structural equation models. In this study, we investigate the use of the Polya-Gamma data augmentation method with Gibbs sampling to improve estimation of structural equation models with dichotomous variables.…

  6. [Variable selection methods combined with local linear embedding theory used for optimization of near infrared spectral quantitative models].

    PubMed

    Hao, Yong; Sun, Xu-Dong; Yang, Qiang

    2012-12-01

    Variables selection strategy combined with local linear embedding (LLE) was introduced for the analysis of complex samples by using near infrared spectroscopy (NIRS). Three methods include Monte Carlo uninformation variable elimination (MCUVE), successive projections algorithm (SPA) and MCUVE connected with SPA were used for eliminating redundancy spectral variables. Partial least squares regression (PLSR) and LLE-PLSR were used for modeling complex samples. The results shown that MCUVE can both extract effective informative variables and improve the precision of models. Compared with PLSR models, LLE-PLSR models can achieve more accurate analysis results. MCUVE combined with LLE-PLSR is an effective modeling method for NIRS quantitative analysis.

  7. What School Boards Can Do to Improve Teacher Competency.

    ERIC Educational Resources Information Center

    Karagan, Nicholas J.

    The school board's role in improving teacher competency involves avoiding incompetent teachers, improving competent ones, and maintaining highly competent ones. Because teacher competence is a complex social phenomenon, affected by many different variables, boards should keep in mind that actions to improve competency may not be preferred by…

  8. Weighting Test Samples in IRT Linking and Equating: Toward an Improved Sampling Design for Complex Equating. Research Report. ETS RR-13-39

    ERIC Educational Resources Information Center

    Qian, Jiahe; Jiang, Yanming; von Davier, Alina A.

    2013-01-01

    Several factors could cause variability in item response theory (IRT) linking and equating procedures, such as the variability across examinee samples and/or test items, seasonality, regional differences, native language diversity, gender, and other demographic variables. Hence, the following question arises: Is it possible to select optimal…

  9. Transcriptome sequencing of diverse peanut (arachis) wild species and the cultivated species reveals a wealth of untapped genetic variability

    USDA-ARS?s Scientific Manuscript database

    Next generation sequencing technologies and improved bioinformatics methods have provided opportunities to study sequence variability in complex polyploid transcriptomes. In this study, we used a diverse panel of twenty-two Arachis accessions representing seven Arachis hypogaea market classes, A-, B...

  10. Complex systems and the technology of variability analysis

    PubMed Central

    Seely, Andrew JE; Macklem, Peter T

    2004-01-01

    Characteristic patterns of variation over time, namely rhythms, represent a defining feature of complex systems, one that is synonymous with life. Despite the intrinsic dynamic, interdependent and nonlinear relationships of their parts, complex biological systems exhibit robust systemic stability. Applied to critical care, it is the systemic properties of the host response to a physiological insult that manifest as health or illness and determine outcome in our patients. Variability analysis provides a novel technology with which to evaluate the overall properties of a complex system. This review highlights the means by which we scientifically measure variation, including analyses of overall variation (time domain analysis, frequency distribution, spectral power), frequency contribution (spectral analysis), scale invariant (fractal) behaviour (detrended fluctuation and power law analysis) and regularity (approximate and multiscale entropy). Each technique is presented with a definition, interpretation, clinical application, advantages, limitations and summary of its calculation. The ubiquitous association between altered variability and illness is highlighted, followed by an analysis of how variability analysis may significantly improve prognostication of severity of illness and guide therapeutic intervention in critically ill patients. PMID:15566580

  11. SPATIAL VARIABILITY IN POLLUTANTS: IMPLICATIONS FOR EXPOSURE ASSESSMENT

    EPA Science Inventory

    The efforts to evaluate the value of improved exposure metrics on the ability to relate those metrics with outcomes in complex systems have met with varying degrees of success. This work describes the results of recent efforts, mostly involving air pollutants, to improve the sop...

  12. Variable speed limit strategies analysis with mesoscopic traffic flow model based on complex networks

    NASA Astrophysics Data System (ADS)

    Li, Shu-Bin; Cao, Dan-Ni; Dang, Wen-Xiu; Zhang, Lin

    As a new cross-discipline, the complexity science has penetrated into every field of economy and society. With the arrival of big data, the research of the complexity science has reached its summit again. In recent years, it offers a new perspective for traffic control by using complex networks theory. The interaction course of various kinds of information in traffic system forms a huge complex system. A new mesoscopic traffic flow model is improved with variable speed limit (VSL), and the simulation process is designed, which is based on the complex networks theory combined with the proposed model. This paper studies effect of VSL on the dynamic traffic flow, and then analyzes the optimal control strategy of VSL in different network topologies. The conclusion of this research is meaningful to put forward some reasonable transportation plan and develop effective traffic management and control measures to help the department of traffic management.

  13. Inversion of the anomalous diffraction approximation for variable complex index of refraction near unity. [numerical tests for water-haze aerosol model

    NASA Technical Reports Server (NTRS)

    Smith, C. B.

    1982-01-01

    The Fymat analytic inversion method for retrieving a particle-area distribution function from anomalous diffraction multispectral extinction data and total area is generalized to the case of a variable complex refractive index m(lambda) near unity depending on spectral wavelength lambda. Inversion tests are presented for a water-haze aerosol model. An upper-phase shift limit of 5 pi/2 retrieved an accurate peak area distribution profile. Analytical corrections using both the total number and area improved the inversion.

  14. Understanding thermal circulations and near-surface turbulence processes in a small mountain valley

    NASA Astrophysics Data System (ADS)

    Pardyjak, E.; Dupuy, F.; Durand, P.; Gunawardena, N.; Thierry, H.; Roubin, P.

    2017-12-01

    The interaction of turbulence and thermal circulations in complex terrain can be significantly different from idealized flat terrain. In particular, near-surface horizontal spatial and temporal variability of winds and thermodynamic variables can be significant event over very small spatial scales. The KASCADE (KAtabatic winds and Stability over CAdarache for Dispersion of Effluents) 2017 conducted from January through March 2017 was designed to address these issues and to ultimately improve prediction of dispersion in complex terrain, particularly during stable atmospheric conditions. We have used a relatively large number of sensors to improve our understanding of the spatial and temporal development, evolution and breakdown of topographically driven flows. KASCADE 2017 consisted of continuous observations and fourteen Intensive Observation Periods (IOPs) conducted in the Cadarache Valley located in southeastern France. The Cadarache Valley is a relatively small valley (5 km x 1 km) with modest slopes and relatively small elevation differences between the valley floor and nearby hilltops ( 100 m). During winter, winds in the valley are light and stably stratified at night leading to thermal circulations as well as complex near-surface atmospheric layering. In this presentation we present results quantifying spatial variability of thermodynamic and turbulence variables as a function of different large -scale forcing conditions (e.g., quiescent conditions, strong westerly flow, and Mistral flow). In addition, we attempt to characterize highly-regular nocturnal horizontal wind meandering and associated turbulence statistics.

  15. Independent variable complexity for regional regression of the flow duration curve in ungauged basins

    NASA Astrophysics Data System (ADS)

    Fouad, Geoffrey; Skupin, André; Hope, Allen

    2016-04-01

    The flow duration curve (FDC) is one of the most widely used tools to quantify streamflow. Its percentile flows are often required for water resource applications, but these values must be predicted for ungauged basins with insufficient or no streamflow data. Regional regression is a commonly used approach for predicting percentile flows that involves identifying hydrologic regions and calibrating regression models to each region. The independent variables used to describe the physiographic and climatic setting of the basins are a critical component of regional regression, yet few studies have investigated their effect on resulting predictions. In this study, the complexity of the independent variables needed for regional regression is investigated. Different levels of variable complexity are applied for a regional regression consisting of 918 basins in the US. Both the hydrologic regions and regression models are determined according to the different sets of variables, and the accuracy of resulting predictions is assessed. The different sets of variables include (1) a simple set of three variables strongly tied to the FDC (mean annual precipitation, potential evapotranspiration, and baseflow index), (2) a traditional set of variables describing the average physiographic and climatic conditions of the basins, and (3) a more complex set of variables extending the traditional variables to include statistics describing the distribution of physiographic data and temporal components of climatic data. The latter set of variables is not typically used in regional regression, and is evaluated for its potential to predict percentile flows. The simplest set of only three variables performed similarly to the other more complex sets of variables. Traditional variables used to describe climate, topography, and soil offered little more to the predictions, and the experimental set of variables describing the distribution of basin data in more detail did not improve predictions. These results are largely reflective of cross-correlation existing in hydrologic datasets, and highlight the limited predictive power of many traditionally used variables for regional regression. A parsimonious approach including fewer variables chosen based on their connection to streamflow may be more efficient than a data mining approach including many different variables. Future regional regression studies may benefit from having a hydrologic rationale for including different variables and attempting to create new variables related to streamflow.

  16. Effects of a hydrotherapy programme on symbolic and complexity dynamics of heart rate variability and aerobic capacity in fibromyalgia patients.

    PubMed

    Zamunér, Antonio Roberto; Andrade, Carolina P; Forti, Meire; Marchi, Andrea; Milan, Juliana; Avila, Mariana Arias; Catai, Aparecida Maria; Porta, Alberto; Silva, Ester

    2015-01-01

    To evaluate the effects of a hydrotherapy programme on aerobic capacity and linear and non-linear dynamics of heart rate variability (HRV) in women with fibromyalgia syndrome (FMS). 20 women with FMS and 20 healthy controls (HC) took part in the study. The FMS group was evaluated at baseline and after a 16-week hydrotherapy programme. All participants underwent cardiopulmonary exercise testing on a cycle ergometer and RR intervals recording in supine and standing positions. The HRV was analysed by linear and non-linear methods. The current level of pain, the tender points, the pressure pain threshold and the impact of FMS on quality of life were assessed. The FMS patients presented higher cardiac sympathetic modulation, lower vagal modulation and lower complexity of HRV in supine position than the HC. Only the HC decreased the complexity indices of HRV during orthostatic stimulus. After a 16-week hydrotherapy programme, the FMS patients increased aerobic capacity, decreased cardiac sympathetic modulation and increased vagal modulation and complexity dynamics of HRV in supine. The FMS patients also improved their cardiac autonomic adjustments to the orthostatic stimulus. Associations between improvements in non-linear dynamics of HRV and improvements in pain and in the impact of FMS on quality of life were found. A 16-week hydrotherapy programme proved to be effective in ameliorating symptoms, aerobic functional capacity and cardiac autonomic control in FMS patients. Improvements in the non-linear dynamics of HRV were related to improvements in pain and in the impact of FMS on quality of life.

  17. Improving the Prediction of Mortality and the Need for Life-Saving Interventions in Trauma Patients Using Standard Vital Signs With Heart-Rate Variability and Complexity

    DTIC Science & Technology

    2015-06-01

    Trauma 69:S10YS13, 2010. 2. Liu NT, Holcomb JB, Wade CE, Darrah MI, Salinas J: Utility of vital signs, heart-rate variability and complexity, and machine ... learning for identifying the need for life-saving interventions in trauma patients. Shock 42:108Y114, 2014. 3. Pickering TG, Shimbo D, Hass D...Ann Emerg Med 45:68Y76, 2005. 8. Liu NT, Holcomb JB, Wade CE, Batchinsky AI, Cancio LC, Darrah MI, Salinas J: Development and validation of a machine

  18. Floodplain complexity and surface metrics: influences of scale and geomorphology

    USGS Publications Warehouse

    Scown, Murray W.; Thoms, Martin C.; DeJager, Nathan R.

    2015-01-01

    Many studies of fluvial geomorphology and landscape ecology examine a single river or landscape, thus lack generality, making it difficult to develop a general understanding of the linkages between landscape patterns and larger-scale driving variables. We examined the spatial complexity of eight floodplain surfaces in widely different geographic settings and determined how patterns measured at different scales relate to different environmental drivers. Floodplain surface complexity is defined as having highly variable surface conditions that are also highly organised in space. These two components of floodplain surface complexity were measured across multiple sampling scales from LiDAR-derived DEMs. The surface character and variability of each floodplain were measured using four surface metrics; namely, standard deviation, skewness, coefficient of variation, and standard deviation of curvature from a series of moving window analyses ranging from 50 to 1000 m in radius. The spatial organisation of each floodplain surface was measured using spatial correlograms of the four surface metrics. Surface character, variability, and spatial organisation differed among the eight floodplains; and random, fragmented, highly patchy, and simple gradient spatial patterns were exhibited, depending upon the metric and window size. Differences in surface character and variability among the floodplains became statistically stronger with increasing sampling scale (window size), as did their associations with environmental variables. Sediment yield was consistently associated with differences in surface character and variability, as were flow discharge and variability at smaller sampling scales. Floodplain width was associated with differences in the spatial organization of surface conditions at smaller sampling scales, while valley slope was weakly associated with differences in spatial organisation at larger scales. A comparison of floodplain landscape patterns measured at different scales would improve our understanding of the role that different environmental variables play at different scales and in different geomorphic settings.

  19. Improved resolution of major clades within Tuber and taxonomy of species within the Tuber gibbosum complex

    Treesearch

    Gregory M. Bonito; James M. Trappe; Pat Rawlinson; Rytas Vilgalys

    2010-01-01

    Tuber gibbosum Harkn., described from northern California, originally was thought to be a single, variable species that fruited from autumn through winter to spring. It has become popular as a culinary truffle in northwestern USA, where it is commercially harvested. Morphological studies suggested it might be a complex that includes at least two...

  20. Understanding patient safety performance and educational needs using the 'Safety-II' approach for complex systems.

    PubMed

    McNab, Duncan; Bowie, Paul; Morrison, Jill; Ross, Alastair

    2016-11-01

    Participation in projects to improve patient safety is a key component of general practice (GP) specialty training, appraisal and revalidation. Patient safety training priorities for GPs at all career stages are described in the Royal College of General Practitioners' curriculum. Current methods that are taught and employed to improve safety often use a 'find-and-fix' approach to identify components of a system (including humans) where performance could be improved. However, the complex interactions and inter-dependence between components in healthcare systems mean that cause and effect are not always linked in a predictable manner. The Safety-II approach has been proposed as a new way to understand how safety is achieved in complex systems that may improve quality and safety initiatives and enhance GP and trainee curriculum coverage. Safety-II aims to maximise the number of events with a successful outcome by exploring everyday work. Work-as-done often differs from work-as-imagined in protocols and guidelines and various ways to achieve success, dependent on work conditions, may be possible. Traditional approaches to improve the quality and safety of care often aim to constrain variability but understanding and managing variability may be a more beneficial approach. The application of a Safety-II approach to incident investigation, quality improvement projects, prospective analysis of risk in systems and performance indicators may offer improved insight into system performance leading to more effective change. The way forward may be to combine the Safety-II approach with 'traditional' methods to enhance patient safety training, outcomes and curriculum coverage.

  1. Variability in Cadence During Forced Cycling Predicts Motor Improvement in Individuals With Parkinson’s Disease

    PubMed Central

    Ridgel, Angela L.; Abdar, Hassan Mohammadi; Alberts, Jay L.; Discenzo, Fred M.; Loparo, Kenneth A.

    2014-01-01

    Variability in severity and progression of Parkinson’s disease symptoms makes it challenging to design therapy interventions that provide maximal benefit. Previous studies showed that forced cycling, at greater pedaling rates, results in greater improvements in motor function than voluntary cycling. The precise mechanism for differences in function following exercise is unknown. We examined the complexity of biomechanical and physiological features of forced and voluntary cycling and correlated these features to improvements in motor function as measured by the Unified Parkinson’s Disease Rating Scale (UPDRS). Heart rate, cadence, and power were analyzed using entropy signal processing techniques. Pattern variability in heart rate and power were greater in the voluntary group when compared to forced group. In contrast, variability in cadence was higher during forced cycling. UPDRS Motor III scores predicted from the pattern variability data were highly correlated to measured scores in the forced group. This study shows how time series analysis methods of biomechanical and physiological parameters of exercise can be used to predict improvements in motor function. This knowledge will be important in the development of optimal exercise-based rehabilitation programs for Parkinson’s disease. PMID:23144045

  2. OLYMPEX Data Workshop: GPM View

    NASA Technical Reports Server (NTRS)

    Petersen, W.

    2017-01-01

    OLYMPEX Primary Objectives: Datasets to enable: (1) Direct validation over complex terrain at multiple scales, liquid and frozen precip types, (a) Do we capture terrain and synoptic regime transitions, orographic enhancements/structure, full range of precipitation intensity (e.g., very light to heavy) and types, spatial variability? (b) How well can we estimate space/time-accumulated precipitation over terrain (liquid + frozen)? (2) Physical validation of algorithms in mid-latitude cold season frontal systems over ocean and complex terrain, (a) What are the column properties of frozen, melting, liquid hydrometeors-their relative contributions to estimated surface precipitation, transition under the influence of terrain gradients, and systematic variability as a function of synoptic regime? (3) Integrated hydrologic validation in complex terrain, (a) Can satellite estimates be combined with modeling over complex topography to drive improved products (assimilation, downscaling) [Level IV products] (b) What are capabilities and limitations for use of satellite-based precipitation estimates in stream/river flow forecasting?

  3. Effects of additional data on Bayesian clustering.

    PubMed

    Yamazaki, Keisuke

    2017-10-01

    Hierarchical probabilistic models, such as mixture models, are used for cluster analysis. These models have two types of variables: observable and latent. In cluster analysis, the latent variable is estimated, and it is expected that additional information will improve the accuracy of the estimation of the latent variable. Many proposed learning methods are able to use additional data; these include semi-supervised learning and transfer learning. However, from a statistical point of view, a complex probabilistic model that encompasses both the initial and additional data might be less accurate due to having a higher-dimensional parameter. The present paper presents a theoretical analysis of the accuracy of such a model and clarifies which factor has the greatest effect on its accuracy, the advantages of obtaining additional data, and the disadvantages of increasing the complexity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Designing eHealth Applications to Reduce Cognitive Effort for Persons With Severe Mental Illness: Page Complexity, Navigation Simplicity, and Comprehensibility

    PubMed Central

    Spring, Michael R; Hanusa, Barbara H; Eack, Shaun M; Haas, Gretchen L

    2017-01-01

    Background eHealth technologies offer great potential for improving the use and effectiveness of treatments for those with severe mental illness (SMI), including schizophrenia and schizoaffective disorder. This potential can be muted by poor design. There is limited research on designing eHealth technologies for those with SMI, others with cognitive impairments, and those who are not technology savvy. We previously tested a design model, the Flat Explicit Design Model (FEDM), to create eHealth interventions for individuals with SMI. Subsequently, we developed the design concept page complexity, defined via the design variables we created of distinct topic areas, distinct navigation areas, and number of columns used to organize contents and the variables of text reading level, text reading ease (a newly added variable to the FEDM), and the number of hyperlinks and number of words on a page. Objective The objective of our study was to report the influence that the 19 variables of the FEDM have on the ability of individuals with SMI to use a website, ratings of a website’s ease of use, and performance on a novel usability task we created termed as content disclosure (a measure of the influence of a homepage’s design on the understanding user’s gain of a website). Finally, we assessed the performance of 3 groups or dimensions we developed that organize the 19 variables of the FEDM, termed as page complexity, navigational simplicity, and comprehensibility. Methods We measured 4 website usability outcomes: ability to find information, time to find information, ease of use, and a user’s ability to accurately judge a website’s contents. A total of 38 persons with SMI (chart diagnosis of schizophrenia or schizoaffective disorder) and 5 mental health websites were used to evaluate the importance of the new design concepts, as well as the other variables in the FEDM. Results We found that 11 of the FEDM’s 19 variables were significantly associated with all 4 usability outcomes. Most other variables were significantly related to 2 or 3 of these usability outcomes. With the 5 tested websites, 7 of the 19 variables of the FEDM overlapped with other variables, resulting in 12 distinct variable groups. The 3 design dimensions had acceptable coefficient alphas. Both navigational simplicity and comprehensibility were significantly related to correctly identifying whether information was available on a website. Page complexity and navigational simplicity were significantly associated with the ability and time to find information and ease-of-use ratings. Conclusions The 19 variables and 3 dimensions (page complexity, navigational simplicity, and comprehensibility) of the FEDM offer evidence-based design guidance intended to reduce the cognitive effort required to effectively use eHealth applications, particularly for persons with SMI, and potentially others, including those with cognitive impairments and limited skills or experience with technology. The new variables we examined (topic areas, navigational areas, columns) offer additional and very simple ways to improve simplicity. PMID:28057610

  5. A network-based approach for semi-quantitative knowledge mining and its application to yield variability

    NASA Astrophysics Data System (ADS)

    Schauberger, Bernhard; Rolinski, Susanne; Müller, Christoph

    2016-12-01

    Variability of crop yields is detrimental for food security. Under climate change its amplitude is likely to increase, thus it is essential to understand the underlying causes and mechanisms. Crop models are the primary tool to project future changes in crop yields under climate change. A systematic overview of drivers and mechanisms of crop yield variability (YV) can thus inform crop model development and facilitate improved understanding of climate change impacts on crop yields. Yet there is a vast body of literature on crop physiology and YV, which makes a prioritization of mechanisms for implementation in models challenging. Therefore this paper takes on a novel approach to systematically mine and organize existing knowledge from the literature. The aim is to identify important mechanisms lacking in models, which can help to set priorities in model improvement. We structure knowledge from the literature in a semi-quantitative network. This network consists of complex interactions between growing conditions, plant physiology and crop yield. We utilize the resulting network structure to assign relative importance to causes of YV and related plant physiological processes. As expected, our findings confirm existing knowledge, in particular on the dominant role of temperature and precipitation, but also highlight other important drivers of YV. More importantly, our method allows for identifying the relevant physiological processes that transmit variability in growing conditions to variability in yield. We can identify explicit targets for the improvement of crop models. The network can additionally guide model development by outlining complex interactions between processes and by easily retrieving quantitative information for each of the 350 interactions. We show the validity of our network method as a structured, consistent and scalable dictionary of literature. The method can easily be applied to many other research fields.

  6. Range expansion through fragmented landscapes under a variable climate

    PubMed Central

    Bennie, Jonathan; Hodgson, Jenny A; Lawson, Callum R; Holloway, Crispin TR; Roy, David B; Brereton, Tom; Thomas, Chris D; Wilson, Robert J

    2013-01-01

    Ecological responses to climate change may depend on complex patterns of variability in weather and local microclimate that overlay global increases in mean temperature. Here, we show that high-resolution temporal and spatial variability in temperature drives the dynamics of range expansion for an exemplar species, the butterfly Hesperia comma. Using fine-resolution (5 m) models of vegetation surface microclimate, we estimate the thermal suitability of 906 habitat patches at the species' range margin for 27 years. Population and metapopulation models that incorporate this dynamic microclimate surface improve predictions of observed annual changes to population density and patch occupancy dynamics during the species' range expansion from 1982 to 2009. Our findings reveal how fine-scale, short-term environmental variability drives rates and patterns of range expansion through spatially localised, intermittent episodes of expansion and contraction. Incorporating dynamic microclimates can thus improve models of species range shifts at spatial and temporal scales relevant to conservation interventions. PMID:23701124

  7. Complexity, Training Paradigm Design, and the Contribution of Memory Subsystems to Grammar Learning

    PubMed Central

    Ettlinger, Marc; Wong, Patrick C. M.

    2016-01-01

    Although there is variability in nonnative grammar learning outcomes, the contributions of training paradigm design and memory subsystems are not well understood. To examine this, we presented learners with an artificial grammar that formed words via simple and complex morphophonological rules. Across three experiments, we manipulated training paradigm design and measured subjects' declarative, procedural, and working memory subsystems. Experiment 1 demonstrated that passive, exposure-based training boosted learning of both simple and complex grammatical rules, relative to no training. Additionally, procedural memory correlated with simple rule learning, whereas declarative memory correlated with complex rule learning. Experiment 2 showed that presenting corrective feedback during the test phase did not improve learning. Experiment 3 revealed that structuring the order of training so that subjects are first exposed to the simple rule and then the complex improved learning. The cumulative findings shed light on the contributions of grammatical complexity, training paradigm design, and domain-general memory subsystems in determining grammar learning success. PMID:27391085

  8. Exhaust emission reduction for intermittent combustion aircraft engines

    NASA Technical Reports Server (NTRS)

    Moffett, R. N.

    1979-01-01

    Three concepts for optimizing the performance, increasing the fuel economy, and reducing exhaust emission of the piston aircraft engine were investigated. High energy-multiple spark discharge and spark plug tip penetration, ultrasonic fuel vaporization, and variable valve timing were evaluated individually. Ultrasonic fuel vaporization did not demonstrate sufficient improvement in distribution to offset the performance loss caused by the additional manifold restriction. High energy ignition and revised spark plug tip location provided no change in performance or emissions. Variable valve timing provided some performance benefit; however, even greater performance improvement was obtained through induction system tuning which could be accomplished with far less complexity.

  9. Improving multi-objective reservoir operation optimization with sensitivity-informed dimension reduction

    NASA Astrophysics Data System (ADS)

    Chu, J.; Zhang, C.; Fu, G.; Li, Y.; Zhou, H.

    2015-08-01

    This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed method dramatically reduces the computational demands required for attaining high-quality approximations of optimal trade-off relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed dimension reduction and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform dimension reduction of optimization problems when solving complex multi-objective reservoir operation problems.

  10. Improving multi-objective reservoir operation optimization with sensitivity-informed problem decomposition

    NASA Astrophysics Data System (ADS)

    Chu, J. G.; Zhang, C.; Fu, G. T.; Li, Y.; Zhou, H. C.

    2015-04-01

    This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce the computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed problem decomposition dramatically reduces the computational demands required for attaining high quality approximations of optimal tradeoff relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed problem decomposition and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform problem decomposition when solving the complex multi-objective reservoir operation problems.

  11. An improved partial least-squares regression method for Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Momenpour Tehran Monfared, Ali; Anis, Hanan

    2017-10-01

    It is known that the performance of partial least-squares (PLS) regression analysis can be improved using the backward variable selection method (BVSPLS). In this paper, we further improve the BVSPLS based on a novel selection mechanism. The proposed method is based on sorting the weighted regression coefficients, and then the importance of each variable of the sorted list is evaluated using root mean square errors of prediction (RMSEP) criterion in each iteration step. Our Improved BVSPLS (IBVSPLS) method has been applied to leukemia and heparin data sets and led to an improvement in limit of detection of Raman biosensing ranged from 10% to 43% compared to PLS. Our IBVSPLS was also compared to the jack-knifing (simpler) and Genetic Algorithm (more complex) methods. Our method was consistently better than the jack-knifing method and showed either a similar or a better performance compared to the genetic algorithm.

  12. Influence of learner knowledge and case complexity on handover accuracy and cognitive load: results from a simulation study.

    PubMed

    Young, John Q; van Dijk, Savannah M; O'Sullivan, Patricia S; Custers, Eugene J; Irby, David M; Ten Cate, Olle

    2016-09-01

    The handover represents a high-risk event in which errors are common and lead to patient harm. A better understanding of the cognitive mechanisms of handover errors is essential to improving handover education and practice. This paper reports on an experiment conducted to study the effects of learner knowledge, case complexity (i.e. cases with or without a clear diagnosis) and their interaction on handover accuracy and cognitive load. Participants were 52 Dutch medical students in Years 2 and 6. The experiment employed a repeated-measures design with two explanatory variables: case complexity (simple or complex) as the within-subject variable, and learner knowledge (as indicated by illness script maturity) as the between-subject covariate. The dependent variables were handover accuracy and cognitive load. Each participant performed a total of four simulated handovers involving two simple cases and two complex cases. Higher illness script maturity predicted increased handover accuracy (p < 0.001) and lower cognitive load (p = 0.007). Case complexity did not independently affect either outcome. For handover accuracy, there was no interaction between case complexity and illness script maturity. For cognitive load, there was an interaction effect between illness script maturity and case complexity, indicating that more mature illness scripts reduced cognitive load less in complex cases than in simple cases. Students with more mature illness scripts performed more accurate handovers and experienced lower cognitive load. For cognitive load, these effects were more pronounced in simple than complex cases. If replicated, these findings suggest that handover curricula and protocols should provide support that varies according to the knowledge of the trainee. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  13. Use of complex hydraulic variables to predict the distribution and density of unionids in a side channel of the Upper Mississippi River

    USGS Publications Warehouse

    Steuer, J.J.; Newton, T.J.; Zigler, S.J.

    2008-01-01

    Previous attempts to predict the importance of abiotic and biotic factors to unionids in large rivers have been largely unsuccessful. Many simple physical habitat descriptors (e.g., current velocity, substrate particle size, and water depth) have limited ability to predict unionid density. However, more recent studies have found that complex hydraulic variables (e.g., shear velocity, boundary shear stress, and Reynolds number) may be more useful predictors of unionid density. We performed a retrospective analysis with unionid density, current velocity, and substrate particle size data from 1987 to 1988 in a 6-km reach of the Upper Mississippi River near Prairie du Chien, Wisconsin. We used these data to model simple and complex hydraulic variables under low and high flow conditions. We then used classification and regression tree analysis to examine the relationships between hydraulic variables and unionid density. We found that boundary Reynolds number, Froude number, boundary shear stress, and grain size were the best predictors of density. Models with complex hydraulic variables were a substantial improvement over previously published discriminant models and correctly classified 65-88% of the observations for the total mussel fauna and six species. These data suggest that unionid beds may be constrained by threshold limits at both ends of the flow regime. Under low flow, mussels may require a minimum hydraulic variable (Rez.ast;, Fr) to transport nutrients, oxygen, and waste products. Under high flow, areas with relatively low boundary shear stress may provide a hydraulic refuge for mussels. Data on hydraulic preferences and identification of other conditions that constitute unionid habitat are needed to help restore and enhance habitats for unionids in rivers. ?? 2008 Springer Science+Business Media B.V.

  14. Treatment of VGKC complex antibody-associated limbic encephalitis: a systematic review.

    PubMed

    Radja, Guirindhra Koumar; Cavanna, Andrea Eugenio

    2013-01-01

    Limbic encephalitis is an autoimmune neuropsychiatric condition characterized by subacute cognitive symptoms, seizures, and affective changes. Although limbic encephalitis is usually caused by an immune reaction secondary to neoplasms, different types of potentially treatable non-paraneoplastic limbic encephalitis (nPLE) have recently been described. In particular, published studies have reported variable responses to immunosuppressive therapy in Voltage-Gated Potassium Channel (VGKC) complex antibody-associated nPLE. This systematic literature review found that the most significant improvements were reported by patients presenting with affective symptoms and consistent neuroradiological changes. In these patients, improved clinical outcomes correlated with the largest decreases in antibody titers.

  15. Contracting to improve your revenue cycle performance.

    PubMed

    Welter, Terri L; Semko, George A; Miller, Tony; Lauer, Roberta

    2007-09-01

    The following key drivers of commercial contract variability can have a material effect on your hospital's revenue cycle: Claim form variance. Benefit design. Contract complexity. Coding variance. Medical necessity. Precertification/authorization. Claim adjudication/appeal requirements. Additional documentation requirements. Timeliness of payment. Third-party payer activity.

  16. Diurnal Evolution and Annual Variability of Boundary Layer Height in the Columbia River Gorge through the `Eye' of Wind Profiling Radars

    NASA Astrophysics Data System (ADS)

    Bianco, L.; Djalalova, I.; Konopleva-Akish, E.; Kenyon, J.; Olson, J. B.; Wilczak, J. M.

    2016-12-01

    The Second Wind Forecast Improvement Project (WFIP2) is a DoE- and NOAA-sponsored program whose goal is to improve the accuracy of numerical weather prediction (NWP) forecasts in complex terrain. WFIP2 consists of an 18-month (October 2015 - March 2017) field campaign held in the Columbia River basin, in the Pacific Northwest of the U.S. As part of WFIP2 a large suite of in-situ and remote sensing instrumentation has been deployed, including, among several others, a network of eight 915-MHz wind profiling radars (WPRs) equipped with radio acoustic sounding systems (RASSs), and many surface meteorological stations. The diurnal evolution and annual variability of boundary layer height in the area of WFIP2 will be investigated through the `eye' of WPRs, employing state-of-the-art automated algorithms, based on fuzzy logic and artificial intelligence. The results will be used to evaluate possible errors in NWP models in this area of complex terrain.

  17. Noise and poise: Enhancement of postural complexity in the elderly with a stochastic-resonance based therapy

    NASA Astrophysics Data System (ADS)

    Costa, M.; Priplata, A. A.; Lipsitz, L. A.; Wu, Z.; Huang, N. E.; Goldberger, A. L.; Peng, C.-K.

    2007-03-01

    Pathologic states are associated with a loss of dynamical complexity. Therefore, therapeutic interventions that increase physiologic complexity may enhance health status. Using multiscale entropy analysis, we show that the postural sway dynamics of healthy young and healthy elderly subjects are more complex than that of elderly subjects with a history of falls. Application of subsensory noise to the feet has been demonstrated to improve postural stability in the elderly. We next show that this therapy significantly increases the multiscale complexity of sway fluctuations in healthy elderly subjects. Quantification of changes in dynamical complexity of biologic variability may be the basis of a new approach to assessing risk and to predicting the efficacy of clinical interventions, including noise-based therapies.

  18. Exploring total cardiac variability in healthy and pathophysiological subjects using improved refined multiscale entropy.

    PubMed

    Marwaha, Puneeta; Sunkaria, Ramesh Kumar

    2017-02-01

    Multiscale entropy (MSE) and refined multiscale entropy (RMSE) techniques are being widely used to evaluate the complexity of a time series across multiple time scales 't'. Both these techniques, at certain time scales (sometimes for the entire time scales, in the case of RMSE), assign higher entropy to the HRV time series of certain pathologies than that of healthy subjects, and to their corresponding randomized surrogate time series. This incorrect assessment of signal complexity may be due to the fact that these techniques suffer from the following limitations: (1) threshold value 'r' is updated as a function of long-term standard deviation and hence unable to explore the short-term variability as well as substantial variability inherited in beat-to-beat fluctuations of long-term HRV time series. (2) In RMSE, entropy values assigned to different filtered scaled time series are the result of changes in variance, but do not completely reflect the real structural organization inherited in original time series. In the present work, we propose an improved RMSE (I-RMSE) technique by introducing a new procedure to set the threshold value by taking into account the period-to-period variability inherited in a signal and evaluated it on simulated and real HRV database. The proposed I-RMSE assigns higher entropy to the age-matched healthy subjects than that of patients suffering from atrial fibrillation, congestive heart failure, sudden cardiac death and diabetes mellitus, for the entire time scales. The results strongly support the reduction in complexity of HRV time series in female group, old-aged, patients suffering from severe cardiovascular and non-cardiovascular diseases, and in their corresponding surrogate time series.

  19. Complexity quantification of cardiac variability time series using improved sample entropy (I-SampEn).

    PubMed

    Marwaha, Puneeta; Sunkaria, Ramesh Kumar

    2016-09-01

    The sample entropy (SampEn) has been widely used to quantify the complexity of RR-interval time series. It is a fact that higher complexity, and hence, entropy is associated with the RR-interval time series of healthy subjects. But, SampEn suffers from the disadvantage that it assigns higher entropy to the randomized surrogate time series as well as to certain pathological time series, which is a misleading observation. This wrong estimation of the complexity of a time series may be due to the fact that the existing SampEn technique updates the threshold value as a function of long-term standard deviation (SD) of a time series. However, time series of certain pathologies exhibits substantial variability in beat-to-beat fluctuations. So the SD of the first order difference (short term SD) of the time series should be considered while updating threshold value, to account for period-to-period variations inherited in a time series. In the present work, improved sample entropy (I-SampEn), a new methodology has been proposed in which threshold value is updated by considering the period-to-period variations of a time series. The I-SampEn technique results in assigning higher entropy value to age-matched healthy subjects than patients suffering atrial fibrillation (AF) and diabetes mellitus (DM). Our results are in agreement with the theory of reduction in complexity of RR-interval time series in patients suffering from chronic cardiovascular and non-cardiovascular diseases.

  20. Hybrid genetic algorithm with an adaptive penalty function for fitting multimodal experimental data: application to exchange-coupled non-Kramers binuclear iron active sites.

    PubMed

    Beaser, Eric; Schwartz, Jennifer K; Bell, Caleb B; Solomon, Edward I

    2011-09-26

    A Genetic Algorithm (GA) is a stochastic optimization technique based on the mechanisms of biological evolution. These algorithms have been successfully applied in many fields to solve a variety of complex nonlinear problems. While they have been used with some success in chemical problems such as fitting spectroscopic and kinetic data, many have avoided their use due to the unconstrained nature of the fitting process. In engineering, this problem is now being addressed through incorporation of adaptive penalty functions, but their transfer to other fields has been slow. This study updates the Nanakorrn Adaptive Penalty function theory, expanding its validity beyond maximization problems to minimization as well. The expanded theory, using a hybrid genetic algorithm with an adaptive penalty function, was applied to analyze variable temperature variable field magnetic circular dichroism (VTVH MCD) spectroscopic data collected on exchange coupled Fe(II)Fe(II) enzyme active sites. The data obtained are described by a complex nonlinear multimodal solution space with at least 6 to 13 interdependent variables and are costly to search efficiently. The use of the hybrid GA is shown to improve the probability of detecting the global optimum. It also provides large gains in computational and user efficiency. This method allows a full search of a multimodal solution space, greatly improving the quality and confidence in the final solution obtained, and can be applied to other complex systems such as fitting of other spectroscopic or kinetics data.

  1. Factor complexity of crash occurrence: An empirical demonstration using boosted regression trees.

    PubMed

    Chung, Yi-Shih

    2013-12-01

    Factor complexity is a characteristic of traffic crashes. This paper proposes a novel method, namely boosted regression trees (BRT), to investigate the complex and nonlinear relationships in high-variance traffic crash data. The Taiwanese 2004-2005 single-vehicle motorcycle crash data are used to demonstrate the utility of BRT. Traditional logistic regression and classification and regression tree (CART) models are also used to compare their estimation results and external validities. Both the in-sample cross-validation and out-of-sample validation results show that an increase in tree complexity provides improved, although declining, classification performance, indicating a limited factor complexity of single-vehicle motorcycle crashes. The effects of crucial variables including geographical, time, and sociodemographic factors explain some fatal crashes. Relatively unique fatal crashes are better approximated by interactive terms, especially combinations of behavioral factors. BRT models generally provide improved transferability than conventional logistic regression and CART models. This study also discusses the implications of the results for devising safety policies. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Force control compensation method with variable load stiffness and damping of the hydraulic drive unit force control system

    NASA Astrophysics Data System (ADS)

    Kong, Xiangdong; Ba, Kaixian; Yu, Bin; Cao, Yuan; Zhu, Qixin; Zhao, Hualong

    2016-05-01

    Each joint of hydraulic drive quadruped robot is driven by the hydraulic drive unit (HDU), and the contacting between the robot foot end and the ground is complex and variable, which increases the difficulty of force control inevitably. In the recent years, although many scholars researched some control methods such as disturbance rejection control, parameter self-adaptive control, impedance control and so on, to improve the force control performance of HDU, the robustness of the force control still needs improving. Therefore, how to simulate the complex and variable load characteristics of the environment structure and how to ensure HDU having excellent force control performance with the complex and variable load characteristics are key issues to be solved in this paper. The force control system mathematic model of HDU is established by the mechanism modeling method, and the theoretical models of a novel force control compensation method and a load characteristics simulation method under different environment structures are derived, considering the dynamic characteristics of the load stiffness and the load damping under different environment structures. Then, simulation effects of the variable load stiffness and load damping under the step and sinusoidal load force are analyzed experimentally on the HDU force control performance test platform, which provides the foundation for the force control compensation experiment research. In addition, the optimized PID control parameters are designed to make the HDU have better force control performance with suitable load stiffness and load damping, under which the force control compensation method is introduced, and the robustness of the force control system with several constant load characteristics and the variable load characteristics respectively are comparatively analyzed by experiment. The research results indicate that if the load characteristics are known, the force control compensation method presented in this paper has positive compensation effects on the load characteristics variation, i.e., this method decreases the effects of the load characteristics variation on the force control performance and enhances the force control system robustness with the constant PID parameters, thereby, the online PID parameters tuning control method which is complex needs not be adopted. All the above research provides theoretical and experimental foundation for the force control method of the quadruped robot joints with high robustness.

  3. Integrating Decision Making and Mental Health Interventions Research: Research Directions

    PubMed Central

    Wills, Celia E.; Holmes-Rovner, Margaret

    2006-01-01

    The importance of incorporating patient and provider decision-making processes is in the forefront of the National Institute of Mental Health (NIMH) agenda for improving mental health interventions and services. Key concepts in patient decision making are highlighted within a simplified model of patient decision making that links patient-level/“micro” variables to services-level/“macro” variables via the decision-making process that is a target for interventions. The prospective agenda for incorporating decision-making concepts in mental health research includes (a) improved measures for characterizing decision-making processes that are matched to study populations, complexity, and types of decision making; (b) testing decision aids in effectiveness research for diverse populations and clinical settings; and (c) improving the understanding and incorporation of preference concepts in enhanced intervention designs. PMID:16724158

  4. Utilizing multiple scale models to improve predictions of extra-axial hemorrhage in the immature piglet.

    PubMed

    Scott, Gregory G; Margulies, Susan S; Coats, Brittany

    2016-10-01

    Traumatic brain injury (TBI) is a leading cause of death and disability in the USA. To help understand and better predict TBI, researchers have developed complex finite element (FE) models of the head which incorporate many biological structures such as scalp, skull, meninges, brain (with gray/white matter differentiation), and vasculature. However, most models drastically simplify the membranes and substructures between the pia and arachnoid membranes. We hypothesize that substructures in the pia-arachnoid complex (PAC) contribute substantially to brain deformation following head rotation, and that when included in FE models accuracy of extra-axial hemorrhage prediction improves. To test these hypotheses, microscale FE models of the PAC were developed to span the variability of PAC substructure anatomy and regional density. The constitutive response of these models were then integrated into an existing macroscale FE model of the immature piglet brain to identify changes in cortical stress distribution and predictions of extra-axial hemorrhage (EAH). Incorporating regional variability of PAC substructures substantially altered the distribution of principal stress on the cortical surface of the brain compared to a uniform representation of the PAC. Simulations of 24 non-impact rapid head rotations in an immature piglet animal model resulted in improved accuracy of EAH prediction (to 94 % sensitivity, 100 % specificity), as well as a high accuracy in regional hemorrhage prediction (to 82-100 % sensitivity, 100 % specificity). We conclude that including a biofidelic PAC substructure variability in FE models of the head is essential for improved predictions of hemorrhage at the brain/skull interface.

  5. Handling Practicalities in Agricultural Policy Optimization for Water Quality Improvements

    EPA Science Inventory

    Bilevel and multi-objective optimization methods are often useful to spatially target agri-environmental policy throughout a watershed. This type of problem is complex and is comprised of a number of practicalities: (i) a large number of decision variables, (ii) at least two inte...

  6. Enhanced leaf photosynthesis as a target to increase grain yield: insights from transgenic rice lines with variable Rieske FeS protein content in the cytochrome b6 /f complex.

    PubMed

    Yamori, Wataru; Kondo, Eri; Sugiura, Daisuke; Terashima, Ichiro; Suzuki, Yuji; Makino, Amane

    2016-01-01

    Although photosynthesis is the most important source for biomass and grain yield, a lack of correlation between photosynthesis and plant yield among different genotypes of various crop species has been frequently observed. Such observations contribute to the ongoing debate whether enhancing leaf photosynthesis can improve yield potential. Here, transgenic rice plants that contain variable amounts of the Rieske FeS protein in the cytochrome (cyt) b6 /f complex between 10 and 100% of wild-type levels have been used to investigate the effect of reductions of these proteins on photosynthesis, plant growth and yield. Reductions of the cyt b6 /f complex did not affect the electron transport rates through photosystem I but decreased electron transport rates through photosystem II, leading to concomitant decreases in CO2 assimilation rates. There was a strong control of plant growth and grain yield by the rate of leaf photosynthesis, leading to the conclusion that enhancing photosynthesis at the single-leaf level would be a useful target for improving crop productivity and yield both via conventional breeding and biotechnology. The data here also suggest that changing photosynthetic electron transport rates via manipulation of the cyt b6 /f complex could be a potential target for enhancing photosynthetic capacity in higher plants. © 2015 John Wiley & Sons Ltd.

  7. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    PubMed

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  8. Parallel-Vector Algorithm For Rapid Structural Anlysis

    NASA Technical Reports Server (NTRS)

    Agarwal, Tarun R.; Nguyen, Duc T.; Storaasli, Olaf O.

    1993-01-01

    New algorithm developed to overcome deficiency of skyline storage scheme by use of variable-band storage scheme. Exploits both parallel and vector capabilities of modern high-performance computers. Gives engineers and designers opportunity to include more design variables and constraints during optimization of structures. Enables use of more refined finite-element meshes to obtain improved understanding of complex behaviors of aerospace structures leading to better, safer designs. Not only attractive for current supercomputers but also for next generation of shared-memory supercomputers.

  9. The Effect of a Six-Month Dancing Program on Motor-Cognitive Dual-Task Performance in Older Adults.

    PubMed

    Hamacher, Dennis; Hamacher, Daniel; Rehfeld, Kathrin; Hökelmann, Anita; Schega, Lutz

    2015-10-01

    Dancing is a complex sensorimotor activity involving physical and mental elements which have positive effects on cognitive functions and motor control. The present randomized controlled trial aims to analyze the effects of a dancing program on the performance on a motor-cognitive dual task. Data of 35 older adults, who were assigned to a dancing group or a health-related exercise group, are presented in the study. In pretest and posttest, we assessed cognitive performance and variability of minimum foot clearance, stride time, and stride length while walking. Regarding the cognitive performance and the stride-to-stride variability of minimum foot clearance, interaction effects have been found, indicating that dancing lowers gait variability to a higher extent than conventional health-related exercise. The data show that dancing improves minimum foot clearance variability and cognitive performance in a dual-task situation. Multi-task exercises (like dancing) might be a powerful tool to improve motor-cognitive dual-task performance.

  10. Beginning the Principalship: A Practical Guide for New School Leaders.

    ERIC Educational Resources Information Center

    Daresh, John C.; Playko, Marsha A.

    The most critical variable for school success is the leadership behavior of the school principal. However, the principal's role is becoming increasingly complex. This book offers suggestions for the beginning principal. Each chapter is geared to help new principals develop personal plans for professional development and improvement. Following the…

  11. Regression Analysis of Optical Coherence Tomography Disc Variables for Glaucoma Diagnosis.

    PubMed

    Richter, Grace M; Zhang, Xinbo; Tan, Ou; Francis, Brian A; Chopra, Vikas; Greenfield, David S; Varma, Rohit; Schuman, Joel S; Huang, David

    2016-08-01

    To report diagnostic accuracy of optical coherence tomography (OCT) disc variables using both time-domain (TD) and Fourier-domain (FD) OCT, and to improve the use of OCT disc variable measurements for glaucoma diagnosis through regression analyses that adjust for optic disc size and axial length-based magnification error. Observational, cross-sectional. In total, 180 normal eyes of 112 participants and 180 eyes of 138 participants with perimetric glaucoma from the Advanced Imaging for Glaucoma Study. Diagnostic variables evaluated from TD-OCT and FD-OCT were: disc area, rim area, rim volume, optic nerve head volume, vertical cup-to-disc ratio (CDR), and horizontal CDR. These were compared with overall retinal nerve fiber layer thickness and ganglion cell complex. Regression analyses were performed that corrected for optic disc size and axial length. Area-under-receiver-operating curves (AUROC) were used to assess diagnostic accuracy before and after the adjustments. An index based on multiple logistic regression that combined optic disc variables with axial length was also explored with the aim of improving diagnostic accuracy of disc variables. Comparison of diagnostic accuracy of disc variables, as measured by AUROC. The unadjusted disc variables with the highest diagnostic accuracies were: rim volume for TD-OCT (AUROC=0.864) and vertical CDR (AUROC=0.874) for FD-OCT. Magnification correction significantly worsened diagnostic accuracy for rim variables, and while optic disc size adjustments partially restored diagnostic accuracy, the adjusted AUROCs were still lower. Axial length adjustments to disc variables in the form of multiple logistic regression indices led to a slight but insignificant improvement in diagnostic accuracy. Our various regression approaches were not able to significantly improve disc-based OCT glaucoma diagnosis. However, disc rim area and vertical CDR had very high diagnostic accuracy, and these disc variables can serve to complement additional OCT measurements for diagnosis of glaucoma.

  12. Improved result on stability analysis of discrete stochastic neural networks with time delay

    NASA Astrophysics Data System (ADS)

    Wu, Zhengguang; Su, Hongye; Chu, Jian; Zhou, Wuneng

    2009-04-01

    This Letter investigates the problem of exponential stability for discrete stochastic time-delay neural networks. By defining a novel Lyapunov functional, an improved delay-dependent exponential stability criterion is established in terms of linear matrix inequality (LMI) approach. Meanwhile, the computational complexity of the newly established stability condition is reduced because less variables are involved. Numerical example is given to illustrate the effectiveness and the benefits of the proposed method.

  13. Datamining approaches for modeling tumor control probability.

    PubMed

    Naqa, Issam El; Deasy, Joseph O; Mu, Yi; Huang, Ellen; Hope, Andrew J; Lindsay, Patricia E; Apte, Aditya; Alaly, James; Bradley, Jeffrey D

    2010-11-01

    Tumor control probability (TCP) to radiotherapy is determined by complex interactions between tumor biology, tumor microenvironment, radiation dosimetry, and patient-related variables. The complexity of these heterogeneous variable interactions constitutes a challenge for building predictive models for routine clinical practice. We describe a datamining framework that can unravel the higher order relationships among dosimetric dose-volume prognostic variables, interrogate various radiobiological processes, and generalize to unseen data before when applied prospectively. Several datamining approaches are discussed that include dose-volume metrics, equivalent uniform dose, mechanistic Poisson model, and model building methods using statistical regression and machine learning techniques. Institutional datasets of non-small cell lung cancer (NSCLC) patients are used to demonstrate these methods. The performance of the different methods was evaluated using bivariate Spearman rank correlations (rs). Over-fitting was controlled via resampling methods. Using a dataset of 56 patients with primary NCSLC tumors and 23 candidate variables, we estimated GTV volume and V75 to be the best model parameters for predicting TCP using statistical resampling and a logistic model. Using these variables, the support vector machine (SVM) kernel method provided superior performance for TCP prediction with an rs=0.68 on leave-one-out testing compared to logistic regression (rs=0.4), Poisson-based TCP (rs=0.33), and cell kill equivalent uniform dose model (rs=0.17). The prediction of treatment response can be improved by utilizing datamining approaches, which are able to unravel important non-linear complex interactions among model variables and have the capacity to predict on unseen data for prospective clinical applications.

  14. Development of a computerized assessment of clinician adherence to a treatment guideline for patients with bipolar disorder.

    PubMed

    Dennehy, Ellen B; Suppes, Trisha; John Rush, A; Lynn Crismon, M; Witte, B; Webster, J

    2004-01-01

    The adoption of treatment guidelines for complex psychiatric illness is increasing. Treatment decisions in psychiatry depend on a number of variables, including severity of symptoms, past treatment history, patient preferences, medication tolerability, and clinical response. While patient outcomes may be improved by the use of treatment guidelines, there is no agreed upon standard by which to assess the degree to which clinician behavior corresponds to those recommendations. This report presents a method to assess clinician adherence to the complex multidimensional treatment guideline for bipolar disorder utilized in the Texas Medication Algorithm Project. The steps involved in the development of this system are presented, including the reliance on standardized documentation, defining core variables of interest, selecting criteria for operationalization of those variables, and computerization of the assessment of adherence. The computerized assessment represents an improvement over other assessment methods, which have relied on laborious and costly chart reviews to extract clinical information and to analyze provider behavior. However, it is limited by the specificity of decisions that guided the adherence scoring process. Preliminary findings using this system with 2035 clinical visits conducted for the bipolar disorder module of TMAP Phase 3 are presented. These data indicate that this system of guideline adherence monitoring is feasible.

  15. The effect of workstation and task variables on forces applied during simulated meat cutting.

    PubMed

    McGorry, Raymond W; Dempsey, Patrick G; O'Brien, Niall V

    2004-12-01

    The purpose of the study was to investigate factors related to force and postural exposure during a simulated meat cutting task. The hypothesis was that workstation, tool and task variables would affect the dependent kinetic variables of gripping force, cutting moment and the dependent kinematic variables of elbow elevation and wrist angular displacement in the flexion/extension and radial/ulnar deviation planes. To evaluate this hypothesis a 3 x 3 x 2 x 2 x 2 (surface orientation by surface height by blade angle by cut complexity by work pace) within-subject factorial design was conducted with 12 participants. The results indicated that the variables can act and interact to modify the kinematics and kinetics of a cutting task. Participants used greater grip force and cutting moment when working at a pace based on productivity. The interactions of the work surface height and orientation indicated that the use of an adjustable workstation could minimize wrist deviation from neutral and improve shoulder posture during cutting operations. Angling the knife blade also interacted with workstation variables to improve wrist and upper extremity posture, but this benefit must be weighed against the potential for small increases in force exposure.

  16. A system of three-dimensional complex variables

    NASA Technical Reports Server (NTRS)

    Martin, E. Dale

    1986-01-01

    Some results of a new theory of multidimensional complex variables are reported, including analytic functions of a three-dimensional (3-D) complex variable. Three-dimensional complex numbers are defined, including vector properties and rules of multiplication. The necessary conditions for a function of a 3-D variable to be analytic are given and shown to be analogous to the 2-D Cauchy-Riemann equations. A simple example also demonstrates the analogy between the newly defined 3-D complex velocity and 3-D complex potential and the corresponding ordinary complex velocity and complex potential in two dimensions.

  17. Effects of organizational complexity and resources on construction site risk.

    PubMed

    Forteza, Francisco J; Carretero-Gómez, Jose M; Sesé, Albert

    2017-09-01

    Our research is aimed at studying the relationship between risk level and organizational complexity and resources on constructions sites. Our general hypothesis is that site complexity increases risk, whereas more resources of the structure decrease risk. A Structural Equation Model (SEM) approach was adopted to validate our theoretical model. To develop our study, 957 building sites in Spain were visited and assessed in 2003-2009. All needed data were obtained using a specific tool developed by the authors to assess site risk, structure and resources (Construction Sites Risk Assessment Tool, or CONSRAT). This tool operationalizes the variables to fit our model, specifically, via a site risk index (SRI) and 10 organizational variables. Our random sample is composed largely of small building sites with general high levels of risk, moderate complexity, and low resources on site. The model obtained adequate fit, and results showed empirical evidence that the factors of complexity and resources can be considered predictors of site risk level. Consequently, these results can help companies, managers of construction and regulators to identify which organizational aspects should be improved to prevent risks on sites and consequently accidents. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  18. Genotype-environment interaction and sociology: contributions and complexities.

    PubMed

    Seabrook, Jamie A; Avison, William R

    2010-05-01

    Genotype-environment interaction (G x E) refers to situations in which genetic effects connected to a phenotype are dependent upon variability in the environment, or when genes modify an organism's sensitivity to particular environmental features. Using a typology suggested in the G x E literature, we provide an overview of recent papers that show how social context can trigger a genetic vulnerability, compensate for a genetic vulnerability, control behaviors for which a genetic vulnerability exists, and improve adaptation via proximal causes. We argue that to improve their understanding of social structure, sociologists can take advantage of research in behavior genetics by assessing the impact of within-group variance of various health outcomes and complex human behaviors that are explainable by genotype, environment and their interaction. Insights from life course sociology can aid in ensuring that the dynamic nature of the environment in G x E has been accounted for. Identification of an appropriate entry point for sociologists interested in G x E research could begin with the choice of an environmental feature of interest, a genetic factor of interest, and/or behavior of interest. Optimizing measurement in order to capture the complexity of G x E is critical. Examining the interaction between poorly measured environmental factors and well measured genetic variables will overestimate the effects of genetic variables while underestimating the effect of environmental influences, thereby distorting the interaction between genotype and environment. Although the expense of collecting environmental data is very high, reliable and precise measurement of an environmental pathogen enhances a study's statistical power. Copyright 2010 Elsevier Ltd. All rights reserved.

  19. Stop Trying to Make Kids "Ready" for Kindergarten

    ERIC Educational Resources Information Center

    Pretti-Frontczak, Kristie

    2014-01-01

    The author of this article asks: What is readiness for kindergarten? How do we know when a child is ready? Unfortunately, as with many topics in education reform and improvement, policy makers ignore the complex questions about readiness and instead focus narrowly on select variables. The focus for kindergarten readiness is on select literacy and…

  20. Amazon forest structure generates diurnal and seasonal variability in light utilization

    Treesearch

    Douglas C. Morton; Jeremy Rubio; Bruce D. Cook; Jean-Philippe Gastellu-Etchegorry; Marcos Longo; Hyeungu Choi; Maria Hunter; Michael Keller

    2016-01-01

    The complex three-dimensional (3-D) structure of tropical forests generates a diversity of light environments for canopy and understory trees. Understanding diurnal and seasonal changes in light availability is critical for interpreting measurements of net ecosystem exchange and improving ecosystem models. Here, we used the Discrete Anisotropic Radiative Transfer (DART...

  1. Seeing the forests and the trees—innovative approaches to exploring heterogeneity in systematic reviews of complex interventions to enhance health system decision-making: a protocol

    PubMed Central

    2014-01-01

    Background To improve quality of care and patient outcomes, health system decision-makers need to identify and implement effective interventions. An increasing number of systematic reviews document the effects of quality improvement programs to assist decision-makers in developing new initiatives. However, limitations in the reporting of primary studies and current meta-analysis methods (including approaches for exploring heterogeneity) reduce the utility of existing syntheses for health system decision-makers. This study will explore the role of innovative meta-analysis approaches and the added value of enriched and updated data for increasing the utility of systematic reviews of complex interventions. Methods/Design We will use the dataset from our recent systematic review of 142 randomized trials of diabetes quality improvement programs to evaluate novel approaches for exploring heterogeneity. These will include exploratory methods, such as multivariate meta-regression analyses and all-subsets combinatorial meta-analysis. We will then update our systematic review to include new trials and enrich the dataset by surveying authors of all included trials. In doing so, we will explore the impact of variables not, reported in previous publications, such as details of study context, on the effectiveness of the intervention. We will use innovative analytical methods on the enriched and updated dataset to identify key success factors in the implementation of quality improvement interventions for diabetes. Decision-makers will be involved throughout to help identify and prioritize variables to be explored and to aid in the interpretation and dissemination of results. Discussion This study will inform future systematic reviews of complex interventions and describe the value of enriching and updating data for exploring heterogeneity in meta-analysis. It will also result in an updated comprehensive systematic review of diabetes quality improvement interventions that will be useful to health system decision-makers in developing interventions to improve outcomes for people with diabetes. Systematic review registration PROSPERO registration no. CRD42013005165 PMID:25115289

  2. Characterization of local complex structures in a recurrence plot to improve nonlinear dynamic discriminant analysis.

    PubMed

    Ding, Hang

    2014-01-01

    Structures in recurrence plots (RPs), preserving the rich information of nonlinear invariants and trajectory characteristics, have been increasingly analyzed in dynamic discrimination studies. The conventional analysis of RPs is mainly focused on quantifying the overall diagonal and vertical line structures through a method, called recurrence quantification analysis (RQA). This study extensively explores the information in RPs by quantifying local complex RP structures. To do this, an approach was developed to analyze the combination of three major RQA variables: determinism, laminarity, and recurrence rate (DLR) in a metawindow moving over a RP. It was then evaluated in two experiments discriminating (1) ideal nonlinear dynamic series emulated from the Lorenz system with different control parameters and (2) data sets of human heart rate regulations with normal sinus rhythms (n = 18) and congestive heart failure (n = 29). Finally, the DLR was compared with seven major RQA variables in terms of discriminatory power, measured by standardized mean difference (DSMD). In the two experiments, DLR resulted in the highest discriminatory power with DSMD = 2.53 and 0.98, respectively, which were 7.41 and 2.09 times the best performance from RQA. The study also revealed that the optimal RP structures for the discriminations were neither typical diagonal structures nor vertical structures. These findings indicate that local complex RP structures contain some rich information unexploited by RQA. Therefore, future research to extensively analyze complex RP structures would potentially improve the effectiveness of the RP analysis in dynamic discrimination studies.

  3. Myocardial infarction in the elderly.

    PubMed

    Carro, Amelia; Kaski, Juan Carlos

    2011-04-01

    Advances in pharmacological treatment and effective early myocardial revascularization have -in recent years- led to improved clinical outcomes in patients with acute myocardial infarction (AMI). However, it has been suggested that compared to younger subjects, elderly AMI patients are less likely to receive evidence-based treatment, including myocardial revascularization therapy. Several reasons have been postulated to explain this trend, including uncertainty regarding the true benefits of the interventions commonly used in this setting as well as increased risk mainly associated with comorbidities. The diagnosis, management, and post-hospitalization care of elderly patients presenting with an acute coronary syndrome pose many difficulties at present. A complex interplay of variables such as comorbidities, functional and socioeconomic status, side effects associated with multiple drug administration, and individual biologic variability, all contribute to creating a complex clinical scenario. In this complex setting, clinicians are often required to extrapolate evidence-based results obtained in cardiovascular trials from which older patients are often, implicitly or explicitly, excluded. This article reviews current recommendations regarding management of AMI in the elderly.

  4. Myocardial Infarction in the Elderly

    PubMed Central

    Carro, Amelia; Kaski, Juan Carlos

    2011-01-01

    Advances in pharmacological treatment and effective early myocardial revascularization have –in recent years- led to improved clinical outcomes in patients with acute myocardial infarction (AMI). However, it has been suggested that compared to younger subjects, elderly AMI patients are less likely to receive evidence-based treatment, including myocardial revascularization therapy. Several reasons have been postulated to explain this trend, including uncertainty regarding the true benefits of the interventions commonly used in this setting as well as increased risk mainly associated with comorbidities. The diagnosis, management, and post-hospitalization care of elderly patients presenting with an acute coronary syndrome pose many difficulties at present. A complex interplay of variables such as comorbidities, functional and socioeconomic status, side effects associated with multiple drug administration, and individual biologic variability, all contribute to creating a complex clinical scenario. In this complex setting, clinicians are often required to extrapolate evidence-based results obtained in cardiovascular trials from which older patients are often, implicitly or explicitly, excluded. This article reviews current recommendations regarding management of AMI in the elderly. PMID:22396870

  5. Design and performance of optimal detectors for guided wave structural health monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dib, G.; Udpa, L.

    2016-01-01

    Ultrasonic guided wave measurements in a long term structural health monitoring system are affected by measurement noise, environmental conditions, transducer aging and malfunction. This results in measurement variability which affects detection performance, especially in complex structures where baseline data comparison is required. This paper derives the optimal detector structure, within the framework of detection theory, where a guided wave signal at the sensor is represented by a single feature value that can be used for comparison with a threshold. Three different types of detectors are derived depending on the underlying structure’s complexity: (i) Simple structures where defect reflections can bemore » identified without the need for baseline data; (ii) Simple structures that require baseline data due to overlap of defect scatter with scatter from structural features; (iii) Complex structure with dense structural features that require baseline data. The detectors are derived by modeling the effects of variabilities and uncertainties as random processes. Analytical solutions for the performance of detectors in terms of the probability of detection and false alarm are derived. A finite element model is used to generate guided wave signals and the performance results of a Monte-Carlo simulation are compared with the theoretical performance. initial results demonstrate that the problems of signal complexity and environmental variability can in fact be exploited to improve detection performance.« less

  6. The role of phospholipid as a solubility- and permeability-enhancing excipient for the improved delivery of the bioactive phytoconstituents of Bacopa monnieri.

    PubMed

    Saoji, Suprit D; Dave, Vivek S; Dhore, Pradip W; Bobde, Yamini S; Mack, Connor; Gupta, Deepak; Raut, Nishikant A

    2017-10-15

    In an attempt to improve the solubility and permeability of Standardized Bacopa Extract (SBE), a complexation approach based on phospholipid was employed. A solvent evaporation method was used to prepare the SBE-phospholipid complex (Bacopa Naturosome, BN). The formulation and process variables were optimized using a central-composite design. The formation of BN was confirmed by photomicroscopy, Scanning Electron Microscopy (SEM), Fourier Transform Infrared Spectroscopy (FTIR), Differential Scanning Calorimetry (DSC), and Powder X-ray Diffraction (PXRD). The saturation solubility, the in-vitro dissolution, and the ex-vivo permeability studies were used for the functional evaluation of the prepared complex. BN exhibited a significantly higher aqueous solubility compared to the pure SBE (20-fold), or the physical mixture of SBE and the phospholipid (13-fold). Similarly, the in-vitro dissolution revealed a significantly higher efficiency of the prepared complex (BN) in releasing the SBE (>97%) in comparison to the pure SCE (~42%), or the physical mixture (~47%). The ex-vivo permeation studies showed that the prepared BN significantly improved the permeation of SBE (>90%), compared to the pure SBE (~21%), or the physical mixture (~24%). Drug-phospholipid complexation may thus be a promising strategy for solubility enhancement of bioactive phytoconstituents. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Visual analytics in medical education: impacting analytical reasoning and decision making for quality improvement.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2015-01-01

    The medical curriculum is the main tool representing the entire undergraduate medical education. Due to its complexity and multilayered structure it is of limited use to teachers in medical education for quality improvement purposes. In this study we evaluated three visualizations of curriculum data from a pilot course, using teachers from an undergraduate medical program and applying visual analytics methods. We found that visual analytics can be used to positively impacting analytical reasoning and decision making in medical education through the realization of variables capable to enhance human perception and cognition on complex curriculum data. The positive results derived from our evaluation of a medical curriculum and in a small scale, signify the need to expand this method to an entire medical curriculum. As our approach sustains low levels of complexity it opens a new promising direction in medical education informatics research.

  8. Etiological Beliefs, Treatments, Stigmatizing Attitudes toward Schizophrenia. What Do Italians and Israelis Think?

    PubMed

    Mannarini, Stefania; Boffo, Marilisa; Rossi, Alessandro; Balottin, Laura

    2017-01-01

    Background: Although scientific research on the etiology of mental disorders has improved the knowledge of biogenetic and psychosocial aspects related to the onset of mental illness, stigmatizing attitudes and behaviors are still very prevalent and pose a significant social problem. Aim: The aim of this study was to deepen the knowledge of how attitudes toward people with mental illness are affected by specific personal beliefs and characteristics, such as culture and religion of the perceiver. More precisely, the main purpose is the definition of a structure of variables, namely perceived dangerousness, social closeness, and avoidance of the ill person, together with the beliefs about the best treatment to be undertaken and the sick person' gender, capable of describing the complexity of the stigma construct in particular as far as schizophrenia is concerned. Method: The study involved 305 university students, 183 from the University of Padua, Italy, and 122 from the University of Haifa, Israel. For the analyses, a latent class analysis (LCA) approach was chosen to identify a latent categorical structure accounting for the covariance between the observed variables. Such a latent structure was expected to be moderated by cultural background (Italy versus Israel) and religious beliefs, whereas causal beliefs, recommended treatment, dangerousness, social closeness, and public avoidance were the manifest variables, namely the observed indicators of the latent variable. Results: Two sets of results were obtained. First, the relevance of the manifest variables as indicators of the hypothesized latent variable was highlighted. Second, a two-latent-class categorical dimension represented by prejudicial attitudes, causal beliefs, and treatments concerning schizophrenia was found. Specifically, the differential effects of the two cultures and the religious beliefs on the latent structure and their relations highlighted the relevance of the observed variables as indicators of the expected latent variable. Conclusion: The present study contributes to the improvement of the understanding of how attitudes toward people with mental illness are affected by specific personal beliefs and characteristics of the perceiver. The definition of a structure of variables capable of describing the complexity of the stigma construct in particular as far as schizophrenia is concerned was achieved from a cross-cultural perspective.

  9. Water Quality Variable Estimation using Partial Least Squares Regression and Multi-Scale Remote Sensing.

    NASA Astrophysics Data System (ADS)

    Peterson, K. T.; Wulamu, A.

    2017-12-01

    Water, essential to all living organisms, is one of the Earth's most precious resources. Remote sensing offers an ideal approach to monitor water quality over traditional in-situ techniques that are highly time and resource consuming. Utilizing a multi-scale approach, incorporating data from handheld spectroscopy, UAS based hyperspectal, and satellite multispectral images were collected in coordination with in-situ water quality samples for the two midwestern watersheds. The remote sensing data was modeled and correlated to the in-situ water quality variables including chlorophyll content (Chl), turbidity, and total dissolved solids (TDS) using Normalized Difference Spectral Indices (NDSI) and Partial Least Squares Regression (PLSR). The results of the study supported the original hypothesis that correlating water quality variables with remotely sensed data benefits greatly from the use of more complex modeling and regression techniques such as PLSR. The final results generated from the PLSR analysis resulted in much higher R2 values for all variables when compared to NDSI. The combination of NDSI and PLSR analysis also identified key wavelengths for identification that aligned with previous study's findings. This research displays the advantages and future for complex modeling and machine learning techniques to improve water quality variable estimation from spectral data.

  10. Refining Collective Coordinates and Improving Free Energy Representation in Variational Enhanced Sampling.

    PubMed

    Yang, Yi Isaac; Parrinello, Michele

    2018-06-12

    Collective variables are used often in many enhanced sampling methods, and their choice is a crucial factor in determining sampling efficiency. However, at times, searching for good collective variables can be challenging. In a recent paper, we combined time-lagged independent component analysis with well-tempered metadynamics in order to obtain improved collective variables from metadynamics runs that use lower quality collective variables [ McCarty, J.; Parrinello, M. J. Chem. Phys. 2017 , 147 , 204109 ]. In this work, we extend these ideas to variationally enhanced sampling. This leads to an efficient scheme that is able to make use of the many advantages of the variational scheme. We apply the method to alanine-3 in water. From an alanine-3 variationally enhanced sampling trajectory in which all the six dihedral angles are biased, we extract much better collective variables able to describe in exquisite detail the protein complex free energy surface in a low dimensional representation. The success of this investigation is helped by a more accurate way of calculating the correlation functions needed in the time-lagged independent component analysis and from the introduction of a new basis set to describe the dihedral angles arrangement.

  11. Variable camber rotor study

    NASA Technical Reports Server (NTRS)

    Dadone, L.; Cowan, J.; Mchugh, F. J.

    1982-01-01

    Deployment of variable camber concepts on helicopter rotors was analytically assessed. It was determined that variable camber extended the operating range of helicopters provided that the correct compromise can be obtained between performance/loads gains and mechanical complexity. A number of variable camber concepts were reviewed on a two dimensional basis to determine the usefulness of leading edge, trailing edge and overall camber variation schemes. The most powerful method to vary camber was through the trailing edge flaps undergoing relatively small motions (-5 deg to +15 deg). The aerodynamic characteristics of the NASA/Ames A-1 airfoil with 35% and 50% plain trailing edge flaps were determined by means of current subcritical and transonic airfoil design methods and used by rotor performance and loads analysis codes. The most promising variable camber schedule reviewed was a configuration with a 35% plain flap deployment in an on/off mode near the tip of a blade. Preliminary results show approximately 11% reduction in power is possible at 192 knots and a rotor thrust coefficient of 0.09. The potential demonstrated indicates a significant potential for expanding the operating envelope of the helicopter. Further investigation into improving the power saving and defining the improvement in the operational envelope of the helicopter is recommended.

  12. Advanced multivariable control of a turboexpander plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Altena, D.; Howard, M.; Bullin, K.

    1998-12-31

    This paper describes an application of advanced multivariable control on a natural gas plant and compares its performance to the previous conventional feed-back control. This control algorithm utilizes simple models from existing plant data and/or plant tests to hold the process at the desired operating point in the presence of disturbances and changes in operating conditions. The control software is able to accomplish this due to effective handling of process variable interaction, constraint avoidance and feed-forward of measured disturbances. The economic benefit of improved control lies in operating closer to the process constraints while avoiding significant violations. The South Texasmore » facility where this controller was implemented experienced reduced variability in process conditions which increased liquids recovery because the plant was able to operate much closer to the customer specified impurity constraint. An additional benefit of this implementation of multivariable control is the ability to set performance criteria beyond simple setpoints, including process variable constraints, relative variable merit and optimizing use of manipulated variables. The paper also details the control scheme applied to the complex turboexpander process and some of the safety features included to improve reliability.« less

  13. An improved switching converter model using discrete and average techniques

    NASA Technical Reports Server (NTRS)

    Shortt, D. J.; Lee, F. C.

    1982-01-01

    The nonlinear modeling and analysis of dc-dc converters has been done by averaging and discrete-sampling techniques. The averaging technique is simple, but inaccurate as the modulation frequencies approach the theoretical limit of one-half the switching frequency. The discrete technique is accurate even at high frequencies, but is very complex and cumbersome. An improved model is developed by combining the aforementioned techniques. This new model is easy to implement in circuit and state variable forms and is accurate to the theoretical limit.

  14. Solving the Inverse-Square Problem with Complex Variables

    ERIC Educational Resources Information Center

    Gauthier, N.

    2005-01-01

    The equation of motion for a mass that moves under the influence of a central, inverse-square force is formulated and solved as a problem in complex variables. To find the solution, the constancy of angular momentum is first established using complex variables. Next, the complex position coordinate and complex velocity of the particle are assumed…

  15. Undergraduate Student Task Group Approach to Complex Problem Solving Employing Computer Programming.

    ERIC Educational Resources Information Center

    Brooks, LeRoy D.

    A project formulated a computer simulation game for use as an instructional device to improve financial decision making. The author constructed a hypothetical firm, specifying its environment, variables, and a maximization problem. Students, assisted by a professor and computer consultants and having access to B5500 and B6700 facilities, held 16…

  16. Improving the Quality of Home Health Care for Children With Medical Complexity.

    PubMed

    Nageswaran, Savithri; Golden, Shannon L

    2017-08-01

    The objectives of this study are to describe the quality of home health care services for children with medical complexity, identify barriers to delivering optimal home health care, and discuss potential solutions to improve home health care delivery. In this qualitative study, we conducted 20 semistructured in-depth interviews with primary caregivers of children with medical complexity, and 4 focus groups with 18 home health nurses. During an iterative analysis process, we identified themes related to quality of home health care. There is substantial variability between home health nurses in the delivery of home health care to children. Lack of skills in nurses is common and has serious negative health consequences for children with medical complexity, including hospitalizations, emergency room visits, and need for medical procedures. Inadequate home health care also contributes to caregiver burden. A major barrier to delivering optimal home health care is the lack of training of home health nurses in pediatric care and technology use. Potential solutions for improving care include home health agencies training nurses in the care of children with medical complexity, support for nurses in clinical problem solving, and reimbursement for training nurses in pediatric home care. Caregiver-level interventions includes preparation of caregivers about: providing medical care for their children at home and addressing problems with home health care services. There are problems in the quality of home health care delivered to children with medical complexity. Training nurses in the care of children with medical complexity and preparing caregivers about home care could improve home health care quality. Copyright © 2017 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  17. Stability of uncertain impulsive complex-variable chaotic systems with time-varying delays.

    PubMed

    Zheng, Song

    2015-09-01

    In this paper, the robust exponential stabilization of uncertain impulsive complex-variable chaotic delayed systems is considered with parameters perturbation and delayed impulses. It is assumed that the considered complex-variable chaotic systems have bounded parametric uncertainties together with the state variables on the impulses related to the time-varying delays. Based on the theories of adaptive control and impulsive control, some less conservative and easily verified stability criteria are established for a class of complex-variable chaotic delayed systems with delayed impulses. Some numerical simulations are given to validate the effectiveness of the proposed criteria of impulsive stabilization for uncertain complex-variable chaotic delayed systems. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Double symbolic joint entropy in nonlinear dynamic complexity analysis

    NASA Astrophysics Data System (ADS)

    Yao, Wenpo; Wang, Jun

    2017-07-01

    Symbolizations, the base of symbolic dynamic analysis, are classified as global static and local dynamic approaches which are combined by joint entropy in our works for nonlinear dynamic complexity analysis. Two global static methods, symbolic transformations of Wessel N. symbolic entropy and base-scale entropy, and two local ones, namely symbolizations of permutation and differential entropy, constitute four double symbolic joint entropies that have accurate complexity detections in chaotic models, logistic and Henon map series. In nonlinear dynamical analysis of different kinds of heart rate variability, heartbeats of healthy young have higher complexity than those of the healthy elderly, and congestive heart failure (CHF) patients are lowest in heartbeats' joint entropy values. Each individual symbolic entropy is improved by double symbolic joint entropy among which the combination of base-scale and differential symbolizations have best complexity analysis. Test results prove that double symbolic joint entropy is feasible in nonlinear dynamic complexity analysis.

  19. Hsp70 enhances presentation of FMDV antigen to bovine CD4+ T cells in vitro

    PubMed Central

    McLaughlin, Kerry; Seago, Julian; Robinson, Lucy; Kelly, Charles; Charleston, Bryan

    2010-01-01

    Foot-and-mouth disease virus (FMDV) is the causative agent of a highly contagious acute vesicular disease affecting cloven-hoofed animals, including cattle, sheep and pigs. The current vaccine induces a rapid humoral response, but the duration of the protective antibody response is variable, possibly associated with a variable specific CD4+ T cell response. We investigated the use of heat shock protein 70 (Hsp70) as a molecular chaperone to target viral antigen to the Major Histocompatibility Complex (MHC) class II pathway of antigen presenting cells and generate enhanced MHC II-restricted CD4+ T cell responses in cattle. Monocytes and CD4+ T cells from FMDV vaccinated cattle were stimulated in vitro with complexes of Hsp70 and FMDV peptide, or peptide alone. Hsp70 was found to consistently improve the presentation of a 25-mer FMDV peptide to CD4+ T cells, as measured by T cell proliferation. Complex formation was required for the enhanced effects and Hsp70 alone did not stimulate proliferation. This study provides further evidence that Hsp70:peptide complexes can enhance antigen-specific CD4+ T cell responses in vitro for an important pathogen of livestock. PMID:20167197

  20. Exploratory Spectroscopy of Magnetic Cataclysmic Variables Candidates and Other Variable Objects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oliveira, A. S.; Palhares, M. S.; Rodrigues, C. V.

    2017-04-01

    The increasing number of synoptic surveys made by small robotic telescopes, such as the photometric Catalina Real-Time Transient Survey (CRTS), provides a unique opportunity to discover variable sources and improves the statistical samples of such classes of objects. Our goal is the discovery of magnetic Cataclysmic Variables (mCVs). These are rare objects that probe interesting accretion scenarios controlled by the white-dwarf magnetic field. In particular, improved statistics of mCVs would help to address open questions on their formation and evolution. We performed an optical spectroscopy survey to search for signatures of magnetic accretion in 45 variable objects selected mostly from themore » CRTS. In this sample, we found 32 CVs, 22 being mCV candidates, 13 of which were previously unreported as such. If the proposed classifications are confirmed, it would represent an increase of 4% in the number of known polars and 12% in the number of known IPs. A fraction of our initial sample was classified as extragalactic sources or other types of variable stars by the inspection of the identification spectra. Despite the inherent complexity in identifying a source as an mCV, variability-based selection, followed by spectroscopic snapshot observations, has proved to be an efficient strategy for their discoveries, being a relatively inexpensive approach in terms of telescope time.« less

  1. Towards Improved Forecasts of Atmospheric and Oceanic Circulations over the Complex Terrain of the Eastern Mediterranean

    NASA Technical Reports Server (NTRS)

    Chronis, Themis; Case, Jonathan L.; Papadopoulos, Anastasios; Anagnostou, Emmanouil N.; Mecikalski, John R.; Haines, Stephanie L.

    2008-01-01

    Forecasting atmospheric and oceanic circulations accurately over the Eastern Mediterranean has proved to be an exceptional challenge. The existence of fine-scale topographic variability (land/sea coverage) and seasonal dynamics variations can create strong spatial gradients in temperature, wind and other state variables, which numerical models may have difficulty capturing. The Hellenic Center for Marine Research (HCMR) is one of the main operational centers for wave forecasting in the eastern Mediterranean. Currently, HCMR's operational numerical weather/ocean prediction model is based on the coupled Eta/Princeton Ocean Model (POM). Since 1999, HCMR has also operated the POSEIDON floating buoys as a means of state-of-the-art, real-time observations of several oceanic and surface atmospheric variables. This study attempts a first assessment at improving both atmospheric and oceanic prediction by initializing a regional Numerical Weather Prediction (NWP) model with high-resolution sea surface temperatures (SST) from remotely sensed platforms in order to capture the small-scale characteristics.

  2. Cost drivers and resource allocation in military health care systems.

    PubMed

    Fulton, Larry; Lasdon, Leon S; McDaniel, Reuben R

    2007-03-01

    This study illustrates the feasibility of incorporating technical efficiency considerations in the funding of military hospitals and identifies the primary drivers for hospital costs. Secondary data collected for 24 U.S.-based Army hospitals and medical centers for the years 2001 to 2003 are the basis for this analysis. Technical efficiency was measured by using data envelopment analysis; subsequently, efficiency estimates were included in logarithmic-linear cost models that specified cost as a function of volume, complexity, efficiency, time, and facility type. These logarithmic-linear models were compared against stochastic frontier analysis models. A parsimonious, three-variable, logarithmic-linear model composed of volume, complexity, and efficiency variables exhibited a strong linear relationship with observed costs (R(2) = 0.98). This model also proved reliable in forecasting (R(2) = 0.96). Based on our analysis, as much as $120 million might be reallocated to improve the United States-based Army hospital performance evaluated in this study.

  3. Envisioning, quantifying, and managing thermal regimes on river networks

    USGS Publications Warehouse

    Steel, E. Ashley; Beechie, Timothy J.; Torgersen, Christian E.; Fullerton, Aimee H.

    2017-01-01

    Water temperatures fluctuate in time and space, creating diverse thermal regimes on river networks. Temporal variability in these thermal landscapes has important biological and ecological consequences because of nonlinearities in physiological reactions; spatial diversity in thermal landscapes provides aquatic organisms with options to maximize growth and survival. However, human activities and climate change threaten to alter the dynamics of riverine thermal regimes. New data and tools can identify particular facets of the thermal landscape that describe ecological and management concerns and that are linked to human actions. The emerging complexity of thermal landscapes demands innovations in communication, opens the door to exciting research opportunities on the human impacts to and biological consequences of thermal variability, suggests improvements in monitoring programs to better capture empirical patterns, provides a framework for suites of actions to restore and protect the natural processes that drive thermal complexity, and indicates opportunities for better managing thermal landscapes.

  4. VLSI implementation of a new LMS-based algorithm for noise removal in ECG signal

    NASA Astrophysics Data System (ADS)

    Satheeskumaran, S.; Sabrigiriraj, M.

    2016-06-01

    Least mean square (LMS)-based adaptive filters are widely deployed for removing artefacts in electrocardiogram (ECG) due to less number of computations. But they posses high mean square error (MSE) under noisy environment. The transform domain variable step-size LMS algorithm reduces the MSE at the cost of computational complexity. In this paper, a variable step-size delayed LMS adaptive filter is used to remove the artefacts from the ECG signal for improved feature extraction. The dedicated digital Signal processors provide fast processing, but they are not flexible. By using field programmable gate arrays, the pipelined architectures can be used to enhance the system performance. The pipelined architecture can enhance the operation efficiency of the adaptive filter and save the power consumption. This technique provides high signal-to-noise ratio and low MSE with reduced computational complexity; hence, it is a useful method for monitoring patients with heart-related problem.

  5. Using Principal Component Analysis to Improve Fallout Characterization

    DTIC Science & Technology

    2017-03-23

    between actinide location and elemental composition in fallout from historic atmospheric nuclear weapons testing. Fifty spherical fallout samples were...mathematical approach to solving the complex system of elemental variables while establishing correlations to actinide incorporation within the fallout...1. The double hump curve for uranium-235 showing the effective fission yield by mass number for thermal neutrons. Reproduced with permission from

  6. Coordinated crew performance in commercial aircraft operations

    NASA Technical Reports Server (NTRS)

    Murphy, M. R.

    1977-01-01

    A specific methodology is proposed for an improved system of coding and analyzing crew member interaction. The complexity and lack of precision of many crew and task variables suggest the usefulness of fuzzy linguistic techniques for modeling and computer simulation of the crew performance process. Other research methodologies and concepts that have promise for increasing the effectiveness of research on crew performance are identified.

  7. Shape optimization techniques for musical instrument design

    NASA Astrophysics Data System (ADS)

    Henrique, Luis; Antunes, Jose; Carvalho, Joao S.

    2002-11-01

    The design of musical instruments is still mostly based on empirical knowledge and costly experimentation. One interesting improvement is the shape optimization of resonating components, given a number of constraints (allowed parameter ranges, shape smoothness, etc.), so that vibrations occur at specified modal frequencies. Each admissible geometrical configuration generates an error between computed eigenfrequencies and the target set. Typically, error surfaces present many local minima, corresponding to suboptimal designs. This difficulty can be overcome using global optimization techniques, such as simulated annealing. However these methods are greedy, concerning the number of function evaluations required. Thus, the computational effort can be unacceptable if complex problems, such as bell optimization, are tackled. Those issues are addressed in this paper, and a method for improving optimization procedures is proposed. Instead of using the local geometric parameters as searched variables, the system geometry is modeled in terms of truncated series of orthogonal space-funcitons, and optimization is performed on their amplitude coefficients. Fourier series and orthogonal polynomials are typical such functions. This technique reduces considerably the number of searched variables, and has a potential for significant computational savings in complex problems. It is illustrated by optimizing the shapes of both current and uncommon marimba bars.

  8. Incorporating biological information in sparse principal component analysis with application to genomic data.

    PubMed

    Li, Ziyi; Safo, Sandra E; Long, Qi

    2017-07-11

    Sparse principal component analysis (PCA) is a popular tool for dimensionality reduction, pattern recognition, and visualization of high dimensional data. It has been recognized that complex biological mechanisms occur through concerted relationships of multiple genes working in networks that are often represented by graphs. Recent work has shown that incorporating such biological information improves feature selection and prediction performance in regression analysis, but there has been limited work on extending this approach to PCA. In this article, we propose two new sparse PCA methods called Fused and Grouped sparse PCA that enable incorporation of prior biological information in variable selection. Our simulation studies suggest that, compared to existing sparse PCA methods, the proposed methods achieve higher sensitivity and specificity when the graph structure is correctly specified, and are fairly robust to misspecified graph structures. Application to a glioblastoma gene expression dataset identified pathways that are suggested in the literature to be related with glioblastoma. The proposed sparse PCA methods Fused and Grouped sparse PCA can effectively incorporate prior biological information in variable selection, leading to improved feature selection and more interpretable principal component loadings and potentially providing insights on molecular underpinnings of complex diseases.

  9. Review and classification of variability analysis techniques with clinical applications.

    PubMed

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  10. Review and classification of variability analysis techniques with clinical applications

    PubMed Central

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  11. Spike-In Normalization of ChIP Data Using DNA-DIG-Antibody Complex.

    PubMed

    Eberle, Andrea B

    2018-01-01

    Chromatin immunoprecipitation (ChIP) is a widely used method to determine the occupancy of specific proteins within the genome, helping to unravel the function and activity of specific genomic regions. In ChIP experiments, normalization of the obtained data by a suitable internal reference is crucial. However, particularly when comparing differently treated samples, such a reference is difficult to identify. Here, a simple method to improve the accuracy and reliability of ChIP experiments by the help of an external reference is described. An artificial molecule, composed of a well-defined digoxigenin (DIG) labeled DNA fragment in complex with an anti-DIG antibody, is synthesized and added to each chromatin sample before immunoprecipitation. During the ChIP procedure, the DNA-DIG-antibody complex undergoes the same treatments as the chromatin and is therefore purified and quantified together with the chromatin of interest. This external reference compensates for variability during the ChIP routine and improves the similarity between replicates, thereby emphasizing the biological differences between samples.

  12. Molecular Species Delimitation in the Racomitrium canescens Complex (Grimmiaceae) and Implications for DNA Barcoding of Species Complexes in Mosses

    PubMed Central

    Stech, Michael; Veldman, Sarina; Larraín, Juan; Muñoz, Jesús; Quandt, Dietmar; Hassel, Kristian; Kruijer, Hans

    2013-01-01

    In bryophytes a morphological species concept is still most commonly employed, but delimitation of closely related species based on morphological characters is often difficult. Here we test morphological species circumscriptions in a species complex of the moss genus Racomitrium, the R. canescens complex, based on variable DNA sequence markers from the plastid (rps4-trnT-trnL region) and nuclear (nrITS) genomes. The extensive morphological variability within the complex has led to different opinions about the number of species and intraspecific taxa to be distinguished. Molecular phylogenetic reconstructions allowed to clearly distinguish all eight currently recognised species of the complex plus a ninth species that was inferred to belong to the complex in earlier molecular analyses. The taxonomic significance of intraspecific sequence variation is discussed. The present molecular data do not support the division of the R. canescens complex into two groups of species (subsections or sections). Most morphological characters, albeit being in part difficult to apply, are reliable for species identification in the R. canescens complex. However, misidentification of collections that were morphologically intermediate between species questioned the suitability of leaf shape as diagnostic character. Four partitions of the molecular markers (rps4-trnT, trnT-trnL, ITS1, ITS2) that could potentially be used for molecular species identification (DNA barcoding) performed almost equally well concerning amplification and sequencing success. Of these, ITS1 provided the highest species discrimination capacity and should be considered as a DNA barcoding marker for mosses, especially in complexes of closely related species. Molecular species identification should be complemented by redefining morphological characters, to develop a set of easy-to-use molecular and non-molecular identification tools for improving biodiversity assessments and ecological research including mosses. PMID:23341927

  13. Elastic scattering spectroscopy for detection of cancer risk in Barrett's esophagus: experimental and clinical validation of error removal by orthogonal subtraction for increasing accuracy

    NASA Astrophysics Data System (ADS)

    Zhu, Ying; Fearn, Tom; MacKenzie, Gary; Clark, Ben; Dunn, Jason M.; Bigio, Irving J.; Bown, Stephen G.; Lovat, Laurence B.

    2009-07-01

    Elastic scattering spectroscopy (ESS) may be used to detect high-grade dysplasia (HGD) or cancer in Barrett's esophagus (BE). When spectra are measured in vivo by a hand-held optical probe, variability among replicated spectra from the same site can hinder the development of a diagnostic model for cancer risk. An experiment was carried out on excised tissue to investigate how two potential sources of this variability, pressure and angle, influence spectral variability, and the results were compared with the variations observed in spectra collected in vivo from patients with Barrett's esophagus. A statistical method called error removal by orthogonal subtraction (EROS) was applied to model and remove this measurement variability, which accounted for 96.6% of the variation in the spectra, from the in vivo data. Its removal allowed the construction of a diagnostic model with specificity improved from 67% to 82% (with sensitivity fixed at 90%). The improvement was maintained in predictions on an independent in vivo data set. EROS works well as an effective pretreatment for Barrett's in vivo data by identifying measurement variability and ameliorating its effect. The procedure reduces the complexity and increases the accuracy and interpretability of the model for classification and detection of cancer risk in Barrett's esophagus.

  14. Predicting radiotherapy outcomes using statistical learning techniques

    NASA Astrophysics Data System (ADS)

    El Naqa, Issam; Bradley, Jeffrey D.; Lindsay, Patricia E.; Hope, Andrew J.; Deasy, Joseph O.

    2009-09-01

    Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among model variables. These models have the capacity to predict on unseen data. Part of this work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.

  15. An evaluation of human factors research for ultrasonic inservice inspection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pond, D.J.; Donohoo, D.T.; Harris, R.V. Jr.

    1998-03-01

    This work was undertaken to determine if human factors research has yielded information applicable to upgrading requirements in ASME Boiler and Pressure Vessel Code Section XI, improving methods and techniques in Section V, and/or suggesting relevant research. A preference was established for information and recommendations which have become accepted and standard practice. Manual Ultrasonic Testing/Inservice Inspection (UT/ISI) is a complex task subject to influence by dozens of variables. This review frequently revealed equivocal findings regarding effects of environmental variables as well as repeated indications that inspection performance may be more, and more reliably, influenced by the workers` social environment, includingmore » managerial practices, than by other situational variables. Also of significance are each inspector`s relevant knowledge, skills, and abilities, and determination of these is seen as a necessary first step in upgrading requirements, methods, and techniques as well as in focusing research in support of such programs, While understanding the effects and mediating mechanisms of the variables impacting inspection performance is a worthwhile pursuit for researchers, initial improvements in industrial UTASI performance may be achieved by implementing practices already known to mitigate the effects of potentially adverse conditions. 52 refs., 2 tabs.« less

  16. Why "improved" water sources are not always safe.

    PubMed

    Shaheed, Ameer; Orgill, Jennifer; Montgomery, Maggie A; Jeuland, Marc A; Brown, Joe

    2014-04-01

    Existing and proposed metrics for household drinking-water services are intended to measure the availability, safety and accessibility of water sources. However, these attributes can be highly variable over time and space and this variation complicates the task of creating and implementing simple and scalable metrics. In this paper, we highlight those factors - especially those that relate to so-called improved water sources - that contribute to variability in water safety but may not be generally recognized as important by non-experts. Problems in the provision of water in adequate quantities and of adequate quality - interrelated problems that are often influenced by human behaviour - may contribute to an increased risk of poor health. Such risk may be masked by global water metrics that indicate that we are on the way to meeting the world's drinking-water needs. Given the complexity of the topic and current knowledge gaps, international metrics for access to drinking water should be interpreted with great caution. We need further targeted research on the health impacts associated with improvements in drinking-water supplies.

  17. Introduction of Transplant Registry Unified Management Program 2 (TRUMP2): scripts for TRUMP data analyses, part I (variables other than HLA-related data).

    PubMed

    Atsuta, Yoshiko

    2016-01-01

    Collection and analysis of information on diseases and post-transplant courses of allogeneic hematopoietic stem cell transplant recipients have played important roles in improving therapeutic outcomes in hematopoietic stem cell transplantation. Efficient, high-quality data collection systems are essential. The introduction of the Second-Generation Transplant Registry Unified Management Program (TRUMP2) is intended to improve data quality and more efficient data management. The TRUMP2 system will also expand possible uses of data, as it is capable of building a more complex relational database. The construction of an accessible data utilization system for adequate data utilization by researchers would promote greater research activity. Study approval and management processes and authorship guidelines also need to be organized within this context. Quality control of processes for data manipulation and analysis will also affect study outcomes. Shared scripts have been introduced to define variables according to standard definitions for quality control and improving efficiency of registry studies using TRUMP data.

  18. Providing Social Support for Underrepresented Racial and Ethnic Minority PhD Students in the Biomedical Sciences: A Career Coaching Model

    ERIC Educational Resources Information Center

    Williams, Simon N.; Thakore, Bhoomi K.; McGee, Richard

    2017-01-01

    Improvement in the proportion of underrepresented racial and ethnic minorities (URMs) in academic positions has been unsatisfactory. Although this is a complex problem, one key issue is that graduate students often rely on research mentors for career-related support, the effectiveness of which can be variable. We present results from a novel…

  19. Changes in the neural control of a complex motor sequence during learning

    PubMed Central

    Otchy, Timothy M.; Goldberg, Jesse H.; Aronov, Dmitriy; Fee, Michale S.

    2011-01-01

    The acquisition of complex motor sequences often proceeds through trial-and-error learning, requiring the deliberate exploration of motor actions and the concomitant evaluation of the resulting performance. Songbirds learn their song in this manner, producing highly variable vocalizations as juveniles. As the song improves, vocal variability is gradually reduced until it is all but eliminated in adult birds. In the present study we examine how the motor program underlying such a complex motor behavior evolves during learning by recording from the robust nucleus of the arcopallium (RA), a motor cortex analog brain region. In young birds, neurons in RA exhibited highly variable firing patterns that throughout development became more precise, sparse, and bursty. We further explored how the developing motor program in RA is shaped by its two main inputs: LMAN, the output nucleus of a basal ganglia-forebrain circuit, and HVC, a premotor nucleus. Pharmacological inactivation of LMAN during singing made the song-aligned firing patterns of RA neurons adultlike in their stereotypy without dramatically affecting the spike statistics or the overall firing patterns. Removing the input from HVC, on the other hand, resulted in a complete loss of stereotypy of both the song and the underlying motor program. Thus our results show that a basal ganglia-forebrain circuit drives motor exploration required for trial-and-error learning by adding variability to the developing motor program. As learning proceeds and the motor circuits mature, the relative contribution of LMAN is reduced, allowing the premotor input from HVC to drive an increasingly stereotyped song. PMID:21543758

  20. The theory and method of variable frequency directional seismic wave under the complex geologic conditions

    NASA Astrophysics Data System (ADS)

    Jiang, T.; Yue, Y.

    2017-12-01

    It is well known that the mono-frequency directional seismic wave technology can concentrate seismic waves into a beam. However, little work on the method and effect of variable frequency directional seismic wave under complex geological conditions have been done .We studied the variable frequency directional wave theory in several aspects. Firstly, we studied the relation between directional parameters and the direction of the main beam. Secondly, we analyzed the parameters that affect the beam width of main beam significantly, such as spacing of vibrator, wavelet dominant frequency, and number of vibrator. In addition, we will study different characteristics of variable frequency directional seismic wave in typical velocity models. In order to examine the propagation characteristics of directional seismic wave, we designed appropriate parameters according to the character of direction parameters, which is capable to enhance the energy of the main beam direction. Further study on directional seismic wave was discussed in the viewpoint of power spectral. The results indicate that the energy intensity of main beam direction increased 2 to 6 times for a multi-ore body velocity model. It showed us that the variable frequency directional seismic technology provided an effective way to strengthen the target signals under complex geological conditions. For concave interface model, we introduced complicated directional seismic technology which supports multiple main beams to obtain high quality data. Finally, we applied the 9-element variable frequency directional seismic wave technology to process the raw data acquired in a oil-shale exploration area. The results show that the depth of exploration increased 4 times with directional seismic wave method. Based on the above analysis, we draw the conclusion that the variable frequency directional seismic wave technology can improve the target signals of different geologic conditions and increase exploration depth with little cost. Due to inconvenience of hydraulic vibrators in complicated surface area, we suggest that the combination of high frequency portable vibrator and variable frequency directional seismic wave method is an alternative technology to increase depth of exploration or prospecting.

  1. Soft-sensing model of temperature for aluminum reduction cell on improved twin support vector regression

    NASA Astrophysics Data System (ADS)

    Li, Tao

    2018-06-01

    The complexity of aluminum electrolysis process leads the temperature for aluminum reduction cells hard to measure directly. However, temperature is the control center of aluminum production. To solve this problem, combining some aluminum plant's practice data, this paper presents a Soft-sensing model of temperature for aluminum electrolysis process on Improved Twin Support Vector Regression (ITSVR). ITSVR eliminates the slow learning speed of Support Vector Regression (SVR) and the over-fit risk of Twin Support Vector Regression (TSVR) by introducing a regularization term into the objective function of TSVR, which ensures the structural risk minimization principle and lower computational complexity. Finally, the model with some other parameters as auxiliary variable, predicts the temperature by ITSVR. The simulation result shows Soft-sensing model based on ITSVR has short time-consuming and better generalization.

  2. Optimisation of spray-drying process variables for dry powder inhalation (DPI) formulations of corticosteroid/cyclodextrin inclusion complexes.

    PubMed

    Cabral-Marques, Helena; Almeida, Rita

    2009-09-01

    This study aims to develop and characterise a beclomethasone diproprionate:gamma-cyclodextrin (BDP:gamma-CYD) complex and to optimise the variables on the spray-drying process, in order to obtain a powder with the most suitable characteristics for lung delivery. The spray-dried powder--in a mass ratio of 2:5 (BDP:gamma-CYD)--was physically mixed with three carriers of different particle sizes and in different ratios. Particle-size distribution, shape and morphology, moisture content, and uniformity in BDP content of formulations were studied. In vitro aerolisation behaviour of the formulations was evaluated using the Rotahaler, and the performance was characterised based on the uniformity of emitted dose and aerodynamic particle-size distribution (respirable fraction (RF), as a percentage of nominal dose (RFN) and emitted dose (RFE)). The most suitable conditions for the preparation of BDP:gamma-CYD complexes were obtained with the solution flow of 5 ml/min, T(in) of 70 degrees C and T(out) of 50 degrees C. Statistically significant differences in the aerodynamic performances were obtained for formulations containing BDP:gamma-CYD complexes prepared using different solution flows and different T(in) (p<0.05). RFN and RFE vary in direct proportion with T(in), while an inverse relationship was observed for the solution flow. A direct correlation between the RFE and the T(out) was identified. Performance of the formulations was compared with an established commercial product (Beclotaide Rotacaps 100 microg) with improved performance of RF: formulations with respitose carrier attained RFN and RFE twofold greater, and formulations based on 63-90 microm fraction lactose and trehalose achieved a threefold improvement; also, all formulations showed that the percentage of dose of BDP deposited in the "oropharynx" compartment was reduced to half.

  3. TopoSCALE v.1.0: downscaling gridded climate data in complex terrain

    NASA Astrophysics Data System (ADS)

    Fiddes, J.; Gruber, S.

    2014-02-01

    Simulation of land surface processes is problematic in heterogeneous terrain due to the the high resolution required of model grids to capture strong lateral variability caused by, for example, topography, and the lack of accurate meteorological forcing data at the site or scale it is required. Gridded data products produced by atmospheric models can fill this gap, however, often not at an appropriate spatial resolution to drive land-surface simulations. In this study we describe a method that uses the well-resolved description of the atmospheric column provided by climate models, together with high-resolution digital elevation models (DEMs), to downscale coarse-grid climate variables to a fine-scale subgrid. The main aim of this approach is to provide high-resolution driving data for a land-surface model (LSM). The method makes use of an interpolation of pressure-level data according to topographic height of the subgrid. An elevation and topography correction is used to downscale short-wave radiation. Long-wave radiation is downscaled by deriving a cloud-component of all-sky emissivity at grid level and using downscaled temperature and relative humidity fields to describe variability with elevation. Precipitation is downscaled with a simple non-linear lapse and optionally disaggregated using a climatology approach. We test the method in comparison with unscaled grid-level data and a set of reference methods, against a large evaluation dataset (up to 210 stations per variable) in the Swiss Alps. We demonstrate that the method can be used to derive meteorological inputs in complex terrain, with most significant improvements (with respect to reference methods) seen in variables derived from pressure levels: air temperature, relative humidity, wind speed and incoming long-wave radiation. This method may be of use in improving inputs to numerical simulations in heterogeneous and/or remote terrain, especially when statistical methods are not possible, due to lack of observations (i.e. remote areas or future periods).

  4. Improving Synoptic and Intra-Seasonnal Variability in CFS via a Better Representation of Organized Convection

    NASA Astrophysics Data System (ADS)

    Khouider, B.; Goswami, B. B.; Majda, A.; Krishna, R. P. M. M.; Mukhopadhyay, P.

    2016-12-01

    Improvements in the capability of climate models to realistically capture the synoptic and intra-seasonnal variability, associated with tropical rainfall, are conditioned by improvement in the representation of the subgrid variability due to organized convection and the underlying two-way interactions through multiple scales and thus breaking with the quasi-equilibrium bottleneck. By design, the stochastic multi-cloud model (SMCM) mimics the life cycle of organized tropical convective systems and the interactions of the associated cloud types with each other and with large scales, as it is observed. It is based a lattice particle interaction model for predefined microscopic (subgrid) sites that make random transitions from one cloud type to another conditional to the large scale state. In return the SMCM provides the cloud type area fractions on the form of a Markov chain model which can be run in parallel with the climate model without any significant computational overhead. The SMCM was previously successfully tested in both reduced complexity tropical models and an aquaplanet global atmospheric model. Here, we report for the first time the results of its implementation in the fully coupled NCEP climate model (CFSv2) through the used of prescribed vertical profiles of heating and drying obtained from observations. While many known biases in CFSv2 have been slightly improved there are no noticeable degradation in the simulated mean climatology. Nonetheless, comparison with observations show that the improvements in terms of synoptic and intra-seasonnal variability are spectacular, despite the fact that CFSv2 is one of the best models in this regard. In particular, while CFSv2 exaggerates the intra-seasonnal variance at the expense of the synoptic contribution, the CFS-SMCM shows a good balance between the two as in the observations.

  5. Forecasting seasonal hydrologic response in major river basins

    NASA Astrophysics Data System (ADS)

    Bhuiyan, A. M.

    2014-05-01

    Seasonal precipitation variation due to natural climate variation influences stream flow and the apparent frequency and severity of extreme hydrological conditions such as flood and drought. To study hydrologic response and understand the occurrence of extreme hydrological events, the relevant forcing variables must be identified. This study attempts to assess and quantify the historical occurrence and context of extreme hydrologic flow events and quantify the relation between relevant climate variables. Once identified, the flow data and climate variables are evaluated to identify the primary relationship indicators of hydrologic extreme event occurrence. Existing studies focus on developing basin-scale forecasting techniques based on climate anomalies in El Nino/La Nina episodes linked to global climate. Building on earlier work, the goal of this research is to quantify variations in historical river flows at seasonal temporal-scale, and regional to continental spatial-scale. The work identifies and quantifies runoff variability of major river basins and correlates flow with environmental forcing variables such as El Nino, La Nina, sunspot cycle. These variables are expected to be the primary external natural indicators of inter-annual and inter-seasonal patterns of regional precipitation and river flow. Relations between continental-scale hydrologic flows and external climate variables are evaluated through direct correlations in a seasonal context with environmental phenomenon such as sun spot numbers (SSN), Southern Oscillation Index (SOI), and Pacific Decadal Oscillation (PDO). Methods including stochastic time series analysis and artificial neural networks are developed to represent the seasonal variability evident in the historical records of river flows. River flows are categorized into low, average and high flow levels to evaluate and simulate flow variations under associated climate variable variations. Results demonstrated not any particular method is suited to represent scenarios leading to extreme flow conditions. For selected flow scenarios, the persistence model performance may be comparable to more complex multivariate approaches, and complex methods did not always improve flow estimation. Overall model performance indicates inclusion of river flows and forcing variables on average improve model extreme event forecasting skills. As a means to further refine the flow estimation, an ensemble forecast method is implemented to provide a likelihood-based indication of expected river flow magnitude and variability. Results indicate seasonal flow variations are well-captured in the ensemble range, therefore the ensemble approach can often prove efficient in estimating extreme river flow conditions. The discriminant prediction approach, a probabilistic measure to forecast streamflow, is also adopted to derive model performance. Results show the efficiency of the method in terms of representing uncertainties in the forecasts.

  6. Defining drug response for stratified medicine.

    PubMed

    Lonergan, Mike; Senn, Stephen J; McNamee, Christine; Daly, Ann K; Sutton, Robert; Hattersley, Andrew; Pearson, Ewan; Pirmohamed, Munir

    2017-01-01

    The premise for stratified medicine is that drug efficacy, drug safety, or both, vary between groups of patients, and biomarkers can be used to facilitate more targeted prescribing, with the aim of improving the benefit:risk ratio of treatment. However, many factors can contribute to the variability in response to drug treatment. Inadequate characterisation of the nature and degree of variability can lead to the identification of biomarkers that have limited utility in clinical settings. Here, we discuss the complexities associated with the investigation of variability in drug efficacy and drug safety, and how consideration of these issues a priori, together with standardisation of phenotypes, can increase both the efficiency of stratification procedures and identification of biomarkers with the potential for clinical impact. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Detection of quasars in the time domain

    NASA Astrophysics Data System (ADS)

    Graham, Matthew J.; Djorgovski, S. G.; Stern, Daniel J.; Drake, Andrew; Mahabal, Ashish

    2017-06-01

    The time domain is the emerging forefront of astronomical research with new facilities and instruments providing unprecedented amounts of data on the temporal behavior of astrophysical populations. Dealing with the size and complexity of this requires new techniques and methodologies. Quasars are an ideal work set for developing and applying these: they vary in a detectable but not easily quantifiable manner whose physical origins are poorly understood. In this paper, we will review how quasars are identified by their variability and how these techniques can be improved, what physical insights into their variability can be gained from studying extreme examples of variability, and what approaches can be taken to increase the number of quasars known. These will demonstrate how astroinformatics is essential to discovering and understanding this important population.

  8. Fluid Mechanics and Complex Variable Theory: Getting Past the 19th Century

    ERIC Educational Resources Information Center

    Newton, Paul K.

    2017-01-01

    The subject of fluid mechanics is a rich, vibrant, and rapidly developing branch of applied mathematics. Historically, it has developed hand-in-hand with the elegant subject of complex variable theory. The Westmont College NSF-sponsored workshop on the revitalization of complex variable theory in the undergraduate curriculum focused partly on…

  9. Revealing hidden clonal complexity in Mycobacterium tuberculosis infection by qualitative and quantitative improvement of sampling.

    PubMed

    Pérez-Lago, L; Palacios, J J; Herranz, M; Ruiz Serrano, M J; Bouza, E; García-de-Viedma, D

    2015-02-01

    The analysis of microevolution events, its functional relevance and impact on molecular epidemiology strategies, constitutes one of the most challenging aspects of the study of clonal complexity in infection by Mycobacterium tuberculosis. In this study, we retrospectively evaluated whether two improved sampling schemes could provide access to the clonal complexity that is undetected by the current standards (analysis of one isolate from one sputum). We evaluated in 48 patients the analysis by mycobacterial interspersed repetitive unit-variable number tandem repeat of M. tuberculosis isolates cultured from bronchial aspirate (BAS) or bronchoalveolar lavage (BAL) and, in another 16 cases, the analysis of a higher number of isolates from independent sputum samples. Analysis of the isolates from BAS/BAL specimens revealed clonal complexity in a very high proportion of cases (5/48); in most of these cases, complexity was not detected when the isolates from sputum samples were analysed. Systematic analysis of isolates from multiple sputum samples also improved the detection of clonal complexity. We found coexisting clonal variants in two of 16 cases that would have gone undetected in the analysis of the isolate from a single sputum specimen. Our results suggest that analysis of isolates from BAS/BAL specimens is highly efficient for recording the true clonal composition of M. tuberculosis in the lungs. When these samples are not available, we recommend increasing the number of isolates from independent sputum specimens, because they might not harbour the same pool of bacteria. Our data suggest that the degree of clonal complexity in tuberculosis has been underestimated because of the deficiencies inherent in a simplified procedure. Copyright © 2014 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  10. Variability of hand tremor in rest and in posture--a pilot study.

    PubMed

    Rahimi, Fariborz; Bee, Carina; South, Angela; Debicki, Derek; Jog, Mandar

    2011-01-01

    Previous, studies have demonstrated variability in the frequency and amplitude in tremor between subjects and between trials in both healthy individuals and those with disease states. However, to date, few studies have examined the composition of tremor. Efficacy of treatment for tremor using techniques such as Botulinum neurotoxin type A (BoNT A) injection may benefit from a better understanding of tremor variability, but more importantly, tremor composition. In the present study, we evaluated tremor variability and composition in 8 participants with either essential tremor or Parkinson disease tremor using kinematic recording methods. Our preliminary findings suggest that while individual patients may have more intra-trial and intra-task variability, overall, task effect was significant only for amplitude of tremor. Composition of tremor varied among patients and the data suggest that tremor composition is complex involving multiple muscle groups. These results may support the value of kinematic assessment methods and the improved understanding of tremor composition in the management of tremor.

  11. Intrinsic movement variability at work. How long is the path from motor control to design engineering?

    PubMed

    Gaudez, C; Gilles, M A; Savin, J

    2016-03-01

    For several years, increasing numbers of studies have highlighted the existence of movement variability. Before that, it was neglected in movement analysis and it is still almost completely ignored in workstation design. This article reviews motor control theories and factors influencing movement execution, and indicates how intrinsic movement variability is part of task completion. These background clarifications should help ergonomists and workstation designers to gain a better understanding of these concepts, which can then be used to improve design tools. We also question which techniques--kinematics, kinetics or muscular activity--and descriptors are most appropriate for describing intrinsic movement variability and for integration into design tools. By this way, simulations generated by designers for workstation design should be closer to the real movements performed by workers. This review emphasises the complexity of identifying, describing and processing intrinsic movement variability in occupational activities. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  12. Prediction of enteric methane production, yield, and intensity in dairy cattle using an intercontinental database.

    PubMed

    Niu, Mutian; Kebreab, Ermias; Hristov, Alexander N; Oh, Joonpyo; Arndt, Claudia; Bannink, André; Bayat, Ali R; Brito, André F; Boland, Tommy; Casper, David; Crompton, Les A; Dijkstra, Jan; Eugène, Maguy A; Garnsworthy, Phil C; Haque, Md Najmul; Hellwing, Anne L F; Huhtanen, Pekka; Kreuzer, Michael; Kuhla, Bjoern; Lund, Peter; Madsen, Jørgen; Martin, Cécile; McClelland, Shelby C; McGee, Mark; Moate, Peter J; Muetzel, Stefan; Muñoz, Camila; O'Kiely, Padraig; Peiren, Nico; Reynolds, Christopher K; Schwarm, Angela; Shingfield, Kevin J; Storlien, Tonje M; Weisbjerg, Martin R; Yáñez-Ruiz, David R; Yu, Zhongtang

    2018-02-16

    Enteric methane (CH 4 ) production from cattle contributes to global greenhouse gas emissions. Measurement of enteric CH 4 is complex, expensive, and impractical at large scales; therefore, models are commonly used to predict CH 4 production. However, building robust prediction models requires extensive data from animals under different management systems worldwide. The objectives of this study were to (1) collate a global database of enteric CH 4 production from individual lactating dairy cattle; (2) determine the availability of key variables for predicting enteric CH 4 production (g/day per cow), yield [g/kg dry matter intake (DMI)], and intensity (g/kg energy corrected milk) and their respective relationships; (3) develop intercontinental and regional models and cross-validate their performance; and (4) assess the trade-off between availability of on-farm inputs and CH 4 prediction accuracy. The intercontinental database covered Europe (EU), the United States (US), and Australia (AU). A sequential approach was taken by incrementally adding key variables to develop models with increasing complexity. Methane emissions were predicted by fitting linear mixed models. Within model categories, an intercontinental model with the most available independent variables performed best with root mean square prediction error (RMSPE) as a percentage of mean observed value of 16.6%, 14.7%, and 19.8% for intercontinental, EU, and United States regions, respectively. Less complex models requiring only DMI had predictive ability comparable to complex models. Enteric CH 4 production, yield, and intensity prediction models developed on an intercontinental basis had similar performance across regions, however, intercepts and slopes were different with implications for prediction. Revised CH 4 emission conversion factors for specific regions are required to improve CH 4 production estimates in national inventories. In conclusion, information on DMI is required for good prediction, and other factors such as dietary neutral detergent fiber (NDF) concentration, improve the prediction. For enteric CH 4 yield and intensity prediction, information on milk yield and composition is required for better estimation. © 2018 John Wiley & Sons Ltd.

  13. Improved Sparse Multi-Class SVM and Its Application for Gene Selection in Cancer Classification

    PubMed Central

    Huang, Lingkang; Zhang, Hao Helen; Zeng, Zhao-Bang; Bushel, Pierre R.

    2013-01-01

    Background Microarray techniques provide promising tools for cancer diagnosis using gene expression profiles. However, molecular diagnosis based on high-throughput platforms presents great challenges due to the overwhelming number of variables versus the small sample size and the complex nature of multi-type tumors. Support vector machines (SVMs) have shown superior performance in cancer classification due to their ability to handle high dimensional low sample size data. The multi-class SVM algorithm of Crammer and Singer provides a natural framework for multi-class learning. Despite its effective performance, the procedure utilizes all variables without selection. In this paper, we propose to improve the procedure by imposing shrinkage penalties in learning to enforce solution sparsity. Results The original multi-class SVM of Crammer and Singer is effective for multi-class classification but does not conduct variable selection. We improved the method by introducing soft-thresholding type penalties to incorporate variable selection into multi-class classification for high dimensional data. The new methods were applied to simulated data and two cancer gene expression data sets. The results demonstrate that the new methods can select a small number of genes for building accurate multi-class classification rules. Furthermore, the important genes selected by the methods overlap significantly, suggesting general agreement among different variable selection schemes. Conclusions High accuracy and sparsity make the new methods attractive for cancer diagnostics with gene expression data and defining targets of therapeutic intervention. Availability: The source MATLAB code are available from http://math.arizona.edu/~hzhang/software.html. PMID:23966761

  14. Weighted augmented Jacobian matrix with a variable coefficient method for kinematics mapping of space teleoperation based on human-robot motion similarity

    NASA Astrophysics Data System (ADS)

    Shi, Zhong; Huang, Xuexiang; Hu, Tianjian; Tan, Qian; Hou, Yuzhuo

    2016-10-01

    Space teleoperation is an important space technology, and human-robot motion similarity can improve the flexibility and intuition of space teleoperation. This paper aims to obtain an appropriate kinematics mapping method of coupled Cartesian-joint space for space teleoperation. First, the coupled Cartesian-joint similarity principles concerning kinematics differences are defined. Then, a novel weighted augmented Jacobian matrix with a variable coefficient (WAJM-VC) method for kinematics mapping is proposed. The Jacobian matrix is augmented to achieve a global similarity of human-robot motion. A clamping weighted least norm scheme is introduced to achieve local optimizations, and the operating ratio coefficient is variable to pursue similarity in the elbow joint. Similarity in Cartesian space and the property of joint constraint satisfaction is analysed to determine the damping factor and clamping velocity. Finally, a teleoperation system based on human motion capture is established, and the experimental results indicate that the proposed WAJM-VC method can improve the flexibility and intuition of space teleoperation to complete complex space tasks.

  15. Accurate and efficient integration for molecular dynamics simulations at constant temperature and pressure

    NASA Astrophysics Data System (ADS)

    Lippert, Ross A.; Predescu, Cristian; Ierardi, Douglas J.; Mackenzie, Kenneth M.; Eastwood, Michael P.; Dror, Ron O.; Shaw, David E.

    2013-10-01

    In molecular dynamics simulations, control over temperature and pressure is typically achieved by augmenting the original system with additional dynamical variables to create a thermostat and a barostat, respectively. These variables generally evolve on timescales much longer than those of particle motion, but typical integrator implementations update the additional variables along with the particle positions and momenta at each time step. We present a framework that replaces the traditional integration procedure with separate barostat, thermostat, and Newtonian particle motion updates, allowing thermostat and barostat updates to be applied infrequently. Such infrequent updates provide a particularly substantial performance advantage for simulations parallelized across many computer processors, because thermostat and barostat updates typically require communication among all processors. Infrequent updates can also improve accuracy by alleviating certain sources of error associated with limited-precision arithmetic. In addition, separating the barostat, thermostat, and particle motion update steps reduces certain truncation errors, bringing the time-average pressure closer to its target value. Finally, this framework, which we have implemented on both general-purpose and special-purpose hardware, reduces software complexity and improves software modularity.

  16. Self-referenced continuous-variable measurement-device-independent quantum key distribution

    NASA Astrophysics Data System (ADS)

    Wang, Yijun; Wang, Xudong; Li, Jiawei; Huang, Duan; Zhang, Ling; Guo, Ying

    2018-05-01

    We propose a scheme to remove the demand of transmitting a high-brightness local oscillator (LO) in continuous-variable measurement-device-independent quantum key distribution (CV-MDI QKD) protocol, which we call as the self-referenced (SR) CV-MDI QKD. We show that our scheme is immune to the side-channel attacks, such as the calibration attacks, the wavelength attacks and the LO fluctuation attacks, which are all exploiting the security loopholes introduced by transmitting the LO. Besides, the proposed scheme waives the necessity of complex multiplexer and demultiplexer, which can greatly simplify the QKD processes and improve the transmission efficiency. The numerical simulations under collective attacks show that all the improvements brought about by our scheme are only at the expense of slight transmission distance shortening. This scheme shows an available method to mend the security loopholes incurred by transmitting LO in CV-MDI QKD.

  17. Improved approximations for control augmented structural synthesis

    NASA Technical Reports Server (NTRS)

    Thomas, H. L.; Schmit, L. A.

    1990-01-01

    A methodology for control-augmented structural synthesis is presented for structure-control systems which can be modeled as an assemblage of beam, truss, and nonstructural mass elements augmented by a noncollocated direct output feedback control system. Truss areas, beam cross sectional dimensions, nonstructural masses and rotary inertias, and controller position and velocity gains are treated simultaneously as design variables. The structural mass and a control-system performance index can be minimized simultaneously, with design constraints placed on static stresses and displacements, dynamic harmonic displacements and forces, structural frequencies, and closed-loop eigenvalues and damping ratios. Intermediate design-variable and response-quantity concepts are used to generate new approximations for displacements and actuator forces under harmonic dynamic loads and for system complex eigenvalues. This improves the overall efficiency of the procedure by reducing the number of complete analyses required for convergence. Numerical results which illustrate the effectiveness of the method are given.

  18. Virtual reality for gait training: can it induce motor learning to enhance complex walking and reduce fall risk in patients with Parkinson's disease?

    PubMed

    Mirelman, Anat; Maidan, Inbal; Herman, Talia; Deutsch, Judith E; Giladi, Nir; Hausdorff, Jeffrey M

    2011-02-01

    Gait and cognitive disturbances are common in Parkinson's disease (PD). These deficits exacerbate fall risk and difficulties with mobility, especially during complex or dual-task walking. Traditional gait training generally fails to fully address these complex gait activities. Virtual reality (VR) incorporates principles of motor learning while delivering engaging and challenging training in complex environments. We hypothesized that VR may be applied to address the multifaceted deficits associated with fall risk in PD. Twenty patients received 18 sessions (3 per week) of progressive intensive treadmill training with virtual obstacles (TT + VR). Outcome measures included gait under usual-walking and dual-task conditions and while negotiating physical obstacles. Cognitive function and functional performance were also assessed. Patients were 67.1 ± 6.5 years and had a mean disease duration of 9.8 ± 5.6 years. Posttraining, gait speed significantly improved during usual walking, during dual task, and while negotiating overground obstacles. Dual-task gait variability decreased (ie, improved) and Trail Making Test times (parts A and B) improved. Gains in functional performance measures and retention effects, 1 month later, were also observed. To our knowledge, this is the first time that TT + VR has been used for gait training in PD. The results indicate that TT + VR is viable in PD and may significantly improve physical performance, gait during complex challenging conditions, and even certain aspects of cognitive function. These findings have important implications for understanding motor learning in the presence of PD and for treating fall risk in PD, aging, and others who share a heightened risk of falls.

  19. Developing, delivering and evaluating primary mental health care: the co-production of a new complex intervention.

    PubMed

    Reeve, Joanne; Cooper, Lucy; Harrington, Sean; Rosbottom, Peter; Watkins, Jane

    2016-09-06

    Health services face the challenges created by complex problems, and so need complex intervention solutions. However they also experience ongoing difficulties in translating findings from research in this area in to quality improvement changes on the ground. BounceBack was a service development innovation project which sought to examine this issue through the implementation and evaluation in a primary care setting of a novel complex intervention. The project was a collaboration between a local mental health charity, an academic unit, and GP practices. The aim was to translate the charity's model of care into practice-based evidence describing delivery and impact. Normalisation Process Theory (NPT) was used to support the implementation of the new model of primary mental health care into six GP practices. An integrated process evaluation evaluated the process and impact of care. Implementation quickly stalled as we identified problems with the described model of care when applied in a changing and variable primary care context. The team therefore switched to using the NPT framework to support the systematic identification and modification of the components of the complex intervention: including the core components that made it distinct (the consultation approach) and the variable components (organisational issues) that made it work in practice. The extra work significantly reduced the time available for outcome evaluation. However findings demonstrated moderately successful implementation of the model and a suggestion of hypothesised changes in outcomes. The BounceBack project demonstrates the development of a complex intervention from practice. It highlights the use of Normalisation Process Theory to support development, and not just implementation, of a complex intervention; and describes the use of the research process in the generation of practice-based evidence. Implications for future translational complex intervention research supporting practice change through scholarship are discussed.

  20. Dissociable effects of practice variability on learning motor and timing skills.

    PubMed

    Caramiaux, Baptiste; Bevilacqua, Frédéric; Wanderley, Marcelo M; Palmer, Caroline

    2018-01-01

    Motor skill acquisition inherently depends on the way one practices the motor task. The amount of motor task variability during practice has been shown to foster transfer of the learned skill to other similar motor tasks. In addition, variability in a learning schedule, in which a task and its variations are interweaved during practice, has been shown to help the transfer of learning in motor skill acquisition. However, there is little evidence on how motor task variations and variability schedules during practice act on the acquisition of complex motor skills such as music performance, in which a performer learns both the right movements (motor skill) and the right time to perform them (timing skill). This study investigated the impact of rate (tempo) variability and the schedule of tempo change during practice on timing and motor skill acquisition. Complete novices, with no musical training, practiced a simple musical sequence on a piano keyboard at different rates. Each novice was assigned to one of four learning conditions designed to manipulate the amount of tempo variability across trials (large or small tempo set) and the schedule of tempo change (randomized or non-randomized order) during practice. At test, the novices performed the same musical sequence at a familiar tempo and at novel tempi (testing tempo transfer), as well as two novel (but related) sequences at a familiar tempo (testing spatial transfer). We found that practice conditions had little effect on learning and transfer performance of timing skill. Interestingly, practice conditions influenced motor skill learning (reduction of movement variability): lower temporal variability during practice facilitated transfer to new tempi and new sequences; non-randomized learning schedule improved transfer to new tempi and new sequences. Tempo (rate) and the sequence difficulty (spatial manipulation) affected performance variability in both timing and movement. These findings suggest that there is a dissociable effect of practice variability on learning complex skills that involve both motor and timing constraints.

  1. The Complexity of Threats to Nuclear Strategic Deterrence Posture

    DTIC Science & Technology

    2017-02-07

    environment, a status quo but things are improving, a status quo but things are getting worse, and the occurrence of a game -changing event. Findings...adversaries. This information will certainly inform this research, as game changing technology will in due course affect the strategic nuclear deterrence...Congressional Research Service, RAND Corporation, and certain peer-reviewed and scholarly articles. The fourth independent variable, “Occurrence of a Game

  2. A systematic review of the psychological literature on interruption and its patient safety implications.

    PubMed

    Li, Simon Y W; Magrabi, Farah; Coiera, Enrico

    2012-01-01

    To understand the complex effects of interruption in healthcare. As interruptions have been well studied in other domains, the authors undertook a systematic review of experimental studies in psychology and human-computer interaction to identify the task types and variables influencing interruption effects. 63 studies were identified from 812 articles retrieved by systematic searches. On the basis of interruption profiles for generic tasks, it was found that clinical tasks can be distinguished into three broad types: procedural, problem-solving, and decision-making. Twelve experimental variables that influence interruption effects were identified. Of these, six are the most important, based on the number of studies and because of their centrality to interruption effects, including working memory load, interruption position, similarity, modality, handling strategies, and practice effect. The variables are explained by three main theoretical frameworks: the activation-based goal memory model, prospective memory, and multiple resource theory. This review provides a useful starting point for a more comprehensive examination of interruptions potentially leading to an improved understanding about the impact of this phenomenon on patient safety and task efficiency. The authors provide some recommendations to counter interruption effects. The effects of interruption are the outcome of a complex set of variables and should not be considered as uniformly predictable or bad. The task types, variables, and theories should help us better to identify which clinical tasks and contexts are most susceptible and assist in the design of information systems and processes that are resilient to interruption.

  3. Decadal predictions of Southern Ocean sea ice : testing different initialization methods with an Earth-system Model of Intermediate Complexity

    NASA Astrophysics Data System (ADS)

    Zunz, Violette; Goosse, Hugues; Dubinkina, Svetlana

    2013-04-01

    The sea ice extent in the Southern Ocean has increased since 1979 but the causes of this expansion have not been firmly identified. In particular, the contribution of internal variability and external forcing to this positive trend has not been fully established. In this region, the lack of observations and the overestimation of internal variability of the sea ice by contemporary General Circulation Models (GCMs) make it difficult to understand the behaviour of the sea ice. Nevertheless, if its evolution is governed by the internal variability of the system and if this internal variability is in some way predictable, a suitable initialization method should lead to simulations results that better fit the reality. Current GCMs decadal predictions are generally initialized through a nudging towards some observed fields. This relatively simple method does not seem to be appropriated to the initialization of sea ice in the Southern Ocean. The present study aims at identifying an initialization method that could improve the quality of the predictions of Southern Ocean sea ice at decadal timescales. We use LOVECLIM, an Earth-system Model of Intermediate Complexity that allows us to perform, within a reasonable computational time, the large amount of simulations required to test systematically different initialization procedures. These involve three data assimilation methods: a nudging, a particle filter and an efficient particle filter. In a first step, simulations are performed in an idealized framework, i.e. data from a reference simulation of LOVECLIM are used instead of observations, herein after called pseudo-observations. In this configuration, the internal variability of the model obviously agrees with the one of the pseudo-observations. This allows us to get rid of the issues related to the overestimation of the internal variability by models compared to the observed one. This way, we can work out a suitable methodology to assess the efficiency of the initialization procedures tested. It also allows us determine the upper limit of improvement that can be expected if more sophisticated initialization methods are used in decadal prediction simulations and if models have an internal variability agreeing with the observed one. Furthermore, since pseudo-observations are available everywhere at any time step, we also analyse the differences between simulations initialized with a complete dataset of pseudo-observations and the ones for which pseudo-observations data are not assimilated everywhere. In a second step, simulations are realized in a realistic framework, i.e. through the use of actual available observations. The same data assimilation methods are tested in order to check if more sophisticated methods can improve the reliability and the accuracy of decadal prediction simulations, even if they are performed with models that overestimate the internal variability of the sea ice extent in the Southern Ocean.

  4. Case studies on forecasting for innovative technologies: frequent revisions improve accuracy.

    PubMed

    Lerner, Jeffrey C; Robertson, Diane C; Goldstein, Sara M

    2015-02-01

    Health technology forecasting is designed to provide reliable predictions about costs, utilization, diffusion, and other market realities before the technologies enter routine clinical use. In this article we address three questions central to forecasting's usefulness: Are early forecasts sufficiently accurate to help providers acquire the most promising technology and payers to set effective coverage policies? What variables contribute to inaccurate forecasts? How can forecasters manage the variables to improve accuracy? We analyzed forecasts published between 2007 and 2010 by the ECRI Institute on four technologies: single-room proton beam radiation therapy for various cancers; digital breast tomosynthesis imaging technology for breast cancer screening; transcatheter aortic valve replacement for serious heart valve disease; and minimally invasive robot-assisted surgery for various cancers. We then examined revised ECRI forecasts published in 2013 (digital breast tomosynthesis) and 2014 (the other three topics) to identify inaccuracies in the earlier forecasts and explore why they occurred. We found that five of twenty early predictions were inaccurate when compared with the updated forecasts. The inaccuracies pertained to two technologies that had more time-sensitive variables to consider. The case studies suggest that frequent revision of forecasts could improve accuracy, especially for complex technologies whose eventual use is governed by multiple interactive factors. Project HOPE—The People-to-People Health Foundation, Inc.

  5. Measuring quality in anatomic pathology.

    PubMed

    Raab, Stephen S; Grzybicki, Dana Marie

    2008-06-01

    This article focuses mainly on diagnostic accuracy in measuring quality in anatomic pathology, noting that measuring any quality metric is complex and demanding. The authors discuss standardization and its variability within and across areas of care delivery and efforts involving defining and measuring error to achieve pathology quality and patient safety. They propose that data linking error to patient outcome are critical for developing quality improvement initiatives targeting errors that cause patient harm in addition to using methods of root cause analysis, beyond those traditionally used in cytologic-histologic correlation, to assist in the development of error reduction and quality improvement plans.

  6. Effect of hydrogenation on the electrical and optical properties of CdZnTe substrates and HgCdTe epitaxial layers

    NASA Astrophysics Data System (ADS)

    Sitharaman, S.; Raman, R.; Durai, L.; Pal, Surendra; Gautam, Madhukar; Nagpal, Anjana; Kumar, Shiv; Chatterjee, S. N.; Gupta, S. C.

    2005-12-01

    In this paper, we report the experimental observations on the effect of plasma hydrogenation in passivating intrinsic point defects, shallow/deep levels and extended defects in low-resistivity undoped CdZnTe crystals. The optical absorption studies show transmittance improvement in the below gap absorption spectrum. Using variable temperature Hall measurement technique, the shallow defect level on which the penetrating hydrogen makes complex, has been identified. In 'compensated' n-type HgCdTe epitaxial layers, hydrogenation can improve the resistivity by two orders of magnitude.

  7. Beyond the G-spot: clitourethrovaginal complex anatomy in female orgasm.

    PubMed

    Jannini, Emmanuele A; Buisson, Odile; Rubio-Casillas, Alberto

    2014-09-01

    The search for the legendary, highly erogenous vaginal region, the Gräfenberg spot (G-spot), has produced important data, substantially improving understanding of the complex anatomy and physiology of sexual responses in women. Modern imaging techniques have enabled visualization of dynamic interactions of female genitals during self-sexual stimulation or coitus. Although no single structure consistent with a distinct G-spot has been identified, the vagina is not a passive organ but a highly dynamic structure with an active role in sexual arousal and intercourse. The anatomical relationships and dynamic interactions between the clitoris, urethra, and anterior vaginal wall have led to the concept of a clitourethrovaginal (CUV) complex, defining a variable, multifaceted morphofunctional area that, when properly stimulated during penetration, could induce orgasmic responses. Knowledge of the anatomy and physiology of the CUV complex might help to avoid damage to its neural, muscular, and vascular components during urological and gynaecological surgical procedures.

  8. Structural resolution of inorganic nanotubes with complex stoichiometry.

    PubMed

    Monet, Geoffrey; Amara, Mohamed S; Rouzière, Stéphan; Paineau, Erwan; Chai, Ziwei; Elliott, Joshua D; Poli, Emiliano; Liu, Li-Min; Teobaldi, Gilberto; Launois, Pascale

    2018-05-23

    Determination of the atomic structure of inorganic single-walled nanotubes with complex stoichiometry remains elusive due to the too many atomic coordinates to be fitted with respect to X-ray diffractograms inherently exhibiting rather broad features. Here we introduce a methodology to reduce the number of fitted variables and enable resolution of the atomic structure for inorganic nanotubes with complex stoichiometry. We apply it to recently synthesized methylated aluminosilicate and aluminogermanate imogolite nanotubes of nominal composition (OH) 3 Al 2 O 3 Si(Ge)CH 3 . Fitting of X-ray scattering diagrams, supported by Density Functional Theory simulations, reveals an unexpected rolling mode for these systems. The transferability of the approach opens up for improved understanding of structure-property relationships of inorganic nanotubes to the benefit of fundamental and applicative research in these systems.

  9. Genetics and Genomics of Acute Neurologic Disorders.

    PubMed

    Maserati, Megan; Alexander, Sheila A

    2018-01-01

    Neurologic diseases and injuries are complex and multifactorial, making risk prediction, targeted treatment modalities, and outcome prognostication difficult and elusive. Genetics and genomics have affected clinical practice in many aspects in medicine, particularly cancer treatment. Advancements in knowledge of genetic and genomic variability in neurologic disease and injury are growing rapidly. Although these data are not yet ready for use in clinical practice, research continues to progress and elucidate information that eventually will provide answers to complex neurologic questions and serve as a platform to provide individualized care plans aimed at improving outcomes. This article provides a focused review of relevant literature on genetics, genomics, and common complex neurologic disease and injury likely to be seen in the acute care setting. ©2018 American Association of Critical-Care Nurses.

  10. Modelling Pseudocalanus elongatus stage-structured population dynamics embedded in a water column ecosystem model for the northern North Sea

    NASA Astrophysics Data System (ADS)

    Moll, Andreas; Stegert, Christoph

    2007-01-01

    This paper outlines an approach to couple a structured zooplankton population model with state variables for eggs, nauplii, two copepodites stages and adults adapted to Pseudocalanus elongatus into the complex marine ecosystem model ECOHAM2 with 13 state variables resolving the carbon and nitrogen cycle. Different temperature and food scenarios derived from laboratory culture studies were examined to improve the process parameterisation for copepod stage dependent development processes. To study annual cycles under realistic weather and hydrographic conditions, the coupled ecosystem-zooplankton model is applied to a water column in the northern North Sea. The main ecosystem state variables were validated against observed monthly mean values. Then vertical profiles of selected state variables were compared to the physical forcing to study differences between zooplankton as one biomass state variable or partitioned into five population state variables. Simulated generation times are more affected by temperature than food conditions except during the spring phytoplankton bloom. Up to six generations within the annual cycle can be discerned in the simulation.

  11. Complexity and Hopf Bifurcation Analysis on a Kind of Fractional-Order IS-LM Macroeconomic System

    NASA Astrophysics Data System (ADS)

    Ma, Junhai; Ren, Wenbo

    On the basis of our previous research, we deepen and complete a kind of macroeconomics IS-LM model with fractional-order calculus theory, which is a good reflection on the memory characteristics of economic variables, we also focus on the influence of the variables on the real system, and improve the analysis capabilities of the traditional economic models to suit the actual macroeconomic environment. The conditions of Hopf bifurcation in fractional-order system models are briefly demonstrated, and the fractional order when Hopf bifurcation occurs is calculated, showing the inherent complex dynamic characteristics of the system. With numerical simulation, bifurcation, strange attractor, limit cycle, waveform and other complex dynamic characteristics are given; and the order condition is obtained with respect to time. We find that the system order has an important influence on the running state of the system. The system has a periodic motion when the order meets the conditions of Hopf bifurcation; the fractional-order system gradually stabilizes with the change of the order and parameters while the corresponding integer-order system diverges. This study has certain significance to policy-making about macroeconomic regulation and control.

  12. A Marked Poisson Process Driven Latent Shape Model for 3D Segmentation of Reflectance Confocal Microscopy Image Stacks of Human Skin.

    PubMed

    Ghanta, Sindhu; Jordan, Michael I; Kose, Kivanc; Brooks, Dana H; Rajadhyaksha, Milind; Dy, Jennifer G

    2017-01-01

    Segmenting objects of interest from 3D data sets is a common problem encountered in biological data. Small field of view and intrinsic biological variability combined with optically subtle changes of intensity, resolution, and low contrast in images make the task of segmentation difficult, especially for microscopy of unstained living or freshly excised thick tissues. Incorporating shape information in addition to the appearance of the object of interest can often help improve segmentation performance. However, the shapes of objects in tissue can be highly variable and design of a flexible shape model that encompasses these variations is challenging. To address such complex segmentation problems, we propose a unified probabilistic framework that can incorporate the uncertainty associated with complex shapes, variable appearance, and unknown locations. The driving application that inspired the development of this framework is a biologically important segmentation problem: the task of automatically detecting and segmenting the dermal-epidermal junction (DEJ) in 3D reflectance confocal microscopy (RCM) images of human skin. RCM imaging allows noninvasive observation of cellular, nuclear, and morphological detail. The DEJ is an important morphological feature as it is where disorder, disease, and cancer usually start. Detecting the DEJ is challenging, because it is a 2D surface in a 3D volume which has strong but highly variable number of irregularly spaced and variably shaped "peaks and valleys." In addition, RCM imaging resolution, contrast, and intensity vary with depth. Thus, a prior model needs to incorporate the intrinsic structure while allowing variability in essentially all its parameters. We propose a model which can incorporate objects of interest with complex shapes and variable appearance in an unsupervised setting by utilizing domain knowledge to build appropriate priors of the model. Our novel strategy to model this structure combines a spatial Poisson process with shape priors and performs inference using Gibbs sampling. Experimental results show that the proposed unsupervised model is able to automatically detect the DEJ with physiologically relevant accuracy in the range 10- 20 μm .

  13. A Marked Poisson Process Driven Latent Shape Model for 3D Segmentation of Reflectance Confocal Microscopy Image Stacks of Human Skin

    PubMed Central

    Ghanta, Sindhu; Jordan, Michael I.; Kose, Kivanc; Brooks, Dana H.; Rajadhyaksha, Milind; Dy, Jennifer G.

    2016-01-01

    Segmenting objects of interest from 3D datasets is a common problem encountered in biological data. Small field of view and intrinsic biological variability combined with optically subtle changes of intensity, resolution and low contrast in images make the task of segmentation difficult, especially for microscopy of unstained living or freshly excised thick tissues. Incorporating shape information in addition to the appearance of the object of interest can often help improve segmentation performance. However, shapes of objects in tissue can be highly variable and design of a flexible shape model that encompasses these variations is challenging. To address such complex segmentation problems, we propose a unified probabilistic framework that can incorporate the uncertainty associated with complex shapes, variable appearance and unknown locations. The driving application which inspired the development of this framework is a biologically important segmentation problem: the task of automatically detecting and segmenting the dermal-epidermal junction (DEJ) in 3D reflectance confocal microscopy (RCM) images of human skin. RCM imaging allows noninvasive observation of cellular, nuclear and morphological detail. The DEJ is an important morphological feature as it is where disorder, disease and cancer usually start. Detecting the DEJ is challenging because it is a 2D surface in a 3D volume which has strong but highly variable number of irregularly spaced and variably shaped “peaks and valleys”. In addition, RCM imaging resolution, contrast and intensity vary with depth. Thus a prior model needs to incorporate the intrinsic structure while allowing variability in essentially all its parameters. We propose a model which can incorporate objects of interest with complex shapes and variable appearance in an unsupervised setting by utilizing domain knowledge to build appropriate priors of the model. Our novel strategy to model this structure combines a spatial Poisson process with shape priors and performs inference using Gibbs sampling. Experimental results show that the proposed unsupervised model is able to automatically detect the DEJ with physiologically relevant accuracy in the range 10 – 20µm. PMID:27723590

  14. Effect of task-oriented training and high-variability practice on gross motor performance and activities of daily living in children with spastic diplegia.

    PubMed

    Kwon, Hae-Yeon; Ahn, So-Yoon

    2016-10-01

    [Purpose] This study investigates how a task-oriented training and high-variability practice program can affect the gross motor performance and activities of daily living for children with spastic diplegia and provides an effective and reliable clinical database for future improvement of motor performances skills. [Subjects and Methods] This study randomly assigned seven children with spastic diplegia to each intervention group including that of a control group, task-oriented training group, and a high-variability practice group. The control group only received neurodevelopmental treatment for 40 minutes, while the other two intervention groups additionally implemented a task-oriented training and high-variability practice program for 8 weeks (twice a week, 60 min per session). To compare intra and inter-relationships of the three intervention groups, this study measured gross motor performance measure (GMPM) and functional independence measure for children (WeeFIM) before and after 8 weeks of training. [Results] There were statistically significant differences in the amount of change before and after the training among the three intervention groups for the gross motor performance measure and functional independence measure. [Conclusion] Applying high-variability practice in a task-oriented training course may be considered an efficient intervention method to improve motor performance skills that can tune to movement necessary for daily livelihood through motor experience and learning of new skills as well as change of tasks learned in a complex environment or similar situations to high-variability practice.

  15. Some elements of a theory of multidimensional complex variables. I - General theory. II - Expansions of analytic functions and application to fluid flows

    NASA Technical Reports Server (NTRS)

    Martin, E. Dale

    1989-01-01

    The paper introduces a new theory of N-dimensional complex variables and analytic functions which, for N greater than 2, is both a direct generalization and a close analog of the theory of ordinary complex variables. The algebra in the present theory is a commutative ring, not a field. Functions of a three-dimensional variable were defined and the definition of the derivative then led to analytic functions.

  16. Dannie Heineman Prize for Mathematical Physics: Applying mathematical techniques to solve important problems in quantum theory

    NASA Astrophysics Data System (ADS)

    Bender, Carl

    2017-01-01

    The theory of complex variables is extremely useful because it helps to explain the mathematical behavior of functions of a real variable. Complex variable theory also provides insight into the nature of physical theories. For example, it provides a simple and beautiful picture of quantization and it explains the underlying reason for the divergence of perturbation theory. By using complex-variable methods one can generalize conventional Hermitian quantum theories into the complex domain. The result is a new class of parity-time-symmetric (PT-symmetric) theories whose remarkable physical properties have been studied and verified in many recent laboratory experiments.

  17. Analytical close-form solutions to the elastic fields of solids with dislocations and surface stress

    NASA Astrophysics Data System (ADS)

    Ye, Wei; Paliwal, Bhasker; Ougazzaden, Abdallah; Cherkaoui, Mohammed

    2013-07-01

    The concept of eigenstrain is adopted to derive a general analytical framework to solve the elastic field for 3D anisotropic solids with general defects by considering the surface stress. The formulation shows the elastic constants and geometrical features of the surface play an important role in determining the elastic fields of the solid. As an application, the analytical close-form solutions to the stress fields of an infinite isotropic circular nanowire are obtained. The stress fields are compared with the classical solutions and those of complex variable method. The stress fields from this work demonstrate the impact from the surface stress when the size of the nanowire shrinks but becomes negligible in macroscopic scale. Compared with the power series solutions of complex variable method, the analytical solutions in this work provide a better platform and they are more flexible in various applications. More importantly, the proposed analytical framework profoundly improves the studies of general 3D anisotropic materials with surface effects.

  18. Effects of CPAP therapy on cognitive and psychomotor performances in patients with severe obstructive sleep apnea: a prospective 1-year study.

    PubMed

    Pecotic, Renata; Dodig, Ivana Pavlinac; Valic, Maja; Galic, Tea; Kalcina, Linda Lusic; Ivkovic, Natalija; Dogas, Zoran

    2018-02-16

    We prospectively investigated the effects of continuous positive airway pressure (CPAP) on long-term cognitive and psychomotor performances, and excessive daytime sleepiness in severe obstructive sleep apnea (OSA) patients. A total of 40 patients were recruited and 23 patients with severe OSA fully completed the study protocol to investigate the effects of CPAP therapy on psychomotor performance at 1, 3, and 6 months and 1 year following initiation of the therapy. Psychomotor CRD-series tests measuring reaction times of light stimulus perception, solving simple arithmetic operations, and complex psychomotor limb coordination, were used in this study. The data collected following CPAP therapy were compared to baseline values prior to the CPAP treatment for each patient. All of the measured variables improved following CPAP treatment. However, the most pronounced effect was observed in improvement of reaction times to complex psychomotor limb coordination test (p < 0.05). Self-reported evaluation of excessive daytime sleepiness measured by Epworth Sleepiness Scale (ESS) showed significant decrease from 10.0 ± 1.1 before to 3.5 ± 0.5 (p < 0.001), after 1 year on CPAP therapy. The CPAP therapy improved cognitive and psychomotor performance on CRD-series tests with the most significant improvement observed in complex psychomotor limb coordination of severe OSA patients.

  19. Quantitative predictions of streamflow variability in the Susquehanna River Basin

    NASA Astrophysics Data System (ADS)

    Alexander, R.; Boyer, E. W.; Leonard, L. N.; Duffy, C.; Schwarz, G. E.; Smith, R. A.

    2012-12-01

    Hydrologic researchers and water managers have increasingly sought an improved understanding of the major processes that control fluxes of water and solutes across diverse environmental settings and large spatial scales. Regional analyses of observed streamflow data have led to advances in our knowledge of relations among land use, climate, and streamflow, with methodologies ranging from statistical assessments of multiple monitoring sites to the regionalization of the parameters of catchment-scale mechanistic simulation models. However, gaps remain in our understanding of the best ways to transfer the knowledge of hydrologic response and governing processes among locations, including methods for regionalizing streamflow measurements and model predictions. We developed an approach to predict variations in streamflow using the SPARROW (SPAtially Referenced Regression On Watershed attributes) modeling infrastructure, with mechanistic functions, mass conservation constraints, and statistical estimation of regional and sub-regional parameters. We used the model to predict discharge in the Susquehanna River Basin (SRB) under varying hydrological regimes that are representative of contemporary flow conditions. The resulting basin-scale water balance describes mean monthly flows in stream reaches throughout the entire SRB (represented at a 1:100,000 scale using the National Hydrologic Data network), with water supply and demand components that are inclusive of a range of hydrologic, climatic, and cultural properties (e.g., precipitation, evapotranspiration, soil and groundwater storage, runoff, baseflow, water use). We compare alternative models of varying complexity that reflect differences in the number and types of explanatory variables and functional expressions as well as spatial and temporal variability in the model parameters. Statistical estimation of the models reveals the levels of complexity that can be uniquely identified, subject to the information content and uncertainties of the hydrologic and climate measurements. Assessment of spatial variations in the model parameters and predictions provides an improved understanding of how much of the hydrologic response to land use, climate, and other properties is unique to specific locations versus more universally observed across catchments of the SRB. This approach advances understanding of water cycle variability at any location throughout the stream network, as a function of both landscape characteristics (e.g., soils, vegetation, land use) and external forcings (e.g., precipitation quantity and frequency). These improvements in predictions of streamflow dynamics will advance the ability to predict spatial and temporal variability in key solutes, such as nutrients, and their delivery to the Chesapeake Bay.

  20. Using an adaptive expertise lens to understand the quality of teachers' classroom implementation of computer-supported complex systems curricula in high school science

    NASA Astrophysics Data System (ADS)

    Yoon, Susan A.; Koehler-Yom, Jessica; Anderson, Emma; Lin, Joyce; Klopfer, Eric

    2015-05-01

    Background: This exploratory study is part of a larger-scale research project aimed at building theoretical and practical knowledge of complex systems in students and teachers with the goal of improving high school biology learning through professional development and a classroom intervention. Purpose: We propose a model of adaptive expertise to better understand teachers' classroom practices as they attempt to navigate myriad variables in the implementation of biology units that include working with computer simulations, and learning about and teaching through complex systems ideas. Sample: Research participants were three high school biology teachers, two females and one male, ranging in teaching experience from six to 16 years. Their teaching contexts also ranged in student achievement from 14-47% advanced science proficiency. Design and methods: We used a holistic multiple case study methodology and collected data during the 2011-2012 school year. Data sources include classroom observations, teacher and student surveys, and interviews. Data analyses and trustworthiness measures were conducted through qualitative mining of data sources and triangulation of findings. Results: We illustrate the characteristics of adaptive expertise of more or less successful teaching and learning when implementing complex systems curricula. We also demonstrate differences between case study teachers in terms of particular variables associated with adaptive expertise. Conclusions: This research contributes to scholarship on practices and professional development needed to better support teachers to teach through a complex systems pedagogical and curricular approach.

  1. Extension of optical lithography by mask-litho integration with computational lithography

    NASA Astrophysics Data System (ADS)

    Takigawa, T.; Gronlund, K.; Wiley, J.

    2010-05-01

    Wafer lithography process windows can be enlarged by using source mask co-optimization (SMO). Recently, SMO including freeform wafer scanner illumination sources has been developed. Freeform sources are generated by a programmable illumination system using a micro-mirror array or by custom Diffractive Optical Elements (DOE). The combination of freeform sources and complex masks generated by SMO show increased wafer lithography process window and reduced MEEF. Full-chip mask optimization using source optimized by SMO can generate complex masks with small variable feature size sub-resolution assist features (SRAF). These complex masks create challenges for accurate mask pattern writing and low false-defect inspection. The accuracy of the small variable-sized mask SRAF patterns is degraded by short range mask process proximity effects. To address the accuracy needed for these complex masks, we developed a highly accurate mask process correction (MPC) capability. It is also difficult to achieve low false-defect inspections of complex masks with conventional mask defect inspection systems. A printability check system, Mask Lithography Manufacturability Check (M-LMC), is developed and integrated with 199-nm high NA inspection system, NPI. M-LMC successfully identifies printable defects from all of the masses of raw defect images collected during the inspection of a complex mask. Long range mask CD uniformity errors are compensated by scanner dose control. A mask CD uniformity error map obtained by mask metrology system is used as input data to the scanner. Using this method, wafer CD uniformity is improved. As reviewed above, mask-litho integration technology with computational lithography is becoming increasingly important.

  2. Towards understanding the complexity of cardiovascular oscillations: Insights from information theory.

    PubMed

    Javorka, Michal; Krohova, Jana; Czippelova, Barbora; Turianikova, Zuzana; Lazarova, Zuzana; Wiszt, Radovan; Faes, Luca

    2018-07-01

    Cardiovascular complexity is a feature of healthy physiological regulation, which stems from the simultaneous activity of several cardiovascular reflexes and other non-reflex physiological mechanisms. It is manifested in the rich dynamics characterizing the spontaneous heart rate and blood pressure variability (HRV and BPV). The present study faces the challenge of disclosing the origin of short-term HRV and BPV from the statistical perspective offered by information theory. To dissect the physiological mechanisms giving rise to cardiovascular complexity in different conditions, measures of predictive information, information storage, information transfer and information modification were applied to the beat-to-beat variability of heart period (HP), systolic arterial pressure (SAP) and respiratory volume signal recorded non-invasively in 61 healthy young subjects at supine rest and during head-up tilt (HUT) and mental arithmetics (MA). Information decomposition enabled to assess simultaneously several expected and newly inferred physiological phenomena, including: (i) the decreased complexity of HP during HUT and the increased complexity of SAP during MA; (ii) the suppressed cardiorespiratory information transfer, related to weakened respiratory sinus arrhythmia, under both challenges; (iii) the altered balance of the information transferred along the two arms of the cardiovascular loop during HUT, with larger baroreflex involvement and smaller feedforward mechanical effects; and (iv) an increased importance of direct respiratory effects on SAP during HUT, and on both HP and SAP during MA. We demonstrate that a decomposition of the information contained in cardiovascular oscillations can reveal subtle changes in system dynamics and improve our understanding of the complexity changes during physiological challenges. Copyright © 2018. Published by Elsevier Ltd.

  3. Applied Routh approximation

    NASA Technical Reports Server (NTRS)

    Merrill, W. C.

    1978-01-01

    The Routh approximation technique for reducing the complexity of system models was applied in the frequency domain to a 16th order, state variable model of the F100 engine and to a 43d order, transfer function model of a launch vehicle boost pump pressure regulator. The results motivate extending the frequency domain formulation of the Routh method to the time domain in order to handle the state variable formulation directly. The time domain formulation was derived and a characterization that specifies all possible Routh similarity transformations was given. The characterization was computed by solving two eigenvalue-eigenvector problems. The application of the time domain Routh technique to the state variable engine model is described, and some results are given. Additional computational problems are discussed, including an optimization procedure that can improve the approximation accuracy by taking advantage of the transformation characterization.

  4. A mismatch between population health literacy and the complexity of health information: an observational study.

    PubMed

    Rowlands, Gillian; Protheroe, Joanne; Winkley, John; Richardson, Marty; Seed, Paul T; Rudd, Rima

    2015-06-01

    Low health literacy is associated with poorer health and higher mortality. Complex health materials are a barrier to health. To assess the literacy and numeracy skills required to understand and use commonly used English health information materials, and to describe population skills in relation to these. An English observational study comparing health materials with national working-age population skills. Health materials were sampled using a health literacy framework. Competency thresholds to understand and use the materials were identified. The proportion of the population above and below these thresholds, and the sociodemographic variables associated with a greater risk of being below the thresholds, were described. Sixty-four health materials were sampled. Two competency thresholds were identified: text (literacy) only, and text + numeracy; 2515/5795 participants (43%) were below the text-only threshold, while 2905/4767 (61%) were below the text + numeracy threshold. Univariable analyses of social determinants of health showed that those groups more at risk of socioeconomic deprivation had higher odds of being below the health literacy competency threshold than those at lower risk of deprivation. Multivariable analysis resulted in some variables becoming non-significant or reduced in effect. Levels of low health literacy mirror those found in other industrialised countries, with a mismatch between the complexity of health materials and the skills of the English adult working-age population. Those most in need of health information have the least access to it. Efficacious strategies are building population skills, improving health professionals' communication, and improving written health information. © British Journal of General Practice 2015.

  5. COED Transactions, Vol. IX, No. 3, March 1977. Evaluation of a Complex Variable Using Analog/Hybrid Computation Techniques.

    ERIC Educational Resources Information Center

    Marcovitz, Alan B., Ed.

    Described is the use of an analog/hybrid computer installation to study those physical phenomena that can be described through the evaluation of an algebraic function of a complex variable. This is an alternative way to study such phenomena on an interactive graphics terminal. The typical problem used, involving complex variables, is that of…

  6. Simulation of groundwater flow in the glacial aquifer system of northeastern Wisconsin with variable model complexity

    USGS Publications Warehouse

    Juckem, Paul F.; Clark, Brian R.; Feinstein, Daniel T.

    2017-05-04

    The U.S. Geological Survey, National Water-Quality Assessment seeks to map estimated intrinsic susceptibility of the glacial aquifer system of the conterminous United States. Improved understanding of the hydrogeologic characteristics that explain spatial patterns of intrinsic susceptibility, commonly inferred from estimates of groundwater age distributions, is sought so that methods used for the estimation process are properly equipped. An important step beyond identifying relevant hydrogeologic datasets, such as glacial geology maps, is to evaluate how incorporation of these resources into process-based models using differing levels of detail could affect resulting simulations of groundwater age distributions and, thus, estimates of intrinsic susceptibility.This report describes the construction and calibration of three groundwater-flow models of northeastern Wisconsin that were developed with differing levels of complexity to provide a framework for subsequent evaluations of the effects of process-based model complexity on estimations of groundwater age distributions for withdrawal wells and streams. Preliminary assessments, which focused on the effects of model complexity on simulated water levels and base flows in the glacial aquifer system, illustrate that simulation of vertical gradients using multiple model layers improves simulated heads more in low-permeability units than in high-permeability units. Moreover, simulation of heterogeneous hydraulic conductivity fields in coarse-grained and some fine-grained glacial materials produced a larger improvement in simulated water levels in the glacial aquifer system compared with simulation of uniform hydraulic conductivity within zones. The relation between base flows and model complexity was less clear; however, the relation generally seemed to follow a similar pattern as water levels. Although increased model complexity resulted in improved calibrations, future application of the models using simulated particle tracking is anticipated to evaluate if these model design considerations are similarly important for understanding the primary modeling objective - to simulate reasonable groundwater age distributions.

  7. An adaptive sampling method for variable-fidelity surrogate models using improved hierarchical kriging

    NASA Astrophysics Data System (ADS)

    Hu, Jiexiang; Zhou, Qi; Jiang, Ping; Shao, Xinyu; Xie, Tingli

    2018-01-01

    Variable-fidelity (VF) modelling methods have been widely used in complex engineering system design to mitigate the computational burden. Building a VF model generally includes two parts: design of experiments and metamodel construction. In this article, an adaptive sampling method based on improved hierarchical kriging (ASM-IHK) is proposed to refine the improved VF model. First, an improved hierarchical kriging model is developed as the metamodel, in which the low-fidelity model is varied through a polynomial response surface function to capture the characteristics of a high-fidelity model. Secondly, to reduce local approximation errors, an active learning strategy based on a sequential sampling method is introduced to make full use of the already required information on the current sampling points and to guide the sampling process of the high-fidelity model. Finally, two numerical examples and the modelling of the aerodynamic coefficient for an aircraft are provided to demonstrate the approximation capability of the proposed approach, as well as three other metamodelling methods and two sequential sampling methods. The results show that ASM-IHK provides a more accurate metamodel at the same simulation cost, which is very important in metamodel-based engineering design problems.

  8. Optimisation by hierarchical search

    NASA Astrophysics Data System (ADS)

    Zintchenko, Ilia; Hastings, Matthew; Troyer, Matthias

    2015-03-01

    Finding optimal values for a set of variables relative to a cost function gives rise to some of the hardest problems in physics, computer science and applied mathematics. Although often very simple in their formulation, these problems have a complex cost function landscape which prevents currently known algorithms from efficiently finding the global optimum. Countless techniques have been proposed to partially circumvent this problem, but an efficient method is yet to be found. We present a heuristic, general purpose approach to potentially improve the performance of conventional algorithms or special purpose hardware devices by optimising groups of variables in a hierarchical way. We apply this approach to problems in combinatorial optimisation, machine learning and other fields.

  9. The perception of complex pitch in cochlear implants: A comparison of monopolar and tripolar stimulation.

    PubMed

    Fielden, Claire A; Kluk, Karolina; Boyle, Patrick J; McKay, Colette M

    2015-10-01

    Cochlear implant listeners typically perform poorly in tasks of complex pitch perception (e.g., musical pitch and voice pitch). One explanation is that wide current spread during implant activation creates channel interactions that may interfere with perception of temporal fundamental frequency information contained in the amplitude modulations within channels. Current focusing using a tripolar mode of stimulation has been proposed as a way of reducing channel interactions, minimising spread of excitation and potentially improving place and temporal pitch cues. The present study evaluated the effect of mode in a group of cochlear implant listeners on a pitch ranking task using male and female singing voices separated by either a half or a quarter octave. Results were variable across participants, but on average, pitch ranking was at chance level when the pitches were a quarter octave apart and improved when the difference was a half octave. No advantage was observed for tripolar over monopolar mode at either pitch interval, suggesting that previously published psychophysical advantages for focused modes may not translate into improvements in complex pitch ranking. Evaluation of the spectral centroid of the stimulation pattern, plus a lack of significant difference between male and female voices, suggested that participants may have had difficulty in accessing temporal pitch cues in either mode.

  10. A process-based hierarchical framework for monitoring glaciated alpine headwaters

    USGS Publications Warehouse

    Weekes, Anne A.; Torgersen, Christian E.; Montgomery, David R.; Woodward, Andrea; Bolton, Susan M.

    2012-01-01

    Recent studies have demonstrated the geomorphic complexity and wide range of hydrologic regimes found in alpine headwater channels that provide complex habitats for aquatic taxa. These geohydrologic elements are fundamental to better understand patterns in species assemblages and indicator taxa and are necessary to aquatic monitoring protocols that aim to track changes in physical conditions. Complex physical variables shape many biological and ecological traits, including life history strategies, but these mechanisms can only be understood if critical physical variables are adequately represented within the sampling framework. To better align sampling design protocols with current geohydrologic knowledge, we present a conceptual framework that incorporates regional-scale conditions, basin-scale longitudinal profiles, valley-scale glacial macroform structure, valley segment-scale (i.e., colluvial, alluvial, and bedrock), and reach-scale channel types. At the valley segment- and reach-scales, these hierarchical levels are associated with differences in streamflow and sediment regime, water source contribution and water temperature. Examples of linked physical-ecological hypotheses placed in a landscape context and a case study using the proposed framework are presented to demonstrate the usefulness of this approach for monitoring complex temporal and spatial patterns and processes in glaciated basins. This approach is meant to aid in comparisons between mountain regions on a global scale and to improve management of potentially endangered alpine species affected by climate change and other stressors.

  11. Syntactic Complexity, Lexical Variation and Accuracy as a Function of Task Complexity and Proficiency Level in L2 Writing and Speaking

    ERIC Educational Resources Information Center

    Kuiken, Folkert; Vedder, Ineke

    2012-01-01

    The research project reported in this chapter consists of three studies in which syntactic complexity, lexical variation and fluency appear as dependent variables. The independent variables are task complexity and proficiency level, as the three studies investigate the effect of task complexity on the written and oral performance of L2 learners of…

  12. National-scale aboveground biomass geostatistical mapping with FIA inventory and GLAS data: Preparation for sparsely sampled lidar assisted forest inventory

    NASA Astrophysics Data System (ADS)

    Babcock, C. R.; Finley, A. O.; Andersen, H. E.; Moskal, L. M.; Morton, D. C.; Cook, B.; Nelson, R.

    2017-12-01

    Upcoming satellite lidar missions, such as GEDI and IceSat-2, are designed to collect laser altimetry data from space for narrow bands along orbital tracts. As a result lidar metric sets derived from these sources will not be of complete spatial coverage. This lack of complete coverage, or sparsity, means traditional regression approaches that consider lidar metrics as explanatory variables (without error) cannot be used to generate wall-to-wall maps of forest inventory variables. We implement a coregionalization framework to jointly model sparsely sampled lidar information and point-referenced forest variable measurements to create wall-to-wall maps with full probabilistic uncertainty quantification of all inputs. We inform the model with USFS Forest Inventory and Analysis (FIA) in-situ forest measurements and GLAS lidar data to spatially predict aboveground forest biomass (AGB) across the contiguous US. We cast our model within a Bayesian hierarchical framework to better model complex space-varying correlation structures among the lidar metrics and FIA data, which yields improved prediction and uncertainty assessment. To circumvent computational difficulties that arise when fitting complex geostatistical models to massive datasets, we use a Nearest Neighbor Gaussian process (NNGP) prior. Results indicate that a coregionalization modeling approach to leveraging sampled lidar data to improve AGB estimation is effective. Further, fitting the coregionalization model within a Bayesian mode of inference allows for AGB quantification across scales ranging from individual pixel estimates of AGB density to total AGB for the continental US with uncertainty. The coregionalization framework examined here is directly applicable to future spaceborne lidar acquisitions from GEDI and IceSat-2. Pairing these lidar sources with the extensive FIA forest monitoring plot network using a joint prediction framework, such as the coregionalization model explored here, offers the potential to improve forest AGB accounting certainty and provide maps for post-model fitting analysis of the spatial distribution of AGB.

  13. Use of neural networks to model complex immunogenetic associations of disease: human leukocyte antigen impact on the progression of human immunodeficiency virus infection.

    PubMed

    Ioannidis, J P; McQueen, P G; Goedert, J J; Kaslow, R A

    1998-03-01

    Complex immunogenetic associations of disease involving a large number of gene products are difficult to evaluate with traditional statistical methods and may require complex modeling. The authors evaluated the performance of feed-forward backpropagation neural networks in predicting rapid progression to acquired immunodeficiency syndrome (AIDS) for patients with human immunodeficiency virus (HIV) infection on the basis of major histocompatibility complex variables. Networks were trained on data from patients from the Multicenter AIDS Cohort Study (n = 139) and then validated on patients from the DC Gay cohort (n = 102). The outcome of interest was rapid disease progression, defined as progression to AIDS in <6 years from seroconversion. Human leukocyte antigen (HLA) variables were selected as network inputs with multivariate regression and a previously described algorithm selecting markers with extreme point estimates for progression risk. Network performance was compared with that of logistic regression. Networks with 15 HLA inputs and a single hidden layer of five nodes achieved a sensitivity of 87.5% and specificity of 95.6% in the training set, vs. 77.0% and 76.9%, respectively, achieved by logistic regression. When validated on the DC Gay cohort, networks averaged a sensitivity of 59.1% and specificity of 74.3%, vs. 53.1% and 61.4%, respectively, for logistic regression. Neural networks offer further support to the notion that HIV disease progression may be dependent on complex interactions between different class I and class II alleles and transporters associated with antigen processing variants. The effect in the current models is of moderate magnitude, and more data as well as other host and pathogen variables may need to be considered to improve the performance of the models. Artificial intelligence methods may complement linear statistical methods for evaluating immunogenetic associations of disease.

  14. The 3of5 web application for complex and comprehensive pattern matching in protein sequences.

    PubMed

    Seiler, Markus; Mehrle, Alexander; Poustka, Annemarie; Wiemann, Stefan

    2006-03-16

    The identification of patterns in biological sequences is a key challenge in genome analysis and in proteomics. Frequently such patterns are complex and highly variable, especially in protein sequences. They are frequently described using terms of regular expressions (RegEx) because of the user-friendly terminology. Limitations arise for queries with the increasing complexity of patterns and are accompanied by requirements for enhanced capabilities. This is especially true for patterns containing ambiguous characters and positions and/or length ambiguities. We have implemented the 3of5 web application in order to enable complex pattern matching in protein sequences. 3of5 is named after a special use of its main feature, the novel n-of-m pattern type. This feature allows for an extensive specification of variable patterns where the individual elements may vary in their position, order, and content within a defined stretch of sequence. The number of distinct elements can be constrained by operators, and individual characters may be excluded. The n-of-m pattern type can be combined with common regular expression terms and thus also allows for a comprehensive description of complex patterns. 3of5 increases the fidelity of pattern matching and finds ALL possible solutions in protein sequences in cases of length-ambiguous patterns instead of simply reporting the longest or shortest hits. Grouping and combined search for patterns provides a hierarchical arrangement of larger patterns sets. The algorithm is implemented as internet application and freely accessible. The application is available at http://dkfz.de/mga2/3of5/3of5.html. The 3of5 application offers an extended vocabulary for the definition of search patterns and thus allows the user to comprehensively specify and identify peptide patterns with variable elements. The n-of-m pattern type offers an improved accuracy for pattern matching in combination with the ability to find all solutions, without compromising the user friendliness of regular expression terms.

  15. Effects of 4D-Var data assimilation using remote sensing precipitation products in a WRF over the complex Heihe River Basin

    NASA Astrophysics Data System (ADS)

    Pan, Xiaoduo; Li, Xin; Cheng, Guodong

    2017-04-01

    Traditionally, ground-based, in situ observations, remote sensing, and regional climate modeling, individually, cannot provide the high-quality precipitation data required for hydrological prediction, especially over complex terrain. Data assimilation techniques are often used to assimilate ground observations and remote sensing products into models for dynamic downscaling. In this study, the Weather Research and Forecasting (WRF) model was used to assimilate two satellite precipitation products (TRMM 3B42 and FY-2D) using the 4D-Var data assimilation method. The results show that the assimilation of remote sensing precipitation products can improve the initial WRF fields of humidity and temperature, thereby improving precipitation forecasting and decreasing the spin-up time. Hence, assimilating TRMM and FY-2D remote sensing precipitation products using WRF 4D-Var can be viewed as a positive step toward improving the accuracy and lead time of numerical weather prediction models, particularly for short-term weather forecasting. Future work is proposed to assimilate a suite of remote sensing data, e.g., the combination of precipitation and soil moisture data, into a WRF model to improve 7-8 day forecasts of precipitation and other atmospheric variables.

  16. Application of geologic-mathematical 3D modeling for complex structure deposits by the example of Lower- Cretaceous period depositions in Western Ust - Balykh oil field (Khanty-Mansiysk Autonomous District)

    NASA Astrophysics Data System (ADS)

    Perevertailo, T.; Nedolivko, N.; Prisyazhnyuk, O.; Dolgaya, T.

    2015-11-01

    The complex structure of the Lower-Cretaceous formation by the example of the reservoir BC101 in Western Ust - Balykh Oil Field (Khanty-Mansiysk Autonomous District) has been studied. Reservoir range relationships have been identified. 3D geologic- mathematical modeling technique considering the heterogeneity and variability of a natural reservoir structure has been suggested. To improve the deposit geological structure integrity methods of mathematical statistics were applied, which, in its turn, made it possible to obtain equal probability models with similar input data and to consider the formation conditions of reservoir rocks and cap rocks.

  17. Enhanced Requirements for Assessment in a Competency-Based, Time-Variable Medical Education System.

    PubMed

    Gruppen, Larry D; Ten Cate, Olle; Lingard, Lorelei A; Teunissen, Pim W; Kogan, Jennifer R

    2018-03-01

    Competency-based, time-variable medical education has reshaped the perceptions and practices of teachers, curriculum designers, faculty developers, clinician educators, and program administrators. This increasingly popular approach highlights the fact that learning among different individuals varies in duration, foundation, and goal. Time variability places particular demands on the assessment data that are so necessary for making decisions about learner progress. These decisions may be formative (e.g., feedback for improvement) or summative (e.g., decisions about advancing a student). This article identifies challenges to collecting assessment data and to making assessment decisions in a time-variable system. These challenges include managing assessment data, defining and making valid assessment decisions, innovating in assessment, and modeling the considerable complexity of assessment in real-world settings and richly interconnected social systems. There are hopeful signs of creativity in assessment both from researchers and practitioners, but the transition from a traditional to a competency-based medical education system will likely continue to create much controversy and offer opportunities for originality and innovation in assessment.

  18. Systems engineering and integration: Cost estimation and benefits analysis

    NASA Technical Reports Server (NTRS)

    Dean, ED; Fridge, Ernie; Hamaker, Joe

    1990-01-01

    Space Transportation Avionics hardware and software cost has traditionally been estimated in Phase A and B using cost techniques which predict cost as a function of various cost predictive variables such as weight, lines of code, functions to be performed, quantities of test hardware, quantities of flight hardware, design and development heritage, complexity, etc. The output of such analyses has been life cycle costs, economic benefits and related data. The major objectives of Cost Estimation and Benefits analysis are twofold: (1) to play a role in the evaluation of potential new space transportation avionics technologies, and (2) to benefit from emerging technological innovations. Both aspects of cost estimation and technology are discussed here. The role of cost analysis in the evaluation of potential technologies should be one of offering additional quantitative and qualitative information to aid decision-making. The cost analyses process needs to be fully integrated into the design process in such a way that cost trades, optimizations and sensitivities are understood. Current hardware cost models tend to primarily use weights, functional specifications, quantities, design heritage and complexity as metrics to predict cost. Software models mostly use functionality, volume of code, heritage and complexity as cost descriptive variables. Basic research needs to be initiated to develop metrics more responsive to the trades which are required for future launch vehicle avionics systems. These would include cost estimating capabilities that are sensitive to technological innovations such as improved materials and fabrication processes, computer aided design and manufacturing, self checkout and many others. In addition to basic cost estimating improvements, the process must be sensitive to the fact that no cost estimate can be quoted without also quoting a confidence associated with the estimate. In order to achieve this, better cost risk evaluation techniques are needed as well as improved usage of risk data by decision-makers. More and better ways to display and communicate cost and cost risk to management are required.

  19. multiUQ: An intrusive uncertainty quantification tool for gas-liquid multiphase flows

    NASA Astrophysics Data System (ADS)

    Turnquist, Brian; Owkes, Mark

    2017-11-01

    Uncertainty quantification (UQ) can improve our understanding of the sensitivity of gas-liquid multiphase flows to variability about inflow conditions and fluid properties, creating a valuable tool for engineers. While non-intrusive UQ methods (e.g., Monte Carlo) are simple and robust, the cost associated with these techniques can render them unrealistic. In contrast, intrusive UQ techniques modify the governing equations by replacing deterministic variables with stochastic variables, adding complexity, but making UQ cost effective. Our numerical framework, called multiUQ, introduces an intrusive UQ approach for gas-liquid flows, leveraging a polynomial chaos expansion of the stochastic variables: density, momentum, pressure, viscosity, and surface tension. The gas-liquid interface is captured using a conservative level set approach, including a modified reinitialization equation which is robust and quadrature free. A least-squares method is leveraged to compute the stochastic interface normal and curvature needed in the continuum surface force method for surface tension. The solver is tested by applying uncertainty to one or two variables and verifying results against the Monte Carlo approach. NSF Grant #1511325.

  20. Resolving combinatorial ambiguities in dilepton t t¯ event topologies with constrained M2 variables

    NASA Astrophysics Data System (ADS)

    Debnath, Dipsikha; Kim, Doojin; Kim, Jeong Han; Kong, Kyoungchul; Matchev, Konstantin T.

    2017-10-01

    We advocate the use of on-shell constrained M2 variables in order to mitigate the combinatorial problem in supersymmetry-like events with two invisible particles at the LHC. We show that in comparison to other approaches in the literature, the constrained M2 variables provide superior ansätze for the unmeasured invisible momenta and therefore can be usefully applied to discriminate combinatorial ambiguities. We illustrate our procedure with the example of dilepton t t ¯ events. We critically review the existing methods based on the Cambridge MT 2 variable and MAOS reconstruction of invisible momenta, and show that their algorithm can be simplified without loss of sensitivity, due to a perfect correlation between events with complex solutions for the invisible momenta and events exhibiting a kinematic endpoint violation. Then we demonstrate that the efficiency for selecting the correct partition is further improved by utilizing the M2 variables instead. Finally, we also consider the general case when the underlying mass spectrum is unknown, and no kinematic endpoint information is available.

  1. Co-optimization of lithographic and patterning processes for improved EPE performance

    NASA Astrophysics Data System (ADS)

    Maslow, Mark J.; Timoshkov, Vadim; Kiers, Ton; Jee, Tae Kwon; de Loijer, Peter; Morikita, Shinya; Demand, Marc; Metz, Andrew W.; Okada, Soichiro; Kumar, Kaushik A.; Biesemans, Serge; Yaegashi, Hidetami; Di Lorenzo, Paolo; Bekaert, Joost P.; Mao, Ming; Beral, Christophe; Larivière, Stephane

    2017-03-01

    Complimentary lithography is already being used for advanced logic patterns. The tight pitches for 1D Metal layers are expected to be created using spacer based multiple patterning ArF-i exposures and the more complex cut/block patterns are made using EUV exposures. At the same time, control requirements of CDU, pattern shift and pitch-walk are approaching sub-nanometer levels to meet edge placement error (EPE) requirements. Local variability, such as Line Edge Roughness (LER), Local CDU, and Local Placement Error (LPE), are dominant factors in the total Edge Placement error budget. In the lithography process, improving the imaging contrast when printing the core pattern has been shown to improve the local variability. In the etch process, it has been shown that the fusion of atomic level etching and deposition can also improve these local variations. Co-optimization of lithography and etch processing is expected to further improve the performance over individual optimizations alone. To meet the scaling requirements and keep process complexity to a minimum, EUV is increasingly seen as the platform for delivering the exposures for both the grating and the cut/block patterns beyond N7. In this work, we evaluated the overlay and pattern fidelity of an EUV block printed in a negative tone resist on an ArF-i SAQP grating. High-order Overlay modeling and corrections during the exposure can reduce overlay error after development, a significant component of the total EPE. During etch, additional degrees of freedom are available to improve the pattern placement error in single layer processes. Process control of advanced pitch nanoscale-multi-patterning techniques as described above is exceedingly complicated in a high volume manufacturing environment. Incorporating potential patterning optimizations into both design and HVM controls for the lithography process is expected to bring a combined benefit over individual optimizations. In this work we will show the EPE performance improvement for a 32nm pitch SAQP + block patterned Metal 2 layer by cooptimizing the lithography and etch processes. Recommendations for further improvements and alternative processes will be given.

  2. Bridging gaps: On the performance of airborne LiDAR to model wood mouse-habitat structure relationships in pine forests.

    PubMed

    Jaime-González, Carlos; Acebes, Pablo; Mateos, Ana; Mezquida, Eduardo T

    2017-01-01

    LiDAR technology has firmly contributed to strengthen the knowledge of habitat structure-wildlife relationships, though there is an evident bias towards flying vertebrates. To bridge this gap, we investigated and compared the performance of LiDAR and field data to model habitat preferences of wood mouse (Apodemus sylvaticus) in a Mediterranean high mountain pine forest (Pinus sylvestris). We recorded nine field and 13 LiDAR variables that were summarized by means of Principal Component Analyses (PCA). We then analyzed wood mouse's habitat preferences using three different models based on: (i) field PCs predictors, (ii) LiDAR PCs predictors; and (iii) both set of predictors in a combined model, including a variance partitioning analysis. Elevation was also included as a predictor in the three models. Our results indicate that LiDAR derived variables were better predictors than field-based variables. The model combining both data sets slightly improved the predictive power of the model. Field derived variables indicated that wood mouse was positively influenced by the gradient of increasing shrub cover and negatively affected by elevation. Regarding LiDAR data, two LiDAR PCs, i.e. gradients in canopy openness and complexity in forest vertical structure positively influenced wood mouse, although elevation interacted negatively with the complexity in vertical structure, indicating wood mouse's preferences for plots with lower elevations but with complex forest vertical structure. The combined model was similar to the LiDAR-based model and included the gradient of shrub cover measured in the field. Variance partitioning showed that LiDAR-based variables, together with elevation, were the most important predictors and that part of the variation explained by shrub cover was shared. LiDAR derived variables were good surrogates of environmental characteristics explaining habitat preferences by the wood mouse. Our LiDAR metrics represented structural features of the forest patch, such as the presence and cover of shrubs, as well as other characteristics likely including time since perturbation, food availability and predation risk. Our results suggest that LiDAR is a promising technology for further exploring habitat preferences by small mammal communities.

  3. Atmospheric icing of structures: Observations and simulations

    NASA Astrophysics Data System (ADS)

    Ágústsson, H.; Elíasson, Á. J.; Thorsteins, E.; Rögnvaldsson, Ó.; Ólafsson, H.

    2012-04-01

    This study compares observed icing in a test span in complex orography at Hallormsstaðaháls (575 m) in East-Iceland with parameterized icing based on an icing model and dynamically downscaled weather at high horizontal resolution. Four icing events have been selected from an extensive dataset of observed atmospheric icing in Iceland. A total of 86 test-spans have been erected since 1972 at 56 locations in complex terrain with more than 1000 icing events documented. The events used here have peak observed ice load between 4 and 36 kg/m. Most of the ice accretion is in-cloud icing but it may partly be mixed with freezing drizzle and wet snow icing. The calculation of atmospheric icing is made in two steps. First the atmospheric data is created by dynamically downscaling the ECMWF-analysis to high resolution using the non-hydrostatic mesoscale Advanced Research WRF-model. The horizontal resolution of 9, 3, 1 and 0.33 km is necessary to allow the atmospheric model to reproduce correctly local weather in the complex terrain of Iceland. Secondly, the Makkonen-model is used to calculate the ice accretion rate on the conductors based on the simulated temperature, wind, cloud and precipitation variables from the atmospheric data. In general, the atmospheric model correctly simulates the atmospheric variables and icing calculations based on the atmospheric variables correctly identify the observed icing events, but underestimate the load due to too slow ice accretion. This is most obvious when the temperature is slightly below 0°C and the observed icing is most intense. The model results improve significantly when additional observations of weather from an upstream weather station are used to nudge the atmospheric model. However, the large variability in the simulated atmospheric variables results in high temporal and spatial variability in the calculated ice accretion. Furthermore, there is high sensitivity of the icing model to the droplet size and the possibility that some of the icing may be due to freezing drizzle or wet snow instead of in-cloud icing of super-cooled droplets. In addition, the icing model (Makkonen) may not be accurate for the highest icing loads observed.

  4. Cleft Lip Repair, Nasoalveolar Molding, and Primary Cleft Rhinoplasty.

    PubMed

    Bhuskute, Aditi A; Tollefson, Travis T

    2016-11-01

    Cleft lip and palate are the fourth most common congenital birth defect. Management requires multidisciplinary care owing to the complexity of these clefts on midface growth, dentition, Eustachian tube function, and lip and nasal cosmesis. Repair requires planning, but can be performed systematically to reduce variability of outcomes. The use of primary rhinoplasty at the time of cleft lip repair can improve nose symmetry and reduce nasal deformity. Use of nasoalveolar molding ranging from lip taping to the use of preoperative infant orthopedics has played an important role in improving functional and cosmetic results of cleft lip repair. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Maximally Informative Statistics for Localization and Mapping

    NASA Technical Reports Server (NTRS)

    Deans, Matthew C.

    2001-01-01

    This paper presents an algorithm for localization and mapping for a mobile robot using monocular vision and odometry as its means of sensing. The approach uses the Variable State Dimension filtering (VSDF) framework to combine aspects of Extended Kalman filtering and nonlinear batch optimization. This paper describes two primary improvements to the VSDF. The first is to use an interpolation scheme based on Gaussian quadrature to linearize measurements rather than relying on analytic Jacobians. The second is to replace the inverse covariance matrix in the VSDF with its Cholesky factor to improve the computational complexity. Results of applying the filter to the problem of localization and mapping with omnidirectional vision are presented.

  6. Variable bright-darkfield-contrast, a new illumination technique for improved visualizations of complex structured transparent specimens.

    PubMed

    Piper, Timm; Piper, Jörg

    2012-04-01

    Variable bright-darkfield contrast (VBDC) is a new technique in light microscopy which promises significant improvements in imaging of transparent colorless specimens especially when characterized by a high regional thickness and a complex three-dimensional architecture. By a particular light pathway, two brightfield- and darkfield-like partial images are simultaneously superimposed so that the brightfield-like absorption image based on the principal zeroth order maximum interferes with the darkfield-like reflection image which is based on the secondary maxima. The background brightness and character of the resulting image can be continuously modulated from a brightfield-dominated to a darkfield-dominated appearance. When the weighting of the dark- and brightfield components is balanced, medium background brightness will result showing the specimen in a phase- or interference contrast-like manner. Specimens can either be illuminated axially/concentrically or obliquely/eccentrically. In oblique illumination, the angle of incidence and grade of eccentricity can be continuously changed. The condenser aperture diaphragm can be used for improvements of the image quality in the same manner as usual in standard brightfield illumination. By this means, the illumination can be optimally adjusted to the specific properties of the specimen. In VBDC, the image contrast is higher than in normal brightfield illumination, blooming and scattering are lower than in standard darkfield examinations, and any haloing is significantly reduced or absent. Although axial resolution and depth of field are higher than in concurrent standard techniques, the lateral resolution is not visibly reduced. Three dimensional structures, reliefs and fine textures can be perceived in superior clarity. Copyright © 2011 Wiley-Liss, Inc.

  7. Introducing a new semi-active engine mount using force controlled variable stiffness

    NASA Astrophysics Data System (ADS)

    Azadi, Mojtaba; Behzadipour, Saeed; Faulkner, Gary

    2013-05-01

    This work introduces a new concept in designing semi-active engine mounts. Engine mounts are under continuous development to provide better and more cost-effective engine vibration control. Passive engine mounts do not provide satisfactory solution. Available semi-active and active mounts provide better solutions but they are more complex and expensive. The variable stiffness engine mount (VSEM) is a semi-active engine mount with a simple ON-OFF control strategy. However, unlike available semi-active engine mounts that work based on damping change, the VSEM works based on the static stiffness change by using a new fast response force controlled variable spring. The VSEM is an improved version of the vibration mount introduced by the authors in their previous work. The results showed significant performance improvements over a passive rubber mount. The VSEM also provides better vibration control than a hydromount at idle speed. Low hysteresis and the ability to be modelled by a linear model in low-frequency are the advantages of the VSEM over the vibration isolator introduced earlier and available hydromounts. These specifications facilitate the use of VSEM in the automotive industry, however, further evaluation and developments are needed for this purpose.

  8. Modeling the probability of arsenic in groundwater in New England as a tool for exposure assessment.

    PubMed

    Ayotte, Joseph D; Nolan, Bernard T; Nuckols, John R; Cantor, Kenneth P; Robinson, Gilpin R; Baris, Dalsu; Hayes, Laura; Karagas, Margaret; Bress, William; Silverman, Debra T; Lubin, Jay H

    2006-06-01

    We developed a process-based model to predict the probability of arsenic exceeding 5 microg/L in drinking water wells in New England bedrock aquifers. The model is being used for exposure assessment in an epidemiologic study of bladder cancer. One important study hypothesis that may explain increased bladder cancer risk is elevated concentrations of inorganic arsenic in drinking water. In eastern New England, 20-30% of private wells exceed the arsenic drinking water standard of 10 micrograms per liter. Our predictive model significantly improves the understanding of factors associated with arsenic contamination in New England. Specific rock types, high arsenic concentrations in stream sediments, geochemical factors related to areas of Pleistocene marine inundation and proximity to intrusive granitic plutons, and hydrologic and landscape variables relating to groundwater residence time increase the probability of arsenic occurrence in groundwater. Previous studies suggest that arsenic in bedrock groundwater may be partly from past arsenical pesticide use. Variables representing historic agricultural inputs do not improve the model, indicating that this source does not significantly contribute to current arsenic concentrations. Due to the complexity of the fractured bedrock aquifers in the region, well depth and related variables also are not significant predictors.

  9. Application of copulas to improve covariance estimation for partial least squares.

    PubMed

    D'Angelo, Gina M; Weissfeld, Lisa A

    2013-02-20

    Dimension reduction techniques, such as partial least squares, are useful for computing summary measures and examining relationships in complex settings. Partial least squares requires an estimate of the covariance matrix as a first step in the analysis, making this estimate critical to the results. In addition, the covariance matrix also forms the basis for other techniques in multivariate analysis, such as principal component analysis and independent component analysis. This paper has been motivated by an example from an imaging study in Alzheimer's disease where there is complete separation between Alzheimer's and control subjects for one of the imaging modalities. This separation occurs in one block of variables and does not occur with the second block of variables resulting in inaccurate estimates of the covariance. We propose the use of a copula to obtain estimates of the covariance in this setting, where one set of variables comes from a mixture distribution. Simulation studies show that the proposed estimator is an improvement over the standard estimators of covariance. We illustrate the methods from the motivating example from a study in the area of Alzheimer's disease. Copyright © 2012 John Wiley & Sons, Ltd.

  10. On Chaotic and Hyperchaotic Complex Nonlinear Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Mahmoud, Gamal M.

    Dynamical systems described by real and complex variables are currently one of the most popular areas of scientific research. These systems play an important role in several fields of physics, engineering, and computer sciences, for example, laser systems, control (or chaos suppression), secure communications, and information science. Dynamical basic properties, chaos (hyperchaos) synchronization, chaos control, and generating hyperchaotic behavior of these systems are briefly summarized. The main advantage of introducing complex variables is the reduction of phase space dimensions by a half. They are also used to describe and simulate the physics of detuned laser and thermal convection of liquid flows, where the electric field and the atomic polarization amplitudes are both complex. Clearly, if the variables of the system are complex the equations involve twice as many variables and control parameters, thus making it that much harder for a hostile agent to intercept and decipher the coded message. Chaotic and hyperchaotic complex systems are stated as examples. Finally there are many open problems in the study of chaotic and hyperchaotic complex nonlinear dynamical systems, which need further investigations. Some of these open problems are given.

  11. Moving Beyond Univariate Post-Hoc Testing in Exercise Science: A Primer on Descriptive Discriminate Analysis.

    PubMed

    Barton, Mitch; Yeatts, Paul E; Henson, Robin K; Martin, Scott B

    2016-12-01

    There has been a recent call to improve data reporting in kinesiology journals, including the appropriate use of univariate and multivariate analysis techniques. For example, a multivariate analysis of variance (MANOVA) with univariate post hocs and a Bonferroni correction is frequently used to investigate group differences on multiple dependent variables. However, this univariate approach decreases power, increases the risk for Type 1 error, and contradicts the rationale for conducting multivariate tests in the first place. The purpose of this study was to provide a user-friendly primer on conducting descriptive discriminant analysis (DDA), which is a post-hoc strategy to MANOVA that takes into account the complex relationships among multiple dependent variables. A real-world example using the Statistical Package for the Social Sciences syntax and data from 1,095 middle school students on their body composition and body image are provided to explain and interpret the results from DDA. While univariate post hocs increased the risk for Type 1 error to 76%, the DDA identified which dependent variables contributed to group differences and which groups were different from each other. For example, students in the very lean and Healthy Fitness Zone categories for body mass index experienced less pressure to lose weight, more satisfaction with their body, and higher physical self-concept than the Needs Improvement Zone groups. However, perceived pressure to gain weight did not contribute to group differences because it was a suppressor variable. Researchers are encouraged to use DDA when investigating group differences on multiple correlated dependent variables to determine which variables contributed to group differences.

  12. Fun Science: The Use of Variable Manipulation to Avoid Content Instruction

    NASA Astrophysics Data System (ADS)

    Peters-Burton, Erin E.; Hiller, Suzanne E.

    2013-02-01

    This study examined the beliefs and rationale pre-service elementary teachers used to choose activities for upper-elementary students in a 1-week intensive science camp. Six undergraduate elementary pre-service teachers were observed as they took a semester-long science methods class that culminated in a 1-week science camp. This qualitative, phenomenological study found that counselors chose activities with the possibility of fun being a priority rather than teaching content, even after they were confronted with campers who demanded more content. Additionally, all six of the counselors agreed that activities involving variable manipulation were the most successful, even though content knowledge was not required to complete the activities. The counselors felt the variable manipulation activities were successful because students were constructing products and therefore getting to the end of the activity. Implications include building an awareness of the complexity of self-efficacy of science teaching and outcome expectancy to improve teacher education programs.

  13. Mitochondria and the non-genetic origins of cell-to-cell variability: More is different.

    PubMed

    Guantes, Raúl; Díaz-Colunga, Juan; Iborra, Francisco J

    2016-01-01

    Gene expression activity is heterogeneous in a population of isogenic cells. Identifying the molecular basis of this variability will improve our understanding of phenomena like tumor resistance to drugs, virus infection, or cell fate choice. The complexity of the molecular steps and machines involved in transcription and translation could introduce sources of randomness at many levels, but a common constraint to most of these processes is its energy dependence. In eukaryotic cells, most of this energy is provided by mitochondria. A clonal population of cells may show a large variability in the number and functionality of mitochondria. Here, we discuss how differences in the mitochondrial content of each cell contribute to heterogeneity in gene products. Changes in the amount of mitochondria can also entail drastic alterations of a cell's gene expression program, which ultimately leads to phenotypic diversity. Also watch the Video Abstract. © 2015 WILEY Periodicals, Inc.

  14. Applications of MIDAS regression in analysing trends in water quality

    NASA Astrophysics Data System (ADS)

    Penev, Spiridon; Leonte, Daniela; Lazarov, Zdravetz; Mann, Rob A.

    2014-04-01

    We discuss novel statistical methods in analysing trends in water quality. Such analysis uses complex data sets of different classes of variables, including water quality, hydrological and meteorological. We analyse the effect of rainfall and flow on trends in water quality utilising a flexible model called Mixed Data Sampling (MIDAS). This model arises because of the mixed frequency in the data collection. Typically, water quality variables are sampled fortnightly, whereas the rain data is sampled daily. The advantage of using MIDAS regression is in the flexible and parsimonious modelling of the influence of the rain and flow on trends in water quality variables. We discuss the model and its implementation on a data set from the Shoalhaven Supply System and Catchments in the state of New South Wales, Australia. Information criteria indicate that MIDAS modelling improves upon simplistic approaches that do not utilise the mixed data sampling nature of the data.

  15. Optimization of an angle-beam ultrasonic approach for characterization of impact damage in composites

    NASA Astrophysics Data System (ADS)

    Henry, Christine; Kramb, Victoria; Welter, John T.; Wertz, John N.; Lindgren, Eric A.; Aldrin, John C.; Zainey, David

    2018-04-01

    Advances in NDE method development are greatly improved through model-guided experimentation. In the case of ultrasonic inspections, models which provide insight into complex mode conversion processes and sound propagation paths are essential for understanding the experimental data and inverting the experimental data into relevant information. However, models must also be verified using experimental data obtained under well-documented and understood conditions. Ideally, researchers would utilize the model simulations and experimental approach to efficiently converge on the optimal solution. However, variability in experimental parameters introduce extraneous signals that are difficult to differentiate from the anticipated response. This paper discusses the results of an ultrasonic experiment designed to evaluate the effect of controllable variables on the anticipated signal, and the effect of unaccounted for experimental variables on the uncertainty in those results. Controlled experimental parameters include the transducer frequency, incidence beam angle and focal depth.

  16. Improvement of the R-SWAT-FME framework to support multiple variables and multi-objective functions

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shu-Guang

    2014-01-01

    Application of numerical models is a common practice in the environmental field for investigation and prediction of natural and anthropogenic processes. However, process knowledge, parameter identifiability, sensitivity, and uncertainty analyses are still a challenge for large and complex mathematical models such as the hydrological/water quality model, Soil and Water Assessment Tool (SWAT). In this study, the previously developed R program language-SWAT-Flexible Modeling Environment (R-SWAT-FME) was improved to support multiple model variables and objectives at multiple time steps (i.e., daily, monthly, and annually). This expansion is significant because there is usually more than one variable (e.g., water, nutrients, and pesticides) of interest for environmental models like SWAT. To further facilitate its easy use, we also simplified its application requirements without compromising its merits, such as the user-friendly interface. To evaluate the performance of the improved framework, we used a case study focusing on both streamflow and nitrate nitrogen in the Upper Iowa River Basin (above Marengo) in the United States. Results indicated that the R-SWAT-FME performs well and is comparable to the built-in auto-calibration tool in multi-objective model calibration. Overall, the enhanced R-SWAT-FME can be useful for the SWAT community, and the methods we used can also be valuable for wrapping potential R packages with other environmental models.

  17. An improved standardization procedure to remove systematic low frequency variability biases in GCM simulations

    NASA Astrophysics Data System (ADS)

    Mehrotra, Rajeshwar; Sharma, Ashish

    2012-12-01

    The quality of the absolute estimates of general circulation models (GCMs) calls into question the direct use of GCM outputs for climate change impact assessment studies, particularly at regional scales. Statistical correction of GCM output is often necessary when significant systematic biasesoccur between the modeled output and observations. A common procedure is to correct the GCM output by removing the systematic biases in low-order moments relative to observations or to reanalysis data at daily, monthly, or seasonal timescales. In this paper, we present an extension of a recently published nested bias correction (NBC) technique to correct for the low- as well as higher-order moments biases in the GCM-derived variables across selected multiple time-scales. The proposed recursive nested bias correction (RNBC) approach offers an improved basis for applying bias correction at multiple timescales over the original NBC procedure. The method ensures that the bias-corrected series exhibits improvements that are consistently spread over all of the timescales considered. Different variations of the approach starting from the standard NBC to the more complex recursive alternatives are tested to assess their impacts on a range of GCM-simulated atmospheric variables of interest in downscaling applications related to hydrology and water resources. Results of the study suggest that three to five iteration RNBCs are the most effective in removing distributional and persistence related biases across the timescales considered.

  18. Simulation of changes on the psychosocial risk in the nursing personnel after implementing the policy of good practices on the risk treatment.

    PubMed

    Bolívar Murcia, María Paula; Cruz González, Joan Paola; Rodríguez Bello, Luz Angélica

    2018-02-01

    Evaluate the change over time of psychosocial risk management for the nursing personnel of an intermediate complexity clinic of Bogota (Colombia). Descriptive and correlational research performed under the approach of risk management (identification, analysis, assessment and treatment). The psychosocial risk of the nursing personnel was studied through 10-year system dynamics models (with and without the implementation of the policy of good practices on the risk treatment) in two scenarios: when the nursing personnel works shifts of 6 hours (morning or afternoon) and when they work over 12 hours (double shift or night shift). When implementing a policy of good practices on the risk treatment, the double shift scenario shows an improvement among 25% to 88% in the variables of: health, labor motivation, burnout, service level and productivity; as well as in the variables of the organization associated to number of patients, nursing personnel and profit. Likewise, the single shift scenario with good practices improves in all the above-mentioned variables and generates stability on the variables of absenteeism and resignations. The best scenario is the single shift scenario with the application of good practices of risk treatment in comparison with the double shift scenario with good practices, which allows concluding that the good practices have a positive effect on the variables of nursing personnel and on those associated to the organization. Copyright© by the Universidad de Antioquia.

  19. Continuous Process Improvement at Tinker Air Logistics Complex

    DTIC Science & Technology

    2013-03-01

    principles of Lean thinking published by Womack and Jones (US Air Force, 2008). The goal of AFSO21 was to eliminate waste from organizational...survey because we did not think a survey could accurately capture the depth and of the independent variables. Intangible elements of leadership and...workers within the firm itself. People are at the heart of the TPS. One of Ohno’s 14 principles of the Toyota Way is “Develop exceptional people and

  20. Quantifying the Complex Hydrologic Response of a Desert Ephemeral Wash

    DTIC Science & Technology

    2011-04-19

    who graduated during this period and will receive scholarships or fellowships for further studies in science, mathematics, engineering or technology...precipitation, soil moisture response, 5 and evaporative losses across variable terrain provides an opportunity to improve water balance estimates for...threshold of +/- 0.5 o C for Oceanic Nino Index (ONI). (Source: NOAA /CPC, 2010) Year DJF JFM FMA MAM AMJ MJJ JJA JAS ASO SON OND NDJ 2005 0.7 0.5 0.4 0.4

  1. Short-term to seasonal variability in factors driving primary productivity in a shallow estuary: Implications for modeling production

    NASA Astrophysics Data System (ADS)

    Canion, Andy; MacIntyre, Hugh L.; Phipps, Scott

    2013-10-01

    The inputs of primary productivity models may be highly variable on short timescales (hourly to daily) in turbid estuaries, but modeling of productivity in these environments is often implemented with data collected over longer timescales. Daily, seasonal, and spatial variability in primary productivity model parameters: chlorophyll a concentration (Chla), the downwelling light attenuation coefficient (kd), and photosynthesis-irradiance response parameters (Pmchl, αChl) were characterized in Weeks Bay, a nitrogen-impacted shallow estuary in the northern Gulf of Mexico. Variability in primary productivity model parameters in response to environmental forcing, nutrients, and microalgal taxonomic marker pigments were analysed in monthly and short-term datasets. Microalgal biomass (as Chla) was strongly related to total phosphorus concentration on seasonal scales. Hourly data support wind-driven resuspension as a major source of short-term variability in Chla and light attenuation (kd). The empirical relationship between areal primary productivity and a combined variable of biomass and light attenuation showed that variability in the photosynthesis-irradiance response contributed little to the overall variability in primary productivity, and Chla alone could account for 53-86% of the variability in primary productivity. Efforts to model productivity in similar shallow systems with highly variable microalgal biomass may benefit the most by investing resources in improving spatial and temporal resolution of chlorophyll a measurements before increasing the complexity of models used in productivity modeling.

  2. Input variable selection for data-driven models of Coriolis flowmeters for two-phase flow measurement

    NASA Astrophysics Data System (ADS)

    Wang, Lijuan; Yan, Yong; Wang, Xue; Wang, Tao

    2017-03-01

    Input variable selection is an essential step in the development of data-driven models for environmental, biological and industrial applications. Through input variable selection to eliminate the irrelevant or redundant variables, a suitable subset of variables is identified as the input of a model. Meanwhile, through input variable selection the complexity of the model structure is simplified and the computational efficiency is improved. This paper describes the procedures of the input variable selection for the data-driven models for the measurement of liquid mass flowrate and gas volume fraction under two-phase flow conditions using Coriolis flowmeters. Three advanced input variable selection methods, including partial mutual information (PMI), genetic algorithm-artificial neural network (GA-ANN) and tree-based iterative input selection (IIS) are applied in this study. Typical data-driven models incorporating support vector machine (SVM) are established individually based on the input candidates resulting from the selection methods. The validity of the selection outcomes is assessed through an output performance comparison of the SVM based data-driven models and sensitivity analysis. The validation and analysis results suggest that the input variables selected from the PMI algorithm provide more effective information for the models to measure liquid mass flowrate while the IIS algorithm provides a fewer but more effective variables for the models to predict gas volume fraction.

  3. Elimination of chromatographic and mass spectrometric problems in GC-MS analysis of Lavender essential oil by multivariate curve resolution techniques: Improving the peak purity assessment by variable size moving window-evolving factor analysis.

    PubMed

    Jalali-Heravi, Mehdi; Moazeni-Pourasil, Roudabeh Sadat; Sereshti, Hassan

    2015-03-01

    In analysis of complex natural matrices by gas chromatography-mass spectrometry (GC-MS), many disturbing factors such as baseline drift, spectral background, homoscedastic and heteroscedastic noise, peak shape deformation (non-Gaussian peaks), low S/N ratio and co-elution (overlapped and/or embedded peaks) lead the researchers to handle them to serve time, money and experimental efforts. This study aimed to improve the GC-MS analysis of complex natural matrices utilizing multivariate curve resolution (MCR) methods. In addition, to assess the peak purity of the two-dimensional data, a method called variable size moving window-evolving factor analysis (VSMW-EFA) is introduced and examined. The proposed methodology was applied to the GC-MS analysis of Iranian Lavender essential oil, which resulted in extending the number of identified constituents from 56 to 143 components. It was found that the most abundant constituents of the Iranian Lavender essential oil are α-pinene (16.51%), camphor (10.20%), 1,8-cineole (9.50%), bornyl acetate (8.11%) and camphene (6.50%). This indicates that the Iranian type Lavender contains a relatively high percentage of α-pinene. Comparison of different types of Lavender essential oils showed the composition similarity between Iranian and Italian (Sardinia Island) Lavenders. Published by Elsevier B.V.

  4. Exploratory Spectroscopy of Magnetic Cataclysmic Variables Candidates and Other Variable Objects

    NASA Astrophysics Data System (ADS)

    Oliveira, A. S.; Rodrigues, C. V.; Cieslinski, D.; Jablonski, F. J.; Silva, K. M. G.; Almeida, L. A.; Rodríguez-Ardila, A.; Palhares, M. S.

    2017-04-01

    The increasing number of synoptic surveys made by small robotic telescopes, such as the photometric Catalina Real-Time Transient Survey (CRTS), provides a unique opportunity to discover variable sources and improves the statistical samples of such classes of objects. Our goal is the discovery of magnetic Cataclysmic Variables (mCVs). These are rare objects that probe interesting accretion scenarios controlled by the white-dwarf magnetic field. In particular, improved statistics of mCVs would help to address open questions on their formation and evolution. We performed an optical spectroscopy survey to search for signatures of magnetic accretion in 45 variable objects selected mostly from the CRTS. In this sample, we found 32 CVs, 22 being mCV candidates, 13 of which were previously unreported as such. If the proposed classifications are confirmed, it would represent an increase of 4% in the number of known polars and 12% in the number of known IPs. A fraction of our initial sample was classified as extragalactic sources or other types of variable stars by the inspection of the identification spectra. Despite the inherent complexity in identifying a source as an mCV, variability-based selection, followed by spectroscopic snapshot observations, has proved to be an efficient strategy for their discoveries, being a relatively inexpensive approach in terms of telescope time. Based on observations obtained at the Observatório do Pico dos Dias/LNA, and at the Southern Astrophysical Research (SOAR) telescope, which is a joint project of the Ministério da Ciência, Tecnologia, e Inovação (MCTI) da República Federativa do Brasil, the U.S. National Optical Astronomy Observatory (NOAO), the University of North Carolina at Chapel Hill (UNC), and Michigan State University (MSU).

  5. Up-scaling of multi-variable flood loss models from objects to land use units at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Schröter, Kai; Merz, Bruno

    2016-05-01

    Flood risk management increasingly relies on risk analyses, including loss modelling. Most of the flood loss models usually applied in standard practice have in common that complex damaging processes are described by simple approaches like stage-damage functions. Novel multi-variable models significantly improve loss estimation on the micro-scale and may also be advantageous for large-scale applications. However, more input parameters also reveal additional uncertainty, even more in upscaling procedures for meso-scale applications, where the parameters need to be estimated on a regional area-wide basis. To gain more knowledge about challenges associated with the up-scaling of multi-variable flood loss models the following approach is applied: Single- and multi-variable micro-scale flood loss models are up-scaled and applied on the meso-scale, namely on basis of ATKIS land-use units. Application and validation is undertaken in 19 municipalities, which were affected during the 2002 flood by the River Mulde in Saxony, Germany by comparison to official loss data provided by the Saxon Relief Bank (SAB).In the meso-scale case study based model validation, most multi-variable models show smaller errors than the uni-variable stage-damage functions. The results show the suitability of the up-scaling approach, and, in accordance with micro-scale validation studies, that multi-variable models are an improvement in flood loss modelling also on the meso-scale. However, uncertainties remain high, stressing the importance of uncertainty quantification. Thus, the development of probabilistic loss models, like BT-FLEMO used in this study, which inherently provide uncertainty information are the way forward.

  6. Newtonian nudging for a Richards equation-based distributed hydrological model

    NASA Astrophysics Data System (ADS)

    Paniconi, Claudio; Marrocu, Marino; Putti, Mario; Verbunt, Mark

    The objective of data assimilation is to provide physically consistent estimates of spatially distributed environmental variables. In this study a relatively simple data assimilation method has been implemented in a relatively complex hydrological model. The data assimilation technique is Newtonian relaxation or nudging, in which model variables are driven towards observations by a forcing term added to the model equations. The forcing term is proportional to the difference between simulation and observation (relaxation component) and contains four-dimensional weighting functions that can incorporate prior knowledge about the spatial and temporal variability and characteristic scales of the state variable(s) being assimilated. The numerical model couples a three-dimensional finite element Richards equation solver for variably saturated porous media and a finite difference diffusion wave approximation based on digital elevation data for surface water dynamics. We describe the implementation of the data assimilation algorithm for the coupled model and report on the numerical and hydrological performance of the resulting assimilation scheme. Nudging is shown to be successful in improving the hydrological simulation results, and it introduces little computational cost, in terms of CPU and other numerical aspects of the model's behavior, in some cases even improving numerical performance compared to model runs without nudging. We also examine the sensitivity of the model to nudging term parameters including the spatio-temporal influence coefficients in the weighting functions. Overall the nudging algorithm is quite flexible, for instance in dealing with concurrent observation datasets, gridded or scattered data, and different state variables, and the implementation presented here can be readily extended to any of these features not already incorporated. Moreover the nudging code and tests can serve as a basis for implementation of more sophisticated data assimilation techniques in a Richards equation-based hydrological model.

  7. Complexity Variability Assessment of Nonlinear Time-Varying Cardiovascular Control

    NASA Astrophysics Data System (ADS)

    Valenza, Gaetano; Citi, Luca; Garcia, Ronald G.; Taylor, Jessica Noggle; Toschi, Nicola; Barbieri, Riccardo

    2017-02-01

    The application of complex systems theory to physiology and medicine has provided meaningful information about the nonlinear aspects underlying the dynamics of a wide range of biological processes and their disease-related aberrations. However, no studies have investigated whether meaningful information can be extracted by quantifying second-order moments of time-varying cardiovascular complexity. To this extent, we introduce a novel mathematical framework termed complexity variability, in which the variance of instantaneous Lyapunov spectra estimated over time serves as a reference quantifier. We apply the proposed methodology to four exemplary studies involving disorders which stem from cardiology, neurology and psychiatry: Congestive Heart Failure (CHF), Major Depression Disorder (MDD), Parkinson’s Disease (PD), and Post-Traumatic Stress Disorder (PTSD) patients with insomnia under a yoga training regime. We show that complexity assessments derived from simple time-averaging are not able to discern pathology-related changes in autonomic control, and we demonstrate that between-group differences in measures of complexity variability are consistent across pathologies. Pathological states such as CHF, MDD, and PD are associated with an increased complexity variability when compared to healthy controls, whereas wellbeing derived from yoga in PTSD is associated with lower time-variance of complexity.

  8. Aging and the complexity of cardiovascular dynamics

    NASA Technical Reports Server (NTRS)

    Kaplan, D. T.; Furman, M. I.; Pincus, S. M.; Ryan, S. M.; Lipsitz, L. A.; Goldberger, A. L.

    1991-01-01

    Biomedical signals often vary in a complex and irregular manner. Analysis of variability in such signals generally does not address directly their complexity, and so may miss potentially useful information. We analyze the complexity of heart rate and beat-to-beat blood pressure using two methods motivated by nonlinear dynamics (chaos theory). A comparison of a group of healthy elderly subjects with healthy young adults indicates that the complexity of cardiovascular dynamics is reduced with aging. This suggests that complexity of variability may be a useful physiological marker.

  9. [Pain and workplace. Sociodemographic variables influence in therapeutic response and labor productivity].

    PubMed

    Vicente-Herrero, M T; López-González, Á A; Ramírez Iñiguez de la Torre, M V; Capdevila García, L M; Terradillos García, M J; Aguilar Jiménez, E

    2016-09-01

    Pain is a major cause of medical consultation. The complexity of managing it is due to its long duration and intensity, and it sometimes requires a combination of multiple drugs. The objective of this study is to assess the use of drugs for pain in workers, the clinical response obtained, its influence on estimating work productivity, its relationship to sociodemographic variables, and the type of drug used. A cross-sectional study on 1,080 workers, aged 18-65 years, during periodic surveys to monitor their health in companies in the service sector in Spain. Treatments used, clinical efficacy, influence on work productivity and sociodemographic variables (age, gender) are evaluated. The Brief Pain Inventory questionnaire, validated for Spain, was used to assess pain, and the SPSS(®) 20.0 package for the statistical analysis. NSAIDs and simple analgesics have higher percentages of improvement in pain (P=.032 and P<.0001, respectively). Men respond better to NSAIDs, and women to simple analgesics. Improved productivity is higher in men than in women (P=.042). No significant differences were observed for age, pain improvement or productivity, except in those over 55 years. The analgesic prescription pain conditions must consider the age and gender of the patient, as well as the type of drug. The choice of drug should be based on the aetiology and aspects unrelated to the clinical variables, such as sociodemographic, work or psychosocial. Copyright © 2015 Sociedad Española de Médicos de Atención Primaria (SEMERGEN). Publicado por Elsevier España, S.L.U. All rights reserved.

  10. Evaluation of an improved intermediate complexity snow scheme in the ORCHIDEE land surface model

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Ottlé, Catherine; Boone, Aaron; Ciais, Philippe; Brun, Eric; Morin, Samuel; Krinner, Gerhard; Piao, Shilong; Peng, Shushi

    2013-06-01

    Snow plays an important role in land surface models (LSM) for climate and model applied over Fran studies, but its current treatment as a single layer of constant density and thermal conductivity in ORCHIDEE (Organizing Carbon and Hydrology in Dynamic Ecosystems) induces significant deficiencies. The intermediate complexity snow scheme ISBA-ES (Interaction between Soil, Biosphere and Atmosphere-Explicit Snow) that includes key snow processes has been adapted and implemented into ORCHIDEE, referred to here as ORCHIDEE-ES. In this study, the adapted scheme is evaluated against the observations from the alpine site Col de Porte (CDP) with a continuous 18 year data set and from sites distributed in northern Eurasia. At CDP, the comparisons of snow depth, snow water equivalent, surface temperature, snow albedo, and snowmelt runoff reveal that the improved scheme in ORCHIDEE is capable of simulating the internal snow processes better than the original one. Preliminary sensitivity tests indicate that snow albedo parameterization is the main cause for the large difference in snow-related variables but not for soil temperature simulated by the two models. The ability of the ORCHIDEE-ES to better simulate snow thermal conductivity mainly results in differences in soil temperatures. These are confirmed by performing sensitivity analysis of ORCHIDEE-ES parameters using the Morris method. These features can enable us to more realistically investigate interactions between snow and soil thermal regimes (and related soil carbon decomposition). When the two models are compared over sites located in northern Eurasia from 1979 to 1993, snow-related variables and 20 cm soil temperature are better reproduced by ORCHIDEE-ES than ORCHIDEE, revealing a more accurate representation of spatio-temporal variability.

  11. Cross-scale modeling of surface temperature and tree seedling establishment inmountain landscapes

    USGS Publications Warehouse

    Dingman, John; Sweet, Lynn C.; McCullough, Ian M.; Davis, Frank W.; Flint, Alan L.; Franklin, Janet; Flint, Lorraine E.

    2013-01-01

    Abstract: Introduction: Estimating surface temperature from above-ground field measurements is important for understanding the complex landscape patterns of plant seedling survival and establishment, processes which occur at heights of only several centimeters. Currently, future climate models predict temperature at 2 m above ground, leaving ground-surface microclimate not well characterized. Methods: Using a network of field temperature sensors and climate models, a ground-surface temperature method was used to estimate microclimate variability of minimum and maximum temperature. Temperature lapse rates were derived from field temperature sensors and distributed across the landscape capturing differences in solar radiation and cold air drainages modeled at a 30-m spatial resolution. Results: The surface temperature estimation method used for this analysis successfully estimated minimum surface temperatures on north-facing, south-facing, valley, and ridgeline topographic settings, and when compared to measured temperatures yielded an R2 of 0.88, 0.80, 0.88, and 0.80, respectively. Maximum surface temperatures generally had slightly more spatial variability than minimum surface temperatures, resulting in R2 values of 0.86, 0.77, 0.72, and 0.79 for north-facing, south-facing, valley, and ridgeline topographic settings. Quasi-Poisson regressions predicting recruitment of Quercus kelloggii (black oak) seedlings from temperature variables were significantly improved using these estimates of surface temperature compared to air temperature modeled at 2 m. Conclusion: Predicting minimum and maximum ground-surface temperatures using a downscaled climate model coupled with temperature lapse rates estimated from field measurements provides a method for modeling temperature effects on plant recruitment. Such methods could be applied to improve projections of species’ range shifts under climate change. Areas of complex topography can provide intricate microclimates that may allow species to redistribute locally as climate changes.

  12. Variability in Rheumatology day care hospitals in Spain: VALORA study.

    PubMed

    Hernández Miguel, María Victoria; Martín Martínez, María Auxiliadora; Corominas, Héctor; Sanchez-Piedra, Carlos; Sanmartí, Raimon; Fernandez Martinez, Carmen; García-Vicuña, Rosario

    To describe the variability of the day care hospital units (DCHUs) of Rheumatology in Spain, in terms of structural resources and operating processes. Multicenter descriptive study with data from a self-completed questionnaire of DCHUs self-assessment based on DCHUs quality standards of the Spanish Society of Rheumatology. Structural resources and operating processes were analyzed and stratified by hospital complexity (regional, general, major and complex). Variability was determined using the coefficient of variation (CV) of the variable with clinical relevance that presented statistically significant differences when was compared by centers. A total of 89 hospitals (16 autonomous regions and Melilla) were included in the analysis. 11.2% of hospitals are regional, 22,5% general, 27%, major and 39,3% complex. A total of 92% of DCHUs were polyvalent. The number of treatments applied, the coordination between DCHUs and hospital pharmacy and the post graduate training process were the variables that showed statistically significant differences depending on the complexity of hospital. The highest rate of rheumatologic treatments was found in complex hospitals (2.97 per 1,000 population), and the lowest in general hospitals (2.01 per 1,000 population). The CV was 0.88 in major hospitals; 0.86 in regional; 0.76 in general, and 0.72 in the complex. there was variability in the number of treatments delivered in DCHUs, being greater in major hospitals and then in regional centers. Nonetheless, the variability in terms of structure and function does not seem due to differences in center complexity. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.

  13. An Ensemble Successive Project Algorithm for Liquor Detection Using Near Infrared Sensor.

    PubMed

    Qu, Fangfang; Ren, Dong; Wang, Jihua; Zhang, Zhong; Lu, Na; Meng, Lei

    2016-01-11

    Spectral analysis technique based on near infrared (NIR) sensor is a powerful tool for complex information processing and high precision recognition, and it has been widely applied to quality analysis and online inspection of agricultural products. This paper proposes a new method to address the instability of small sample sizes in the successive projections algorithm (SPA) as well as the lack of association between selected variables and the analyte. The proposed method is an evaluated bootstrap ensemble SPA method (EBSPA) based on a variable evaluation index (EI) for variable selection, and is applied to the quantitative prediction of alcohol concentrations in liquor using NIR sensor. In the experiment, the proposed EBSPA with three kinds of modeling methods are established to test their performance. In addition, the proposed EBSPA combined with partial least square is compared with other state-of-the-art variable selection methods. The results show that the proposed method can solve the defects of SPA and it has the best generalization performance and stability. Furthermore, the physical meaning of the selected variables from the near infrared sensor data is clear, which can effectively reduce the variables and improve their prediction accuracy.

  14. Joint effects of climate variability and socioecological factors on dengue transmission: epidemiological evidence.

    PubMed

    Akter, Rokeya; Hu, Wenbiao; Naish, Suchithra; Banu, Shahera; Tong, Shilu

    2017-06-01

    To assess the epidemiological evidence on the joint effects of climate variability and socioecological factors on dengue transmission. Following PRISMA guidelines, a detailed literature search was conducted in PubMed, Web of Science and Scopus. Peer-reviewed, freely available and full-text articles, considering both climate and socioecological factors in relation to dengue, published in English from January 1993 to October 2015 were included in this review. Twenty studies have met the inclusion criteria and assessed the impact of both climatic and socioecological factors on dengue dynamics. Among those, four studies have further investigated the relative importance of climate variability and socioecological factors on dengue transmission. A few studies also developed predictive models including both climatic and socioecological factors. Due to insufficient data, methodological issues and contextual variability of the studies, it is hard to draw conclusion on the joint effects of climate variability and socioecological factors on dengue transmission. Future research should take into account socioecological factors in combination with climate variables for a better understanding of the complex nature of dengue transmission as well as for improving the predictive capability of dengue forecasting models, to develop effective and reliable early warning systems. © 2017 John Wiley & Sons Ltd.

  15. Learning to recognize letters in the periphery: Effects of repeated exposure, letter frequency, and letter complexity

    PubMed Central

    Husk, Jesse S.; Yu, Deyue

    2017-01-01

    Patients with central vision loss must rely on their peripheral vision for reading. Unfortunately, limitations of peripheral vision, such as crowding, pose significant challenges to letter recognition. As a result, there is a need for developing effective training methods for improving crowded letter recognition in the periphery. Several studies have shown that extensive practice with letter stimuli is beneficial to peripheral letter recognition. Here, we explore stimulus-related factors that might influence the effectiveness of peripheral letter recognition training. Specifically, we examined letter exposure (number of letter occurrences), frequency of letter use in English print, and letter complexity and evaluated their contributions to the amount of improvement observed in crowded letter recognition following training. We analyzed data collected across a range of training protocols. Using linear regression, we identified the best-fitting model and observed that all three stimulus-related factors contributed to improvement in peripheral letter recognition with letter exposure being the most important factor. As an important explanatory variable, pretest accuracy was included in the model as well to avoid estimate biases and was shown to have influence on the relationship between training improvement and letter exposure. When developing training protocols for peripheral letter recognition, it may be beneficial to not only consider the overall length of training, but also to tailor the number of stimulus occurrences for each letter according to its initial performance level, frequency, and complexity. PMID:28265651

  16. Show and Tell: Video Modeling and Instruction Without Feedback Improves Performance but Is Not Sufficient for Retention of a Complex Voice Motor Skill.

    PubMed

    Look, Clarisse; McCabe, Patricia; Heard, Robert; Madill, Catherine J

    2018-02-02

    Modeling and instruction are frequent components of both traditional and technology-assisted voice therapy. This study investigated the value of video modeling and instruction in the early acquisition and short-term retention of a complex voice task without external feedback. Thirty participants were randomized to two conditions and trained to produce a vocal siren over 40 trials. One group received a model and verbal instructions, the other group received a model only. Sirens were analyzed for phonation time, vocal intensity, cepstral peak prominence, peak-to-peak time, and root-mean-square error at five time points. The model and instruction group showed significant improvement on more outcome measures than the model-only group. There was an interaction effect for vocal intensity, which showed that instructions facilitated greater improvement when they were first introduced. However, neither group reproduced the model's siren performance across all parameters or retained the skill 1 day later. Providing verbal instruction with a model appears more beneficial than providing a model only in the prepractice phase of acquiring a complex voice skill. Improved performance was observed; however, the higher level of performance was not retained after 40 trials in both conditions. Other prepractice variables may need to be considered. Findings have implications for traditional and technology-assisted voice therapy. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  17. Variationally Optimized Free-Energy Flooding for Rate Calculation.

    PubMed

    McCarty, James; Valsson, Omar; Tiwary, Pratyush; Parrinello, Michele

    2015-08-14

    We propose a new method to obtain kinetic properties of infrequent events from molecular dynamics simulation. The procedure employs a recently introduced variational approach [Valsson and Parrinello, Phys. Rev. Lett. 113, 090601 (2014)] to construct a bias potential as a function of several collective variables that is designed to flood the associated free energy surface up to a predefined level. The resulting bias potential effectively accelerates transitions between metastable free energy minima while ensuring bias-free transition states, thus allowing accurate kinetic rates to be obtained. We test the method on a few illustrative systems for which we obtain an order of magnitude improvement in efficiency relative to previous approaches and several orders of magnitude relative to unbiased molecular dynamics. We expect an even larger improvement in more complex systems. This and the ability of the variational approach to deal efficiently with a large number of collective variables will greatly enhance the scope of these calculations. This work is a vindication of the potential that the variational principle has if applied in innovative ways.

  18. Non-linear modelling and control of semi-active suspensions with variable damping

    NASA Astrophysics Data System (ADS)

    Chen, Huang; Long, Chen; Yuan, Chao-Chun; Jiang, Hao-Bin

    2013-10-01

    Electro-hydraulic dampers can provide variable damping force that is modulated by varying the command current; furthermore, they offer advantages such as lower power, rapid response, lower cost, and simple hardware. However, accurate characterisation of non-linear f-v properties in pre-yield and force saturation in post-yield is still required. Meanwhile, traditional linear or quarter vehicle models contain various non-linearities. The development of a multi-body dynamics model is very complex, and therefore, SIMPACK was used with suitable improvements for model development and numerical simulations. A semi-active suspension was built based on a belief-desire-intention (BDI)-agent model framework. Vehicle handling dynamics were analysed, and a co-simulation analysis was conducted in SIMPACK and MATLAB to evaluate the BDI-agent controller. The design effectively improved ride comfort, handling stability, and driving safety. A rapid control prototype was built based on dSPACE to conduct a real vehicle test. The test and simulation results were consistent, which verified the simulation.

  19. Development Of A Parallel Performance Model For The THOR Neutral Particle Transport Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yessayan, Raffi; Azmy, Yousry; Schunert, Sebastian

    The THOR neutral particle transport code enables simulation of complex geometries for various problems from reactor simulations to nuclear non-proliferation. It is undergoing a thorough V&V requiring computational efficiency. This has motivated various improvements including angular parallelization, outer iteration acceleration, and development of peripheral tools. For guiding future improvements to the code’s efficiency, better characterization of its parallel performance is useful. A parallel performance model (PPM) can be used to evaluate the benefits of modifications and to identify performance bottlenecks. Using INL’s Falcon HPC, the PPM development incorporates an evaluation of network communication behavior over heterogeneous links and a functionalmore » characterization of the per-cell/angle/group runtime of each major code component. After evaluating several possible sources of variability, this resulted in a communication model and a parallel portion model. The former’s accuracy is bounded by the variability of communication on Falcon while the latter has an error on the order of 1%.« less

  20. Modeling Psychological Attributes in Psychology – An Epistemological Discussion: Network Analysis vs. Latent Variables

    PubMed Central

    Guyon, Hervé; Falissard, Bruno; Kop, Jean-Luc

    2017-01-01

    Network Analysis is considered as a new method that challenges Latent Variable models in inferring psychological attributes. With Network Analysis, psychological attributes are derived from a complex system of components without the need to call on any latent variables. But the ontological status of psychological attributes is not adequately defined with Network Analysis, because a psychological attribute is both a complex system and a property emerging from this complex system. The aim of this article is to reappraise the legitimacy of latent variable models by engaging in an ontological and epistemological discussion on psychological attributes. Psychological attributes relate to the mental equilibrium of individuals embedded in their social interactions, as robust attractors within complex dynamic processes with emergent properties, distinct from physical entities located in precise areas of the brain. Latent variables thus possess legitimacy, because the emergent properties can be conceptualized and analyzed on the sole basis of their manifestations, without exploring the upstream complex system. However, in opposition with the usual Latent Variable models, this article is in favor of the integration of a dynamic system of manifestations. Latent Variables models and Network Analysis thus appear as complementary approaches. New approaches combining Latent Network Models and Network Residuals are certainly a promising new way to infer psychological attributes, placing psychological attributes in an inter-subjective dynamic approach. Pragmatism-realism appears as the epistemological framework required if we are to use latent variables as representations of psychological attributes. PMID:28572780

  1. Complex Variables throughout the Curriculum

    ERIC Educational Resources Information Center

    D'Angelo, John P.

    2017-01-01

    We offer many specific detailed examples, several of which are new, that instructors can use (in lecture or as student projects) to revitalize the role of complex variables throughout the curriculum. We conclude with three primary recommendations: revise the syllabus of Calculus II to allow early introductions of complex numbers and linear…

  2. Advanced metrology by offline SEM data processing

    NASA Astrophysics Data System (ADS)

    Lakcher, Amine; Schneider, Loïc.; Le-Gratiet, Bertrand; Ducoté, Julien; Farys, Vincent; Besacier, Maxime

    2017-06-01

    Today's technology nodes contain more and more complex designs bringing increasing challenges to chip manufacturing process steps. It is necessary to have an efficient metrology to assess process variability of these complex patterns and thus extract relevant data to generate process aware design rules and to improve OPC models. Today process variability is mostly addressed through the analysis of in-line monitoring features which are often designed to support robust measurements and as a consequence are not always very representative of critical design rules. CD-SEM is the main CD metrology technique used in chip manufacturing process but it is challenged when it comes to measure metrics like tip to tip, tip to line, areas or necking in high quantity and with robustness. CD-SEM images contain a lot of information that is not always used in metrology. Suppliers have provided tools that allow engineers to extract the SEM contours of their features and to convert them into a GDS. Contours can be seen as the signature of the shape as it contains all the dimensional data. Thus the methodology is to use the CD-SEM to take high quality images then generate SEM contours and create a data base out of them. Contours are used to feed an offline metrology tool that will process them to extract different metrics. It was shown in two previous papers that it is possible to perform complex measurements on hotspots at different process steps (lithography, etch, copper CMP) by using SEM contours with an in-house offline metrology tool. In the current paper, the methodology presented previously will be expanded to improve its robustness and combined with the use of phylogeny to classify the SEM images according to their geometrical proximities.

  3. THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures

    PubMed Central

    Theobald, Douglas L.; Wuttke, Deborah S.

    2008-01-01

    Summary THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. PMID:16777907

  4. Computational Models of Consumer Confidence from Large-Scale Online Attention Data: Crowd-Sourcing Econometrics

    PubMed Central

    2015-01-01

    Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I) that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting. PMID:25826692

  5. Computational models of consumer confidence from large-scale online attention data: crowd-sourcing econometrics.

    PubMed

    Dong, Xianlei; Bollen, Johan

    2015-01-01

    Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I) that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting.

  6. Steady-state configuration and tension calculations of marine cables under complex currents via separated particle swarm optimization

    NASA Astrophysics Data System (ADS)

    Xu, Xue-song

    2014-12-01

    Under complex currents, the motion governing equations of marine cables are complex and nonlinear, and the calculations of cable configuration and tension become difficult compared with those under the uniform or simple currents. To obtain the numerical results, the usual Newton-Raphson iteration is often adopted, but its stability depends on the initial guessed solution to the governing equations. To improve the stability of numerical calculation, this paper proposed separated the particle swarm optimization, in which the variables are separated into several groups, and the dimension of search space is reduced to facilitate the particle swarm optimization. Via the separated particle swarm optimization, these governing nonlinear equations can be solved successfully with any initial solution, and the process of numerical calculation is very stable. For the calculations of cable configuration and tension of marine cables under complex currents, the proposed separated swarm particle optimization is more effective than the other particle swarm optimizations.

  7. Variability in Second Language Learning: The Roles of Individual Differences, Learning Conditions, and Linguistic Complexity

    ERIC Educational Resources Information Center

    Tagarelli, Kaitlyn M.; Ruiz, Simón; Vega, José Luis Moreno; Rebuschat, Patrick

    2016-01-01

    Second language learning outcomes are highly variable, due to a variety of factors, including individual differences, exposure conditions, and linguistic complexity. However, exactly how these factors interact to influence language learning is unknown. This article examines the relationship between these three variables in language learners.…

  8. New preconditioning strategy for Jacobian-free solvers for variably saturated flows with Richards’ equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lipnikov, Konstantin; Moulton, David; Svyatskiy, Daniil

    2016-04-29

    We develop a new approach for solving the nonlinear Richards’ equation arising in variably saturated flow modeling. The growing complexity of geometric models for simulation of subsurface flows leads to the necessity of using unstructured meshes and advanced discretization methods. Typically, a numerical solution is obtained by first discretizing PDEs and then solving the resulting system of nonlinear discrete equations with a Newton-Raphson-type method. Efficiency and robustness of the existing solvers rely on many factors, including an empiric quality control of intermediate iterates, complexity of the employed discretization method and a customized preconditioner. We propose and analyze a new preconditioningmore » strategy that is based on a stable discretization of the continuum Jacobian. We will show with numerical experiments for challenging problems in subsurface hydrology that this new preconditioner improves convergence of the existing Jacobian-free solvers 3-20 times. Furthermore, we show that the Picard method with this preconditioner becomes a more efficient nonlinear solver than a few widely used Jacobian-free solvers.« less

  9. Diminished heart rate complexity in adolescent girls: a sign of vulnerability to anxiety disorders?

    PubMed

    Fiol-Veny, Aina; De la Torre-Luque, Alejandro; Balle, Maria; Bornas, Xavier

    2018-07-01

    Diminished heart rate variability has been found to be associated with high anxiety symptomatology. Since adolescence is the period of onset for many anxiety disorders, this study aimed to determine sex- and anxiety-related differences in heart rate variability and complexity in adolescents. We created four groups according to sex and anxiety symptomatology: high-anxiety girls (n = 24) and boys (n = 25), and low-anxiety girls (n = 22) and boys (n = 24) and recorded their cardiac function while they performed regular school activities. A series of two-way (sex and anxiety) MANOVAs were performed on time domain variability, frequency domain variability, and non-linear complexity. We obtained no multivariate interaction effects between sex and anxiety, but highly anxious participants had lower heart rate variability than the low-anxiety group. Regarding sex, girls showed lower heart rate variability and complexity than boys. The results suggest that adolescent girls have a less flexible cardiac system that could be a marker of the girls' vulnerability to developing anxiety disorders.

  10. Development of motor speed and associated movements from 5 to 18 years.

    PubMed

    Gasser, Theo; Rousson, Valentin; Caflisch, Jon; Jenni, Oskar G

    2010-03-01

    To study the development of motor speed and associated movements in participants aged 5 to 18 years for age, sex, and laterality. Ten motor tasks of the Zurich Neuromotor Assessment (repetitive and alternating movements of hands and feet, repetitive and sequential finger movements, the pegboard, static and dynamic balance, diadochokinesis) were administered to 593 right-handed participants (286 males, 307 females). A strong improvement with age was observed in motor speed from age 5 to 10, followed by a levelling-off between 12 and 18 years. Simple tasks and the pegboard matured early and complex tasks later. Simple tasks showed no associated movements beyond early childhood; in complex tasks associated movements persisted until early adulthood. The two sexes differed only marginally in speed, but markedly in associated movements. A significant laterality (p<0.001) in speed was found for all tasks except for static balance; the pegboard was most lateralized, and sequential finger movements least. Associated movements were lateralized only for a few complex tasks. We also noted a substantial interindividual variability. Motor speed and associated movements improve strongly in childhood, weakly in adolescence, and are both of developmental relevance. Because they correlate weakly, they provide complementary information.

  11. Effects of temporal averaging on short-term irradiance variability under mixed sky conditions

    NASA Astrophysics Data System (ADS)

    Lohmann, Gerald M.; Monahan, Adam H.

    2018-05-01

    Characterizations of short-term variability in solar radiation are required to successfully integrate large numbers of photovoltaic power systems into the electrical grid. Previous studies have used ground-based irradiance observations with a range of different temporal resolutions and a systematic analysis of the effects of temporal averaging on the representation of variability is lacking. Using high-resolution surface irradiance data with original temporal resolutions between 0.01 and 1 s from six different locations in the Northern Hemisphere, we characterize the changes in representation of temporal variability resulting from time averaging. In this analysis, we condition all data to states of mixed skies, which are the most potentially problematic in terms of local PV power volatility. Statistics of clear-sky index k* and its increments Δk*τ (i.e., normalized surface irradiance and changes therein over specified intervals of time) are considered separately. Our results indicate that a temporal averaging time scale of around 1 s marks a transition in representing single-point irradiance variability, such that longer averages result in substantial underestimates of variability. Higher-resolution data increase the complexity of data management and quality control without appreciably improving the representation of variability. The results do not show any substantial discrepancies between locations or seasons.

  12. Biopsy variability of lymphocytic infiltration in breast cancer subtypes and the ImmunoSkew score

    NASA Astrophysics Data System (ADS)

    Khan, Adnan Mujahid; Yuan, Yinyin

    2016-11-01

    The number of tumour biopsies required for a good representation of tumours has been controversial. An important factor to consider is intra-tumour heterogeneity, which can vary among cancer types and subtypes. Immune cells in particular often display complex infiltrative patterns, however, there is a lack of quantitative understanding of the spatial heterogeneity of immune cells and how this fundamental biological nature of human tumours influences biopsy variability and treatment resistance. We systematically investigate biopsy variability for the lymphocytic infiltrate in 998 breast tumours using a novel virtual biopsy method. Across all breast cancers, we observe a nonlinear increase in concordance between the biopsy and whole-tumour score of lymphocytic infiltrate with increasing number of biopsies, yet little improvement is gained with more than four biopsies. Interestingly, biopsy variability of lymphocytic infiltrate differs considerably among breast cancer subtypes, with the human epidermal growth factor receptor 2-positive (HER2+) subtype having the highest variability. We subsequently identify a quantitative measure of spatial variability that predicts disease-specific survival in HER2+ subtype independent of standard clinical variables (node status, tumour size and grade). Our study demonstrates how systematic methods provide new insights that can influence future study design based on a quantitative knowledge of tumour heterogeneity.

  13. A method for work modeling at complex systems: towards applying information systems in family health care units.

    PubMed

    Jatobá, Alessandro; de Carvalho, Paulo Victor R; da Cunha, Amauri Marques

    2012-01-01

    Work in organizations requires a minimum level of consensus on the understanding of the practices performed. To adopt technological devices to support the activities in environments where work is complex, characterized by the interdependence among a large number of variables, understanding about how work is done not only takes an even greater importance, but also becomes a more difficult task. Therefore, this study aims to present a method for modeling of work in complex systems, which allows improving the knowledge about the way activities are performed where these activities do not simply happen by performing procedures. Uniting techniques of Cognitive Task Analysis with the concept of Work Process, this work seeks to provide a method capable of providing a detailed and accurate vision of how people perform their tasks, in order to apply information systems for supporting work in organizations.

  14. Biopharmaceutical considerations and characterizations in development of colon targeted dosage forms for inflammatory bowel disease.

    PubMed

    Malayandi, Rajkumar; Kondamudi, Phani Krishna; Ruby, P K; Aggarwal, Deepika

    2014-04-01

    Colon targeted dosage forms have been extensively studied for the localized treatment of inflammatory bowel disease. These dosage forms not only improve the therapeutic efficacy but also reduce the incidence of adverse drug reactions and hence improve the patient compliance. However, complex and highly variable gastro intestinal physiology limits the clinical success of these dosage forms. Biopharmaceutical characteristics of these dosage forms play a key role in rapid formulation development and ensure the clinical success. The complexity in product development and clinical success of colon targeted dosage forms are based on the biopharmaceutical characteristics such as physicochemical properties of drug substances, pharmaceutical characteristics of dosage form, physiological conditions and pharmacokinetic properties of drug substances as well as drug products. Various in vitro and in vivo techniques have been employed in past to characterize the biopharmaceutical properties of colon targeted dosage forms. This review focuses on the factors influencing the biopharmaceutical performances of the dosage forms, in vitro characterization techniques and in vivo studies.

  15. An IPSO-SVM algorithm for security state prediction of mine production logistics system

    NASA Astrophysics Data System (ADS)

    Zhang, Yanliang; Lei, Junhui; Ma, Qiuli; Chen, Xin; Bi, Runfang

    2017-06-01

    A theoretical basis for the regulation of corporate security warning and resources was provided in order to reveal the laws behind the security state in mine production logistics. Considering complex mine production logistics system and the variable is difficult to acquire, a superior security status predicting model of mine production logistics system based on the improved particle swarm optimization and support vector machine (IPSO-SVM) is proposed in this paper. Firstly, through the linear adjustments of inertia weight and learning weights, the convergence speed and search accuracy are enhanced with the aim to deal with situations associated with the changeable complexity and the data acquisition difficulty. The improved particle swarm optimization (IPSO) is then introduced to resolve the problem of parameter settings in traditional support vector machines (SVM). At the same time, security status index system is built to determine the classification standards of safety status. The feasibility and effectiveness of this method is finally verified using the experimental results.

  16. Improving informed consent to chemotherapy: a randomized controlled trial of written information versus an interactive multimedia CD-ROM.

    PubMed

    Olver, Ian N; Whitford, Hayley S; Denson, Linley A; Peterson, Melissa J; Olver, Scott I

    2009-02-01

    This randomized controlled trial aimed to determine whether an interactive CD-ROM improved cancer patients' recall of chemotherapy treatment information over standard written information, and whether demographic, cognitive, and psychological factors better predicted recall than this format of delivery. One-hundred-and-one new patients about to commence chemotherapy were randomized to receive written information or a CD-ROM containing treatment information before giving informed consent. Patients' recall, concentration, short-term memory, reading comprehension, anxiety, depression, and coping styles were assessed with standardized measures pre-treatment. Seventy-seven patients completed tests for recall of treatment information before their second chemotherapy session. Intention-to-treat analyses indicated no significant differences between the written information and CD-ROM groups across recall questions about number of drugs received (p=.43), treatment length (p=.23), and treatment goal (p=.69). Binary logistic regressions indicated that for groups combined different variables predicted each of the recall questions. An interactive CD-ROM did not improve cancer patients' recall of treatment information enough to warrant changes in consent procedures. Different variables predicted recall of different treatment aspects highlighting the complex nature of attempting to improve patient recall. Attending to the effect of depression on patient knowledge and understanding appears paramount.

  17. Automatic Information Processing and High Performance Skills: Individual Differences and Mechanisms of Performance Improvement in Search-Detection and Complex Task

    DTIC Science & Technology

    1992-09-01

    abilities is fit along with the autoregressive process. Initially, the influences on search performance of within-group age and sex were included as control...Results: PerformanceLAbility Structure Measurement Model: Ability Structure The correlations between all the ability measures, age, and sex are...subsequent analyses for young adults. Age and sex were included as control variables. There was an age range of 15 years; this range is sufficiently large that

  18. Shimmed electron beam welding process

    DOEpatents

    Feng, Ganjiang; Nowak, Daniel Anthony; Murphy, John Thomas

    2002-01-01

    A modified electron beam welding process effects welding of joints between superalloy materials by inserting a weldable shim in the joint and heating the superalloy materials with an electron beam. The process insures a full penetration of joints with a consistent percentage of filler material and thereby improves fatigue life of the joint by three to four times as compared with the prior art. The process also allows variable shim thickness and joint fit-up gaps to provide increased flexibility for manufacturing when joining complex airfoil structures and the like.

  19. A redefinition of water masses in the Vietnamese upwelling area

    NASA Astrophysics Data System (ADS)

    Dippner, Joachim W.; Loick-Wilde, Natalie

    2011-01-01

    A redefinition of characteristic water masses in the Vietnamese upwelling area is presented based on five interdisciplinary cruises during different seasons and 331 CTD casts. In contrast to the previous definition of Rojana-anawat et al. (2001), we clearly can define water masses which serve as end members of mixing. This new definition is useful for an improved understanding of geographical positions of different water masses with respect to climate variability, the phytoplankton distribution and complex nutrient cycle in this area.

  20. Technical development to improve satellite sounding over radiatively complex terrain

    NASA Technical Reports Server (NTRS)

    Schreiner, A. J.

    1985-01-01

    High resolution topography was acquired and applied on the McIDAS system. A technique for finding the surface skin temperature in the presence of cloud and reflected sunlight was implemented in the ALPEX retrieval software and the variability of surface emissivity at microwave wavelength was examined. Data containing raw radiances for all HIRS and MSU channels for NOAA-6 and 7 were used. METEOSAT data were used to derive cloud drift and water vapor winds over the Alpine region.

  1. Moderation and mediation in the psychological and drug treatment of chronic tension-type headache: the role of disorder severity and psychiatric comorbidity.

    PubMed

    Holroyd, Kenneth A; Labus, Jenifer S; Carlson, Bruce

    2009-06-01

    We evaluated two putative moderators of treatment outcome as well as the role of Headache Management Self-Efficacy (HMSE) in mediating treatment outcomes in the drug and non-drug treatment of chronic tension-type headache (CTTH). Subjects were 169 participants (M=38 yrs.; 77% female; M headache days/mo.=22) who received one of four treatments in the treatment of CTTH trial (JAMA, 2001; 285: 2208-15): tricyclic antidepressant medication, placebo, (cognitive-behavioral) stress-management therapy plus placebo, and stress-management therapy plus antidepressant medication. Severity of CTTH disorder and the presence of a psychiatric (mood or anxiety) disorder were found to moderate outcomes obtained with the three active treatments and with placebo, as well as to moderate the role of HMSE in mediating improvements. Both moderator effects appeared to reflect the differing influence of the moderator variable on each of the three active treatments, as well as the fact that the moderator variables exerted the opposite effect on placebo than on the active treatments. HMSE mediated treatment outcomes in the two stress-management conditions, but the pattern of HMSE mediation was complex, varying with the treatment condition, the outcome measure, and the moderator variable. Irrespective of the severity of the CTTH disorder HMSE fully mediated observed improvements in headache activity in the two stress-management conditions. However, for patients with a mood or anxiety disorder HMSE only partially mediated improvements in headache disability, suggesting an additional therapeutic mechanism is required to explain observed improvements in headache disability in the two stress-management conditions.

  2. Repeated Listening Increases the Liking for Music Regardless of Its Complexity: Implications for the Appreciation and Aesthetics of Music

    PubMed Central

    Madison, Guy; Schiölde, Gunilla

    2017-01-01

    Psychological and aesthetic theories predict that music is appreciated at optimal, peak levels of familiarity and complexity, and that appreciation of music exhibits an inverted U-shaped relationship with familiarity as well as complexity. Because increased familiarity conceivably leads to improved processing and less perceived complexity, we test whether there is an interaction between familiarity and complexity. Specifically, increased familiarity should render the music subjectively less complex, and therefore move the apex of the U curve toward greater complexity. A naturalistic listening experiment was conducted, featuring 40 music examples (ME) divided by experts into 4 levels of complexity prior to the main experiment. The MEs were presented 28 times each across a period of approximately 4 weeks, and individual ratings were assessed throughout the experiment. Ratings of liking increased monotonically with repeated listening at all levels of complexity; both the simplest and the most complex MEs were liked more as a function of listening time, without any indication of a U-shaped relation. Although the MEs were previously unknown to the participants, the strongest predictor of liking was familiarity in terms of having listened to similar music before, i.e., familiarity with musical style. We conclude that familiarity is the single most important variable for explaining differences in liking among music, regardless of the complexity of the music. PMID:28408864

  3. Design of experiments for identification of complex biochemical systems with applications to mitochondrial bioenergetics.

    PubMed

    Vinnakota, Kalyan C; Beard, Daniel A; Dash, Ranjan K

    2009-01-01

    Identification of a complex biochemical system model requires appropriate experimental data. Models constructed on the basis of data from the literature often contain parameters that are not identifiable with high sensitivity and therefore require additional experimental data to identify those parameters. Here we report the application of a local sensitivity analysis to design experiments that will improve the identifiability of previously unidentifiable model parameters in a model of mitochondrial oxidative phosphorylation and tricaboxylic acid cycle. Experiments were designed based on measurable biochemical reactants in a dilute suspension of purified cardiac mitochondria with experimentally feasible perturbations to this system. Experimental perturbations and variables yielding the most number of parameters above a 5% sensitivity level are presented and discussed.

  4. Patent information - towards simplicity or complexity?

    NASA Astrophysics Data System (ADS)

    Shenton, Written By Kathleen; Norton, Peter; Onodera, Translated By Natsuo

    Since the advent of online services, the ability to search and find chemical patent information has improved immeasurably. Recently, integration of a multitude of files (through file merging as well as cross-file/simultaneous searches), 'intelligent' interfaces and optical technology for large amounts of data seem to achieve greater simplicity and convenience in the retrieval of patent information. In spite of these progresses, there is more essential problem which increases complexity. It is a tendency to expand indefinitely the range of claim for chemical substances by a ultra-generic description of structure (overuse of optional substituents, variable divalent groups, repeating groups, etc.) and long listing of prophetic examples. Not only does this tendency worry producers and searchers of patent databases but also prevents truly worthy inventions in future.

  5. Improving the Representation of Snow Crystal Properties within a Single-Moment Microphysics Scheme

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew L.; Petersen, Walter A.; Case, Jonathan L.; Dembek, Scott R.

    2010-01-01

    The assumptions of a single-moment microphysics scheme (NASA Goddard) were evaluated using a variety of surface, aircraft and radar data sets. Fixed distribution intercepts and snow bulk densities fail to represent the vertical variability and diversity of crystal populations for this event. Temperature-based equations have merit, but they can be adversely affected by complex temperature profiles that are inverted or isothermal. Column-based approaches can mitigate complex profiles of temperature but are restricted by the ability of the model to represent cloud depth. Spheres are insufficient for use in CloudSat reflectivity comparisons due to Mie resonance, but reasonable for Rayleigh scattering applications. Microphysics schemes will benefit from a greater range of snow crystal characteristics to accommodate naturally occurring diversity.

  6. The QSAR study of flavonoid-metal complexes scavenging rad OH free radical

    NASA Astrophysics Data System (ADS)

    Wang, Bo-chu; Qian, Jun-zhen; Fan, Ying; Tan, Jun

    2014-10-01

    Flavonoid-metal complexes have antioxidant activities. However, quantitative structure-activity relationships (QSAR) of flavonoid-metal complexes and their antioxidant activities has still not been tackled. On the basis of 21 structures of flavonoid-metal complexes and their antioxidant activities for scavenging rad OH free radical, we optimised their structures using Gaussian 03 software package and we subsequently calculated and chose 18 quantum chemistry descriptors such as dipole, charge and energy. Then we chose several quantum chemistry descriptors that are very important to the IC50 of flavonoid-metal complexes for scavenging rad OH free radical through method of stepwise linear regression, Meanwhile we obtained 4 new variables through the principal component analysis. Finally, we built the QSAR models based on those important quantum chemistry descriptors and the 4 new variables as the independent variables and the IC50 as the dependent variable using an Artificial Neural Network (ANN), and we validated the two models using experimental data. These results show that the two models in this paper are reliable and predictable.

  7. Improving protein-protein interaction prediction using evolutionary information from low-quality MSAs.

    PubMed

    Várnai, Csilla; Burkoff, Nikolas S; Wild, David L

    2017-01-01

    Evolutionary information stored in multiple sequence alignments (MSAs) has been used to identify the interaction interface of protein complexes, by measuring either co-conservation or co-mutation of amino acid residues across the interface. Recently, maximum entropy related correlated mutation measures (CMMs) such as direct information, decoupling direct from indirect interactions, have been developed to identify residue pairs interacting across the protein complex interface. These studies have focussed on carefully selected protein complexes with large, good-quality MSAs. In this work, we study protein complexes with a more typical MSA consisting of fewer than 400 sequences, using a set of 79 intramolecular protein complexes. Using a maximum entropy based CMM at the residue level, we develop an interface level CMM score to be used in re-ranking docking decoys. We demonstrate that our interface level CMM score compares favourably to the complementarity trace score, an evolutionary information-based score measuring co-conservation, when combined with the number of interface residues, a knowledge-based potential and the variability score of individual amino acid sites. We also demonstrate, that, since co-mutation and co-complementarity in the MSA contain orthogonal information, the best prediction performance using evolutionary information can be achieved by combining the co-mutation information of the CMM with co-conservation information of a complementarity trace score, predicting a near-native structure as the top prediction for 41% of the dataset. The method presented is not restricted to small MSAs, and will likely improve interface prediction also for complexes with large and good-quality MSAs.

  8. Preparation, characterization and in vivo evaluation of formulation of repaglinide with hydroxypropyl-β-cyclodextrin.

    PubMed

    Liu, Meina; Cao, Wen; Sun, Yinghua; He, Zhonggui

    2014-12-30

    The therapeutic efficacy of repaglinide (RPG) is limited by the low and variable oral bioavailability owing to its limited aqueous solubility. In our present study, the development and evaluation of inclusion complex applying hydroxypropyl-β-cyclodextrin (HP-β-CD) for the improvement of oral bioavailability of repaglinide was investigated systematically. The inclusion complex of repaglinide was prepared by lyophilization technique using drug: hydroxypropyl-β-cyclodextrin (1:15 mole). The prepared complexation was characterized by differential scanning calorimetry (DSC), X-ray diffractometry (XRD), NMR spectroscopy and evaluated by dissolution studies. The (1)H NMR was used in the structure study of repaglinide-HP-β-CD (RPG-HP-β-CD) inclusion complex. The analysis proved the higher probability of the repaglinide A-ring into the narrow rim of the β-cyclodextrin molecule. All the characterization information confirmed the formation of RPG-HP-β-CD inclusion complex. The in vivo pharmacokinetics of RPG-HP-β-CD and their physical mixture were performed in beagle dogs. For the first time, a simple, rapid, and sensitive LC-MS/MS method for determination of RPG in beagle dog plasma was developed. The Cmax and AUC0-t of RPG-HP-β-CD were 2.5 and 2 times higher than that of the physical mixture. These results suggested that the interaction of repaglinide with HP-β-CD could notably improve the dissolution rate and bioavailability of repaglinide comparing with its physical mixture. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Worldwide end-of-life practice for patients in ICUs.

    PubMed

    Wong, Wai-Tat; Phua, Jason; Joynt, Gavin M

    2018-04-01

    Published data and practice recommendations on end-of-life (EOL) generally reflect Western practice frameworks. Understanding worldwide practices is important because improving economic conditions are promoting rapid expansion of intensive care services in many previously disadvantaged regions, and increasing migration has promoted a new cultural diversity previously predominantly unicultural societies. This review explores current knowledge of similarities and differences in EOL practice between regions and possible causes and implications of these differences. Recent observational and survey data shows a marked variability in the practice of withholding and withdrawing life sustaining therapy worldwide. Some evidence supports the view that culture, religion, and socioeconomic factors influence EOL practice, and individually or together account for differences observed. There are also likely to be commonly desired values and expectations for EOL practice, and recent attempts at establishing where worldwide consensus may lie have improved our understanding of shared values and practices. Awareness of differences, understanding their likely complex causes, and using this knowledge to inform individualized care at EOL is likely to improve the quality of care for patients. Further research should clarify the causes of EOL practice variability, monitor trends, and objectively evaluate the quality of EOL practice worldwide.

  10. Drugs meeting the molecular basis of diabetic kidney disease: bridging from molecular mechanism to personalized medicine.

    PubMed

    Lambers Heerspink, Hiddo J; Oberbauer, Rainer; Perco, Paul; Heinzel, Andreas; Heinze, Georg; Mayer, Gert; Mayer, Bernd

    2015-08-01

    Diabetic kidney disease (DKD) is a complex, multifactorial disease and is associated with a high risk of renal and cardiovascular morbidity and mortality. Clinical practice guidelines for diabetes recommend essentially identical treatments for all patients without taking into account how the individual responds to the instituted therapy. Yet, individuals vary widely in how they respond to medications and therefore optimal therapy differs between individuals. Understanding the underlying molecular mechanisms of variability in drug response will help tailor optimal therapy. Polymorphisms in genes related to drug pharmacokinetics have been used to explore mechanisms of response variability in DKD, but with limited success. The complex interaction between genetic make-up and environmental factors on the abundance of proteins and metabolites renders pharmacogenomics alone insufficient to fully capture response variability. A complementary approach is to attribute drug response variability to individual variability in underlying molecular mechanisms involved in the progression of disease. The interplay of different processes (e.g. inflammation, fibrosis, angiogenesis, oxidative stress) appears to drive disease progression, but the individual contribution of each process varies. Drugs at the other hand address specific targets and thereby interfere in certain disease-associated processes. At this level, biomarkers may help to gain insight into which specific pathophysiological processes are involved in an individual followed by a rational assessment whether a specific drug's mode of action indeed targets the relevant process at hand. This article describes the conceptual background and data-driven workflow developed by the SysKid consortium aimed at improving characterization of the molecular mechanisms underlying DKD at the interference of the molecular impact of individual drugs in order to tailor optimal therapy to individual patients. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  11. Riparian vegetation structure under desertification scenarios

    NASA Astrophysics Data System (ADS)

    Rosário Fernandes, M.; Segurado, Pedro; Jauch, Eduardo; Ferreira, M. Teresa

    2015-04-01

    Riparian areas are responsible for many ecological and ecosystems services, including the filtering function, that are considered crucial to the preservation of water quality and social benefits. The main goal of this study is to quantify and understand the riparian variability under desertification scenario(s) and identify the optimal riparian indicators for water scarcity and droughts (WS&D), henceforth improving river basin management. This study was performed in the Iberian Tâmega basin, using riparian woody patches, mapped by visual interpretation on Google Earth imagery, along 130 Sampling Units of 250 m long river stretches. Eight riparian structural indicators, related with lateral dimension, weighted area and shape complexity of riparian patches were calculated using Patch Analyst extension for ArcGis 10. A set of 29 hydrological, climatic, and hydrogeomorphological variables were computed, by a water modelling system (MOHID), using monthly meteorological data between 2008 and 2014. Land-use classes were also calculated, in a 250m-buffer surrounding each sampling unit, using a classification based system on Corine Land Cover. Boosted Regression Trees identified Mean-width (MW) as the optimal riparian indicator for water scarcity and drought, followed by the Weighted Class Area (WCA) (classification accuracy =0.79 and 0.69 respectively). Average Flow and Strahler number were consistently selected, by all boosted models, as the most important explanatory variables. However, a combined effect of hidrogeomorphology and land-use can explain the high variability found in the riparian width mainly in Tâmega tributaries. Riparian patches are larger towards Tâmega river mouth although with lower shape complexity, probably related with more continuous and almost monospecific stands. Climatic, hydrological and land use scenarios, singly and combined, were used to quantify the riparian variability responding to these changes, and to assess the loss of riparian functions such as nutrient incorporation and sediment flux alterations.

  12. An analysis of the financial crisis in the KOSPI market using Hurst exponents

    NASA Astrophysics Data System (ADS)

    Yim, Kyubin; Oh, Gabjin; Kim, Seunghwan

    2014-09-01

    Recently, the study of the financial crisis has progressed to include the concept of the complex system, thereby improving the understanding of this extreme event from a neoclassical economic perspective. To determine which variables are related to the financial event caused by the 2008 US subprime crisis using temporal correlations, we investigate the diverse variables that may explain the financial system. These variables include return, volatility, trading volume and inter-trade duration data sets within the TAQ data for 27 highly capitalized individual companies listed on the KOSPI stock market. During 2008 and 2009, the Hurst exponent for the return time series over the whole period was less than 0.5, and the Hurst exponents for other variables, such as the volatility, trading volume and inter-trade duration, were greater than 0.5. Additionally, we analyze the relationships between the variation of temporal correlation and market instability based on these Hurst exponents and the degree of multifractality. We find that for the data related to trading volume, the Hurst exponents do not allow us to detect changes in market status, such as changes from normal to abnormal status, whereas other variables, including the return, volatility and weekly inter-trade duration, indicate a significant change in market status after the Lehman Brothers' bankruptcy. In addition, the multifractality and the measurement defined by subtracting the Hurst exponent of the return time series from that of the volatility time series decrease sharply after the US subprime event and recover approximately 50 days after the Lehman Brothers' collapse. Our findings suggest that the temporal features of financial quantities in the TAQ data set and the market complexity perform very well at diagnosing financial market stability.

  13. Environmental variability and indicators: a few observations

    Treesearch

    William F. Laudenslayer

    1991-01-01

    Abstract The environment of the earth is exceedingly complex and variable. Indicator species are used to reduce thaf complexity and variability to a level that can be more emily understood. In recent years, use of indicators has increased dramatically. For the Forest Service, as an example, regulations that interpret the National Forest Management Act require the use...

  14. Environmental variability and acoustic signals: a multi-level approach in songbirds.

    PubMed

    Medina, Iliana; Francis, Clinton D

    2012-12-23

    Among songbirds, growing evidence suggests that acoustic adaptation of song traits occurs in response to habitat features. Despite extensive study, most research supporting acoustic adaptation has only considered acoustic traits averaged for species or populations, overlooking intraindividual variation of song traits, which may facilitate effective communication in heterogeneous and variable environments. Fewer studies have explicitly incorporated sexual selection, which, if strong, may favour variation across environments. Here, we evaluate the prevalence of acoustic adaptation among 44 species of songbirds by determining how environmental variability and sexual selection intensity are associated with song variability (intraindividual and intraspecific) and short-term song complexity. We show that variability in precipitation can explain short-term song complexity among taxonomically diverse songbirds, and that precipitation seasonality and the intensity of sexual selection are related to intraindividual song variation. Our results link song complexity to environmental variability, something previously found for mockingbirds (Family Mimidae). Perhaps more importantly, our results illustrate that individual variation in song traits may be shaped by both environmental variability and strength of sexual selection.

  15. Early in-session cognitive-emotional problem-solving predicts 12-month outcomes in depression with personality disorder.

    PubMed

    McCarthy, Kye L; Mergenthaler, Erhard; Grenyer, Brin F S

    2014-01-01

    Therapist-patient verbalizations reveal complex cognitive-emotional linguistic data. How these variables contribute to change requires further research. Emotional-cognitive text analysis using the Ulm cycles model software was applied to transcripts of the third session of psychotherapy for 20 patients with depression and personality disorder. Results showed that connecting cycle sequences of problem-solving in the third hour predicted 12-month clinical outcomes. Therapist-patient dyads most improved spent significantly more time early in session in connecting cycles, whilst the least improved moved into connecting cycles late in session. For this particular sample, it was clear that positive emotional problem-solving in therapy was beneficial.

  16. Newtonian Nudging For A Richards Equation-based Distributed Hydrological Model

    NASA Astrophysics Data System (ADS)

    Paniconi, C.; Marrocu, M.; Putti, M.; Verbunt, M.

    In this study a relatively simple data assimilation method has been implemented in a relatively complex hydrological model. The data assimilation technique is Newtonian relaxation or nudging, in which model variables are driven towards observations by a forcing term added to the model equations. The forcing term is proportional to the difference between simulation and observation (relaxation component) and contains four-dimensional weighting functions that can incorporate prior knowledge about the spatial and temporal variability and characteristic scales of the state variable(s) being assimilated. The numerical model couples a three-dimensional finite element Richards equation solver for variably saturated porous media and a finite difference diffusion wave approximation based on digital elevation data for surface water dynamics. We describe the implementation of the data assimilation algorithm for the coupled model and report on the numerical and hydrological performance of the resulting assimila- tion scheme. Nudging is shown to be successful in improving the hydrological sim- ulation results, and it introduces little computational cost, in terms of CPU and other numerical aspects of the model's behavior, in some cases even improving numerical performance compared to model runs without nudging. We also examine the sensitiv- ity of the model to nudging term parameters including the spatio-temporal influence coefficients in the weighting functions. Overall the nudging algorithm is quite flexi- ble, for instance in dealing with concurrent observation datasets, gridded or scattered data, and different state variables, and the implementation presented here can be read- ily extended to any features not already incorporated. Moreover the nudging code and tests can serve as a basis for implementation of more sophisticated data assimilation techniques in a Richards equation-based hydrological model.

  17. Permutation importance: a corrected feature importance measure.

    PubMed

    Altmann, André; Toloşi, Laura; Sander, Oliver; Lengauer, Thomas

    2010-05-15

    In life sciences, interpretability of machine learning models is as important as their prediction accuracy. Linear models are probably the most frequently used methods for assessing feature relevance, despite their relative inflexibility. However, in the past years effective estimators of feature relevance have been derived for highly complex or non-parametric models such as support vector machines and RandomForest (RF) models. Recently, it has been observed that RF models are biased in such a way that categorical variables with a large number of categories are preferred. In this work, we introduce a heuristic for normalizing feature importance measures that can correct the feature importance bias. The method is based on repeated permutations of the outcome vector for estimating the distribution of measured importance for each variable in a non-informative setting. The P-value of the observed importance provides a corrected measure of feature importance. We apply our method to simulated data and demonstrate that (i) non-informative predictors do not receive significant P-values, (ii) informative variables can successfully be recovered among non-informative variables and (iii) P-values computed with permutation importance (PIMP) are very helpful for deciding the significance of variables, and therefore improve model interpretability. Furthermore, PIMP was used to correct RF-based importance measures for two real-world case studies. We propose an improved RF model that uses the significant variables with respect to the PIMP measure and show that its prediction accuracy is superior to that of other existing models. R code for the method presented in this article is available at http://www.mpi-inf.mpg.de/ approximately altmann/download/PIMP.R CONTACT: altmann@mpi-inf.mpg.de, laura.tolosi@mpi-inf.mpg.de Supplementary data are available at Bioinformatics online.

  18. Stride-to-stride variability and complexity between novice and experienced runners during a prolonged run at anaerobic threshold speed.

    PubMed

    Mo, Shiwei; Chow, Daniel H K

    2018-05-19

    Motor control, related to running performance and running related injuries, is affected by progression of fatigue during a prolonged run. Distance runners are usually recommended to train at or slightly above anaerobic threshold (AT) speed for improving performance. However, running at AT speed may result in accelerated fatigue. It is not clear how one adapts running gait pattern during a prolonged run at AT speed and if there are differences between runners with different training experience. To compare characteristics of stride-to-stride variability and complexity during a prolonged run at AT speed between novice runners (NR) and experienced runners (ER). Both NR (n = 17) and ER (n = 17) performed a treadmill run for 31 min at his/her AT speed. Stride interval dynamics was obtained throughout the run with the middle 30 min equally divided into six time intervals (denoted as T1, T2, T3, T4, T5 and T6). Mean, coefficient of variation (CV) and scaling exponent alpha of stride intervals were calculated for each interval of each group. This study revealed mean stride interval significantly increased with running time in a non-linear trend (p<0.001). The stride interval variability (CV) maintained relatively constant for NR (p = 0.22) and changed nonlinearly for ER (p = 0.023) throughout the run. Alpha was significantly different between groups at T2, T5 and T6, and nonlinearly changed with running time for both groups with slight differences. These findings provided insights into how the motor control system adapts to progression of fatigue and evidences that long-term training enhances motor control. Although both ER and NR could regulate gait complexity to maintain AT speed throughout the prolonged run, ER also regulated stride interval variability to achieve the goal. Copyright © 2018. Published by Elsevier B.V.

  19. Evaluation of a Specialized Yoga Program for Persons Admitted to a Complex Continuing Care Hospital: A Pilot Study

    PubMed Central

    Kuluski, Kerry; Bechsgaard, Gitte; Ridgway, Jennifer; Katz, Joel

    2016-01-01

    Introduction. The purpose of this study was to evaluate a specialized yoga intervention for inpatients in a rehabilitation and complex continuing care hospital. Design. Single-cohort repeated measures design. Methods. Participants (N = 10) admitted to a rehabilitation and complex continuing care hospital were recruited to participate in a 50–60 min Hatha Yoga class (modified for wheelchair users/seated position) once a week for eight weeks, with assigned homework practice. Questionnaires on pain (pain, pain interference, and pain catastrophizing), psychological variables (depression, anxiety, and experiences with injustice), mindfulness, self-compassion, and spiritual well-being were collected at three intervals: pre-, mid-, and post-intervention. Results. Repeated measures ANOVAs revealed a significant main effect of time indicating improvements over the course of the yoga program on the (1) anxiety subscale of the Hospital Anxiety and Depression Scale, F(2,18) = 4.74, p < .05, and η p 2 = .35, (2) Self-Compassion Scale-Short Form, F(2,18) = 3.71, p < .05, and η p 2 = .29, and (3) Magnification subscale of the Pain Catastrophizing Scale, F(2,18) = 3. 66, p < .05, and η p 2 = .29. Discussion. The results suggest that an 8-week Hatha Yoga program improves pain-related factors and psychological experiences in individuals admitted to a rehabilitation and complex continuing care hospital. PMID:28115969

  20. Mapping SOC (Soil Organic Carbon) using LiDAR-derived vegetation indices in a random forest regression model

    NASA Astrophysics Data System (ADS)

    Will, R. M.; Glenn, N. F.; Benner, S. G.; Pierce, J. L.; Spaete, L.; Li, A.

    2015-12-01

    Quantifying SOC (Soil Organic Carbon) storage in complex terrain is challenging due to high spatial variability. Generally, the challenge is met by transforming point data to the entire landscape using surrogate, spatially-distributed, variables like elevation or precipitation. In many ecosystems, remotely sensed information on above-ground vegetation (e.g. NDVI) is a good predictor of below-ground carbon stocks. In this project, we are attempting to improve this predictive method by incorporating LiDAR-derived vegetation indices. LiDAR provides a mechanism for improved characterization of aboveground vegetation by providing structural parameters such as vegetation height and biomass. In this study, a random forest model is used to predict SOC using a suite of LiDAR-derived vegetation indices as predictor variables. The Reynolds Creek Experimental Watershed (RCEW) is an ideal location for a study of this type since it encompasses a strong elevation/precipitation gradient that supports lower biomass sagebrush ecosystems at low elevations and forests with more biomass at higher elevations. Sagebrush ecosystems composed of Wyoming, Low and Mountain Sagebrush have SOC values ranging from .4 to 1% (top 30 cm), while higher biomass ecosystems composed of aspen, juniper and fir have SOC values approaching 4% (top 30 cm). Large differences in SOC have been observed between canopy and interspace locations and high resolution vegetation information is likely to explain plot scale variability in SOC. Mapping of the SOC reservoir will help identify underlying controls on SOC distribution and provide insight into which processes are most important in determining SOC in semi-arid mountainous regions. In addition, airborne LiDAR has the potential to characterize vegetation communities at a high resolution and could be a tool for improving estimates of SOC at larger scales.

  1. A New, Highly Improved Two-Cycle Engine

    NASA Technical Reports Server (NTRS)

    Wiesen, Bernard

    2008-01-01

    The figure presents a cross-sectional view of a supercharged, variable-compression, two-cycle, internal-combustion engine that offers significant advantages over prior such engines. The improvements are embodied in a combination of design changes that contribute synergistically to improvements in performance and economy. Although the combination of design changes and the principles underlying them are complex, one of the main effects of the changes on the overall engine design is reduced (relative to prior two-cycle designs) mechanical complexity, which translates directly to reduced manufacturing cost and increased reliability. Other benefits include increases in the efficiency of both scavenging and supercharging. The improvements retain the simplicity and other advantages of two-cycle engines while affording increases in volumetric efficiency and performance across a wide range of operating conditions that, heretofore have been accessible to four-cycle engines but not to conventionally scavenged two-cycle ones, thereby increasing the range of usefulness of the two-cycle engine into all areas now dominated by the four-cycle engine. The design changes and benefits are too numerous to describe here in detail, but it is possible to summarize the major improvements: Reciprocating Shuttle Inlet Valve The entire reciprocating shuttle inlet valve and its operating gear is constructed as a single member. The shuttle valve is actuated in a lost-motion arrangement in which, at the ends of its stroke, projections on the shuttle valve come to rest against abutments at the ends of grooves in a piston skirt. This shuttle-valve design obviates the customary complex valve mechanism, actuated from an engine crankshaft or camshaft, yet it is effective with every type of two-cycle engine, from small high-speed single cylinder model engines, to large low-speed multiple cylinder engines.

  2. Complexity, accuracy and practical applicability of different biogeochemical model versions

    NASA Astrophysics Data System (ADS)

    Los, F. J.; Blaas, M.

    2010-04-01

    The construction of validated biogeochemical model applications as prognostic tools for the marine environment involves a large number of choices particularly with respect to the level of details of the .physical, chemical and biological aspects. Generally speaking, enhanced complexity might enhance veracity, accuracy and credibility. However, very complex models are not necessarily effective or efficient forecast tools. In this paper, models of varying degrees of complexity are evaluated with respect to their forecast skills. In total 11 biogeochemical model variants have been considered based on four different horizontal grids. The applications vary in spatial resolution, in vertical resolution (2DH versus 3D), in nature of transport, in turbidity and in the number of phytoplankton species. Included models range from 15 year old applications with relatively simple physics up to present state of the art 3D models. With all applications the same year, 2003, has been simulated. During the model intercomparison it has been noticed that the 'OSPAR' Goodness of Fit cost function (Villars and de Vries, 1998) leads to insufficient discrimination of different models. This results in models obtaining similar scores although closer inspection of the results reveals large differences. In this paper therefore, we have adopted the target diagram by Jolliff et al. (2008) which provides a concise and more contrasting picture of model skill on the entire model domain and for the entire period of the simulations. Correctness in prediction of the mean and the variability are separated and thus enhance insight in model functioning. Using the target diagrams it is demonstrated that recent models are more consistent and have smaller biases. Graphical inspection of time series confirms this, as the level of variability appears more realistic, also given the multi-annual background statistics of the observations. Nevertheless, whether the improvements are all genuine for the particular year cannot be judged due to the low sampling frequency of the traditional monitoring data at hand. Specifically, the overall results for chlorophyll- a are rather consistent throughout all models, but regionally recent models are better; resolution is crucial for the accuracy of transport and more important than the nature of the forcing of the transport; SPM strongly affects the biomass simulation and species composition, but even the most recent SPM results do not yet obtain a good overall score; coloured dissolved organic matter (CDOM) should be included in the calculation of the light regime; more complexity in the phytoplankton model improves the chlorophyll- a simulation, but the simulated species composition needs further improvement for some of the functional groups.

  3. Optimal control in microgrid using multi-agent reinforcement learning.

    PubMed

    Li, Fu-Dong; Wu, Min; He, Yong; Chen, Xin

    2012-11-01

    This paper presents an improved reinforcement learning method to minimize electricity costs on the premise of satisfying the power balance and generation limit of units in a microgrid with grid-connected mode. Firstly, the microgrid control requirements are analyzed and the objective function of optimal control for microgrid is proposed. Then, a state variable "Average Electricity Price Trend" which is used to express the most possible transitions of the system is developed so as to reduce the complexity and randomicity of the microgrid, and a multi-agent architecture including agents, state variables, action variables and reward function is formulated. Furthermore, dynamic hierarchical reinforcement learning, based on change rate of key state variable, is established to carry out optimal policy exploration. The analysis shows that the proposed method is beneficial to handle the problem of "curse of dimensionality" and speed up learning in the unknown large-scale world. Finally, the simulation results under JADE (Java Agent Development Framework) demonstrate the validity of the presented method in optimal control for a microgrid with grid-connected mode. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Adaptive Synchronization of Fractional Order Complex-Variable Dynamical Networks via Pinning Control

    NASA Astrophysics Data System (ADS)

    Ding, Da-Wei; Yan, Jie; Wang, Nian; Liang, Dong

    2017-09-01

    In this paper, the synchronization of fractional order complex-variable dynamical networks is studied using an adaptive pinning control strategy based on close center degree. Some effective criteria for global synchronization of fractional order complex-variable dynamical networks are derived based on the Lyapunov stability theory. From the theoretical analysis, one concludes that under appropriate conditions, the complex-variable dynamical networks can realize the global synchronization by using the proper adaptive pinning control method. Meanwhile, we succeed in solving the problem about how much coupling strength should be applied to ensure the synchronization of the fractional order complex networks. Therefore, compared with the existing results, the synchronization method in this paper is more general and convenient. This result extends the synchronization condition of the real-variable dynamical networks to the complex-valued field, which makes our research more practical. Finally, two simulation examples show that the derived theoretical results are valid and the proposed adaptive pinning method is effective. Supported by National Natural Science Foundation of China under Grant No. 61201227, National Natural Science Foundation of China Guangdong Joint Fund under Grant No. U1201255, the Natural Science Foundation of Anhui Province under Grant No. 1208085MF93, 211 Innovation Team of Anhui University under Grant Nos. KJTD007A and KJTD001B, and also supported by Chinese Scholarship Council

  5. Sleep Consolidates Motor Learning of Complex Movement Sequences in Mice.

    PubMed

    Nagai, Hirotaka; de Vivo, Luisa; Bellesi, Michele; Ghilardi, Maria Felice; Tononi, Giulio; Cirelli, Chiara

    2017-02-01

    Sleep-dependent consolidation of motor learning has been extensively studied in humans, but it remains unclear why some, but not all, learned skills benefit from sleep. Here, we compared 2 different motor tasks, both requiring the mice to run on an accelerating device. In the rotarod task, mice learn to maintain balance while running on a small rod, while in the complex wheel task, mice run on an accelerating wheel with an irregular rung pattern. In the rotarod task, performance improved to the same extent after sleep or after sleep deprivation (SD). Overall, using 7 different experimental protocols (41 sleep deprived mice, 26 sleeping controls), we found large interindividual differences in the learning and consolidation of the rotarod task, but sleep before/after training did not account for this variability. By contrast, using the complex wheel, we found that sleep after training, relative to SD, led to better performance from the beginning of the retest session, and longer sleep was correlated with greater subsequent performance. As in humans, the effects of sleep showed large interindividual variability and varied between fast and slow learners, with sleep favoring the preservation of learned skills in fast learners and leading to a net offline gain in the performance in slow learners. Using Fos expression as a proxy for neuronal activation, we also found that complex wheel training engaged motor cortex and hippocampus more than the rotarod training. Sleep specifically consolidates a motor skill that requires complex movement sequences and strongly engages both motor cortex and hippocampus. © Sleep Research Society 2016. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.

  6. Modeling the probability of arsenic in groundwater in New England as a tool for exposure assessment

    USGS Publications Warehouse

    Ayotte, J.D.; Nolan, B.T.; Nuckols, J.R.; Cantor, K.P.; Robinson, G.R.; Baris, D.; Hayes, L.; Karagas, M.; Bress, W.; Silverman, D.T.; Lubin, J.H.

    2006-01-01

    We developed a process-based model to predict the probability of arsenic exceeding 5 ??g/L in drinking water wells in New England bedrock aquifers. The model is being used for exposure assessment in an epidemiologic study of bladder cancer. One important study hypothesis that may explain increased bladder cancer risk is elevated concentrations of inorganic arsenic in drinking water. In eastern New England, 20-30% of private wells exceed the arsenic drinking water standard of 10 micrograms per liter. Our predictive model significantly improves the understanding of factors associated with arsenic contamination in New England. Specific rock types, high arsenic concentrations in stream sediments, geochemical factors related to areas of Pleistocene marine inundation and proximity to intrusive granitic plutons, and hydrologic and landscape variables relating to groundwater residence time increase the probability of arsenic occurrence in groundwater. Previous studies suggest that arsenic in bedrock groundwater may be partly from past arsenical pesticide use. Variables representing historic agricultural inputs do not improve the model, indicating that this source does not significantly contribute to current arsenic concentrations. Due to the complexity of the fractured bedrock aquifers in the region, well depth and related variables also are not significant predictors. ?? 2006 American Chemical Society.

  7. Specific transfer effects following variable priority dual-task training in older adults.

    PubMed

    Lussier, Maxime; Bugaiska, Aurélia; Bherer, Louis

    2017-01-01

    Past divided attention training studies in older adults have suggested that variable priority training (VPT) tends to show larger improvement than fixed priority training (FPT). However, it remains unclear whether VPT leads to larger transfer effects. In this study, eighty-three older adults aged between 55 and 65 received five 1-hour sessions of VPT, FPT or of an active placebo. VPT and FPT subjects trained on a complex dual-task condition with variable stimulus timings in order to promote more flexible and self-guided strategies with regard to attentional priority devoted to the concurrent tasks. Real-time individualized feedback was provided to encourage improvement. The active placebo group attended computer classes. Near and far modality transfer tasks were used to assess the generalization of transfer effects. Results showed that VPT induced significantly larger transfer effects than FPT on a near modality transfer task. Evidence for larger transfer effects in VPT than FPT on a far modality transfer task was also observed. Furthermore, the superiority of VPT on FPT in transfer effects was specific to the ability to coordinate two concurrent tasks. Results of this study help better understand the benefits of VPT attentional training on transfer effects, which is an essential outcome for cognitive training effectiveness and relevancy.

  8. A RESEARCH DATABASE FOR IMPROVED DATA MANAGEMENT AND ANALYSIS IN LONGITUDINAL STUDIES

    PubMed Central

    BIELEFELD, ROGER A.; YAMASHITA, TOYOKO S.; KEREKES, EDWARD F.; ERCANLI, EHAT; SINGER, LYNN T.

    2014-01-01

    We developed a research database for a five-year prospective investigation of the medical, social, and developmental correlates of chronic lung disease during the first three years of life. We used the Ingres database management system and the Statit statistical software package. The database includes records containing 1300 variables each, the results of 35 psychological tests, each repeated five times (providing longitudinal data on the child, the parents, and behavioral interactions), both raw and calculated variables, and both missing and deferred values. The four-layer menu-driven user interface incorporates automatic activation of complex functions to handle data verification, missing and deferred values, static and dynamic backup, determination of calculated values, display of database status, reports, bulk data extraction, and statistical analysis. PMID:7596250

  9. Conservation and Variability of Meiosis Across the Eukaryotes.

    PubMed

    Loidl, Josef

    2016-11-23

    Comparisons among a variety of eukaryotes have revealed considerable variability in the structures and processes involved in their meiosis. Nevertheless, conventional forms of meiosis occur in all major groups of eukaryotes, including early-branching protists. This finding confirms that meiosis originated in the common ancestor of all eukaryotes and suggests that primordial meiosis may have had many characteristics in common with conventional extant meiosis. However, it is possible that the synaptonemal complex and the delicate crossover control related to its presence were later acquisitions. Later still, modifications to meiotic processes occurred within different groups of eukaryotes. Better knowledge on the spectrum of derived and uncommon forms of meiosis will improve our understanding of many still mysterious aspects of the meiotic process and help to explain the evolutionary basis of functional adaptations to the meiotic program.

  10. Complex mean circulation over the inner shelf south of Martha's Vineyard revealed by observations and a high-resolution model

    USGS Publications Warehouse

    Ganju, Neil K.; Lentz, Steven J.; Kirincich, Anthony R.; Farrar, J. Thomas

    2011-01-01

    Inner-shelf circulation is governed by the interaction between tides, baroclinic forcing, winds, waves, and frictional losses; the mean circulation ultimately governs exchange between the coast and ocean. In some cases, oscillatory tidal currents interact with bathymetric features to generate a tidally rectified flow. Recent observational and modeling efforts in an overlapping domain centered on the Martha's Vineyard Coastal Observatory (MVCO) provided an opportunity to investigate the spatial and temporal complexity of circulation on the inner shelf. ADCP and surface radar observations revealed a mean circulation pattern that was highly variable in the alongshore and cross-shore directions. Nested modeling incrementally improved representation of the mean circulation as grid resolution increased and indicated tidal rectification as the generation mechanism of a counter-clockwise gyre near the MVCO. The loss of model skill with decreasing resolution is attributed to insufficient representation of the bathymetric gradients (Δh/h), which is important for representing nonlinear interactions between currents and bathymetry. The modeled momentum balance was characterized by large spatial variability of the pressure gradient and horizontal advection terms over short distances, suggesting that observed inner-shelf momentum balances may be confounded. Given the available observational and modeling data, this work defines the spatially variable mean circulation and its formation mechanism—tidal rectification—and illustrates the importance of model resolution for resolving circulation and constituent exchange near the coast. The results of this study have implications for future observational and modeling studies near the MVCO and other inner-shelf locations with alongshore bathymetric variability.

  11. Application of quality by design concept to develop a dual gradient elution stability-indicating method for cloxacillin forced degradation studies using combined mixture-process variable models.

    PubMed

    Zhang, Xia; Hu, Changqin

    2017-09-08

    Penicillins are typical of complex ionic samples which likely contain large number of degradation-related impurities (DRIs) with different polarities and charge properties. It is often a challenge to develop selective and robust high performance liquid chromatography (HPLC) methods for the efficient separation of all DRIs. In this study, an analytical quality by design (AQbD) approach was proposed for stability-indicating method development of cloxacillin. The structures, retention and UV characteristics rules of penicillins and their impurities were summarized and served as useful prior knowledge. Through quality risk assessment and screen design, 3 critical process parameters (CPPs) were defined, including 2 mixture variables (MVs) and 1 process variable (PV). A combined mixture-process variable (MPV) design was conducted to evaluate the 3 CPPs simultaneously and a response surface methodology (RSM) was used to achieve the optimal experiment parameters. A dual gradient elution was performed to change buffer pH, mobile-phase type and strength simultaneously. The design spaces (DSs) was evaluated using Monte Carlo simulation to give their possibility of meeting the specifications of CQAs. A Plackett-Burman design was performed to test the robustness around the working points and to decide the normal operating ranges (NORs). Finally, validation was performed following International Conference on Harmonisation (ICH) guidelines. To our knowledge, this is the first study of using MPV design and dual gradient elution to develop HPLC methods and improve separations for complex ionic samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. PBSM3D: A finite volume, scalar-transport blowing snow model for use with variable resolution meshes

    NASA Astrophysics Data System (ADS)

    Marsh, C.; Wayand, N. E.; Pomeroy, J. W.; Wheater, H. S.; Spiteri, R. J.

    2017-12-01

    Blowing snow redistribution results in heterogeneous snowcovers that are ubiquitous in cold, windswept environments. Capturing this spatial and temporal variability is important for melt and runoff simulations. Point scale blowing snow transport models are difficult to apply in fully distributed hydrological models due to landscape heterogeneity and complex wind fields. Many existing distributed snow transport models have empirical wind flow and/or simplified wind direction algorithms that perform poorly in calculating snow redistribution where there are divergent wind flows, sharp topography, and over large spatial extents. Herein, a steady-state scalar transport model is discretized using the finite volume method (FVM), using parameterizations from the Prairie Blowing Snow Model (PBSM). PBSM has been applied in hydrological response units and grids to prairie, arctic, glacier, and alpine terrain and shows a good capability to represent snow redistribution over complex terrain. The FVM discretization takes advantage of the variable resolution mesh in the Canadian Hydrological Model (CHM) to ensure efficient calculations over small and large spatial extents. Variable resolution unstructured meshes preserve surface heterogeneity but result in fewer computational elements versus high-resolution structured (raster) grids. Snowpack, soil moisture, and streamflow observations were used to evaluate CHM-modelled outputs in a sub-arctic and an alpine basin. Newly developed remotely sensed snowcover indices allowed for validation over large basins. CHM simulations of snow hydrology were improved by inclusion of the blowing snow model. The results demonstrate the key role of snow transport processes in creating pre-melt snowcover heterogeneity and therefore governing post-melt soil moisture and runoff generation dynamics.

  13. Random Survival Forest in practice: a method for modelling complex metabolomics data in time to event analysis.

    PubMed

    Dietrich, Stefan; Floegel, Anna; Troll, Martina; Kühn, Tilman; Rathmann, Wolfgang; Peters, Anette; Sookthai, Disorn; von Bergen, Martin; Kaaks, Rudolf; Adamski, Jerzy; Prehn, Cornelia; Boeing, Heiner; Schulze, Matthias B; Illig, Thomas; Pischon, Tobias; Knüppel, Sven; Wang-Sattler, Rui; Drogan, Dagmar

    2016-10-01

    The application of metabolomics in prospective cohort studies is statistically challenging. Given the importance of appropriate statistical methods for selection of disease-associated metabolites in highly correlated complex data, we combined random survival forest (RSF) with an automated backward elimination procedure that addresses such issues. Our RSF approach was illustrated with data from the European Prospective Investigation into Cancer and Nutrition (EPIC)-Potsdam study, with concentrations of 127 serum metabolites as exposure variables and time to development of type 2 diabetes mellitus (T2D) as outcome variable. Out of this data set, Cox regression with a stepwise selection method was recently published. Replication of methodical comparison (RSF and Cox regression) was conducted in two independent cohorts. Finally, the R-code for implementing the metabolite selection procedure into the RSF-syntax is provided. The application of the RSF approach in EPIC-Potsdam resulted in the identification of 16 incident T2D-associated metabolites which slightly improved prediction of T2D when used in addition to traditional T2D risk factors and also when used together with classical biomarkers. The identified metabolites partly agreed with previous findings using Cox regression, though RSF selected a higher number of highly correlated metabolites. The RSF method appeared to be a promising approach for identification of disease-associated variables in complex data with time to event as outcome. The demonstrated RSF approach provides comparable findings as the generally used Cox regression, but also addresses the problem of multicollinearity and is suitable for high-dimensional data. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  14. Mesoscale Convective Complex versus Non-Mesoscale Convective Complex Thunderstorms: A Comparison of Selected Meteorological Variables.

    DTIC Science & Technology

    1986-08-01

    mean square errors for selected variables . . 34 8. Variable range and mean value for MCC and non-MCC cases . . 36 9. Alpha ( a ) levels at which the...Table 9. For each variable, the a level is listed at which the two mean values are determined to be significantly 38 Table 9. Alpha ( a ) levels at...vorticity advection None 700 mb vertical velocity forecast .20 different. These a levels express the probability of erroneously con- cluding that the

  15. Chaos, complexity and complicatedness: lessons from rocket science.

    PubMed

    Norman, Geoff

    2011-06-01

    Recently several authors have drawn parallels between educational research and some theories of natural science, in particular complexity theory and chaos theory. The central claim is that both the natural science theories are useful metaphors for education research in that they deal with phenomena that involve many variables interacting in complex, non-linear and unstable ways, and leading to effects that are neither reproducible nor comprehensible. This paper presents a counter-argument. I begin by carefully examining the concepts of uncertainty, complexity and chaos, as described in physical science. I distinguish carefully between systems that are, respectively, complex, chaotic and complicated. I demonstrate that complex and chaotic systems have highly specific characteristics that are unlikely to be present in education systems. I then suggest that, in fact, there is ample evidence that human learning can be understood adequately with conventional linear models. The implications of these opposing world views are substantial. If education science has the properties of complex or chaotic systems, we should abandon any attempt at control or understanding. However, as I point out, to do so would ignore a number of recent developments in our understanding of learning that hold promise to yield substantial improvements in effectiveness and efficiency of learning. © Blackwell Publishing Ltd 2011.

  16. Nutraceutical approaches to metabolic syndrome.

    PubMed

    Sirtori, Cesare R; Pavanello, Chiara; Calabresi, Laura; Ruscica, Massimiliano

    2017-12-01

    Metabolic Syndrome (MetS), affecting at least 30% of adults in the Western World, is characterized by three out of five variables, from high triglycerides, to elevated waist circumference and blood pressure. MetS is not characterized by elevated cholesterolemia, but is rather the consequence of a complex interaction of factors generally leading to increased insulin resistance. Drug treatments are of difficult handling, whereas well-characterized nutraceuticals may offer an effective alternative. Among these, functional foods, e.g. plant proteins, have been shown to improve insulin resistance and reduce triglyceride secretion. Pro- and pre-biotics, that are able to modify intestinal microbiome, reduce absorption of specific nutrients and improve the metabolic handling of energy-rich foods. Finally, specific nutraceuticals have proven to be of benefit, in particular, red-yeast rice, berberine, curcumin as well as vitamin D. All these can improve lipid handling by the liver as well as ameliorate insulin resistance. While lifestyle approaches, such as with the Mediterranean diet, may prove to be too complex for the single patient, better knowledge of selected nutraceuticals and more appropriate formulations leading to improved bioavailability will certainly widen the use of these agents, already in large use for the management of these very frequent patient groups. Key messages Functional foods, e.g. plant proteins, improve insulin resistance. Pro- and pre-biotics improve the metabolic handling of energy-rich foods. Nutraceutical can offer a significant help in handling MetS patients being part of lifestyle recommendations.

  17. Multivariate analysis: greater insights into complex systems

    USDA-ARS?s Scientific Manuscript database

    Many agronomic researchers measure and collect multiple response variables in an effort to understand the more complex nature of the system being studied. Multivariate (MV) statistical methods encompass the simultaneous analysis of all random variables (RV) measured on each experimental or sampling ...

  18. Cognitive and Adaptive Functioning after Liver Transplantation for Maple Syrup Urine Disease: A Case Series

    PubMed Central

    Shellmer, D. A.; Dabbs, A. DeVito; Dew, M. A.; Noll, R. B.; Feldman, H.; Strauss, K.; Morton, D. H.; Vockley, G.; Mazariegos, G. V.

    2011-01-01

    MSUD is a complex metabolic disorder that has been associated with central nervous system damage, developmental delays, and neurocognitive deficits. Although liver transplantation provides a metabolic cure for MSUD, changes in cognitive and adaptive functioning following transplantation have not been investigated. In this report we present data from 14 patients who completed cognitive and adaptive functioning testing pre- and one year and/or three years post-liver transplantation. Findings show either no significant change or improvement in IQ scores pre- to post-liver transplantation. Greater variability was observed in adaptive functioning scores, but the majority of patients evidenced either no significant change or improvement in adaptive scores. In general, findings may indicate that liver transplantation curtails additional central nervous system damage and neurocognitive decline providing an opportunity for stabilization or improvement in functioning. PMID:20946191

  19. Improved finite element methodology for integrated thermal structural analysis

    NASA Technical Reports Server (NTRS)

    Dechaumphai, P.; Thornton, E. A.

    1982-01-01

    An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analysis is presented. New thermal finite elements which yield exact nodal and element temperatures for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

  20. Antigenic variability: Obstacles on the road to vaccines against traditionally difficult targets.

    PubMed

    Servín-Blanco, R; Zamora-Alvarado, R; Gevorkian, G; Manoutcharian, K

    2016-10-02

    Despite the impressive impact of vaccines on public health, the success of vaccines targeting many important pathogens and cancers has to date been limited. The burden of infectious diseases today is mainly caused by antigenically variable pathogens (AVPs), which escape immune responses induced by prior infection or vaccination through changes in molecular structures recognized by antibodies or T cells. Extensive genetic and antigenic variability is the major obstacle for the development of new or improved vaccines against "difficult" targets. Alternative, qualitatively new approaches leading to the generation of disease- and patient-specific vaccine immunogens that incorporate complex permanently changing epitope landscapes of intended targets accompanied by appropriate immunomodulators are urgently needed. In this review, we highlight some of the most critical common issues related to the development of vaccines against many pathogens and cancers that escape protective immune responses owing to antigenic variation, and discuss recent efforts to overcome the obstacles by applying alternative approaches for the rational design of new types of immunogens.

  1. New parameters in adaptive testing of ferromagnetic materials utilizing magnetic Barkhausen noise

    NASA Astrophysics Data System (ADS)

    Pal'a, Jozef; Ušák, Elemír

    2016-03-01

    A new method of magnetic Barkhausen noise (MBN) measurement and optimization of the measured data processing with respect to non-destructive evaluation of ferromagnetic materials was tested. Using this method we tried to found, if it is possible to enhance sensitivity and stability of measurement results by replacing the traditional MBN parameter (root mean square) with some new parameter. In the tested method, a complex set of the MBN from minor hysteresis loops is measured. Afterward, the MBN data are collected into suitably designed matrices and optimal parameters of MBN with respect to maximum sensitivity to the evaluated variable are searched. The method was verified on plastically deformed steel samples. It was shown that the proposed measuring method and measured data processing bring an improvement of the sensitivity to the evaluated variable when comparing with measuring traditional MBN parameter. Moreover, we found a parameter of MBN, which is highly resistant to the changes of applied field amplitude and at the same time it is noticeably more sensitive to the evaluated variable.

  2. Ultra compact spectrometer using linear variable filters

    NASA Astrophysics Data System (ADS)

    Dami, M.; De Vidi, R.; Aroldi, G.; Belli, F.; Chicarella, L.; Piegari, A.; Sytchkova, A.; Bulir, J.; Lemarquis, F.; Lequime, M.; Abel Tibérini, L.; Harnisch, B.

    2017-11-01

    The Linearly Variable Filters (LVF) are complex optical devices that, integrated in a CCD, can realize a "single chip spectrometer". In the framework of an ESA Study, a team of industries and institutes led by SELEX-Galileo explored the design principles and manufacturing techniques, realizing and characterizing LVF samples based both on All-Dielectric (AD) and Metal-Dielectric (MD) Coating Structures in the VNIR and SWIR spectral ranges. In particular the achieved performances on spectral gradient, transmission bandwidth and Spectral Attenuation (SA) are presented and critically discussed. Potential improvements will be highlighted. In addition the results of a feasibility study of a SWIR Linear Variable Filter are presented with the comparison of design prediction and measured performances. Finally criticalities related to the filter-CCD packaging are discussed. The main achievements reached during these activities have been: - to evaluate by design, manufacturing and test of LVF samples the achievable performances compared with target requirements; - to evaluate the reliability of the projects by analyzing their repeatability; - to define suitable measurement methodologies

  3. Engineering of routes to heparin and related polysaccharides.

    PubMed

    Bhaskar, Ujjwal; Sterner, Eric; Hickey, Anne Marie; Onishi, Akihiro; Zhang, Fuming; Dordick, Jonathan S; Linhardt, Robert J

    2012-01-01

    Anticoagulant heparin has been shown to possess important biological functions that vary according to its fine structure. Variability within heparin's structure occurs owing to its biosynthesis and animal tissue-based recovery and adds another dimension to its complex polymeric structure. The structural variations in chain length and sulfation patterns mediate its interaction with many heparin-binding proteins, thereby eliciting complex biological responses. The advent of novel chemical and enzymatic approaches for polysaccharide synthesis coupled with high throughput combinatorial approaches for drug discovery have facilitated an increased effort to understand heparin's structure-activity relationships. An improved understanding would offer potential for new therapeutic development through the engineering of polysaccharides. Such a bioengineering approach requires the amalgamation of several different disciplines, including carbohydrate synthesis, applied enzymology, metabolic engineering, and process biochemistry.

  4. KASCADE2017 - An experimental study of thermal circulations and turbulence in complex terrain

    NASA Astrophysics Data System (ADS)

    Pardyjak, Eric; Dupuy, Florian; Durand, Pierre; Gunawardena, Nipun; Hedde, Thierry; Rubin, Pierre

    2017-04-01

    The KASCADE (KAtabatic winds and Stability over CAdarache for Dispersion of Effluents) 2017 experiment was conducted during winter 2017 with the overarching objective of improving prediction of dispersion in complex terrain during stable atmospheric conditions. The experiment builds on knowledge gathered during the first KASCADE experiment conducted in 2013 (Duine et al., 2016), which provided detailed observations of the vertical structure of the atmosphere during stable conditions. In spite of this improved understanding, considerable uncertainty remains regarding the near-surface horizontal spatial and temporal variability of winds and thermodynamic variables. For this specific campaign, the general aim has been to use a large number of sensors to improve our understanding of the spatial and temporal development, evolution and breakdown of topographically driven flows. KASCADE 2017 consisted of continuous observations, which were broadened during ten Intensive Observation Periods (IOPs) conducted in the Cadarache Valley located in south-eastern France from January through March 2017. The Cadarache Valley is a relatively small valley (6 km x 1 km) with modest slopes and elevation differences between the valley floor and nearby peaks ( 100 m). The valley is embedded in the larger Durance Valley drainage system leading to multi-scale flow interactions. During the winter, winds are light and stably stratified leading to thermal circulations as well as complex near-surface atmospheric layering that impacts dispersion of contaminants. The continuously operating instrumentation deployed included mean near surface (2-m) and sub-surface observations from 12 low-cost Local Energy-budget Measurement Stations (LEMS), four sonic anemometer masts, one full surface flux station, sodar measurements at two locations, wind and temperature measurements from a tall 110 m tower, and two additional met stations. During IOPs, additional deployments included a low-cost tethered balloon temperature profiler as well as regular (every 3 hours) radiosoundings (including recoverable and reusable probes). The presentation will provide an overview of the experiment and several interesting "first-results." First results will include data characterizing highly-regular nocturnal horizontal wind meandering and associated turbulence statistics. In addition, we present data on the development of strong near surface stable stratification hours before sunset.

  5. Robust joint score tests in the application of DNA methylation data analysis.

    PubMed

    Li, Xuan; Fu, Yuejiao; Wang, Xiaogang; Qiu, Weiliang

    2018-05-18

    Recently differential variability has been showed to be valuable in evaluating the association of DNA methylation to the risks of complex human diseases. The statistical tests based on both differential methylation level and differential variability can be more powerful than those based only on differential methylation level. Anh and Wang (2013) proposed a joint score test (AW) to simultaneously detect for differential methylation and differential variability. However, AW's method seems to be quite conservative and has not been fully compared with existing joint tests. We proposed three improved joint score tests, namely iAW.Lev, iAW.BF, and iAW.TM, and have made extensive comparisons with the joint likelihood ratio test (jointLRT), the Kolmogorov-Smirnov (KS) test, and the AW test. Systematic simulation studies showed that: 1) the three improved tests performed better (i.e., having larger power, while keeping nominal Type I error rates) than the other three tests for data with outliers and having different variances between cases and controls; 2) for data from normal distributions, the three improved tests had slightly lower power than jointLRT and AW. The analyses of two Illumina HumanMethylation27 data sets GSE37020 and GSE20080 and one Illumina Infinium MethylationEPIC data set GSE107080 demonstrated that three improved tests had higher true validation rates than those from jointLRT, KS, and AW. The three proposed joint score tests are robust against the violation of normality assumption and presence of outlying observations in comparison with other three existing tests. Among the three proposed tests, iAW.BF seems to be the most robust and effective one for all simulated scenarios and also in real data analyses.

  6. Improving machine learning reproducibility in genetic association studies with proportional instance cross validation (PICV).

    PubMed

    Piette, Elizabeth R; Moore, Jason H

    2018-01-01

    Machine learning methods and conventions are increasingly employed for the analysis of large, complex biomedical data sets, including genome-wide association studies (GWAS). Reproducibility of machine learning analyses of GWAS can be hampered by biological and statistical factors, particularly so for the investigation of non-additive genetic interactions. Application of traditional cross validation to a GWAS data set may result in poor consistency between the training and testing data set splits due to an imbalance of the interaction genotypes relative to the data as a whole. We propose a new cross validation method, proportional instance cross validation (PICV), that preserves the original distribution of an independent variable when splitting the data set into training and testing partitions. We apply PICV to simulated GWAS data with epistatic interactions of varying minor allele frequencies and prevalences and compare performance to that of a traditional cross validation procedure in which individuals are randomly allocated to training and testing partitions. Sensitivity and positive predictive value are significantly improved across all tested scenarios for PICV compared to traditional cross validation. We also apply PICV to GWAS data from a study of primary open-angle glaucoma to investigate a previously-reported interaction, which fails to significantly replicate; PICV however improves the consistency of testing and training results. Application of traditional machine learning procedures to biomedical data may require modifications to better suit intrinsic characteristics of the data, such as the potential for highly imbalanced genotype distributions in the case of epistasis detection. The reproducibility of genetic interaction findings can be improved by considering this variable imbalance in cross validation implementation, such as with PICV. This approach may be extended to problems in other domains in which imbalanced variable distributions are a concern.

  7. Effects of head-down bed rest on complex heart rate variability: Response to LBNP testing

    NASA Technical Reports Server (NTRS)

    Goldberger, Ary L.; Mietus, Joseph E.; Rigney, David R.; Wood, Margie L.; Fortney, Suzanne M.

    1994-01-01

    Head-down bed rest is used to model physiological changes during spaceflight. We postulated that bed rest would decrease the degree of complex physiological heart rate variability. We analyzed continuous heart rate data from digitized Holter recordings in eight healthy female volunteers (age 28-34 yr) who underwent a 13-day 6 deg head-down bed rest study with serial lower body negative pressure (LBNP) trials. Heart rate variability was measured on a 4-min data sets using conventional time and frequency domain measures as well as with a new measure of signal 'complexity' (approximate entropy). Data were obtained pre-bed rest (control), during bed rest (day 4 and day 9 or 11), and 2 days post-bed rest (recovery). Tolerance to LBNP was significantly reduced on both bed rest days vs. pre-bed rest. Heart rate variability was assessed at peak LBNP. Heart rate approximate entropy was significantly decreased at day 4 and day 9 or 11, returning toward normal during recovery. Heart rate standard deviation and the ratio of high- to low-power frequency did not change significantly. We conclude that short-term bed rest is associated with a decrease in the complex variability of heart rate during LBNP testing in healthy young adult women. Measurement of heart rate complexity, using a method derived from nonlinear dynamics ('chaos theory'), may provide a sensitive marker of this loss of physiological variability, complementing conventional time and frequency domain statistical measures.

  8. A Geometric View of Complex Trigonometric Functions

    ERIC Educational Resources Information Center

    Hammack, Richard

    2007-01-01

    Given that the sine and cosine functions of a real variable can be interpreted as the coordinates of points on the unit circle, the author of this article asks whether there is something similar for complex variables, and shows that indeed there is.

  9. Spatio-temporal error growth in the multi-scale Lorenz'96 model

    NASA Astrophysics Data System (ADS)

    Herrera, S.; Fernández, J.; Rodríguez, M. A.; Gutiérrez, J. M.

    2010-07-01

    The influence of multiple spatio-temporal scales on the error growth and predictability of atmospheric flows is analyzed throughout the paper. To this aim, we consider the two-scale Lorenz'96 model and study the interplay of the slow and fast variables on the error growth dynamics. It is shown that when the coupling between slow and fast variables is weak the slow variables dominate the evolution of fluctuations whereas in the case of strong coupling the fast variables impose a non-trivial complex error growth pattern on the slow variables with two different regimes, before and after saturation of fast variables. This complex behavior is analyzed using the recently introduced Mean-Variance Logarithmic (MVL) diagram.

  10. Why “improved” water sources are not always safe

    PubMed Central

    Shaheed, Ameer; Orgill, Jennifer; Montgomery, Maggie A; Jeuland, Marc A; Brown, Joe

    2014-01-01

    Abstract Existing and proposed metrics for household drinking-water services are intended to measure the availability, safety and accessibility of water sources. However, these attributes can be highly variable over time and space and this variation complicates the task of creating and implementing simple and scalable metrics. In this paper, we highlight those factors – especially those that relate to so-called improved water sources – that contribute to variability in water safety but may not be generally recognized as important by non-experts. Problems in the provision of water in adequate quantities and of adequate quality – interrelated problems that are often influenced by human behaviour – may contribute to an increased risk of poor health. Such risk may be masked by global water metrics that indicate that we are on the way to meeting the world’s drinking-water needs. Given the complexity of the topic and current knowledge gaps, international metrics for access to drinking water should be interpreted with great caution. We need further targeted research on the health impacts associated with improvements in drinking-water supplies. PMID:24700996

  11. Artificial neural networks using complex numbers and phase encoded weights.

    PubMed

    Michel, Howard E; Awwal, Abdul Ahad S

    2010-04-01

    The model of a simple perceptron using phase-encoded inputs and complex-valued weights is proposed. The aggregation function, activation function, and learning rule for the proposed neuron are derived and applied to Boolean logic functions and simple computer vision tasks. The complex-valued neuron (CVN) is shown to be superior to traditional perceptrons. An improvement of 135% over the theoretical maximum of 104 linearly separable problems (of three variables) solvable by conventional perceptrons is achieved without additional logic, neuron stages, or higher order terms such as those required in polynomial logic gates. The application of CVN in distortion invariant character recognition and image segmentation is demonstrated. Implementation details are discussed, and the CVN is shown to be very attractive for optical implementation since optical computations are naturally complex. The cost of the CVN is less in all cases than the traditional neuron when implemented optically. Therefore, all the benefits of the CVN can be obtained without additional cost. However, on those implementations dependent on standard serial computers, CVN will be more cost effective only in those applications where its increased power can offset the requirement for additional neurons.

  12. An efficient approach to BAC based assembly of complex genomes.

    PubMed

    Visendi, Paul; Berkman, Paul J; Hayashi, Satomi; Golicz, Agnieszka A; Bayer, Philipp E; Ruperao, Pradeep; Hurgobin, Bhavna; Montenegro, Juan; Chan, Chon-Kit Kenneth; Staňková, Helena; Batley, Jacqueline; Šimková, Hana; Doležel, Jaroslav; Edwards, David

    2016-01-01

    There has been an exponential growth in the number of genome sequencing projects since the introduction of next generation DNA sequencing technologies. Genome projects have increasingly involved assembly of whole genome data which produces inferior assemblies compared to traditional Sanger sequencing of genomic fragments cloned into bacterial artificial chromosomes (BACs). While whole genome shotgun sequencing using next generation sequencing (NGS) is relatively fast and inexpensive, this method is extremely challenging for highly complex genomes, where polyploidy or high repeat content confounds accurate assembly, or where a highly accurate 'gold' reference is required. Several attempts have been made to improve genome sequencing approaches by incorporating NGS methods, to variable success. We present the application of a novel BAC sequencing approach which combines indexed pools of BACs, Illumina paired read sequencing, a sequence assembler specifically designed for complex BAC assembly, and a custom bioinformatics pipeline. We demonstrate this method by sequencing and assembling BAC cloned fragments from bread wheat and sugarcane genomes. We demonstrate that our assembly approach is accurate, robust, cost effective and scalable, with applications for complete genome sequencing in large and complex genomes.

  13. Deep Blue Phosphorescent Organic Light-Emitting Diodes with CIEy Value of 0.11 and External Quantum Efficiency up to 22.5.

    PubMed

    Li, Xiaoyue; Zhang, Juanye; Zhao, Zifeng; Wang, Liding; Yang, Hannan; Chang, Qiaowen; Jiang, Nan; Liu, Zhiwei; Bian, Zuqiang; Liu, Weiping; Lu, Zhenghong; Huang, Chunhui

    2018-03-01

    Organic light-emitting diodes (OLEDs) based on red and green phosphorescent iridium complexes are successfully commercialized in displays and solid-state lighting. However, blue ones still remain a challenge on account of their relatively dissatisfactory Commission International de L'Eclairage (CIE) coordinates and low efficiency. After analyzing the reported blue iridium complexes in the literature, a new deep-blue-emitting iridium complex with improved photoluminescence quantum yield is designed and synthesized. By rational screening host materials showing high triplet energy level in neat film as well as the OLED architecture to balance electron and hole recombination, highly efficient deep-blue-emission OLEDs with a CIE at (0.15, 0.11) and maximum external quantum efficiency (EQE) up to 22.5% are demonstrated. Based on the transition dipole moment vector measurement with a variable-angle spectroscopic ellipsometry method, the ultrahigh EQE is assigned to a preferred horizontal dipole orientation of the iridium complex in doped film, which is beneficial for light extraction from the OLEDs. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Automatic identification of variables in epidemiological datasets using logic regression.

    PubMed

    Lorenz, Matthias W; Abdi, Negin Ashtiani; Scheckenbach, Frank; Pflug, Anja; Bülbül, Alpaslan; Catapano, Alberico L; Agewall, Stefan; Ezhov, Marat; Bots, Michiel L; Kiechl, Stefan; Orth, Andreas

    2017-04-13

    For an individual participant data (IPD) meta-analysis, multiple datasets must be transformed in a consistent format, e.g. using uniform variable names. When large numbers of datasets have to be processed, this can be a time-consuming and error-prone task. Automated or semi-automated identification of variables can help to reduce the workload and improve the data quality. For semi-automation high sensitivity in the recognition of matching variables is particularly important, because it allows creating software which for a target variable presents a choice of source variables, from which a user can choose the matching one, with only low risk of having missed a correct source variable. For each variable in a set of target variables, a number of simple rules were manually created. With logic regression, an optimal Boolean combination of these rules was searched for every target variable, using a random subset of a large database of epidemiological and clinical cohort data (construction subset). In a second subset of this database (validation subset), this optimal combination rules were validated. In the construction sample, 41 target variables were allocated on average with a positive predictive value (PPV) of 34%, and a negative predictive value (NPV) of 95%. In the validation sample, PPV was 33%, whereas NPV remained at 94%. In the construction sample, PPV was 50% or less in 63% of all variables, in the validation sample in 71% of all variables. We demonstrated that the application of logic regression in a complex data management task in large epidemiological IPD meta-analyses is feasible. However, the performance of the algorithm is poor, which may require backup strategies.

  15. Data mining and statistical inference in selective laser melting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamath, Chandrika

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  16. Data mining and statistical inference in selective laser melting

    DOE PAGES

    Kamath, Chandrika

    2016-01-11

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  17. Active Learning to Understand Infectious Disease Models and Improve Policy Making

    PubMed Central

    Vladislavleva, Ekaterina; Broeckhove, Jan; Beutels, Philippe; Hens, Niel

    2014-01-01

    Modeling plays a major role in policy making, especially for infectious disease interventions but such models can be complex and computationally intensive. A more systematic exploration is needed to gain a thorough systems understanding. We present an active learning approach based on machine learning techniques as iterative surrogate modeling and model-guided experimentation to systematically analyze both common and edge manifestations of complex model runs. Symbolic regression is used for nonlinear response surface modeling with automatic feature selection. First, we illustrate our approach using an individual-based model for influenza vaccination. After optimizing the parameter space, we observe an inverse relationship between vaccination coverage and cumulative attack rate reinforced by herd immunity. Second, we demonstrate the use of surrogate modeling techniques on input-response data from a deterministic dynamic model, which was designed to explore the cost-effectiveness of varicella-zoster virus vaccination. We use symbolic regression to handle high dimensionality and correlated inputs and to identify the most influential variables. Provided insight is used to focus research, reduce dimensionality and decrease decision uncertainty. We conclude that active learning is needed to fully understand complex systems behavior. Surrogate models can be readily explored at no computational expense, and can also be used as emulator to improve rapid policy making in various settings. PMID:24743387

  18. Active learning to understand infectious disease models and improve policy making.

    PubMed

    Willem, Lander; Stijven, Sean; Vladislavleva, Ekaterina; Broeckhove, Jan; Beutels, Philippe; Hens, Niel

    2014-04-01

    Modeling plays a major role in policy making, especially for infectious disease interventions but such models can be complex and computationally intensive. A more systematic exploration is needed to gain a thorough systems understanding. We present an active learning approach based on machine learning techniques as iterative surrogate modeling and model-guided experimentation to systematically analyze both common and edge manifestations of complex model runs. Symbolic regression is used for nonlinear response surface modeling with automatic feature selection. First, we illustrate our approach using an individual-based model for influenza vaccination. After optimizing the parameter space, we observe an inverse relationship between vaccination coverage and cumulative attack rate reinforced by herd immunity. Second, we demonstrate the use of surrogate modeling techniques on input-response data from a deterministic dynamic model, which was designed to explore the cost-effectiveness of varicella-zoster virus vaccination. We use symbolic regression to handle high dimensionality and correlated inputs and to identify the most influential variables. Provided insight is used to focus research, reduce dimensionality and decrease decision uncertainty. We conclude that active learning is needed to fully understand complex systems behavior. Surrogate models can be readily explored at no computational expense, and can also be used as emulator to improve rapid policy making in various settings.

  19. Health technology assessment review: Computerized glucose regulation in the intensive care unit - how to create artificial control

    PubMed Central

    2009-01-01

    Current care guidelines recommend glucose control (GC) in critically ill patients. To achieve GC, many ICUs have implemented a (nurse-based) protocol on paper. However, such protocols are often complex, time-consuming, and can cause iatrogenic hypoglycemia. Computerized glucose regulation protocols may improve patient safety, efficiency, and nurse compliance. Such computerized clinical decision support systems (Cuss) use more complex logic to provide an insulin infusion rate based on previous blood glucose levels and other parameters. A computerized CDSS for glucose control has the potential to reduce overall workload, reduce the chance of human cognitive failure, and improve glucose control. Several computer-assisted glucose regulation programs have been published recently. In order of increasing complexity, the three main types of algorithms used are computerized flowcharts, Proportional-Integral-Derivative (PID), and Model Predictive Control (MPC). PID is essentially a closed-loop feedback system, whereas MPC models the behavior of glucose and insulin in ICU patients. Although the best approach has not yet been determined, it should be noted that PID controllers are generally thought to be more robust than MPC systems. The computerized Cuss that are most likely to emerge are those that are fully a part of the routine workflow, use patient-specific characteristics and apply variable sampling intervals. PMID:19849827

  20. Complex socio-technical systems: Characterization and management guidelines.

    PubMed

    Righi, Angela Weber; Saurin, Tarcisio Abreu

    2015-09-01

    Although ergonomics has paid increasing attention to the perspective of complexity, methods for its operationalization are scarce. This study introduces a framework for the operationalization of the "attribute view" of complexity, which involves: (i) the delimitation of the socio-technical system (STS); (ii) the description of four complexity attributes, namely a large number of elements in dynamic interactions, a wide diversity of elements, unexpected variability, and resilience; (iii) the assessment of six management guidelines, namely design slack, give visibility to processes and outcomes, anticipate and monitor the impacts of small changes, monitor the gap between prescription and practice, encourage diversity of perspectives when making decisions, and create an environment that supports resilience; and (iv) the identification of leverage points for improving the STS design, based on both the analysis of relationships among the attributes and their classification as irreducible/manageable complexity, and liability/asset. The use of the framework is illustrated by the study of an emergency department of a University hospital. Data collection involved analysis of documents, observations of work at the front-line, interviews with employees, and the application of questionnaires. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  1. Reversible heart rhythm complexity impairment in patients with primary aldosteronism

    NASA Astrophysics Data System (ADS)

    Lin, Yen-Hung; Wu, Vin-Cent; Lo, Men-Tzung; Wu, Xue-Ming; Hung, Chi-Sheng; Wu, Kwan-Dun; Lin, Chen; Ho, Yi-Lwun; Stowasser, Michael; Peng, Chung-Kang

    2015-08-01

    Excess aldosterone secretion in patients with primary aldosteronism (PA) impairs their cardiovascular system. Heart rhythm complexity analysis, derived from heart rate variability (HRV), is a powerful tool to quantify the complex regulatory dynamics of human physiology. We prospectively analyzed 20 patients with aldosterone producing adenoma (APA) that underwent adrenalectomy and 25 patients with essential hypertension (EH). The heart rate data were analyzed by conventional HRV and heart rhythm complexity analysis including detrended fluctuation analysis (DFA) and multiscale entropy (MSE). We found APA patients had significantly decreased DFAα2 on DFA analysis and decreased area 1-5, area 6-15, and area 6-20 on MSE analysis (all p < 0.05). Area 1-5, area 6-15, area 6-20 in the MSE study correlated significantly with log-transformed renin activity and log-transformed aldosterone-renin ratio (all p < = 0.01). The conventional HRV parameters were comparable between PA and EH patients. After adrenalectomy, all the altered DFA and MSE parameters improved significantly (all p < 0.05). The conventional HRV parameters did not change. Our result suggested that heart rhythm complexity is impaired in APA patients and this is at least partially reversed by adrenalectomy.

  2. Synthesis of triple-stranded complexes using bis(dipyrromethene) ligands.

    PubMed

    Zhang, Zhan; Dolphin, David

    2010-12-20

    The reaction of an α-free, β,β'-linked bis(dipyrromethene) ligand with Fe(3+) or Co(3+) led to noninterconvertible triple-stranded helicates and mesocates. In the present context, a stable α-free ligand 2 has been developed and complexation of ligands 1 and 2 with diamagnetic Co(3+), Ga(3+), and In(3+) has been studied. The triple-stranded M(2)1(3) (M = Ga, In) and M(2)2(3) (M = Co, Ga, In) complexes were characterized using matrix-assisted laser desorption ionization time-of-flight spectrometry, (1)H NMR and UV-vis spectroscopy, and X-ray crystallography. Again, the (1)H NMR analysis showed that both the triple-stranded helicates and mesocates were generated in this metal-directed assembly. Consistent with our previous finding on coordinatively inert Co(3+) complexes, variable-temperature NMR spectroscopy indicated that the triple-stranded helicate and mesocate of labile In(3+) did not interconvert in solution, either. However, the diastereoselectivity of the M(2)2(3) complexes was found to improve with an increase in the reaction temperature. Taken together, this study complements the coordination chemistry of poly(dipyrromethene) ligands and provides further insight into the formation of helicates versus mesocates.

  3. Characterizing a Century of Climate and Hydrological Variability of a Mediterranean and Mountainous Watersheds: the Durance River Case-Study

    NASA Astrophysics Data System (ADS)

    Mathevet, T.; Kuentz, A.; Gailhard, J.; Andreassian, V.

    2013-12-01

    Improving the understanding of mountain watersheds hydrological variability is a great scientific issue, for both researchers and water resources managers, such as Electricite de France (Energy and Hydropower Company). The past and current context of climate variability enhances the interest on this topic, since multi-purposes water resources management is highly sensitive to this variability. The Durance River watershed (14000 km2), situated in the French Alps, is a good example of the complexity of this issue. It is characterized by a variety of hydrological processes (from snowy to Mediterranean regimes) and a wide range of anthropogenic influences (hydropower, irrigation, flood control, tourism and water supply), mixing potential causes of changes in its hydrological regimes. As water related stakes are numerous in this watershed, improving knowledge on the hydrological variability of the Durance River appears to be essential. In this presentation, we would like to focus on a methodology we developed to build long-term historical hydrometeorological time-series, based on atmospheric reanalysis (20CR : 20th Century Reanalysis) and historical local observations. This methodology allowed us to generate precipitation, air temperature and streamflow time-series at a daily time-step for a sample of 22 watersheds, for the 1883-2010 period. These long-term streamflow reconstructions have been validated thanks to historical searches that allowed to bring to light ten long historical series of daily streamflows, beginning on the early 20th century. Reconstructions appear to have rather good statistical properties, with good correlation (greater than 0.8) and limited mean and variance bias (less than 5%). Then, these long-term hydrometeorological time-series allowed us to characterize the past variability in terms of available water resources, droughts or hydrological regime. These analyses help water resources managers to better know the range of hydrological variabilities, which are usually greatly underestimated with classical available time-series (less than 50 years).

  4. Computational analysis of liquid hypergolic propellant rocket engines

    NASA Technical Reports Server (NTRS)

    Krishnan, A.; Przekwas, A. J.; Gross, K. W.

    1992-01-01

    The combustion process in liquid rocket engines depends on a number of complex phenomena such as atomization, vaporization, spray dynamics, mixing, and reaction mechanisms. A computational tool to study their mutual interactions is developed to help analyze these processes with a view of improving existing designs and optimizing future designs of the thrust chamber. The focus of the article is on the analysis of the Variable Thrust Engine for the Orbit Maneuvering Vehicle. This engine uses a hypergolic liquid bipropellant combination of monomethyl hydrazine as fuel and nitrogen tetroxide as oxidizer.

  5. [Representation and mathematical analysis of human crystalline lens].

    PubMed

    Tălu, Stefan; Giovanzana, Stefano; Tălu, Mihai

    2011-01-01

    The surface of human crystalline lens can be described and analyzed using mathematical models based on parametric representations, used in biomechanical studies and 3D solid modeling of the lens. The mathematical models used in lens biomechanics allow the study and the behavior of crystalline lens on variables and complex dynamic loads. Also, the lens biomechanics has the potential to improve the results in the development of intraocular lenses and cataract surgery. The paper presents the most representative mathematical models currently used for the modeling of human crystalline lens, both optically and biomechanically.

  6. Influences on the implementation of TQM in health care organizations: professional bureaucracies, ownership and complexity.

    PubMed

    Badrick, T; Preston, A

    2001-01-01

    TQM is introduced into many organisations in an attempt to improve productivity and quality. There are a number of organisational variables that have been recognised as influencing the success of TQM implementation including leadership, teamwork, and suppliers. This paper presents findings of a study of the implementation of TQM in Australian health care organisations. Structural factors were observed to affect the progress of TQM. Professional bureaucracies were less successful than machine bureaucracies. Private organisations were more successful than their public counterparts.

  7. Molecular Diagnostic Testing for Aspergillus

    PubMed Central

    Powers-Fletcher, Margaret V.

    2016-01-01

    The direct detection of Aspergillus nucleic acid in clinical specimens has the potential to improve the diagnosis of aspergillosis by offering more rapid and sensitive identification of invasive infections than is possible with traditional techniques, such as culture or histopathology. Molecular tests for Aspergillus have been limited historically by lack of standardization and variable sensitivities and specificities. Recent efforts have been directed at addressing these limitations and optimizing assay performance using a variety of specimen types. This review provides a summary of standardization efforts and outlines the complexities of molecular testing for Aspergillus in clinical mycology. PMID:27487954

  8. Integrating Patient Concerns into Parkinson's Disease Management.

    PubMed

    Lim, Shen-Yang; Tan, Ai Huey; Fox, Susan H; Evans, Andrew H; Low, Soon Chai

    2017-01-01

    Parkinson's disease (PD) is a complex motor and non-motor disorder and management is often challenging. In this review, we explore emerging approaches to improve the care of patients, drawing from the literature regarding patient-centred care, patient and caregiver perspectives and priorities, gaps in knowledge among patients and caregivers and the need for accurate information, individual variability in disease manifestations, prognostication of disease course, new developments in health technologies and personalized medicine, specialty care, pharmacological and non-pharmacological management, financial burden, lifestyle and work-related issues, support groups and palliative care.

  9. Improvement of Simulation Method in Validation of Software of the Coordinate Measuring Systems

    NASA Astrophysics Data System (ADS)

    Nieciąg, Halina

    2015-10-01

    Software is used in order to accomplish various tasks at each stage of the functioning of modern measuring systems. Before metrological confirmation of measuring equipment, the system has to be validated. This paper discusses the method for conducting validation studies of a fragment of software to calculate the values of measurands. Due to the number and nature of the variables affecting the coordinate measurement results and the complex character and multi-dimensionality of measurands, the study used the Monte Carlo method of numerical simulation. The article presents an attempt of possible improvement of results obtained by classic Monte Carlo tools. The algorithm LHS (Latin Hypercube Sampling) was implemented as alternative to the simple sampling schema of classic algorithm.

  10. Mining temporal data sets: hypoplastic left heart syndrome case study

    NASA Astrophysics Data System (ADS)

    Kusiak, Andrew; Caldarone, Christopher A.; Kelleher, Michael D.; Lamb, Fred S.; Persoon, Thomas J.; Gan, Yuan; Burns, Alex

    2003-03-01

    Hypoplastic left heart syndrome (HLHS) affects infants and is uniformly fatal without surgery. Post-surgery mortality rates are highly variable and dependent on postoperative management. The high mortality after the first stage surgery usually occurs within the first few days after procedure. Typically, the deaths are attributed to the unstable balance between the pulmonary and systemic circulations. An experienced team of physicians, nurses, and therapists is required to successfully manage the infant. However, even the most experienced teams report significant mortality due to the extremely complex relationships among physiologic parameters in a given patient. A data acquisition system was developed for the simultaneous collection of 73 physiologic, laboratory, and nurse-assessed variables. Data records were created at intervals of 30 seconds. An expert-validated wellness score was computed for each data record. A training data set consisting of over 5000 data records from multiple patients was collected. Preliminary results demonstratd that the knowledge discovery approach was over 94.57% accurate in predicting the "wellness score" of an infant. The discovered knowledge can improve care of complex patients by development of an intelligent simulator that can be used to support decisions.

  11. Network collaboration of organisations for homeless individuals in the Montreal region

    PubMed Central

    Fleury, Marie-Josée; Grenier, Guy; Lesage, Alain; Ma, Nan; Ngui, André Ngamini

    2014-01-01

    Introduction We know little about the intensity and determinants of interorganisational collaboration within the homeless network. This study describes the characteristics and relationships (along with the variables predicting their degree of interorganisational collaboration) of 68 organisations of such a network in Montreal (Quebec, Canada). Theory and methods Data were collected primarily through a self-administered questionnaire. Descriptive analyses were conducted followed by social network and multivariate analyses. Results The Montreal homeless network has a high density (50.5%) and a decentralised structure and maintains a mostly informal collaboration with the public and cross-sectorial sectors. The network density showed more frequent contacts among four types of organisations which could point to the existence of cliques. Four variables predicted interorganisational collaboration: organisation type, number of services offered, volume of referrals and satisfaction with the relationships with public organisations. Conclusions and discussion The Montreal homeless network seems adequate to address non-complex homelessness problems. Considering, however, that most homeless individuals present chronic and complex profiles, it appears necessary to have a more formal and better integrated network of homeless organisations, particularly in the health and social service sectors, in order to improve services. PMID:24520216

  12. Workspace Program for Complex-Number Arithmetic

    NASA Technical Reports Server (NTRS)

    Patrick, M. C.; Howell, Leonard W., Jr.

    1986-01-01

    COMPLEX is workspace program designed to empower APL with complexnumber capabilities. Complex-variable methods provide analytical tools invaluable for applications in mathematics, science, and engineering. COMPLEX written in APL.

  13. A multi-model approach to monitor emissions of CO2 and CO from an urban-industrial complex

    NASA Astrophysics Data System (ADS)

    Super, Ingrid; Denier van der Gon, Hugo A. C.; van der Molen, Michiel K.; Sterk, Hendrika A. M.; Hensen, Arjan; Peters, Wouter

    2017-11-01

    Monitoring urban-industrial emissions is often challenging because observations are scarce and regional atmospheric transport models are too coarse to represent the high spatiotemporal variability in the resulting concentrations. In this paper we apply a new combination of an Eulerian model (Weather Research and Forecast, WRF, with chemistry) and a Gaussian plume model (Operational Priority Substances - OPS). The modelled mixing ratios are compared to observed CO2 and CO mole fractions at four sites along a transect from an urban-industrial complex (Rotterdam, the Netherlands) towards rural conditions for October-December 2014. Urban plumes are well-mixed at our semi-urban location, making this location suited for an integrated emission estimate over the whole study area. The signals at our urban measurement site (with average enhancements of 11 ppm CO2 and 40 ppb CO over the baseline) are highly variable due to the presence of distinct source areas dominated by road traffic/residential heating emissions or industrial activities. This causes different emission signatures that are translated into a large variability in observed ΔCO : ΔCO2 ratios, which can be used to identify dominant source types. We find that WRF-Chem is able to represent synoptic variability in CO2 and CO (e.g. the median CO2 mixing ratio is 9.7 ppm, observed, against 8.8 ppm, modelled), but it fails to reproduce the hourly variability of daytime urban plumes at the urban site (R2 up to 0.05). For the urban site, adding a plume model to the model framework is beneficial to adequately represent plume transport especially from stack emissions. The explained variance in hourly, daytime CO2 enhancements from point source emissions increases from 30 % with WRF-Chem to 52 % with WRF-Chem in combination with the most detailed OPS simulation. The simulated variability in ΔCO :  ΔCO2 ratios decreases drastically from 1.5 to 0.6 ppb ppm-1, which agrees better with the observed standard deviation of 0.4 ppb ppm-1. This is partly due to improved wind fields (increase in R2 of 0.10) but also due to improved point source representation (increase in R2 of 0.05) and dilution (increase in R2 of 0.07). Based on our analysis we conclude that a plume model with detailed and accurate dispersion parameters adds substantially to top-down monitoring of greenhouse gas emissions in urban environments with large point source contributions within a ˜ 10 km radius from the observation sites.

  14. Batch-mode Reinforcement Learning for improved hydro-environmental systems management

    NASA Astrophysics Data System (ADS)

    Castelletti, A.; Galelli, S.; Restelli, M.; Soncini-Sessa, R.

    2010-12-01

    Despite the great progresses made in the last decades, the optimal management of hydro-environmental systems still remains a very active and challenging research area. The combination of multiple, often conflicting interests, high non-linearities of the physical processes and the management objectives, strong uncertainties in the inputs, and high dimensional state makes the problem challenging and intriguing. Stochastic Dynamic Programming (SDP) is one of the most suitable methods for designing (Pareto) optimal management policies preserving the original problem complexity. However, it suffers from a dual curse, which, de facto, prevents its practical application to even reasonably complex water systems. (i) Computational requirement grows exponentially with state and control dimension (Bellman's curse of dimensionality), so that SDP can not be used with water systems where the state vector includes more than few (2-3) units. (ii) An explicit model of each system's component is required (curse of modelling) to anticipate the effects of the system transitions, i.e. any information included into the SDP framework can only be either a state variable described by a dynamic model or a stochastic disturbance, independent in time, with the associated pdf. Any exogenous information that could effectively improve the system operation cannot be explicitly considered in taking the management decision, unless a dynamic model is identified for each additional information, thus adding to the problem complexity through the curse of dimensionality (additional state variables). To mitigate this dual curse, the combined use of batch-mode Reinforcement Learning (bRL) and Dynamic Model Reduction (DMR) techniques is explored in this study. bRL overcomes the curse of modelling by replacing explicit modelling with an external simulator and/or historical observations. The curse of dimensionality is averted using a functional approximation of the SDP value function based on proper non-linear regressors. DMR reduces the complexity and the associated computational requirements of non-linear distributed process based models, making them suitable for being included into optimization schemes. Results from real world applications of the approach are also presented, including reservoir operation with both quality and quantity targets.

  15. An isotopic view of water and nitrate transport through the vadose zone in Oregon's southern Willamette Valley's Groundwater Management Area

    NASA Astrophysics Data System (ADS)

    Brooks, J. R.; Pearlstein, S.; Hutchins, S.; Faulkner, B. R.; Rugh, W.; Willard, K.; Coulombe, R.; Compton, J.

    2017-12-01

    Groundwater nitrate contamination affects thousands of households in Oregon's southern Willamette Valley and many more across the USA. The southern Willamette Valley Groundwater Management Area (GWMA) was established in 2004 due to nitrate levels in the groundwater exceeding the human health standard of 10 mg nitrate-N L-1. Much of the nitrogen (N) inputs to the GWMA comes from agricultural fertilizers, and thus efforts to reduce N inputs to groundwater are focused upon improving N management. However, the effectiveness of these improvements on groundwater quality is unclear because of the complexity of nutrient transport through the vadose zone and long groundwater residence times. Our objective was to focus on vadose zone transport and understand the dynamics and timing of N and water movement below the rooting zone in relation to N management and water inputs. Stable isotopes are a powerful tool for tracking water movement, and understanding N transformations. In partnership with local farmers and state agencies, we established lysimeters and groundwater wells in multiple agricultural fields in the GWMA, and have monitored nitrate, nitrate isotopes, and water isotopes weekly for multiple years. Our results indicate that vadose zone transport is highly complex, and the residence time of water collected in lysimeters was much longer than expected. While input precipitation water isotopes were highly variable over time, lysimeter water isotopes were surprisingly consistent, more closely resembling long-term precipitation isotope means rather than recent precipitation isotopic signatures. However, some particularly large precipitation events with unique isotopic signatures revealed high spatial variability in transport, with some lysimeters showing greater proportions of recent precipitation inputs than others. In one installation where we have groundwater wells and lysimeters at multiple depths, nitrate/nitrite concentrations decreased with depth. N concentrations and δ15N values indicated leaching at 1 m and denitrification at 3 m depth. However, these relationships showed spatial and temporal complexity. We are exploring how these vadose zone complexities can be incorporated into practical understanding of the impacts of N management on groundwater inputs.

  16. An open-chain imaginary-time path-integral sampling approach to the calculation of approximate symmetrized quantum time correlation functions.

    PubMed

    Cendagorta, Joseph R; Bačić, Zlatko; Tuckerman, Mark E

    2018-03-14

    We introduce a scheme for approximating quantum time correlation functions numerically within the Feynman path integral formulation. Starting with the symmetrized version of the correlation function expressed as a discretized path integral, we introduce a change of integration variables often used in the derivation of trajectory-based semiclassical methods. In particular, we transform to sum and difference variables between forward and backward complex-time propagation paths. Once the transformation is performed, the potential energy is expanded in powers of the difference variables, which allows us to perform the integrals over these variables analytically. The manner in which this procedure is carried out results in an open-chain path integral (in the remaining sum variables) with a modified potential that is evaluated using imaginary-time path-integral sampling rather than requiring the generation of a large ensemble of trajectories. Consequently, any number of path integral sampling schemes can be employed to compute the remaining path integral, including Monte Carlo, path-integral molecular dynamics, or enhanced path-integral molecular dynamics. We believe that this approach constitutes a different perspective in semiclassical-type approximations to quantum time correlation functions. Importantly, we argue that our approximation can be systematically improved within a cumulant expansion formalism. We test this approximation on a set of one-dimensional problems that are commonly used to benchmark approximate quantum dynamical schemes. We show that the method is at least as accurate as the popular ring-polymer molecular dynamics technique and linearized semiclassical initial value representation for correlation functions of linear operators in most of these examples and improves the accuracy of correlation functions of nonlinear operators.

  17. An open-chain imaginary-time path-integral sampling approach to the calculation of approximate symmetrized quantum time correlation functions

    NASA Astrophysics Data System (ADS)

    Cendagorta, Joseph R.; Bačić, Zlatko; Tuckerman, Mark E.

    2018-03-01

    We introduce a scheme for approximating quantum time correlation functions numerically within the Feynman path integral formulation. Starting with the symmetrized version of the correlation function expressed as a discretized path integral, we introduce a change of integration variables often used in the derivation of trajectory-based semiclassical methods. In particular, we transform to sum and difference variables between forward and backward complex-time propagation paths. Once the transformation is performed, the potential energy is expanded in powers of the difference variables, which allows us to perform the integrals over these variables analytically. The manner in which this procedure is carried out results in an open-chain path integral (in the remaining sum variables) with a modified potential that is evaluated using imaginary-time path-integral sampling rather than requiring the generation of a large ensemble of trajectories. Consequently, any number of path integral sampling schemes can be employed to compute the remaining path integral, including Monte Carlo, path-integral molecular dynamics, or enhanced path-integral molecular dynamics. We believe that this approach constitutes a different perspective in semiclassical-type approximations to quantum time correlation functions. Importantly, we argue that our approximation can be systematically improved within a cumulant expansion formalism. We test this approximation on a set of one-dimensional problems that are commonly used to benchmark approximate quantum dynamical schemes. We show that the method is at least as accurate as the popular ring-polymer molecular dynamics technique and linearized semiclassical initial value representation for correlation functions of linear operators in most of these examples and improves the accuracy of correlation functions of nonlinear operators.

  18. Conformational Flexibility and Subunit Arrangement of the Modular Yeast Spt-Ada-Gcn5 Acetyltransferase Complex*

    PubMed Central

    Setiaputra, Dheva; Ross, James D.; Lu, Shan; Cheng, Derrick T.; Dong, Meng-Qiu; Yip, Calvin K.

    2015-01-01

    The Spt-Ada-Gcn5 acetyltransferase (SAGA) complex is a highly conserved, 19-subunit histone acetyltransferase complex that activates transcription through acetylation and deubiquitination of nucleosomal histones in Saccharomyces cerevisiae. Because SAGA has been shown to display conformational variability, we applied gradient fixation to stabilize purified SAGA and systematically analyzed this flexibility using single-particle EM. Our two- and three-dimensional studies show that SAGA adopts three major conformations, and mutations of specific subunits affect the distribution among these. We also located the four functional modules of SAGA using electron microscopy-based labeling and transcriptional activator binding analyses and show that the acetyltransferase module is localized in the most mobile region of the complex. We further comprehensively mapped the subunit interconnectivity of SAGA using cross-linking mass spectrometry, revealing that the Spt and Taf subunits form the structural core of the complex. These results provide the necessary restraints for us to generate a model of the spatial arrangement of all SAGA subunits. According to this model, the chromatin-binding domains of SAGA are all clustered in one face of the complex that is highly flexible. Our results relate information of overall SAGA structure with detailed subunit level interactions, improving our understanding of its architecture and flexibility. PMID:25713136

  19. [Real time 3D echocardiography

    NASA Technical Reports Server (NTRS)

    Bauer, F.; Shiota, T.; Thomas, J. D.

    2001-01-01

    Three-dimensional representation of the heart is an old concern. Usually, 3D reconstruction of the cardiac mass is made by successive acquisition of 2D sections, the spatial localisation and orientation of which require complex guiding systems. More recently, the concept of volumetric acquisition has been introduced. A matricial emitter-receiver probe complex with parallel data processing provides instantaneous of a pyramidal 64 degrees x 64 degrees volume. The image is restituted in real time and is composed of 3 planes (planes B and C) which can be displaced in all spatial directions at any time during acquisition. The flexibility of this system of acquisition allows volume and mass measurement with greater accuracy and reproducibility, limiting inter-observer variability. Free navigation of the planes of investigation allows reconstruction for qualitative and quantitative analysis of valvular heart disease and other pathologies. Although real time 3D echocardiography is ready for clinical usage, some improvements are still necessary to improve its conviviality. Then real time 3D echocardiography could be the essential tool for understanding, diagnosis and management of patients.

  20. Development of an Enhanced Metaproteomic Approach for Deepening the Microbiome Characterization of the Human Infant Gut

    PubMed Central

    2015-01-01

    The establishment of early life microbiota in the human infant gut is highly variable and plays a crucial role in host nutrient availability/uptake and maturation of immunity. Although high-performance mass spectrometry (MS)-based metaproteomics is a powerful method for the functional characterization of complex microbial communities, the acquisition of comprehensive metaproteomic information in human fecal samples is inhibited by the presence of abundant human proteins. To alleviate this restriction, we have designed a novel metaproteomic strategy based on double filtering (DF) the raw samples, a method that fractionates microbial from human cells to enhance microbial protein identification and characterization in complex fecal samples from healthy premature infants. This method dramatically improved the overall depth of infant gut proteome measurement, with an increase in the number of identified low-abundance proteins and a greater than 2-fold improvement in microbial protein identification and quantification. This enhancement of proteome measurement depth enabled a more extensive microbiome comparison between infants by not only increasing the confidence of identified microbial functional categories but also revealing previously undetected categories. PMID:25350865

  1. Evaluation of alternative model selection criteria in the analysis of unimodal response curves using CART

    USGS Publications Warehouse

    Ribic, C.A.; Miller, T.W.

    1998-01-01

    We investigated CART performance with a unimodal response curve for one continuous response and four continuous explanatory variables, where two variables were important (ie directly related to the response) and the other two were not. We explored performance under three relationship strengths and two explanatory variable conditions: equal importance and one variable four times as important as the other. We compared CART variable selection performance using three tree-selection rules ('minimum risk', 'minimum risk complexity', 'one standard error') to stepwise polynomial ordinary least squares (OLS) under four sample size conditions. The one-standard-error and minimum-risk-complexity methods performed about as well as stepwise OLS with large sample sizes when the relationship was strong. With weaker relationships, equally important explanatory variables and larger sample sizes, the one-standard-error and minimum-risk-complexity rules performed better than stepwise OLS. With weaker relationships and explanatory variables of unequal importance, tree-structured methods did not perform as well as stepwise OLS. Comparing performance within tree-structured methods, with a strong relationship and equally important explanatory variables, the one-standard-error-rule was more likely to choose the correct model than were the other tree-selection rules 1) with weaker relationships and equally important explanatory variables; and 2) under all relationship strengths when explanatory variables were of unequal importance and sample sizes were lower.

  2. Therapeutic drug monitoring in patients with inflammatory bowel disease

    PubMed Central

    Yarur, Andres J; Abreu, Maria T; Deshpande, Amar R; Kerman, David H; Sussman, Daniel A

    2014-01-01

    Thiopurine analogs and anti-tumor necrosis factor (TNF) agents have dramatically changed the therapeutics of inflammatory bowel diseases (IBD), improving short and long-term outcomes. Unfortunately some patients do not respond to therapy and others lose response over time. The pharmacokinetic properties of these drugs are complex, with high inter-patient variability. Thiopurine analogs are metabolized through a series of pathways, which vary according to the patients’ pharmacogenetic profile. This profile largely determines the ratios of metabolites, which are in turn associated with likelihoods of clinical efficacy and/or toxicity. Understanding these mechanisms allows for manipulation of drug dose, aiming to reduce the development of toxicity while improving the efficacy of treatment. The efficacy of anti-TNF drugs is influenced by many pharmacodynamic variables. Several factors may alter drug clearance, including the concomitant use of immunomodulators (thiopurine analogs and methotrexate), systemic inflammation, the presence of anti-drug antibodies, and body mass. The treatment of IBD has evolved with the understanding of the pharmacologic profiles of immunomodulating and TNF-inhibiting medications, with good evidence for improvement in patient outcomes observed when measuring metabolic pathway indices. The role of routine measurement of metabolite/drug levels and antibodies warrants further prospective studies as we enter the era of personalized IBD care. PMID:24707130

  3. Can purchasing information be used to predict adherence to cardiovascular medications? An analysis of linked retail pharmacy and insurance claims data

    PubMed Central

    Krumme, Alexis A; Sanfélix-Gimeno, Gabriel; Franklin, Jessica M; Isaman, Danielle L; Mahesri, Mufaddal; Matlin, Olga S; Shrank, William H; Brennan, Troyen A; Brill, Gregory; Choudhry, Niteesh K

    2016-01-01

    Objective The use of retail purchasing data may improve adherence prediction over approaches using healthcare insurance claims alone. Design Retrospective. Setting and participants A cohort of patients who received prescription medication benefits through CVS Caremark, used a CVS Pharmacy ExtraCare Health Care (ECHC) loyalty card, and initiated a statin medication in 2011. Outcome We evaluated associations between retail purchasing patterns and optimal adherence to statins in the 12 subsequent months. Results Among 11 010 statin initiators, 43% were optimally adherent at 12 months of follow-up. Greater numbers of store visits per month and dollar amount per visit were positively associated with optimal adherence, as was making a purchase on the same day as filling a prescription (p<0.0001 for all). Models to predict adherence using retail purchase variables had low discriminative ability (C-statistic: 0.563), while models with both clinical and retail purchase variables achieved a C-statistic of 0.617. Conclusions While the use of retail purchases may improve the discriminative ability of claims-based approaches, these data alone appear inadequate for adherence prediction, even with the addition of more complex analytical approaches. Nevertheless, associations between retail purchasing behaviours and adherence could inform the development of quality improvement interventions. PMID:28186924

  4. Replica exchange and expanded ensemble simulations as Gibbs sampling: simple improvements for enhanced mixing.

    PubMed

    Chodera, John D; Shirts, Michael R

    2011-11-21

    The widespread popularity of replica exchange and expanded ensemble algorithms for simulating complex molecular systems in chemistry and biophysics has generated much interest in discovering new ways to enhance the phase space mixing of these protocols in order to improve sampling of uncorrelated configurations. Here, we demonstrate how both of these classes of algorithms can be considered as special cases of Gibbs sampling within a Markov chain Monte Carlo framework. Gibbs sampling is a well-studied scheme in the field of statistical inference in which different random variables are alternately updated from conditional distributions. While the update of the conformational degrees of freedom by Metropolis Monte Carlo or molecular dynamics unavoidably generates correlated samples, we show how judicious updating of the thermodynamic state indices--corresponding to thermodynamic parameters such as temperature or alchemical coupling variables--can substantially increase mixing while still sampling from the desired distributions. We show how state update methods in common use can lead to suboptimal mixing, and present some simple, inexpensive alternatives that can increase mixing of the overall Markov chain, reducing simulation times necessary to obtain estimates of the desired precision. These improved schemes are demonstrated for several common applications, including an alchemical expanded ensemble simulation, parallel tempering, and multidimensional replica exchange umbrella sampling.

  5. Primer in Genetics and Genomics, Article 2-Advancing Nursing Research With Genomic Approaches.

    PubMed

    Lee, Hyunhwa; Gill, Jessica; Barr, Taura; Yun, Sijung; Kim, Hyungsuk

    2017-03-01

    Nurses investigate reasons for variable patient symptoms and responses to treatments to inform how best to improve outcomes. Genomics has the potential to guide nursing research exploring contributions to individual variability. This article is meant to serve as an introduction to the novel methods available through genomics for addressing this critical issue and includes a review of methodological considerations for selected genomic approaches. This review presents essential concepts in genetics and genomics that will allow readers to identify upcoming trends in genomics nursing research and improve research practice. It introduces general principles of genomic research and provides an overview of the research process. It also highlights selected nursing studies that serve as clinical examples of the use of genomic technologies. Finally, the authors provide suggestions about how to apply genomic technology in nursing research along with directions for future research. Using genomic approaches in nursing research can advance the understanding of the complex pathophysiology of disease susceptibility and different patient responses to interventions. Nurses should be incorporating genomics into education, clinical practice, and research as the influence of genomics in health-care research and practice continues to grow. Nurses are also well placed to translate genomic discoveries into improved methods for patient assessment and intervention.

  6. Solid oxide fuel cell simulation and design optimization with numerical adjoint techniques

    NASA Astrophysics Data System (ADS)

    Elliott, Louie C.

    This dissertation reports on the application of numerical optimization techniques as applied to fuel cell simulation and design. Due to the "multi-physics" inherent in a fuel cell, which results in a highly coupled and non-linear behavior, an experimental program to analyze and improve the performance of fuel cells is extremely difficult. This program applies new optimization techniques with computational methods from the field of aerospace engineering to the fuel cell design problem. After an overview of fuel cell history, importance, and classification, a mathematical model of solid oxide fuel cells (SOFC) is presented. The governing equations are discretized and solved with computational fluid dynamics (CFD) techniques including unstructured meshes, non-linear solution methods, numerical derivatives with complex variables, and sensitivity analysis with adjoint methods. Following the validation of the fuel cell model in 2-D and 3-D, the results of the sensitivity analysis are presented. The sensitivity derivative for a cost function with respect to a design variable is found with three increasingly sophisticated techniques: finite difference, direct differentiation, and adjoint. A design cycle is performed using a simple optimization method to improve the value of the implemented cost function. The results from this program could improve fuel cell performance and lessen the world's dependence on fossil fuels.

  7. Making Student Online Teams Work

    ERIC Educational Resources Information Center

    Olsen, Joel; Kalinski, Ray

    2017-01-01

    Online professors typically assign teams based on time zones, performance, or alphabet, but are these the best ways to position student virtual teams for success? Personality and task complexity could provide additional direction. Personality and task complexity were used as independent variables related to the depended variable of team…

  8. 'Potentially inappropriate or specifically appropriate?' Qualitative evaluation of general practitioners views on prescribing, polypharmacy and potentially inappropriate prescribing in older people.

    PubMed

    Clyne, Barbara; Cooper, Janine A; Hughes, Carmel M; Fahey, Tom; Smith, Susan M

    2016-08-11

    Potentially inappropriate prescribing (PIP) is common in older people in primary care, as evidenced by a significant body of quantitative research. However, relatively few qualitative studies have investigated the phenomenon of PIP and its underlying processes from the perspective of general practitioners (GPs). The aim of this paper is to explore qualitatively, GP perspectives regarding prescribing and PIP in older primary care patients. Semi-structured qualitative interviews were conducted with GPs participating in a randomised controlled trial (RCT) of an intervention to decrease PIP in older patients (≥70 years) in Ireland. Interviews were conducted with GP participants (both intervention and control) from the OPTI-SCRIPT cluster RCT as part of the trial process evaluation between January and July 2013. Interviews were conducted by one interviewer and audio recorded. Interviews were transcribed verbatim and a thematic analysis was conducted. Seventeen semi-structured interviews were conducted (13 male; 4 female). Three main, inter-related themes emerged (complex prescribing environment, paternalistic doctor-patient relationship, and relevance of PIP concept). Patient complexity (e.g. polypharmacy, multimorbidity), as well as prescriber complexity (e.g. multiple prescribers, poor communication, restricted autonomy) were all identified as factors contributing to a complex prescribing environment where PIP could occur, as was a paternalistic-doctor patient relationship. The concept of PIP was perceived to be of variable usefulness to GPs and the criteria to measure it may be at odds with the complex processes of prescribing for this patient population. Several inter-related factors contributing to the occurrence of PIP were identified, some of which may be amenable to intervention. Improvement strategies focused on improved management of polypharmacy and multimorbidity, and communication across primary and secondary care could result in substantial improvements in PIP. Current controlled trials ISRCTN41694007.

  9. Multifractal Properties of Process Control Variables

    NASA Astrophysics Data System (ADS)

    Domański, Paweł D.

    2017-06-01

    Control system is an inevitable element of any industrial installation. Its quality affects overall process performance significantly. The assessment, whether control system needs any improvement or not, requires relevant and constructive measures. There are various methods, like time domain based, Minimum Variance, Gaussian and non-Gaussian statistical factors, fractal and entropy indexes. Majority of approaches use time series of control variables. They are able to cover many phenomena. But process complexities and human interventions cause effects that are hardly visible for standard measures. It is shown that the signals originating from industrial installations have multifractal properties and such an analysis may extend standard approach to further observations. The work is based on industrial and simulation data. The analysis delivers additional insight into the properties of control system and the process. It helps to discover internal dependencies and human factors, which are hardly detectable.

  10. Prediction of biodiversity hotspots in the Anthropocene: The case of veteran oaks.

    PubMed

    Skarpaas, Olav; Blumentrath, Stefan; Evju, Marianne; Sverdrup-Thygeson, Anne

    2017-10-01

    Over the past centuries, humans have transformed large parts of the biosphere, and there is a growing need to understand and predict the distribution of biodiversity hotspots influenced by the presence of humans. Our basic hypothesis is that human influence in the Anthropocene is ubiquitous, and we predict that biodiversity hot spot modeling can be improved by addressing three challenges raised by the increasing ecological influence of humans: (i) anthropogenically modified responses to individual ecological factors, (ii) fundamentally different processes and predictors in landscape types shaped by different land use histories and (iii) a multitude and complexity of natural and anthropogenic processes that may require many predictors and even multiple models in different landscape types. We modeled the occurrence of veteran oaks in Norway, and found, in accordance with our basic hypothesis and predictions, that humans influence the distribution of veteran oaks throughout its range, but in different ways in forests and open landscapes. In forests, geographical and topographic variables related to the oak niche are still important, but the occurrence of veteran oaks is shifted toward steeper slopes, where logging is difficult. In open landscapes, land cover variables are more important, and veteran oaks are more common toward the north than expected from the fundamental oak niche. In both landscape types, multiple predictor variables representing ecological and human-influenced processes were needed to build a good model, and several models performed almost equally well. Models accounting for the different anthropogenic influences on landscape structure and processes consistently performed better than models based exclusively on natural biogeographical and ecological predictors. Thus, our results for veteran oaks clearly illustrate the challenges to distribution modeling raised by the ubiquitous influence of humans, even in a moderately populated region, but also show that predictions can be improved by explicitly addressing these anthropogenic complexities.

  11. Utilizing NASA DISCOVER-AQ Data to Examine Spatial Gradients in Complex Emission Environments

    NASA Astrophysics Data System (ADS)

    Buzanowicz, M. E.; Moore, W.; Crawford, J. H.; Schroeder, J.

    2017-12-01

    Although many regulations have been enacted with the goal of improving air quality, many parts of the US are still classified as `non-attainment areas' because they frequently violate federal air quality standards. Adequately monitoring the spatial distribution of pollutants both within and outside of non-attainment areas has been an ongoing challenge for regulators. Observations of near-surface pollution from space-based platforms would provide an unprecedented view of the spatial distribution of pollution, but this goal has not yet been realized due to fundamental limitations of satellites, specifically because the footprint size of satellite measurements may not be sufficiently small enough to capture true gradients in pollution, and rather represents an average over a large area. NASA's DISCOVER-AQ was a multi-year field campaign aimed at improving our understanding of the role that remote sensing, including satellite-based remote sensing, could play in air quality monitoring systems. DISCOVER-AQ data will be utilized to create a metric to examine spatial gradients and how satellites can capture those gradients in areas with complex emission environments. Examining horizontal variability within a vertical column is critical to understanding mixing within the atmosphere. Aircraft spirals conducted during DISCOVER-AQ were divided into octants, and averages of a given a species were calculated, with certain points receiving a flag. These flags were determined by calculating gradients between subsequent octants. Initial calculations have shown that over areas with large point source emissions, such as Platteville and Denver-La Casa in Colorado, and Essex, Maryland, satellite retrievals may not adequately capture spatial variability in the atmosphere, thus complicating satellite inversion techniques and limiting our ability to understand human exposure on sub-grid scales. Further calculations at other locations and for other trace gases are necessary to determine the effects of vertical variability within the atmosphere.

  12. Aortic arch atherosclerosis in patients with severe aortic stenosis can be argued by greater day-by-day blood pressure variability.

    PubMed

    Iwata, Shinichi; Sugioka, Kenichi; Fujita, Suwako; Ito, Asahiro; Matsumura, Yoshiki; Hanatani, Akihisa; Takagi, Masahiko; Di Tullio, Marco R; Homma, Shunichi; Yoshiyama, Minoru

    2015-07-01

    Although it is well known that the prevalence of aortic arch plaques, one of the risk factors for ischemic stroke, is high in patients with severe aortic stenosis, the underlying mechanisms are not well understood. Increased day-by-day blood pressure (BP) variability is also known to be associated with stroke; however, little is known on the association between day-by-bay BP variability and aortic arch atherosclerosis in patients with aortic stenosis. Our objective was to clarify the association between day-by-day BP variables (average values and variability) and aortic arch atherosclerosis in patients with severe aortic stenosis. The study population consisted of 104 consecutive patients (mean age 75 ± 8 years) with severe aortic stenosis who were scheduled for aortic valve replacement. BP was measured in the morning in at least 4 consecutive days (mean 6.8 days) prior to the day of surgery. Large (≥4 mm), ulcerated, or mobile plaques were defined as complex plaques using transesophageal echocardiography. Cigarette smoking and all systolic BP variables were associated with the presence of complex plaques (p < 0.05), whereas diastolic BP variables were not. Multiple regression analysis indicated that day-by-day mean systolic BP and day-by-day systolic BP variability remained independently associated with the presence of complex plaques (p < 0.05) after adjustment for age, male sex, cigarette smoking, hypertension, hypercholesterolemia, and diabetes mellitus. These findings suggest that higher day-by-day mean systolic BP and day-by-day systolic BP variability are associated with complex plaques in the aortic arch and consequently stroke risk in patients with aortic stenosis. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Health and household air pollution from solid fuel use: the need for improved exposure assessment.

    PubMed

    Clark, Maggie L; Peel, Jennifer L; Balakrishnan, Kalpana; Breysse, Patrick N; Chillrud, Steven N; Naeher, Luke P; Rodes, Charles E; Vette, Alan F; Balbus, John M

    2013-10-01

    Nearly 3 billion people worldwide rely on solid fuel combustion to meet basic household energy needs. The resulting exposure to air pollution causes an estimated 4.5% of the global burden of disease. Large variability and a lack of resources for research and development have resulted in highly uncertain exposure estimates. We sought to identify research priorities for exposure assessment that will more accurately and precisely define exposure-response relationships of household air pollution necessary to inform future cleaner-burning cookstove dissemination programs. As part of an international workshop in May 2011, an expert group characterized the state of the science and developed recommendations for exposure assessment of household air pollution. The following priority research areas were identified to explain variability and reduce uncertainty of household air pollution exposure measurements: improved characterization of spatial and temporal variability for studies examining both short- and long-term health effects; development and validation of measurement technology and approaches to conduct complex exposure assessments in resource-limited settings with a large range of pollutant concentrations; and development and validation of biomarkers for estimating dose. Addressing these priority research areas, which will inherently require an increased allocation of resources for cookstove research, will lead to better characterization of exposure-response relationships. Although the type and extent of exposure assessment will necessarily depend on the goal and design of the cookstove study, without improved understanding of exposure-response relationships, the level of air pollution reduction necessary to meet the health targets of cookstove interventions will remain uncertain.

  14. Selection of specific protein binders for pre-defined targets from an optimized library of artificial helicoidal repeat proteins (alphaRep).

    PubMed

    Guellouz, Asma; Valerio-Lepiniec, Marie; Urvoas, Agathe; Chevrel, Anne; Graille, Marc; Fourati-Kammoun, Zaineb; Desmadril, Michel; van Tilbeurgh, Herman; Minard, Philippe

    2013-01-01

    We previously designed a new family of artificial proteins named αRep based on a subgroup of thermostable helicoidal HEAT-like repeats. We have now assembled a large optimized αRep library. In this library, the side chains at each variable position are not fully randomized but instead encoded by a distribution of codons based on the natural frequency of side chains of the natural repeats family. The library construction is based on a polymerization of micro-genes and therefore results in a distribution of proteins with a variable number of repeats. We improved the library construction process using a "filtration" procedure to retain only fully coding modules that were recombined to recreate sequence diversity. The final library named Lib2.1 contains 1.7×10(9) independent clones. Here, we used phage display to select, from the previously described library or from the new library, new specific αRep proteins binding to four different non-related predefined protein targets. Specific binders were selected in each case. The results show that binders with various sizes are selected including relatively long sequences, with up to 7 repeats. ITC-measured affinities vary with Kd values ranging from micromolar to nanomolar ranges. The formation of complexes is associated with a significant thermal stabilization of the bound target protein. The crystal structures of two complexes between αRep and their cognate targets were solved and show that the new interfaces are established by the variable surfaces of the repeated modules, as well by the variable N-cap residues. These results suggest that αRep library is a new and versatile source of tight and specific binding proteins with favorable biophysical properties.

  15. Learning from adaptive neural dynamic surface control of strict-feedback systems.

    PubMed

    Wang, Min; Wang, Cong

    2015-06-01

    Learning plays an essential role in autonomous control systems. However, how to achieve learning in the nonstationary environment for nonlinear systems is a challenging problem. In this paper, we present learning method for a class of n th-order strict-feedback systems by adaptive dynamic surface control (DSC) technology, which achieves the human-like ability of learning by doing and doing with learned knowledge. To achieve the learning, this paper first proposes stable adaptive DSC with auxiliary first-order filters, which ensures the boundedness of all the signals in the closed-loop system and the convergence of tracking errors in a finite time. With the help of DSC, the derivative of the filter output variable is used as the neural network (NN) input instead of traditional intermediate variables. As a result, the proposed adaptive DSC method reduces greatly the dimension of NN inputs, especially for high-order systems. After the stable DSC design, we decompose the stable closed-loop system into a series of linear time-varying perturbed subsystems. Using a recursive design, the recurrent property of NN input variables is easily verified since the complexity is overcome using DSC. Subsequently, the partial persistent excitation condition of the radial basis function NN is satisfied. By combining a state transformation, accurate approximations of the closed-loop system dynamics are recursively achieved in a local region along recurrent orbits. Then, the learning control method using the learned knowledge is proposed to achieve the closed-loop stability and the improved control performance. Simulation studies are performed to demonstrate the proposed scheme can not only reuse the learned knowledge to achieve the better control performance with the faster tracking convergence rate and the smaller tracking error but also greatly alleviate the computational burden because of reducing the number and complexity of NN input variables.

  16. Selection of Specific Protein Binders for Pre-Defined Targets from an Optimized Library of Artificial Helicoidal Repeat Proteins (alphaRep)

    PubMed Central

    Chevrel, Anne; Graille, Marc; Fourati-Kammoun, Zaineb; Desmadril, Michel; van Tilbeurgh, Herman; Minard, Philippe

    2013-01-01

    We previously designed a new family of artificial proteins named αRep based on a subgroup of thermostable helicoidal HEAT-like repeats. We have now assembled a large optimized αRep library. In this library, the side chains at each variable position are not fully randomized but instead encoded by a distribution of codons based on the natural frequency of side chains of the natural repeats family. The library construction is based on a polymerization of micro-genes and therefore results in a distribution of proteins with a variable number of repeats. We improved the library construction process using a “filtration” procedure to retain only fully coding modules that were recombined to recreate sequence diversity. The final library named Lib2.1 contains 1.7×109 independent clones. Here, we used phage display to select, from the previously described library or from the new library, new specific αRep proteins binding to four different non-related predefined protein targets. Specific binders were selected in each case. The results show that binders with various sizes are selected including relatively long sequences, with up to 7 repeats. ITC-measured affinities vary with Kd values ranging from micromolar to nanomolar ranges. The formation of complexes is associated with a significant thermal stabilization of the bound target protein. The crystal structures of two complexes between αRep and their cognate targets were solved and show that the new interfaces are established by the variable surfaces of the repeated modules, as well by the variable N-cap residues. These results suggest that αRep library is a new and versatile source of tight and specific binding proteins with favorable biophysical properties. PMID:24014183

  17. Self-consistent adjoint analysis for topology optimization of electromagnetic waves

    NASA Astrophysics Data System (ADS)

    Deng, Yongbo; Korvink, Jan G.

    2018-05-01

    In topology optimization of electromagnetic waves, the Gâteaux differentiability of the conjugate operator to the complex field variable results in the complexity of the adjoint sensitivity, which evolves the original real-valued design variable to be complex during the iterative solution procedure. Therefore, the self-inconsistency of the adjoint sensitivity is presented. To enforce the self-consistency, the real part operator has been used to extract the real part of the sensitivity to keep the real-value property of the design variable. However, this enforced self-consistency can cause the problem that the derived structural topology has unreasonable dependence on the phase of the incident wave. To solve this problem, this article focuses on the self-consistent adjoint analysis of the topology optimization problems for electromagnetic waves. This self-consistent adjoint analysis is implemented by splitting the complex variables of the wave equations into the corresponding real parts and imaginary parts, sequentially substituting the split complex variables into the wave equations with deriving the coupled equations equivalent to the original wave equations, where the infinite free space is truncated by the perfectly matched layers. Then, the topology optimization problems of electromagnetic waves are transformed into the forms defined on real functional spaces instead of complex functional spaces; the adjoint analysis of the topology optimization problems is implemented on real functional spaces with removing the variational of the conjugate operator; the self-consistent adjoint sensitivity is derived, and the phase-dependence problem is avoided for the derived structural topology. Several numerical examples are implemented to demonstrate the robustness of the derived self-consistent adjoint analysis.

  18. An adaptive moving finite volume scheme for modeling flood inundation over dry and complex topography

    NASA Astrophysics Data System (ADS)

    Zhou, Feng; Chen, Guoxian; Huang, Yuefei; Yang, Jerry Zhijian; Feng, Hui

    2013-04-01

    A new geometrical conservative interpolation on unstructured meshes is developed for preserving still water equilibrium and positivity of water depth at each iteration of mesh movement, leading to an adaptive moving finite volume (AMFV) scheme for modeling flood inundation over dry and complex topography. Unlike traditional schemes involving position-fixed meshes, the iteration process of the AFMV scheme moves a fewer number of the meshes adaptively in response to flow variables calculated in prior solutions and then simulates their posterior values on the new meshes. At each time step of the simulation, the AMFV scheme consists of three parts: an adaptive mesh movement to shift the vertices position, a geometrical conservative interpolation to remap the flow variables by summing the total mass over old meshes to avoid the generation of spurious waves, and a partial differential equations(PDEs) discretization to update the flow variables for a new time step. Five different test cases are presented to verify the computational advantages of the proposed scheme over nonadaptive methods. The results reveal three attractive features: (i) the AMFV scheme could preserve still water equilibrium and positivity of water depth within both mesh movement and PDE discretization steps; (ii) it improved the shock-capturing capability for handling topographic source terms and wet-dry interfaces by moving triangular meshes to approximate the spatial distribution of time-variant flood processes; (iii) it was able to solve the shallow water equations with a relatively higher accuracy and spatial-resolution with a lower computational cost.

  19. The neurovascular complexity index as a potential indicator of traumatic brain injury severity: A case-series study.

    PubMed

    Howard, Jeffrey T; Janak, Jud C; Bukhman, Vladislav; Robertson, Claudia; Frolov, Iurii; Nawn, Corinne D; Schiller, Alicia M; Convertino, Victor A

    2017-07-01

    Multimodal monitoring of brain physiology following a traumatic brain injury (TBI) shows promise as a strategy to improve management and outcomes of TBI patients within civilian and military trauma. Valid and reliable measures of different aspects of brain physiology following a TBI could prove critical to accurately capturing these changes. Using a case-series design with a control subject group comparison, we evaluated a new proprietary algorithm called the Neurovascular Complexity Index (NCI) using transcranial Doppler to noninvasively obtain measures of cerebral blood flow variability. Baseline NCI data from 169 control subjects were compared with 12 patients with moderate to severe TBI. Patients with TBI exhibited significantly greater mean and variability in NCI scores compared with control subjects (F = 195.48; p < 0.001). The mean absolute deviation (MAD) of NCI scores increased significantly and in a monotonic fashion with severity of injury, where control subjects exhibited a small MAD of 0.44, patients with moderate TBI had a higher MAD of 4.20, and patients with severe TBI had an MAD of 6.51 (p < 0.001). Advancement in multimodal monitoring of TBI patients is important in reducing the potential risk of secondary injury. This study reports results indicating that a new noninvasive quantifiable assessment of TBI based on a noninvasive measure of cerebral blood flow variability shows potential for continuous monitoring and early identification of brain-injured patients, deployable in far-forward military environments, to better inform individualized management. Case series, level IV.

  20. State estimation and prediction using clustered particle filters.

    PubMed

    Lee, Yoonsang; Majda, Andrew J

    2016-12-20

    Particle filtering is an essential tool to improve uncertain model predictions by incorporating noisy observational data from complex systems including non-Gaussian features. A class of particle filters, clustered particle filters, is introduced for high-dimensional nonlinear systems, which uses relatively few particles compared with the standard particle filter. The clustered particle filter captures non-Gaussian features of the true signal, which are typical in complex nonlinear dynamical systems such as geophysical systems. The method is also robust in the difficult regime of high-quality sparse and infrequent observations. The key features of the clustered particle filtering are coarse-grained localization through the clustering of the state variables and particle adjustment to stabilize the method; each observation affects only neighbor state variables through clustering and particles are adjusted to prevent particle collapse due to high-quality observations. The clustered particle filter is tested for the 40-dimensional Lorenz 96 model with several dynamical regimes including strongly non-Gaussian statistics. The clustered particle filter shows robust skill in both achieving accurate filter results and capturing non-Gaussian statistics of the true signal. It is further extended to multiscale data assimilation, which provides the large-scale estimation by combining a cheap reduced-order forecast model and mixed observations of the large- and small-scale variables. This approach enables the use of a larger number of particles due to the computational savings in the forecast model. The multiscale clustered particle filter is tested for one-dimensional dispersive wave turbulence using a forecast model with model errors.

  1. What model resolution is required in climatological downscaling over complex terrain?

    NASA Astrophysics Data System (ADS)

    El-Samra, Renalda; Bou-Zeid, Elie; El-Fadel, Mutasem

    2018-05-01

    This study presents results from the Weather Research and Forecasting (WRF) model applied for climatological downscaling simulations over highly complex terrain along the Eastern Mediterranean. We sequentially downscale general circulation model results, for a mild and wet year (2003) and a hot and dry year (2010), to three local horizontal resolutions of 9, 3 and 1 km. Simulated near-surface hydrometeorological variables are compared at different time scales against data from an observational network over the study area comprising rain gauges, anemometers, and thermometers. The overall performance of WRF at 1 and 3 km horizontal resolution was satisfactory, with significant improvement over the 9 km downscaling simulation. The total yearly precipitation from WRF's 1 km and 3 km domains exhibited < 10% bias with respect to observational data. The errors in minimum and maximum temperatures were reduced by the downscaling, along with a high-quality delineation of temperature variability and extremes for both the 1 and 3 km resolution runs. Wind speeds, on the other hand, are generally overestimated for all model resolutions, in comparison with observational data, particularly on the coast (up to 50%) compared to inland stations (up to 40%). The findings therefore indicate that a 3 km resolution is sufficient for the downscaling, especially that it would allow more years and scenarios to be investigated compared to the higher 1 km resolution at the same computational effort. In addition, the results provide a quantitative measure of the potential errors for various hydrometeorological variables.

  2. State estimation and prediction using clustered particle filters

    PubMed Central

    Lee, Yoonsang; Majda, Andrew J.

    2016-01-01

    Particle filtering is an essential tool to improve uncertain model predictions by incorporating noisy observational data from complex systems including non-Gaussian features. A class of particle filters, clustered particle filters, is introduced for high-dimensional nonlinear systems, which uses relatively few particles compared with the standard particle filter. The clustered particle filter captures non-Gaussian features of the true signal, which are typical in complex nonlinear dynamical systems such as geophysical systems. The method is also robust in the difficult regime of high-quality sparse and infrequent observations. The key features of the clustered particle filtering are coarse-grained localization through the clustering of the state variables and particle adjustment to stabilize the method; each observation affects only neighbor state variables through clustering and particles are adjusted to prevent particle collapse due to high-quality observations. The clustered particle filter is tested for the 40-dimensional Lorenz 96 model with several dynamical regimes including strongly non-Gaussian statistics. The clustered particle filter shows robust skill in both achieving accurate filter results and capturing non-Gaussian statistics of the true signal. It is further extended to multiscale data assimilation, which provides the large-scale estimation by combining a cheap reduced-order forecast model and mixed observations of the large- and small-scale variables. This approach enables the use of a larger number of particles due to the computational savings in the forecast model. The multiscale clustered particle filter is tested for one-dimensional dispersive wave turbulence using a forecast model with model errors. PMID:27930332

  3. Synthesising empirical results to improve predictions of post-wildfire runoff and erosion response

    USGS Publications Warehouse

    Shakesby, Richard A.; Moody, John A.; Martin, Deborah A.; Robichaud, Peter R.

    2016-01-01

    Advances in research into wildfire impacts on runoff and erosion have demonstrated increasing complexity of controlling factors and responses, which, combined with changing fire frequency, present challenges for modellers. We convened a conference attended by experts and practitioners in post-wildfire impacts, meteorology and related research, including modelling, to focus on priority research issues. The aim was to improve our understanding of controls and responses and the predictive capabilities of models. This conference led to the eight selected papers in this special issue. They address aspects of the distinctiveness in the controls and responses among wildfire regions, spatiotemporal rainfall variability, infiltration, runoff connectivity, debris flow formation and modelling applications. Here we summarise key findings from these papers and evaluate their contribution to improving understanding and prediction of post-wildfire runoff and erosion under changes in climate, human intervention and population pressure on wildfire-prone areas.

  4. Bundled Payments in Total Joint Replacement: Keeping Our Care Affordable and High in Quality.

    PubMed

    McLawhorn, Alexander S; Buller, Leonard T

    2017-09-01

    The purpose of this review was to evaluate the literature regarding bundle payment reimbursement models for total joint arthroplasty (TJA). From an economic standpoint, TJA are cost-effective, but they represent a substantial expense to the Centers for Medicare & Medicaid Services (CMS). Historically, fee-for-service payment models resulted in highly variable cost and quality. CMS introduced Bundled Payments for Care Improvement (BPCI) in 2012 and subsequently the Comprehensive Care for Joint Replacement (CJR) reimbursement model in 2016 to improve the value of TJA from the perspectives of both CMS and patients, by improving quality via cost control. Early results of bundled payments are promising, but preserving access to care for patients with high comorbidity burdens and those requiring more complex care is a lingering concern. Hospitals, regardless of current participation in bundled payments, should develop care pathways for TJA to maximize efficiency and patient safety.

  5. Landscape analysis of methane flux across complex terrain

    NASA Astrophysics Data System (ADS)

    Kaiser, K. E.; McGlynn, B. L.; Dore, J. E.

    2014-12-01

    Greenhouse gas (GHG) fluxes into and out of the soil are influenced by environmental conditions resulting in landscape-mediated patterns of spatial heterogeneity. The temporal variability of inputs (e.g. precipitation) and internal redistribution (e.g. groundwater flow) and dynamics (e.g. microbial communities) make predicating these fluxes challenging. Complex terrain can provide a laboratory for improving understanding of the spatial patterns, temporal dynamics, and drivers of trace gas flux rates, requisite to constraining current GHG budgets and future scenarios. Our research builds on previous carbon cycle research at the USFS Tenderfoot Creek Experimental Forest, Little Belt Mountains, Montana that highlighted the relationships between landscape position and seasonal CO2 efflux, induced by the topographic redistribution of water. Spatial patterns and landscape scale mediation of CH4 fluxes in seasonally aerobic soils have not yet been elucidated. We measured soil methane concentrations and fluxes across a full range of landscape positions, leveraging topographic and seasonal gradients, to examine the relationships between environmental variables, hydrologic dynamics, and CH4 production and consumption. We determined that a threshold of ~30% VWC distinguished the direction of flux at individual time points, with the riparian area and uplands having distinct source/sink characteristics respectively. Riparian locations were either strong sources or fluctuated between sink and source behavior, resulting in near neutral seasonal flux. Upland sites however, exhibited significant relationships between sink strength and topographic/energy balance indices. Our results highlight spatial and temporal coherence to landscape scale heterogeneity of CH4 dynamics that can improve estimates of landscape scale CH4 balances and sensitivity to change.

  6. Reversal of dabigatran anticoagulation ex vivo: Porcine study comparing prothrombin complex concentrates and idarucizumab.

    PubMed

    Honickel, Markus; Treutler, Stefanie; van Ryn, Joanne; Tillmann, Sabine; Rossaint, Rolf; Grottke, Oliver

    2015-04-01

    Urgent surgery or life-threatening bleeding requires prompt reversal of the anticoagulant effects of dabigatran. This study assessed the ability of three- and four-factor prothrombin complex concentrate (PCC) and idarucizumab (specific antidote for dabigatran) to reverse the anticoagulant effects of dabigatran in a porcine model of trauma. Twelve animals were given dabigatran etexilate (DE) orally and dabigatran intravenously, before infliction of trauma. Six animals received tranexamic acid plus fibrinogen concentrate 12 minutes post-injury. Six PCCs (each 30 and 60 U/kg) and idarucizumab (30 and 60 mg/kg) were added to blood samples ex vivo. Coagulation was assessed by several coagulation assays. All coagulation parameters were altered after dabigatran infusion (plasma level: 442 ± 138 ng/ml). Both three- and four-factor PCCs mostly or completely reversed the effects of dabigatran on thromboelastometry variables and PT but not on aPTT. Idarucizumab neutralised plasma concentrations of dabigatran, and reversed the effects of the drug on coagulation variables. Thrombin generation showed dose-dependent over-correction following the addition of PCC, implying that elevated levels of thrombin are required to overcome dabigatran-induced coagulopathy. In contrast, treatment with idarucizumab returned thrombin generation to baseline levels. Following trauma, therapy with tranexamic acid plus fibrinogen improved correction of coagulation parameters by PCC, and thromboelastometry parameters by idarucizumab. All investigated PCCs improved dabigatran- and trauma-induced coagulopathy to a similar degree. In conclusion, this study shows that three- and four-factor PCCs are similarly effective for dabigatran reversal. Idarucizumab also reversed the effects of dabigatran and, unlike PCCs, was not associated with over-correction of thrombin generation.

  7. Training to walk amid uncertainty with Re-Step: measurements and changes with perturbation training for hemiparesis and cerebral palsy.

    PubMed

    Bar-Haim, Simona; Harries, Netta; Hutzler, Yeshayahu; Belokopytov, Mark; Dobrov, Igor

    2013-09-01

    To describe Re-Step™, a novel mechatronic shoe system that measures center of pressure (COP) gait parameters and complexity of COP dispersion while walking, and to demonstrate these measurements in healthy controls and individuals with hemiparesis and cerebral palsy (CP) before and after perturbation training. The Re-Step™ was used to induce programmed chaotic perturbations to the feet while walking for 30 min for 36 sessions over 12-weeks of training in two subjects with hemiparesis and two with CP. Baseline measurements of complexity indices (fractal dimension and approximate entropy) tended to be higher in controls than in those with disabilities, while COP variability, mean and variability of step time and COP dispersion were lower. After training the disabled subjects these measurement values tended toward those of the controls, along with a decrease in step time, 10 m walk time, average step time, percentage of double support and increased Berg balance score. This pilot trial reveals the feasibility and applicability of this unique measurement and perturbation system for evaluating functional disabilities and changes with interventions to improve walking. Implication for Rehabilitation Walking, of individuals with cerebral palsy and hemiparesis following stroke, can be viewed in terms of a rigid motor behavior that prevents adaptation to changing environmental conditions. Re-Step system (a) measures and records linear and non-linear gait parameters during free walking to provide a detailed evaluation of walking disabilities, (b) is an intervention training modality that applies unexpected perturbations during walking. This perturbation intervention may improve gait and motor functions of individuals with hemiparesis and cerebral plasy.

  8. Wind Power Curve Modeling in Simple and Complex Terrain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bulaevskaya, V.; Wharton, S.; Irons, Z.

    2015-02-09

    Our previous work on wind power curve modeling using statistical models focused on a location with a moderately complex terrain in the Altamont Pass region in northern California (CA). The work described here is the follow-up to that work, but at a location with a simple terrain in northern Oklahoma (OK). The goal of the present analysis was to determine the gain in predictive ability afforded by adding information beyond the hub-height wind speed, such as wind speeds at other heights, as well as other atmospheric variables, to the power prediction model at this new location and compare the resultsmore » to those obtained at the CA site in the previous study. While we reach some of the same conclusions at both sites, many results reported for the CA site do not hold at the OK site. In particular, using the entire vertical profile of wind speeds improves the accuracy of wind power prediction relative to using the hub-height wind speed alone at both sites. However, in contrast to the CA site, the rotor equivalent wind speed (REWS) performs almost as well as the entire profile at the OK site. Another difference is that at the CA site, adding wind veer as a predictor significantly improved the power prediction accuracy. The same was true for that site when air density was added to the model separately instead of using the standard air density adjustment. At the OK site, these additional variables result in no significant benefit for the prediction accuracy.« less

  9. Geometrical accuracy improvement in flexible roll forming lines

    NASA Astrophysics Data System (ADS)

    Larrañaga, J.; Berner, S.; Galdos, L.; Groche, P.

    2011-01-01

    The general interest to produce profiles with variable cross section in a cost-effective way has increased in the last few years. The flexible roll forming process allows producing profiles with variable cross section lengthwise in a continuous way. Until now, only a few flexible roll forming lines were developed and built up. Apart from the flange wrinkling along the transition zone of u-profiles with variable cross section, the process limits have not been investigated and solutions for shape deviations are unknown. During the PROFOM project a flexible roll forming machine has been developed with the objective of producing high technological components for automotive body structures. In order to investigate the limits of the process, different profile geometries and steel grades including high strength steels have been applied. During the first experimental tests, several errors have been identified, as a result of the complex stress states generated during the forming process. In order to improve the accuracy of the target profiles and to meet the tolerance demands of the automotive industry, a thermo-mechanical solution has been proposed. Additional mechanical devices, supporting flexible the roll forming process, have been implemented in the roll forming line together with local heating techniques. The combination of both methods shows a significant increase of the accuracy. In the present investigation, the experimental results of the validation process are presented.

  10. An index of floodplain surface complexity

    USGS Publications Warehouse

    Scown, Murray W.; Thoms, Martin C.; DeJager, Nathan R.

    2016-01-01

    Floodplain surface topography is an important component of floodplain ecosystems. It is the primary physical template upon which ecosystem processes are acted out, and complexity in this template can contribute to the high biodiversity and productivity of floodplain ecosystems. There has been a limited appreciation of floodplain surface complexity because of the traditional focus on temporal variability in floodplains as well as limitations to quantifying spatial complexity. An index of floodplain surface complexity (FSC) is developed in this paper and applied to eight floodplains from different geographic settings. The index is based on two key indicators of complexity, variability in surface geometry (VSG) and the spatial organisation of surface conditions (SPO), and was determined at three sampling scales. FSC, VSG, and SPO varied between the eight floodplains and these differences depended upon sampling scale. Relationships between these measures of spatial complexity and seven geomorphological and hydrological drivers were investigated. There was a significant decline in all complexity measures with increasing floodplain width, which was explained by either a power, logarithmic, or exponential function. There was an initial rapid decline in surface complexity as floodplain width increased from 1.5 to 5 km, followed by little change in floodplains wider than 10 km. VSG also increased significantly with increasing sediment yield. No significant relationships were determined between any of the four hydrological variables and floodplain surface complexity.

  11. Climate and dengue transmission: evidence and implications.

    PubMed

    Morin, Cory W; Comrie, Andrew C; Ernst, Kacey

    2013-01-01

    Climate influences dengue ecology by affecting vector dynamics, agent development, and mosquito/human interactions. Although these relationships are known, the impact climate change will have on transmission is unclear. Climate-driven statistical and process-based models are being used to refine our knowledge of these relationships and predict the effects of projected climate change on dengue fever occurrence, but results have been inconsistent. We sought to identify major climatic influences on dengue virus ecology and to evaluate the ability of climate-based dengue models to describe associations between climate and dengue, simulate outbreaks, and project the impacts of climate change. We reviewed the evidence for direct and indirect relationships between climate and dengue generated from laboratory studies, field studies, and statistical analyses of associations between vectors, dengue fever incidence, and climate conditions. We assessed the potential contribution of climate-driven, process-based dengue models and provide suggestions to improve their performance. Relationships between climate variables and factors that influence dengue transmission are complex. A climate variable may increase dengue transmission potential through one aspect of the system while simultaneously decreasing transmission potential through another. This complexity may at least partly explain inconsistencies in statistical associations between dengue and climate. Process-based models can account for the complex dynamics but often omit important aspects of dengue ecology, notably virus development and host-species interactions. Synthesizing and applying current knowledge of climatic effects on all aspects of dengue virus ecology will help direct future research and enable better projections of climate change effects on dengue incidence.

  12. Real-world hydrologic assessment of a fully-distributed hydrological model in a parallel computing environment

    NASA Astrophysics Data System (ADS)

    Vivoni, Enrique R.; Mascaro, Giuseppe; Mniszewski, Susan; Fasel, Patricia; Springer, Everett P.; Ivanov, Valeriy Y.; Bras, Rafael L.

    2011-10-01

    SummaryA major challenge in the use of fully-distributed hydrologic models has been the lack of computational capabilities for high-resolution, long-term simulations in large river basins. In this study, we present the parallel model implementation and real-world hydrologic assessment of the Triangulated Irregular Network (TIN)-based Real-time Integrated Basin Simulator (tRIBS). Our parallelization approach is based on the decomposition of a complex watershed using the channel network as a directed graph. The resulting sub-basin partitioning divides effort among processors and handles hydrologic exchanges across boundaries. Through numerical experiments in a set of nested basins, we quantify parallel performance relative to serial runs for a range of processors, simulation complexities and lengths, and sub-basin partitioning methods, while accounting for inter-run variability on a parallel computing system. In contrast to serial simulations, the parallel model speed-up depends on the variability of hydrologic processes. Load balancing significantly improves parallel speed-up with proportionally faster runs as simulation complexity (domain resolution and channel network extent) increases. The best strategy for large river basins is to combine a balanced partitioning with an extended channel network, with potential savings through a lower TIN resolution. Based on these advances, a wider range of applications for fully-distributed hydrologic models are now possible. This is illustrated through a set of ensemble forecasts that account for precipitation uncertainty derived from a statistical downscaling model.

  13. Outline of a new approach to the analysis of complex systems and decision processes.

    NASA Technical Reports Server (NTRS)

    Zadeh, L. A.

    1973-01-01

    Development of a conceptual framework for dealing with systems which are too complex or too ill-defined to admit of precise quantitative analysis. The approach outlined is based on the premise that the key elements in human thinking are not numbers, but labels of fuzzy sets - i.e., classes of objects in which the transition from membership to nonmembership is gradual rather than abrupt. The approach in question has three main distinguishing features - namely, the use of so-called 'linguistic' variables in place of or in addition to numerical variables, the characterization of simple relations between variables by conditional fuzzy statements, and the characterization of complex relations by fuzzy algorithms.

  14. Complex and Simple Clinical Reaction Times Are Associated with Gait, Balance, and Major Fall Injury in Older Subjects with Diabetic Peripheral Neuropathy

    PubMed Central

    Richardson, James K.; Eckner, James T.; Allet, Lara; Kim, Hogene; Ashton-Miller, James

    2016-01-01

    Objective To identify relationships between complex and simple clinical measures of reaction time (RTclin), and indicators of balance in older subjects with and without diabetic peripheral neuropathy (DPN). Design Prospective cohort design. Complex RTclin Accuracy, Simple RTclin Latency, and their ratio were determined using a novel device in 42 subjects (age = 69.1 ± 8.3 yrs), 26 with DPN and 16 without. Dependent variables included unipedal stance time (UST), step width variability and range on an uneven surface, and major fall-related injury over 12 months. Results In the DPN subjects the ratio of Complex RTclin Accuracy:Simple RTclin Latency was strongly associated with longer UST (r/p = .653/.004), and decreased step width variability and range (r/p = −.696/.001 and −.782/<.001, respectively) on an uneven surface. Additionally, the two DPN subjects sustaining major injuries had lower Complex RTclin Accuracy:Simple: RTclin Latency than those without. Conclusions The ratio of Complex RTclin Accuracy:Simple RTclin Latency is a potent predictor of UST and frontal plane gait variability in response to perturbations, and may predict major fall injury in older subjects with DPN. These short latency neurocognitive measures may compensate for lower limb neuromuscular impairments, and provide a more comprehensive understanding of balance and fall risk. PMID:27552354

  15. Complexity in relational processing predicts changes in functional brain network dynamics.

    PubMed

    Cocchi, Luca; Halford, Graeme S; Zalesky, Andrew; Harding, Ian H; Ramm, Brentyn J; Cutmore, Tim; Shum, David H K; Mattingley, Jason B

    2014-09-01

    The ability to link variables is critical to many high-order cognitive functions, including reasoning. It has been proposed that limits in relating variables depend critically on relational complexity, defined formally as the number of variables to be related in solving a problem. In humans, the prefrontal cortex is known to be important for reasoning, but recent studies have suggested that such processes are likely to involve widespread functional brain networks. To test this hypothesis, we used functional magnetic resonance imaging and a classic measure of deductive reasoning to examine changes in brain networks as a function of relational complexity. As expected, behavioral performance declined as the number of variables to be related increased. Likewise, increments in relational complexity were associated with proportional enhancements in brain activity and task-based connectivity within and between 2 cognitive control networks: A cingulo-opercular network for maintaining task set, and a fronto-parietal network for implementing trial-by-trial control. Changes in effective connectivity as a function of increased relational complexity suggested a key role for the left dorsolateral prefrontal cortex in integrating and implementing task set in a trial-by-trial manner. Our findings show that limits in relational processing are manifested in the brain as complexity-dependent modulations of large-scale networks. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. First steps of ecological restoration in Mediterranean lagoons: Shifts in phytoplankton communities

    NASA Astrophysics Data System (ADS)

    Leruste, A.; Malet, N.; Munaron, D.; Derolez, V.; Hatey, E.; Collos, Y.; De Wit, R.; Bec, B.

    2016-10-01

    Along the French Mediterranean coast, a complex of eight lagoons underwent intensive eutrophication over four decades, mainly related to nutrient over-enrichment from continuous sewage discharges. The lagoon complex displayed a wide trophic gradient from mesotrophy to hypertrophy and primary production was dominated by phytoplankton communities. In 2005, the implementation of an 11 km offshore outfall system diverted the treated sewage effluents leading to a drastic reduction of anthropogenic inputs of nitrogen and phosphorus into the lagoons. Time series data have been examined from 2000 to 2013 for physical, chemical and biological (phytoplankton) variables of the water column during the summer period. Since 2006, total nitrogen and phosphorus concentrations as well as chlorophyll biomass strongly decreased revealing an improvement in lagoon water quality. In summertime, the decline in phytoplankton biomass was accompanied by shifts in community structure and composition that could be explained by adopting a functional approach by considering the common functional traits of the main algal groups. These phytoplankton communities were dominated by functional groups of small-sized and fast-growing algae (diatoms, cryptophytes and green algae). The trajectories of summer phytoplankton communities displayed a complex response to changing nutrient loads over time. While diatoms were the major group in 2006 in all the lagoons, the summer phytoplankton composition in hypertrophic lagoons has shifted towards green algae, which are particularly well adapted to summertime conditions. All lagoons showed increasing proportion and occurrence of peridinin-rich dinophytes over time, probably related to their capacity for mixotrophy. The diversity patterns were marked by a strong variability in eutrophic and hypertrophic lagoons whereas phytoplankton community structure reached the highest diversity and stability in mesotrophic lagoons. We observe that during the re-oligotrophication process in coastal lagoons, phytoplankton shows complex trajectories with similarities with those observed in freshwater lake systems.

  17. Enhanced conformational sampling using replica exchange with concurrent solute scaling and hamiltonian biasing realized in one dimension.

    PubMed

    Yang, Mingjun; Huang, Jing; MacKerell, Alexander D

    2015-06-09

    Replica exchange (REX) is a powerful computational tool for overcoming the quasi-ergodic sampling problem of complex molecular systems. Recently, several multidimensional extensions of this method have been developed to realize exchanges in both temperature and biasing potential space or the use of multiple biasing potentials to improve sampling efficiency. However, increased computational cost due to the multidimensionality of exchanges becomes challenging for use on complex systems under explicit solvent conditions. In this study, we develop a one-dimensional (1D) REX algorithm to concurrently combine the advantages of overall enhanced sampling from Hamiltonian solute scaling and the specific enhancement of collective variables using Hamiltonian biasing potentials. In the present Hamiltonian replica exchange method, termed HREST-BP, Hamiltonian solute scaling is applied to the solute subsystem, and its interactions with the environment to enhance overall conformational transitions and biasing potentials are added along selected collective variables associated with specific conformational transitions, thereby balancing the sampling of different hierarchical degrees of freedom. The two enhanced sampling approaches are implemented concurrently allowing for the use of a small number of replicas (e.g., 6 to 8) in 1D, thus greatly reducing the computational cost in complex system simulations. The present method is applied to conformational sampling of two nitrogen-linked glycans (N-glycans) found on the HIV gp120 envelope protein. Considering the general importance of the conformational sampling problem, HREST-BP represents an efficient procedure for the study of complex saccharides, and, more generally, the method is anticipated to be of general utility for the conformational sampling in a wide range of macromolecular systems.

  18. Examining neural correlates of skill acquisition in a complex videogame training program.

    PubMed

    Prakash, Ruchika S; De Leon, Angeline A; Mourany, Lyla; Lee, Hyunkyu; Voss, Michelle W; Boot, Walter R; Basak, Chandramallika; Fabiani, Monica; Gratton, Gabriele; Kramer, Arthur F

    2012-01-01

    Acquisition of complex skills is a universal feature of human behavior that has been conceptualized as a process that starts with intense resource dependency, requires effortful cognitive control, and ends in relative automaticity on the multi-faceted task. The present study examined the effects of different theoretically based training strategies on cortical recruitment during acquisition of complex video game skills. Seventy-five participants were recruited and assigned to one of three training groups: (1) Fixed Emphasis Training (FET), in which participants practiced the game, (2) Hybrid Variable-Priority Training (HVT), in which participants practiced using a combination of part-task training and variable priority training, or (3) a Control group that received limited game play. After 30 h of training, game data indicated a significant advantage for the two training groups relative to the control group. The HVT group demonstrated enhanced benefits of training, as indexed by an improvement in overall game score and a reduction in cortical recruitment post-training. Specifically, while both groups demonstrated a significant reduction of activation in attentional control areas, namely the right middle frontal gyrus, right superior frontal gyrus, and the ventral medial prefrontal cortex, participants in the control group continued to engage these areas post-training, suggesting a sustained reliance on attentional regions during challenging task demands. The HVT group showed a further reduction in neural resources post-training compared to the FET group in these cognitive control regions, along with reduced activation in the motor and sensory cortices and the posteromedial cortex. Findings suggest that training, specifically one that emphasizes cognitive flexibility can reduce the attentional demands of a complex cognitive task, along with reduced reliance on the motor network.

  19. Examining neural correlates of skill acquisition in a complex videogame training program

    PubMed Central

    Prakash, Ruchika S.; De Leon, Angeline A.; Mourany, Lyla; Lee, Hyunkyu; Voss, Michelle W.; Boot, Walter R.; Basak, Chandramallika; Fabiani, Monica; Gratton, Gabriele; Kramer, Arthur F.

    2012-01-01

    Acquisition of complex skills is a universal feature of human behavior that has been conceptualized as a process that starts with intense resource dependency, requires effortful cognitive control, and ends in relative automaticity on the multi-faceted task. The present study examined the effects of different theoretically based training strategies on cortical recruitment during acquisition of complex video game skills. Seventy-five participants were recruited and assigned to one of three training groups: (1) Fixed Emphasis Training (FET), in which participants practiced the game, (2) Hybrid Variable-Priority Training (HVT), in which participants practiced using a combination of part-task training and variable priority training, or (3) a Control group that received limited game play. After 30 h of training, game data indicated a significant advantage for the two training groups relative to the control group. The HVT group demonstrated enhanced benefits of training, as indexed by an improvement in overall game score and a reduction in cortical recruitment post-training. Specifically, while both groups demonstrated a significant reduction of activation in attentional control areas, namely the right middle frontal gyrus, right superior frontal gyrus, and the ventral medial prefrontal cortex, participants in the control group continued to engage these areas post-training, suggesting a sustained reliance on attentional regions during challenging task demands. The HVT group showed a further reduction in neural resources post-training compared to the FET group in these cognitive control regions, along with reduced activation in the motor and sensory cortices and the posteromedial cortex. Findings suggest that training, specifically one that emphasizes cognitive flexibility can reduce the attentional demands of a complex cognitive task, along with reduced reliance on the motor network. PMID:22615690

  20. HDMR methods to assess reliability in slope stability analyses

    NASA Astrophysics Data System (ADS)

    Kozubal, Janusz; Pula, Wojciech; Vessia, Giovanna

    2014-05-01

    Stability analyses of complex rock-soil deposits shall be tackled considering the complex structure of discontinuities within rock mass and embedded soil layers. These materials are characterized by a high variability in physical and mechanical properties. Thus, to calculate the slope safety factor in stability analyses two issues must be taken into account: 1) the uncertainties related to structural setting of the rock-slope mass and 2) the variability in mechanical properties of soils and rocks. High Dimensional Model Representation (HDMR) (Chowdhury et al. 2009; Chowdhury and Rao 2010) can be used to carry out the reliability index within complex rock-soil slopes when numerous random variables with high coefficient of variations are considered. HDMR implements the inverse reliability analysis, meaning that the unknown design parameters are sought provided that prescribed reliability index values are attained. Such approach uses implicit response functions according to the Response Surface Method (RSM). The simple RSM can be efficiently applied when less than four random variables are considered; as the number of variables increases, the efficiency in reliability index estimation decreases due to the great amount of calculations. Therefore, HDMR method is used to improve the computational accuracy. In this study, the sliding mechanism in Polish Flysch Carpathian Mountains have been studied by means of HDMR. The Southern part of Poland where Carpathian Mountains are placed is characterized by a rather complicated sedimentary pattern of flysh rocky-soil deposits that can be simplified into three main categories: (1) normal flysch, consisting of adjacent sandstone and shale beds of approximately equal thickness, (2) shale flysch, where shale beds are thicker than adjacent sandstone beds, and (3) sandstone flysch, where the opposite holds. Landslides occur in all flysch deposit types thus some configurations of possible unstable settings (within fractured rocky-soil masses) resulting in sliding mechanisms have been investigated in this study. The reliability indices values drawn from the HDRM method have been compared with conventional approaches as neural networks: the efficiency of HDRM is shown in the case studied. References Chowdhury R., Rao B.N. and Prasad A.M. 2009. High-dimensional model representation for structural reliability analysis. Commun. Numer. Meth. Engng, 25: 301-337. Chowdhury R. and Rao B. 2010. Probabilistic Stability Assessment of Slopes Using High Dimensional Model Representation. Computers and Geotechnics, 37: 876-884.

  1. ESCAPE: Eco-Behavioral System for Complex Assessments of Preschool Environments. Research Draft.

    ERIC Educational Resources Information Center

    Carta, Judith J.; And Others

    The manual details an observational code designed to track a child during an entire day in a preschool setting. The Eco-Behavioral System for Complex Assessments of Preschool Environments (ESCAPE) encompasses assessment of the following three major categories of variables with their respective subcategories: (1) ecological variables (designated…

  2. Evaluation of a laser scanning sensor on detection of complex shaped targets for variable-rate sprayer development

    USDA-ARS?s Scientific Manuscript database

    Sensors that can accurately measure canopy structures are prerequisites for development of advanced variable-rate sprayers. A 270° radial range laser sensor was evaluated for its accuracy to measure dimensions of target surfaces with complex shapes and sizes. An algorithm for data acquisition and 3-...

  3. How does complex terrain influence responses of carbon and water cycle processes to climate variability and climate change?

    EPA Science Inventory

    We are pursuing the ambitious goal of understanding how complex terrain influences the responses of carbon and water cycle processes to climate variability and climate change. Our studies take place in H.J. Andrews Experimental Forest, an LTER (Long Term Ecological Research) site...

  4. Subgrid-scale effects in compressible variable-density decaying turbulence

    DOE PAGES

    GS, Sidharth; Candler, Graham V.

    2018-05-08

    We present that many turbulent flows are characterized by complex scale interactions and vorticity generation caused by compressibility and variable-density effects. In the large-eddy simulation of variable-density flows, these processes manifest themselves as subgrid-scale (SGS) terms that interact with the resolved-scale flow. This paper studies the effect of the variable-density SGS terms and quantifies their relative importance. We consider the SGS terms appearing in the density-weighted Favre-filtered equations and in the unweighted Reynolds-filtered equations. The conventional form of the Reynolds-filtered momentum equation is complicated by a temporal SGS term; therefore, we derive a new form of the Reynolds-filtered governing equationsmore » that does not contain this term and has only double-correlation SGS terms. The new form of the filtered equations has terms that represent the SGS mass flux, pressure-gradient acceleration and velocity-dilatation correlation. To evaluate the dynamical significance of the variable-density SGS effects, we carry out direct numerical simulations of compressible decaying turbulence at a turbulent Mach number of 0.3. Two different initial thermodynamic conditions are investigated: homentropic and a thermally inhomogeneous gas with regions of differing densities. The simulated flow fields are explicitly filtered to evaluate the SGS terms. The importance of the variable-density SGS terms is quantified relative to the SGS specific stress, which is the only SGS term active in incompressible constant-density turbulence. It is found that while the variable-density SGS terms in the homentropic case are negligible, they are dynamically significant in the thermally inhomogeneous flows. Investigation of the variable-density SGS terms is therefore important, not only to develop variable-density closures but also to improve the understanding of scale interactions in variable-density flows.« less

  5. Subgrid-scale effects in compressible variable-density decaying turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    GS, Sidharth; Candler, Graham V.

    We present that many turbulent flows are characterized by complex scale interactions and vorticity generation caused by compressibility and variable-density effects. In the large-eddy simulation of variable-density flows, these processes manifest themselves as subgrid-scale (SGS) terms that interact with the resolved-scale flow. This paper studies the effect of the variable-density SGS terms and quantifies their relative importance. We consider the SGS terms appearing in the density-weighted Favre-filtered equations and in the unweighted Reynolds-filtered equations. The conventional form of the Reynolds-filtered momentum equation is complicated by a temporal SGS term; therefore, we derive a new form of the Reynolds-filtered governing equationsmore » that does not contain this term and has only double-correlation SGS terms. The new form of the filtered equations has terms that represent the SGS mass flux, pressure-gradient acceleration and velocity-dilatation correlation. To evaluate the dynamical significance of the variable-density SGS effects, we carry out direct numerical simulations of compressible decaying turbulence at a turbulent Mach number of 0.3. Two different initial thermodynamic conditions are investigated: homentropic and a thermally inhomogeneous gas with regions of differing densities. The simulated flow fields are explicitly filtered to evaluate the SGS terms. The importance of the variable-density SGS terms is quantified relative to the SGS specific stress, which is the only SGS term active in incompressible constant-density turbulence. It is found that while the variable-density SGS terms in the homentropic case are negligible, they are dynamically significant in the thermally inhomogeneous flows. Investigation of the variable-density SGS terms is therefore important, not only to develop variable-density closures but also to improve the understanding of scale interactions in variable-density flows.« less

  6. Increasing algal photosynthetic productivity by integrating ecophysiology with systems biology.

    PubMed

    Peers, Graham

    2014-11-01

    Oxygenic photosynthesis is the process by which plants, algae, and cyanobacteria convert sunlight and CO2 into chemical energy and biomass. Previously published estimates suggest that algal photosynthesis is, at best, able to convert approximately 5-7% of incident light energy to biomass and there is opportunity for improvement. Recent analyses of in situ photophysiology in mass cultures of algae and cyanobacteria show that cultivation methods can have detrimental effects on a cell's photophysiology - reinforcing the need to understand the complex responses of cell biology to a highly variable environment. A systems-based approach to understanding the stresses and efficiencies associated with light-energy harvesting, CO2 fixation, and carbon partitioning will be necessary to make major headway toward improving photosynthetic yields. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Sensitivity Analysis of Multidisciplinary Rotorcraft Simulations

    NASA Technical Reports Server (NTRS)

    Wang, Li; Diskin, Boris; Biedron, Robert T.; Nielsen, Eric J.; Bauchau, Olivier A.

    2017-01-01

    A multidisciplinary sensitivity analysis of rotorcraft simulations involving tightly coupled high-fidelity computational fluid dynamics and comprehensive analysis solvers is presented and evaluated. An unstructured sensitivity-enabled Navier-Stokes solver, FUN3D, and a nonlinear flexible multibody dynamics solver, DYMORE, are coupled to predict the aerodynamic loads and structural responses of helicopter rotor blades. A discretely-consistent adjoint-based sensitivity analysis available in FUN3D provides sensitivities arising from unsteady turbulent flows and unstructured dynamic overset meshes, while a complex-variable approach is used to compute DYMORE structural sensitivities with respect to aerodynamic loads. The multidisciplinary sensitivity analysis is conducted through integrating the sensitivity components from each discipline of the coupled system. Numerical results verify accuracy of the FUN3D/DYMORE system by conducting simulations for a benchmark rotorcraft test model and comparing solutions with established analyses and experimental data. Complex-variable implementation of sensitivity analysis of DYMORE and the coupled FUN3D/DYMORE system is verified by comparing with real-valued analysis and sensitivities. Correctness of adjoint formulations for FUN3D/DYMORE interfaces is verified by comparing adjoint-based and complex-variable sensitivities. Finally, sensitivities of the lift and drag functions obtained by complex-variable FUN3D/DYMORE simulations are compared with sensitivities computed by the multidisciplinary sensitivity analysis, which couples adjoint-based flow and grid sensitivities of FUN3D and FUN3D/DYMORE interfaces with complex-variable sensitivities of DYMORE structural responses.

  8. Intelligent mobility research for robotic locomotion in complex terrain

    NASA Astrophysics Data System (ADS)

    Trentini, Michael; Beckman, Blake; Digney, Bruce; Vincent, Isabelle; Ricard, Benoit

    2006-05-01

    The objective of the Autonomous Intelligent Systems Section of Defence R&D Canada - Suffield is best described by its mission statement, which is "to augment soldiers and combat systems by developing and demonstrating practical, cost effective, autonomous intelligent systems capable of completing military missions in complex operating environments." The mobility requirement for ground-based mobile systems operating in urban settings must increase significantly if robotic technology is to augment human efforts in these roles and environments. The intelligence required for autonomous systems to operate in complex environments demands advances in many fields of robotics. This has resulted in large bodies of research in areas of perception, world representation, and navigation, but the problem of locomotion in complex terrain has largely been ignored. In order to achieve its objective, the Autonomous Intelligent Systems Section is pursuing research that explores the use of intelligent mobility algorithms designed to improve robot mobility. Intelligent mobility uses sensing, control, and learning algorithms to extract measured variables from the world, control vehicle dynamics, and learn by experience. These algorithms seek to exploit available world representations of the environment and the inherent dexterity of the robot to allow the vehicle to interact with its surroundings and produce locomotion in complex terrain. The primary focus of the paper is to present the intelligent mobility research within the framework of the research methodology, plan and direction defined at Defence R&D Canada - Suffield. It discusses the progress and future direction of intelligent mobility research and presents the research tools, topics, and plans to address this critical research gap. This research will create effective intelligence to improve the mobility of ground-based mobile systems operating in urban settings to assist the Canadian Forces in their future urban operations.

  9. Initial fractal exponent of heart-rate variability is associated with success of early resuscitation in patients with severe sepsis or septic shock: a prospective cohort study

    PubMed Central

    Brown, Samuel M.; Tate, Quinn; Jones, Jason P.; Knox, Daniel; Kuttler, Kathryn G.; Lanspa, Michael; Rondina, Matthew T.; Grissom, Colin K.; Behera, Subhasis; Mathews, V.J.; Morris, Alan

    2013-01-01

    Introduction Heart-rate variability reflects autonomic nervous system tone as well as the overall health of the baroreflex system. We hypothesized that loss of complexity in heart-rate variability upon ICU admission would be associated with unsuccessful early resuscitation of sepsis. Methods We prospectively enrolled patients admitted to ICUs with severe sepsis or septic shock from 2009 to 2011. We studied 30 minutes of EKG, sampled at 500 Hz, at ICU admission and calculated heart-rate complexity via detrended fluctuation analysis. Primary outcome was vasopressor independence at 24 hours after ICU admission. Secondary outcome was 28-day mortality. Results We studied 48 patients, of whom 60% were vasopressor independent at 24 hours. Five (10%) died within 28 days. The ratio of fractal alpha parameters was associated with both vasopressor independence and 28-day mortality (p=0.04) after controlling for mean heart rate. In the optimal model, SOFA score and the long-term fractal alpha parameter were associated with vasopressor independence. Conclusions Loss of complexity in heart rate variability is associated with worse outcome early in severe sepsis and septic shock. Further work should evaluate whether complexity of heart rate variability (HRV) could guide treatment in sepsis. PMID:23958243

  10. Increased ventilatory variability and complexity in patients with hyperventilation disorder.

    PubMed

    Bokov, Plamen; Fiamma, Marie-Noëlle; Chevalier-Bidaud, Brigitte; Chenivesse, Cécile; Straus, Christian; Similowski, Thomas; Delclaux, Christophe

    2016-05-15

    It has been hypothesized that hyperventilation disorders could be characterized by an abnormal ventilatory control leading to enhanced variability of resting ventilation. The variability of tidal volume (VT) often depicts a nonnormal distribution that can be described by the negative slope characterizing augmented breaths formed by the relationship between the probability density distribution of VT and VT on a log-log scale. The objectives of this study were to describe the variability of resting ventilation [coefficient of variation (CV) of VT and slope], the stability in respiratory control (loop, controller and plant gains characterizing ventilatory-chemoresponsiveness interactions) and the chaotic-like dynamics (embedding dimension, Kappa values characterizing complexity) of resting ventilation in patients with a well-defined dysfunctional breathing pattern characterized by air hunger and constantly decreased PaCO2 during a cardiopulmonary exercise test. Compared with 14 healthy subjects with similar anthropometrics, 23 patients with hyperventilation were characterized by increased variability of resting tidal ventilation (CV of VT median [interquartile]: 26% [19-35] vs. 36% [28-48], P = 0.020; slope: -6.63 [-7.65; -5.36] vs. -3.88 [-5.91; -2.66], P = 0.004) that was not related to increased chemical drive (loop gain: 0.051 [0.039-0.221] vs. 0.044 [0.012-0.087], P = 0.149) but that was related to an increased ventilatory complexity (Kappa values, P < 0.05). Plant gain was decreased in patients and correlated with complexity (with Kappa 5 - degree 5: Rho = -0.48, P = 0.006). In conclusion, well-defined patients suffering from hyperventilation disorder are characterized by increased variability of their resting ventilation due to increased ventilatory complexity with stable ventilatory-chemoresponsiveness interactions. Copyright © 2016 the American Physiological Society.

  11. Evaluation of terrain complexity by autocorrelation. [geomorphology and geobotany

    NASA Technical Reports Server (NTRS)

    Craig, R. G.

    1982-01-01

    The topographic complexity of various sections of the Ozark, Appalachian, and Interior Low Plateaus, as well as of the New England, Piedmont, Blue Ridge, Ouachita, and Valley and Ridge Provinces of the Eastern United States were characterized. The variability of autocorrelation within a small area (7 1/2-ft quadrangle) to the variability at widely separated and diverse areas within the same physiographic region was compared to measure the degree of uniformity of the processes which can be expected to be encountered within a given physiographic province. The variability of autocorrelation across the eight geomorphic regions was compared and contrasted. The total study area was partitioned into subareas homogeneous in terrain complexity. The relation between the complexity measured, the geomorphic process mix implied, and the way in which geobotanical information is modified into a more or less recognizable entity is demonstrated. Sampling strategy is described.

  12. Mathematics for Physics

    NASA Astrophysics Data System (ADS)

    Stone, Michael; Goldbart, Paul

    2009-07-01

    Preface; 1. Calculus of variations; 2. Function spaces; 3. Linear ordinary differential equations; 4. Linear differential operators; 5. Green functions; 6. Partial differential equations; 7. The mathematics of real waves; 8. Special functions; 9. Integral equations; 10. Vectors and tensors; 11. Differential calculus on manifolds; 12. Integration on manifolds; 13. An introduction to differential topology; 14. Group and group representations; 15. Lie groups; 16. The geometry of fibre bundles; 17. Complex analysis I; 18. Applications of complex variables; 19. Special functions and complex variables; Appendixes; Reference; Index.

  13. Regional-scale brine migration along vertical pathways due to CO2 injection - Part 2: A simulated case study in the North German Basin

    NASA Astrophysics Data System (ADS)

    Kissinger, Alexander; Noack, Vera; Knopf, Stefan; Konrad, Wilfried; Scheer, Dirk; Class, Holger

    2017-06-01

    Saltwater intrusion into potential drinking water aquifers due to the injection of CO2 into deep saline aquifers is one of the hazards associated with the geological storage of CO2. Thus, in a site-specific risk assessment, models for predicting the fate of the displaced brine are required. Practical simulation of brine displacement involves decisions regarding the complexity of the model. The choice of an appropriate level of model complexity depends on multiple criteria: the target variable of interest, the relevant physical processes, the computational demand, the availability of data, and the data uncertainty. In this study, we set up a regional-scale geological model for a realistic (but not real) onshore site in the North German Basin with characteristic geological features for that region. A major aim of this work is to identify the relevant parameters controlling saltwater intrusion in a complex structural setting and to test the applicability of different model simplifications. The model that is used to identify relevant parameters fully couples flow in shallow freshwater aquifers and deep saline aquifers. This model also includes variable-density transport of salt and realistically incorporates surface boundary conditions with groundwater recharge. The complexity of this model is then reduced in several steps, by neglecting physical processes (two-phase flow near the injection well, variable-density flow) and by simplifying the complex geometry of the geological model. The results indicate that the initial salt distribution prior to the injection of CO2 is one of the key parameters controlling shallow aquifer salinization. However, determining the initial salt distribution involves large uncertainties in the regional-scale hydrogeological parameterization and requires complex and computationally demanding models (regional-scale variable-density salt transport). In order to evaluate strategies for minimizing leakage into shallow aquifers, other target variables can be considered, such as the volumetric leakage rate into shallow aquifers or the pressure buildup in the injection horizon. Our results show that simplified models, which neglect variable-density salt transport, can reach an acceptable agreement with more complex models.

  14. A Brief History of the use of Electromagnetic Induction Techniques in Soil Survey

    NASA Astrophysics Data System (ADS)

    Brevik, Eric C.; Doolittle, James

    2017-04-01

    Electromagnetic induction (EMI) has been used to characterize the spatial variability of soil properties since the late 1970s. Initially used to assess soil salinity, the use of EMI in soil studies has expanded to include: mapping soil types; characterizing soil water content and flow patterns; assessing variations in soil texture, compaction, organic matter content, and pH; and determining the depth to subsurface horizons, stratigraphic layers or bedrock, among other uses. In all cases the soil property being investigated must influence soil apparent electrical conductivity (ECa) either directly or indirectly for EMI techniques to be effective. An increasing number and diversity of EMI sensors have been developed in response to users' needs and the availability of allied technologies, which have greatly improved the functionality of these tools and increased the amount and types of data that can be gathered with a single pass. EMI investigations provide several benefits for soil studies. The large amount of georeferenced data that can be rapidly and inexpensively collected with EMI provides more complete characterization of the spatial variations in soil properties than traditional sampling techniques. In addition, compared to traditional soil survey methods, EMI can more effectively characterize diffuse soil boundaries and identify included areas of dissimilar soils within mapped soil units, giving soil scientists greater confidence when collecting spatial soil information. EMI techniques do have limitations; results are site-specific and can vary depending on the complex interactions among multiple and variable soil properties. Despite this, EMI techniques are increasingly being used to investigate the spatial variability of soil properties at field and landscape scales. The future should witness a greater use of multiple-frequency and multiple-coil EMI sensors and integration with other sensors to assess the spatial variability of soil properties. Data analysis will be improved with advanced processing and presentation systems and more sophisticated geostatistical modeling algorithms will be developed and used to interpolate EMI data, improve the resolution of subsurface features, and assess soil properties.

  15. Suicide and suicidal behaviour

    PubMed Central

    Turecki, Gustavo; Brent, David A.

    2017-01-01

    Summary Suicide is a complex public health problem of global dimension. Suicidal behaviour (SB) shows marked differences between genders, age groups, geographic regions and socio-political realities, and variably associates with different risk factors, underscoring likely etiological heterogeneity. Although there is no effective algorithm to predict suicide in clinical practice, improved recognition and understanding of clinical, psychological, sociological, and biological factors may facilitate the detection of high-risk individuals and assist in treatment selection. Psychotherapeutic, pharmacological, or neuromodulatory treatments of mental disorders can often prevent SB; additionally, regular follow-up of suicide attempters by mental health services is key to prevent future SB. PMID:26385066

  16. Short-term Wind Forecasting at Wind Farms using WRF-LES and Actuator Disk Model

    NASA Astrophysics Data System (ADS)

    Kirkil, Gokhan

    2017-04-01

    Short-term wind forecasts are obtained for a wind farm on a mountainous terrain using WRF-LES. Multi-scale simulations are also performed using different PBL parameterizations. Turbines are parameterized using Actuator Disc Model. LES models improved the forecasts. Statistical error analysis is performed and ramp events are analyzed. Complex topography of the study area affects model performance, especially the accuracy of wind forecasts were poor for cross valley-mountain flows. By means of LES, we gain new knowledge about the sources of spatial and temporal variability of wind fluctuations such as the configuration of wind turbines.

  17. Observations of strain accumulation across the San Andreas fault near Palmdale, California, with a two-color geodimeter

    USGS Publications Warehouse

    Langbein, J.O.; Linker, M.F.; McGarr, A.; Slater, L.E.

    1982-01-01

    Two-color laser ranging measurements during a 15-month period over a geodetic network spanning the San Andreas fault near Palmdale, California, indicate that the crust expands and contracts aseismically in episodes as short as 2 weeks. Shear strain parallel to the fault has accumulated monotonically since November 1980, but at a variable rate. Improvements in measurement precision and temporal resolution over those of previous geodetic studies near Palmdale have resulted in the definition of a time history of crustal deformation that is much more complex than formerly realized. Copyright ?? 1982 AAAS.

  18. The growth receptors and their role in wound healing.

    PubMed

    Rolfe, Kerstin J; Grobbelaar, Adriaan O

    2010-11-01

    Abnormal wound healing is a major problem in healthcare today, with both scarring and chronic wounds affecting large numbers of individuals worldwide. Wound healing is a complex process involving several variables, including growth factors and their receptors. Chronic wounds fail to complete the wound healing process, while scarring is considered to be an overzealous wound healing process. Growth factor receptors and their ligands are being investigated to assess their potential in the development of therapeutic strategies to improve wound healing. This review discusses potential therapeutics for manipulating growth factors and their corresponding receptors for the treatment of abnormal wound healing.

  19. THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures.

    PubMed

    Theobald, Douglas L; Wuttke, Deborah S

    2006-09-01

    THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. ANSI C source code and selected binaries for various computing platforms are available under the GNU open source license from http://monkshood.colorado.edu/theseus/ or http://www.theseus3d.org.

  20. Factors that influence sexual arousal in men: a focus group study.

    PubMed

    Janssen, Erick; McBride, Kimberly R; Yarber, William; Hill, Brandon J; Butler, Scott M

    2008-04-01

    The goal of this study was to improve our understanding of men's sexual response and its components as well as the factors or types of situations that men describe as facilitating or interfering with sexual arousal. Six focus groups, involving 50 mostly white, heterosexual men (M age = 35.2 years; range, 18-70), were conducted. As it was previously found in women (Graham, Sanders, Milhausen, & McBride, Archives of Sexual Behavior, 33, 527-538, 2004), men described a wide range of physical (genital as well as nongenital) and cognitive/affective cues for sexual arousal. Also, men described the relationship between sexual desire and arousal as being variable and complex, presented a wide range of factors that increased or decreased sexual arousal, and showed substantial variability in both the importance and direction of their effects. The findings may help further development of models of sexual response and inform discussions about gender differences in sexual desire and arousal.

  1. Parenting Stress, Parental Reactions, and Externalizing Behavior From Ages 4 to 10.

    PubMed

    Mackler, Jennifer S; Kelleher, Rachael T; Shanahan, Lilly; Calkins, Susan D; Keane, Susan P; O'Brien, Marion

    2015-04-01

    The association between parenting stress and child externalizing behavior, and the mediating role of parenting, has yielded inconsistent findings; however, the literature has typically been cross-sectional and unidirectional. In the current study the authors examined the longitudinal transactions among parenting stress, perceived negative parental reactions, and child externalizing at 4, 5, 7, and 10 years old. Models examining parent effects (parenting stress to child behavior), child effects (externalizing to parental reactions and stress), indirect effects of parental reactions, and the transactional associations among all variables, were compared. The transactional model best fit the data, and longitudinal reciprocal effects emerged between parenting stress and externalizing behavior. The mediating role of parental reactions was not supported; however, indirect effects suggest that parenting stress both is affected by and affects parent and child behavior. The complex associations among parent and child variables indicate the importance of interventions to improve the parent-child relationship and reducing parenting stress.

  2. Hydropower Modeling Challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoll, Brady; Andrade, Juan; Cohen, Stuart

    Hydropower facilities are important assets for the electric power sector and represent a key source of flexibility for electric grids with large amounts of variable generation. As variable renewable generation sources expand, understanding the capabilities and limitations of the flexibility from hydropower resources is important for grid planning. Appropriately modeling these resources, however, is difficult because of the wide variety of constraints these plants face that other generators do not. These constraints can be broadly categorized as environmental, operational, and regulatory. This report highlights several key issues involving incorporating these constraints when modeling hydropower operations in terms of production costmore » and capacity expansion. Many of these challenges involve a lack of data to adequately represent the constraints or issues of model complexity and run time. We present several potential methods for improving the accuracy of hydropower representation in these models to allow for a better understanding of hydropower's capabilities.« less

  3. Improved Quantification of Free and Ester-Bound Gallic Acid in Foods and Beverages by UHPLC-MS/MS.

    PubMed

    Newsome, Andrew G; Li, Yongchao; van Breemen, Richard B

    2016-02-17

    Hydrolyzable tannins are measured routinely during the characterization of food and beverage samples. Most methods for the determination of hydrolyzable tannins use hydrolysis or methanolysis to convert complex tannins to small molecules (gallic acid, methyl gallate, and ellagic acid) for quantification by HPLC-UV. Often unrecognized, analytical limitations and variability inherent in these approaches for the measurement of hydrolyzable tannins include the variable mass fraction (0-0.90) that is released as analyte, contributions of sources other than tannins to hydrolyzable gallate (can exceed >10 wt %/wt), the measurement of both free and total analyte, and lack of controls to account for degradation. An accurate, specific, sensitive, and higher-throughput approach for the determination of hydrolyzable gallate based on ultrahigh-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) that overcomes these limitations was developed.

  4. An effective pseudospectral method for constraint dynamic optimisation problems with characteristic times

    NASA Astrophysics Data System (ADS)

    Xiao, Long; Liu, Xinggao; Ma, Liang; Zhang, Zeyin

    2018-03-01

    Dynamic optimisation problem with characteristic times, widely existing in many areas, is one of the frontiers and hotspots of dynamic optimisation researches. This paper considers a class of dynamic optimisation problems with constraints that depend on the interior points either fixed or variable, where a novel direct pseudospectral method using Legendre-Gauss (LG) collocation points for solving these problems is presented. The formula for the state at the terminal time of each subdomain is derived, which results in a linear combination of the state at the LG points in the subdomains so as to avoid the complex nonlinear integral. The sensitivities of the state at the collocation points with respect to the variable characteristic times are derived to improve the efficiency of the method. Three well-known characteristic time dynamic optimisation problems are solved and compared in detail among the reported literature methods. The research results show the effectiveness of the proposed method.

  5. An adaptive gridless methodology in one dimension

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, N.T.; Hailey, C.E.

    1996-09-01

    Gridless numerical analysis offers great potential for accurately solving for flow about complex geometries or moving boundary problems. Because gridless methods do not require point connection, the mesh cannot twist or distort. The gridless method utilizes a Taylor series about each point to obtain the unknown derivative terms from the current field variable estimates. The governing equation is then numerically integrated to determine the field variables for the next iteration. Effects of point spacing and Taylor series order on accuracy are studied, and they follow similar trends of traditional numerical techniques. Introducing adaption by point movement using a spring analogymore » allows the solution method to track a moving boundary. The adaptive gridless method models linear, nonlinear, steady, and transient problems. Comparison with known analytic solutions is given for these examples. Although point movement adaption does not provide a significant increase in accuracy, it helps capture important features and provides an improved solution.« less

  6. Gaming science innovations to integrate health systems science into medical education and practice

    PubMed Central

    White, Earla J; Lewis, Joy H; McCoy, Lise

    2018-01-01

    Health systems science (HSS) is an emerging discipline addressing multiple, complex, interdependent variables that affect providers’ abilities to deliver patient care and influence population health. New perspectives and innovations are required as physician leaders and medical educators strive to accelerate changes in medical education and practice to meet the needs of evolving populations and systems. The purpose of this paper is to introduce gaming science as a lens to magnify HSS integration opportunities in the scope of medical education and practice. Evidence supports gaming science innovations as effective teaching and learning tools to promote learner engagement in scientific and systems thinking for decision making in complex scenarios. Valuable insights and lessons gained through the history of war games have resulted in strategic thinking to minimize risk and save lives. In health care, where decisions can affect patient and population outcomes, gaming science innovations have the potential to provide safe learning environments to practice crucial decision-making skills. Research of gaming science limitations, gaps, and strategies to maximize innovations to further advance HSS in medical education and practice is required. Gaming science holds promise to equip health care teams with HSS knowledge and skills required for transformative practice. The ultimate goals are to empower providers to work in complex systems to improve patient and population health outcomes and experiences, and to reduce costs and improve care team well-being.

  7. Improving the Fitness of High-Dimensional Biomechanical Models via Data-Driven Stochastic Exploration

    PubMed Central

    Bustamante, Carlos D.; Valero-Cuevas, Francisco J.

    2010-01-01

    The field of complex biomechanical modeling has begun to rely on Monte Carlo techniques to investigate the effects of parameter variability and measurement uncertainty on model outputs, search for optimal parameter combinations, and define model limitations. However, advanced stochastic methods to perform data-driven explorations, such as Markov chain Monte Carlo (MCMC), become necessary as the number of model parameters increases. Here, we demonstrate the feasibility and, what to our knowledge is, the first use of an MCMC approach to improve the fitness of realistically large biomechanical models. We used a Metropolis–Hastings algorithm to search increasingly complex parameter landscapes (3, 8, 24, and 36 dimensions) to uncover underlying distributions of anatomical parameters of a “truth model” of the human thumb on the basis of simulated kinematic data (thumbnail location, orientation, and linear and angular velocities) polluted by zero-mean, uncorrelated multivariate Gaussian “measurement noise.” Driven by these data, ten Markov chains searched each model parameter space for the subspace that best fit the data (posterior distribution). As expected, the convergence time increased, more local minima were found, and marginal distributions broadened as the parameter space complexity increased. In the 36-D scenario, some chains found local minima but the majority of chains converged to the true posterior distribution (confirmed using a cross-validation dataset), thus demonstrating the feasibility and utility of these methods for realistically large biomechanical problems. PMID:19272906

  8. Gaming science innovations to integrate health systems science into medical education and practice.

    PubMed

    White, Earla J; Lewis, Joy H; McCoy, Lise

    2018-01-01

    Health systems science (HSS) is an emerging discipline addressing multiple, complex, interdependent variables that affect providers' abilities to deliver patient care and influence population health. New perspectives and innovations are required as physician leaders and medical educators strive to accelerate changes in medical education and practice to meet the needs of evolving populations and systems. The purpose of this paper is to introduce gaming science as a lens to magnify HSS integration opportunities in the scope of medical education and practice. Evidence supports gaming science innovations as effective teaching and learning tools to promote learner engagement in scientific and systems thinking for decision making in complex scenarios. Valuable insights and lessons gained through the history of war games have resulted in strategic thinking to minimize risk and save lives. In health care, where decisions can affect patient and population outcomes, gaming science innovations have the potential to provide safe learning environments to practice crucial decision-making skills. Research of gaming science limitations, gaps, and strategies to maximize innovations to further advance HSS in medical education and practice is required. Gaming science holds promise to equip health care teams with HSS knowledge and skills required for transformative practice. The ultimate goals are to empower providers to work in complex systems to improve patient and population health outcomes and experiences, and to reduce costs and improve care team well-being.

  9. Multi-Constraint Multi-Variable Optimization of Source-Driven Nuclear Systems

    NASA Astrophysics Data System (ADS)

    Watkins, Edward Francis

    1995-01-01

    A novel approach to the search for optimal designs of source-driven nuclear systems is investigated. Such systems include radiation shields, fusion reactor blankets and various neutron spectrum-shaping assemblies. The novel approach involves the replacement of the steepest-descents optimization algorithm incorporated in the code SWAN by a significantly more general and efficient sequential quadratic programming optimization algorithm provided by the code NPSOL. The resulting SWAN/NPSOL code system can be applied to more general, multi-variable, multi-constraint shield optimization problems. The constraints it accounts for may include simple bounds on variables, linear constraints, and smooth nonlinear constraints. It may also be applied to unconstrained, bound-constrained and linearly constrained optimization. The shield optimization capabilities of the SWAN/NPSOL code system is tested and verified in a variety of optimization problems: dose minimization at constant cost, cost minimization at constant dose, and multiple-nonlinear constraint optimization. The replacement of the optimization part of SWAN with NPSOL is found feasible and leads to a very substantial improvement in the complexity of optimization problems which can be efficiently handled.

  10. Review article: closed-loop systems in anesthesia: is there a potential for closed-loop fluid management and hemodynamic optimization?

    PubMed

    Rinehart, Joseph; Liu, Ngai; Alexander, Brenton; Cannesson, Maxime

    2012-01-01

    Closed-loop (automated) controllers are encountered in all aspects of modern life in applications ranging from air-conditioning to spaceflight. Although these systems are virtually ubiquitous, they are infrequently used in anesthesiology because of the complexity of physiologic systems and the difficulty in obtaining reliable and valid feedback data from the patient. Despite these challenges, closed-loop systems are being increasingly studied and improved for medical use. Two recent developments have made fluid administration a candidate for closed-loop control. First, the further description and development of dynamic predictors of fluid responsiveness provides a strong parameter for use as a control variable to guide fluid administration. Second, rapid advances in noninvasive monitoring of cardiac output and other hemodynamic variables make goal-directed therapy applicable for a wide range of patients in a variety of clinical care settings. In this article, we review the history of closed-loop controllers in clinical care, discuss the current understanding and limitations of the dynamic predictors of fluid responsiveness, and examine how these variables might be incorporated into a closed-loop fluid administration system.

  11. Determining the Ocean's Role on the Variable Gravity Field and Earth Rotation

    NASA Technical Reports Server (NTRS)

    Ponte, Rui M.; Frey, H. (Technical Monitor)

    2000-01-01

    A number of ocean models of different complexity have been used to study changes in the oceanic angular momentum (OAM) and mass fields and their relation to the variable Earth rotation and gravity field. Time scales examined range from seasonal to a few days. Results point to the importance of oceanic signals in driving polar motion, in particular the Chandler and annual wobbles. Results also show that oceanic signals have a measurable impact on length-of-day variations. Various circulation features and associated mass signals, including the North Pacific subtropical gyre, the equatorial currents, and the Antarctic Circumpolar Current play a significant role in oceanic angular momentum variability. The impact on OAM values of an optimization procedure that uses available data to constrain ocean model results was also tested for the first time. The optimization procedure yielded substantial changes, in OAM, related to adjustments in both motion and mass fields,as well as in the wind stress torques acting on the ocean. Constrained OAM values were found to yield noticeable improvements in the agreement with the observed Earth rotation parameters, particularly at the seasonal timescale.

  12. Analysing inter-relationships among water, governance, human development variables in developing countries

    NASA Astrophysics Data System (ADS)

    Dondeynaz, C.; Carmona Moreno, C.; Céspedes Lorente, J. J.

    2012-10-01

    The "Integrated Water Resources Management" principle was formally laid down at the International Conference on Water and Sustainable development in Dublin 1992. One of the main results of this conference is that improving Water and Sanitation Services (WSS), being a complex and interdisciplinary issue, passes through collaboration and coordination of different sectors (environment, health, economic activities, governance, and international cooperation). These sectors influence or are influenced by the access to WSS. The understanding of these interrelations appears as crucial for decision makers in the water sector. In this framework, the Joint Research Centre (JRC) of the European Commission (EC) has developed a new database (WatSan4Dev database) containing 42 indicators (called variables in this paper) from environmental, socio-economic, governance and financial aid flows data in developing countries. This paper describes the development of the WatSan4Dev dataset, the statistical processes needed to improve the data quality, and finally, the analysis to verify the database coherence is presented. Based on 25 relevant variables, the relationships between variables are described and organised into five factors (HDP - Human Development against Poverty, AP - Human Activity Pressure on water resources, WR - Water Resources, ODA - Official Development Aid, CEC - Country Environmental Concern). Linear regression methods are used to identify key variables having influence on water supply and sanitation. First analysis indicates that the informal urbanisation development is an important factor negatively influencing the percentage of the population having access to WSS. Health, and in particular children's health, benefits from the improvement of WSS. Irrigation is also enhancing Water Supply service thanks to multi-purpose infrastructure. Five country profiles are also created to deeper understand and synthetize the amount of information gathered. This new classification of countries is useful in identifying countries with a less advanced position and weaknesses to be tackled. The relevance of indicators gathered to represent environmental and water resources state is questioned in the discussion section. The paper concludes with the necessity to increase the reliability of current indicators and calls for further research on specific indicators, in particular on water quality at national scale, in order to better include environmental state in analysis to WSS.

  13. Family relations and eating disorders. The effectiveness of an integrated approach in the treatment of anorexia and bulimia in teenagers: results of a case-control systemic research.

    PubMed

    Onnis, L; Barbara, E; Bernardini, M; Caggese, A; Di Giacomo, S; Giambartolomei, A; Leonelli, A; Mule', A M; Nicoletti, P G; Vietri, A

    2012-03-01

    This article presents the results of a broader clinical research into the effectiveness of integrated treatments in teenage eating disorders, carried out at the Complex Operative Unit of Psychotherapy (Unità Operativa Complessa or U.O.C.) of the Department of Psychiatric Sciences and Psychological Medicine in collaboration with the Department of Neuropsychiatric Science for Child Development (Dipartimento di Scienze Neuropsichiatriche dell'Età Evolutiva), both at the "La Sapienza" University of Rome. The hypothesis of this research project is that in diagnosticable situations such as anorexia or bulimia, an integrated and multidisciplinary treatment, which combines medical-nutritional interventions and family psychotherapy, allows better results than a single kind of treatment, which is the usual medical- nutritional intervention supported by psychiatric counselling. Twenty-eight cases (16 of bulimia and 12 of anorexia) were selected and then subdivided, with a randomized distribution, into two (experimental and control) homogeneous groups of 14 patients. The grouping variables were the diagnosis, the disorder's seriousness and duration, BMI, gender, age, family composition and social status. The variables which have been examined in this article are the clinical parameters, which were valuated in accordance with the DSM IV-TR criteria, and relational parameters which were explored through the use of the W.F.T. Test (Wiltwyck Family Tasks). These parameters were tested at beginning as well as at the end of the therapies, in both the experimental group and the control group. Statistical analysis has shown that the experimental group, which was followed with the integrated treatment, experienced a significant improvement of the parameters as related to dysfunctional family interaction modalities, and that this improvement was correlated to the positive evolution of the clinical parameters. This improvement was not present or not of the same degree in the control group. The results, moreover, demonstrate the effectiveness of an integrated systemic treatment based on a complex approach compared to a reductionist approach.

  14. Weak conservation of structural features in the interfaces of homologous transient protein–protein complexes

    PubMed Central

    Sudha, Govindarajan; Singh, Prashant; Swapna, Lakshmipuram S; Srinivasan, Narayanaswamy

    2015-01-01

    Residue types at the interface of protein–protein complexes (PPCs) are known to be reasonably well conserved. However, we show, using a dataset of known 3-D structures of homologous transient PPCs, that the 3-D location of interfacial residues and their interaction patterns are only moderately and poorly conserved, respectively. Another surprising observation is that a residue at the interface that is conserved is not necessarily in the interface in the homolog. Such differences in homologous complexes are manifested by substitution of the residues that are spatially proximal to the conserved residue and structural differences at the interfaces as well as differences in spatial orientations of the interacting proteins. Conservation of interface location and the interaction pattern at the core of the interfaces is higher than at the periphery of the interface patch. Extents of variability of various structural features reported here for homologous transient PPCs are higher than the variation in homologous permanent homomers. Our findings suggest that straightforward extrapolation of interfacial nature and inter-residue interaction patterns from template to target could lead to serious errors in the modeled complex structure. Understanding the evolution of interfaces provides insights to improve comparative modeling of PPC structures. PMID:26311309

  15. Efficient computation of the joint sample frequency spectra for multiple populations.

    PubMed

    Kamm, John A; Terhorst, Jonathan; Song, Yun S

    2017-01-01

    A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity.

  16. Efficient computation of the joint sample frequency spectra for multiple populations

    PubMed Central

    Kamm, John A.; Terhorst, Jonathan; Song, Yun S.

    2016-01-01

    A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity. PMID:28239248

  17. A Comparative Study of the Variables Used to Measure Syntactic Complexity and Accuracy in Task-Based Research

    ERIC Educational Resources Information Center

    Inoue, Chihiro

    2016-01-01

    The constructs of complexity, accuracy and fluency (CAF) have been used extensively to investigate learner performance on second language tasks. However, a serious concern is that the variables used to measure these constructs are sometimes used conventionally without any empirical justification. It is crucial for researchers to understand how…

  18. A Program Complexity Metric Based on Variable Usage for Algorithmic Thinking Education of Novice Learners

    ERIC Educational Resources Information Center

    Fuwa, Minori; Kayama, Mizue; Kunimune, Hisayoshi; Hashimoto, Masami; Asano, David K.

    2015-01-01

    We have explored educational methods for algorithmic thinking for novices and implemented a block programming editor and a simple learning management system. In this paper, we propose a program/algorithm complexity metric specified for novice learners. This metric is based on the variable usage in arithmetic and relational formulas in learner's…

  19. Mathematical Methods for Physics and Engineering Third Edition Paperback Set

    NASA Astrophysics Data System (ADS)

    Riley, Ken F.; Hobson, Mike P.; Bence, Stephen J.

    2006-06-01

    Prefaces; 1. Preliminary algebra; 2. Preliminary calculus; 3. Complex numbers and hyperbolic functions; 4. Series and limits; 5. Partial differentiation; 6. Multiple integrals; 7. Vector algebra; 8. Matrices and vector spaces; 9. Normal modes; 10. Vector calculus; 11. Line, surface and volume integrals; 12. Fourier series; 13. Integral transforms; 14. First-order ordinary differential equations; 15. Higher-order ordinary differential equations; 16. Series solutions of ordinary differential equations; 17. Eigenfunction methods for differential equations; 18. Special functions; 19. Quantum operators; 20. Partial differential equations: general and particular; 21. Partial differential equations: separation of variables; 22. Calculus of variations; 23. Integral equations; 24. Complex variables; 25. Application of complex variables; 26. Tensors; 27. Numerical methods; 28. Group theory; 29. Representation theory; 30. Probability; 31. Statistics; Index.

  20. Geoelectrical characterisation of basement aquifers: the case of Iberekodo, southwestern Nigeria

    NASA Astrophysics Data System (ADS)

    Aizebeokhai, Ahzegbobor P.; Oyeyemi, Kehinde D.

    2018-03-01

    Basement aquifers, which occur within the weathered and fractured zones of crystalline bedrocks, are important groundwater resources in tropical and subtropical regions. The development of basement aquifers is complex owing to their high spatial variability. Geophysical techniques are used to obtain information about the hydrologic characteristics of the weathered and fractured zones of the crystalline basement rocks, which relates to the occurrence of groundwater in the zones. The spatial distributions of these hydrologic characteristics are then used to map the spatial variability of the basement aquifers. Thus, knowledge of the spatial variability of basement aquifers is useful in siting wells and boreholes for optimal and perennial yield. Geoelectrical resistivity is one of the most widely used geophysical methods for assessing the spatial variability of the weathered and fractured zones in groundwater exploration efforts in basement complex terrains. The presented study focuses on combining vertical electrical sounding with two-dimensional (2D) geoelectrical resistivity imaging to characterise the weathered and fractured zones in a crystalline basement complex terrain in southwestern Nigeria. The basement aquifer was delineated, and the nature, extent and spatial variability of the delineated basement aquifer were assessed based on the spatial variability of the weathered and fractured zones. The study shows that a multiple-gradient array for 2D resistivity imaging is sensitive to vertical and near-surface stratigraphic features, which have hydrological implications. The integration of resistivity sounding with 2D geoelectrical resistivity imaging is efficient and enhances near-surface characterisation in basement complex terrain.

  1. Improving cardiac operating room to intensive care unit handover using a standardised handover process.

    PubMed

    Gleicher, Yehoshua; Mosko, Jeffrey David; McGhee, Irene

    2017-01-01

    Handovers from the cardiovascular operating room (CVOR) to the cardiovascular intensive care unit (CVICU) are complex processes involving the transfer of information, equipment and responsibility, at a time when the patient is most vulnerable. This transfer is typically variable in structure, content and execution. This variability can lead to the omission and miscommunication of critical information leading to patient harm. We set out to improve the quality of patient handover from the CVOR to the CVICU by introducing a standardised handover protocol. This study is an interventional time-series study over a 4-month period at an adult cardiac surgery centre. A standardised handover protocol was developed using quality improvement methodologies. The protocol included a handover content checklist and introduction of a formal 'sterile cockpit' timeout. Implementation of the protocol was refined using monthly iterative Plan-Do-Study-Act. The primary outcome was the quality of handovers, measured by a Handover Score, comprising handover content, teamwork and patient care planning indicators. Secondary outcomes included handover duration, adherence to the standardised handover protocol and handover team satisfaction surveys. 37 handovers were observed (6 pre intervention and 31 post intervention). The mean handover score increased from 6.5 to 14.0 (maximum 18 points). Specific improvements included fewer handover interruptions and more frequent postoperative patient care planning. Average handover duration increased slightly from 2:40 to 2:57 min. Caregivers noted improvements in teamwork, content received and patient care planning. The majority (>95%) agreed that the intervention was a valuable addition to the CVOR to CVICU handover process. Implementation of a standardised handover protocol for postcardiac surgery patients was associated with fewer interruptions during handover, more reliable transfer of critical content and improved patient care planning.

  2. [Variable magnetic fields in the treatment of tics disorders - preliminary results].

    PubMed

    Pasek, Jarosław; Jędrzejewska, Anna; Jagodziński, Leszek; Obuchowicz, Anna; Flak, Maria; Sieroń, Aleksander

    Tics disorders is frequent pathological syndrome, particularly typical for children's age. The symptoms of this disease are differential, and their intensification individualized, which makes difficult unique recognition. Tics disorders concern the most often the muscles of face, head, upper limbs and trunk. The study group consisted of 16 patients (11 boys and 5 girls) with tics complex disorders about unknown etiology particularly relating of face and upper limbs muscles. In the treatment were the interventions with use a magnetotherapy and magnetostimulation applied. The procedures were ones daily by 3 weeks in two series executed. It author's pool was the frequency of occurrence tics disorders as well as the proportional opinion of effectiveness conducted treatment estimated. After 10 weeks in 14 patients was the decrease of occurrence frequency involuntary movements observed, in proportional scale about 75%. The results of subjective opinion of mood showed, that decrease the frequency tics disorders had in all children the direct shift on improvement their mood, and also satisfaction in their parents. The use of variable magnetic fields influenced on decrease the occurrence frequency tics disorders, and also on improvement quality of life the treated patients.

  3. [Variable magnetic fields in the treatment of tics disorders - preliminary results].

    PubMed

    Pasek, Jarosław; Jędrzejewska, Anna; Jagodziński, Leszek; Obuchowicz, Anna; Flak, Maria; Sieroń, Aleksander

    2016-01-01

    Tics disorders is frequent pathological syndrome, particularly typical for children's age. The symptoms of this disease are differential, and their intensification individualized, which makes difficult unique recognition. Tics disorders concern the most often the muscles of face, head, upper limbs and trunk. The study group consisted of 16 patients (11 boys and 5 girls) with tics complex disorders about unknown etiology particularly relating of face and upper limbs muscles. In the treatment were the interventions with use a magnetotherapy and magnetostimulation applied. The procedures were ones daily by 3 weeks in two series executed. It author's pool was the frequency of occurrence tics disorders as well as the proportional opinion of effectiveness conducted treatment estimated. After 10 weeks in 14 patients was the decrease of occurrence frequency involuntary movements observed, in proportional scale about 75%. The results of subjective opinion of mood showed, that decrease the frequency tics disorders had in all children the direct shift on improvement their mood, and also satisfaction in their parents. The use of variable magnetic fields influenced on decrease the occurrence frequency tics disorders, and also on improvement quality of life the treated patients.

  4. Medical Image Compression Based on Vector Quantization with Variable Block Sizes in Wavelet Domain

    PubMed Central

    Jiang, Huiyan; Ma, Zhiyuan; Hu, Yang; Yang, Benqiang; Zhang, Libo

    2012-01-01

    An optimized medical image compression algorithm based on wavelet transform and improved vector quantization is introduced. The goal of the proposed method is to maintain the diagnostic-related information of the medical image at a high compression ratio. Wavelet transformation was first applied to the image. For the lowest-frequency subband of wavelet coefficients, a lossless compression method was exploited; for each of the high-frequency subbands, an optimized vector quantization with variable block size was implemented. In the novel vector quantization method, local fractal dimension (LFD) was used to analyze the local complexity of each wavelet coefficients, subband. Then an optimal quadtree method was employed to partition each wavelet coefficients, subband into several sizes of subblocks. After that, a modified K-means approach which is based on energy function was used in the codebook training phase. At last, vector quantization coding was implemented in different types of sub-blocks. In order to verify the effectiveness of the proposed algorithm, JPEG, JPEG2000, and fractal coding approach were chosen as contrast algorithms. Experimental results show that the proposed method can improve the compression performance and can achieve a balance between the compression ratio and the image visual quality. PMID:23049544

  5. Medical image compression based on vector quantization with variable block sizes in wavelet domain.

    PubMed

    Jiang, Huiyan; Ma, Zhiyuan; Hu, Yang; Yang, Benqiang; Zhang, Libo

    2012-01-01

    An optimized medical image compression algorithm based on wavelet transform and improved vector quantization is introduced. The goal of the proposed method is to maintain the diagnostic-related information of the medical image at a high compression ratio. Wavelet transformation was first applied to the image. For the lowest-frequency subband of wavelet coefficients, a lossless compression method was exploited; for each of the high-frequency subbands, an optimized vector quantization with variable block size was implemented. In the novel vector quantization method, local fractal dimension (LFD) was used to analyze the local complexity of each wavelet coefficients, subband. Then an optimal quadtree method was employed to partition each wavelet coefficients, subband into several sizes of subblocks. After that, a modified K-means approach which is based on energy function was used in the codebook training phase. At last, vector quantization coding was implemented in different types of sub-blocks. In order to verify the effectiveness of the proposed algorithm, JPEG, JPEG2000, and fractal coding approach were chosen as contrast algorithms. Experimental results show that the proposed method can improve the compression performance and can achieve a balance between the compression ratio and the image visual quality.

  6. Exploring the potential of Saccharomyces eubayanus as a parent for new interspecies hybrid strains in winemaking.

    PubMed

    Magalhães, Frederico; Krogerus, Kristoffer; Castillo, Sandra; Ortiz-Julien, Anne; Dequin, Sylvie; Gibson, Brian

    2017-08-01

    Yeast cryotolerance brings some advantages for wine fermentations, including the improved aromatic complexity of white wines. Naturally cold-tolerant strains are generally less adept at wine fermentation but fermentative fitness can potentially be improved through hybridization. Here we studied the potential of using hybrids involving Saccharomyces eubayanus and a S. cerevisiae wine strain for low-temperature winemaking. Through screening the performance in response to variable concentrations of sugar, nitrogen and temperature, we isolated one hybrid strain that exhibited the superior performance. This hybrid strain was propagated and dried in pilot scale and tested for the fermentation of Macabeu and Sauvignon blanc grape musts. We obtained highly viable active dry yeast, which was able to efficiently ferment the grape musts with superior production of aroma active volatiles, in particular, 2-phenylethanol. The genome sequences of the hybrid strains revealed variable chromosome inheritance among hybrids, particularly within the S. cerevisiae subgenome. With the present paper, we expand the knowledge on the potentialities of using S. eubayanus hybrids in industrial fermentation at beverages other than lager beer. © FEMS 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. The Effects of Educational Diversity in a National Sample of Law Students: Fitting Multilevel Latent Variable Models in Data With Categorical Indicators.

    PubMed

    Gottfredson, Nisha C; Panter, A T; Daye, Charles E; Allen, Walter F; Wightman, Linda F

    2009-01-01

    Controversy surrounding the use of race-conscious admissions can be partially resolved with improved empirical knowledge of the effects of racial diversity in educational settings. We use a national sample of law students nested in 64 law schools to test the complex and largely untested theory regarding the effects of educational diversity on student outcomes. Social scientists who study these outcomes frequently encounter both latent variables and nested data within a single analysis. Yet, until recently, an appropriate modeling technique has been computationally infeasible, and consequently few applied researchers have estimated appropriate models to test their theories, sometimes limiting the scope of their research question. Our results, based on disaggregated multilevel structural equation models, show that racial diversity is related to a reduction in prejudiced attitudes and increased perceived exposure to diverse ideas and that these effects are mediated by more frequent interpersonal contact with diverse peers. These findings provide support for the idea that administrative manipulation of educational diversity may lead to improved student outcomes. Admitting a racially/ethnically diverse student body provides an educational experience that encourages increased exposure to diverse ideas and belief systems.

  8. Emotional Experience Improves With Age: Evidence Based on Over 10 Years of Experience Sampling

    PubMed Central

    Carstensen, Laura L.; Turan, Bulent; Scheibe, Susanne; Ram, Nilam; Ersner-Hershfield, Hal; Samanez-Larkin, Gregory R.; Brooks, Kathryn P.; Nesselroade, John R.

    2012-01-01

    Recent evidence suggests that emotional well-being improves from early adulthood to old age. This study used experience-sampling to examine the developmental course of emotional experience in a representative sample of adults spanning early to very late adulthood. Participants (N = 184, Wave 1; N = 191, Wave 2; N = 178, Wave 3) reported their emotional states at five randomly selected times each day for a one week period. Using a measurement burst design, the one-week sampling procedure was repeated five and then ten years later. Cross-sectional and growth curve analyses indicate that aging is associated with more positive overall emotional well-being, with greater emotional stability and with more complexity (as evidenced by greater co-occurrence of positive and negative emotions). These findings remained robust after accounting for other variables that may be related to emotional experience (personality, verbal fluency, physical health, and demographic variables). Finally, emotional experience predicted mortality; controlling for age, sex, and ethnicity, individuals who experienced relatively more positive than negative emotions in everyday life were more likely to have survived over a 13 year period. Findings are discussed in the theoretical context of socioemotional selectivity theory. PMID:20973600

  9. Reasoning strategies used by students to solve stoichiometry problems and its relationship to alternative conceptions, prior knowledge, and cognitive variables

    NASA Astrophysics Data System (ADS)

    de Astudillo, Luisa Rojas; Niaz, Mansoor

    1996-06-01

    Achievement in science depends on a series of factors that characterize the cognitive abilities of the students and the complex interactions between these factors and the environment that intervenes in the formation of students' background. The objective of this study is to: a) investigate reasoning strategies students use in solving stoichiometric problems; b) explore the relation between these strategies and alternative conceptions, prior knowledge and cognitive variables; and c) interpret the results within an epistemological framework. Results obtained show how stoichiometric relations produce conflicting situations for students, leading to conceptual misunderstanding of concepts, such as mass, atoms and moles. The wide variety of strategies used by students attest to the presence of competing and conflicting frameworks (progressive transitions, cf. Lakatos, 1970), leading to greater conceptual understanding. It is concluded that the methodology developed in this study (based on a series of closely related probing questions, generally requiring no calculations, that elicit student conceptual understanding to varying degrees within an intact classroom context) was influential in improving student performance. This improvement in performance, however, does not necessarily affect students' hard core of beliefs.

  10. The ins and outs of change of shift handoffs between nurses: a communication challenge.

    PubMed

    Carroll, John S; Williams, Michele; Gallivan, Theresa M

    2012-07-01

    Communication breakdowns have been identified as a source of problems in complex work settings such as hospital-based healthcare. The authors conducted a multi-method study of change of shift handoffs between nurses, including interviews, survey, audio taping and direct observation of handoffs, posthandoff questionnaires, and archival coding of clinical records. The authors found considerable variability across units, nurses and, surprisingly, roles. Incoming and outgoing nurses had different expectations for a good handoff: incoming nurses wanted a conversation with questions and eye contact, whereas outgoing nurses wanted to tell their story without interruptions. More experienced nurses abbreviated their reports when incoming nurses knew the patient, but the incoming nurses responded with a large number of questions, creating a contest for control. Nurses' ratings did not correspond to expert ratings of information adequacy, suggesting that nurses consider other functions of handoffs beyond information processing, such as social interaction and learning. These results suggest that variability across roles as information provider versus receiver and experience level (as well as across individual and organisational contexts) are reasons why improvement efforts directed at standardising and improving handoffs have been challenging in nursing and in other healthcare professions as well.

  11. [Modulator effect of socio-emotional variables on training in elaboration strategies in Compulsory Secondary Education (CSE): paraphrase and applications].

    PubMed

    Martín-Antón, Luis Jorge; Carbonero Martín, Miguel Angel; Román Sánchez, José María

    2012-02-01

    The purpose of this work is to verify the modulation of motivation, self-concept, and causal attributions in the efficacy of a training program of strategies to elaborate information in the stage of Compulsory Secondary Education (CSE). We selected 328 students from CSE, 179 from second grade and 149 from fourth grade, and three measurement moments: pretest, posttest, and follow-up. The results indicate greater use of learning strategies by students with higher intrinsic motivation, in contrast to students with higher extrinsic motivation, who use learning strategies less frequently. With regard to self-concept, the results differ as a function of the course. In second grade, we found modulation of the variable Academic self-concept, whereas in fourth grade, such modulation is produced by General self-concept and Private self-concept. In general, there is a tendency towards more enduring significant improvements in students with medium and high self-concept, especially in their perception of the use of strategies or in complex tasks that involve relating the contents to be learned with experiences from their daily life. However, students with low self-concept significantly improve strategies associated with learning how to perform specific tasks.

  12. Application of Complex Fluids in Lignocellulose Processing

    NASA Astrophysics Data System (ADS)

    Carrillo Lugo, Carlos A.

    Complex fluids such as emulsions, microemulsions and foams, have been used for different applications due to the multiplicity of properties they possess. In the present work, such fluids are introduced as effective media for processing lignocellulosic biomass. A demonstration of the generic benefits of complex fluids is presented to enhance biomass impregnation, to facilitate pretreatment for fiber deconstruction and to make compatible cellulose fibrils with hydrophobic polymers during composite manufacture. An improved impregnation of woody biomass was accomplished by application of water-continuous microemulsions. Microemulsions with high water content, > 85%, were formulated and wood samples were impregnated by wicking and capillary flooding at atmospheric pressure and temperature. Formulations were designed to effectively impregnate different wood species during shorter times and to a larger extent compared to the single components of the microemulsions (water, oil or surfactant solutions). The viscosity of the microemulsions and their interactions with cell wall constituents in fibers were critical to define the extent of impregnation and solubilization. The relation between composition and formulation variables and the extent of microemulsion penetration in different woody substrates was studied. Formulation variables such as salinity content of the aqueous phase and type of surfactant were elucidated. Likewise, composition variables such as the water-to-oil ratio and surfactant concentration were investigated. These variables affected the characteristics of the microemulsion and determined their effectiveness in wood treatment. Also, the interactions between the surfactant and the substrate had an important contribution in defining microemulsion penetration in the capillary structure of wood. Microemulsions as an alternative pretreatment for the manufacture of cellulose nanofibrils (CNFs) was also studied. Microemulsions were applied to pretreat lignin-free and lignin-containing fibers obtained from various processes. Incorporation of active agents in the microemulsion facilitated fiber pretreatment before deconstruction via grinding and microfluidization. The energy consumed during the manufacture of cellulose nanofibrils was reduced by up to 55 and 32% in the case of lignin-containing and lignin-free fibers. Moreover, such pre-treatment did not affect negatively the mechanical properties of films prepared with the produced CNF. CNF was also used to enhance the stability of normal and multiple emulsions of the water-in-oil-in-water (W/O/W) type and to prevent their creaming. This was achieved by the marked increase in viscosity of the aqueous phase in the presence CNF. Finally, water-continuous emulsions were used to prepare nanocomposite fibers containing polystyrene and CNF. The morphology of composite fibers obtained after electrospinning of emulsions incorporating polystyrene and CNF was affected by parameters such the concentration of surfactant additives present in the microemulsion and the conductivity of the aqueous phase. Overall, emulsions and microemulsions are presented as a convenient platform to improve the compatibility between polymers of different hydrophilicity, to facilitate their processing and integration in composites.

  13. Complexity reduction of biochemical rate expressions.

    PubMed

    Schmidt, Henning; Madsen, Mads F; Danø, Sune; Cedersund, Gunnar

    2008-03-15

    The current trend in dynamical modelling of biochemical systems is to construct more and more mechanistically detailed and thus complex models. The complexity is reflected in the number of dynamic state variables and parameters, as well as in the complexity of the kinetic rate expressions. However, a greater level of complexity, or level of detail, does not necessarily imply better models, or a better understanding of the underlying processes. Data often does not contain enough information to discriminate between different model hypotheses, and such overparameterization makes it hard to establish the validity of the various parts of the model. Consequently, there is an increasing demand for model reduction methods. We present a new reduction method that reduces complex rational rate expressions, such as those often used to describe enzymatic reactions. The method is a novel term-based identifiability analysis, which is easy to use and allows for user-specified reductions of individual rate expressions in complete models. The method is one of the first methods to meet the classical engineering objective of improved parameter identifiability without losing the systems biology demand of preserved biochemical interpretation. The method has been implemented in the Systems Biology Toolbox 2 for MATLAB, which is freely available from http://www.sbtoolbox2.org. The Supplementary Material contains scripts that show how to use it by applying the method to the example models, discussed in this article.

  14. Intraindividual variability is related to cognitive change in older adults: evidence for within-person coupling.

    PubMed

    Bielak, Allison A M; Hultsch, David F; Strauss, Esther; MacDonald, Stuart W S; Hunter, Michael A

    2010-09-01

    In this study, the authors addressed the longitudinal nature of intraindividual variability over 3 years. A sample of 304 community-dwelling older adults, initially between the ages of 64 and 92 years, completed 4 waves of annual testing on a battery of accuracy- and latency-based tests covering a wide range of cognitive complexity. Increases in response-time inconsistency on moderately and highly complex tasks were associated with increasing age, but there were significant individual differences in change across the entire sample. The time-varying covariation between cognition and inconsistency was significant across the 1-year intervals and remained stable across both time and age. On occasions when intraindividual variability was high, participants' cognitive performance was correspondingly low. The strength of the coupling relationship was greater for more fluid cognitive domains such as memory, reasoning, and processing speed than for more crystallized domains such as verbal ability. Variability based on moderately and highly complex tasks provided the strongest prediction. These results suggest that intraindividual variability is highly sensitive to even subtle changes in cognitive ability. (c) 2010 APA, all rights reserved.

  15. Extended q -Gaussian and q -exponential distributions from gamma random variables

    NASA Astrophysics Data System (ADS)

    Budini, Adrián A.

    2015-05-01

    The family of q -Gaussian and q -exponential probability densities fit the statistical behavior of diverse complex self-similar nonequilibrium systems. These distributions, independently of the underlying dynamics, can rigorously be obtained by maximizing Tsallis "nonextensive" entropy under appropriate constraints, as well as from superstatistical models. In this paper we provide an alternative and complementary scheme for deriving these objects. We show that q -Gaussian and q -exponential random variables can always be expressed as a function of two statistically independent gamma random variables with the same scale parameter. Their shape index determines the complexity q parameter. This result also allows us to define an extended family of asymmetric q -Gaussian and modified q -exponential densities, which reduce to the standard ones when the shape parameters are the same. Furthermore, we demonstrate that a simple change of variables always allows relating any of these distributions with a beta stochastic variable. The extended distributions are applied in the statistical description of different complex dynamics such as log-return signals in financial markets and motion of point defects in a fluid flow.

  16. Research activities in the field of human factors: Evaluation and prospects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larchier-Boulanger, J.; Grosdeva, T.

    1988-01-01

    The industrial systems are sociotechnical i.e., conceived, directed, checked, run, and repaired by individuals belonging to structured organizations for either individual or group work. Hence, a better understanding of how their behavior, competences, and know-how is a must. At the DER and ESF department human factors group is given the mission to enlarge, through a pluridisciplinary approach, the knowledge of human factors in complex systems. Human interventions are analyzed both for their positive aspects (competences and know-how to retrieve complex situations) and their negative aspects (human weaknesses). For safety reasons such analyses are mainly directed toward the nuclear plant operators,more » considered individually (intervening of one operator) or as a team (group behavior). The aims of the studies on human factors are various, and such studies justify the research in this field. They make it possible, through a better consideration of the variables specific to individuals, to bring to the enterprise means for: (1) increasing reliability and helping performance, (2) improving the adjustment of work demands to the real environment, and (3) creating a better energy between the individual and his/her enterprise. The variables specific to human factors that keep developing thus the perspectives for research, in this field, are to recenter and redefine the undertaken studies.« less

  17. Implementing high-performance work practices in healthcare organizations: qualitative and conceptual evidence.

    PubMed

    McAlearney, Ann Scheck; Robbins, Julie; Garman, Andrew N; Song, Paula H

    2013-01-01

    Studies across industries suggest that the systematic use of high-performance work practices (HPWPs) may be an effective but underused strategy to improve quality of care in healthcare organizations. Optimal use of HPWPs depends on how they are implemented, yet we know little about their implementation in healthcare. We conducted 67 key informant interviews in five healthcare organizations, each considered to have exemplary work practices in place and to deliver high-quality care, as part of an extensive study of HPWP use in healthcare. We analyzed interview transcripts inductively and deductively to examine why and how organizations implement HPWPs. We used an evidence-based model of complex innovation adoption to guide our exploration of factors that facilitate HPWP implementation. We found considerable variability in interviewees' reasons for implementing HPWPs, including macro-organizational (strategic level) and micro-organizational (individual level) reasons. This variability highlighted the complex context for HPWP implementation in many organizations. We also found that our application of an innovation implementation model helped clarify and categorize facilitators of HPWP implementation, thus providing insight on how these factors can contribute to implementation effectiveness. Focusing efforts on clarifying definitions, building commitment, and ensuring consistency in the application of work practices may be particularly important elements of successful implementation.

  18. A Three-Dimensional Finite-Element Model for Simulating Water Flow in Variably Saturated Porous Media

    NASA Astrophysics Data System (ADS)

    Huyakorn, Peter S.; Springer, Everett P.; Guvanasen, Varut; Wadsworth, Terry D.

    1986-12-01

    A three-dimensional finite-element model for simulating water flow in variably saturated porous media is presented. The model formulation is general and capable of accommodating complex boundary conditions associated with seepage faces and infiltration or evaporation on the soil surface. Included in this formulation is an improved Picard algorithm designed to cope with severely nonlinear soil moisture relations. The algorithm is formulated for both rectangular and triangular prism elements. The element matrices are evaluated using an "influence coefficient" technique that avoids costly numerical integration. Spatial discretization of a three-dimensional region is performed using a vertical slicing approach designed to accommodate complex geometry with irregular boundaries, layering, and/or lateral discontinuities. Matrix solution is achieved using a slice successive overrelaxation scheme that permits a fairly large number of nodal unknowns (on the order of several thousand) to be handled efficiently on small minicomputers. Six examples are presented to verify and demonstrate the utility of the proposed finite-element model. The first four examples concern one- and two-dimensional flow problems used as sample problems to benchmark the code. The remaining examples concern three-dimensional problems. These problems are used to illustrate the performance of the proposed algorithm in three-dimensional situations involving seepage faces and anisotropic soil media.

  19. Dominant Lyapunov exponent and approximate entropy in heart rate variability during emotional visual elicitation

    PubMed Central

    Valenza, Gaetano; Allegrini, Paolo; Lanatà, Antonio; Scilingo, Enzo Pasquale

    2012-01-01

    In this work we characterized the non-linear complexity of Heart Rate Variability (HRV) in short time series. The complexity of HRV signal was evaluated during emotional visual elicitation by using Dominant Lyapunov Exponents (DLEs) and Approximate Entropy (ApEn). We adopted a simplified model of emotion derived from the Circumplex Model of Affects (CMAs), in which emotional mechanisms are conceptualized in two dimensions by the terms of valence and arousal. Following CMA model, a set of standardized visual stimuli in terms of arousal and valence gathered from the International Affective Picture System (IAPS) was administered to a group of 35 healthy volunteers. Experimental session consisted of eight sessions alternating neutral images with high arousal content images. Several works can be found in the literature showing a chaotic dynamics of HRV during rest or relax conditions. The outcomes of this work showed a clear switching mechanism between regular and chaotic dynamics when switching from neutral to arousal elicitation. Accordingly, the mean ApEn decreased with statistical significance during arousal elicitation and the DLE became negative. Results showed a clear distinction between the neutral and the arousal elicitation and could be profitably exploited to improve the accuracy of emotion recognition systems based on HRV time series analysis. PMID:22393320

  20. Simulating Complex, Cold-region Process Interactions Using a Multi-scale, Variable-complexity Hydrological Model

    NASA Astrophysics Data System (ADS)

    Marsh, C.; Pomeroy, J. W.; Wheater, H. S.

    2017-12-01

    Accurate management of water resources is necessary for social, economic, and environmental sustainability worldwide. In locations with seasonal snowcovers, the accurate prediction of these water resources is further complicated due to frozen soils, solid-phase precipitation, blowing snow transport, and snowcover-vegetation-atmosphere interactions. Complex process interactions and feedbacks are a key feature of hydrological systems and may result in emergent phenomena, i.e., the arising of novel and unexpected properties within a complex system. One example is the feedback associated with blowing snow redistribution, which can lead to drifts that cause locally-increased soil moisture, thus increasing plant growth that in turn subsequently impacts snow redistribution, creating larger drifts. Attempting to simulate these emergent behaviours is a significant challenge, however, and there is concern that process conceptualizations within current models are too incomplete to represent the needed interactions. An improved understanding of the role of emergence in hydrological systems often requires high resolution distributed numerical hydrological models that incorporate the relevant process dynamics. The Canadian Hydrological Model (CHM) provides a novel tool for examining cold region hydrological systems. Key features include efficient terrain representation, allowing simulations at various spatial scales, reduced computational overhead, and a modular process representation allowing for an alternative-hypothesis framework. Using both physics-based and conceptual process representations sourced from long term process studies and the current cold regions literature allows for comparison of process representations and importantly, their ability to produce emergent behaviours. Examining the system in a holistic, process-based manner can hopefully derive important insights and aid in development of improved process representations.

  1. Performance Variability as a Predictor of Response to Aphasia Treatment.

    PubMed

    Duncan, E Susan; Schmah, Tanya; Small, Steven L

    2016-10-01

    Performance variability in individuals with aphasia is typically regarded as a nuisance factor complicating assessment and treatment. We present the alternative hypothesis that intraindividual variability represents a fundamental characteristic of an individual's functioning and an important biomarker for therapeutic selection and prognosis. A total of 19 individuals with chronic aphasia participated in a 6-week trial of imitation-based speech therapy. We assessed improvement both on overall language functioning and repetition ability. Furthermore, we determined which pretreatment variables best predicted improvement on the repetition test. Significant gains were made on the Western Aphasia Battery-Revised (WAB) Aphasia Quotient, Cortical Quotient, and 2 subtests as well as on a separate repetition test. Using stepwise regression, we found that pretreatment intraindividual variability was the only predictor of improvement in performance on the repetition test, with greater pretreatment variability predicting greater improvement. Furthermore, the degree of reduction in this variability over the course of treatment was positively correlated with the degree of improvement. Intraindividual variability may be indicative of potential for improvement on a given task, with more uniform performance suggesting functioning at or near peak potential. © The Author(s) 2016.

  2. Replication of Simulated Prebiotic Amphiphilic Vesicles in a Finite Environment Exhibits Complex Behavior That Includes High Progeny Variability and Competition

    PubMed Central

    Armstrong, Don L.; Lancet, Doron

    2018-01-01

    Abstract We studied the simulated replication and growth of prebiotic vesicles composed of 140 phospholipids and cholesterol using our R-GARD (Real Graded Autocatalysis Replication Domain) formalism that utilizes currently extant lipids that have known rate constants of lipid-vesicle interactions from published experimental data. R-GARD normally modifies kinetic parameters of lipid-vesicle interactions based on vesicle composition and properties. Our original R-GARD model tracked the growth and division of one vesicle at a time in an environment with unlimited lipids at a constant concentration. We explore here a modified model where vesicles compete for a finite supply of lipids. We observed that vesicles exhibit complex behavior including initial fast unrestricted growth, followed by intervesicle competition for diminishing resources, then a second growth burst driven by better-adapted vesicles, and ending with a final steady state. Furthermore, in simulations without kinetic parameter modifications (“invariant kinetics”), the initial replication was an order of magnitude slower, and vesicles' composition variability at the final steady state was much lower. The complex kinetic behavior was not observed either in the previously published R-GARD simulations or in additional simulations presented here with only one lipid component. This demonstrates that both a finite environment (inducing selection) and multiple components (providing variation for selection to act upon) are crucial for portraying evolution-like behavior. Such properties can improve survival in a changing environment by increasing the ability of early protocellular entities to respond to rapid environmental fluctuations likely present during abiogenesis both on Earth and possibly on other planets. This in silico simulation predicts that a relatively simple in vitro chemical system containing only lipid molecules might exhibit properties that are relevant to prebiotic processes. Key Words: Phospholipid vesicles—Prebiotic compartments—Prebiotic vesicle competition—Prebiotic vesicle variability. Astrobiology 18, 419–430. PMID:29634319

  3. Risk modeling for ventricular assist device support in post-cardiotomy shock.

    PubMed

    Alsoufi, Bahaaldin; Rao, Vivek; Tang, Augustine; Maganti, Manjula; Cusimano, Robert

    2012-04-01

    Post-cardiotomy shock (PCS) has a complex etiology. Although treatment with inotrops and intra-aortic balloon pump (IABP) support improves cardiac performance, end-organ injuries are common and lead to prolonged ICU stay, extended hospitalization and increased mortality. Early consideration of mechanical circulatory support may prevent such complications and improve outcome. Between January 1997 and January 2002, 321 patients required IABP and inotropic support for PCS following coronary artery bypass grafting (CABG) at our institution. Perioperative variables including age, mixed venous saturation (MVO2), inotropic requirements and LV function were analyzed using multivariate statistical methods. All explanatory variables with a univariate p value <0.10 were entered into a stepwise logistic regression model to predict hospital mortality. Odds ratios from significant variables (p < 0.05) in the regression model were used to compose a risk score. Overall hospital mortality was 16%. The independent risk factors for mortality in this population were: MVO2 < 60% (OR = 3.2), milrinone > 0.5 μg/kg/min (OR = 3.2), age > 75 (OR = 2.7), adrenaline > 0.1 μg/kg/min (OR = 1.5). A 15-point risk score was developed based on the regression model. Hospital mortality in patients with a score >6 was 46% (n = 13/28), 3-6 was 31% (n = 9/29) and <3 was 11% (n = 29/264). A significant proportion of patients with PCS continue to face high mortality despite IABP and inotropic support. Advanced age, heavy inotropic dependency and poor oxygen delivery all predicted increased risk for death. Further investigation is needed to assess whether early institution of VAD support could improve outcome in this high-risk group of patients.

  4. Newborn screening: A disease-changing intervention for glutaric aciduria type 1.

    PubMed

    Boy, Nikolas; Mengler, Katharina; Thimm, Eva; Schiergens, Katharina A; Marquardt, Thorsten; Weinhold, Natalie; Marquardt, Iris; Das, Anibh M; Freisinger, Peter; Grünert, Sarah C; Vossbeck, Judith; Steinfeld, Robert; Baumgartner, Matthias R; Beblo, Skadi; Dieckmann, Andrea; Näke, Andrea; Lindner, Martin; Heringer, Jana; Hoffmann, Georg F; Mühlhausen, Chris; Maier, Esther M; Ensenauer, Regina; Garbade, Sven F; Kölker, Stefan

    2018-05-01

    Untreated individuals with glutaric aciduria type 1 (GA1) commonly present with a complex, predominantly dystonic movement disorder (MD) following acute or insidious onset striatal damage. Implementation of GA1 into newborn screening (NBS) programs has improved the short-term outcome. It remains unclear, however, whether NBS changes the long-term outcome and which variables are predictive. This prospective, observational, multicenter study includes 87 patients identified by NBS, 4 patients missed by NBS, and 3 women with GA1 identified by positive NBS results of their unaffected children. The study population comprises 98.3% of individuals with GA1 identified by NBS in Germany during 1999-2016. Overall, cumulative sensitivity of NBS is 95.6%, but it is lower (84%) for patients with low excreter phenotype. The neurologic outcome of patients missed by NBS is as poor as in the pre-NBS era, and the clinical phenotype of diagnosed patients depends on the quality of therapeutic interventions rather than noninterventional variables. Presymptomatic start of treatment according to current guideline recommendations clearly improves the neurologic outcome (MD: 7% of patients), whereas delayed emergency treatment results in acute onset MD (100%), and deviations from maintenance treatment increase the risk of insidious onset MD (50%). Independent of the neurologic phenotype, kidney function tends to decline with age, a nonneurologic manifestation not predicted by any variable included in this study. NBS is a beneficial, disease-changing intervention for GA1. However, improved neurologic outcome critically depends on adherence to recommended therapy, whereas kidney dysfunction does not appear to be impacted by recommended therapy. Ann Neurol 2018;83:970-979. © 2018 American Neurological Association.

  5. Analytic complexity of functions of two variables

    NASA Astrophysics Data System (ADS)

    Beloshapka, V. K.

    2007-09-01

    The definition of analytic complexity of an analytic function of two variables is given. It is proved that the class of functions of a chosen complexity is a differentialalgebraic set. A differential polynomial defining the functions of first class is constructed. An algorithm for obtaining relations defining an arbitrary class is described. Examples of functions are given whose order of complexity is equal to zero, one, two, and infinity. It is shown that the formal order of complexity of the Cardano and Ferrari formulas is significantly higher than their analytic complexity. The complexity classes turn out to be invariant with respect to a certain infinite-dimensional transformation pseudogroup. In this connection, we describe the orbits of the action of this pseudogroup in the jets of orders one, two, and three. The notion of complexity order is extended to plane (or “planar”) 3-webs. It is discovered that webs of complexity order one are the hexagonal webs. Some problems are posed.

  6. An outline of graphical Markov models in dentistry.

    PubMed

    Helfenstein, U; Steiner, M; Menghini, G

    1999-12-01

    In the usual multiple regression model there is one response variable and one block of several explanatory variables. In contrast, in reality there may be a block of several possibly interacting response variables one would like to explain. In addition, the explanatory variables may split into a sequence of several blocks, each block containing several interacting variables. The variables in the second block are explained by those in the first block; the variables in the third block by those in the first and the second block etc. During recent years methods have been developed allowing analysis of problems where the data set has the above complex structure. The models involved are called graphical models or graphical Markov models. The main result of an analysis is a picture, a conditional independence graph with precise statistical meaning, consisting of circles representing variables and lines or arrows representing significant conditional associations. The absence of a line between two circles signifies that the corresponding two variables are independent conditional on the presence of other variables in the model. An example from epidemiology is presented in order to demonstrate application and use of the models. The data set in the example has a complex structure consisting of successive blocks: the variable in the first block is year of investigation; the variables in the second block are age and gender; the variables in the third block are indices of calculus, gingivitis and mutans streptococci and the final response variables in the fourth block are different indices of caries. Since the statistical methods may not be easily accessible to dentists, this article presents them in an introductory form. Graphical models may be of great value to dentists in allowing analysis and visualisation of complex structured multivariate data sets consisting of a sequence of blocks of interacting variables and, in particular, several possibly interacting responses in the final block.

  7. Videogame training strategy-induced change in brain function during a complex visuomotor task.

    PubMed

    Lee, Hyunkyu; Voss, Michelle W; Prakash, Ruchika Shaurya; Boot, Walter R; Vo, Loan T K; Basak, Chandramallika; Vanpatter, Matt; Gratton, Gabriele; Fabiani, Monica; Kramer, Arthur F

    2012-07-01

    Although changes in brain function induced by cognitive training have been examined, functional plasticity associated with specific training strategies is still relatively unexplored. In this study, we examined changes in brain function during a complex visuomotor task following training using the Space Fortress video game. To assess brain function, participants completed functional magnetic resonance imaging (fMRI) before and after 30 h of training with one of two training regimens: Hybrid Variable-Priority Training (HVT), with a focus on improving specific skills and managing task priority, or Full Emphasis Training (FET), in which participants simply practiced the game to obtain the highest overall score. Control participants received only 6 h of FET. Compared to FET, HVT learners reached higher performance on the game and showed less brain activation in areas related to visuo-spatial attention and goal-directed movement after training. Compared to the control group, HVT exhibited less brain activation in right dorsolateral prefrontal cortex (DLPFC), coupled with greater performance improvement. Region-of-interest analysis revealed that the reduction in brain activation was correlated with improved performance on the task. This study sheds light on the neurobiological mechanisms of improved learning from directed training (HVT) over non-directed training (FET), which is related to visuo-spatial attention and goal-directed motor planning, while separating the practice-based benefit, which is related to executive control and rule management. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Effects of Sex, Strain, and Energy Intake on Hallmarks of Aging in Mice.

    PubMed

    Mitchell, Sarah J; Madrigal-Matute, Julio; Scheibye-Knudsen, Morten; Fang, Evandro; Aon, Miguel; González-Reyes, José A; Cortassa, Sonia; Kaushik, Susmita; Gonzalez-Freire, Marta; Patel, Bindi; Wahl, Devin; Ali, Ahmed; Calvo-Rubio, Miguel; Burón, María I; Guiterrez, Vincent; Ward, Theresa M; Palacios, Hector H; Cai, Huan; Frederick, David W; Hine, Christopher; Broeskamp, Filomena; Habering, Lukas; Dawson, John; Beasley, T Mark; Wan, Junxiang; Ikeno, Yuji; Hubbard, Gene; Becker, Kevin G; Zhang, Yongqing; Bohr, Vilhelm A; Longo, Dan L; Navas, Placido; Ferrucci, Luigi; Sinclair, David A; Cohen, Pinchas; Egan, Josephine M; Mitchell, James R; Baur, Joseph A; Allison, David B; Anson, R Michael; Villalba, José M; Madeo, Frank; Cuervo, Ana Maria; Pearson, Kevin J; Ingram, Donald K; Bernier, Michel; de Cabo, Rafael

    2016-06-14

    Calorie restriction (CR) is the most robust non-genetic intervention to delay aging. However, there are a number of emerging experimental variables that alter CR responses. We investigated the role of sex, strain, and level of CR on health and survival in mice. CR did not always correlate with lifespan extension, although it consistently improved health across strains and sexes. Transcriptional and metabolomics changes driven by CR in liver indicated anaplerotic filling of the Krebs cycle together with fatty acid fueling of mitochondria. CR prevented age-associated decline in the liver proteostasis network while increasing mitochondrial number, preserving mitochondrial ultrastructure and function with age. Abrogation of mitochondrial function negated life-prolonging effects of CR in yeast and worms. Our data illustrate the complexity of CR in the context of aging, with a clear separation of outcomes related to health and survival, highlighting complexities of translation of CR into human interventions. Published by Elsevier Inc.

  9. An artificial neural network improves prediction of observed survival in patients with laryngeal squamous carcinoma.

    PubMed

    Jones, Andrew S; Taktak, Azzam G F; Helliwell, Timothy R; Fenton, John E; Birchall, Martin A; Husband, David J; Fisher, Anthony C

    2006-06-01

    The accepted method of modelling and predicting failure/survival, Cox's proportional hazards model, is theoretically inferior to neural network derived models for analysing highly complex systems with large datasets. A blinded comparison of the neural network versus the Cox's model in predicting survival utilising data from 873 treated patients with laryngeal cancer. These were divided randomly and equally into a training set and a study set and Cox's and neural network models applied in turn. Data were then divided into seven sets of binary covariates and the analysis repeated. Overall survival was not significantly different on Kaplan-Meier plot, or with either test model. Although the network produced qualitatively similar results to Cox's model it was significantly more sensitive to differences in survival curves for age and N stage. We propose that neural networks are capable of prediction in systems involving complex interactions between variables and non-linearity.

  10. Development of sensor augmented robotic weld systems for aerospace propulsion system fabrication

    NASA Technical Reports Server (NTRS)

    Jones, C. S.; Gangl, K. J.

    1986-01-01

    In order to meet stringent performance goals for power and reuseability, the Space Shuttle Main Engine was designed with many complex, difficult welded joints that provide maximum strength and minimum weight. To this end, the SSME requires 370 meters of welded joints. Automation of some welds has improved welding productivity significantly over manual welding. Application has previously been limited by accessibility constraints, requirements for complex process control, low production volumes, high part variability, and stringent quality requirements. Development of robots for welding in this application requires that a unique set of constraints be addressed. This paper shows how robotic welding can enhance production of aerospace components by addressing their specific requirements. A development program at the Marshall Space Flight Center combining industrial robots with state-of-the-art sensor systems and computer simulation is providing technology for the automation of welds in Space Shuttle Main Engine production.

  11. Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.

    PubMed

    Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.

  12. Mapping wildland fuels for fire management across multiple scales: integrating remote sensing, GIS, and biophysical modeling

    USGS Publications Warehouse

    Keane, Robert E.; Burgan, Robert E.; Van Wagtendonk, Jan W.

    2001-01-01

    Fuel maps are essential for computing spatial fire hazard and risk and simulating fire growth and intensity across a landscape. However, fuel mapping is an extremely difficult and complex process requiring expertise in remotely sensed image classification, fire behavior, fuels modeling, ecology, and geographical information systems (GIS). This paper first presents the challenges of mapping fuels: canopy concealment, fuelbed complexity, fuel type diversity, fuel variability, and fuel model generalization. Then, four approaches to mapping fuels are discussed with examples provided from the literature: (1) field reconnaissance; (2) direct mapping methods; (3) indirect mapping methods; and (4) gradient modeling. A fuel mapping method is proposed that uses current remote sensing and image processing technology. Future fuel mapping needs are also discussed which include better field data and fuel models, accurate GIS reference layers, improved satellite imagery, and comprehensive ecosystem models.

  13. A Graph Based Backtracking Algorithm for Solving General CSPs

    NASA Technical Reports Server (NTRS)

    Pang, Wanlin; Goodwin, Scott D.

    2003-01-01

    Many AI tasks can be formalized as constraint satisfaction problems (CSPs), which involve finding values for variables subject to constraints. While solving a CSP is an NP-complete task in general, tractable classes of CSPs have been identified based on the structure of the underlying constraint graphs. Much effort has been spent on exploiting structural properties of the constraint graph to improve the efficiency of finding a solution. These efforts contributed to development of a class of CSP solving algorithms called decomposition algorithms. The strength of CSP decomposition is that its worst-case complexity depends on the structural properties of the constraint graph and is usually better than the worst-case complexity of search methods. Its practical application is limited, however, since it cannot be applied if the CSP is not decomposable. In this paper, we propose a graph based backtracking algorithm called omega-CDBT, which shares merits and overcomes the weaknesses of both decomposition and search approaches.

  14. Prader-Willi Syndrome: Clinical Aspects

    PubMed Central

    Elena, Grechi; Bruna, Cammarata; Benedetta, Mariani; Stefania, Di Candia; Giuseppe, Chiumello

    2012-01-01

    Prader-Willi Syndrome (PWS) is a complex multisystem genetic disorder that shows great variability, with changing clinical features during a patient's life. The syndrome is due to the loss of expression of several genes encoded on the proximal long arm of chromosome 15 (15q11.2–q13). The complex phenotype is most probably caused by a hypothalamic dysfunction that is responsible for hormonal dysfunctions and for absence of the sense of satiety. For this reason a Prader-Willi (PW) child develops hyperphagia during the initial stage of infancy that can lead to obesity and its complications. During infancy many PW child display a range of behavioural problems that become more noticeable in adolescence and adulthood and interfere mostly with quality of life. Early diagnosis of PWS is important for effective long-term management, and a precocious multidisciplinary approach is fundamental to improve quality of life, prevent complications, and prolong life expectancy. PMID:23133744

  15. Defining Long-Duration Traverses of Lunar Volcanic Complexes with LROC NAC Images

    NASA Technical Reports Server (NTRS)

    Stopar, J. D.; Lawrence, S. J.; Joliff, B. L.; Speyerer, E. J.; Robinson, M. S.

    2016-01-01

    A long-duration lunar rover [e.g., 1] would be ideal for investigating large volcanic complexes like the Marius Hills (MH) (approximately 300 x 330 km), where widely spaced sampling points are needed to explore the full geologic and compositional variability of the region. Over these distances, a rover would encounter varied surface morphologies (ranging from impact craters to rugged lava shields), each of which need to be considered during the rover design phase. Previous rovers including Apollo, Lunokhod, and most recently Yutu, successfully employed pre-mission orbital data for planning (at scales significantly coarser than that of the surface assets). LROC was specifically designed to provide mission-planning observations at scales useful for accurate rover traverse planning (crewed and robotic) [2]. After-the-fact analyses of the planning data can help improve predictions of future rover performance [e.g., 3-5].

  16. Factors associated to acceptable treatment adherence among children with chronic kidney disease in Guatemala

    PubMed Central

    Cerón, Alejandro; Méndez-Alburez, Luis Pablo; Lou-Meda, Randall

    2017-01-01

    Pediatric patients with Chronic Kidney Disease face several barriers to medication adherence that, if addressed, may improve clinical care outcomes. A cross sectional questionnaire was administered in the Foundation for Children with Kidney Disease (FUNDANIER, Guatemala City) from September of 2015 to April of 2016 to identify the predisposing factors, enabling factors and need factors related to medication adherence. Sample size was calculated using simple random sampling with a confidence level of 95%, confidence interval of 0.05 and a proportion of 87%. A total of 103 participants responded to the questionnaire (calculated sample size was 96). Independent variables were defined and described, and the bivariate relationship to dependent variables was determined using Odds Ratio. Multivariate analysis was carried out using logistic regression. The mean adherence of study population was 78% (SD 0.08, max = 96%, min = 55%). The mean adherence in transplant patients was 82% (SD 7.8, max 96%, min 63%), and the mean adherence in dialysis patients was 76% (SD 7.8 max 90%, min 55%). Adherence was positively associated to the mother’s educational level and to higher monthly household income. Together predisposing, enabling and need factors illustrate the complexities surrounding adherence in this pediatric CKD population. Public policy strategies aimed at improving access to comprehensive treatment regimens may facilitate treatment access, alleviating economic strain on caregivers and may improve adherence outcomes. PMID:29036228

  17. Comparing and improving reconstruction methods for proxies based on compositional data

    NASA Astrophysics Data System (ADS)

    Nolan, C.; Tipton, J.; Booth, R.; Jackson, S. T.; Hooten, M.

    2017-12-01

    Many types of studies in paleoclimatology and paleoecology involve compositional data. Often, these studies aim to use compositional data to reconstruct an environmental variable of interest; the reconstruction is usually done via the development of a transfer function. Transfer functions have been developed using many different methods. Existing methods tend to relate the compositional data and the reconstruction target in very simple ways. Additionally, the results from different methods are rarely compared. Here we seek to address these two issues. First, we introduce a new hierarchical Bayesian multivariate gaussian process model; this model allows for the relationship between each species in the compositional dataset and the environmental variable to be modeled in a way that captures the underlying complexities. Then, we compare this new method to machine learning techniques and commonly used existing methods. The comparisons are based on reconstructing the water table depth history of Caribou Bog (an ombrotrophic Sphagnum peat bog in Old Town, Maine, USA) from a new 7500 year long record of testate amoebae assemblages. The resulting reconstructions from different methods diverge in both their resulting means and uncertainties. In particular, uncertainty tends to be drastically underestimated by some common methods. These results will help to improve inference of water table depth from testate amoebae. Furthermore, this approach can be applied to test and improve inferences of past environmental conditions from a broad array of paleo-proxies based on compositional data

  18. Factors associated to acceptable treatment adherence among children with chronic kidney disease in Guatemala.

    PubMed

    Ramay, Brooke M; Cerón, Alejandro; Méndez-Alburez, Luis Pablo; Lou-Meda, Randall

    2017-01-01

    Pediatric patients with Chronic Kidney Disease face several barriers to medication adherence that, if addressed, may improve clinical care outcomes. A cross sectional questionnaire was administered in the Foundation for Children with Kidney Disease (FUNDANIER, Guatemala City) from September of 2015 to April of 2016 to identify the predisposing factors, enabling factors and need factors related to medication adherence. Sample size was calculated using simple random sampling with a confidence level of 95%, confidence interval of 0.05 and a proportion of 87%. A total of 103 participants responded to the questionnaire (calculated sample size was 96). Independent variables were defined and described, and the bivariate relationship to dependent variables was determined using Odds Ratio. Multivariate analysis was carried out using logistic regression. The mean adherence of study population was 78% (SD 0.08, max = 96%, min = 55%). The mean adherence in transplant patients was 82% (SD 7.8, max 96%, min 63%), and the mean adherence in dialysis patients was 76% (SD 7.8 max 90%, min 55%). Adherence was positively associated to the mother's educational level and to higher monthly household income. Together predisposing, enabling and need factors illustrate the complexities surrounding adherence in this pediatric CKD population. Public policy strategies aimed at improving access to comprehensive treatment regimens may facilitate treatment access, alleviating economic strain on caregivers and may improve adherence outcomes.

  19. Optimizing ACS NSQIP modeling for evaluation of surgical quality and risk: patient risk adjustment, procedure mix adjustment, shrinkage adjustment, and surgical focus.

    PubMed

    Cohen, Mark E; Ko, Clifford Y; Bilimoria, Karl Y; Zhou, Lynn; Huffman, Kristopher; Wang, Xue; Liu, Yaoming; Kraemer, Kari; Meng, Xiangju; Merkow, Ryan; Chow, Warren; Matel, Brian; Richards, Karen; Hart, Amy J; Dimick, Justin B; Hall, Bruce L

    2013-08-01

    The American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) collects detailed clinical data from participating hospitals using standardized data definitions, analyzes these data, and provides participating hospitals with reports that permit risk-adjusted comparisons with a surgical quality standard. Since its inception, the ACS NSQIP has worked to refine surgical outcomes measurements and enhance statistical methods to improve the reliability and validity of this hospital profiling. From an original focus on controlling for between-hospital differences in patient risk factors with logistic regression, ACS NSQIP has added a variable to better adjust for the complexity and risk profile of surgical procedures (procedure mix adjustment) and stabilized estimates derived from small samples by using a hierarchical model with shrinkage adjustment. New models have been developed focusing on specific surgical procedures (eg, "Procedure Targeted" models), which provide opportunities to incorporate indication and other procedure-specific variables and outcomes to improve risk adjustment. In addition, comparative benchmark reports given to participating hospitals have been expanded considerably to allow more detailed evaluations of performance. Finally, procedures have been developed to estimate surgical risk for individual patients. This article describes the development of, and justification for, these new statistical methods and reporting strategies in ACS NSQIP. Copyright © 2013 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  20. Individuals with Type 1 and Type 2 Diabetes Mellitus Trade Increased Hyperglycemia for Decreased Hypoglycemia When Glycemic Variability is not Improved.

    PubMed

    Jangam, Sujit R; Hayter, Gary; Dunn, Timothy C

    2018-02-01

    Glycemic variability refers to oscillations in blood glucose within a day and differences in blood glucose at the same time on different days. Glycemic variability is linked to hypoglycemia and hyperglycemia. The relationship among these three important metrics is examined here, specifically to show how reduction in both hypo- and hyperglycemia risk is dependent on changes in variability. To understand the importance of glycemic variability in the simultaneous reduction of hypoglycemia and hyperglycemia risk, we introduce the glycemic risk plot-estimated HbA1c % (eA1c) vs. minutes below 70 mg/dl (MB70) with constant variability contours for predicting post-intervention risks in the absence of a change in glycemic variability. The glycemic risk plot illustrates that individuals who do not reduce glycemic variability improve one of the two metrics (hypoglycemia risk or hyperglycemia risk) at the cost of the other. It is important to reduce variability to improve both risks. These results were confirmed by data collected in a randomized controlled trial consisting of individuals with type 1 and type 2 diabetes on insulin therapy. For type 1, a total of 28 individuals out of 35 (80%) showed improvement in at least one of the risks (hypo and/or hyper) during the 100-day course of the study. Seven individuals (20%) showed improvement in both. Similar data were observed for type 2 where a total of 36 individuals out of 43 (84%) showed improvement in at least one risk and 8 individuals (19%) showed improvement in both. All individuals in the study who showed improvement in both hypoglycemia and hyperglycemia risk also showed a reduction in variability. Therapy changes intended to improve an individual's hypoglycemia or hyperglycemia risk often result in the reduction of one risk at the expense of another. It is important to improve glucose variability to reduce both risks or at least maintain one risk while reducing the other. Abbott Diabetes Care.

  1. The effect of muscle fatigue and low back pain on lumbar movement variability and complexity.

    PubMed

    Bauer, C M; Rast, F M; Ernst, M J; Meichtry, A; Kool, J; Rissanen, S M; Suni, J H; Kankaanpää, M

    2017-04-01

    Changes in movement variability and complexity may reflect an adaptation strategy to fatigue. One unresolved question is whether this adaptation is hampered by the presence of low back pain (LBP). This study investigated if changes in movement variability and complexity after fatigue are influenced by the presence of LBP. It is hypothesised that pain free people and people suffering from LBP differ in their response to fatigue. The effect of an isometric endurance test on lumbar movement was tested in 27 pain free participants and 59 participants suffering from LBP. Movement variability and complexity were quantified with %determinism and sample entropy of lumbar angular displacement and velocity. Generalized linear models were fitted for each outcome. Bayesian estimation of the group-fatigue effect with 95% highest posterior density intervals (95%HPDI) was performed. After fatiguing %determinism decreased and sample entropy increased in the pain free group, compared to the LBP group. The corresponding group-fatigue effects were 3.7 (95%HPDI: 2.3-7.1) and -1.4 (95%HPDI: -2.7 to -0.1). These effects manifested in angular velocity, but not in angular displacement. The effects indicate that pain free participants showed more complex and less predictable lumbar movement with a lower degree of structure in its variability following fatigue while participants suffering from LBP did not. This may be physiological responses to avoid overload of fatigued tissue, increase endurance, or a consequence of reduced movement control caused by fatigue. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Modelling the meteorological forest fire niche in heterogeneous pyrologic conditions.

    PubMed

    De Angelis, Antonella; Ricotta, Carlo; Conedera, Marco; Pezzatti, Gianni Boris

    2015-01-01

    Fire regimes are strongly related to weather conditions that directly and indirectly influence fire ignition and propagation. Identifying the most important meteorological fire drivers is thus fundamental for daily fire risk forecasting. In this context, several fire weather indices have been developed focussing mainly on fire-related local weather conditions and fuel characteristics. The specificity of the conditions for which fire danger indices are developed makes its direct transfer and applicability problematic in different areas or with other fuel types. In this paper we used the low-to-intermediate fire-prone region of Canton Ticino as a case study to develop a new daily fire danger index by implementing a niche modelling approach (Maxent). In order to identify the most suitable weather conditions for fires, different combinations of input variables were tested (meteorological variables, existing fire danger indices or a combination of both). Our findings demonstrate that such combinations of input variables increase the predictive power of the resulting index and surprisingly even using meteorological variables only allows similar or better performances than using the complex Canadian Fire Weather Index (FWI). Furthermore, the niche modelling approach based on Maxent resulted in slightly improved model performance and in a reduced number of selected variables with respect to the classical logistic approach. Factors influencing final model robustness were the number of fire events considered and the specificity of the meteorological conditions leading to fire ignition.

  3. Neural Network Machine Learning and Dimension Reduction for Data Visualization

    NASA Technical Reports Server (NTRS)

    Liles, Charles A.

    2014-01-01

    Neural network machine learning in computer science is a continuously developing field of study. Although neural network models have been developed which can accurately predict a numeric value or nominal classification, a general purpose method for constructing neural network architecture has yet to be developed. Computer scientists are often forced to rely on a trial-and-error process of developing and improving accurate neural network models. In many cases, models are constructed from a large number of input parameters. Understanding which input parameters have the greatest impact on the prediction of the model is often difficult to surmise, especially when the number of input variables is very high. This challenge is often labeled the "curse of dimensionality" in scientific fields. However, techniques exist for reducing the dimensionality of problems to just two dimensions. Once a problem's dimensions have been mapped to two dimensions, it can be easily plotted and understood by humans. The ability to visualize a multi-dimensional dataset can provide a means of identifying which input variables have the highest effect on determining a nominal or numeric output. Identifying these variables can provide a better means of training neural network models; models can be more easily and quickly trained using only input variables which appear to affect the outcome variable. The purpose of this project is to explore varying means of training neural networks and to utilize dimensional reduction for visualizing and understanding complex datasets.

  4. Modelling the Meteorological Forest Fire Niche in Heterogeneous Pyrologic Conditions

    PubMed Central

    De Angelis, Antonella; Ricotta, Carlo; Conedera, Marco; Pezzatti, Gianni Boris

    2015-01-01

    Fire regimes are strongly related to weather conditions that directly and indirectly influence fire ignition and propagation. Identifying the most important meteorological fire drivers is thus fundamental for daily fire risk forecasting. In this context, several fire weather indices have been developed focussing mainly on fire-related local weather conditions and fuel characteristics. The specificity of the conditions for which fire danger indices are developed makes its direct transfer and applicability problematic in different areas or with other fuel types. In this paper we used the low-to-intermediate fire-prone region of Canton Ticino as a case study to develop a new daily fire danger index by implementing a niche modelling approach (Maxent). In order to identify the most suitable weather conditions for fires, different combinations of input variables were tested (meteorological variables, existing fire danger indices or a combination of both). Our findings demonstrate that such combinations of input variables increase the predictive power of the resulting index and surprisingly even using meteorological variables only allows similar or better performances than using the complex Canadian Fire Weather Index (FWI). Furthermore, the niche modelling approach based on Maxent resulted in slightly improved model performance and in a reduced number of selected variables with respect to the classical logistic approach. Factors influencing final model robustness were the number of fire events considered and the specificity of the meteorological conditions leading to fire ignition. PMID:25679957

  5. Rivastigmine for gait stability in patients with Parkinson's disease (ReSPonD): a randomised, double-blind, placebo-controlled, phase 2 trial.

    PubMed

    Henderson, Emily J; Lord, Stephen R; Brodie, Matthew A; Gaunt, Daisy M; Lawrence, Andrew D; Close, Jacqueline C T; Whone, A L; Ben-Shlomo, Y

    2016-03-01

    Falls are a frequent and serious complication of Parkinson's disease and are related partly to an underlying cholinergic deficit that contributes to gait and cognitive dysfunction in these patients. Gait dysfunction can lead to an increased variability of gait from one step to another, raising the likelihood of falls. In the ReSPonD trial we aimed to assess whether ameliorating this cholinergic deficit with the acetylcholinesterase inhibitor rivastigmine would reduce gait variability. We did this randomised, double-blind, placebo-controlled, phase 2 trial at the North Bristol NHS Trust Hospital, Bristol, UK, in patients with Parkinson's disease recruited from community and hospital settings in the UK. We included patients who had fallen at least once in the year before enrolment, were able to walk 18 m without an aid, had no previous exposure to an acetylcholinesterase inhibitor, and did not have dementia. Our clinical trials unit randomly assigned (1:1) patients to oral rivastigmine or placebo capsules (both taken twice a day) using a computer-generated randomisation sequence and web-based allocation. Rivastigmine was uptitrated from 3 mg per day to the target dose of 12 mg per day over 12 weeks. Both the trial team and patients were masked to treatment allocation. Masking was achieved with matched placebo capsules and a dummy uptitration schedule. The primary endpoint was difference in step time variability between the two groups at 32 weeks, adjusted for baseline age, cognition, step time variability, and number of falls in the previous year. We measured step time variability with a triaxial accelerometer during an 18 m walking task in three conditions: normal walking, simple dual task with phonemic verbal fluency (walking while naming words beginning with a single letter), and complex dual task switching with phonemic verbal fluency (walking while naming words, alternating between two letters of the alphabet). Analysis was by modified intention to treat; we excluded from the primary analysis patients who withdrew, died, or did not attend the 32 week assessment. This trial is registered with ISRCTN, number 19880883. Between Oct 4, 2012 and March 28, 2013, we enrolled 130 patients and randomly assigned 65 to the rivastigmine group and 65 to the placebo group. At week 32, compared with patients assigned to placebo (59 assessed), those assigned to rivastigmine (55 assessed) had improved step time variability for normal walking (ratio of geometric means 0.72, 95% CI 0.58-0.88; p=0.002) and the simple dual task (0.79; 0.62-0.99; p=0.045). Improvements in step time variability for the complex dual task did not differ between groups (0.81, 0.60-1.09; p=0.17). Gastrointestinal side-effects were more common in the rivastigmine group than in the placebo group (p<0.0001); 20 (31%) patients in the rivastigmine group versus three (5%) in the placebo group had nausea and 15 (17%) versus three (5%) had vomiting. Rivastigmine can improve gait stability and might reduce the frequency of falls. A phase 3 study is needed to confirm these findings and show cost-effectiveness of rivastigmine treatment. Parkinson's UK. Copyright © 2016 Henderson et al. Open Access article distributed under the terms of CC BY. Published by Elsevier Ltd.. All rights reserved.

  6. Domestication to Crop Improvement: Genetic Resources for Sorghum and Saccharum (Andropogoneae)

    PubMed Central

    Dillon, Sally L.; Shapter, Frances M.; Henry, Robert J.; Cordeiro, Giovanni; Izquierdo, Liz; Lee, L. Slade

    2007-01-01

    Background Both sorghum (Sorghum bicolor) and sugarcane (Saccharum officinarum) are members of the Andropogoneae tribe in the Poaceae and are each other's closest relatives amongst cultivated plants. Both are relatively recent domesticates and comparatively little of the genetic potential of these taxa and their wild relatives has been captured by breeding programmes to date. This review assesses the genetic gains made by plant breeders since domestication and the progress in the characterization of genetic resources and their utilization in crop improvement for these two related species. Genetic Resources The genome of sorghum has recently been sequenced providing a great boost to our knowledge of the evolution of grass genomes and the wealth of diversity within S. bicolor taxa. Molecular analysis of the Sorghum genus has identified close relatives of S. bicolor with novel traits, endosperm structure and composition that may be used to expand the cultivated gene pool. Mutant populations (including TILLING populations) provide a useful addition to genetic resources for this species. Sugarcane is a complex polyploid with a large and variable number of copies of each gene. The wild relatives of sugarcane represent a reservoir of genetic diversity for use in sugarcane improvement. Techniques for quantitative molecular analysis of gene or allele copy number in this genetically complex crop have been developed. SNP discovery and mapping in sugarcane has been advanced by the development of high-throughput techniques for ecoTILLING in sugarcane. Genetic linkage maps of the sugarcane genome are being improved for use in breeding selection. The improvement of both sorghum and sugarcane will be accelerated by the incorporation of more diverse germplasm into the domesticated gene pools using molecular tools and the improved knowledge of these genomes. PMID:17766842

  7. Decade-long bird community response to the spatial pattern of variable retention harvesting in red pine (Pinus resinosa) forests

    Treesearch

    Eddie L. Shea; Lisa A. Schulte; Brian J. Palik

    2017-01-01

    Structural complexity is widely recognized as an inherent characteristic of unmanaged forests critical to their function and resilience, but often reduced in their managed counterparts. Variable retention harvesting (VRH) has been proposed as a way to restore or enhance structural complexity in managed forests, and thereby sustain attendant biodiversity and ecosystem...

  8. Analysis and Design of Complex Network Environments

    DTIC Science & Technology

    2014-02-01

    entanglements among un- measured variables. This “potential entanglement ” type of network complexity is previously unaddressed in the literature, yet it...Appreciating the power of structural representations that allow for potential entanglement among unmeasured variables to simplify network inference problems...rely on the idea of subsystems and allows for potential entanglement among unmeasured states. As a result, inferring a system’s signal structure

  9. Spatial variability of shortwave radiative fluxes in the context of snowmelt

    NASA Astrophysics Data System (ADS)

    Pinker, Rachel T.; Ma, Yingtao; Hinkelman, Laura; Lundquist, Jessica

    2014-05-01

    Snow-covered mountain ranges are a major source of water supply for run-off and groundwater recharge. Snowmelt supplies as much as 75% of surface water in basins of the western United States. Factors that affect the rate of snow melt include incoming shortwave and longwave radiation, surface albedo, snow emissivity, snow surface temperature, sensible and latent heat fluxes, ground heat flux, and energy transferred to the snowpack from deposited snow or rain. The net radiation generally makes up about 80% of the energy balance and is dominated by the shortwave radiation. Complex terrain poses a great challenge for obtaining the needed information on radiative fluxes from satellites due to elevation issues, spatially-variable cloud cover, rapidly changing surface conditions during snow fall and snow melt, lack of high quality ground truth for evaluation of the satellite based estimates, as well as scale issues between the ground observations and the satellite footprint. In this study we utilize observations of high spatial resolution (5-km) as available from the Moderate Resolution Imaging Spectro-radiometer (MODIS) to derive surface shortwave radiative fluxes in complex terrain, with attention to the impact of slopes on the amount of radiation received. The methodology developed has been applied to several water years (January to July during 2003, 2004, 2005 and 2009) over the western part of the United States, and the available information was used to derive metrics on spatial and temporal variability in the shortwave fluxes. It is planned to apply the findings from this study for testing improvements in Snow Water Equivalent (SWE) estimates.

  10. The Impact of Interactivity on Comprehending 2D and 3D Visualizations of Movement Data.

    PubMed

    Amini, Fereshteh; Rufiange, Sebastien; Hossain, Zahid; Ventura, Quentin; Irani, Pourang; McGuffin, Michael J

    2015-01-01

    GPS, RFID, and other technologies have made it increasingly common to track the positions of people and objects over time as they move through two-dimensional spaces. Visualizing such spatio-temporal movement data is challenging because each person or object involves three variables (two spatial variables as a function of the time variable), and simply plotting the data on a 2D geographic map can result in overplotting and occlusion that hides details. This also makes it difficult to understand correlations between space and time. Software such as GeoTime can display such data with a three-dimensional visualization, where the 3rd dimension is used for time. This allows for the disambiguation of spatially overlapping trajectories, and in theory, should make the data clearer. However, previous experimental comparisons of 2D and 3D visualizations have so far found little advantage in 3D visualizations, possibly due to the increased complexity of navigating and understanding a 3D view. We present a new controlled experimental comparison of 2D and 3D visualizations, involving commonly performed tasks that have not been tested before, and find advantages in 3D visualizations for more complex tasks. In particular, we tease out the effects of various basic interactions and find that the 2D view relies significantly on "scrubbing" the timeline, whereas the 3D view relies mainly on 3D camera navigation. Our work helps to improve understanding of 2D and 3D visualizations of spatio-temporal data, particularly with respect to interactivity.

  11. Cognitive tasks promote automatization of postural control in young and older adults.

    PubMed

    Potvin-Desrochers, Alexandra; Richer, Natalie; Lajoie, Yves

    2017-09-01

    Researchers looking at the effects of performing a concurrent cognitive task on postural control in young and older adults using traditional center-of-pressure measures and complexity measures found discordant results. Results of experiments showing improvements of stability have suggested the use of strategies such as automatization of postural control or stiffening strategy. This experiment aimed to confirm in healthy young and older adults that performing a cognitive task while standing leads to improvements that are due to automaticity of sway by using sample entropy. Twenty-one young adults and twenty-five older adults were asked to stand on a force platform while performing a cognitive task. There were four cognitive tasks: simple reaction time, go/no-go reaction time, equation and occurrence of a digit in a number sequence. Results demonstrated decreased sway area and variability as well as increased sample entropy for both groups when performing a cognitive task. Results suggest that performing a concurrent cognitive task promotes the adoption of an automatic postural control in young and older adults as evidenced by an increased postural stability and postural sway complexity. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Development of an Enhanced Metaproteomic Approach for Deepening the Microbiome Characterization of the Human Infant Gut

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Weili; Richard J. Giannone; Morowitz, Michael J.

    The early-life microbiota establishment in the human infant gut is highly variable and plays a crucial role in host nutrients and immunity maturation. While high-performance mass spectrometry (MS)-based metaproteomics is a powerful method for the functional characterization of complex microbial communities, the construction of comprehensive metaproteomic information in human fecal samples is inhibited by the presence of abundant human proteins. To alleviate this restriction, we have designed a novel metaproteomic strategy based on Double Filtering (DF) to enhance microbial protein characterization in complex fecal samples from healthy premature infants. We improved the overall depth of infant gut proteome measurement, withmore » an increase in the number of identified low abundance proteins, and observed greater than twofold improvement in metrics for microbial protein identifications and quantifications with a relatively high rank correlation to control. We further showed the substantial enhancement of this approach for extensively interpreting microbial functional categories between infants by affording more detailed and confident identified categories. This approach provided an avenue for in-depth measurement in the microbial component of infant fecal samples and thus comprehensive characterization of infant gut microbiome functionality.« less

  13. Two-step sensitivity testing of parametrized and regionalized life cycle assessments: methodology and case study.

    PubMed

    Mutel, Christopher L; de Baan, Laura; Hellweg, Stefanie

    2013-06-04

    Comprehensive sensitivity analysis is a significant tool to interpret and improve life cycle assessment (LCA) models, but is rarely performed. Sensitivity analysis will increase in importance as inventory databases become regionalized, increasing the number of system parameters, and parametrized, adding complexity through variables and nonlinear formulas. We propose and implement a new two-step approach to sensitivity analysis. First, we identify parameters with high global sensitivities for further examination and analysis with a screening step, the method of elementary effects. Second, the more computationally intensive contribution to variance test is used to quantify the relative importance of these parameters. The two-step sensitivity test is illustrated on a regionalized, nonlinear case study of the biodiversity impacts from land use of cocoa production, including a worldwide cocoa products trade model. Our simplified trade model can be used for transformable commodities where one is assessing market shares that vary over time. In the case study, the highly uncertain characterization factors for the Ivory Coast and Ghana contributed more than 50% of variance for almost all countries and years examined. The two-step sensitivity test allows for the interpretation, understanding, and improvement of large, complex, and nonlinear LCA systems.

  14. [Geographical distribution of family physicians: which solutions for a complex problem?].

    PubMed

    Touati, Nassera

    2013-01-01

    This article examines the geographical distribution of family physicians, focusing on attraction issues. This analysis is based on a configurational approach. In simple terms, this approach stipulates that the impacts of an intervention are related, on the one hand, to the internal consistency between the characteristics of an intervention and, on the other hand, the consistency between this intervention and its context. A longitudinal case study was performed, corresponding to the Quebec experience over a 35-year period. The measures implemented essentially consisted of training, incentives (positive and negative), support, and, since 2004, a certain degree of coercion. Note that selection of applicants for medicine training programmes according to certain individual variables likely to have an impact on the subsequent site of practice, were only rarely used. An improvement of the efficacy of the combination of measures was observed over time: this improvement can be interpreted in terms of the consistency between the characteristics of the intervention and the consistency between the intervention and its context. Interventions designed to promote a more balanced distribution of healthcare professionals cannot be limited to activation of a single lever, but must be considered in the context of complex interventions.

  15. Development of an Enhanced Metaproteomic Approach for Deepening the Microbiome Characterization of the Human Infant Gut

    DOE PAGES

    Xiong, Weili; Richard J. Giannone; Morowitz, Michael J.; ...

    2014-10-28

    The early-life microbiota establishment in the human infant gut is highly variable and plays a crucial role in host nutrients and immunity maturation. While high-performance mass spectrometry (MS)-based metaproteomics is a powerful method for the functional characterization of complex microbial communities, the construction of comprehensive metaproteomic information in human fecal samples is inhibited by the presence of abundant human proteins. To alleviate this restriction, we have designed a novel metaproteomic strategy based on Double Filtering (DF) to enhance microbial protein characterization in complex fecal samples from healthy premature infants. We improved the overall depth of infant gut proteome measurement, withmore » an increase in the number of identified low abundance proteins, and observed greater than twofold improvement in metrics for microbial protein identifications and quantifications with a relatively high rank correlation to control. We further showed the substantial enhancement of this approach for extensively interpreting microbial functional categories between infants by affording more detailed and confident identified categories. This approach provided an avenue for in-depth measurement in the microbial component of infant fecal samples and thus comprehensive characterization of infant gut microbiome functionality.« less

  16. Sleep Telemedicine: An Emerging Field's Latest Frontier.

    PubMed

    Zia, Subaila; Fields, Barry G

    2016-06-01

    There is a widening gap between sleep provider access and patient demand for it. An American Academy of Sleep Medicine position paper recently recognized sleep telemedicine as one tool to narrow that divide. We define the term sleep telemedicine as the use of sleep-related medical information exchanged from one site to another via electronic communications to improve a patient's health. Applicable data transfer methods include telephone, video, smartphone applications, and the Internet. Their usefulness for the treatment of insomnia and sleep-disordered breathing is highlighted. Sleep telemedicine programs range in complexity from telephone-based patient feedback systems to comprehensive treatment pathways incorporating real-time video, telephone, and the Internet. While large, randomized trials are lacking, smaller studies comparing telemedicine with in-person care suggest noninferiority in terms of patient satisfaction, adherence to treatment, and symptomatic improvement. Sleep telemedicine is feasible from a technological and quality-driven perspective, but cost uncertainties, complex reimbursement structures, and variable licensing rules remain significant challenges to its feasibility on a larger scale. As legislative reform pends, larger randomized trials are needed to elucidate impact on patient outcomes, cost, and health-care system accessibility. Published by Elsevier Inc.

  17. Development of a novel virtual reality gait intervention.

    PubMed

    Boone, Anna E; Foreman, Matthew H; Engsberg, Jack R

    2017-02-01

    Improving gait speed and kinematics can be a time consuming and tiresome process. We hypothesize that incorporating virtual reality videogame play into variable improvement goals will improve levels of enjoyment and motivation and lead to improved gait performance. To develop a feasible, engaging, VR gait intervention for improving gait variables. Completing this investigation involved four steps: 1) identify gait variables that could be manipulated to improve gait speed and kinematics using the Microsoft Kinect and free software, 2) identify free internet videogames that could successfully manipulate the chosen gait variables, 3) experimentally evaluate the ability of the videogames and software to manipulate the gait variables, and 4) evaluate the enjoyment and motivation from a small sample of persons without disability. The Kinect sensor was able to detect stride length, cadence, and joint angles. FAAST software was able to identify predetermined gait variable thresholds and use the thresholds to play free online videogames. Videogames that involved continuous pressing of a keyboard key were found to be most appropriate for manipulating the gait variables. Five participants without disability evaluated the effectiveness for modifying the gait variables and enjoyment and motivation during play. Participants were able to modify gait variables to permit successful videogame play. Motivation and enjoyment were high. A clinically feasible and engaging virtual intervention for improving gait speed and kinematics has been developed and initially tested. It may provide an engaging avenue for achieving thousands of repetitions necessary for neural plastic changes and improved gait. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Improving the assessment of predator functional responses by considering alternate prey and predator interactions.

    PubMed

    Chan, K; Boutin, S; Hossie, T J; Krebs, C J; O'Donoghue, M; Murray, D L

    2017-07-01

    To improve understanding of the complex and variable patterns of predator foraging behavior in natural systems, it is critical to determine how density-dependent predation and predator hunting success are mediated by alternate prey or predator interference. Despite considerable theory and debate seeking to place predator-prey interactions in a more realistic context, few empirical studies have quantified the role of alternate prey or intraspecific interactions on predator-prey dynamics. We assessed functional responses of two similarly sized, sympatric carnivores, lynx (Lynx canadensis) and coyotes (Canis latrans), foraging on common primary (snowshoe hares; Lepus americanus) and alternate (red squirrels; Tamiasciurus hudsonicus) prey in a natural system. Lynx exhibited a hyperbolic prey-dependent response to changes in hare density, which is characteristic of predators relying primarily on a single prey species. In contrast, the lynx-squirrel response was found to be linear ratio dependent, or inversely dependent on hare density. The coyote-hare and coyote-squirrel interactions also were linear and influenced by predator density. We explain these novel results by apparent use of spatial and temporal refuges by prey, and the likelihood that predators commonly experience interference and lack of satiation when foraging. Our study provides empirical support from a natural predator-prey system that (1) predation rate may not be limited at high prey densities when prey are small or rarely captured; (2) interference competition may influence the predator functional response; and (3) predator interference has a variable role across different prey types. Ultimately, distinct functional responses of predators to different prey types illustrates the complexity associated with predator-prey interactions in natural systems and highlights the need to investigate predator behavior and predation rate in relation to the broader ecological community. © 2017 by the Ecological Society of America.

  19. RE-PERG, a new procedure for electrophysiologic diagnosis of glaucoma that may improve PERG specificity.

    PubMed

    Mavilio, Alberto; Sisto, Dario; Ferreri, Paolo; Cardascia, Nicola; Alessio, Giovanni

    2017-01-01

    A significant variability of the second harmonic (2ndH) phase of steady-state pattern electroretinogram (SS-PERG) in intrasession retest has been recently described in glaucoma patients (GP), which has not been found in healthy subjects. To evaluate the reliability of phase variability in retest (a procedure called RE-PERG or REPERG) in the presence of cataract, which is known to affect standard PERG, we tested this procedure in GP, normal controls (NC), and cataract patients (CP). The procedure was performed on 50 GP, 35 NC, and 27 CP. All subjects were examined with RE-PERG and SS-PERG and also with spectral domain optical coherence tomography and standard automated perimetry. Standard deviation of phase and amplitude value of 2ndH were correlated by means of one-way analysis of variance and Pearson correlation, with the mean deviation and pattern standard deviation assessed by standard automated perimetry and retinal nerve fiber layer and the ganglion cell complex thickness assessed by spectral domain optical coherence tomography. Receiver operating characteristics were calculated in cohort populations with and without cataract. Standard deviation of phase of 2ndH was significantly higher in GP with respect to NC ( P <0.001) and CP ( P <0.001), and it correlated with retinal nerve fiber layer ( r =-0.5, P <0.001) and ganglion cell complex ( r =-0.6, P <0.001) defects in GP. Receiver operating characteristic evaluation showed higher specificity of RE-PERG (86.4%; area under the curve 0.93) with respect to SS-PERG (54.5%; area under the curve 0.68) in CP. RE-PERG may improve the specificity of SS-PERG in clinical practice in the discrimination of GP.

  20. Utilizing Structures of CYP2D6 and BACE1 Complexes To Reduce Risk of Drug–Drug Interactions with a Novel Series of Centrally Efficacious BACE1 Inhibitors

    PubMed Central

    2016-01-01

    In recent years, the first generation of β-secretase (BACE1) inhibitors advanced into clinical development for the treatment of Alzheimer’s disease (AD). However, the alignment of drug-like properties and selectivity remains a major challenge. Herein, we describe the discovery of a novel class of potent, low clearance, CNS penetrant BACE1 inhibitors represented by thioamidine 5. Further profiling suggested that a high fraction of the metabolism (>95%) was due to CYP2D6, increasing the potential risk for victim-based drug–drug interactions (DDI) and variable exposure in the clinic due to the polymorphic nature of this enzyme. To guide future design, we solved crystal structures of CYP2D6 complexes with substrate 5 and its corresponding metabolic product pyrazole 6, which provided insight into the binding mode and movements between substrate/inhibitor complexes. Guided by the BACE1 and CYP2D6 crystal structures, we designed and synthesized analogues with reduced risk for DDI, central efficacy, and improved hERG therapeutic margins. PMID:25781223

  1. Secondary instabilities modulate cortical complexity in the mammalian brain

    NASA Astrophysics Data System (ADS)

    Budday, Silvia; Steinmann, Paul; Kuhl, Ellen

    2015-10-01

    Disclosing the origin of convolutions in the mammalian brain remains a scientific challenge. Primary folds form before we are born: they are static, well defined and highly preserved across individuals. Secondary folds occur and disappear throughout our entire lifetime: they are dynamic, irregular and highly variable among individuals. While extensive research has improved our understanding of primary folding in the mammalian brain, secondary folding remains understudied and poorly understood. Here, we show that secondary instabilities can explain the increasing complexity of our brain surface as we age. Using the nonlinear field theories of mechanics supplemented by the theory of finite growth, we explore the critical conditions for secondary instabilities. We show that with continuing growth, our brain surface continues to bifurcate into increasingly complex morphologies. Our results suggest that even small geometric variations can have a significant impact on surface morphogenesis. Secondary bifurcations, and with them morphological changes during childhood and adolescence, are closely associated with the formation and loss of neuronal connections. Understanding the correlation between neuronal connectivity, cortical thickness, surface morphology and ultimately behaviour, could have important implications on the diagnostics, classification and treatment of neurological disorders.

  2. Health and Household Air Pollution from Solid Fuel Use: The Need for Improved Exposure Assessment

    PubMed Central

    Peel, Jennifer L.; Balakrishnan, Kalpana; Breysse, Patrick N.; Chillrud, Steven N.; Naeher, Luke P.; Rodes, Charles E.; Vette, Alan F.; Balbus, John M.

    2013-01-01

    Background: Nearly 3 billion people worldwide rely on solid fuel combustion to meet basic household energy needs. The resulting exposure to air pollution causes an estimated 4.5% of the global burden of disease. Large variability and a lack of resources for research and development have resulted in highly uncertain exposure estimates. Objective: We sought to identify research priorities for exposure assessment that will more accurately and precisely define exposure–response relationships of household air pollution necessary to inform future cleaner-burning cookstove dissemination programs. Data Sources: As part of an international workshop in May 2011, an expert group characterized the state of the science and developed recommendations for exposure assessment of household air pollution. Synthesis: The following priority research areas were identified to explain variability and reduce uncertainty of household air pollution exposure measurements: improved characterization of spatial and temporal variability for studies examining both short- and long-term health effects; development and validation of measurement technology and approaches to conduct complex exposure assessments in resource-limited settings with a large range of pollutant concentrations; and development and validation of biomarkers for estimating dose. Addressing these priority research areas, which will inherently require an increased allocation of resources for cookstove research, will lead to better characterization of exposure–response relationships. Conclusions: Although the type and extent of exposure assessment will necessarily depend on the goal and design of the cookstove study, without improved understanding of exposure–response relationships, the level of air pollution reduction necessary to meet the health targets of cookstove interventions will remain uncertain. Citation: Clark ML, Peel JL, Balakrishnan K, Breysse PN, Chillrud SN, Naeher LP, Rodes CE, Vette AF, Balbus JM. 2013. Health and household air pollution from solid fuel use: the need for improved exposure assessment. Environ Health Perspect 121:1120–1128; http://dx.doi.org/10.1289/ehp.1206429 PMID:23872398

  3. Modest weight loss in moderately overweight postmenopausal women improves heart rate variability.

    PubMed

    Mouridsen, Mette Rauhe; Bendsen, Nathalie Tommerup; Astrup, Arne; Haugaard, Steen Bendix; Binici, Zeynep; Sajadieh, Ahmad

    2013-08-01

    To evaluate the effects of weight loss on heart rate (HR) and heart rate variability (HRV) parameters in overweight postmenopausal women. Forty-nine overweight postmenopausal women with an average body mass index of 28.8 ± 1.9 kg/m(2) underwent a 12-week dietary weight-loss programme. Accepted variables for characterization of HRV were analysed before and after the weight loss by 24-h ambulatory ECG monitoring; mean and standard deviation for the time between normal-to-normal complexes (MeanNN and SDNN, respectively), and the mean of standard deviations of normal-to-normal intervals for each 5-min period (SDNNindex). Baseline body fat mass (FM%) and changes in body composition was determined by dual X-ray absorptiometry. Before and after the weight-loss period, total abdominal fat, intra-abdominal fat (IAAT), and subcutaneous abdominal fat (SCAT) were measured by single-slice MRI at L3. The weight loss of 3.9 ± 2.0 kg was accompanied by an improvement of HRV. SDNN increased by 9.2% (p = 0.003) and SDNNindex increased by 11.4% (p = 0.0003). MeanNN increased by 2.4%, reflecting a decrease in mean heart rate from 74.1 to 72.3 beats/min (p = 0.033). Systolic blood pressure (SBP) decreased by 2.7%, total cholesterol by 5.1% and high-sensitivity C-reactive protein (hsCRP) by 15.8% (p = 0.002). Improvements in SDNN and cholesterol were correlated with weight loss (r = -0.329, p = 0.024 and r = 0.327, p = 0.020, respectively) but changes in HR, SBP, and hsCRP were not. IAAT and the IAAT/SCAT-ratio were found to be negatively associated with HRV parameters but changes in body composition were not associated with changes in HRV. The observed improvement of HRV seems to be facilitated by weight loss. IAAT and the IAAT/SCAT ratio were found to be associated with low HRV.

  4. Analysing inter-relationships among water, governance, human development variables in developing countries: WatSan4Dev database coherency analysis

    NASA Astrophysics Data System (ADS)

    Dondeynaz, C.; Carmona Moreno, C.; Céspedes Lorente, J. J.

    2012-01-01

    The "Integrated Water Resources Management" principle was formally laid down at the International Conference on Water and Sustainable development in Dublin 1992. One of the main results of this conference is that improving Water and Sanitation Services (WSS), being a complex and interdisciplinary issue, passes through collaboration and coordination of different sectors (environment, health, economic activities, governance, and international cooperation). These sectors influence or are influenced by the access to WSS. The understanding of these interrelations appears as crucial for decision makers in the water sector. In this framework, the Joint Research Centre (JRC) of the European Commission (EC) has developed a new database (WatSan4Dev database) containing 45 indicators (called variables in this paper) from environmental, socio-economic, governance and financial aid flows data in developing countries. This paper describes the development of the WatSan4Dev dataset, the statistical processes needed to improve the data quality; and, finally, the analysis to verify the database coherence is presented. At the light of the first analysis, WatSan4Dev Dataset shows the coherency among the different variables that are confirmed by the direct field experience and/or the scientific literature in the domain. Preliminary analysis of the relationships indicates that the informal urbanisation development is an important factor influencing negatively the percentage of the population having access to WSS. Health, and in particular children health, benefits from the improvement of WSS. Efficient environmental governance is also an important factor for providing improved water supply services. The database would be at the base of posterior analyses to better understand the interrelationships between the different indicators associated in the water sector in developing countries. A data model using the different indicators will be realised on the next phase of this research work.

  5. Linking Inflammation, Cardiorespiratory Variability, and Neural Control in Acute Inflammation via Computational Modeling

    PubMed Central

    Dick, Thomas E.; Molkov, Yaroslav I.; Nieman, Gary; Hsieh, Yee-Hsee; Jacono, Frank J.; Doyle, John; Scheff, Jeremy D.; Calvano, Steve E.; Androulakis, Ioannis P.; An, Gary; Vodovotz, Yoram

    2012-01-01

    Acute inflammation leads to organ failure by engaging catastrophic feedback loops in which stressed tissue evokes an inflammatory response and, in turn, inflammation damages tissue. Manifestations of this maladaptive inflammatory response include cardio-respiratory dysfunction that may be reflected in reduced heart rate and ventilatory pattern variabilities. We have developed signal-processing algorithms that quantify non-linear deterministic characteristics of variability in biologic signals. Now, coalescing under the aegis of the NIH Computational Biology Program and the Society for Complexity in Acute Illness, two research teams performed iterative experiments and computational modeling on inflammation and cardio-pulmonary dysfunction in sepsis as well as on neural control of respiration and ventilatory pattern variability. These teams, with additional collaborators, have recently formed a multi-institutional, interdisciplinary consortium, whose goal is to delineate the fundamental interrelationship between the inflammatory response and physiologic variability. Multi-scale mathematical modeling and complementary physiological experiments will provide insight into autonomic neural mechanisms that may modulate the inflammatory response to sepsis and simultaneously reduce heart rate and ventilatory pattern variabilities associated with sepsis. This approach integrates computational models of neural control of breathing and cardio-respiratory coupling with models that combine inflammation, cardiovascular function, and heart rate variability. The resulting integrated model will provide mechanistic explanations for the phenomena of respiratory sinus-arrhythmia and cardio-ventilatory coupling observed under normal conditions, and the loss of these properties during sepsis. This approach holds the potential of modeling cross-scale physiological interactions to improve both basic knowledge and clinical management of acute inflammatory diseases such as sepsis and trauma. PMID:22783197

  6. Linking Inflammation, Cardiorespiratory Variability, and Neural Control in Acute Inflammation via Computational Modeling.

    PubMed

    Dick, Thomas E; Molkov, Yaroslav I; Nieman, Gary; Hsieh, Yee-Hsee; Jacono, Frank J; Doyle, John; Scheff, Jeremy D; Calvano, Steve E; Androulakis, Ioannis P; An, Gary; Vodovotz, Yoram

    2012-01-01

    Acute inflammation leads to organ failure by engaging catastrophic feedback loops in which stressed tissue evokes an inflammatory response and, in turn, inflammation damages tissue. Manifestations of this maladaptive inflammatory response include cardio-respiratory dysfunction that may be reflected in reduced heart rate and ventilatory pattern variabilities. We have developed signal-processing algorithms that quantify non-linear deterministic characteristics of variability in biologic signals. Now, coalescing under the aegis of the NIH Computational Biology Program and the Society for Complexity in Acute Illness, two research teams performed iterative experiments and computational modeling on inflammation and cardio-pulmonary dysfunction in sepsis as well as on neural control of respiration and ventilatory pattern variability. These teams, with additional collaborators, have recently formed a multi-institutional, interdisciplinary consortium, whose goal is to delineate the fundamental interrelationship between the inflammatory response and physiologic variability. Multi-scale mathematical modeling and complementary physiological experiments will provide insight into autonomic neural mechanisms that may modulate the inflammatory response to sepsis and simultaneously reduce heart rate and ventilatory pattern variabilities associated with sepsis. This approach integrates computational models of neural control of breathing and cardio-respiratory coupling with models that combine inflammation, cardiovascular function, and heart rate variability. The resulting integrated model will provide mechanistic explanations for the phenomena of respiratory sinus-arrhythmia and cardio-ventilatory coupling observed under normal conditions, and the loss of these properties during sepsis. This approach holds the potential of modeling cross-scale physiological interactions to improve both basic knowledge and clinical management of acute inflammatory diseases such as sepsis and trauma.

  7. Climate variability and vulnerability to climate change: a review

    PubMed Central

    Thornton, Philip K; Ericksen, Polly J; Herrero, Mario; Challinor, Andrew J

    2014-01-01

    The focus of the great majority of climate change impact studies is on changes in mean climate. In terms of climate model output, these changes are more robust than changes in climate variability. By concentrating on changes in climate means, the full impacts of climate change on biological and human systems are probably being seriously underestimated. Here, we briefly review the possible impacts of changes in climate variability and the frequency of extreme events on biological and food systems, with a focus on the developing world. We present new analysis that tentatively links increases in climate variability with increasing food insecurity in the future. We consider the ways in which people deal with climate variability and extremes and how they may adapt in the future. Key knowledge and data gaps are highlighted. These include the timing and interactions of different climatic stresses on plant growth and development, particularly at higher temperatures, and the impacts on crops, livestock and farming systems of changes in climate variability and extreme events on pest-weed-disease complexes. We highlight the need to reframe research questions in such a way that they can provide decision makers throughout the food system with actionable answers, and the need for investment in climate and environmental monitoring. Improved understanding of the full range of impacts of climate change on biological and food systems is a critical step in being able to address effectively the effects of climate variability and extreme events on human vulnerability and food security, particularly in agriculturally based developing countries facing the challenge of having to feed rapidly growing populations in the coming decades. PMID:24668802

  8. Decadal climate variability and the spatial organization of deep hydrological drought

    NASA Astrophysics Data System (ADS)

    Barros, Ana P.; Hodes, Jared L.; Arulraj, Malarvizhi

    2017-10-01

    Empirical Orthogonal Function (EOF), wavelet, and wavelet coherence analysis of baseflow time-series from 126 streamgauges (record-length > 50 years; small and mid-size watersheds) in the US South Atlantic (USSA) region reveal three principal modes of space-time variability: (1) a region-wide dominant mode tied to annual precipitation that exhibits non-stationary decadal variability after the mid 1990s concurrent with the warming of the AMO (Atlantic Multidecadal Oscillation); (2) two spatial modes, east and west of the Blue Ridge, exhibiting nonstationary seasonal to sub-decadal variability before and after 1990 attributed to complex nonlinear interactions between ENSO and AMO impacting precipitation and recharge; and (3) deep (decadal) and shallow (< 6 years) space-time modes of groundwater variability separating basins with high and low annual mean baseflow fraction (MBF) by physiographic region. The results explain the propagation of multiscale climate variability into the regional groundwater system through recharge modulated by topography, geomorphology, and geology to determine the spatial organization of baseflow variability at decadal (and longer) time-scales, that is, deep hydrologic drought. Further, these findings suggest potential for long-range predictability of hydrological drought in small and mid-size watersheds, where baseflow is a robust indicator of nonstationary yield capacity of the underlying groundwater basins. Predictive associations between climate mode indices and deep baseflow (e.g. persistent decreases of the decadal-scale components of baseflow during the cold phase of the AMO in the USSA) can be instrumental toward improving forecast lead-times and long-range mitigation of severe drought.

  9. Influence of seasonal and inter-annual hydro-meteorological variability on surface water fecal coliform concentration under varying land-use composition.

    PubMed

    St Laurent, Jacques; Mazumder, Asit

    2014-01-01

    Quantifying the influence of hydro-meteorological variability on surface source water fecal contamination is critical to the maintenance of safe drinking water. Historically, this has not been possible due to the scarcity of data on fecal indicator bacteria (FIB). We examined the relationship between hydro-meteorological variability and the most commonly measured FIB, fecal coliform (FC), concentration for 43 surface water sites within the hydro-climatologically complex region of British Columbia. The strength of relationship was highly variable among sites, but tended to be stronger in catchments with nival (snowmelt-dominated) hydro-meteorological regimes and greater land-use impacts. We observed positive relationships between inter-annual FC concentration and hydro-meteorological variability for around 50% of the 19 sites examined. These sites are likely to experience increased fecal contamination due to the projected intensification of the hydrological cycle. Seasonal FC concentration variability appeared to be driven by snowmelt and rainfall-induced runoff for around 30% of the 43 sites examined. Earlier snowmelt in nival catchments may advance the timing of peak contamination, and the projected decrease in annual snow-to-precipitation ratio is likely to increase fecal contamination levels during summer, fall, and winter among these sites. Safeguarding drinking water quality in the face of such impacts will require increased monitoring of FIB and waterborne pathogens, especially during periods of high hydro-meteorological variability. This data can then be used to develop predictive models, inform source water protection measures, and improve drinking water treatment. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. The importance of topography controlled sub-grid process heterogeneity in distributed hydrological models

    NASA Astrophysics Data System (ADS)

    Nijzink, R. C.; Samaniego, L.; Mai, J.; Kumar, R.; Thober, S.; Zink, M.; Schäfer, D.; Savenije, H. H. G.; Hrachowitz, M.

    2015-12-01

    Heterogeneity of landscape features like terrain, soil, and vegetation properties affect the partitioning of water and energy. However, it remains unclear to which extent an explicit representation of this heterogeneity at the sub-grid scale of distributed hydrological models can improve the hydrological consistency and the robustness of such models. In this study, hydrological process complexity arising from sub-grid topography heterogeneity was incorporated in the distributed mesoscale Hydrologic Model (mHM). Seven study catchments across Europe were used to test whether (1) the incorporation of additional sub-grid variability on the basis of landscape-derived response units improves model internal dynamics, (2) the application of semi-quantitative, expert-knowledge based model constraints reduces model uncertainty; and (3) the combined use of sub-grid response units and model constraints improves the spatial transferability of the model. Unconstrained and constrained versions of both, the original mHM and mHMtopo, which allows for topography-based sub-grid heterogeneity, were calibrated for each catchment individually following a multi-objective calibration strategy. In addition, four of the study catchments were simultaneously calibrated and their feasible parameter sets were transferred to the remaining three receiver catchments. In a post-calibration evaluation procedure the probabilities of model and transferability improvement, when accounting for sub-grid variability and/or applying expert-knowledge based model constraints, were assessed on the basis of a set of hydrological signatures. In terms of the Euclidian distance to the optimal model, used as overall measure for model performance with respect to the individual signatures, the model improvement achieved by introducing sub-grid heterogeneity to mHM in mHMtopo was on average 13 %. The addition of semi-quantitative constraints to mHM and mHMtopo resulted in improvements of 13 and 19 % respectively, compared to the base case of the unconstrained mHM. Most significant improvements in signature representations were, in particular, achieved for low flow statistics. The application of prior semi-quantitative constraints further improved the partitioning between runoff and evaporative fluxes. Besides, it was shown that suitable semi-quantitative prior constraints in combination with the transfer function based regularization approach of mHM, can be beneficial for spatial model transferability as the Euclidian distances for the signatures improved on average by 2 %. The effect of semi-quantitative prior constraints combined with topography-guided sub-grid heterogeneity on transferability showed a more variable picture of improvements and deteriorations, but most improvements were observed for low flow statistics.

  11. The importance of topography-controlled sub-grid process heterogeneity and semi-quantitative prior constraints in distributed hydrological models

    NASA Astrophysics Data System (ADS)

    Nijzink, Remko C.; Samaniego, Luis; Mai, Juliane; Kumar, Rohini; Thober, Stephan; Zink, Matthias; Schäfer, David; Savenije, Hubert H. G.; Hrachowitz, Markus

    2016-03-01

    Heterogeneity of landscape features like terrain, soil, and vegetation properties affects the partitioning of water and energy. However, it remains unclear to what extent an explicit representation of this heterogeneity at the sub-grid scale of distributed hydrological models can improve the hydrological consistency and the robustness of such models. In this study, hydrological process complexity arising from sub-grid topography heterogeneity was incorporated into the distributed mesoscale Hydrologic Model (mHM). Seven study catchments across Europe were used to test whether (1) the incorporation of additional sub-grid variability on the basis of landscape-derived response units improves model internal dynamics, (2) the application of semi-quantitative, expert-knowledge-based model constraints reduces model uncertainty, and whether (3) the combined use of sub-grid response units and model constraints improves the spatial transferability of the model. Unconstrained and constrained versions of both the original mHM and mHMtopo, which allows for topography-based sub-grid heterogeneity, were calibrated for each catchment individually following a multi-objective calibration strategy. In addition, four of the study catchments were simultaneously calibrated and their feasible parameter sets were transferred to the remaining three receiver catchments. In a post-calibration evaluation procedure the probabilities of model and transferability improvement, when accounting for sub-grid variability and/or applying expert-knowledge-based model constraints, were assessed on the basis of a set of hydrological signatures. In terms of the Euclidian distance to the optimal model, used as an overall measure of model performance with respect to the individual signatures, the model improvement achieved by introducing sub-grid heterogeneity to mHM in mHMtopo was on average 13 %. The addition of semi-quantitative constraints to mHM and mHMtopo resulted in improvements of 13 and 19 %, respectively, compared to the base case of the unconstrained mHM. Most significant improvements in signature representations were, in particular, achieved for low flow statistics. The application of prior semi-quantitative constraints further improved the partitioning between runoff and evaporative fluxes. In addition, it was shown that suitable semi-quantitative prior constraints in combination with the transfer-function-based regularization approach of mHM can be beneficial for spatial model transferability as the Euclidian distances for the signatures improved on average by 2 %. The effect of semi-quantitative prior constraints combined with topography-guided sub-grid heterogeneity on transferability showed a more variable picture of improvements and deteriorations, but most improvements were observed for low flow statistics.

  12. The Role of Global Hydrologic Processes in Interannual and Long-Term Climate Variability

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.

    1997-01-01

    The earth's climate and its variability is linked inextricably with the presence of water on our planet. El Nino / Southern Oscillation-- the major mode of interannual variability-- is characterized by strong perturbations in oceanic evaporation, tropical rainfall, and radiation. On longer time scales, the major feedback mechanism in CO2-induced global warming is actually that due to increased water vapor holding capacity of the atmosphere. The global hydrologic cycle effects on climate are manifested through influence of cloud and water vapor on energy fluxes at the top of atmosphere and at the surface. Surface moisture anomalies retain the "memory" of past precipitation anomalies and subsequently alter the partitioning of latent and sensible heat fluxes at the surface. At the top of atmosphere, water vapor and cloud perturbations alter the net amount of radiation that the earth's climate system receives. These pervasive linkages between water, radiation, and surface processes present major complexities for observing and modeling climate variations. Major uncertainties in the observations include vertical structure of clouds and water vapor, surface energy balance, and transport of water and heat by wind fields. Modeling climate variability and change on a physical basis requires accurate by simplified submodels of radiation, cloud formation, radiative exchange, surface biophysics, and oceanic energy flux. In the past, we m safely say that being "data poor' has limited our depth of understanding and impeded model validation and improvement. Beginning with pre-EOS data sets, many of these barriers are being removed. EOS platforms with the suite of measurements dedicated to specific science questions are part of our most cost effective path to improved understanding and predictive capability. This talk will highlight some of the major questions confronting global hydrology and the prospects for significant progress afforded by EOS-era measurements.

  13. Spatio-temporal Variability of Stratified Snowpack Cold Content Observed in the Rocky Mountains

    NASA Astrophysics Data System (ADS)

    Schmidt, J. S.; Sexstone, G. A.; Serreze, M. C.

    2017-12-01

    Snowpack cold content (CCsnow) is the energy required to bring a snowpack to an isothermal temperature of 0.0°C. The spatio-temporal variability of CCsnow is complex as it is a measure that integrates the response of a snowpack to each component of the snow-cover energy balance. Snow and ice at high elevation is climate sensitive water storage for the Western U.S. Therefore, an improved understanding of the spatio-temporal variability of CCsnow may provide insight into snowpack dynamics and sensitivity to climate change. In this study, stratified snowpit observations of snow water equivalent (SWE) and snow temperature (Tsnow) from the USGS Rocky Mountain Snowpack network (USGS RMS) were used to evaluate vertical CCsnow profiles over a 16-year period in Montana, Idaho, Wyoming, Colorado and New Mexico. Since 1993, USGS RMS has collected snow chemistry, snow temperature, and SWE data throughout the Rocky Mountain region, making it well positioned for Anthropocene cryosphere benchmarking and climate change interpretation. Spatial grouping of locations based on similar CCsnow characteristics was evaluated and trend analyses were performed. Additionally, we evaluated the regional relation of CCsnow to snowmelt timing. CCsnow was more precisely calculated and more representative using vertically stratified field observed values than bulk values, which highlights the utility of the snowpack dataset presented here. Location specific annual and 16 year mean stratified snowpit profiles of SWE, Tsnow, and CCsnow well represent the physical geography and past weather patterns acting on the snowpack. Observed trends and spatial variability of CCsnow profiles explored by this study provides an improved understanding of changing snowpack behavior in the western U.S., and will be useful for assessing the regional sensitivity of snowpacks to future climate change.

  14. An unusual kind of complex synchronizations and its applications in secure communications

    NASA Astrophysics Data System (ADS)

    Mahmoud, Emad E.

    2017-11-01

    In this paper, we talk about the meaning of complex anti-syncrhonization (CAS) of hyperchaotic nonlinear frameworks comprehensive complex variables and indeterminate parameters. This sort of synchronization can break down just for complex nonlinear frameworks. The CAS contains or fuses two sorts of synchronizations (complete synchronization and anti-synchronization). In the CAS the attractors of the master and slave frameworks are moving opposite or orthogonal to each other with a similar form; this phenomenon does not exist in the literature. Upon confirmation of the Lyapunov function and a versatile control strategy, a plan is made to play out the CAS of two indistinguishable hyperchaotic attractors of these frameworks. The adequacy of the obtained results is shown by a simulation case. Numerical issues are plotted to decide state variables, synchronization errors, modules errors, and phases errors of those hyperchaotic attractors after synchronization to determine that the CAS is accomplished. The above outcomes will present the possible establishment to the secure communication applications. The CAS of hyperchaotic complex frameworks in which a state variable of the master framework synchronizes with an alternate state variable of the slave framework is an encouraging kind of synchronization as it contributes fantastic security in secure communications. Amid this secure communications, the synchronization between transmitter and collector is shut and message signs are recouped. The encryption and reclamation of the signs are reproduced numerically.

  15. Distribution, abundance, and diversity of stream fishes under variable environmental conditions

    Treesearch

    Christopher M. Taylor; Thomas L. Holder; Richard A. Fiorillo; Lance R. Williams; R. Brent Thomas; Melvin L. Warren

    2006-01-01

    The effects of stream size and flow regime on spatial and temporal variability of stream fish distribution, abundance, and diversity patterns were investigated. Assemblage variability and species richness were each significantly associated with a complex environmental gradient contrasting smaller, hydrologically variable stream localities with larger localities...

  16. 89. 22'X34' original vellum, VariableAngle Launcher 'ELEVATION OF LAUNCHER BRIDGE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    89. 22'X34' original vellum, Variable-Angle Launcher 'ELEVATION OF LAUNCHER BRIDGE ON TEMPORARY SUPPORT' drawn at 1'=20'. (BUORD Sketch # 209786, PAPW 1932). - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  17. 90. 22'X34' original blueprint, VariableAngle Launcher, 'FRONT ELEVATION OF LAUNCHER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    90. 22'X34' original blueprint, Variable-Angle Launcher, 'FRONT ELEVATION OF LAUNCHER BRIDGE, CONNECTING BRIDGE AND BARGES' drawn at 1/4'=1'0'. (BUROD Sketch # 208247). - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  18. Student Solution Manual for Mathematical Methods for Physics and Engineering Third Edition

    NASA Astrophysics Data System (ADS)

    Riley, K. F.; Hobson, M. P.

    2006-03-01

    Preface; 1. Preliminary algebra; 2. Preliminary calculus; 3. Complex numbers and hyperbolic functions; 4. Series and limits; 5. Partial differentiation; 6. Multiple integrals; 7. Vector algebra; 8. Matrices and vector spaces; 9. Normal modes; 10. Vector calculus; 11. Line, surface and volume integrals; 12. Fourier series; 13. Integral transforms; 14. First-order ordinary differential equations; 15. Higher-order ordinary differential equations; 16. Series solutions of ordinary differential equations; 17. Eigenfunction methods for differential equations; 18. Special functions; 19. Quantum operators; 20. Partial differential equations: general and particular; 21. Partial differential equations: separation of variables; 22. Calculus of variations; 23. Integral equations; 24. Complex variables; 25. Application of complex variables; 26. Tensors; 27. Numerical methods; 28. Group theory; 29. Representation theory; 30. Probability; 31. Statistics.

  19. Coulomb wave functions with complex values of the variable and the parameters

    NASA Astrophysics Data System (ADS)

    Dzieciol, Aleksander; Yngve, Staffan; Fröman, Per Olof

    1999-12-01

    The motivation for the present paper lies in the fact that the literature concerning the Coulomb wave functions FL(η,ρ) and GL(η,ρ) is a jungle in which it may be hard to find a safe way when one needs general formulas for the Coulomb wave functions with complex values of the variable ρ and the parameters L and η. For the Coulomb wave functions and certain linear combinations of these functions we discuss the connection with the Whittaker function, the Coulomb phase shift, Wronskians, reflection formulas (L→-L-1), integral representations, series expansions, circuital relations (ρ→ρe±iπ) and asymptotic formulas on a Riemann surface for the variable ρ. The parameters L and η are allowed to assume complex values.

  20. Cognitive Agility Measurement in a Complex Environment

    DTIC Science & Technology

    2017-04-01

    correlate with their corresponding historical psychology tests? EEA3.1: Does the variable for Make Goal cognitive flexibility correlate with the...Stroop Test cognitive flexibility variable? EEA3.2: Does the variable for Make Goal cognitive openness correlate with the AUT cognitive openness...variable? EEA3.3: Does the variable for Make Goal focused attention correlate with the Go, No Go Paradigm focused attention variable? 1.6

  1. Relationships between landscape pattern, wetland characteristics, and water quality in agricultural catchments.

    PubMed

    Moreno-Mateos, David; Mander, Ulo; Comín, Francisco A; Pedrocchi, César; Uuemaa, Evelyn

    2008-01-01

    Water quality in streams is dependent on landscape metrics at catchment and wetland scales. A study was undertaken to evaluate the correlation between landscape metrics, namely patch density and area, shape, heterogeneity, aggregation, connectivity, land-use ratio, and water quality variables (salinity, nutrients, sediments, alkalinity, other potential pollutants and pH) in the agricultural areas of a semiarid Mediterranean region dominated by irrigated farmlands (NE Spain). The study also aims to develop wetland construction criteria in agricultural catchments. The percentage of arable land and landscape homogeneity (low value of Simpson index) are significantly correlated with salinity (r(2) = 0.72) and NO(3)-N variables (r(2) = 0.49) at catchment scale. The number of stock farms was correlated (Spearman's corr. = 0.60; p < 0.01) with TP concentration in stream water. The relative abundance of wetlands and the aggregation of its patches influence salinity variables at wetland scale (r(2) = 0.59 for Na(+) and K(+) concentrations). The number and aggregation of wetland patches are closely correlated to the landscape complexity of catchments, measured as patch density (r(2) = 0.69), patch size (r(2) = 0.53), and landscape heterogeneity (r(2) = 0.62). These results suggest that more effective results in water quality improvement would be achieved if we acted at both catchment and wetland scales, especially reducing landscape homogeneity and creating numerous wetlands scattered throughout the catchment. A set of guidelines for planners and decision makers is provided for future agricultural developments or to improve existing ones.

  2. Influence of Excipients and Spray Drying on the Physical and Chemical Properties of Nutraceutical Capsules Containing Phytochemicals from Black Bean Extract.

    PubMed

    Guajardo-Flores, Daniel; Rempel, Curtis; Gutiérrez-Uribe, Janet A; Serna-Saldívar, Sergio O

    2015-12-03

    Black beans (Phaseolus vulgaris L.) are a rich source of flavonoids and saponins with proven health benefits. Spray dried black bean extract powders were used in different formulations for the production of nutraceutical capsules with reduced batch-to-batch weight variability. Factorial designs were used to find an adequate maltodextrin-extract ratio for the spray-drying process to produce black bean extract powders. Several flowability properties were used to determine composite flow index of produced powders. Powder containing 6% maltodextrin had the highest yield (78.6%) and the best recovery of flavonoids and saponins (>56% and >73%, respectively). The new complexes formed by the interaction of black bean powder with maltodextrin, microcrystalline cellulose 50 and starch exhibited not only bigger particles, but also a rougher structure than using only maltodextrin and starch as excipients. A drying process prior to capsule production improved powder flowability, increasing capsule weight and reducing variability. The formulation containing 25.0% of maltodextrin, 24.1% of microcrystalline cellulose 50, 50% of starch and 0.9% of magnesium stearate produced capsules with less than 2.5% weight variability. The spray drying technique is a feasible technique to produce good flow extract powders containing valuable phytochemicals and low cost excipients to reduce the end-product variability.

  3. Combinatorial techniques to efficiently investigate and optimize organic thin film processing and properties.

    PubMed

    Wieberger, Florian; Kolb, Tristan; Neuber, Christian; Ober, Christopher K; Schmidt, Hans-Werner

    2013-04-08

    In this article we present several developed and improved combinatorial techniques to optimize processing conditions and material properties of organic thin films. The combinatorial approach allows investigations of multi-variable dependencies and is the perfect tool to investigate organic thin films regarding their high performance purposes. In this context we develop and establish the reliable preparation of gradients of material composition, temperature, exposure, and immersion time. Furthermore we demonstrate the smart application of combinations of composition and processing gradients to create combinatorial libraries. First a binary combinatorial library is created by applying two gradients perpendicular to each other. A third gradient is carried out in very small areas and arranged matrix-like over the entire binary combinatorial library resulting in a ternary combinatorial library. Ternary combinatorial libraries allow identifying precise trends for the optimization of multi-variable dependent processes which is demonstrated on the lithographic patterning process. Here we verify conclusively the strong interaction and thus the interdependency of variables in the preparation and properties of complex organic thin film systems. The established gradient preparation techniques are not limited to lithographic patterning. It is possible to utilize and transfer the reported combinatorial techniques to other multi-variable dependent processes and to investigate and optimize thin film layers and devices for optical, electro-optical, and electronic applications.

  4. Computationally-efficient stochastic cluster dynamics method for modeling damage accumulation in irradiated materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoang, Tuan L.; Physical and Life Sciences Directorate, Lawrence Livermore National Laboratory, CA 94550; Marian, Jaime, E-mail: jmarian@ucla.edu

    2015-11-01

    An improved version of a recently developed stochastic cluster dynamics (SCD) method (Marian and Bulatov, 2012) [6] is introduced as an alternative to rate theory (RT) methods for solving coupled ordinary differential equation (ODE) systems for irradiation damage simulations. SCD circumvents by design the curse of dimensionality of the variable space that renders traditional ODE-based RT approaches inefficient when handling complex defect population comprised of multiple (more than two) defect species. Several improvements introduced here enable efficient and accurate simulations of irradiated materials up to realistic (high) damage doses characteristic of next-generation nuclear systems. The first improvement is a proceduremore » for efficiently updating the defect reaction-network and event selection in the context of a dynamically expanding reaction-network. Next is a novel implementation of the τ-leaping method that speeds up SCD simulations by advancing the state of the reaction network in large time increments when appropriate. Lastly, a volume rescaling procedure is introduced to control the computational complexity of the expanding reaction-network through occasional reductions of the defect population while maintaining accurate statistics. The enhanced SCD method is then applied to model defect cluster accumulation in iron thin films subjected to triple ion-beam (Fe{sup 3+}, He{sup +} and H{sup +}) irradiations, for which standard RT or spatially-resolved kinetic Monte Carlo simulations are prohibitively expensive.« less

  5. Quality in transitional care of the elderly: Key challenges and relevant improvement measures

    PubMed Central

    Storm, Marianne; Siemsen, Inger Margrete D.; Laugaland, Kristin; Dyrstad, Dagrunn Nåden; Aase, Karina

    2014-01-01

    Introduction Elderly people aged over 75 years with multifaceted care needs are often in need of hospital treatment. Transfer across care levels for this patient group increases the risk of adverse events. The aim of this paper is to establish knowledge of quality in transitional care of the elderly in two Norwegian hospital regions by identifying issues affecting the quality of transitional care and based on these issues suggest improvement measures. Methodology Included in the study were elderly patients (75+) receiving health care in the municipality admitted to hospital emergency department or discharged to community health care with hip fracture or with a general medical diagnosis. Participant observations of admission and discharge transitions (n = 41) were carried out by two researchers. Results Six main challenges with belonging descriptions have been identified: (1) next of kin (bridging providers, advocacy, support, information brokering), (2) patient characteristics (level of satisfaction, level of insecurity, complex clinical conditions), (3) health care personnel's competence (professional, system, awareness of others’ roles), (4) information exchange (oral, written, electronic), (5) context (stability, variability, change incentives, number of patient handovers) and (6) patient assessment (complex clinical picture, patient description, clinical assessment). Conclusion Related to the six main challenges, several measures have been suggested to improve quality in transitional care, e.g. information to and involvement of patients and next of kin, staff training, standardisation of routines and inter-organisational staff meetings. PMID:24868196

  6. Computationally-efficient stochastic cluster dynamics method for modeling damage accumulation in irradiated materials

    NASA Astrophysics Data System (ADS)

    Hoang, Tuan L.; Marian, Jaime; Bulatov, Vasily V.; Hosemann, Peter

    2015-11-01

    An improved version of a recently developed stochastic cluster dynamics (SCD) method (Marian and Bulatov, 2012) [6] is introduced as an alternative to rate theory (RT) methods for solving coupled ordinary differential equation (ODE) systems for irradiation damage simulations. SCD circumvents by design the curse of dimensionality of the variable space that renders traditional ODE-based RT approaches inefficient when handling complex defect population comprised of multiple (more than two) defect species. Several improvements introduced here enable efficient and accurate simulations of irradiated materials up to realistic (high) damage doses characteristic of next-generation nuclear systems. The first improvement is a procedure for efficiently updating the defect reaction-network and event selection in the context of a dynamically expanding reaction-network. Next is a novel implementation of the τ-leaping method that speeds up SCD simulations by advancing the state of the reaction network in large time increments when appropriate. Lastly, a volume rescaling procedure is introduced to control the computational complexity of the expanding reaction-network through occasional reductions of the defect population while maintaining accurate statistics. The enhanced SCD method is then applied to model defect cluster accumulation in iron thin films subjected to triple ion-beam (Fe3+, He+ and H+) irradiations, for which standard RT or spatially-resolved kinetic Monte Carlo simulations are prohibitively expensive.

  7. Iterative refinement of implicit boundary models for improved geological feature reproduction

    NASA Astrophysics Data System (ADS)

    Martin, Ryan; Boisvert, Jeff B.

    2017-12-01

    Geological domains contain non-stationary features that cannot be described by a single direction of continuity. Non-stationary estimation frameworks generate more realistic curvilinear interpretations of subsurface geometries. A radial basis function (RBF) based implicit modeling framework using domain decomposition is developed that permits introduction of locally varying orientations and magnitudes of anisotropy for boundary models to better account for the local variability of complex geological deposits. The interpolation framework is paired with a method to automatically infer the locally predominant orientations, which results in a rapid and robust iterative non-stationary boundary modeling technique that can refine locally anisotropic geological shapes automatically from the sample data. The method also permits quantification of the volumetric uncertainty associated with the boundary modeling. The methodology is demonstrated on a porphyry dataset and shows improved local geological features.

  8. Land-cover classification in a moist tropical region of Brazil with Landsat TM imagery.

    PubMed

    Li, Guiying; Lu, Dengsheng; Moran, Emilio; Hetrick, Scott

    2011-01-01

    This research aims to improve land-cover classification accuracy in a moist tropical region in Brazil by examining the use of different remote sensing-derived variables and classification algorithms. Different scenarios based on Landsat Thematic Mapper (TM) spectral data and derived vegetation indices and textural images, and different classification algorithms - maximum likelihood classification (MLC), artificial neural network (ANN), classification tree analysis (CTA), and object-based classification (OBC), were explored. The results indicated that a combination of vegetation indices as extra bands into Landsat TM multispectral bands did not improve the overall classification performance, but the combination of textural images was valuable for improving vegetation classification accuracy. In particular, the combination of both vegetation indices and textural images into TM multispectral bands improved overall classification accuracy by 5.6% and kappa coefficient by 6.25%. Comparison of the different classification algorithms indicated that CTA and ANN have poor classification performance in this research, but OBC improved primary forest and pasture classification accuracies. This research indicates that use of textural images or use of OBC are especially valuable for improving the vegetation classes such as upland and liana forest classes having complex stand structures and having relatively large patch sizes.

  9. Land-cover classification in a moist tropical region of Brazil with Landsat TM imagery

    PubMed Central

    LI, GUIYING; LU, DENGSHENG; MORAN, EMILIO; HETRICK, SCOTT

    2011-01-01

    This research aims to improve land-cover classification accuracy in a moist tropical region in Brazil by examining the use of different remote sensing-derived variables and classification algorithms. Different scenarios based on Landsat Thematic Mapper (TM) spectral data and derived vegetation indices and textural images, and different classification algorithms – maximum likelihood classification (MLC), artificial neural network (ANN), classification tree analysis (CTA), and object-based classification (OBC), were explored. The results indicated that a combination of vegetation indices as extra bands into Landsat TM multispectral bands did not improve the overall classification performance, but the combination of textural images was valuable for improving vegetation classification accuracy. In particular, the combination of both vegetation indices and textural images into TM multispectral bands improved overall classification accuracy by 5.6% and kappa coefficient by 6.25%. Comparison of the different classification algorithms indicated that CTA and ANN have poor classification performance in this research, but OBC improved primary forest and pasture classification accuracies. This research indicates that use of textural images or use of OBC are especially valuable for improving the vegetation classes such as upland and liana forest classes having complex stand structures and having relatively large patch sizes. PMID:22368311

  10. RMS Spectral Modelling - a powerful tool to probe the origin of variability in Active Galactic Nuclei

    NASA Astrophysics Data System (ADS)

    Mallick, Labani; Dewangan, Gulab chand; Misra, Ranjeev

    2016-07-01

    The broadband energy spectra of Active Galactic Nuclei (AGN) are very complex in nature with the contribution from many ingredients: accretion disk, corona, jets, broad-line region (BLR), narrow-line region (NLR) and Compton-thick absorbing cloud or TORUS. The complexity of the broadband AGN spectra gives rise to mean spectral model degeneracy, e.g, there are competing models for the broad feature near 5-7 keV in terms of blurred reflection and complex absorption. In order to overcome the energy spectral model degeneracy, the most reliable approach is to study the RMS variability spectrum which connects the energy spectrum with temporal variability. The origin of variability could be pivoting of the primary continuum, reflection and/or absorption. The study of RMS (Root Mean Square) spectra would help us to connect the energy spectra with the variability. In this work, we study the energy dependent variability of AGN by developing theoretical RMS spectral model in ISIS (Interactive Spectral Interpretation System) for different input energy spectra. In this talk, I would like to present results of RMS spectral modelling for few radio-loud and radio-quiet AGN observed by XMM-Newton, Suzaku, NuSTAR and ASTROSAT and will probe the dichotomy between these two classes of AGN.

  11. Modelling Inter-relationships among water, governance, human development variables in developing countries with Bayesian networks.

    NASA Astrophysics Data System (ADS)

    Dondeynaz, C.; Lopez-Puga, J.; Carmona-Moreno, C.

    2012-04-01

    Improving Water and Sanitation Services (WSS), being a complex and interdisciplinary issue, passes through collaboration and coordination of different sectors (environment, health, economic activities, governance, and international cooperation). This inter-dependency has been recognised with the adoption of the "Integrated Water Resources Management" principles that push for the integration of these various dimensions involved in WSS delivery to ensure an efficient and sustainable management. The understanding of these interrelations appears as crucial for decision makers in the water sector in particular in developing countries where WSS still represent an important leverage for livelihood improvement. In this framework, the Joint Research Centre of the European Commission has developed a coherent database (WatSan4Dev database) containing 29 indicators from environmental, socio-economic, governance and financial aid flows data focusing on developing countries (Celine et al, 2011 under publication). The aim of this work is to model the WatSan4Dev dataset using probabilistic models to identify the key variables influencing or being influenced by the water supply and sanitation access levels. Bayesian Network Models are suitable to map the conditional dependencies between variables and also allows ordering variables by level of influence on the dependent variable. Separated models have been built for water supply and for sanitation because of different behaviour. The models are validated if complying with statistical criteria but either with scientific knowledge and literature. A two steps approach has been adopted to build the structure of the model; Bayesian network is first built for each thematic cluster of variables (e.g governance, agricultural pressure, or human development) keeping a detailed level for interpretation later one. A global model is then built based on significant indicators of each cluster being previously modelled. The structure of the relationships between variable are set a priori according to literature and/or experience in the field (expert knowledge). The statistical validation is verified according to error rate of classification, and the significance of the variables. Sensibility analysis has also been performed to characterise the relative influence of every single variable in the model. Once validated, the models allow the estimation of impact of each variable on the behaviour of the water supply or sanitation providing an interesting mean to test scenarios and predict variables behaviours. The choices made, methods and description of the various models, for each cluster as well as the global model for water supply and sanitation will be presented. Key results and interpretation of the relationships depicted by the models will be detailed during the conference.

  12. Comparing and improving proper orthogonal decomposition (POD) to reduce the complexity of groundwater models

    NASA Astrophysics Data System (ADS)

    Gosses, Moritz; Nowak, Wolfgang; Wöhling, Thomas

    2017-04-01

    Physically-based modeling is a wide-spread tool in understanding and management of natural systems. With the high complexity of many such models and the huge amount of model runs necessary for parameter estimation and uncertainty analysis, overall run times can be prohibitively long even on modern computer systems. An encouraging strategy to tackle this problem are model reduction methods. In this contribution, we compare different proper orthogonal decomposition (POD, Siade et al. (2010)) methods and their potential applications to groundwater models. The POD method performs a singular value decomposition on system states as simulated by the complex (e.g., PDE-based) groundwater model taken at several time-steps, so-called snapshots. The singular vectors with the highest information content resulting from this decomposition are then used as a basis for projection of the system of model equations onto a subspace of much lower dimensionality than the original complex model, thereby greatly reducing complexity and accelerating run times. In its original form, this method is only applicable to linear problems. Many real-world groundwater models are non-linear, tough. These non-linearities are introduced either through model structure (unconfined aquifers) or boundary conditions (certain Cauchy boundaries, like rivers with variable connection to the groundwater table). To date, applications of POD focused on groundwater models simulating pumping tests in confined aquifers with constant head boundaries. In contrast, POD model reduction either greatly looses accuracy or does not significantly reduce model run time if the above-mentioned non-linearities are introduced. We have also found that variable Dirichlet boundaries are problematic for POD model reduction. An extension to the POD method, called POD-DEIM, has been developed for non-linear groundwater models by Stanko et al. (2016). This method uses spatial interpolation points to build the equation system in the reduced model space, thereby allowing the recalculation of system matrices at every time-step necessary for non-linear models while retaining the speed of the reduced model. This makes POD-DEIM applicable for groundwater models simulating unconfined aquifers. However, in our analysis, the method struggled to reproduce variable river boundaries accurately and gave no advantage for variable Dirichlet boundaries compared to the original POD method. We have developed another extension for POD that targets to address these remaining problems by performing a second POD operation on the model matrix on the left-hand side of the equation. The method aims to at least reproduce the accuracy of the other methods where they are applicable while outperforming them for setups with changing river boundaries or variable Dirichlet boundaries. We compared the new extension with original POD and POD-DEIM for different combinations of model structures and boundary conditions. The new method shows the potential of POD extensions for applications to non-linear groundwater systems and complex boundary conditions that go beyond the current, relatively limited range of applications. References: Siade, A. J., Putti, M., and Yeh, W. W.-G. (2010). Snapshot selection for groundwater model reduction using proper orthogonal decomposition. Water Resour. Res., 46(8):W08539. Stanko, Z. P., Boyce, S. E., and Yeh, W. W.-G. (2016). Nonlinear model reduction of unconfined groundwater flow using pod and deim. Advances in Water Resources, 97:130 - 143.

  13. Development and optimization of carvedilol orodispersible tablets: enhancement of pharmacokinetic parameters in rabbits

    PubMed Central

    Aljimaee, Yazeed HM; El-Helw, Abdel-Rahim M; Ahmed, Osama AA; El-Say, Khalid M

    2015-01-01

    Background Carvedilol (CVD) is used for the treatment of essential hypertension, heart failure, and systolic dysfunction after myocardial infarction. Due to its lower aqueous solubility and extensive first-pass metabolism, the absolute bioavailability of CVD does not exceed 30%. To overcome these drawbacks, the objective of this work was to improve the solubility and onset of action of CVD through complexation with hydroxypropyl-β-cyclodextrin and formulation of the prepared complex as orodispersible tablets (ODTs). Methods Compatibility among CVD and all tablet excipients using differential scanning calorimetry and Fourier transform infrared spectroscopy, complexation of CVD with different polymers, and determination of the solubility of CVD in the prepared complexes were first determined. A Box-Behnken design (BBD) was used to study the effect of tablet formulation variables on the characteristics of the prepared tablets and to optimize preparation conditions. According to BBD design, 15 formulations of CVD-ODTs were prepared by direct compression and then evaluated for their quality attributes. The relative pharmacokinetic parameters of the optimized CVD-ODTs were compared with those of the marketed CVD tablet. A single dose, equivalent to 2.5 mg/kg CVD, was administered orally to New Zealand white rabbits using a double-blind, randomized, crossover design. Results The solubility of CVD was improved from 7.32 to 22.92 mg/mL after complexation with hydroxypropyl-β-cyclodextrin at a molar ratio of 1:2 (CVD to cyclodextrin). The formulated CVD-ODTs showed satisfactory results concerning tablet hardness (5.35 kg/cm2), disintegration time (18 seconds), and maximum amount of CVD released (99.72%). The pharmacokinetic data for the optimized CVD-ODT showed a significant (P<0.05) increase in maximum plasma concentration from 363.667 to 496.4 ng/mL, and a shortening of the time taken to reach maximum plasma concentration to 2 hours in comparison with the marketed tablet. Conclusion The optimized CVD-ODTs showed improved oral absorption of CVD and a subsequent acceleration of clinical effect, which is favored for hypertensive and cardiac patients. PMID:25834396

  14. Student Solution Manual for Essential Mathematical Methods for the Physical Sciences

    NASA Astrophysics Data System (ADS)

    Riley, K. F.; Hobson, M. P.

    2011-02-01

    1. Matrices and vector spaces; 2. Vector calculus; 3. Line, surface and volume integrals; 4. Fourier series; 5. Integral transforms; 6. Higher-order ODEs; 7. Series solutions of ODEs; 8. Eigenfunction methods; 9. Special functions; 10. Partial differential equations; 11. Solution methods for PDEs; 12. Calculus of variations; 13. Integral equations; 14. Complex variables; 15. Applications of complex variables; 16. Probability; 17. Statistics.

  15. Essential Mathematical Methods for the Physical Sciences

    NASA Astrophysics Data System (ADS)

    Riley, K. F.; Hobson, M. P.

    2011-02-01

    1. Matrices and vector spaces; 2. Vector calculus; 3. Line, surface and volume integrals; 4. Fourier series; 5. Integral transforms; 6. Higher-order ODEs; 7. Series solutions of ODEs; 8. Eigenfunction methods; 9. Special functions; 10. Partial differential equations; 11. Solution methods for PDEs; 12. Calculus of variations; 13. Integral equations; 14. Complex variables; 15. Applications of complex variables; 16. Probability; 17. Statistics; Appendices; Index.

  16. Monthly streamflow forecasting using continuous wavelet and multi-gene genetic programming combination

    NASA Astrophysics Data System (ADS)

    Hadi, Sinan Jasim; Tombul, Mustafa

    2018-06-01

    Streamflow is an essential component of the hydrologic cycle in the regional and global scale and the main source of fresh water supply. It is highly associated with natural disasters, such as droughts and floods. Therefore, accurate streamflow forecasting is essential. Forecasting streamflow in general and monthly streamflow in particular is a complex process that cannot be handled by data-driven models (DDMs) only and requires pre-processing. Wavelet transformation is a pre-processing technique; however, application of continuous wavelet transformation (CWT) produces many scales that cause deterioration in the performance of any DDM because of the high number of redundant variables. This study proposes multigene genetic programming (MGGP) as a selection tool. After the CWT analysis, it selects important scales to be imposed into the artificial neural network (ANN). A basin located in the southeast of Turkey is selected as case study to prove the forecasting ability of the proposed model. One month ahead downstream flow is used as output, and downstream flow, upstream, rainfall, temperature, and potential evapotranspiration with associated lags are used as inputs. Before modeling, wavelet coherence transformation (WCT) analysis was conducted to analyze the relationship between variables in the time-frequency domain. Several combinations were developed to investigate the effect of the variables on streamflow forecasting. The results indicated a high localized correlation between the streamflow and other variables, especially the upstream. In the models of the standalone layout where the data were entered to ANN and MGGP without CWT, the performance is found poor. In the best-scale layout, where the best scale of the CWT identified as the highest correlated scale is chosen and enters to ANN and MGGP, the performance increased slightly. Using the proposed model, the performance improved dramatically particularly in forecasting the peak values because of the inclusion of several scales in which seasonality and irregularity can be captured. Using hydrological and meteorological variables also improved the ability to forecast the streamflow.

  17. Quantifying the sources of variability in equine faecal egg counts: implications for improving the utility of the method.

    PubMed

    Denwood, M J; Love, S; Innocent, G T; Matthews, L; McKendrick, I J; Hillary, N; Smith, A; Reid, S W J

    2012-08-13

    The faecal egg count (FEC) is the most widely used means of quantifying the nematode burden of horses, and is frequently used in clinical practice to inform treatment and prevention. The statistical process underlying the FEC is complex, comprising a Poisson counting error process for each sample, compounded with an underlying continuous distribution of means between samples. Being able to quantify the sources of variability contributing to this distribution of means is a necessary step towards providing estimates of statistical power for future FEC and FECRT studies, and may help to improve the usefulness of the FEC technique by identifying and minimising unwanted sources of variability. Obtaining such estimates require a hierarchical statistical model coupled with repeated FEC observations from a single animal over a short period of time. Here, we use this approach to provide the first comparative estimate of multiple sources of within-horse FEC variability. The results demonstrate that a substantial proportion of the observed variation in FEC between horses occurs as a result of variation in FEC within an animal, with the major sources being aggregation of eggs within faeces and variation in egg concentration between faecal piles. The McMaster procedure itself is associated with a comparatively small coefficient of variation, and is therefore highly repeatable when a sufficiently large number of eggs are observed to reduce the error associated with the counting process. We conclude that the variation between samples taken from the same animal is substantial, but can be reduced through the use of larger homogenised faecal samples. Estimates are provided for the coefficient of variation (cv) associated with each within animal source of variability in observed FEC, allowing the usefulness of individual FEC to be quantified, and providing a basis for future FEC and FECRT studies. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Understanding the Complexity of Temperature Dynamics in Xinjiang, China, from Multitemporal Scale and Spatial Perspectives

    PubMed Central

    Chen, Yaning; Li, Weihong; Liu, Zuhan; Wei, Chunmeng; Tang, Jie

    2013-01-01

    Based on the observed data from 51 meteorological stations during the period from 1958 to 2012 in Xinjiang, China, we investigated the complexity of temperature dynamics from the temporal and spatial perspectives by using a comprehensive approach including the correlation dimension (CD), classical statistics, and geostatistics. The main conclusions are as follows (1) The integer CD values indicate that the temperature dynamics are a complex and chaotic system, which is sensitive to the initial conditions. (2) The complexity of temperature dynamics decreases along with the increase of temporal scale. To describe the temperature dynamics, at least 3 independent variables are needed at daily scale, whereas at least 2 independent variables are needed at monthly, seasonal, and annual scales. (3) The spatial patterns of CD values at different temporal scales indicate that the complex temperature dynamics are derived from the complex landform. PMID:23843732

  19. Loss of 'complexity' and aging. Potential applications of fractals and chaos theory to senescence

    NASA Technical Reports Server (NTRS)

    Lipsitz, L. A.; Goldberger, A. L.

    1992-01-01

    The concept of "complexity," derived from the field of nonlinear dynamics, can be adapted to measure the output of physiologic processes that generate highly variable fluctuations resembling "chaos." We review data suggesting that physiologic aging is associated with a generalized loss of such complexity in the dynamics of healthy organ system function and hypothesize that such loss of complexity leads to an impaired ability to adapt to physiologic stress. This hypothesis is supported by observations showing an age-related loss of complex variability in multiple physiologic processes including cardiovascular control, pulsatile hormone release, and electroencephalographic potentials. If further research supports this hypothesis, measures of complexity based on chaos theory and the related geometric concept of fractals may provide new ways to monitor senescence and test the efficacy of specific interventions to modify the age-related decline in adaptive capacity.

  20. X-ray penumbral imaging diagnostic developments at the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Bachmann, B.; Abu-Shawareb, H.; Alexander, N.; Ayers, J.; Bailey, C. G.; Bell, P.; Benedetti, L. R.; Bradley, D.; Collins, G.; Divol, L.; Döppner, T.; Felker, S.; Field, J.; Forsman, A.; Galbraith, J. D.; Hardy, C. M.; Hilsabeck, T.; Izumi, N.; Jarrot, C.; Kilkenny, J.; Kramer, S.; Landen, O. L.; Ma, T.; MacPhee, A.; Masters, N.; Nagel, S. R.; Pak, A.; Patel, P.; Pickworth, L. A.; Ralph, J. E.; Reed, C.; Rygg, J. R.; Thorn, D. B.

    2017-08-01

    X-ray penumbral imaging has been successfully fielded on a variety of inertial confinement fusion (ICF) capsule implosion experiments on the National Ignition Facility (NIF). We have demonstrated sub-5 μm resolution imaging of stagnated plasma cores (hot spots) at x-ray energies from 6 to 30 keV. These measurements are crucial for improving our understanding of the hot deuterium-tritium fuel assembly, which can be affected by various mechanisms, including complex 3-D perturbations caused by the support tent, fill tube or capsule surface roughness. Here we present the progress on several approaches to improve x-ray penumbral imaging experiments on the NIF. We will discuss experimental setups that include penumbral imaging from multiple lines-of-sight, target mounted penumbral apertures and variably filtered penumbral images. Such setups will improve the signal-to-noise ratio and the spatial imaging resolution, with the goal of enabling spatially resolved measurements of the hot spot electron temperature and material mix in ICF implosions.

  1. Utilizing food effects to overcome challenges in delivery of lipophilic bioactives: structural design of medical and functional foods.

    PubMed

    McClements, David Julian

    2013-12-01

    The oral bioavailability of many lipophilic bioactives, such as pharmaceuticals and nutraceuticals, is relatively low due to their poor solubility, permeability and/or chemical stability within the human gastrointestinal tract (GIT). The oral bioavailability of lipophilic bioactives can be improved by designing food matrices that control their release, solubilization, transport and absorption within the GIT. This article discusses the challenges associated with delivering lipophilic bioactive components, the impact of food composition and structure on oral bioavailability and the design of functional and medical foods for improving the oral bioavailability of lipophilic bioactives. Food-based delivery systems can be used to improve the oral bioavailability of lipophilic bioactives. There are a number of potential advantages to delivering lipophilic bioactives using functional or medical foods: greater compliance than conventional delivery forms; increased bioavailability and efficacy; and reduced variability in biological effects. However, food matrices are structurally complex multicomponent materials and research is still needed to identify optimum structures and compositions for particular bioactives.

  2. A visual tracking method based on improved online multiple instance learning

    NASA Astrophysics Data System (ADS)

    He, Xianhui; Wei, Yuxing

    2016-09-01

    Visual tracking is an active research topic in the field of computer vision and has been well studied in the last decades. The method based on multiple instance learning (MIL) was recently introduced into the tracking task, which can solve the problem that template drift well. However, MIL method has relatively poor performance in running efficiency and accuracy, due to its strong classifiers updating strategy is complicated, and the speed of the classifiers update is not always same with the change of the targets' appearance. In this paper, we present a novel online effective MIL (EMIL) tracker. A new update strategy for strong classifier was proposed to improve the running efficiency of MIL method. In addition, to improve the t racking accuracy and stability of the MIL method, a new dynamic mechanism for learning rate renewal of the classifier and variable search window were proposed. Experimental results show that our method performs good performance under the complex scenes, with strong stability and high efficiency.

  3. Biotemplating pores with size and shape diversity for Li-oxygen Battery Cathodes.

    PubMed

    Oh, Dahyun; Ozgit-Akgun, Cagla; Akca, Esin; Thompson, Leslie E; Tadesse, Loza F; Kim, Ho-Cheol; Demirci, Gökhan; Miller, Robert D; Maune, Hareem

    2017-04-04

    Synthetic porogens provide an easy way to create porous structures, but their usage is limited due to synthetic difficulties, process complexities and prohibitive costs. Here we investigate the use of bacteria, sustainable and naturally abundant materials, as a pore template. The bacteria require no chemical synthesis, come in variable sizes and shapes, degrade easier and are approximately a million times cheaper than conventional porogens. We fabricate free standing porous multiwalled carbon nanotube (MWCNT) films using cultured, harmless bacteria as porogens, and demonstrate substantial Li-oxygen battery performance improvement by porosity control. Pore volume as well as shape in the cathodes were easily tuned to improve oxygen evolution efficiency by 30% and double the full discharge capacity in repeated cycles compared to the compact MWCNT electrode films. The interconnected pores produced by the templates greatly improve the accessibility of reactants allowing the achievement of 4,942 W/kg (8,649 Wh/kg) at 2 A/g e (1.7 mA/cm 2 ).

  4. Biotemplating pores with size and shape diversity for Li-oxygen Battery Cathodes

    PubMed Central

    Oh, Dahyun; Ozgit-Akgun, Çagla; Akca, Esin; Thompson, Leslie E.; Tadesse, Loza F.; Kim, Ho-Cheol; Demirci, Gökhan; Miller, Robert D.; Maune, Hareem

    2017-01-01

    Synthetic porogens provide an easy way to create porous structures, but their usage is limited due to synthetic difficulties, process complexities and prohibitive costs. Here we investigate the use of bacteria, sustainable and naturally abundant materials, as a pore template. The bacteria require no chemical synthesis, come in variable sizes and shapes, degrade easier and are approximately a million times cheaper than conventional porogens. We fabricate free standing porous multiwalled carbon nanotube (MWCNT) films using cultured, harmless bacteria as porogens, and demonstrate substantial Li-oxygen battery performance improvement by porosity control. Pore volume as well as shape in the cathodes were easily tuned to improve oxygen evolution efficiency by 30% and double the full discharge capacity in repeated cycles compared to the compact MWCNT electrode films. The interconnected pores produced by the templates greatly improve the accessibility of reactants allowing the achievement of 4,942 W/kg (8,649 Wh/kg) at 2 A/ge (1.7 mA/cm2). PMID:28374862

  5. Incongruity, incongruity resolution, and mental states: The measure and modification of situational awareness and control

    NASA Technical Reports Server (NTRS)

    Derks, Peter L.; Gillikin, Lynn S.

    1993-01-01

    The research reported here describes the process of induction of various mental states. Our goals were to measure and to manipulate both the behavioral and the neurological correlates of particular mental states that have previously been demonstrated to be either beneficial or deleterious to in-flight performance situations. The experimental paradigm involved developing a context of which the participants were aware, followed by the introduction of an incongruity into that context. The empirical questions involved how the incongruity was resolved and the consequent effects on mental state. The dependent variables were measures of both the short-term ERP changes and the longer-term brain mapping indications of predominant mental states. The mission of NASA Flight Management Division and Human/Automation Integration Branch centers on the understanding and improvement of interaction between a complex system and a human operator. Specifically, the goal is improved efficiency through better operative procedures and control strategies. More efficient performance in demanding flight environments depends on improved situational awareness and replanning for fault management.

  6. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology.

    PubMed

    Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H

    2017-07-01

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in using RF to develop predictive models with large environmental data sets.

  7. The Generalization of Mutual Information as the Information between a Set of Variables: The Information Correlation Function Hierarchy and the Information Structure of Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Wolf, David R.

    2004-01-01

    The topic of this paper is a hierarchy of information-like functions, here named the information correlation functions, where each function of the hierarchy may be thought of as the information between the variables it depends upon. The information correlation functions are particularly suited to the description of the emergence of complex behaviors due to many- body or many-agent processes. They are particularly well suited to the quantification of the decomposition of the information carried among a set of variables or agents, and its subsets. In more graphical language, they provide the information theoretic basis for understanding the synergistic and non-synergistic components of a system, and as such should serve as a forceful toolkit for the analysis of the complexity structure of complex many agent systems. The information correlation functions are the natural generalization to an arbitrary number of sets of variables of the sequence starting with the entropy function (one set of variables) and the mutual information function (two sets). We start by describing the traditional measures of information (entropy) and mutual information.

  8. Effects of visual feedback-induced variability on motor learning of handrim wheelchair propulsion.

    PubMed

    Leving, Marika T; Vegter, Riemer J K; Hartog, Johanneke; Lamoth, Claudine J C; de Groot, Sonja; van der Woude, Lucas H V

    2015-01-01

    It has been suggested that a higher intra-individual variability benefits the motor learning of wheelchair propulsion. The present study evaluated whether feedback-induced variability on wheelchair propulsion technique variables would also enhance the motor learning process. Learning was operationalized as an improvement in mechanical efficiency and propulsion technique, which are thought to be closely related during the learning process. 17 Participants received visual feedback-based practice (feedback group) and 15 participants received regular practice (natural learning group). Both groups received equal practice dose of 80 min, over 3 weeks, at 0.24 W/kg at a treadmill speed of 1.11 m/s. To compare both groups the pre- and post-test were performed without feedback. The feedback group received real-time visual feedback on seven propulsion variables with instruction to manipulate the presented variable to achieve the highest possible variability (1st 4-min block) and optimize it in the prescribed direction (2nd 4-min block). To increase motor exploration the participants were unaware of the exact variable they received feedback on. Energy consumption and the propulsion technique variables with their respective coefficient of variation were calculated to evaluate the amount of intra-individual variability. The feedback group, which practiced with higher intra-individual variability, improved the propulsion technique between pre- and post-test to the same extent as the natural learning group. Mechanical efficiency improved between pre- and post-test in the natural learning group but remained unchanged in the feedback group. These results suggest that feedback-induced variability inhibited the improvement in mechanical efficiency. Moreover, since both groups improved propulsion technique but only the natural learning group improved mechanical efficiency, it can be concluded that the improvement in mechanical efficiency and propulsion technique do not always appear simultaneously during the motor learning process. Their relationship is most likely modified by other factors such as the amount of the intra-individual variability.

  9. Effects of Visual Feedback-Induced Variability on Motor Learning of Handrim Wheelchair Propulsion

    PubMed Central

    Leving, Marika T.; Vegter, Riemer J. K.; Hartog, Johanneke; Lamoth, Claudine J. C.; de Groot, Sonja; van der Woude, Lucas H. V.

    2015-01-01

    Background It has been suggested that a higher intra-individual variability benefits the motor learning of wheelchair propulsion. The present study evaluated whether feedback-induced variability on wheelchair propulsion technique variables would also enhance the motor learning process. Learning was operationalized as an improvement in mechanical efficiency and propulsion technique, which are thought to be closely related during the learning process. Methods 17 Participants received visual feedback-based practice (feedback group) and 15 participants received regular practice (natural learning group). Both groups received equal practice dose of 80 min, over 3 weeks, at 0.24 W/kg at a treadmill speed of 1.11 m/s. To compare both groups the pre- and post-test were performed without feedback. The feedback group received real-time visual feedback on seven propulsion variables with instruction to manipulate the presented variable to achieve the highest possible variability (1st 4-min block) and optimize it in the prescribed direction (2nd 4-min block). To increase motor exploration the participants were unaware of the exact variable they received feedback on. Energy consumption and the propulsion technique variables with their respective coefficient of variation were calculated to evaluate the amount of intra-individual variability. Results The feedback group, which practiced with higher intra-individual variability, improved the propulsion technique between pre- and post-test to the same extent as the natural learning group. Mechanical efficiency improved between pre- and post-test in the natural learning group but remained unchanged in the feedback group. Conclusion These results suggest that feedback-induced variability inhibited the improvement in mechanical efficiency. Moreover, since both groups improved propulsion technique but only the natural learning group improved mechanical efficiency, it can be concluded that the improvement in mechanical efficiency and propulsion technique do not always appear simultaneously during the motor learning process. Their relationship is most likely modified by other factors such as the amount of the intra-individual variability. PMID:25992626

  10. Diversity pattern in Sesamum mutants selected for a semi-arid cropping system.

    PubMed

    Murty, B R; Oropeza, F

    1989-02-01

    Due to the complex requirements of moisture stress, substantial genetic diversity with a wide array of character combinations and effective simultaneous selection for several variables is necessary for improving the productivity and adaptation of a component crop in order for it to fit into a cropping system under semi-arid tropical conditions. Sesamum indicum L. is grown in Venezuela after rice/sorghum/or maize under such conditions. A mutation breeding program was undertaken using six locally adapted varieties to develop genotypes suitable for the above system. The diversity pattern for nine variables was assessed by multivariate analysis in 301 M4 progenies. Analysis of the characteristic roots and principal components in three methods of selection, i.e., M2 bulks (A), individual plant selection throughout (B), and selection in M3 for single variable (C), revealed differences in the pattern of variation between varieties, selection methods, and varieties x methods interactions. Method B was superior to the others and gave 17 of the 21 best M5 progenies. 'Piritu' and 'CF' varieties yielded the most productive progenies in M5 and M6. Diversity was large and selection was effective for such developmental traits as earliness and synchrony, combined with multiple disease resistance, which could be related to their importance by multivariate analyses. Considerable differences in the variety of character combinations among the high yielding. M5 progenies of 'CF' and 'Piritu' suggested possible further yield improvement. The superior response of 'Piritu' and 'CF' over other varieties in yield and adaptation was due to major changes in plant type and character associations. Multilocation testing of M5 generations revealed that the mutant progenies had a 40%-100% yield superiority over the parents; this was combined with earliness, synchrony, and multiple disease resistance, and was confirmed in the M6 generation grown on a commercial scale. This study showed that multivariate analysis is an effective tool for assessing diversity patterns, choice of appropriate variety, and selection methodology in order to make rapid progress in meeting the complex requirements of semi-arid cropping systems.

  11. Preventing healthcare-associated infections through human factors engineering.

    PubMed

    Jacob, Jesse T; Herwaldt, Loreen A; Durso, Francis T

    2018-05-24

    Human factors engineering (HFE) approaches are increasingly being used in healthcare, but have been applied in relatively limited ways to infection prevention and control (IPC). Previous studies have focused on using selected HFE tools, but newer literature supports a system-based HFE approach to IPC. Cross-contamination and the existence of workarounds suggest that healthcare workers need better support to reduce and simplify steps in delivering care. Simplifying workflow can lead to better understanding of why a process fails and allow for improvements to reduce errors and increase efficiency. Hand hygiene can be improved using visual cues and nudges based on room layout. Using personal protective equipment appropriately appears simple, but exists in a complex interaction with workload, behavior, emotion, and environmental variables including product placement. HFE can help prevent the pathogen transmission through improving environmental cleaning and appropriate use of medical devices. Emerging evidence suggests that HFE can be applied in IPC to reduce healthcare-associated infections. HFE and IPC collaboration can help improve many of the basic best practices including use of hand hygiene and personal protective equipment by healthcare workers during patient care.

  12. Microbes as engines of ecosystem function: When does community structure enhance predictions of ecosystem processes?

    DOE PAGES

    Graham, Emily B.; Knelman, Joseph E.; Schindlbacher, Andreas; ...

    2016-02-24

    In this study, microorganisms are vital in mediating the earth’s biogeochemical cycles; yet, despite our rapidly increasing ability to explore complex environmental microbial communities, the relationship between microbial community structure and ecosystem processes remains poorly understood. Here, we address a fundamental and unanswered question in microbial ecology: ‘When do we need to understand microbial community structure to accurately predict function?’ We present a statistical analysis investigating the value of environmental data and microbial community structure independently and in combination for explaining rates of carbon and nitrogen cycling processes within 82 global datasets. Environmental variables were the strongest predictors of processmore » rates but left 44% of variation unexplained on average, suggesting the potential for microbial data to increase model accuracy. Although only 29% of our datasets were significantly improved by adding information on microbial community structure, we observed improvement in models of processes mediated by narrow phylogenetic guilds via functional gene data, and conversely, improvement in models of facultative microbial processes via community diversity metrics. Our results also suggest that microbial diversity can strengthen predictions of respiration rates beyond microbial biomass parameters, as 53% of models were improved by incorporating both sets of predictors compared to 35% by microbial biomass alone. Our analysis represents the first comprehensive analysis of research examining links between microbial community structure and ecosystem function. Taken together, our results indicate that a greater understanding of microbial communities informed by ecological principles may enhance our ability to predict ecosystem process rates relative to assessments based on environmental variables and microbial physiology.« less

  13. Microbes as engines of ecosystem function: When does community structure enhance predictions of ecosystem processes?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graham, Emily B.; Knelman, Joseph E.; Schindlbacher, Andreas

    In this study, microorganisms are vital in mediating the earth’s biogeochemical cycles; yet, despite our rapidly increasing ability to explore complex environmental microbial communities, the relationship between microbial community structure and ecosystem processes remains poorly understood. Here, we address a fundamental and unanswered question in microbial ecology: ‘When do we need to understand microbial community structure to accurately predict function?’ We present a statistical analysis investigating the value of environmental data and microbial community structure independently and in combination for explaining rates of carbon and nitrogen cycling processes within 82 global datasets. Environmental variables were the strongest predictors of processmore » rates but left 44% of variation unexplained on average, suggesting the potential for microbial data to increase model accuracy. Although only 29% of our datasets were significantly improved by adding information on microbial community structure, we observed improvement in models of processes mediated by narrow phylogenetic guilds via functional gene data, and conversely, improvement in models of facultative microbial processes via community diversity metrics. Our results also suggest that microbial diversity can strengthen predictions of respiration rates beyond microbial biomass parameters, as 53% of models were improved by incorporating both sets of predictors compared to 35% by microbial biomass alone. Our analysis represents the first comprehensive analysis of research examining links between microbial community structure and ecosystem function. Taken together, our results indicate that a greater understanding of microbial communities informed by ecological principles may enhance our ability to predict ecosystem process rates relative to assessments based on environmental variables and microbial physiology.« less

  14. Microbes as Engines of Ecosystem Function: When Does Community Structure Enhance Predictions of Ecosystem Processes?

    PubMed Central

    Graham, Emily B.; Knelman, Joseph E.; Schindlbacher, Andreas; Siciliano, Steven; Breulmann, Marc; Yannarell, Anthony; Beman, J. M.; Abell, Guy; Philippot, Laurent; Prosser, James; Foulquier, Arnaud; Yuste, Jorge C.; Glanville, Helen C.; Jones, Davey L.; Angel, Roey; Salminen, Janne; Newton, Ryan J.; Bürgmann, Helmut; Ingram, Lachlan J.; Hamer, Ute; Siljanen, Henri M. P.; Peltoniemi, Krista; Potthast, Karin; Bañeras, Lluís; Hartmann, Martin; Banerjee, Samiran; Yu, Ri-Qing; Nogaro, Geraldine; Richter, Andreas; Koranda, Marianne; Castle, Sarah C.; Goberna, Marta; Song, Bongkeun; Chatterjee, Amitava; Nunes, Olga C.; Lopes, Ana R.; Cao, Yiping; Kaisermann, Aurore; Hallin, Sara; Strickland, Michael S.; Garcia-Pausas, Jordi; Barba, Josep; Kang, Hojeong; Isobe, Kazuo; Papaspyrou, Sokratis; Pastorelli, Roberta; Lagomarsino, Alessandra; Lindström, Eva S.; Basiliko, Nathan; Nemergut, Diana R.

    2016-01-01

    Microorganisms are vital in mediating the earth’s biogeochemical cycles; yet, despite our rapidly increasing ability to explore complex environmental microbial communities, the relationship between microbial community structure and ecosystem processes remains poorly understood. Here, we address a fundamental and unanswered question in microbial ecology: ‘When do we need to understand microbial community structure to accurately predict function?’ We present a statistical analysis investigating the value of environmental data and microbial community structure independently and in combination for explaining rates of carbon and nitrogen cycling processes within 82 global datasets. Environmental variables were the strongest predictors of process rates but left 44% of variation unexplained on average, suggesting the potential for microbial data to increase model accuracy. Although only 29% of our datasets were significantly improved by adding information on microbial community structure, we observed improvement in models of processes mediated by narrow phylogenetic guilds via functional gene data, and conversely, improvement in models of facultative microbial processes via community diversity metrics. Our results also suggest that microbial diversity can strengthen predictions of respiration rates beyond microbial biomass parameters, as 53% of models were improved by incorporating both sets of predictors compared to 35% by microbial biomass alone. Our analysis represents the first comprehensive analysis of research examining links between microbial community structure and ecosystem function. Taken together, our results indicate that a greater understanding of microbial communities informed by ecological principles may enhance our ability to predict ecosystem process rates relative to assessments based on environmental variables and microbial physiology. PMID:26941732

  15. Microbes as Engines of Ecosystem Function: When Does Community Structure Enhance Predictions of Ecosystem Processes?

    PubMed

    Graham, Emily B; Knelman, Joseph E; Schindlbacher, Andreas; Siciliano, Steven; Breulmann, Marc; Yannarell, Anthony; Beman, J M; Abell, Guy; Philippot, Laurent; Prosser, James; Foulquier, Arnaud; Yuste, Jorge C; Glanville, Helen C; Jones, Davey L; Angel, Roey; Salminen, Janne; Newton, Ryan J; Bürgmann, Helmut; Ingram, Lachlan J; Hamer, Ute; Siljanen, Henri M P; Peltoniemi, Krista; Potthast, Karin; Bañeras, Lluís; Hartmann, Martin; Banerjee, Samiran; Yu, Ri-Qing; Nogaro, Geraldine; Richter, Andreas; Koranda, Marianne; Castle, Sarah C; Goberna, Marta; Song, Bongkeun; Chatterjee, Amitava; Nunes, Olga C; Lopes, Ana R; Cao, Yiping; Kaisermann, Aurore; Hallin, Sara; Strickland, Michael S; Garcia-Pausas, Jordi; Barba, Josep; Kang, Hojeong; Isobe, Kazuo; Papaspyrou, Sokratis; Pastorelli, Roberta; Lagomarsino, Alessandra; Lindström, Eva S; Basiliko, Nathan; Nemergut, Diana R

    2016-01-01

    Microorganisms are vital in mediating the earth's biogeochemical cycles; yet, despite our rapidly increasing ability to explore complex environmental microbial communities, the relationship between microbial community structure and ecosystem processes remains poorly understood. Here, we address a fundamental and unanswered question in microbial ecology: 'When do we need to understand microbial community structure to accurately predict function?' We present a statistical analysis investigating the value of environmental data and microbial community structure independently and in combination for explaining rates of carbon and nitrogen cycling processes within 82 global datasets. Environmental variables were the strongest predictors of process rates but left 44% of variation unexplained on average, suggesting the potential for microbial data to increase model accuracy. Although only 29% of our datasets were significantly improved by adding information on microbial community structure, we observed improvement in models of processes mediated by narrow phylogenetic guilds via functional gene data, and conversely, improvement in models of facultative microbial processes via community diversity metrics. Our results also suggest that microbial diversity can strengthen predictions of respiration rates beyond microbial biomass parameters, as 53% of models were improved by incorporating both sets of predictors compared to 35% by microbial biomass alone. Our analysis represents the first comprehensive analysis of research examining links between microbial community structure and ecosystem function. Taken together, our results indicate that a greater understanding of microbial communities informed by ecological principles may enhance our ability to predict ecosystem process rates relative to assessments based on environmental variables and microbial physiology.

  16. Multiscale entropy-based methods for heart rate variability complexity analysis

    NASA Astrophysics Data System (ADS)

    Silva, Luiz Eduardo Virgilio; Cabella, Brenno Caetano Troca; Neves, Ubiraci Pereira da Costa; Murta Junior, Luiz Otavio

    2015-03-01

    Physiologic complexity is an important concept to characterize time series from biological systems, which associated to multiscale analysis can contribute to comprehension of many complex phenomena. Although multiscale entropy has been applied to physiological time series, it measures irregularity as function of scale. In this study we purpose and evaluate a set of three complexity metrics as function of time scales. Complexity metrics are derived from nonadditive entropy supported by generation of surrogate data, i.e. SDiffqmax, qmax and qzero. In order to access accuracy of proposed complexity metrics, receiver operating characteristic (ROC) curves were built and area under the curves was computed for three physiological situations. Heart rate variability (HRV) time series in normal sinus rhythm, atrial fibrillation, and congestive heart failure data set were analyzed. Results show that proposed metric for complexity is accurate and robust when compared to classic entropic irregularity metrics. Furthermore, SDiffqmax is the most accurate for lower scales, whereas qmax and qzero are the most accurate when higher time scales are considered. Multiscale complexity analysis described here showed potential to assess complex physiological time series and deserves further investigation in wide context.

  17. Accuracy Improvement in Magnetic Field Modeling for an Axisymmetric Electromagnet

    NASA Technical Reports Server (NTRS)

    Ilin, Andrew V.; Chang-Diaz, Franklin R.; Gurieva, Yana L.; Il,in, Valery P.

    2000-01-01

    This paper examines the accuracy and calculation speed for the magnetic field computation in an axisymmetric electromagnet. Different numerical techniques, based on an adaptive nonuniform grid, high order finite difference approximations, and semi-analitical calculation of boundary conditions are considered. These techniques are being applied to the modeling of the Variable Specific Impulse Magnetoplasma Rocket. For high-accuracy calculations, a fourth-order scheme offers dramatic advantages over a second order scheme. For complex physical configurations of interest in plasma propulsion, a second-order scheme with nonuniform mesh gives the best results. Also, the relative advantages of various methods are described when the speed of computation is an important consideration.

  18. Antifungal Therapy in Birds: Old Drugs in a New Jacket.

    PubMed

    Antonissen, Gunther; Martel, An

    2018-05-01

    The use of antifungals in birds is characterized by interspecies and interindividual variability in the pharmacokinetics, affecting drug safety and efficacy. Oral antifungal drug absorption is a complex process affected by drug formulation characteristics, gastrointestinal anatomy, and physiology. New antifungal drug delivery systems can enhance drug stability, reduce off-target side effects, prolong residence time in the blood, and improve efficacy. Topical administration of antifungals through nebulization shows promising results. However, therapeutic output is highly influenced by drug formulation and type of nebulizer, indicating these factors should be taken into account when selecting this medication route. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. LinguisticBelief: a java application for linguistic evaluation using belief, fuzzy sets, and approximate reasoning.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darby, John L.

    LinguisticBelief is a Java computer code that evaluates combinations of linguistic variables using an approximate reasoning rule base. Each variable is comprised of fuzzy sets, and a rule base describes the reasoning on combinations of variables fuzzy sets. Uncertainty is considered and propagated through the rule base using the belief/plausibility measure. The mathematics of fuzzy sets, approximate reasoning, and belief/ plausibility are complex. Without an automated tool, this complexity precludes their application to all but the simplest of problems. LinguisticBelief automates the use of these techniques, allowing complex problems to be evaluated easily. LinguisticBelief can be used free of chargemore » on any Windows XP machine. This report documents the use and structure of the LinguisticBelief code, and the deployment package for installation client machines.« less

  20. Variation of M···H-C Interactions in Square-Planar Complexes of Nickel(II), Palladium(II), and Platinum(II) Probed by Luminescence Spectroscopy and X-ray Diffraction at Variable Pressure.

    PubMed

    Poirier, Stéphanie; Lynn, Hudson; Reber, Christian; Tailleur, Elodie; Marchivie, Mathieu; Guionneau, Philippe; Probert, Michael R

    2018-06-12

    Luminescence spectra of isoelectronic square-planar d 8 complexes with 3d, 4d, and 5d metal centers show d-d luminescence with an energetic order different from that of the spectrochemical series, indicating that additional structural effects, such as different ligand-metal-ligand angles, are important factors. Variable-pressure luminescence spectra of square-planar nickel(II), palladium(II), and platinum(II) complexes with dimethyldithiocarbamate ({CH 3 } 2 DTC) ligands and their deuterated analogues show unexpected variations of the shifts of their maxima. High-resolution crystal structures and crystal structures at variable pressure for [Pt{(CH 3 ) 2 DTC} 2 ] indicate that intermolecular M···H-C interactions are at the origin of these different shifts.

Top