Multi-fidelity methods for uncertainty quantification in transport problems
NASA Astrophysics Data System (ADS)
Tartakovsky, G.; Yang, X.; Tartakovsky, A. M.; Barajas-Solano, D. A.; Scheibe, T. D.; Dai, H.; Chen, X.
2016-12-01
We compare several multi-fidelity approaches for uncertainty quantification in flow and transport simulations that have a lower computational cost than the standard Monte Carlo method. The cost reduction is achieved by combining a small number of high-resolution (high-fidelity) simulations with a large number of low-resolution (low-fidelity) simulations. We propose a new method, a re-scaled Multi Level Monte Carlo (rMLMC) method. The rMLMC is based on the idea that the statistics of quantities of interest depends on scale/resolution. We compare rMLMC with existing multi-fidelity methods such as Multi Level Monte Carlo (MLMC) and reduced basis methods and discuss advantages of each approach.
Bittig, Arne T; Uhrmacher, Adelinde M
2017-01-01
Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.
Zhang, Xinyan; Li, Bingzong; Han, Huiying; Song, Sha; Xu, Hongxia; Hong, Yating; Yi, Nengjun; Zhuang, Wenzhuo
2018-05-10
Multiple myeloma (MM), like other cancers, is caused by the accumulation of genetic abnormalities. Heterogeneity exists in the patients' response to treatments, for example, bortezomib. This urges efforts to identify biomarkers from numerous molecular features and build predictive models for identifying patients that can benefit from a certain treatment scheme. However, previous studies treated the multi-level ordinal drug response as a binary response where only responsive and non-responsive groups are considered. It is desirable to directly analyze the multi-level drug response, rather than combining the response to two groups. In this study, we present a novel method to identify significantly associated biomarkers and then develop ordinal genomic classifier using the hierarchical ordinal logistic model. The proposed hierarchical ordinal logistic model employs the heavy-tailed Cauchy prior on the coefficients and is fitted by an efficient quasi-Newton algorithm. We apply our hierarchical ordinal regression approach to analyze two publicly available datasets for MM with five-level drug response and numerous gene expression measures. Our results show that our method is able to identify genes associated with the multi-level drug response and to generate powerful predictive models for predicting the multi-level response. The proposed method allows us to jointly fit numerous correlated predictors and thus build efficient models for predicting the multi-level drug response. The predictive model for the multi-level drug response can be more informative than the previous approaches. Thus, the proposed approach provides a powerful tool for predicting multi-level drug response and has important impact on cancer studies.
Jaiswal, Astha; Godinez, William J; Eils, Roland; Lehmann, Maik Jorg; Rohr, Karl
2015-11-01
Automatic fluorescent particle tracking is an essential task to study the dynamics of a large number of biological structures at a sub-cellular level. We have developed a probabilistic particle tracking approach based on multi-scale detection and two-step multi-frame association. The multi-scale detection scheme allows coping with particles in close proximity. For finding associations, we have developed a two-step multi-frame algorithm, which is based on a temporally semiglobal formulation as well as spatially local and global optimization. In the first step, reliable associations are determined for each particle individually in local neighborhoods. In the second step, the global spatial information over multiple frames is exploited jointly to determine optimal associations. The multi-scale detection scheme and the multi-frame association finding algorithm have been combined with a probabilistic tracking approach based on the Kalman filter. We have successfully applied our probabilistic tracking approach to synthetic as well as real microscopy image sequences of virus particles and quantified the performance. We found that the proposed approach outperforms previous approaches.
Vickers, T. Winston; Ernest, Holly B.; Boyce, Walter M.
2017-01-01
The importance of examining multiple hierarchical levels when modeling resource use for wildlife has been acknowledged for decades. Multi-level resource selection functions have recently been promoted as a method to synthesize resource use across nested organizational levels into a single predictive surface. Analyzing multiple scales of selection within each hierarchical level further strengthens multi-level resource selection functions. We extend this multi-level, multi-scale framework to modeling resistance for wildlife by combining multi-scale resistance surfaces from two data types, genetic and movement. Resistance estimation has typically been conducted with one of these data types, or compared between the two. However, we contend it is not an either/or issue and that resistance may be better-modeled using a combination of resistance surfaces that represent processes at different hierarchical levels. Resistance surfaces estimated from genetic data characterize temporally broad-scale dispersal and successful breeding over generations, whereas resistance surfaces estimated from movement data represent fine-scale travel and contextualized movement decisions. We used telemetry and genetic data from a long-term study on pumas (Puma concolor) in a highly developed landscape in southern California to develop a multi-level, multi-scale resource selection function and a multi-level, multi-scale resistance surface. We used these multi-level, multi-scale surfaces to identify resource use patches and resistant kernel corridors. Across levels, we found puma avoided urban, agricultural areas, and roads and preferred riparian areas and more rugged terrain. For other landscape features, selection differed among levels, as did the scales of selection for each feature. With these results, we developed a conservation plan for one of the most isolated puma populations in the U.S. Our approach captured a wide spectrum of ecological relationships for a population, resulted in effective conservation planning, and can be readily applied to other wildlife species. PMID:28609466
Zeller, Katherine A; Vickers, T Winston; Ernest, Holly B; Boyce, Walter M
2017-01-01
The importance of examining multiple hierarchical levels when modeling resource use for wildlife has been acknowledged for decades. Multi-level resource selection functions have recently been promoted as a method to synthesize resource use across nested organizational levels into a single predictive surface. Analyzing multiple scales of selection within each hierarchical level further strengthens multi-level resource selection functions. We extend this multi-level, multi-scale framework to modeling resistance for wildlife by combining multi-scale resistance surfaces from two data types, genetic and movement. Resistance estimation has typically been conducted with one of these data types, or compared between the two. However, we contend it is not an either/or issue and that resistance may be better-modeled using a combination of resistance surfaces that represent processes at different hierarchical levels. Resistance surfaces estimated from genetic data characterize temporally broad-scale dispersal and successful breeding over generations, whereas resistance surfaces estimated from movement data represent fine-scale travel and contextualized movement decisions. We used telemetry and genetic data from a long-term study on pumas (Puma concolor) in a highly developed landscape in southern California to develop a multi-level, multi-scale resource selection function and a multi-level, multi-scale resistance surface. We used these multi-level, multi-scale surfaces to identify resource use patches and resistant kernel corridors. Across levels, we found puma avoided urban, agricultural areas, and roads and preferred riparian areas and more rugged terrain. For other landscape features, selection differed among levels, as did the scales of selection for each feature. With these results, we developed a conservation plan for one of the most isolated puma populations in the U.S. Our approach captured a wide spectrum of ecological relationships for a population, resulted in effective conservation planning, and can be readily applied to other wildlife species.
Adaptive multi-GPU Exchange Monte Carlo for the 3D Random Field Ising Model
NASA Astrophysics Data System (ADS)
Navarro, Cristóbal A.; Huang, Wei; Deng, Youjin
2016-08-01
This work presents an adaptive multi-GPU Exchange Monte Carlo approach for the simulation of the 3D Random Field Ising Model (RFIM). The design is based on a two-level parallelization. The first level, spin-level parallelism, maps the parallel computation as optimal 3D thread-blocks that simulate blocks of spins in shared memory with minimal halo surface, assuming a constant block volume. The second level, replica-level parallelism, uses multi-GPU computation to handle the simulation of an ensemble of replicas. CUDA's concurrent kernel execution feature is used in order to fill the occupancy of each GPU with many replicas, providing a performance boost that is more notorious at the smallest values of L. In addition to the two-level parallel design, the work proposes an adaptive multi-GPU approach that dynamically builds a proper temperature set free of exchange bottlenecks. The strategy is based on mid-point insertions at the temperature gaps where the exchange rate is most compromised. The extra work generated by the insertions is balanced across the GPUs independently of where the mid-point insertions were performed. Performance results show that spin-level performance is approximately two orders of magnitude faster than a single-core CPU version and one order of magnitude faster than a parallel multi-core CPU version running on 16-cores. Multi-GPU performance is highly convenient under a weak scaling setting, reaching up to 99 % efficiency as long as the number of GPUs and L increase together. The combination of the adaptive approach with the parallel multi-GPU design has extended our possibilities of simulation to sizes of L = 32 , 64 for a workstation with two GPUs. Sizes beyond L = 64 can eventually be studied using larger multi-GPU systems.
Automatic multi-organ segmentation using learning-based segmentation and level set optimization.
Kohlberger, Timo; Sofka, Michal; Zhang, Jingdan; Birkbeck, Neil; Wetzl, Jens; Kaftan, Jens; Declerck, Jérôme; Zhou, S Kevin
2011-01-01
We present a novel generic segmentation system for the fully automatic multi-organ segmentation from CT medical images. Thereby we combine the advantages of learning-based approaches on point cloud-based shape representation, such a speed, robustness, point correspondences, with those of PDE-optimization-based level set approaches, such as high accuracy and the straightforward prevention of segment overlaps. In a benchmark on 10-100 annotated datasets for the liver, the lungs, and the kidneys we show that the proposed system yields segmentation accuracies of 1.17-2.89 mm average surface errors. Thereby the level set segmentation (which is initialized by the learning-based segmentations) contributes with an 20%-40% increase in accuracy.
ERIC Educational Resources Information Center
Uiboleht, Kaire; Karm, Mari; Postareff, Liisa
2016-01-01
Teaching approaches in higher education are at the general level well researched and have identified not only the two broad categories of content-focused and learning-focused approaches to teaching but also consonance and dissonance between the aspects of teaching. Consonance means that theoretically coherent teaching practices are employed, but…
Using Nonlinear Programming in International Trade Theory: The Factor-Proportions Model
ERIC Educational Resources Information Center
Gilbert, John
2004-01-01
Students at all levels benefit from a multi-faceted approach to learning abstract material. The most commonly used technique in teaching the pure theory of international trade is a combination of geometry and algebraic derivations. Numerical simulation can provide a valuable third support to these approaches. The author describes a simple…
Civilisation on the Couch: Theorising Multi-Levelled Psychoanalytical Arts Practice
ERIC Educational Resources Information Center
Benjamin, Garfield
2014-01-01
This paper combines two psychological approaches to art to theorise a both subjective and cultural methodology for practice-based arts research. The first psychoanalytical approach will follow the work of Deleuze and Guattari's Schizoanalysis, considering the role of the artist in order to assess their work in relation to society from an…
Arbabi, Vahid; Pouran, Behdad; Weinans, Harrie; Zadpoor, Amir A
2016-09-06
Analytical and numerical methods have been used to extract essential engineering parameters such as elastic modulus, Poisson׳s ratio, permeability and diffusion coefficient from experimental data in various types of biological tissues. The major limitation associated with analytical techniques is that they are often only applicable to problems with simplified assumptions. Numerical multi-physics methods, on the other hand, enable minimizing the simplified assumptions but require substantial computational expertise, which is not always available. In this paper, we propose a novel approach that combines inverse and forward artificial neural networks (ANNs) which enables fast and accurate estimation of the diffusion coefficient of cartilage without any need for computational modeling. In this approach, an inverse ANN is trained using our multi-zone biphasic-solute finite-bath computational model of diffusion in cartilage to estimate the diffusion coefficient of the various zones of cartilage given the concentration-time curves. Robust estimation of the diffusion coefficients, however, requires introducing certain levels of stochastic variations during the training process. Determining the required level of stochastic variation is performed by coupling the inverse ANN with a forward ANN that receives the diffusion coefficient as input and returns the concentration-time curve as output. Combined together, forward-inverse ANNs enable computationally inexperienced users to obtain accurate and fast estimation of the diffusion coefficients of cartilage zones. The diffusion coefficients estimated using the proposed approach are compared with those determined using direct scanning of the parameter space as the optimization approach. It has been shown that both approaches yield comparable results. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Joyce, Steven; Hartley, Lee; Applegate, David; Hoek, Jaap; Jackson, Peter
2014-09-01
Forsmark in Sweden has been proposed as the site of a geological repository for spent high-level nuclear fuel, to be located at a depth of approximately 470 m in fractured crystalline rock. The safety assessment for the repository has required a multi-disciplinary approach to evaluate the impact of hydrogeological and hydrogeochemical conditions close to the repository and in a wider regional context. Assessing the consequences of potential radionuclide releases requires quantitative site-specific information concerning the details of groundwater flow on the scale of individual waste canister locations (1-10 m) as well as details of groundwater flow and composition on the scale of groundwater pathways between the facility and the surface (500 m to 5 km). The purpose of this article is to provide an illustration of multi-scale modeling techniques and the results obtained when combining aspects of local-scale flows in fractures around a potential contaminant source with regional-scale groundwater flow and transport subject to natural evolution of the system. The approach set out is novel, as it incorporates both different scales of model and different levels of detail, combining discrete fracture network and equivalent continuous porous medium representations of fractured bedrock.
Experimental Design for Multi-drug Combination Studies Using Signaling Networks
Huang, Hengzhen; Fang, Hong-Bin; Tan, Ming T.
2017-01-01
Summary Combinations of multiple drugs are an important approach to maximize the chance for therapeutic success by inhibiting multiple pathways/targets. Analytic methods for studying drug combinations have received increasing attention because major advances in biomedical research have made available large number of potential agents for testing. The preclinical experiment on multi-drug combinations plays a key role in (especially cancer) drug development because of the complex nature of the disease, the need to reduce development time and costs. Despite recent progresses in statistical methods for assessing drug interaction, there is an acute lack of methods for designing experiments on multi-drug combinations. The number of combinations grows exponentially with the number of drugs and dose-levels and it quickly precludes laboratory testing. Utilizing experimental dose-response data of single drugs and a few combinations along with pathway/network information to obtain an estimate of the functional structure of the dose-response relationship in silico, we propose an optimal design that allows exploration of the dose-effect surface with the smallest possible sample size in this paper. The simulation studies show our proposed methods perform well. PMID:28960231
Lakerveld, Jeroen; Brug, Johannes; Bot, Sandra; Teixeira, Pedro J; Rutter, Harry; Woodward, Euan; Samdal, Oddrun; Stockley, Lynn; De Bourdeaudhuij, Ilse; van Assema, Patricia; Robertson, Aileen; Lobstein, Tim; Oppert, Jean-Michel; Adány, Róza; Nijpels, Giel
2012-09-17
The prevalence of overweight and obesity in Europe is high. It is a major cause of the overall rates of many of the main chronic (or non communicable) diseases in this region and is characterized by an unequal socio-economic distribution within the population. Obesity is largely determined by modifiable lifestyle behaviours such as low physical activity levels, sedentary behaviour and consumption of energy dense diets. It is increasingly being recognised that effective responses must go beyond interventions that only focus on a specific individual, social or environmental level and instead embrace system-based multi-level intervention approaches that address both the individual and environment. The EU-funded project "sustainable prevention of obesity through integrated strategies" (SPOTLIGHT) aims to increase and combine knowledge on the wide range of determinants of obesity in a systematic way, and to identify multi-level intervention approaches that are strong in terms of Reach, Efficacy, Adoption, Implementation and Maintenance (RE-AIM). SPOTLIGHT comprises a series of systematic reviews on: individual-level predictors of success in behaviour change obesity interventions; social and physical environmental determinants of obesity; and on the RE-AIM of multi-level interventions. An interactive web-atlas of currently running multi-level interventions will be developed, and enhancing and impeding factors for implementation will be described. At the neighbourhood level, these elements will inform the development of methods to assess obesogenicity of diverse environments, using remote imaging techniques linked to geographic information systems. The validity of these methods will be evaluated using data from surveys of health and lifestyles of adults residing in the neighbourhoods surveyed. At both the micro- and macro-levels (national and international) the different physical, economical, political and socio-cultural elements will be assessed. SPOTLIGHT offers the potential to develop approaches that combine an understanding of the obesogenicity of environments in Europe, and thus how they can be improved, with an appreciation of the individual factors that explain why people respond differently to such environments. Its findings will inform governmental authorities and professionals, academics, NGOs and private sector stakeholders engaged in the development and implementation of policies to tackle the obesity epidemic in Europe.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Passel, Steven, E-mail: Steven.vanpassel@uhasselt.be; University of Antwerp, Department Bioscience Engineering, Groenenborgerlaan 171, 2020 Antwerp; Meul, Marijke
Sustainability assessment is needed to build sustainable farming systems. A broad range of sustainability concepts, methodologies and applications already exists. They differ in level, focus, orientation, measurement, scale, presentation and intended end-users. In this paper we illustrate that a smart combination of existing methods with different levels of application can make sustainability assessment more profound, and that it can broaden the insights of different end-user groups. An overview of sustainability assessment tools on different levels and for different end-users shows the complementarities and the opportunities of using different methods. In a case-study, a combination of the sustainable value approach (SVA)more » and MOTIFS is used to perform a sustainability evaluation of farming systems in Flanders. SVA is used to evaluate sustainability at sector level, and is especially useful to support policy makers, while MOTIFS is used to support and guide farmers towards sustainability at farm level. The combined use of the two methods with complementary goals can widen the insights of both farmers and policy makers, without losing the particularities of the different approaches. To stimulate and support further research and applications, we propose guidelines for multilevel and multi-user sustainability assessments. - Highlights: Black-Right-Pointing-Pointer We give an overview of sustainability assessment tools for agricultural systems. Black-Right-Pointing-Pointer SVA and MOTIFS are used to evaluate the sustainability of dairy farming in Flanders. Black-Right-Pointing-Pointer Combination of methods with different levels broadens the insights of different end-user groups. Black-Right-Pointing-Pointer We propose guidelines for multilevel and multi-user sustainability assessments.« less
A Multi-targeted Approach to Suppress Tumor-Promoting Inflammation
Samadi, Abbas K.; Georgakilas, Alexandros G.; Amedei, Amedeo; Amin, Amr; Bishayee, Anupam; Lokeshwar, Bal L.; Grue, Brendan; Panis, Carolina; Boosani, Chandra S.; Poudyal, Deepak; Stafforini, Diana M.; Bhakta, Dipita; Niccolai, Elena; Guha, Gunjan; Rupasinghe, H.P. Vasantha; Fujii, Hiromasa; Honoki, Kanya; Mehta, Kapil; Aquilano, Katia; Lowe, Leroy; Hofseth, Lorne J.; Ricciardiello, Luigi; Ciriolo, Maria Rosa; Singh, Neetu; Whelan, Richard L.; Chaturvedi, Rupesh; Ashraf, S. Salman; Kumara, HMC Shantha; Nowsheen, Somaira; Mohammed, Sulma I.; Helferich, William G.; Yang, Xujuan
2015-01-01
Cancers harbor significant genetic heterogeneity and patterns of relapse following many therapies are due to evolved resistance to treatment. While efforts have been made to combine targeted therapies, significant levels of toxicity have stymied efforts to effectively treat cancer with multi-drug combinations using currently approved therapeutics. We discuss the relationship between tumor-promoting inflammation and cancer as part of a larger effort to develop a broad-spectrum therapeutic approach aimed at a wide range of targets to address this heterogeneity. Specifically, macrophage migration inhibitory factor, cyclooxygenase-2, transcription factor nuclear factor-kappaB, tumor necrosis factor alpha, inducible nitric oxide synthase, protein kinase B, and CXC chemokines are reviewed as important antiinflammatory targets while curcumin, resveratrol, epigallocatechin gallate, genistein, lycopene, and anthocyanins are reviewed as low-cost, low toxicity means by which these targets might all be reached simultaneously. Future translational work will need to assess the resulting synergies of rationally designed antiinflammatory mixtures (employing low-toxicity constituents), and then combine this with similar approaches targeting the most important pathways across the range of cancer hallmark phenotypes. PMID:25951989
A hybrid solution approach for a multi-objective closed-loop logistics network under uncertainty
NASA Astrophysics Data System (ADS)
Mehrbod, Mehrdad; Tu, Nan; Miao, Lixin
2015-06-01
The design of closed-loop logistics (forward and reverse logistics) has attracted growing attention with the stringent pressures of customer expectations, environmental concerns and economic factors. This paper considers a multi-product, multi-period and multi-objective closed-loop logistics network model with regard to facility expansion as a facility location-allocation problem, which more closely approximates real-world conditions. A multi-objective mixed integer nonlinear programming formulation is linearized by defining new variables and adding new constraints to the model. By considering the aforementioned model under uncertainty, this paper develops a hybrid solution approach by combining an interactive fuzzy goal programming approach and robust counterpart optimization based on three well-known robust counterpart optimization formulations. Finally, this paper compares the results of the three formulations using different test scenarios and parameter-sensitive analysis in terms of the quality of the final solution, CPU time, the level of conservatism, the degree of closeness to the ideal solution, the degree of balance involved in developing a compromise solution, and satisfaction degree.
Krippendorff, Ben-Fillippo; Oyarzún, Diego A; Huisinga, Wilhelm
2012-04-01
Cell-level kinetic models for therapeutically relevant processes increasingly benefit the early stages of drug development. Later stages of the drug development processes, however, rely on pharmacokinetic compartment models while cell-level dynamics are typically neglected. We here present a systematic approach to integrate cell-level kinetic models and pharmacokinetic compartment models. Incorporating target dynamics into pharmacokinetic models is especially useful for the development of therapeutic antibodies because their effect and pharmacokinetics are inherently interdependent. The approach is illustrated by analysing the F(ab)-mediated inhibitory effect of therapeutic antibodies targeting the epidermal growth factor receptor. We build a multi-level model for anti-EGFR antibodies by combining a systems biology model with in vitro determined parameters and a pharmacokinetic model based on in vivo pharmacokinetic data. Using this model, we investigated in silico the impact of biochemical properties of anti-EGFR antibodies on their F(ab)-mediated inhibitory effect. The multi-level model suggests that the F(ab)-mediated inhibitory effect saturates with increasing drug-receptor affinity, thereby limiting the impact of increasing antibody affinity on improving the effect. This indicates that observed differences in the therapeutic effects of high affinity antibodies in the market and in clinical development may result mainly from Fc-mediated indirect mechanisms such as antibody-dependent cell cytotoxicity.
Zhou, Yuan; Shi, Tie-Mao; Hu, Yuan-Man; Gao, Chang; Liu, Miao; Song, Lin-Qi
2011-12-01
Based on geographic information system (GIS) technology and multi-objective location-allocation (LA) model, and in considering of four relatively independent objective factors (population density level, air pollution level, urban heat island effect level, and urban land use pattern), an optimized location selection for the urban parks within the Third Ring of Shenyang was conducted, and the selection results were compared with the spatial distribution of existing parks, aimed to evaluate the rationality of the spatial distribution of urban green spaces. In the location selection of urban green spaces in the study area, the factor air pollution was most important, and, compared with single objective factor, the weighted analysis results of multi-objective factors could provide optimized spatial location selection of new urban green spaces. The combination of GIS technology with LA model would be a new approach for the spatial optimizing of urban green spaces.
Multi-Scale Computational Models for Electrical Brain Stimulation
Seo, Hyeon; Jun, Sung C.
2017-01-01
Electrical brain stimulation (EBS) is an appealing method to treat neurological disorders. To achieve optimal stimulation effects and a better understanding of the underlying brain mechanisms, neuroscientists have proposed computational modeling studies for a decade. Recently, multi-scale models that combine a volume conductor head model and multi-compartmental models of cortical neurons have been developed to predict stimulation effects on the macroscopic and microscopic levels more precisely. As the need for better computational models continues to increase, we overview here recent multi-scale modeling studies; we focused on approaches that coupled a simplified or high-resolution volume conductor head model and multi-compartmental models of cortical neurons, and constructed realistic fiber models using diffusion tensor imaging (DTI). Further implications for achieving better precision in estimating cellular responses are discussed. PMID:29123476
Kapellusch, Jay M; Silverstein, Barbara A; Bao, Stephen S; Thiese, Mathew S; Merryweather, Andrew S; Hegmann, Kurt T; Garg, Arun
2018-02-01
The Strain Index (SI) and the American Conference of Governmental Industrial Hygienists (ACGIH) threshold limit value for hand activity level (TLV for HAL) have been shown to be associated with prevalence of distal upper-limb musculoskeletal disorders such as carpal tunnel syndrome (CTS). The SI and TLV for HAL disagree on more than half of task exposure classifications. Similarly, time-weighted average (TWA), peak, and typical exposure techniques used to quantity physical exposure from multi-task jobs have shown between-technique agreement ranging from 61% to 93%, depending upon whether the SI or TLV for HAL model was used. This study compared exposure-response relationships between each model-technique combination and prevalence of CTS. Physical exposure data from 1,834 workers (710 with multi-task jobs) were analyzed using the SI and TLV for HAL and the TWA, typical, and peak multi-task job exposure techniques. Additionally, exposure classifications from the SI and TLV for HAL were combined into a single measure and evaluated. Prevalent CTS cases were identified using symptoms and nerve-conduction studies. Mixed effects logistic regression was used to quantify exposure-response relationships between categorized (i.e., low, medium, and high) physical exposure and CTS prevalence for all model-technique combinations, and for multi-task workers, mono-task workers, and all workers combined. Except for TWA TLV for HAL, all model-technique combinations showed monotonic increases in risk of CTS with increased physical exposure. The combined-models approach showed stronger association than the SI or TLV for HAL for multi-task workers. Despite differences in exposure classifications, nearly all model-technique combinations showed exposure-response relationships with prevalence of CTS for the combined sample of mono-task and multi-task workers. Both the TLV for HAL and the SI, with the TWA or typical techniques, appear useful for epidemiological studies and surveillance. However, the utility of TWA, typical, and peak techniques for job design and intervention is dubious.
Cilfone, Nicholas A.; Kirschner, Denise E.; Linderman, Jennifer J.
2015-01-01
Biologically related processes operate across multiple spatiotemporal scales. For computational modeling methodologies to mimic this biological complexity, individual scale models must be linked in ways that allow for dynamic exchange of information across scales. A powerful methodology is to combine a discrete modeling approach, agent-based models (ABMs), with continuum models to form hybrid models. Hybrid multi-scale ABMs have been used to simulate emergent responses of biological systems. Here, we review two aspects of hybrid multi-scale ABMs: linking individual scale models and efficiently solving the resulting model. We discuss the computational choices associated with aspects of linking individual scale models while simultaneously maintaining model tractability. We demonstrate implementations of existing numerical methods in the context of hybrid multi-scale ABMs. Using an example model describing Mycobacterium tuberculosis infection, we show relative computational speeds of various combinations of numerical methods. Efficient linking and solution of hybrid multi-scale ABMs is key to model portability, modularity, and their use in understanding biological phenomena at a systems level. PMID:26366228
Multi Response Optimization of Laser Micro Marking Process:A Grey- Fuzzy Approach
NASA Astrophysics Data System (ADS)
Shivakoti, I.; Das, P. P.; Kibria, G.; Pradhan, B. B.; Mustafa, Z.; Ghadai, R. K.
2017-07-01
The selection of optimal parametric combination for efficient machining has always become a challenging issue for the manufacturing researcher. The optimal parametric combination always provides a better machining which improves the productivity, product quality and subsequently reduces the production cost and time. The paper presents the hybrid approach of Grey relational analysis and Fuzzy logic to obtain the optimal parametric combination for better laser beam micro marking on the Gallium Nitride (GaN) work material. The response surface methodology has been implemented for design of experiment considering three parameters with their five levels. The parameter such as current, frequency and scanning speed has been considered and the mark width, mark depth and mark intensity has been considered as the process response.
Multi-variants synthesis of Petri nets for FPGA devices
NASA Astrophysics Data System (ADS)
Bukowiec, Arkadiusz; Doligalski, Michał
2015-09-01
There is presented new method of synthesis of application specific logic controllers for FPGA devices. The specification of control algorithm is made with use of control interpreted Petri net (PT type). It allows specifying parallel processes in easy way. The Petri net is decomposed into state-machine type subnets. In this case, each subnet represents one parallel process. For this purpose there are applied algorithms of coloring of Petri nets. There are presented two approaches of such decomposition: with doublers of macroplaces or with one global wait place. Next, subnets are implemented into two-level logic circuit of the controller. The levels of logic circuit are obtained as a result of its architectural decomposition. The first level combinational circuit is responsible for generation of next places and second level decoder is responsible for generation output symbols. There are worked out two variants of such circuits: with one shared operational memory or with many flexible distributed memories as a decoder. Variants of Petri net decomposition and structures of logic circuits can be combined together without any restrictions. It leads to existence of four variants of multi-variants synthesis.
Jing, Luyang; Wang, Taiyong; Zhao, Ming; Wang, Peng
2017-01-01
A fault diagnosis approach based on multi-sensor data fusion is a promising tool to deal with complicated damage detection problems of mechanical systems. Nevertheless, this approach suffers from two challenges, which are (1) the feature extraction from various types of sensory data and (2) the selection of a suitable fusion level. It is usually difficult to choose an optimal feature or fusion level for a specific fault diagnosis task, and extensive domain expertise and human labor are also highly required during these selections. To address these two challenges, we propose an adaptive multi-sensor data fusion method based on deep convolutional neural networks (DCNN) for fault diagnosis. The proposed method can learn features from raw data and optimize a combination of different fusion levels adaptively to satisfy the requirements of any fault diagnosis task. The proposed method is tested through a planetary gearbox test rig. Handcraft features, manual-selected fusion levels, single sensory data, and two traditional intelligent models, back-propagation neural networks (BPNN) and a support vector machine (SVM), are used as comparisons in the experiment. The results demonstrate that the proposed method is able to detect the conditions of the planetary gearbox effectively with the best diagnosis accuracy among all comparative methods in the experiment. PMID:28230767
Feng, Yan; Mitchison, Timothy J; Bender, Andreas; Young, Daniel W; Tallarico, John A
2009-07-01
Multi-parameter phenotypic profiling of small molecules provides important insights into their mechanisms of action, as well as a systems level understanding of biological pathways and their responses to small molecule treatments. It therefore deserves more attention at an early step in the drug discovery pipeline. Here, we summarize the technologies that are currently in use for phenotypic profiling--including mRNA-, protein- and imaging-based multi-parameter profiling--in the drug discovery context. We think that an earlier integration of phenotypic profiling technologies, combined with effective experimental and in silico target identification approaches, can improve success rates of lead selection and optimization in the drug discovery process.
Comparison of multi-subject ICA methods for analysis of fMRI data
Erhardt, Erik Barry; Rachakonda, Srinivas; Bedrick, Edward; Allen, Elena; Adali, Tülay; Calhoun, Vince D.
2010-01-01
Spatial independent component analysis (ICA) applied to functional magnetic resonance imaging (fMRI) data identifies functionally connected networks by estimating spatially independent patterns from their linearly mixed fMRI signals. Several multi-subject ICA approaches estimating subject-specific time courses (TCs) and spatial maps (SMs) have been developed, however there has not yet been a full comparison of the implications of their use. Here, we provide extensive comparisons of four multi-subject ICA approaches in combination with data reduction methods for simulated and fMRI task data. For multi-subject ICA, the data first undergo reduction at the subject and group levels using principal component analysis (PCA). Comparisons of subject-specific, spatial concatenation, and group data mean subject-level reduction strategies using PCA and probabilistic PCA (PPCA) show that computationally intensive PPCA is equivalent to PCA, and that subject-specific and group data mean subject-level PCA are preferred because of well-estimated TCs and SMs. Second, aggregate independent components are estimated using either noise free ICA or probabilistic ICA (PICA). Third, subject-specific SMs and TCs are estimated using back-reconstruction. We compare several direct group ICA (GICA) back-reconstruction approaches (GICA1-GICA3) and an indirect back-reconstruction approach, spatio-temporal regression (STR, or dual regression). Results show the earlier group ICA (GICA1) approximates STR, however STR has contradictory assumptions and may show mixed-component artifacts in estimated SMs. Our evidence-based recommendation is to use GICA3, introduced here, with subject-specific PCA and noise-free ICA, providing the most robust and accurate estimated SMs and TCs in addition to offering an intuitive interpretation. PMID:21162045
Multi-Scale Approach to Understanding Source-Sink Dynamics of Amphibians
2015-12-01
spotted salamander, A. maculatum) at Fort Leonard Wood (FLW), Missouri. We used a multi-faceted approach in which we combined ecological , genetic...spotted salamander, A. maculatum) at Fort Leonard Wood , Missouri through a combination of intensive ecological field studies, genetic analyses, and...spatial demographic networks to identify optimal locations for wetland construction and restoration. Ecological Applications. Walls, S. C., Ball, L. C
Molins, C; Hogendoorn, E A; Dijkman, E; Heusinkveld, H A; Baumann, R A
2000-02-11
The combination of microwave-assisted solvent extraction (MASE) and reversed-phase liquid chromatography (RPLC) with UV detection has been investigated for the efficient determination of phenylurea herbicides in soils involving the single-residue method (SRM) approach (linuron) and the multi-residue method (MRM) approach (monuron, monolinuron, isoproturon, metobromuron, diuron and linuron). Critical parameters of MASE, viz, extraction temperature, water content and extraction solvent were varied in order to optimise recoveries of the analytes while simultaneously minimising co-extraction of soil interferences. The optimised extraction procedure was applied to different types of soil with an organic carbon content of 0.4-16.7%. Besides freshly spiked soil samples, method validation included the analysis of samples with aged residues. A comparative study between the applicability of RPLC-UV without and with the use of column switching for the processing of uncleaned extracts, was carried out. For some of the tested analyte/matrix combinations the one-column approach (LC mode) is feasible. In comparison to LC, coupled-column LC (LC-LC mode) provides high selectivity in single-residue analysis (linuron) and, although less pronounced in multi-residue analysis (all six phenylurea herbicides), the clean-up performance of LC-LC improves both time of analysis and sample throughput. In the MRM approach the developed procedure involving MASE and LC-LC-UV provided acceptable recoveries (range, 80-120%) and RSDs (<12%) at levels of 10 microg/kg (n=9) and 50 microg/kg (n=7), respectively, for most analyte/matrix combinations. Recoveries from aged residue samples spiked at a level of 100 microg/kg (n=7) ranged, depending of the analyte/soil type combination, from 41-113% with RSDs ranging from 1-35%. In the SRM approach the developed LC-LC procedure was applied for the determination of linuron in 28 sandy soil samples collected in a field study. Linuron could be determined in soil with a limit of quantitation of 10 microg/kg.
ERIC Educational Resources Information Center
Kefford, Colin W.
This description of a unit for teaching about the environment at the junior high level is an experimental study. The focus of the program is the integration of several media; films and tapes play a large role in the unit. Students perform a combination of classroom work, field work, and simulated exercises; assessment procedures are described.…
ERIC Educational Resources Information Center
Begland, Robert R.
In reviewing the Army Continuing Education System in 1979, the Assistant Secretary of the Army found a basic skills program based on traditional academic level goals was inadequate to meet the Army's requirement to provide functional, job-related basic skill education. Combining the shrinking manpower pool and projected basic skill deficiencies of…
NASA Astrophysics Data System (ADS)
Taubenböck, H.; Wurm, M.; Netzband, M.; Zwenzner, H.; Roth, A.; Rahman, A.; Dech, S.
2011-02-01
Estimating flood risks and managing disasters combines knowledge in climatology, meteorology, hydrology, hydraulic engineering, statistics, planning and geography - thus a complex multi-faceted problem. This study focuses on the capabilities of multi-source remote sensing data to support decision-making before, during and after a flood event. With our focus on urbanized areas, sample methods and applications show multi-scale products from the hazard and vulnerability perspective of the risk framework. From the hazard side, we present capabilities with which to assess flood-prone areas before an expected disaster. Then we map the spatial impact during or after a flood and finally, we analyze damage grades after a flood disaster. From the vulnerability side, we monitor urbanization over time on an urban footprint level, classify urban structures on an individual building level, assess building stability and quantify probably affected people. The results show a large database for sustainable development and for developing mitigation strategies, ad-hoc coordination of relief measures and organizing rehabilitation.
Few, Roger; Lake, Iain; Hunter, Paul R; Tran, Pham Gia; Thien, Vu Trong
2009-12-21
Understanding how risks to human health change as a result of seasonal variations in environmental conditions is likely to become of increasing importance in the context of climatic change, especially in lower-income countries. A multi-disciplinary approach can be a useful tool for improving understanding, particularly in situations where existing data resources are limited but the environmental health implications of seasonal hazards may be high. This short article describes a multi-disciplinary approach combining analysis of changes in levels of environmental contamination, seasonal variations in disease incidence and a social scientific analysis of health behaviour. The methodology was field-tested in a peri-urban environment in the Mekong Delta, Vietnam, where poor households face alternate seasonal extremes in the local environment as the water level in the Delta changes from flood to dry season. Low-income households in the research sites rely on river water for domestic uses, including provision of drinking water, and it is commonly perceived that the seasonal changes alter risk from diarrhoeal diseases and other diseases associated with contamination of water. The discussion focuses on the implementation of the methodology in the field, and draws lessons from the research process that can help in refining and developing the approach for application in other locations where seasonal dynamics of disease risk may have important consequences for public health.
Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Xu; Tuo, Rui; Jeff Wu, C. F.
Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less
Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion
He, Xu; Tuo, Rui; Jeff Wu, C. F.
2017-01-31
Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less
ERIC Educational Resources Information Center
Hallberg, Kelly; Cook, Thomas D.; Figlio, David
2013-01-01
The goal of this paper is to provide guidance for applied education researchers in using multi-level data to study the effects of interventions implemented at the school level. Two primary approaches are currently employed in observational studies of the effect of school-level interventions. One approach employs intact school matching: matching…
Combining multiple decisions: applications to bioinformatics
NASA Astrophysics Data System (ADS)
Yukinawa, N.; Takenouchi, T.; Oba, S.; Ishii, S.
2008-01-01
Multi-class classification is one of the fundamental tasks in bioinformatics and typically arises in cancer diagnosis studies by gene expression profiling. This article reviews two recent approaches to multi-class classification by combining multiple binary classifiers, which are formulated based on a unified framework of error-correcting output coding (ECOC). The first approach is to construct a multi-class classifier in which each binary classifier to be aggregated has a weight value to be optimally tuned based on the observed data. In the second approach, misclassification of each binary classifier is formulated as a bit inversion error with a probabilistic model by making an analogy to the context of information transmission theory. Experimental studies using various real-world datasets including cancer classification problems reveal that both of the new methods are superior or comparable to other multi-class classification methods.
Multi-layer plastic/glass microfluidic systems containing electrical and mechanical functionality.
Han, Arum; Wang, Olivia; Graff, Mason; Mohanty, Swomitra K; Edwards, Thayne L; Han, Ki-Ho; Bruno Frazier, A
2003-08-01
This paper describes an approach for fabricating multi-layer microfluidic systems from a combination of glass and plastic materials. Methods and characterization results for the microfabrication technologies underlying the process flow are presented. The approach is used to fabricate and characterize multi-layer plastic/glass microfluidic systems containing electrical and mechanical functionality. Hot embossing, heat staking of plastics, injection molding, microstenciling of electrodes, and stereolithography were combined with conventional MEMS fabrication techniques to realize the multi-layer systems. The approach enabled the integration of multiple plastic/glass materials into a single monolithic system, provided a solution for the integration of electrical functionality throughout the system, provided a mechanism for the inclusion of microactuators such as micropumps/valves, and provided an interconnect technology for interfacing fluids and electrical components between the micro system and the macro world.
A multi-disciplinary approach for the integrated assessment of multiple risks in delta areas.
NASA Astrophysics Data System (ADS)
Sperotto, Anna; Torresan, Silvia; Critto, Andrea; Marcomini, Antonio
2016-04-01
The assessment of climate change related risks is notoriously difficult due to the complex and uncertain combinations of hazardous events that might happen, the multiplicity of physical processes involved, the continuous changes and interactions of environmental and socio-economic systems. One important challenge lies in predicting and modelling cascades of natural and man -made hazard events which can be triggered by climate change, encompassing different spatial and temporal scales. Another regard the potentially difficult integration of environmental, social and economic disciplines in the multi-risk concept. Finally, the effective interaction between scientists and stakeholders is essential to ensure that multi-risk knowledge is translated into efficient adaptation and management strategies. The assessment is even more complex at the scale of deltaic systems which are particularly vulnerable to global environmental changes, due to the fragile equilibrium between the presence of valuable natural ecosystems and relevant economic activities. Improving our capacity to assess the combined effects of multiple hazards (e.g. sea-level rise, storm surges, reduction in sediment load, local subsidence, saltwater intrusion) is therefore essential to identify timely opportunities for adaptation. A holistic multi-risk approach is here proposed to integrate terminology, metrics and methodologies from different research fields (i.e. environmental, social and economic sciences) thus creating shared knowledge areas to advance multi risk assessment and management in delta regions. A first testing of the approach, including the application of Bayesian network analysis for the assessment of impacts of climate change on key natural systems (e.g. wetlands, protected areas, beaches) and socio-economic activities (e.g. agriculture, tourism), is applied in the Po river delta in Northern Italy. The approach is based on a bottom-up process involving local stakeholders early in different stages of the multi-risk assessment process (i.e. identification of objectives, collection of data, definition of risk thresholds and indicators). The results of the assessment will allow the development of multi-risk scenarios enabling the evaluation and prioritization of risk management and adaptation options under changing climate conditions.
González-Domínguez, Raúl; Santos, Hugo Miguel; Bebianno, Maria João; García-Barrera, Tamara; Gómez-Ariza, José Luis; Capelo, José Luis
2016-12-15
Estuaries are very important ecosystems with great ecological and economic value, but usually highly impacted by anthropogenic pressure. Thus, the assessment of pollution levels in these habitats is critical in order to evaluate their environmental quality. In this work, we combined complementary metallomic and proteomic approaches with the aim to monitor the effects of environmental pollution on Scrobicularia plana clams captured in three estuarine systems from the south coast of Portugal; Arade estuary, Ria Formosa and Guadiana estuary. Multi-elemental profiling of digestive glands was carried out to evaluate the differential pollution levels in the three study areas. Then, proteomic analysis by means of two-dimensional gel electrophoresis and mass spectrometry revealed twenty-one differential proteins, which could be associated with multiple toxicological mechanisms induced in environmentally stressed organisms. Accordingly, it could be concluded that the combination of different omic approaches presents a great potential in environmental research. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Chiariotti, P.; Martarelli, M.; Revel, G. M.
2017-12-01
A novel non-destructive testing procedure for delamination detection based on the exploitation of the simultaneous time and spatial sampling provided by Continuous Scanning Laser Doppler Vibrometry (CSLDV) and the feature extraction capability of Multi-Level wavelet-based processing is presented in this paper. The processing procedure consists in a multi-step approach. Once the optimal mother-wavelet is selected as the one maximizing the Energy to Shannon Entropy Ratio criterion among the mother-wavelet space, a pruning operation aiming at identifying the best combination of nodes inside the full-binary tree given by Wavelet Packet Decomposition (WPD) is performed. The pruning algorithm exploits, in double step way, a measure of the randomness of the point pattern distribution on the damage map space with an analysis of the energy concentration of the wavelet coefficients on those nodes provided by the first pruning operation. A combination of the point pattern distributions provided by each node of the ensemble node set from the pruning algorithm allows for setting a Damage Reliability Index associated to the final damage map. The effectiveness of the whole approach is proven on both simulated and real test cases. A sensitivity analysis related to the influence of noise on the CSLDV signal provided to the algorithm is also discussed, showing that the processing developed is robust enough to measurement noise. The method is promising: damages are well identified on different materials and for different damage-structure varieties.
Li, Ying Hong; Wang, Pan Pan; Li, Xiao Xu; Yu, Chun Yan; Yang, Hong; Zhou, Jin; Xue, Wei Wei; Tan, Jun; Zhu, Feng
2016-01-01
The human kinome is one of the most productive classes of drug target, and there is emerging necessity for treating complex diseases by means of polypharmacology (multi-target drugs and combination products). However, the advantages of the multi-target drugs and the combination products are still under debate. A comparative analysis between FDA approved multi-target drugs and combination products, targeting the human kinome, was conducted by mapping targets onto the phylogenetic tree of the human kinome. The approach of network medicine illustrating the drug-target interactions was applied to identify popular targets of multi-target drugs and combination products. As identified, the multi-target drugs tended to inhibit target pairs in the human kinome, especially the receptor tyrosine kinase family, while the combination products were able to against targets of distant homology relationship. This finding asked for choosing the combination products as a better solution for designing drugs aiming at targets of distant homology relationship. Moreover, sub-networks of drug-target interactions in specific disease were generated, and mechanisms shared by multi-target drugs and combination products were identified. In conclusion, this study performed an analysis between approved multi-target drugs and combination products against the human kinome, which could assist the discovery of next generation polypharmacology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.; Whitney, Paul D.; White, Amanda M.
2013-07-15
Pacific Northwest National Laboratory has spent several years researching, developing, and validating large Bayesian network models to support integration of open source data sets for nuclear proliferation research. Our current work focuses on generating a set of interrelated models for multi-source assessment of nuclear programs, as opposed to a single comprehensive model. By using this approach, we can break down the models to cover logical sub-problems that can utilize different expertise and data sources. This approach allows researchers to utilize the models individually or in combination to detect and characterize a nuclear program and identify data gaps. The models operatemore » at various levels of granularity, covering a combination of state-level assessments with more detailed models of site or facility characteristics. This paper will describe the current open source-driven, nuclear nonproliferation models under development, the pros and cons of the analytical approach, and areas for additional research.« less
A MultiAir®/MultiFuel Approach to Enhancing Engine System Efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reese, Ronald
2015-05-20
FCA US LLC (formally known as Chrysler Group LLC, and hereinafter “Chrysler”) was awarded an American Recovery and Reinvestment Act (ARRA) funded project by the Department of Energy (DOE) titled “A MultiAir®/MultiFuel Approach to Enhancing Engine System Efficiency” (hereinafter “project”). This award was issued after Chrysler submitted a proposal for Funding Opportunity Announcement DE-FOA- 0000079, “Systems Level Technology Development, Integration, and Demonstration for Efficient Class 8 Trucks (SuperTruck) and Advanced Technology Powertrains for Light-Duty Vehicles (ATP-LD).” Chrysler started work on this project on June 01, 2010 and completed testing activities on August 30, 2014. Overall objectives of this project were;more » Demonstrate a 25% improvement in combined Federal Test Procedure (FTP) City and Highway fuel economy over a 2009 Chrysler minivan; Accelerate the development of highly efficient engine and powertrain systems for light-duty vehicles, while meeting future emissions standards; and Create and retain jobs in accordance with the American Recovery and Reinvestment Act of 2009« less
A multi-objective constraint-based approach for modeling genome-scale microbial ecosystems.
Budinich, Marko; Bourdon, Jérémie; Larhlimi, Abdelhalim; Eveillard, Damien
2017-01-01
Interplay within microbial communities impacts ecosystems on several scales, and elucidation of the consequent effects is a difficult task in ecology. In particular, the integration of genome-scale data within quantitative models of microbial ecosystems remains elusive. This study advocates the use of constraint-based modeling to build predictive models from recent high-resolution -omics datasets. Following recent studies that have demonstrated the accuracy of constraint-based models (CBMs) for simulating single-strain metabolic networks, we sought to study microbial ecosystems as a combination of single-strain metabolic networks that exchange nutrients. This study presents two multi-objective extensions of CBMs for modeling communities: multi-objective flux balance analysis (MO-FBA) and multi-objective flux variability analysis (MO-FVA). Both methods were applied to a hot spring mat model ecosystem. As a result, multiple trade-offs between nutrients and growth rates, as well as thermodynamically favorable relative abundances at community level, were emphasized. We expect this approach to be used for integrating genomic information in microbial ecosystems. Following models will provide insights about behaviors (including diversity) that take place at the ecosystem scale.
NASA Astrophysics Data System (ADS)
Xu, Zhoubing; Baucom, Rebeccah B.; Abramson, Richard G.; Poulose, Benjamin K.; Landman, Bennett A.
2016-03-01
The abdominal wall is an important structure differentiating subcutaneous and visceral compartments and intimately involved with maintaining abdominal structure. Segmentation of the whole abdominal wall on routinely acquired computed tomography (CT) scans remains challenging due to variations and complexities of the wall and surrounding tissues. In this study, we propose a slice-wise augmented active shape model (AASM) approach to robustly segment both the outer and inner surfaces of the abdominal wall. Multi-atlas label fusion (MALF) and level set (LS) techniques are integrated into the traditional ASM framework. The AASM approach globally optimizes the landmark updates in the presence of complicated underlying local anatomical contexts. The proposed approach was validated on 184 axial slices of 20 CT scans. The Hausdorff distance against the manual segmentation was significantly reduced using proposed approach compared to that using ASM, MALF, and LS individually. Our segmentation of the whole abdominal wall enables the subcutaneous and visceral fat measurement, with high correlation to the measurement derived from manual segmentation. This study presents the first generic algorithm that combines ASM, MALF, and LS, and demonstrates practical application for automatically capturing visceral and subcutaneous fat volumes.
NASA Astrophysics Data System (ADS)
Sweeney, C.; Kort, E. A.; Rella, C.; Conley, S. A.; Karion, A.; Lauvaux, T.; Frankenberg, C.
2015-12-01
Along with a boom in oil and natural gas production in the US, there has been a substantial effort to understand the true environmental impact of these operations on air and water quality, as well asnet radiation balance. This multi-institution effort funded by both governmental and non-governmental agencies has provided a case study for identification and verification of emissions using a multi-scale, top-down approach. This approach leverages a combination of remote sensing to identify areas that need specific focus and airborne in-situ measurements to quantify both regional and large- to mid-size single-point emitters. Ground-based networks of mobile and stationary measurements provide the bottom tier of measurements from which process-level information can be gathered to better understand the specific sources and temporal distribution of the emitters. The motivation for this type of approach is largely driven by recent work in the Barnett Shale region in Texas as well as the San Juan Basin in New Mexico and Colorado; these studies suggest that relatively few single-point emitters dominate the regional emissions of CH4.
NASA Astrophysics Data System (ADS)
Tabik, S.; Romero, L. F.; Mimica, P.; Plata, O.; Zapata, E. L.
2012-09-01
A broad area in astronomy focuses on simulating extragalactic objects based on Very Long Baseline Interferometry (VLBI) radio-maps. Several algorithms in this scope simulate what would be the observed radio-maps if emitted from a predefined extragalactic object. This work analyzes the performance and scaling of this kind of algorithms on multi-socket, multi-core architectures. In particular, we evaluate a sharing approach, a privatizing approach and a hybrid approach on systems with complex memory hierarchy that includes shared Last Level Cache (LLC). In addition, we investigate which manual processes can be systematized and then automated in future works. The experiments show that the data-privatizing model scales efficiently on medium scale multi-socket, multi-core systems (up to 48 cores) while regardless of algorithmic and scheduling optimizations, the sharing approach is unable to reach acceptable scalability on more than one socket. However, the hybrid model with a specific level of data-sharing provides the best scalability over all used multi-socket, multi-core systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ajami, N K; Duan, Q; Gao, X
2005-04-11
This paper examines several multi-model combination techniques: the Simple Multi-model Average (SMA), the Multi-Model Super Ensemble (MMSE), Modified Multi-Model Super Ensemble (M3SE) and the Weighted Average Method (WAM). These model combination techniques were evaluated using the results from the Distributed Model Intercomparison Project (DMIP), an international project sponsored by the National Weather Service (NWS) Office of Hydrologic Development (OHD). All of the multi-model combination results were obtained using uncalibrated DMIP model outputs and were compared against the best uncalibrated as well as the best calibrated individual model results. The purpose of this study is to understand how different combination techniquesmore » affect the skill levels of the multi-model predictions. This study revealed that the multi-model predictions obtained from uncalibrated single model predictions are generally better than any single member model predictions, even the best calibrated single model predictions. Furthermore, more sophisticated multi-model combination techniques that incorporated bias correction steps work better than simple multi-model average predictions or multi-model predictions without bias correction.« less
Single Cell Multi-Omics Technology: Methodology and Application.
Hu, Youjin; An, Qin; Sheu, Katherine; Trejo, Brandon; Fan, Shuxin; Guo, Ying
2018-01-01
In the era of precision medicine, multi-omics approaches enable the integration of data from diverse omics platforms, providing multi-faceted insight into the interrelation of these omics layers on disease processes. Single cell sequencing technology can dissect the genotypic and phenotypic heterogeneity of bulk tissue and promises to deepen our understanding of the underlying mechanisms governing both health and disease. Through modification and combination of single cell assays available for transcriptome, genome, epigenome, and proteome profiling, single cell multi-omics approaches have been developed to simultaneously and comprehensively study not only the unique genotypic and phenotypic characteristics of single cells, but also the combined regulatory mechanisms evident only at single cell resolution. In this review, we summarize the state-of-the-art single cell multi-omics methods and discuss their applications, challenges, and future directions.
Single Cell Multi-Omics Technology: Methodology and Application
Hu, Youjin; An, Qin; Sheu, Katherine; Trejo, Brandon; Fan, Shuxin; Guo, Ying
2018-01-01
In the era of precision medicine, multi-omics approaches enable the integration of data from diverse omics platforms, providing multi-faceted insight into the interrelation of these omics layers on disease processes. Single cell sequencing technology can dissect the genotypic and phenotypic heterogeneity of bulk tissue and promises to deepen our understanding of the underlying mechanisms governing both health and disease. Through modification and combination of single cell assays available for transcriptome, genome, epigenome, and proteome profiling, single cell multi-omics approaches have been developed to simultaneously and comprehensively study not only the unique genotypic and phenotypic characteristics of single cells, but also the combined regulatory mechanisms evident only at single cell resolution. In this review, we summarize the state-of-the-art single cell multi-omics methods and discuss their applications, challenges, and future directions. PMID:29732369
Gorsevski, Pece V; Donevska, Katerina R; Mitrovski, Cvetko D; Frizado, Joseph P
2012-02-01
This paper presents a GIS-based multi-criteria decision analysis approach for evaluating the suitability for landfill site selection in the Polog Region, Macedonia. The multi-criteria decision framework considers environmental and economic factors which are standardized by fuzzy membership functions and combined by integration of analytical hierarchy process (AHP) and ordered weighted average (OWA) techniques. The AHP is used for the elicitation of attribute weights while the OWA operator function is used to generate a wide range of decision alternatives for addressing uncertainty associated with interaction between multiple criteria. The usefulness of the approach is illustrated by different OWA scenarios that report landfill suitability on a scale between 0 and 1. The OWA scenarios are intended to quantify the level of risk taking (i.e., optimistic, pessimistic, and neutral) and to facilitate a better understanding of patterns that emerge from decision alternatives involved in the decision making process. Copyright © 2011 Elsevier Ltd. All rights reserved.
Energy Efficient Image/Video Data Transmission on Commercial Multi-Core Processors
Lee, Sungju; Kim, Heegon; Chung, Yongwha; Park, Daihee
2012-01-01
In transmitting image/video data over Video Sensor Networks (VSNs), energy consumption must be minimized while maintaining high image/video quality. Although image/video compression is well known for its efficiency and usefulness in VSNs, the excessive costs associated with encoding computation and complexity still hinder its adoption for practical use. However, it is anticipated that high-performance handheld multi-core devices will be used as VSN processing nodes in the near future. In this paper, we propose a way to improve the energy efficiency of image and video compression with multi-core processors while maintaining the image/video quality. We improve the compression efficiency at the algorithmic level or derive the optimal parameters for the combination of a machine and compression based on the tradeoff between the energy consumption and the image/video quality. Based on experimental results, we confirm that the proposed approach can improve the energy efficiency of the straightforward approach by a factor of 2∼5 without compromising image/video quality. PMID:23202181
Multi-level trellis coded modulation and multi-stage decoding
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.; Wu, Jiantian; Lin, Shu
1990-01-01
Several constructions for multi-level trellis codes are presented and many codes with better performance than previously known codes are found. These codes provide a flexible trade-off between coding gain, decoding complexity, and decoding delay. New multi-level trellis coded modulation schemes using generalized set partitioning methods are developed for Quadrature Amplitude Modulation (QAM) and Phase Shift Keying (PSK) signal sets. New rotationally invariant multi-level trellis codes which can be combined with differential encoding to resolve phase ambiguity are presented.
Efficient Data Mining for Local Binary Pattern in Texture Image Analysis
Kwak, Jin Tae; Xu, Sheng; Wood, Bradford J.
2015-01-01
Local binary pattern (LBP) is a simple gray scale descriptor to characterize the local distribution of the grey levels in an image. Multi-resolution LBP and/or combinations of the LBPs have shown to be effective in texture image analysis. However, it is unclear what resolutions or combinations to choose for texture analysis. Examining all the possible cases is impractical and intractable due to the exponential growth in a feature space. This limits the accuracy and time- and space-efficiency of LBP. Here, we propose a data mining approach for LBP, which efficiently explores a high-dimensional feature space and finds a relatively smaller number of discriminative features. The features can be any combinations of LBPs. These may not be achievable with conventional approaches. Hence, our approach not only fully utilizes the capability of LBP but also maintains the low computational complexity. We incorporated three different descriptors (LBP, local contrast measure, and local directional derivative measure) with three spatial resolutions and evaluated our approach using two comprehensive texture databases. The results demonstrated the effectiveness and robustness of our approach to different experimental designs and texture images. PMID:25767332
ICPL: Intelligent Cooperative Planning and Learning for Multi-agent Systems
2012-02-29
objective was to develop a new planning approach for teams!of multiple UAVs that tightly integrates learning and cooperative!control algorithms at... algorithms at multiple levels of the planning architecture. The research results enabled a team of mobile agents to learn to adapt and react to uncertainty in...expressive representation that incorporates feature conjunctions. Our algorithm is simple to implement, fast to execute, and can be combined with any
Information fusion-based approach for studying influence on Twitter using belief theory.
Azaza, Lobna; Kirgizov, Sergey; Savonnet, Marinette; Leclercq, Éric; Gastineau, Nicolas; Faiz, Rim
2016-01-01
Influence in Twitter has become recently a hot research topic, since this micro-blogging service is widely used to share and disseminate information. Some users are more able than others to influence and persuade peers. Thus, studying most influential users leads to reach a large-scale information diffusion area, something very useful in marketing or political campaigns. In this study, we propose a new approach for multi-level influence assessment on multi-relational networks, such as Twitter . We define a social graph to model the relationships between users as a multiplex graph where users are represented by nodes, and links model the different relations between them (e.g., retweets , mentions , and replies ). We explore how relations between nodes in this graph could reveal about the influence degree and propose a generic computational model to assess influence degree of a certain node. This is based on the conjunctive combination rule from the belief functions theory to combine different types of relations. We experiment the proposed method on a large amount of data gathered from Twitter during the European Elections 2014 and deduce top influential candidates. The results show that our model is flexible enough to to consider multiple interactions combination according to social scientists needs or requirements and that the numerical results of the belief theory are accurate. We also evaluate the approach over the CLEF RepLab 2014 data set and show that our approach leads to quite interesting results.
ERIC Educational Resources Information Center
Chadli, Abdelhafid; Bendella, Fatima; Tranvouez, Erwan
2015-01-01
In this paper we present an Agent-based evaluation approach in a context of Multi-agent simulation learning systems. Our evaluation model is based on a two stage assessment approach: (1) a Distributed skill evaluation combining agents and fuzzy sets theory; and (2) a Negotiation based evaluation of students' performance during a training…
NASA Astrophysics Data System (ADS)
Zhu, Aichun; Wang, Tian; Snoussi, Hichem
2018-03-01
This paper addresses the problems of the graphical-based human pose estimation in still images, including the diversity of appearances and confounding background clutter. We present a new architecture for estimating human pose using a Convolutional Neural Network (CNN). Firstly, a Relative Mixture Deformable Model (RMDM) is defined by each pair of connected parts to compute the relative spatial information in the graphical model. Secondly, a Local Multi-Resolution Convolutional Neural Network (LMR-CNN) is proposed to train and learn the multi-scale representation of each body parts by combining different levels of part context. Thirdly, a LMR-CNN based hierarchical model is defined to explore the context information of limb parts. Finally, the experimental results demonstrate the effectiveness of the proposed deep learning approach for human pose estimation.
Novel Wireless-Communicating Textiles Made from Multi-Material and Minimally-Invasive Fibers
Gorgutsa, Stepan; Bélanger-Garnier, Victor; Ung, Bora; Viens, Jeff; Gosselin, Benoit; LaRochelle, Sophie; Messaddeq, Younes
2014-01-01
The ability to integrate multiple materials into miniaturized fiber structures enables the realization of novel biomedical textile devices with higher-level functionalities and minimally-invasive attributes. In this work, we present novel textile fabrics integrating unobtrusive multi-material fibers that communicate through 2.4 GHz wireless networks with excellent signal quality. The conductor elements of the textiles are embedded within the fibers themselves, providing electrical and chemical shielding against the environment, while preserving the mechanical and cosmetic properties of the garments. These multi-material fibers combine insulating and conducting materials into a well-defined geometry, and represent a cost-effective and minimally-invasive approach to sensor fabrics and bio-sensing textiles connected in real time to mobile communications infrastructures, suitable for a variety of health and life science applications. PMID:25325335
Novel wireless-communicating textiles made from multi-material and minimally-invasive fibers.
Bélanger-Garnier, Victor; Gorgutsa, Stephan; Ung, Bora; Viens, Jeff; Gosselin, Benoit; LaRochelle, Sophie; Messaddeq, Younes
2014-01-01
The ability to integrate multiple materials into miniaturized fiber structures enables the realization of novel biomedical textile devices with higher-level functionalities and minimally-invasive attributes. In this work, we present novel textile fabrics integrating unobtrusive multi-material fibers that communicate through 2.4 GHz wireless networks with excellent signal quality. The conductor elements of the textiles are embedded within the fibers themselves, providing electrical and chemical shielding against the environment, while preserving the mechanical and cosmetic properties of the garments. These multi-material fibers combine insulating and conducting materials into a well-defined geometry, and represent a cost-effective and minimally-invasive approach to sensor fabrics and bio-sensing textiles connected in real time to mobile communications infrastructures, suitable for a variety of health and life science applications.
Novel wireless-communicating textiles made from multi-material and minimally-invasive fibers.
Gorgutsa, Stepan; Bélanger-Garnier, Victor; Ung, Bora; Viens, Jeff; Gosselin, Benoit; LaRochelle, Sophie; Messaddeq, Younes
2014-10-16
The ability to integrate multiple materials into miniaturized fiber structures enables the realization of novel biomedical textile devices with higher-level functionalities and minimally-invasive attributes. In this work, we present novel textile fabrics integrating unobtrusive multi-material fibers that communicate through 2.4 GHz wireless networks with excellent signal quality. The conductor elements of the textiles are embedded within the fibers themselves, providing electrical and chemical shielding against the environment, while preserving the mechanical and cosmetic properties of the garments. These multi-material fibers combine insulating and conducting materials into a well-defined geometry, and represent a cost-effective and minimally-invasive approach to sensor fabrics and bio-sensing textiles connected in real time to mobile communications infrastructures, suitable for a variety of health and life science applications.
Multi-atlas and label fusion approach for patient-specific MRI based skull estimation.
Torrado-Carvajal, Angel; Herraiz, Joaquin L; Hernandez-Tamames, Juan A; San Jose-Estepar, Raul; Eryaman, Yigitcan; Rozenholc, Yves; Adalsteinsson, Elfar; Wald, Lawrence L; Malpica, Norberto
2016-04-01
MRI-based skull segmentation is a useful procedure for many imaging applications. This study describes a methodology for automatic segmentation of the complete skull from a single T1-weighted volume. The skull is estimated using a multi-atlas segmentation approach. Using a whole head computed tomography (CT) scan database, the skull in a new MRI volume is detected by nonrigid image registration of the volume to every CT, and combination of the individual segmentations by label-fusion. We have compared Majority Voting, Simultaneous Truth and Performance Level Estimation (STAPLE), Shape Based Averaging (SBA), and the Selective and Iterative Method for Performance Level Estimation (SIMPLE) algorithms. The pipeline has been evaluated quantitatively using images from the Retrospective Image Registration Evaluation database (reaching an overlap of 72.46 ± 6.99%), a clinical CT-MR dataset (maximum overlap of 78.31 ± 6.97%), and a whole head CT-MRI pair (maximum overlap 78.68%). A qualitative evaluation has also been performed on MRI acquisition of volunteers. It is possible to automatically segment the complete skull from MRI data using a multi-atlas and label fusion approach. This will allow the creation of complete MRI-based tissue models that can be used in electromagnetic dosimetry applications and attenuation correction in PET/MR. © 2015 Wiley Periodicals, Inc.
Butel, Jean; Braun, Kathryn L; Novotny, Rachel; Acosta, Mark; Castro, Rose; Fleming, Travis; Powers, Julianne; Nigg, Claudio R
2015-12-01
Addressing complex chronic disease prevention, like childhood obesity, requires a multi-level, multi-component culturally relevant approach with broad reach. Models are lacking to guide fidelity monitoring across multiple levels, components, and sites engaged in such interventions. The aim of this study is to describe the fidelity-monitoring approach of The Children's Healthy Living (CHL) Program, a multi-level multi-component intervention in five Pacific jurisdictions. A fidelity-monitoring rubric was developed. About halfway during the intervention, community partners were randomly selected and interviewed independently by local CHL staff and by Coordinating Center representatives to assess treatment fidelity. Ratings were compared and discussed by local and Coordinating Center staff. There was good agreement between the teams (Kappa = 0.50, p < 0.001), and intervention improvement opportunities were identified through data review and group discussion. Fidelity for the multi-level, multi-component, multi-site CHL intervention was successfully assessed, identifying adaptations as well as ways to improve intervention delivery prior to the end of the intervention.
Structural Optimization of a Force Balance Using a Computational Experiment Design
NASA Technical Reports Server (NTRS)
Parker, P. A.; DeLoach, R.
2002-01-01
This paper proposes a new approach to force balance structural optimization featuring a computational experiment design. Currently, this multi-dimensional design process requires the designer to perform a simplification by executing parameter studies on a small subset of design variables. This one-factor-at-a-time approach varies a single variable while holding all others at a constant level. Consequently, subtle interactions among the design variables, which can be exploited to achieve the design objectives, are undetected. The proposed method combines Modern Design of Experiments techniques to direct the exploration of the multi-dimensional design space, and a finite element analysis code to generate the experimental data. To efficiently search for an optimum combination of design variables and minimize the computational resources, a sequential design strategy was employed. Experimental results from the optimization of a non-traditional force balance measurement section are presented. An approach to overcome the unique problems associated with the simultaneous optimization of multiple response criteria is described. A quantitative single-point design procedure that reflects the designer's subjective impression of the relative importance of various design objectives, and a graphical multi-response optimization procedure that provides further insights into available tradeoffs among competing design objectives are illustrated. The proposed method enhances the intuition and experience of the designer by providing new perspectives on the relationships between the design variables and the competing design objectives providing a systematic foundation for advancements in structural design.
Taxonomy of multi-focal nematode image stacks by a CNN based image fusion approach.
Liu, Min; Wang, Xueping; Zhang, Hongzhong
2018-03-01
In the biomedical field, digital multi-focal images are very important for documentation and communication of specimen data, because the morphological information for a transparent specimen can be captured in form of a stack of high-quality images. Given biomedical image stacks containing multi-focal images, how to efficiently extract effective features from all layers to classify the image stacks is still an open question. We present to use a deep convolutional neural network (CNN) image fusion based multilinear approach for the taxonomy of multi-focal image stacks. A deep CNN based image fusion technique is used to combine relevant information of multi-focal images within a given image stack into a single image, which is more informative and complete than any single image in the given stack. Besides, multi-focal images within a stack are fused along 3 orthogonal directions, and multiple features extracted from the fused images along different directions are combined by canonical correlation analysis (CCA). Because multi-focal image stacks represent the effect of different factors - texture, shape, different instances within the same class and different classes of objects, we embed the deep CNN based image fusion method within a multilinear framework to propose an image fusion based multilinear classifier. The experimental results on nematode multi-focal image stacks demonstrated that the deep CNN image fusion based multilinear classifier can reach a higher classification rate (95.7%) than that by the previous multilinear based approach (88.7%), even we only use the texture feature instead of the combination of texture and shape features as in the previous work. The proposed deep CNN image fusion based multilinear approach shows great potential in building an automated nematode taxonomy system for nematologists. It is effective to classify multi-focal image stacks. Copyright © 2018 Elsevier B.V. All rights reserved.
Kim, Dokyoon; Joung, Je-Gun; Sohn, Kyung-Ah; Shin, Hyunjung; Park, Yu Rang; Ritchie, Marylyn D; Kim, Ju Han
2015-01-01
Objective Cancer can involve gene dysregulation via multiple mechanisms, so no single level of genomic data fully elucidates tumor behavior due to the presence of numerous genomic variations within or between levels in a biological system. We have previously proposed a graph-based integration approach that combines multi-omics data including copy number alteration, methylation, miRNA, and gene expression data for predicting clinical outcome in cancer. However, genomic features likely interact with other genomic features in complex signaling or regulatory networks, since cancer is caused by alterations in pathways or complete processes. Methods Here we propose a new graph-based framework for integrating multi-omics data and genomic knowledge to improve power in predicting clinical outcomes and elucidate interplay between different levels. To highlight the validity of our proposed framework, we used an ovarian cancer dataset from The Cancer Genome Atlas for predicting stage, grade, and survival outcomes. Results Integrating multi-omics data with genomic knowledge to construct pre-defined features resulted in higher performance in clinical outcome prediction and higher stability. For the grade outcome, the model with gene expression data produced an area under the receiver operating characteristic curve (AUC) of 0.7866. However, models of the integration with pathway, Gene Ontology, chromosomal gene set, and motif gene set consistently outperformed the model with genomic data only, attaining AUCs of 0.7873, 0.8433, 0.8254, and 0.8179, respectively. Conclusions Integrating multi-omics data and genomic knowledge to improve understanding of molecular pathogenesis and underlying biology in cancer should improve diagnostic and prognostic indicators and the effectiveness of therapies. PMID:25002459
Kim, Dokyoon; Joung, Je-Gun; Sohn, Kyung-Ah; Shin, Hyunjung; Park, Yu Rang; Ritchie, Marylyn D; Kim, Ju Han
2015-01-01
Cancer can involve gene dysregulation via multiple mechanisms, so no single level of genomic data fully elucidates tumor behavior due to the presence of numerous genomic variations within or between levels in a biological system. We have previously proposed a graph-based integration approach that combines multi-omics data including copy number alteration, methylation, miRNA, and gene expression data for predicting clinical outcome in cancer. However, genomic features likely interact with other genomic features in complex signaling or regulatory networks, since cancer is caused by alterations in pathways or complete processes. Here we propose a new graph-based framework for integrating multi-omics data and genomic knowledge to improve power in predicting clinical outcomes and elucidate interplay between different levels. To highlight the validity of our proposed framework, we used an ovarian cancer dataset from The Cancer Genome Atlas for predicting stage, grade, and survival outcomes. Integrating multi-omics data with genomic knowledge to construct pre-defined features resulted in higher performance in clinical outcome prediction and higher stability. For the grade outcome, the model with gene expression data produced an area under the receiver operating characteristic curve (AUC) of 0.7866. However, models of the integration with pathway, Gene Ontology, chromosomal gene set, and motif gene set consistently outperformed the model with genomic data only, attaining AUCs of 0.7873, 0.8433, 0.8254, and 0.8179, respectively. Integrating multi-omics data and genomic knowledge to improve understanding of molecular pathogenesis and underlying biology in cancer should improve diagnostic and prognostic indicators and the effectiveness of therapies. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Multi-spectral black meta-infrared detectors (Conference Presentation)
NASA Astrophysics Data System (ADS)
Krishna, Sanjay
2016-09-01
There is an increased emphasis on obtaining imaging systems with on-demand spectro-polarimetric information at the pixel level. Meta-infrared detectors in which infrared detectors are combined with metamaterials are a promising way to realize this. The infrared region is appealing due to the low metallic loss, large penetration depth of the localized field and the larger feature sizes compared to the visible region. I will discuss approaches to realize multispectral detectors including our recent work on double metal meta-material design combined with Type II superlattices that have demonstrated enhanced quantum efficiency (collaboration with Padilla group at Duke University).
Optimizing Multi-Product Multi-Constraint Inventory Control Systems with Stochastic Replenishments
NASA Astrophysics Data System (ADS)
Allah Taleizadeh, Ata; Aryanezhad, Mir-Bahador; Niaki, Seyed Taghi Akhavan
Multi-periodic inventory control problems are mainly studied employing two assumptions. The first is the continuous review, where depending on the inventory level orders can happen at any time and the other is the periodic review, where orders can only happen at the beginning of each period. In this study, we relax these assumptions and assume that the periodic replenishments are stochastic in nature. Furthermore, we assume that the periods between two replenishments are independent and identically random variables. For the problem at hand, the decision variables are of integer-type and there are two kinds of space and service level constraints for each product. We develop a model of the problem in which a combination of back-order and lost-sales are considered for the shortages. Then, we show that the model is of an integer-nonlinear-programming type and in order to solve it, a search algorithm can be utilized. We employ a simulated annealing approach and provide a numerical example to demonstrate the applicability of the proposed methodology.
A Summary of the Naval Postgraduate School Research Program
1989-08-30
5 Fundamental Theory for Automatically Combining Changes to Software Systems ............................ 6 Database -System Approach to...Software Engineering Environments(SEE’s) .................................. 10 Multilevel Database Security .......................... 11 Temporal... Database Management and Real-Time Database Computers .................................... 12 The Multi-lingual, Multi Model, Multi-Backend Database
NASA Astrophysics Data System (ADS)
Torres-Martínez, J. A.; Seddaiu, M.; Rodríguez-Gonzálvez, P.; Hernández-López, D.; González-Aguilera, D.
2015-02-01
The complexity of archaeological sites hinders to get an integral modelling using the actual Geomatic techniques (i.e. aerial, closerange photogrammetry and terrestrial laser scanner) individually, so a multi-sensor approach is proposed as the best solution to provide a 3D reconstruction and visualization of these complex sites. Sensor registration represents a riveting milestone when automation is required and when aerial and terrestrial dataset must be integrated. To this end, several problems must be solved: coordinate system definition, geo-referencing, co-registration of point clouds, geometric and radiometric homogeneity, etc. Last but not least, safeguarding of tangible archaeological heritage and its associated intangible expressions entails a multi-source data approach in which heterogeneous material (historical documents, drawings, archaeological techniques, habit of living, etc.) should be collected and combined with the resulting hybrid 3D of "Tolmo de Minateda" located models. The proposed multi-data source and multi-sensor approach is applied to the study case of "Tolmo de Minateda" archaeological site. A total extension of 9 ha is reconstructed, with an adapted level of detail, by an ultralight aerial platform (paratrike), an unmanned aerial vehicle, a terrestrial laser scanner and terrestrial photogrammetry. In addition, the own defensive nature of the site (i.e. with the presence of three different defensive walls) together with the considerable stratification of the archaeological site (i.e. with different archaeological surfaces and constructive typologies) require that tangible and intangible archaeological heritage expressions can be integrated with the hybrid 3D models obtained, to analyse, understand and exploit the archaeological site by different experts and heritage stakeholders.
Håkansson, Asa; Bränning, Camilla; Adawi, Diya; Molin, Göran; Nyman, Margareta; Jeppsson, Bengt; Ahrné, Siv
2009-01-01
The enteric microbiota is a pivotal factor in the development of intestinal inflammation in humans but probiotics, dietary fibres and phytochemicals can have anti-inflammatory effects. The aim of this study was to evaluate the therapeutic effect of multi-strain probiotics and two conceivable prebiotics in an experimental colitis model. Sprague-Dawley rats were fed a fibre-free diet alone or in combination with Lactobacillus crispatus DSM 16743, L. gasseri DSM 16737 and Bifidobacterium infantis DSM 15158 and/or rye bran and blueberry husks. Colitis was induced by 5% dextran sulphate sodium (DSS) given by oro-gastric tube. Colitis severity, inflammatory markers, gut-load of lactobacilli and Enterobacteriaceae, bacterial translocation and formation of carboxylic acids (CAs) were analysed. The disease activity index (DAI) was lower in all treatment groups. Viable counts of Enterobacteriaceae were reduced and correlated positively with colitis severity, while DAI was negatively correlated with several CAs, e.g. butyric acid. The addition of probiotics to blueberry husks lowered the level of caecal acetic acid and increased that of propionic acid, while rye bran in combination with probiotics increased caecal CA levels and decreased distal colonic levels. Blueberry husks with probiotics reduced the incidence of bacterial translocation to the liver, colonic levels of myeloperoxidase, malondialdehyde and serum interleukin-12. Acetic and butyric acids in colonic content correlated negatively to malondialdehyde. A combination of probiotics and blueberry husks or rye bran enhanced the anti-inflammatory effects compared with probiotics or dietary fibres alone. These combinations can be used as a preventive or therapeutic approach to dietary amelioration of intestinal inflammation.
Multi person detection and tracking based on hierarchical level-set method
NASA Astrophysics Data System (ADS)
Khraief, Chadia; Benzarti, Faouzi; Amiri, Hamid
2018-04-01
In this paper, we propose an efficient unsupervised method for mutli-person tracking based on hierarchical level-set approach. The proposed method uses both edge and region information in order to effectively detect objects. The persons are tracked on each frame of the sequence by minimizing an energy functional that combines color, texture and shape information. These features are enrolled in covariance matrix as region descriptor. The present method is fully automated without the need to manually specify the initial contour of Level-set. It is based on combined person detection and background subtraction methods. The edge-based is employed to maintain a stable evolution, guide the segmentation towards apparent boundaries and inhibit regions fusion. The computational cost of level-set is reduced by using narrow band technique. Many experimental results are performed on challenging video sequences and show the effectiveness of the proposed method.
Ashraf, Muhammad Tahir; Schmidt, Jens Ejbye
2018-02-01
Biorefinery based on multi-feedstock lignocellulose can be viable where a sustainable supply of a single substrate is limited, for example in arid regions. Processing of mixed feedstocks has been studied in lab scale, however, its economics are less studied. In this study, an economic comparison was made between separate and combined (mixed) processing approaches for multi-feedstock lignocellulose for the production of monomeric sugars. This modular approach of focusing on sugar platform makes the results applicable for many applications using the sugars as feedstock. Feedstock considered in this study were the green and woody lignocellulose residues: Bermuda grass, Jasmine hedges, and date palm fronds. Results showed that, at an identical total feed rate, combined processing was more advantageous as compared to separate processing. A further sensitivity analysis on mixed combined processing showed that the cellulase enzyme price and feed price are the two major factors affecting the production cost. Copyright © 2017 Elsevier Ltd. All rights reserved.
A closed-loop multi-level model of glucose homeostasis
Uluseker, Cansu; Simoni, Giulia; Dauriz, Marco; Matone, Alice
2018-01-01
Background The pathophysiologic processes underlying the regulation of glucose homeostasis are considerably complex at both cellular and systemic level. A comprehensive and structured specification for the several layers of abstraction of glucose metabolism is often elusive, an issue currently solvable with the hierarchical description provided by multi-level models. In this study we propose a multi-level closed-loop model of whole-body glucose homeostasis, coupled with the molecular specifications of the insulin signaling cascade in adipocytes, under the experimental conditions of normal glucose regulation and type 2 diabetes. Methodology/Principal findings The ordinary differential equations of the model, describing the dynamics of glucose and key regulatory hormones and their reciprocal interactions among gut, liver, muscle and adipose tissue, were designed for being embedded in a modular, hierarchical structure. The closed-loop model structure allowed self-sustained simulations to represent an ideal in silico subject that adjusts its own metabolism to the fasting and feeding states, depending on the hormonal context and invariant to circadian fluctuations. The cellular level of the model provided a seamless dynamic description of the molecular mechanisms downstream the insulin receptor in the adipocytes by accounting for variations in the surrounding metabolic context. Conclusions/Significance The combination of a multi-level and closed-loop modeling approach provided a fair dynamic description of the core determinants of glucose homeostasis at both cellular and systemic scales. This model architecture is intrinsically open to incorporate supplementary layers of specifications describing further individual components influencing glucose metabolism. PMID:29420588
Modeling human diseases with induced pluripotent stem cells: from 2D to 3D and beyond.
Liu, Chun; Oikonomopoulos, Angelos; Sayed, Nazish; Wu, Joseph C
2018-03-08
The advent of human induced pluripotent stem cells (iPSCs) presents unprecedented opportunities to model human diseases. Differentiated cells derived from iPSCs in two-dimensional (2D) monolayers have proven to be a relatively simple tool for exploring disease pathogenesis and underlying mechanisms. In this Spotlight article, we discuss the progress and limitations of the current 2D iPSC disease-modeling platform, as well as recent advancements in the development of human iPSC models that mimic in vivo tissues and organs at the three-dimensional (3D) level. Recent bioengineering approaches have begun to combine different 3D organoid types into a single '4D multi-organ system'. We summarize the advantages of this approach and speculate on the future role of 4D multi-organ systems in human disease modeling. © 2018. Published by The Company of Biologists Ltd.
Operation Exodus: The Massacre of 44 Philippine Police Commandos In Mamasapano Clash
2016-09-01
strategic thinking, utilizing Game Theory and Multi-Attribute Decision Making; the combination of these two dynamic tools is used to evaluate their...thinking, utilizing Game Theory and Multi-Attribute Decision Making; the combination of these two dynamic tools is used to evaluate their potential...35 A. GAME THEORETIC APPROACH ......................................................36 B. APPLYING GAME THEORY TO OPLAN: EXODUS
Handwritten Word Recognition Using Multi-view Analysis
NASA Astrophysics Data System (ADS)
de Oliveira, J. J.; de A. Freitas, C. O.; de Carvalho, J. M.; Sabourin, R.
This paper brings a contribution to the problem of efficiently recognizing handwritten words from a limited size lexicon. For that, a multiple classifier system has been developed that analyzes the words from three different approximation levels, in order to get a computational approach inspired on the human reading process. For each approximation level a three-module architecture composed of a zoning mechanism (pseudo-segmenter), a feature extractor and a classifier is defined. The proposed application is the recognition of the Portuguese handwritten names of the months, for which a best recognition rate of 97.7% was obtained, using classifier combination.
Quantitative multi-modal NDT data analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heideklang, René; Shokouhi, Parisa
2014-02-18
A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundantmore » information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.« less
[Research on the methods for multi-class kernel CSP-based feature extraction].
Wang, Jinjia; Zhang, Lingzhi; Hu, Bei
2012-04-01
To relax the presumption of strictly linear patterns in the common spatial patterns (CSP), we studied the kernel CSP (KCSP). A new multi-class KCSP (MKCSP) approach was proposed in this paper, which combines the kernel approach with multi-class CSP technique. In this approach, we used kernel spatial patterns for each class against all others, and extracted signal components specific to one condition from EEG data sets of multiple conditions. Then we performed classification using the Logistic linear classifier. Brain computer interface (BCI) competition III_3a was used in the experiment. Through the experiment, it can be proved that this approach could decompose the raw EEG singles into spatial patterns extracted from multi-class of single trial EEG, and could obtain good classification results.
de Oliveira Dal'Molin, Cristiana G; Orellana, Camila; Gebbie, Leigh; Steen, Jennifer; Hodson, Mark P; Chrysanthopoulos, Panagiotis; Plan, Manuel R; McQualter, Richard; Palfreyman, Robin W; Nielsen, Lars K
2016-01-01
The urgent need for major gains in industrial crops productivity and in biofuel production from bioenergy grasses have reinforced attention on understanding C4 photosynthesis. Systems biology studies of C4 model plants may reveal important features of C4 metabolism. Here we chose foxtail millet (Setaria italica), as a C4 model plant and developed protocols to perform systems biology studies. As part of the systems approach, we have developed and used a genome-scale metabolic reconstruction in combination with the use of multi-omics technologies to gain more insights into the metabolism of S. italica. mRNA, protein, and metabolite abundances, were measured in mature and immature stem/leaf phytomers, and the multi-omics data were integrated into the metabolic reconstruction framework to capture key metabolic features in different developmental stages of the plant. RNA-Seq reads were mapped to the S. italica resulting for 83% coverage of the protein coding genes of S. italica. Besides revealing similarities and differences in central metabolism of mature and immature tissues, transcriptome analysis indicates significant gene expression of two malic enzyme isoforms (NADP- ME and NAD-ME). Although much greater expression levels of NADP-ME genes are observed and confirmed by the correspondent protein abundances in the samples, the expression of multiple genes combined to the significant abundance of metabolites that participates in C4 metabolism of NAD-ME and NADP-ME subtypes suggest that S. italica may use mixed decarboxylation modes of C4 photosynthetic pathways under different plant developmental stages. The overall analysis also indicates different levels of regulation in mature and immature tissues in carbon fixation, glycolysis, TCA cycle, amino acids, fatty acids, lignin, and cellulose syntheses. Altogether, the multi-omics analysis reveals different biological entities and their interrelation and regulation over plant development. With this study, we demonstrated that this systems approach is powerful enough to complement the functional metabolic annotation of bioenergy grasses.
de Oliveira Dal'Molin, Cristiana G.; Orellana, Camila; Gebbie, Leigh; Steen, Jennifer; Hodson, Mark P.; Chrysanthopoulos, Panagiotis; Plan, Manuel R.; McQualter, Richard; Palfreyman, Robin W.; Nielsen, Lars K.
2016-01-01
The urgent need for major gains in industrial crops productivity and in biofuel production from bioenergy grasses have reinforced attention on understanding C4 photosynthesis. Systems biology studies of C4 model plants may reveal important features of C4 metabolism. Here we chose foxtail millet (Setaria italica), as a C4 model plant and developed protocols to perform systems biology studies. As part of the systems approach, we have developed and used a genome-scale metabolic reconstruction in combination with the use of multi-omics technologies to gain more insights into the metabolism of S. italica. mRNA, protein, and metabolite abundances, were measured in mature and immature stem/leaf phytomers, and the multi-omics data were integrated into the metabolic reconstruction framework to capture key metabolic features in different developmental stages of the plant. RNA-Seq reads were mapped to the S. italica resulting for 83% coverage of the protein coding genes of S. italica. Besides revealing similarities and differences in central metabolism of mature and immature tissues, transcriptome analysis indicates significant gene expression of two malic enzyme isoforms (NADP- ME and NAD-ME). Although much greater expression levels of NADP-ME genes are observed and confirmed by the correspondent protein abundances in the samples, the expression of multiple genes combined to the significant abundance of metabolites that participates in C4 metabolism of NAD-ME and NADP-ME subtypes suggest that S. italica may use mixed decarboxylation modes of C4 photosynthetic pathways under different plant developmental stages. The overall analysis also indicates different levels of regulation in mature and immature tissues in carbon fixation, glycolysis, TCA cycle, amino acids, fatty acids, lignin, and cellulose syntheses. Altogether, the multi-omics analysis reveals different biological entities and their interrelation and regulation over plant development. With this study, we demonstrated that this systems approach is powerful enough to complement the functional metabolic annotation of bioenergy grasses. PMID:27559337
Combination Approaches to Combat Multi-Drug Resistant Bacteria
Worthington, Roberta J.; Melander, Christian
2013-01-01
The increasing prevalence of infections caused by multi-drug resistant bacteria is a global health problem that is exacerbated by the dearth of novel classes of antibiotics entering the clinic over the past 40 years. Herein we describe recent developments toward combination therapies for the treatment of multi-drug resistant bacterial infections. These efforts include antibiotic-antibiotic combinations, and the development of adjuvants that either directly target resistance mechanisms such as the inhibition of β-lactamase enzymes, or indirectly target resistance by interfering with bacterial signaling pathways such as two-component systems. We also discuss screening of libraries of previously approved drugs to identify non-obvious antimicrobial adjuvants. PMID:23333434
Revisiting the Robustness of PET-Based Textural Features in the Context of Multi-Centric Trials.
Bailly, Clément; Bodet-Milin, Caroline; Couespel, Solène; Necib, Hatem; Kraeber-Bodéré, Françoise; Ansquer, Catherine; Carlier, Thomas
2016-01-01
This study aimed to investigate the variability of textural features (TF) as a function of acquisition and reconstruction parameters within the context of multi-centric trials. The robustness of 15 selected TFs were studied as a function of the number of iterations, the post-filtering level, input data noise, the reconstruction algorithm and the matrix size. A combination of several reconstruction and acquisition settings was devised to mimic multi-centric conditions. We retrospectively studied data from 26 patients enrolled in a diagnostic study that aimed to evaluate the performance of PET/CT 68Ga-DOTANOC in gastro-entero-pancreatic neuroendocrine tumors. Forty-one tumors were extracted and served as the database. The coefficient of variation (COV) or the absolute deviation (for the noise study) was derived and compared statistically with SUVmax and SUVmean results. The majority of investigated TFs can be used in a multi-centric context when each parameter is considered individually. The impact of voxel size and noise in the input data were predominant as only 4 TFs presented a high/intermediate robustness against SUV-based metrics (Entropy, Homogeneity, RP and ZP). When combining several reconstruction settings to mimic multi-centric conditions, most of the investigated TFs were robust enough against SUVmax except Correlation, Contrast, LGRE, LGZE and LZLGE. Considering previously published results on either reproducibility or sensitivity against delineation approach and our findings, it is feasible to consider Homogeneity, Entropy, Dissimilarity, HGRE, HGZE and ZP as relevant for being used in multi-centric trials.
Multi-scale heat and mass transfer modelling of cell and tissue cryopreservation
Xu, Feng; Moon, Sangjun; Zhang, Xiaohui; Shao, Lei; Song, Young Seok; Demirci, Utkan
2010-01-01
Cells and tissues undergo complex physical processes during cryopreservation. Understanding the underlying physical phenomena is critical to improve current cryopreservation methods and to develop new techniques. Here, we describe multi-scale approaches for modelling cell and tissue cryopreservation including heat transfer at macroscale level, crystallization, cell volume change and mass transport across cell membranes at microscale level. These multi-scale approaches allow us to study cell and tissue cryopreservation. PMID:20047939
2010-03-24
COIN response contributed to the movement’s growth. Conclusion: A multi-pronged COIN strategy that combines the use of force with concrete steps to...excesses provided the spark that tumed the agitation into a conflagration of a violent struggle. The resultant violence soon spread to other areas and...coordinate a comprehensive COIN approach against the Maoists. A multi-pronged COIN strategy that combines the use of force with concrete steps to
Multi-scale model for the hierarchical architecture of native cellulose hydrogels.
Martínez-Sanz, Marta; Mikkelsen, Deirdre; Flanagan, Bernadine; Gidley, Michael J; Gilbert, Elliot P
2016-08-20
The structure of protiated and deuterated cellulose hydrogels has been investigated using a multi-technique approach combining small-angle scattering with diffraction, spectroscopy and microscopy. A model for the multi-scale structure of native cellulose hydrogels is proposed which highlights the essential role of water at different structural levels characterised by: (i) the existence of cellulose microfibrils containing an impermeable crystalline core surrounded by a partially hydrated paracrystalline shell, (ii) the creation of a strong network of cellulose microfibrils held together by hydrogen bonding to form cellulose ribbons and (iii) the differential behaviour of tightly bound water held within the ribbons compared to bulk solvent. Deuterium labelling provides an effective platform on which to further investigate the role of different plant cell wall polysaccharides in cellulose composite formation through the production of selectively deuterated cellulose composite hydrogels. Copyright © 2016 Elsevier Ltd. All rights reserved.
Finegan, Donal P; Scheel, Mario; Robinson, James B; Tjaden, Bernhard; Di Michiel, Marco; Hinds, Gareth; Brett, Dan J L; Shearing, Paul R
2016-11-16
Catastrophic failure of lithium-ion batteries occurs across multiple length scales and over very short time periods. A combination of high-speed operando tomography, thermal imaging and electrochemical measurements is used to probe the degradation mechanisms leading up to overcharge-induced thermal runaway of a LiCoO 2 pouch cell, through its interrelated dynamic structural, thermal and electrical responses. Failure mechanisms across multiple length scales are explored using a post-mortem multi-scale tomography approach, revealing significant morphological and phase changes in the LiCoO 2 electrode microstructure and location dependent degradation. This combined operando and multi-scale X-ray computed tomography (CT) technique is demonstrated as a comprehensive approach to understanding battery degradation and failure.
A computational intelligent approach to multi-factor analysis of violent crime information system
NASA Astrophysics Data System (ADS)
Liu, Hongbo; Yang, Chao; Zhang, Meng; McLoone, Seán; Sun, Yeqing
2017-02-01
Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.
Methodological flaws introduce strong bias into molecular analysis of microbial populations.
Krakat, N; Anjum, R; Demirel, B; Schröder, P
2017-02-01
In this study, we report how different cell disruption methods, PCR primers and in silico analyses can seriously bias results from microbial population studies, with consequences for the credibility and reproducibility of the findings. Our results emphasize the pitfalls of commonly used experimental methods that can seriously weaken the interpretation of results. Four different cell lysis methods, three commonly used primer pairs and various computer-based analyses were applied to investigate the microbial diversity of a fermentation sample composed of chicken dung. The fault-prone, but still frequently used, amplified rRNA gene restriction analysis was chosen to identify common weaknesses. In contrast to other studies, we focused on the complete analytical process, from cell disruption to in silico analysis, and identified potential error rates. This identified a wide disagreement of results between applied experimental approaches leading to very different community structures depending on the chosen approach. The interpretation of microbial diversity data remains a challenge. In order to accurately investigate the taxonomic diversity and structure of prokaryotic communities, we suggest a multi-level approach combining DNA-based and DNA-independent techniques. The identified weaknesses of commonly used methods to study microbial diversity can be overcome by a multi-level approach, which produces more reliable data about the fate and behaviour of microbial communities of engineered habitats such as biogas plants, so that the best performance can be ensured. © 2016 The Society for Applied Microbiology.
Simeonov, Plamen L
2017-12-01
The goal of this paper is to advance an extensible theory of living systems using an approach to biomathematics and biocomputation that suitably addresses self-organized, self-referential and anticipatory systems with multi-temporal multi-agents. Our first step is to provide foundations for modelling of emergent and evolving dynamic multi-level organic complexes and their sustentative processes in artificial and natural life systems. Main applications are in life sciences, medicine, ecology and astrobiology, as well as robotics, industrial automation, man-machine interface and creative design. Since 2011 over 100 scientists from a number of disciplines have been exploring a substantial set of theoretical frameworks for a comprehensive theory of life known as Integral Biomathics. That effort identified the need for a robust core model of organisms as dynamic wholes, using advanced and adequately computable mathematics. The work described here for that core combines the advantages of a situation and context aware multivalent computational logic for active self-organizing networks, Wandering Logic Intelligence (WLI), and a multi-scale dynamic category theory, Memory Evolutive Systems (MES), hence WLIMES. This is presented to the modeller via a formal augmented reality language as a first step towards practical modelling and simulation of multi-level living systems. Initial work focuses on the design and implementation of this visual language and calculus (VLC) and its graphical user interface. The results will be integrated within the current methodology and practices of theoretical biology and (personalized) medicine to deepen and to enhance the holistic understanding of life. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Taher, M.; Hamidah, I.; Suwarma, I. R.
2017-09-01
This paper outlined the results of an experimental study on the effects of multi-representation approach in learning Archimedes Law on students’ mental model improvement. The multi-representation techniques implemented in the study were verbal, pictorial, mathematical, and graphical representations. Students’ mental model was classified into three levels, i.e. scientific, synthetic, and initial levels, based on the students’ level of understanding. The present study employed the pre-experimental methodology, using one group pretest-posttest design. The subject of the study was 32 eleventh grade students in a Public Senior High School in Riau Province. The research instrument included model mental test on hydrostatic pressure concept, in the form of essay test judged by experts. The findings showed that there was positive change in students’ mental model, indicating that multi-representation approach was effective to improve students’ mental model.
Wels, Michael; Carneiro, Gustavo; Aplas, Alexander; Huber, Martin; Hornegger, Joachim; Comaniciu, Dorin
2008-01-01
In this paper we present a fully automated approach to the segmentation of pediatric brain tumors in multi-spectral 3-D magnetic resonance images. It is a top-down segmentation approach based on a Markov random field (MRF) model that combines probabilistic boosting trees (PBT) and lower-level segmentation via graph cuts. The PBT algorithm provides a strong discriminative observation model that classifies tumor appearance while a spatial prior takes into account the pair-wise homogeneity in terms of classification labels and multi-spectral voxel intensities. The discriminative model relies not only on observed local intensities but also on surrounding context for detecting candidate regions for pathology. A mathematically sound formulation for integrating the two approaches into a unified statistical framework is given. The proposed method is applied to the challenging task of detection and delineation of pediatric brain tumors. This segmentation task is characterized by a high non-uniformity of both the pathology and the surrounding non-pathologic brain tissue. A quantitative evaluation illustrates the robustness of the proposed method. Despite dealing with more complicated cases of pediatric brain tumors the results obtained are mostly better than those reported for current state-of-the-art approaches to 3-D MR brain tumor segmentation in adult patients. The entire processing of one multi-spectral data set does not require any user interaction, and takes less time than previously proposed methods.
[Multi-mathematical modelings for compatibility optimization of Jiangzhi granules].
Yang, Ming; Zhang, Li; Ge, Yingli; Lu, Yanliu; Ji, Guang
2011-12-01
To investigate into the method of "multi activity index evaluation and combination optimized of mult-component" for Chinese herbal formulas. According to the scheme of uniform experimental design, efficacy experiment, multi index evaluation, least absolute shrinkage, selection operator (LASSO) modeling, evolutionary optimization algorithm, validation experiment, we optimized the combination of Jiangzhi granules based on the activity indexes of blood serum ALT, ALT, AST, TG, TC, HDL, LDL and TG level of liver tissues, ratio of liver tissue to body. Analytic hierarchy process (AHP) combining with criteria importance through intercriteria correlation (CRITIC) for multi activity index evaluation was more reasonable and objective, it reflected the information of activity index's order and objective sample data. LASSO algorithm modeling could accurately reflect the relationship between different combination of Jiangzhi granule and the activity comprehensive indexes. The optimized combination of Jiangzhi granule showed better values of the activity comprehensive indexed than the original formula after the validation experiment. AHP combining with CRITIC can be used for multi activity index evaluation and LASSO algorithm, it is suitable for combination optimized of Chinese herbal formulas.
A critical review of nanotechnologies for composite aerospace structures
NASA Astrophysics Data System (ADS)
Kostopoulos, Vassilis; Masouras, Athanasios; Baltopoulos, Athanasios; Vavouliotis, Antonios; Sotiriadis, George; Pambaguian, Laurent
2017-03-01
The past decade extensive efforts have been invested in understanding the nano-scale and revealing the capabilities offered by nanotechnology products to structural materials. Integration of nano-particles into fiber composites concludes to multi-scale reinforced composites and has opened a new wide range of multi-functional materials in industry. In this direction, a variety of carbon based nano-fillers has been proposed and employed, individually or in combination in hybrid forms, to approach the desired performance. Nevertheless, a major issue faced lately more seriously due to the interest of industry is on how to incorporate these nano-species into the final composite structure through existing manufacturing processes and infrastructure. This interest originates from several industrial applications needs that request the development of new multi-functional materials which combine enhanced mechanical, electrical and thermal properties. In this work, an attempt is performed to review the most representative processes and related performances reported in literature and the experience obtained on nano-enabling technologies of fiber composite materials. This review focuses on the two main composite manufacturing technologies used by the aerospace industry; Prepreg/Autoclave and Resin Transfer technologies. It addresses several approaches for nano-enabling of composites for these two routes and reports latest achieved results focusing on performance of nano-enabled fiber reinforced composites extracted from literature. Finally, this review work identifies the gap between available nano-technology integration routes and the established industrial composite manufacturing techniques and the challenges to increase the Technology Readiness Level to reach the demands for aerospace industry applications.
Cohen, Alexander D; Nencka, Andrew S; Lebel, R Marc; Wang, Yang
2017-01-01
A novel sequence has been introduced that combines multiband imaging with a multi-echo acquisition for simultaneous high spatial resolution pseudo-continuous arterial spin labeling (ASL) and blood-oxygenation-level dependent (BOLD) echo-planar imaging (MBME ASL/BOLD). Resting-state connectivity in healthy adult subjects was assessed using this sequence. Four echoes were acquired with a multiband acceleration of four, in order to increase spatial resolution, shorten repetition time, and reduce slice-timing effects on the ASL signal. In addition, by acquiring four echoes, advanced multi-echo independent component analysis (ME-ICA) denoising could be employed to increase the signal-to-noise ratio (SNR) and BOLD sensitivity. Seed-based and dual-regression approaches were utilized to analyze functional connectivity. Cerebral blood flow (CBF) and BOLD coupling was also evaluated by correlating the perfusion-weighted timeseries with the BOLD timeseries. These metrics were compared between single echo (E2), multi-echo combined (MEC), multi-echo combined and denoised (MECDN), and perfusion-weighted (PW) timeseries. Temporal SNR increased for the MECDN data compared to the MEC and E2 data. Connectivity also increased, in terms of correlation strength and network size, for the MECDN compared to the MEC and E2 datasets. CBF and BOLD coupling was increased in major resting-state networks, and that correlation was strongest for the MECDN datasets. These results indicate our novel MBME ASL/BOLD sequence, which collects simultaneous high-resolution ASL/BOLD data, could be a powerful tool for detecting functional connectivity and dynamic neurovascular coupling during the resting state. The collection of more than two echoes facilitates the use of ME-ICA denoising to greatly improve the quality of resting state functional connectivity MRI.
NASA Astrophysics Data System (ADS)
Khan, F. A.; Yousaf, A.; Reindl, L. M.
2018-04-01
This paper presents a multi segment capacitive level monitoring sensor based on distributed E-fields approach Glocal. This approach has an advantage to analyze build-up problem by the local E-fields as well the fluid level monitoring by the global E-fields. The multi segment capacitive approach presented within this work addresses the main problem of unwanted parasitic capacitance generated from Copper (Cu) strips by applying active shielding concept. Polyvinyl chloride (PVC) is used for isolation and parafilm is used for creating artificial build-up on a CLS.
Delmotte, Sylvestre; Lopez-Ridaura, Santiago; Barbier, Jean-Marc; Wery, Jacques
2013-11-15
Evaluating the impacts of the development of alternative agricultural systems, such as organic or low-input cropping systems, in the context of an agricultural region requires the use of specific tools and methodologies. They should allow a prospective (using scenarios), multi-scale (taking into account the field, farm and regional level), integrated (notably multicriteria) and participatory assessment, abbreviated PIAAS (for Participatory Integrated Assessment of Agricultural System). In this paper, we compare the possible contribution to PIAAS of three modeling approaches i.e. Bio-Economic Modeling (BEM), Agent-Based Modeling (ABM) and statistical Land-Use/Land Cover Change (LUCC) models. After a presentation of each approach, we analyze their advantages and drawbacks, and identify their possible complementarities for PIAAS. Statistical LUCC modeling is a suitable approach for multi-scale analysis of past changes and can be used to start discussion about the futures with stakeholders. BEM and ABM approaches have complementary features for scenarios assessment at different scales. While ABM has been widely used for participatory assessment, BEM has been rarely used satisfactorily in a participatory manner. On the basis of these results, we propose to combine these three approaches in a framework targeted to PIAAS. Copyright © 2013 Elsevier Ltd. All rights reserved.
Wang, Wei; Song, Wei-Guo; Liu, Shi-Xing; Zhang, Yong-Ming; Zheng, Hong-Yang; Tian, Wei
2011-04-01
An improved method for detecting cloud combining Kmeans clustering and the multi-spectral threshold approach is described. On the basis of landmark spectrum analysis, MODIS data is categorized into two major types initially by Kmeans method. The first class includes clouds, smoke and snow, and the second class includes vegetation, water and land. Then a multi-spectral threshold detection is applied to eliminate interference such as smoke and snow for the first class. The method is tested with MODIS data at different time under different underlying surface conditions. By visual method to test the performance of the algorithm, it was found that the algorithm can effectively detect smaller area of cloud pixels and exclude the interference of underlying surface, which provides a good foundation for the next fire detection approach.
NASA Astrophysics Data System (ADS)
Chiarelli, Antonio Maria; Croce, Pierpaolo; Merla, Arcangelo; Zappasodi, Filippo
2018-06-01
Objective. Brain–computer interface (BCI) refers to procedures that link the central nervous system to a device. BCI was historically performed using electroencephalography (EEG). In the last years, encouraging results were obtained by combining EEG with other neuroimaging technologies, such as functional near infrared spectroscopy (fNIRS). A crucial step of BCI is brain state classification from recorded signal features. Deep artificial neural networks (DNNs) recently reached unprecedented complex classification outcomes. These performances were achieved through increased computational power, efficient learning algorithms, valuable activation functions, and restricted or back-fed neurons connections. By expecting significant overall BCI performances, we investigated the capabilities of combining EEG and fNIRS recordings with state-of-the-art deep learning procedures. Approach. We performed a guided left and right hand motor imagery task on 15 subjects with a fixed classification response time of 1 s and overall experiment length of 10 min. Left versus right classification accuracy of a DNN in the multi-modal recording modality was estimated and it was compared to standalone EEG and fNIRS and other classifiers. Main results. At a group level we obtained significant increase in performance when considering multi-modal recordings and DNN classifier with synergistic effect. Significance. BCI performances can be significantly improved by employing multi-modal recordings that provide electrical and hemodynamic brain activity information, in combination with advanced non-linear deep learning classification procedures.
Uncovering Hidden Layers of Cell Cycle Regulation through Integrative Multi-omic Analysis
Aviner, Ranen; Shenoy, Anjana; Elroy-Stein, Orna; Geiger, Tamar
2015-01-01
Studying the complex relationship between transcription, translation and protein degradation is essential to our understanding of biological processes in health and disease. The limited correlations observed between mRNA and protein abundance suggest pervasive regulation of post-transcriptional steps and support the importance of profiling mRNA levels in parallel to protein synthesis and degradation rates. In this work, we applied an integrative multi-omic approach to study gene expression along the mammalian cell cycle through side-by-side analysis of mRNA, translation and protein levels. Our analysis sheds new light on the significant contribution of both protein synthesis and degradation to the variance in protein expression. Furthermore, we find that translation regulation plays an important role at S-phase, while progression through mitosis is predominantly controlled by changes in either mRNA levels or protein stability. Specific molecular functions are found to be co-regulated and share similar patterns of mRNA, translation and protein expression along the cell cycle. Notably, these include genes and entire pathways not previously implicated in cell cycle progression, demonstrating the potential of this approach to identify novel regulatory mechanisms beyond those revealed by traditional expression profiling. Through this three-level analysis, we characterize different mechanisms of gene expression, discover new cycling gene products and highlight the importance and utility of combining datasets generated using different techniques that monitor distinct steps of gene expression. PMID:26439921
Revisiting the Robustness of PET-Based Textural Features in the Context of Multi-Centric Trials
Bailly, Clément; Bodet-Milin, Caroline; Couespel, Solène; Necib, Hatem; Kraeber-Bodéré, Françoise; Ansquer, Catherine; Carlier, Thomas
2016-01-01
Purpose This study aimed to investigate the variability of textural features (TF) as a function of acquisition and reconstruction parameters within the context of multi-centric trials. Methods The robustness of 15 selected TFs were studied as a function of the number of iterations, the post-filtering level, input data noise, the reconstruction algorithm and the matrix size. A combination of several reconstruction and acquisition settings was devised to mimic multi-centric conditions. We retrospectively studied data from 26 patients enrolled in a diagnostic study that aimed to evaluate the performance of PET/CT 68Ga-DOTANOC in gastro-entero-pancreatic neuroendocrine tumors. Forty-one tumors were extracted and served as the database. The coefficient of variation (COV) or the absolute deviation (for the noise study) was derived and compared statistically with SUVmax and SUVmean results. Results The majority of investigated TFs can be used in a multi-centric context when each parameter is considered individually. The impact of voxel size and noise in the input data were predominant as only 4 TFs presented a high/intermediate robustness against SUV-based metrics (Entropy, Homogeneity, RP and ZP). When combining several reconstruction settings to mimic multi-centric conditions, most of the investigated TFs were robust enough against SUVmax except Correlation, Contrast, LGRE, LGZE and LZLGE. Conclusion Considering previously published results on either reproducibility or sensitivity against delineation approach and our findings, it is feasible to consider Homogeneity, Entropy, Dissimilarity, HGRE, HGZE and ZP as relevant for being used in multi-centric trials. PMID:27467882
Kaufman, Michelle R; Cornish, Flora; Zimmerman, Rick S; Johnson, Blair T
2014-08-15
Despite increasing recent emphasis on the social and structural determinants of HIV-related behavior, empirical research and interventions lag behind, partly because of the complexity of social-structural approaches. This article provides a comprehensive and practical review of the diverse literature on multi-level approaches to HIV-related behavior change in the interest of contributing to the ongoing shift to more holistic theory, research, and practice. It has the following specific aims: (1) to provide a comprehensive list of relevant variables/factors related to behavior change at all points on the individual-structural spectrum, (2) to map out and compare the characteristics of important recent multi-level models, (3) to reflect on the challenges of operating with such complex theoretical tools, and (4) to identify next steps and make actionable recommendations. Using a multi-level approach implies incorporating increasing numbers of variables and increasingly context-specific mechanisms, overall producing greater intricacies. We conclude with recommendations on how best to respond to this complexity, which include: using formative research and interdisciplinary collaboration to select the most appropriate levels and variables in a given context; measuring social and institutional variables at the appropriate level to ensure meaningful assessments of multiple levels are made; and conceptualizing intervention and research with reference to theoretical models and mechanisms to facilitate transferability, sustainability, and scalability.
Integrated microsystems packaging approach with LCP
NASA Astrophysics Data System (ADS)
Jaynes, Paul; Shacklette, Lawrence W.
2006-05-01
Within the government communication market there is an increasing push to further miniaturize systems with the use of chip-scale packages, flip-chip bonding, and other advances over traditional packaging techniques. Harris' approach to miniaturization includes these traditional packaging advances, but goes beyond this level of miniaturization by combining the functional and structural elements of a system, thus creating a Multi-Functional Structural Circuit (MFSC). An emerging high-frequency, near hermetic, thermoplastic electronic substrate material, Liquid Crystal Polymer (LCP), is the material that will enable the combination of the electronic circuit and the physical structure of the system. The first embodiment of this vision for Harris is the development of a battlefield acoustic sensor module. This paper will introduce LCP and its advantages for MFSC, present an example of the work that Harris has performed, and speak to LCP MFSCs' potential benefits to miniature communications modules and sensor platforms.
A Multi-Level Decision Fusion Strategy for Condition Based Maintenance of Composite Structures
Sharif Khodaei, Zahra; Aliabadi, M.H.
2016-01-01
In this work, a multi-level decision fusion strategy is proposed which weighs the Value of Information (VoI) against the intended functions of a Structural Health Monitoring (SHM) system. This paper presents a multi-level approach for three different maintenance strategies in which the performance of the SHM systems is evaluated against its intended functions. Level 1 diagnosis results in damage existence with minimum sensors covering a large area by finding the maximum energy difference for the guided waves propagating in pristine structure and the post-impact state; Level 2 diagnosis provides damage detection and approximate localization using an approach based on Electro-Mechanical Impedance (EMI) measures, while Level 3 characterizes damage (exact location and size) in addition to its detection by utilising a Weighted Energy Arrival Method (WEAM). The proposed multi-level strategy is verified and validated experimentally by detection of Barely Visible Impact Damage (BVID) on a curved composite fuselage panel. PMID:28773910
Optimal maintenance policy incorporating system level and unit level for mechanical systems
NASA Astrophysics Data System (ADS)
Duan, Chaoqun; Deng, Chao; Wang, Bingran
2018-04-01
The study works on a multi-level maintenance policy combining system level and unit level under soft and hard failure modes. The system experiences system-level preventive maintenance (SLPM) when the conditional reliability of entire system exceeds SLPM threshold, and also undergoes a two-level maintenance for each single unit, which is initiated when a single unit exceeds its preventive maintenance (PM) threshold, and the other is performed simultaneously the moment when any unit is going for maintenance. The units experience both periodic inspections and aperiodic inspections provided by failures of hard-type units. To model the practical situations, two types of economic dependence have been taken into account, which are set-up cost dependence and maintenance expertise dependence due to the same technology and tool/equipment can be utilised. The optimisation problem is formulated and solved in a semi-Markov decision process framework. The objective is to find the optimal system-level threshold and unit-level thresholds by minimising the long-run expected average cost per unit time. A formula for the mean residual life is derived for the proposed multi-level maintenance policy. The method is illustrated by a real case study of feed subsystem from a boring machine, and a comparison with other policies demonstrates the effectiveness of our approach.
Joint multi-object registration and segmentation of left and right cardiac ventricles in 4D cine MRI
NASA Astrophysics Data System (ADS)
Ehrhardt, Jan; Kepp, Timo; Schmidt-Richberg, Alexander; Handels, Heinz
2014-03-01
The diagnosis of cardiac function based on cine MRI requires the segmentation of cardiac structures in the images, but the problem of automatic cardiac segmentation is still open, due to the imaging characteristics of cardiac MR images and the anatomical variability of the heart. In this paper, we present a variational framework for joint segmentation and registration of multiple structures of the heart. To enable the simultaneous segmentation and registration of multiple objects, a shape prior term is introduced into a region competition approach for multi-object level set segmentation. The proposed algorithm is applied for simultaneous segmentation of the myocardium as well as the left and right ventricular blood pool in short axis cine MRI images. Two experiments are performed: first, intra-patient 4D segmentation with a given initial segmentation for one time-point in a 4D sequence, and second, a multi-atlas segmentation strategy is applied to unseen patient data. Evaluation of segmentation accuracy is done by overlap coefficients and surface distances. An evaluation based on clinical 4D cine MRI images of 25 patients shows the benefit of the combined approach compared to sole registration and sole segmentation.
All-fiber 7x1 signal combiner for incoherent laser beam combining
NASA Astrophysics Data System (ADS)
Noordegraaf, D.; Maack, M. D.; Skovgaard, P. M. W.; Johansen, J.; Becker, F.; Belke, S.; Blomqvist, M.; Laegsgaard, J.
2011-02-01
We demonstrate an all-fiber 7x1 signal combiner for incoherent laser beam combining. This is a potential key component for reaching several kW of stabile laser output power. The combiner couples the output from 7 single-mode (SM) fiber lasers into a single multi-mode (MM) fiber. The input signal fibers have a core diameter of 17 μm and the output MM fiber has a core diameter of 100 μm. In a tapered section light gradually leaks out of the SM fibers and is captured by a surrounding fluorine-doped cladding. The combiner is tested up to 2.5 kW of combined output power and only a minor increase in device temperature is observed. At an intermediate power level of 600 W a beam parameter product (BPP) of 2.22 mm x mrad is measured, corresponding to an M2 value of 6.5. These values are approaching the theoretical limit dictated by brightness conservation.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-31
... that is a unique combination of: (1) multi-gradient Single Point Imaging involving global phase...-encoding gradients. The combination approach of single point imaging with the spin-echo signal detection...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Djara, V.; Cherkaoui, K.; Negara, M. A.
2015-11-28
An alternative multi-frequency inversion-charge pumping (MFICP) technique was developed to directly separate the inversion charge density (N{sub inv}) from the trapped charge density in high-k/InGaAs metal-oxide-semiconductor field-effect transistors (MOSFETs). This approach relies on the fitting of the frequency response of border traps, obtained from inversion-charge pumping measurements performed over a wide range of frequencies at room temperature on a single MOSFET, using a modified charge trapping model. The obtained model yielded the capture time constant and density of border traps located at energy levels aligned with the InGaAs conduction band. Moreover, the combination of MFICP and pulsed I{sub d}-V{sub g}more » measurements enabled an accurate effective mobility vs N{sub inv} extraction and analysis. The data obtained using the MFICP approach are consistent with the most recent reports on high-k/InGaAs.« less
Horizontal and vertical combination of multi-tenancy patterns in service-oriented applications
NASA Astrophysics Data System (ADS)
Mietzner, Ralph; Leymann, Frank; Unger, Tobias
2011-02-01
Software as a service (SaaS) providers exploit economies of scale by offering the same instance of an application to multiple customers typically in a single-instance multi-tenant architecture model. Therefore the applications must be scalable, multi-tenant aware and configurable. In this article, we show how the services in a service-oriented SaaS application can be deployed using different multi-tenancy patterns. We describe how services in different multi-tenancy patterns can be composed on the application level. In addition to that, we also describe how these multi-tenancy patterns can be applied to middleware and hardware components. We then show with some real world examples how the different multi-tenancy patterns can be combined.
Wang, Bao-Zhen; Chen, Zhi
2013-01-01
This article presents a GIS-based multi-source and multi-box modeling approach (GMSMB) to predict the spatial concentration distributions of airborne pollutant on local and regional scales. In this method, an extended multi-box model combined with a multi-source and multi-grid Gaussian model are developed within the GIS framework to examine the contributions from both point- and area-source emissions. By using GIS, a large amount of data including emission sources, air quality monitoring, meteorological data, and spatial location information required for air quality modeling are brought into an integrated modeling environment. It helps more details of spatial variation in source distribution and meteorological condition to be quantitatively analyzed. The developed modeling approach has been examined to predict the spatial concentration distribution of four air pollutants (CO, NO(2), SO(2) and PM(2.5)) for the State of California. The modeling results are compared with the monitoring data. Good agreement is acquired which demonstrated that the developed modeling approach could deliver an effective air pollution assessment on both regional and local scales to support air pollution control and management planning.
Wu, Chi; Xie, Zuowei; Zhang, Guangzhao; Zi, Guofu; Tu, Yingfeng; Yang, Yali; Cai, Ping; Nie, Ting
2002-12-07
A combination of polymer physics and synthetic chemistry has enabled us to develop self-assembly assisted polymerization (SAAP), leading to the preparation of long multi-block copolymers with an ordered chain sequence and controllable block lengths.
Hybrid Compounds as Anti-infective Agents.
Sbaraglini, María Laura; Talevi, Alan
2017-01-01
Hybrid drugs are multi-target chimeric chemicals combining two or more drugs or pharmacophores covalently linked in a single molecule. In the field of anti-infective agents, they have been proposed as a possible solution to drug resistance issues, presumably having a broader spectrum of activity and less probability of eliciting high level resistance linked to single gene product. Although less frequently explored, they could also be useful in the treatment of frequently occurring co-infections. Here, we overview recent advances in the field of hybrid antimicrobials. Furthermore, we discuss some cutting-edge approaches to face the development of designed multi-target agents in the era of omics and big data, namely analysis of gene signatures and multitask QSAR models. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
The Integrated Multi-Level Bilingual Teaching of "Social Research Methods"
ERIC Educational Resources Information Center
Zhu, Yanhan; Ye, Jian
2012-01-01
"Social Research Methods," as a methodology course, combines theories and practices closely. Based on the synergy theory, this paper tries to establish an integrated multi-level bilingual teaching mode. Starting from the transformation of teaching concepts, we should integrate interactions, experiences, and researches together and focus…
An Integer Programming Model for Multi-Echelon Supply Chain Decision Problem Considering Inventories
NASA Astrophysics Data System (ADS)
Harahap, Amin; Mawengkang, Herman; Siswadi; Effendi, Syahril
2018-01-01
In this paper we address a problem that is of significance to the industry, namely the optimal decision of a multi-echelon supply chain and the associated inventory systems. By using the guaranteed service approach to model the multi-echelon inventory system, we develop a mixed integer; programming model to simultaneously optimize the transportation, inventory and network structure of a multi-echelon supply chain. To solve the model we develop a direct search approach using a strategy of releasing nonbasic variables from their bounds, combined with the “active constraint” method. This strategy is used to force the appropriate non-integer basic variables to move to their neighbourhood integer points.
Active Learning by Querying Informative and Representative Examples.
Huang, Sheng-Jun; Jin, Rong; Zhou, Zhi-Hua
2014-10-01
Active learning reduces the labeling cost by iteratively selecting the most valuable data to query their labels. It has attracted a lot of interests given the abundance of unlabeled data and the high cost of labeling. Most active learning approaches select either informative or representative unlabeled instances to query their labels, which could significantly limit their performance. Although several active learning algorithms were proposed to combine the two query selection criteria, they are usually ad hoc in finding unlabeled instances that are both informative and representative. We address this limitation by developing a principled approach, termed QUIRE, based on the min-max view of active learning. The proposed approach provides a systematic way for measuring and combining the informativeness and representativeness of an unlabeled instance. Further, by incorporating the correlation among labels, we extend the QUIRE approach to multi-label learning by actively querying instance-label pairs. Extensive experimental results show that the proposed QUIRE approach outperforms several state-of-the-art active learning approaches in both single-label and multi-label learning.
Mathematical model comparing of the multi-level economics systems
NASA Astrophysics Data System (ADS)
Brykalov, S. M.; Kryanev, A. V.
2017-12-01
The mathematical model (scheme) of a multi-level comparison of the economic system, characterized by the system of indices, is worked out. In the mathematical model of the multi-level comparison of the economic systems, the indicators of peer review and forecasting of the economic system under consideration can be used. The model can take into account the uncertainty in the estimated values of the parameters or expert estimations. The model uses the multi-criteria approach based on the Pareto solutions.
Publics and biobanks: Pan-European diversity and the challenge of responsible innovation.
Gaskell, George; Gottweis, Herbert; Starkbaum, Johannes; Gerber, Monica M; Broerse, Jacqueline; Gottweis, Ursula; Hobbs, Abbi; Helén, Ilpo; Paschou, Maria; Snell, Karoliina; Soulier, Alexandra
2013-01-01
This article examines public perceptions of biobanks in Europe using a multi-method approach combining quantitative and qualitative data. It is shown that public support for biobanks in Europe is variable and dependent on a range of interconnected factors: people's engagement with biobanks; concerns about privacy and data security, and trust in the socio-political system, key actors and institutions involved in biobanks. We argue that the biobank community needs to acknowledge the impact of these factors if they are to successfully develop and integrate biobanks at a pan-European level.
SVM-Fold: a tool for discriminative multi-class protein fold and superfamily recognition
Melvin, Iain; Ie, Eugene; Kuang, Rui; Weston, Jason; Stafford, William Noble; Leslie, Christina
2007-01-01
Background Predicting a protein's structural class from its amino acid sequence is a fundamental problem in computational biology. Much recent work has focused on developing new representations for protein sequences, called string kernels, for use with support vector machine (SVM) classifiers. However, while some of these approaches exhibit state-of-the-art performance at the binary protein classification problem, i.e. discriminating between a particular protein class and all other classes, few of these studies have addressed the real problem of multi-class superfamily or fold recognition. Moreover, there are only limited software tools and systems for SVM-based protein classification available to the bioinformatics community. Results We present a new multi-class SVM-based protein fold and superfamily recognition system and web server called SVM-Fold, which can be found at . Our system uses an efficient implementation of a state-of-the-art string kernel for sequence profiles, called the profile kernel, where the underlying feature representation is a histogram of inexact matching k-mer frequencies. We also employ a novel machine learning approach to solve the difficult multi-class problem of classifying a sequence of amino acids into one of many known protein structural classes. Binary one-vs-the-rest SVM classifiers that are trained to recognize individual structural classes yield prediction scores that are not comparable, so that standard "one-vs-all" classification fails to perform well. Moreover, SVMs for classes at different levels of the protein structural hierarchy may make useful predictions, but one-vs-all does not try to combine these multiple predictions. To deal with these problems, our method learns relative weights between one-vs-the-rest classifiers and encodes information about the protein structural hierarchy for multi-class prediction. In large-scale benchmark results based on the SCOP database, our code weighting approach significantly improves on the standard one-vs-all method for both the superfamily and fold prediction in the remote homology setting and on the fold recognition problem. Moreover, our code weight learning algorithm strongly outperforms nearest-neighbor methods based on PSI-BLAST in terms of prediction accuracy on every structure classification problem we consider. Conclusion By combining state-of-the-art SVM kernel methods with a novel multi-class algorithm, the SVM-Fold system delivers efficient and accurate protein fold and superfamily recognition. PMID:17570145
NASA Astrophysics Data System (ADS)
Mesbah, Mostefa; Balakrishnan, Malarvili; Colditz, Paul B.; Boashash, Boualem
2012-12-01
This article proposes a new method for newborn seizure detection that uses information extracted from both multi-channel electroencephalogram (EEG) and a single channel electrocardiogram (ECG). The aim of the study is to assess whether additional information extracted from ECG can improve the performance of seizure detectors based solely on EEG. Two different approaches were used to combine this extracted information. The first approach, known as feature fusion, involves combining features extracted from EEG and heart rate variability (HRV) into a single feature vector prior to feeding it to a classifier. The second approach, called classifier or decision fusion, is achieved by combining the independent decisions of the EEG and the HRV-based classifiers. Tested on recordings obtained from eight newborns with identified EEG seizures, the proposed neonatal seizure detection algorithms achieved 95.20% sensitivity and 88.60% specificity for the feature fusion case and 95.20% sensitivity and 94.30% specificity for the classifier fusion case. These results are considerably better than those involving classifiers using EEG only (80.90%, 86.50%) or HRV only (85.70%, 84.60%).
Shamwell, E Jared; Nothwang, William D; Perlis, Donald
2018-05-04
Aimed at improving size, weight, and power (SWaP)-constrained robotic vision-aided state estimation, we describe our unsupervised, deep convolutional-deconvolutional sensor fusion network, Multi-Hypothesis DeepEfference (MHDE). MHDE learns to intelligently combine noisy heterogeneous sensor data to predict several probable hypotheses for the dense, pixel-level correspondence between a source image and an unseen target image. We show how our multi-hypothesis formulation provides increased robustness against dynamic, heteroscedastic sensor and motion noise by computing hypothesis image mappings and predictions at 76⁻357 Hz depending on the number of hypotheses being generated. MHDE fuses noisy, heterogeneous sensory inputs using two parallel, inter-connected architectural pathways and n (1⁻20 in this work) multi-hypothesis generating sub-pathways to produce n global correspondence estimates between a source and a target image. We evaluated MHDE on the KITTI Odometry dataset and benchmarked it against the vision-only DeepMatching and Deformable Spatial Pyramids algorithms and were able to demonstrate a significant runtime decrease and a performance increase compared to the next-best performing method.
Kampmann, Peter; Kirchner, Frank
2014-01-01
With the increasing complexity of robotic missions and the development towards long-term autonomous systems, the need for multi-modal sensing of the environment increases. Until now, the use of tactile sensor systems has been mostly based on sensing one modality of forces in the robotic end-effector. The use of a multi-modal tactile sensory system is motivated, which combines static and dynamic force sensor arrays together with an absolute force measurement system. This publication is focused on the development of a compact sensor interface for a fiber-optic sensor array, as optic measurement principles tend to have a bulky interface. Mechanical, electrical and software approaches are combined to realize an integrated structure that provides decentralized data pre-processing of the tactile measurements. Local behaviors are implemented using this setup to show the effectiveness of this approach. PMID:24743158
A multi-scale spatial approach to address environmental effects of small hydropower development.
McManamay, Ryan A; Samu, Nicole; Kao, Shih-Chieh; Bevelhimer, Mark S; Hetrick, Shelaine C
2015-01-01
Hydropower development continues to grow worldwide in developed and developing countries. While the ecological and physical responses to dam construction have been well documented, translating this information into planning for hydropower development is extremely difficult. Very few studies have conducted environmental assessments to guide site-specific or widespread hydropower development. Herein, we propose a spatial approach for estimating environmental effects of hydropower development at multiple scales, as opposed to individual site-by-site assessments (e.g., environmental impact assessment). Because the complex, process-driven effects of future hydropower development may be uncertain or, at best, limited by available information, we invested considerable effort in describing novel approaches to represent environmental concerns using spatial data and in developing the spatial footprint of hydropower infrastructure. We then use two case studies in the US, one at the scale of the conterminous US and another within two adjoining rivers basins, to examine how environmental concerns can be identified and related to areas of varying energy capacity. We use combinations of reserve-design planning and multi-metric ranking to visualize tradeoffs among environmental concerns and potential energy capacity. Spatial frameworks, like the one presented, are not meant to replace more in-depth environmental assessments, but to identify information gaps and measure the sustainability of multi-development scenarios as to inform policy decisions at the basin or national level. Most importantly, the approach should foster discussions among environmental scientists and stakeholders regarding solutions to optimize energy development and environmental sustainability.
2013-01-01
Background Despite progress in the development of combined antiretroviral therapies (cART), HIV infection remains a significant challenge for human health. Current problems of cART include multi-drug-resistant virus variants, long-term toxicity and enormous treatment costs. Therefore, the identification of novel effective drugs is urgently needed. Methods We developed a straightforward screening approach for simultaneously evaluating the sensitivity of multiple HIV gag-pol mutants to antiviral drugs in one assay. Our technique is based on multi-colour lentiviral self-inactivating (SIN) LeGO vector technology. Results We demonstrated the successful use of this approach for screening compounds against up to four HIV gag-pol variants (wild-type and three mutants) simultaneously. Importantly, the technique was adapted to Biosafety Level 1 conditions by utilising ecotropic pseudotypes. This allowed upscaling to a large-scale screening protocol exploited by pharmaceutical companies in a successful proof-of-concept experiment. Conclusions The technology developed here facilitates fast screening for anti-HIV activity of individual agents from large compound libraries. Although drugs targeting gag-pol variants were used here, our approach permits screening compounds that target several different, key cellular and viral functions of the HIV life-cycle. The modular principle of the method also allows the easy exchange of various mutations in HIV sequences. In conclusion, the methodology presented here provides a valuable new approach for the identification of novel anti-HIV drugs. PMID:23286882
Multi-Hazard Interactions in Guatemala
NASA Astrophysics Data System (ADS)
Gill, Joel; Malamud, Bruce D.
2017-04-01
In this paper, we combine physical and social science approaches to develop a multi-scale regional framework for natural hazard interactions in Guatemala. The identification and characterisation of natural hazard interactions is an important input for comprehensive multi-hazard approaches to disaster risk reduction at a regional level. We use five transdisciplinary evidence sources to organise and populate our framework: (i) internationally-accessible literature; (ii) civil protection bulletins; (iii) field observations; (iv) stakeholder interviews (hazard and civil protection professionals); and (v) stakeholder workshop results. These five evidence sources are synthesised to determine an appropriate natural hazard classification scheme for Guatemala (6 hazard groups, 19 hazard types, and 37 hazard sub-types). For a national spatial extent (Guatemala), we construct and populate a "21×21" hazard interaction matrix, identifying 49 possible interactions between 21 hazard types. For a sub-national spatial extent (Southern Highlands, Guatemala), we construct and populate a "33×33" hazard interaction matrix, identifying 112 possible interactions between 33 hazard sub-types. Evidence sources are also used to constrain anthropogenic processes that could trigger natural hazards in Guatemala, and characterise possible networks of natural hazard interactions (cascades). The outcomes of this approach are among the most comprehensive interaction frameworks for national and sub-national spatial scales in the published literature. These can be used to support disaster risk reduction and civil protection professionals in better understanding natural hazards and potential disasters at a regional scale.
Constraint Based Modeling Going Multicellular.
Martins Conde, Patricia do Rosario; Sauter, Thomas; Pfau, Thomas
2016-01-01
Constraint based modeling has seen applications in many microorganisms. For example, there are now established methods to determine potential genetic modifications and external interventions to increase the efficiency of microbial strains in chemical production pipelines. In addition, multiple models of multicellular organisms have been created including plants and humans. While initially the focus here was on modeling individual cell types of the multicellular organism, this focus recently started to switch. Models of microbial communities, as well as multi-tissue models of higher organisms have been constructed. These models thereby can include different parts of a plant, like root, stem, or different tissue types in the same organ. Such models can elucidate details of the interplay between symbiotic organisms, as well as the concerted efforts of multiple tissues and can be applied to analyse the effects of drugs or mutations on a more systemic level. In this review we give an overview of the recent development of multi-tissue models using constraint based techniques and the methods employed when investigating these models. We further highlight advances in combining constraint based models with dynamic and regulatory information and give an overview of these types of hybrid or multi-level approaches.
Efficient search, mapping, and optimization of multi-protein genetic systems in diverse bacteria
Farasat, Iman; Kushwaha, Manish; Collens, Jason; Easterbrook, Michael; Guido, Matthew; Salis, Howard M
2014-01-01
Developing predictive models of multi-protein genetic systems to understand and optimize their behavior remains a combinatorial challenge, particularly when measurement throughput is limited. We developed a computational approach to build predictive models and identify optimal sequences and expression levels, while circumventing combinatorial explosion. Maximally informative genetic system variants were first designed by the RBS Library Calculator, an algorithm to design sequences for efficiently searching a multi-protein expression space across a > 10,000-fold range with tailored search parameters and well-predicted translation rates. We validated the algorithm's predictions by characterizing 646 genetic system variants, encoded in plasmids and genomes, expressed in six gram-positive and gram-negative bacterial hosts. We then combined the search algorithm with system-level kinetic modeling, requiring the construction and characterization of 73 variants to build a sequence-expression-activity map (SEAMAP) for a biosynthesis pathway. Using model predictions, we designed and characterized 47 additional pathway variants to navigate its activity space, find optimal expression regions with desired activity response curves, and relieve rate-limiting steps in metabolism. Creating sequence-expression-activity maps accelerates the optimization of many protein systems and allows previous measurements to quantitatively inform future designs. PMID:24952589
Probabilistic sizing of laminates with uncertainties
NASA Technical Reports Server (NTRS)
Shah, A. R.; Liaw, D. G.; Chamis, C. C.
1993-01-01
A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.
A mixed parallel strategy for the solution of coupled multi-scale problems at finite strains
NASA Astrophysics Data System (ADS)
Lopes, I. A. Rodrigues; Pires, F. M. Andrade; Reis, F. J. P.
2018-02-01
A mixed parallel strategy for the solution of homogenization-based multi-scale constitutive problems undergoing finite strains is proposed. The approach aims to reduce the computational time and memory requirements of non-linear coupled simulations that use finite element discretization at both scales (FE^2). In the first level of the algorithm, a non-conforming domain decomposition technique, based on the FETI method combined with a mortar discretization at the interface of macroscopic subdomains, is employed. A master-slave scheme, which distributes tasks by macroscopic element and adopts dynamic scheduling, is then used for each macroscopic subdomain composing the second level of the algorithm. This strategy allows the parallelization of FE^2 simulations in computers with either shared memory or distributed memory architectures. The proposed strategy preserves the quadratic rates of asymptotic convergence that characterize the Newton-Raphson scheme. Several examples are presented to demonstrate the robustness and efficiency of the proposed parallel strategy.
NASA Astrophysics Data System (ADS)
Ma, Weiwei; Gong, Cailan; Hu, Yong; Li, Long; Meng, Peng
2015-10-01
Remote sensing technology has been broadly recognized for its convenience and efficiency in mapping vegetation, particularly in high-altitude and inaccessible areas where there are lack of in-situ observations. In this study, Landsat Thematic Mapper (TM) images and Chinese environmental mitigation satellite CCD sensor (HJ-1 CCD) images, both of which are at 30m spatial resolution were employed for identifying and monitoring of vegetation types in a area of Western China——Qinghai Lake Watershed(QHLW). A decision classification tree (DCT) algorithm using multi-characteristic including seasonal TM/HJ-1 CCD time series data combined with digital elevation models (DEMs) dataset, and a supervised maximum likelihood classification (MLC) algorithm with single-data TM image were applied vegetation classification. Accuracy of the two algorithms was assessed using field observation data. Based on produced vegetation classification maps, it was found that the DCT using multi-season data and geomorphologic parameters was superior to the MLC algorithm using single-data image, improving the overall accuracy by 11.86% at second class level and significantly reducing the "salt and pepper" noise. The DCT algorithm applied to TM /HJ-1 CCD time series data geomorphologic parameters appeared as a valuable and reliable tool for monitoring vegetation at first class level (5 vegetation classes) and second class level(8 vegetation subclasses). The DCT algorithm using multi-characteristic might provide a theoretical basis and general approach to automatic extraction of vegetation types from remote sensing imagery over plateau areas.
De Jong, Joop T V M
2010-01-01
Political violence, armed conflicts and human rights violations are produced by a variety of political, economic and socio-cultural factors. Conflicts can be analyzed with an interdisciplinary approach to obtain a global understanding of the relative contribution of risk and protective factors. A public health framework was designed to address these risk factors and protective factors. The framework resulted in a matrix that combined primary, secondary and tertiary interventions with their implementation on the levels of the society-at-large, the community, and the family and individual. Subsequently, the risk and protective factors were translated into multi-sectoral, multi-modal and multi-level preventive interventions involving the economy, governance, diplomacy, the military, human rights, agriculture, health, and education. Then the interventions were slotted in their appropriate place in the matrix. The interventions can be applied in an integrative form by international agencies, governments and non-governmental organizations, and molded to meet the requirements of the historic, political-economic and socio-cultural context. The framework maps the complementary fit among the different actors while engaging themselves in preventive, rehabilitative and reconstructive interventions. The framework shows how the economic, diplomatic, political, criminal justice, human rights, military, health and rural development sectors can collaborate to promote peace or prevent the aggravation or continuation of violence. A deeper understanding of the association between risk and protective factors and the developmental pathways of generic, country-specific and culture-specific factors leading to political violence is needed.
Dinh, Duy; Tamine, Lynda; Boubekeur, Fatiha
2013-02-01
The aim of this work is to evaluate a set of indexing and retrieval strategies based on the integration of several biomedical terminologies on the available TREC Genomics collections for an ad hoc information retrieval (IR) task. We propose a multi-terminology based concept extraction approach to selecting best concepts from free text by means of voting techniques. We instantiate this general approach on four terminologies (MeSH, SNOMED, ICD-10 and GO). We particularly focus on the effect of integrating terminologies into a biomedical IR process, and the utility of using voting techniques for combining the extracted concepts from each document in order to provide a list of unique concepts. Experimental studies conducted on the TREC Genomics collections show that our multi-terminology IR approach based on voting techniques are statistically significant compared to the baseline. For example, tested on the 2005 TREC Genomics collection, our multi-terminology based IR approach provides an improvement rate of +6.98% in terms of MAP (mean average precision) (p<0.05) compared to the baseline. In addition, our experimental results show that document expansion using preferred terms in combination with query expansion using terms from top ranked expanded documents improve the biomedical IR effectiveness. We have evaluated several voting models for combining concepts issued from multiple terminologies. Through this study, we presented many factors affecting the effectiveness of biomedical IR system including term weighting, query expansion, and document expansion models. The appropriate combination of those factors could be useful to improve the IR performance. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ajadi, O. A.; Meyer, F. J.
2014-12-01
Automatic oil spill detection and tracking from Synthetic Aperture Radar (SAR) images is a difficult task, due in large part to the inhomogeneous properties of the sea surface, the high level of speckle inherent in SAR data, the complexity and the highly non-Gaussian nature of amplitude information, and the low temporal sampling that is often achieved with SAR systems. This research presents a promising new oil spill detection and tracking method that is based on time series of SAR images. Through the combination of a number of advanced image processing techniques, the develop approach is able to mitigate some of these previously mentioned limitations of SAR-based oil-spill detection and enables fully automatic spill detection and tracking across a wide range of spatial scales. The method combines an initial automatic texture analysis with a consecutive change detection approach based on multi-scale image decomposition. The first step of the approach, a texture transformation of the original SAR images, is performed in order to normalize the ocean background and enhance the contrast between oil-covered and oil-free ocean surfaces. The Lipschitz regularity (LR), a local texture parameter, is used here due to its proven ability to normalize the reflectivity properties of ocean water and maximize the visibly of oil in water. To calculate LR, the images are decomposed using two-dimensional continuous wavelet transform (2D-CWT), and transformed into Holder space to measure LR. After texture transformation, the now normalized images are inserted into our multi-temporal change detection algorithm. The multi-temporal change detection approach is a two-step procedure including (1) data enhancement and filtering and (2) multi-scale automatic change detection. The performance of the developed approach is demonstrated by an application to oil spill areas in the Gulf of Mexico. In this example, areas affected by oil spills were identified from a series of ALOS PALSAR images acquired in 2010. The comparison showed exceptional performance of our method. This method can be applied to emergency management and decision support systems with a need for real-time data, and it shows great potential for rapid data analysis in other areas, including volcano detection, flood boundaries, forest health, and wildfires.
MODELING MICROBUBBLE DYNAMICS IN BIOMEDICAL APPLICATIONS*
CHAHINE, Georges L.; HSIAO, Chao-Tsung
2012-01-01
Controlling microbubble dynamics to produce desirable biomedical outcomes when and where necessary and avoid deleterious effects requires advanced knowledge, which can be achieved only through a combination of experimental and numerical/analytical techniques. The present communication presents a multi-physics approach to study the dynamics combining viscous- in-viscid effects, liquid and structure dynamics, and multi bubble interaction. While complex numerical tools are developed and used, the study aims at identifying the key parameters influencing the dynamics, which need to be included in simpler models. PMID:22833696
Ambient and smartphone sensor assisted ADL recognition in multi-inhabitant smart environments.
Roy, Nirmalya; Misra, Archan; Cook, Diane
2016-02-01
Activity recognition in smart environments is an evolving research problem due to the advancement and proliferation of sensing, monitoring and actuation technologies to make it possible for large scale and real deployment. While activities in smart home are interleaved, complex and volatile; the number of inhabitants in the environment is also dynamic. A key challenge in designing robust smart home activity recognition approaches is to exploit the users' spatiotemporal behavior and location, focus on the availability of multitude of devices capable of providing different dimensions of information and fulfill the underpinning needs for scaling the system beyond a single user or a home environment. In this paper, we propose a hybrid approach for recognizing complex activities of daily living (ADL), that lie in between the two extremes of intensive use of body-worn sensors and the use of ambient sensors. Our approach harnesses the power of simple ambient sensors (e.g., motion sensors) to provide additional 'hidden' context (e.g., room-level location) of an individual, and then combines this context with smartphone-based sensing of micro-level postural/locomotive states. The major novelty is our focus on multi-inhabitant environments, where we show how the use of spatiotemporal constraints along with multitude of data sources can be used to significantly improve the accuracy and computational overhead of traditional activity recognition based approaches such as coupled-hidden Markov models. Experimental results on two separate smart home datasets demonstrate that this approach improves the accuracy of complex ADL classification by over 30 %, compared to pure smartphone-based solutions.
Ambient and smartphone sensor assisted ADL recognition in multi-inhabitant smart environments
Misra, Archan; Cook, Diane
2016-01-01
Activity recognition in smart environments is an evolving research problem due to the advancement and proliferation of sensing, monitoring and actuation technologies to make it possible for large scale and real deployment. While activities in smart home are interleaved, complex and volatile; the number of inhabitants in the environment is also dynamic. A key challenge in designing robust smart home activity recognition approaches is to exploit the users' spatiotemporal behavior and location, focus on the availability of multitude of devices capable of providing different dimensions of information and fulfill the underpinning needs for scaling the system beyond a single user or a home environment. In this paper, we propose a hybrid approach for recognizing complex activities of daily living (ADL), that lie in between the two extremes of intensive use of body-worn sensors and the use of ambient sensors. Our approach harnesses the power of simple ambient sensors (e.g., motion sensors) to provide additional ‘hidden’ context (e.g., room-level location) of an individual, and then combines this context with smartphone-based sensing of micro-level postural/locomotive states. The major novelty is our focus on multi-inhabitant environments, where we show how the use of spatiotemporal constraints along with multitude of data sources can be used to significantly improve the accuracy and computational overhead of traditional activity recognition based approaches such as coupled-hidden Markov models. Experimental results on two separate smart home datasets demonstrate that this approach improves the accuracy of complex ADL classification by over 30 %, compared to pure smartphone-based solutions. PMID:27042240
Multi-level systems modeling and optimization for novel aircraft
NASA Astrophysics Data System (ADS)
Subramanian, Shreyas Vathul
This research combines the disciplines of system-of-systems (SoS) modeling, platform-based design, optimization and evolving design spaces to achieve a novel capability for designing solutions to key aeronautical mission challenges. A central innovation in this approach is the confluence of multi-level modeling (from sub-systems to the aircraft system to aeronautical system-of-systems) in a way that coordinates the appropriate problem formulations at each level and enables parametric search in design libraries for solutions that satisfy level-specific objectives. The work here addresses the topic of SoS optimization and discusses problem formulation, solution strategy, the need for new algorithms that address special features of this problem type, and also demonstrates these concepts using two example application problems - a surveillance UAV swarm problem, and the design of noise optimal aircraft and approach procedures. This topic is critical since most new capabilities in aeronautics will be provided not just by a single air vehicle, but by aeronautical Systems of Systems (SoS). At the same time, many new aircraft concepts are pressing the boundaries of cyber-physical complexity through the myriad of dynamic and adaptive sub-systems that are rising up the TRL (Technology Readiness Level) scale. This compositional approach is envisioned to be active at three levels: validated sub-systems are integrated to form conceptual aircraft, which are further connected with others to perform a challenging mission capability at the SoS level. While these multiple levels represent layers of physical abstraction, each discipline is associated with tools of varying fidelity forming strata of 'analysis abstraction'. Further, the design (composition) will be guided by a suitable hierarchical complexity metric formulated for the management of complexity in both the problem (as part of the generative procedure and selection of fidelity level) and the product (i.e., is the mission best achieved via a large collection of interacting simple systems, or a relatively few highly capable, complex air vehicles). The vastly unexplored area of optimization in evolving design spaces will be studied and incorporated into the SoS optimization framework. We envision a framework that resembles a multi-level, mult-fidelity, multi-disciplinary assemblage of optimization problems. The challenge is not simply one of scaling up to a new level (the SoS), but recognizing that the aircraft sub-systems and the integrated vehicle are now intensely cyber-physical, with hardware and software components interacting in complex ways that give rise to new and improved capabilities. The work presented here is a step closer to modeling the information flow that exists in realistic SoS optimization problems between sub-contractors, contractors and the SoS architect.
Radac, Mircea-Bogdan; Precup, Radu-Emil; Roman, Raul-Cristian
2018-02-01
This paper proposes a combined Virtual Reference Feedback Tuning-Q-learning model-free control approach, which tunes nonlinear static state feedback controllers to achieve output model reference tracking in an optimal control framework. The novel iterative Batch Fitted Q-learning strategy uses two neural networks to represent the value function (critic) and the controller (actor), and it is referred to as a mixed Virtual Reference Feedback Tuning-Batch Fitted Q-learning approach. Learning convergence of the Q-learning schemes generally depends, among other settings, on the efficient exploration of the state-action space. Handcrafting test signals for efficient exploration is difficult even for input-output stable unknown processes. Virtual Reference Feedback Tuning can ensure an initial stabilizing controller to be learned from few input-output data and it can be next used to collect substantially more input-state data in a controlled mode, in a constrained environment, by compensating the process dynamics. This data is used to learn significantly superior nonlinear state feedback neural networks controllers for model reference tracking, using the proposed Batch Fitted Q-learning iterative tuning strategy, motivating the original combination of the two techniques. The mixed Virtual Reference Feedback Tuning-Batch Fitted Q-learning approach is experimentally validated for water level control of a multi input-multi output nonlinear constrained coupled two-tank system. Discussions on the observed control behavior are offered. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Yang, Yan; Geng, Chao; Li, Feng; Huang, Guan; Li, Xinyang
2017-10-30
Multi-aperture receiver with optical combining architecture is an effective approach to overcome the turbulent atmosphere effect on the performance of the free-space optical (FSO) communications, in which how to combine the multiple laser beams received by the sub-apertures efficiently is one of the key technologies. In this paper, we focus on the combining module based on fiber couplers, and propose the all-fiber coherent beam combining (CBC) with two architectures by using active phase locking. To validate the feasibility of the proposed combining module, corresponding experiments and simulations on the CBC of four laser beams are carried out. The experimental results show that the phase differences among the input beams can be compensated and the combining efficiency can be stably promoted by active phase locking in CBC with both of the two architectures. The simulation results show that the combining efficiency fluctuates when turbulent atmosphere is considered, and the effectiveness of the combining module decreases as the turbulence increases. We believe that the combining module proposed in this paper has great potential, and the results can provide significant advices for researchers when building such a multi-aperture receiver with optical combining architecture for FSO commutation systems.
A Case Study on Neural Inspired Dynamic Memory Management Strategies for High Performance Computing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vineyard, Craig Michael; Verzi, Stephen Joseph
As high performance computing architectures pursue more computational power there is a need for increased memory capacity and bandwidth as well. A multi-level memory (MLM) architecture addresses this need by combining multiple memory types with different characteristics as varying levels of the same architecture. How to efficiently utilize this memory infrastructure is an unknown challenge, and in this research we sought to investigate whether neural inspired approaches can meaningfully help with memory management. In particular we explored neurogenesis inspired re- source allocation, and were able to show a neural inspired mixed controller policy can beneficially impact how MLM architectures utilizemore » memory.« less
Upadhyay, Manas V.; Patra, Anirban; Wen, Wei; ...
2018-05-08
In this paper, we propose a multi-scale modeling approach that can simulate the microstructural and mechanical behavior of metal or alloy parts with complex geometries subjected to multi-axial load path changes. The model is used to understand the biaxial load path change behavior of 316L stainless steel cruciform samples. At the macroscale, a finite element approach is used to simulate the cruciform geometry and numerically predict the gauge stresses, which are difficult to obtain analytically. At each material point in the finite element mesh, the anisotropic viscoplastic self-consistent model is used to simulate the role of texture evolution on themore » mechanical response. At the single crystal level, a dislocation density based hardening law that appropriately captures the role of multi-axial load path changes on slip activity is used. The combined approach is experimentally validated using cruciform samples subjected to uniaxial load and unload followed by different biaxial reloads in the angular range [27º, 90º]. Polycrystalline yield surfaces before and after load path changes are generated using the full-field elasto-viscoplastic fast Fourier transform model to study the influence of the deformation history and reloading direction on the mechanical response, including the Bauschinger effect, of these cruciform samples. Results reveal that the Bauschinger effect is strongly dependent on the first loading direction and strain, intergranular and macroscopic residual stresses after first load, and the reloading angle. The microstructural origins of the mechanical response are discussed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upadhyay, Manas V.; Patra, Anirban; Wen, Wei
In this paper, we propose a multi-scale modeling approach that can simulate the microstructural and mechanical behavior of metal or alloy parts with complex geometries subjected to multi-axial load path changes. The model is used to understand the biaxial load path change behavior of 316L stainless steel cruciform samples. At the macroscale, a finite element approach is used to simulate the cruciform geometry and numerically predict the gauge stresses, which are difficult to obtain analytically. At each material point in the finite element mesh, the anisotropic viscoplastic self-consistent model is used to simulate the role of texture evolution on themore » mechanical response. At the single crystal level, a dislocation density based hardening law that appropriately captures the role of multi-axial load path changes on slip activity is used. The combined approach is experimentally validated using cruciform samples subjected to uniaxial load and unload followed by different biaxial reloads in the angular range [27º, 90º]. Polycrystalline yield surfaces before and after load path changes are generated using the full-field elasto-viscoplastic fast Fourier transform model to study the influence of the deformation history and reloading direction on the mechanical response, including the Bauschinger effect, of these cruciform samples. Results reveal that the Bauschinger effect is strongly dependent on the first loading direction and strain, intergranular and macroscopic residual stresses after first load, and the reloading angle. The microstructural origins of the mechanical response are discussed.« less
Investigating the Group-Level Impact of Advanced Dual-Echo fMRI Combinations
Kettinger, Ádám; Hill, Christopher; Vidnyánszky, Zoltán; Windischberger, Christian; Nagy, Zoltán
2016-01-01
Multi-echo fMRI data acquisition has been widely investigated and suggested to optimize sensitivity for detecting the BOLD signal. Several methods have also been proposed for the combination of data with different echo times. The aim of the present study was to investigate whether these advanced echo combination methods provide advantages over the simple averaging of echoes when state-of-the-art group-level random-effect analyses are performed. Both resting-state and task-based dual-echo fMRI data were collected from 27 healthy adult individuals (14 male, mean age = 25.75 years) using standard echo-planar acquisition methods at 3T. Both resting-state and task-based data were subjected to a standard image pre-processing pipeline. Subsequently the two echoes were combined as a weighted average, using four different strategies for calculating the weights: (1) simple arithmetic averaging, (2) BOLD sensitivity weighting, (3) temporal-signal-to-noise ratio weighting and (4) temporal BOLD sensitivity weighting. Our results clearly show that the simple averaging of data with the different echoes is sufficient. Advanced echo combination methods may provide advantages on a single-subject level but when considering random-effects group level statistics they provide no benefit regarding sensitivity (i.e., group-level t-values) compared to the simple echo-averaging approach. One possible reason for the lack of clear advantages may be that apart from increasing the average BOLD sensitivity at the single-subject level, the advanced weighted averaging methods also inflate the inter-subject variance. As the echo combination methods provide very similar results, the recommendation is to choose between them depending on the availability of time for collecting additional resting-state data or whether subject-level or group-level analyses are planned. PMID:28018165
A Generalized Mixture Framework for Multi-label Classification
Hong, Charmgil; Batal, Iyad; Hauskrecht, Milos
2015-01-01
We develop a novel probabilistic ensemble framework for multi-label classification that is based on the mixtures-of-experts architecture. In this framework, we combine multi-label classification models in the classifier chains family that decompose the class posterior distribution P(Y1, …, Yd|X) using a product of posterior distributions over components of the output space. Our approach captures different input–output and output–output relations that tend to change across data. As a result, we can recover a rich set of dependency relations among inputs and outputs that a single multi-label classification model cannot capture due to its modeling simplifications. We develop and present algorithms for learning the mixtures-of-experts models from data and for performing multi-label predictions on unseen data instances. Experiments on multiple benchmark datasets demonstrate that our approach achieves highly competitive results and outperforms the existing state-of-the-art multi-label classification methods. PMID:26613069
Multilayer composition coatings for cutting tools: formation and performance properties
NASA Astrophysics Data System (ADS)
Tabakov, Vladimir P.; Vereschaka, Anatoly S.; Vereschaka, Alexey A.
2018-03-01
The paper considers the concept of a multi-layer architecture of the coating in which each layer has a predetermined functionality. Latest generation of coatings with multi-layered architecture for cutting tools secure a dual nature of the coating, in which coatings should not only improve the mechanical and physical characteristics of the cutting tool material, but also reduce the thermo-mechanical effect on the cutting tool determining wear intensity. Here are presented the results of the development of combined methods of forming multi-layer coatings with improved properties. Combined method of forming coatings using a pulsed laser allowed reducing excessively high levels of compressive residual stress and increasing micro hardness of the multilayered coatings. The results in testing coated HSS tools showed that the use of additional pulse of laser processing increases tool life up to 3 times. Using filtered cathodic vacuum arc deposition for the generation of multilayer coatings based on TiAlN compound has increased the wear-resistance of carbide tools by 2 fold compared with tool life of cutting tool with commercial TiN coatings. The aim of this study was to develop an innovative methodological approach to the deposition of multilayer coatings for cutting tools with functional architectural selection, properties and parameters of the coating based on sound knowledge of coating failure in machining process.
Diffusion-Based Design of Multi-Layered Ophthalmic Lenses for Controlled Drug Release
Pimenta, Andreia F. R.; Serro, Ana Paula; Paradiso, Patrizia; Saramago, Benilde
2016-01-01
The study of ocular drug delivery systems has been one of the most covered topics in drug delivery research. One potential drug carrier solution is the use of materials that are already commercially available in ophthalmic lenses for the correction of refractive errors. In this study, we present a diffusion-based mathematical model in which the parameters can be adjusted based on experimental results obtained under controlled conditions. The model allows for the design of multi-layered therapeutic ophthalmic lenses for controlled drug delivery. We show that the proper combination of materials with adequate drug diffusion coefficients, thicknesses and interfacial transport characteristics allows for the control of the delivery of drugs from multi-layered ophthalmic lenses, such that drug bursts can be minimized, and the release time can be maximized. As far as we know, this combination of a mathematical modelling approach with experimental validation of non-constant activity source lamellar structures, made of layers of different materials, accounting for the interface resistance to the drug diffusion, is a novel approach to the design of drug loaded multi-layered contact lenses. PMID:27936138
Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery.
Sakamoto, Takuto
2016-01-01
Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level.
Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery
Sakamoto, Takuto
2016-01-01
Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level. PMID:26963526
NASA Astrophysics Data System (ADS)
Bauer, Johannes; Dávila-Chacón, Jorge; Wermter, Stefan
2015-10-01
Humans and other animals have been shown to perform near-optimally in multi-sensory integration tasks. Probabilistic population codes (PPCs) have been proposed as a mechanism by which optimal integration can be accomplished. Previous approaches have focussed on how neural networks might produce PPCs from sensory input or perform calculations using them, like combining multiple PPCs. Less attention has been given to the question of how the necessary organisation of neurons can arise and how the required knowledge about the input statistics can be learned. In this paper, we propose a model of learning multi-sensory integration based on an unsupervised learning algorithm in which an artificial neural network learns the noise characteristics of each of its sources of input. Our algorithm borrows from the self-organising map the ability to learn latent-variable models of the input and extends it to learning to produce a PPC approximating a probability density function over the latent variable behind its (noisy) input. The neurons in our network are only required to perform simple calculations and we make few assumptions about input noise properties and tuning functions. We report on a neurorobotic experiment in which we apply our algorithm to multi-sensory integration in a humanoid robot to demonstrate its effectiveness and compare it to human multi-sensory integration on the behavioural level. We also show in simulations that our algorithm performs near-optimally under certain plausible conditions, and that it reproduces important aspects of natural multi-sensory integration on the neural level.
Xu, Huile; Liu, Jinyi; Hu, Haibo; Zhang, Yi
2016-12-02
Wearable sensors-based human activity recognition introduces many useful applications and services in health care, rehabilitation training, elderly monitoring and many other areas of human interaction. Existing works in this field mainly focus on recognizing activities by using traditional features extracted from Fourier transform (FT) or wavelet transform (WT). However, these signal processing approaches are suitable for a linear signal but not for a nonlinear signal. In this paper, we investigate the characteristics of the Hilbert-Huang transform (HHT) for dealing with activity data with properties such as nonlinearity and non-stationarity. A multi-features extraction method based on HHT is then proposed to improve the effect of activity recognition. The extracted multi-features include instantaneous amplitude (IA) and instantaneous frequency (IF) by means of empirical mode decomposition (EMD), as well as instantaneous energy density (IE) and marginal spectrum (MS) derived from Hilbert spectral analysis. Experimental studies are performed to verify the proposed approach by using the PAMAP2 dataset from the University of California, Irvine for wearable sensors-based activity recognition. Moreover, the effect of combining multi-features vs. a single-feature are investigated and discussed in the scenario of a dependent subject. The experimental results show that multi-features combination can further improve the performance measures. Finally, we test the effect of multi-features combination in the scenario of an independent subject. Our experimental results show that we achieve four performance indexes: recall, precision, F-measure, and accuracy to 0.9337, 0.9417, 0.9353, and 0.9377 respectively, which are all better than the achievements of related works.
Xu, Huile; Liu, Jinyi; Hu, Haibo; Zhang, Yi
2016-01-01
Wearable sensors-based human activity recognition introduces many useful applications and services in health care, rehabilitation training, elderly monitoring and many other areas of human interaction. Existing works in this field mainly focus on recognizing activities by using traditional features extracted from Fourier transform (FT) or wavelet transform (WT). However, these signal processing approaches are suitable for a linear signal but not for a nonlinear signal. In this paper, we investigate the characteristics of the Hilbert-Huang transform (HHT) for dealing with activity data with properties such as nonlinearity and non-stationarity. A multi-features extraction method based on HHT is then proposed to improve the effect of activity recognition. The extracted multi-features include instantaneous amplitude (IA) and instantaneous frequency (IF) by means of empirical mode decomposition (EMD), as well as instantaneous energy density (IE) and marginal spectrum (MS) derived from Hilbert spectral analysis. Experimental studies are performed to verify the proposed approach by using the PAMAP2 dataset from the University of California, Irvine for wearable sensors-based activity recognition. Moreover, the effect of combining multi-features vs. a single-feature are investigated and discussed in the scenario of a dependent subject. The experimental results show that multi-features combination can further improve the performance measures. Finally, we test the effect of multi-features combination in the scenario of an independent subject. Our experimental results show that we achieve four performance indexes: recall, precision, F-measure, and accuracy to 0.9337, 0.9417, 0.9353, and 0.9377 respectively, which are all better than the achievements of related works. PMID:27918414
Automated diagnosis of Alzheimer's disease with multi-atlas based whole brain segmentations
NASA Astrophysics Data System (ADS)
Luo, Yuan; Tang, Xiaoying
2017-03-01
Voxel-based analysis is widely used in quantitative analysis of structural brain magnetic resonance imaging (MRI) and automated disease detection, such as Alzheimer's disease (AD). However, noise at the voxel level may cause low sensitivity to AD-induced structural abnormalities. This can be addressed with the use of a whole brain structural segmentation approach which greatly reduces the dimension of features (the number of voxels). In this paper, we propose an automatic AD diagnosis system that combines such whole brain segmen- tations with advanced machine learning methods. We used a multi-atlas segmentation technique to parcellate T1-weighted images into 54 distinct brain regions and extract their structural volumes to serve as the features for principal-component-analysis-based dimension reduction and support-vector-machine-based classification. The relationship between the number of retained principal components (PCs) and the diagnosis accuracy was systematically evaluated, in a leave-one-out fashion, based on 28 AD subjects and 23 age-matched healthy subjects. Our approach yielded pretty good classification results with 96.08% overall accuracy being achieved using the three foremost PCs. In addition, our approach yielded 96.43% specificity, 100% sensitivity, and 0.9891 area under the receiver operating characteristic curve.
Crowe, A S; Booty, W G
1995-05-01
A multi-level pesticide assessment methodology has been developed to permit regulatory personnel to undertake a variety of assessments on the potential for pesticide used in agricultural areas to contaminate the groundwater regime at an increasingly detailed geographical scale of investigation. A multi-level approach accounts for a variety of assessment objectives and detail required in the assessment, the restrictions on the availability and accuracy of data, the time available to undertake the assessment, and the expertise of the decision maker. The level 1: regional scale is designed to prioritize districts having a potentially high risk for groundwater contamination from the application of a specific pesticide for a particular crop. The level 2: local scale is used to identify critical areas for groundwater contamination, at a soil polygon scale, within a district. A level 3: soil profile scale allows the user to evaluate specific factors influencing pesticide leaching and persistence, and to determine the extent and timing of leaching, through the simulation of the migration of a pesticide within a soil profile. Because of the scale of investigation, limited amount of data required, and qualitative nature of the assessment results, the level 1 and level 2 assessment are designed primarily for quick and broad guidance related to management practices. A level 3 assessment is more complex, requires considerably more data and expertise on the part of the user, and hence is designed to verify the potential for contamination identified during the level 1 or 2 assessment. The system combines environmental modelling, geographical information systems, extensive databases, data management systems, expert systems, and pesticide assessment models, to form an environmental information system for assessing the potential for pesticides to contaminate groundwater.
A Multi-Level Approach to Outreach for Geologic Sequestration Projects
Greenberg, S.E.; Leetaru, H.E.; Krapac, I.G.; Hnottavange-Telleen, K.; Finley, R.J.
2009-01-01
Public perception of carbon capture and sequestration (CCS) projects represents a potential barrier to commercialization. Outreach to stakeholders at the local, regional, and national level is needed to create familiarity with and potential acceptance of CCS projects. This paper highlights the Midwest Geological Sequestration Consortium (MGSC) multi-level outreach approach which interacts with multiple stakeholders. The MGSC approach focuses on external and internal communication. External communication has resulted in building regional public understanding of CCS. Internal communication, through a project Risk Assessment process, has resulted in enhanced team communication and preparation of team members for outreach roles. ?? 2009 Elsevier Ltd. All rights reserved.
Modeling of BN Lifetime Prediction of a System Based on Integrated Multi-Level Information
Wang, Xiaohong; Wang, Lizhi
2017-01-01
Predicting system lifetime is important to ensure safe and reliable operation of products, which requires integrated modeling based on multi-level, multi-sensor information. However, lifetime characteristics of equipment in a system are different and failure mechanisms are inter-coupled, which leads to complex logical correlations and the lack of a uniform lifetime measure. Based on a Bayesian network (BN), a lifetime prediction method for systems that combine multi-level sensor information is proposed. The method considers the correlation between accidental failures and degradation failure mechanisms, and achieves system modeling and lifetime prediction under complex logic correlations. This method is applied in the lifetime prediction of a multi-level solar-powered unmanned system, and the predicted results can provide guidance for the improvement of system reliability and for the maintenance and protection of the system. PMID:28926930
Modeling of BN Lifetime Prediction of a System Based on Integrated Multi-Level Information.
Wang, Jingbin; Wang, Xiaohong; Wang, Lizhi
2017-09-15
Predicting system lifetime is important to ensure safe and reliable operation of products, which requires integrated modeling based on multi-level, multi-sensor information. However, lifetime characteristics of equipment in a system are different and failure mechanisms are inter-coupled, which leads to complex logical correlations and the lack of a uniform lifetime measure. Based on a Bayesian network (BN), a lifetime prediction method for systems that combine multi-level sensor information is proposed. The method considers the correlation between accidental failures and degradation failure mechanisms, and achieves system modeling and lifetime prediction under complex logic correlations. This method is applied in the lifetime prediction of a multi-level solar-powered unmanned system, and the predicted results can provide guidance for the improvement of system reliability and for the maintenance and protection of the system.
Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS)
NASA Astrophysics Data System (ADS)
OConnor, A.; Kirtman, B. P.; Harrison, S.; Gorman, J.
2016-02-01
Current US Navy forecasting systems cannot easily incorporate extended-range forecasts that can improve mission readiness and effectiveness; ensure safety; and reduce cost, labor, and resource requirements. If Navy operational planners had systems that incorporated these forecasts, they could plan missions using more reliable and longer-term weather and climate predictions. Further, using multi-model forecast ensembles instead of single forecasts would produce higher predictive performance. Extended-range multi-model forecast ensembles, such as those available in the North American Multi-Model Ensemble (NMME), are ideal for system integration because of their high skill predictions; however, even higher skill predictions can be produced if forecast model ensembles are combined correctly. While many methods for weighting models exist, the best method in a given environment requires expert knowledge of the models and combination methods.We present an innovative approach that uses machine learning to combine extended-range predictions from multi-model forecast ensembles and generate a probabilistic forecast for any region of the globe up to 12 months in advance. Our machine-learning approach uses 30 years of hindcast predictions to learn patterns of forecast model successes and failures. Each model is assigned a weight for each environmental condition, 100 km2 region, and day given any expected environmental information. These weights are then applied to the respective predictions for the region and time of interest to effectively stitch together a single, coherent probabilistic forecast. Our experimental results demonstrate the benefits of our approach to produce extended-range probabilistic forecasts for regions and time periods of interest that are superior, in terms of skill, to individual NMME forecast models and commonly weighted models. The probabilistic forecast leverages the strengths of three NMME forecast models to predict environmental conditions for an area spanning from San Diego, CA to Honolulu, HI, seven months in-advance. Key findings include: weighted combinations of models are strictly better than individual models; machine-learned combinations are especially better; and forecasts produced using our approach have the highest rank probability skill score most often.
Approaching human language with complex networks
NASA Astrophysics Data System (ADS)
Cong, Jin; Liu, Haitao
2014-12-01
The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics).
Multi-Level Adaptation in End-User Development of 3D Virtual Chemistry Experiments
ERIC Educational Resources Information Center
Liu, Chang; Zhong, Ying
2014-01-01
Multi-level adaptation in end-user development (EUD) is an effective way to enable non-technical end users such as educators to gradually introduce more functionality with increasing complexity to 3D virtual learning environments developed by themselves using EUD approaches. Parameterization, integration, and extension are three levels of…
Supermodeling With A Global Atmospheric Model
NASA Astrophysics Data System (ADS)
Wiegerinck, Wim; Burgers, Willem; Selten, Frank
2013-04-01
In weather and climate prediction studies it often turns out to be the case that the multi-model ensemble mean prediction has the best prediction skill scores. One possible explanation is that the major part of the model error is random and is averaged out in the ensemble mean. In the standard multi-model ensemble approach, the models are integrated in time independently and the predicted states are combined a posteriori. Recently an alternative ensemble prediction approach has been proposed in which the models exchange information during the simulation and synchronize on a common solution that is closer to the truth than any of the individual model solutions in the standard multi-model ensemble approach or a weighted average of these. This approach is called the super modeling approach (SUMO). The potential of the SUMO approach has been demonstrated in the context of simple, low-order, chaotic dynamical systems. The information exchange takes the form of linear nudging terms in the dynamical equations that nudge the solution of each model to the solution of all other models in the ensemble. With a suitable choice of the connection strengths the models synchronize on a common solution that is indeed closer to the true system than any of the individual model solutions without nudging. This approach is called connected SUMO. An alternative approach is to integrate a weighted averaged model, weighted SUMO. At each time step all models in the ensemble calculate the tendency, these tendencies are weighted averaged and the state is integrated one time step into the future with this weighted averaged tendency. It was shown that in case the connected SUMO synchronizes perfectly, the connected SUMO follows the weighted averaged trajectory and both approaches yield the same solution. In this study we pioneer both approaches in the context of a global, quasi-geostrophic, three-level atmosphere model that is capable of simulating quite realistically the extra-tropical circulation in the Northern Hemisphere winter.
Multi-platform metabolomics assays for human lung lavage fluids in an air pollution exposure study.
Surowiec, Izabella; Karimpour, Masoumeh; Gouveia-Figueira, Sandra; Wu, Junfang; Unosson, Jon; Bosson, Jenny A; Blomberg, Anders; Pourazar, Jamshid; Sandström, Thomas; Behndig, Annelie F; Trygg, Johan; Nording, Malin L
2016-07-01
Metabolomics protocols are used to comprehensively characterize the metabolite content of biological samples by exploiting cutting-edge analytical platforms, such as gas chromatography (GC) or liquid chromatography (LC) coupled to mass spectrometry (MS) assays, as well as nuclear magnetic resonance (NMR) assays. We have developed novel sample preparation procedures combined with GC-MS, LC-MS, and NMR metabolomics profiling for analyzing bronchial wash (BW) and bronchoalveolar lavage (BAL) fluid from 15 healthy volunteers following exposure to biodiesel exhaust and filtered air. Our aim was to investigate the responsiveness of metabolite profiles in the human lung to air pollution exposure derived from combustion of biofuels, such as rapeseed methyl ester biodiesel, which are increasingly being promoted as alternatives to conventional fossil fuels. Our multi-platform approach enabled us to detect the greatest number of unique metabolites yet reported in BW and BAL fluid (82 in total). All of the metabolomics assays indicated that the metabolite profiles of the BW and BAL fluids differed appreciably, with 46 metabolites showing significantly different levels in the corresponding lung compartments. Furthermore, the GC-MS assay revealed an effect of biodiesel exhaust exposure on the levels of 1-monostearylglycerol, sucrose, inosine, nonanoic acid, and ethanolamine (in BAL) and pentadecanoic acid (in BW), whereas the LC-MS assay indicated a shift in the levels of niacinamide (in BAL). The NMR assay only identified lactic acid (in BW) as being responsive to biodiesel exhaust exposure. Our findings demonstrate that the proposed multi-platform approach is useful for wide metabolomics screening of BW and BAL fluids and can facilitate elucidation of metabolites responsive to biodiesel exhaust exposure. Graphical Abstract Graphical abstract illustrating the study workflow. NMR Nuclear Magnetic Resonance, LC-TOFMS Liquid chromatography-Time Of Flight Mass Spectrometry, GC Gas Chromatography-Mass spectrometry.
Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei
2013-01-01
Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis.
Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei
2014-01-01
Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis. PMID:23702553
Biodiversity conservation in Swedish forests: ways forward for a 30-year-old multi-scaled approach.
Gustafsson, Lena; Perhans, Karin
2010-12-01
A multi-scaled model for biodiversity conservation in forests was introduced in Sweden 30 years ago, which makes it a pioneer example of an integrated ecosystem approach. Trees are set aside for biodiversity purposes at multiple scale levels varying from individual trees to areas of thousands of hectares, with landowner responsibility at the lowest level and with increasing state involvement at higher levels. Ecological theory supports the multi-scaled approach, and retention efforts at every harvest occasion stimulate landowners' interest in conservation. We argue that the model has large advantages but that in a future with intensified forestry and global warming, development based on more progressive thinking is necessary to maintain and increase biodiversity. Suggestions for the future include joint planning for several forest owners, consideration of cost-effectiveness, accepting opportunistic work models, adjusting retention levels to stand and landscape composition, introduction of temporary reserves, creation of "receiver habitats" for species escaping climate change, and protection of young forests.
Effectiveness of Social Media for Communicating Health Messages in Ghana
ERIC Educational Resources Information Center
Bannor, Richard; Asare, Anthony Kwame; Bawole, Justice Nyigmah
2017-01-01
Purpose: The purpose of this paper is to develop an in-depth understanding of the effectiveness, evolution and dynamism of the current health communication media used in Ghana. Design/methodology/approach: This paper uses a multi-method approach which utilizes a combination of qualitative and quantitative approaches. In-depth interviews are…
Can we use Earth Observations to improve monthly water level forecasts?
NASA Astrophysics Data System (ADS)
Slater, L. J.; Villarini, G.
2017-12-01
Dynamical-statistical hydrologic forecasting approaches benefit from different strengths in comparison with traditional hydrologic forecasting systems: they are computationally efficient, can integrate and `learn' from a broad selection of input data (e.g., General Circulation Model (GCM) forecasts, Earth Observation time series, teleconnection patterns), and can take advantage of recent progress in machine learning (e.g. multi-model blending, post-processing and ensembling techniques). Recent efforts to develop a dynamical-statistical ensemble approach for forecasting seasonal streamflow using both GCM forecasts and changing land cover have shown promising results over the U.S. Midwest. Here, we use climate forecasts from several GCMs of the North American Multi Model Ensemble (NMME) alongside 15-minute stage time series from the National River Flow Archive (NRFA) and land cover classes extracted from the European Space Agency's Climate Change Initiative 300 m annual Global Land Cover time series. With these data, we conduct systematic long-range probabilistic forecasting of monthly water levels in UK catchments over timescales ranging from one to twelve months ahead. We evaluate the improvement in model fit and model forecasting skill that comes from using land cover classes as predictors in the models. This work opens up new possibilities for combining Earth Observation time series with GCM forecasts to predict a variety of hazards from space using data science techniques.
Multi-element fiber technology for space-division multiplexing applications.
Jain, S; Rancaño, V J F; May-Smith, T C; Petropoulos, P; Sahu, J K; Richardson, D J
2014-02-24
A novel technological approach to space division multiplexing (SDM) based on the use of multiple individual fibers embedded in a common polymer coating material is presented, which is referred to as Multi-Element Fiber (MEF). The approach ensures ultralow crosstalk between spatial channels and allows for cost-effective ways of realizing multi-spatial channel amplification and signal multiplexing/demultiplexing. Both the fabrication and characterization of a passive 3-element MEF for data transmission, and an active 5-element erbium/ytterbium doped MEF for cladding-pumped optical amplification that uses one of the elements as an integrated pump delivery fiber is reported. Finally, both components were combined to emulate an optical fiber network comprising SDM transmission lines and amplifiers, and illustrate the compatibility of the approach with existing installed single-mode WDM fiber systems.
Surface tension models for a multi-material ALE code with AMR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Wangyi; Koniges, Alice; Gott, Kevin
A number of surface tension models have been implemented in a 3D multi-physics multi-material code, ALE–AMR, which combines Arbitrary Lagrangian Eulerian (ALE) hydrodynamics with Adaptive Mesh Refinement (AMR). ALE–AMR is unique in its ability to model hot radiating plasmas, cold fragmenting solids, and most recently, the deformation of molten material. The surface tension models implemented include a diffuse interface approach with special numerical techniques to remove parasitic flow and a height function approach in conjunction with a volume-fraction interface reconstruction package. These surface tension models are benchmarked with a variety of test problems. In conclusion, based on the results, themore » height function approach using volume fractions was chosen to simulate droplet dynamics associated with extreme ultraviolet (EUV) lithography.« less
Surface tension models for a multi-material ALE code with AMR
Liu, Wangyi; Koniges, Alice; Gott, Kevin; ...
2017-06-01
A number of surface tension models have been implemented in a 3D multi-physics multi-material code, ALE–AMR, which combines Arbitrary Lagrangian Eulerian (ALE) hydrodynamics with Adaptive Mesh Refinement (AMR). ALE–AMR is unique in its ability to model hot radiating plasmas, cold fragmenting solids, and most recently, the deformation of molten material. The surface tension models implemented include a diffuse interface approach with special numerical techniques to remove parasitic flow and a height function approach in conjunction with a volume-fraction interface reconstruction package. These surface tension models are benchmarked with a variety of test problems. In conclusion, based on the results, themore » height function approach using volume fractions was chosen to simulate droplet dynamics associated with extreme ultraviolet (EUV) lithography.« less
Portet, Anaïs; Pinaud, Silvain; Tetreau, Guillaume; Galinier, Richard; Cosseau, Céline; Duval, David; Grunau, Christoph; Mitta, Guillaume; Gourbal, Benjamin
2017-10-01
The fresh water snail Biomphalaria glabrata is one of the vectors of the trematode pathogen Schistosoma mansoni, which is one of the agents responsible of human schistosomiasis. In this host-parasite interaction, co-evolutionary dynamic results into an infectivity mosaic known as compatibility polymorphism. Integrative approaches including large scale molecular approaches have been conducted in recent years to improve our understanding of the mechanisms underlying compatibility. This review presents the combination of integrated Multi-Omic approaches leading to the discovery of two repertoires of polymorphic and/or diversified interacting molecules: the parasite antigens S. mansoni polymorphic mucins (SmPoMucs) and the B. glabrata immune receptors fibrinogen-related proteins (FREPs). We argue that their interactions may be major components for defining the compatible/incompatible status of a specific snail/schistosome combination. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Bathrellou, Eirini; Yannakoulia, Mary; Papanikolaou, Katerina; Pehlivanidis, Artemios; Pervanidou, Panagiota; Kanaka-Gantenbein, Christina; Tsiantis, John; Chrousos, George P.; Sidossis, Labros S.
2010-01-01
Along the lines of the evidence-based recommendations, we developed a multi-disciplinary intervention for overweight children 7- to 12-years-old, primarily aiming at helping children to adopt healthier eating habits and a physically active lifestyle. The program combined nutrition intervention, based on a non-dieting approach, with physical…
NASA Astrophysics Data System (ADS)
Ayadi, Omar; Felfel, Houssem; Masmoudi, Faouzi
2017-07-01
The current manufacturing environment has changed from traditional single-plant to multi-site supply chain where multiple plants are serving customer demands. In this article, a tactical multi-objective, multi-period, multi-product, multi-site supply-chain planning problem is proposed. A corresponding optimization model aiming to simultaneously minimize the total cost, maximize product quality and maximize the customer satisfaction demand level is developed. The proposed solution approach yields to a front of Pareto-optimal solutions that represents the trade-offs among the different objectives. Subsequently, the analytic hierarchy process method is applied to select the best Pareto-optimal solution according to the preferences of the decision maker. The robustness of the solutions and the proposed approach are discussed based on a sensitivity analysis and an application to a real case from the textile and apparel industry.
Principles of dynamical modularity in biological regulatory networks
Deritei, Dávid; Aird, William C.; Ercsey-Ravasz, Mária; Regan, Erzsébet Ravasz
2016-01-01
Intractable diseases such as cancer are associated with breakdown in multiple individual functions, which conspire to create unhealthy phenotype-combinations. An important challenge is to decipher how these functions are coordinated in health and disease. We approach this by drawing on dynamical systems theory. We posit that distinct phenotype-combinations are generated by interactions among robust regulatory switches, each in control of a discrete set of phenotypic outcomes. First, we demonstrate the advantage of characterizing multi-switch regulatory systems in terms of their constituent switches by building a multiswitch cell cycle model which points to novel, testable interactions critical for early G2/M commitment to division. Second, we define quantitative measures of dynamical modularity, namely that global cell states are discrete combinations of switch-level phenotypes. Finally, we formulate three general principles that govern the way coupled switches coordinate their function. PMID:26979940
Lab Demonstration of the Hybrid Doppler Wind Lidar (HDWL) Transceiver
NASA Technical Reports Server (NTRS)
Marx, Catherine T.; Gentry, Bruce; Jordan, Patrick; Dogoda, Peter; Faust, Ed; Kavaya, Michael
2013-01-01
The recommended design approach for the 3D Tropospheric Winds mission is a hybrid Doppler lidar which combines the best elements of both a coherent aerosol Doppler lidar operating at 2 micron and a direct detection molecular Doppler lidar operating at 0.355 micron. In support of the mission, we built a novel, compact, light-weighted multi-field of view transceiver where multiple telescopes are used to cover the required four fields of view. A small mechanism sequentially selects both the "transmit" and "receive" fields of view. The four fields are combined to stimulate both the 0.355 micron receiver and the 2 micron receiver. This version is scaled (0.2 micron diameter aperture) from the space-based version but still demonstrates the feasibility of the hybrid approach. The primary mirrors were conventionally light-weighted and coated with dielectric, high reflectivity coatings with high laser damage thresholds at both 2 micron and 0.355 micron. The mechanical structure and mounts were fabricated from composites to achieve dimensional stability while significantly reducing the mass. In the laboratory, we demonstrated the system level functionality at 0.355 micron and at 2 micron raising the Technology Readiness Level (TRL) from 2 to 4.
Lab Demonstration of the Hybrid Doppler Wind Lidar (HDWL) Transceiver
NASA Technical Reports Server (NTRS)
Marx, Catherine T.; Gentry, Bruce; Jordan, Patrick; Dogoda, Peter; Faust, Ed; Kavaya, Michael
2013-01-01
The recommended design approach for the 3D Tropospheric Winds mission is a hybrid Doppler lidar which combines the best elements of both a coherent aerosol Doppler lidar operating at 2 microns and a direct detection molecular Doppler lidar operating at 0.355 microns. In support of the mission, we built a novel, compact, light-weighted multi-field of view transceiver where multiple telescopes are used to cover the required four fields of view. A small mechanism sequentially selects both the "transmit" and "receive" fields of view. The four fields are combined to stimulate both the 0.355 micron receiver and the 2 micron receiver. This version is scaled (0.2 m diameter aperture) from the space-based version but still demonstrates the feasibility of the hybrid approach. The primary mirrors were conventionally light-weighted and coated with dielectric, high reflectivity coatings with high laser damage thresholds at both 2 microns and 0.355 microns. The mechanical structure and mounts were fabricated from composites to achieve dimensional stability while significantly reducing the mass. In the laboratory, we demonstrated the system level functionality at 0.355 microns and at 2 microns, raising the Technology Readiness Level (TRL) from 2 to 4.
NASA Astrophysics Data System (ADS)
Huang, C. L.; Hsu, N. S.; Yeh, W. W. G.; Hsieh, I. H.
2017-12-01
This study develops an innovative calibration method for regional groundwater modeling by using multi-class empirical orthogonal functions (EOFs). The developed method is an iterative approach. Prior to carrying out the iterative procedures, the groundwater storage hydrographs associated with the observation wells are calculated. The combined multi-class EOF amplitudes and EOF expansion coefficients of the storage hydrographs are then used to compute the initial gauss of the temporal and spatial pattern of multiple recharges. The initial guess of the hydrogeological parameters are also assigned according to in-situ pumping experiment. The recharges include net rainfall recharge and boundary recharge, and the hydrogeological parameters are riverbed leakage conductivity, horizontal hydraulic conductivity, vertical hydraulic conductivity, storage coefficient, and specific yield. The first step of the iterative algorithm is to conduct the numerical model (i.e. MODFLOW) by the initial guess / adjusted values of the recharges and parameters. Second, in order to determine the best EOF combination of the error storage hydrographs for determining the correction vectors, the objective function is devised as minimizing the root mean square error (RMSE) of the simulated storage hydrographs. The error storage hydrograph are the differences between the storage hydrographs computed from observed and simulated groundwater level fluctuations. Third, adjust the values of recharges and parameters and repeat the iterative procedures until the stopping criterion is reached. The established methodology was applied to the groundwater system of Ming-Chu Basin, Taiwan. The study period is from January 1st to December 2ed in 2012. Results showed that the optimal EOF combination for the multiple recharges and hydrogeological parameters can decrease the RMSE of the simulated storage hydrographs dramatically within three calibration iterations. It represents that the iterative approach that using EOF techniques can capture the groundwater flow tendency and detects the correction vector of the simulated error sources. Hence, the established EOF-based methodology can effectively and accurately identify the multiple recharges and hydrogeological parameters.
The Development of Multi-Level Audio-Visual Teaching Aids for Earth Science.
ERIC Educational Resources Information Center
Pitt, William D.
The project consisted of making a multi-level teaching film titled "Rocks and Minerals of the Ouachita Mountains," which runs for 25 minutes and is in color. The film was designed to be interesting to earth science students from junior high to college, and consists of dialogue combined with motion pictures of charts, sequential diagrams, outcrops,…
Sauer, J; Darioly, A; Mast, M Schmid; Schmid, P C; Bischof, N
2010-11-01
The article proposes a multi-level approach for evaluating communication skills training (CST) as an important element of crew resource management (CRM) training. Within this methodological framework, the present work examined the effectiveness of CST in matching or mismatching team compositions with regard to hierarchical status and competence. There is little experimental research that evaluated the effectiveness of CRM training at multiple levels (i.e. reaction, learning, behaviour) and in teams composed of members of different status and competence. An experiment with a two (CST: with vs. without) by two (competence/hierarchical status: congruent vs. incongruent) design was carried out. A total of 64 participants were trained for 2.5 h on a simulated process control environment, with the experimental group being given 45 min of training on receptiveness and influencing skills. Prior to the 1-h experimental session, participants were assigned to two-person teams. The results showed overall support for the use of such a multi-level approach of training evaluation. Stronger positive effects of CST were found for subjective measures than for objective performance measures. STATEMENT OF RELEVANCE: This work provides some guidance for the use of a multi-level evaluation of CRM training. It also emphasises the need to collect objective performance data for training evaluation in addition to subjective measures with a view to gain a more accurate picture of the benefits of such training approaches.
Aguirre-Rubí, J; Luna-Acosta, A; Ortiz-Zarragoitia, M; Zaldibar, B; Izagirre, U; Ahrens, M J; Villamil, L; Marigómez, I
2018-03-15
This investigation was aimed at contributing to develop a suitable multi-biomarker approach for pollution monitoring in mangrove-lined Caribbean coastal systems using as sentinel species, the mangrove cupped oyster, Crassostrea rhizophorae. A pilot field study was carried out in 8 localities (3 in Nicaragua; 5 in Colombia), characterized by different environmental conditions and subjected to different levels and types of pollution. Samples were collected in the rainy and dry seasons of 2012-2013. The biological effects at different levels of biological complexity (Stress-on-Stress response, reproduction, condition index, tissue-level biomarkers and histopathology) were determined as indicators of health disturbance, integrated as IBR/n index, and compared with tissue burdens of contaminants in order to achieve an integrative biomonitoring approach. Though modulated by natural variables and confounding factors, different indicators of oyster health, alone and in combination, were related to the presence of different profiles and levels of contaminants present at low-to-moderate levels. Different mixtures of persistent (As, Cd, PAHs) and emerging chemical pollutants (musk fragrances), in combination with different levels of organic and particulate matter resulting from seasonal oceanographic variability and sewage discharges, and environmental factors (salinity, temperature) elicited a different degree of disturbance in ecosystem health condition, as reflected in sentinel C. rhizophorae. As a result, IBR/n was correlated with pollution indices, even though the levels of biological indicators of health disturbance and pollutants were low-to-moderate, and seasonality and the incidence of confounding factors were remarkable. Our study supports the use of simple methodological approaches to diagnose anomalies in the health status of oysters from different localities and to identify potential causing agents and reflect disturbances in ecosystem health. Consequently, the easy methodological approach used herein is useful for the assessment of health disturbance in a variety of mangrove-lined Caribbean coastal systems using mangrove cupped oysters as sentinel species. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Thomsen, N. I.; Troldborg, M.; McKnight, U. S.; Binning, P. J.; Bjerg, P. L.
2012-04-01
Mass discharge estimates are increasingly being used in the management of contaminated sites. Such estimates have proven useful for supporting decisions related to the prioritization of contaminated sites in a groundwater catchment. Potential management options can be categorised as follows: (1) leave as is, (2) clean up, or (3) further investigation needed. However, mass discharge estimates are often very uncertain, which may hamper the management decisions. If option 1 is incorrectly chosen soil and water quality will decrease, threatening or destroying drinking water resources. The risk of choosing option 2 is to spend money on remediating a site that does not pose a problem. Choosing option 3 will often be safest, but may not be the optimal economic solution. Quantification of the uncertainty in mass discharge estimates can therefore greatly improve the foundation for selecting the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We propose a method for quantifying the uncertainty of dynamic mass discharge estimates from contaminant point sources on the local scale. The method considers both parameter and conceptual uncertainty through a multi-model approach. The multi-model approach evaluates multiple conceptual models for the same site. The different conceptual models consider different source characterizations and hydrogeological descriptions. The idea is to include a set of essentially different conceptual models where each model is believed to be realistic representation of the given site, based on the current level of information. Parameter uncertainty is quantified using Monte Carlo simulations. For each conceptual model we calculate a transient mass discharge estimate with uncertainty bounds resulting from the parametric uncertainty. To quantify the conceptual uncertainty from a given site, we combine the outputs from the different conceptual models using Bayesian model averaging. The weight for each model is obtained by integrating available data and expert knowledge using Bayesian belief networks. The multi-model approach is applied to a contaminated site. At the site a DNAPL (dense non aqueous phase liquid) spill consisting of PCE (perchloroethylene) has contaminated a fractured clay till aquitard overlaying a limestone aquifer. The exact shape and nature of the source is unknown and so is the importance of transport in the fractures. The result of the multi-model approach is a visual representation of the uncertainty of the mass discharge estimates for the site which can be used to support the management options.
Automated diagnosis of interstitial lung diseases and emphysema in MDCT imaging
NASA Astrophysics Data System (ADS)
Fetita, Catalin; Chang Chien, Kuang-Che; Brillet, Pierre-Yves; Prêteux, Françoise
2007-09-01
Diffuse lung diseases (DLD) include a heterogeneous group of non-neoplasic disease resulting from damage to the lung parenchyma by varying patterns of inflammation. Characterization and quantification of DLD severity using MDCT, mainly in interstitial lung diseases and emphysema, is an important issue in clinical research for the evaluation of new therapies. This paper develops a 3D automated approach for detection and diagnosis of diffuse lung diseases such as fibrosis/honeycombing, ground glass and emphysema. The proposed methodology combines multi-resolution 3D morphological filtering (exploiting the sup-constrained connection cost operator) and graph-based classification for a full characterization of the parenchymal tissue. The morphological filtering performs a multi-level segmentation of the low- and medium-attenuated lung regions as well as their classification with respect to a granularity criterion (multi-resolution analysis). The original intensity range of the CT data volume is thus reduced in the segmented data to a number of levels equal to the resolution depth used (generally ten levels). The specificity of such morphological filtering is to extract tissue patterns locally contrasting with their neighborhood and of size inferior to the resolution depth, while preserving their original shape. A multi-valued hierarchical graph describing the segmentation result is built-up according to the resolution level and the adjacency of the different segmented components. The graph nodes are then enriched with the textural information carried out by their associated components. A graph analysis-reorganization based on the nodes attributes delivers the final classification of the lung parenchyma in normal and ILD/emphysematous regions. It also makes possible to discriminate between different types, or development stages, among the same class of diseases.
NASA Astrophysics Data System (ADS)
Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen
2018-01-01
Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.
Distributed and Lumped Parameter Models for the Characterization of High Throughput Bioreactors
Conoscenti, Gioacchino; Cutrì, Elena; Tuan, Rocky S.; Raimondi, Manuela T.; Gottardi, Riccardo
2016-01-01
Next generation bioreactors are being developed to generate multiple human cell-based tissue analogs within the same fluidic system, to better recapitulate the complexity and interconnection of human physiology [1, 2]. The effective development of these devices requires a solid understanding of their interconnected fluidics, to predict the transport of nutrients and waste through the constructs and improve the design accordingly. In this work, we focus on a specific model of bioreactor, with multiple input/outputs, aimed at generating osteochondral constructs, i.e., a biphasic construct in which one side is cartilaginous in nature, while the other is osseous. We next develop a general computational approach to model the microfluidics of a multi-chamber, interconnected system that may be applied to human-on-chip devices. This objective requires overcoming several challenges at the level of computational modeling. The main one consists of addressing the multi-physics nature of the problem that combines free flow in channels with hindered flow in porous media. Fluid dynamics is also coupled with advection-diffusion-reaction equations that model the transport of biomolecules throughout the system and their interaction with living tissues and C constructs. Ultimately, we aim at providing a predictive approach useful for the general organ-on-chip community. To this end, we have developed a lumped parameter approach that allows us to analyze the behavior of multi-unit bioreactor systems with modest computational effort, provided that the behavior of a single unit can be fully characterized. PMID:27669413
Distributed and Lumped Parameter Models for the Characterization of High Throughput Bioreactors.
Iannetti, Laura; D'Urso, Giovanna; Conoscenti, Gioacchino; Cutrì, Elena; Tuan, Rocky S; Raimondi, Manuela T; Gottardi, Riccardo; Zunino, Paolo
Next generation bioreactors are being developed to generate multiple human cell-based tissue analogs within the same fluidic system, to better recapitulate the complexity and interconnection of human physiology [1, 2]. The effective development of these devices requires a solid understanding of their interconnected fluidics, to predict the transport of nutrients and waste through the constructs and improve the design accordingly. In this work, we focus on a specific model of bioreactor, with multiple input/outputs, aimed at generating osteochondral constructs, i.e., a biphasic construct in which one side is cartilaginous in nature, while the other is osseous. We next develop a general computational approach to model the microfluidics of a multi-chamber, interconnected system that may be applied to human-on-chip devices. This objective requires overcoming several challenges at the level of computational modeling. The main one consists of addressing the multi-physics nature of the problem that combines free flow in channels with hindered flow in porous media. Fluid dynamics is also coupled with advection-diffusion-reaction equations that model the transport of biomolecules throughout the system and their interaction with living tissues and C constructs. Ultimately, we aim at providing a predictive approach useful for the general organ-on-chip community. To this end, we have developed a lumped parameter approach that allows us to analyze the behavior of multi-unit bioreactor systems with modest computational effort, provided that the behavior of a single unit can be fully characterized.
A Novel Fiber Optic Based Surveillance System for Prevention of Pipeline Integrity Threats.
Tejedor, Javier; Macias-Guarasa, Javier; Martins, Hugo F; Piote, Daniel; Pastor-Graells, Juan; Martin-Lopez, Sonia; Corredera, Pedro; Gonzalez-Herraez, Miguel
2017-02-12
This paper presents a novel surveillance system aimed at the detection and classification of threats in the vicinity of a long gas pipeline. The sensing system is based on phase-sensitive optical time domain reflectometry ( ϕ -OTDR) technology for signal acquisition and pattern recognition strategies for threat identification. The proposal incorporates contextual information at the feature level and applies a system combination strategy for pattern classification. The contextual information at the feature level is based on the tandem approach (using feature representations produced by discriminatively-trained multi-layer perceptrons) by employing feature vectors that spread different temporal contexts. The system combination strategy is based on a posterior combination of likelihoods computed from different pattern classification processes. The system operates in two different modes: (1) machine + activity identification, which recognizes the activity being carried out by a certain machine, and (2) threat detection, aimed at detecting threats no matter what the real activity being conducted is. In comparison with a previous system based on the same rigorous experimental setup, the results show that the system combination from the contextual feature information improves the results for each individual class in both operational modes, as well as the overall classification accuracy, with statistically-significant improvements.
Characterizing multi-pollutant air pollution in China: Comparison of three air quality indices.
Hu, Jianlin; Ying, Qi; Wang, Yungang; Zhang, Hongliang
2015-11-01
Multi-pollutant air pollution (i.e., several pollutants reaching very high concentrations simultaneously) frequently occurs in many regions across China. Air quality index (AQI) is used worldwide to inform the public about levels of air pollution and associated health risks. The current AQI approach used in China is based on the maximum value of individual pollutants, and does not consider the combined health effects of exposure to multiple pollutants. In this study, two novel alternative indices--aggregate air quality index (AAQI) and health-risk based air quality index (HAQI)--were calculated based on data collected in six megacities of China (Beijing, Shanghai, Guangzhou, Shjiazhuang, Xi'an, and Wuhan) during 2013 to 2014. Both AAQI and HAQI take into account the combined health effects of various pollutants, and the HAQI considers the exposure (or concentration)-response relationships of pollutants. AAQI and HAQI were compared to AQI to examine the effectiveness of the current AQI in characterizing multi-pollutant air pollution in China. The AAQI and HAQI values are higher than the AQI on days when two or more pollutants simultaneously exceed the Chinese Ambient Air Quality Standards (CAAQS) 24-hour Grade II standards. The results of the comparison of the classification of risk categories based on the three indices indicate that the current AQI approach underestimates the severity of health risk associated with exposure to multi-pollutant air pollution. For the AQI-based risk category of 'unhealthy', 96% and 80% of the days would be 'very unhealthy' or 'hazardous' if based on AAQI and HAQI, respectively; and for the AQI-based risk category of 'very unhealthy', 67% and 75% of the days would be 'hazardous' if based on AAQI and HAQI, respectively. The results suggest that the general public, especially sensitive population groups such as children and the elderly, should take more stringent actions than those currently suggested based on the AQI approach during high air pollution events. Sensitivity studies were conducted to examine the assumptions used in the AAQI and HAQI approaches. Results show that AAQI is sensitive to the choice of pollutant irrelevant constant. HAQI is sensitive to the choice of both threshold values and pollutants included in total risk calculation. Copyright © 2015 Elsevier Ltd. All rights reserved.
A Multi-Modal Face Recognition Method Using Complete Local Derivative Patterns and Depth Maps
Yin, Shouyi; Dai, Xu; Ouyang, Peng; Liu, Leibo; Wei, Shaojun
2014-01-01
In this paper, we propose a multi-modal 2D + 3D face recognition method for a smart city application based on a Wireless Sensor Network (WSN) and various kinds of sensors. Depth maps are exploited for the 3D face representation. As for feature extraction, we propose a new feature called Complete Local Derivative Pattern (CLDP). It adopts the idea of layering and has four layers. In the whole system, we apply CLDP separately on Gabor features extracted from a 2D image and depth map. Then, we obtain two features: CLDP-Gabor and CLDP-Depth. The two features weighted by the corresponding coefficients are combined together in the decision level to compute the total classification distance. At last, the probe face is assigned the identity with the smallest classification distance. Extensive experiments are conducted on three different databases. The results demonstrate the robustness and superiority of the new approach. The experimental results also prove that the proposed multi-modal 2D + 3D method is superior to other multi-modal ones and CLDP performs better than other Local Binary Pattern (LBP) based features. PMID:25333290
Le, Bao; Powers, Ginny L; Tam, Yu Tong; Schumacher, Nicholas; Malinowski, Rita L; Steinke, Laura; Kwon, Glen; Marker, Paul C
2017-01-01
Advanced prostate cancers that are resistant to all current therapies create a need for new therapeutic strategies. One recent innovative approach to cancer therapy is the simultaneous use of multiple FDA-approved drugs to target multiple pathways. A challenge for this approach is caused by the different solubility requirements of each individual drug, resulting in the need for a drug vehicle that is non-toxic and capable of carrying multiple water-insoluble antitumor drugs. Micelles have recently been shown to be new candidate drug solubilizers for anti cancer therapy. This study set out to examine the potential use of multi-drug loaded micelles for prostate cancer treatment in preclinical models including cell line and mouse models for prostate cancers with Pten deletions. Specifically antimitotic agent docetaxel, mTOR inhibitor rapamycin, and HSP90 inhibitor 17-N-allylamino-17-demethoxygeldanamycin were incorporated into the micelle system (DR17) and tested for antitumor efficacy. In vitro growth inhibition of prostate cancer cells was greater when all three drugs were used in combination compared to each individual drug, and packaging the drugs into micelles enhanced the cytotoxic effects. At the molecular level DR17 targeted simultaneously several molecular signaling axes important in prostate cancer including androgen receptor, mTOR, and PI3K/AKT. In a mouse genetic model of prostate cancer, DR17 treatment decreased prostate weight, which was achieved by both increasing caspase-dependent cell death and decreasing cell proliferation. Similar effects were also observed when DR17 was administered to nude mice bearing prostate cancer cells xenografts. These results suggest that combining these three cancer drugs in multi-drug loaded micelles may be a promising strategy for prostate cancer therapy.
NASA Astrophysics Data System (ADS)
Phenglengdi, Butsari
This research evaluates the use of a molecular level visualisation approach in Thai secondary schools. The goal is to obtain insights about the usefulness of this approach, and to examine possible improvements in how the approach might be applied in the future. The methodology used for this research used both qualitative and quantitative approaches. Data were collected in the form of pre- and post-intervention multiple choice questions, open-ended-questions, drawing exercises, one-to-one interviews and video recordings of class activity. The research was conducted in two phases, involving a total of 261 students from the 11th Grade in Thailand. The use of VisChem animations in three studies was evaluated in Phase I. Study 1 was a pilot study exploring the benefits of incorporating VisChem animations to portray the molecular level. Study 2 compared test results between students exposed to these animations of molecular level events, and those not. Finally, in Study 3, test results were gathered from different types of schools (a rural school, a city school, and a university school). The results showed that students (and teachers) had misconceptions at the molecular level, and VisChem animations could help students understand chemistry concepts at the molecular level across all three types of schools. While the animation treatment group had a better score on the topic of states of water, the non-animation treatment group had a better score on the topic of dissolving sodium chloride in water than the animation group. The molecular level visualisation approach as a learning design was evaluated in Phase II. This approach involved a combination of VisChem animations, pictures, and diagrams together with the seven-step VisChem learning design. The study involved three classes of students, each with a different treatment, described as Class A - Traditional approach; Class B - VisChem animations with traditional approach; and Class C - Molecular level visualisation approach. Pre-test and post-test scores were compared across the three classes. The results from the multiple choice and calculation tests showed that the Class C - molecular level visualisation approach group demonstrated a deeper understanding of chemistry concepts than students in Classes A and B. However, the results showed that all the students were unable to perform satisfactorily on the calculation tests because the students had insufficient prior knowledge about stoichiometry to connect with the new knowledge. In the drawing tests the students exposed to the molecular level visualisation approach had a better mental model than the other classes, albeit with some remaining misconceptions. The findings highlight the intersecting nature of the teacher, student, and modelling in chemistry teaching. Use of a multi-step molecular level visualisation approach that encourages observation, reflection of prior understanding, and multiple opportunities at viewing (and using various visualisation elements), are key elements leading to a deeper understanding of chemistry. Presentation of the multi-step molecular level visualisation approach must be coupled with careful consideration of student prior knowledge, and with adequate guidance from a teacher who understands the topics at a deep level.
Hazard Interactions and Interaction Networks (Cascades) within Multi-Hazard Methodologies
NASA Astrophysics Data System (ADS)
Gill, Joel; Malamud, Bruce D.
2016-04-01
Here we combine research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between 'multi-layer single hazard' approaches and 'multi-hazard' approaches that integrate such interactions. This synthesis suggests that ignoring interactions could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. We proceed to present an enhanced multi-hazard framework, through the following steps: (i) describe and define three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment; (ii) outline three types of interaction relationship (triggering, increased probability, and catalysis/impedance); and (iii) assess the importance of networks of interactions (cascades) through case-study examples (based on literature, field observations and semi-structured interviews). We further propose visualisation frameworks to represent these networks of interactions. Our approach reinforces the importance of integrating interactions between natural hazards, anthropogenic processes and technological hazards/disasters into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential, and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.
Language Model Combination and Adaptation Using Weighted Finite State Transducers
NASA Technical Reports Server (NTRS)
Liu, X.; Gales, M. J. F.; Hieronymus, J. L.; Woodland, P. C.
2010-01-01
In speech recognition systems language model (LMs) are often constructed by training and combining multiple n-gram models. They can be either used to represent different genres or tasks found in diverse text sources, or capture stochastic properties of different linguistic symbol sequences, for example, syllables and words. Unsupervised LM adaption may also be used to further improve robustness to varying styles or tasks. When using these techniques, extensive software changes are often required. In this paper an alternative and more general approach based on weighted finite state transducers (WFSTs) is investigated for LM combination and adaptation. As it is entirely based on well-defined WFST operations, minimum change to decoding tools is needed. A wide range of LM combination configurations can be flexibly supported. An efficient on-the-fly WFST decoding algorithm is also proposed. Significant error rate gains of 7.3% relative were obtained on a state-of-the-art broadcast audio recognition task using a history dependently adapted multi-level LM modelling both syllable and word sequences
Kim, Ha Yeon; Cappella, Elise
2016-03-01
Understanding the social context of classrooms has been a central goal of research focused on the promotion of academic development. Building on the current literature on classroom social settings and guided by a risk and protection framework, this study examines the unique and combined contribution of individual relationships and quality of classroom interactions on behavioral engagement among low-income Latino students in kindergarten to fifth grade (N = 111). Findings indicate that individual relationships with teachers and peers and classroom quality, each independently predicted behavioral engagement. Moreover, high-quality classrooms buffered the negative influence of students' difficulties in individual relationships on behavioral engagement. Findings illuminate the need to consider multiple layers of social classroom relationships and interactions and suggest the potential benefit of targeting classroom quality as a mechanism for improving behavioral engagement in urban elementary schools. © Society for Community Research and Action 2016.
a Framework for Low-Cost Multi-Platform VR and AR Site Experiences
NASA Astrophysics Data System (ADS)
Wallgrün, J. O.; Huang, J.; Zhao, J.; Masrur, A.; Oprean, D.; Klippel, A.
2017-11-01
Low-cost consumer-level immersive solutions have the potential to revolutionize education and research in many fields by providing virtual experiences of sites that are either inaccessible, too dangerous, or too expensive to visit, or by augmenting in-situ experiences using augmented and mixed reality methods. We present our approach for creating low-cost multi-platform virtual and augmented reality site experiences of real world places for education and research purposes, making extensive use of Structure-from-Motion methods as well as 360° photography and videography. We discuss several example projects, for the Mayan City of Cahal Pech, Iceland's Thrihnukar volcano, the Santa Marta informal settlement in Rio, and for the Penn State Campus, and we propose a framework for creating and maintaining such applications by combining declarative content specification methods with a central linked-data based spatio-temporal information system.
Application of multi-grid methods for solving the Navier-Stokes equations
NASA Technical Reports Server (NTRS)
Demuren, A. O.
1989-01-01
The application of a class of multi-grid methods to the solution of the Navier-Stokes equations for two-dimensional laminar flow problems is discussed. The methods consist of combining the full approximation scheme-full multi-grid technique (FAS-FMG) with point-, line-, or plane-relaxation routines for solving the Navier-Stokes equations in primitive variables. The performance of the multi-grid methods is compared to that of several single-grid methods. The results show that much faster convergence can be procured through the use of the multi-grid approach than through the various suggestions for improving single-grid methods. The importance of the choice of relaxation scheme for the multi-grid method is illustrated.
Application of multi-grid methods for solving the Navier-Stokes equations
NASA Technical Reports Server (NTRS)
Demuren, A. O.
1989-01-01
This paper presents the application of a class of multi-grid methods to the solution of the Navier-Stokes equations for two-dimensional laminar flow problems. The methods consists of combining the full approximation scheme-full multi-grid technique (FAS-FMG) with point-, line- or plane-relaxation routines for solving the Navier-Stokes equations in primitive variables. The performance of the multi-grid methods is compared to those of several single-grid methods. The results show that much faster convergence can be procured through the use of the multi-grid approach than through the various suggestions for improving single-grid methods. The importance of the choice of relaxation scheme for the multi-grid method is illustrated.
Extending the Multi-level Method for the Simulation of Stochastic Biological Systems.
Lester, Christopher; Baker, Ruth E; Giles, Michael B; Yates, Christian A
2016-08-01
The multi-level method for discrete-state systems, first introduced by Anderson and Higham (SIAM Multiscale Model Simul 10(1):146-179, 2012), is a highly efficient simulation technique that can be used to elucidate statistical characteristics of biochemical reaction networks. A single point estimator is produced in a cost-effective manner by combining a number of estimators of differing accuracy in a telescoping sum, and, as such, the method has the potential to revolutionise the field of stochastic simulation. In this paper, we present several refinements of the multi-level method which render it easier to understand and implement, and also more efficient. Given the substantial and complex nature of the multi-level method, the first part of this work reviews existing literature, with the aim of providing a practical guide to the use of the multi-level method. The second part provides the means for a deft implementation of the technique and concludes with a discussion of a number of open problems.
NASA Astrophysics Data System (ADS)
Khan, Faisal; Enzmann, Frieder; Kersten, Michael
2016-03-01
Image processing of X-ray-computed polychromatic cone-beam micro-tomography (μXCT) data of geological samples mainly involves artefact reduction and phase segmentation. For the former, the main beam-hardening (BH) artefact is removed by applying a best-fit quadratic surface algorithm to a given image data set (reconstructed slice), which minimizes the BH offsets of the attenuation data points from that surface. A Matlab code for this approach is provided in the Appendix. The final BH-corrected image is extracted from the residual data or from the difference between the surface elevation values and the original grey-scale values. For the segmentation, we propose a novel least-squares support vector machine (LS-SVM, an algorithm for pixel-based multi-phase classification) approach. A receiver operating characteristic (ROC) analysis was performed on BH-corrected and uncorrected samples to show that BH correction is in fact an important prerequisite for accurate multi-phase classification. The combination of the two approaches was thus used to classify successfully three different more or less complex multi-phase rock core samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orphan, Victoria; Tyson, Gene; Meile, Christof
The global biological CH4 cycle is largely controlled through coordinated and often intimate microbial interactions between archaea and bacteria, the majority of which are still unknown or have been only cursorily identified. Members of the methanotrophic archaea, aka ‘ANME’, are believed to play a major role in the cycling of methane in anoxic environments coupled to sulfate, nitrate, and possibly iron and manganese oxides, frequently forming diverse physical and metabolic partnerships with a range of bacteria. The thermodynamic challenges overcome by the ANME and their bacterial partners and corresponding slow rates of growth are common characteristics in anaerobic ecosystems, and,more » in stark contrast to most cultured microorganisms, this type of energy and resource limited microbial lifestyle is likely the norm in the environment. While we have gained an in-depth systems level understanding of fast-growing, energy-replete microorganisms, comparatively little is known about the dynamics of cell respiration, growth, protein turnover, gene expression, and energy storage in the slow-growing microbial majority. These fundamental properties, combined with the observed metabolic and symbiotic versatility of methanotrophic ANME, make these cooperative microbial systems a relevant (albeit challenging) system to study and for which to develop and optimize culture-independent methodologies, which enable a systems-level understanding of microbial interactions and metabolic networks. We used an integrative systems biology approach to study anaerobic sediment microcosms and methane-oxidizing bioreactors and expanded our understanding of the methanotrophic ANME archaea, their interactions with physically-associated bacteria, ecophysiological characteristics, and underlying genetic basis for cooperative microbial methane-oxidation linked with different terminal electron acceptors. Our approach is inherently multi-disciplinary and multi-scaled, combining transcriptional and proteomic analyses with high resolution microscopy techniques, and stable isotopic and chemical analyses that span community level ‘omics investigations (cm scale) to interspecies consortia (µm scale), to the individual cell and its subcellular components (nm scale). We have organized our methodological approach into three broad categories, RNA-based, Protein-targeted and Geochemical, each encompassing a range of scales, with many techniques and resulting datasets that are highly complementary with one another, and together, offer a unique systems-level perspective of methane-based microbial interactions.« less
The LET Procedure for Prosthetic Myocontrol: Towards Multi-DOF Control Using Single-DOF Activations.
Nowak, Markus; Castellini, Claudio
2016-01-01
Simultaneous and proportional myocontrol of dexterous hand prostheses is to a large extent still an open problem. With the advent of commercially and clinically available multi-fingered hand prostheses there are now more independent degrees of freedom (DOFs) in prostheses than can be effectively controlled using surface electromyography (sEMG), the current standard human-machine interface for hand amputees. In particular, it is uncertain, whether several DOFs can be controlled simultaneously and proportionally by exclusively calibrating the intended activation of single DOFs. The problem is currently solved by training on all required combinations. However, as the number of available DOFs grows, this approach becomes overly long and poses a high cognitive burden on the subject. In this paper we present a novel approach to overcome this problem. Multi-DOF activations are artificially modelled from single-DOF ones using a simple linear combination of sEMG signals, which are then added to the training set. This procedure, which we named LET (Linearly Enhanced Training), provides an augmented data set to any machine-learning-based intent detection system. In two experiments involving intact subjects, one offline and one online, we trained a standard machine learning approach using the full data set containing single- and multi-DOF activations as well as using the LET-augmented data set in order to evaluate the performance of the LET procedure. The results indicate that the machine trained on the latter data set obtains worse results in the offline experiment compared to the full data set. However, the online implementation enables the user to perform multi-DOF tasks with almost the same precision as single-DOF tasks without the need of explicitly training multi-DOF activations. Moreover, the parameters involved in the system are statistically uniform across subjects.
Improved blood glucose estimation through multi-sensor fusion.
Xiong, Feiyu; Hipszer, Brian R; Joseph, Jeffrey; Kam, Moshe
2011-01-01
Continuous glucose monitoring systems are an integral component of diabetes management. Efforts to improve the accuracy and robustness of these systems are at the forefront of diabetes research. Towards this goal, a multi-sensor approach was evaluated in hospitalized patients. In this paper, we report on a multi-sensor fusion algorithm to combine glucose sensor measurements in a retrospective fashion. The results demonstrate the algorithm's ability to improve the accuracy and robustness of the blood glucose estimation with current glucose sensor technology.
NASA Astrophysics Data System (ADS)
Yang, Ruitao; Pollinger, Florian; Meiners-Hagen, Karl; Krystek, Michael; Tan, Jiubin; Bosse, Harald
2015-08-01
We present a dual-comb-based heterodyne multi-wavelength absolute interferometer capable of long distance measurements. The phase information of the various comb modes is extracted in parallel by a multi-channel digital lock-in phase detection scheme. Several synthetic wavelengths of the same order are constructed and the corresponding phases are averaged to deduce the absolute lengths with significantly reduced uncertainty. Comparison experiments with an incremental HeNe reference interferometer show a combined relative measurement uncertainty of 5.3 × 10-7 at a measurement distance of 20 m. Combining the advantage of synthetic wavelength interferometry and dual-comb interferometry, our compact and simple approach provides sufficient precision for many industrial applications.
Battling Wormy apples in the Home Orchard Using a SOFT Approach
USDA-ARS?s Scientific Manuscript database
A program was developed for use by homeowners to control codling moth in backyard apple and pear trees. Coined SOFT (Selective Organic Fruit Tree), this management program uses a combination of granulosis virus, parasitic nematodes, and a trap and lure for females. This multi-tactic approach reduced...
Evaluation of Brazed Joints Using Failure Assessment Diagram
NASA Technical Reports Server (NTRS)
Flom, Yury
2012-01-01
Fitness-for service approach was used to perform structural analysis of the brazed joints consisting of several base metal / filler metal combinations. Failure Assessment Diagrams (FADs) based on tensile and shear stress ratios were constructed and experimentally validated. It was shown that such FADs can provide a conservative estimate of safe combinations of stresses in the brazed joints. Based on this approach, Margins of Safety (MS) of the brazed joints subjected to multi-axial loading conditions can be evaluated..
From sedentary to active: Shifting the movement paradigm in workplaces.
Das, Bhibha M; Mailey, Emily; Murray, Kate; Phillips, Siobhan M; Torres, Cam; King, Abby C
2016-06-08
Increased sedentary behavior and reduced physical activity are risk factors for morbidity and mortality. As adults spend a significant portion of their time at work where the default is to spend the majority of the day sitting, shifting workplace norms to decrease sedentary time and increase active time could have a public health impact. Workplaces offer a unique setting for multi-level interventions that can reach diverse populations. Traditional worksite wellness initiatives have produced equivocal results in terms of increasing physical activity. One reason for this may be the focus on corporate-fitness type programs and health education with little change in workplace culture. More innovative approaches combining theory-based worksite wellness components with behavioral economics approaches promoting incidental physical activity at the workplace to make activity the default may be necessary. This article discusses strategies to shift the workplace paradigm from being sedentary to more active using a range of approaches.
Informatics Approaches for Predicting, Understanding, and Testing Cancer Drug Combinations.
Tang, Jing
2017-01-01
Making cancer treatment more effective is one of the grand challenges in our health care system. However, many drugs have entered clinical trials but so far showed limited efficacy or induced rapid development of resistance. We urgently need multi-targeted drug combinations, which shall selectively inhibit the cancer cells and block the emergence of drug resistance. The book chapter focuses on mathematical and computational tools to facilitate the discovery of the most promising drug combinations to improve efficacy and prevent resistance. Data integration approaches that leverage drug-target interactions, cancer molecular features, and signaling pathways for predicting, understanding, and testing drug combinations are critically reviewed.
Object-Part Attention Model for Fine-Grained Image Classification
NASA Astrophysics Data System (ADS)
Peng, Yuxin; He, Xiangteng; Zhao, Junjie
2018-03-01
Fine-grained image classification is to recognize hundreds of subcategories belonging to the same basic-level category, such as 200 subcategories belonging to the bird, which is highly challenging due to large variance in the same subcategory and small variance among different subcategories. Existing methods generally first locate the objects or parts and then discriminate which subcategory the image belongs to. However, they mainly have two limitations: (1) Relying on object or part annotations which are heavily labor consuming. (2) Ignoring the spatial relationships between the object and its parts as well as among these parts, both of which are significantly helpful for finding discriminative parts. Therefore, this paper proposes the object-part attention model (OPAM) for weakly supervised fine-grained image classification, and the main novelties are: (1) Object-part attention model integrates two level attentions: object-level attention localizes objects of images, and part-level attention selects discriminative parts of object. Both are jointly employed to learn multi-view and multi-scale features to enhance their mutual promotions. (2) Object-part spatial constraint model combines two spatial constraints: object spatial constraint ensures selected parts highly representative, and part spatial constraint eliminates redundancy and enhances discrimination of selected parts. Both are jointly employed to exploit the subtle and local differences for distinguishing the subcategories. Importantly, neither object nor part annotations are used in our proposed approach, which avoids the heavy labor consumption of labeling. Comparing with more than 10 state-of-the-art methods on 4 widely-used datasets, our OPAM approach achieves the best performance.
Straube, Andreas; Aicher, Bernhard; Fiebich, Bernd L; Haag, Gunther
2011-03-31
Pain in general and headache in particular are characterized by a change in activity in brain areas involved in pain processing. The therapeutic challenge is to identify drugs with molecular targets that restore the healthy state, resulting in meaningful pain relief or even freedom from pain. Different aspects of pain perception, i.e. sensory and affective components, also explain why there is not just one single target structure for therapeutic approaches to pain. A network of brain areas ("pain matrix") are involved in pain perception and pain control. This diversification of the pain system explains why a wide range of molecularly different substances can be used in the treatment of different pain states and why in recent years more and more studies have described a superior efficacy of a precise multi-target combination therapy compared to therapy with monotherapeutics. In this article, we discuss the available literature on the effects of several fixed-dose combinations in the treatment of headaches and discuss the evidence in support of the role of combination therapy in the pharmacotherapy of pain, particularly of headaches. The scientific rationale behind multi-target combinations is the therapeutic benefit that could not be achieved by the individual constituents and that the single substances of the combinations act together additively or even multiplicatively and cooperate to achieve a completeness of the desired therapeutic effect.As an example the fixed-dose combination of acetylsalicylic acid (ASA), paracetamol (acetaminophen) and caffeine is reviewed in detail. The major advantage of using such a fixed combination is that the active ingredients act on different but distinct molecular targets and thus are able to act on more signalling cascades involved in pain than most single analgesics without adding more side effects to the therapy. Multitarget therapeutics like combined analgesics broaden the array of therapeutic options, enable the completeness of the therapeutic effect, and allow doctors (and, in self-medication with OTC medications, the patients themselves) to customize treatment to the patient's specific needs. There is substantial clinical evidence that such a multi-component therapy is more effective than mono-component therapies.
2011-01-01
Background Pain in general and headache in particular are characterized by a change in activity in brain areas involved in pain processing. The therapeutic challenge is to identify drugs with molecular targets that restore the healthy state, resulting in meaningful pain relief or even freedom from pain. Different aspects of pain perception, i.e. sensory and affective components, also explain why there is not just one single target structure for therapeutic approaches to pain. A network of brain areas ("pain matrix") are involved in pain perception and pain control. This diversification of the pain system explains why a wide range of molecularly different substances can be used in the treatment of different pain states and why in recent years more and more studies have described a superior efficacy of a precise multi-target combination therapy compared to therapy with monotherapeutics. Discussion In this article, we discuss the available literature on the effects of several fixed-dose combinations in the treatment of headaches and discuss the evidence in support of the role of combination therapy in the pharmacotherapy of pain, particularly of headaches. The scientific rationale behind multi-target combinations is the therapeutic benefit that could not be achieved by the individual constituents and that the single substances of the combinations act together additively or even multiplicatively and cooperate to achieve a completeness of the desired therapeutic effect. As an example the fixesd-dose combination of acetylsalicylic acid (ASA), paracetamol (acetaminophen) and caffeine is reviewed in detail. The major advantage of using such a fixed combination is that the active ingredients act on different but distinct molecular targets and thus are able to act on more signalling cascades involved in pain than most single analgesics without adding more side effects to the therapy. Summary Multitarget therapeutics like combined analgesics broaden the array of therapeutic options, enable the completeness of the therapeutic effect, and allow doctors (and, in self-medication with OTC medications, the patients themselves) to customize treatment to the patient's specific needs. There is substantial clinical evidence that such a multi-component therapy is more effective than mono-component therapies. PMID:21453539
NASA Astrophysics Data System (ADS)
Georgiou, Harris
2009-10-01
Medical Informatics and the application of modern signal processing in the assistance of the diagnostic process in medical imaging is one of the more recent and active research areas today. This thesis addresses a variety of issues related to the general problem of medical image analysis, specifically in mammography, and presents a series of algorithms and design approaches for all the intermediate levels of a modern system for computer-aided diagnosis (CAD). The diagnostic problem is analyzed with a systematic approach, first defining the imaging characteristics and features that are relevant to probable pathology in mammo-grams. Next, these features are quantified and fused into new, integrated radio-logical systems that exhibit embedded digital signal processing, in order to improve the final result and minimize the radiological dose for the patient. In a higher level, special algorithms are designed for detecting and encoding these clinically interest-ing imaging features, in order to be used as input to advanced pattern classifiers and machine learning models. Finally, these approaches are extended in multi-classifier models under the scope of Game Theory and optimum collective deci-sion, in order to produce efficient solutions for combining classifiers with minimum computational costs for advanced diagnostic systems. The material covered in this thesis is related to a total of 18 published papers, 6 in scientific journals and 12 in international conferences.
Sea-level change during the last 2500 years in New Jersey, USA
Kemp, Andrew C.; Horton, Benjamin P.; Vane, Christopher H.; Bernhardt, Christopher E.; Corbett, D. Reide; Engelhart, Simon E.; Anisfeld, Shimon C.; Parnell, Andrew C.; Cahill, Niamh
2013-01-01
Relative sea-level changes during the last ∼2500 years in New Jersey, USA were reconstructed to test if late Holocene sea level was stable or included persistent and distinctive phases of variability. Foraminifera and bulk-sediment δ13C values were combined to reconstruct paleomarsh elevation with decimeter precision from sequences of salt-marsh sediment at two sites using a multi-proxy approach. The additional paleoenvironmental information provided by bulk-sediment δ13C values reduced vertical uncertainty in the sea-level reconstruction by about one third of that estimated from foraminifera alone using a transfer function. The history of sediment deposition was constrained by a composite chronology. An age–depth model developed for each core enabled reconstruction of sea level with multi-decadal resolution. Following correction for land-level change (1.4 mm/yr), four successive and sustained (multi-centennial) sea-level trends were objectively identified and quantified (95% confidence interval) using error-in-variables change point analysis to account for age and sea-level uncertainties. From at least 500 BC to 250 AD, sea-level fell at 0.11 mm/yr. The second period saw sea-level rise at 0.62 mm/yr from 250 AD to 733 AD. Between 733 AD and 1850 AD, sea level fell at 0.12 mm/yr. The reconstructed rate of sea-level rise since ∼1850 AD was 3.1 mm/yr and represents the most rapid period of change for at least 2500 years. This trend began between 1830 AD and 1873 AD. Since this change point, reconstructed sea-level rise is in agreement with regional tide-gauge records and exceeds the global average estimate for the 20th century. These positive and negative departures from background rates demonstrate that the late Holocene sea level was not stable in New Jersey.
Multi-objective experimental design for (13)C-based metabolic flux analysis.
Bouvin, Jeroen; Cajot, Simon; D'Huys, Pieter-Jan; Ampofo-Asiama, Jerry; Anné, Jozef; Van Impe, Jan; Geeraerd, Annemie; Bernaerts, Kristel
2015-10-01
(13)C-based metabolic flux analysis is an excellent technique to resolve fluxes in the central carbon metabolism but costs can be significant when using specialized tracers. This work presents a framework for cost-effective design of (13)C-tracer experiments, illustrated on two different networks. Linear and non-linear optimal input mixtures are computed for networks for Streptomyces lividans and a carcinoma cell line. If only glucose tracers are considered as labeled substrate for a carcinoma cell line or S. lividans, the best parameter estimation accuracy is obtained by mixtures containing high amounts of 1,2-(13)C2 glucose combined with uniformly labeled glucose. Experimental designs are evaluated based on a linear (D-criterion) and non-linear approach (S-criterion). Both approaches generate almost the same input mixture, however, the linear approach is favored due to its low computational effort. The high amount of 1,2-(13)C2 glucose in the optimal designs coincides with a high experimental cost, which is further enhanced when labeling is introduced in glutamine and aspartate tracers. Multi-objective optimization gives the possibility to assess experimental quality and cost at the same time and can reveal excellent compromise experiments. For example, the combination of 100% 1,2-(13)C2 glucose with 100% position one labeled glutamine and the combination of 100% 1,2-(13)C2 glucose with 100% uniformly labeled glutamine perform equally well for the carcinoma cell line, but the first mixture offers a decrease in cost of $ 120 per ml-scale cell culture experiment. We demonstrated the validity of a multi-objective linear approach to perform optimal experimental designs for the non-linear problem of (13)C-metabolic flux analysis. Tools and a workflow are provided to perform multi-objective design. The effortless calculation of the D-criterion can be exploited to perform high-throughput screening of possible (13)C-tracers, while the illustrated benefit of multi-objective design should stimulate its application within the field of (13)C-based metabolic flux analysis. Copyright © 2015 Elsevier Inc. All rights reserved.
Multi-Tier Mental Health Program for Refugee Youth
ERIC Educational Resources Information Center
Ellis, B. Heidi; Miller, Alisa B.; Abdi, Saida; Barrett, Colleen; Blood, Emily A.; Betancourt, Theresa S.
2013-01-01
Objective: We sought to establish that refugee youths who receive a multi-tiered approach to services, Project SHIFA, would show high levels of engagement in treatment appropriate to their level of mental health distress, improvements in mental health symptoms, and a decrease in resource hardships. Method: Study participants were 30 Somali and…
USDA-ARS?s Scientific Manuscript database
A continuous monitoring of daily evapotranspiration (ET) at field scale can be achieved by combining thermal infrared remote sensing data information from multiple satellite platforms. Here, an integrated approach to field scale ET mapping is described, combining multi-scale surface energy balance e...
Group decision-making approach for flood vulnerability identification using the fuzzy VIKOR method
NASA Astrophysics Data System (ADS)
Lee, G.; Jun, K. S.; Cung, E. S.
2014-09-01
This study proposes an improved group decision making (GDM) framework that combines VIKOR method with fuzzified data to quantify the spatial flood vulnerability including multi-criteria evaluation indicators. In general, GDM method is an effective tool for formulating a compromise solution that involves various decision makers since various stakeholders may have different perspectives on their flood risk/vulnerability management responses. The GDM approach is designed to achieve consensus building that reflects the viewpoints of each participant. The fuzzy VIKOR method was developed to solve multi-criteria decision making (MCDM) problems with conflicting and noncommensurable criteria. This comprising method can be used to obtain a nearly ideal solution according to all established criteria. Triangular fuzzy numbers are used to consider the uncertainty of weights and the crisp data of proxy variables. This approach can effectively propose some compromising decisions by combining the GDM method and fuzzy VIKOR method. The spatial flood vulnerability of the south Han River using the GDM approach combined with the fuzzy VIKOR method was compared with the results from general MCDM methods, such as the fuzzy TOPSIS and classical GDM methods, such as those developed by Borda, Condorcet, and Copeland. The evaluated priorities were significantly dependent on the employed decision-making method. The proposed fuzzy GDM approach can reduce the uncertainty in the data confidence and weight derivation techniques. Thus, the combination of the GDM approach with the fuzzy VIKOR method can provide robust prioritization because it actively reflects the opinions of various groups and considers uncertainty in the input data.
May, Christian P; Kolokotroni, Eleni; Stamatakos, Georgios S; Büchler, Philippe
2011-10-01
Modeling of tumor growth has been performed according to various approaches addressing different biocomplexity levels and spatiotemporal scales. Mathematical treatments range from partial differential equation based diffusion models to rule-based cellular level simulators, aiming at both improving our quantitative understanding of the underlying biological processes and, in the mid- and long term, constructing reliable multi-scale predictive platforms to support patient-individualized treatment planning and optimization. The aim of this paper is to establish a multi-scale and multi-physics approach to tumor modeling taking into account both the cellular and the macroscopic mechanical level. Therefore, an already developed biomodel of clinical tumor growth and response to treatment is self-consistently coupled with a biomechanical model. Results are presented for the free growth case of the imageable component of an initially point-like glioblastoma multiforme tumor. The composite model leads to significant tumor shape corrections that are achieved through the utilization of environmental pressure information and the application of biomechanical principles. Using the ratio of smallest to largest moment of inertia of the tumor material to quantify the effect of our coupled approach, we have found a tumor shape correction of 20% by coupling biomechanics to the cellular simulator as compared to a cellular simulation without preferred growth directions. We conclude that the integration of the two models provides additional morphological insight into realistic tumor growth behavior. Therefore, it might be used for the development of an advanced oncosimulator focusing on tumor types for which morphology plays an important role in surgical and/or radio-therapeutic treatment planning. Copyright © 2011 Elsevier Ltd. All rights reserved.
Angelis, Aris; Kanavos, Panos
2017-09-01
Escalating drug prices have catalysed the generation of numerous "value frameworks" with the aim of informing payers, clinicians and patients on the assessment and appraisal process of new medicines for the purpose of coverage and treatment selection decisions. Although this is an important step towards a more inclusive Value Based Assessment (VBA) approach, aspects of these frameworks are based on weak methodologies and could potentially result in misleading recommendations or decisions. In this paper, a Multiple Criteria Decision Analysis (MCDA) methodological process, based on Multi Attribute Value Theory (MAVT), is adopted for building a multi-criteria evaluation model. A five-stage model-building process is followed, using a top-down "value-focused thinking" approach, involving literature reviews and expert consultations. A generic value tree is structured capturing decision-makers' concerns for assessing the value of new medicines in the context of Health Technology Assessment (HTA) and in alignment with decision theory. The resulting value tree (Advance Value Tree) consists of three levels of criteria (top level criteria clusters, mid-level criteria, bottom level sub-criteria or attributes) relating to five key domains that can be explicitly measured and assessed: (a) burden of disease, (b) therapeutic impact, (c) safety profile (d) innovation level and (e) socioeconomic impact. A number of MAVT modelling techniques are introduced for operationalising (i.e. estimating) the model, for scoring the alternative treatment options, assigning relative weights of importance to the criteria, and combining scores and weights. Overall, the combination of these MCDA modelling techniques for the elicitation and construction of value preferences across the generic value tree provides a new value framework (Advance Value Framework) enabling the comprehensive measurement of value in a structured and transparent way. Given its flexibility to meet diverse requirements and become readily adaptable across different settings, the Advance Value Framework could be offered as a decision-support tool for evaluators and payers to aid coverage and reimbursement of new medicines. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Multi-Scale Analyses of Three Dimensional Woven Composite 3D Shell With a Cut Out Circle
NASA Astrophysics Data System (ADS)
Nguyen, Duc Hai; Wang, Hu
2018-06-01
A composite material are made by combining two or more constituent materials to obtain the desired material properties of each product type. The matrix material which can be polymer and fiber is used as reinforcing material. Currently, the polymer matrix is widely used in many different fields with differently designed structures such as automotive structures and aviation, aerospace, marine, etc. because of their excellent mechanical properties; in addition, they possess the high level of hardness and durability together with a significant reduction in weight compared to traditional materials. However, during design process of structure, there will be many interruptions created for the purpose of assembling the structures together or for many other design purposes. Therefore, when this structure is subject to load-bearing, its failure occurs at these interruptions due to stress concentration. This paper proposes multi-scale modeling and optimization strategies in evaluation of the effectiveness of fiber orientation in an E-glass/Epoxy woven composite 3D shell with circular holes at the center investigated by FEA results. A multi-scale model approach was developed to predict the mechanical behavior of woven composite 3D shell with circular holes at the center with different designs of material and structural parameters. Based on the analysis result of laminae, we have found that the 3D shell with fiber direction of 450 shows the best stress and strain bearing capacity. Thus combining several layers of 450 fiber direction in a multi-layer composite 3D shell reduces the stresses concentrated on the cuts of the structures.
Ngendahimana, David K.; Fagerholm, Cara L.; Sun, Jiayang; Bruckman, Laura S.
2017-01-01
Accelerated weathering exposures were performed on poly(ethylene-terephthalate) (PET) films. Longitudinal multi-level predictive models as a function of PET grades and exposure types were developed for the change in yellowness index (YI) and haze (%). Exposures with similar change in YI were modeled using a linear fixed-effects modeling approach. Due to the complex nature of haze formation, measurement uncertainty, and the differences in the samples’ responses, the change in haze (%) depended on individual samples’ responses and a linear mixed-effects modeling approach was used. When compared to fixed-effects models, the addition of random effects in the haze formation models significantly increased the variance explained. For both modeling approaches, diagnostic plots confirmed independence and homogeneity with normally distributed residual errors. Predictive R2 values for true prediction error and predictive power of the models demonstrated that the models were not subject to over-fitting. These models enable prediction under pre-defined exposure conditions for a given exposure time (or photo-dosage in case of UV light exposure). PET degradation under cyclic exposures combining UV light and condensing humidity is caused by photolytic and hydrolytic mechanisms causing yellowing and haze formation. Quantitative knowledge of these degradation pathways enable cross-correlation of these lab-based exposures with real-world conditions for service life prediction. PMID:28498875
Grimm, Marcus O W; Michaelson, Daniel M; Hartmann, Tobias
2017-11-01
In the last decade, it has become obvious that Alzheimer's disease (AD) is closely linked to changes in lipids or lipid metabolism. One of the main pathological hallmarks of AD is amyloid-β (Aβ) deposition. Aβ is derived from sequential proteolytic processing of the amyloid precursor protein (APP). Interestingly, both, the APP and all APP secretases are transmembrane proteins that cleave APP close to and in the lipid bilayer. Moreover, apoE4 has been identified as the most prevalent genetic risk factor for AD. ApoE is the main lipoprotein in the brain, which has an abundant role in the transport of lipids and brain lipid metabolism. Several lipidomic approaches revealed changes in the lipid levels of cerebrospinal fluid or in post mortem AD brains. Here, we review the impact of apoE and lipids in AD, focusing on the major brain lipid classes, sphingomyelin, plasmalogens, gangliosides, sulfatides, DHA, and EPA, as well as on lipid signaling molecules, like ceramide and sphingosine-1-phosphate. As nutritional approaches showed limited beneficial effects in clinical studies, the opportunities of combining different supplements in multi-nutritional approaches are discussed and summarized. Copyright © 2017 by the American Society for Biochemistry and Molecular Biology, Inc.
Fabritius, Helge-Otto; Ziegler, Andreas; Friák, Martin; Nikolov, Svetoslav; Huber, Julia; Seidl, Bastian H M; Ruangchai, Sukhum; Alagboso, Francisca I; Karsten, Simone; Lu, Jin; Janus, Anna M; Petrov, Michal; Zhu, Li-Fang; Hemzalová, Pavlína; Hild, Sabine; Raabe, Dierk; Neugebauer, Jörg
2016-09-09
The crustacean cuticle is a composite material that covers the whole animal and forms the continuous exoskeleton. Nano-fibers composed of chitin and protein molecules form most of the organic matrix of the cuticle that, at the macroscale, is organized in up to eight hierarchical levels. At least two of them, the exo- and endocuticle, contain a mineral phase of mainly Mg-calcite, amorphous calcium carbonate and phosphate. The high number of hierarchical levels and the compositional diversity provide a high degree of freedom for varying the physical, in particular mechanical, properties of the material. This makes the cuticle a versatile material ideally suited to form a variety of skeletal elements that are adapted to different functions and the eco-physiological strains of individual species. This review presents our recent analytical, experimental and theoretical studies on the cuticle, summarising at which hierarchical levels structure and composition are modified to achieve the required physical properties. We describe our multi-scale hierarchical modeling approach based on the results from these studies, aiming at systematically predicting the structure-composition-property relations of cuticle composites from the molecular level to the macro-scale. This modeling approach provides a tool to facilitate the development of optimized biomimetic materials within a knowledge-based design approach.
Approaching human language with complex networks.
Cong, Jin; Liu, Haitao
2014-12-01
The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics). Copyright © 2014 Elsevier B.V. All rights reserved.
An Approach to Speed up Single-Frequency PPP Convergence with Quad-Constellation GNSS and GIM.
Cai, Changsheng; Gong, Yangzhao; Gao, Yang; Kuang, Cuilin
2017-06-06
The single-frequency precise point positioning (PPP) technique has attracted increasing attention due to its high accuracy and low cost. However, a very long convergence time, normally a few hours, is required in order to achieve a positioning accuracy level of a few centimeters. In this study, an approach is proposed to accelerate the single-frequency PPP convergence by combining quad-constellation global navigation satellite system (GNSS) and global ionospheric map (GIM) data. In this proposed approach, the GPS, GLONASS, BeiDou, and Galileo observations are directly used in an uncombined observation model and as a result the ionospheric and hardware delay (IHD) can be estimated together as a single unknown parameter. The IHD values acquired from the GIM product and the multi-GNSS differential code bias (DCB) product are then utilized as pseudo-observables of the IHD parameter in the observation model. A time varying weight scheme has also been proposed for the pseudo-observables to gradually decrease its contribution to the position solutions during the convergence period. To evaluate the proposed approach, datasets from twelve Multi-GNSS Experiment (MGEX) stations on seven consecutive days are processed and analyzed. The numerical results indicate that the single-frequency PPP with quad-constellation GNSS and GIM data are able to reduce the convergence time by 56%, 47%, 41% in the east, north, and up directions compared to the GPS-only single-frequency PPP.
NASA Astrophysics Data System (ADS)
Yoon, Kyungho; Lee, Wonhye; Croce, Phillip; Cammalleri, Amanda; Yoo, Seung-Schik
2018-05-01
Transcranial focused ultrasound (tFUS) is emerging as a non-invasive brain stimulation modality. Complicated interactions between acoustic pressure waves and osseous tissue introduce many challenges in the accurate targeting of an acoustic focus through the cranium. Image-guidance accompanied by a numerical simulation is desired to predict the intracranial acoustic propagation through the skull; however, such simulations typically demand heavy computation, which warrants an expedited processing method to provide on-site feedback for the user in guiding the acoustic focus to a particular brain region. In this paper, we present a multi-resolution simulation method based on the finite-difference time-domain formulation to model the transcranial propagation of acoustic waves from a single-element transducer (250 kHz). The multi-resolution approach improved computational efficiency by providing the flexibility in adjusting the spatial resolution. The simulation was also accelerated by utilizing parallelized computation through the graphic processing unit. To evaluate the accuracy of the method, we measured the actual acoustic fields through ex vivo sheep skulls with different sonication incident angles. The measured acoustic fields were compared to the simulation results in terms of focal location, dimensions, and pressure levels. The computational efficiency of the presented method was also assessed by comparing simulation speeds at various combinations of resolution grid settings. The multi-resolution grids consisting of 0.5 and 1.0 mm resolutions gave acceptable accuracy (under 3 mm in terms of focal position and dimension, less than 5% difference in peak pressure ratio) with a speed compatible with semi real-time user feedback (within 30 s). The proposed multi-resolution approach may serve as a novel tool for simulation-based guidance for tFUS applications.
Yoon, Kyungho; Lee, Wonhye; Croce, Phillip; Cammalleri, Amanda; Yoo, Seung-Schik
2018-05-10
Transcranial focused ultrasound (tFUS) is emerging as a non-invasive brain stimulation modality. Complicated interactions between acoustic pressure waves and osseous tissue introduce many challenges in the accurate targeting of an acoustic focus through the cranium. Image-guidance accompanied by a numerical simulation is desired to predict the intracranial acoustic propagation through the skull; however, such simulations typically demand heavy computation, which warrants an expedited processing method to provide on-site feedback for the user in guiding the acoustic focus to a particular brain region. In this paper, we present a multi-resolution simulation method based on the finite-difference time-domain formulation to model the transcranial propagation of acoustic waves from a single-element transducer (250 kHz). The multi-resolution approach improved computational efficiency by providing the flexibility in adjusting the spatial resolution. The simulation was also accelerated by utilizing parallelized computation through the graphic processing unit. To evaluate the accuracy of the method, we measured the actual acoustic fields through ex vivo sheep skulls with different sonication incident angles. The measured acoustic fields were compared to the simulation results in terms of focal location, dimensions, and pressure levels. The computational efficiency of the presented method was also assessed by comparing simulation speeds at various combinations of resolution grid settings. The multi-resolution grids consisting of 0.5 and 1.0 mm resolutions gave acceptable accuracy (under 3 mm in terms of focal position and dimension, less than 5% difference in peak pressure ratio) with a speed compatible with semi real-time user feedback (within 30 s). The proposed multi-resolution approach may serve as a novel tool for simulation-based guidance for tFUS applications.
Eliseyev, Andrey; Aksenova, Tetiana
2016-01-01
In the current paper the decoding algorithms for motor-related BCI systems for continuous upper limb trajectory prediction are considered. Two methods for the smooth prediction, namely Sobolev and Polynomial Penalized Multi-Way Partial Least Squares (PLS) regressions, are proposed. The methods are compared to the Multi-Way Partial Least Squares and Kalman Filter approaches. The comparison demonstrated that the proposed methods combined the prediction accuracy of the algorithms of the PLS family and trajectory smoothness of the Kalman Filter. In addition, the prediction delay is significantly lower for the proposed algorithms than for the Kalman Filter approach. The proposed methods could be applied in a wide range of applications beyond neuroscience. PMID:27196417
Anastasio, Thomas J.
2015-01-01
Like other neurodegenerative diseases, Alzheimer Disease (AD) has a prominent inflammatory component mediated by brain microglia. Reducing microglial inflammation could potentially halt or at least slow the neurodegenerative process. A major challenge in the development of treatments targeting brain inflammation is the sheer complexity of the molecular mechanisms that determine whether microglia become inflammatory or take on a more neuroprotective phenotype. The process is highly multifactorial, raising the possibility that a multi-target/multi-drug strategy could be more effective than conventional monotherapy. This study takes a computational approach in finding combinations of approved drugs that are potentially more effective than single drugs in reducing microglial inflammation in AD. This novel approach exploits the distinct advantages of two different computer programming languages, one imperative and the other declarative. Existing programs written in both languages implement the same model of microglial behavior, and the input/output relationships of both programs agree with each other and with data on microglia over an extensive test battery. Here the imperative program is used efficiently to screen the model for the most efficacious combinations of 10 drugs, while the declarative program is used to analyze in detail the mechanisms of action of the most efficacious combinations. Of the 1024 possible drug combinations, the simulated screen identifies only 7 that are able to move simulated microglia at least 50% of the way from a neurotoxic to a neuroprotective phenotype. Subsequent analysis shows that of the 7 most efficacious combinations, 2 stand out as superior both in strength and reliability. The model offers many experimentally testable and therapeutically relevant predictions concerning effective drug combinations and their mechanisms of action. PMID:26097457
Manda, Prashanti; McCarthy, Fiona; Bridges, Susan M
2013-10-01
The Gene Ontology (GO), a set of three sub-ontologies, is one of the most popular bio-ontologies used for describing gene product characteristics. GO annotation data containing terms from multiple sub-ontologies and at different levels in the ontologies is an important source of implicit relationships between terms from the three sub-ontologies. Data mining techniques such as association rule mining that are tailored to mine from multiple ontologies at multiple levels of abstraction are required for effective knowledge discovery from GO annotation data. We present a data mining approach, Multi-ontology data mining at All Levels (MOAL) that uses the structure and relationships of the GO to mine multi-ontology multi-level association rules. We introduce two interestingness measures: Multi-ontology Support (MOSupport) and Multi-ontology Confidence (MOConfidence) customized to evaluate multi-ontology multi-level association rules. We also describe a variety of post-processing strategies for pruning uninteresting rules. We use publicly available GO annotation data to demonstrate our methods with respect to two applications (1) the discovery of co-annotation suggestions and (2) the discovery of new cross-ontology relationships. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
Center-Within-Trial Versus Trial-Level Evaluation of Surrogate Endpoints.
Renfro, Lindsay A; Shi, Qian; Xue, Yuan; Li, Junlong; Shang, Hongwei; Sargent, Daniel J
2014-10-01
Evaluation of candidate surrogate endpoints using individual patient data from multiple clinical trials is considered the gold standard approach to validate surrogates at both patient and trial levels. However, this approach assumes the availability of patient-level data from a relatively large collection of similar trials, which may not be possible to achieve for a given disease application. One common solution to the problem of too few similar trials involves performing trial-level surrogacy analyses on trial sub-units (e.g., centers within trials), thereby artificially increasing the trial-level sample size for feasibility of the multi-trial analysis. To date, the practical impact of treating trial sub-units (centers) identically to trials in multi-trial surrogacy analyses remains unexplored, and conditions under which this ad hoc solution may in fact be reasonable have not been identified. We perform a simulation study to identify such conditions, and demonstrate practical implications using a multi-trial dataset of patients with early stage colon cancer.
Center-Within-Trial Versus Trial-Level Evaluation of Surrogate Endpoints
Renfro, Lindsay A.; Shi, Qian; Xue, Yuan; Li, Junlong; Shang, Hongwei; Sargent, Daniel J.
2014-01-01
Evaluation of candidate surrogate endpoints using individual patient data from multiple clinical trials is considered the gold standard approach to validate surrogates at both patient and trial levels. However, this approach assumes the availability of patient-level data from a relatively large collection of similar trials, which may not be possible to achieve for a given disease application. One common solution to the problem of too few similar trials involves performing trial-level surrogacy analyses on trial sub-units (e.g., centers within trials), thereby artificially increasing the trial-level sample size for feasibility of the multi-trial analysis. To date, the practical impact of treating trial sub-units (centers) identically to trials in multi-trial surrogacy analyses remains unexplored, and conditions under which this ad hoc solution may in fact be reasonable have not been identified. We perform a simulation study to identify such conditions, and demonstrate practical implications using a multi-trial dataset of patients with early stage colon cancer. PMID:25061255
Self-balanced modulation and magnetic rebalancing method for parallel multilevel inverters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Hui; Shi, Yanjun
A self-balanced modulation method and a closed-loop magnetic flux rebalancing control method for parallel multilevel inverters. The combination of the two methods provides for balancing of the magnetic flux of the inter-cell transformers (ICTs) of the parallel multilevel inverters without deteriorating the quality of the output voltage. In various embodiments a parallel multi-level inverter modulator is provide including a multi-channel comparator to generate a multiplexed digitized ideal waveform for a parallel multi-level inverter and a finite state machine (FSM) module coupled to the parallel multi-channel comparator, the FSM module to receive the multiplexed digitized ideal waveform and to generate amore » pulse width modulated gate-drive signal for each switching device of the parallel multi-level inverter. The system and method provides for optimization of the output voltage spectrum without influence the magnetic balancing.« less
Chen, Wen Hao; Yang, Sam Y. S.; Xiao, Ti Qiao; Mayo, Sherry C.; Wang, Yu Dan; Wang, Hai Peng
2014-01-01
Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials. PMID:24763649
A general CFD framework for fault-resilient simulations based on multi-resolution information fusion
NASA Astrophysics Data System (ADS)
Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em
2017-10-01
We develop a general CFD framework for multi-resolution simulations to target multiscale problems but also resilience in exascale simulations, where faulty processors may lead to gappy, in space-time, simulated fields. We combine approximation theory and domain decomposition together with statistical learning techniques, e.g. coKriging, to estimate boundary conditions and minimize communications by performing independent parallel runs. To demonstrate this new simulation approach, we consider two benchmark problems. First, we solve the heat equation (a) on a small number of spatial "patches" distributed across the domain, simulated by finite differences at fine resolution and (b) on the entire domain simulated at very low resolution, thus fusing multi-resolution models to obtain the final answer. Second, we simulate the flow in a lid-driven cavity in an analogous fashion, by fusing finite difference solutions obtained with fine and low resolution assuming gappy data sets. We investigate the influence of various parameters for this framework, including the correlation kernel, the size of a buffer employed in estimating boundary conditions, the coarseness of the resolution of auxiliary data, and the communication frequency across different patches in fusing the information at different resolution levels. In addition to its robustness and resilience, the new framework can be employed to generalize previous multiscale approaches involving heterogeneous discretizations or even fundamentally different flow descriptions, e.g. in continuum-atomistic simulations.
Multi-Level Building Reconstruction for Automatic Enhancement of High Resolution Dsms
NASA Astrophysics Data System (ADS)
Arefi, H.; Reinartz, P.
2012-07-01
In this article a multi-level approach is proposed for reconstruction-based improvement of high resolution Digital Surface Models (DSMs). The concept of Levels of Detail (LOD) defined by CityGML standard has been considered as basis for abstraction levels of building roof structures. Here, the LOD1 and LOD2 which are related to prismatic and parametric roof shapes are reconstructed. Besides proposing a new approach for automatic LOD1 and LOD2 generation from high resolution DSMs, the algorithm contains two generalization levels namely horizontal and vertical. Both generalization levels are applied to prismatic model of buildings. The horizontal generalization allows controlling the approximation level of building footprints which is similar to cartographic generalization concept of the urban maps. In vertical generalization, the prismatic model is formed using an individual building height and continuous to included all flat structures locating in different height levels. The concept of LOD1 generation is based on approximation of the building footprints into rectangular or non-rectangular polygons. For a rectangular building containing one main orientation a method based on Minimum Bounding Rectangle (MBR) in employed. In contrast, a Combined Minimum Bounding Rectangle (CMBR) approach is proposed for regularization of non-rectilinear polygons, i.e. buildings without perpendicular edge directions. Both MBRand CMBR-based approaches are iteratively employed on building segments to reduce the original building footprints to a minimum number of nodes with maximum similarity to original shapes. A model driven approach based on the analysis of the 3D points of DSMs in a 2D projection plane is proposed for LOD2 generation. Accordingly, a building block is divided into smaller parts according to the direction and number of existing ridge lines. The 3D model is derived for each building part and finally, a complete parametric model is formed by merging all the 3D models of the individual parts and adjusting the nodes after the merging step. In order to provide an enhanced DSM, a surface model is provided for each building by interpolation of the internal points of the generated models. All interpolated models are situated on a Digital Terrain Model (DTM) of corresponding area to shape the enhanced DSM. Proposed DSM enhancement approach has been tested on a dataset from Munich central area. The original DSM is created using robust stereo matching of Worldview-2 stereo images. A quantitative assessment of the new DSM by comparing the heights of the ridges and eaves shows a standard deviation of better than 50cm.
Multi-objective optimization for generating a weighted multi-model ensemble
NASA Astrophysics Data System (ADS)
Lee, H.
2017-12-01
Many studies have demonstrated that multi-model ensembles generally show better skill than each ensemble member. When generating weighted multi-model ensembles, the first step is measuring the performance of individual model simulations using observations. There is a consensus on the assignment of weighting factors based on a single evaluation metric. When considering only one evaluation metric, the weighting factor for each model is proportional to a performance score or inversely proportional to an error for the model. While this conventional approach can provide appropriate combinations of multiple models, the approach confronts a big challenge when there are multiple metrics under consideration. When considering multiple evaluation metrics, it is obvious that a simple averaging of multiple performance scores or model ranks does not address the trade-off problem between conflicting metrics. So far, there seems to be no best method to generate weighted multi-model ensembles based on multiple performance metrics. The current study applies the multi-objective optimization, a mathematical process that provides a set of optimal trade-off solutions based on a range of evaluation metrics, to combining multiple performance metrics for the global climate models and their dynamically downscaled regional climate simulations over North America and generating a weighted multi-model ensemble. NASA satellite data and the Regional Climate Model Evaluation System (RCMES) software toolkit are used for assessment of the climate simulations. Overall, the performance of each model differs markedly with strong seasonal dependence. Because of the considerable variability across the climate simulations, it is important to evaluate models systematically and make future projections by assigning optimized weighting factors to the models with relatively good performance. Our results indicate that the optimally weighted multi-model ensemble always shows better performance than an arithmetic ensemble mean and may provide reliable future projections.
A multi-domain spectral method for time-fractional differential equations
NASA Astrophysics Data System (ADS)
Chen, Feng; Xu, Qinwu; Hesthaven, Jan S.
2015-07-01
This paper proposes an approach for high-order time integration within a multi-domain setting for time-fractional differential equations. Since the kernel is singular or nearly singular, two main difficulties arise after the domain decomposition: how to properly account for the history/memory part and how to perform the integration accurately. To address these issues, we propose a novel hybrid approach for the numerical integration based on the combination of three-term-recurrence relations of Jacobi polynomials and high-order Gauss quadrature. The different approximations used in the hybrid approach are justified theoretically and through numerical examples. Based on this, we propose a new multi-domain spectral method for high-order accurate time integrations and study its stability properties by identifying the method as a generalized linear method. Numerical experiments confirm hp-convergence for both time-fractional differential equations and time-fractional partial differential equations.
Martins, Marta
2011-05-01
The emergence of resistance in tuberculosis has become a serious problem for the control of this disease. For that reason, new therapeutic strategies that can be implemented in the clinical setting are urgently needed. The design of new compounds active against mycobacteria must take into account that tuberculosis is mainly an intracellular infection of the alveolar macrophage and therefore must maintain activity within the host cells. An alternative therapeutic approach will be described in this review, focusing on the activation of the phagocytic cell and the subsequent killing of the internalized bacteria. This approach explores the combined use of antibiotics and phenothiazines, or Ca(2+) and K(+) flux inhibitors, in the infected macrophage. Targeting the infected macrophage and not the internalized bacteria could overcome the problem of bacterial multi-drug resistance. This will potentially eliminate the appearance of new multi-drug resistant tuberculosis (MDR-TB) cases and subsequently prevent the emergence of extensively-drug resistant tuberculosis (XDR-TB). Patents resulting from this novel and innovative approach could be extremely valuable if they can be implemented in the clinical setting. Other patents will also be discussed such as the treatment of TB using immunomodulator compounds (for example: betaglycans).
Absolute order-of-magnitude reasoning applied to a social multi-criteria evaluation framework
NASA Astrophysics Data System (ADS)
Afsordegan, A.; Sánchez, M.; Agell, N.; Aguado, J. C.; Gamboa, G.
2016-03-01
A social multi-criteria evaluation framework for solving a real-case problem of selecting a wind farm location in the regions of Urgell and Conca de Barberá in Catalonia (northeast of Spain) is studied. This paper applies a qualitative multi-criteria decision analysis approach based on linguistic labels assessment able to address uncertainty and deal with different levels of precision. This method is based on qualitative reasoning as an artificial intelligence technique for assessing and ranking multi-attribute alternatives with linguistic labels in order to handle uncertainty. This method is suitable for problems in the social framework such as energy planning which require the construction of a dialogue process among many social actors with high level of complexity and uncertainty. The method is compared with an existing approach, which has been applied previously in the wind farm location problem. This approach, consisting of an outranking method, is based on Condorcet's original method. The results obtained by both approaches are analysed and their performance in the selection of the wind farm location is compared in aggregation procedures. Although results show that both methods conduct to similar alternatives rankings, the study highlights both their advantages and drawbacks.
USDA-ARS?s Scientific Manuscript database
In the last few years, modeling of surface processes, such as water and carbon balances, vegetation growth and energy budgets, has focused on integrated approaches that combine aspects of hydrology, biology and meteorology into unified analyses. In this context, remotely sensed data often have a cor...
NASA Astrophysics Data System (ADS)
Oliveira, Miguel; Santos, Cristina P.; Costa, Lino
2012-09-01
In this paper, a study based on sensitivity analysis is performed for a gait multi-objective optimization system that combines bio-inspired Central Patterns Generators (CPGs) and a multi-objective evolutionary algorithm based on NSGA-II. In this system, CPGs are modeled as autonomous differential equations, that generate the necessary limb movement to perform the required walking gait. In order to optimize the walking gait, a multi-objective problem with three conflicting objectives is formulated: maximization of the velocity, the wide stability margin and the behavioral diversity. The experimental results highlight the effectiveness of this multi-objective approach and the importance of the objectives to find different walking gait solutions for the quadruped robot.
Multi-Detection Events, Probability Density Functions, and Reduced Location Area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eslinger, Paul W.; Schrom, Brian T.
2016-03-01
Abstract Several efforts have been made in the Comprehensive Nuclear-Test-Ban Treaty (CTBT) community to assess the benefits of combining detections of radionuclides to improve the location estimates available from atmospheric transport modeling (ATM) backtrack calculations. We present a Bayesian estimation approach rather than a simple dilution field of regard approach to allow xenon detections and non-detections to be combined mathematically. This system represents one possible probabilistic approach to radionuclide event formation. Application of this method to a recent interesting radionuclide event shows a substantial reduction in the location uncertainty of that event.
A novel method for a multi-level hierarchical composite with brick-and-mortar structure
Brandt, Kristina; Wolff, Michael F. H.; Salikov, Vitalij; Heinrich, Stefan; Schneider, Gerold A.
2013-01-01
The fascination for hierarchically structured hard tissues such as enamel or nacre arises from their unique structure-properties-relationship. During the last decades this numerously motivated the synthesis of composites, mimicking the brick-and-mortar structure of nacre. However, there is still a lack in synthetic engineering materials displaying a true hierarchical structure. Here, we present a novel multi-step processing route for anisotropic 2-level hierarchical composites by combining different coating techniques on different length scales. It comprises polymer-encapsulated ceramic particles as building blocks for the first level, followed by spouted bed spray granulation for a second level, and finally directional hot pressing to anisotropically consolidate the composite. The microstructure achieved reveals a brick-and-mortar hierarchical structure with distinct, however not yet optimized mechanical properties on each level. It opens up a completely new processing route for the synthesis of multi-level hierarchically structured composites, giving prospects to multi-functional structure-properties relationships. PMID:23900554
A novel method for a multi-level hierarchical composite with brick-and-mortar structure.
Brandt, Kristina; Wolff, Michael F H; Salikov, Vitalij; Heinrich, Stefan; Schneider, Gerold A
2013-01-01
The fascination for hierarchically structured hard tissues such as enamel or nacre arises from their unique structure-properties-relationship. During the last decades this numerously motivated the synthesis of composites, mimicking the brick-and-mortar structure of nacre. However, there is still a lack in synthetic engineering materials displaying a true hierarchical structure. Here, we present a novel multi-step processing route for anisotropic 2-level hierarchical composites by combining different coating techniques on different length scales. It comprises polymer-encapsulated ceramic particles as building blocks for the first level, followed by spouted bed spray granulation for a second level, and finally directional hot pressing to anisotropically consolidate the composite. The microstructure achieved reveals a brick-and-mortar hierarchical structure with distinct, however not yet optimized mechanical properties on each level. It opens up a completely new processing route for the synthesis of multi-level hierarchically structured composites, giving prospects to multi-functional structure-properties relationships.
A novel method for a multi-level hierarchical composite with brick-and-mortar structure
NASA Astrophysics Data System (ADS)
Brandt, Kristina; Wolff, Michael F. H.; Salikov, Vitalij; Heinrich, Stefan; Schneider, Gerold A.
2013-07-01
The fascination for hierarchically structured hard tissues such as enamel or nacre arises from their unique structure-properties-relationship. During the last decades this numerously motivated the synthesis of composites, mimicking the brick-and-mortar structure of nacre. However, there is still a lack in synthetic engineering materials displaying a true hierarchical structure. Here, we present a novel multi-step processing route for anisotropic 2-level hierarchical composites by combining different coating techniques on different length scales. It comprises polymer-encapsulated ceramic particles as building blocks for the first level, followed by spouted bed spray granulation for a second level, and finally directional hot pressing to anisotropically consolidate the composite. The microstructure achieved reveals a brick-and-mortar hierarchical structure with distinct, however not yet optimized mechanical properties on each level. It opens up a completely new processing route for the synthesis of multi-level hierarchically structured composites, giving prospects to multi-functional structure-properties relationships.
Model and Analytic Processes for Export License Assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.
2011-09-29
This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less
Person-city personality fit and entrepreneurial success: An explorative study in China.
Zhou, Mingjie; Zhou, Yixin; Zhang, Jianxin; Obschonka, Martin; Silbereisen, Rainer K
2017-08-13
While the study of personality differences is a traditional psychological approach in entrepreneurship research, economic research directs attention towards the entrepreneurial ecosystems in which entrepreneurial activity are embedded. We combine both approaches and quantify the interplay between the individual personality make-up of entrepreneurs and the local personality composition of ecosystems, with a special focus on person-city personality fit. Specifically, we analyse personality data from N = 26,405 Chinese residents across 42 major Chinese cities, including N = 1091 Chinese entrepreneurs. Multi-level polynomial regression and response surface plots revealed that: (a) individual-level conscientiousness had a positive effect and individual-level agreeableness and neuroticism had a negative effect on entrepreneurial success, (b) city-level conscientiousness had a positive, and city-level neuroticism had a negative effect on entrepreneurial success, and (c) additional person-city personality fit effects existed for agreeableness, conscientiousness and neuroticism. For example, entrepreneurs who are high in agreeableness and conduct their business in a city with a low agreeableness level show the lowest entrepreneurial success. In contrast, entrepreneurs who are low in agreeableness and conduct their business in a city with a high agreeableness level show relatively high entrepreneurial success. Implications for research and practice are discussed. © 2017 International Union of Psychological Science.
Structural diversity: a multi-dimensional approach to assess recreational services in urban parks.
Voigt, Annette; Kabisch, Nadja; Wurster, Daniel; Haase, Dagmar; Breuste, Jürgen
2014-05-01
Urban green spaces provide important recreational services for urban residents. In general, when park visitors enjoy "the green," they are in actuality appreciating a mix of biotic, abiotic, and man-made park infrastructure elements and qualities. We argue that these three dimensions of structural diversity have an influence on how people use and value urban parks. We present a straightforward approach for assessing urban parks that combines multi-dimensional landscape mapping and questionnaire surveys. We discuss the method as well the results from its application to differently sized parks in Berlin and Salzburg.
Martelli, Nicolas; Hansen, Paul; van den Brink, Hélène; Boudard, Aurélie; Cordonnier, Anne-Laure; Devaux, Capucine; Pineau, Judith; Prognon, Patrice; Borget, Isabelle
2016-02-01
At the hospital level, decisions about purchasing new and oftentimes expensive medical devices must take into account multiple criteria simultaneously. Multi-criteria decision analysis (MCDA) is increasingly used for health technology assessment (HTA). One of the most successful hospital-based HTA approaches is mini-HTA, of which a notable example is the Matrix4value model. To develop a funding decision-support tool combining MCDA and mini-HTA, based on Matrix4value, suitable for medical devices for individual patient use in French university hospitals - known as the IDA tool, short for 'innovative device assessment'. Criteria for assessing medical devices were identified from a literature review and a survey of 18 French university hospitals. Weights for the criteria, representing their relative importance, were derived from a survey of 25 members of a medical devices committee using an elicitation technique involving pairwise comparisons. As a test of its usefulness, the IDA tool was applied to two new drug-eluting beads (DEBs) for transcatheter arterial chemoembolization. The IDA tool comprises five criteria and weights for each of two over-arching categories: risk and value. The tool revealed that the two new DEBs conferred no additional value relative to DEBs currently available. Feedback from participating decision-makers about the IDA tool was very positive. The tool could help to promote a more structured and transparent approach to HTA decision-making in French university hospitals. Copyright © 2015 Elsevier Inc. All rights reserved.
Multi-interface Level Sensors and New Development in Monitoring and Control of Oil Separators
Bukhari, Syed Faisal Ahmed; Yang, Wuqiang
2006-01-01
In the oil industry, huge saving may be made if suitable multi-interface level measurement systems are employed for effectively monitoring crude oil separators and efficient control of their operation. A number of techniques, e.g. externally mounted displacers, differential pressure transmitters and capacitance rod devices, have been developed to measure the separation process with gas, oil, water and other components. Because of the unavailability of suitable multi-interface level measurement systems, oil separators are currently operated by the trial-and-error approach. In this paper some conventional techniques, which have been used for level measurement in industry, and new development are discussed.
Uncertainties in the projection of species distributions related to general circulation models
Goberville, Eric; Beaugrand, Grégory; Hautekèete, Nina-Coralie; Piquot, Yves; Luczak, Christophe
2015-01-01
Ecological Niche Models (ENMs) are increasingly used by ecologists to project species potential future distribution. However, the application of such models may be challenging, and some caveats have already been identified. While studies have generally shown that projections may be sensitive to the ENM applied or the emission scenario, to name just a few, the sensitivity of ENM-based scenarios to General Circulation Models (GCMs) has been often underappreciated. Here, using a multi-GCM and multi-emission scenario approach, we evaluated the variability in projected distributions under future climate conditions. We modeled the ecological realized niche (sensu Hutchinson) and predicted the baseline distribution of species with contrasting spatial patterns and representative of two major functional groups of European trees: the dwarf birch and the sweet chestnut. Their future distributions were then projected onto future climatic conditions derived from seven GCMs and four emissions scenarios using the new Representative Concentration Pathways (RCPs) developed for the Intergovernmental Panel on Climate Change (IPCC) AR5 report. Uncertainties arising from GCMs and those resulting from emissions scenarios were quantified and compared. Our study reveals that scenarios of future species distribution exhibit broad differences, depending not only on emissions scenarios but also on GCMs. We found that the between-GCM variability was greater than the between-RCP variability for the next decades and both types of variability reached a similar level at the end of this century. Our result highlights that a combined multi-GCM and multi-RCP approach is needed to better consider potential trajectories and uncertainties in future species distributions. In all cases, between-GCM variability increases with the level of warming, and if nothing is done to alleviate global warming, future species spatial distribution may become more and more difficult to anticipate. When future species spatial distributions are examined, we propose to use a large number of GCMs and RCPs to better anticipate potential trajectories and quantify uncertainties. PMID:25798227
Efficient multi-atlas abdominal segmentation on clinically acquired CT with SIMPLE context learning.
Xu, Zhoubing; Burke, Ryan P; Lee, Christopher P; Baucom, Rebeccah B; Poulose, Benjamin K; Abramson, Richard G; Landman, Bennett A
2015-08-01
Abdominal segmentation on clinically acquired computed tomography (CT) has been a challenging problem given the inter-subject variance of human abdomens and complex 3-D relationships among organs. Multi-atlas segmentation (MAS) provides a potentially robust solution by leveraging label atlases via image registration and statistical fusion. We posit that the efficiency of atlas selection requires further exploration in the context of substantial registration errors. The selective and iterative method for performance level estimation (SIMPLE) method is a MAS technique integrating atlas selection and label fusion that has proven effective for prostate radiotherapy planning. Herein, we revisit atlas selection and fusion techniques for segmenting 12 abdominal structures using clinically acquired CT. Using a re-derived SIMPLE algorithm, we show that performance on multi-organ classification can be improved by accounting for exogenous information through Bayesian priors (so called context learning). These innovations are integrated with the joint label fusion (JLF) approach to reduce the impact of correlated errors among selected atlases for each organ, and a graph cut technique is used to regularize the combined segmentation. In a study of 100 subjects, the proposed method outperformed other comparable MAS approaches, including majority vote, SIMPLE, JLF, and the Wolz locally weighted vote technique. The proposed technique provides consistent improvement over state-of-the-art approaches (median improvement of 7.0% and 16.2% in DSC over JLF and Wolz, respectively) and moves toward efficient segmentation of large-scale clinically acquired CT data for biomarker screening, surgical navigation, and data mining. Copyright © 2015 Elsevier B.V. All rights reserved.
Automatic Prediction of Protein 3D Structures by Probabilistic Multi-template Homology Modeling.
Meier, Armin; Söding, Johannes
2015-10-01
Homology modeling predicts the 3D structure of a query protein based on the sequence alignment with one or more template proteins of known structure. Its great importance for biological research is owed to its speed, simplicity, reliability and wide applicability, covering more than half of the residues in protein sequence space. Although multiple templates have been shown to generally increase model quality over single templates, the information from multiple templates has so far been combined using empirically motivated, heuristic approaches. We present here a rigorous statistical framework for multi-template homology modeling. First, we find that the query proteins' atomic distance restraints can be accurately described by two-component Gaussian mixtures. This insight allowed us to apply the standard laws of probability theory to combine restraints from multiple templates. Second, we derive theoretically optimal weights to correct for the redundancy among related templates. Third, a heuristic template selection strategy is proposed. We improve the average GDT-ha model quality score by 11% over single template modeling and by 6.5% over a conventional multi-template approach on a set of 1000 query proteins. Robustness with respect to wrong constraints is likewise improved. We have integrated our multi-template modeling approach with the popular MODELLER homology modeling software in our free HHpred server http://toolkit.tuebingen.mpg.de/hhpred and also offer open source software for running MODELLER with the new restraints at https://bitbucket.org/soedinglab/hh-suite.
Recommendations for level-determined sampling in wells
NASA Astrophysics Data System (ADS)
Lerner, David N.; Teutsch, Georg
1995-10-01
Level-determined samples of groundwater are increasingly important for hydrogeological studies. The techniques for collecting them range from the use of purpose drilled wells, sometimes with sophisticated dedicated multi-level samplers in them, to a variety of methods used in open wells. Open, often existing, wells are frequently used on cost grounds, but there are risks of obtaining poor and unrepresentative samples. Alternative approaches to level-determined sampling incorporate seven concepts: depth sampling; packer systems; individual wells; dedicated multi-level systems; separation pumping; baffle systems; multi-port sock samplers. These are outlined and evaluated in terms of the environment to be sampled, and the features and performance of the methods. Recommendations are offered to match methods to sampling problems.
Generating multi-double-scroll attractors via nonautonomous approach.
Hong, Qinghui; Xie, Qingguo; Shen, Yi; Wang, Xiaoping
2016-08-01
It is a common phenomenon that multi-scroll attractors are realized by introducing the various nonlinear functions with multiple breakpoints in double scroll chaotic systems. Differently, we present a nonautonomous approach for generating multi-double-scroll attractors (MDSA) without changing the original nonlinear functions. By using the multi-level-logic pulse excitation technique in double scroll chaotic systems, MDSA can be generated. A Chua's circuit, a Jerk circuit, and a modified Lorenz system are given as designed example and the Matlab simulation results are presented. Furthermore, the corresponding realization circuits are designed. The Pspice results are in agreement with numerical simulation results, which verify the availability and feasibility of this method.
Eberhardt, Martin; Lai, Xin; Tomar, Namrata; Gupta, Shailendra; Schmeck, Bernd; Steinkasserer, Alexander; Schuler, Gerold; Vera, Julio
2016-01-01
The understanding of the immune response is right now at the center of biomedical research. There are growing expectations that immune-based interventions will in the midterm provide new, personalized, and targeted therapeutic options for many severe and highly prevalent diseases, from aggressive cancers to infectious and autoimmune diseases. To this end, immunology should surpass its current descriptive and phenomenological nature, and become quantitative, and thereby predictive.Immunology is an ideal field for deploying the tools, methodologies, and philosophy of systems biology, an approach that combines quantitative experimental data, computational biology, and mathematical modeling. This is because, from an organism-wide perspective, the immunity is a biological system of systems, a paradigmatic instance of a multi-scale system. At the molecular scale, the critical phenotypic responses of immune cells are governed by large biochemical networks, enriched in nested regulatory motifs such as feedback and feedforward loops. This network complexity confers them the ability of highly nonlinear behavior, including remarkable examples of homeostasis, ultra-sensitivity, hysteresis, and bistability. Moving from the cellular level, different immune cell populations communicate with each other by direct physical contact or receiving and secreting signaling molecules such as cytokines. Moreover, the interaction of the immune system with its potential targets (e.g., pathogens or tumor cells) is far from simple, as it involves a number of attack and counterattack mechanisms that ultimately constitute a tightly regulated multi-feedback loop system. From a more practical perspective, this leads to the consequence that today's immunologists are facing an ever-increasing challenge of integrating massive quantities from multi-platforms.In this chapter, we support the idea that the analysis of the immune system demands the use of systems-level approaches to ensure the success in the search for more effective and personalized immune-based therapies.
Integration of ultra-high field MRI and histology for connectome based research of brain disorders
Yang, Shan; Yang, Zhengyi; Fischer, Karin; Zhong, Kai; Stadler, Jörg; Godenschweger, Frank; Steiner, Johann; Heinze, Hans-Jochen; Bernstein, Hans-Gert; Bogerts, Bernhard; Mawrin, Christian; Reutens, David C.; Speck, Oliver; Walter, Martin
2013-01-01
Ultra-high field magnetic resonance imaging (MRI) became increasingly relevant for in vivo neuroscientific research because of improved spatial resolutions. However, this is still the unchallenged domain of histological studies, which long played an important role in the investigation of neuropsychiatric disorders. While the field of biological psychiatry strongly advanced on macroscopic levels, current developments are rediscovering the richness of immunohistological information when attempting a multi-level systematic approach to brain function and dysfunction. For most studies, histology sections lost information on three-dimensional reconstructions. Translating histological sections to 3D-volumes would thus not only allow for multi-stain and multi-subject alignment in post mortem data, but also provide a crucial step in big data initiatives involving the network analyses currently performed with in vivo MRI. We therefore investigated potential pitfalls during integration of MR and histological information where no additional blockface information is available. We demonstrated that strengths and requirements from both methods can be effectively combined at a spatial resolution of 200 μm. However, the success of this approach is heavily dependent on choices of hardware, sequence and reconstruction. We provide a fully automated pipeline that optimizes histological 3D reconstructions, providing a potentially powerful solution not only for primary human post mortem research institutions in neuropsychiatric research, but also to help alleviate the massive workloads in neuroanatomical atlas initiatives. We further demonstrate (for the first time) the feasibility and quality of ultra-high spatial resolution (150 μm isotopic) imaging of the entire human brain MRI at 7T, offering new opportunities for analyses on MR-derived information. PMID:24098272
Medical image classification based on multi-scale non-negative sparse coding.
Zhang, Ruijie; Shen, Jian; Wei, Fushan; Li, Xiong; Sangaiah, Arun Kumar
2017-11-01
With the rapid development of modern medical imaging technology, medical image classification has become more and more important in medical diagnosis and clinical practice. Conventional medical image classification algorithms usually neglect the semantic gap problem between low-level features and high-level image semantic, which will largely degrade the classification performance. To solve this problem, we propose a multi-scale non-negative sparse coding based medical image classification algorithm. Firstly, Medical images are decomposed into multiple scale layers, thus diverse visual details can be extracted from different scale layers. Secondly, for each scale layer, the non-negative sparse coding model with fisher discriminative analysis is constructed to obtain the discriminative sparse representation of medical images. Then, the obtained multi-scale non-negative sparse coding features are combined to form a multi-scale feature histogram as the final representation for a medical image. Finally, SVM classifier is combined to conduct medical image classification. The experimental results demonstrate that our proposed algorithm can effectively utilize multi-scale and contextual spatial information of medical images, reduce the semantic gap in a large degree and improve medical image classification performance. Copyright © 2017 Elsevier B.V. All rights reserved.
Kukafka, Rita; Johnson, Stephen B; Linfante, Allison; Allegrante, John P
2003-06-01
Many interventions to improve the success of information technology (IT) implementations are grounded in behavioral science, using theories, and models to identify conditions and determinants of successful use. However, each model in the IT literature has evolved to address specific theoretical problems of particular disciplinary concerns, and each model has been tested and has evolved using, in most cases, a more or less restricted set of IT implementation procedures. Functionally, this limits the perspective for taking into account the multiple factors at the individual, group, and organizational levels that influence use behavior. While a rich body of literature has emerged, employing prominent models such as the Technology Adoption Model, Social-Cognitive Theory, and Diffusion of Innovation Theory, the complexity of defining a suitable multi-level intervention has largely been overlooked. A gap exists between the implementation of IT and the integration of theories and models that can be utilized to develop multi-level approaches to identify factors that impede usage behavior. We present a novel framework that is intended to guide synthesis of more than one theoretical perspective for the purpose of planning multi-level interventions to enhance IT use. This integrative framework is adapted from PRECEDE/PROCEDE, a conceptual framework used by health planners in hundreds of published studies to direct interventions that account for the multiple determinants of behavior. Since we claim that the literature on IT use behavior does not now include a multi-level approach, we undertook a systematic literature analysis to confirm this assertion. Our framework facilitated organizing this literature synthesis and our analysis was aimed at determining if the IT implementation approaches in the published literature were characterized by an approach that considered at least two levels of IT usage determinants. We found that while 61% of studies mentioned or referred to theory, none considered two or more levels. In other words, although the researchers employ behavioral theory, they omit two fundamental propositions: (1) IT usage is influenced by multiple factors and (2) interventions must be multi-dimensional. Our literature synthesis may provide additional insight into the reason for high failure rates associated with underutilized systems, and underscores the need to move beyond the current dominant approach that employs a single model to guide IT implementation plans that aim to address factors associated with IT acceptance and subsequent positive use behavior.
Overweight and obesity in India: policy issues from an exploratory multi-level analysis.
Siddiqui, Md Zakaria; Donato, Ronald
2016-06-01
This article analyses a nationally representative household dataset-the National Family Health Survey (NFHS-3) conducted in 2005 to 2006-to examine factors influencing the prevalence of overweight/obesity in India. The dataset was disaggregated into four sub-population groups-urban and rural females and males-and multi-level logit regression models were used to estimate the impact of particular covariates on the likelihood of overweight/obesity. The multi-level modelling approach aimed to identify individual and macro-level contextual factors influencing this health outcome. In contrast to most studies on low-income developing countries, the findings reveal that education for females beyond a particular level of educational attainment exhibits a negative relationship with the likelihood of overweight/obesity. This relationship was not observed for males. Muslim females and all Sikh sub-populations have a higher likelihood of overweight/obesity suggesting the importance of socio-cultural influences. The results also show that the relationship between wealth and the probability of overweight/obesity is stronger for males than females highlighting the differential impact of increasing socio-economic status on gender. Multi-level analysis reveals that states exerted an independent influence on the likelihood of overweight/obesity beyond individual-level covariates, reflecting the importance of spatially related contextual factors on overweight/obesity. While this study does not disentangle macro-level 'obesogenic' environmental factors from socio-cultural network influences, the results highlight the need to refrain from adopting a 'one size fits all' policy approach in addressing the overweight/obesity epidemic facing India. Instead, policy implementation requires a more nuanced and targeted approach to incorporate the growing recognition of socio-cultural and spatial contextual factors impacting on healthy behaviours. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Gao, Jie; Roan, Esra; Williams, John L
2015-01-01
The physis, or growth plate, is a complex disc-shaped cartilage structure that is responsible for longitudinal bone growth. In this study, a multi-scale computational approach was undertaken to better understand how physiological loads are experienced by chondrocytes embedded inside chondrons when subjected to moderate strain under instantaneous compressive loading of the growth plate. Models of representative samples of compressed bone/growth-plate/bone from a 0.67 mm thick 4-month old bovine proximal tibial physis were subjected to a prescribed displacement equal to 20% of the growth plate thickness. At the macroscale level, the applied compressive deformation resulted in an overall compressive strain across the proliferative-hypertrophic zone of 17%. The microscale model predicted that chondrocytes sustained compressive height strains of 12% and 6% in the proliferative and hypertrophic zones, respectively, in the interior regions of the plate. This pattern was reversed within the outer 300 μm region at the free surface where cells were compressed by 10% in the proliferative and 26% in the hypertrophic zones, in agreement with experimental observations. This work provides a new approach to study growth plate behavior under compression and illustrates the need for combining computational and experimental methods to better understand the chondrocyte mechanics in the growth plate cartilage. While the current model is relevant to fast dynamic events, such as heel strike in walking, we believe this approach provides new insight into the mechanical factors that regulate bone growth at the cell level and provides a basis for developing models to help interpret experimental results at varying time scales.
Gao, Jie; Roan, Esra; Williams, John L.
2015-01-01
The physis, or growth plate, is a complex disc-shaped cartilage structure that is responsible for longitudinal bone growth. In this study, a multi-scale computational approach was undertaken to better understand how physiological loads are experienced by chondrocytes embedded inside chondrons when subjected to moderate strain under instantaneous compressive loading of the growth plate. Models of representative samples of compressed bone/growth-plate/bone from a 0.67 mm thick 4-month old bovine proximal tibial physis were subjected to a prescribed displacement equal to 20% of the growth plate thickness. At the macroscale level, the applied compressive deformation resulted in an overall compressive strain across the proliferative-hypertrophic zone of 17%. The microscale model predicted that chondrocytes sustained compressive height strains of 12% and 6% in the proliferative and hypertrophic zones, respectively, in the interior regions of the plate. This pattern was reversed within the outer 300 μm region at the free surface where cells were compressed by 10% in the proliferative and 26% in the hypertrophic zones, in agreement with experimental observations. This work provides a new approach to study growth plate behavior under compression and illustrates the need for combining computational and experimental methods to better understand the chondrocyte mechanics in the growth plate cartilage. While the current model is relevant to fast dynamic events, such as heel strike in walking, we believe this approach provides new insight into the mechanical factors that regulate bone growth at the cell level and provides a basis for developing models to help interpret experimental results at varying time scales. PMID:25885547
Reaching Mars: multi-criteria R&D portfolio selection for Mars exploration technology planning
NASA Technical Reports Server (NTRS)
Smith, J. H.; Dolgin, B. P.; Weisbin, C. R.
2003-01-01
The exploration of Mars has been the focus of increasing scientific interest about the planet and its relationship to Earth. A multi-criteria decision-making approach was developed to address the question, Given a Mars program composed of mission concepts dependent on a variety of alternative technology development programs, which combination of technologies would enable missions to maximize science return under a constrained budget?.
A Social-Ecological Framework of Theory, Assessment, and Prevention of Suicide
Cramer, Robert J.; Kapusta, Nestor D.
2017-01-01
The juxtaposition of increasing suicide rates with continued calls for suicide prevention efforts begs for new approaches. Grounded in the Centers for Disease Control and Prevention (CDC) framework for tackling health issues, this personal views work integrates relevant suicide risk/protective factor, assessment, and intervention/prevention literatures. Based on these components of suicide risk, we articulate a Social-Ecological Suicide Prevention Model (SESPM) which provides an integration of general and population-specific risk and protective factors. We also use this multi-level perspective to provide a structured approach to understanding current theories and intervention/prevention efforts concerning suicide. Following similar multi-level prevention efforts in interpersonal violence and Human Immunodeficiency Virus (HIV) domains, we offer recommendations for social-ecologically informed suicide prevention theory, training, research, assessment, and intervention programming. Although the SESPM calls for further empirical testing, it provides a suitable backdrop for tailoring of current prevention and intervention programs to population-specific needs. Moreover, the multi-level model shows promise to move suicide risk assessment forward (e.g., development of multi-level suicide risk algorithms or structured professional judgments instruments) to overcome current limitations in the field. Finally, we articulate a set of characteristics of social-ecologically based suicide prevention programs. These include the need to address risk and protective factors with the strongest degree of empirical support at each multi-level layer, incorporate a comprehensive program evaluation strategy, and use a variety of prevention techniques across levels of prevention. PMID:29062296
Granovsky, Alexander A
2011-06-07
The distinctive desirable features, both mathematically and physically meaningful, for all partially contracted multi-state multi-reference perturbation theories (MS-MR-PT) are explicitly formulated. The original approach to MS-MR-PT theory, called extended multi-configuration quasi-degenerate perturbation theory (XMCQDPT), having most, if not all, of the desirable properties is introduced. The new method is applied at the second order of perturbation theory (XMCQDPT2) to the 1(1)A(')-2(1)A(') conical intersection in allene molecule, the avoided crossing in LiF molecule, and the 1(1)A(1) to 2(1)A(1) electronic transition in cis-1,3-butadiene. The new theory has several advantages compared to those of well-established approaches, such as second order multi-configuration quasi-degenerate perturbation theory and multi-state-second order complete active space perturbation theory. The analysis of the prevalent approaches to the MS-MR-PT theory performed within the framework of the XMCQDPT theory unveils the origin of their common inherent problems. We describe the efficient implementation strategy that makes XMCQDPT2 an especially useful general-purpose tool in the high-level modeling of small to large molecular systems. © 2011 American Institute of Physics
Kia, Seyed Mostafa; Pedregosa, Fabian; Blumenthal, Anna; Passerini, Andrea
2017-06-15
The use of machine learning models to discriminate between patterns of neural activity has become in recent years a standard analysis approach in neuroimaging studies. Whenever these models are linear, the estimated parameters can be visualized in the form of brain maps which can aid in understanding how brain activity in space and time underlies a cognitive function. However, the recovered brain maps often suffer from lack of interpretability, especially in group analysis of multi-subject data. To facilitate the application of brain decoding in group-level analysis, we present an application of multi-task joint feature learning for group-level multivariate pattern recovery in single-trial magnetoencephalography (MEG) decoding. The proposed method allows for recovering sparse yet consistent patterns across different subjects, and therefore enhances the interpretability of the decoding model. Our experimental results demonstrate that the mutli-task joint feature learning framework is capable of recovering more meaningful patterns of varying spatio-temporally distributed brain activity across individuals while still maintaining excellent generalization performance. We compare the performance of the multi-task joint feature learning in terms of generalization, reproducibility, and quality of pattern recovery against traditional single-subject and pooling approaches on both simulated and real MEG datasets. These results can facilitate the usage of brain decoding for the characterization of fine-level distinctive patterns in group-level inference. Considering the importance of group-level analysis, the proposed approach can provide a methodological shift towards more interpretable brain decoding models. Copyright © 2017 Elsevier B.V. All rights reserved.
Estimation of Solar Radiation on Building Roofs in Mountainous Areas
NASA Astrophysics Data System (ADS)
Agugiaro, G.; Remondino, F.; Stevanato, G.; De Filippi, R.; Furlanello, C.
2011-04-01
The aim of this study is estimating solar radiation on building roofs in complex mountain landscape areas. A multi-scale solar radiation estimation methodology is proposed that combines 3D data ranging from regional scale to the architectural one. Both the terrain and the nearby building shadowing effects are considered. The approach is modular and several alternative roof models, obtained by surveying and modelling techniques at varying level of detail, can be embedded in a DTM, e.g. that of an Alpine valley surrounded by mountains. The solar radiation maps obtained from raster models at different resolutions are compared and evaluated in order to obtain information regarding the benefits and disadvantages tied to each roof modelling approach. The solar radiation estimation is performed within the open-source GRASS GIS environment using r.sun and its ancillary modules.
NASA Astrophysics Data System (ADS)
Riccio, A.; Giunta, G.; Galmarini, S.
2007-04-01
In this paper we present an approach for the statistical analysis of multi-model ensemble results. The models considered here are operational long-range transport and dispersion models, also used for the real-time simulation of pollutant dispersion or the accidental release of radioactive nuclides. We first introduce the theoretical basis (with its roots sinking into the Bayes theorem) and then apply this approach to the analysis of model results obtained during the ETEX-1 exercise. We recover some interesting results, supporting the heuristic approach called "median model", originally introduced in Galmarini et al. (2004a, b). This approach also provides a way to systematically reduce (and quantify) model uncertainties, thus supporting the decision-making process and/or regulatory-purpose activities in a very effective manner.
NASA Astrophysics Data System (ADS)
Riccio, A.; Giunta, G.; Galmarini, S.
2007-12-01
In this paper we present an approach for the statistical analysis of multi-model ensemble results. The models considered here are operational long-range transport and dispersion models, also used for the real-time simulation of pollutant dispersion or the accidental release of radioactive nuclides. We first introduce the theoretical basis (with its roots sinking into the Bayes theorem) and then apply this approach to the analysis of model results obtained during the ETEX-1 exercise. We recover some interesting results, supporting the heuristic approach called "median model", originally introduced in Galmarini et al. (2004a, b). This approach also provides a way to systematically reduce (and quantify) model uncertainties, thus supporting the decision-making process and/or regulatory-purpose activities in a very effective manner.
NASA Astrophysics Data System (ADS)
Zhang, Ying; Feng, Yuanming; Wang, Wei; Yang, Chengwen; Wang, Ping
2017-03-01
A novel and versatile “bottom-up” approach is developed to estimate the radiobiological effect of clinic radiotherapy. The model consists of multi-scale Monte Carlo simulations from organ to cell levels. At cellular level, accumulated damages are computed using a spectrum-based accumulation algorithm and predefined cellular damage database. The damage repair mechanism is modeled by an expanded reaction-rate two-lesion kinetic model, which were calibrated through replicating a radiobiological experiment. Multi-scale modeling is then performed on a lung cancer patient under conventional fractionated irradiation. The cell killing effects of two representative voxels (isocenter and peripheral voxel of the tumor) are computed and compared. At microscopic level, the nucleus dose and damage yields vary among all nucleuses within the voxels. Slightly larger percentage of cDSB yield is observed for the peripheral voxel (55.0%) compared to the isocenter one (52.5%). For isocenter voxel, survival fraction increase monotonically at reduced oxygen environment. Under an extreme anoxic condition (0.001%), survival fraction is calculated to be 80% and the hypoxia reduction factor reaches a maximum value of 2.24. In conclusion, with biological-related variations, the proposed multi-scale approach is more versatile than the existing approaches for evaluating personalized radiobiological effects in radiotherapy.
Studies of a new multi-layer compression bandage for the treatment of venous ulceration.
Scriven, J M; Bello, M; Taylor, L E; Wood, A J; London, N J
2000-03-01
This study aimed to develop an alternative graduated compression bandage for the treatment of venous leg ulcers. Alternative bandage components were identified and assessed for optimal performance as a graduated multi-layer compression bandage. Subsequently the physical characteristics and clinical efficacy of the optimal bandage combination was prospectively examined. Ten healthy limbs were used to develop the optimal combination and 20 limbs with venous ulceration to compare the physical properties of the two bandage types. Subsequently 42 consecutive ulcerated limbs were prospectively treated to examine the efficacy of the new bandage combination. The new combination produced graduated median (range) sub-bandage pressures (mmHg) as follows: ankle 59 (42-100), calf 36 (27-67) and knee 35 (16-67). Over a seven-day period this combination maintained a comparable level of compression with the Charing Cross system, and achieved an overall healing rate at one year of 88%. The described combination should be brought to the attention of healthcare professionals treating venous ulcers as a possible alternative to other forms of multi-layer graduated compression bandages pending prospective, randomised clinical trials.
A Multi-level Fuzzy Evaluation Method for Smart Distribution Network Based on Entropy Weight
NASA Astrophysics Data System (ADS)
Li, Jianfang; Song, Xiaohui; Gao, Fei; Zhang, Yu
2017-05-01
Smart distribution network is considered as the future trend of distribution network. In order to comprehensive evaluate smart distribution construction level and give guidance to the practice of smart distribution construction, a multi-level fuzzy evaluation method based on entropy weight is proposed. Firstly, focus on both the conventional characteristics of distribution network and new characteristics of smart distribution network such as self-healing and interaction, a multi-level evaluation index system which contains power supply capability, power quality, economy, reliability and interaction is established. Then, a combination weighting method based on Delphi method and entropy weight method is put forward, which take into account not only the importance of the evaluation index in the experts’ subjective view, but also the objective and different information from the index values. Thirdly, a multi-level evaluation method based on fuzzy theory is put forward. Lastly, an example is conducted based on the statistical data of some cites’ distribution network and the evaluation method is proved effective and rational.
Control of Thermo-Acoustics Instabilities: The Multi-Scale Extended Kalman Approach
NASA Technical Reports Server (NTRS)
Le, Dzu K.; DeLaat, John C.; Chang, Clarence T.
2003-01-01
"Multi-Scale Extended Kalman" (MSEK) is a novel model-based control approach recently found to be effective for suppressing combustion instabilities in gas turbines. A control law formulated in this approach for fuel modulation demonstrated steady suppression of a high-frequency combustion instability (less than 500Hz) in a liquid-fuel combustion test rig under engine-realistic conditions. To make-up for severe transport-delays on control effect, the MSEK controller combines a wavelet -like Multi-Scale analysis and an Extended Kalman Observer to predict the thermo-acoustic states of combustion pressure perturbations. The commanded fuel modulation is composed of a damper action based on the predicted states, and a tones suppression action based on the Multi-Scale estimation of thermal excitations and other transient disturbances. The controller performs automatic adjustments of the gain and phase of these actions to minimize the Time-Scale Averaged Variances of the pressures inside the combustion zone and upstream of the injector. The successful demonstration of Active Combustion Control with this MSEK controller completed an important NASA milestone for the current research in advanced combustion technologies.
Using Evaluation Research as a Means for Policy Analysis in a "New" Mission-Oriented Policy Context
ERIC Educational Resources Information Center
Amanatidou, Effie; Cunningham, Paul; Gök, Abdullah; Garefi, Ioanna
2014-01-01
Grand challenges stress the importance of multi-disciplinary research, a multi-actor approach in examining the current state of affairs and exploring possible solutions, multi-level governance and policy coordination across geographical boundaries and policy areas, and a policy environment for enabling change both in science and technology and in…
Device Independent Layout and Style Editing Using Multi-Level Style Sheets
NASA Astrophysics Data System (ADS)
Dees, Walter
This paper describes a layout and styling framework that is based on the multi-level style sheets approach. It shows some of the techniques that can be used to add layout and style information to a UI in a device-independent manner, and how to reuse the layout and style information to create user interfaces for different devices
Achieving bifunctional cloak via combination of passive and active schemes
NASA Astrophysics Data System (ADS)
Lan, Chuwen; Bi, Ke; Gao, Zehua; Li, Bo; Zhou, Ji
2016-11-01
In this study, a simple and delicate approach to realizing manipulation of multi-physics field simultaneously through combination of passive and active schemes is proposed. In the design, one physical field is manipulated with passive scheme while the other with active scheme. As a proof of this concept, a bifunctional device is designed and fabricated to behave as electric and thermal invisibility cloak simultaneously. It is found that the experimental results are consistent with the simulated ones well, confirming the feasibility of our method. Furthermore, the proposed method could also be extended to other multi-physics fields, which might lead to potential applications in thermal, electric, and acoustic areas.
Pupils' Views of Religious Education in a Pluralistic Educational Context
ERIC Educational Resources Information Center
Kuusisto, Arniika; Kallioniemi, Arto
2014-01-01
This article examines Finnish pupils' views of religious education (RE) in a pluralistic educational context. The focus is on pupils' views of the aims and different approaches to RE in a multi-faith school. The study utilised a mixed method approach, combining quantitative and qualitative data. It employed a survey (n = 1301) and interviews (n =…
Single Channel EEG Artifact Identification Using Two-Dimensional Multi-Resolution Analysis.
Taherisadr, Mojtaba; Dehzangi, Omid; Parsaei, Hossein
2017-12-13
As a diagnostic monitoring approach, electroencephalogram (EEG) signals can be decoded by signal processing methodologies for various health monitoring purposes. However, EEG recordings are contaminated by other interferences, particularly facial and ocular artifacts generated by the user. This is specifically an issue during continuous EEG recording sessions, and is therefore a key step in using EEG signals for either physiological monitoring and diagnosis or brain-computer interface to identify such artifacts from useful EEG components. In this study, we aim to design a new generic framework in order to process and characterize EEG recording as a multi-component and non-stationary signal with the aim of localizing and identifying its component (e.g., artifact). In the proposed method, we gather three complementary algorithms together to enhance the efficiency of the system. Algorithms include time-frequency (TF) analysis and representation, two-dimensional multi-resolution analysis (2D MRA), and feature extraction and classification. Then, a combination of spectro-temporal and geometric features are extracted by combining key instantaneous TF space descriptors, which enables the system to characterize the non-stationarities in the EEG dynamics. We fit a curvelet transform (as a MRA method) to 2D TF representation of EEG segments to decompose the given space to various levels of resolution. Such a decomposition efficiently improves the analysis of the TF spaces with different characteristics (e.g., resolution). Our experimental results demonstrate that the combination of expansion to TF space, analysis using MRA, and extracting a set of suitable features and applying a proper predictive model is effective in enhancing the EEG artifact identification performance. We also compare the performance of the designed system with another common EEG signal processing technique-namely, 1D wavelet transform. Our experimental results reveal that the proposed method outperforms 1D wavelet.
Li, L; Guennel, T; Marshall, S; Cheung, L W-K
2014-10-01
Delivering on the promise of personalized medicine has become a focus of the pharmaceutical industry as the era of the blockbuster drug is fading. Central to realizing this promise is the need for improved analytical strategies for effectively integrating information across various biological assays (for example, copy number variation and targeted protein expression) toward identification of a treatment-specific subgroup-identifying the right patients. We propose a novel combination of elastic net followed by a maximal χ(2) and semiparametric bootstrap. The combined approaches are presented in a two-stage strategy that estimates patient-specific multi-marker molecular signatures (MMMS) to identify and directly test for a biomarker-driven subgroup with enhanced treatment effect. This flexible strategy provides for incorporation of business-specific needs, such as confining the search space to a subgroup size that is commercially viable, ultimately resulting in actionable information for use in empirically based decision making.
Fisz, Jacek J
2006-12-07
The optimization approach based on the genetic algorithm (GA) combined with multiple linear regression (MLR) method, is discussed. The GA-MLR optimizer is designed for the nonlinear least-squares problems in which the model functions are linear combinations of nonlinear functions. GA optimizes the nonlinear parameters, and the linear parameters are calculated from MLR. GA-MLR is an intuitive optimization approach and it exploits all advantages of the genetic algorithm technique. This optimization method results from an appropriate combination of two well-known optimization methods. The MLR method is embedded in the GA optimizer and linear and nonlinear model parameters are optimized in parallel. The MLR method is the only one strictly mathematical "tool" involved in GA-MLR. The GA-MLR approach simplifies and accelerates considerably the optimization process because the linear parameters are not the fitted ones. Its properties are exemplified by the analysis of the kinetic biexponential fluorescence decay surface corresponding to a two-excited-state interconversion process. A short discussion of the variable projection (VP) algorithm, designed for the same class of the optimization problems, is presented. VP is a very advanced mathematical formalism that involves the methods of nonlinear functionals, algebra of linear projectors, and the formalism of Fréchet derivatives and pseudo-inverses. Additional explanatory comments are added on the application of recently introduced the GA-NR optimizer to simultaneous recovery of linear and weakly nonlinear parameters occurring in the same optimization problem together with nonlinear parameters. The GA-NR optimizer combines the GA method with the NR method, in which the minimum-value condition for the quadratic approximation to chi(2), obtained from the Taylor series expansion of chi(2), is recovered by means of the Newton-Raphson algorithm. The application of the GA-NR optimizer to model functions which are multi-linear combinations of nonlinear functions, is indicated. The VP algorithm does not distinguish the weakly nonlinear parameters from the nonlinear ones and it does not apply to the model functions which are multi-linear combinations of nonlinear functions.
Performance Evaluation of Fusing Protected Fingerprint Minutiae Templates on the Decision Level
Yang, Bian; Busch, Christoph; de Groot, Koen; Xu, Haiyun; Veldhuis, Raymond N. J.
2012-01-01
In a biometric authentication system using protected templates, a pseudonymous identifier is the part of a protected template that can be directly compared. Each compared pair of pseudonymous identifiers results in a decision testing whether both identifiers are derived from the same biometric characteristic. Compared to an unprotected system, most existing biometric template protection methods cause to a certain extent degradation in biometric performance. Fusion is therefore a promising way to enhance the biometric performance in template-protected biometric systems. Compared to feature level fusion and score level fusion, decision level fusion has not only the least fusion complexity, but also the maximum interoperability across different biometric features, template protection and recognition algorithms, templates formats, and comparison score rules. However, performance improvement via decision level fusion is not obvious. It is influenced by both the dependency and the performance gap among the conducted tests for fusion. We investigate in this paper several fusion scenarios (multi-sample, multi-instance, multi-sensor, multi-algorithm, and their combinations) on the binary decision level, and evaluate their biometric performance and fusion efficiency on a multi-sensor fingerprint database with 71,994 samples. PMID:22778583
Hazard interactions and interaction networks (cascades) within multi-hazard methodologies
NASA Astrophysics Data System (ADS)
Gill, Joel C.; Malamud, Bruce D.
2016-08-01
This paper combines research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between multi-layer single-hazard approaches and multi-hazard approaches that integrate such interactions. This synthesis suggests that ignoring interactions between important environmental and anthropogenic processes could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. In this paper we proceed to present an enhanced multi-hazard framework through the following steps: (i) description and definition of three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment, (ii) outlining of three types of interaction relationship (triggering, increased probability, and catalysis/impedance), and (iii) assessment of the importance of networks of interactions (cascades) through case study examples (based on the literature, field observations and semi-structured interviews). We further propose two visualisation frameworks to represent these networks of interactions: hazard interaction matrices and hazard/process flow diagrams. Our approach reinforces the importance of integrating interactions between different aspects of the Earth system, together with human activity, into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.
Combination therapies - the next logical step for the treatment of synucleinopathies?
Valera, E.; Masliah, E.
2015-01-01
Currently there are no disease-modifying alternatives for the treatment of most neurodegenerative disorders. The available therapies for diseases such as Parkinson’s disease (PD), PD dementia (PDD), Dementia with Lewy bodies (DLB) and Multiple system atrophy (MSA), in which the protein alpha-synuclein (α-syn) accumulates within neurons and glial cells with toxic consequences, are focused on managing the disease symptoms. However, utilizing strategic drug combinations and/or multi-target drugs might increase the treatment efficiency when compared to monotherapies. Synucleinopathies are complex disorders that progress through several stages, and toxic α-syn aggregates exhibit prion-like behavior spreading from cell to cell. Therefore, it follows that these neurodegenerative disorders might require equally complex therapeutic approaches in order to obtain significant and long-lasting results. Hypothetically, therapies aimed at reducing α-syn accumulation and cell-to-cell transfer, such as immunotherapy against α-syn, could be combined with agents that reduce neuroinflammation with potential synergistic outcomes. Here we review the current evidence supporting this type of approach, suggesting that such rational therapy combinations, together with the use of multi-target drugs, may hold promise as the next logical step for the treatment of synucleinopathies. PMID:26388203
TMS combined with EEG in genetic generalized epilepsy: A phase II diagnostic accuracy study.
Kimiskidis, Vasilios K; Tsimpiris, Alkiviadis; Ryvlin, Philippe; Kalviainen, Reetta; Koutroumanidis, Michalis; Valentin, Antonio; Laskaris, Nikolaos; Kugiumtzis, Dimitris
2017-02-01
(A) To develop a TMS-EEG stimulation and data analysis protocol in genetic generalized epilepsy (GGE). (B) To investigate the diagnostic accuracy of TMS-EEG in GGE. Pilot experiments resulted in the development and optimization of a paired-pulse TMS-EEG protocol at rest, during hyperventilation (HV), and post-HV combined with multi-level data analysis. This protocol was applied in 11 controls (C) and 25 GGE patients (P), further dichotomized into responders to antiepileptic drugs (R, n=13) and non-responders (n-R, n=12).Features (n=57) extracted from TMS-EEG responses after multi-level analysis were given to a feature selection scheme and a Bayesian classifier, and the accuracy of assigning participants into the classes P-C and R-nR was computed. On the basis of the optimal feature subset, the cross-validated accuracy of TMS-EEG for the classification P-C was 0.86 at rest, 0.81 during HV and 0.92 at post-HV, whereas for R-nR the corresponding figures are 0.80, 0.78 and 0.65, respectively. Applying a fusion approach on all conditions resulted in an accuracy of 0.84 for the classification P-C and 0.76 for the classification R-nR. TMS-EEG can be used for diagnostic purposes and for assessing the response to antiepileptic drugs. TMS-EEG holds significant diagnostic potential in GGE. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
3D Digital Surveying and Modelling of Cave Geometry: Application to Paleolithic Rock Art.
González-Aguilera, Diego; Muñoz-Nieto, Angel; Gómez-Lahoz, Javier; Herrero-Pascual, Jesus; Gutierrez-Alonso, Gabriel
2009-01-01
3D digital surveying and modelling of cave geometry represents a relevant approach for research, management and preservation of our cultural and geological legacy. In this paper, a multi-sensor approach based on a terrestrial laser scanner, a high-resolution digital camera and a total station is presented. Two emblematic caves of Paleolithic human occupation and situated in northern Spain, "Las Caldas" and "Peña de Candamo", have been chosen to put in practise this approach. As a result, an integral and multi-scalable 3D model is generated which may allow other scientists, pre-historians, geologists…, to work on two different levels, integrating different Paleolithic Art datasets: (1) a basic level based on the accurate and metric support provided by the laser scanner; and (2) a advanced level using the range and image-based modelling.
The design of multi-core DSP parallel model based on message passing and multi-level pipeline
NASA Astrophysics Data System (ADS)
Niu, Jingyu; Hu, Jian; He, Wenjing; Meng, Fanrong; Li, Chuanrong
2017-10-01
Currently, the design of embedded signal processing system is often based on a specific application, but this idea is not conducive to the rapid development of signal processing technology. In this paper, a parallel processing model architecture based on multi-core DSP platform is designed, and it is mainly suitable for the complex algorithms which are composed of different modules. This model combines the ideas of multi-level pipeline parallelism and message passing, and summarizes the advantages of the mainstream model of multi-core DSP (the Master-Slave model and the Data Flow model), so that it has better performance. This paper uses three-dimensional image generation algorithm to validate the efficiency of the proposed model by comparing with the effectiveness of the Master-Slave and the Data Flow model.
NASA Astrophysics Data System (ADS)
Darker, Iain T.; Kuo, Paul; Yang, Ming Yuan; Blechko, Anastassia; Grecos, Christos; Makris, Dimitrios; Nebel, Jean-Christophe; Gale, Alastair G.
2009-05-01
Findings from the current UK national research programme, MEDUSA (Multi Environment Deployable Universal Software Application), are presented. MEDUSA brings together two approaches to facilitate the design of an automatic, CCTV-based firearm detection system: psychological-to elicit strategies used by CCTV operators; and machine vision-to identify key cues derived from camera imagery. Potentially effective human- and machine-based strategies have been identified; these will form elements of the final system. The efficacies of these algorithms have been tested on staged CCTV footage in discriminating between firearms and matched distractor objects. Early results indicate the potential for this combined approach.
Bourne, Amanda; Holness, Stephen; Holden, Petra; Scorgie, Sarshen; Donatti, Camila I.; Midgley, Guy
2016-01-01
Climate change adds an additional layer of complexity to existing sustainable development and biodiversity conservation challenges. The impacts of global climate change are felt locally, and thus local governance structures will increasingly be responsible for preparedness and local responses. Ecosystem-based adaptation (EbA) options are gaining prominence as relevant climate change solutions. Local government officials seldom have an appropriate understanding of the role of ecosystem functioning in sustainable development goals, or access to relevant climate information. Thus the use of ecosystems in helping people adapt to climate change is limited partially by the lack of information on where ecosystems have the highest potential to do so. To begin overcoming this barrier, Conservation South Africa in partnership with local government developed a socio-ecological approach for identifying spatial EbA priorities at the sub-national level. Using GIS-based multi-criteria analysis and vegetation distribution models, the authors have spatially integrated relevant ecological and social information at a scale appropriate to inform local level political, administrative, and operational decision makers. This is the first systematic approach of which we are aware that highlights spatial priority areas for EbA implementation. Nodes of socio-ecological vulnerability are identified, and the inclusion of areas that provide ecosystem services and ecological resilience to future climate change is innovative. The purpose of this paper is to present and demonstrate a methodology for combining complex information into user-friendly spatial products for local level decision making on EbA. The authors focus on illustrating the kinds of products that can be generated from combining information in the suggested ways, and do not discuss the nuance of climate models nor present specific technical details of the model outputs here. Two representative case studies from rural South Africa demonstrate the replicability of this approach in rural and peri-urban areas of other developing and least developed countries around the world. PMID:27227671
Bourne, Amanda; Holness, Stephen; Holden, Petra; Scorgie, Sarshen; Donatti, Camila I; Midgley, Guy
2016-01-01
Climate change adds an additional layer of complexity to existing sustainable development and biodiversity conservation challenges. The impacts of global climate change are felt locally, and thus local governance structures will increasingly be responsible for preparedness and local responses. Ecosystem-based adaptation (EbA) options are gaining prominence as relevant climate change solutions. Local government officials seldom have an appropriate understanding of the role of ecosystem functioning in sustainable development goals, or access to relevant climate information. Thus the use of ecosystems in helping people adapt to climate change is limited partially by the lack of information on where ecosystems have the highest potential to do so. To begin overcoming this barrier, Conservation South Africa in partnership with local government developed a socio-ecological approach for identifying spatial EbA priorities at the sub-national level. Using GIS-based multi-criteria analysis and vegetation distribution models, the authors have spatially integrated relevant ecological and social information at a scale appropriate to inform local level political, administrative, and operational decision makers. This is the first systematic approach of which we are aware that highlights spatial priority areas for EbA implementation. Nodes of socio-ecological vulnerability are identified, and the inclusion of areas that provide ecosystem services and ecological resilience to future climate change is innovative. The purpose of this paper is to present and demonstrate a methodology for combining complex information into user-friendly spatial products for local level decision making on EbA. The authors focus on illustrating the kinds of products that can be generated from combining information in the suggested ways, and do not discuss the nuance of climate models nor present specific technical details of the model outputs here. Two representative case studies from rural South Africa demonstrate the replicability of this approach in rural and peri-urban areas of other developing and least developed countries around the world.
NASA Astrophysics Data System (ADS)
Mohamed, Raihani; Perumal, Thinagaran; Sulaiman, Md Nasir; Mustapha, Norwati; Zainudin, M. N. Shah
2017-10-01
Pertaining to the human centric concern and non-obtrusive way, the ambient sensor type technology has been selected, accepted and embedded in the environment in resilient style. Human activities, everyday are gradually becoming complex and thus complicate the inferences of activities when it involving the multi resident in the same smart environment. Current works solutions focus on separate model between the resident, activities and interactions. Some study use data association and extra auxiliary of graphical nodes to model human tracking information in an environment and some produce separate framework to incorporate the auxiliary for interaction feature model. Thus, recognizing the activities and which resident perform the activity at the same time in the smart home are vital for the smart home development and future applications. This paper will cater the above issue by considering the simplification and efficient method using the multi label classification framework. This effort eliminates time consuming and simplifies a lot of pre-processing tasks comparing with previous approach. Applications to the multi resident multi label learning in smart home problems shows the LC (Label Combination) using Decision Tree (DT) as base classifier can tackle the above problems.
On multi-site damage identification using single-site training data
NASA Astrophysics Data System (ADS)
Barthorpe, R. J.; Manson, G.; Worden, K.
2017-11-01
This paper proposes a methodology for developing multi-site damage location systems for engineering structures that can be trained using single-site damaged state data only. The methodology involves training a sequence of binary classifiers based upon single-site damage data and combining the developed classifiers into a robust multi-class damage locator. In this way, the multi-site damage identification problem may be decomposed into a sequence of binary decisions. In this paper Support Vector Classifiers are adopted as the means of making these binary decisions. The proposed methodology represents an advancement on the state of the art in the field of multi-site damage identification which require either: (1) full damaged state data from single- and multi-site damage cases or (2) the development of a physics-based model to make multi-site model predictions. The potential benefit of the proposed methodology is that a significantly reduced number of recorded damage states may be required in order to train a multi-site damage locator without recourse to physics-based model predictions. In this paper it is first demonstrated that Support Vector Classification represents an appropriate approach to the multi-site damage location problem, with methods for combining binary classifiers discussed. Next, the proposed methodology is demonstrated and evaluated through application to a real engineering structure - a Piper Tomahawk trainer aircraft wing - with its performance compared to classifiers trained using the full damaged-state dataset.
Digital equalization of time-delay array receivers on coherent laser communications.
Belmonte, Aniceto
2017-01-15
Field conjugation arrays use adaptive combining techniques on multi-aperture receivers to improve the performance of coherent laser communication links by mitigating the consequences of atmospheric turbulence on the down-converted coherent power. However, this motivates the use of complex receivers as optical signals collected by different apertures need to be adaptively processed, co-phased, and scaled before they are combined. Here, we show that multiple apertures, coupled with optical delay lines, combine retarded versions of a signal at a single coherent receiver, which uses digital equalization to obtain diversity gain against atmospheric fading. We found in our analysis that, instead of field conjugation arrays, digital equalization of time-delay multi-aperture receivers is a simpler and more versatile approach to accomplish reduction of atmospheric fading.
NASA Astrophysics Data System (ADS)
Lee, Jongpil; Nam, Juhan
2017-08-01
Music auto-tagging is often handled in a similar manner to image classification by regarding the 2D audio spectrogram as image data. However, music auto-tagging is distinguished from image classification in that the tags are highly diverse and have different levels of abstractions. Considering this issue, we propose a convolutional neural networks (CNN)-based architecture that embraces multi-level and multi-scaled features. The architecture is trained in three steps. First, we conduct supervised feature learning to capture local audio features using a set of CNNs with different input sizes. Second, we extract audio features from each layer of the pre-trained convolutional networks separately and aggregate them altogether given a long audio clip. Finally, we put them into fully-connected networks and make final predictions of the tags. Our experiments show that using the combination of multi-level and multi-scale features is highly effective in music auto-tagging and the proposed method outperforms previous state-of-the-arts on the MagnaTagATune dataset and the Million Song Dataset. We further show that the proposed architecture is useful in transfer learning.
Generating multi-double-scroll attractors via nonautonomous approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Qinghui; Xie, Qingguo, E-mail: qgxie@mail.hust.edu.cn; Shen, Yi
It is a common phenomenon that multi-scroll attractors are realized by introducing the various nonlinear functions with multiple breakpoints in double scroll chaotic systems. Differently, we present a nonautonomous approach for generating multi-double-scroll attractors (MDSA) without changing the original nonlinear functions. By using the multi-level-logic pulse excitation technique in double scroll chaotic systems, MDSA can be generated. A Chua's circuit, a Jerk circuit, and a modified Lorenz system are given as designed example and the Matlab simulation results are presented. Furthermore, the corresponding realization circuits are designed. The Pspice results are in agreement with numerical simulation results, which verify themore » availability and feasibility of this method.« less
NASA Astrophysics Data System (ADS)
Jäckel, Nicolas; Dargel, Vadim; Shpigel, Netanel; Sigalov, Sergey; Levi, Mikhael D.; Daikhin, Leonid; Aurbach, Doron; Presser, Volker
2017-12-01
Intercalation-induced dimensional changes of composite battery electrodes containing either a stiff or a soft polymeric binder is one of the many factors determining the cycling performance and ageing. Herein, we report dimensional changes in bulk composite electrodes by in situ electrochemical dilatometry (eD) combined with electrochemical quartz-crystal microbalance with dissipation monitoring (EQCM-D). The latter tracks the mechanical properties on the level of the electrode particle size. Lithium iron phosphate (LiFePO4, LFP) electrodes with a stiff binder (PVdF) and a soft binder (NaCMC) were investigated by cycling in lithium sulfate (Li2SO4) aqueous solution. The electrochemical and mechanical electrode performances depend on the electrode cycling history. Based on combined eD and EQCM-D measurements we provide evidence which properties are preferred for a binder used for a composite Li-ion battery electrode.
Risk Governance of Multiple Natural Hazards: Centralized versus Decentralized Approach in Europe
NASA Astrophysics Data System (ADS)
Komendantova, Nadejda; Scolobig, Anna; Vinchon, Charlotte
2014-05-01
The multi-risk approach is a relatively new field and its definition includes the need to consider multiple hazards and vulnerabilities in their interdependency (Selva, 2013) and the current multi-hazards disasters, such as the 2011 Tohoku earthquake, tsunami and nuclear catastrophe, showed the need for a multi-risk approach in hazard mitigation and management. Our knowledge about multi-risk assessment, including studies from different scientific disciplines and developed assessment tools, is constantly growing (White et al., 2001). However, the link between scientific knowledge, its implementation and the results in terms of improved governance and decision-making have gained significantly less attention (IRGC, 2005; Kappes et al., 2012), even though the interest to risk governance, in general, has increased significantly during the last years (Verweiy and Thompson, 2006). Therefore, the key research question is how risk assessment is implemented and what is the potential for the implementation of a multi-risk approach in different governance systems across Europe. More precisely, how do the characteristics of risk governance, such as the degree of centralization versus decentralization, influence the implementation of a multi-risk approach. The methodology of this research includes comparative case study analysis of top-down and bottom-up interactions in governance in the city of Naples, (Italy), where the institutional landscape is marked by significant autonomy of Italian regions in decision-making processes for assessing the majority of natural risks, excluding volcanic, and in Guadeloupe, French West Indies, an overseas department of France, where the decision-making process is marked by greater centralization in decision making associated with a well established state governance within regions, delegated to the prefect and decentralised services of central ministries. The research design included documentary analysis and extensive empirical work involving policy makers, private sector actors and practitioners in risk and emergency management. This work was informed by 36 semi-structured interviews, three workshops with over seventy participants from eleven different countries, feedback from questionnaires and focus group discussions (Scolobig et al., 2013). The results show that both governance systems have their own strengths and weaknesses (Komendantova et al., 2013). Elements of the centralized multi-risk governance system could lead to improvements in interagency communication and the creation of an inter-agency environment, where the different departments at the national level can exchange information, identify the communities that are most exposed to multiple risks and set priorities, while providing consistent information about and responses to multi-risk to the relevant stakeholders at the local level. A decentralised multi-risk governance system by contrast can instead favour the creation of local multi-risk commissions to conduct discussions between experts in meteorological, geological and technological risks and practitioners, to elaborate risk and hazard maps, and to develop local capacities which would include educational and training activities. Both governance systems suffer from common deficiencies, the most important being the frequent lack of capacities at the local level, especially financial, but sometimes also technical and institutional ones, as the responsibilities for disaster risk management are often transferred from the national to local levels without sufficient resources for implementation of programs on risk management (UNISDR, 2013). The difficulty in balancing available resources between short-term and medium-term priorities often complicates the issue. Our recommendations are that the implementation of multi-risk approach can be facilitated through knowledge exchange and dialogue between different disciplinary communities, such as geological and meteorological, and between the natural and social sciences. The implementation of a multi-risk approach can be strengthened through the creation of multi-risk platforms and multi-risk commissions, which can liaise between risk management experts and local communities and to unify numerous actions on natural hazard management. However, the multi-risk approach cannot be a subsidiary to a single risk approach, and both have to be pursued. References: IRGC. (2011). Concept note: Improving the management of emerging risks: Risks from new technologies, system interactions, and unforeseen or changing circumstances. International Risk Governance Council (IRGC), Geneva. Kappes, M. S., Keiler, M., Elverfeldt, von K., & Glade, T, (2012). Challenges of analyzing multi-hazard risk: A review. Natural Hazards, 64(2), 1925-1958. doi: 10.1007/s11069-012-0294-2. Komendantova N, Scolobig A, Vinchon C (2013). Multi-risk approach in centralized and decentralized risk governance systems: Case studies of Naples, Italy and Guadeloupe, France. International Relations and Diplomacy, 1(3):224-239 (December 2013) Scolobig, A., Vichon, C., Komendantova, N., Bengoubou-Valerius, M., & Patt, A. (2013). Social and institutional barriers to effective multi-hazard and multi-risk decision-making governance. D6.3 MATRIX project. Selva, J. (2013). Long-term multi-risk assessment: statistical treatment of interaction among risks. Natural Hazards, 67(2),701-722. UNISDR. (2013). Implementing the HYOGO framework for action in Europe: Regional synthesis report 2011-2013. Verweij, M., & Thompson, M. (Eds.). (2006). Clumsy solutions for a complex world: Governance, politics, and plural perceptions. New York: Palgrave Macmillan. White, G., Kates, R., & Burton, I. (2001). Knowing better and losing even more: the use of knowledge in hazards management. Environmental Hazards, 3, 81-92.
Whole-body diffusion-weighted MR image stitching and alignment to anatomical MRI
NASA Astrophysics Data System (ADS)
Ceranka, Jakub; Polfliet, Mathias; Lecouvet, Frederic; Michoux, Nicolas; Vandemeulebroucke, Jef
2017-02-01
Whole-body diffusion-weighted (WB-DW) MRI in combination with anatomical MRI has shown a great poten- tial in bone and soft tissue tumour detection, evaluation of lymph nodes and treatment response assessment. Because of the vast body coverage, whole-body MRI is acquired in separate stations, which are subsequently combined into a whole-body image. However, inter-station and inter-modality image misalignments can occur due to image distortions and patient motion during acquisition, which may lead to inaccurate representations of patient anatomy and hinder visual assessment. Automated and accurate whole-body image formation and alignment of the multi-modal MRI images is therefore crucial. We investigated several registration approaches for the formation or stitching of the whole-body image stations, followed by a deformable alignment of the multi- modal whole-body images. We compared a pairwise approach, where diffusion-weighted (DW) image stations were sequentially aligned to a reference station (pelvis), to a groupwise approach, where all stations were simultaneously mapped to a common reference space while minimizing the overall transformation. For each, a choice of input images and corresponding metrics was investigated. Performance was evaluated by assessing the quality of the obtained whole-body images, and by verifying the accuracy of the alignment with whole-body anatomical sequences. The groupwise registration approach provided the best compromise between the formation of WB- DW images and multi-modal alignment. The fully automated method was found to be robust, making its use in the clinic feasible.
NASA Astrophysics Data System (ADS)
Shirazi, M. R.; Mohamed Taib, J.; De La Rue, R. M.; Harun, S. W.; Ahmad, H.
2015-03-01
Dynamic characteristics of a multi-wavelength Brillouin-Raman fiber laser (MBRFL) assisted by four-wave mixing have been investigated through the development of Stokes and anti-Stokes lines under different combinations of Brillouin and Raman pump power levels and different Raman pumping schemes in a ring cavity. For a Stokes line of order higher than three, the threshold power was less than the saturation power of its last-order Stokes line. By increasing the Brillouin pump power, the nth order anti-Stokes and the (n+4)th order Stokes power levels were unexpectedly increased almost the same before the Stokes line threshold power. It was also found out that the SBS threshold reduction (SBSTR) depended linearly on the gain factor for the 1st and 2nd Stokes lines, as the first set. This relation for the 3rd and 4th Stokes lines as the second set, however, was almost linear with the same slope before SBSTR -6 dB, then, it approached to the linear relation in the first set when the gain factor was increased to 50 dB. Therefore, the threshold power levels of Stokes lines for a given Raman gain can be readily estimated only by knowing the threshold power levels in which there is no Raman amplification.
Ko, Linda K; Rillamas-Sun, Eileen; Bishop, Sonia; Cisneros, Oralia; Holte, Sarah; Thompson, Beti
2018-04-01
Hispanic children are disproportionally overweight and obese compared to their non-Hispanic white counterparts in the US. Community-wide, multi-level interventions have been successful to promote healthier nutrition, increased physical activity (PA), and weight loss. Using community-based participatory approach (CBPR) that engages community members in rural Hispanic communities is a promising way to promote behavior change, and ultimately weight loss among Hispanic children. Led by a community-academic partnership, the Together We STRIDE (Strategizing Together Relevant Interventions for Diet and Exercise) aims to test the effectiveness of a community-wide, multi-level intervention to promote healthier diets, increased PA, and weight loss among Hispanic children. The Together We STRIDE is a parallel quasi-experimental trial with a goal of recruiting 900 children aged 8-12 years nested within two communities (one intervention and one comparison). Children will be recruited from their respective elementary schools. Components of the 2-year multi-level intervention include comic books (individual-level), multi-generational nutrition and PA classes (family-level), teacher-led PA breaks and media literacy education (school-level), family nights, a farmer's market and a community PA event (known as ciclovia) at the community-level. Children from the comparison community will receive two newsletters. Height and weight measures will be collected from children in both communities at three time points (baseline, 6-months, and 18-months). The Together We STRIDE study aims to promote healthier diet and increased PA to produce healthy weight among Hispanic children. The use of CBPR approach and the engagement of the community will springboard strategies for intervention' sustainability. Clinical Trials Registration Number: NCT02982759 Retrospectively registered. Copyright © 2018 Elsevier Inc. All rights reserved.
The Kirkendall and Frenkel effects during 2D diffusion process
NASA Astrophysics Data System (ADS)
Wierzba, Bartek
2014-11-01
The two-dimensional approach for inter-diffusion and voids generation is presented. The voids evolution and growth is discussed. This approach is based on the bi-velocity (Darken) method which combines the Darken and Brenner concepts that the volume velocity is essential in defining the local material velocity in multi-component mixture at non-equilibrium. The model is formulated for arbitrary multi-component two-dimensional systems. It is shown that the voids growth is due to the drift velocity and vacancy migration. The radius of the void can be easily estimated. The distributions of (1) components, (2) vacancy and (3) voids radius over the distance is presented.
An Approach to Speed up Single-Frequency PPP Convergence with Quad-Constellation GNSS and GIM
Cai, Changsheng; Gong, Yangzhao; Gao, Yang; Kuang, Cuilin
2017-01-01
The single-frequency precise point positioning (PPP) technique has attracted increasing attention due to its high accuracy and low cost. However, a very long convergence time, normally a few hours, is required in order to achieve a positioning accuracy level of a few centimeters. In this study, an approach is proposed to accelerate the single-frequency PPP convergence by combining quad-constellation global navigation satellite system (GNSS) and global ionospheric map (GIM) data. In this proposed approach, the GPS, GLONASS, BeiDou, and Galileo observations are directly used in an uncombined observation model and as a result the ionospheric and hardware delay (IHD) can be estimated together as a single unknown parameter. The IHD values acquired from the GIM product and the multi-GNSS differential code bias (DCB) product are then utilized as pseudo-observables of the IHD parameter in the observation model. A time varying weight scheme has also been proposed for the pseudo-observables to gradually decrease its contribution to the position solutions during the convergence period. To evaluate the proposed approach, datasets from twelve Multi-GNSS Experiment (MGEX) stations on seven consecutive days are processed and analyzed. The numerical results indicate that the single-frequency PPP with quad-constellation GNSS and GIM data are able to reduce the convergence time by 56%, 47%, 41% in the east, north, and up directions compared to the GPS-only single-frequency PPP. PMID:28587305
ERIC Educational Resources Information Center
Vieira, Rodrigo Drumond; Kelly, Gregory J.
2014-01-01
In this paper, we present and apply a multi-level method for discourse analysis in science classrooms. This method is based on the structure of human activity (activity, actions, and operations) and it was applied to study a pre-service physics teacher methods course. We argue that such an approach, based on a cultural psychological perspective,…
Review on Graph Clustering and Subgraph Similarity Based Analysis of Neurological Disorders
Thomas, Jaya; Seo, Dongmin; Sael, Lee
2016-01-01
How can complex relationships among molecular or clinico-pathological entities of neurological disorders be represented and analyzed? Graphs seem to be the current answer to the question no matter the type of information: molecular data, brain images or neural signals. We review a wide spectrum of graph representation and graph analysis methods and their application in the study of both the genomic level and the phenotypic level of the neurological disorder. We find numerous research works that create, process and analyze graphs formed from one or a few data types to gain an understanding of specific aspects of the neurological disorders. Furthermore, with the increasing number of data of various types becoming available for neurological disorders, we find that integrative analysis approaches that combine several types of data are being recognized as a way to gain a global understanding of the diseases. Although there are still not many integrative analyses of graphs due to the complexity in analysis, multi-layer graph analysis is a promising framework that can incorporate various data types. We describe and discuss the benefits of the multi-layer graph framework for studies of neurological disease. PMID:27258269
Review on Graph Clustering and Subgraph Similarity Based Analysis of Neurological Disorders.
Thomas, Jaya; Seo, Dongmin; Sael, Lee
2016-06-01
How can complex relationships among molecular or clinico-pathological entities of neurological disorders be represented and analyzed? Graphs seem to be the current answer to the question no matter the type of information: molecular data, brain images or neural signals. We review a wide spectrum of graph representation and graph analysis methods and their application in the study of both the genomic level and the phenotypic level of the neurological disorder. We find numerous research works that create, process and analyze graphs formed from one or a few data types to gain an understanding of specific aspects of the neurological disorders. Furthermore, with the increasing number of data of various types becoming available for neurological disorders, we find that integrative analysis approaches that combine several types of data are being recognized as a way to gain a global understanding of the diseases. Although there are still not many integrative analyses of graphs due to the complexity in analysis, multi-layer graph analysis is a promising framework that can incorporate various data types. We describe and discuss the benefits of the multi-layer graph framework for studies of neurological disease.
Carcinogenic Air Toxics Exposure and Their Cancer-Related Health Impacts in the United States.
Zhou, Ying; Li, Chaoyang; Huijbregts, Mark A J; Mumtaz, M Moiz
2015-01-01
Public health protection from air pollution can be achieved more effectively by shifting from a single-pollutant approach to a multi-pollutant approach. To develop such multi-pollutant approaches, identifying which air pollutants are present most frequently is essential. This study aims to determine the frequently found carcinogenic air toxics or hazardous air pollutants (HAPs) combinations across the United States as well as to analyze the health impacts of developing cancer due to exposure to these HAPs. To identify the most commonly found carcinogenic air toxics combinations, we first identified HAPs with cancer risk greater than one in a million in more than 5% of the census tracts across the United States, based on the National-Scale Air Toxics Assessment (NATA) by the U.S. EPA for year 2005. We then calculated the frequencies of their two-component (binary), and three-component (ternary) combinations. To quantify the cancer-related health impacts, we focused on the 10 most frequently found HAPs with national average cancer risk greater than one in a million. Their cancer-related health impacts were calculated by converting lifetime cancer risk reported in NATA 2005 to years of healthy life lost or Disability-Adjusted Life Years (DALYs). We found that the most frequently found air toxics with cancer risk greater than one in a million are formaldehyde, carbon tetrachloride, acetaldehyde, and benzene. The most frequently occurring binary pairs and ternary mixtures are the various combinations of these four air toxics. Analysis of urban and rural HAPs did not reveal significant differences in the top combinations of these chemicals. The cumulative annual cancer-related health impacts of inhaling the top 10 carcinogenic air toxics included was about 1,600 DALYs in the United States or 0.6 DALYs per 100,000 people. Formaldehyde and benzene together contribute nearly 60 percent of the total cancer-related health impacts. Our study shows that although there are many carcinogenic air toxics, only a few of them affect public health significantly at the national level in the United States, based on the frequency of occurrence of air toxics mixtures and cancer-related public health impacts. Future research is needed on their joint toxicity and cumulative health impacts.
A multi-sensor remote sensing approach for measuring primary production from space
NASA Technical Reports Server (NTRS)
Gautier, Catherine
1989-01-01
It is proposed to develop a multi-sensor remote sensing method for computing marine primary productivity from space, based on the capability to measure the primary ocean variables which regulate photosynthesis. The three variables and the sensors which measure them are: (1) downwelling photosynthetically available irradiance, measured by the VISSR sensor on the GOES satellite, (2) sea-surface temperature from AVHRR on NOAA series satellites, and (3) chlorophyll-like pigment concentration from the Nimbus-7/CZCS sensor. These and other measured variables would be combined within empirical or analytical models to compute primary productivity. With this proposed capability of mapping primary productivity on a regional scale, we could begin realizing a more precise and accurate global assessment of its magnitude and variability. Applications would include supplementation and expansion on the horizontal scale of ship-acquired biological data, which is more accurate and which supplies the vertical components of the field, monitoring oceanic response to increased atmospheric carbon dioxide levels, correlation with observed sedimentation patterns and processes, and fisheries management.
A versatile clearing agent for multi-modal brain imaging
Costantini, Irene; Ghobril, Jean-Pierre; Di Giovanna, Antonino Paolo; Mascaro, Anna Letizia Allegra; Silvestri, Ludovico; Müllenbroich, Marie Caroline; Onofri, Leonardo; Conti, Valerio; Vanzi, Francesco; Sacconi, Leonardo; Guerrini, Renzo; Markram, Henry; Iannello, Giulio; Pavone, Francesco Saverio
2015-01-01
Extensive mapping of neuronal connections in the central nervous system requires high-throughput µm-scale imaging of large volumes. In recent years, different approaches have been developed to overcome the limitations due to tissue light scattering. These methods are generally developed to improve the performance of a specific imaging modality, thus limiting comprehensive neuroanatomical exploration by multi-modal optical techniques. Here, we introduce a versatile brain clearing agent (2,2′-thiodiethanol; TDE) suitable for various applications and imaging techniques. TDE is cost-efficient, water-soluble and low-viscous and, more importantly, it preserves fluorescence, is compatible with immunostaining and does not cause deformations at sub-cellular level. We demonstrate the effectiveness of this method in different applications: in fixed samples by imaging a whole mouse hippocampus with serial two-photon tomography; in combination with CLARITY by reconstructing an entire mouse brain with light sheet microscopy and in translational research by imaging immunostained human dysplastic brain tissue. PMID:25950610
Reasoning about real-time systems with temporal interval logic constraints on multi-state automata
NASA Technical Reports Server (NTRS)
Gabrielian, Armen
1991-01-01
Models of real-time systems using a single paradigm often turn out to be inadequate, whether the paradigm is based on states, rules, event sequences, or logic. A model-based approach to reasoning about real-time systems is presented in which a temporal interval logic called TIL is employed to define constraints on a new type of high level automata. The combination, called hierarchical multi-state (HMS) machines, can be used to model formally a real-time system, a dynamic set of requirements, the environment, heuristic knowledge about planning-related problem solving, and the computational states of the reasoning mechanism. In this framework, mathematical techniques were developed for: (1) proving the correctness of a representation; (2) planning of concurrent tasks to achieve goals; and (3) scheduling of plans to satisfy complex temporal constraints. HMS machines allow reasoning about a real-time system from a model of how truth arises instead of merely depending of what is true in a system.
Xu, Shi-Zhou; Wang, Chun-Jie; Lin, Fang-Li; Li, Shi-Xiang
2017-10-31
The multi-device open-circuit fault is a common fault of ANPC (Active Neutral-Point Clamped) three-level inverter and effect the operation stability of the whole system. To improve the operation stability, this paper summarized the main solutions currently firstly and analyzed all the possible states of multi-device open-circuit fault. Secondly, an order-reduction optimal control strategy was proposed under multi-device open-circuit fault to realize fault-tolerant control based on the topology and control requirement of ANPC three-level inverter and operation stability. This control strategy can solve the faults with different operation states, and can works in order-reduction state under specific open-circuit faults with specific combined devices, which sacrifices the control quality to obtain the stability priority control. Finally, the simulation and experiment proved the effectiveness of the proposed strategy.
Multicolor Super-Resolution Fluorescence Imaging via Multi-Parameter Fluorophore Detection
Bates, Mark; Dempsey, Graham T; Chen, Kok Hao; Zhuang, Xiaowei
2012-01-01
Understanding the complexity of the cellular environment will benefit from the ability to unambiguously resolve multiple cellular components, simultaneously and with nanometer-scale spatial resolution. Multicolor super-resolution fluorescence microscopy techniques have been developed to achieve this goal, yet challenges remain in terms of the number of targets that can be simultaneously imaged and the crosstalk between color channels. Herein, we demonstrate multicolor stochastic optical reconstruction microscopy (STORM) based on a multi-parameter detection strategy, which uses both the fluorescence activation wavelength and the emission color to discriminate between photo-activatable fluorescent probes. First, we obtained two-color super-resolution images using the near-infrared cyanine dye Alexa 750 in conjunction with a red cyanine dye Alexa 647, and quantified color crosstalk levels and image registration accuracy. Combinatorial pairing of these two switchable dyes with fluorophores which enhance photo-activation enabled multi-parameter detection of six different probes. Using this approach, we obtained six-color super-resolution fluorescence images of a model sample. The combination of multiple fluorescence detection parameters for improved fluorophore discrimination promises to substantially enhance our ability to visualize multiple cellular targets with sub-diffraction-limit resolution. PMID:22213647
Biomaterial science meets computational biology.
Hutmacher, Dietmar W; Little, J Paige; Pettet, Graeme J; Loessner, Daniela
2015-05-01
There is a pressing need for a predictive tool capable of revealing a holistic understanding of fundamental elements in the normal and pathological cell physiology of organoids in order to decipher the mechanoresponse of cells. Therefore, the integration of a systems bioengineering approach into a validated mathematical model is necessary to develop a new simulation tool. This tool can only be innovative by combining biomaterials science with computational biology. Systems-level and multi-scale experimental data are incorporated into a single framework, thus representing both single cells and collective cell behaviour. Such a computational platform needs to be validated in order to discover key mechano-biological factors associated with cell-cell and cell-niche interactions.
NASA Astrophysics Data System (ADS)
Bashiri, Mahdi; Farshbaf-Geranmayeh, Amir; Mogouie, Hamed
2013-11-01
In this paper, a new method is proposed to optimize a multi-response optimization problem based on the Taguchi method for the processes where controllable factors are the smaller-the-better (STB)-type variables and the analyzer desires to find an optimal solution with smaller amount of controllable factors. In such processes, the overall output quality of the product should be maximized while the usage of the process inputs, the controllable factors, should be minimized. Since all possible combinations of factors' levels, are not considered in the Taguchi method, the response values of the possible unpracticed treatments are estimated using the artificial neural network (ANN). The neural network is tuned by the central composite design (CCD) and the genetic algorithm (GA). Then data envelopment analysis (DEA) is applied for determining the efficiency of each treatment. Although the important issue for implementation of DEA is its philosophy, which is maximization of outputs versus minimization of inputs, this important issue has been neglected in previous similar studies in multi-response problems. Finally, the most efficient treatment is determined using the maximin weight model approach. The performance of the proposed method is verified in a plastic molding process. Moreover a sensitivity analysis has been done by an efficiency estimator neural network. The results show efficiency of the proposed approach.
Encarnação, L Miguel; Bimber, Oliver
2002-01-01
Collaborative virtual environments for diagnosis and treatment planning are increasingly gaining importance in our global society. Virtual and Augmented Reality approaches promised to provide valuable means for the involved interactive data analysis, but the underlying technologies still create a cumbersome work environment that is inadequate for clinical employment. This paper addresses two of the shortcomings of such technology: Intuitive interaction with multi-dimensional data in immersive and semi-immersive environments as well as stereoscopic multi-user displays combining the advantages of Virtual and Augmented Reality technology.
Multi-enzyme logic network architectures for assessing injuries: digital processing of biomarkers.
Halámek, Jan; Bocharova, Vera; Chinnapareddy, Soujanya; Windmiller, Joshua Ray; Strack, Guinevere; Chuang, Min-Chieh; Zhou, Jian; Santhosh, Padmanabhan; Ramirez, Gabriela V; Arugula, Mary A; Wang, Joseph; Katz, Evgeny
2010-12-01
A multi-enzyme biocatalytic cascade processing simultaneously five biomarkers characteristic of traumatic brain injury (TBI) and soft tissue injury (STI) was developed. The system operates as a digital biosensor based on concerted function of 8 Boolean AND logic gates, resulting in the decision about the physiological conditions based on the logic analysis of complex patterns of the biomarkers. The system represents the first example of a multi-step/multi-enzyme biosensor with the built-in logic for the analysis of complex combinations of biochemical inputs. The approach is based on recent advances in enzyme-based biocomputing systems and the present paper demonstrates the potential applicability of biocomputing for developing novel digital biosensor networks.
A comprehensive Probabilistic Tsunami Hazard Assessment for the city of Naples (Italy)
NASA Astrophysics Data System (ADS)
Anita, G.; Tonini, R.; Selva, J.; Sandri, L.; Pierdominici, S.; Faenza, L.; Zaccarelli, L.
2012-12-01
A comprehensive Probabilistic Tsunami Hazard Assessment (PTHA) should consider different tsunamigenic sources (seismic events, slide failures, volcanic eruptions) to calculate the hazard on given target sites. This implies a multi-disciplinary analysis of all natural tsunamigenic sources, in a multi-hazard/risk framework, which considers also the effects of interaction/cascade events. Our approach shows the ongoing effort to analyze the comprehensive PTHA for the city of Naples (Italy) including all types of sources located in the Tyrrhenian Sea, as developed within the Italian project ByMuR (Bayesian Multi-Risk Assessment). The project combines a multi-hazard/risk approach to treat the interactions among different hazards, and a Bayesian approach to handle the uncertainties. The natural potential tsunamigenic sources analyzed are: 1) submarine seismic sources located on active faults in the Tyrrhenian Sea and close to the Southern Italian shore line (also we consider the effects of the inshore seismic sources and the associated active faults which we provide their rapture properties), 2) mass failures and collapses around the target area (spatially identified on the basis of their propensity to failure), and 3) volcanic sources mainly identified by pyroclastic flows and collapses from the volcanoes in the Neapolitan area (Vesuvius, Campi Flegrei and Ischia). All these natural sources are here preliminary analyzed and combined, in order to provide a complete picture of a PTHA for the city of Naples. In addition, the treatment of interaction/cascade effects is formally discussed in the case of significant temporary variations in the short-term PTHA due to an earthquake.
Multi-element stochastic spectral projection for high quantile estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ko, Jordan, E-mail: jordan.ko@mac.com; Garnier, Josselin
2013-06-15
We investigate quantile estimation by multi-element generalized Polynomial Chaos (gPC) metamodel where the exact numerical model is approximated by complementary metamodels in overlapping domains that mimic the model’s exact response. The gPC metamodel is constructed by the non-intrusive stochastic spectral projection approach and function evaluation on the gPC metamodel can be considered as essentially free. Thus, large number of Monte Carlo samples from the metamodel can be used to estimate α-quantile, for moderate values of α. As the gPC metamodel is an expansion about the means of the inputs, its accuracy may worsen away from these mean values where themore » extreme events may occur. By increasing the approximation accuracy of the metamodel, we may eventually improve accuracy of quantile estimation but it is very expensive. A multi-element approach is therefore proposed by combining a global metamodel in the standard normal space with supplementary local metamodels constructed in bounded domains about the design points corresponding to the extreme events. To improve the accuracy and to minimize the sampling cost, sparse-tensor and anisotropic-tensor quadratures are tested in addition to the full-tensor Gauss quadrature in the construction of local metamodels; different bounds of the gPC expansion are also examined. The global and local metamodels are combined in the multi-element gPC (MEgPC) approach and it is shown that MEgPC can be more accurate than Monte Carlo or importance sampling methods for high quantile estimations for input dimensions roughly below N=8, a limit that is very much case- and α-dependent.« less
Ramirez-Sarmiento, Cesar A; Komives, Elizabeth A
2018-04-06
Hydrogen-deuterium exchange mass spectrometry (HDXMS) has emerged as a powerful approach for revealing folding and allostery in protein-protein interactions. The advent of higher resolution mass spectrometers combined with ion mobility separation and ultra performance liquid chromatographic separations have allowed the complete coverage of large protein sequences and multi-protein complexes. Liquid-handling robots have improved the reproducibility and accurate temperature control of the sample preparation. Many researchers are also appreciating the power of combining biophysical approaches such as stopped-flow fluorescence, single molecule FRET, and molecular dynamics simulations with HDXMS. In this review, we focus on studies that have used a combination of approaches to reveal (re)folding of proteins as well as on long-distance allosteric changes upon interaction. Copyright © 2018 Elsevier Inc. All rights reserved.
Kreider, Wayne; Yuldashev, Petr V.; Sapozhnikov, Oleg A.; Farr, Navid; Partanen, Ari; Bailey, Michael R.; Khokhlova, Vera A.
2014-01-01
High-intensity focused ultrasound (HIFU) is a treatment modality that relies on the delivery of acoustic energy to remote tissue sites to induce thermal and/or mechanical tissue ablation. To ensure the safety and efficacy of this medical technology, standard approaches are needed for accurately characterizing the acoustic pressures generated by clinical ultrasound sources under operating conditions. Characterization of HIFU fields is complicated by nonlinear wave propagation and the complexity of phased-array transducers. Previous work has described aspects of an approach that combines measurements and modeling, and here we demonstrate this approach for a clinical phased array transducer. First, low-amplitude hydrophone measurements were performed in water over a scan plane between the array and the focus. Second, these measurements were used to holographically reconstruct the surface vibrations of the transducer and to set a boundary condition for a 3-D acoustic propagation model. Finally, nonlinear simulations of the acoustic field were carried out over a range of source power levels. Simulation results were compared to pressure waveforms measured directly by hydrophone at both low and high power levels, demonstrating that details of the acoustic field including shock formation are quantitatively predicted. PMID:25004539
NASA Astrophysics Data System (ADS)
Parfenov, D. I.; Bolodurina, I. P.
2018-05-01
The article presents the results of developing an approach to detecting and protecting against network attacks on the corporate infrastructure deployed on the multi-cloud platform. The proposed approach is based on the combination of two technologies: a softwareconfigurable network and virtualization of network functions. The approach for searching for anomalous traffic is to use a hybrid neural network consisting of a self-organizing Kohonen network and a multilayer perceptron. The study of the work of the prototype of the system for detecting attacks, the method of forming a learning sample, and the course of experiments are described. The study showed that using the proposed approach makes it possible to increase the effectiveness of the obfuscation of various types of attacks and at the same time does not reduce the performance of the network
Evaluation of prostate segmentation algorithms for MRI: the PROMISE12 challenge
Litjens, Geert; Toth, Robert; van de Ven, Wendy; Hoeks, Caroline; Kerkstra, Sjoerd; van Ginneken, Bram; Vincent, Graham; Guillard, Gwenael; Birbeck, Neil; Zhang, Jindang; Strand, Robin; Malmberg, Filip; Ou, Yangming; Davatzikos, Christos; Kirschner, Matthias; Jung, Florian; Yuan, Jing; Qiu, Wu; Gao, Qinquan; Edwards, Philip “Eddie”; Maan, Bianca; van der Heijden, Ferdinand; Ghose, Soumya; Mitra, Jhimli; Dowling, Jason; Barratt, Dean; Huisman, Henkjan; Madabhushi, Anant
2014-01-01
Prostate MRI image segmentation has been an area of intense research due to the increased use of MRI as a modality for the clinical workup of prostate cancer. Segmentation is useful for various tasks, e.g. to accurately localize prostate boundaries for radiotherapy or to initialize multi-modal registration algorithms. In the past, it has been difficult for research groups to evaluate prostate segmentation algorithms on multi-center, multi-vendor and multi-protocol data. Especially because we are dealing with MR images, image appearance, resolution and the presence of artifacts are affected by differences in scanners and/or protocols, which in turn can have a large influence on algorithm accuracy. The Prostate MR Image Segmentation (PROMISE12) challenge was setup to allow a fair and meaningful comparison of segmentation methods on the basis of performance and robustness. In this work we will discuss the initial results of the online PROMISE12 challenge, and the results obtained in the live challenge workshop hosted by the MICCAI2012 conference. In the challenge, 100 prostate MR cases from 4 different centers were included, with differences in scanner manufacturer, field strength and protocol. A total of 11 teams from academic research groups and industry participated. Algorithms showed a wide variety in methods and implementation, including active appearance models, atlas registration and level sets. Evaluation was performed using boundary and volume based metrics which were combined into a single score relating the metrics to human expert performance. The winners of the challenge where the algorithms by teams Imorphics and ScrAutoProstate, with scores of 85.72 and 84.29 overall. Both algorithms where significantly better than all other algorithms in the challenge (p < 0.05) and had an efficient implementation with a run time of 8 minutes and 3 second per case respectively. Overall, active appearance model based approaches seemed to outperform other approaches like multi-atlas registration, both on accuracy and computation time. Although average algorithm performance was good to excellent and the Imorphics algorithm outperformed the second observer on average, we showed that algorithm combination might lead to further improvement, indicating that optimal performance for prostate segmentation is not yet obtained. All results are available online at http://promise12.grand-challenge.org/. PMID:24418598
NASA Astrophysics Data System (ADS)
Fu, Lin; Hu, Xiangyu Y.; Adams, Nikolaus A.
2017-12-01
We propose efficient single-step formulations for reinitialization and extending algorithms, which are critical components of level-set based interface-tracking methods. The level-set field is reinitialized with a single-step (non iterative) "forward tracing" algorithm. A minimum set of cells is defined that describes the interface, and reinitialization employs only data from these cells. Fluid states are extrapolated or extended across the interface by a single-step "backward tracing" algorithm. Both algorithms, which are motivated by analogy to ray-tracing, avoid multiple block-boundary data exchanges that are inevitable for iterative reinitialization and extending approaches within a parallel-computing environment. The single-step algorithms are combined with a multi-resolution conservative sharp-interface method and validated by a wide range of benchmark test cases. We demonstrate that the proposed reinitialization method achieves second-order accuracy in conserving the volume of each phase. The interface location is invariant to reapplication of the single-step reinitialization. Generally, we observe smaller absolute errors than for standard iterative reinitialization on the same grid. The computational efficiency is higher than for the standard and typical high-order iterative reinitialization methods. We observe a 2- to 6-times efficiency improvement over the standard method for serial execution. The proposed single-step extending algorithm, which is commonly employed for assigning data to ghost cells with ghost-fluid or conservative interface interaction methods, shows about 10-times efficiency improvement over the standard method while maintaining same accuracy. Despite their simplicity, the proposed algorithms offer an efficient and robust alternative to iterative reinitialization and extending methods for level-set based multi-phase simulations.
Modelling strategies to predict the multi-scale effects of rural land management change
NASA Astrophysics Data System (ADS)
Bulygina, N.; Ballard, C. E.; Jackson, B. M.; McIntyre, N.; Marshall, M.; Reynolds, B.; Wheater, H. S.
2011-12-01
Changes to the rural landscape due to agricultural land management are ubiquitous, yet predicting the multi-scale effects of land management change on hydrological response remains an important scientific challenge. Much empirical research has been of little generic value due to inadequate design and funding of monitoring programmes, while the modelling issues challenge the capability of data-based, conceptual and physics-based modelling approaches. In this paper we report on a major UK research programme, motivated by a national need to quantify effects of agricultural intensification on flood risk. Working with a consortium of farmers in upland Wales, a multi-scale experimental programme (from experimental plots to 2nd order catchments) was developed to address issues of upland agricultural intensification. This provided data support for a multi-scale modelling programme, in which highly detailed physics-based models were conditioned on the experimental data and used to explore effects of potential field-scale interventions. A meta-modelling strategy was developed to represent detailed modelling in a computationally-efficient manner for catchment-scale simulation; this allowed catchment-scale quantification of potential management options. For more general application to data-sparse areas, alternative approaches were needed. Physics-based models were developed for a range of upland management problems, including the restoration of drained peatlands, afforestation, and changing grazing practices. Their performance was explored using literature and surrogate data; although subject to high levels of uncertainty, important insights were obtained, of practical relevance to management decisions. In parallel, regionalised conceptual modelling was used to explore the potential of indices of catchment response, conditioned on readily-available catchment characteristics, to represent ungauged catchments subject to land management change. Although based in part on speculative relationships, significant predictive power was derived from this approach. Finally, using a formal Bayesian procedure, these different sources of information were combined with local flow data in a catchment-scale conceptual model application , i.e. using small-scale physical properties, regionalised signatures of flow and available flow measurements.
NASA Astrophysics Data System (ADS)
Zunoubi, Mohammad R.; Anderson, Brian; Naderi, Shadi A.; Madden, Timothy J.; Dajani, Iyad
2017-03-01
The development of high-power fiber lasers is of great interest due to the advantages they offer relative to other laser technologies. Currently, the maximum power from a reportedly single-mode fiber amplifier stands at 10 kW. Though impressive, this power level was achieved at the cost of a large spectral linewidth, making the laser unsuitable for coherent or spectral beam combination techniques required to reach power levels necessary for airborne tactical applications. An effective approach in limiting the SBS effect is to insert an electro-optic phase modulator at the low-power end of a master oscillator power amplifier (MOPA) system. As a result, the optical power is spread among spectral sidebands; thus raising the overall SBS threshold of the amplifier. It is the purpose of this work to present a comprehensive numerical scheme that is based on the extended nonlinear Schrodinger equations that allows for accurate analysis of phase modulated fiber amplifier systems in relation to the group velocity dispersion and Kerr nonlinearities and their effect on the coherent beam combining efficiency. As such, we have simulated a high-power MOPA system modulated via filtered pseudo-random bit sequence format for different clock rates and power levels. We show that at clock rates of ≥30 GHz, the combination of GVD and self-phase modulation may lead to a drastic drop in beam combining efficiency at the multi-kW level. Furthermore, we extend our work to study the effect of cross-phase modulation where an amplifier is seeded with two laser sources.
Detecting bursts in the EEG of very and extremely premature infants using a multi-feature approach.
O'Toole, John M; Boylan, Geraldine B; Lloyd, Rhodri O; Goulding, Robert M; Vanhatalo, Sampsa; Stevenson, Nathan J
2017-07-01
To develop a method that segments preterm EEG into bursts and inter-bursts by extracting and combining multiple EEG features. Two EEG experts annotated bursts in individual EEG channels for 36 preterm infants with gestational age < 30 weeks. The feature set included spectral, amplitude, and frequency-weighted energy features. Using a consensus annotation, feature selection removed redundant features and a support vector machine combined features. Area under the receiver operator characteristic (AUC) and Cohen's kappa (κ) evaluated performance within a cross-validation procedure. The proposed channel-independent method improves AUC by 4-5% over existing methods (p < 0.001, n=36), with median (95% confidence interval) AUC of 0.989 (0.973-0.997) and sensitivity-specificity of 95.8-94.4%. Agreement rates between the detector and experts' annotations, κ=0.72 (0.36-0.83) and κ=0.65 (0.32-0.81), are comparable to inter-rater agreement, κ=0.60 (0.21-0.74). Automating the visual identification of bursts in preterm EEG is achievable with a high level of accuracy. Multiple features, combined using a data-driven approach, improves on existing single-feature methods. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Combined non-parametric and parametric approach for identification of time-variant systems
NASA Astrophysics Data System (ADS)
Dziedziech, Kajetan; Czop, Piotr; Staszewski, Wieslaw J.; Uhl, Tadeusz
2018-03-01
Identification of systems, structures and machines with variable physical parameters is a challenging task especially when time-varying vibration modes are involved. The paper proposes a new combined, two-step - i.e. non-parametric and parametric - modelling approach in order to determine time-varying vibration modes based on input-output measurements. Single-degree-of-freedom (SDOF) vibration modes from multi-degree-of-freedom (MDOF) non-parametric system representation are extracted in the first step with the use of time-frequency wavelet-based filters. The second step involves time-varying parametric representation of extracted modes with the use of recursive linear autoregressive-moving-average with exogenous inputs (ARMAX) models. The combined approach is demonstrated using system identification analysis based on the experimental mass-varying MDOF frame-like structure subjected to random excitation. The results show that the proposed combined method correctly captures the dynamics of the analysed structure, using minimum a priori information on the model.
NASA Technical Reports Server (NTRS)
Tarabalka, Y.; Tilton, J. C.; Benediktsson, J. A.; Chanussot, J.
2012-01-01
The Hierarchical SEGmentation (HSEG) algorithm, which combines region object finding with region object clustering, has given good performances for multi- and hyperspectral image analysis. This technique produces at its output a hierarchical set of image segmentations. The automated selection of a single segmentation level is often necessary. We propose and investigate the use of automatically selected markers for this purpose. In this paper, a novel Marker-based HSEG (M-HSEG) method for spectral-spatial classification of hyperspectral images is proposed. Two classification-based approaches for automatic marker selection are adapted and compared for this purpose. Then, a novel constrained marker-based HSEG algorithm is applied, resulting in a spectral-spatial classification map. Three different implementations of the M-HSEG method are proposed and their performances in terms of classification accuracies are compared. The experimental results, presented for three hyperspectral airborne images, demonstrate that the proposed approach yields accurate segmentation and classification maps, and thus is attractive for remote sensing image analysis.
Resilience and Robustness in Long-Term Planning of the National Energy and Transportation System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ibanez, Eduardo; Lavrenz, Steven; Gkritza, Konstantina
2016-01-01
The most significant energy consuming infrastructures and the greatest contributors to greenhouse gases for any developed nation today are electric and freight/passenger transportation systems. Technological alternatives for producing, transporting and converting energy for electric and transportation systems are numerous. Addressing costs, sustainability and resilience of electric and transportation needs requires long-term assessment since these capital-intensive infrastructures take years to build with lifetimes approaching a century. Yet, the advent of electrically driven transportation, including cars, trucks and trains, creates potential interdependencies between the two infrastructures that may be both problematic and beneficial. We are developing modelling capability to perform long-term electricmore » and transportation infrastructure design at a national level, accounting for their interdependencies. The approach combines network flow modelling with a multi-objective solution method. We describe and compare it to the state of the art in energy planning models. An example is presented to illustrate important features of this new approach.« less
A multi-objective optimization approach accurately resolves protein domain architectures
Bernardes, J.S.; Vieira, F.R.J.; Zaverucha, G.; Carbone, A.
2016-01-01
Motivation: Given a protein sequence and a number of potential domains matching it, what are the domain content and the most likely domain architecture for the sequence? This problem is of fundamental importance in protein annotation, constituting one of the main steps of all predictive annotation strategies. On the other hand, when potential domains are several and in conflict because of overlapping domain boundaries, finding a solution for the problem might become difficult. An accurate prediction of the domain architecture of a multi-domain protein provides important information for function prediction, comparative genomics and molecular evolution. Results: We developed DAMA (Domain Annotation by a Multi-objective Approach), a novel approach that identifies architectures through a multi-objective optimization algorithm combining scores of domain matches, previously observed multi-domain co-occurrence and domain overlapping. DAMA has been validated on a known benchmark dataset based on CATH structural domain assignments and on the set of Plasmodium falciparum proteins. When compared with existing tools on both datasets, it outperforms all of them. Availability and implementation: DAMA software is implemented in C++ and the source code can be found at http://www.lcqb.upmc.fr/DAMA. Contact: juliana.silva_bernardes@upmc.fr or alessandra.carbone@lip6.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26458889
A Visual Analytics Approach for Station-Based Air Quality Data
Du, Yi; Ma, Cuixia; Wu, Chao; Xu, Xiaowei; Guo, Yike; Zhou, Yuanchun; Li, Jianhui
2016-01-01
With the deployment of multi-modality and large-scale sensor networks for monitoring air quality, we are now able to collect large and multi-dimensional spatio-temporal datasets. For these sensed data, we present a comprehensive visual analysis approach for air quality analysis. This approach integrates several visual methods, such as map-based views, calendar views, and trends views, to assist the analysis. Among those visual methods, map-based visual methods are used to display the locations of interest, and the calendar and the trends views are used to discover the linear and periodical patterns. The system also provides various interaction tools to combine the map-based visualization, trends view, calendar view and multi-dimensional view. In addition, we propose a self-adaptive calendar-based controller that can flexibly adapt the changes of data size and granularity in trends view. Such a visual analytics system would facilitate big-data analysis in real applications, especially for decision making support. PMID:28029117
A Visual Analytics Approach for Station-Based Air Quality Data.
Du, Yi; Ma, Cuixia; Wu, Chao; Xu, Xiaowei; Guo, Yike; Zhou, Yuanchun; Li, Jianhui
2016-12-24
With the deployment of multi-modality and large-scale sensor networks for monitoring air quality, we are now able to collect large and multi-dimensional spatio-temporal datasets. For these sensed data, we present a comprehensive visual analysis approach for air quality analysis. This approach integrates several visual methods, such as map-based views, calendar views, and trends views, to assist the analysis. Among those visual methods, map-based visual methods are used to display the locations of interest, and the calendar and the trends views are used to discover the linear and periodical patterns. The system also provides various interaction tools to combine the map-based visualization, trends view, calendar view and multi-dimensional view. In addition, we propose a self-adaptive calendar-based controller that can flexibly adapt the changes of data size and granularity in trends view. Such a visual analytics system would facilitate big-data analysis in real applications, especially for decision making support.
Measuring sustainable development using a multi-criteria model: a case study.
Boggia, Antonio; Cortina, Carla
2010-11-01
This paper shows how Multi-criteria Decision Analysis (MCDA) can help in a complex process such as the assessment of the level of sustainability of a certain area. The paper presents the results of a study in which a model for measuring sustainability was implemented to better aid public policy decisions regarding sustainability. In order to assess sustainability in specific areas, a methodological approach based on multi-criteria analysis has been developed. The aim is to rank areas in order to understand the specific technical and/or financial support that they need to develop sustainable growth. The case study presented is an assessment of the level of sustainability in different areas of an Italian Region using the MCDA approach. Our results show that MCDA is a proper approach for sustainability assessment. The results are easy to understand and the evaluation path is clear and transparent. This is what decision makers need for having support to their decisions. The multi-criteria model for evaluation has been developed respecting the sustainable development economic theory, so that final results can have a clear meaning in terms of sustainability. Copyright 2010 Elsevier Ltd. All rights reserved.
Wu, Dan; Ma, Ting; Ceritoglu, Can; Li, Yue; Chotiyanonta, Jill; Hou, Zhipeng; Hsu, John; Xu, Xin; Brown, Timothy; Miller, Michael I; Mori, Susumu
2016-01-15
Technologies for multi-atlas brain segmentation of T1-weighted MRI images have rapidly progressed in recent years, with highly promising results. This approach, however, relies on a large number of atlases with accurate and consistent structural identifications. Here, we introduce our atlas inventories (n=90), which cover ages 4-82years with unique hierarchical structural definitions (286 structures at the finest level). This multi-atlas library resource provides the flexibility to choose appropriate atlases for various studies with different age ranges and structure-definition criteria. In this paper, we describe the details of the atlas resources and demonstrate the improved accuracy achievable with a dynamic age-matching approach, in which atlases that most closely match the subject's age are dynamically selected. The advanced atlas creation strategy, together with atlas pre-selection principles, is expected to support the further development of multi-atlas image segmentation. Copyright © 2015 Elsevier Inc. All rights reserved.
Combined analysis of magnetic and gravity anomalies using normalized source strength (NSS)
NASA Astrophysics Data System (ADS)
Li, L.; Wu, Y.
2017-12-01
Gravity field and magnetic field belong to potential fields which lead inherent multi-solution. Combined analysis of magnetic and gravity anomalies based on Poisson's relation is used to determinate homology gravity and magnetic anomalies and decrease the ambiguity. The traditional combined analysis uses the linear regression of the reduction to pole (RTP) magnetic anomaly to the first order vertical derivative of the gravity anomaly, and provides the quantitative or semi-quantitative interpretation by calculating the correlation coefficient, slope and intercept. In the calculation process, due to the effect of remanent magnetization, the RTP anomaly still contains the effect of oblique magnetization. In this case the homology gravity and magnetic anomalies display irrelevant results in the linear regression calculation. The normalized source strength (NSS) can be transformed from the magnetic tensor matrix, which is insensitive to the remanence. Here we present a new combined analysis using NSS. Based on the Poisson's relation, the gravity tensor matrix can be transformed into the pseudomagnetic tensor matrix of the direction of geomagnetic field magnetization under the homologous condition. The NSS of pseudomagnetic tensor matrix and original magnetic tensor matrix are calculated and linear regression analysis is carried out. The calculated correlation coefficient, slope and intercept indicate the homology level, Poisson's ratio and the distribution of remanent respectively. We test the approach using synthetic model under complex magnetization, the results show that it can still distinguish the same source under the condition of strong remanence, and establish the Poisson's ratio. Finally, this approach is applied in China. The results demonstrated that our approach is feasible.
NASA Astrophysics Data System (ADS)
Affouri, Aida; Dezileau, Laurent; Kallel, Nejib
2017-06-01
Climate models project that rising atmospheric carbon dioxide concentrations will increase the frequency and the severity of some extreme weather events. The flood events represent a major risk for populations and infrastructures settled on coastal lowlands. Recent studies of lagoon sediments have enhanced our knowledge on extreme hydrological events such as palaeo-storms and on their relation with climate change over the last millennium. However, few studies have been undertaken to reconstruct past flood events from lagoon sediments. Here, the past flood activity was investigated using a multi-proxy approach combining sedimentological and geochemical analysis of surfaces sediments from a southeastern Tunisian catchment in order to trace the origin of sediment deposits in the El Bibane Lagoon. Three sediment sources were identified: marine, fluvial and aeolian. When applying this multi-proxy approach on core BL12-10, recovered from the El Bibane Lagoon, we can see that finer material, a high content of the clay and silt, and a high content of the elemental ratios (Fe / Ca and Ti / Ca) characterise the sedimentological signature of the palaeo-flood levels identified in the lagoonal sequence. For the last century, which is the period covered by the BL12-10 short core, three palaeo-flood events were identified. The age of these flood events have been determined by 210Pb and 137Cs chronology and give ages of AD 1995 ± 6, 1970 ± 9 and 1945 ± 9. These results show a good temporal correlation with historical flood events recorded in southern Tunisia in the last century (AD 1932, 1969, 1979 and 1995). Our finding suggests that reconstruction of the history of the hydrological extreme events during the upper Holocene is possible in this location through the use of the sedimentary archives.
Large Margin Multi-Modal Multi-Task Feature Extraction for Image Classification.
Yong Luo; Yonggang Wen; Dacheng Tao; Jie Gui; Chao Xu
2016-01-01
The features used in many image analysis-based applications are frequently of very high dimension. Feature extraction offers several advantages in high-dimensional cases, and many recent studies have used multi-task feature extraction approaches, which often outperform single-task feature extraction approaches. However, most of these methods are limited in that they only consider data represented by a single type of feature, even though features usually represent images from multiple modalities. We, therefore, propose a novel large margin multi-modal multi-task feature extraction (LM3FE) framework for handling multi-modal features for image classification. In particular, LM3FE simultaneously learns the feature extraction matrix for each modality and the modality combination coefficients. In this way, LM3FE not only handles correlated and noisy features, but also utilizes the complementarity of different modalities to further help reduce feature redundancy in each modality. The large margin principle employed also helps to extract strongly predictive features, so that they are more suitable for prediction (e.g., classification). An alternating algorithm is developed for problem optimization, and each subproblem can be efficiently solved. Experiments on two challenging real-world image data sets demonstrate the effectiveness and superiority of the proposed method.
NASA Astrophysics Data System (ADS)
Roshanian, Jafar; Jodei, Jahangir; Mirshams, Mehran; Ebrahimi, Reza; Mirzaee, Masood
A new automated multi-level of fidelity Multi-Disciplinary Design Optimization (MDO) methodology has been developed at the MDO Laboratory of K.N. Toosi University of Technology. This paper explains a new design approach by formulation of developed disciplinary modules. A conceptual design for a small, solid-propellant launch vehicle was considered at two levels of fidelity structure. Low and medium level of fidelity disciplinary codes were developed and linked. Appropriate design and analysis codes were defined according to their effect on the conceptual design process. Simultaneous optimization of the launch vehicle was performed at the discipline level and system level. Propulsion, aerodynamics, structure and trajectory disciplinary codes were used. To reach the minimum launch weight, the Low LoF code first searches the whole design space to achieve the mission requirements. Then the medium LoF code receives the output of the low LoF and gives a value near the optimum launch weight with more details and higher fidelity.
Bazyk, Susan; Winne, Rebecca
2013-04-01
Obesity in children and youth is a major public health concern known to have a significant impact on physical and mental health. Although traditional approaches to obesity have emphasized diet and exercise at the individual level, broader attention to the mental health consequences of obesity is crucial. Individuals who are obese live in a world where they are often less accepted resulting in social exclusion and discrimination. A public health multi-tiered approach to obesity focusing on mental health promotion, prevention, and individualized intervention is presented.
A multi-pattern hash-binary hybrid algorithm for URL matching in the HTTP protocol.
Zeng, Ping; Tan, Qingping; Meng, Xiankai; Shao, Zeming; Xie, Qinzheng; Yan, Ying; Cao, Wei; Xu, Jianjun
2017-01-01
In this paper, based on our previous multi-pattern uniform resource locator (URL) binary-matching algorithm called HEM, we propose an improved multi-pattern matching algorithm called MH that is based on hash tables and binary tables. The MH algorithm can be applied to the fields of network security, data analysis, load balancing, cloud robotic communications, and so on-all of which require string matching from a fixed starting position. Our approach effectively solves the performance problems of the classical multi-pattern matching algorithms. This paper explores ways to improve string matching performance under the HTTP protocol by using a hash method combined with a binary method that transforms the symbol-space matching problem into a digital-space numerical-size comparison and hashing problem. The MH approach has a fast matching speed, requires little memory, performs better than both the classical algorithms and HEM for matching fields in an HTTP stream, and it has great promise for use in real-world applications.
NASA Astrophysics Data System (ADS)
Fei, Peng; Lee, Juhyun; Packard, René R. Sevag; Sereti, Konstantina-Ioanna; Xu, Hao; Ma, Jianguo; Ding, Yichen; Kang, Hanul; Chen, Harrison; Sung, Kevin; Kulkarni, Rajan; Ardehali, Reza; Kuo, C.-C. Jay; Xu, Xiaolei; Ho, Chih-Ming; Hsiai, Tzung K.
2016-03-01
Light Sheet Fluorescence Microscopy (LSFM) enables multi-dimensional and multi-scale imaging via illuminating specimens with a separate thin sheet of laser. It allows rapid plane illumination for reduced photo-damage and superior axial resolution and contrast. We hereby demonstrate cardiac LSFM (c-LSFM) imaging to assess the functional architecture of zebrafish embryos with a retrospective cardiac synchronization algorithm for four-dimensional reconstruction (3-D space + time). By combining our approach with tissue clearing techniques, we reveal the entire cardiac structures and hypertrabeculation of adult zebrafish hearts in response to doxorubicin treatment. By integrating the resolution enhancement technique with c-LSFM to increase the resolving power under a large field-of-view, we demonstrate the use of low power objective to resolve the entire architecture of large-scale neonatal mouse hearts, revealing the helical orientation of individual myocardial fibers. Therefore, our c-LSFM imaging approach provides multi-scale visualization of architecture and function to drive cardiovascular research with translational implication in congenital heart diseases.
A multi-pattern hash-binary hybrid algorithm for URL matching in the HTTP protocol
Tan, Qingping; Meng, Xiankai; Shao, Zeming; Xie, Qinzheng; Yan, Ying; Cao, Wei; Xu, Jianjun
2017-01-01
In this paper, based on our previous multi-pattern uniform resource locator (URL) binary-matching algorithm called HEM, we propose an improved multi-pattern matching algorithm called MH that is based on hash tables and binary tables. The MH algorithm can be applied to the fields of network security, data analysis, load balancing, cloud robotic communications, and so on—all of which require string matching from a fixed starting position. Our approach effectively solves the performance problems of the classical multi-pattern matching algorithms. This paper explores ways to improve string matching performance under the HTTP protocol by using a hash method combined with a binary method that transforms the symbol-space matching problem into a digital-space numerical-size comparison and hashing problem. The MH approach has a fast matching speed, requires little memory, performs better than both the classical algorithms and HEM for matching fields in an HTTP stream, and it has great promise for use in real-world applications. PMID:28399157
Multi-stage robust scheme for citrus identification from high resolution airborne images
NASA Astrophysics Data System (ADS)
Amorós-López, Julia; Izquierdo Verdiguier, Emma; Gómez-Chova, Luis; Muñoz-Marí, Jordi; Zoilo Rodríguez-Barreiro, Jorge; Camps-Valls, Gustavo; Calpe-Maravilla, Javier
2008-10-01
Identification of land cover types is one of the most critical activities in remote sensing. Nowadays, managing land resources by using remote sensing techniques is becoming a common procedure to speed up the process while reducing costs. However, data analysis procedures should satisfy the accuracy figures demanded by institutions and governments for further administrative actions. This paper presents a methodological scheme to update the citrus Geographical Information Systems (GIS) of the Comunidad Valenciana autonomous region, Spain). The proposed approach introduces a multi-stage automatic scheme to reduce visual photointerpretation and ground validation tasks. First, an object-oriented feature extraction process is carried out for each cadastral parcel from very high spatial resolution (VHR) images (0.5m) acquired in the visible and near infrared. Next, several automatic classifiers (decision trees, multilayer perceptron, and support vector machines) are trained and combined to improve the final accuracy of the results. The proposed strategy fulfills the high accuracy demanded by policy makers by means of combining automatic classification methods with visual photointerpretation available resources. A level of confidence based on the agreement between classifiers allows us an effective management by fixing the quantity of parcels to be reviewed. The proposed methodology can be applied to similar problems and applications.
Groundwater-abstraction induced land subsidence and groundwater regulation in the North China Plain
NASA Astrophysics Data System (ADS)
Guo, H.; Wang, L.; Cheng, G.; Zhang, Z.
2015-11-01
Land subsidence can be induced when various factors such as geological, and hydrogeological conditions and intensive groundwater abstraction combine. The development and utilization of groundwater in the North China Plain (NCP) bring great benefits, and at the same time have led to a series of environmental and geological problems accompanying groundwater-level declines and land subsidence. Subsidence occurs commonly in the NCP and analyses show that multi-layer aquifer systems with deep confined aquifers and thick compressible clay layers are the key geological and hydrogeological conditions responsible for its development in this region. Groundwater overdraft results in aquifer-system compaction, resulting in subsidence. A calibrated, transient groundwater-flow numerical model of the Beijing plain portion of the NCP was developed using MODFLOW. According to available water supply and demand in Beijing plain, several groundwater regulation scenarios were designed. These different regulation scenarios were simulated with the groundwater model, and assessed using a multi-criteria fuzzy pattern recognition model. This approach is proven to be very useful for scientific analysis of sustainable development and utilization of groundwater resources. The evaluation results show that sustainable development of groundwater resources may be achieved in Beijing plain when various measures such as control of groundwater abstraction and increase of artificial recharge combine favourably.
Fate and Transport of Tungsten at Camp Edwards Small Arms Ranges
2007-08-01
area into the lower berm and/or trough. A similar approach was used in the lower berm area with samples collected from soil sloughing from the...bucket au- ger to collect samples beneath the bullet pockets and the trough. A multi - increment, subsurface soil sample was made by combining the...range. From these soil profiles, a total of 72 multi -increment subsurface soil sam- ples was collected (Table 2). The auger was cleaned between holes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Li; He, Ya-Ling; Kang, Qinjun
2013-12-15
A coupled (hybrid) simulation strategy spatially combining the finite volume method (FVM) and the lattice Boltzmann method (LBM), called CFVLBM, is developed to simulate coupled multi-scale multi-physicochemical processes. In the CFVLBM, computational domain of multi-scale problems is divided into two sub-domains, i.e., an open, free fluid region and a region filled with porous materials. The FVM and LBM are used for these two regions, respectively, with information exchanged at the interface between the two sub-domains. A general reconstruction operator (RO) is proposed to derive the distribution functions in the LBM from the corresponding macro scalar, the governing equation of whichmore » obeys the convection–diffusion equation. The CFVLBM and the RO are validated in several typical physicochemical problems and then are applied to simulate complex multi-scale coupled fluid flow, heat transfer, mass transport, and chemical reaction in a wall-coated micro reactor. The maximum ratio of the grid size between the FVM and LBM regions is explored and discussed. -- Highlights: •A coupled simulation strategy for simulating multi-scale phenomena is developed. •Finite volume method and lattice Boltzmann method are coupled. •A reconstruction operator is derived to transfer information at the sub-domains interface. •Coupled multi-scale multiple physicochemical processes in micro reactor are simulated. •Techniques to save computational resources and improve the efficiency are discussed.« less
Analyzing the quality robustness of chemotherapy plans with respect to model uncertainties.
Hoffmann, Anna; Scherrer, Alexander; Küfer, Karl-Heinz
2015-01-01
Mathematical models of chemotherapy planning problems contain various biomedical parameters, whose values are difficult to quantify and thus subject to some uncertainty. This uncertainty propagates into the therapy plans computed on these models, which poses the question of robustness to the expected therapy quality. This work introduces a combined approach for analyzing the quality robustness of plans in terms of dosing levels with respect to model uncertainties in chemotherapy planning. It uses concepts from multi-criteria decision making for studying parameters related to the balancing between the different therapy goals, and concepts from sensitivity analysis for the examination of parameters describing the underlying biomedical processes and their interplay. This approach allows for a profound assessment of a therapy plan, how stable its quality is with respect to parametric changes in the used mathematical model. Copyright © 2014 Elsevier Inc. All rights reserved.
A gantry-based tri-modality system for bioluminescence tomography
Yan, Han; Lin, Yuting; Barber, William C.; Unlu, Mehmet Burcin; Gulsen, Gultekin
2012-01-01
A gantry-based tri-modality system that combines bioluminescence (BLT), diffuse optical (DOT), and x-ray computed tomography (XCT) into the same setting is presented here. The purpose of this system is to perform bioluminescence tomography using a multi-modality imaging approach. As parts of this hybrid system, XCT and DOT provide anatomical information and background optical property maps. This structural and functional a priori information is used to guide and restrain bioluminescence reconstruction algorithm and ultimately improve the BLT results. The performance of the combined system is evaluated using multi-modality phantoms. In particular, a cylindrical heterogeneous multi-modality phantom that contains regions with higher optical absorption and x-ray attenuation is constructed. We showed that a 1.5 mm diameter bioluminescence inclusion can be localized accurately with the functional a priori information while its source strength can be recovered more accurately using both structural and the functional a priori information. PMID:22559540
3D Digital Surveying and Modelling of Cave Geometry: Application to Paleolithic Rock Art
González-Aguilera, Diego; Muñoz-Nieto, Angel; Gómez-Lahoz, Javier; Herrero-Pascual, Jesus; Gutierrez-Alonso, Gabriel
2009-01-01
3D digital surveying and modelling of cave geometry represents a relevant approach for research, management and preservation of our cultural and geological legacy. In this paper, a multi-sensor approach based on a terrestrial laser scanner, a high-resolution digital camera and a total station is presented. Two emblematic caves of Paleolithic human occupation and situated in northern Spain, “Las Caldas” and “Peña de Candamo”, have been chosen to put in practise this approach. As a result, an integral and multi-scalable 3D model is generated which may allow other scientists, pre-historians, geologists…, to work on two different levels, integrating different Paleolithic Art datasets: (1) a basic level based on the accurate and metric support provided by the laser scanner; and (2) a advanced level using the range and image-based modelling. PMID:22399958
NASA Astrophysics Data System (ADS)
Kopf, S.; McGlynn, S.; Cowley, E.; Green, A.; Newman, D. K.; Orphan, V. J.
2014-12-01
Metabolic rates of microbial communities constitute a key physiological parameter for understanding the in situ growth constraints for life in any environment. Isotope labeling techniques provide a powerful approach for measuring such biological activity, due to the use of isotopically enriched substrate tracers whose incorporation into biological materials can be detected with high sensitivity by isotope-ratio mass spectrometry. Nano-meter scale secondary ion mass spectrometry (NanoSIMS) combined with stable isotope labeling provides a unique tool for studying the spatiometabolic activity of microbial populations at the single cell level in order to assess both community structure and population diversity. However, assessing the distribution and range of microbial activity in complex environmental systems with slow-growing organisms, diverse carbon and nitrogen sources, or heterotrophic subpopulations poses a tremendous technical challenge because the introduction of isotopically labeled substrates frequently changes the nutrient availability and can inflate or bias measures of activity. Here, we present the use of hydrogen isotope labeling with deuterated water as an important new addition to the isotopic toolkit and apply it for the determination of single cell microbial activities by NanoSIMS imaging. This tool provides a labeling technique that minimally alters any aquatic chemical environment, can be administered with strong labels even in minimal addition (natural background is very low), is an equally universal substrate for all forms of life even in complex, carbon and nitrogen saturated systems, and can be combined with other isotopic tracers. The combination of heavy water labeling with the most commonly used NanoSIMS tracer, 15N, is technically challenging but opens up a powerful new set of multi-tracer experiments for the study of microbial activity in complex communities. We present the first truly simultaneous single cell triple isotope system measurements of 2H/1H, 13C/12C and 15N/14N and apply it to study of microbial metabolic heterogeneity and nitrogen metabolism in a continuous culture case study. Our data provide insight into both the diversity of microbial activity rates, as well as patterns of ammonium utilization at the single cell level.
NASA Astrophysics Data System (ADS)
Helm, P. Johannes; Reppen, Trond; Heggelund, Paul
2009-02-01
Multi Photon Laser Scanning Microscopy (MPLSM) appears today as one of the most powerful experimental tools in cellular neurophysiology, notably in studies of the functional dynamics of signal processing in single neurons. Simultaneous recording of fluorescence signals at high spatial and temporal resolution and electric signals by means of multi electrode patch clamp techniques have provided new paths for the systematic investigation of neuronal mechanisms. In particular, this approach has opened for direct studies of dendritic signal processing in neurons. We report about a setup optimized for simultaneous electrophysiological multi electrode patch clamp and multi photon laser scanning fluorescence microscopic experiments on brain slices. The microscopic system is based on a modified commercially available confocal scanning laser microscope (CLSM). From a technical and operational point of view, two developments are important: Firstly, in order to reduce the workload for the experimentalist, who in general is forced to concentrate on controlling the electrophysiological parameters during the recordings, a system of shutters has been installed together with dedicated electronic modules protecting the photo detectors against destructive light levels caused by erroneous opening or closing of microscopic light paths by the experimentalist. Secondly, the standard detection unit has been improved by installing the photomultiplier tubes (PMT) in a Peltier cooled thermal box shielding the detector from both room temperature and distortions caused by external electromagnetic fields. The electrophysiological system is based on an industrial standard multi patch clamp unit ergonomically arranged around the microscope stage. The electrophysiological and scanning processes can be time coordinated by standard trigger electronics.
NASA Astrophysics Data System (ADS)
Kuzle, A.
2018-06-01
The important role that metacognition plays as a predictor for student mathematical learning and for mathematical problem-solving, has been extensively documented. But only recently has attention turned to primary grades, and more research is needed at this level. The goals of this paper are threefold: (1) to present metacognitive framework during mathematics problem-solving, (2) to describe their multi-method interview approach developed to study student mathematical metacognition, and (3) to empirically evaluate the utility of their model and the adaptation of their approach in the context of grade 2 and grade 4 mathematics problem-solving. The results are discussed not only with regard to further development of the adapted multi-method interview approach, but also with regard to their theoretical and practical implications.
Kuhlmann, Ellen; Larsen, Christa
2015-12-01
Health workforce needs have moved up on the reform agendas, but policymaking often remains 'piece-meal work' and does not respond to the complexity of health workforce challenges. This article argues for innovation in healthcare governance as a key to greater sustainability of health human resources. The aim is to develop a multi-level approach that helps to identify gaps in governance and improve policy interventions. Pilot research into nursing and medicine in Germany, carried out between 2013 and 2015 using a qualitative methodology, serves to illustrate systems-based governance weaknesses. Three explorative cases address major responses to health workforce shortages, comprising migration/mobility of nurses, reform of nursing education, and gender-sensitive work management of hospital doctors. The findings illustrate a lack of connections between transnational/EU and organizational governance, between national and local levels, occupational and sector governance, and organizations/hospital management and professional development. Consequently, innovations in the health workforce need a multi-level governance approach to get transformative potential and help closing the existing gaps in governance. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Multiframe video coding for improved performance over wireless channels.
Budagavi, M; Gibson, J D
2001-01-01
We propose and evaluate a multi-frame extension to block motion compensation (BMC) coding of videoconferencing-type video signals for wireless channels. The multi-frame BMC (MF-BMC) coder makes use of the redundancy that exists across multiple frames in typical videoconferencing sequences to achieve additional compression over that obtained by using the single frame BMC (SF-BMC) approach, such as in the base-level H.263 codec. The MF-BMC approach also has an inherent ability of overcoming some transmission errors and is thus more robust when compared to the SF-BMC approach. We model the error propagation process in MF-BMC coding as a multiple Markov chain and use Markov chain analysis to infer that the use of multiple frames in motion compensation increases robustness. The Markov chain analysis is also used to devise a simple scheme which randomizes the selection of the frame (amongst the multiple previous frames) used in BMC to achieve additional robustness. The MF-BMC coders proposed are a multi-frame extension of the base level H.263 coder and are found to be more robust than the base level H.263 coder when subjected to simulated errors commonly encountered on wireless channels.
Velpuri, N.M.; Senay, G.B.; Asante, K.O.
2011-01-01
Managing limited surface water resources is a great challenge in areas where ground-based data are either limited or unavailable. Direct or indirect measurements of surface water resources through remote sensing offer several advantages of monitoring in ungauged basins. A physical based hydrologic technique to monitor lake water levels in ungauged basins using multi-source satellite data such as satellite-based rainfall estimates, modelled runoff, evapotranspiration, a digital elevation model, and other data is presented. This approach is applied to model Lake Turkana water levels from 1998 to 2009. Modelling results showed that the model can reasonably capture all the patterns and seasonal variations of the lake water level fluctuations. A composite lake level product of TOPEX/Poseidon, Jason-1, and ENVISAT satellite altimetry data is used for model calibration (1998-2000) and model validation (2001-2009). Validation results showed that model-based lake levels are in good agreement with observed satellite altimetry data. Compared to satellite altimetry data, the Pearson's correlation coefficient was found to be 0.81 during the validation period. The model efficiency estimated using NSCE is found to be 0.93, 0.55 and 0.66 for calibration, validation and combined periods, respectively. Further, the model-based estimates showed a root mean square error of 0.62 m and mean absolute error of 0.46 m with a positive mean bias error of 0.36 m for the validation period (2001-2009). These error estimates were found to be less than 15 % of the natural variability of the lake, thus giving high confidence on the modelled lake level estimates. The approach presented in this paper can be used to (a) simulate patterns of lake water level variations in data scarce regions, (b) operationally monitor lake water levels in ungauged basins, (c) derive historical lake level information using satellite rainfall and evapotranspiration data, and (d) augment the information provided by the satellite altimetry systems on changes in lake water levels. ?? Author(s) 2011.
A Comprehensive Prevention Approach to Reducing Assault Offenses and Assault Injuries Among Youth
Heinze, Justin E.; Reischl, Thomas M.; Bai, Mengqiao; Roche, Jessica S.; Morrel-Samuels, Susan; Cunningham, Rebecca M.; Zimmerman, Marc A.
2018-01-01
Since 2011, the CDC-funded Michigan Youth Violence Prevention Center (MI-YVPC), working with community partners, has implemented a comprehensive prevention approach to reducing youth violence in Flint, MI, based on public health principles. MI-YVPC employed an intervention strategy that capitalizes on existing community resources and application of evidence-based programs using a social-ecological approach to change. We evaluated the combined effect of six programs in reducing assaults and injury among 10–24 year olds in the intervention area relative to a matched comparison community. We used generalized linear mixed models to examine change in the intervention area counts of reported assault offenses and assault injury presentation relative to the comparison area over a period six years prior- and two and a half years post-intervention. Results indicated that youth victimization and assault injuries fell in the intervention area subsequent to the initiation of the interventions and that these reductions were sustained over time. Our evaluation demonstrated that a comprehensive multi-level approach can be effective for reducing youth violence and injury. PMID:26572898
Weatherill, John; Krause, Stefan; Voyce, Kevin; Drijfhout, Falko; Levy, Amir; Cassidy, Nigel
2014-03-01
Integrated approaches for the identification of pollutant linkages between aquifers and streams are of crucial importance for evaluating the environmental risks posed by industrial contaminants like trichloroethene (TCE). This study presents a systematic, multi-scale approach to characterising groundwater TCE discharge to a 'gaining' UK lowland stream receiving baseflow from a major Permo-Triassic sandstone aquifer. Beginning with a limited number of initial monitoring points, we aim to provide a 'first pass' mechanistic understanding of the plume's fate at the aquifer/stream interface using a novel combination of streambed diffusion samplers, riparian monitoring wells and drive-point mini-piezometers in a spatially nested sampling configuration. Our results indicate the potential discharge zone of the plume to extend along a stream reach of 120 m in length, delineated by a network of 60 in-situ diffusion samplers. Within this section, a 40 m long sub-reach of higher concentration (>10 μg L(-1)) was identified; centred on a meander bend in the floodplain. 25 multi-level mini-piezometers installed to target this down-scaled reach revealed even higher TCE concentrations (20-40 μg L(-1)), significantly above alluvial groundwater samples (<6 μg L(-1)) from 15 riparian monitoring wells. Significant lateral and vertical spatial heterogeneity in TCE concentrations within the top 1m of the streambed was observed with the decimetre-scale vertical resolution provided by multi-level mini-piezometers. It appears that the distribution of fine-grained material in the Holocene deposits of the riparian floodplain and below the channel is exerting significant local-scale geological controls on the location and magnitude of the TCE discharge. Large-scale in-situ biodegradation of the plume was not evident during the monitoring campaigns. However, detections of cis-1,2-dichloroethene and vinyl chloride in discrete sections of the sediment profile indicate that shallow (e.g., <20 cm) TCE transformation may be significant at a local scale in the streambed deposits. Our findings highlight the need for efficient multi-scale monitoring strategies in geologically heterogeneous lowland stream/aquifer systems in order to more adequately quantify the risk to surface water ecological receptors posed by point-source groundwater contaminants like TCE. Copyright © 2013 Elsevier B.V. All rights reserved.
Shahamiri, Seyed Reza; Salim, Siti Salwah Binti
2014-09-01
Automatic speech recognition (ASR) can be very helpful for speakers who suffer from dysarthria, a neurological disability that damages the control of motor speech articulators. Although a few attempts have been made to apply ASR technologies to sufferers of dysarthria, previous studies show that such ASR systems have not attained an adequate level of performance. In this study, a dysarthric multi-networks speech recognizer (DM-NSR) model is provided using a realization of multi-views multi-learners approach called multi-nets artificial neural networks, which tolerates variability of dysarthric speech. In particular, the DM-NSR model employs several ANNs (as learners) to approximate the likelihood of ASR vocabulary words and to deal with the complexity of dysarthric speech. The proposed DM-NSR approach was presented as both speaker-dependent and speaker-independent paradigms. In order to highlight the performance of the proposed model over legacy models, multi-views single-learner models of the DM-NSRs were also provided and their efficiencies were compared in detail. Moreover, a comparison among the prominent dysarthric ASR methods and the proposed one is provided. The results show that the DM-NSR recorded improved recognition rate by up to 24.67% and the error rate was reduced by up to 8.63% over the reference model.
A multi-part matching strategy for mapping LOINC with laboratory terminologies
Lee, Li-Hui; Groß, Anika; Hartung, Michael; Liou, Der-Ming; Rahm, Erhard
2014-01-01
Objective To address the problem of mapping local laboratory terminologies to Logical Observation Identifiers Names and Codes (LOINC). To study different ontology matching algorithms and investigate how the probability of term combinations in LOINC helps to increase match quality and reduce manual effort. Materials and methods We proposed two matching strategies: full name and multi-part. The multi-part approach also considers the occurrence probability of combined concept parts. It can further recommend possible combinations of concept parts to allow more local terms to be mapped. Three real-world laboratory databases from Taiwanese hospitals were used to validate the proposed strategies with respect to different quality measures and execution run time. A comparison with the commonly used tool, Regenstrief LOINC Mapping Assistant (RELMA) Lab Auto Mapper (LAM), was also carried out. Results The new multi-part strategy yields the best match quality, with F-measure values between 89% and 96%. It can automatically match 70–85% of the laboratory terminologies to LOINC. The recommendation step can further propose mapping to (proposed) LOINC concepts for 9–20% of the local terminology concepts. On average, 91% of the local terminology concepts can be correctly mapped to existing or newly proposed LOINC concepts. Conclusions The mapping quality of the multi-part strategy is significantly better than that of LAM. It enables domain experts to perform LOINC matching with little manual work. The probability of term combinations proved to be a valuable strategy for increasing the quality of match results, providing recommendations for proposed LOINC conepts, and decreasing the run time for match processing. PMID:24363318
Microbially assisted phytoremediation approaches for two multi-element contaminated sites.
Langella, Francesca; Grawunder, Anja; Stark, Romy; Weist, Aileen; Merten, Dirk; Haferburg, Götz; Büchel, Georg; Kothe, Erika
2014-01-01
Phytoremediation is an environmental friendly, cost-effective technology for a soft restoration of abandoned mine sites. The grasses Agrostis capillaris, Deschampsia flexuosa and Festuca rubra, and the annual herb Helianthus annuus were combined with microbial consortia in pot experiments on multi-metal polluted substrates collected at a former uranium mine near Ronneburg, Germany, and a historic copper mine in Kopparberg, Sweden, to test for phytoextraction versus phytostabilization abilities. Metal uptake into plant biomass was evaluated to identify optimal plant-microbe combinations for each substrate. Metal bioavailability was found to be plant species and element specific, and influenced by the applied bacterial consortia of 10 strains, each isolated from the same soil to which it was applied. H. annuus showed high extraction capacity for several metals on the German soil independent of inoculation. Our study could also show a significant enhancement of extraction for F. rubra and A. capillaris when combined with the bacterial consortium, although usually grasses are considered metal excluder species. On the Swedish mixed substrate, due to its toxicity, with 30 % bark compost, A. capillaris inoculated with the respective consortium was able to extract multi-metal contaminants.
Yu, Dongjun; Wu, Xiaowei; Shen, Hongbin; Yang, Jian; Tang, Zhenmin; Qi, Yong; Yang, Jingyu
2012-12-01
Membrane proteins are encoded by ~ 30% in the genome and function importantly in the living organisms. Previous studies have revealed that membrane proteins' structures and functions show obvious cell organelle-specific properties. Hence, it is highly desired to predict membrane protein's subcellular location from the primary sequence considering the extreme difficulties of membrane protein wet-lab studies. Although many models have been developed for predicting protein subcellular locations, only a few are specific to membrane proteins. Existing prediction approaches were constructed based on statistical machine learning algorithms with serial combination of multi-view features, i.e., different feature vectors are simply serially combined to form a super feature vector. However, such simple combination of features will simultaneously increase the information redundancy that could, in turn, deteriorate the final prediction accuracy. That's why it was often found that prediction success rates in the serial super space were even lower than those in a single-view space. The purpose of this paper is investigation of a proper method for fusing multiple multi-view protein sequential features for subcellular location predictions. Instead of serial strategy, we propose a novel parallel framework for fusing multiple membrane protein multi-view attributes that will represent protein samples in complex spaces. We also proposed generalized principle component analysis (GPCA) for feature reduction purpose in the complex geometry. All the experimental results through different machine learning algorithms on benchmark membrane protein subcellular localization datasets demonstrate that the newly proposed parallel strategy outperforms the traditional serial approach. We also demonstrate the efficacy of the parallel strategy on a soluble protein subcellular localization dataset indicating the parallel technique is flexible to suite for other computational biology problems. The software and datasets are available at: http://www.csbio.sjtu.edu.cn/bioinf/mpsp.
Elastic all-optical multi-hop interconnection in data centers with adaptive spectrum allocation
NASA Astrophysics Data System (ADS)
Hong, Yuanyuan; Hong, Xuezhi; Chen, Jiajia; He, Sailing
2017-01-01
In this paper, a novel flex-grid all-optical interconnect scheme that supports transparent multi-hop connections in data centers is proposed. An inter-rack all-optical multi-hop connection is realized with an optical loop employed at flex-grid wavelength selective switches (WSSs) in an intermediate rack rather than by relaying through optical-electric-optical (O-E-O) conversions. Compared with the conventional O-E-O based approach, the proposed all-optical scheme is able to off-load the traffic at intermediate racks, leading to a reduction of the power consumption and cost. The transmission performance of the proposed flex-grid multi-hop all-optical interconnect scheme with various modulation formats, including both coherently detected and directly detected approaches, are investigated by Monte-Carlo simulations. To enhance the spectrum efficiency (SE), number-of-hop adaptive bandwidth allocation is introduced. Numerical results show that the SE can be improved by up to 33.3% at 40 Gbps, and by up to 25% at 100 Gbps. The impact of parameters, such as targeted bit error rate (BER) level and insertion loss of components, on the transmission performance of the proposed approach are also explored. The results show that the maximum SE improvement of the adaptive approach over the non-adaptive one is enhanced with the decrease of the targeted BER levels and the component insertion loss.
Pirpinia, Kleopatra; Bosman, Peter A N; Loo, Claudette E; Winter-Warnars, Gonneke; Janssen, Natasja N Y; Scholten, Astrid N; Sonke, Jan-Jakob; van Herk, Marcel; Alderliesten, Tanja
2017-06-23
Deformable image registration is typically formulated as an optimization problem involving a linearly weighted combination of terms that correspond to objectives of interest (e.g. similarity, deformation magnitude). The weights, along with multiple other parameters, need to be manually tuned for each application, a task currently addressed mainly via trial-and-error approaches. Such approaches can only be successful if there is a sensible interplay between parameters, objectives, and desired registration outcome. This, however, is not well established. To study this interplay, we use multi-objective optimization, where multiple solutions exist that represent the optimal trade-offs between the objectives, forming a so-called Pareto front. Here, we focus on weight tuning. To study the space a user has to navigate during manual weight tuning, we randomly sample multiple linear combinations. To understand how these combinations relate to desirability of registration outcome, we associate with each outcome a mean target registration error (TRE) based on expert-defined anatomical landmarks. Further, we employ a multi-objective evolutionary algorithm that optimizes the weight combinations, yielding a Pareto front of solutions, which can be directly navigated by the user. To study how the complexity of manual weight tuning changes depending on the registration problem, we consider an easy problem, prone-to-prone breast MR image registration, and a hard problem, prone-to-supine breast MR image registration. Lastly, we investigate how guidance information as an additional objective influences the prone-to-supine registration outcome. Results show that the interplay between weights, objectives, and registration outcome makes manual weight tuning feasible for the prone-to-prone problem, but very challenging for the harder prone-to-supine problem. Here, patient-specific, multi-objective weight optimization is needed, obtaining a mean TRE of 13.6 mm without guidance information reduced to 7.3 mm with guidance information, but also providing a Pareto front that exhibits an intuitively sensible interplay between weights, objectives, and registration outcome, allowing outcome selection.
NASA Astrophysics Data System (ADS)
Pirpinia, Kleopatra; Bosman, Peter A. N.; E Loo, Claudette; Winter-Warnars, Gonneke; Y Janssen, Natasja N.; Scholten, Astrid N.; Sonke, Jan-Jakob; van Herk, Marcel; Alderliesten, Tanja
2017-07-01
Deformable image registration is typically formulated as an optimization problem involving a linearly weighted combination of terms that correspond to objectives of interest (e.g. similarity, deformation magnitude). The weights, along with multiple other parameters, need to be manually tuned for each application, a task currently addressed mainly via trial-and-error approaches. Such approaches can only be successful if there is a sensible interplay between parameters, objectives, and desired registration outcome. This, however, is not well established. To study this interplay, we use multi-objective optimization, where multiple solutions exist that represent the optimal trade-offs between the objectives, forming a so-called Pareto front. Here, we focus on weight tuning. To study the space a user has to navigate during manual weight tuning, we randomly sample multiple linear combinations. To understand how these combinations relate to desirability of registration outcome, we associate with each outcome a mean target registration error (TRE) based on expert-defined anatomical landmarks. Further, we employ a multi-objective evolutionary algorithm that optimizes the weight combinations, yielding a Pareto front of solutions, which can be directly navigated by the user. To study how the complexity of manual weight tuning changes depending on the registration problem, we consider an easy problem, prone-to-prone breast MR image registration, and a hard problem, prone-to-supine breast MR image registration. Lastly, we investigate how guidance information as an additional objective influences the prone-to-supine registration outcome. Results show that the interplay between weights, objectives, and registration outcome makes manual weight tuning feasible for the prone-to-prone problem, but very challenging for the harder prone-to-supine problem. Here, patient-specific, multi-objective weight optimization is needed, obtaining a mean TRE of 13.6 mm without guidance information reduced to 7.3 mm with guidance information, but also providing a Pareto front that exhibits an intuitively sensible interplay between weights, objectives, and registration outcome, allowing outcome selection.
A Multi-Faceted Approach to Successful Transition for Students with Intellectual Disabilities
ERIC Educational Resources Information Center
Dubberly, Russell G.
2011-01-01
This report summarizes the multi-faceted, dynamic instructional model implemented to increase positive transition outcomes for high school students with intellectual disabilities. This report is based on the programmatic methods implemented within a secondary-level school in an urban setting. This pedagogical model facilitates the use of…
Constrained Multi-Level Algorithm for Trajectory Optimization
NASA Astrophysics Data System (ADS)
Adimurthy, V.; Tandon, S. R.; Jessy, Antony; Kumar, C. Ravi
The emphasis on low cost access to space inspired many recent developments in the methodology of trajectory optimization. Ref.1 uses a spectral patching method for optimization, where global orthogonal polynomials are used to describe the dynamical constraints. A two-tier approach of optimization is used in Ref.2 for a missile mid-course trajectory optimization. A hybrid analytical/numerical approach is described in Ref.3, where an initial analytical vacuum solution is taken and gradually atmospheric effects are introduced. Ref.4 emphasizes the fact that the nonlinear constraints which occur in the initial and middle portions of the trajectory behave very nonlinearly with respect the variables making the optimization very difficult to solve in the direct and indirect shooting methods. The problem is further made complex when different phases of the trajectory have different objectives of optimization and also have different path constraints. Such problems can be effectively addressed by multi-level optimization. In the multi-level methods reported so far, optimization is first done in identified sub-level problems, where some coordination variables are kept fixed for global iteration. After all the sub optimizations are completed, higher-level optimization iteration with all the coordination and main variables is done. This is followed by further sub system optimizations with new coordination variables. This process is continued until convergence. In this paper we use a multi-level constrained optimization algorithm which avoids the repeated local sub system optimizations and which also removes the problem of non-linear sensitivity inherent in the single step approaches. Fall-zone constraints, structural load constraints and thermal constraints are considered. In this algorithm, there is only a single multi-level sequence of state and multiplier updates in a framework of an augmented Lagrangian. Han Tapia multiplier updates are used in view of their special role in diagonalised methods, being the only single update with quadratic convergence. For a single level, the diagonalised multiplier method (DMM) is described in Ref.5. The main advantage of the two-level analogue of the DMM approach is that it avoids the inner loop optimizations required in the other methods. The scheme also introduces a gradient change measure to reduce the computational time needed to calculate the gradients. It is demonstrated that the new multi-level scheme leads to a robust procedure to handle the sensitivity of the constraints, and the multiple objectives of different trajectory phases. Ref. 1. Fahroo, F and Ross, M., " A Spectral Patching Method for Direct Trajectory Optimization" The Journal of the Astronautical Sciences, Vol.48, 2000, pp.269-286 Ref. 2. Phililps, C.A. and Drake, J.C., "Trajectory Optimization for a Missile using a Multitier Approach" Journal of Spacecraft and Rockets, Vol.37, 2000, pp.663-669 Ref. 3. Gath, P.F., and Calise, A.J., " Optimization of Launch Vehicle Ascent Trajectories with Path Constraints and Coast Arcs", Journal of Guidance, Control, and Dynamics, Vol. 24, 2001, pp.296-304 Ref. 4. Betts, J.T., " Survey of Numerical Methods for Trajectory Optimization", Journal of Guidance, Control, and Dynamics, Vol.21, 1998, pp. 193-207 Ref. 5. Adimurthy, V., " Launch Vehicle Trajectory Optimization", Acta Astronautica, Vol.15, 1987, pp.845-850.
NASA Astrophysics Data System (ADS)
Fatrias, D.; Kamil, I.; Meilani, D.
2018-03-01
Coordinating business operation with suppliers becomes increasingly important to survive and prosper under the dynamic business environment. A good partnership with suppliers not only increase efficiency, but also strengthen corporate competitiveness. Associated with such concern, this study aims to develop a practical approach of multi-criteria supplier evaluation using combined methods of Taguchi loss function (TLF), best-worst method (BWM) and VIse Kriterijumska Optimizacija kompromisno Resenje (VIKOR). A new framework of integrative approach adopting these methods is our main contribution for supplier evaluation in literature. In this integrated approach, a compromised supplier ranking list based on the loss score of suppliers is obtained using efficient steps of a pairwise comparison based decision making process. Implemetation to the case problem with real data from crumb rubber industry shows the usefulness of the proposed approach. Finally, a suitable managerial implication is presented.
NASA Astrophysics Data System (ADS)
Becker, P.; Idelsohn, S. R.; Oñate, E.
2015-06-01
This paper describes a strategy to solve multi-fluid and fluid-structure interaction (FSI) problems using Lagrangian particles combined with a fixed finite element (FE) mesh. Our approach is an extension of the fluid-only PFEM-2 (Idelsohn et al., Eng Comput 30(2):2-2, 2013; Idelsohn et al., J Numer Methods Fluids, 2014) which uses explicit integration over the streamlines to improve accuracy. As a result, the convective term does not appear in the set of equations solved on the fixed mesh. Enrichments in the pressure field are used to improve the description of the interface between phases.
A scale space feature based registration technique for fusion of satellite imagery
NASA Technical Reports Server (NTRS)
Raghavan, Srini; Cromp, Robert F.; Campbell, William C.
1997-01-01
Feature based registration is one of the most reliable methods to register multi-sensor images (both active and passive imagery) since features are often more reliable than intensity or radiometric values. The only situation where a feature based approach will fail is when the scene is completely homogenous or densely textural in which case a combination of feature and intensity based methods may yield better results. In this paper, we present some preliminary results of testing our scale space feature based registration technique, a modified version of feature based method developed earlier for classification of multi-sensor imagery. The proposed approach removes the sensitivity in parameter selection experienced in the earlier version as explained later.
The Multi-Isotope Process (MIP) Monitor Project: FY13 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meier, David E.; Coble, Jamie B.; Jordan, David V.
The Multi-Isotope Process (MIP) Monitor provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of “… (minimization of) the risks of nuclear proliferation and terrorism.” The MIP Monitor measures the distribution of the radioactive isotopes in product and waste streams of a nuclear reprocessing facility. These isotopes are monitored online by gamma spectrometry and compared, in near-real-time, to spectral patterns representing “normal” process conditions using multivariate analysis and pattern recognition algorithms. The combination of multivariate analysis and gamma spectroscopy allows us to detect small changes in the gamma spectrum, which may indicatemore » changes in process conditions. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting for systems representing aqueous reprocessing facilities. However, pyroprocessing is emerging as an alternative to aqueous reprocessing techniques.« less
NASA Astrophysics Data System (ADS)
Ravnik, Domen; Jerman, Tim; Pernuš, Franjo; Likar, Boštjan; Å piclin, Žiga
2018-03-01
Performance of a convolutional neural network (CNN) based white-matter lesion segmentation in magnetic resonance (MR) brain images was evaluated under various conditions involving different levels of image preprocessing and augmentation applied and different compositions of the training dataset. On images of sixty multiple sclerosis patients, half acquired on one and half on another scanner of different vendor, we first created a highly accurate multi-rater consensus based lesion segmentations, which were used in several experiments to evaluate the CNN segmentation result. First, the CNN was trained and tested without preprocessing the images and by using various combinations of preprocessing techniques, namely histogram-based intensity standardization, normalization by whitening, and train dataset augmentation by flipping the images across the midsagittal plane. Then, the CNN was trained and tested on images of the same, different or interleaved scanner datasets using a cross-validation approach. The results indicate that image preprocessing has little impact on performance in a same-scanner situation, while between-scanner performance benefits most from intensity standardization and normalization, but also further by incorporating heterogeneous multi-scanner datasets in the training phase. Under such conditions the between-scanner performance of the CNN approaches that of the ideal situation, when the CNN is trained and tested on the same scanner dataset.
Guirro, Maria; Costa, Andrea; Gual-Grau, Andreu; Mayneris-Perxachs, Jordi; Torrell, Helena; Herrero, Pol; Canela, Núria; Arola, Lluís
2018-02-10
Over the last few years, the application of high-throughput meta-omics methods has provided great progress in improving the knowledge of the gut ecosystem and linking its biodiversity to host health conditions, offering complementary support to classical microbiology. Gut microbiota plays a crucial role in relevant diseases such as obesity or cardiovascular disease (CVD), and its regulation is closely influenced by several factors, such as dietary composition. In fact, polyphenol-rich diets are the most palatable treatment to prevent hypertension associated with CVD, although the polyphenol-microbiota interactions have not been completely elucidated. For this reason, the aim of this study was to evaluate microbiota effect in obese rats supplemented by hesperidin, after being fed with cafeteria or standard diet, using a multi meta-omics approaches combining strategy of metagenomics and metaproteomics analysis. We reported that cafeteria diet induces obesity, resulting in changes in the microbiota composition, which are related to functional alterations at proteome level. In addition, hesperidin supplementation alters microbiota diversity and also proteins involved in important metabolic pathways. Overall, going deeper into strategies to integrate omics sciences is necessary to understand the complex relationships between the host, gut microbiota, and diet. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Hybrid BCI approach to control an artificial tibio-femoral joint.
Mercado, Luis; Rodriguez-Linan, Angel; Torres-Trevino, Luis M; Quiroz, G
2016-08-01
Brain-Computer Interfaces (BCIs) for disabled people should allow them to use their remaining functionalities as control possibilities. BCIs connect the brain with external devices to perform the volition or intent of movement, regardless if that individual is unable to perform the task due to body impairments. In this work we fuse electromyographic (EMG) with electroencephalographic (EEG) activity in a framework called "Hybrid-BCI" (hBCI) approach to control the movement of a simulated tibio-femoral joint. Two mathematical models of a tibio-femoral joint are used to emulate the kinematic and dynamic behavior of the knee. The interest is to reproduce different velocities of the human gait cycle. The EEG signals are used to classify the user intent, which are the velocity changes, meanwhile the superficial EMG signals are used to estimate the amplitude of such intent. A multi-level controller is used to solve the trajectory tracking problem involved. The lower level consists of an individual controller for each model, it solves the tracking of the desired trajectory even considering different velocities of the human gait cycle. The mid-level uses a combination of a logical operator and a finite state machine for the switching between models. Finally, the highest level consists in a support vector machine to classify the desired activity.
Keren, Ilai N; Menalled, Fabian D; Weaver, David K; Robison-Cox, James F
2015-01-01
Worldwide, the landscape homogeneity of extensive monocultures that characterizes conventional agriculture has resulted in the development of specialized and interacting multitrophic pest complexes. While integrated pest management emphasizes the need to consider the ecological context where multiple species coexist, management recommendations are often based on single-species tactics. This approach may not provide satisfactory solutions when confronted with the complex interactions occurring between organisms at the same or different trophic levels. Replacement of the single-species management model with more sophisticated, multi-species programs requires an understanding of the direct and indirect interactions occurring between the crop and all categories of pests. We evaluated a modeling framework to make multi-pest management decisions taking into account direct and indirect interactions among species belonging to different trophic levels. We adopted a Bayesian decision theory approach in combination with path analysis to evaluate interactions between Bromus tectorum (downy brome, cheatgrass) and Cephus cinctus (wheat stem sawfly) in wheat (Triticum aestivum) systems. We assessed their joint responses to weed management tactics, seeding rates, and cultivar tolerance to insect stem boring or competition. Our results indicated that C. cinctus oviposition behavior varied as a function of B. tectorum pressure. Crop responses were more readily explained by the joint effects of management tactics on both categories of pests and their interactions than just by the direct impact of any particular management scheme on yield. In accordance, a C. cinctus tolerant variety should be planted at a low seeding rate under high insect pressure. However as B. tectorum levels increase, the C. cinctus tolerant variety should be replaced by a competitive and drought tolerant cultivar at high seeding rates despite C. cinctus infestation. This study exemplifies the necessity of accounting for direct and indirect biological interactions occurring within agroecosystems and propagating this information from the statistical analysis stage to the management stage.
NASA Astrophysics Data System (ADS)
Choubey, Gautam; Pandey, K. M.
2018-04-01
The multi-strut injector is one of the most favourable perspectives for the mixing improvement in between the hydrogen and the high-speed air, and its parametric investigation has drawn an increasing attention among the researchers. Hence the flow-field aspects of a particular multi-strut based scramjet combustor have been investigated numerically with the addition of four wall injectors and at the same time, the influence of combination of different strut as well as wall injector scheme on the performance of multi-strut scramjet engine has also been explored. Moreover, the current computational approach has been validated against the experimental data present in the open literature in case of single strut scramjet engine. The attained results reveal that the collaboration of multi-strut along with 2 wall injectors' improves the efficiency of scramjet as compared to other multi-strut + wall injection scheme as this combination achieve higher penetration height which will boost to a wider temperature and robust combustion area adjacent to the wall. Again, the appearance of extra H2 in the separated flow region precisely ahead of the wall injection region is mainly reasonable for the abrupt decrease in the mixing as well combustion efficiency plot in all the multi-strut + wall injection strategy.
Swarm Intelligence for Urban Dynamics Modelling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghnemat, Rawan; Bertelle, Cyrille; Duchamp, Gerard H. E.
2009-04-16
In this paper, we propose swarm intelligence algorithms to deal with dynamical and spatial organization emergence. The goal is to model and simulate the developement of spatial centers using multi-criteria. We combine a decentralized approach based on emergent clustering mixed with spatial constraints or attractions. We propose an extension of the ant nest building algorithm with multi-center and adaptive process. Typically, this model is suitable to analyse and simulate urban dynamics like gentrification or the dynamics of the cultural equipment in urban area.
[Media for 21st century--towards human communication media].
Harashima, H
2000-05-01
Today, with the approach of the 21st century, attention is focused on multi-media communications combining computer, visual and audio technologies. This article discusses the communication media target and the technological problems constituting the nucleus of multi-media. The communication media is becoming an environment from which no one can escape. Since the media has such a great power, what is needed now is not to predict the future technologies, but to estimate the future world and take to responsibility for future environments.
Swarm Intelligence for Urban Dynamics Modelling
NASA Astrophysics Data System (ADS)
Ghnemat, Rawan; Bertelle, Cyrille; Duchamp, Gérard H. E.
2009-04-01
In this paper, we propose swarm intelligence algorithms to deal with dynamical and spatial organization emergence. The goal is to model and simulate the developement of spatial centers using multi-criteria. We combine a decentralized approach based on emergent clustering mixed with spatial constraints or attractions. We propose an extension of the ant nest building algorithm with multi-center and adaptive process. Typically, this model is suitable to analyse and simulate urban dynamics like gentrification or the dynamics of the cultural equipment in urban area.
Binot, Aurelie; Duboz, Raphaël; Promburom, Panomsak; Phimpraphai, Waraphon; Cappelle, Julien; Lajaunie, Claire; Goutard, Flavie Luce; Pinyopummintr, Tanu; Figuié, Muriel; Roger, François Louis
2015-12-01
As Southeast Asia (SEA) is characterized by high human and domestic animal densities, growing intensification of trade, drastic land use changes and biodiversity erosion, this region appears to be a hotspot to study complex dynamics of zoonoses emergence and health issues at the Animal-Human-Environment interface. Zoonotic diseases and environmental health issues can have devastating socioeconomic and wellbeing impacts. Assessing and managing the related risks implies to take into account ecological and social dynamics at play, in link with epidemiological patterns. The implementation of a One Health ( OH ) approach in this context calls for improved integration among disciplines and improved cross-sectoral collaboration, involving stakeholders at different levels. For sure, such integration is not achieved spontaneously, implies methodological guidelines and has transaction costs. We explore pathways for implementing such collaboration in SEA context, highlighting the main challenges to be faced by researchers and other target groups involved in OH actions. On this basis, we propose a conceptual framework of OH integration. Throughout 3 components (field-based data management, professional training workshops and higher education), we suggest to develop a new culture of networking involving actors from various disciplines, sectors and levels (from the municipality to the Ministries) through a participatory modelling process, fostering synergies and cooperation. This framework could stimulate long-term dialogue process, based on the combination of case studies implementation and capacity building. It aims for implementing both institutional OH dynamics (multi-stakeholders and cross-sectoral) and research approaches promoting systems thinking and involving social sciences to follow-up and strengthen collective action.
Multi-objective optimisation and decision-making of space station logistics strategies
NASA Astrophysics Data System (ADS)
Zhu, Yue-he; Luo, Ya-zhong
2016-10-01
Space station logistics strategy optimisation is a complex engineering problem with multiple objectives. Finding a decision-maker-preferred compromise solution becomes more significant when solving such a problem. However, the designer-preferred solution is not easy to determine using the traditional method. Thus, a hybrid approach that combines the multi-objective evolutionary algorithm, physical programming, and differential evolution (DE) algorithm is proposed to deal with the optimisation and decision-making of space station logistics strategies. A multi-objective evolutionary algorithm is used to acquire a Pareto frontier and help determine the range parameters of the physical programming. Physical programming is employed to convert the four-objective problem into a single-objective problem, and a DE algorithm is applied to solve the resulting physical programming-based optimisation problem. Five kinds of objective preference are simulated and compared. The simulation results indicate that the proposed approach can produce good compromise solutions corresponding to different decision-makers' preferences.
Exploring High-D Spaces with Multiform Matrices and Small Multiples
MacEachren, Alan; Dai, Xiping; Hardisty, Frank; Guo, Diansheng; Lengerich, Gene
2011-01-01
We introduce an approach to visual analysis of multivariate data that integrates several methods from information visualization, exploratory data analysis (EDA), and geovisualization. The approach leverages the component-based architecture implemented in GeoVISTA Studio to construct a flexible, multiview, tightly (but generically) coordinated, EDA toolkit. This toolkit builds upon traditional ideas behind both small multiples and scatterplot matrices in three fundamental ways. First, we develop a general, MultiForm, Bivariate Matrix and a complementary MultiForm, Bivariate Small Multiple plot in which different bivariate representation forms can be used in combination. We demonstrate the flexibility of this approach with matrices and small multiples that depict multivariate data through combinations of: scatterplots, bivariate maps, and space-filling displays. Second, we apply a measure of conditional entropy to (a) identify variables from a high-dimensional data set that are likely to display interesting relationships and (b) generate a default order of these variables in the matrix or small multiple display. Third, we add conditioning, a kind of dynamic query/filtering in which supplementary (undisplayed) variables are used to constrain the view onto variables that are displayed. Conditioning allows the effects of one or more well understood variables to be removed from the analysis, making relationships among remaining variables easier to explore. We illustrate the individual and combined functionality enabled by this approach through application to analysis of cancer diagnosis and mortality data and their associated covariates and risk factors. PMID:21947129
Multi-Atlas Based Segmentation of Brainstem Nuclei from MR Images by Deep Hyper-Graph Learning.
Dong, Pei; Guo, Yangrong; Gao, Yue; Liang, Peipeng; Shi, Yonghong; Wang, Qian; Shen, Dinggang; Wu, Guorong
2016-10-01
Accurate segmentation of brainstem nuclei (red nucleus and substantia nigra) is very important in various neuroimaging applications such as deep brain stimulation and the investigation of imaging biomarkers for Parkinson's disease (PD). Due to iron deposition during aging, image contrast in the brainstem is very low in Magnetic Resonance (MR) images. Hence, the ambiguity of patch-wise similarity makes the recently successful multi-atlas patch-based label fusion methods have difficulty to perform as competitive as segmenting cortical and sub-cortical regions from MR images. To address this challenge, we propose a novel multi-atlas brainstem nuclei segmentation method using deep hyper-graph learning. Specifically, we achieve this goal in three-fold. First , we employ hyper-graph to combine the advantage of maintaining spatial coherence from graph-based segmentation approaches and the benefit of harnessing population priors from multi-atlas based framework. Second , besides using low-level image appearance, we also extract high-level context features to measure the complex patch-wise relationship. Since the context features are calculated on a tentatively estimated label probability map, we eventually turn our hyper-graph learning based label propagation into a deep and self-refining model. Third , since anatomical labels on some voxels (usually located in uniform regions) can be identified much more reliably than other voxels (usually located at the boundary between two regions), we allow these reliable voxels to propagate their labels to the nearby difficult-to-label voxels. Such hierarchical strategy makes our proposed label fusion method deep and dynamic. We evaluate our proposed label fusion method in segmenting substantia nigra (SN) and red nucleus (RN) from 3.0 T MR images, where our proposed method achieves significant improvement over the state-of-the-art label fusion methods.
Strengthening Indonesia's health workforce through partnerships.
Kurniati, A; Rosskam, E; Afzal, M M; Suryowinoto, T B; Mukti, A G
2015-09-01
Indonesia faces critical challenges pertaining to human resources for health (HRH). These relate to HRH policy, planning, mismatch between production and demand, quality, renumeration, and mal-distribution. This paper provides a state of the art review of the existing conditions in Indonesia, innovations to tackle the problems, results of the innovations to date, and a picture of the on-going challenges that have yet to be met. Reversing this crisis level shortage of HRH requires an inclusive approach to address the underlying challenges. In 2010 the government initiated multi-stakeholder coordination for HRH, using the Country Coordination and Facilitation approach. The process requires committed engagement and coordination of relevant stakeholders to address priority health needs. This manuscript is a formative evaluation of the program using documentary study and analysis. Consistent with Indonesia's decentralized health system, since 2011 local governments also started establishing provincial multi-stakeholder committees and working groups for HRH development. Through this multi-stakeholder approach with high level government support and leadership, Indonesia was able to carry out HRH planning by engaging 164 stakeholders. Multi-stakeholder coordination has produced positive results in Indonesia by bringing about a number of innovations in HRH development to achieve UHC, fostered partnerships, attracted international attention, and galvanized multi-stakeholder support in improving the HRH situation. This approach also has facilitated mobilizing technical and financial support from domestic and international partners for HRH development. Applying the multi-stakeholder engagement and coordination process in Indonesia has proved instrumental in advancing the country's work to achieve Universal Health Coverage and the Millennium Development Goals by 2015. Indonesia continues to face an HRH crisis but the collaborative process provides an opportunity to achieve results. Indonesia's experience indicates that irrespective of geographical or economic status, countries can benefit from multi-stakeholder coordination and engagement to increase access to health workers, strengthen health systems, as well as achieve and sustain UHC. Copyright © 2015 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Erfanian, A.; Fomenko, L.; Wang, G.
2016-12-01
Multi-model ensemble (MME) average is considered the most reliable for simulating both present-day and future climates. It has been a primary reference for making conclusions in major coordinated studies i.e. IPCC Assessment Reports and CORDEX. The biases of individual models cancel out each other in MME average, enabling the ensemble mean to outperform individual members in simulating the mean climate. This enhancement however comes with tremendous computational cost, which is especially inhibiting for regional climate modeling as model uncertainties can originate from both RCMs and the driving GCMs. Here we propose the Ensemble-based Reconstructed Forcings (ERF) approach to regional climate modeling that achieves a similar level of bias reduction at a fraction of cost compared with the conventional MME approach. The new method constructs a single set of initial and boundary conditions (IBCs) by averaging the IBCs of multiple GCMs, and drives the RCM with this ensemble average of IBCs to conduct a single run. Using a regional climate model (RegCM4.3.4-CLM4.5), we tested the method over West Africa for multiple combination of (up to six) GCMs. Our results indicate that the performance of the ERF method is comparable to that of the MME average in simulating the mean climate. The bias reduction seen in ERF simulations is achieved by using more realistic IBCs in solving the system of equations underlying the RCM physics and dynamics. This endows the new method with a theoretical advantage in addition to reducing computational cost. The ERF output is an unaltered solution of the RCM as opposed to a climate state that might not be physically plausible due to the averaging of multiple solutions with the conventional MME approach. The ERF approach should be considered for use in major international efforts such as CORDEX. Key words: Multi-model ensemble, ensemble analysis, ERF, regional climate modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Behafarid, F.; Shaver, D. R.; Bolotnov, I. A.
The required technological and safety standards for future Gen IV Reactors can only be achieved if advanced simulation capabilities become available, which combine high performance computing with the necessary level of modeling detail and high accuracy of predictions. The purpose of this paper is to present new results of multi-scale three-dimensional (3D) simulations of the inter-related phenomena, which occur as a result of fuel element heat-up and cladding failure, including the injection of a jet of gaseous fission products into a partially blocked Sodium Fast Reactor (SFR) coolant channel, and gas/molten sodium transport along the coolant channels. The computational approachmore » to the analysis of the overall accident scenario is based on using two different inter-communicating computational multiphase fluid dynamics (CMFD) codes: a CFD code, PHASTA, and a RANS code, NPHASE-CMFD. Using the geometry and time history of cladding failure and the gas injection rate, direct numerical simulations (DNS), combined with the Level Set method, of two-phase turbulent flow have been performed by the PHASTA code. The model allows one to track the evolution of gas/liquid interfaces at a centimeter scale. The simulated phenomena include the formation and breakup of the jet of fission products injected into the liquid sodium coolant. The PHASTA outflow has been averaged over time to obtain mean phasic velocities and volumetric concentrations, as well as the liquid turbulent kinetic energy and turbulence dissipation rate, all of which have served as the input to the core-scale simulations using the NPHASE-CMFD code. A sliding window time averaging has been used to capture mean flow parameters for transient cases. The results presented in the paper include testing and validation of the proposed models, as well the predictions of fission-gas/liquid-sodium transport along a multi-rod fuel assembly of SFR during a partial loss-of-flow accident. (authors)« less
NASA Astrophysics Data System (ADS)
Schmidt, M.; Hugentobler, U.; Jakowski, N.; Dettmering, D.; Liang, W.; Limberger, M.; Wilken, V.; Gerzen, T.; Hoque, M.; Berdermann, J.
2012-04-01
Near real-time high resolution and high precision ionosphere models are needed for a large number of applications, e.g. in navigation, positioning, telecommunications or astronautics. Today these ionosphere models are mostly empirical, i.e., based purely on mathematical approaches. In the DFG project 'Multi-scale model of the ionosphere from the combination of modern space-geodetic satellite techniques (MuSIK)' the complex phenomena within the ionosphere are described vertically by combining the Chapman electron density profile with a plasmasphere layer. In order to consider the horizontal and temporal behaviour the fundamental target parameters of this physics-motivated approach are modelled by series expansions in terms of tensor products of localizing B-spline functions depending on longitude, latitude and time. For testing the procedure the model will be applied to an appropriate region in South America, which covers relevant ionospheric processes and phenomena such as the Equatorial Anomaly. The project connects the expertise of the three project partners, namely Deutsches Geodätisches Forschungsinstitut (DGFI) Munich, the Institute of Astronomical and Physical Geodesy (IAPG) of the Technical University Munich (TUM) and the German Aerospace Center (DLR), Neustrelitz. In this presentation we focus on the current status of the project. In the first year of the project we studied the behaviour of the ionosphere in the test region, we setup appropriate test periods covering high and low solar activity as well as winter and summer and started the data collection, analysis, pre-processing and archiving. We developed partly the mathematical-physical modelling approach and performed first computations based on simulated input data. Here we present information on the data coverage for the area and the time periods of our investigations and we outline challenges of the multi-dimensional mathematical-physical modelling approach. We show first results, discuss problems in modelling and possible solution strategies and finally, we address open questions.
A Regional Multi-permit Market for Ecosystem Services
NASA Astrophysics Data System (ADS)
Bernknopf, R.; Amos, P.; Zhang, E.
2014-12-01
Regional cap and trade programs have been in operation since the 1970's to reduce environmental externalities (NOx and SOx emissions) and have been shown to be beneficial. Air quality and water quality limits are enforced through numerous Federal and State laws and regulations while local communities are seeking ways to protect regional green infrastructure and their ecosystems services. Why not combine them in a market approach to reduce many environmental externalities simultaneously? In a multi-permit market program reforestation (land offsets) as part of a nutrient or carbon sequestration trading program would provide a means to reduce agrochemical discharges into streams, rivers, and groundwater. Land conversions also improve the quality and quantity of other environmental externalities such as air pollution. Collocated nonmarket ecosystem services have societal benefits that can expand the crediting system into a multi-permit trading program. At a regional scale it is possible to combine regulation of water quality, air emissions and quality, and habitat conservation and restoration into one program. This research is about the economic feasibility of a Philadelphia regional multi-permit (cap and trade) program for ecosystem services. Instead of establishing individual markets for ecosystem services, the assumption of the spatial portfolio approach is that it is based on the interdependence of ecosystem functions so that market credits encompasses a range of ecosystem services. Using an existing example the components of the approach are described in terms of scenarios of land portfolios and the calculation of expected return on investment and risk. An experiment in the Schuylkill Watershed will be described for ecosystem services such as nutrients in water and populations of bird species along with Green House Gases. The Philadelphia regional market includes the urban - nonurban economic and environmental interactions and impacts.
An adaptive multi-level simulation algorithm for stochastic biological systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lester, C., E-mail: lesterc@maths.ox.ac.uk; Giles, M. B.; Baker, R. E.
2015-01-14
Discrete-state, continuous-time Markov models are widely used in the modeling of biochemical reaction networks. Their complexity often precludes analytic solution, and we rely on stochastic simulation algorithms (SSA) to estimate system statistics. The Gillespie algorithm is exact, but computationally costly as it simulates every single reaction. As such, approximate stochastic simulation algorithms such as the tau-leap algorithm are often used. Potentially computationally more efficient, the system statistics generated suffer from significant bias unless tau is relatively small, in which case the computational time can be comparable to that of the Gillespie algorithm. The multi-level method [Anderson and Higham, “Multi-level Montemore » Carlo for continuous time Markov chains, with applications in biochemical kinetics,” SIAM Multiscale Model. Simul. 10(1), 146–179 (2012)] tackles this problem. A base estimator is computed using many (cheap) sample paths at low accuracy. The bias inherent in this estimator is then reduced using a number of corrections. Each correction term is estimated using a collection of paired sample paths where one path of each pair is generated at a higher accuracy compared to the other (and so more expensive). By sharing random variables between these paired paths, the variance of each correction estimator can be reduced. This renders the multi-level method very efficient as only a relatively small number of paired paths are required to calculate each correction term. In the original multi-level method, each sample path is simulated using the tau-leap algorithm with a fixed value of τ. This approach can result in poor performance when the reaction activity of a system changes substantially over the timescale of interest. By introducing a novel adaptive time-stepping approach where τ is chosen according to the stochastic behaviour of each sample path, we extend the applicability of the multi-level method to such cases. We demonstrate the efficiency of our method using a number of examples.« less
Season-ahead water quality forecasts for the Schuylkill River, Pennsylvania
NASA Astrophysics Data System (ADS)
Block, P. J.; Leung, K.
2013-12-01
Anticipating and preparing for elevated water quality parameter levels in critical water sources, using weather forecasts, is not uncommon. In this study, we explore the feasibility of extending this prediction scale to a season-ahead for the Schuylkill River in Philadelphia, utilizing both statistical and dynamical prediction models, to characterize the season. This advance information has relevance for recreational activities, ecosystem health, and water treatment, as the Schuylkill provides 40% of Philadelphia's water supply. The statistical model associates large-scale climate drivers with streamflow and water quality parameter levels; numerous variables from NOAA's CFSv2 model are evaluated for the dynamical approach. A multi-model combination is also assessed. Results indicate moderately skillful prediction of average summertime total coliform and wintertime turbidity, using season-ahead oceanic and atmospheric variables, predominantly from the North Atlantic Ocean. Models predicting the number of elevated turbidity events across the wintertime season are also explored.
Brimblecombe, J; Bailie, R; van den Boogaard, C; Wood, B; Liberato, S C; Ferguson, M; Coveney, J; Jaenke, R; Ritchie, J
2017-12-01
Food insecurity underlies and compounds many of the development issues faced by remote Indigenous communities in Australia. Multi-sector approaches offer promise to improve food security. We assessed the feasibility of a novel multi-sector approach to enhance community food security in remote Indigenous Australia. A longitudinal comparative multi-site case study, the Good Food Systems Good Food for All Project, was conducted (2009-2013) with four Aboriginal communities. Continuous improvement meetings were held in each community. Data from project documents and store sales were used to assess feasibility according to engagement, uptake and sustainability of action, and impact on community diet, as well as identifying conditions facilitating or hindering these. Engagement was established where: the community perceived a need for the approach; where trust was developed between the community and facilitators; where there was community stability; and where flexibility was applied in the timing of meetings. The approach enabled stakeholders in each community to collectively appraise the community food system and plan action. Actions that could be directly implemented within available resources resulted from developing collaborative capacity. Actions requiring advocacy, multi-sectoral involvement, commitment or further resources were less frequently used. Positive shifts in community diet were associated with key areas where actions were implemented. A multi-sector participatory approach seeking continuous improvement engaged committed Aboriginal and non-Aboriginal stakeholders and was shown to have potential to shift community diet. Provision of clear mechanisms to link this approach with higher level policy and decision-making structures, clarity of roles and responsibilities, and processes to prioritise and communicate actions across sectors should further strengthen capacity for food security improvement. Integrating this approach enabling local decision-making into community governance structures with adequate resourcing is an imperative.
Confidence level estimation in multi-target classification problems
NASA Astrophysics Data System (ADS)
Chang, Shi; Isaacs, Jason; Fu, Bo; Shin, Jaejeong; Zhu, Pingping; Ferrari, Silvia
2018-04-01
This paper presents an approach for estimating the confidence level in automatic multi-target classification performed by an imaging sensor on an unmanned vehicle. An automatic target recognition algorithm comprised of a deep convolutional neural network in series with a support vector machine classifier detects and classifies targets based on the image matrix. The joint posterior probability mass function of target class, features, and classification estimates is learned from labeled data, and recursively updated as additional images become available. Based on the learned joint probability mass function, the approach presented in this paper predicts the expected confidence level of future target classifications, prior to obtaining new images. The proposed approach is tested with a set of simulated sonar image data. The numerical results show that the estimated confidence level provides a close approximation to the actual confidence level value determined a posteriori, i.e. after the new image is obtained by the on-board sensor. Therefore, the expected confidence level function presented in this paper can be used to adaptively plan the path of the unmanned vehicle so as to optimize the expected confidence levels and ensure that all targets are classified with satisfactory confidence after the path is executed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNab, W; Ezzedine, S; Detwiler, R
2007-02-26
Industrial organic solvents such as trichloroethylene (TCE) and tetrachloroethylene (PCE) constitute a principal class of groundwater contaminants. Cleanup of groundwater plume source areas associated with these compounds is problematic, in part, because the compounds often exist in the subsurface as dense nonaqueous phase liquids (DNAPLs). Ganglia (or 'blobs') of DNAPL serve as persistent sources of contaminants that are difficult to locate and remediate (e.g. Fenwick and Blunt, 1998). Current understanding of the physical and chemical processes associated with dissolution of DNAPLs in the subsurface is incomplete and yet is critical for evaluating long-term behavior of contaminant migration, groundwater cleanup, andmore » the efficacy of source area cleanup technologies. As such, a goal of this project has been to contribute to this critical understanding by investigating the multi-phase, multi-component physics of DNAPL dissolution using state-of-the-art experimental and computational techniques. Through this research, we have explored efficient and accurate conceptual and numerical models for source area contaminant transport that can be used to better inform the modeling of source area contaminants, including those at the LLNL Superfund sites, to re-evaluate existing remediation technologies, and to inspire or develop new remediation strategies. The problem of DNAPL dissolution in natural porous media must be viewed in the context of several scales (Khachikian and Harmon, 2000), including the microscopic level at which capillary forces, viscous forces, and gravity/buoyancy forces are manifested at the scale of individual pores (Wilson and Conrad, 1984; Chatzis et al., 1988), the mesoscale where dissolution rates are strongly influenced by the local hydrodynamics, and the field-scale. Historically, the physico-chemical processes associated with DNAPL dissolution have been addressed through the use of lumped mass transfer coefficients which attempt to quantify the dissolution rate in response to local dissolved-phase concentrations distributed across the source area using a volume-averaging approach (Figure 1). The fundamental problem with the lumped mass transfer parameter is that its value is typically derived empirically through column-scale experiments that combine the effects of pore-scale flow, diffusion, and pore-scale geometry in a manner that does not provide a robust theoretical basis for upscaling. In our view, upscaling processes from the pore-scale to the field-scale requires new computational approaches (Held and Celia, 2001) that are directly linked to experimental studies of dissolution at the pore scale. As such, our investigation has been multi-pronged, combining theory, experiments, numerical modeling, new data analysis approaches, and a synthesis of previous studies (e.g. Glass et al, 2001; Keller et al., 2002) aimed at quantifying how the mechanisms controlling dissolution at the pore-scale control the long-term dissolution of source areas at larger scales.« less
Heideklang, René; Shokouhi, Parisa
2016-01-01
This article focuses on the fusion of flaw indications from multi-sensor nondestructive materials testing. Because each testing method makes use of a different physical principle, a multi-method approach has the potential of effectively differentiating actual defect indications from the many false alarms, thus enhancing detection reliability. In this study, we propose a new technique for aggregating scattered two- or three-dimensional sensory data. Using a density-based approach, the proposed method explicitly addresses localization uncertainties such as registration errors. This feature marks one of the major of advantages of this approach over pixel-based image fusion techniques. We provide guidelines on how to set all the key parameters and demonstrate the technique’s robustness. Finally, we apply our fusion approach to experimental data and demonstrate its capability to locate small defects by substantially reducing false alarms under conditions where no single-sensor method is adequate. PMID:26784200
Systems design of transformation toughened blast-resistant naval hull steels
NASA Astrophysics Data System (ADS)
Saha, Arup
A systems approach to computational materials design has demonstrated a new class of ultratough, weldable secondary hardened plate steels combining new levels of strength and toughness while meeting processability requirements. A first prototype alloy has achieved property goals motivated by projected naval hull applications requiring extreme fracture toughness (Cv > 85 ft-lbs (115 J) corresponding to KId > 200 ksi.in1/2 (220 MPa.m1/2)) at strength levels of 150--180 ksi (1034--1241 MPa) yield strength in weldable, formable plate steels. A theoretical design concept was explored integrating the mechanism of precipitated nickel-stabilized dispersed austenite for transformation toughening in an alloy strengthened by combined precipitation of M2C carbides and BCC copper both at an optimal ˜3nm particle size for efficient strengthening. This concept was adapted to plate steel design by employing a mixed bainitic/martensitic matrix microstructure produced by air-cooling after solution-treatment and constraining the composition to low carbon content for weldability. With optimized levels of copper and M2C carbide formers based on a quantitative strength model, a required alloy nickel content of 6.5 wt% was predicted for optimal austenite stability for transformation toughening at the desired strength level of 160 ksi (1100 MPa) yield strength. A relatively high Cu level of 3.65 wt% was employed to allow a carbon limit of 0.05 wt% for good weldability. Hardness and tensile tests conducted on the designed prototype confirmed predicted precipitation strengthening behavior in quench and tempered material. Multi-step tempering conditions were employed to achieve the optimal austenite stability resulting in significant increase of impact toughness to 130 ft-lb (176 J) at a strength level of 160 ksi (1100 MPa). Comparison with the baseline toughness-strength combination determined by isochronal tempering studies indicates a transformation toughening increment of 60% in Charpy energy. Predicted Cu particle number densities and the heterogeneous nucleation of optimal stability high Ni 5 nm austenite on nanometer-scale copper precipitates in the multi-step tempered samples was confirmed using three-dimensional atom probe microscopy. Charpy impact tests and fractography demonstrate ductile fracture with C v > 90 ft-lbs (122 J) down to -40°C, with a substantial toughness peak at 25°C consistent with designed transformation toughening behavior. The properties demonstrated in this first prototype represent a substantial advance over existing naval hull steels.
Chung, Dongjun; Kuan, Pei Fen; Li, Bo; Sanalkumar, Rajendran; Liang, Kun; Bresnick, Emery H; Dewey, Colin; Keleş, Sündüz
2011-07-01
Chromatin immunoprecipitation followed by high-throughput sequencing (ChIP-seq) is rapidly replacing chromatin immunoprecipitation combined with genome-wide tiling array analysis (ChIP-chip) as the preferred approach for mapping transcription-factor binding sites and chromatin modifications. The state of the art for analyzing ChIP-seq data relies on using only reads that map uniquely to a relevant reference genome (uni-reads). This can lead to the omission of up to 30% of alignable reads. We describe a general approach for utilizing reads that map to multiple locations on the reference genome (multi-reads). Our approach is based on allocating multi-reads as fractional counts using a weighted alignment scheme. Using human STAT1 and mouse GATA1 ChIP-seq datasets, we illustrate that incorporation of multi-reads significantly increases sequencing depths, leads to detection of novel peaks that are not otherwise identifiable with uni-reads, and improves detection of peaks in mappable regions. We investigate various genome-wide characteristics of peaks detected only by utilization of multi-reads via computational experiments. Overall, peaks from multi-read analysis have similar characteristics to peaks that are identified by uni-reads except that the majority of them reside in segmental duplications. We further validate a number of GATA1 multi-read only peaks by independent quantitative real-time ChIP analysis and identify novel target genes of GATA1. These computational and experimental results establish that multi-reads can be of critical importance for studying transcription factor binding in highly repetitive regions of genomes with ChIP-seq experiments.
Integrative Analysis of Longitudinal Metabolomics Data from a Personal Multi-Omics Profile
Stanberry, Larissa; Mias, George I.; Haynes, Winston; Higdon, Roger; Snyder, Michael; Kolker, Eugene
2013-01-01
The integrative personal omics profile (iPOP) is a pioneering study that combines genomics, transcriptomics, proteomics, metabolomics and autoantibody profiles from a single individual over a 14-month period. The observation period includes two episodes of viral infection: a human rhinovirus and a respiratory syncytial virus. The profile studies give an informative snapshot into the biological functioning of an organism. We hypothesize that pathway expression levels are associated with disease status. To test this hypothesis, we use biological pathways to integrate metabolomics and proteomics iPOP data. The approach computes the pathways’ differential expression levels at each time point, while taking into account the pathway structure and the longitudinal design. The resulting pathway levels show strong association with the disease status. Further, we identify temporal patterns in metabolite expression levels. The changes in metabolite expression levels also appear to be consistent with the disease status. The results of the integrative analysis suggest that changes in biological pathways may be used to predict and monitor the disease. The iPOP experimental design, data acquisition and analysis issues are discussed within the broader context of personal profiling. PMID:24958148
Automatic CT Brain Image Segmentation Using Two Level Multiresolution Mixture Model of EM
NASA Astrophysics Data System (ADS)
Jiji, G. Wiselin; Dehmeshki, Jamshid
2014-04-01
Tissue classification in computed tomography (CT) brain images is an important issue in the analysis of several brain dementias. A combination of different approaches for the segmentation of brain images is presented in this paper. A multi resolution algorithm is proposed along with scaled versions using Gaussian filter and wavelet analysis that extends expectation maximization (EM) algorithm. It is found that it is less sensitive to noise and got more accurate image segmentation than traditional EM. Moreover the algorithm has been applied on 20 sets of CT of the human brain and compared with other works. The segmentation results show the advantages of the proposed work have achieved more promising results and the results have been tested with Doctors.
Curry, Joanne; Fitzgerald, Anneke; Prodan, Ante; Dadich, Ann; Sloan, Terry
2014-01-01
This article focuses on a framework that will investigate the integration of two disparate methodologies: patient journey modelling and visual multi-agent simulation, and its impact on the speed and quality of knowledge translation to healthcare stakeholders. Literature describes patient journey modelling and visual simulation as discrete activities. This paper suggests that their combination and their impact on translating knowledge to practitioners are greater than the sum of the two technologies. The test-bed is ambulatory care and the goal is to determine if this approach can improve health services delivery, workflow, and patient outcomes and satisfaction. The multidisciplinary research team is comprised of expertise in patient journey modelling, simulation, and knowledge translation.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-05
..., multi- level interventions; and community and public health approaches. To improve program design... prevention services and an evidence-based approach are provided for States to use in their SNAP-Ed programming. These definitions provide States with greater flexibility to include environmental approaches and...
A functional language approach in high-speed digital simulation
NASA Technical Reports Server (NTRS)
Ercegovac, M. D.; Lu, S.-L.
1983-01-01
A functional programming approach for a multi-microprocessor architecture is presented. The language, based on Backus FP, its intermediate form and the translation process are discussed and illustrated with an example. The approach allows performance analysis to be performed at a high level as an aid in program partitioning.
Merging information from multi-model flood projections in a hierarchical Bayesian framework
NASA Astrophysics Data System (ADS)
Le Vine, Nataliya
2016-04-01
Multi-model ensembles are becoming widely accepted for flood frequency change analysis. The use of multiple models results in large uncertainty around estimates of flood magnitudes, due to both uncertainty in model selection and natural variability of river flow. The challenge is therefore to extract the most meaningful signal from the multi-model predictions, accounting for both model quality and uncertainties in individual model estimates. The study demonstrates the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach facilitates explicit treatment of shared multi-model discrepancy as well as the probabilistic nature of the flood estimates, by treating the available models as a sample from a hypothetical complete (but unobserved) set of models. The advantages of the approach are: 1) to insure an adequate 'baseline' conditions with which to compare future changes; 2) to reduce flood estimate uncertainty; 3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; 4) to adjust multi-model consistency criteria when model biases are large; and 5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mai, Sebastian; Marquetand, Philipp; González, Leticia
2014-08-21
An efficient perturbational treatment of spin-orbit coupling within the framework of high-level multi-reference techniques has been implemented in the most recent version of the COLUMBUS quantum chemistry package, extending the existing fully variational two-component (2c) multi-reference configuration interaction singles and doubles (MRCISD) method. The proposed scheme follows related implementations of quasi-degenerate perturbation theory (QDPT) model space techniques. Our model space is built either from uncontracted, large-scale scalar relativistic MRCISD wavefunctions or based on the scalar-relativistic solutions of the linear-response-theory-based multi-configurational averaged quadratic coupled cluster method (LRT-MRAQCC). The latter approach allows for a consistent, approximatively size-consistent and size-extensive treatment of spin-orbitmore » coupling. The approach is described in detail and compared to a number of related techniques. The inherent accuracy of the QDPT approach is validated by comparing cuts of the potential energy surfaces of acrolein and its S, Se, and Te analoga with the corresponding data obtained from matching fully variational spin-orbit MRCISD calculations. The conceptual availability of approximate analytic gradients with respect to geometrical displacements is an attractive feature of the 2c-QDPT-MRCISD and 2c-QDPT-LRT-MRAQCC methods for structure optimization and ab inito molecular dynamics simulations.« less
Guillou, S; Lerasle, M; Simonin, H; Anthoine, V; Chéret, R; Federighi, M; Membré, J-M
2016-09-16
A multi-criteria framework combining safety, hygiene and sensorial quality was developed to investigate the possibility of extending the shelf-life and/or removing lactate by applying High Hydrostatic Pressure (HHP) in a ready-to-cook (RTC) poultry product. For this purpose, Salmonella and Listeria monocytogenes were considered as safety indicators and Escherichia coli as hygienic indicator. Predictive modeling was used to determine the influence of HHP and lactate concentration on microbial growth and survival of these indicators. To that end, probabilistic assessment exposure models developed in a previous study (Lerasle, M., Guillou, S., Simonin, H., Anthoine, V., Chéret, R., Federighi, M., Membré, J.M. 2014. Assessment of Salmonella and L. monocytogenes level in ready-to-cook poultry meat: Effect of various high pressure treatments and potassium lactate concentrations. International Journal of Food Microbiology 186, 74-83) were used for L. monocytogenes and Salmonella. Besides, for E. coli, an exposure assessment model was built by modeling data from challenge-test experiments. Finally, sensory tests and color measurements were performed to evaluate the effect of HHP on the organoleptic quality of an RTC product. Quantitative rules of decision based on safety, hygienic and organoleptic criteria were set. Hygienic and safety criteria were associated with probability to exceed maximum contamination levels of L. monocytogenes, Salmonella and E. coli at the end of the shelf-life whereas organoleptic criteria corresponded to absence of statistical difference between pressurized and unpressurized products. A tradeoff between safety and hygienic risk, color and taste, was then applied to define process and formulation enabling shelf-life extension. In the resulting operating window, one condition was experimentally assayed on naturally contaminated RTC products to validate the multi-criteria approach. As a conclusion, the framework was validated; it was possible to extend the shelf-life of an RTC poultry product containing 1.8% (w/w) lactate by one week, despite slight color alteration. This approach could be profitably implemented by food processors as a decision support tool for shelf-life determination. Copyright © 2016 Elsevier B.V. All rights reserved.
Qi, Yu; Wang, Hui; Wei, Kai; Yang, Ya; Zheng, Ru-Yue; Kim, Ick Soo; Zhang, Ke-Qin
2017-03-03
The biological performance of artificial biomaterials is closely related to their structure characteristics. Cell adhesion, migration, proliferation, and differentiation are all strongly affected by the different scale structures of biomaterials. Silk fibroin (SF), extracted mainly from silkworms, has become a popular biomaterial due to its excellent biocompatibility, exceptional mechanical properties, tunable degradation, ease of processing, and sufficient supply. As a material with excellent processability, SF can be processed into various forms with different structures, including particulate, fiber, film, and three-dimensional (3D) porous scaffolds. This review discusses and summarizes the various constructions of SF-based materials, from single structures to multi-level structures, and their applications. In combination with single structures, new techniques for creating special multi-level structures of SF-based materials, such as micropatterning and 3D-printing, are also briefly addressed.
Luo, Jie; Cai, Limei; Qi, Shihua; Wu, Jian; Sophie Gu, Xiaowen
2017-12-15
Multiple techniques for soil decontamination were combined to enhance the phytoremediation efficiency of Eucalyptus globulese and alleviate the corresponding environmental risks. The approach constituted of chelating agent using, electrokinetic remediation, plant hormone foliar application and phytoremediation was designed to remediate multi-metal contaminated soils from a notorious e-waste recycling town. The decontamination ability of E. globulese increased from 1.35, 58.47 and 119.18 mg per plant for Cd, Pb and Cu in planting controls to 7.57, 198.68 and 174.34 mg per plant in individual EDTA treatments, respectively, but simultaneously, 0.9-11.5 times more metals leached from chelator treatments relative to controls. Low (2 V) and moderate (4 V) voltage electric fields provoked the growth of the species while high voltage (10 V) had an opposite effect and metal concentrations of the plants elevated with the increment of voltage. Volumes of the leachate decreased from 1224 to 134 mL with voltage increasing from 0 to 10 V due to electroosmosis and electrolysis. Comparing with individual phytoremediation, foliar cytokinin treatments produced 56% more biomass and intercepted 2.5 times more leachate attributed to the enhanced transpiration rate. The synergistic combination of the individuals resulted in the most biomass production and metal accumulation of the species under the stress condition relative to other methods. Time required for the multi-technique approach to decontaminate Cd, Pb and Cu from soil was 2.1-10.4 times less than individual chelator addition, electric field application or plant hormone utilization. It's especially important that nearly no leachate (60 mL in total) was collected from the multi-technique system. This approach is a suitable method to remediate metal polluted site considering its decontamination efficiency and associated environmental negligible risk. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Das, Bankim Chandra; Bhattacharyya, Dipankar; Das, Arpita; Chakrabarti, Shrabana; De, Sankar
2016-12-01
We report here simultaneous experimental observation of Electromagnetically Induced Transparency (EIT) and Electromagnetically Induced Absorption (EIA) in a multi-level V-type system in D2 transition of 87Rb, i.e., F =2 →F' with a strong pump and a weak probe beam. We studied the probe spectrum by locking the probe beam to the transition F =2 →F'=2 while the pump is scanned from F =2 →F' . EIA is observed for the open transition (F =2 →F'=2 ) whereas EIT is observed in the closed transition (F =2 →F'=3 ). Sub natural line-width is observed for the EIA. To simulate the observed spectra theoretically, Liouville equation for the three-level V-type system is solved analytically with a multi-mode approach for the density matrix elements. We assumed both the pump and the probe beams can couple the excited states. A multi-mode approach for the coherence terms facilitates the study of all the frequency contributions due to the pump and the probe fields. Since the terms contain higher harmonics of the pump and the probe frequencies, we expressed them in Fourier transformed forms. To simulate the probe spectrum, we have solved inhomogeneous difference equations for the coherence terms using the Green's function technique and continued fraction theory. The experimental line-widths of the EIT and the EIA are compared with our theoretical model. Our system can be useful in optical switching applications as it can be precisely tuned to render the medium opaque and transparent simultaneously.
Das, Bankim Chandra; Bhattacharyya, Dipankar; Das, Arpita; Chakrabarti, Shrabana; De, Sankar
2016-12-14
We report here simultaneous experimental observation of Electromagnetically Induced Transparency (EIT) and Electromagnetically Induced Absorption (EIA) in a multi-level V-type system in D 2 transition of Rb87, i.e., F=2→F ' with a strong pump and a weak probe beam. We studied the probe spectrum by locking the probe beam to the transition F=2→F ' =2 while the pump is scanned from F=2→F ' . EIA is observed for the open transition (F=2→F ' =2) whereas EIT is observed in the closed transition (F=2→F ' =3). Sub natural line-width is observed for the EIA. To simulate the observed spectra theoretically, Liouville equation for the three-level V-type system is solved analytically with a multi-mode approach for the density matrix elements. We assumed both the pump and the probe beams can couple the excited states. A multi-mode approach for the coherence terms facilitates the study of all the frequency contributions due to the pump and the probe fields. Since the terms contain higher harmonics of the pump and the probe frequencies, we expressed them in Fourier transformed forms. To simulate the probe spectrum, we have solved inhomogeneous difference equations for the coherence terms using the Green's function technique and continued fraction theory. The experimental line-widths of the EIT and the EIA are compared with our theoretical model. Our system can be useful in optical switching applications as it can be precisely tuned to render the medium opaque and transparent simultaneously.
Semantic classification of business images
NASA Astrophysics Data System (ADS)
Erol, Berna; Hull, Jonathan J.
2006-01-01
Digital cameras are becoming increasingly common for capturing information in business settings. In this paper, we describe a novel method for classifying images into the following semantic classes: document, whiteboard, business card, slide, and regular images. Our method is based on combining low-level image features, such as text color, layout, and handwriting features with high-level OCR output analysis. Several Support Vector Machine Classifiers are combined for multi-class classification of input images. The system yields 95% accuracy in classification.
Lu, Victor M; Zhang, Lucy; Scherman, Daniel B; Rao, Prashanth J; Mobbs, Ralph J; Phan, Kevin
2017-02-01
The traditional surgical approach to treat multi-level cervical disc disease (mCDD) has been anterior cervical discectomy and fusion (ACDF). There has been recent development of other surgical approaches to further improve clinical outcomes. Collectively, when elements of these different approaches are combined in surgery, it is known as hybrid surgery (HS) which remains a novel treatment option. A systematic review and meta-analysis was conducted to compare the outcomes of HS versus ACDF for the treatment of mCDD. Relevant articles were identified from six electronic databases from their inception to January 2016. From 8 relevant studies identified, 169 patients undergoing HS were compared with 193 ACDF procedures. Operative time was greater after HS by 42 min (p < 0.00001), with less intraoperative blood loss by 26 mL (p < 0.00001) and shorter return to work by 32 days (p < 0.00001). In terms of clinical outcomes, HS was associated with greater C2-C7 range of motion (ROM) preservation (p < 0.00001) and less functional impairment (p = 0.008) after surgery compared to ACDF. There was no significant difference between HS and ACDF with respect to postoperative pain (p = 0.12). The postoperative course following HS was not significantly different to ACDF in terms of length of stay (p = 0.24) and postoperative complication rates (p = 0.18). HS is a novel surgical approach to treat mCDD, associated with a greater operative time, less intraoperative blood loss and comparable if not superior clinical outcomes compared to ACDF. While it remains a viable consideration, there is a lack of robust clinical evidence in the literature. Future large prospective registries and randomised trials are warranted to validate the findings of this study.
NASA Astrophysics Data System (ADS)
Held, H.; Gerstengarbe, F.-W.; Hattermann, F.; Pinto, J. G.; Ulbrich, U.; Böhm, U.; Born, K.; Büchner, M.; Donat, M. G.; Kücken, M.; Leckebusch, G. C.; Nissen, K.; Nocke, T.; Österle, H.; Pardowitz, T.; Werner, P. C.; Burghoff, O.; Broecker, U.; Kubik, A.
2012-04-01
We present an overview of a complementary-approaches impact project dealing with the consequences of climate change for the natural hazard branch of the insurance industry in Germany. The project was conducted by four academic institutions together with the German Insurance Association (GDV) and finalized in autumn 2011. A causal chain is modeled that goes from global warming projections over regional meteorological impacts to regional economic losses for private buildings, hereby fully covering the area of Germany. This presentation will focus on wind storm related losses, although the method developed had also been applied in part to hail and flood impact losses. For the first time, the GDV supplied their collected set of insurance cases, dating back for decades, for such an impact study. These data were used to calibrate and validate event-based damage functions which in turn were driven by three different types of regional climate models to generate storm loss projections. The regional models were driven by a triplet of ECHAM5 experiments following the A1B scenario which were found representative in the recent ENSEMBLES intercomparison study. In our multi-modeling approach we used two types of regional climate models that conceptually differ at maximum: a dynamical model (CCLM) and a statistical model based on the idea of biased bootstrapping (STARS). As a third option we pursued a hybrid approach (statistical-dynamical downscaling). For the assessment of climate change impacts, the buildings' infrastructure and their economic value is kept at current values. For all three approaches, a significant increase of average storm losses and extreme event return levels in the German private building sector is found for future decades assuming an A1B-scenario. However, the three projections differ somewhat in terms of magnitude and regional differentiation. We have developed a formalism that allows us to express the combined effect of multi-source uncertainty on return levels within the framework of a generalized Pareto distribution.
Joint Facial Action Unit Detection and Feature Fusion: A Multi-conditional Learning Approach.
Eleftheriadis, Stefanos; Rudovic, Ognjen; Pantic, Maja
2016-10-05
Automated analysis of facial expressions can benefit many domains, from marketing to clinical diagnosis of neurodevelopmental disorders. Facial expressions are typically encoded as a combination of facial muscle activations, i.e., action units. Depending on context, these action units co-occur in specific patterns, and rarely in isolation. Yet, most existing methods for automatic action unit detection fail to exploit dependencies among them, and the corresponding facial features. To address this, we propose a novel multi-conditional latent variable model for simultaneous fusion of facial features and joint action unit detection. Specifically, the proposed model performs feature fusion in a generative fashion via a low-dimensional shared subspace, while simultaneously performing action unit detection using a discriminative classification approach. We show that by combining the merits of both approaches, the proposed methodology outperforms existing purely discriminative/generative methods for the target task. To reduce the number of parameters, and avoid overfitting, a novel Bayesian learning approach based on Monte Carlo sampling is proposed, to integrate out the shared subspace. We validate the proposed method on posed and spontaneous data from three publicly available datasets (CK+, DISFA and Shoulder-pain), and show that both feature fusion and joint learning of action units leads to improved performance compared to the state-of-the-art methods for the task.
Sanbonmatsu, David M; Strayer, David L; Medeiros-Ward, Nathan; Watson, Jason M
2013-01-01
The present study examined the relationship between personality and individual differences in multi-tasking ability. Participants enrolled at the University of Utah completed measures of multi-tasking activity, perceived multi-tasking ability, impulsivity, and sensation seeking. In addition, they performed the Operation Span in order to assess their executive control and actual multi-tasking ability. The findings indicate that the persons who are most capable of multi-tasking effectively are not the persons who are most likely to engage in multiple tasks simultaneously. To the contrary, multi-tasking activity as measured by the Media Multitasking Inventory and self-reported cell phone usage while driving were negatively correlated with actual multi-tasking ability. Multi-tasking was positively correlated with participants' perceived ability to multi-task ability which was found to be significantly inflated. Participants with a strong approach orientation and a weak avoidance orientation--high levels of impulsivity and sensation seeking--reported greater multi-tasking behavior. Finally, the findings suggest that people often engage in multi-tasking because they are less able to block out distractions and focus on a singular task. Participants with less executive control--low scorers on the Operation Span task and persons high in impulsivity--tended to report higher levels of multi-tasking activity.
LONG-TERM PERFORMANCE OF PERMEABLE REACTIVE BARRIERS: AN UPDATE ON A U.S. MULTI-AGENCY INITIATIVE
Permeable reactive barriers (PRB's) are an emerging, alternative in-situ approach for remediating contaminated groundwater that combine subsurface fluid flow management with a passive chemical treatment zone. PRB's are a potentially more cost effective treatment option at seve...
Lee, Junghoon; Carass, Aaron; Jog, Amod; Zhao, Can; Prince, Jerry L
2017-02-01
Accurate CT synthesis, sometimes called electron density estimation, from MRI is crucial for successful MRI-based radiotherapy planning and dose computation. Existing CT synthesis methods are able to synthesize normal tissues but are unable to accurately synthesize abnormal tissues (i.e., tumor), thus providing a suboptimal solution. We propose a multi-atlas-based hybrid synthesis approach that combines multi-atlas registration and patch-based synthesis to accurately synthesize both normal and abnormal tissues. Multi-parametric atlas MR images are registered to the target MR images by multi-channel deformable registration, from which the atlas CT images are deformed and fused by locally-weighted averaging using a structural similarity measure (SSIM). Synthetic MR images are also computed from the registered atlas MRIs by using the same weights used for the CT synthesis; these are compared to the target patient MRIs allowing for the assessment of the CT synthesis fidelity. Poor synthesis regions are automatically detected based on the fidelity measure and refined by a patch-based synthesis. The proposed approach was tested on brain cancer patient data, and showed a noticeable improvement for the tumor region.
Sobieranski, Antonio C; Inci, Fatih; Tekin, H Cumhur; Yuksekkaya, Mehmet; Comunello, Eros; Cobra, Daniel; von Wangenheim, Aldo; Demirci, Utkan
2017-01-01
In this paper, an irregular displacement-based lensless wide-field microscopy imaging platform is presented by combining digital in-line holography and computational pixel super-resolution using multi-frame processing. The samples are illuminated by a nearly coherent illumination system, where the hologram shadows are projected into a complementary metal-oxide semiconductor-based imaging sensor. To increase the resolution, a multi-frame pixel resolution approach is employed to produce a single holographic image from multiple frame observations of the scene, with small planar displacements. Displacements are resolved by a hybrid approach: (i) alignment of the LR images by a fast feature-based registration method, and (ii) fine adjustment of the sub-pixel information using a continuous optimization approach designed to find the global optimum solution. Numerical method for phase-retrieval is applied to decode the signal and reconstruct the morphological details of the analyzed sample. The presented approach was evaluated with various biological samples including sperm and platelets, whose dimensions are in the order of a few microns. The obtained results demonstrate a spatial resolution of 1.55 µm on a field-of-view of ≈30 mm2. PMID:29657866
Enhancing the Teaching of Introductory Economics with a Team-Based, Multi-Section Competition
ERIC Educational Resources Information Center
Beaudin, Laura; Berdiev, Aziz N.; Kaminaga, Allison Shwachman; Mirmirani, Sam; Tebaldi, Edinaldo
2017-01-01
The authors describe a unique approach to enhancing student learning at the introductory economics level that utilizes a multi-section, team-based competition. The competition is structured to supplement learning throughout the entire introductory course. Student teams are presented with current economic issues, trends, or events, and use economic…
Kim, Sungjin; Jinich, Adrián; Aspuru-Guzik, Alán
2017-04-24
We propose a multiple descriptor multiple kernel (MultiDK) method for efficient molecular discovery using machine learning. We show that the MultiDK method improves both the speed and accuracy of molecular property prediction. We apply the method to the discovery of electrolyte molecules for aqueous redox flow batteries. Using multiple-type-as opposed to single-type-descriptors, we obtain more relevant features for machine learning. Following the principle of "wisdom of the crowds", the combination of multiple-type descriptors significantly boosts prediction performance. Moreover, by employing multiple kernels-more than one kernel function for a set of the input descriptors-MultiDK exploits nonlinear relations between molecular structure and properties better than a linear regression approach. The multiple kernels consist of a Tanimoto similarity kernel and a linear kernel for a set of binary descriptors and a set of nonbinary descriptors, respectively. Using MultiDK, we achieve an average performance of r 2 = 0.92 with a test set of molecules for solubility prediction. We also extend MultiDK to predict pH-dependent solubility and apply it to a set of quinone molecules with different ionizable functional groups to assess their performance as flow battery electrolytes.
NASA Astrophysics Data System (ADS)
Xu, Z.; Guan, K.; Peng, B.; Casler, N. P.; Wang, S. W.
2017-12-01
Landscape has complex three-dimensional features. These 3D features are difficult to extract using conventional methods. Small-footprint LiDAR provides an ideal way for capturing these features. Existing approaches, however, have been relegated to raster or metric-based (two-dimensional) feature extraction from the upper or bottom layer, and thus are not suitable for resolving morphological and intensity features that could be important to fine-scale land cover mapping. Therefore, this research combines airborne LiDAR and multi-temporal Landsat imagery to classify land cover types of Williamson County, Illinois that has diverse and mixed landscape features. Specifically, we applied a 3D convolutional neural network (CNN) method to extract features from LiDAR point clouds by (1) creating occupancy grid, intensity grid at 1-meter resolution, and then (2) normalizing and incorporating data into a 3D CNN feature extractor for many epochs of learning. The learned features (e.g., morphological features, intensity features, etc) were combined with multi-temporal spectral data to enhance the performance of land cover classification based on a Support Vector Machine classifier. We used photo interpretation for training and testing data generation. The classification results show that our approach outperforms traditional methods using LiDAR derived feature maps, and promises to serve as an effective methodology for creating high-quality land cover maps through fusion of complementary types of remote sensing data.
Optimized swimmer tracking system based on a novel multi-related-targets approach
NASA Astrophysics Data System (ADS)
Benarab, D.; Napoléon, T.; Alfalou, A.; Verney, A.; Hellard, P.
2017-02-01
Robust tracking is a crucial step in automatic swimmer evaluation from video sequences. We designed a robust swimmer tracking system using a new multi-related-targets approach. The main idea is to consider the swimmer as a bloc of connected subtargets that advance at the same speed. If one of the subtargets is partially or totally occluded, it can be localized by knowing the position of the others. In this paper, we first introduce the two-dimensional direct linear transformation technique that we used to calibrate the videos. Then, we present the classical tracking approach based on dynamic fusion. Next, we highlight the main contribution of our work, which is the multi-related-targets tracking approach. This approach, the classical head-only approach and the ground truth are then compared, through testing on a database of high-level swimmers in training, national and international competitions (French National Championships, Limoges 2015, and World Championships, Kazan 2015). Tracking percentage and the accuracy of the instantaneous speed are evaluated and the findings show that our new appraoach is significantly more accurate than the classical approach.
Santos, Guido; Lai, Xin; Eberhardt, Martin; Vera, Julio
2018-01-01
Pneumococcal infection is the most frequent cause of pneumonia, and one of the most prevalent diseases worldwide. The population groups at high risk of death from bacterial pneumonia are infants, elderly and immunosuppressed people. These groups are more vulnerable because they have immature or impaired immune systems, the efficacy of their response to vaccines is lower, and antibiotic treatment often does not take place until the inflammatory response triggered is already overwhelming. The immune response to bacterial lung infections involves dynamic interactions between several types of cells whose activation is driven by intracellular molecular networks. A feasible approach to the integration of knowledge and data linking tissue, cellular and intracellular events and the construction of hypotheses in this area is the use of mathematical modeling. For this paper, we used a multi-level computational model to analyse the role of cellular and molecular interactions during the first 10 h after alveolar invasion of Streptococcus pneumoniae bacteria. By "multi-level" we mean that we simulated the interplay between different temporal and spatial scales in a single computational model. In this instance, we included the intracellular scale of processes driving lung epithelial cell activation together with the scale of cell-to-cell interactions at the alveolar tissue. In our analysis, we combined systematic model simulations with logistic regression analysis and decision trees to find genotypic-phenotypic signatures that explain differences in bacteria strain infectivity. According to our simulations, pneumococci benefit from a high dwelling probability and a high proliferation rate during the first stages of infection. In addition to this, the model predicts that during the very early phases of infection the bacterial capsule could be an impediment to the establishment of the alveolar infection because it impairs bacterial colonization.
NASA Astrophysics Data System (ADS)
Harvey, Natalie J.; Huntley, Nathan; Dacre, Helen F.; Goldstein, Michael; Thomson, David; Webster, Helen
2018-01-01
Following the disruption to European airspace caused by the eruption of Eyjafjallajökull in 2010 there has been a move towards producing quantitative predictions of volcanic ash concentration using volcanic ash transport and dispersion simulators. However, there is no formal framework for determining the uncertainties of these predictions and performing many simulations using these complex models is computationally expensive. In this paper a Bayesian linear emulation approach is applied to the Numerical Atmospheric-dispersion Modelling Environment (NAME) to better understand the influence of source and internal model parameters on the simulator output. Emulation is a statistical method for predicting the output of a computer simulator at new parameter choices without actually running the simulator. A multi-level emulation approach is applied using two configurations of NAME with different numbers of model particles. Information from many evaluations of the computationally faster configuration is combined with results from relatively few evaluations of the slower, more accurate, configuration. This approach is effective when it is not possible to run the accurate simulator many times and when there is also little prior knowledge about the influence of parameters. The approach is applied to the mean ash column loading in 75 geographical regions on 14 May 2010. Through this analysis it has been found that the parameters that contribute the most to the output uncertainty are initial plume rise height, mass eruption rate, free tropospheric turbulence levels and precipitation threshold for wet deposition. This information can be used to inform future model development and observational campaigns and routine monitoring. The analysis presented here suggests the need for further observational and theoretical research into parameterisation of atmospheric turbulence. Furthermore it can also be used to inform the most important parameter perturbations for a small operational ensemble of simulations. The use of an emulator also identifies the input and internal parameters that do not contribute significantly to simulator uncertainty. Finally, the analysis highlights that the faster, less accurate, configuration of NAME can, on its own, provide useful information for the problem of predicting average column load over large areas.
A multi-resolution approach to electromagnetic modelling
NASA Astrophysics Data System (ADS)
Cherevatova, M.; Egbert, G. D.; Smirnov, M. Yu
2018-07-01
We present a multi-resolution approach for 3-D magnetotelluric forward modelling. Our approach is motivated by the fact that fine-grid resolution is typically required at shallow levels to adequately represent near surface inhomogeneities, topography and bathymetry, while a much coarser grid may be adequate at depth where the diffusively propagating electromagnetic fields are much smoother. With a conventional structured finite difference grid, the fine discretization required to adequately represent rapid variations near the surface is continued to all depths, resulting in higher computational costs. Increasing the computational efficiency of the forward modelling is especially important for solving regularized inversion problems. We implement a multi-resolution finite difference scheme that allows us to decrease the horizontal grid resolution with depth, as is done with vertical discretization. In our implementation, the multi-resolution grid is represented as a vertical stack of subgrids, with each subgrid being a standard Cartesian tensor product staggered grid. Thus, our approach is similar to the octree discretization previously used for electromagnetic modelling, but simpler in that we allow refinement only with depth. The major difficulty arose in deriving the forward modelling operators on interfaces between adjacent subgrids. We considered three ways of handling the interface layers and suggest a preferable one, which results in similar accuracy as the staggered grid solution, while retaining the symmetry of coefficient matrix. A comparison between multi-resolution and staggered solvers for various models shows that multi-resolution approach improves on computational efficiency without compromising the accuracy of the solution.
Vasconcelos, A G; Almeida, R M; Nobre, F F
2001-08-01
This paper introduces an approach that includes non-quantitative factors for the selection and assessment of multivariate complex models in health. A goodness-of-fit based methodology combined with fuzzy multi-criteria decision-making approach is proposed for model selection. Models were obtained using the Path Analysis (PA) methodology in order to explain the interrelationship between health determinants and the post-neonatal component of infant mortality in 59 municipalities of Brazil in the year 1991. Socioeconomic and demographic factors were used as exogenous variables, and environmental, health service and agglomeration as endogenous variables. Five PA models were developed and accepted by statistical criteria of goodness-of fit. These models were then submitted to a group of experts, seeking to characterize their preferences, according to predefined criteria that tried to evaluate model relevance and plausibility. Fuzzy set techniques were used to rank the alternative models according to the number of times a model was superior to ("dominated") the others. The best-ranked model explained above 90% of the endogenous variables variation, and showed the favorable influences of income and education levels on post-neonatal mortality. It also showed the unfavorable effect on mortality of fast population growth, through precarious dwelling conditions and decreased access to sanitation. It was possible to aggregate expert opinions in model evaluation. The proposed procedure for model selection allowed the inclusion of subjective information in a clear and systematic manner.
Reis, Yara; Wolf, Thomas; Brors, Benedikt; Hamacher-Brady, Anne; Eils, Roland; Brady, Nathan R.
2012-01-01
Mitochondria exist as a network of interconnected organelles undergoing constant fission and fusion. Current approaches to study mitochondrial morphology are limited by low data sampling coupled with manual identification and classification of complex morphological phenotypes. Here we propose an integrated mechanistic and data-driven modeling approach to analyze heterogeneous, quantified datasets and infer relations between mitochondrial morphology and apoptotic events. We initially performed high-content, multi-parametric measurements of mitochondrial morphological, apoptotic, and energetic states by high-resolution imaging of human breast carcinoma MCF-7 cells. Subsequently, decision tree-based analysis was used to automatically classify networked, fragmented, and swollen mitochondrial subpopulations, at the single-cell level and within cell populations. Our results revealed subtle but significant differences in morphology class distributions in response to various apoptotic stimuli. Furthermore, key mitochondrial functional parameters including mitochondrial membrane potential and Bax activation, were measured under matched conditions. Data-driven fuzzy logic modeling was used to explore the non-linear relationships between mitochondrial morphology and apoptotic signaling, combining morphological and functional data as a single model. Modeling results are in accordance with previous studies, where Bax regulates mitochondrial fragmentation, and mitochondrial morphology influences mitochondrial membrane potential. In summary, we established and validated a platform for mitochondrial morphological and functional analysis that can be readily extended with additional datasets. We further discuss the benefits of a flexible systematic approach for elucidating specific and general relationships between mitochondrial morphology and apoptosis. PMID:22272225
Miller-Rosales, Chris; Sterling, Stacy A; Wood, Sabrina B; Ross, Thekla; Makki, Mojdeh; Zamudio, Cindy; Kane, Irene M; Richardson, Megan C; Samayoa, Claudia; Charvat-Aguilar, Nancy; Lu, Wendy Y; Vo, Michelle; Whelan, Kimberly; Uratsu, Connie S; Grant, Richard W
2017-12-01
Cardiovascular disease (CVD) is the leading cause of death in the US. Many patients do not benefit from traditional disease management approaches to CVD risk reduction. Here we describe the rationale, development, and implementation of a multi-component behavioral intervention targeting patients who have persistently not met goals of CVD risk factor control. Informed by published evidence, relevant theoretical frameworks, stakeholder advice, and patient input, we developed a group-based intervention (Changing Results: Engage and Activate to Enhance Wellness; "CREATE Wellness") to address the complex needs of patients with elevated or unmeasured CVD-related risk factors. We are testing this intervention in a randomized trial among patients with persistent (i.e > 2 years) sub-optimal risk factor control despite being enrolled in an advanced and highly successful CVD disease management program. The CREATE Wellness intervention is designed as a 3 session, group-based intervention combining proven elements of patient activation, health system engagement skills training, shared decision making, care planning, and identification of lifestyle change barriers. Our key learnings in designing the intervention included the value of multi-level stakeholder input and the importance of pragmatic skills training to address barriers to care. The CREATE Wellness intervention represents an evidence-based, patient-centered approach for patients not responding to traditional disease management. The trial is currently underway at three medical facilities within Kaiser Permanente Northern California and next steps include an evaluation of efficacy, adaptation for non-English speaking patient populations, and modification of the curriculum for web- or phone-based versions. NCT02302612.
NASA Astrophysics Data System (ADS)
Goodwin, I. D.; Mortlock, T.
2016-02-01
Geohistorical archives of shoreline and foredune planform geometry provides a unique evidence-based record of the time integral response to coupled directional wave climate and sediment supply variability on annual to multi-decadal time scales. We develop conceptual shoreline modelling from the geohistorical shoreline archive using a novel combination of methods, including: LIDAR DEM and field mapping of coastal geology; a decadal-scale climate reconstruction of sea-level pressure, marine windfields, and paleo-storm synoptic type and frequency, and historical bathymetry. The conceptual modelling allows for the discrimination of directional wave climate shifts and the relative contributions of cross-shore and along-shore sand supply rates at multi-decadal resolution. We present regional examples from south-eastern Australia over a large latitudinal gradient from subtropical Queensland (S 25°) to mid-latitude Bass Strait (S 40°) that illustrate the morphodynamic evolution and reorganization to wave climate change. We then use the conceptual modeling to inform a two-dimensional coupled spectral wave-hydrodynamic-morphodynamic model to investigate the shoreface response to paleo-directional wind and wave climates. Unlike one-line shoreline modelling, this fully dynamical approach allows for the investigation of cumulative and spatial bathymetric change due to wave-induced currents, as well as proxy-shoreline change. The fusion of the two modeling approaches allows for: (i) the identification of the natural range of coastal planform geometries in response to wave climate shifts; and, (ii) the decomposition of the multidecadal coastal change into the cross-shore and along-shore sand supply drivers, according to the best-matching planforms.
Rahbar, Mohammad H; Choi, Sangbum; Hong, Chuan; Zhu, Liang; Jeon, Sangchoon; Gardiner, Joseph C
2018-01-01
We propose a nonparametric shrinkage estimator for the median survival times from several independent samples of right-censored data, which combines the samples and hypothesis information to improve the efficiency. We compare efficiency of the proposed shrinkage estimation procedure to unrestricted estimator and combined estimator through extensive simulation studies. Our results indicate that performance of these estimators depends on the strength of homogeneity of the medians. When homogeneity holds, the combined estimator is the most efficient estimator. However, it becomes inconsistent when homogeneity fails. On the other hand, the proposed shrinkage estimator remains efficient. Its efficiency decreases as the equality of the survival medians is deviated, but is expected to be as good as or equal to the unrestricted estimator. Our simulation studies also indicate that the proposed shrinkage estimator is robust to moderate levels of censoring. We demonstrate application of these methods to estimating median time for trauma patients to receive red blood cells in the Prospective Observational Multi-center Major Trauma Transfusion (PROMMTT) study.
Choi, Sangbum; Hong, Chuan; Zhu, Liang; Jeon, Sangchoon; Gardiner, Joseph C.
2018-01-01
We propose a nonparametric shrinkage estimator for the median survival times from several independent samples of right-censored data, which combines the samples and hypothesis information to improve the efficiency. We compare efficiency of the proposed shrinkage estimation procedure to unrestricted estimator and combined estimator through extensive simulation studies. Our results indicate that performance of these estimators depends on the strength of homogeneity of the medians. When homogeneity holds, the combined estimator is the most efficient estimator. However, it becomes inconsistent when homogeneity fails. On the other hand, the proposed shrinkage estimator remains efficient. Its efficiency decreases as the equality of the survival medians is deviated, but is expected to be as good as or equal to the unrestricted estimator. Our simulation studies also indicate that the proposed shrinkage estimator is robust to moderate levels of censoring. We demonstrate application of these methods to estimating median time for trauma patients to receive red blood cells in the Prospective Observational Multi-center Major Trauma Transfusion (PROMMTT) study. PMID:29772007
NASA Astrophysics Data System (ADS)
Chen, Yen-Luan; Chang, Chin-Chih; Sheu, Dwan-Fang
2016-04-01
This paper proposes the generalised random and age replacement policies for a multi-state system composed of multi-state elements. The degradation of the multi-state element is assumed to follow the non-homogeneous continuous time Markov process which is a continuous time and discrete state process. A recursive approach is presented to efficiently compute the time-dependent state probability distribution of the multi-state element. The state and performance distribution of the entire multi-state system is evaluated via the combination of the stochastic process and the Lz-transform method. The concept of customer-centred reliability measure is developed based on the system performance and the customer demand. We develop the random and age replacement policies for an aging multi-state system subject to imperfect maintenance in a failure (or unacceptable) state. For each policy, the optimum replacement schedule which minimises the mean cost rate is derived analytically and discussed numerically.
Interconnecting Multidiscilinary Data Infrastructures: From Federation to Brokering Framework
NASA Astrophysics Data System (ADS)
Nativi, S.
2014-12-01
Standardization and federation activities have been played an essential role to push interoperability at the disciplinary and cross-disciplinary level. However, they demonstrated not to be sufficient to resolve important interoperability challenges, including: disciplinary heterogeneity, cross-organizations diversities, cultural differences. Significant international initiatives like GEOSS, IODE, and CEOS demonstrated that a federation system dealing with global and multi-disciplinary domain turns out to be rater complex, raising more the already high entry level barriers for both Providers and Users. In particular, GEOSS demonstrated that standardization and federation actions must be accompanied and complemented by a brokering approach. Brokering architecture and its implementing technologies are able to implement an effective interoperability level among multi-disciplinary systems, lowering the entry level barriers for both data providers and users. This presentation will discuss the brokering philosophy as a complementary approach for standardization and federation to interconnect existing and heterogeneous infrastructures and systems. The GEOSS experience will be analyzed, specially.
System and Method for Multi-Wavelength Optical Signal Detection
NASA Technical Reports Server (NTRS)
McGlone, Thomas D. (Inventor)
2017-01-01
The system and method for multi-wavelength optical signal detection enables the detection of optical signal levels significantly below those processed at the discrete circuit level by the use of mixed-signal processing methods implemented with integrated circuit technologies. The present invention is configured to detect and process small signals, which enables the reduction of the optical power required to stimulate detection networks, and lowers the required laser power to make specific measurements. The present invention provides an adaptation of active pixel networks combined with mixed-signal processing methods to provide an integer representation of the received signal as an output. The present invention also provides multi-wavelength laser detection circuits for use in various systems, such as a differential absorption light detection and ranging system.
Optimization as a Tool for Consistency Maintenance in Multi-Resolution Simulation
NASA Technical Reports Server (NTRS)
Drewry, Darren T; Reynolds, Jr , Paul F; Emanuel, William R
2006-01-01
The need for new approaches to the consistent simulation of related phenomena at multiple levels of resolution is great. While many fields of application would benefit from a complete and approachable solution to this problem, such solutions have proven extremely difficult. We present a multi-resolution simulation methodology that uses numerical optimization as a tool for maintaining external consistency between models of the same phenomena operating at different levels of temporal and/or spatial resolution. Our approach follows from previous work in the disparate fields of inverse modeling and spacetime constraint-based animation. As a case study, our methodology is applied to two environmental models of forest canopy processes that make overlapping predictions under unique sets of operating assumptions, and which execute at different temporal resolutions. Experimental results are presented and future directions are addressed.
Improved NSGA model for multi objective operation scheduling and its evaluation
NASA Astrophysics Data System (ADS)
Li, Weining; Wang, Fuyu
2017-09-01
Reasonable operation can increase the income of the hospital and improve the patient’s satisfactory level. In this paper, by using multi object operation scheduling method with improved NSGA algorithm, it shortens the operation time, reduces the operation costand lowers the operation risk, the multi-objective optimization model is established for flexible operation scheduling, through the MATLAB simulation method, the Pareto solution is obtained, the standardization of data processing. The optimal scheduling scheme is selected by using entropy weight -Topsis combination method. The results show that the algorithm is feasible to solve the multi-objective operation scheduling problem, and provide a reference for hospital operation scheduling.
A novel visual saliency detection method for infrared video sequences
NASA Astrophysics Data System (ADS)
Wang, Xin; Zhang, Yuzhen; Ning, Chen
2017-12-01
Infrared video applications such as target detection and recognition, moving target tracking, and so forth can benefit a lot from visual saliency detection, which is essentially a method to automatically localize the ;important; content in videos. In this paper, a novel visual saliency detection method for infrared video sequences is proposed. Specifically, for infrared video saliency detection, both the spatial saliency and temporal saliency are considered. For spatial saliency, we adopt a mutual consistency-guided spatial cues combination-based method to capture the regions with obvious luminance contrast and contour features. For temporal saliency, a multi-frame symmetric difference approach is proposed to discriminate salient moving regions of interest from background motions. Then, the spatial saliency and temporal saliency are combined to compute the spatiotemporal saliency using an adaptive fusion strategy. Besides, to highlight the spatiotemporal salient regions uniformly, a multi-scale fusion approach is embedded into the spatiotemporal saliency model. Finally, a Gestalt theory-inspired optimization algorithm is designed to further improve the reliability of the final saliency map. Experimental results demonstrate that our method outperforms many state-of-the-art saliency detection approaches for infrared videos under various backgrounds.
A Parallel Approach To Optimum Actuator Selection With a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Rogers, James L.
2000-01-01
Recent discoveries in smart technologies have created a variety of aerodynamic actuators which have great potential to enable entirely new approaches to aerospace vehicle flight control. For a revolutionary concept such as a seamless aircraft with no moving control surfaces, there is a large set of candidate locations for placing actuators, resulting in a substantially larger number of combinations to examine in order to find an optimum placement satisfying the mission requirements. The placement of actuators on a wing determines the control effectiveness of the airplane. One approach to placement Maximizes the moments about the pitch, roll, and yaw axes, while minimizing the coupling. Genetic algorithms have been instrumental in achieving good solutions to discrete optimization problems, such as the actuator placement problem. As a proof of concept, a genetic has been developed to find the minimum number of actuators required to provide uncoupled pitch, roll, and yaw control for a simplified, untapered, unswept wing model. To find the optimum placement by searching all possible combinations would require 1,100 hours. Formulating the problem and as a multi-objective problem and modifying it to take advantage of the parallel processing capabilities of a multi-processor computer, reduces the optimization time to 22 hours.
Advances in combined endoscopic fluorescence confocal microscopy and optical coherence tomography
NASA Astrophysics Data System (ADS)
Risi, Matthew D.
Confocal microendoscopy provides real-time high resolution cellular level images via a minimally invasive procedure. Results from an ongoing clinical study to detect ovarian cancer with a novel confocal fluorescent microendoscope are presented. As an imaging modality, confocal fluorescence microendoscopy typically requires exogenous fluorophores, has a relatively limited penetration depth (100 μm), and often employs specialized aperture configurations to achieve real-time imaging in vivo. Two primary research directions designed to overcome these limitations and improve diagnostic capability are presented. Ideal confocal imaging performance is obtained with a scanning point illumination and confocal aperture, but this approach is often unsuitable for real-time, in vivo biomedical imaging. By scanning a slit aperture in one direction, image acquisition speeds are greatly increased, but at the cost of a reduction in image quality. The design, implementation, and experimental verification of a custom multi-point-scanning modification to a slit-scanning multi-spectral confocal microendoscope is presented. This new design improves the axial resolution while maintaining real-time imaging rates. In addition, the multi-point aperture geometry greatly reduces the effects of tissue scatter on imaging performance. Optical coherence tomography (OCT) has seen wide acceptance and FDA approval as a technique for ophthalmic retinal imaging, and has been adapted for endoscopic use. As a minimally invasive imaging technique, it provides morphological characteristics of tissues at a cellular level without requiring the use of exogenous fluorophores. OCT is capable of imaging deeper into biological tissue (˜1-2 mm) than confocal fluorescence microscopy. A theoretical analysis of the use of a fiber-bundle in spectral-domain OCT systems is presented. The fiber-bundle enables a flexible endoscopic design and provides fast, parallelized acquisition of the optical coherence tomography data. However, the multi-mode characteristic of the fibers in the fiber-bundle affects the depth sensitivity of the imaging system. A description of light interference in a multi-mode fiber is presented along with numerical simulations and experimental studies to illustrate the theoretical analysis.
NASA Astrophysics Data System (ADS)
Chang Chien, Kuang-Che; Fetita, Catalin; Brillet, Pierre-Yves; Prêteux, Françoise; Chang, Ruey-Feng
2009-02-01
Multi-detector computed tomography (MDCT) has high accuracy and specificity on volumetrically capturing serial images of the lung. It increases the capability of computerized classification for lung tissue in medical research. This paper proposes a three-dimensional (3D) automated approach based on mathematical morphology and fuzzy logic for quantifying and classifying interstitial lung diseases (ILDs) and emphysema. The proposed methodology is composed of several stages: (1) an image multi-resolution decomposition scheme based on a 3D morphological filter is used to detect and analyze the different density patterns of the lung texture. Then, (2) for each pattern in the multi-resolution decomposition, six features are computed, for which fuzzy membership functions define a probability of association with a pathology class. Finally, (3) for each pathology class, the probabilities are combined up according to the weight assigned to each membership function and two threshold values are used to decide the final class of the pattern. The proposed approach was tested on 10 MDCT cases and the classification accuracy was: emphysema: 95%, fibrosis/honeycombing: 84% and ground glass: 97%.
Extension of the HAL QCD approach to inelastic and multi-particle scatterings in lattice QCD
NASA Astrophysics Data System (ADS)
Aoki, S.
We extend the HAL QCD approach, with which potentials between two hadrons can be obtained in QCD at energy below inelastic thresholds, to inelastic and multi-particle scatterings. We first derive asymptotic behaviors of the Nambu-Bethe-Salpeter (NBS) wave function at large space separations for systems with more than 2 particles, in terms of the one-shell $T$-matrix consrainted by the unitarity of quantum field theories. We show that its asymptotic behavior contains phase shifts and mixing angles of $n$ particle scatterings. This property is one of the essential ingredients of the HAL QCD scheme to define "potential" from the NBS wave function in quantum field theories such as QCD. We next construct energy independent but non-local potentials above inelastic thresholds, in terms of these NBS wave functions. We demonstrate an existence of energy-independent coupled channel potentials with a non-relativistic approximation, where momenta of all particles are small compared with their own masses. Combining these two results, we can employ the HAL QCD approach also to investigate inelastic and multi-particle scatterings.
Uddin, Shahadat
2016-02-04
A patient-centric care network can be defined as a network among a group of healthcare professionals who provide treatments to common patients. Various multi-level attributes of the members of this network have substantial influence to its perceived level of performance. In order to assess the impact different multi-level attributes of patient-centric care networks on healthcare outcomes, this study first captured patient-centric care networks for 85 hospitals using health insurance claim dataset. From these networks, this study then constructed physician collaboration networks based on the concept of patient-sharing network among physicians. A multi-level regression model was then developed to explore the impact of different attributes that are organised at two levels on hospitalisation cost and hospital length of stay. For Level-1 model, the average visit per physician significantly predicted both hospitalisation cost and hospital length of stay. The number of different physicians significantly predicted only the hospitalisation cost, which has significantly been moderated by age, gender and Comorbidity score of patients. All Level-1 findings showed significance variance across physician collaboration networks having different community structure and density. These findings could be utilised as a reflective measure by healthcare decision makers. Moreover, healthcare managers could consider them in developing effective healthcare environments.
Wang, Huilin; Wang, Mingjun; Tan, Hao; Li, Yuan; Zhang, Ziding; Song, Jiangning
2014-01-01
X-ray crystallography is the primary approach to solve the three-dimensional structure of a protein. However, a major bottleneck of this method is the failure of multi-step experimental procedures to yield diffraction-quality crystals, including sequence cloning, protein material production, purification, crystallization and ultimately, structural determination. Accordingly, prediction of the propensity of a protein to successfully undergo these experimental procedures based on the protein sequence may help narrow down laborious experimental efforts and facilitate target selection. A number of bioinformatics methods based on protein sequence information have been developed for this purpose. However, our knowledge on the important determinants of propensity for a protein sequence to produce high diffraction-quality crystals remains largely incomplete. In practice, most of the existing methods display poorer performance when evaluated on larger and updated datasets. To address this problem, we constructed an up-to-date dataset as the benchmark, and subsequently developed a new approach termed 'PredPPCrys' using the support vector machine (SVM). Using a comprehensive set of multifaceted sequence-derived features in combination with a novel multi-step feature selection strategy, we identified and characterized the relative importance and contribution of each feature type to the prediction performance of five individual experimental steps required for successful crystallization. The resulting optimal candidate features were used as inputs to build the first-level SVM predictor (PredPPCrys I). Next, prediction outputs of PredPPCrys I were used as the input to build second-level SVM classifiers (PredPPCrys II), which led to significantly enhanced prediction performance. Benchmarking experiments indicated that our PredPPCrys method outperforms most existing procedures on both up-to-date and previous datasets. In addition, the predicted crystallization targets of currently non-crystallizable proteins were provided as compendium data, which are anticipated to facilitate target selection and design for the worldwide structural genomics consortium. PredPPCrys is freely available at http://www.structbioinfor.org/PredPPCrys.
An integrated multi-electrode-optrode array for in vitro optogenetics
Welkenhuysen, Marleen; Hoffman, Luis; Luo, Zhengxiang; De Proft, Anabel; Van den Haute, Chris; Baekelandt, Veerle; Debyser, Zeger; Gielen, Georges; Puers, Robert; Braeken, Dries
2016-01-01
Modulation of a group of cells or tissue needs to be very precise in order to exercise effective control over the cell population under investigation. Optogenetic tools have already demonstrated to be of great value in the study of neuronal circuits and in neuromodulation. Ideally, they should permit very accurate resolution, preferably down to the single cell level. Further, to address a spatially distributed sample, independently addressable multiple optical outputs should be present. In current techniques, at least one of these requirements is not fulfilled. In addition to this, it is interesting to directly monitor feedback of the modulation by electrical registration of the activity of the stimulated cells. Here, we present the fabrication and characterization of a fully integrated silicon-based multi-electrode-optrode array (MEOA) for in vitro optogenetics. We demonstrate that this device allows for artifact-free electrical recording. Moreover, the MEOA was used to reliably elicit spiking activity from ChR2-transduced neurons. Thanks to the single cell resolution stimulation capability, we could determine spatial and temporal activation patterns and spike latencies of the neuronal network. This integrated approach to multi-site combined optical stimulation and electrical recording significantly advances today’s tool set for neuroscientists in their search to unravel neuronal network dynamics. PMID:26832455
NASA Astrophysics Data System (ADS)
Wałach, Daniel; Sagan, Joanna; Gicala, Magdalena
2017-10-01
The paper presents an environmental and economic analysis of the material solutions of multi-level garage. The construction project approach considered reinforced concrete structure under conditions of use of ordinary concrete and high-performance concrete (HPC). Using of HPC allowed to significant reduction of reinforcement steel, mainly in compression elements (columns) in the construction of the object. The analysis includes elements of the methodology of integrated lice cycle design (ILCD). By making multi-criteria analysis based on established weight of the economic and environmental parameters, three solutions have been evaluated and compared within phase of material production (information modules A1-A3).
A Multi-Resolution Approach for an Automated Fusion of Different Low-Cost 3D Sensors
Dupuis, Jan; Paulus, Stefan; Behmann, Jan; Plümer, Lutz; Kuhlmann, Heiner
2014-01-01
The 3D acquisition of object structures has become a common technique in many fields of work, e.g., industrial quality management, cultural heritage or crime scene documentation. The requirements on the measuring devices are versatile, because spacious scenes have to be imaged with a high level of detail for selected objects. Thus, the used measuring systems are expensive and require an experienced operator. With the rise of low-cost 3D imaging systems, their integration into the digital documentation process is possible. However, common low-cost sensors have the limitation of a trade-off between range and accuracy, providing either a low resolution of single objects or a limited imaging field. Therefore, the use of multiple sensors is desirable. We show the combined use of two low-cost sensors, the Microsoft Kinect and the David laserscanning system, to achieve low-resolved scans of the whole scene and a high level of detail for selected objects, respectively. Afterwards, the high-resolved David objects are automatically assigned to their corresponding Kinect object by the use of surface feature histograms and SVM-classification. The corresponding objects are fitted using an ICP-implementation to produce a multi-resolution map. The applicability is shown for a fictional crime scene and the reconstruction of a ballistic trajectory. PMID:24763255
Adaptation of a multi-resolution adversarial model for asymmetric warfare
NASA Astrophysics Data System (ADS)
Rosenberg, Brad; Gonsalves, Paul G.
2006-05-01
Recent military operations have demonstrated the use by adversaries of non-traditional or asymmetric military tactics to offset US military might. Rogue nations with links to trans-national terrorists have created a highly unpredictable and potential dangerous environment for US military operations. Several characteristics of these threats include extremism in beliefs, global in nature, non-state oriented, and highly networked and adaptive, thus making these adversaries less vulnerable to conventional military approaches. Additionally, US forces must also contend with more traditional state-based threats that are further evolving their military fighting strategies and capabilities. What are needed are solutions to assist our forces in the prosecution of operations against these diverse threat types and their atypical strategies and tactics. To address this issue, we present a system that allows for the adaptation of a multi-resolution adversarial model. The developed model can then be used to support both training and simulation based acquisition requirements to effectively respond to such an adversary. The described system produces a combined adversarial model by merging behavior modeling at the individual level with aspects at the group and organizational level via network analysis. Adaptation of this adversarial model is performed by means of an evolutionary algorithm to build a suitable model for the chosen adversary.
A multi-resolution approach for an automated fusion of different low-cost 3D sensors.
Dupuis, Jan; Paulus, Stefan; Behmann, Jan; Plümer, Lutz; Kuhlmann, Heiner
2014-04-24
The 3D acquisition of object structures has become a common technique in many fields of work, e.g., industrial quality management, cultural heritage or crime scene documentation. The requirements on the measuring devices are versatile, because spacious scenes have to be imaged with a high level of detail for selected objects. Thus, the used measuring systems are expensive and require an experienced operator. With the rise of low-cost 3D imaging systems, their integration into the digital documentation process is possible. However, common low-cost sensors have the limitation of a trade-off between range and accuracy, providing either a low resolution of single objects or a limited imaging field. Therefore, the use of multiple sensors is desirable. We show the combined use of two low-cost sensors, the Microsoft Kinect and the David laserscanning system, to achieve low-resolved scans of the whole scene and a high level of detail for selected objects, respectively. Afterwards, the high-resolved David objects are automatically assigned to their corresponding Kinect object by the use of surface feature histograms and SVM-classification. The corresponding objects are fitted using an ICP-implementation to produce a multi-resolution map. The applicability is shown for a fictional crime scene and the reconstruction of a ballistic trajectory.
NASA Astrophysics Data System (ADS)
Liu, Likun
2018-01-01
In the field of remote sensing image processing, remote sensing image segmentation is a preliminary step for later analysis of remote sensing image processing and semi-auto human interpretation, fully-automatic machine recognition and learning. Since 2000, a technique of object-oriented remote sensing image processing method and its basic thought prevails. The core of the approach is Fractal Net Evolution Approach (FNEA) multi-scale segmentation algorithm. The paper is intent on the research and improvement of the algorithm, which analyzes present segmentation algorithms and selects optimum watershed algorithm as an initialization. Meanwhile, the algorithm is modified by modifying an area parameter, and then combining area parameter with a heterogeneous parameter further. After that, several experiments is carried on to prove the modified FNEA algorithm, compared with traditional pixel-based method (FCM algorithm based on neighborhood information) and combination of FNEA and watershed, has a better segmentation result.
Optimisation of Combined Cycle Gas Turbine Power Plant in Intraday Market: Riga CHP-2 Example
NASA Astrophysics Data System (ADS)
Ivanova, P.; Grebesh, E.; Linkevics, O.
2018-02-01
In the research, the influence of optimised combined cycle gas turbine unit - according to the previously developed EM & OM approach with its use in the intraday market - is evaluated on the generation portfolio. It consists of the two combined cycle gas turbine units. The introduced evaluation algorithm saves the power and heat balance before and after the performance of EM & OM approach by making changes in the generation profile of units. The aim of this algorithm is profit maximisation of the generation portfolio. The evaluation algorithm is implemented in multi-paradigm numerical computing environment MATLab on the example of Riga CHP-2. The results show that the use of EM & OM approach in the intraday market can be profitable or unprofitable. It depends on the initial state of generation units in the intraday market and on the content of the generation portfolio.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Blaschke, Thomas; Tiede, Dirk; Moghaddam, Mohammad Hossein Rezaei
2017-09-01
This article presents a method of object-based image analysis (OBIA) for landslide delineation and landslide-related change detection from multi-temporal satellite images. It uses both spatial and spectral information on landslides, through spectral analysis, shape analysis, textural measurements using a gray-level co-occurrence matrix (GLCM), and fuzzy logic membership functionality. Following an initial segmentation step, particular combinations of various information layers were investigated to generate objects. This was achieved by applying multi-resolution segmentation to IRS-1D, SPOT-5, and ALOS satellite imagery in sequential steps of feature selection and object classification, and using slope and flow direction derivatives from a digital elevation model together with topographically-oriented gray level co-occurrence matrices. Fuzzy membership values were calculated for 11 different membership functions using 20 landslide objects from a landslide training data. Six fuzzy operators were used for the final classification and the accuracies of the resulting landslide maps were compared. A Fuzzy Synthetic Evaluation (FSE) approach was adapted for validation of the results and for an accuracy assessment using the landslide inventory database. The FSE approach revealed that the AND operator performed best with an accuracy of 93.87% for 2005 and 94.74% for 2011, closely followed by the MEAN Arithmetic operator, while the OR and AND (*) operators yielded relatively low accuracies. An object-based change detection was then applied to monitor landslide-related changes that occurred in northern Iran between 2005 and 2011. Knowledge rules to detect possible landslide-related changes were developed by evaluating all possible landslide-related objects for both time steps.
NASA Astrophysics Data System (ADS)
D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco P.; Pasquariello, Guido
2018-03-01
High-resolution, remotely sensed images of the Earth surface have been proven to be of help in producing detailed flood maps, thanks to their synoptic overview of the flooded area and frequent revisits. However, flood scenarios can be complex situations, requiring the integration of different data in order to provide accurate and robust flood information. Several processing approaches have been recently proposed to efficiently combine and integrate heterogeneous information sources. In this paper, we introduce DAFNE, a Matlab®-based, open source toolbox, conceived to produce flood maps from remotely sensed and other ancillary information, through a data fusion approach. DAFNE is based on Bayesian Networks, and is composed of several independent modules, each one performing a different task. Multi-temporal and multi-sensor data can be easily handled, with the possibility of following the evolution of an event through multi-temporal output flood maps. Each DAFNE module can be easily modified or upgraded to meet different user needs. The DAFNE suite is presented together with an example of its application.
NASA Astrophysics Data System (ADS)
Kebede, Abiy S.; Nicholls, Robert J.; Allan, Andrew; Arto, Inaki; Cazcarro, Ignacio; Fernandes, Jose A.; Hill, Chris T.; Hutton, Craig W.; Kay, Susan; Lawn, Jon; Lazar, Attila N.; Whitehead, Paul W.
2017-04-01
Coastal deltas are home for over 500 million people globally, and they have been identified as one of the most vulnerable coastal environments during the 21st century. They are susceptible to multiple climatic (e.g., sea-level rise, storm surges, change in temperature and precipitation) and socio-economic (e.g., human-induced subsidence, population and urbanisation changes, GDP growth) drivers of change. These drivers also operate at multiple scales, ranging from local to global and short- to long-term. This highlights the complex challenges deltas face in terms of both their long-term sustainability as well as the well-being of their residents and the health of ecosystems that support the livelihood of large (often very poor) population under uncertain changing conditions. A holistic understanding of these challenges and the potential impacts of future climate and socio-economic changes is central for devising robust adaptation policies. Scenario analysis has long been identified as a strategic management tool to explore future climate change and its impacts for supporting robust decision-making under uncertainty. This work presents the overall scenario framework, methodology, and processes adopted for the development of scenarios in the DECCMA* project. DECCMA is analysing the future of three deltas in South Asia and West Africa: (i) the Ganges-Brahmaputra-Meghna (GBM) delta (Bangladesh/India), (ii) the Mahanadi delta (India), and (iii) the Volta delta (Ghana). This includes comparisons between these three deltas. Hence, the scenario framework comprises a multi-scale hybrid approach, with six levels of scenario considerations: (i) global (climate change, e.g., sea-level rise, temperature change; and socio-economic assumptions, e.g., population and urbanisation changes, GDP growth); (ii) regional catchments (e.g., river flow modelling), (iii) regional seas (e.g., fisheries modelling), (iv) regional politics (e.g., transboundary disputes), (v) national (e.g., socio-economic factors), and (vi) delta-scale (e.g., future adaptation and migration policies) scenarios. The framework includes and combines expert-based and participatory approaches and provides improved specification of the role of scenarios to analyse the future state of adaptation and migration across the three deltas. It facilitates the development of appropriate and consistent endogenous and exogenous scenario futures: (i) at the delta-scale, (ii) across all deltas, and (iii) with wider climate change, environmental change, and adaptation & migration research. Key words: Coastal deltas, sea-level rise, migration and adaptation, multi-scale scenarios, participatory approach *DECCMA (Deltas, Vulnerability & Climate Change: Migration & Adaptation) project is part of the Collaborative ADAPTATION Research Initiative in Africa and Asia (CARIAA), with financial support from the UK Government's Department for International Development (DFID) and the International Development Research Centre (IDRC), Canada.
Non-Mutually Exclusive Deep Neural Network Classifier for Combined Modes of Bearing Fault Diagnosis.
Duong, Bach Phi; Kim, Jong-Myon
2018-04-07
The simultaneous occurrence of various types of defects in bearings makes their diagnosis more challenging owing to the resultant complexity of the constituent parts of the acoustic emission (AE) signals. To address this issue, a new approach is proposed in this paper for the detection of multiple combined faults in bearings. The proposed methodology uses a deep neural network (DNN) architecture to effectively diagnose the combined defects. The DNN structure is based on the stacked denoising autoencoder non-mutually exclusive classifier (NMEC) method for combined modes. The NMEC-DNN is trained using data for a single fault and it classifies both single faults and multiple combined faults. The results of experiments conducted on AE data collected through an experimental test-bed demonstrate that the DNN achieves good classification performance with a maximum accuracy of 95%. The proposed method is compared with a multi-class classifier based on support vector machines (SVMs). The NMEC-DNN yields better diagnostic performance in comparison to the multi-class classifier based on SVM. The NMEC-DNN reduces the number of necessary data collections and improves the bearing fault diagnosis performance.
Design of supply chain in fuzzy environment
NASA Astrophysics Data System (ADS)
Rao, Kandukuri Narayana; Subbaiah, Kambagowni Venkata; Singh, Ganja Veera Pratap
2013-05-01
Nowadays, customer expectations are increasing and organizations are prone to operate in an uncertain environment. Under this uncertain environment, the ultimate success of the firm depends on its ability to integrate business processes among supply chain partners. Supply chain management emphasizes cross-functional links to improve the competitive strategy of organizations. Now, companies are moving from decoupled decision processes towards more integrated design and control of their components to achieve the strategic fit. In this paper, a new approach is developed to design a multi-echelon, multi-facility, and multi-product supply chain in fuzzy environment. In fuzzy environment, mixed integer programming problem is formulated through fuzzy goal programming in strategic level with supply chain cost and volume flexibility as fuzzy goals. These fuzzy goals are aggregated using minimum operator. In tactical level, continuous review policy for controlling raw material inventories in supplier echelon and controlling finished product inventories in plant as well as distribution center echelon is considered as fuzzy goals. A non-linear programming model is formulated through fuzzy goal programming using minimum operator in the tactical level. The proposed approach is illustrated with a numerical example.
Gene prioritization and clustering by multi-view text mining
2010-01-01
Background Text mining has become a useful tool for biologists trying to understand the genetics of diseases. In particular, it can help identify the most interesting candidate genes for a disease for further experimental analysis. Many text mining approaches have been introduced, but the effect of disease-gene identification varies in different text mining models. Thus, the idea of incorporating more text mining models may be beneficial to obtain more refined and accurate knowledge. However, how to effectively combine these models still remains a challenging question in machine learning. In particular, it is a non-trivial issue to guarantee that the integrated model performs better than the best individual model. Results We present a multi-view approach to retrieve biomedical knowledge using different controlled vocabularies. These controlled vocabularies are selected on the basis of nine well-known bio-ontologies and are applied to index the vast amounts of gene-based free-text information available in the MEDLINE repository. The text mining result specified by a vocabulary is considered as a view and the obtained multiple views are integrated by multi-source learning algorithms. We investigate the effect of integration in two fundamental computational disease gene identification tasks: gene prioritization and gene clustering. The performance of the proposed approach is systematically evaluated and compared on real benchmark data sets. In both tasks, the multi-view approach demonstrates significantly better performance than other comparing methods. Conclusions In practical research, the relevance of specific vocabulary pertaining to the task is usually unknown. In such case, multi-view text mining is a superior and promising strategy for text-based disease gene identification. PMID:20074336
NASA Astrophysics Data System (ADS)
Balaji, V.; Benson, Rusty; Wyman, Bruce; Held, Isaac
2016-10-01
Climate models represent a large variety of processes on a variety of timescales and space scales, a canonical example of multi-physics multi-scale modeling. Current hardware trends, such as Graphical Processing Units (GPUs) and Many Integrated Core (MIC) chips, are based on, at best, marginal increases in clock speed, coupled with vast increases in concurrency, particularly at the fine grain. Multi-physics codes face particular challenges in achieving fine-grained concurrency, as different physics and dynamics components have different computational profiles, and universal solutions are hard to come by. We propose here one approach for multi-physics codes. These codes are typically structured as components interacting via software frameworks. The component structure of a typical Earth system model consists of a hierarchical and recursive tree of components, each representing a different climate process or dynamical system. This recursive structure generally encompasses a modest level of concurrency at the highest level (e.g., atmosphere and ocean on different processor sets) with serial organization underneath. We propose to extend concurrency much further by running more and more lower- and higher-level components in parallel with each other. Each component can further be parallelized on the fine grain, potentially offering a major increase in the scalability of Earth system models. We present here first results from this approach, called coarse-grained component concurrency, or CCC. Within the Geophysical Fluid Dynamics Laboratory (GFDL) Flexible Modeling System (FMS), the atmospheric radiative transfer component has been configured to run in parallel with a composite component consisting of every other atmospheric component, including the atmospheric dynamics and all other atmospheric physics components. We will explore the algorithmic challenges involved in such an approach, and present results from such simulations. Plans to achieve even greater levels of coarse-grained concurrency by extending this approach within other components, such as the ocean, will be discussed.
Utilization Management in the Blood Transfusion Service
Peña, Jeremy Ryan Andrew; Dzik, Walter “Sunny”
2015-01-01
The scope of activity of the Blood Transfusion Service (BTS) makes it unique among the clinical laboratories. The combination of therapeutic and diagnostic roles necessitates a multi-faceted approach to utilization management in the BTS. We present our experience in utilization management in large academic medical center. PMID:24080431
Creating a Down-to-Earth Approach to Teaching Science, Math and Technology.
ERIC Educational Resources Information Center
Williamson, Robert; Smoak, Ellen
1999-01-01
Down-to-Earth is a program designed to increase 9- to 12-year olds' critical thinking and problem solving by teaching gardening through the scientific method. The combination of multi- and interdisciplinary topics has increased achievement and resulted in attitudinal and behavioral changes. (SK)
Competitive Learning Neural Network Ensemble Weighted by Predicted Performance
ERIC Educational Resources Information Center
Ye, Qiang
2010-01-01
Ensemble approaches have been shown to enhance classification by combining the outputs from a set of voting classifiers. Diversity in error patterns among base classifiers promotes ensemble performance. Multi-task learning is an important characteristic for Neural Network classifiers. Introducing a secondary output unit that receives different…
Steady groundwater flow through many cylindrical inhomogeneities in a multi-aquifer system
NASA Astrophysics Data System (ADS)
Bakker, Mark
2003-06-01
A new approach is presented for the simulation of steady-state groundwater flow in multi-aquifer systems that contain many cylindrical inhomogeneities. The hydraulic conductivity of all aquifers and the resistance of all leaky layers may be different inside each cylinder. The approach is based on separation of variables and combines principles of the theory for multi-aquifer flow with principles of the analytic element method. The solution fulfills the governing differential equations exactly everywhere; the head, flow, and leakage between aquifers may be computed analytically at any point in the aquifer system. The boundary conditions along the circumference of the cylinder are satisfied approximately, but may be met at any precision. Two examples are discussed to illustrate the accuracy of the approach and the significance of inhomogeneities in multi-aquifer systems. The first application simulates the vertical and horizontal, advective spreading of a conservative tracer in a homogeneous aquifer that is overlain by an aquifer with cylindrical inclusions of higher permeability. The second application concerns the three-dimensional shape of the capture zone of a well that is screened in the bottom aquifer of a three-aquifer system. The capture zone extends to the top aquifer due to cylindrical holes of lower resistance in the separating clay layers.
A Bayesian alternative for multi-objective ecohydrological model specification
NASA Astrophysics Data System (ADS)
Tang, Yating; Marshall, Lucy; Sharma, Ashish; Ajami, Hoori
2018-01-01
Recent studies have identified the importance of vegetation processes in terrestrial hydrologic systems. Process-based ecohydrological models combine hydrological, physical, biochemical and ecological processes of the catchments, and as such are generally more complex and parametric than conceptual hydrological models. Thus, appropriate calibration objectives and model uncertainty analysis are essential for ecohydrological modeling. In recent years, Bayesian inference has become one of the most popular tools for quantifying the uncertainties in hydrological modeling with the development of Markov chain Monte Carlo (MCMC) techniques. The Bayesian approach offers an appealing alternative to traditional multi-objective hydrologic model calibrations by defining proper prior distributions that can be considered analogous to the ad-hoc weighting often prescribed in multi-objective calibration. Our study aims to develop appropriate prior distributions and likelihood functions that minimize the model uncertainties and bias within a Bayesian ecohydrological modeling framework based on a traditional Pareto-based model calibration technique. In our study, a Pareto-based multi-objective optimization and a formal Bayesian framework are implemented in a conceptual ecohydrological model that combines a hydrological model (HYMOD) and a modified Bucket Grassland Model (BGM). Simulations focused on one objective (streamflow/LAI) and multiple objectives (streamflow and LAI) with different emphasis defined via the prior distribution of the model error parameters. Results show more reliable outputs for both predicted streamflow and LAI using Bayesian multi-objective calibration with specified prior distributions for error parameters based on results from the Pareto front in the ecohydrological modeling. The methodology implemented here provides insight into the usefulness of multiobjective Bayesian calibration for ecohydrologic systems and the importance of appropriate prior distributions in such approaches.
Sequeira, Daniela P; Correia, Ricardo; Carrondo, Manuel J T; Roldão, António; Teixeira, Ana P; Alves, Paula M
2018-05-24
Safer and broadly protective vaccines are needed to cope with the continuous evolution of circulating influenza virus strains and promising approaches based on the expression of multiple hemagglutinins (HA) in a virus-like particle (VLP) have been proposed. However, expression of multiple genes in the same vector can lead to its instability due to tandem repetition of similar sequences. By combining stable with transient expression systems we can rationally distribute the number of genes to be expressed per platform and thus mitigate this risk. In this work, we developed a modular system comprising stable and baculovirus-mediated expression in insect cells for production of multi-HA influenza enveloped VLPs. First, a stable insect High Five cell population expressing two different HA proteins from subtype H3 was established. Infection of this cell population with a baculovirus vector encoding three other HA proteins from H3 subtype proved to be as competitive as traditional co-infection approaches in producing a pentavalent H3 VLP. Aiming at increasing HA expression, the stable insect cell population was infected at increasingly higher cell concentrations (CCI). However, cultures infected at CCI of 3×10 6 cells/mL showed lower HA titers per cell in comparison to standard CCI of 2×10 6 cells/mL, a phenomenon named "cell density effect". To lessen the negative impact of this phenomenon, a tailor-made refeed strategy was designed based on the exhaustion of key nutrients during cell growth. Noteworthy, cultures supplemented and infected at a CCI of 4×10 6 cells/mL showed comparable HA titers per cell to those of CCI of 2×10 6 cells/mL, thus leading to an increase of up to 4-fold in HA titers per mL. Scalability of the modular strategy herein proposed was successfully demonstrated in 2L stirred tank bioreactors with comparable HA protein levels observed between bioreactor and shake flasks cultures. Overall, this work demonstrates the suitability of combining stable with baculovirus-mediated expression in insect cells as an efficient platform for production of multi-HA influenza VLPs, surpassing the drawbacks of traditional co-infection strategies and/or the use of larger, unstable vectors. Copyright © 2017 Elsevier Ltd. All rights reserved.
Trispectrum from co-dimension 2(n) Galileons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fasiello, Matteo, E-mail: mrf65@case.edu
2013-12-01
A generalized theory of multi-field Galileons has been recently put forward. This model stems from the ongoing effort to embed generic Galileon theories within brane constructions. Such an approach has proved very useful in connecting interesting and essential features of these theories with geometric properties of the branes embedding. We investigate the cosmological implications of a very restrictive multi-field Galileon theory whose leading interaction is solely quartic in the scalar field π and lends itself nicely to an interesting cosmology. The bispectrum is characterized by a naturally small amplitude (f{sub NL}∼<1) and an equilateral shape-function. The trispectrum of curvature fluctuationsmore » has features which are quite distinctive with respect to their P(X,φ) counterpart. We also show that, despite an absent cubic Lagrangian in the full theory, non-Gaussianities in this model cannot produce the combination of a small bispectrum alongside with a large trispectrum. We further expand on this point to draw a lesson on what having a symmetry in the full background independent theory entails at the level of fluctuations and vice-versa.« less
Zhao, Fei-Ya; Tao, Ai-En; Xia, Cong-Long
2018-01-01
Paris is a commonly used traditional Chinese medicine (TCM), and has antitumor, antibacterial, sedative, analgesic and hemostatic effects. It has been used as an ingredient of 81 Chinese patent medicines, with a wide application and large market demand. Based on the data retrieved from state Intellectual Property Office patent database, a comprehensive analysis was made on Paris patents, so as to explore the current features of Paris patents in the aspects of domestic patent output, development trend, technology field distribution, time dimension, technology growth rate and patent applicant, and reveal the development trend of China's Paris industry. In addition, based on the current Paris resource application and development, a sustainable, multi-channel and multi-level industrial development approach was built. According to the results, studies of Paris in China are at the rapid development period, with a good development trend. However, because wild Paris resources tend to be exhausted, the studies for artificial cultivation technology should be strengthened to promote the industrial development. Copyright© by the Chinese Pharmaceutical Association.
NASA Astrophysics Data System (ADS)
Villiger, Arturo; Schaer, Stefan; Dach, Rolf; Prange, Lars; Jäggi, Adrian
2017-04-01
It is common to handle code biases in the Global Navigation Satellite System (GNSS) data analysis as conventional differential code biases (DCBs): P1-C1, P1-P2, and P2-C2. Due to the increasing number of signals and systems in conjunction with various tracking modes for the different signals (as defined in RINEX3 format), the number of DCBs would increase drastically and the bookkeeping becomes almost unbearable. The Center for Orbit Determination in Europe (CODE) has thus changed its processing scheme to observable-specific signal biases (OSB). This means that for each observation involved all related satellite and receiver biases are considered. The OSB contributions from various ionosphere analyses (geometry-free linear combination) using different observables and frequencies and from clock analyses (ionosphere-free linear combination) are then combined on normal equation level. By this, one consistent set of OSB values per satellite and receiver can be obtained that contains all information needed for GNSS-related processing. This advanced procedure of code bias handling is now also applied to the IGS (International GNSS Service) MGEX (Multi-GNSS Experiment) procedure at CODE. Results for the biases from the legacy IGS solution as well as the CODE MGEX processing (considering GPS, GLONASS, Galileo, BeiDou, and QZSS) are presented. The consistency with the traditional method is confirmed and the new results are discussed regarding the long-term stability. When processing code data, it is essential to know the true observable types in order to correct for the associated biases. CODE has been verifying the receiver tracking technologies for GPS based on estimated DCB multipliers (for the RINEX 2 case). With the change to OSB, the original verification approach was extended to search for the best fitting observable types based on known OSB values. In essence, a multiplier parameter is estimated for each involved GNSS observable type. This implies that we could recover, for receivers tracking a combination of signals, even the factors of these combinations. The verification of the observable types is crucial to identify the correct observable types of RINEX 2 data (which does not contain the signal modulation in comparison to RINEX 3). The correct information of the used observable types is essential for precise point positioning (PPP) applications and GNSS ambiguity resolution. Multi-GNSS OSBs and verified receiver tracking modes are essential to get best possible multi-GNSS solutions for geodynamic purposes and other applications.
A multi-model assessment of the co-benefits of climate mitigation for global air quality
NASA Astrophysics Data System (ADS)
Rao, Shilpa; Klimont, Zbigniew; Leitao, Joana; Riahi, Keywan; van Dingenen, Rita; Aleluia Reis, Lara; Calvin, Katherine; Dentener, Frank; Drouet, Laurent; Fujimori, Shinichiro; Harmsen, Mathijs; Luderer, Gunnar; Heyes, Chris; Strefler, Jessica; Tavoni, Massimo; van Vuuren, Detlef P.
2016-12-01
We present a model comparison study that combines multiple integrated assessment models with a reduced-form global air quality model to assess the potential co-benefits of global climate mitigation policies in relation to the World Health Organization (WHO) goals on air quality and health. We include in our assessment, a range of alternative assumptions on the implementation of current and planned pollution control policies. The resulting air pollution emission ranges significantly extend those in the Representative Concentration Pathways. Climate mitigation policies complement current efforts on air pollution control through technology and fuel transformations in the energy system. A combination of stringent policies on air pollution control and climate change mitigation results in 40% of the global population exposed to PM levels below the WHO air quality guideline; with the largest improvements estimated for India, China, and Middle East. Our results stress the importance of integrated multisector policy approaches to achieve the Sustainable Development Goals.
Neuroeconomics: A bridge for translational research
Sharp, Carla; Monterosso, John; Montague, Read
2014-01-01
Neuroeconomic methods combine behavioral economic experiments to parameterize aspects of reward-related decision-making with neuroimaging techniques to record corresponding brain activity. In this introductory paper to the current special issue, we propose that neuroeconomics is a potential bridge for translational research in psychiatry for several reasons. First, neuroeconomics-derived theoretical predictions about optimal adaptation in a changing environment provide an objective metric to examine psychopathology. Second, neuroeconomics provides a ‘multi-level’ research approach that combines performance (behavioral) measures with intermediate measures between behavior and neurobiology (e.g, neuroimaging) and uses a common metaphor to describe decision-making across multiple levels of explanation. As such, ecologically valid behavioral paradigms closely mirror the physical mechanisms of reward processing. Third, neuroeconomics provides a platform for investigators from neuroscience, economics, psychiatry and social and clinical psychology to develop a common language for studying reward-related decision making in psychiatric disorders. Therefore, neuroeconomics can provide promising candidate endophenotypes that may help clarify the basis of high heritability associated with psychiatric disorders and that may, in turn, inform treatment. PMID:22727459
Velpuri, Naga Manohar; Senay, Gabriel B.
2012-01-01
Lake Turkana, the largest desert lake in the world, is fed by ungauged or poorly gauged river systems. To meet the demand of electricity in the East African region, Ethiopia is currently building the Gibe III hydroelectric dam on the Omo River, which supplies more than 80% of the inflows to Lake Turkana. On completion, the Gibe III dam will be the tallest dam in Africa with a height of 241 m. However, the nature of interactions and potential impacts of regulated inflows to Lake Turkana are not well understood due to its remote location and unavailability of reliable in-situ datasets. In this study, we used 12 years (1998–2009) of existing multi-source satellite and model-assimilated global weather data. We use calibrated multi-source satellite data-driven water balance model for Lake Turkana that takes into account model routed runoff, lake/reservoir evapotranspiration, direct rain on lakes/reservoirs and releases from the dam to compute lake water levels. The model evaluates the impact of Gibe III dam using three different approaches such as (a historical approach, a knowledge-based approach, and a nonparametric bootstrap resampling approach) to generate rainfall-runoff scenarios. All the approaches provided comparable and consistent results. Model results indicated that the hydrological impact of the dam on Lake Turkana would vary with the magnitude and distribution of rainfall post-dam commencement. On average, the reservoir would take up to 8–10 months, after commencement, to reach a minimum operation level of 201 m depth of water. During the dam filling period, the lake level would drop up to 2 m (95% confidence) compared to the lake level modelled without the dam. The lake level variability caused by regulated inflows after the dam commissioning were found to be within the natural variability of the lake of 4.8 m. Moreover, modelling results indicated that the hydrological impact of the Gibe III dam would depend on the initial lake level at the time of dam commencement. Areas along the Lake Turkana shoreline that are vulnerable to fluctuations in lake levels were also identified. This study demonstrates the effectiveness of using existing multi-source satellite data in a basic modeling framework to assess the potential hydrological impact of an upstream dam on a terminal downstream lake. The results obtained from this study could also be used to evaluate alternate dam-filling scenarios and assess the potential impact of the dam on Lake Turkana under different operational strategies.
Robust Targeting for the Smartphone Video Guidance Sensor
NASA Technical Reports Server (NTRS)
Carter, Christopher
2017-01-01
The Smartphone Video Guidance Sensor (SVGS) is a miniature, self-contained autonomous rendezvous and docking sensor developed using a commercial off the shelf Android-based smartphone. It aims to provide a miniaturized solution for rendezvous and docking, enabling small satellites to conduct proximity operations and formation flying while minimizing interference with a primary payload. Previously, the sensor was limited by a slow (2 Hz) refresh rate and its use of retro-reflectors, both of which contributed to a limited operating environment. To advance the technology readiness level, a modified approach was developed, combining a multi-colored LED target with a focused target-detection algorithm. Alone, the use of an LED system was determined to be much more reliable, though slower, than the retro-reflector system. The focused target-detection system was developed in response to this problem to mitigate the speed reduction of using color. However, it also improved the reliability. In combination these two methods have been demonstrated to dramatically increase sensor speed and allow the sensor to select the target even with significant noise interfering with the sensor, providing millimeter level accuracy at a range of two meters with a 1U target.
Robust Targeting for the Smartphone Video Guidance Sensor
NASA Technical Reports Server (NTRS)
Carter, C.
2017-01-01
The Smartphone Video Guidance Sensor (SVGS) is a miniature, self-contained autonomous rendezvous and docking sensor developed using a commercial off the shelf Android-based smartphone. It aims to provide a miniaturized solution for rendezvous and docking, enabling small satellites to conduct proximity operations and formation flying while minimizing interference with a primary payload. Previously, the sensor was limited by a slow (2 Hz) refresh rate and its use of retro-reflectors, both of which contributed to a limited operating environment. To advance the technology readiness level, a modified approach was developed, combining a multi-colored LED target with a focused target-detection algorithm. Alone, the use of an LED system was determined to be much more reliable, though slower, than the retro-reflector system. The focused target-detection system was developed in response to this problem to mitigate the speed reduction of using color. However it also improved the reliability. In combination these two methods have been demonstrated to dramatically increase sensor speed and allow the sensor to select the target even with significant noise interfering with the sensor, providing millimeter level precision at a range of two meters with a 1U target.
NASA Astrophysics Data System (ADS)
Ding, Zhongan; Gao, Chen; Yan, Shengteng; Yang, Canrong
2017-10-01
The power user electric energy data acquire system (PUEEDAS) is an important part of smart grid. This paper builds a multi-objective optimization model for the performance of the PUEEADS from the point of view of the combination of the comprehensive benefits and cost. Meanwhile, the Chebyshev decomposition approach is used to decompose the multi-objective optimization problem. We design a MOEA/D evolutionary algorithm to solve the problem. By analyzing the Pareto optimal solution set of multi-objective optimization problem and comparing it with the monitoring value to grasp the direction of optimizing the performance of the PUEEDAS. Finally, an example is designed for specific analysis.
Bio-monitoring of mycotoxin exposure in Cameroon using a urinary multi-biomarker approach.
Abia, Wilfred A; Warth, Benedikt; Sulyok, Michael; Krska, Rudolf; Tchana, Angele; Njobeh, Patrick B; Turner, Paul C; Kouanfack, Charles; Eyongetah, Mbu; Dutton, Mike; Moundipa, Paul F
2013-12-01
Bio-monitoring of human exposure to mycotoxin has mostly been limited to a few individually measured mycotoxin biomarkers. This study aimed to determine the frequency and level of exposure to multiple mycotoxins in human urine from Cameroonian adults. 175 Urine samples (83% from HIV-positive individuals) and food frequency questionnaire responses were collected from consenting Cameroonians, and analyzed for 15 mycotoxins and relevant metabolites using LC-ESI-MS/MS. In total, eleven analytes were detected individually or in combinations in 110/175 (63%) samples including the biomarkers aflatoxin M1, fumonisin B1, ochratoxin A and total deoxynivalenol. Additionally, important mycotoxins and metabolites thereof, such as fumonisin B2, nivalenol and zearalenone, were determined, some for the first time in urine following dietary exposures. Multi-mycotoxin contamination was common with one HIV-positive individual exposed to five mycotoxins, a severe case of co-exposure that has never been reported in adults before. For the first time in Africa or elsewhere, this study quantified eleven mycotoxin biomarkers and bio-measures in urine from adults. For several mycotoxins estimates indicate that the tolerable daily intake is being exceeded in this study population. Given that many mycotoxins adversely affect the immune system, future studies will examine whether combinations of mycotoxins negatively impact Cameroonian population particularly immune-suppressed individuals. Copyright © 2013 Elsevier Ltd. All rights reserved.
Longitudinal analysis on the development of hospital quality management systems in the Netherlands.
Dückers, Michel; Makai, Peter; Vos, Leti; Groenewegen, Peter; Wagner, Cordula
2009-10-01
Many changes have been initiated in the Dutch hospital sector to optimize health-care delivery: national agenda-setting, increased competition and transparency, a new system of hospital reimbursement based on diagnosis-treatment combinations, intensified monitoring of quality and a multi-layered organizational development programme based on quality improvement collaboratives. The objective is to answer the question as to whether these changes were accompanied by a further development of hospital quality management systems and to what extent did the development within the multi-layered programme hospitals differ from that in other hospitals. Longitudinal data were collected in 1995, 2000, 2005 and 2007 using a validated questionnaire. Descriptive analyses and multi-level modelling were applied to test whether: (1) quality management system development stages in hospitals differ over time, (2) development stages and trends differ between hospitals participating or not participating in the multi-layered programme and (3) hospital size has an effect on development stage. Dutch hospital sector between 1995 and 2007. Hospital organizations. Changes through time. Quality management system development stage. Since 1995, hospital quality management systems have reached higher development levels. Programme participants have developed their quality management system more rapidly than have non-participants. However, this effect is confounded by hospital size. Study results suggest that the combination of policy measures at macro level was accompanied by an increase in hospital size and the further development of quality management systems. Hospitals are entering the stage of systematic quality improvement.
Speaker-sensitive emotion recognition via ranking: Studies on acted and spontaneous speech☆
Cao, Houwei; Verma, Ragini; Nenkova, Ani
2014-01-01
We introduce a ranking approach for emotion recognition which naturally incorporates information about the general expressivity of speakers. We demonstrate that our approach leads to substantial gains in accuracy compared to conventional approaches. We train ranking SVMs for individual emotions, treating the data from each speaker as a separate query, and combine the predictions from all rankers to perform multi-class prediction. The ranking method provides two natural benefits. It captures speaker specific information even in speaker-independent training/testing conditions. It also incorporates the intuition that each utterance can express a mix of possible emotion and that considering the degree to which each emotion is expressed can be productively exploited to identify the dominant emotion. We compare the performance of the rankers and their combination to standard SVM classification approaches on two publicly available datasets of acted emotional speech, Berlin and LDC, as well as on spontaneous emotional data from the FAU Aibo dataset. On acted data, ranking approaches exhibit significantly better performance compared to SVM classification both in distinguishing a specific emotion from all others and in multi-class prediction. On the spontaneous data, which contains mostly neutral utterances with a relatively small portion of less intense emotional utterances, ranking-based classifiers again achieve much higher precision in identifying emotional utterances than conventional SVM classifiers. In addition, we discuss the complementarity of conventional SVM and ranking-based classifiers. On all three datasets we find dramatically higher accuracy for the test items on whose prediction the two methods agree compared to the accuracy of individual methods. Furthermore on the spontaneous data the ranking and standard classification are complementary and we obtain marked improvement when we combine the two classifiers by late-stage fusion. PMID:25422534
Speaker-sensitive emotion recognition via ranking: Studies on acted and spontaneous speech☆
Cao, Houwei; Verma, Ragini; Nenkova, Ani
2015-01-01
We introduce a ranking approach for emotion recognition which naturally incorporates information about the general expressivity of speakers. We demonstrate that our approach leads to substantial gains in accuracy compared to conventional approaches. We train ranking SVMs for individual emotions, treating the data from each speaker as a separate query, and combine the predictions from all rankers to perform multi-class prediction. The ranking method provides two natural benefits. It captures speaker specific information even in speaker-independent training/testing conditions. It also incorporates the intuition that each utterance can express a mix of possible emotion and that considering the degree to which each emotion is expressed can be productively exploited to identify the dominant emotion. We compare the performance of the rankers and their combination to standard SVM classification approaches on two publicly available datasets of acted emotional speech, Berlin and LDC, as well as on spontaneous emotional data from the FAU Aibo dataset. On acted data, ranking approaches exhibit significantly better performance compared to SVM classification both in distinguishing a specific emotion from all others and in multi-class prediction. On the spontaneous data, which contains mostly neutral utterances with a relatively small portion of less intense emotional utterances, ranking-based classifiers again achieve much higher precision in identifying emotional utterances than conventional SVM classifiers. In addition, we discuss the complementarity of conventional SVM and ranking-based classifiers. On all three datasets we find dramatically higher accuracy for the test items on whose prediction the two methods agree compared to the accuracy of individual methods. Furthermore on the spontaneous data the ranking and standard classification are complementary and we obtain marked improvement when we combine the two classifiers by late-stage fusion.
A Cognitive Computing Approach for Classification of Complaints in the Insurance Industry
NASA Astrophysics Data System (ADS)
Forster, J.; Entrup, B.
2017-10-01
In this paper we present and evaluate a cognitive computing approach for classification of dissatisfaction and four complaint specific complaint classes in correspondence documents between insurance clients and an insurance company. A cognitive computing approach includes the combination classical natural language processing methods, machine learning algorithms and the evaluation of hypothesis. The approach combines a MaxEnt machine learning algorithm with language modelling, tf-idf and sentiment analytics to create a multi-label text classification model. The result is trained and tested with a set of 2500 original insurance communication documents written in German, which have been manually annotated by the partnering insurance company. With a F1-Score of 0.9, a reliable text classification component has been implemented and evaluated. A final outlook towards a cognitive computing insurance assistant is given in the end.
Turner, Simon; Vasilakis, Christos; Utley, Martin; Foster, Paul; Kotecha, Aachal; Fulop, Naomi J
2018-05-01
The development and implementation of innovation by healthcare providers is understood as a multi-determinant and multi-level process. Theories at different analytical levels (i.e. micro and organisational) are needed to capture the processes that influence innovation by providers. This article combines a micro theory of innovation, actor-network theory, with organisational level processes using the 'resource based view of the firm'. It examines the influence of, and interplay between, innovation-seeking teams (micro) and underlying organisational capabilities (meso) during innovation processes. We used ethnographic methods to study service innovations in relation to ophthalmology services run by a specialist English NHS Trust at multiple locations. Operational research techniques were used to support the ethnographic methods by mapping the care process in the existing and redesigned clinics. Deficiencies in organisational capabilities for supporting innovation were identified, including manager-clinician relations and organisation-wide resources. The article concludes that actor-network theory can be combined with the resource-based view to highlight the influence of organisational capabilities on the management of innovation. Equally, actor-network theory helps to address the lack of theory in the resource-based view on the micro practices of implementing change. © 2018 The Authors. Sociology of Health & Illness published by John Wiley & Sons Ltd on behalf of Foundation for SHIL.
Qi, Yu; Wang, Hui; Wei, Kai; Yang, Ya; Zheng, Ru-Yue; Kim, Ick Soo; Zhang, Ke-Qin
2017-01-01
The biological performance of artificial biomaterials is closely related to their structure characteristics. Cell adhesion, migration, proliferation, and differentiation are all strongly affected by the different scale structures of biomaterials. Silk fibroin (SF), extracted mainly from silkworms, has become a popular biomaterial due to its excellent biocompatibility, exceptional mechanical properties, tunable degradation, ease of processing, and sufficient supply. As a material with excellent processability, SF can be processed into various forms with different structures, including particulate, fiber, film, and three-dimensional (3D) porous scaffolds. This review discusses and summarizes the various constructions of SF-based materials, from single structures to multi-level structures, and their applications. In combination with single structures, new techniques for creating special multi-level structures of SF-based materials, such as micropatterning and 3D-printing, are also briefly addressed. PMID:28273799
Multi-level structure in the large scale distribution of optically luminous galaxies
NASA Astrophysics Data System (ADS)
Deng, Xin-fa; Deng, Zu-gan; Liu, Yong-zhen
1992-04-01
Fractal dimensions in the large scale distribution of galaxies have been calculated with the method given by Wen et al. [1] Samples are taken from CfA redshift survey in northern and southern galactic [2] hemisphere in our analysis respectively. Results from these two regions are compared with each other. There are significant differences between the distributions in these two regions. However, our analyses do show some common features of the distributions in these two regions. All subsamples show multi-level fractal character distinctly. Combining it with the results from analyses of samples given by IRAS galaxies and results from samples given by redshift survey in pencil-beam fields, [3,4] we suggest that multi-level fractal structure is most likely to be a general and important character in the large scale distribution of galaxies. The possible implications of this character are discussed.
NASA Astrophysics Data System (ADS)
Indarsih, Indrati, Ch. Rini
2016-02-01
In this paper, we define variance of the fuzzy random variables through alpha level. We have a theorem that can be used to know that the variance of fuzzy random variables is a fuzzy number. We have a multi-objective linear programming (MOLP) with fuzzy random of objective function coefficients. We will solve the problem by variance approach. The approach transform the MOLP with fuzzy random of objective function coefficients into MOLP with fuzzy of objective function coefficients. By weighted methods, we have linear programming with fuzzy coefficients and we solve by simplex method for fuzzy linear programming.
The Aeronautical Data Link: Taxonomy, Architectural Analysis, and Optimization
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Goode, Plesent W.
2002-01-01
The future Communication, Navigation, and Surveillance/Air Traffic Management (CNS/ATM) System will rely on global satellite navigation, and ground-based and satellite based communications via Multi-Protocol Networks (e.g. combined Aeronautical Telecommunications Network (ATN)/Internet Protocol (IP)) to bring about needed improvements in efficiency and safety of operations to meet increasing levels of air traffic. This paper will discuss the development of an approach that completely describes optimal data link architecture configuration and behavior to meet the multiple conflicting objectives of concurrent and different operations functions. The practical application of the approach enables the design and assessment of configurations relative to airspace operations phases. The approach includes a formal taxonomic classification, an architectural analysis methodology, and optimization techniques. The formal taxonomic classification provides a multidimensional correlation of data link performance with data link service, information protocol, spectrum, and technology mode; and to flight operations phase and environment. The architectural analysis methodology assesses the impact of a specific architecture configuration and behavior on the local ATM system performance. Deterministic and stochastic optimization techniques maximize architectural design effectiveness while addressing operational, technology, and policy constraints.
Schlottfeldt, S; Walter, M E M T; Carvalho, A C P L F; Soares, T N; Telles, M P C; Loyola, R D; Diniz-Filho, J A F
2015-06-18
Biodiversity crises have led scientists to develop strategies for achieving conservation goals. The underlying principle of these strategies lies in systematic conservation planning (SCP), in which there are at least 2 conflicting objectives, making it a good candidate for multi-objective optimization. Although SCP is typically applied at the species level (or hierarchically higher), it can be used at lower hierarchical levels, such as using alleles as basic units for analysis, for conservation genetics. Here, we propose a method of SCP using a multi-objective approach. We used non-dominated sorting genetic algorithm II in order to identify the smallest set of local populations of Dipteryx alata (baru) (a Brazilian Cerrado species) for conservation, representing the known genetic diversity and using allele frequency information associated with heterozygosity and Hardy-Weinberg equilibrium. We worked in 3 variations for the problem. First, we reproduced a previous experiment, but using a multi-objective approach. We found that the smallest set of populations needed to represent all alleles under study was 7, corroborating the results of the previous study, but with more distinct solutions. In the 2nd and 3rd variations, we performed simultaneous optimization of 4 and 5 objectives, respectively. We found similar but refined results for 7 populations, and a larger portfolio considering intra-specific diversity and persistence with populations ranging from 8-22. This is the first study to apply multi-objective algorithms to an SCP problem using alleles at the population level as basic units for analysis.
Balicer, Ran; Bitterman, Haim; Shadmi, Efrat
2012-07-01
Technological advances combined with the aging of the population bring about an increasingly growing number of patients with chronic conditions and multi-morbidity. Multi-morbidity, the co-occurrence of chronic and/or non-chronic conditions in an individual, is the norm among elderly patients, and is becoming increasingly common among younger adults. The Israeli health system, like other systems worldwide, is faced with the challenges posed by the increase in complex multi-morbidity, in an era of growing fiscal constraints, a situation that can induce financial and organizational crises. To effectively cope with such circumstances, a paradigm shift is needed. Health systems need to focus on overall morbidity burden and multi-morbidity (rather than the prevailing one disease at a time approach) and on better care integration. The Israeli health system entails many of the essential elements for addressing the challenges of integrated care, including universal health coverage and advanced health information technology systems. Yet, like other health systems, there is a need for care management support mechanisms that are more effectively tailored to meet the needs of the highly multimorbid patients. This review outlines the organizational approach required to better align care for the main customers of health care in the 21st century: patients with multi-morbidity. We focus on four domains: assessment of morbidity burden according to measures that account for the interaction and synergism amongst conditions; integration across the care continuum; enhancement of primary care and self-management support approaches; and provision of uniquely tailored care management solutions for the highest risk multi-morbid patients.
NASA Astrophysics Data System (ADS)
Fritsche, H.; Koch, Ralf; Krusche, B.; Ferrario, F.; Grohe, Andreas; Pflueger, S.; Gries, W.
2014-05-01
Generating high power laser radiation with diode lasers is commonly realized by geometrical stacking of diode bars, which results in high output power but poor beam parameter product (BPP). The accessible brightness in this approach is limited by the fill factor, both in slow and fast axis. By using a geometry that accesses the BPP of the individual diodes, generating a multi kilowatt diode laser with a BPP comparable to fiber lasers is possible. We will demonstrate such a modular approach for generating multi kilowatt lasers by combining single emitter diode lasers. Single emitter diodes have advantages over bars, mainly a simplified cooling, better reliability and a higher brightness per emitter. Additionally, because single emitters can be arranged in many different geometries, they allow building laser modules where the brightness of the single emitters is preserved. In order to maintain the high brightness of the single emitter we developed a modular laser design which uses single emitters in a staircase arrangement, then coupling two of those bases with polarization combination which is our basic module. Those modules generate up to 160 W with a BPP better than 7.5 mm*mrad. For further power scaling wavelength stabilization is crucial. The wavelength is stabilized with only one Volume Bragg Grating (VBG) in front of a base providing the very same feedback to all of the laser diodes. This results in a bandwidth of < 0.5 nm and a wavelength stability of better than 250 MHz over one hour. Dense spectral combination with dichroic mirrors and narrow channel spacing allows us to combine multiple wavelength channels, resulting in a 2 kW laser module with a BPP better than 7.5 mm*mrad, which can easily coupled into a 100 μm fiber and 0.15 NA.
Field Trials of the Multi-Source Approach for Resistivity and Induced Polarization Data Acquisition
NASA Astrophysics Data System (ADS)
LaBrecque, D. J.; Morelli, G.; Fischanger, F.; Lamoureux, P.; Brigham, R.
2013-12-01
Implementing systems of distributed receivers and transmitters for resistivity and induced polarization data is an almost inevitable result of the availability of wireless data communication modules and GPS modules offering precise timing and instrument locations. Such systems have a number of advantages; for example, they can be deployed around obstacles such as rivers, canyons, or mountains which would be difficult with traditional 'hard-wired' systems. However, deploying a system of identical, small, battery powered, transceivers, each capable of injecting a known current and measuring the induced potential has an additional and less obvious advantage in that multiple units can inject current simultaneously. The original purpose for using multiple simultaneous current sources (multi-source) was to increase signal levels. In traditional systems, to double the received signal you inject twice the current which requires you to apply twice the voltage and thus four times the power. Alternatively, one approach to increasing signal levels for large-scale surveys collected using small, battery powered transceivers is it to allow multiple units to transmit in parallel. In theory, using four 400 watt transmitters on separate, parallel dipoles yields roughly the same signal as a single 6400 watt transmitter. Furthermore, implementing the multi-source approach creates the opportunity to apply more complex current flow patterns than simple, parallel dipoles. For a perfect, noise-free system, multi-sources adds no new information to a data set that contains a comprehensive set of data collected using single sources. However, for realistic, noisy systems, it appears that multi-source data can substantially impact survey results. In preliminary model studies, the multi-source data produced such startling improvements in subsurface images that even the authors questioned their veracity. Between December of 2012 and July of 2013, we completed multi-source surveys at five sites with depths of exploration ranging from 150 to 450 m. The sites included shallow geothermal sites near Reno Nevada, Pomarance Italy, and Volterra Italy; a mineral exploration site near Timmins Quebec; and a landslide investigation near Vajont Dam in northern Italy. These sites provided a series of challenges in survey design and deployment including some extremely difficult terrain and a broad range of background resistivity and induced values. Despite these challenges, comparison of multi-source results to resistivity and induced polarization data collection with more traditional methods support the thesis that the multi-source approach is capable of providing substantial improvements in both depth of penetration and resolution over conventional approaches.
Chedid, Mokbel K; Tundo, Kelly M; Block, Jon E; Muir, Jeffrey M
2015-01-01
Autologous iliac crest bone graft is the preferred option for spinal fusion, but the morbidity associated with bone harvest and the need for graft augmentation in more demanding cases necessitates combining local bone with bone substitutes. The purpose of this study was to document the clinical effectiveness and safety of a novel hybrid biosynthetic scaffold material consisting of poly(D,L-lactide-co-glycolide) (PLGA, 75:25) combined by lyophilization with unmodified high molecular weight hyaluronic acid (10-12% wt:wt) as an extender for a broad range of spinal fusion procedures. We retrospectively evaluated all patients undergoing single- and multi-level posterior lumbar interbody fusion at an academic medical center over a 3-year period. A total of 108 patients underwent 109 procedures (245 individual vertebral levels). Patient-related outcomes included pain measured on a Visual Analog Scale. Radiographic outcomes were assessed at 6 weeks, 3-6 months, and 1 year postoperatively. Radiographic fusion or progression of fusion was documented in 221 of 236 index levels (93.6%) at a mean (±SD) time to fusion of 10.2+4.1 months. Single and multi-level fusions were not associated with significantly different success rates. Mean pain scores (+SD) for all patients improved from 6.8+2.5 at baseline to 3.6+2.9 at approximately 12 months. Improvements in VAS were greatest in patients undergoing one- or two-level fusion, with patients undergoing multi-level fusion demonstrating lesser but still statistically significant improvements. Overall, stable fusion was observed in 64.8% of vertebral levels; partial fusion was demonstrated in 28.8% of vertebral levels. Only 15 of 236 levels (6.4%) were non-fused at final follow-up.
Life Cycle Assessment of Mixed Municipal Solid Waste: Multi-input versus multi-output perspective.
Fiorentino, G; Ripa, M; Protano, G; Hornsby, C; Ulgiati, S
2015-12-01
This paper analyses four strategies for managing the Mixed Municipal Solid Waste (MMSW) in terms of their environmental impacts and potential advantages by means of Life Cycle Assessment (LCA) methodology. To this aim, both a multi-input and a multi-output approach are applied to evaluate the effect of these perspectives on selected impact categories. The analyzed management options include direct landfilling with energy recovery (S-1), Mechanical-Biological Treatment (MBT) followed by Waste-to-Energy (WtE) conversion (S-2), a combination of an innovative MBT/MARSS (Material Advanced Recovery Sustainable Systems) process and landfill disposal (S-3), and finally a combination of the MBT/MARSS process with WtE conversion (S-4). The MARSS technology, developed within an European LIFE PLUS framework and currently implemented at pilot plant scale, is an innovative MBT plant having the main goal to yield a Renewable Refined Biomass Fuel (RRBF) to be used for combined heat and power production (CHP) under the regulations enforced for biomass-based plants instead of Waste-to-Energy systems, for increased environmental performance. The four scenarios are characterized by different resource investment for plant and infrastructure construction and different quantities of matter, heat and electricity recovery and recycling. Results, calculated per unit mass of waste treated and per unit exergy delivered, under both multi-input and multi-output LCA perspectives, point out improved performance for scenarios characterized by increased matter and energy recovery. Although none of the investigated scenarios is capable to provide the best performance in all the analyzed impact categories, the scenario S-4 shows the best LCA results in the human toxicity and freshwater eutrophication categories, i.e. the ones with highest impacts in all waste management processes. Copyright © 2015 Elsevier Ltd. All rights reserved.
Kankaanpää, Aino; Ariniemi, Kari; Heinonen, Mari; Kuoppasalmi, Kimmo; Gunnar, Teemu
2016-10-15
No single measure is able to provide a complete picture of population- or community-level drug abuse and its current trends. Therefore, a multi-indicator approach is needed. The aim of this study was to combine wastewater-based epidemiology (WBE) with data from other national indicators, namely driving under the influence of drugs (DUID) statistics, drug seizures, and drug use surveys. Furthermore, drug market size estimates and a comparison of confiscated drugs to drugs actually consumed by users were performed using the WBE approach. Samples for wastewater analysis were collected during one-week sampling periods in 2012, 2014 and 2015, with a maximum of 14 cities participating. The samples were analysed with a validated ultra-high-performance liquid chromatography-mass spectrometric (UHPLC-MS/MS) methodology for various common drugs of abuse. The results were then compared with data from other national indicators available. Joint interpretation of the data shows that the use of amphetamine and MDMA has increased in Finland from 2012 to 2014. A similar trend was also observed for cocaine, although its use remains at a very low level compared to many other European countries. Heroin was practically absent from the Finnish drug market during the study period. The retail market for the most common stimulant drugs were estimated to have been worth EUR 70 million for amphetamine and around EUR 10 million for both methamphetamine and cocaine, in 2014 in Finland. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Linard, J.; Leib, K.; Colorado Water Science Center
2010-12-01
Elevated levels of salinity and dissolved selenium can detrimentally effect the quality of water where anthropogenic and natural uses are concerned. In areas, such as the lower Gunnison Basin of western Colorado, salinity and selenium are such a concern that control projects are implemented to limit their mobilization. To prioritize the locations in which control projects are implemented, multi-parameter regression models were developed to identify subbasins in the lower Gunnison River Basin that were most likely to have elevated salinity and dissolved selenium levels. The drainage area is about 5,900 mi2 and is underlain by Cretaceous marine shale, which is the most common source of salinity and dissolved selenium. To characterize the complex hydrologic and chemical processes governing constituent mobilization, geospatial variables representing 70 different environmental characteristics were correlated to mean seasonal (irrigation and nonirrigation seasons) salinity and selenium yields estimated at 154 sampling sites. The variables generally represented characteristics of the physical basin, precipitation, soil, geology, land use, and irrigation water delivery systems. Irrigation and nonirrigation seasons were selected due to documented effects of irrigation on constituent mobilization. Following a stepwise approach, combinations of the geospatial variables were used to develop four multi-parameter regression models. These models predicted salinity and selenium yield, within a 95 percent confidence range, at individual points in the Lower Gunnison Basin for irrigation and non-irrigation seasons. The corresponding subbasins were ranked according to their potential to yield salinity and selenium and rankings were used to prioritize areas that would most benefit from control projects.
Persson, Ulf; Norlin, J M
2018-04-01
Many pharmaceuticals are effective in multiple indications and the degree of effectiveness may differ. A product-based pricing and reimbursement system with a single price per product is insufficient to reflect the variable values between different indications. The objective of this article is to present examples of actual pricing and reimbursement decisions using current value-based pricing in Sweden and to discuss their implications and possible solutions. The value of several cancer drugs was estimated for various indications based on a willingness-to-pay threshold of 1 million SEK (EUR 104,000) per QALY gained. For some drugs, the estimated value was higher than the drug acquisition cost in several indications, whilst in others, the estimated value was lower than the drug acquisition cost. Drugs used in combination present a special case. If a drug prolongs survival and consequently also a continued use of the anchor drug, the combination use may not be cost effective even at a zero price. In a product-based pricing and reimbursement system, patients may not get access to drugs or access may be delayed and manufacturers may be discouraged to invest in future indications. To overcome these issues, there are several approaches to link price and value. One approach is a "weighted-average" price based on an average of the value across all indications. Another is "multi-indication pricing," which enables price differentiation between indications. However, there are several barriers for applying multi-indication pricing and reimbursement schemes. One barrier is the lack of existing administrative infrastructure to track patients' indications.
Multi-level molecular modelling for plasma medicine
NASA Astrophysics Data System (ADS)
Bogaerts, Annemie; Khosravian, Narjes; Van der Paal, Jonas; Verlackt, Christof C. W.; Yusupov, Maksudbek; Kamaraj, Balu; Neyts, Erik C.
2016-02-01
Modelling at the molecular or atomic scale can be very useful for obtaining a better insight in plasma medicine. This paper gives an overview of different atomic/molecular scale modelling approaches that can be used to study the direct interaction of plasma species with biomolecules or the consequences of these interactions for the biomolecules on a somewhat longer time-scale. These approaches include density functional theory (DFT), density functional based tight binding (DFTB), classical reactive and non-reactive molecular dynamics (MD) and united-atom or coarse-grained MD, as well as hybrid quantum mechanics/molecular mechanics (QM/MM) methods. Specific examples will be given for three important types of biomolecules, present in human cells, i.e. proteins, DNA and phospholipids found in the cell membrane. The results show that each of these modelling approaches has its specific strengths and limitations, and is particularly useful for certain applications. A multi-level approach is therefore most suitable for obtaining a global picture of the plasma-biomolecule interactions.
The Effect of Visual Information on the Manual Approach and Landing
NASA Technical Reports Server (NTRS)
Wewerinke, P. H.
1982-01-01
The effect of visual information in combination with basic display information on the approach performance. A pre-experimental model analysis was performed in terms of the optimal control model. The resulting aircraft approach performance predictions were compared with the results of a moving base simulator program. The results illustrate that the model provides a meaningful description of the visual (scene) perception process involved in the complex (multi-variable, time varying) manual approach task with a useful predictive capability. The theoretical framework was shown to allow a straight-forward investigation of the complex interaction of a variety of task variables.
Multi-jet Merging with NLO Matrix Elements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siegert, Frank; /Freiburg U.; Hoche, Stefan
2011-08-18
In the algorithm presented here, the ME+PS approach to merge samples of tree-level matrix elements into inclusive event samples is combined with the POWHEG method, which includes exact next-to-leading order matrix elements in the parton shower. The advantages of the method are discussed and the quality of its implementation in SHERPA is exemplified by results for e{sup +}e{sup -} annihilation into hadrons at LEP, for deep-inelastic lepton-nucleon scattering at HERA, for Drell-Yan lepton-pair production at the Tevatron and for W{sup +}W{sup -}-production at LHC energies. The simulation of hard QCD radiation in parton-shower Monte Carlos has seen tremendous progress overmore » the last years. It was largely stimulated by the need for more precise predictions at LHC energies where the large available phase space allows additional hard QCD radiation alongside known Standard Model processes or even signals from new physics. Two types of algorithms have been developed, which allow to improve upon the soft-collinear approximations made in the parton shower, such that hard radiation is simulated according to exact matrix elements. In the ME+PS approach [1] higher-order tree-level matrix elements for different final-state jet multiplicity are merged with each other and with subsequent parton shower emissions to generate an inclusive sample. Such a prescription is invaluable for analyses which are sensitive to final states with a large jet multiplicity. The only remaining deficiency of such tree-level calculations is the large uncertainty stemming from scale variations. The POWHEG method [2] solves this problem for the lowest multiplicity subprocess by combining full NLO matrix elements with the parton shower. While this leads to NLO accuracy in the inclusive cross section and the exact radiation pattern for the first emission, it fails to describe higher-order emissions with improved accuracy. Thus it is not sufficient if final states with high jet multiplicities are considered. With the complementary advantages of these two approaches, the question arises naturally whether it would be possible to combine them into an even more powerful one. Such a combined algorithm was independently developed in [5] and [6]. Here a summary of the algorithm is given and predictions from corresponding Monte-Carlo predictions are presented.« less
Hirai, Tadayoshi; Oikawa, Akira; Matsuda, Fumio; Fukushima, Atsushi; Arita, Masanori; Watanabe, Shin; Yano, Megumu; Hiwasa-Tanase, Kyoko; Ezura, Hiroshi; Saito, Kazuki
2011-01-01
As metabolomics can provide a biochemical snapshot of an organism's phenotype it is a promising approach for charting the unintended effects of genetic modification. A critical obstacle for this application is the inherently limited metabolomic coverage of any single analytical platform. We propose using multiple analytical platforms for the direct acquisition of an interpretable data set of estimable chemical diversity. As an example, we report an application of our multi-platform approach that assesses the substantial equivalence of tomatoes over-expressing the taste-modifying protein miraculin. In combination, the chosen platforms detected compounds that represent 86% of the estimated chemical diversity of the metabolites listed in the LycoCyc database. Following a proof-of-safety approach, we show that % had an acceptable range of variation while simultaneously indicating a reproducible transformation-related metabolic signature. We conclude that multi-platform metabolomics is an approach that is both sensitive and robust and that it constitutes a good starting point for characterizing genetically modified organisms. PMID:21359231
A self-consistent first-principle based approach to model carrier mobility in organic materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meded, Velimir; Friederich, Pascal; Symalla, Franz
2015-12-31
Transport through thin organic amorphous films, utilized in OLEDs and OPVs, has been a challenge to model by using ab-initio methods. Charge carrier mobility depends strongly on the disorder strength and reorganization energy, both of which are significantly affected by the details in environment of each molecule. Here we present a multi-scale approach to describe carrier mobility in which the materials morphology is generated using DEPOSIT, a Monte Carlo based atomistic simulation approach, or, alternatively by molecular dynamics calculations performed with GROMACS. From this morphology we extract the material specific hopping rates, as well as the on-site energies using amore » fully self-consistent embedding approach to compute the electronic structure parameters, which are then used in an analytic expression for the carrier mobility. We apply this strategy to compute the carrier mobility for a set of widely studied molecules and obtain good agreement between experiment and theory varying over several orders of magnitude in the mobility without any freely adjustable parameters. The work focuses on the quantum mechanical step of the multi-scale workflow, explains the concept along with the recently published workflow optimization, which combines density functional with semi-empirical tight binding approaches. This is followed by discussion on the analytic formula and its agreement with established percolation fits as well as kinetic Monte Carlo numerical approaches. Finally, we skatch an unified multi-disciplinary approach that integrates materials science simulation and high performance computing, developed within EU project MMM@HPC.« less
Toroody, Ahmad Bahoo; Abaei, Mohammad Mahdy; Gholamnia, Reza
2016-12-01
Risk assessment can be classified into two broad categories: traditional and modern. This paper is aimed at contrasting the functional resonance analysis method (FRAM) as a modern approach with the fault tree analysis (FTA) as a traditional method, regarding assessing the risks of a complex system. Applied methodology by which the risk assessment is carried out, is presented in each approach. Also, FRAM network is executed with regard to nonlinear interaction of human and organizational levels to assess the safety of technological systems. The methodology is implemented for lifting structures deep offshore. The main finding of this paper is that the combined application of FTA and FRAM during risk assessment, could provide complementary perspectives and may contribute to a more comprehensive understanding of an incident. Finally, it is shown that coupling a FRAM network with a suitable quantitative method will result in a plausible outcome for a predefined accident scenario.
NASA Astrophysics Data System (ADS)
Hofmann, Dietrich; Dittrich, Paul-Gerald; Gärtner, Claudia; Klemm, Richard
2013-03-01
Aim of the paper is the orientation of research and development on a completely new approach to innovative in-field and point of care diagnostics in industry, biology and medicine. Central functional modules are smartphones and/or smart pads supplemented by additional hardware apps and software apps. Specific examples are given for numerous practical applications concerning optodigital instrumentations. The methodical classification distinguishes between different levels for combination of hardware apps (hwapps) and software apps (swapps) with smartphones and/or smartpads. These methods are fundamental enablers for the transformation from stationary conventional laboratory diagnostics into mobile innovative in-field and point of care diagnostics. The innovative approach opens so far untapped enormous markets due to the convenience, reliability and affordability of smartphone and/or smartpad instruments. A highly visible advantage of smartphones and/or smartpads is the huge number of their distribution, their worldwide connectivity via cloud services and the experienced capability of their users for practical operations.
Multi-atlas segmentation enables robust multi-contrast MRI spleen segmentation for splenomegaly
NASA Astrophysics Data System (ADS)
Huo, Yuankai; Liu, Jiaqi; Xu, Zhoubing; Harrigan, Robert L.; Assad, Albert; Abramson, Richard G.; Landman, Bennett A.
2017-02-01
Non-invasive spleen volume estimation is essential in detecting splenomegaly. Magnetic resonance imaging (MRI) has been used to facilitate splenomegaly diagnosis in vivo. However, achieving accurate spleen volume estimation from MR images is challenging given the great inter-subject variance of human abdomens and wide variety of clinical images/modalities. Multi-atlas segmentation has been shown to be a promising approach to handle heterogeneous data and difficult anatomical scenarios. In this paper, we propose to use multi-atlas segmentation frameworks for MRI spleen segmentation for splenomegaly. To the best of our knowledge, this is the first work that integrates multi-atlas segmentation for splenomegaly as seen on MRI. To address the particular concerns of spleen MRI, automated and novel semi-automated atlas selection approaches are introduced. The automated approach interactively selects a subset of atlases using selective and iterative method for performance level estimation (SIMPLE) approach. To further control the outliers, semi-automated craniocaudal length based SIMPLE atlas selection (L-SIMPLE) is proposed to introduce a spatial prior in a fashion to guide the iterative atlas selection. A dataset from a clinical trial containing 55 MRI volumes (28 T1 weighted and 27 T2 weighted) was used to evaluate different methods. Both automated and semi-automated methods achieved median DSC > 0.9. The outliers were alleviated by the L-SIMPLE (≍1 min manual efforts per scan), which achieved 0.9713 Pearson correlation compared with the manual segmentation. The results demonstrated that the multi-atlas segmentation is able to achieve accurate spleen segmentation from the multi-contrast splenomegaly MRI scans.
Multi-atlas Segmentation Enables Robust Multi-contrast MRI Spleen Segmentation for Splenomegaly.
Huo, Yuankai; Liu, Jiaqi; Xu, Zhoubing; Harrigan, Robert L; Assad, Albert; Abramson, Richard G; Landman, Bennett A
2017-02-11
Non-invasive spleen volume estimation is essential in detecting splenomegaly. Magnetic resonance imaging (MRI) has been used to facilitate splenomegaly diagnosis in vivo. However, achieving accurate spleen volume estimation from MR images is challenging given the great inter-subject variance of human abdomens and wide variety of clinical images/modalities. Multi-atlas segmentation has been shown to be a promising approach to handle heterogeneous data and difficult anatomical scenarios. In this paper, we propose to use multi-atlas segmentation frameworks for MRI spleen segmentation for splenomegaly. To the best of our knowledge, this is the first work that integrates multi-atlas segmentation for splenomegaly as seen on MRI. To address the particular concerns of spleen MRI, automated and novel semi-automated atlas selection approaches are introduced. The automated approach interactively selects a subset of atlases using selective and iterative method for performance level estimation (SIMPLE) approach. To further control the outliers, semi-automated craniocaudal length based SIMPLE atlas selection (L-SIMPLE) is proposed to introduce a spatial prior in a fashion to guide the iterative atlas selection. A dataset from a clinical trial containing 55 MRI volumes (28 T1 weighted and 27 T2 weighted) was used to evaluate different methods. Both automated and semi-automated methods achieved median DSC > 0.9. The outliers were alleviated by the L-SIMPLE (≈1 min manual efforts per scan), which achieved 0.9713 Pearson correlation compared with the manual segmentation. The results demonstrated that the multi-atlas segmentation is able to achieve accurate spleen segmentation from the multi-contrast splenomegaly MRI scans.
Investigation of some selected strategies for multi-GNSS instantaneous RTK positioning
NASA Astrophysics Data System (ADS)
Paziewski, Jacek; Wielgosz, Pawel
2017-01-01
It is clear that we can benefit from multi-constellation GNSS in precise relative positioning. On the other hand, it is still an open problem how to combine multi-GNSS signals in a single functional model. This study presents methodology and quality assessment of selected methods allowing for multi-GNSS observations combining in relative kinematic positioning using baselines up to tens of kilometers. In specific, this paper characterizes loose and tight integration strategies applied to the ionosphere and troposphere weighted model. Performance assessment of the established strategies was based on the analyses of the integer ambiguity resolution and rover coordinates' repeatability obtained in the medium range instantaneous RTK positioning with the use of full constellation dual frequency GPS and Galileo signals. Since full constellation of Galileo satellites is not yet available, the observational data were obtained from a hardware GNSS signal simulator using regular geodetic GNSS receivers. The results indicate on similar and high performance of the loose, and tight integration with calibrated receiver ISBs strategies. These approaches have undeniable advantage over single system positioning in terms of reliability of the integer ambiguity resolution as well as rover coordinate repeatability.
NASA Astrophysics Data System (ADS)
Andrade, Xavier; Alberdi-Rodriguez, Joseba; Strubbe, David A.; Oliveira, Micael J. T.; Nogueira, Fernando; Castro, Alberto; Muguerza, Javier; Arruabarrena, Agustin; Louie, Steven G.; Aspuru-Guzik, Alán; Rubio, Angel; Marques, Miguel A. L.
2012-06-01
Octopus is a general-purpose density-functional theory (DFT) code, with a particular emphasis on the time-dependent version of DFT (TDDFT). In this paper we present the ongoing efforts to achieve the parallelization of octopus. We focus on the real-time variant of TDDFT, where the time-dependent Kohn-Sham equations are directly propagated in time. This approach has great potential for execution in massively parallel systems such as modern supercomputers with thousands of processors and graphics processing units (GPUs). For harvesting the potential of conventional supercomputers, the main strategy is a multi-level parallelization scheme that combines the inherent scalability of real-time TDDFT with a real-space grid domain-partitioning approach. A scalable Poisson solver is critical for the efficiency of this scheme. For GPUs, we show how using blocks of Kohn-Sham states provides the required level of data parallelism and that this strategy is also applicable for code optimization on standard processors. Our results show that real-time TDDFT, as implemented in octopus, can be the method of choice for studying the excited states of large molecular systems in modern parallel architectures.
Andrade, Xavier; Alberdi-Rodriguez, Joseba; Strubbe, David A; Oliveira, Micael J T; Nogueira, Fernando; Castro, Alberto; Muguerza, Javier; Arruabarrena, Agustin; Louie, Steven G; Aspuru-Guzik, Alán; Rubio, Angel; Marques, Miguel A L
2012-06-13
Octopus is a general-purpose density-functional theory (DFT) code, with a particular emphasis on the time-dependent version of DFT (TDDFT). In this paper we present the ongoing efforts to achieve the parallelization of octopus. We focus on the real-time variant of TDDFT, where the time-dependent Kohn-Sham equations are directly propagated in time. This approach has great potential for execution in massively parallel systems such as modern supercomputers with thousands of processors and graphics processing units (GPUs). For harvesting the potential of conventional supercomputers, the main strategy is a multi-level parallelization scheme that combines the inherent scalability of real-time TDDFT with a real-space grid domain-partitioning approach. A scalable Poisson solver is critical for the efficiency of this scheme. For GPUs, we show how using blocks of Kohn-Sham states provides the required level of data parallelism and that this strategy is also applicable for code optimization on standard processors. Our results show that real-time TDDFT, as implemented in octopus, can be the method of choice for studying the excited states of large molecular systems in modern parallel architectures.
A multi-criteria spatial deprivation index to support health inequality analyses.
Cabrera-Barona, Pablo; Murphy, Thomas; Kienberger, Stefan; Blaschke, Thomas
2015-03-20
Deprivation indices are useful measures to analyze health inequalities. There are several methods to construct these indices, however, few studies have used Geographic Information Systems (GIS) and Multi-Criteria methods to construct a deprivation index. Therefore, this study applies Multi-Criteria Evaluation to calculate weights for the indicators that make up the deprivation index and a GIS-based fuzzy approach to create different scenarios of this index is also implemented. The Analytical Hierarchy Process (AHP) is used to obtain the weights for the indicators of the index. The Ordered Weighted Averaging (OWA) method using linguistic quantifiers is applied in order to create different deprivation scenarios. Geographically Weighted Regression (GWR) and a Moran's I analysis are employed to explore spatial relationships between the different deprivation measures and two health factors: the distance to health services and the percentage of people that have never had a live birth. This last indicator was considered as the dependent variable in the GWR. The case study is Quito City, in Ecuador. The AHP-based deprivation index show medium and high levels of deprivation (0,511 to 1,000) in specific zones of the study area, even though most of the study area has low values of deprivation. OWA results show deprivation scenarios that can be evaluated considering the different attitudes of decision makers. GWR results indicate that the deprivation index and its OWA scenarios can be considered as local estimators for health related phenomena. Moran's I calculations demonstrate that several deprivation scenarios, in combination with the 'distance to health services' factor, could be explanatory variables to predict the percentage of people that have never had a live birth. The AHP-based deprivation index and the OWA deprivation scenarios developed in this study are Multi-Criteria instruments that can support the identification of highly deprived zones and can support health inequalities analysis in combination with different health factors. The methodology described in this study can be applied in other regions of the world to develop spatial deprivation indices based on Multi-Criteria analysis.