ERIC Educational Resources Information Center
Hallberg, Kelly; Cook, Thomas D.; Figlio, David
2013-01-01
The goal of this paper is to provide guidance for applied education researchers in using multi-level data to study the effects of interventions implemented at the school level. Two primary approaches are currently employed in observational studies of the effect of school-level interventions. One approach employs intact school matching: matching…
NASA Astrophysics Data System (ADS)
Tabik, S.; Romero, L. F.; Mimica, P.; Plata, O.; Zapata, E. L.
2012-09-01
A broad area in astronomy focuses on simulating extragalactic objects based on Very Long Baseline Interferometry (VLBI) radio-maps. Several algorithms in this scope simulate what would be the observed radio-maps if emitted from a predefined extragalactic object. This work analyzes the performance and scaling of this kind of algorithms on multi-socket, multi-core architectures. In particular, we evaluate a sharing approach, a privatizing approach and a hybrid approach on systems with complex memory hierarchy that includes shared Last Level Cache (LLC). In addition, we investigate which manual processes can be systematized and then automated in future works. The experiments show that the data-privatizing model scales efficiently on medium scale multi-socket, multi-core systems (up to 48 cores) while regardless of algorithmic and scheduling optimizations, the sharing approach is unable to reach acceptable scalability on more than one socket. However, the hybrid model with a specific level of data-sharing provides the best scalability over all used multi-socket, multi-core systems.
Butel, Jean; Braun, Kathryn L; Novotny, Rachel; Acosta, Mark; Castro, Rose; Fleming, Travis; Powers, Julianne; Nigg, Claudio R
2015-12-01
Addressing complex chronic disease prevention, like childhood obesity, requires a multi-level, multi-component culturally relevant approach with broad reach. Models are lacking to guide fidelity monitoring across multiple levels, components, and sites engaged in such interventions. The aim of this study is to describe the fidelity-monitoring approach of The Children's Healthy Living (CHL) Program, a multi-level multi-component intervention in five Pacific jurisdictions. A fidelity-monitoring rubric was developed. About halfway during the intervention, community partners were randomly selected and interviewed independently by local CHL staff and by Coordinating Center representatives to assess treatment fidelity. Ratings were compared and discussed by local and Coordinating Center staff. There was good agreement between the teams (Kappa = 0.50, p < 0.001), and intervention improvement opportunities were identified through data review and group discussion. Fidelity for the multi-level, multi-component, multi-site CHL intervention was successfully assessed, identifying adaptations as well as ways to improve intervention delivery prior to the end of the intervention.
Zhang, Xinyan; Li, Bingzong; Han, Huiying; Song, Sha; Xu, Hongxia; Hong, Yating; Yi, Nengjun; Zhuang, Wenzhuo
2018-05-10
Multiple myeloma (MM), like other cancers, is caused by the accumulation of genetic abnormalities. Heterogeneity exists in the patients' response to treatments, for example, bortezomib. This urges efforts to identify biomarkers from numerous molecular features and build predictive models for identifying patients that can benefit from a certain treatment scheme. However, previous studies treated the multi-level ordinal drug response as a binary response where only responsive and non-responsive groups are considered. It is desirable to directly analyze the multi-level drug response, rather than combining the response to two groups. In this study, we present a novel method to identify significantly associated biomarkers and then develop ordinal genomic classifier using the hierarchical ordinal logistic model. The proposed hierarchical ordinal logistic model employs the heavy-tailed Cauchy prior on the coefficients and is fitted by an efficient quasi-Newton algorithm. We apply our hierarchical ordinal regression approach to analyze two publicly available datasets for MM with five-level drug response and numerous gene expression measures. Our results show that our method is able to identify genes associated with the multi-level drug response and to generate powerful predictive models for predicting the multi-level response. The proposed method allows us to jointly fit numerous correlated predictors and thus build efficient models for predicting the multi-level drug response. The predictive model for the multi-level drug response can be more informative than the previous approaches. Thus, the proposed approach provides a powerful tool for predicting multi-level drug response and has important impact on cancer studies.
Multi-scale heat and mass transfer modelling of cell and tissue cryopreservation
Xu, Feng; Moon, Sangjun; Zhang, Xiaohui; Shao, Lei; Song, Young Seok; Demirci, Utkan
2010-01-01
Cells and tissues undergo complex physical processes during cryopreservation. Understanding the underlying physical phenomena is critical to improve current cryopreservation methods and to develop new techniques. Here, we describe multi-scale approaches for modelling cell and tissue cryopreservation including heat transfer at macroscale level, crystallization, cell volume change and mass transport across cell membranes at microscale level. These multi-scale approaches allow us to study cell and tissue cryopreservation. PMID:20047939
Adaptive multi-GPU Exchange Monte Carlo for the 3D Random Field Ising Model
NASA Astrophysics Data System (ADS)
Navarro, Cristóbal A.; Huang, Wei; Deng, Youjin
2016-08-01
This work presents an adaptive multi-GPU Exchange Monte Carlo approach for the simulation of the 3D Random Field Ising Model (RFIM). The design is based on a two-level parallelization. The first level, spin-level parallelism, maps the parallel computation as optimal 3D thread-blocks that simulate blocks of spins in shared memory with minimal halo surface, assuming a constant block volume. The second level, replica-level parallelism, uses multi-GPU computation to handle the simulation of an ensemble of replicas. CUDA's concurrent kernel execution feature is used in order to fill the occupancy of each GPU with many replicas, providing a performance boost that is more notorious at the smallest values of L. In addition to the two-level parallel design, the work proposes an adaptive multi-GPU approach that dynamically builds a proper temperature set free of exchange bottlenecks. The strategy is based on mid-point insertions at the temperature gaps where the exchange rate is most compromised. The extra work generated by the insertions is balanced across the GPUs independently of where the mid-point insertions were performed. Performance results show that spin-level performance is approximately two orders of magnitude faster than a single-core CPU version and one order of magnitude faster than a parallel multi-core CPU version running on 16-cores. Multi-GPU performance is highly convenient under a weak scaling setting, reaching up to 99 % efficiency as long as the number of GPUs and L increase together. The combination of the adaptive approach with the parallel multi-GPU design has extended our possibilities of simulation to sizes of L = 32 , 64 for a workstation with two GPUs. Sizes beyond L = 64 can eventually be studied using larger multi-GPU systems.
NASA Astrophysics Data System (ADS)
Taher, M.; Hamidah, I.; Suwarma, I. R.
2017-09-01
This paper outlined the results of an experimental study on the effects of multi-representation approach in learning Archimedes Law on students’ mental model improvement. The multi-representation techniques implemented in the study were verbal, pictorial, mathematical, and graphical representations. Students’ mental model was classified into three levels, i.e. scientific, synthetic, and initial levels, based on the students’ level of understanding. The present study employed the pre-experimental methodology, using one group pretest-posttest design. The subject of the study was 32 eleventh grade students in a Public Senior High School in Riau Province. The research instrument included model mental test on hydrostatic pressure concept, in the form of essay test judged by experts. The findings showed that there was positive change in students’ mental model, indicating that multi-representation approach was effective to improve students’ mental model.
Multi-fidelity methods for uncertainty quantification in transport problems
NASA Astrophysics Data System (ADS)
Tartakovsky, G.; Yang, X.; Tartakovsky, A. M.; Barajas-Solano, D. A.; Scheibe, T. D.; Dai, H.; Chen, X.
2016-12-01
We compare several multi-fidelity approaches for uncertainty quantification in flow and transport simulations that have a lower computational cost than the standard Monte Carlo method. The cost reduction is achieved by combining a small number of high-resolution (high-fidelity) simulations with a large number of low-resolution (low-fidelity) simulations. We propose a new method, a re-scaled Multi Level Monte Carlo (rMLMC) method. The rMLMC is based on the idea that the statistics of quantities of interest depends on scale/resolution. We compare rMLMC with existing multi-fidelity methods such as Multi Level Monte Carlo (MLMC) and reduced basis methods and discuss advantages of each approach.
NASA Astrophysics Data System (ADS)
Khan, F. A.; Yousaf, A.; Reindl, L. M.
2018-04-01
This paper presents a multi segment capacitive level monitoring sensor based on distributed E-fields approach Glocal. This approach has an advantage to analyze build-up problem by the local E-fields as well the fluid level monitoring by the global E-fields. The multi segment capacitive approach presented within this work addresses the main problem of unwanted parasitic capacitance generated from Copper (Cu) strips by applying active shielding concept. Polyvinyl chloride (PVC) is used for isolation and parafilm is used for creating artificial build-up on a CLS.
Kaufman, Michelle R; Cornish, Flora; Zimmerman, Rick S; Johnson, Blair T
2014-08-15
Despite increasing recent emphasis on the social and structural determinants of HIV-related behavior, empirical research and interventions lag behind, partly because of the complexity of social-structural approaches. This article provides a comprehensive and practical review of the diverse literature on multi-level approaches to HIV-related behavior change in the interest of contributing to the ongoing shift to more holistic theory, research, and practice. It has the following specific aims: (1) to provide a comprehensive list of relevant variables/factors related to behavior change at all points on the individual-structural spectrum, (2) to map out and compare the characteristics of important recent multi-level models, (3) to reflect on the challenges of operating with such complex theoretical tools, and (4) to identify next steps and make actionable recommendations. Using a multi-level approach implies incorporating increasing numbers of variables and increasingly context-specific mechanisms, overall producing greater intricacies. We conclude with recommendations on how best to respond to this complexity, which include: using formative research and interdisciplinary collaboration to select the most appropriate levels and variables in a given context; measuring social and institutional variables at the appropriate level to ensure meaningful assessments of multiple levels are made; and conceptualizing intervention and research with reference to theoretical models and mechanisms to facilitate transferability, sustainability, and scalability.
A Multi-Level Decision Fusion Strategy for Condition Based Maintenance of Composite Structures
Sharif Khodaei, Zahra; Aliabadi, M.H.
2016-01-01
In this work, a multi-level decision fusion strategy is proposed which weighs the Value of Information (VoI) against the intended functions of a Structural Health Monitoring (SHM) system. This paper presents a multi-level approach for three different maintenance strategies in which the performance of the SHM systems is evaluated against its intended functions. Level 1 diagnosis results in damage existence with minimum sensors covering a large area by finding the maximum energy difference for the guided waves propagating in pristine structure and the post-impact state; Level 2 diagnosis provides damage detection and approximate localization using an approach based on Electro-Mechanical Impedance (EMI) measures, while Level 3 characterizes damage (exact location and size) in addition to its detection by utilising a Weighted Energy Arrival Method (WEAM). The proposed multi-level strategy is verified and validated experimentally by detection of Barely Visible Impact Damage (BVID) on a curved composite fuselage panel. PMID:28773910
Mathematical model comparing of the multi-level economics systems
NASA Astrophysics Data System (ADS)
Brykalov, S. M.; Kryanev, A. V.
2017-12-01
The mathematical model (scheme) of a multi-level comparison of the economic system, characterized by the system of indices, is worked out. In the mathematical model of the multi-level comparison of the economic systems, the indicators of peer review and forecasting of the economic system under consideration can be used. The model can take into account the uncertainty in the estimated values of the parameters or expert estimations. The model uses the multi-criteria approach based on the Pareto solutions.
Jaiswal, Astha; Godinez, William J; Eils, Roland; Lehmann, Maik Jorg; Rohr, Karl
2015-11-01
Automatic fluorescent particle tracking is an essential task to study the dynamics of a large number of biological structures at a sub-cellular level. We have developed a probabilistic particle tracking approach based on multi-scale detection and two-step multi-frame association. The multi-scale detection scheme allows coping with particles in close proximity. For finding associations, we have developed a two-step multi-frame algorithm, which is based on a temporally semiglobal formulation as well as spatially local and global optimization. In the first step, reliable associations are determined for each particle individually in local neighborhoods. In the second step, the global spatial information over multiple frames is exploited jointly to determine optimal associations. The multi-scale detection scheme and the multi-frame association finding algorithm have been combined with a probabilistic tracking approach based on the Kalman filter. We have successfully applied our probabilistic tracking approach to synthetic as well as real microscopy image sequences of virus particles and quantified the performance. We found that the proposed approach outperforms previous approaches.
Bittig, Arne T; Uhrmacher, Adelinde M
2017-01-01
Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.
A Multi-Level Approach to Outreach for Geologic Sequestration Projects
Greenberg, S.E.; Leetaru, H.E.; Krapac, I.G.; Hnottavange-Telleen, K.; Finley, R.J.
2009-01-01
Public perception of carbon capture and sequestration (CCS) projects represents a potential barrier to commercialization. Outreach to stakeholders at the local, regional, and national level is needed to create familiarity with and potential acceptance of CCS projects. This paper highlights the Midwest Geological Sequestration Consortium (MGSC) multi-level outreach approach which interacts with multiple stakeholders. The MGSC approach focuses on external and internal communication. External communication has resulted in building regional public understanding of CCS. Internal communication, through a project Risk Assessment process, has resulted in enhanced team communication and preparation of team members for outreach roles. ?? 2009 Elsevier Ltd. All rights reserved.
Multi-Level Adaptation in End-User Development of 3D Virtual Chemistry Experiments
ERIC Educational Resources Information Center
Liu, Chang; Zhong, Ying
2014-01-01
Multi-level adaptation in end-user development (EUD) is an effective way to enable non-technical end users such as educators to gradually introduce more functionality with increasing complexity to 3D virtual learning environments developed by themselves using EUD approaches. Parameterization, integration, and extension are three levels of…
Biodiversity conservation in Swedish forests: ways forward for a 30-year-old multi-scaled approach.
Gustafsson, Lena; Perhans, Karin
2010-12-01
A multi-scaled model for biodiversity conservation in forests was introduced in Sweden 30 years ago, which makes it a pioneer example of an integrated ecosystem approach. Trees are set aside for biodiversity purposes at multiple scale levels varying from individual trees to areas of thousands of hectares, with landowner responsibility at the lowest level and with increasing state involvement at higher levels. Ecological theory supports the multi-scaled approach, and retention efforts at every harvest occasion stimulate landowners' interest in conservation. We argue that the model has large advantages but that in a future with intensified forestry and global warming, development based on more progressive thinking is necessary to maintain and increase biodiversity. Suggestions for the future include joint planning for several forest owners, consideration of cost-effectiveness, accepting opportunistic work models, adjusting retention levels to stand and landscape composition, introduction of temporary reserves, creation of "receiver habitats" for species escaping climate change, and protection of young forests.
NASA Astrophysics Data System (ADS)
Ayadi, Omar; Felfel, Houssem; Masmoudi, Faouzi
2017-07-01
The current manufacturing environment has changed from traditional single-plant to multi-site supply chain where multiple plants are serving customer demands. In this article, a tactical multi-objective, multi-period, multi-product, multi-site supply-chain planning problem is proposed. A corresponding optimization model aiming to simultaneously minimize the total cost, maximize product quality and maximize the customer satisfaction demand level is developed. The proposed solution approach yields to a front of Pareto-optimal solutions that represents the trade-offs among the different objectives. Subsequently, the analytic hierarchy process method is applied to select the best Pareto-optimal solution according to the preferences of the decision maker. The robustness of the solutions and the proposed approach are discussed based on a sensitivity analysis and an application to a real case from the textile and apparel industry.
Sauer, J; Darioly, A; Mast, M Schmid; Schmid, P C; Bischof, N
2010-11-01
The article proposes a multi-level approach for evaluating communication skills training (CST) as an important element of crew resource management (CRM) training. Within this methodological framework, the present work examined the effectiveness of CST in matching or mismatching team compositions with regard to hierarchical status and competence. There is little experimental research that evaluated the effectiveness of CRM training at multiple levels (i.e. reaction, learning, behaviour) and in teams composed of members of different status and competence. An experiment with a two (CST: with vs. without) by two (competence/hierarchical status: congruent vs. incongruent) design was carried out. A total of 64 participants were trained for 2.5 h on a simulated process control environment, with the experimental group being given 45 min of training on receptiveness and influencing skills. Prior to the 1-h experimental session, participants were assigned to two-person teams. The results showed overall support for the use of such a multi-level approach of training evaluation. Stronger positive effects of CST were found for subjective measures than for objective performance measures. STATEMENT OF RELEVANCE: This work provides some guidance for the use of a multi-level evaluation of CRM training. It also emphasises the need to collect objective performance data for training evaluation in addition to subjective measures with a view to gain a more accurate picture of the benefits of such training approaches.
Multi-Tier Mental Health Program for Refugee Youth
ERIC Educational Resources Information Center
Ellis, B. Heidi; Miller, Alisa B.; Abdi, Saida; Barrett, Colleen; Blood, Emily A.; Betancourt, Theresa S.
2013-01-01
Objective: We sought to establish that refugee youths who receive a multi-tiered approach to services, Project SHIFA, would show high levels of engagement in treatment appropriate to their level of mental health distress, improvements in mental health symptoms, and a decrease in resource hardships. Method: Study participants were 30 Somali and…
May, Christian P; Kolokotroni, Eleni; Stamatakos, Georgios S; Büchler, Philippe
2011-10-01
Modeling of tumor growth has been performed according to various approaches addressing different biocomplexity levels and spatiotemporal scales. Mathematical treatments range from partial differential equation based diffusion models to rule-based cellular level simulators, aiming at both improving our quantitative understanding of the underlying biological processes and, in the mid- and long term, constructing reliable multi-scale predictive platforms to support patient-individualized treatment planning and optimization. The aim of this paper is to establish a multi-scale and multi-physics approach to tumor modeling taking into account both the cellular and the macroscopic mechanical level. Therefore, an already developed biomodel of clinical tumor growth and response to treatment is self-consistently coupled with a biomechanical model. Results are presented for the free growth case of the imageable component of an initially point-like glioblastoma multiforme tumor. The composite model leads to significant tumor shape corrections that are achieved through the utilization of environmental pressure information and the application of biomechanical principles. Using the ratio of smallest to largest moment of inertia of the tumor material to quantify the effect of our coupled approach, we have found a tumor shape correction of 20% by coupling biomechanics to the cellular simulator as compared to a cellular simulation without preferred growth directions. We conclude that the integration of the two models provides additional morphological insight into realistic tumor growth behavior. Therefore, it might be used for the development of an advanced oncosimulator focusing on tumor types for which morphology plays an important role in surgical and/or radio-therapeutic treatment planning. Copyright © 2011 Elsevier Ltd. All rights reserved.
Manda, Prashanti; McCarthy, Fiona; Bridges, Susan M
2013-10-01
The Gene Ontology (GO), a set of three sub-ontologies, is one of the most popular bio-ontologies used for describing gene product characteristics. GO annotation data containing terms from multiple sub-ontologies and at different levels in the ontologies is an important source of implicit relationships between terms from the three sub-ontologies. Data mining techniques such as association rule mining that are tailored to mine from multiple ontologies at multiple levels of abstraction are required for effective knowledge discovery from GO annotation data. We present a data mining approach, Multi-ontology data mining at All Levels (MOAL) that uses the structure and relationships of the GO to mine multi-ontology multi-level association rules. We introduce two interestingness measures: Multi-ontology Support (MOSupport) and Multi-ontology Confidence (MOConfidence) customized to evaluate multi-ontology multi-level association rules. We also describe a variety of post-processing strategies for pruning uninteresting rules. We use publicly available GO annotation data to demonstrate our methods with respect to two applications (1) the discovery of co-annotation suggestions and (2) the discovery of new cross-ontology relationships. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
Center-Within-Trial Versus Trial-Level Evaluation of Surrogate Endpoints.
Renfro, Lindsay A; Shi, Qian; Xue, Yuan; Li, Junlong; Shang, Hongwei; Sargent, Daniel J
2014-10-01
Evaluation of candidate surrogate endpoints using individual patient data from multiple clinical trials is considered the gold standard approach to validate surrogates at both patient and trial levels. However, this approach assumes the availability of patient-level data from a relatively large collection of similar trials, which may not be possible to achieve for a given disease application. One common solution to the problem of too few similar trials involves performing trial-level surrogacy analyses on trial sub-units (e.g., centers within trials), thereby artificially increasing the trial-level sample size for feasibility of the multi-trial analysis. To date, the practical impact of treating trial sub-units (centers) identically to trials in multi-trial surrogacy analyses remains unexplored, and conditions under which this ad hoc solution may in fact be reasonable have not been identified. We perform a simulation study to identify such conditions, and demonstrate practical implications using a multi-trial dataset of patients with early stage colon cancer.
Center-Within-Trial Versus Trial-Level Evaluation of Surrogate Endpoints
Renfro, Lindsay A.; Shi, Qian; Xue, Yuan; Li, Junlong; Shang, Hongwei; Sargent, Daniel J.
2014-01-01
Evaluation of candidate surrogate endpoints using individual patient data from multiple clinical trials is considered the gold standard approach to validate surrogates at both patient and trial levels. However, this approach assumes the availability of patient-level data from a relatively large collection of similar trials, which may not be possible to achieve for a given disease application. One common solution to the problem of too few similar trials involves performing trial-level surrogacy analyses on trial sub-units (e.g., centers within trials), thereby artificially increasing the trial-level sample size for feasibility of the multi-trial analysis. To date, the practical impact of treating trial sub-units (centers) identically to trials in multi-trial surrogacy analyses remains unexplored, and conditions under which this ad hoc solution may in fact be reasonable have not been identified. We perform a simulation study to identify such conditions, and demonstrate practical implications using a multi-trial dataset of patients with early stage colon cancer. PMID:25061255
Absolute order-of-magnitude reasoning applied to a social multi-criteria evaluation framework
NASA Astrophysics Data System (ADS)
Afsordegan, A.; Sánchez, M.; Agell, N.; Aguado, J. C.; Gamboa, G.
2016-03-01
A social multi-criteria evaluation framework for solving a real-case problem of selecting a wind farm location in the regions of Urgell and Conca de Barberá in Catalonia (northeast of Spain) is studied. This paper applies a qualitative multi-criteria decision analysis approach based on linguistic labels assessment able to address uncertainty and deal with different levels of precision. This method is based on qualitative reasoning as an artificial intelligence technique for assessing and ranking multi-attribute alternatives with linguistic labels in order to handle uncertainty. This method is suitable for problems in the social framework such as energy planning which require the construction of a dialogue process among many social actors with high level of complexity and uncertainty. The method is compared with an existing approach, which has been applied previously in the wind farm location problem. This approach, consisting of an outranking method, is based on Condorcet's original method. The results obtained by both approaches are analysed and their performance in the selection of the wind farm location is compared in aggregation procedures. Although results show that both methods conduct to similar alternatives rankings, the study highlights both their advantages and drawbacks.
Vickers, T. Winston; Ernest, Holly B.; Boyce, Walter M.
2017-01-01
The importance of examining multiple hierarchical levels when modeling resource use for wildlife has been acknowledged for decades. Multi-level resource selection functions have recently been promoted as a method to synthesize resource use across nested organizational levels into a single predictive surface. Analyzing multiple scales of selection within each hierarchical level further strengthens multi-level resource selection functions. We extend this multi-level, multi-scale framework to modeling resistance for wildlife by combining multi-scale resistance surfaces from two data types, genetic and movement. Resistance estimation has typically been conducted with one of these data types, or compared between the two. However, we contend it is not an either/or issue and that resistance may be better-modeled using a combination of resistance surfaces that represent processes at different hierarchical levels. Resistance surfaces estimated from genetic data characterize temporally broad-scale dispersal and successful breeding over generations, whereas resistance surfaces estimated from movement data represent fine-scale travel and contextualized movement decisions. We used telemetry and genetic data from a long-term study on pumas (Puma concolor) in a highly developed landscape in southern California to develop a multi-level, multi-scale resource selection function and a multi-level, multi-scale resistance surface. We used these multi-level, multi-scale surfaces to identify resource use patches and resistant kernel corridors. Across levels, we found puma avoided urban, agricultural areas, and roads and preferred riparian areas and more rugged terrain. For other landscape features, selection differed among levels, as did the scales of selection for each feature. With these results, we developed a conservation plan for one of the most isolated puma populations in the U.S. Our approach captured a wide spectrum of ecological relationships for a population, resulted in effective conservation planning, and can be readily applied to other wildlife species. PMID:28609466
Zeller, Katherine A; Vickers, T Winston; Ernest, Holly B; Boyce, Walter M
2017-01-01
The importance of examining multiple hierarchical levels when modeling resource use for wildlife has been acknowledged for decades. Multi-level resource selection functions have recently been promoted as a method to synthesize resource use across nested organizational levels into a single predictive surface. Analyzing multiple scales of selection within each hierarchical level further strengthens multi-level resource selection functions. We extend this multi-level, multi-scale framework to modeling resistance for wildlife by combining multi-scale resistance surfaces from two data types, genetic and movement. Resistance estimation has typically been conducted with one of these data types, or compared between the two. However, we contend it is not an either/or issue and that resistance may be better-modeled using a combination of resistance surfaces that represent processes at different hierarchical levels. Resistance surfaces estimated from genetic data characterize temporally broad-scale dispersal and successful breeding over generations, whereas resistance surfaces estimated from movement data represent fine-scale travel and contextualized movement decisions. We used telemetry and genetic data from a long-term study on pumas (Puma concolor) in a highly developed landscape in southern California to develop a multi-level, multi-scale resource selection function and a multi-level, multi-scale resistance surface. We used these multi-level, multi-scale surfaces to identify resource use patches and resistant kernel corridors. Across levels, we found puma avoided urban, agricultural areas, and roads and preferred riparian areas and more rugged terrain. For other landscape features, selection differed among levels, as did the scales of selection for each feature. With these results, we developed a conservation plan for one of the most isolated puma populations in the U.S. Our approach captured a wide spectrum of ecological relationships for a population, resulted in effective conservation planning, and can be readily applied to other wildlife species.
Multi-interface Level Sensors and New Development in Monitoring and Control of Oil Separators
Bukhari, Syed Faisal Ahmed; Yang, Wuqiang
2006-01-01
In the oil industry, huge saving may be made if suitable multi-interface level measurement systems are employed for effectively monitoring crude oil separators and efficient control of their operation. A number of techniques, e.g. externally mounted displacers, differential pressure transmitters and capacitance rod devices, have been developed to measure the separation process with gas, oil, water and other components. Because of the unavailability of suitable multi-interface level measurement systems, oil separators are currently operated by the trial-and-error approach. In this paper some conventional techniques, which have been used for level measurement in industry, and new development are discussed.
Recommendations for level-determined sampling in wells
NASA Astrophysics Data System (ADS)
Lerner, David N.; Teutsch, Georg
1995-10-01
Level-determined samples of groundwater are increasingly important for hydrogeological studies. The techniques for collecting them range from the use of purpose drilled wells, sometimes with sophisticated dedicated multi-level samplers in them, to a variety of methods used in open wells. Open, often existing, wells are frequently used on cost grounds, but there are risks of obtaining poor and unrepresentative samples. Alternative approaches to level-determined sampling incorporate seven concepts: depth sampling; packer systems; individual wells; dedicated multi-level systems; separation pumping; baffle systems; multi-port sock samplers. These are outlined and evaluated in terms of the environment to be sampled, and the features and performance of the methods. Recommendations are offered to match methods to sampling problems.
Generating multi-double-scroll attractors via nonautonomous approach.
Hong, Qinghui; Xie, Qingguo; Shen, Yi; Wang, Xiaoping
2016-08-01
It is a common phenomenon that multi-scroll attractors are realized by introducing the various nonlinear functions with multiple breakpoints in double scroll chaotic systems. Differently, we present a nonautonomous approach for generating multi-double-scroll attractors (MDSA) without changing the original nonlinear functions. By using the multi-level-logic pulse excitation technique in double scroll chaotic systems, MDSA can be generated. A Chua's circuit, a Jerk circuit, and a modified Lorenz system are given as designed example and the Matlab simulation results are presented. Furthermore, the corresponding realization circuits are designed. The Pspice results are in agreement with numerical simulation results, which verify the availability and feasibility of this method.
Kukafka, Rita; Johnson, Stephen B; Linfante, Allison; Allegrante, John P
2003-06-01
Many interventions to improve the success of information technology (IT) implementations are grounded in behavioral science, using theories, and models to identify conditions and determinants of successful use. However, each model in the IT literature has evolved to address specific theoretical problems of particular disciplinary concerns, and each model has been tested and has evolved using, in most cases, a more or less restricted set of IT implementation procedures. Functionally, this limits the perspective for taking into account the multiple factors at the individual, group, and organizational levels that influence use behavior. While a rich body of literature has emerged, employing prominent models such as the Technology Adoption Model, Social-Cognitive Theory, and Diffusion of Innovation Theory, the complexity of defining a suitable multi-level intervention has largely been overlooked. A gap exists between the implementation of IT and the integration of theories and models that can be utilized to develop multi-level approaches to identify factors that impede usage behavior. We present a novel framework that is intended to guide synthesis of more than one theoretical perspective for the purpose of planning multi-level interventions to enhance IT use. This integrative framework is adapted from PRECEDE/PROCEDE, a conceptual framework used by health planners in hundreds of published studies to direct interventions that account for the multiple determinants of behavior. Since we claim that the literature on IT use behavior does not now include a multi-level approach, we undertook a systematic literature analysis to confirm this assertion. Our framework facilitated organizing this literature synthesis and our analysis was aimed at determining if the IT implementation approaches in the published literature were characterized by an approach that considered at least two levels of IT usage determinants. We found that while 61% of studies mentioned or referred to theory, none considered two or more levels. In other words, although the researchers employ behavioral theory, they omit two fundamental propositions: (1) IT usage is influenced by multiple factors and (2) interventions must be multi-dimensional. Our literature synthesis may provide additional insight into the reason for high failure rates associated with underutilized systems, and underscores the need to move beyond the current dominant approach that employs a single model to guide IT implementation plans that aim to address factors associated with IT acceptance and subsequent positive use behavior.
Overweight and obesity in India: policy issues from an exploratory multi-level analysis.
Siddiqui, Md Zakaria; Donato, Ronald
2016-06-01
This article analyses a nationally representative household dataset-the National Family Health Survey (NFHS-3) conducted in 2005 to 2006-to examine factors influencing the prevalence of overweight/obesity in India. The dataset was disaggregated into four sub-population groups-urban and rural females and males-and multi-level logit regression models were used to estimate the impact of particular covariates on the likelihood of overweight/obesity. The multi-level modelling approach aimed to identify individual and macro-level contextual factors influencing this health outcome. In contrast to most studies on low-income developing countries, the findings reveal that education for females beyond a particular level of educational attainment exhibits a negative relationship with the likelihood of overweight/obesity. This relationship was not observed for males. Muslim females and all Sikh sub-populations have a higher likelihood of overweight/obesity suggesting the importance of socio-cultural influences. The results also show that the relationship between wealth and the probability of overweight/obesity is stronger for males than females highlighting the differential impact of increasing socio-economic status on gender. Multi-level analysis reveals that states exerted an independent influence on the likelihood of overweight/obesity beyond individual-level covariates, reflecting the importance of spatially related contextual factors on overweight/obesity. While this study does not disentangle macro-level 'obesogenic' environmental factors from socio-cultural network influences, the results highlight the need to refrain from adopting a 'one size fits all' policy approach in addressing the overweight/obesity epidemic facing India. Instead, policy implementation requires a more nuanced and targeted approach to incorporate the growing recognition of socio-cultural and spatial contextual factors impacting on healthy behaviours. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
A Social-Ecological Framework of Theory, Assessment, and Prevention of Suicide
Cramer, Robert J.; Kapusta, Nestor D.
2017-01-01
The juxtaposition of increasing suicide rates with continued calls for suicide prevention efforts begs for new approaches. Grounded in the Centers for Disease Control and Prevention (CDC) framework for tackling health issues, this personal views work integrates relevant suicide risk/protective factor, assessment, and intervention/prevention literatures. Based on these components of suicide risk, we articulate a Social-Ecological Suicide Prevention Model (SESPM) which provides an integration of general and population-specific risk and protective factors. We also use this multi-level perspective to provide a structured approach to understanding current theories and intervention/prevention efforts concerning suicide. Following similar multi-level prevention efforts in interpersonal violence and Human Immunodeficiency Virus (HIV) domains, we offer recommendations for social-ecologically informed suicide prevention theory, training, research, assessment, and intervention programming. Although the SESPM calls for further empirical testing, it provides a suitable backdrop for tailoring of current prevention and intervention programs to population-specific needs. Moreover, the multi-level model shows promise to move suicide risk assessment forward (e.g., development of multi-level suicide risk algorithms or structured professional judgments instruments) to overcome current limitations in the field. Finally, we articulate a set of characteristics of social-ecologically based suicide prevention programs. These include the need to address risk and protective factors with the strongest degree of empirical support at each multi-level layer, incorporate a comprehensive program evaluation strategy, and use a variety of prevention techniques across levels of prevention. PMID:29062296
Granovsky, Alexander A
2011-06-07
The distinctive desirable features, both mathematically and physically meaningful, for all partially contracted multi-state multi-reference perturbation theories (MS-MR-PT) are explicitly formulated. The original approach to MS-MR-PT theory, called extended multi-configuration quasi-degenerate perturbation theory (XMCQDPT), having most, if not all, of the desirable properties is introduced. The new method is applied at the second order of perturbation theory (XMCQDPT2) to the 1(1)A(')-2(1)A(') conical intersection in allene molecule, the avoided crossing in LiF molecule, and the 1(1)A(1) to 2(1)A(1) electronic transition in cis-1,3-butadiene. The new theory has several advantages compared to those of well-established approaches, such as second order multi-configuration quasi-degenerate perturbation theory and multi-state-second order complete active space perturbation theory. The analysis of the prevalent approaches to the MS-MR-PT theory performed within the framework of the XMCQDPT theory unveils the origin of their common inherent problems. We describe the efficient implementation strategy that makes XMCQDPT2 an especially useful general-purpose tool in the high-level modeling of small to large molecular systems. © 2011 American Institute of Physics
Kia, Seyed Mostafa; Pedregosa, Fabian; Blumenthal, Anna; Passerini, Andrea
2017-06-15
The use of machine learning models to discriminate between patterns of neural activity has become in recent years a standard analysis approach in neuroimaging studies. Whenever these models are linear, the estimated parameters can be visualized in the form of brain maps which can aid in understanding how brain activity in space and time underlies a cognitive function. However, the recovered brain maps often suffer from lack of interpretability, especially in group analysis of multi-subject data. To facilitate the application of brain decoding in group-level analysis, we present an application of multi-task joint feature learning for group-level multivariate pattern recovery in single-trial magnetoencephalography (MEG) decoding. The proposed method allows for recovering sparse yet consistent patterns across different subjects, and therefore enhances the interpretability of the decoding model. Our experimental results demonstrate that the mutli-task joint feature learning framework is capable of recovering more meaningful patterns of varying spatio-temporally distributed brain activity across individuals while still maintaining excellent generalization performance. We compare the performance of the multi-task joint feature learning in terms of generalization, reproducibility, and quality of pattern recovery against traditional single-subject and pooling approaches on both simulated and real MEG datasets. These results can facilitate the usage of brain decoding for the characterization of fine-level distinctive patterns in group-level inference. Considering the importance of group-level analysis, the proposed approach can provide a methodological shift towards more interpretable brain decoding models. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhang, Ying; Feng, Yuanming; Wang, Wei; Yang, Chengwen; Wang, Ping
2017-03-01
A novel and versatile “bottom-up” approach is developed to estimate the radiobiological effect of clinic radiotherapy. The model consists of multi-scale Monte Carlo simulations from organ to cell levels. At cellular level, accumulated damages are computed using a spectrum-based accumulation algorithm and predefined cellular damage database. The damage repair mechanism is modeled by an expanded reaction-rate two-lesion kinetic model, which were calibrated through replicating a radiobiological experiment. Multi-scale modeling is then performed on a lung cancer patient under conventional fractionated irradiation. The cell killing effects of two representative voxels (isocenter and peripheral voxel of the tumor) are computed and compared. At microscopic level, the nucleus dose and damage yields vary among all nucleuses within the voxels. Slightly larger percentage of cDSB yield is observed for the peripheral voxel (55.0%) compared to the isocenter one (52.5%). For isocenter voxel, survival fraction increase monotonically at reduced oxygen environment. Under an extreme anoxic condition (0.001%), survival fraction is calculated to be 80% and the hypoxia reduction factor reaches a maximum value of 2.24. In conclusion, with biological-related variations, the proposed multi-scale approach is more versatile than the existing approaches for evaluating personalized radiobiological effects in radiotherapy.
Using Evaluation Research as a Means for Policy Analysis in a "New" Mission-Oriented Policy Context
ERIC Educational Resources Information Center
Amanatidou, Effie; Cunningham, Paul; Gök, Abdullah; Garefi, Ioanna
2014-01-01
Grand challenges stress the importance of multi-disciplinary research, a multi-actor approach in examining the current state of affairs and exploring possible solutions, multi-level governance and policy coordination across geographical boundaries and policy areas, and a policy environment for enabling change both in science and technology and in…
Device Independent Layout and Style Editing Using Multi-Level Style Sheets
NASA Astrophysics Data System (ADS)
Dees, Walter
This paper describes a layout and styling framework that is based on the multi-level style sheets approach. It shows some of the techniques that can be used to add layout and style information to a UI in a device-independent manner, and how to reuse the layout and style information to create user interfaces for different devices
Automatic multi-organ segmentation using learning-based segmentation and level set optimization.
Kohlberger, Timo; Sofka, Michal; Zhang, Jingdan; Birkbeck, Neil; Wetzl, Jens; Kaftan, Jens; Declerck, Jérôme; Zhou, S Kevin
2011-01-01
We present a novel generic segmentation system for the fully automatic multi-organ segmentation from CT medical images. Thereby we combine the advantages of learning-based approaches on point cloud-based shape representation, such a speed, robustness, point correspondences, with those of PDE-optimization-based level set approaches, such as high accuracy and the straightforward prevention of segment overlaps. In a benchmark on 10-100 annotated datasets for the liver, the lungs, and the kidneys we show that the proposed system yields segmentation accuracies of 1.17-2.89 mm average surface errors. Thereby the level set segmentation (which is initialized by the learning-based segmentations) contributes with an 20%-40% increase in accuracy.
3D Digital Surveying and Modelling of Cave Geometry: Application to Paleolithic Rock Art.
González-Aguilera, Diego; Muñoz-Nieto, Angel; Gómez-Lahoz, Javier; Herrero-Pascual, Jesus; Gutierrez-Alonso, Gabriel
2009-01-01
3D digital surveying and modelling of cave geometry represents a relevant approach for research, management and preservation of our cultural and geological legacy. In this paper, a multi-sensor approach based on a terrestrial laser scanner, a high-resolution digital camera and a total station is presented. Two emblematic caves of Paleolithic human occupation and situated in northern Spain, "Las Caldas" and "Peña de Candamo", have been chosen to put in practise this approach. As a result, an integral and multi-scalable 3D model is generated which may allow other scientists, pre-historians, geologists…, to work on two different levels, integrating different Paleolithic Art datasets: (1) a basic level based on the accurate and metric support provided by the laser scanner; and (2) a advanced level using the range and image-based modelling.
Generating multi-double-scroll attractors via nonautonomous approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Qinghui; Xie, Qingguo, E-mail: qgxie@mail.hust.edu.cn; Shen, Yi
It is a common phenomenon that multi-scroll attractors are realized by introducing the various nonlinear functions with multiple breakpoints in double scroll chaotic systems. Differently, we present a nonautonomous approach for generating multi-double-scroll attractors (MDSA) without changing the original nonlinear functions. By using the multi-level-logic pulse excitation technique in double scroll chaotic systems, MDSA can be generated. A Chua's circuit, a Jerk circuit, and a modified Lorenz system are given as designed example and the Matlab simulation results are presented. Furthermore, the corresponding realization circuits are designed. The Pspice results are in agreement with numerical simulation results, which verify themore » availability and feasibility of this method.« less
Risk Governance of Multiple Natural Hazards: Centralized versus Decentralized Approach in Europe
NASA Astrophysics Data System (ADS)
Komendantova, Nadejda; Scolobig, Anna; Vinchon, Charlotte
2014-05-01
The multi-risk approach is a relatively new field and its definition includes the need to consider multiple hazards and vulnerabilities in their interdependency (Selva, 2013) and the current multi-hazards disasters, such as the 2011 Tohoku earthquake, tsunami and nuclear catastrophe, showed the need for a multi-risk approach in hazard mitigation and management. Our knowledge about multi-risk assessment, including studies from different scientific disciplines and developed assessment tools, is constantly growing (White et al., 2001). However, the link between scientific knowledge, its implementation and the results in terms of improved governance and decision-making have gained significantly less attention (IRGC, 2005; Kappes et al., 2012), even though the interest to risk governance, in general, has increased significantly during the last years (Verweiy and Thompson, 2006). Therefore, the key research question is how risk assessment is implemented and what is the potential for the implementation of a multi-risk approach in different governance systems across Europe. More precisely, how do the characteristics of risk governance, such as the degree of centralization versus decentralization, influence the implementation of a multi-risk approach. The methodology of this research includes comparative case study analysis of top-down and bottom-up interactions in governance in the city of Naples, (Italy), where the institutional landscape is marked by significant autonomy of Italian regions in decision-making processes for assessing the majority of natural risks, excluding volcanic, and in Guadeloupe, French West Indies, an overseas department of France, where the decision-making process is marked by greater centralization in decision making associated with a well established state governance within regions, delegated to the prefect and decentralised services of central ministries. The research design included documentary analysis and extensive empirical work involving policy makers, private sector actors and practitioners in risk and emergency management. This work was informed by 36 semi-structured interviews, three workshops with over seventy participants from eleven different countries, feedback from questionnaires and focus group discussions (Scolobig et al., 2013). The results show that both governance systems have their own strengths and weaknesses (Komendantova et al., 2013). Elements of the centralized multi-risk governance system could lead to improvements in interagency communication and the creation of an inter-agency environment, where the different departments at the national level can exchange information, identify the communities that are most exposed to multiple risks and set priorities, while providing consistent information about and responses to multi-risk to the relevant stakeholders at the local level. A decentralised multi-risk governance system by contrast can instead favour the creation of local multi-risk commissions to conduct discussions between experts in meteorological, geological and technological risks and practitioners, to elaborate risk and hazard maps, and to develop local capacities which would include educational and training activities. Both governance systems suffer from common deficiencies, the most important being the frequent lack of capacities at the local level, especially financial, but sometimes also technical and institutional ones, as the responsibilities for disaster risk management are often transferred from the national to local levels without sufficient resources for implementation of programs on risk management (UNISDR, 2013). The difficulty in balancing available resources between short-term and medium-term priorities often complicates the issue. Our recommendations are that the implementation of multi-risk approach can be facilitated through knowledge exchange and dialogue between different disciplinary communities, such as geological and meteorological, and between the natural and social sciences. The implementation of a multi-risk approach can be strengthened through the creation of multi-risk platforms and multi-risk commissions, which can liaise between risk management experts and local communities and to unify numerous actions on natural hazard management. However, the multi-risk approach cannot be a subsidiary to a single risk approach, and both have to be pursued. References: IRGC. (2011). Concept note: Improving the management of emerging risks: Risks from new technologies, system interactions, and unforeseen or changing circumstances. International Risk Governance Council (IRGC), Geneva. Kappes, M. S., Keiler, M., Elverfeldt, von K., & Glade, T, (2012). Challenges of analyzing multi-hazard risk: A review. Natural Hazards, 64(2), 1925-1958. doi: 10.1007/s11069-012-0294-2. Komendantova N, Scolobig A, Vinchon C (2013). Multi-risk approach in centralized and decentralized risk governance systems: Case studies of Naples, Italy and Guadeloupe, France. International Relations and Diplomacy, 1(3):224-239 (December 2013) Scolobig, A., Vichon, C., Komendantova, N., Bengoubou-Valerius, M., & Patt, A. (2013). Social and institutional barriers to effective multi-hazard and multi-risk decision-making governance. D6.3 MATRIX project. Selva, J. (2013). Long-term multi-risk assessment: statistical treatment of interaction among risks. Natural Hazards, 67(2),701-722. UNISDR. (2013). Implementing the HYOGO framework for action in Europe: Regional synthesis report 2011-2013. Verweij, M., & Thompson, M. (Eds.). (2006). Clumsy solutions for a complex world: Governance, politics, and plural perceptions. New York: Palgrave Macmillan. White, G., Kates, R., & Burton, I. (2001). Knowing better and losing even more: the use of knowledge in hazards management. Environmental Hazards, 3, 81-92.
Ko, Linda K; Rillamas-Sun, Eileen; Bishop, Sonia; Cisneros, Oralia; Holte, Sarah; Thompson, Beti
2018-04-01
Hispanic children are disproportionally overweight and obese compared to their non-Hispanic white counterparts in the US. Community-wide, multi-level interventions have been successful to promote healthier nutrition, increased physical activity (PA), and weight loss. Using community-based participatory approach (CBPR) that engages community members in rural Hispanic communities is a promising way to promote behavior change, and ultimately weight loss among Hispanic children. Led by a community-academic partnership, the Together We STRIDE (Strategizing Together Relevant Interventions for Diet and Exercise) aims to test the effectiveness of a community-wide, multi-level intervention to promote healthier diets, increased PA, and weight loss among Hispanic children. The Together We STRIDE is a parallel quasi-experimental trial with a goal of recruiting 900 children aged 8-12 years nested within two communities (one intervention and one comparison). Children will be recruited from their respective elementary schools. Components of the 2-year multi-level intervention include comic books (individual-level), multi-generational nutrition and PA classes (family-level), teacher-led PA breaks and media literacy education (school-level), family nights, a farmer's market and a community PA event (known as ciclovia) at the community-level. Children from the comparison community will receive two newsletters. Height and weight measures will be collected from children in both communities at three time points (baseline, 6-months, and 18-months). The Together We STRIDE study aims to promote healthier diet and increased PA to produce healthy weight among Hispanic children. The use of CBPR approach and the engagement of the community will springboard strategies for intervention' sustainability. Clinical Trials Registration Number: NCT02982759 Retrospectively registered. Copyright © 2018 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Vieira, Rodrigo Drumond; Kelly, Gregory J.
2014-01-01
In this paper, we present and apply a multi-level method for discourse analysis in science classrooms. This method is based on the structure of human activity (activity, actions, and operations) and it was applied to study a pre-service physics teacher methods course. We argue that such an approach, based on a cultural psychological perspective,…
Measuring sustainable development using a multi-criteria model: a case study.
Boggia, Antonio; Cortina, Carla
2010-11-01
This paper shows how Multi-criteria Decision Analysis (MCDA) can help in a complex process such as the assessment of the level of sustainability of a certain area. The paper presents the results of a study in which a model for measuring sustainability was implemented to better aid public policy decisions regarding sustainability. In order to assess sustainability in specific areas, a methodological approach based on multi-criteria analysis has been developed. The aim is to rank areas in order to understand the specific technical and/or financial support that they need to develop sustainable growth. The case study presented is an assessment of the level of sustainability in different areas of an Italian Region using the MCDA approach. Our results show that MCDA is a proper approach for sustainability assessment. The results are easy to understand and the evaluation path is clear and transparent. This is what decision makers need for having support to their decisions. The multi-criteria model for evaluation has been developed respecting the sustainable development economic theory, so that final results can have a clear meaning in terms of sustainability. Copyright 2010 Elsevier Ltd. All rights reserved.
Wu, Dan; Ma, Ting; Ceritoglu, Can; Li, Yue; Chotiyanonta, Jill; Hou, Zhipeng; Hsu, John; Xu, Xin; Brown, Timothy; Miller, Michael I; Mori, Susumu
2016-01-15
Technologies for multi-atlas brain segmentation of T1-weighted MRI images have rapidly progressed in recent years, with highly promising results. This approach, however, relies on a large number of atlases with accurate and consistent structural identifications. Here, we introduce our atlas inventories (n=90), which cover ages 4-82years with unique hierarchical structural definitions (286 structures at the finest level). This multi-atlas library resource provides the flexibility to choose appropriate atlases for various studies with different age ranges and structure-definition criteria. In this paper, we describe the details of the atlas resources and demonstrate the improved accuracy achievable with a dynamic age-matching approach, in which atlases that most closely match the subject's age are dynamically selected. The advanced atlas creation strategy, together with atlas pre-selection principles, is expected to support the further development of multi-atlas image segmentation. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Roshanian, Jafar; Jodei, Jahangir; Mirshams, Mehran; Ebrahimi, Reza; Mirzaee, Masood
A new automated multi-level of fidelity Multi-Disciplinary Design Optimization (MDO) methodology has been developed at the MDO Laboratory of K.N. Toosi University of Technology. This paper explains a new design approach by formulation of developed disciplinary modules. A conceptual design for a small, solid-propellant launch vehicle was considered at two levels of fidelity structure. Low and medium level of fidelity disciplinary codes were developed and linked. Appropriate design and analysis codes were defined according to their effect on the conceptual design process. Simultaneous optimization of the launch vehicle was performed at the discipline level and system level. Propulsion, aerodynamics, structure and trajectory disciplinary codes were used. To reach the minimum launch weight, the Low LoF code first searches the whole design space to achieve the mission requirements. Then the medium LoF code receives the output of the low LoF and gives a value near the optimum launch weight with more details and higher fidelity.
Bazyk, Susan; Winne, Rebecca
2013-04-01
Obesity in children and youth is a major public health concern known to have a significant impact on physical and mental health. Although traditional approaches to obesity have emphasized diet and exercise at the individual level, broader attention to the mental health consequences of obesity is crucial. Individuals who are obese live in a world where they are often less accepted resulting in social exclusion and discrimination. A public health multi-tiered approach to obesity focusing on mental health promotion, prevention, and individualized intervention is presented.
3D Digital Surveying and Modelling of Cave Geometry: Application to Paleolithic Rock Art
González-Aguilera, Diego; Muñoz-Nieto, Angel; Gómez-Lahoz, Javier; Herrero-Pascual, Jesus; Gutierrez-Alonso, Gabriel
2009-01-01
3D digital surveying and modelling of cave geometry represents a relevant approach for research, management and preservation of our cultural and geological legacy. In this paper, a multi-sensor approach based on a terrestrial laser scanner, a high-resolution digital camera and a total station is presented. Two emblematic caves of Paleolithic human occupation and situated in northern Spain, “Las Caldas” and “Peña de Candamo”, have been chosen to put in practise this approach. As a result, an integral and multi-scalable 3D model is generated which may allow other scientists, pre-historians, geologists…, to work on two different levels, integrating different Paleolithic Art datasets: (1) a basic level based on the accurate and metric support provided by the laser scanner; and (2) a advanced level using the range and image-based modelling. PMID:22399958
NASA Astrophysics Data System (ADS)
Kuzle, A.
2018-06-01
The important role that metacognition plays as a predictor for student mathematical learning and for mathematical problem-solving, has been extensively documented. But only recently has attention turned to primary grades, and more research is needed at this level. The goals of this paper are threefold: (1) to present metacognitive framework during mathematics problem-solving, (2) to describe their multi-method interview approach developed to study student mathematical metacognition, and (3) to empirically evaluate the utility of their model and the adaptation of their approach in the context of grade 2 and grade 4 mathematics problem-solving. The results are discussed not only with regard to further development of the adapted multi-method interview approach, but also with regard to their theoretical and practical implications.
Kuhlmann, Ellen; Larsen, Christa
2015-12-01
Health workforce needs have moved up on the reform agendas, but policymaking often remains 'piece-meal work' and does not respond to the complexity of health workforce challenges. This article argues for innovation in healthcare governance as a key to greater sustainability of health human resources. The aim is to develop a multi-level approach that helps to identify gaps in governance and improve policy interventions. Pilot research into nursing and medicine in Germany, carried out between 2013 and 2015 using a qualitative methodology, serves to illustrate systems-based governance weaknesses. Three explorative cases address major responses to health workforce shortages, comprising migration/mobility of nurses, reform of nursing education, and gender-sensitive work management of hospital doctors. The findings illustrate a lack of connections between transnational/EU and organizational governance, between national and local levels, occupational and sector governance, and organizations/hospital management and professional development. Consequently, innovations in the health workforce need a multi-level governance approach to get transformative potential and help closing the existing gaps in governance. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Multiframe video coding for improved performance over wireless channels.
Budagavi, M; Gibson, J D
2001-01-01
We propose and evaluate a multi-frame extension to block motion compensation (BMC) coding of videoconferencing-type video signals for wireless channels. The multi-frame BMC (MF-BMC) coder makes use of the redundancy that exists across multiple frames in typical videoconferencing sequences to achieve additional compression over that obtained by using the single frame BMC (SF-BMC) approach, such as in the base-level H.263 codec. The MF-BMC approach also has an inherent ability of overcoming some transmission errors and is thus more robust when compared to the SF-BMC approach. We model the error propagation process in MF-BMC coding as a multiple Markov chain and use Markov chain analysis to infer that the use of multiple frames in motion compensation increases robustness. The Markov chain analysis is also used to devise a simple scheme which randomizes the selection of the frame (amongst the multiple previous frames) used in BMC to achieve additional robustness. The MF-BMC coders proposed are a multi-frame extension of the base level H.263 coder and are found to be more robust than the base level H.263 coder when subjected to simulated errors commonly encountered on wireless channels.
Shahamiri, Seyed Reza; Salim, Siti Salwah Binti
2014-09-01
Automatic speech recognition (ASR) can be very helpful for speakers who suffer from dysarthria, a neurological disability that damages the control of motor speech articulators. Although a few attempts have been made to apply ASR technologies to sufferers of dysarthria, previous studies show that such ASR systems have not attained an adequate level of performance. In this study, a dysarthric multi-networks speech recognizer (DM-NSR) model is provided using a realization of multi-views multi-learners approach called multi-nets artificial neural networks, which tolerates variability of dysarthric speech. In particular, the DM-NSR model employs several ANNs (as learners) to approximate the likelihood of ASR vocabulary words and to deal with the complexity of dysarthric speech. The proposed DM-NSR approach was presented as both speaker-dependent and speaker-independent paradigms. In order to highlight the performance of the proposed model over legacy models, multi-views single-learner models of the DM-NSRs were also provided and their efficiencies were compared in detail. Moreover, a comparison among the prominent dysarthric ASR methods and the proposed one is provided. The results show that the DM-NSR recorded improved recognition rate by up to 24.67% and the error rate was reduced by up to 8.63% over the reference model.
Elastic all-optical multi-hop interconnection in data centers with adaptive spectrum allocation
NASA Astrophysics Data System (ADS)
Hong, Yuanyuan; Hong, Xuezhi; Chen, Jiajia; He, Sailing
2017-01-01
In this paper, a novel flex-grid all-optical interconnect scheme that supports transparent multi-hop connections in data centers is proposed. An inter-rack all-optical multi-hop connection is realized with an optical loop employed at flex-grid wavelength selective switches (WSSs) in an intermediate rack rather than by relaying through optical-electric-optical (O-E-O) conversions. Compared with the conventional O-E-O based approach, the proposed all-optical scheme is able to off-load the traffic at intermediate racks, leading to a reduction of the power consumption and cost. The transmission performance of the proposed flex-grid multi-hop all-optical interconnect scheme with various modulation formats, including both coherently detected and directly detected approaches, are investigated by Monte-Carlo simulations. To enhance the spectrum efficiency (SE), number-of-hop adaptive bandwidth allocation is introduced. Numerical results show that the SE can be improved by up to 33.3% at 40 Gbps, and by up to 25% at 100 Gbps. The impact of parameters, such as targeted bit error rate (BER) level and insertion loss of components, on the transmission performance of the proposed approach are also explored. The results show that the maximum SE improvement of the adaptive approach over the non-adaptive one is enhanced with the decrease of the targeted BER levels and the component insertion loss.
A Multi-Faceted Approach to Successful Transition for Students with Intellectual Disabilities
ERIC Educational Resources Information Center
Dubberly, Russell G.
2011-01-01
This report summarizes the multi-faceted, dynamic instructional model implemented to increase positive transition outcomes for high school students with intellectual disabilities. This report is based on the programmatic methods implemented within a secondary-level school in an urban setting. This pedagogical model facilitates the use of…
Constrained Multi-Level Algorithm for Trajectory Optimization
NASA Astrophysics Data System (ADS)
Adimurthy, V.; Tandon, S. R.; Jessy, Antony; Kumar, C. Ravi
The emphasis on low cost access to space inspired many recent developments in the methodology of trajectory optimization. Ref.1 uses a spectral patching method for optimization, where global orthogonal polynomials are used to describe the dynamical constraints. A two-tier approach of optimization is used in Ref.2 for a missile mid-course trajectory optimization. A hybrid analytical/numerical approach is described in Ref.3, where an initial analytical vacuum solution is taken and gradually atmospheric effects are introduced. Ref.4 emphasizes the fact that the nonlinear constraints which occur in the initial and middle portions of the trajectory behave very nonlinearly with respect the variables making the optimization very difficult to solve in the direct and indirect shooting methods. The problem is further made complex when different phases of the trajectory have different objectives of optimization and also have different path constraints. Such problems can be effectively addressed by multi-level optimization. In the multi-level methods reported so far, optimization is first done in identified sub-level problems, where some coordination variables are kept fixed for global iteration. After all the sub optimizations are completed, higher-level optimization iteration with all the coordination and main variables is done. This is followed by further sub system optimizations with new coordination variables. This process is continued until convergence. In this paper we use a multi-level constrained optimization algorithm which avoids the repeated local sub system optimizations and which also removes the problem of non-linear sensitivity inherent in the single step approaches. Fall-zone constraints, structural load constraints and thermal constraints are considered. In this algorithm, there is only a single multi-level sequence of state and multiplier updates in a framework of an augmented Lagrangian. Han Tapia multiplier updates are used in view of their special role in diagonalised methods, being the only single update with quadratic convergence. For a single level, the diagonalised multiplier method (DMM) is described in Ref.5. The main advantage of the two-level analogue of the DMM approach is that it avoids the inner loop optimizations required in the other methods. The scheme also introduces a gradient change measure to reduce the computational time needed to calculate the gradients. It is demonstrated that the new multi-level scheme leads to a robust procedure to handle the sensitivity of the constraints, and the multiple objectives of different trajectory phases. Ref. 1. Fahroo, F and Ross, M., " A Spectral Patching Method for Direct Trajectory Optimization" The Journal of the Astronautical Sciences, Vol.48, 2000, pp.269-286 Ref. 2. Phililps, C.A. and Drake, J.C., "Trajectory Optimization for a Missile using a Multitier Approach" Journal of Spacecraft and Rockets, Vol.37, 2000, pp.663-669 Ref. 3. Gath, P.F., and Calise, A.J., " Optimization of Launch Vehicle Ascent Trajectories with Path Constraints and Coast Arcs", Journal of Guidance, Control, and Dynamics, Vol. 24, 2001, pp.296-304 Ref. 4. Betts, J.T., " Survey of Numerical Methods for Trajectory Optimization", Journal of Guidance, Control, and Dynamics, Vol.21, 1998, pp. 193-207 Ref. 5. Adimurthy, V., " Launch Vehicle Trajectory Optimization", Acta Astronautica, Vol.15, 1987, pp.845-850.
Strengthening Indonesia's health workforce through partnerships.
Kurniati, A; Rosskam, E; Afzal, M M; Suryowinoto, T B; Mukti, A G
2015-09-01
Indonesia faces critical challenges pertaining to human resources for health (HRH). These relate to HRH policy, planning, mismatch between production and demand, quality, renumeration, and mal-distribution. This paper provides a state of the art review of the existing conditions in Indonesia, innovations to tackle the problems, results of the innovations to date, and a picture of the on-going challenges that have yet to be met. Reversing this crisis level shortage of HRH requires an inclusive approach to address the underlying challenges. In 2010 the government initiated multi-stakeholder coordination for HRH, using the Country Coordination and Facilitation approach. The process requires committed engagement and coordination of relevant stakeholders to address priority health needs. This manuscript is a formative evaluation of the program using documentary study and analysis. Consistent with Indonesia's decentralized health system, since 2011 local governments also started establishing provincial multi-stakeholder committees and working groups for HRH development. Through this multi-stakeholder approach with high level government support and leadership, Indonesia was able to carry out HRH planning by engaging 164 stakeholders. Multi-stakeholder coordination has produced positive results in Indonesia by bringing about a number of innovations in HRH development to achieve UHC, fostered partnerships, attracted international attention, and galvanized multi-stakeholder support in improving the HRH situation. This approach also has facilitated mobilizing technical and financial support from domestic and international partners for HRH development. Applying the multi-stakeholder engagement and coordination process in Indonesia has proved instrumental in advancing the country's work to achieve Universal Health Coverage and the Millennium Development Goals by 2015. Indonesia continues to face an HRH crisis but the collaborative process provides an opportunity to achieve results. Indonesia's experience indicates that irrespective of geographical or economic status, countries can benefit from multi-stakeholder coordination and engagement to increase access to health workers, strengthen health systems, as well as achieve and sustain UHC. Copyright © 2015 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
An adaptive multi-level simulation algorithm for stochastic biological systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lester, C., E-mail: lesterc@maths.ox.ac.uk; Giles, M. B.; Baker, R. E.
2015-01-14
Discrete-state, continuous-time Markov models are widely used in the modeling of biochemical reaction networks. Their complexity often precludes analytic solution, and we rely on stochastic simulation algorithms (SSA) to estimate system statistics. The Gillespie algorithm is exact, but computationally costly as it simulates every single reaction. As such, approximate stochastic simulation algorithms such as the tau-leap algorithm are often used. Potentially computationally more efficient, the system statistics generated suffer from significant bias unless tau is relatively small, in which case the computational time can be comparable to that of the Gillespie algorithm. The multi-level method [Anderson and Higham, “Multi-level Montemore » Carlo for continuous time Markov chains, with applications in biochemical kinetics,” SIAM Multiscale Model. Simul. 10(1), 146–179 (2012)] tackles this problem. A base estimator is computed using many (cheap) sample paths at low accuracy. The bias inherent in this estimator is then reduced using a number of corrections. Each correction term is estimated using a collection of paired sample paths where one path of each pair is generated at a higher accuracy compared to the other (and so more expensive). By sharing random variables between these paired paths, the variance of each correction estimator can be reduced. This renders the multi-level method very efficient as only a relatively small number of paired paths are required to calculate each correction term. In the original multi-level method, each sample path is simulated using the tau-leap algorithm with a fixed value of τ. This approach can result in poor performance when the reaction activity of a system changes substantially over the timescale of interest. By introducing a novel adaptive time-stepping approach where τ is chosen according to the stochastic behaviour of each sample path, we extend the applicability of the multi-level method to such cases. We demonstrate the efficiency of our method using a number of examples.« less
Brimblecombe, J; Bailie, R; van den Boogaard, C; Wood, B; Liberato, S C; Ferguson, M; Coveney, J; Jaenke, R; Ritchie, J
2017-12-01
Food insecurity underlies and compounds many of the development issues faced by remote Indigenous communities in Australia. Multi-sector approaches offer promise to improve food security. We assessed the feasibility of a novel multi-sector approach to enhance community food security in remote Indigenous Australia. A longitudinal comparative multi-site case study, the Good Food Systems Good Food for All Project, was conducted (2009-2013) with four Aboriginal communities. Continuous improvement meetings were held in each community. Data from project documents and store sales were used to assess feasibility according to engagement, uptake and sustainability of action, and impact on community diet, as well as identifying conditions facilitating or hindering these. Engagement was established where: the community perceived a need for the approach; where trust was developed between the community and facilitators; where there was community stability; and where flexibility was applied in the timing of meetings. The approach enabled stakeholders in each community to collectively appraise the community food system and plan action. Actions that could be directly implemented within available resources resulted from developing collaborative capacity. Actions requiring advocacy, multi-sectoral involvement, commitment or further resources were less frequently used. Positive shifts in community diet were associated with key areas where actions were implemented. A multi-sector participatory approach seeking continuous improvement engaged committed Aboriginal and non-Aboriginal stakeholders and was shown to have potential to shift community diet. Provision of clear mechanisms to link this approach with higher level policy and decision-making structures, clarity of roles and responsibilities, and processes to prioritise and communicate actions across sectors should further strengthen capacity for food security improvement. Integrating this approach enabling local decision-making into community governance structures with adequate resourcing is an imperative.
Confidence level estimation in multi-target classification problems
NASA Astrophysics Data System (ADS)
Chang, Shi; Isaacs, Jason; Fu, Bo; Shin, Jaejeong; Zhu, Pingping; Ferrari, Silvia
2018-04-01
This paper presents an approach for estimating the confidence level in automatic multi-target classification performed by an imaging sensor on an unmanned vehicle. An automatic target recognition algorithm comprised of a deep convolutional neural network in series with a support vector machine classifier detects and classifies targets based on the image matrix. The joint posterior probability mass function of target class, features, and classification estimates is learned from labeled data, and recursively updated as additional images become available. Based on the learned joint probability mass function, the approach presented in this paper predicts the expected confidence level of future target classifications, prior to obtaining new images. The proposed approach is tested with a set of simulated sonar image data. The numerical results show that the estimated confidence level provides a close approximation to the actual confidence level value determined a posteriori, i.e. after the new image is obtained by the on-board sensor. Therefore, the expected confidence level function presented in this paper can be used to adaptively plan the path of the unmanned vehicle so as to optimize the expected confidence levels and ensure that all targets are classified with satisfactory confidence after the path is executed.
Heideklang, René; Shokouhi, Parisa
2016-01-01
This article focuses on the fusion of flaw indications from multi-sensor nondestructive materials testing. Because each testing method makes use of a different physical principle, a multi-method approach has the potential of effectively differentiating actual defect indications from the many false alarms, thus enhancing detection reliability. In this study, we propose a new technique for aggregating scattered two- or three-dimensional sensory data. Using a density-based approach, the proposed method explicitly addresses localization uncertainties such as registration errors. This feature marks one of the major of advantages of this approach over pixel-based image fusion techniques. We provide guidelines on how to set all the key parameters and demonstrate the technique’s robustness. Finally, we apply our fusion approach to experimental data and demonstrate its capability to locate small defects by substantially reducing false alarms under conditions where no single-sensor method is adequate. PMID:26784200
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-05
..., multi- level interventions; and community and public health approaches. To improve program design... prevention services and an evidence-based approach are provided for States to use in their SNAP-Ed programming. These definitions provide States with greater flexibility to include environmental approaches and...
A functional language approach in high-speed digital simulation
NASA Technical Reports Server (NTRS)
Ercegovac, M. D.; Lu, S.-L.
1983-01-01
A functional programming approach for a multi-microprocessor architecture is presented. The language, based on Backus FP, its intermediate form and the translation process are discussed and illustrated with an example. The approach allows performance analysis to be performed at a high level as an aid in program partitioning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mai, Sebastian; Marquetand, Philipp; González, Leticia
2014-08-21
An efficient perturbational treatment of spin-orbit coupling within the framework of high-level multi-reference techniques has been implemented in the most recent version of the COLUMBUS quantum chemistry package, extending the existing fully variational two-component (2c) multi-reference configuration interaction singles and doubles (MRCISD) method. The proposed scheme follows related implementations of quasi-degenerate perturbation theory (QDPT) model space techniques. Our model space is built either from uncontracted, large-scale scalar relativistic MRCISD wavefunctions or based on the scalar-relativistic solutions of the linear-response-theory-based multi-configurational averaged quadratic coupled cluster method (LRT-MRAQCC). The latter approach allows for a consistent, approximatively size-consistent and size-extensive treatment of spin-orbitmore » coupling. The approach is described in detail and compared to a number of related techniques. The inherent accuracy of the QDPT approach is validated by comparing cuts of the potential energy surfaces of acrolein and its S, Se, and Te analoga with the corresponding data obtained from matching fully variational spin-orbit MRCISD calculations. The conceptual availability of approximate analytic gradients with respect to geometrical displacements is an attractive feature of the 2c-QDPT-MRCISD and 2c-QDPT-LRT-MRAQCC methods for structure optimization and ab inito molecular dynamics simulations.« less
A hybrid solution approach for a multi-objective closed-loop logistics network under uncertainty
NASA Astrophysics Data System (ADS)
Mehrbod, Mehrdad; Tu, Nan; Miao, Lixin
2015-06-01
The design of closed-loop logistics (forward and reverse logistics) has attracted growing attention with the stringent pressures of customer expectations, environmental concerns and economic factors. This paper considers a multi-product, multi-period and multi-objective closed-loop logistics network model with regard to facility expansion as a facility location-allocation problem, which more closely approximates real-world conditions. A multi-objective mixed integer nonlinear programming formulation is linearized by defining new variables and adding new constraints to the model. By considering the aforementioned model under uncertainty, this paper develops a hybrid solution approach by combining an interactive fuzzy goal programming approach and robust counterpart optimization based on three well-known robust counterpart optimization formulations. Finally, this paper compares the results of the three formulations using different test scenarios and parameter-sensitive analysis in terms of the quality of the final solution, CPU time, the level of conservatism, the degree of closeness to the ideal solution, the degree of balance involved in developing a compromise solution, and satisfaction degree.
NASA Astrophysics Data System (ADS)
Das, Bankim Chandra; Bhattacharyya, Dipankar; Das, Arpita; Chakrabarti, Shrabana; De, Sankar
2016-12-01
We report here simultaneous experimental observation of Electromagnetically Induced Transparency (EIT) and Electromagnetically Induced Absorption (EIA) in a multi-level V-type system in D2 transition of 87Rb, i.e., F =2 →F' with a strong pump and a weak probe beam. We studied the probe spectrum by locking the probe beam to the transition F =2 →F'=2 while the pump is scanned from F =2 →F' . EIA is observed for the open transition (F =2 →F'=2 ) whereas EIT is observed in the closed transition (F =2 →F'=3 ). Sub natural line-width is observed for the EIA. To simulate the observed spectra theoretically, Liouville equation for the three-level V-type system is solved analytically with a multi-mode approach for the density matrix elements. We assumed both the pump and the probe beams can couple the excited states. A multi-mode approach for the coherence terms facilitates the study of all the frequency contributions due to the pump and the probe fields. Since the terms contain higher harmonics of the pump and the probe frequencies, we expressed them in Fourier transformed forms. To simulate the probe spectrum, we have solved inhomogeneous difference equations for the coherence terms using the Green's function technique and continued fraction theory. The experimental line-widths of the EIT and the EIA are compared with our theoretical model. Our system can be useful in optical switching applications as it can be precisely tuned to render the medium opaque and transparent simultaneously.
Das, Bankim Chandra; Bhattacharyya, Dipankar; Das, Arpita; Chakrabarti, Shrabana; De, Sankar
2016-12-14
We report here simultaneous experimental observation of Electromagnetically Induced Transparency (EIT) and Electromagnetically Induced Absorption (EIA) in a multi-level V-type system in D 2 transition of Rb87, i.e., F=2→F ' with a strong pump and a weak probe beam. We studied the probe spectrum by locking the probe beam to the transition F=2→F ' =2 while the pump is scanned from F=2→F ' . EIA is observed for the open transition (F=2→F ' =2) whereas EIT is observed in the closed transition (F=2→F ' =3). Sub natural line-width is observed for the EIA. To simulate the observed spectra theoretically, Liouville equation for the three-level V-type system is solved analytically with a multi-mode approach for the density matrix elements. We assumed both the pump and the probe beams can couple the excited states. A multi-mode approach for the coherence terms facilitates the study of all the frequency contributions due to the pump and the probe fields. Since the terms contain higher harmonics of the pump and the probe frequencies, we expressed them in Fourier transformed forms. To simulate the probe spectrum, we have solved inhomogeneous difference equations for the coherence terms using the Green's function technique and continued fraction theory. The experimental line-widths of the EIT and the EIA are compared with our theoretical model. Our system can be useful in optical switching applications as it can be precisely tuned to render the medium opaque and transparent simultaneously.
Sanbonmatsu, David M; Strayer, David L; Medeiros-Ward, Nathan; Watson, Jason M
2013-01-01
The present study examined the relationship between personality and individual differences in multi-tasking ability. Participants enrolled at the University of Utah completed measures of multi-tasking activity, perceived multi-tasking ability, impulsivity, and sensation seeking. In addition, they performed the Operation Span in order to assess their executive control and actual multi-tasking ability. The findings indicate that the persons who are most capable of multi-tasking effectively are not the persons who are most likely to engage in multiple tasks simultaneously. To the contrary, multi-tasking activity as measured by the Media Multitasking Inventory and self-reported cell phone usage while driving were negatively correlated with actual multi-tasking ability. Multi-tasking was positively correlated with participants' perceived ability to multi-task ability which was found to be significantly inflated. Participants with a strong approach orientation and a weak avoidance orientation--high levels of impulsivity and sensation seeking--reported greater multi-tasking behavior. Finally, the findings suggest that people often engage in multi-tasking because they are less able to block out distractions and focus on a singular task. Participants with less executive control--low scorers on the Operation Span task and persons high in impulsivity--tended to report higher levels of multi-tasking activity.
Enhancing the Teaching of Introductory Economics with a Team-Based, Multi-Section Competition
ERIC Educational Resources Information Center
Beaudin, Laura; Berdiev, Aziz N.; Kaminaga, Allison Shwachman; Mirmirani, Sam; Tebaldi, Edinaldo
2017-01-01
The authors describe a unique approach to enhancing student learning at the introductory economics level that utilizes a multi-section, team-based competition. The competition is structured to supplement learning throughout the entire introductory course. Student teams are presented with current economic issues, trends, or events, and use economic…
Optimized swimmer tracking system based on a novel multi-related-targets approach
NASA Astrophysics Data System (ADS)
Benarab, D.; Napoléon, T.; Alfalou, A.; Verney, A.; Hellard, P.
2017-02-01
Robust tracking is a crucial step in automatic swimmer evaluation from video sequences. We designed a robust swimmer tracking system using a new multi-related-targets approach. The main idea is to consider the swimmer as a bloc of connected subtargets that advance at the same speed. If one of the subtargets is partially or totally occluded, it can be localized by knowing the position of the others. In this paper, we first introduce the two-dimensional direct linear transformation technique that we used to calibrate the videos. Then, we present the classical tracking approach based on dynamic fusion. Next, we highlight the main contribution of our work, which is the multi-related-targets tracking approach. This approach, the classical head-only approach and the ground truth are then compared, through testing on a database of high-level swimmers in training, national and international competitions (French National Championships, Limoges 2015, and World Championships, Kazan 2015). Tracking percentage and the accuracy of the instantaneous speed are evaluated and the findings show that our new appraoach is significantly more accurate than the classical approach.
A multi-resolution approach to electromagnetic modelling
NASA Astrophysics Data System (ADS)
Cherevatova, M.; Egbert, G. D.; Smirnov, M. Yu
2018-07-01
We present a multi-resolution approach for 3-D magnetotelluric forward modelling. Our approach is motivated by the fact that fine-grid resolution is typically required at shallow levels to adequately represent near surface inhomogeneities, topography and bathymetry, while a much coarser grid may be adequate at depth where the diffusively propagating electromagnetic fields are much smoother. With a conventional structured finite difference grid, the fine discretization required to adequately represent rapid variations near the surface is continued to all depths, resulting in higher computational costs. Increasing the computational efficiency of the forward modelling is especially important for solving regularized inversion problems. We implement a multi-resolution finite difference scheme that allows us to decrease the horizontal grid resolution with depth, as is done with vertical discretization. In our implementation, the multi-resolution grid is represented as a vertical stack of subgrids, with each subgrid being a standard Cartesian tensor product staggered grid. Thus, our approach is similar to the octree discretization previously used for electromagnetic modelling, but simpler in that we allow refinement only with depth. The major difficulty arose in deriving the forward modelling operators on interfaces between adjacent subgrids. We considered three ways of handling the interface layers and suggest a preferable one, which results in similar accuracy as the staggered grid solution, while retaining the symmetry of coefficient matrix. A comparison between multi-resolution and staggered solvers for various models shows that multi-resolution approach improves on computational efficiency without compromising the accuracy of the solution.
Comparison of multi-subject ICA methods for analysis of fMRI data
Erhardt, Erik Barry; Rachakonda, Srinivas; Bedrick, Edward; Allen, Elena; Adali, Tülay; Calhoun, Vince D.
2010-01-01
Spatial independent component analysis (ICA) applied to functional magnetic resonance imaging (fMRI) data identifies functionally connected networks by estimating spatially independent patterns from their linearly mixed fMRI signals. Several multi-subject ICA approaches estimating subject-specific time courses (TCs) and spatial maps (SMs) have been developed, however there has not yet been a full comparison of the implications of their use. Here, we provide extensive comparisons of four multi-subject ICA approaches in combination with data reduction methods for simulated and fMRI task data. For multi-subject ICA, the data first undergo reduction at the subject and group levels using principal component analysis (PCA). Comparisons of subject-specific, spatial concatenation, and group data mean subject-level reduction strategies using PCA and probabilistic PCA (PPCA) show that computationally intensive PPCA is equivalent to PCA, and that subject-specific and group data mean subject-level PCA are preferred because of well-estimated TCs and SMs. Second, aggregate independent components are estimated using either noise free ICA or probabilistic ICA (PICA). Third, subject-specific SMs and TCs are estimated using back-reconstruction. We compare several direct group ICA (GICA) back-reconstruction approaches (GICA1-GICA3) and an indirect back-reconstruction approach, spatio-temporal regression (STR, or dual regression). Results show the earlier group ICA (GICA1) approximates STR, however STR has contradictory assumptions and may show mixed-component artifacts in estimated SMs. Our evidence-based recommendation is to use GICA3, introduced here, with subject-specific PCA and noise-free ICA, providing the most robust and accurate estimated SMs and TCs in addition to offering an intuitive interpretation. PMID:21162045
Interconnecting Multidiscilinary Data Infrastructures: From Federation to Brokering Framework
NASA Astrophysics Data System (ADS)
Nativi, S.
2014-12-01
Standardization and federation activities have been played an essential role to push interoperability at the disciplinary and cross-disciplinary level. However, they demonstrated not to be sufficient to resolve important interoperability challenges, including: disciplinary heterogeneity, cross-organizations diversities, cultural differences. Significant international initiatives like GEOSS, IODE, and CEOS demonstrated that a federation system dealing with global and multi-disciplinary domain turns out to be rater complex, raising more the already high entry level barriers for both Providers and Users. In particular, GEOSS demonstrated that standardization and federation actions must be accompanied and complemented by a brokering approach. Brokering architecture and its implementing technologies are able to implement an effective interoperability level among multi-disciplinary systems, lowering the entry level barriers for both data providers and users. This presentation will discuss the brokering philosophy as a complementary approach for standardization and federation to interconnect existing and heterogeneous infrastructures and systems. The GEOSS experience will be analyzed, specially.
Optimization as a Tool for Consistency Maintenance in Multi-Resolution Simulation
NASA Technical Reports Server (NTRS)
Drewry, Darren T; Reynolds, Jr , Paul F; Emanuel, William R
2006-01-01
The need for new approaches to the consistent simulation of related phenomena at multiple levels of resolution is great. While many fields of application would benefit from a complete and approachable solution to this problem, such solutions have proven extremely difficult. We present a multi-resolution simulation methodology that uses numerical optimization as a tool for maintaining external consistency between models of the same phenomena operating at different levels of temporal and/or spatial resolution. Our approach follows from previous work in the disparate fields of inverse modeling and spacetime constraint-based animation. As a case study, our methodology is applied to two environmental models of forest canopy processes that make overlapping predictions under unique sets of operating assumptions, and which execute at different temporal resolutions. Experimental results are presented and future directions are addressed.
Uddin, Shahadat
2016-02-04
A patient-centric care network can be defined as a network among a group of healthcare professionals who provide treatments to common patients. Various multi-level attributes of the members of this network have substantial influence to its perceived level of performance. In order to assess the impact different multi-level attributes of patient-centric care networks on healthcare outcomes, this study first captured patient-centric care networks for 85 hospitals using health insurance claim dataset. From these networks, this study then constructed physician collaboration networks based on the concept of patient-sharing network among physicians. A multi-level regression model was then developed to explore the impact of different attributes that are organised at two levels on hospitalisation cost and hospital length of stay. For Level-1 model, the average visit per physician significantly predicted both hospitalisation cost and hospital length of stay. The number of different physicians significantly predicted only the hospitalisation cost, which has significantly been moderated by age, gender and Comorbidity score of patients. All Level-1 findings showed significance variance across physician collaboration networks having different community structure and density. These findings could be utilised as a reflective measure by healthcare decision makers. Moreover, healthcare managers could consider them in developing effective healthcare environments.
NASA Astrophysics Data System (ADS)
Wałach, Daniel; Sagan, Joanna; Gicala, Magdalena
2017-10-01
The paper presents an environmental and economic analysis of the material solutions of multi-level garage. The construction project approach considered reinforced concrete structure under conditions of use of ordinary concrete and high-performance concrete (HPC). Using of HPC allowed to significant reduction of reinforcement steel, mainly in compression elements (columns) in the construction of the object. The analysis includes elements of the methodology of integrated lice cycle design (ILCD). By making multi-criteria analysis based on established weight of the economic and environmental parameters, three solutions have been evaluated and compared within phase of material production (information modules A1-A3).
Design of supply chain in fuzzy environment
NASA Astrophysics Data System (ADS)
Rao, Kandukuri Narayana; Subbaiah, Kambagowni Venkata; Singh, Ganja Veera Pratap
2013-05-01
Nowadays, customer expectations are increasing and organizations are prone to operate in an uncertain environment. Under this uncertain environment, the ultimate success of the firm depends on its ability to integrate business processes among supply chain partners. Supply chain management emphasizes cross-functional links to improve the competitive strategy of organizations. Now, companies are moving from decoupled decision processes towards more integrated design and control of their components to achieve the strategic fit. In this paper, a new approach is developed to design a multi-echelon, multi-facility, and multi-product supply chain in fuzzy environment. In fuzzy environment, mixed integer programming problem is formulated through fuzzy goal programming in strategic level with supply chain cost and volume flexibility as fuzzy goals. These fuzzy goals are aggregated using minimum operator. In tactical level, continuous review policy for controlling raw material inventories in supplier echelon and controlling finished product inventories in plant as well as distribution center echelon is considered as fuzzy goals. A non-linear programming model is formulated through fuzzy goal programming using minimum operator in the tactical level. The proposed approach is illustrated with a numerical example.
NASA Astrophysics Data System (ADS)
Balaji, V.; Benson, Rusty; Wyman, Bruce; Held, Isaac
2016-10-01
Climate models represent a large variety of processes on a variety of timescales and space scales, a canonical example of multi-physics multi-scale modeling. Current hardware trends, such as Graphical Processing Units (GPUs) and Many Integrated Core (MIC) chips, are based on, at best, marginal increases in clock speed, coupled with vast increases in concurrency, particularly at the fine grain. Multi-physics codes face particular challenges in achieving fine-grained concurrency, as different physics and dynamics components have different computational profiles, and universal solutions are hard to come by. We propose here one approach for multi-physics codes. These codes are typically structured as components interacting via software frameworks. The component structure of a typical Earth system model consists of a hierarchical and recursive tree of components, each representing a different climate process or dynamical system. This recursive structure generally encompasses a modest level of concurrency at the highest level (e.g., atmosphere and ocean on different processor sets) with serial organization underneath. We propose to extend concurrency much further by running more and more lower- and higher-level components in parallel with each other. Each component can further be parallelized on the fine grain, potentially offering a major increase in the scalability of Earth system models. We present here first results from this approach, called coarse-grained component concurrency, or CCC. Within the Geophysical Fluid Dynamics Laboratory (GFDL) Flexible Modeling System (FMS), the atmospheric radiative transfer component has been configured to run in parallel with a composite component consisting of every other atmospheric component, including the atmospheric dynamics and all other atmospheric physics components. We will explore the algorithmic challenges involved in such an approach, and present results from such simulations. Plans to achieve even greater levels of coarse-grained concurrency by extending this approach within other components, such as the ocean, will be discussed.
Lakerveld, Jeroen; Brug, Johannes; Bot, Sandra; Teixeira, Pedro J; Rutter, Harry; Woodward, Euan; Samdal, Oddrun; Stockley, Lynn; De Bourdeaudhuij, Ilse; van Assema, Patricia; Robertson, Aileen; Lobstein, Tim; Oppert, Jean-Michel; Adány, Róza; Nijpels, Giel
2012-09-17
The prevalence of overweight and obesity in Europe is high. It is a major cause of the overall rates of many of the main chronic (or non communicable) diseases in this region and is characterized by an unequal socio-economic distribution within the population. Obesity is largely determined by modifiable lifestyle behaviours such as low physical activity levels, sedentary behaviour and consumption of energy dense diets. It is increasingly being recognised that effective responses must go beyond interventions that only focus on a specific individual, social or environmental level and instead embrace system-based multi-level intervention approaches that address both the individual and environment. The EU-funded project "sustainable prevention of obesity through integrated strategies" (SPOTLIGHT) aims to increase and combine knowledge on the wide range of determinants of obesity in a systematic way, and to identify multi-level intervention approaches that are strong in terms of Reach, Efficacy, Adoption, Implementation and Maintenance (RE-AIM). SPOTLIGHT comprises a series of systematic reviews on: individual-level predictors of success in behaviour change obesity interventions; social and physical environmental determinants of obesity; and on the RE-AIM of multi-level interventions. An interactive web-atlas of currently running multi-level interventions will be developed, and enhancing and impeding factors for implementation will be described. At the neighbourhood level, these elements will inform the development of methods to assess obesogenicity of diverse environments, using remote imaging techniques linked to geographic information systems. The validity of these methods will be evaluated using data from surveys of health and lifestyles of adults residing in the neighbourhoods surveyed. At both the micro- and macro-levels (national and international) the different physical, economical, political and socio-cultural elements will be assessed. SPOTLIGHT offers the potential to develop approaches that combine an understanding of the obesogenicity of environments in Europe, and thus how they can be improved, with an appreciation of the individual factors that explain why people respond differently to such environments. Its findings will inform governmental authorities and professionals, academics, NGOs and private sector stakeholders engaged in the development and implementation of policies to tackle the obesity epidemic in Europe.
Velpuri, Naga Manohar; Senay, Gabriel B.
2012-01-01
Lake Turkana, the largest desert lake in the world, is fed by ungauged or poorly gauged river systems. To meet the demand of electricity in the East African region, Ethiopia is currently building the Gibe III hydroelectric dam on the Omo River, which supplies more than 80% of the inflows to Lake Turkana. On completion, the Gibe III dam will be the tallest dam in Africa with a height of 241 m. However, the nature of interactions and potential impacts of regulated inflows to Lake Turkana are not well understood due to its remote location and unavailability of reliable in-situ datasets. In this study, we used 12 years (1998–2009) of existing multi-source satellite and model-assimilated global weather data. We use calibrated multi-source satellite data-driven water balance model for Lake Turkana that takes into account model routed runoff, lake/reservoir evapotranspiration, direct rain on lakes/reservoirs and releases from the dam to compute lake water levels. The model evaluates the impact of Gibe III dam using three different approaches such as (a historical approach, a knowledge-based approach, and a nonparametric bootstrap resampling approach) to generate rainfall-runoff scenarios. All the approaches provided comparable and consistent results. Model results indicated that the hydrological impact of the dam on Lake Turkana would vary with the magnitude and distribution of rainfall post-dam commencement. On average, the reservoir would take up to 8–10 months, after commencement, to reach a minimum operation level of 201 m depth of water. During the dam filling period, the lake level would drop up to 2 m (95% confidence) compared to the lake level modelled without the dam. The lake level variability caused by regulated inflows after the dam commissioning were found to be within the natural variability of the lake of 4.8 m. Moreover, modelling results indicated that the hydrological impact of the Gibe III dam would depend on the initial lake level at the time of dam commencement. Areas along the Lake Turkana shoreline that are vulnerable to fluctuations in lake levels were also identified. This study demonstrates the effectiveness of using existing multi-source satellite data in a basic modeling framework to assess the potential hydrological impact of an upstream dam on a terminal downstream lake. The results obtained from this study could also be used to evaluate alternate dam-filling scenarios and assess the potential impact of the dam on Lake Turkana under different operational strategies.
NASA Astrophysics Data System (ADS)
Indarsih, Indrati, Ch. Rini
2016-02-01
In this paper, we define variance of the fuzzy random variables through alpha level. We have a theorem that can be used to know that the variance of fuzzy random variables is a fuzzy number. We have a multi-objective linear programming (MOLP) with fuzzy random of objective function coefficients. We will solve the problem by variance approach. The approach transform the MOLP with fuzzy random of objective function coefficients into MOLP with fuzzy of objective function coefficients. By weighted methods, we have linear programming with fuzzy coefficients and we solve by simplex method for fuzzy linear programming.
Schlottfeldt, S; Walter, M E M T; Carvalho, A C P L F; Soares, T N; Telles, M P C; Loyola, R D; Diniz-Filho, J A F
2015-06-18
Biodiversity crises have led scientists to develop strategies for achieving conservation goals. The underlying principle of these strategies lies in systematic conservation planning (SCP), in which there are at least 2 conflicting objectives, making it a good candidate for multi-objective optimization. Although SCP is typically applied at the species level (or hierarchically higher), it can be used at lower hierarchical levels, such as using alleles as basic units for analysis, for conservation genetics. Here, we propose a method of SCP using a multi-objective approach. We used non-dominated sorting genetic algorithm II in order to identify the smallest set of local populations of Dipteryx alata (baru) (a Brazilian Cerrado species) for conservation, representing the known genetic diversity and using allele frequency information associated with heterozygosity and Hardy-Weinberg equilibrium. We worked in 3 variations for the problem. First, we reproduced a previous experiment, but using a multi-objective approach. We found that the smallest set of populations needed to represent all alleles under study was 7, corroborating the results of the previous study, but with more distinct solutions. In the 2nd and 3rd variations, we performed simultaneous optimization of 4 and 5 objectives, respectively. We found similar but refined results for 7 populations, and a larger portfolio considering intra-specific diversity and persistence with populations ranging from 8-22. This is the first study to apply multi-objective algorithms to an SCP problem using alleles at the population level as basic units for analysis.
Field Trials of the Multi-Source Approach for Resistivity and Induced Polarization Data Acquisition
NASA Astrophysics Data System (ADS)
LaBrecque, D. J.; Morelli, G.; Fischanger, F.; Lamoureux, P.; Brigham, R.
2013-12-01
Implementing systems of distributed receivers and transmitters for resistivity and induced polarization data is an almost inevitable result of the availability of wireless data communication modules and GPS modules offering precise timing and instrument locations. Such systems have a number of advantages; for example, they can be deployed around obstacles such as rivers, canyons, or mountains which would be difficult with traditional 'hard-wired' systems. However, deploying a system of identical, small, battery powered, transceivers, each capable of injecting a known current and measuring the induced potential has an additional and less obvious advantage in that multiple units can inject current simultaneously. The original purpose for using multiple simultaneous current sources (multi-source) was to increase signal levels. In traditional systems, to double the received signal you inject twice the current which requires you to apply twice the voltage and thus four times the power. Alternatively, one approach to increasing signal levels for large-scale surveys collected using small, battery powered transceivers is it to allow multiple units to transmit in parallel. In theory, using four 400 watt transmitters on separate, parallel dipoles yields roughly the same signal as a single 6400 watt transmitter. Furthermore, implementing the multi-source approach creates the opportunity to apply more complex current flow patterns than simple, parallel dipoles. For a perfect, noise-free system, multi-sources adds no new information to a data set that contains a comprehensive set of data collected using single sources. However, for realistic, noisy systems, it appears that multi-source data can substantially impact survey results. In preliminary model studies, the multi-source data produced such startling improvements in subsurface images that even the authors questioned their veracity. Between December of 2012 and July of 2013, we completed multi-source surveys at five sites with depths of exploration ranging from 150 to 450 m. The sites included shallow geothermal sites near Reno Nevada, Pomarance Italy, and Volterra Italy; a mineral exploration site near Timmins Quebec; and a landslide investigation near Vajont Dam in northern Italy. These sites provided a series of challenges in survey design and deployment including some extremely difficult terrain and a broad range of background resistivity and induced values. Despite these challenges, comparison of multi-source results to resistivity and induced polarization data collection with more traditional methods support the thesis that the multi-source approach is capable of providing substantial improvements in both depth of penetration and resolution over conventional approaches.
Multi-level molecular modelling for plasma medicine
NASA Astrophysics Data System (ADS)
Bogaerts, Annemie; Khosravian, Narjes; Van der Paal, Jonas; Verlackt, Christof C. W.; Yusupov, Maksudbek; Kamaraj, Balu; Neyts, Erik C.
2016-02-01
Modelling at the molecular or atomic scale can be very useful for obtaining a better insight in plasma medicine. This paper gives an overview of different atomic/molecular scale modelling approaches that can be used to study the direct interaction of plasma species with biomolecules or the consequences of these interactions for the biomolecules on a somewhat longer time-scale. These approaches include density functional theory (DFT), density functional based tight binding (DFTB), classical reactive and non-reactive molecular dynamics (MD) and united-atom or coarse-grained MD, as well as hybrid quantum mechanics/molecular mechanics (QM/MM) methods. Specific examples will be given for three important types of biomolecules, present in human cells, i.e. proteins, DNA and phospholipids found in the cell membrane. The results show that each of these modelling approaches has its specific strengths and limitations, and is particularly useful for certain applications. A multi-level approach is therefore most suitable for obtaining a global picture of the plasma-biomolecule interactions.
Multi-atlas segmentation enables robust multi-contrast MRI spleen segmentation for splenomegaly
NASA Astrophysics Data System (ADS)
Huo, Yuankai; Liu, Jiaqi; Xu, Zhoubing; Harrigan, Robert L.; Assad, Albert; Abramson, Richard G.; Landman, Bennett A.
2017-02-01
Non-invasive spleen volume estimation is essential in detecting splenomegaly. Magnetic resonance imaging (MRI) has been used to facilitate splenomegaly diagnosis in vivo. However, achieving accurate spleen volume estimation from MR images is challenging given the great inter-subject variance of human abdomens and wide variety of clinical images/modalities. Multi-atlas segmentation has been shown to be a promising approach to handle heterogeneous data and difficult anatomical scenarios. In this paper, we propose to use multi-atlas segmentation frameworks for MRI spleen segmentation for splenomegaly. To the best of our knowledge, this is the first work that integrates multi-atlas segmentation for splenomegaly as seen on MRI. To address the particular concerns of spleen MRI, automated and novel semi-automated atlas selection approaches are introduced. The automated approach interactively selects a subset of atlases using selective and iterative method for performance level estimation (SIMPLE) approach. To further control the outliers, semi-automated craniocaudal length based SIMPLE atlas selection (L-SIMPLE) is proposed to introduce a spatial prior in a fashion to guide the iterative atlas selection. A dataset from a clinical trial containing 55 MRI volumes (28 T1 weighted and 27 T2 weighted) was used to evaluate different methods. Both automated and semi-automated methods achieved median DSC > 0.9. The outliers were alleviated by the L-SIMPLE (≍1 min manual efforts per scan), which achieved 0.9713 Pearson correlation compared with the manual segmentation. The results demonstrated that the multi-atlas segmentation is able to achieve accurate spleen segmentation from the multi-contrast splenomegaly MRI scans.
Multi-atlas Segmentation Enables Robust Multi-contrast MRI Spleen Segmentation for Splenomegaly.
Huo, Yuankai; Liu, Jiaqi; Xu, Zhoubing; Harrigan, Robert L; Assad, Albert; Abramson, Richard G; Landman, Bennett A
2017-02-11
Non-invasive spleen volume estimation is essential in detecting splenomegaly. Magnetic resonance imaging (MRI) has been used to facilitate splenomegaly diagnosis in vivo. However, achieving accurate spleen volume estimation from MR images is challenging given the great inter-subject variance of human abdomens and wide variety of clinical images/modalities. Multi-atlas segmentation has been shown to be a promising approach to handle heterogeneous data and difficult anatomical scenarios. In this paper, we propose to use multi-atlas segmentation frameworks for MRI spleen segmentation for splenomegaly. To the best of our knowledge, this is the first work that integrates multi-atlas segmentation for splenomegaly as seen on MRI. To address the particular concerns of spleen MRI, automated and novel semi-automated atlas selection approaches are introduced. The automated approach interactively selects a subset of atlases using selective and iterative method for performance level estimation (SIMPLE) approach. To further control the outliers, semi-automated craniocaudal length based SIMPLE atlas selection (L-SIMPLE) is proposed to introduce a spatial prior in a fashion to guide the iterative atlas selection. A dataset from a clinical trial containing 55 MRI volumes (28 T1 weighted and 27 T2 weighted) was used to evaluate different methods. Both automated and semi-automated methods achieved median DSC > 0.9. The outliers were alleviated by the L-SIMPLE (≈1 min manual efforts per scan), which achieved 0.9713 Pearson correlation compared with the manual segmentation. The results demonstrated that the multi-atlas segmentation is able to achieve accurate spleen segmentation from the multi-contrast splenomegaly MRI scans.
Wang, Lin; Qu, Hui; Liu, Shan; Dun, Cai-xia
2013-01-01
As a practical inventory and transportation problem, it is important to synthesize several objectives for the joint replenishment and delivery (JRD) decision. In this paper, a new multiobjective stochastic JRD (MSJRD) of the one-warehouse and n-retailer systems considering the balance of service level and total cost simultaneously is proposed. The goal of this problem is to decide the reasonable replenishment interval, safety stock factor, and traveling routing. Secondly, two approaches are designed to handle this complex multi-objective optimization problem. Linear programming (LP) approach converts the multi-objective to single objective, while a multi-objective evolution algorithm (MOEA) solves a multi-objective problem directly. Thirdly, three intelligent optimization algorithms, differential evolution algorithm (DE), hybrid DE (HDE), and genetic algorithm (GA), are utilized in LP-based and MOEA-based approaches. Results of the MSJRD with LP-based and MOEA-based approaches are compared by a contrastive numerical example. To analyses the nondominated solution of MOEA, a metric is also used to measure the distribution of the last generation solution. Results show that HDE outperforms DE and GA whenever LP or MOEA is adopted.
Dun, Cai-xia
2013-01-01
As a practical inventory and transportation problem, it is important to synthesize several objectives for the joint replenishment and delivery (JRD) decision. In this paper, a new multiobjective stochastic JRD (MSJRD) of the one-warehouse and n-retailer systems considering the balance of service level and total cost simultaneously is proposed. The goal of this problem is to decide the reasonable replenishment interval, safety stock factor, and traveling routing. Secondly, two approaches are designed to handle this complex multi-objective optimization problem. Linear programming (LP) approach converts the multi-objective to single objective, while a multi-objective evolution algorithm (MOEA) solves a multi-objective problem directly. Thirdly, three intelligent optimization algorithms, differential evolution algorithm (DE), hybrid DE (HDE), and genetic algorithm (GA), are utilized in LP-based and MOEA-based approaches. Results of the MSJRD with LP-based and MOEA-based approaches are compared by a contrastive numerical example. To analyses the nondominated solution of MOEA, a metric is also used to measure the distribution of the last generation solution. Results show that HDE outperforms DE and GA whenever LP or MOEA is adopted. PMID:24302880
Climate policy: Uncovering ocean-related priorities
NASA Astrophysics Data System (ADS)
Barkemeyer, Ralf
2017-11-01
Given the complexity and multi-faceted nature of policy processes, national-level policy preferences are notoriously difficult to capture. Now, research applying an automated text mining approach helps to shed light on country-level differences and priorities in the context of marine climate issues.
NASA Astrophysics Data System (ADS)
Chen, Hsing-Ta; Ho, Tak-San; Chu, Shih-I.
The generalized Floquet approach is developed to study memory effect on electron transport phenomena through a periodically driven single quantum dot in an electrode-multi-level dot-electrode nanoscale quantum device. The memory effect is treated using a multi-function Lorentzian spectral density (LSD) model that mimics the spectral density of each electrode in terms of multiple Lorentzian functions. For the symmetric single-function LSD model involving a single-level dot, the underlying single-particle propagator is shown to be related to a 2×2 effective time-dependent Hamiltonian that includes both the periodic external field and the electrode memory effect. By invoking the generalized Van Vleck (GVV) nearly degenerate perturbation theory, an analytical Tien-Gordon-like expression is derived for arbitrary order multi-photon resonance d.c. tunneling current. Numerically converged simulations and the GVV analytical results are in good agreement, revealing the origin of multi-photon coherent destruction of tunneling and accounting for the suppression of the staircase jumps of d.c. current due to the memory effect. Specially, a novel blockade phenomenon is observed, showing distinctive oscillations in the field-induced current in the large bias voltage limit.
NASA Astrophysics Data System (ADS)
Vieira, Rodrigo Drumond; Kelly, Gregory J.
2014-11-01
In this paper, we present and apply a multi-level method for discourse analysis in science classrooms. This method is based on the structure of human activity (activity, actions, and operations) and it was applied to study a pre-service physics teacher methods course. We argue that such an approach, based on a cultural psychological perspective, affords opportunities for analysts to perform a theoretically based detailed analysis of discourse events. Along with the presentation of analysis, we show and discuss how the articulation of different levels offers interpretative criteria for analyzing instructional conversations. We synthesize the results into a model for a teacher's practice and discuss the implications and possibilities of this approach for the field of discourse analysis in science classrooms. Finally, we reflect on how the development of teachers' understanding of their activity structures can contribute to forms of progressive discourse of science education.
NASA Astrophysics Data System (ADS)
Farroha, Bassam S.; Farroha, Deborah L.
2011-06-01
The new corporate approach to efficient processing and storage is migrating from in-house service-center services to the newly coined approach of Cloud Computing. This approach advocates thin clients and providing services by the service provider over time-shared resources. The concept is not new, however the implementation approach presents a strategic shift in the way organizations provision and manage their IT resources. The requirements on some of the data sets targeted to be run on the cloud vary depending on the data type, originator, user, and confidentiality level. Additionally, the systems that fuse such data would have to deal with the classifying the product and clearing the computing resources prior to allowing new application to be executed. This indicates that we could end up with a multi-level security system that needs to follow specific rules and can send the output to a protected network and systems in order not to have data spill or contaminated resources. The paper discusses these requirements and potential impact on the cloud architecture. Additionally, the paper discusses the unexpected advantages of the cloud framework providing a sophisticated environment for information sharing and data mining.
NASA Technical Reports Server (NTRS)
Skillen, Michael D.; Crossley, William A.
2008-01-01
This report presents an approach for sizing of a morphing aircraft based upon a multi-level design optimization approach. For this effort, a morphing wing is one whose planform can make significant shape changes in flight - increasing wing area by 50% or more from the lowest possible area, changing sweep 30 or more, and/or increasing aspect ratio by as much as 200% from the lowest possible value. The top-level optimization problem seeks to minimize the gross weight of the aircraft by determining a set of "baseline" variables - these are common aircraft sizing variables, along with a set of "morphing limit" variables - these describe the maximum shape change for a particular morphing strategy. The sub-level optimization problems represent each segment in the morphing aircraft's design mission; here, each sub-level optimizer minimizes fuel consumed during each mission segment by changing the wing planform within the bounds set by the baseline and morphing limit variables from the top-level problem.
Schlüter, Daniela K; Ramis-Conde, Ignacio; Chaplain, Mark A J
2015-02-06
Studying the biophysical interactions between cells is crucial to understanding how normal tissue develops, how it is structured and also when malfunctions occur. Traditional experiments try to infer events at the tissue level after observing the behaviour of and interactions between individual cells. This approach assumes that cells behave in the same biophysical manner in isolated experiments as they do within colonies and tissues. In this paper, we develop a multi-scale multi-compartment mathematical model that accounts for the principal biophysical interactions and adhesion pathways not only at a cell-cell level but also at the level of cell colonies (in contrast to the traditional approach). Our results suggest that adhesion/separation forces between cells may be lower in cell colonies than traditional isolated single-cell experiments infer. As a consequence, isolated single-cell experiments may be insufficient to deduce important biological processes such as single-cell invasion after detachment from a solid tumour. The simulations further show that kinetic rates and cell biophysical characteristics such as pressure-related cell-cycle arrest have a major influence on cell colony patterns and can allow for the development of protrusive cellular structures as seen in invasive cancer cell lines independent of expression levels of pro-invasion molecules.
Schlüter, Daniela K.; Ramis-Conde, Ignacio; Chaplain, Mark A. J.
2015-01-01
Studying the biophysical interactions between cells is crucial to understanding how normal tissue develops, how it is structured and also when malfunctions occur. Traditional experiments try to infer events at the tissue level after observing the behaviour of and interactions between individual cells. This approach assumes that cells behave in the same biophysical manner in isolated experiments as they do within colonies and tissues. In this paper, we develop a multi-scale multi-compartment mathematical model that accounts for the principal biophysical interactions and adhesion pathways not only at a cell–cell level but also at the level of cell colonies (in contrast to the traditional approach). Our results suggest that adhesion/separation forces between cells may be lower in cell colonies than traditional isolated single-cell experiments infer. As a consequence, isolated single-cell experiments may be insufficient to deduce important biological processes such as single-cell invasion after detachment from a solid tumour. The simulations further show that kinetic rates and cell biophysical characteristics such as pressure-related cell-cycle arrest have a major influence on cell colony patterns and can allow for the development of protrusive cellular structures as seen in invasive cancer cell lines independent of expression levels of pro-invasion molecules. PMID:25519994
Jing, Luyang; Wang, Taiyong; Zhao, Ming; Wang, Peng
2017-01-01
A fault diagnosis approach based on multi-sensor data fusion is a promising tool to deal with complicated damage detection problems of mechanical systems. Nevertheless, this approach suffers from two challenges, which are (1) the feature extraction from various types of sensory data and (2) the selection of a suitable fusion level. It is usually difficult to choose an optimal feature or fusion level for a specific fault diagnosis task, and extensive domain expertise and human labor are also highly required during these selections. To address these two challenges, we propose an adaptive multi-sensor data fusion method based on deep convolutional neural networks (DCNN) for fault diagnosis. The proposed method can learn features from raw data and optimize a combination of different fusion levels adaptively to satisfy the requirements of any fault diagnosis task. The proposed method is tested through a planetary gearbox test rig. Handcraft features, manual-selected fusion levels, single sensory data, and two traditional intelligent models, back-propagation neural networks (BPNN) and a support vector machine (SVM), are used as comparisons in the experiment. The results demonstrate that the proposed method is able to detect the conditions of the planetary gearbox effectively with the best diagnosis accuracy among all comparative methods in the experiment. PMID:28230767
Lawrence, Katherine A; Rapee, Ronald M; Cardamone-Breen, Mairead C; Green, Jacqueline; Jorm, Anthony F
2017-01-01
Depression and anxiety disorders in young people are a global health concern. Various risk and protective factors for these disorders are potentially modifiable by parents, underscoring the important role parents play in reducing the risk and impact of these disorders in their adolescent children. However, cost-effective, evidence-based interventions for parents that can be widely disseminated are lacking. In this paper, we propose a multi-level public health approach involving a Web-based parenting intervention, Partners in Parenting (PIP). We describe the components of the Web-based intervention and how each component was developed. Development of the intervention was guided by principles of the persuasive systems design model to maximize parental engagement and adherence. A consumer-engagement approach was used, including consultation with parents and adolescents about the content and presentation of the intervention. The PIP intervention can be used at varying levels of intensity to tailor to the different needs of parents across the population. Challenges and opportunities for the use of the intervention are discussed. The PIP Web-based intervention was developed to address the dearth of evidence-based resources to support parents in their important role in their adolescents’ mental health. The proposed public health approach utilizes this intervention at varying levels of intensity based on parents’ needs. Evaluation of each separate level of the model is ongoing. Further evaluation of the whole approach is required to assess the utility of the intervention as a public health approach, as well as its broader effects on adolescent functioning and socioeconomic outcomes. PMID:29258974
Sexual networks: measuring sexual selection in structured, polyandrous populations.
McDonald, Grant C; James, Richard; Krause, Jens; Pizzari, Tommaso
2013-03-05
Sexual selection is traditionally measured at the population level, assuming that populations lack structure. However, increasing evidence undermines this approach, indicating that intrasexual competition in natural populations often displays complex patterns of spatial and temporal structure. This complexity is due in part to the degree and mechanisms of polyandry within a population, which can influence the intensity and scale of both pre- and post-copulatory sexual competition. Attempts to measure selection at the local and global scale have been made through multi-level selection approaches. However, definitions of local scale are often based on physical proximity, providing a rather coarse measure of local competition, particularly in polyandrous populations where the local scale of pre- and post-copulatory competition may differ drastically from each other. These limitations can be solved by social network analysis, which allows us to define a unique sexual environment for each member of a population: 'local scale' competition, therefore, becomes an emergent property of a sexual network. Here, we first propose a novel quantitative approach to measure pre- and post-copulatory sexual selection, which integrates multi-level selection with information on local scale competition derived as an emergent property of networks of sexual interactions. We then use simple simulations to illustrate the ways in which polyandry can impact estimates of sexual selection. We show that for intermediate levels of polyandry, the proposed network-based approach provides substantially more accurate measures of sexual selection than the more traditional population-level approach. We argue that the increasing availability of fine-grained behavioural datasets provides exciting new opportunities to develop network approaches to study sexual selection in complex societies.
Multiscale modeling of a low magnetostrictive Fe-27wt%Co-0.5wt%Cr alloy
NASA Astrophysics Data System (ADS)
Savary, M.; Hubert, O.; Helbert, A. L.; Baudin, T.; Batonnet, R.; Waeckerlé, T.
2018-05-01
The present paper deals with the improvement of a multi-scale approach describing the magneto-mechanical coupling of Fe-27wt%Co-0.5wt%Cr alloy. The magnetostriction behavior is demonstrated as very different (low magnetostriction vs. high magnetostriction) when this material is submitted to two different final annealing conditions after cold rolling. The numerical data obtained from a multi-scale approach are in accordance with experimental data corresponding to the high magnetostriction level material. A bi-domain structure hypothesis is employed to explain the low magnetostriction behavior, in accordance with the effect of an applied tensile stress. A modification of the multiscale approach is proposed to match this result.
Epithelial perturbation by inhaled chlorine: Multi-scale mechanistic modeling in rats and humans
Chlorine is a high-production volume, hazardous air pollutant and irritant gas of interest to homeland security. Thus, scenarios of interest for risk characterization range from acute high-level exposures to lower-level chronic exposures. Risk assessment approaches to estimate ...
Systematic Approach to Food Safety Education on the Farm
ERIC Educational Resources Information Center
Shaw, Angela; Strohbehn, Catherine; Naeve, Linda; Domoto, Paul; Wilson, Lester
2015-01-01
Food safety education from farm to end user is essential in the mitigation of food safety concerns associated with fresh produce. Iowa State University developed a multi-disciplinary three-level sequential program ("Know," "Show," "Go") to provide a holistic approach to food safety education. This program provides…
Pedagogical Transitions among Science Teachers: How Does Context Intersect with Teacher Beliefs?
ERIC Educational Resources Information Center
Hamilton, Miriam
2018-01-01
This paper examines attitudes to pedagogical change, among teachers within a second level science department in Ireland. It explores the beliefs and contextual constraints that mediate diversification from a primarily didactic pedagogical approach towards more student-led pedagogies. Using a multi-method approach incorporating observations of…
ERIC Educational Resources Information Center
Mulford, Bill; Silins, Halia
2011-01-01
Purpose: This study aims to present revised models and a reconceptualisation of successful school principalship for improved student outcomes. Design/methodology/approach: The study's approach is qualitative and quantitative, culminating in model building and multi-level statistical analyses. Findings: Principals who promote both capacity building…
Lough, Emma; Fisher, Marisa H
2016-11-01
The current study took a multi-informant approach to compare parent to self-report ratings of social vulnerability of adults with Williams syndrome (WS). Participants included 102 pairs of adults with WS and their parents. Parents completed the Social Vulnerability Questionnaire and adults with WS completed an adapted version of the questionnaire. Parents consistently reported higher levels of social vulnerability for their son/daughter than the individual with WS reported, with the exception of emotional abuse. The lower ratings of social vulnerability by adults with WS, compared to their parents, offer new information about their insight into their own vulnerability. These findings highlight the importance of teaching self-awareness as a part of a multi-informant approach to interventions designed to target social vulnerability.
A multi-standard approach for GIAO (13)C NMR calculations.
Sarotti, Ariel M; Pellegrinet, Silvina C
2009-10-02
The influence of the reference standard employed in the calculation of (13)C NMR chemical shifts was investigated over a large variety of known organic compounds, using different quantum chemistry methods and basis sets. After detailed analysis of the collected data, we found that methanol and benzene are excellent reference standards for computing NMR shifts of sp(3)- and sp-sp(2)-hybridized carbon atoms, respectively. This multi-standard approach (MSTD) performs better than TMS in terms of accuracy and precision and also displays much lower dependence on the level of theory employed. The use of mPW1PW91/6-31G(d)//mPW1PW91/6-31G(d) level is recommended for accurate (13)C NMR chemical shift prediction at low computational cost.
ERIC Educational Resources Information Center
Uiboleht, Kaire; Karm, Mari; Postareff, Liisa
2016-01-01
Teaching approaches in higher education are at the general level well researched and have identified not only the two broad categories of content-focused and learning-focused approaches to teaching but also consonance and dissonance between the aspects of teaching. Consonance means that theoretically coherent teaching practices are employed, but…
A multi-resolution approach to electromagnetic modeling.
NASA Astrophysics Data System (ADS)
Cherevatova, M.; Egbert, G. D.; Smirnov, M. Yu
2018-04-01
We present a multi-resolution approach for three-dimensional magnetotelluric forward modeling. Our approach is motivated by the fact that fine grid resolution is typically required at shallow levels to adequately represent near surface inhomogeneities, topography, and bathymetry, while a much coarser grid may be adequate at depth where the diffusively propagating electromagnetic fields are much smoother. This is especially true for forward modeling required in regularized inversion, where conductivity variations at depth are generally very smooth. With a conventional structured finite-difference grid the fine discretization required to adequately represent rapid variations near the surface are continued to all depths, resulting in higher computational costs. Increasing the computational efficiency of the forward modeling is especially important for solving regularized inversion problems. We implement a multi-resolution finite-difference scheme that allows us to decrease the horizontal grid resolution with depth, as is done with vertical discretization. In our implementation, the multi-resolution grid is represented as a vertical stack of sub-grids, with each sub-grid being a standard Cartesian tensor product staggered grid. Thus, our approach is similar to the octree discretization previously used for electromagnetic modeling, but simpler in that we allow refinement only with depth. The major difficulty arose in deriving the forward modeling operators on interfaces between adjacent sub-grids. We considered three ways of handling the interface layers and suggest a preferable one, which results in similar accuracy as the staggered grid solution, while retaining the symmetry of coefficient matrix. A comparison between multi-resolution and staggered solvers for various models show that multi-resolution approach improves on computational efficiency without compromising the accuracy of the solution.
NASA Astrophysics Data System (ADS)
Rico, Antonio; Noguera, Manuel; Garrido, José Luis; Benghazi, Kawtar; Barjis, Joseph
2016-05-01
Multi-tenant architectures (MTAs) are considered a cornerstone in the success of Software as a Service as a new application distribution formula. Multi-tenancy allows multiple customers (i.e. tenants) to be consolidated into the same operational system. This way, tenants run and share the same application instance as well as costs, which are significantly reduced. Functional needs vary from one tenant to another; either companies from different sectors run different types of applications or, although deploying the same functionality, they do differ in the extent of their complexity. In any case, MTA leaves one major concern regarding the companies' data, their privacy and security, which requires special attention to the data layer. In this article, we propose an extended data model that enhances traditional MTAs in respect of this concern. This extension - called multi-target - allows MT applications to host, manage and serve multiple functionalities within the same multi-tenant (MT) environment. The practical deployment of this approach will allow SaaS vendors to target multiple markets or address different levels of functional complexity and yet commercialise just one single MT application. The applicability of the approach is demonstrated via a case study of a real multi-tenancy multi-target (MT2) implementation, called Globalgest.
ERIC Educational Resources Information Center
Francis-Thompson, Nyshawana
2017-01-01
This qualitative study examined how Multi-tier System of Supports (MTSS), a systematic approach to providing academic and behavioral supports to students, was implemented and experienced by macro and micro levels of educators in the Bermuda Public School system. I asked three research questions regarding: (a) how MTSS was being implemented in the…
Are hotspots of evolutionary potential adequately protected in southern California?
Vandergast, A.G.; Bohonak, A.J.; Hathaway, S.A.; Boys, J.; Fisher, R.N.
2008-01-01
Reserves are often designed to protect rare habitats, or "typical" exemplars of ecoregions and geomorphic provinces. This approach focuses on current patterns of organismal and ecosystem-level biodiversity, but typically ignores the evolutionary processes that control the gain and loss of biodiversity at these and other levels (e.g., genetic, ecological). In order to include evolutionary processes in conservation planning efforts, their spatial components must first be identified and mapped. We describe a GIS-based approach for explicitly mapping patterns of genetic divergence and diversity for multiple species (a "multi-species genetic landscape"). Using this approach, we analyzed mitochondrial DNA datasets from 21 vertebrate and invertebrate species in southern California to identify areas with common phylogeographic breaks and high intrapopulation diversity. The result is an evolutionary framework for southern California within which patterns of genetic diversity can be analyzed in the context of historical processes, future evolutionary potential and current reserve design. Our multi-species genetic landscapes pinpoint six hotspots where interpopulation genetic divergence is consistently high, five evolutionary hotspots within which genetic connectivity is high, and three hotspots where intrapopulation genetic diversity is high. These 14 hotspots can be grouped into eight geographic areas, of which five largely are unprotected at this time. The multi-species genetic landscape approach may provide an avenue to readily incorporate measures of evolutionary process into GIS-based systematic conservation assessment and land-use planning.
Few, Roger; Lake, Iain; Hunter, Paul R; Tran, Pham Gia; Thien, Vu Trong
2009-12-21
Understanding how risks to human health change as a result of seasonal variations in environmental conditions is likely to become of increasing importance in the context of climatic change, especially in lower-income countries. A multi-disciplinary approach can be a useful tool for improving understanding, particularly in situations where existing data resources are limited but the environmental health implications of seasonal hazards may be high. This short article describes a multi-disciplinary approach combining analysis of changes in levels of environmental contamination, seasonal variations in disease incidence and a social scientific analysis of health behaviour. The methodology was field-tested in a peri-urban environment in the Mekong Delta, Vietnam, where poor households face alternate seasonal extremes in the local environment as the water level in the Delta changes from flood to dry season. Low-income households in the research sites rely on river water for domestic uses, including provision of drinking water, and it is commonly perceived that the seasonal changes alter risk from diarrhoeal diseases and other diseases associated with contamination of water. The discussion focuses on the implementation of the methodology in the field, and draws lessons from the research process that can help in refining and developing the approach for application in other locations where seasonal dynamics of disease risk may have important consequences for public health.
A multi-scale framework to link remotely sensed metrics with socioeconomic data
NASA Astrophysics Data System (ADS)
Watmough, Gary; Svenning, Jens-Christian; Palm, Cheryl; Sullivan, Clare; Danylo, Olha; McCallum, Ian
2017-04-01
There is increasing interest in the use of remotely sensed satellite data for estimating human poverty as it can bridge data gaps that prevent fine scale monitoring of development goals across large areas. The ways in which metrics derived from satellite imagery are linked with socioeconomic data are crucial for accurate estimation of poverty. Yet, to date, approaches in the literature linking satellite metrics with socioeconomic data are poorly characterized. Typically, approaches use a GIS approach such as circular buffer zones around a village or household or an administrative boundary such as a district or census enumeration area. These polygons are then used to extract environmental data from satellite imagery and related to the socioeconomic data in statistical analyses. The use of a single polygon to link environment and socioeconomic data is inappropriate in coupled human-natural systems as processes operate over multiple scales. Human interactions with the environment occur at multiple levels from individual (household) access to agricultural plots adjacent to homes, to communal access to common pool resources (CPR) such as forests at the village level. Here, we present a multi-scale framework that explicitly considers how people use the landscape. The framework is presented along with a case study example in Kenya. The multi-scale approach could enhance the modelling of human-environment interactions which will have important consequences for monitoring the sustainable development goals for human livelihoods and biodiversity conservation.
Single-mode glass waveguide technology for optical interchip communication on board level
NASA Astrophysics Data System (ADS)
Brusberg, Lars; Neitz, Marcel; Schröder, Henning
2012-01-01
The large bandwidth demand in long-distance telecom networks lead to single-mode fiber interconnects as result of low dispersion, low loss and dense wavelength multiplexing possibilities. In contrast, multi-mode interconnects are suitable for much shorter lengths up to 300 meters and are promising for optical links between racks and on board level. Active optical cables based on multi-mode fiber links are at the market and research in multi-mode waveguide integration on board level is still going on. Compared to multi-mode, a single-mode waveguide has much more integration potential because of core diameters of around 20% of a multi-mode waveguide by a much larger bandwidth. But light coupling in single-mode waveguides is much more challenging because of lower coupling tolerances. Together with the silicon photonics technology, a single-mode waveguide technology on board-level will be the straight forward development goal for chip-to-chip optical interconnects integration. Such a hybrid packaging platform providing 3D optical single-mode links bridges the gap between novel photonic integrated circuits and the glass fiber based long-distance telecom networks. Following we introduce our 3D photonic packaging approach based on thin glass substrates with planar integrated optical single-mode waveguides for fiber-to-chip and chip-to-chip interconnects. This novel packaging approach merges micro-system packaging and glass integrated optics. It consists of a thin glass substrate with planar integrated singlemode waveguide circuits, optical mirrors and lenses providing an integration platform for photonic IC assembly and optical fiber interconnect. Thin glass is commercially available in panel and wafer formats and characterizes excellent optical and high-frequency properties. That makes it perfect for microsystem packaging. The paper presents recent results in single-mode waveguide technology on wafer level and waveguide characterization. Furthermore the integration in a hybrid packaging process and design issues are discussed.
Push pull microfluidics on a multi-level 3D CD.
Thio, Tzer Hwai Gilbert; Ibrahim, Fatimah; Al-Faqheri, Wisam; Moebius, Jacob; Khalid, Noor Sakinah; Soin, Norhayati; Kahar, Maria Kahar Bador Abdul; Madou, Marc
2013-08-21
A technique known as thermo-pneumatic (TP) pumping is used to pump fluids on a microfluidic compact disc (CD) back towards the CD center against the centrifugal force that pushes liquids from the center to the perimeter of the disc. Trapped air expands in a TP air chamber during heating, and this creates positive pressure on liquids located in chambers connected to that chamber. While the TP air chamber and connecting channels are easy to fabricate in a one-level CD manufacturing technique, this approach provides only one way pumping between two chambers, is real-estate hungry and leads to unnecessary heating of liquids in close proximity to the TP chamber. In this paper, we present a novel TP push and pull pumping method which allows for pumping of liquid in any direction between two connected liquid chambers. To ensure that implementation of TP push and pull pumping also addresses the issue of space and heating challenges, a multi-level 3D CD design is developed, and localized forced convection heating, rather than infra-red (IR) is applied. On a multi-level 3D CD, the TP features are placed on a top level separate from the rest of the microfluidic processes that are implemented on a lower separate level. This approach allows for heat shielding of the microfluidic process level, and efficient usage of space on the CD for centrifugal handling of liquids. The use of localized forced convection heating, rather than infra-red (IR) or laser heating in earlier implementations allows not only for TP pumping of liquids while the CD is spinning but also makes heat insulation for TP pumping and other fluidic functions easier. To aid in future implementations of TP push and pull pumping on a multi-level 3D CD, study on CD surface heating is also presented. In this contribution, we also demonstrate an advanced application of pull pumping through the implementation of valve-less switch pumping.
Push pull microfluidics on a multi-level 3D CD
Thio, Tzer Hwai Gilbert; Ibrahim, Fatimah; Al-Faqheri, Wisam; Moebius, Jacob; Khalid, Noor Sakinah; Soin, Norhayati; Kahar, Maria Kahar Bador Abdul; Madou, Marc
2013-01-01
A technique known as thermo-pneumatic (TP) pumping is used to pump fluids on a microfluidic compact disc (CD) back towards the CD center against the centrifugal force that pushes liquids from the center to the perimeter of the disc. Trapped air expands in a TP air chamber during heating, and this creates positive pressure on liquids located in chambers connected to that chamber. While the TP air chamber and connecting channels are easy to fabricate in a one-level CD manufacturing technique, this approach provides only one way pumping between two chambers, is real-estate hungry and leads to unnecessary heating of liquids in close proximity to the TP chamber. In this paper, we present a novel TP push and pull pumping method which allows for pumping of liquid in any direction between two connected liquid chambers. To ensure that implementation of TP push and pull pumping also addresses the issue of space and heating challenges, a multi-level 3D CD design is developed, and localized forced convection heating, rather than infra-red (IR) is applied. On a multi-level 3D CD, the TP features are placed on a top level separate from the rest of the microfluidic processes that are implemented on a lower separate level. This approach allows for heat shielding of the microfluidic process levels, and efficient usage of space on the CD for centrifugal handling of liquids. The use of localized forced convection heating, rather than infra-red (IR) or laser heating in earlier implementations allows not only for TP pumping of liquids while the CD is spinning but also makes heat insulation for TP pumping and other fluidic functions easier. To aid in future implementations of TP push and pull pumping on a multi-level 3D CD, study on CD surface heating is also presented. In this contribution, we also demonstrate an advanced application of pull pumping through the implementation of valve-less switch pumping. PMID:23774994
ERIC Educational Resources Information Center
Fielding, A.
1995-01-01
Reanalyzes H. Thomas's 1980s data, which used teaching group as the unit of analysis and illuminated some institutional disparities in provision of General Certificate of Education (GCE) A-levels. Uses multilevel analysis to focus on individual students in a hierarchical framework. Among the study institutions, school sixth forms appear less…
Using Nonlinear Programming in International Trade Theory: The Factor-Proportions Model
ERIC Educational Resources Information Center
Gilbert, John
2004-01-01
Students at all levels benefit from a multi-faceted approach to learning abstract material. The most commonly used technique in teaching the pure theory of international trade is a combination of geometry and algebraic derivations. Numerical simulation can provide a valuable third support to these approaches. The author describes a simple…
Multi-Level Alignment Model: Transforming Face-to-Face into E-Instructional Programs
ERIC Educational Resources Information Center
Byers, Celina
2005-01-01
Purpose--To suggest to others in the field an approach equally valid for transforming existing courses into online courses and for creating new online courses. Design/methodology/approach--Using the literature for substantiation, this article discusses the current rapid change within organizations, the role of technology in that change, and the…
Zhou, Yuan; Shi, Tie-Mao; Hu, Yuan-Man; Gao, Chang; Liu, Miao; Song, Lin-Qi
2011-12-01
Based on geographic information system (GIS) technology and multi-objective location-allocation (LA) model, and in considering of four relatively independent objective factors (population density level, air pollution level, urban heat island effect level, and urban land use pattern), an optimized location selection for the urban parks within the Third Ring of Shenyang was conducted, and the selection results were compared with the spatial distribution of existing parks, aimed to evaluate the rationality of the spatial distribution of urban green spaces. In the location selection of urban green spaces in the study area, the factor air pollution was most important, and, compared with single objective factor, the weighted analysis results of multi-objective factors could provide optimized spatial location selection of new urban green spaces. The combination of GIS technology with LA model would be a new approach for the spatial optimizing of urban green spaces.
NASA Astrophysics Data System (ADS)
Hochrainer-Stigler, Stefan; Lorant, Anna
2018-01-01
Disaster risk is increasingly recognized as a major development challenge. Recent calls emphasize the need to proactively engage in disaster risk reduction, as well as to establish new partnerships between private and public sector entities in order to decrease current and future risks. Very often such potential partnerships have to meet different objectives reflecting on the priorities of stakeholders involved. Consequently, potential partnerships need to be assessed on multiple criteria to determine weakest links and greatest threats in collaboration. This paper takes a supranational multi-sector partnership perspective, and considers possible ways to enhance disaster risk management in the European Union by better coordination between the European Union Solidarity Fund, risk reduction efforts, and insurance mechanisms. Based on flood risk estimates we employ a risk-layer approach to determine set of options for new partnerships and test them in a high-level workshop via a novel cardinal ranking based multi-criteria approach. Whilst transformative changes receive good overall scores, we also find that the incorporation of risk into budget planning is an essential condition for successful partnerships.
NASA Astrophysics Data System (ADS)
Lee, Chang-Chun; Huang, Pei-Chen
2018-05-01
The long-term reliability of multi-stacked coatings suffering the bending or rolling load was a severe challenge to extend the lifespan of foregoing structure. In addition, the adhesive strength of dissimilar materials was regarded as the major mechanical reliability concerns among multi-stacked films. However, the significant scale-mismatch from several nano-meter to micro-meter among the multi-stacked coatings causing the numerical accuracy and converged capability issues on fracture-based simulation approach. For those reasons, this study proposed the FEA-based multi-level submodeling and multi-point constraint (MPC) technique to conquer the foregoing scale-mismatch issue. The results indicated that the decent region of first and second-order submodeling can achieve the small error of 1.27% compared with the experimental result and significantly reduced the mesh density and computing time. Moreover, the MPC method adopted in FEA simulation also shown only 0.54% error when the boundary of selected local region was away the concerned critical region following the Saint-Venant principle. In this investigation, two FEA-based approaches were used to conquer the evidently scale mismatch issue when the adhesive strengths of micro and nano-scale multi-stacked coating were taken into account.
Braithwaite, Jeffrey; Westbrook, Johanna; Pawsey, Marjorie; Greenfield, David; Naylor, Justine; Iedema, Rick; Runciman, Bill; Redman, Sally; Jorm, Christine; Robinson, Maureen; Nathan, Sally; Gibberd, Robert
2006-01-01
Background Accreditation has become ubiquitous across the international health care landscape. Award of full accreditation status in health care is viewed, as it is in other sectors, as a valid indicator of high quality organisational performance. However, few studies have empirically demonstrated this assertion. The value of accreditation, therefore, remains uncertain, and this persists as a central legitimacy problem for accreditation providers, policymakers and researchers. The question arises as to how best to research the validity, impact and value of accreditation processes in health care. Most health care organisations participate in some sort of accreditation process and thus it is not possible to study its merits using a randomised controlled strategy. Further, tools and processes for accreditation and organisational performance are multifaceted. Methods/design To understand the relationship between them a multi-method research approach is required which incorporates both quantitative and qualitative data. The generic nature of accreditation standard development and inspection within different sectors enhances the extent to which the findings of in-depth study of accreditation process in one industry can be generalised to other industries. This paper presents a research design which comprises a prospective, multi-method, multi-level, multi-disciplinary approach to assess the validity, impact and value of accreditation. Discussion The accreditation program which assesses over 1,000 health services in Australia is used as an exemplar for testing this design. The paper proposes this design as a framework suitable for application to future international research into accreditation. Our aim is to stimulate debate on the role of accreditation and how to research it. PMID:16968552
NASA Astrophysics Data System (ADS)
Randrianalisoa, Jaona; Haussener, Sophia; Baillis, Dominique; Lipiński, Wojciech
2017-11-01
Radiative heat transfer is analyzed in participating media consisting of long cylindrical fibers with a diameter in the limit of geometrical optics. The absorption and scattering coefficients and the scattering phase function of the medium are determined based on the discrete-level medium geometry and optical properties of individual fibers. The fibers are assumed to be randomly oriented and positioned inside the medium. Two approaches are employed: a volume-averaged two-intensity approach referred to as multi-RTE approach and a homogenized single-intensity approach referred to as the single-RTE approach. Both approaches require effective properties, determined using direct Monte Carlo ray tracing techniques. The macroscopic radiative transfer equations (for single intensity or two volume-averaged intensities) with the corresponding effective properties are solved using Monte Carlo techniques and allow for the determination of the radiative flux distribution as well as overall transmittance and reflectance of the medium. The results are compared against predictions by the direct Monte Carlo simulation on the exact morphology. The effects of fiber volume fraction and optical properties on the effective radiative properties and the overall slab radiative characteristics are investigated. The single-RTE approach gives accurate predictions for high porosity fibrous media (porosity about 95%). The multi-RTE approach is recommended for isotropic fibrous media with porosity in the range of 79-95%.
Improving survey response rates from parents in school-based research using a multi-level approach.
Schilpzand, Elizabeth J; Sciberras, Emma; Efron, Daryl; Anderson, Vicki; Nicholson, Jan M
2015-01-01
While schools can provide a comprehensive sampling frame for community-based studies of children and their families, recruitment is challenging. Multi-level approaches which engage multiple school stakeholders have been recommended but few studies have documented their effects. This paper compares the impact of a standard versus enhanced engagement approach on multiple indicators of recruitment: parent response rates, response times, reminders required and sample characteristics. Parents and teachers were distributed a brief screening questionnaire as a first step for recruitment to a longitudinal study, with two cohorts recruited in consecutive years (cohort 1 2011, cohort 2 2012). For cohort 2, additional engagement strategies included the use of pre-notification postcards, improved study materials, and recruitment progress graphs provided to school staff. Chi-square and t-tests were used to examine cohort differences. Compared to cohort 1, a higher proportion of cohort 2 parents responded to the survey (76% versus 69%; p < 0.001), consented to participate (71% versus 56%; p < 0.001), agreed to teacher participation (90% versus 82%; p < 0.001) and agreed to follow-up contact (91% versus 80%; p < 0.001). Fewer cohort 2 parents required reminders (52% versus 63%; p < 0.001), and cohort 2 parents responded more promptly than cohort 1 parents (mean difference: 19.4 days, 95% CI: 18.0 to 20.9, p < 0.001). These results illustrate the value of investing in a relatively simple multi-level strategy to maximise parent response rates, and potentially reduce recruitment time and costs.
Integrating Climate Projections into Multi-Level City Planning: A Texas Case Study
NASA Astrophysics Data System (ADS)
Hayhoe, K.; Gelca, R.; Baumer, Z.; Gold, G.
2016-12-01
Climate change impacts on energy and water are a serious concern for many cities across the United States. Regional projections from the National Assessment process, or state-specific efforts as in California and Delaware, are typically used to quantify impacts at the regional scale. However, these are often insufficient to provide information at the scale of decision-making for an individual city. Here, we describe a multi-level approach to developing and integrating usable climate information into planning, using a case study from the City of Austin in Texas, a state where few official climate resources are available. Spearheaded by the Office of Sustainability in collaboration with Austin Water, the first step was to characterize observed trends and future projections of how global climate change might affect Austin's current climate. The City then assembled a team of city experts, consulting engineers, and climate scientists to develop a methodology to assess impacts on regional hydrology as part of its Integrated Water Resource Plan, Austin's 100-year water supply and demand planning effort, an effort which included calculating a range of climate indicators and developing and evaluating a new approach to generating climate inputs - including daily streamflow and evaporation - for existing water availability models. This approach, which brings together a range of public, private, and academic experts to support a stakeholder-initiated planning effort, provides concrete insights into the critical importance of multi-level, long-term engagement for development and application of actionable climate science at the local to regional scale.
Butun, Ismail; Ra, In-Ho; Sankar, Ravi
2015-01-01
In this work, an intrusion detection system (IDS) framework based on multi-level clustering for hierarchical wireless sensor networks is proposed. The framework employs two types of intrusion detection approaches: (1) “downward-IDS (D-IDS)” to detect the abnormal behavior (intrusion) of the subordinate (member) nodes; and (2) “upward-IDS (U-IDS)” to detect the abnormal behavior of the cluster heads. By using analytical calculations, the optimum parameters for the D-IDS (number of maximum hops) and U-IDS (monitoring group size) of the framework are evaluated and presented. PMID:26593915
Three essays on multi-level optimization models and applications
NASA Astrophysics Data System (ADS)
Rahdar, Mohammad
The general form of a multi-level mathematical programming problem is a set of nested optimization problems, in which each level controls a series of decision variables independently. However, the value of decision variables may also impact the objective function of other levels. A two-level model is called a bilevel model and can be considered as a Stackelberg game with a leader and a follower. The leader anticipates the response of the follower and optimizes its objective function, and then the follower reacts to the leader's action. The multi-level decision-making model has many real-world applications such as government decisions, energy policies, market economy, network design, etc. However, there is a lack of capable algorithms to solve medium and large scale these types of problems. The dissertation is devoted to both theoretical research and applications of multi-level mathematical programming models, which consists of three parts, each in a paper format. The first part studies the renewable energy portfolio under two major renewable energy policies. The potential competition for biomass for the growth of the renewable energy portfolio in the United States and other interactions between two policies over the next twenty years are investigated. This problem mainly has two levels of decision makers: the government/policy makers and biofuel producers/electricity generators/farmers. We focus on the lower-level problem to predict the amount of capacity expansions, fuel production, and power generation. In the second part, we address uncertainty over demand and lead time in a multi-stage mathematical programming problem. We propose a two-stage tri-level optimization model in the concept of rolling horizon approach to reducing the dimensionality of the multi-stage problem. In the third part of the dissertation, we introduce a new branch and bound algorithm to solve bilevel linear programming problems. The total time is reduced by solving a smaller relaxation problem in each node and decreasing the number of iterations. Computational experiments show that the proposed algorithm is faster than the existing ones.
Kim, Dokyoon; Joung, Je-Gun; Sohn, Kyung-Ah; Shin, Hyunjung; Park, Yu Rang; Ritchie, Marylyn D; Kim, Ju Han
2015-01-01
Objective Cancer can involve gene dysregulation via multiple mechanisms, so no single level of genomic data fully elucidates tumor behavior due to the presence of numerous genomic variations within or between levels in a biological system. We have previously proposed a graph-based integration approach that combines multi-omics data including copy number alteration, methylation, miRNA, and gene expression data for predicting clinical outcome in cancer. However, genomic features likely interact with other genomic features in complex signaling or regulatory networks, since cancer is caused by alterations in pathways or complete processes. Methods Here we propose a new graph-based framework for integrating multi-omics data and genomic knowledge to improve power in predicting clinical outcomes and elucidate interplay between different levels. To highlight the validity of our proposed framework, we used an ovarian cancer dataset from The Cancer Genome Atlas for predicting stage, grade, and survival outcomes. Results Integrating multi-omics data with genomic knowledge to construct pre-defined features resulted in higher performance in clinical outcome prediction and higher stability. For the grade outcome, the model with gene expression data produced an area under the receiver operating characteristic curve (AUC) of 0.7866. However, models of the integration with pathway, Gene Ontology, chromosomal gene set, and motif gene set consistently outperformed the model with genomic data only, attaining AUCs of 0.7873, 0.8433, 0.8254, and 0.8179, respectively. Conclusions Integrating multi-omics data and genomic knowledge to improve understanding of molecular pathogenesis and underlying biology in cancer should improve diagnostic and prognostic indicators and the effectiveness of therapies. PMID:25002459
Kim, Dokyoon; Joung, Je-Gun; Sohn, Kyung-Ah; Shin, Hyunjung; Park, Yu Rang; Ritchie, Marylyn D; Kim, Ju Han
2015-01-01
Cancer can involve gene dysregulation via multiple mechanisms, so no single level of genomic data fully elucidates tumor behavior due to the presence of numerous genomic variations within or between levels in a biological system. We have previously proposed a graph-based integration approach that combines multi-omics data including copy number alteration, methylation, miRNA, and gene expression data for predicting clinical outcome in cancer. However, genomic features likely interact with other genomic features in complex signaling or regulatory networks, since cancer is caused by alterations in pathways or complete processes. Here we propose a new graph-based framework for integrating multi-omics data and genomic knowledge to improve power in predicting clinical outcomes and elucidate interplay between different levels. To highlight the validity of our proposed framework, we used an ovarian cancer dataset from The Cancer Genome Atlas for predicting stage, grade, and survival outcomes. Integrating multi-omics data with genomic knowledge to construct pre-defined features resulted in higher performance in clinical outcome prediction and higher stability. For the grade outcome, the model with gene expression data produced an area under the receiver operating characteristic curve (AUC) of 0.7866. However, models of the integration with pathway, Gene Ontology, chromosomal gene set, and motif gene set consistently outperformed the model with genomic data only, attaining AUCs of 0.7873, 0.8433, 0.8254, and 0.8179, respectively. Integrating multi-omics data and genomic knowledge to improve understanding of molecular pathogenesis and underlying biology in cancer should improve diagnostic and prognostic indicators and the effectiveness of therapies. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems
NASA Astrophysics Data System (ADS)
Kwag, Shinyoung
Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.
Multi-level gene/MiRNA feature selection using deep belief nets and active learning.
Ibrahim, Rania; Yousri, Noha A; Ismail, Mohamed A; El-Makky, Nagwa M
2014-01-01
Selecting the most discriminative genes/miRNAs has been raised as an important task in bioinformatics to enhance disease classifiers and to mitigate the dimensionality curse problem. Original feature selection methods choose genes/miRNAs based on their individual features regardless of how they perform together. Considering group features instead of individual ones provides a better view for selecting the most informative genes/miRNAs. Recently, deep learning has proven its ability in representing the data in multiple levels of abstraction, allowing for better discrimination between different classes. However, the idea of using deep learning for feature selection is not widely used in the bioinformatics field yet. In this paper, a novel multi-level feature selection approach named MLFS is proposed for selecting genes/miRNAs based on expression profiles. The approach is based on both deep and active learning. Moreover, an extension to use the technique for miRNAs is presented by considering the biological relation between miRNAs and genes. Experimental results show that the approach was able to outperform classical feature selection methods in hepatocellular carcinoma (HCC) by 9%, lung cancer by 6% and breast cancer by around 10% in F1-measure. Results also show the enhancement in F1-measure of our approach over recently related work in [1] and [2].
Desired Precision in Multi-Objective Optimization: Epsilon Archiving or Rounding Objectives?
NASA Astrophysics Data System (ADS)
Asadzadeh, M.; Sahraei, S.
2016-12-01
Multi-objective optimization (MO) aids in supporting the decision making process in water resources engineering and design problems. One of the main goals of solving a MO problem is to archive a set of solutions that is well-distributed across a wide range of all the design objectives. Modern MO algorithms use the epsilon dominance concept to define a mesh with pre-defined grid-cell size (often called epsilon) in the objective space and archive at most one solution at each grid-cell. Epsilon can be set to the desired precision level of each objective function to make sure that the difference between each pair of archived solutions is meaningful. This epsilon archiving process is computationally expensive in problems that have quick-to-evaluate objective functions. This research explores the applicability of a similar but computationally more efficient approach to respect the desired precision level of all objectives in the solution archiving process. In this alternative approach each objective function is rounded to the desired precision level before comparing any new solution to the set of archived solutions that already have rounded objective function values. This alternative solution archiving approach is compared to the epsilon archiving approach in terms of efficiency and quality of archived solutions for solving mathematical test problems and hydrologic model calibration problems.
Clinical Assessment of Risk Management: an INtegrated Approach (CARMINA).
Tricarico, Pierfrancesco; Tardivo, Stefano; Sotgiu, Giovanni; Moretti, Francesca; Poletti, Piera; Fiore, Alberto; Monturano, Massimo; Mura, Ida; Privitera, Gaetano; Brusaferro, Silvio
2016-08-08
Purpose - The European Union recommendations for patient safety calls for shared clinical risk management (CRM) safety standards able to guide organizations in CRM implementation. The purpose of this paper is to develop a self-evaluation tool to measure healthcare organization performance on CRM and guide improvements over time. Design/methodology/approach - A multi-step approach was implemented including: a systematic literature review; consensus meetings with an expert panel from eight Italian leader organizations to get to an agreement on the first version; field testing to test instrument feasibility and flexibility; Delphi strategy with a second expert panel for content validation and balanced scoring system development. Findings - The self-assessment tool - Clinical Assessment of Risk Management: an INtegrated Approach includes seven areas (governance, communication, knowledge and skills, safe environment, care processes, adverse event management, learning from experience) and 52 standards. Each standard is evaluated according to four performance levels: minimum; monitoring; outcomes; and improvement actions, which resulted in a feasible, flexible and valid instrument to be used throughout different organizations. Practical implications - This tool allows practitioners to assess their CRM activities compared to minimum levels, monitor performance, benchmarking with other institutions and spreading results to different stakeholders. Originality/value - The multi-step approach allowed us to identify core minimum CRM levels in a field where no consensus has been reached. Most standards may be easily adopted in other countries.
Evolution of neuroarchitecture, multi-level analyses and calibrative reductionism
Berntson, Gary G.; Norman, Greg J.; Hawkley, Louise C.; Cacioppo, John T.
2012-01-01
Evolution has sculpted the incredibly complex human nervous system, among the most complex functions of which extend beyond the individual to an intricate social structure. Although these functions are deterministic, those determinants are legion, heavily interacting and dependent on a specific evolutionary trajectory. That trajectory was directed by the adaptive significance of quasi-random genetic variations, but was also influenced by chance and caprice. With a different evolutionary pathway, the same neural elements could subserve functions distinctly different from what they do in extant human brains. Consequently, the properties of higher level neural networks cannot be derived readily from the properties of the lower level constituent elements, without studying these elements in the aggregate. Thus, a multi-level approach to integrative neuroscience may offer an optimal strategy. Moreover, the process of calibrative reductionism, by which concepts and understandings from one level of organization or analysis can mutually inform and ‘calibrate’ those from other levels (both higher and lower), may represent a viable approach to the application of reductionism in science. This is especially relevant in social neuroscience, where the basic subject matter of interest is defined by interacting organisms across diverse environments. PMID:23386961
NASA Astrophysics Data System (ADS)
Mallick, Rajnish; Ganguli, Ranjan; Seetharama Bhat, M.
2015-09-01
The objective of this study is to determine an optimal trailing edge flap configuration and flap location to achieve minimum hub vibration levels and flap actuation power simultaneously. An aeroelastic analysis of a soft in-plane four-bladed rotor is performed in conjunction with optimal control. A second-order polynomial response surface based on an orthogonal array (OA) with 3-level design describes both the objectives adequately. Two new orthogonal arrays called MGB2P-OA and MGB4P-OA are proposed to generate nonlinear response surfaces with all interaction terms for two and four parameters, respectively. A multi-objective bat algorithm (MOBA) approach is used to obtain the optimal design point for the mutually conflicting objectives. MOBA is a recently developed nature-inspired metaheuristic optimization algorithm that is based on the echolocation behaviour of bats. It is found that MOBA inspired Pareto optimal trailing edge flap design reduces vibration levels by 73% and flap actuation power by 27% in comparison with the baseline design.
Analytical approach to the multi-state lasing phenomenon in quantum dot lasers
NASA Astrophysics Data System (ADS)
Korenev, V. V.; Savelyev, A. V.; Zhukov, A. E.; Omelchenko, A. V.; Maximov, M. V.
2013-03-01
We introduce an analytical approach to describe the multi-state lasing phenomenon in quantum dot lasers. We show that the key parameter is the hole-to-electron capture rate ratio. If it is lower than a certain critical value, the complete quenching of ground-state lasing takes place at high injection levels. At higher values of the ratio, the model predicts saturation of the ground-state power. This explains the diversity of experimental results and their contradiction to the conventional rate equation model. Recently found enhancement of ground-state lasing in p-doped samples and temperature dependence of the ground-state power are also discussed.
NASA Astrophysics Data System (ADS)
Joyce, Steven; Hartley, Lee; Applegate, David; Hoek, Jaap; Jackson, Peter
2014-09-01
Forsmark in Sweden has been proposed as the site of a geological repository for spent high-level nuclear fuel, to be located at a depth of approximately 470 m in fractured crystalline rock. The safety assessment for the repository has required a multi-disciplinary approach to evaluate the impact of hydrogeological and hydrogeochemical conditions close to the repository and in a wider regional context. Assessing the consequences of potential radionuclide releases requires quantitative site-specific information concerning the details of groundwater flow on the scale of individual waste canister locations (1-10 m) as well as details of groundwater flow and composition on the scale of groundwater pathways between the facility and the surface (500 m to 5 km). The purpose of this article is to provide an illustration of multi-scale modeling techniques and the results obtained when combining aspects of local-scale flows in fractures around a potential contaminant source with regional-scale groundwater flow and transport subject to natural evolution of the system. The approach set out is novel, as it incorporates both different scales of model and different levels of detail, combining discrete fracture network and equivalent continuous porous medium representations of fractured bedrock.
Civilisation on the Couch: Theorising Multi-Levelled Psychoanalytical Arts Practice
ERIC Educational Resources Information Center
Benjamin, Garfield
2014-01-01
This paper combines two psychological approaches to art to theorise a both subjective and cultural methodology for practice-based arts research. The first psychoanalytical approach will follow the work of Deleuze and Guattari's Schizoanalysis, considering the role of the artist in order to assess their work in relation to society from an…
Embedding EfS in Teacher Education through a Multi-Level Systems Approach: Lessons from Queensland
ERIC Educational Resources Information Center
Evans, Neus; Ferreira, Jo-Anne; Davis, Julie; Stevenson, Robert B.
2016-01-01
This article reports on the fourth stage of an evolving study to develop a systems model for embedding education for sustainability (EfS) into preservice teacher education. The fourth stage trialled the extension of the model to a comprehensive state-wide systems approach involving representatives from all eight Queensland teacher education…
Secondary School Socio-Cultural Context Influencing ICT Integration: A Case Study Approach
ERIC Educational Resources Information Center
Divaharan, Shanti; Ping, Lim Cher
2010-01-01
This paper proposes the use of activity theory and multi-level activity systems as a framework to analyse the effectiveness of ICT integration in Singapore secondary school classrooms. Three levels of activity systems are developed to study the effectiveness of ICT integration at the classroom: the classroom activity system, the department…
Determinants of Academic Achievement of Middle Schoolers in Turkey
ERIC Educational Resources Information Center
Börkan, Bengü; Bakis, Ozan
2016-01-01
The purpose of this study is to discuss student and school factors, including cross level interaction, that cause inequalities in seven and eighth grade students' achievement in Turkish context by using national achievement test scores with a multi-level statistical approach. Our results are in line with most other studies with similar purpose.…
We present a simple approach to estimating ground-level fine particle (PM2.5, particles smaller than 2.5 um in diameter) concentration using global atmospheric chemistry models and aerosol optical thickness (AOT) measurements from the Multi- angle Imaging SpectroRadiometer (MISR)...
Negotiating designs of multi-purpose reservoir systems in international basins
NASA Astrophysics Data System (ADS)
Geressu, Robel; Harou, Julien
2016-04-01
Given increasing agricultural and energy demands, coordinated management of multi-reservoir systems could help increase production without further stressing available water resources. However, regional or international disputes about water-use rights pose a challenge to efficient expansion and management of many large reservoir systems. Even when projects are likely to benefit all stakeholders, agreeing on the design, operation, financing, and benefit sharing can be challenging. This is due to the difficulty of considering multiple stakeholder interests in the design of projects and understanding the benefit trade-offs that designs imply. Incommensurate performance metrics, incomplete knowledge on system requirements, lack of objectivity in managing conflict and difficulty to communicate complex issue exacerbate the problem. This work proposes a multi-step hybrid multi-objective optimization and multi-criteria ranking approach for supporting negotiation in water resource systems. The approach uses many-objective optimization to generate alternative efficient designs and reveal the trade-offs between conflicting objectives. This enables informed elicitation of criteria weights for further multi-criteria ranking of alternatives. An ideal design would be ranked as best by all stakeholders. Resource-sharing mechanisms such as power-trade and/or cost sharing may help competing stakeholders arrive at designs acceptable to all. Many-objective optimization helps suggests efficient designs (reservoir site, its storage size and operating rule) and coordination levels considering the perspectives of multiple stakeholders simultaneously. We apply the proposed approach to a proof-of-concept study of the expansion of the Blue Nile transboundary reservoir system.
Multi-model comparison highlights consistency in predicted effect of warming on a semi-arid shrub
Renwick, Katherine M.; Curtis, Caroline; Kleinhesselink, Andrew R.; Schlaepfer, Daniel R.; Bradley, Bethany A.; Aldridge, Cameron L.; Poulter, Benjamin; Adler, Peter B.
2018-01-01
A number of modeling approaches have been developed to predict the impacts of climate change on species distributions, performance, and abundance. The stronger the agreement from models that represent different processes and are based on distinct and independent sources of information, the greater the confidence we can have in their predictions. Evaluating the level of confidence is particularly important when predictions are used to guide conservation or restoration decisions. We used a multi-model approach to predict climate change impacts on big sagebrush (Artemisia tridentata), the dominant plant species on roughly 43 million hectares in the western United States and a key resource for many endemic wildlife species. To evaluate the climate sensitivity of A. tridentata, we developed four predictive models, two based on empirically derived spatial and temporal relationships, and two that applied mechanistic approaches to simulate sagebrush recruitment and growth. This approach enabled us to produce an aggregate index of climate change vulnerability and uncertainty based on the level of agreement between models. Despite large differences in model structure, predictions of sagebrush response to climate change were largely consistent. Performance, as measured by change in cover, growth, or recruitment, was predicted to decrease at the warmest sites, but increase throughout the cooler portions of sagebrush's range. A sensitivity analysis indicated that sagebrush performance responds more strongly to changes in temperature than precipitation. Most of the uncertainty in model predictions reflected variation among the ecological models, raising questions about the reliability of forecasts based on a single modeling approach. Our results highlight the value of a multi-model approach in forecasting climate change impacts and uncertainties and should help land managers to maximize the value of conservation investments.
Pearl, D L; Louie, M; Chui, L; Doré, K; Grimsrud, K M; Martin, S W; Michel, P; Svenson, L W; McEwen, S A
2009-10-01
Using negative binomial and multi-level Poisson models, the authors determined the statistical significance of agricultural and socio-economic risk factors for rates of reported disease associated with Escherichia coli O157 in census subdivisions (CSDs) in Alberta, Canada, 2000-2002. Variables relating to population stability, aboriginal composition of the CSDs, and the economic relationship between CSDs and urban centres were significant risk factors. The percentage of individuals living in low-income households was not a statistically significant risk factor for rates of disease. The statistical significance of cattle density, recorded at a higher geographical level, depended on the method used to correct for overdispersion, the number of levels included in the multi-level models, and the choice of using all reported cases or only sporadic cases. Our results highlight the importance of local socio-economic risk factors in determining rates of disease associated with E. coli O157, but their relationship with individual risk factors requires further evaluation.
Multi-country health surveys: are the analyses misleading?
Masood, Mohd; Reidpath, Daniel D
2014-05-01
The aim of this paper was to review the types of approaches currently utilized in the analysis of multi-country survey data, specifically focusing on design and modeling issues with a focus on analyses of significant multi-country surveys published in 2010. A systematic search strategy was used to identify the 10 multi-country surveys and the articles published from them in 2010. The surveys were selected to reflect diverse topics and foci; and provide an insight into analytic approaches across research themes. The search identified 159 articles appropriate for full text review and data extraction. The analyses adopted in the multi-country surveys can be broadly classified as: univariate/bivariate analyses, and multivariate/multivariable analyses. Multivariate/multivariable analyses may be further divided into design- and model-based analyses. Of the 159 articles reviewed, 129 articles used model-based analysis, 30 articles used design-based analyses. Similar patterns could be seen in all the individual surveys. While there is general agreement among survey statisticians that complex surveys are most appropriately analyzed using design-based analyses, most researchers continued to use the more common model-based approaches. Recent developments in design-based multi-level analysis may be one approach to include all the survey design characteristics. This is a relatively new area, however, and there remains statistical, as well as applied analytic research required. An important limitation of this study relates to the selection of the surveys used and the choice of year for the analysis, i.e., year 2010 only. There is, however, no strong reason to believe that analytic strategies have changed radically in the past few years, and 2010 provides a credible snapshot of current practice.
NASA Astrophysics Data System (ADS)
Torres-Martínez, J. A.; Seddaiu, M.; Rodríguez-Gonzálvez, P.; Hernández-López, D.; González-Aguilera, D.
2015-02-01
The complexity of archaeological sites hinders to get an integral modelling using the actual Geomatic techniques (i.e. aerial, closerange photogrammetry and terrestrial laser scanner) individually, so a multi-sensor approach is proposed as the best solution to provide a 3D reconstruction and visualization of these complex sites. Sensor registration represents a riveting milestone when automation is required and when aerial and terrestrial dataset must be integrated. To this end, several problems must be solved: coordinate system definition, geo-referencing, co-registration of point clouds, geometric and radiometric homogeneity, etc. Last but not least, safeguarding of tangible archaeological heritage and its associated intangible expressions entails a multi-source data approach in which heterogeneous material (historical documents, drawings, archaeological techniques, habit of living, etc.) should be collected and combined with the resulting hybrid 3D of "Tolmo de Minateda" located models. The proposed multi-data source and multi-sensor approach is applied to the study case of "Tolmo de Minateda" archaeological site. A total extension of 9 ha is reconstructed, with an adapted level of detail, by an ultralight aerial platform (paratrike), an unmanned aerial vehicle, a terrestrial laser scanner and terrestrial photogrammetry. In addition, the own defensive nature of the site (i.e. with the presence of three different defensive walls) together with the considerable stratification of the archaeological site (i.e. with different archaeological surfaces and constructive typologies) require that tangible and intangible archaeological heritage expressions can be integrated with the hybrid 3D models obtained, to analyse, understand and exploit the archaeological site by different experts and heritage stakeholders.
NASA Astrophysics Data System (ADS)
Cayirci, Erdal; Rong, Chunming; Huiskamp, Wim; Verkoelen, Cor
Military/civilian education training and experimentation networks (ETEN) are an important application area for the cloud computing concept. However, major security challenges have to be overcome to realize an ETEN. These challenges can be categorized as security challenges typical to any cloud and multi-level security challenges specific to an ETEN environment. The cloud approach for ETEN is introduced and its security challenges are explained in this paper.
Multi-Scale Computational Models for Electrical Brain Stimulation
Seo, Hyeon; Jun, Sung C.
2017-01-01
Electrical brain stimulation (EBS) is an appealing method to treat neurological disorders. To achieve optimal stimulation effects and a better understanding of the underlying brain mechanisms, neuroscientists have proposed computational modeling studies for a decade. Recently, multi-scale models that combine a volume conductor head model and multi-compartmental models of cortical neurons have been developed to predict stimulation effects on the macroscopic and microscopic levels more precisely. As the need for better computational models continues to increase, we overview here recent multi-scale modeling studies; we focused on approaches that coupled a simplified or high-resolution volume conductor head model and multi-compartmental models of cortical neurons, and constructed realistic fiber models using diffusion tensor imaging (DTI). Further implications for achieving better precision in estimating cellular responses are discussed. PMID:29123476
Integrative Data Analysis of Multi-Platform Cancer Data with a Multimodal Deep Learning Approach.
Liang, Muxuan; Li, Zhizhong; Chen, Ting; Zeng, Jianyang
2015-01-01
Identification of cancer subtypes plays an important role in revealing useful insights into disease pathogenesis and advancing personalized therapy. The recent development of high-throughput sequencing technologies has enabled the rapid collection of multi-platform genomic data (e.g., gene expression, miRNA expression, and DNA methylation) for the same set of tumor samples. Although numerous integrative clustering approaches have been developed to analyze cancer data, few of them are particularly designed to exploit both deep intrinsic statistical properties of each input modality and complex cross-modality correlations among multi-platform input data. In this paper, we propose a new machine learning model, called multimodal deep belief network (DBN), to cluster cancer patients from multi-platform observation data. In our integrative clustering framework, relationships among inherent features of each single modality are first encoded into multiple layers of hidden variables, and then a joint latent model is employed to fuse common features derived from multiple input modalities. A practical learning algorithm, called contrastive divergence (CD), is applied to infer the parameters of our multimodal DBN model in an unsupervised manner. Tests on two available cancer datasets show that our integrative data analysis approach can effectively extract a unified representation of latent features to capture both intra- and cross-modality correlations, and identify meaningful disease subtypes from multi-platform cancer data. In addition, our approach can identify key genes and miRNAs that may play distinct roles in the pathogenesis of different cancer subtypes. Among those key miRNAs, we found that the expression level of miR-29a is highly correlated with survival time in ovarian cancer patients. These results indicate that our multimodal DBN based data analysis approach may have practical applications in cancer pathogenesis studies and provide useful guidelines for personalized cancer therapy.
A fuzzy MCDM framework based on fuzzy measure and fuzzy integral for agile supplier evaluation
NASA Astrophysics Data System (ADS)
Dursun, Mehtap
2017-06-01
Supply chains need to be agile in order to response quickly to the changes in today's competitive environment. The success of an agile supply chain depends on the firm's ability to select the most appropriate suppliers. This study proposes a multi-criteria decision making technique for conducting an analysis based on multi-level hierarchical structure and fuzzy logic for the evaluation of agile suppliers. The ideal and anti-ideal solutions are taken into consideration simultaneously in the developed approach. The proposed decision approach enables the decision-makers to use linguistic terms, and thus, reduce their cognitive burden in the evaluation process. Furthermore, a hierarchy of evaluation criteria and their related sub-criteria is employed in the presented approach in order to conduct a more effective analysis.
FNCS: A Framework for Power System and Communication Networks Co-Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ciraci, Selim; Daily, Jeffrey A.; Fuller, Jason C.
2014-04-13
This paper describes the Fenix framework that uses a federated approach for integrating power grid and communication network simulators. Compared existing approaches, Fenix al- lows co-simulation of both transmission and distribution level power grid simulators with the communication network sim- ulator. To reduce the performance overhead of time synchro- nization, Fenix utilizes optimistic synchronization strategies that make speculative decisions about when the simulators are going to exchange messages. GridLAB-D (a distribution simulator), PowerFlow (a transmission simulator), and ns-3 (a telecommunication simulator) are integrated with the frame- work and are used to illustrate the enhanced performance pro- vided by speculative multi-threadingmore » on a smart grid applica- tion. Our speculative multi-threading approach achieved on average 20% improvement over the existing synchronization methods« less
Hybrid Parallelism for Volume Rendering on Large-, Multi-, and Many-Core Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howison, Mark; Bethel, E. Wes; Childs, Hank
2012-01-01
With the computing industry trending towards multi- and many-core processors, we study how a standard visualization algorithm, ray-casting volume rendering, can benefit from a hybrid parallelism approach. Hybrid parallelism provides the best of both worlds: using distributed-memory parallelism across a large numbers of nodes increases available FLOPs and memory, while exploiting shared-memory parallelism among the cores within each node ensures that each node performs its portion of the larger calculation as efficiently as possible. We demonstrate results from weak and strong scaling studies, at levels of concurrency ranging up to 216,000, and with datasets as large as 12.2 trillion cells.more » The greatest benefit from hybrid parallelism lies in the communication portion of the algorithm, the dominant cost at higher levels of concurrency. We show that reducing the number of participants with a hybrid approach significantly improves performance.« less
Robinson, Thomas N.; Matheson, Donna; Desai, Manisha; Wilson, Darrell M.; Weintraub, Dana L.; Haskell, William L.; McClain, Arianna; McClure, Samuel; Banda, Jorge; Sanders, Lee M.; Haydel, K. Farish; Killen, Joel D.
2013-01-01
Objective To test the effects of a three-year, community-based, multi-component, multi-level, multi-setting (MMM) approach for treating overweight and obese children. Design Two-arm, parallel group, randomized controlled trial with measures at baseline, 12, 24, and 36 months after randomization. Participants Seven through eleven year old, overweight and obese children (BMI ≥ 85th percentile) and their parents/caregivers recruited from community locations in low-income, primarily Latino neighborhoods in Northern California. Interventions Families are randomized to the MMM intervention versus a community health education active-placebo comparison intervention. Interventions last for three years for each participant. The MMM intervention includes a community-based after school team sports program designed specifically for overweight and obese children, a home-based family intervention to reduce screen time, alter the home food/eating environment, and promote self-regulatory skills for eating and activity behavior change, and a primary care behavioral counseling intervention linked to the community and home interventions. The active-placebo comparison intervention includes semi-annual health education home visits, monthly health education newsletters for children and for parents/guardians, and a series of community-based health education events for families. Main Outcome Measure Body mass index trajectory over the three-year study. Secondary outcome measures include waist circumference, triceps skinfold thickness, accelerometer-measured physical activity, 24-hour dietary recalls, screen time and other sedentary behaviors, blood pressure, fasting lipids, glucose, insulin, hemoglobin A1c, C-reactive protein, alanine aminotransferase, and psychosocial measures. Conclusions The Stanford GOALS trial is testing the efficacy of a novel community-based multi-component, multi-level, multi-setting treatment for childhood overweight and obesity in low-income, Latino families. PMID:24028942
Robinson, Thomas N; Matheson, Donna; Desai, Manisha; Wilson, Darrell M; Weintraub, Dana L; Haskell, William L; McClain, Arianna; McClure, Samuel; Banda, Jorge A; Sanders, Lee M; Haydel, K Farish; Killen, Joel D
2013-11-01
To test the effects of a three-year, community-based, multi-component, multi-level, multi-setting (MMM) approach for treating overweight and obese children. Two-arm, parallel group, randomized controlled trial with measures at baseline, 12, 24, and 36 months after randomization. Seven through eleven year old, overweight and obese children (BMI ≥ 85th percentile) and their parents/caregivers recruited from community locations in low-income, primarily Latino neighborhoods in Northern California. Families are randomized to the MMM intervention versus a community health education active-placebo comparison intervention. Interventions last for three years for each participant. The MMM intervention includes a community-based after school team sports program designed specifically for overweight and obese children, a home-based family intervention to reduce screen time, alter the home food/eating environment, and promote self-regulatory skills for eating and activity behavior change, and a primary care behavioral counseling intervention linked to the community and home interventions. The active-placebo comparison intervention includes semi-annual health education home visits, monthly health education newsletters for children and for parents/guardians, and a series of community-based health education events for families. Body mass index trajectory over the three-year study. Secondary outcome measures include waist circumference, triceps skinfold thickness, accelerometer-measured physical activity, 24-hour dietary recalls, screen time and other sedentary behaviors, blood pressure, fasting lipids, glucose, insulin, hemoglobin A1c, C-reactive protein, alanine aminotransferase, and psychosocial measures. The Stanford GOALS trial is testing the efficacy of a novel community-based multi-component, multi-level, multi-setting treatment for childhood overweight and obesity in low-income, Latino families. © 2013 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Taubenböck, H.; Wurm, M.; Netzband, M.; Zwenzner, H.; Roth, A.; Rahman, A.; Dech, S.
2011-02-01
Estimating flood risks and managing disasters combines knowledge in climatology, meteorology, hydrology, hydraulic engineering, statistics, planning and geography - thus a complex multi-faceted problem. This study focuses on the capabilities of multi-source remote sensing data to support decision-making before, during and after a flood event. With our focus on urbanized areas, sample methods and applications show multi-scale products from the hazard and vulnerability perspective of the risk framework. From the hazard side, we present capabilities with which to assess flood-prone areas before an expected disaster. Then we map the spatial impact during or after a flood and finally, we analyze damage grades after a flood disaster. From the vulnerability side, we monitor urbanization over time on an urban footprint level, classify urban structures on an individual building level, assess building stability and quantify probably affected people. The results show a large database for sustainable development and for developing mitigation strategies, ad-hoc coordination of relief measures and organizing rehabilitation.
Highly Resolved Intravital Striped-illumination Microscopy of Germinal Centers
Andresen, Volker; Sporbert, Anje
2014-01-01
Monitoring cellular communication by intravital deep-tissue multi-photon microscopy is the key for understanding the fate of immune cells within thick tissue samples and organs in health and disease. By controlling the scanning pattern in multi-photon microscopy and applying appropriate numerical algorithms, we developed a striped-illumination approach, which enabled us to achieve 3-fold better axial resolution and improved signal-to-noise ratio, i.e. contrast, in more than 100 µm tissue depth within highly scattering tissue of lymphoid organs as compared to standard multi-photon microscopy. The acquisition speed as well as photobleaching and photodamage effects were similar to standard photo-multiplier-based technique, whereas the imaging depth was slightly lower due to the use of field detectors. By using the striped-illumination approach, we are able to observe the dynamics of immune complex deposits on secondary follicular dendritic cells – on the level of a few protein molecules in germinal centers. PMID:24748007
Xiao, Jianru; He, Shaohui; Jiao, Jian; Wan, Wei; Xu, Wei; Zhang, Dan; Liu, Weibo; Zhong, Nanzhe; Liu, Tielong; Wei, Haifeng; Yang, Xinghai
2018-03-01
Multi-level reconstruction incorporating the chest wall and ribs is technically demanding after multi-segmental total en bloc spondylectomy (TES) of thoracic spinal tumours. Few surgical techniques are reported for effective reconstruction. A novel and straightforward technical reconstruction through posterior-lateral approach was presented to solve the extensive chest wall defect and prevent occurrences of severe respiratory dysfunctions after performing TES. The preliminary outcomes of surgery were reviewed. Multi-level TES was performed for five patients with primary or recurrent thoracic spinal malignancies through posterior-lateral approach. The involved ribs and chest wall were removed to achieve tumour-free margin. Then titanium mesh with allograft bone and pedicle screw-rod system were adopted for the circumferential spinal reconstruction routinely. Titanium rods were modified accordingly to attach to the screw-rod system proximally, and the distal end of rods was dynamically inserted into the ribs. The mean surgery time was 6.7 hours (range 5-8), with the average blood loss of 3260 ml (range 2300-4500). No severe neurological complications were reported while three patients had complaints of slight numbness of chest skin (no. 1, 3, and 5). No severe respiratory complications occurred during peri-operative period. No implant failure and no local recurrence or distant metastases were observed with an average follow-up of 12.5 months. The single-stage reconstructions incorporating spine and chest wall are straightforward and easy to perform. The preliminary outcomes of co-reconstructions are promising and favourable. More studies and longer follow-up are required to validate this technique.
ERIC Educational Resources Information Center
Enger, Kathy; Lajimodiere, Denise
2011-01-01
Purpose: The purpose of this paper is to examine the attitudes of students following the completion of an online doctoral level multicultural diversity course at a university in the Midwestern USA based on Banks' transformative approach to learning in an effort to determine if the online environment could successfully intervene to change student…
Improved optical flow motion estimation for digital image stabilization
NASA Astrophysics Data System (ADS)
Lai, Lijun; Xu, Zhiyong; Zhang, Xuyao
2015-11-01
Optical flow is the instantaneous motion vector at each pixel in the image frame at a time instant. The gradient-based approach for optical flow computation can't work well when the video motion is too large. To alleviate such problem, we incorporate this algorithm into a pyramid multi-resolution coarse-to-fine search strategy. Using pyramid strategy to obtain multi-resolution images; Using iterative relationship from the highest level to the lowest level to obtain inter-frames' affine parameters; Subsequence frames compensate back to the first frame to obtain stabilized sequence. The experiment results demonstrate that the promoted method has good performance in global motion estimation.
CubeSat mechanical design: creating low mass and durable structures
NASA Astrophysics Data System (ADS)
Fiedler, Gilbert; Straub, Jeremy
2017-05-01
This paper considers the mechanical design of a low-mass, low-cost spacecraft for use in a multi-satellite sensing constellation. For a multi-spacecraft mission, aggregated small mass and cost reductions can have significant impact. One approach to mass reduction is to make cuts into the structure, removing material. Stress analysis is used to determine the level of material reduction possible. Focus areas for this paper include determining areas to make cuts to ensure that a strong shape remains, while considering the comparative cost and skill level of each type of cut. Real-world results for a CubeSat and universally applicable analysis are presented.
Multi-atlas and label fusion approach for patient-specific MRI based skull estimation.
Torrado-Carvajal, Angel; Herraiz, Joaquin L; Hernandez-Tamames, Juan A; San Jose-Estepar, Raul; Eryaman, Yigitcan; Rozenholc, Yves; Adalsteinsson, Elfar; Wald, Lawrence L; Malpica, Norberto
2016-04-01
MRI-based skull segmentation is a useful procedure for many imaging applications. This study describes a methodology for automatic segmentation of the complete skull from a single T1-weighted volume. The skull is estimated using a multi-atlas segmentation approach. Using a whole head computed tomography (CT) scan database, the skull in a new MRI volume is detected by nonrigid image registration of the volume to every CT, and combination of the individual segmentations by label-fusion. We have compared Majority Voting, Simultaneous Truth and Performance Level Estimation (STAPLE), Shape Based Averaging (SBA), and the Selective and Iterative Method for Performance Level Estimation (SIMPLE) algorithms. The pipeline has been evaluated quantitatively using images from the Retrospective Image Registration Evaluation database (reaching an overlap of 72.46 ± 6.99%), a clinical CT-MR dataset (maximum overlap of 78.31 ± 6.97%), and a whole head CT-MRI pair (maximum overlap 78.68%). A qualitative evaluation has also been performed on MRI acquisition of volunteers. It is possible to automatically segment the complete skull from MRI data using a multi-atlas and label fusion approach. This will allow the creation of complete MRI-based tissue models that can be used in electromagnetic dosimetry applications and attenuation correction in PET/MR. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Xu, Zhoubing; Baucom, Rebeccah B.; Abramson, Richard G.; Poulose, Benjamin K.; Landman, Bennett A.
2016-03-01
The abdominal wall is an important structure differentiating subcutaneous and visceral compartments and intimately involved with maintaining abdominal structure. Segmentation of the whole abdominal wall on routinely acquired computed tomography (CT) scans remains challenging due to variations and complexities of the wall and surrounding tissues. In this study, we propose a slice-wise augmented active shape model (AASM) approach to robustly segment both the outer and inner surfaces of the abdominal wall. Multi-atlas label fusion (MALF) and level set (LS) techniques are integrated into the traditional ASM framework. The AASM approach globally optimizes the landmark updates in the presence of complicated underlying local anatomical contexts. The proposed approach was validated on 184 axial slices of 20 CT scans. The Hausdorff distance against the manual segmentation was significantly reduced using proposed approach compared to that using ASM, MALF, and LS individually. Our segmentation of the whole abdominal wall enables the subcutaneous and visceral fat measurement, with high correlation to the measurement derived from manual segmentation. This study presents the first generic algorithm that combines ASM, MALF, and LS, and demonstrates practical application for automatically capturing visceral and subcutaneous fat volumes.
NASA Astrophysics Data System (ADS)
Sweeney, C.; Kort, E. A.; Rella, C.; Conley, S. A.; Karion, A.; Lauvaux, T.; Frankenberg, C.
2015-12-01
Along with a boom in oil and natural gas production in the US, there has been a substantial effort to understand the true environmental impact of these operations on air and water quality, as well asnet radiation balance. This multi-institution effort funded by both governmental and non-governmental agencies has provided a case study for identification and verification of emissions using a multi-scale, top-down approach. This approach leverages a combination of remote sensing to identify areas that need specific focus and airborne in-situ measurements to quantify both regional and large- to mid-size single-point emitters. Ground-based networks of mobile and stationary measurements provide the bottom tier of measurements from which process-level information can be gathered to better understand the specific sources and temporal distribution of the emitters. The motivation for this type of approach is largely driven by recent work in the Barnett Shale region in Texas as well as the San Juan Basin in New Mexico and Colorado; these studies suggest that relatively few single-point emitters dominate the regional emissions of CH4.
The Inclusive Classroom. Professional's Guide.
ERIC Educational Resources Information Center
Grenot-Scheyer, Marquita; And Others
Inclusive education reflects the changing culture of contemporary schools with emphasis on active learning, authentic assessment practices, applied curriculum, multi-level instructional approaches, and increased attention to diverse student needs and individualization. This guide is intended to help teachers implement inclusive educational…
NASA Astrophysics Data System (ADS)
Prasad, S.; Bruce, L. M.
2007-04-01
There is a growing interest in using multiple sources for automatic target recognition (ATR) applications. One approach is to take multiple, independent observations of a phenomenon and perform a feature level or a decision level fusion for ATR. This paper proposes a method to utilize these types of multi-source fusion techniques to exploit hyperspectral data when only a small number of training pixels are available. Conventional hyperspectral image based ATR techniques project the high dimensional reflectance signature onto a lower dimensional subspace using techniques such as Principal Components Analysis (PCA), Fisher's linear discriminant analysis (LDA), subspace LDA and stepwise LDA. While some of these techniques attempt to solve the curse of dimensionality, or small sample size problem, these are not necessarily optimal projections. In this paper, we present a divide and conquer approach to address the small sample size problem. The hyperspectral space is partitioned into contiguous subspaces such that the discriminative information within each subspace is maximized, and the statistical dependence between subspaces is minimized. We then treat each subspace as a separate source in a multi-source multi-classifier setup and test various decision fusion schemes to determine their efficacy. Unlike previous approaches which use correlation between variables for band grouping, we study the efficacy of higher order statistical information (using average mutual information) for a bottom up band grouping. We also propose a confidence measure based decision fusion technique, where the weights associated with various classifiers are based on their confidence in recognizing the training data. To this end, training accuracies of all classifiers are used for weight assignment in the fusion process of test pixels. The proposed methods are tested using hyperspectral data with known ground truth, such that the efficacy can be quantitatively measured in terms of target recognition accuracies.
NASA Astrophysics Data System (ADS)
Montero, Marc Villa; Barjasteh, Ehsan; Baid, Harsh K.; Godines, Cody; Abdi, Frank; Nikbin, Kamran
A multi-scale micromechanics approach along with finite element (FE) model predictive tool is developed to analyze low-energy-impact damage footprint and compression-after-impact (CAI) of composite laminates which is also tested and verified with experimental data. Effective fiber and matrix properties were reverse-engineered from lamina properties using an optimization algorithm and used to assess damage at the micro-level during impact and post-impact FE simulations. Progressive failure dynamic analysis (PFDA) was performed for a two step-process simulation. Damage mechanisms at the micro-level were continuously evaluated during the analyses. Contribution of each failure mode was tracked during the simulations and damage and delamination footprint size and shape were predicted to understand when, where and why failure occurred during both impact and CAI events. The composite laminate was manufactured by the vacuum infusion of the aero-grade toughened Benzoxazine system into the fabric preform. Delamination footprint was measured using C-scan data from the impacted panels and compared with the predicated values obtained from proposed multi-scale micromechanics coupled with FE analysis. Furthermore, the residual strength was predicted from the load-displacement curve and compared with the experimental values as well.
Deep Visual Attention Prediction
NASA Astrophysics Data System (ADS)
Wang, Wenguan; Shen, Jianbing
2018-05-01
In this work, we aim to predict human eye fixation with view-free scenes based on an end-to-end deep learning architecture. Although Convolutional Neural Networks (CNNs) have made substantial improvement on human attention prediction, it is still needed to improve CNN based attention models by efficiently leveraging multi-scale features. Our visual attention network is proposed to capture hierarchical saliency information from deep, coarse layers with global saliency information to shallow, fine layers with local saliency response. Our model is based on a skip-layer network structure, which predicts human attention from multiple convolutional layers with various reception fields. Final saliency prediction is achieved via the cooperation of those global and local predictions. Our model is learned in a deep supervision manner, where supervision is directly fed into multi-level layers, instead of previous approaches of providing supervision only at the output layer and propagating this supervision back to earlier layers. Our model thus incorporates multi-level saliency predictions within a single network, which significantly decreases the redundancy of previous approaches of learning multiple network streams with different input scales. Extensive experimental analysis on various challenging benchmark datasets demonstrate our method yields state-of-the-art performance with competitive inference time.
Golden, Sherita Hill; Maruthur, Nisa; Mathioudakis, Nestoras; Spanakis, Elias; Rubin, Daniel; Zilbermint, Mihail; Hill-Briggs, Felicia
2017-07-01
The goal of this review is to describe diabetes within a population health improvement framework and to review the evidence for a diabetes population health continuum of intervention approaches, including diabetes prevention and chronic and acute diabetes management, to improve clinical and economic outcomes. Recent studies have shown that compared to usual care, lifestyle interventions in prediabetes lower diabetes risk at the population-level and that group-based programs have low incremental medial cost effectiveness ratio for health systems. Effective outpatient interventions that improve diabetes control and process outcomes are multi-level, targeting the patient, provider, and healthcare system simultaneously and integrate community health workers as a liaison between the patient and community-based healthcare resources. A multi-faceted approach to diabetes management is also effective in the inpatient setting. Interventions shown to promote safe and effective glycemic control and use of evidence-based glucose management practices include provider reminder and clinical decision support systems, automated computer order entry, provider education, and organizational change. Future studies should examine the cost-effectiveness of multi-faceted outpatient and inpatient diabetes management programs to determine the best financial models for incorporating them into diabetes population health strategies.
Towards Simpler Custom and OpenSearch Services for Voluminous NEWS Merged A-Train Data (Invited)
NASA Astrophysics Data System (ADS)
Hua, H.; Fetzer, E.; Braverman, A. J.; Lewis, S.; Henderson, M. L.; Guillaume, A.; Lee, S.; de La Torre Juarez, M.; Dang, H. T.
2010-12-01
To simplify access to large and complex satellite data sets for climate analysis and model verification, we developed web services that is used to study long-term and global-scale trends in climate, water and energy cycle, and weather variability. A related NASA Energy and Water Cycle Study (NEWS) task has created a merged NEWS Level 2 data from multiple instruments in NASA’s A-Train constellation of satellites. We used this data to enable creation of climatologies that include correlation between observed temperature, water vapor and cloud properties from the A-Train sensors. Instead of imposing on the user an often rigid and limiting web-based analysis environment, we recognize the need for simple and well-designed services so that users can perform analysis in their own familiar computing environments. Custom on-demand services were developed to improve data accessibility of voluminous multi-sensor data. Services enabling geospatial, geographical, and multi-sensor parameter subsets of the data, as well a custom time-averaged Level 3 service will be presented. We will also show how a Level 3Q data reduction approach can be used to help “browse” the voluminous multi-sensor Level 2 data. An OpenSearch capability with full text + space + time search of data products will also be presented as an approach to facilitated interoperability with other data systems. We will present our experiences for improving user usability as well as strategies for facilitating interoperability with other data systems.
NASA Technical Reports Server (NTRS)
Baker, G. R.; Fethe, T. P.
1975-01-01
Research in the application of remotely sensed data from LANDSAT or other airborne platforms to the efficient management of a large timber based forest industry was divided into three phases: (1) establishment of a photo/ground sample correlation, (2) investigation of techniques for multi-spectral digital analysis, and (3) development of a semi-automated multi-level sampling system. To properly verify results, three distinct test areas were selected: (1) Jacksonville Mill Region, Lower Coastal Plain, Flatwoods, (2) Pensacola Mill Region, Middle Coastal Plain, and (3) Mississippi Mill Region, Middle Coastal Plain. The following conclusions were reached: (1) the probability of establishing an information base suitable for management requirements through a photo/ground double sampling procedure, alleviating the ground sampling effort, is encouraging, (2) known classification techniques must be investigated to ascertain the level of precision possible in separating the many densities involved, and (3) the multi-level approach must be related to an information system that is executable and feasible.
Comparison of alternative approaches for analysing multi-level RNA-seq data
Mohorianu, Irina; Bretman, Amanda; Smith, Damian T.; Fowler, Emily K.; Dalmay, Tamas
2017-01-01
RNA sequencing (RNA-seq) is widely used for RNA quantification in the environmental, biological and medical sciences. It enables the description of genome-wide patterns of expression and the identification of regulatory interactions and networks. The aim of RNA-seq data analyses is to achieve rigorous quantification of genes/transcripts to allow a reliable prediction of differential expression (DE), despite variation in levels of noise and inherent biases in sequencing data. This can be especially challenging for datasets in which gene expression differences are subtle, as in the behavioural transcriptomics test dataset from D. melanogaster that we used here. We investigated the power of existing approaches for quality checking mRNA-seq data and explored additional, quantitative quality checks. To accommodate nested, multi-level experimental designs, we incorporated sample layout into our analyses. We employed a subsampling without replacement-based normalization and an identification of DE that accounted for the hierarchy and amplitude of effect sizes within samples, then evaluated the resulting differential expression call in comparison to existing approaches. In a final step to test for broader applicability, we applied our approaches to a published set of H. sapiens mRNA-seq samples, The dataset-tailored methods improved sample comparability and delivered a robust prediction of subtle gene expression changes. The proposed approaches have the potential to improve key steps in the analysis of RNA-seq data by incorporating the structure and characteristics of biological experiments. PMID:28792517
MLP: A Parallel Programming Alternative to MPI for New Shared Memory Parallel Systems
NASA Technical Reports Server (NTRS)
Taft, James R.
1999-01-01
Recent developments at the NASA AMES Research Center's NAS Division have demonstrated that the new generation of NUMA based Symmetric Multi-Processing systems (SMPs), such as the Silicon Graphics Origin 2000, can successfully execute legacy vector oriented CFD production codes at sustained rates far exceeding processing rates possible on dedicated 16 CPU Cray C90 systems. This high level of performance is achieved via shared memory based Multi-Level Parallelism (MLP). This programming approach, developed at NAS and outlined below, is distinct from the message passing paradigm of MPI. It offers parallelism at both the fine and coarse grained level, with communication latencies that are approximately 50-100 times lower than typical MPI implementations on the same platform. Such latency reductions offer the promise of performance scaling to very large CPU counts. The method draws on, but is also distinct from, the newly defined OpenMP specification, which uses compiler directives to support a limited subset of multi-level parallel operations. The NAS MLP method is general, and applicable to a large class of NASA CFD codes.
Arbabi, Vahid; Pouran, Behdad; Weinans, Harrie; Zadpoor, Amir A
2016-09-06
Analytical and numerical methods have been used to extract essential engineering parameters such as elastic modulus, Poisson׳s ratio, permeability and diffusion coefficient from experimental data in various types of biological tissues. The major limitation associated with analytical techniques is that they are often only applicable to problems with simplified assumptions. Numerical multi-physics methods, on the other hand, enable minimizing the simplified assumptions but require substantial computational expertise, which is not always available. In this paper, we propose a novel approach that combines inverse and forward artificial neural networks (ANNs) which enables fast and accurate estimation of the diffusion coefficient of cartilage without any need for computational modeling. In this approach, an inverse ANN is trained using our multi-zone biphasic-solute finite-bath computational model of diffusion in cartilage to estimate the diffusion coefficient of the various zones of cartilage given the concentration-time curves. Robust estimation of the diffusion coefficients, however, requires introducing certain levels of stochastic variations during the training process. Determining the required level of stochastic variation is performed by coupling the inverse ANN with a forward ANN that receives the diffusion coefficient as input and returns the concentration-time curve as output. Combined together, forward-inverse ANNs enable computationally inexperienced users to obtain accurate and fast estimation of the diffusion coefficients of cartilage zones. The diffusion coefficients estimated using the proposed approach are compared with those determined using direct scanning of the parameter space as the optimization approach. It has been shown that both approaches yield comparable results. Copyright © 2016 Elsevier Ltd. All rights reserved.
An Approach for Autonomy: A Collaborative Communication Framework for Multi-Agent Systems
NASA Technical Reports Server (NTRS)
Dufrene, Warren Russell, Jr.
2005-01-01
Research done during the last three years has studied the emersion properties of Complex Adaptive Systems (CAS). The deployment of Artificial Intelligence (AI) techniques applied to remote Unmanned Aerial Vehicles has led the author to investigate applications of CAS within the field of Autonomous Multi-Agent Systems. The core objective of current research efforts is focused on the simplicity of Intelligent Agents (IA) and the modeling of these agents within complex systems. This research effort looks at the communication, interaction, and adaptability of multi-agents as applied to complex systems control. The embodiment concept applied to robotics has application possibilities within multi-agent frameworks. A new framework for agent awareness within a virtual 3D world concept is possible where the vehicle is composed of collaborative agents. This approach has many possibilities for applications to complex systems. This paper describes the development of an approach to apply this virtual framework to the NASA Goddard Space Flight Center (GSFC) tetrahedron structure developed under the Autonomous Nano Technology Swarm (ANTS) program and the Super Miniaturized Addressable Reconfigurable Technology (SMART) architecture program. These projects represent an innovative set of novel concepts deploying adaptable, self-organizing structures composed of many tetrahedrons. This technology is pushing current applied Agents Concepts to new levels of requirements and adaptability.
MUSIC: MUlti-Scale Initial Conditions
NASA Astrophysics Data System (ADS)
Hahn, Oliver; Abel, Tom
2013-11-01
MUSIC generates multi-scale initial conditions with multiple levels of refinements for cosmological ‘zoom-in’ simulations. The code uses an adaptive convolution of Gaussian white noise with a real-space transfer function kernel together with an adaptive multi-grid Poisson solver to generate displacements and velocities following first- (1LPT) or second-order Lagrangian perturbation theory (2LPT). MUSIC achieves rms relative errors of the order of 10-4 for displacements and velocities in the refinement region and thus improves in terms of errors by about two orders of magnitude over previous approaches. In addition, errors are localized at coarse-fine boundaries and do not suffer from Fourier space-induced interference ringing.
Silva Junqueira, Vinícius; de Azevedo Peixoto, Leonardo; Galvêas Laviola, Bruno; Lopes Bhering, Leonardo; Mendonça, Simone; Agostini Costa, Tania da Silveira; Antoniassi, Rosemar
2016-01-01
The biggest challenge for jatropha breeding is to identify superior genotypes that present high seed yield and seed oil content with reduced toxicity levels. Therefore, the objective of this study was to estimate genetic parameters for three important traits (weight of 100 seed, oil seed content, and phorbol ester concentration), and to select superior genotypes to be used as progenitors in jatropha breeding. Additionally, the genotypic values and the genetic parameters estimated under the Bayesian multi-trait approach were used to evaluate different selection indices scenarios of 179 half-sib families. Three different scenarios and economic weights were considered. It was possible to simultaneously reduce toxicity and increase seed oil content and weight of 100 seed by using index selection based on genotypic value estimated by the Bayesian multi-trait approach. Indeed, we identified two families that present these characteristics by evaluating genetic diversity using the Ward clustering method, which suggested nine homogenous clusters. Future researches must integrate the Bayesian multi-trait methods with realized relationship matrix, aiming to build accurate selection indices models. PMID:27281340
A MultiAir®/MultiFuel Approach to Enhancing Engine System Efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reese, Ronald
2015-05-20
FCA US LLC (formally known as Chrysler Group LLC, and hereinafter “Chrysler”) was awarded an American Recovery and Reinvestment Act (ARRA) funded project by the Department of Energy (DOE) titled “A MultiAir®/MultiFuel Approach to Enhancing Engine System Efficiency” (hereinafter “project”). This award was issued after Chrysler submitted a proposal for Funding Opportunity Announcement DE-FOA- 0000079, “Systems Level Technology Development, Integration, and Demonstration for Efficient Class 8 Trucks (SuperTruck) and Advanced Technology Powertrains for Light-Duty Vehicles (ATP-LD).” Chrysler started work on this project on June 01, 2010 and completed testing activities on August 30, 2014. Overall objectives of this project were;more » Demonstrate a 25% improvement in combined Federal Test Procedure (FTP) City and Highway fuel economy over a 2009 Chrysler minivan; Accelerate the development of highly efficient engine and powertrain systems for light-duty vehicles, while meeting future emissions standards; and Create and retain jobs in accordance with the American Recovery and Reinvestment Act of 2009« less
A multi-objective constraint-based approach for modeling genome-scale microbial ecosystems.
Budinich, Marko; Bourdon, Jérémie; Larhlimi, Abdelhalim; Eveillard, Damien
2017-01-01
Interplay within microbial communities impacts ecosystems on several scales, and elucidation of the consequent effects is a difficult task in ecology. In particular, the integration of genome-scale data within quantitative models of microbial ecosystems remains elusive. This study advocates the use of constraint-based modeling to build predictive models from recent high-resolution -omics datasets. Following recent studies that have demonstrated the accuracy of constraint-based models (CBMs) for simulating single-strain metabolic networks, we sought to study microbial ecosystems as a combination of single-strain metabolic networks that exchange nutrients. This study presents two multi-objective extensions of CBMs for modeling communities: multi-objective flux balance analysis (MO-FBA) and multi-objective flux variability analysis (MO-FVA). Both methods were applied to a hot spring mat model ecosystem. As a result, multiple trade-offs between nutrients and growth rates, as well as thermodynamically favorable relative abundances at community level, were emphasized. We expect this approach to be used for integrating genomic information in microbial ecosystems. Following models will provide insights about behaviors (including diversity) that take place at the ecosystem scale.
Multi-level systems modeling and optimization for novel aircraft
NASA Astrophysics Data System (ADS)
Subramanian, Shreyas Vathul
This research combines the disciplines of system-of-systems (SoS) modeling, platform-based design, optimization and evolving design spaces to achieve a novel capability for designing solutions to key aeronautical mission challenges. A central innovation in this approach is the confluence of multi-level modeling (from sub-systems to the aircraft system to aeronautical system-of-systems) in a way that coordinates the appropriate problem formulations at each level and enables parametric search in design libraries for solutions that satisfy level-specific objectives. The work here addresses the topic of SoS optimization and discusses problem formulation, solution strategy, the need for new algorithms that address special features of this problem type, and also demonstrates these concepts using two example application problems - a surveillance UAV swarm problem, and the design of noise optimal aircraft and approach procedures. This topic is critical since most new capabilities in aeronautics will be provided not just by a single air vehicle, but by aeronautical Systems of Systems (SoS). At the same time, many new aircraft concepts are pressing the boundaries of cyber-physical complexity through the myriad of dynamic and adaptive sub-systems that are rising up the TRL (Technology Readiness Level) scale. This compositional approach is envisioned to be active at three levels: validated sub-systems are integrated to form conceptual aircraft, which are further connected with others to perform a challenging mission capability at the SoS level. While these multiple levels represent layers of physical abstraction, each discipline is associated with tools of varying fidelity forming strata of 'analysis abstraction'. Further, the design (composition) will be guided by a suitable hierarchical complexity metric formulated for the management of complexity in both the problem (as part of the generative procedure and selection of fidelity level) and the product (i.e., is the mission best achieved via a large collection of interacting simple systems, or a relatively few highly capable, complex air vehicles). The vastly unexplored area of optimization in evolving design spaces will be studied and incorporated into the SoS optimization framework. We envision a framework that resembles a multi-level, mult-fidelity, multi-disciplinary assemblage of optimization problems. The challenge is not simply one of scaling up to a new level (the SoS), but recognizing that the aircraft sub-systems and the integrated vehicle are now intensely cyber-physical, with hardware and software components interacting in complex ways that give rise to new and improved capabilities. The work presented here is a step closer to modeling the information flow that exists in realistic SoS optimization problems between sub-contractors, contractors and the SoS architect.
Multi-scale modelling of elastic moduli of trabecular bone
Hamed, Elham; Jasiuk, Iwona; Yoo, Andrew; Lee, YikHan; Liszka, Tadeusz
2012-01-01
We model trabecular bone as a nanocomposite material with hierarchical structure and predict its elastic properties at different structural scales. The analysis involves a bottom-up multi-scale approach, starting with nanoscale (mineralized collagen fibril) and moving up the scales to sub-microscale (single lamella), microscale (single trabecula) and mesoscale (trabecular bone) levels. Continuum micromechanics methods, composite materials laminate theory and finite-element methods are used in the analysis. Good agreement is found between theoretical and experimental results. PMID:22279160
ERIC Educational Resources Information Center
Marans, Robert W.; Edelstein, Jack Y.
2010-01-01
Purpose: The purpose of this paper is to determine the behaviors, attitudes, and levels of understanding among faculty, staff, and students in efforts to design programs aimed at reducing energy use in University of Michigan (UM) buildings. Design/methodology/approach: A multi-method approach is used in five diverse pilot buildings including focus…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, L.H., E-mail: Luhui.Han@tum.de; Hu, X.Y., E-mail: Xiangyu.Hu@tum.de; Adams, N.A., E-mail: Nikolaus.Adams@tum.de
In this paper we present a scale separation approach for multi-scale modeling of free-surface and two-phase flows with complex interface evolution. By performing a stimulus-response operation on the level-set function representing the interface, separation of resolvable and non-resolvable interface scales is achieved efficiently. Uniform positive and negative shifts of the level-set function are used to determine non-resolvable interface structures. Non-resolved interface structures are separated from the resolved ones and can be treated by a mixing model or a Lagrangian-particle model in order to preserve mass. Resolved interface structures are treated by the conservative sharp-interface model. Since the proposed scale separationmore » approach does not rely on topological information, unlike in previous work, it can be implemented in a straightforward fashion into a given level set based interface model. A number of two- and three-dimensional numerical tests demonstrate that the proposed method is able to cope with complex interface variations accurately and significantly increases robustness against underresolved interface structures.« less
NASA Astrophysics Data System (ADS)
Yoon, J.; Klassert, C. J. A.; Lachaut, T.; Selby, P. D.; Knox, S.; Gorelick, S.; Rajsekhar, D.; Tilmant, A.; Avisse, N.; Harou, J. J.; Gawel, E.; Klauer, B.; Mustafa, D.; Talozi, S.; Sigel, K.
2015-12-01
Our work focuses on development of a multi-agent, hydroeconomic model for purposes of water policy evaluation in Jordan. The model adopts a modular approach, integrating biophysical modules that simulate natural and engineered phenomena with human modules that represent behavior at multiple levels of decision making. The hydrologic modules are developed using spatially-distributed groundwater and surface water models, which are translated into compact simulators for efficient integration into the multi-agent model. For the groundwater model, we adopt a response matrix method approach in which a 3-dimensional MODFLOW model of a complex regional groundwater system is converted into a linear simulator of groundwater response by pre-processing drawdown results from several hundred numerical simulation runs. Surface water models for each major surface water basin in the country are developed in SWAT and similarly translated into simple rainfall-runoff functions for integration with the multi-agent model. The approach balances physically-based, spatially-explicit representation of hydrologic systems with the efficiency required for integration into a complex multi-agent model that is computationally amenable to robust scenario analysis. For the multi-agent model, we explicitly represent human agency at multiple levels of decision making, with agents representing riparian, management, supplier, and water user groups. The agents' decision making models incorporate both rule-based heuristics as well as economic optimization. The model is programmed in Python using Pynsim, a generalizable, open-source object-oriented code framework for modeling network-based water resource systems. The Jordan model is one of the first applications of Pynsim to a real-world water management case study. Preliminary results from a tanker market scenario run through year 2050 are presented in which several salient features of the water system are investigated: competition between urban and private farmer agents, the emergence of a private tanker market, disparities in economic wellbeing to different user groups caused by unique supply conditions, and response of the complex system to various policy interventions.
A closed-loop multi-level model of glucose homeostasis
Uluseker, Cansu; Simoni, Giulia; Dauriz, Marco; Matone, Alice
2018-01-01
Background The pathophysiologic processes underlying the regulation of glucose homeostasis are considerably complex at both cellular and systemic level. A comprehensive and structured specification for the several layers of abstraction of glucose metabolism is often elusive, an issue currently solvable with the hierarchical description provided by multi-level models. In this study we propose a multi-level closed-loop model of whole-body glucose homeostasis, coupled with the molecular specifications of the insulin signaling cascade in adipocytes, under the experimental conditions of normal glucose regulation and type 2 diabetes. Methodology/Principal findings The ordinary differential equations of the model, describing the dynamics of glucose and key regulatory hormones and their reciprocal interactions among gut, liver, muscle and adipose tissue, were designed for being embedded in a modular, hierarchical structure. The closed-loop model structure allowed self-sustained simulations to represent an ideal in silico subject that adjusts its own metabolism to the fasting and feeding states, depending on the hormonal context and invariant to circadian fluctuations. The cellular level of the model provided a seamless dynamic description of the molecular mechanisms downstream the insulin receptor in the adipocytes by accounting for variations in the surrounding metabolic context. Conclusions/Significance The combination of a multi-level and closed-loop modeling approach provided a fair dynamic description of the core determinants of glucose homeostasis at both cellular and systemic scales. This model architecture is intrinsically open to incorporate supplementary layers of specifications describing further individual components influencing glucose metabolism. PMID:29420588
ERIC Educational Resources Information Center
Sengupta, Atanu; Pal, Naibedya Prasun
2012-01-01
Primary education is essential for the economic development in any country. Most studies give more emphasis to the final output (such as literacy, enrolment etc.) rather than the delivery of the entire primary education system. In this paper, we study the school level data from an Indian district, collected under the official DISE statistics. We…
Peissig, Peggy L; Rasmussen, Luke V; Berg, Richard L; Linneman, James G; McCarty, Catherine A; Waudby, Carol; Chen, Lin; Denny, Joshua C; Wilke, Russell A; Pathak, Jyotishman; Carrell, David; Kho, Abel N; Starren, Justin B
2012-01-01
There is increasing interest in using electronic health records (EHRs) to identify subjects for genomic association studies, due in part to the availability of large amounts of clinical data and the expected cost efficiencies of subject identification. We describe the construction and validation of an EHR-based algorithm to identify subjects with age-related cataracts. We used a multi-modal strategy consisting of structured database querying, natural language processing on free-text documents, and optical character recognition on scanned clinical images to identify cataract subjects and related cataract attributes. Extensive validation on 3657 subjects compared the multi-modal results to manual chart review. The algorithm was also implemented at participating electronic MEdical Records and GEnomics (eMERGE) institutions. An EHR-based cataract phenotyping algorithm was successfully developed and validated, resulting in positive predictive values (PPVs) >95%. The multi-modal approach increased the identification of cataract subject attributes by a factor of three compared to single-mode approaches while maintaining high PPV. Components of the cataract algorithm were successfully deployed at three other institutions with similar accuracy. A multi-modal strategy incorporating optical character recognition and natural language processing may increase the number of cases identified while maintaining similar PPVs. Such algorithms, however, require that the needed information be embedded within clinical documents. We have demonstrated that algorithms to identify and characterize cataracts can be developed utilizing data collected via the EHR. These algorithms provide a high level of accuracy even when implemented across multiple EHRs and institutional boundaries.
Algorithms for bilevel optimization
NASA Technical Reports Server (NTRS)
Alexandrov, Natalia; Dennis, J. E., Jr.
1994-01-01
General multilevel nonlinear optimization problems arise in design of complex systems and can be used as a means of regularization for multi-criteria optimization problems. Here, for clarity in displaying our ideas, we restrict ourselves to general bi-level optimization problems, and we present two solution approaches. Both approaches use a trust-region globalization strategy, and they can be easily extended to handle the general multilevel problem. We make no convexity assumptions, but we do assume that the problem has a nondegenerate feasible set. We consider necessary optimality conditions for the bi-level problem formulations and discuss results that can be extended to obtain multilevel optimization formulations with constraints at each level.
Earth Science Data Fusion with Event Building Approach
NASA Technical Reports Server (NTRS)
Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.
2015-01-01
Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.
NASA Astrophysics Data System (ADS)
Han, W.; Stammer, D.; Meehl, G. A.; Hu, A.; Sienz, F.
2016-12-01
Sea level varies on decadal and multi-decadal timescales over the Indian Ocean. The variations are not spatially uniform, and can deviate considerably from the global mean sea level rise (SLR) due to various geophysical processes. One of these processes is the change of ocean circulation, which can be partly attributed to natural internal modes of climate variability. Over the Indian Ocean, the most influential climate modes on decadal and multi-decadal timescales are the Interdecadal Pacific Oscillation (IPO) and decadal variability of the Indian Ocean dipole (IOD). Here, we first analyze observational datasets to investigate the impacts of IPO and IOD on spatial patterns of decadal and interdecadal (hereafter decal) sea level variability & multi-decadal trend over the Indian Ocean since the 1950s, using a new statistical approach of Bayesian Dynamical Linear regression Model (DLM). The Bayesian DLM overcomes the limitation of "time-constant (static)" regression coefficients in conventional multiple linear regression model, by allowing the coefficients to vary with time and therefore measuring "time-evolving (dynamical)" relationship between climate modes and sea level. For the multi-decadal sea level trend since the 1950s, our results show that climate modes and non-climate modes (the part that cannot be explained by climate modes) have comparable contributions in magnitudes but with different spatial patterns, with each dominating different regions of the Indian Ocean. For decadal variability, climate modes are the major contributors for sea level variations over most region of the tropical Indian Ocean. The relative importance of IPO and decadal variability of IOD, however, varies spatially. For example, while IOD decadal variability dominates IPO in the eastern equatorial basin (85E-100E, 5S-5N), IPO dominates IOD in causing sea level variations in the tropical southwest Indian Ocean (45E-65E, 12S-2S). To help decipher the possible contribution of external forcing to the multi-decadal sea level trend and decadal variability, we also analyze the model outputs from NCAR's Community Earth System Model (CESM) Large Ensemble Experiments, and compare the results with our observational analyses.
NASA Astrophysics Data System (ADS)
Ravi, Koustuban; Wang, Qian; Ho, Seng-Tiong
2015-08-01
We report a new computational model for simulations of electromagnetic interactions with semiconductor quantum well(s) (SQW) in complex electromagnetic geometries using the finite-difference time-domain method. The presented model is based on an approach of spanning a large number of electron transverse momentum states in each SQW sub-band (multi-band) with a small number of discrete multi-electron states (multi-level, multi-electron). This enables accurate and efficient two-dimensional (2-D) and three-dimensional (3-D) simulations of nanophotonic devices with SQW active media. The model includes the following features: (1) Optically induced interband transitions between various SQW conduction and heavy-hole or light-hole sub-bands are considered. (2) Novel intra sub-band and inter sub-band transition terms are derived to thermalize the electron and hole occupational distributions to the correct Fermi-Dirac distributions. (3) The terms in (2) result in an explicit update scheme which circumvents numerically cumbersome iterative procedures. This significantly augments computational efficiency. (4) Explicit update terms to account for carrier leakage to unconfined states are derived, which thermalize the bulk and SQW populations to a common quasi-equilibrium Fermi-Dirac distribution. (5) Auger recombination and intervalence band absorption are included. The model is validated by comparisons to analytic band-filling calculations, simulations of SQW optical gain spectra, and photonic crystal lasers.
Kim, Woo-Keun; Jung, Jinho
2016-06-01
The integration of biomarker responses ranging from the molecular to the individual level is of great interest for measuring the toxic effects of hazardous chemicals or effluent mixtures on aquatic organisms. This study evaluated the effects of wastewater treatment plant (WWTP) effluents on the freshwater pale chub Zacco platypus by using multi-level biomarker responses at molecular [mRNA expression of catalase (CAT), superoxide dismutase (SOD), glutathione S-transferase (GST), and metallothionein (MT)], biochemical (enzyme activities of CAT, SOD, GST, and concentration of MT), and physiological [condition factor (CF) and liver somatic index (LSI)] levels. The mRNA expression levels of GST and MT in Z. platypus from a site downstream of a WWTP significantly increased by 2.2- and 4.5-fold (p<0.05) when compared with those from an upstream site. However, the enzyme activities of CAT, SOD, and GST in fish from the downstream site significantly decreased by 43%, 98%, and 13%, respectively (p<0.05), except for an increase in MT concentration (41%). In addition, a significant increase in LSI (46%) was observed in Z. platypus from the downstream site (p<0.05). Concentrations of Cu, Zn, Cd, and Pb in the liver of Z. platypus were higher (530%, 353%, 800%, and 2,200%, respectively) in fish from a downstream site than in fish from an upstream location, and several multi-level biomarker responses were significantly correlated with the accumulated metals in Z. platypus (p<0.05). Integrated biomarker responses at molecular, biochemical, and physiological levels (multi-level IBR) were much higher (about 4-fold) at the downstream site than at the upstream site. This study suggests that the multi-level IBR approach is very useful for quantifying in situ adverse effects of WWTP effluents. Copyright © 2016 Elsevier Inc. All rights reserved.
Automatic classification and detection of clinically relevant images for diabetic retinopathy
NASA Astrophysics Data System (ADS)
Xu, Xinyu; Li, Baoxin
2008-03-01
We proposed a novel approach to automatic classification of Diabetic Retinopathy (DR) images and retrieval of clinically-relevant DR images from a database. Given a query image, our approach first classifies the image into one of the three categories: microaneurysm (MA), neovascularization (NV) and normal, and then it retrieves DR images that are clinically-relevant to the query image from an archival image database. In the classification stage, the query DR images are classified by the Multi-class Multiple-Instance Learning (McMIL) approach, where images are viewed as bags, each of which contains a number of instances corresponding to non-overlapping blocks, and each block is characterized by low-level features including color, texture, histogram of edge directions, and shape. McMIL first learns a collection of instance prototypes for each class that maximizes the Diverse Density function using Expectation- Maximization algorithm. A nonlinear mapping is then defined using the instance prototypes and maps every bag to a point in a new multi-class bag feature space. Finally a multi-class Support Vector Machine is trained in the multi-class bag feature space. In the retrieval stage, we retrieve images from the archival database who bear the same label with the query image, and who are the top K nearest neighbors of the query image in terms of similarity in the multi-class bag feature space. The classification approach achieves high classification accuracy, and the retrieval of clinically-relevant images not only facilitates utilization of the vast amount of hidden diagnostic knowledge in the database, but also improves the efficiency and accuracy of DR lesion diagnosis and assessment.
A Hierarchical Model for Simultaneous Detection and Estimation in Multi-subject fMRI Studies
Degras, David; Lindquist, Martin A.
2014-01-01
In this paper we introduce a new hierarchical model for the simultaneous detection of brain activation and estimation of the shape of the hemodynamic response in multi-subject fMRI studies. The proposed approach circumvents a major stumbling block in standard multi-subject fMRI data analysis, in that it both allows the shape of the hemodynamic response function to vary across region and subjects, while still providing a straightforward way to estimate population-level activation. An e cient estimation algorithm is presented, as is an inferential framework that not only allows for tests of activation, but also for tests for deviations from some canonical shape. The model is validated through simulations and application to a multi-subject fMRI study of thermal pain. PMID:24793829
Accelerated sampling by infinite swapping of path integral molecular dynamics with surface hopping
NASA Astrophysics Data System (ADS)
Lu, Jianfeng; Zhou, Zhennan
2018-02-01
To accelerate the thermal equilibrium sampling of multi-level quantum systems, the infinite swapping limit of a recently proposed multi-level ring polymer representation is investigated. In the infinite swapping limit, the ring polymer evolves according to an averaged Hamiltonian with respect to all possible surface index configurations of the ring polymer and thus connects the surface hopping approach to the mean-field path-integral molecular dynamics. A multiscale integrator for the infinite swapping limit is also proposed to enable efficient sampling based on the limiting dynamics. Numerical results demonstrate the huge improvement of sampling efficiency of the infinite swapping compared with the direct simulation of path-integral molecular dynamics with surface hopping.
X-framework: Space system failure analysis framework
NASA Astrophysics Data System (ADS)
Newman, John Steven
Space program and space systems failures result in financial losses in the multi-hundred million dollar range every year. In addition to financial loss, space system failures may also represent the loss of opportunity, loss of critical scientific, commercial and/or national defense capabilities, as well as loss of public confidence. The need exists to improve learning and expand the scope of lessons documented and offered to the space industry project team. One of the barriers to incorporating lessons learned include the way in which space system failures are documented. Multiple classes of space system failure information are identified, ranging from "sound bite" summaries in space insurance compendia, to articles in journals, lengthy data-oriented (what happened) reports, and in some rare cases, reports that treat not only the what, but also the why. In addition there are periodically published "corporate crisis" reports, typically issued after multiple or highly visible failures that explore management roles in the failure, often within a politically oriented context. Given the general lack of consistency, it is clear that a good multi-level space system/program failure framework with analytical and predictive capability is needed. This research effort set out to develop such a model. The X-Framework (x-fw) is proposed as an innovative forensic failure analysis approach, providing a multi-level understanding of the space system failure event beginning with the proximate cause, extending to the directly related work or operational processes and upward through successive management layers. The x-fw focus is on capability and control at the process level and examines: (1) management accountability and control, (2) resource and requirement allocation, and (3) planning, analysis, and risk management at each level of management. The x-fw model provides an innovative failure analysis approach for acquiring a multi-level perspective, direct and indirect causation of failures, and generating better and more consistent reports. Through this approach failures can be more fully understood, existing programs can be evaluated and future failures avoided. The x-fw development involved a review of the historical failure analysis and prevention literature, coupled with examination of numerous failure case studies. Analytical approaches included use of a relational failure "knowledge base" for classification and sorting of x-fw elements and attributes for each case. In addition a novel "management mapping" technique was developed as a means of displaying an integrated snapshot of indirect causes within the management chain. Further research opportunities will extend the depth of knowledge available for many of the component level cases. In addition, the x-fw has the potential to expand the scope of space sector lessons learned, and contribute to knowledge management and organizational learning.
Orion Flight Test 1 Architecture: Observed Benefits of a Model Based Engineering Approach
NASA Technical Reports Server (NTRS)
Simpson, Kimberly A.; Sindiy, Oleg V.; McVittie, Thomas I.
2012-01-01
This paper details how a NASA-led team is using a model-based systems engineering approach to capture, analyze and communicate the end-to-end information system architecture supporting the first unmanned orbital flight of the Orion Multi-Purpose Crew Exploration Vehicle. Along with a brief overview of the approach and its products, the paper focuses on the observed program-level benefits, challenges, and lessons learned; all of which may be applied to improve system engineering tasks for characteristically similarly challenges
Dall'Osso, F.; Dominey-Howes, D.; Moore, C.; Summerhayes, S.; Withycombe, G.
2014-01-01
Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney. PMID:25492514
Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G
2014-12-10
Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.
Chen, Chunhui; Chen, Chuansheng; Moyzis, Robert; Stern, Hal; He, Qinghua; Li, He; Li, Jin; Zhu, Bi; Dong, Qi
2011-01-01
Traditional behavioral genetic studies (e.g., twin, adoption studies) have shown that human personality has moderate to high heritability, but recent molecular behavioral genetic studies have failed to identify quantitative trait loci (QTL) with consistent effects. The current study adopted a multi-step approach (ANOVA followed by multiple regression and permutation) to assess the cumulative effects of multiple QTLs. Using a system-level (dopamine system) genetic approach, we investigated a personality trait deeply rooted in the nervous system (the Highly Sensitive Personality, HSP). 480 healthy Chinese college students were given the HSP scale and genotyped for 98 representative polymorphisms in all major dopamine neurotransmitter genes. In addition, two environment factors (stressful life events and parental warmth) that have been implicated for their contributions to personality development were included to investigate their relative contributions as compared to genetic factors. In Step 1, using ANOVA, we identified 10 polymorphisms that made statistically significant contributions to HSP. In Step 2, these polymorphism's main effects and interactions were assessed using multiple regression. This model accounted for 15% of the variance of HSP (p<0.001). Recent stressful life events accounted for an additional 2% of the variance. Finally, permutation analyses ascertained the probability of obtaining these findings by chance to be very low, p ranging from 0.001 to 0.006. Dividing these loci by the subsystems of dopamine synthesis, degradation/transport, receptor and modulation, we found that the modulation and receptor subsystems made the most significant contribution to HSP. The results of this study demonstrate the utility of a multi-step neuronal system-level approach in assessing genetic contributions to individual differences in human behavior. It can potentially bridge the gap between the high heritability estimates based on traditional behavioral genetics and the lack of reproducible genetic effects observed currently from molecular genetic studies.
Chen, Chunhui; Chen, Chuansheng; Moyzis, Robert; Stern, Hal; He, Qinghua; Li, He; Li, Jin; Zhu, Bi; Dong, Qi
2011-01-01
Traditional behavioral genetic studies (e.g., twin, adoption studies) have shown that human personality has moderate to high heritability, but recent molecular behavioral genetic studies have failed to identify quantitative trait loci (QTL) with consistent effects. The current study adopted a multi-step approach (ANOVA followed by multiple regression and permutation) to assess the cumulative effects of multiple QTLs. Using a system-level (dopamine system) genetic approach, we investigated a personality trait deeply rooted in the nervous system (the Highly Sensitive Personality, HSP). 480 healthy Chinese college students were given the HSP scale and genotyped for 98 representative polymorphisms in all major dopamine neurotransmitter genes. In addition, two environment factors (stressful life events and parental warmth) that have been implicated for their contributions to personality development were included to investigate their relative contributions as compared to genetic factors. In Step 1, using ANOVA, we identified 10 polymorphisms that made statistically significant contributions to HSP. In Step 2, these polymorphism's main effects and interactions were assessed using multiple regression. This model accounted for 15% of the variance of HSP (p<0.001). Recent stressful life events accounted for an additional 2% of the variance. Finally, permutation analyses ascertained the probability of obtaining these findings by chance to be very low, p ranging from 0.001 to 0.006. Dividing these loci by the subsystems of dopamine synthesis, degradation/transport, receptor and modulation, we found that the modulation and receptor subsystems made the most significant contribution to HSP. The results of this study demonstrate the utility of a multi-step neuronal system-level approach in assessing genetic contributions to individual differences in human behavior. It can potentially bridge the gap between the high heritability estimates based on traditional behavioral genetics and the lack of reproducible genetic effects observed currently from molecular genetic studies. PMID:21765900
Zaritsky, Assaf; Natan, Sari; Horev, Judith; Hecht, Inbal; Wolf, Lior; Ben-Jacob, Eshel; Tsarfaty, Ilan
2011-01-01
Confocal microscopy analysis of fluorescence and morphology is becoming the standard tool in cell biology and molecular imaging. Accurate quantification algorithms are required to enhance the understanding of different biological phenomena. We present a novel approach based on image-segmentation of multi-cellular regions in bright field images demonstrating enhanced quantitative analyses and better understanding of cell motility. We present MultiCellSeg, a segmentation algorithm to separate between multi-cellular and background regions for bright field images, which is based on classification of local patches within an image: a cascade of Support Vector Machines (SVMs) is applied using basic image features. Post processing includes additional classification and graph-cut segmentation to reclassify erroneous regions and refine the segmentation. This approach leads to a parameter-free and robust algorithm. Comparison to an alternative algorithm on wound healing assay images demonstrates its superiority. The proposed approach was used to evaluate common cell migration models such as wound healing and scatter assay. It was applied to quantify the acceleration effect of Hepatocyte growth factor/scatter factor (HGF/SF) on healing rate in a time lapse confocal microscopy wound healing assay and demonstrated that the healing rate is linear in both treated and untreated cells, and that HGF/SF accelerates the healing rate by approximately two-fold. A novel fully automated, accurate, zero-parameters method to classify and score scatter-assay images was developed and demonstrated that multi-cellular texture is an excellent descriptor to measure HGF/SF-induced cell scattering. We show that exploitation of textural information from differential interference contrast (DIC) images on the multi-cellular level can prove beneficial for the analyses of wound healing and scatter assays. The proposed approach is generic and can be used alone or alongside traditional fluorescence single-cell processing to perform objective, accurate quantitative analyses for various biological applications. PMID:22096600
Zaritsky, Assaf; Natan, Sari; Horev, Judith; Hecht, Inbal; Wolf, Lior; Ben-Jacob, Eshel; Tsarfaty, Ilan
2011-01-01
Confocal microscopy analysis of fluorescence and morphology is becoming the standard tool in cell biology and molecular imaging. Accurate quantification algorithms are required to enhance the understanding of different biological phenomena. We present a novel approach based on image-segmentation of multi-cellular regions in bright field images demonstrating enhanced quantitative analyses and better understanding of cell motility. We present MultiCellSeg, a segmentation algorithm to separate between multi-cellular and background regions for bright field images, which is based on classification of local patches within an image: a cascade of Support Vector Machines (SVMs) is applied using basic image features. Post processing includes additional classification and graph-cut segmentation to reclassify erroneous regions and refine the segmentation. This approach leads to a parameter-free and robust algorithm. Comparison to an alternative algorithm on wound healing assay images demonstrates its superiority. The proposed approach was used to evaluate common cell migration models such as wound healing and scatter assay. It was applied to quantify the acceleration effect of Hepatocyte growth factor/scatter factor (HGF/SF) on healing rate in a time lapse confocal microscopy wound healing assay and demonstrated that the healing rate is linear in both treated and untreated cells, and that HGF/SF accelerates the healing rate by approximately two-fold. A novel fully automated, accurate, zero-parameters method to classify and score scatter-assay images was developed and demonstrated that multi-cellular texture is an excellent descriptor to measure HGF/SF-induced cell scattering. We show that exploitation of textural information from differential interference contrast (DIC) images on the multi-cellular level can prove beneficial for the analyses of wound healing and scatter assays. The proposed approach is generic and can be used alone or alongside traditional fluorescence single-cell processing to perform objective, accurate quantitative analyses for various biological applications.
Guo, P; Huang, G H
2009-01-01
In this study, an inexact fuzzy chance-constrained two-stage mixed-integer linear programming (IFCTIP) approach is proposed for supporting long-term planning of waste-management systems under multiple uncertainties in the City of Regina, Canada. The method improves upon the existing inexact two-stage programming and mixed-integer linear programming techniques by incorporating uncertainties expressed as multiple uncertainties of intervals and dual probability distributions within a general optimization framework. The developed method can provide an effective linkage between the predefined environmental policies and the associated economic implications. Four special characteristics of the proposed method make it unique compared with other optimization techniques that deal with uncertainties. Firstly, it provides a linkage to predefined policies that have to be respected when a modeling effort is undertaken; secondly, it is useful for tackling uncertainties presented as intervals, probabilities, fuzzy sets and their incorporation; thirdly, it facilitates dynamic analysis for decisions of facility-expansion planning and waste-flow allocation within a multi-facility, multi-period, multi-level, and multi-option context; fourthly, the penalties are exercised with recourse against any infeasibility, which permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised solid waste-generation rates are violated. In a companion paper, the developed method is applied to a real case for the long-term planning of waste management in the City of Regina, Canada.
NASA Astrophysics Data System (ADS)
Phenglengdi, Butsari
This research evaluates the use of a molecular level visualisation approach in Thai secondary schools. The goal is to obtain insights about the usefulness of this approach, and to examine possible improvements in how the approach might be applied in the future. The methodology used for this research used both qualitative and quantitative approaches. Data were collected in the form of pre- and post-intervention multiple choice questions, open-ended-questions, drawing exercises, one-to-one interviews and video recordings of class activity. The research was conducted in two phases, involving a total of 261 students from the 11th Grade in Thailand. The use of VisChem animations in three studies was evaluated in Phase I. Study 1 was a pilot study exploring the benefits of incorporating VisChem animations to portray the molecular level. Study 2 compared test results between students exposed to these animations of molecular level events, and those not. Finally, in Study 3, test results were gathered from different types of schools (a rural school, a city school, and a university school). The results showed that students (and teachers) had misconceptions at the molecular level, and VisChem animations could help students understand chemistry concepts at the molecular level across all three types of schools. While the animation treatment group had a better score on the topic of states of water, the non-animation treatment group had a better score on the topic of dissolving sodium chloride in water than the animation group. The molecular level visualisation approach as a learning design was evaluated in Phase II. This approach involved a combination of VisChem animations, pictures, and diagrams together with the seven-step VisChem learning design. The study involved three classes of students, each with a different treatment, described as Class A - Traditional approach; Class B - VisChem animations with traditional approach; and Class C - Molecular level visualisation approach. Pre-test and post-test scores were compared across the three classes. The results from the multiple choice and calculation tests showed that the Class C - molecular level visualisation approach group demonstrated a deeper understanding of chemistry concepts than students in Classes A and B. However, the results showed that all the students were unable to perform satisfactorily on the calculation tests because the students had insufficient prior knowledge about stoichiometry to connect with the new knowledge. In the drawing tests the students exposed to the molecular level visualisation approach had a better mental model than the other classes, albeit with some remaining misconceptions. The findings highlight the intersecting nature of the teacher, student, and modelling in chemistry teaching. Use of a multi-step molecular level visualisation approach that encourages observation, reflection of prior understanding, and multiple opportunities at viewing (and using various visualisation elements), are key elements leading to a deeper understanding of chemistry. Presentation of the multi-step molecular level visualisation approach must be coupled with careful consideration of student prior knowledge, and with adequate guidance from a teacher who understands the topics at a deep level.
NASA Astrophysics Data System (ADS)
Bosse, Stefan
2013-05-01
Sensorial materials consisting of high-density, miniaturized, and embedded sensor networks require new robust and reliable data processing and communication approaches. Structural health monitoring is one major field of application for sensorial materials. Each sensor node provides some kind of sensor, electronics, data processing, and communication with a strong focus on microchip-level implementation to meet the goals of miniaturization and low-power energy environments, a prerequisite for autonomous behaviour and operation. Reliability requires robustness of the entire system in the presence of node, link, data processing, and communication failures. Interaction between nodes is required to manage and distribute information. One common interaction model is the mobile agent. An agent approach provides stronger autonomy than a traditional object or remote-procedure-call based approach. Agents can decide for themselves, which actions are performed, and they are capable of flexible behaviour, reacting on the environment and other agents, providing some degree of robustness. Traditionally multi-agent systems are abstract programming models which are implemented in software and executed on program controlled computer architectures. This approach does not well scale to micro-chip level and requires full equipped computers and communication structures, and the hardware architecture does not consider and reflect the requirements for agent processing and interaction. We propose and demonstrate a novel design paradigm for reliable distributed data processing systems and a synthesis methodology and framework for multi-agent systems implementable entirely on microchip-level with resource and power constrained digital logic supporting Agent-On-Chip architectures (AoC). The agent behaviour and mobility is fully integrated on the micro-chip using pipelined communicating processes implemented with finite-state machines and register-transfer logic. The agent behaviour, interaction (communication), and mobility features are modelled and specified on a machine-independent abstract programming level using a state-based agent behaviour language (APL). With this APL a high-level agent compiler is able to synthesize a hardware model (RTL, VHDL), a software model (C, ML), or a simulation model (XML) suitable to simulate a multi-agent system using the SeSAm simulator framework. Agent communication is provided by a simple tuple-space database implemented on node level providing fault tolerant access of global data. A novel synthesis development kit (SynDK) based on a graph-structured database approach is introduced to support the rapid development of compilers and synthesis tools, used for example for the design and implementation of the APL compiler.
Wiklund, Urban; Karlsson, Marcus; Ostlund, Nils; Berglin, Lena; Lindecrantz, Kaj; Karlsson, Stefan; Sandsjö, Leif
2007-06-01
Intermittent disturbances are common in ECG signals recorded with smart clothing: this is mainly because of displacement of the electrodes over the skin. We evaluated a novel adaptive method for spatio-temporal filtering for heartbeat detection in noisy multi-channel ECGs including short signal interruptions in single channels. Using multi-channel database recordings (12-channel ECGs from 10 healthy subjects), the results showed that multi-channel spatio-temporal filtering outperformed regular independent component analysis. We also recorded seven channels of ECG using a T-shirt with textile electrodes. Ten healthy subjects performed different sequences during a 10-min recording: resting, standing, flexing breast muscles, walking and pushups. Using adaptive multi-channel filtering, the sensitivity and precision was above 97% in nine subjects. Adaptive multi-channel spatio-temporal filtering can be used to detect heartbeats in ECGs with high noise levels. One application is heartbeat detection in noisy ECG recordings obtained by integrated textile electrodes in smart clothing.
Multi-User Hardware Solutions to Combustion Science ISS Research
NASA Technical Reports Server (NTRS)
Otero, Angel M.
2001-01-01
In response to the budget environment and to expand on the International Space Station (ISS) Fluids and Combustion Facility (FCF) Combustion Integrated Rack (CIR), common hardware approach, the NASA Combustion Science Program shifted focus in 1999 from single investigator PI (Principal Investigator)-specific hardware to multi-user 'Minifacilities'. These mini-facilities would take the CIR common hardware philosophy to the next level. The approach that was developed re-arranged all the investigations in the program into sub-fields of research. Then common requirements within these subfields were used to develop a common system that would then be complemented by a few PI-specific components. The sub-fields of research selected were droplet combustion, solids and fire safety, and gaseous fuels. From these research areas three mini-facilities have sprung: the Multi-user Droplet Combustion Apparatus (MDCA) for droplet research, Flow Enclosure for Novel Investigations in Combustion of Solids (FEANICS) for solids and fire safety, and the Multi-user Gaseous Fuels Apparatus (MGFA) for gaseous fuels. These mini-facilities will develop common Chamber Insert Assemblies (CIA) and diagnostics for the respective investigators complementing the capability provided by CIR. Presently there are four investigators for MDCA, six for FEANICS, and four for MGFA. The goal of these multi-user facilities is to drive the cost per PI down after the initial development investment is made. Each of these mini-facilities will become a fixture of future Combustion Science NASA Research Announcements (NRAs), enabling investigators to propose against an existing capability. Additionally, an investigation is provided the opportunity to enhance the existing capability to bridge the gap between the capability and their specific science requirements. This multi-user development approach will enable the Combustion Science Program to drive cost per investigation down while drastically reducing the time required to go from selection to space flight.
Wiltshire, Travis J; Lobato, Emilio J C; McConnell, Daniel S; Fiore, Stephen M
2014-01-01
In this paper we suggest that differing approaches to the science of social cognition mirror the arguments between radical embodied and traditional approaches to cognition. We contrast the use in social cognition of theoretical inference and mental simulation mechanisms with approaches emphasizing a direct perception of others' mental states. We build from a recent integrative framework unifying these divergent perspectives through the use of dual-process theory and supporting social neuroscience research. Our elaboration considers two complementary notions of direct perception: one primarily stemming from ecological psychology and the other from enactive cognition theory. We use this as the foundation from which to offer an account of the informational basis for social information and assert a set of research propositions to further the science of social cognition. In doing so, we point out how perception of the minds of others can be supported in some cases by lawful information, supporting direct perception of social affordances and perhaps, mental states, and in other cases by cues that support indirect perceptual inference. Our goal is to extend accounts of social cognition by integrating advances across disciplines to provide a multi-level and multi-theoretic description that can advance this field and offer a means through which to reconcile radical embodied and traditional approaches to cognitive neuroscience.
Foundations of modelling of nonequilibrium low-temperature plasmas
NASA Astrophysics Data System (ADS)
Alves, L. L.; Bogaerts, A.; Guerra, V.; Turner, M. M.
2018-02-01
This work explains the need for plasma models, introduces arguments for choosing the type of model that better fits the purpose of each study, and presents the basics of the most common nonequilibrium low-temperature plasma models and the information available from each one, along with an extensive list of references for complementary in-depth reading. The paper presents the following models, organised according to the level of multi-dimensional description of the plasma: kinetic models, based on either a statistical particle-in-cell/Monte-Carlo approach or the solution to the Boltzmann equation (in the latter case, special focus is given to the description of the electron kinetics); multi-fluid models, based on the solution to the hydrodynamic equations; global (spatially-average) models, based on the solution to the particle and energy rate-balance equations for the main plasma species, usually including a very complete reaction chemistry; mesoscopic models for plasma-surface interaction, adopting either a deterministic approach or a stochastic dynamical Monte-Carlo approach. For each plasma model, the paper puts forward the physics context, introduces the fundamental equations, presents advantages and limitations, also from a numerical perspective, and illustrates its application with some examples. Whenever pertinent, the interconnection between models is also discussed, in view of multi-scale hybrid approaches.
23 CFR 972.214 - Federal lands congestion management system (CMS).
Code of Federal Regulations, 2011 CFR
2011-04-01
... management strategies; (v) Determine methods to monitor and evaluate the performance of the multi-modal... means the level at which transportation system performance is no longer acceptable due to traffic... improve existing transportation system efficiency. Approaches may include the use of alternate mode...
23 CFR 972.214 - Federal lands congestion management system (CMS).
Code of Federal Regulations, 2010 CFR
2010-04-01
... management strategies; (v) Determine methods to monitor and evaluate the performance of the multi-modal... means the level at which transportation system performance is no longer acceptable due to traffic... improve existing transportation system efficiency. Approaches may include the use of alternate mode...
ERIC Educational Resources Information Center
Galey, Sarah; Youngs, Peter
2014-01-01
Scholars have developed a wide range of theories to explain both stability and change in policy subsystems. In recent years, a burgeoning literature has emerged that focuses on the application of network analysis in policy research, more formally known as Policy Network Analysis (PNA). This approach, while still developing, has great potential as…
Haller, Sven; Lovblad, Karl-Olof; Giannakopoulos, Panteleimon; Van De Ville, Dimitri
2014-05-01
Many diseases are associated with systematic modifications in brain morphometry and function. These alterations may be subtle, in particular at early stages of the disease progress, and thus not evident by visual inspection alone. Group-level statistical comparisons have dominated neuroimaging studies for many years, proving fascinating insight into brain regions involved in various diseases. However, such group-level results do not warrant diagnostic value for individual patients. Recently, pattern recognition approaches have led to a fundamental shift in paradigm, bringing multivariate analysis and predictive results, notably for the early diagnosis of individual patients. We review the state-of-the-art fundamentals of pattern recognition including feature selection, cross-validation and classification techniques, as well as limitations including inter-individual variation in normal brain anatomy and neurocognitive reserve. We conclude with the discussion of future trends including multi-modal pattern recognition, multi-center approaches with data-sharing and cloud-computing.
Diagnostics in the Extendable Integrated Support Environment (EISE)
NASA Technical Reports Server (NTRS)
Brink, James R.; Storey, Paul
1988-01-01
Extendable Integrated Support Environment (EISE) is a real-time computer network consisting of commercially available hardware and software components to support systems level integration, modifications, and enhancement to weapons systems. The EISE approach offers substantial potential savings by eliminating unique support environments in favor of sharing common modules for the support of operational weapon systems. An expert system is being developed that will help support diagnosing faults in this network. This is a multi-level, multi-expert diagnostic system that uses experiential knowledge relating symptoms to faults and also reasons from structural and functional models of the underlying physical model when experiential reasoning is inadequate. The individual expert systems are orchestrated by a supervisory reasoning controller, a meta-level reasoner which plans the sequence of reasoning steps to solve the given specific problem. The overall system, termed the Diagnostic Executive, accesses systems level performance checks and error reports, and issues remote test procedures to formulate and confirm fault hypotheses.
Axelsson, Robert; Angelstam, Per; Myhrman, Lennart; Sädbom, Stefan; Ivarsson, Milis; Elbakidze, Marine; Andersson, Kenneth; Cupa, Petr; Diry, Christian; Doyon, Frederic; Drotz, Marcus K; Hjorth, Arne; Hermansson, Jan Olof; Kullberg, Thomas; Lickers, F Henry; McTaggart, Johanna; Olsson, Anders; Pautov, Yurij; Svensson, Lennart; Törnblom, Johan
2013-03-01
To implement policies about sustainable landscapes and rural development necessitates social learning about states and trends of sustainability indicators, norms that define sustainability, and adaptive multi-level governance. We evaluate the extent to which social learning at multiple governance levels for sustainable landscapes occur in 18 local development initiatives in the network of Sustainable Bergslagen in Sweden. We mapped activities over time, and interviewed key actors in the network about social learning. While activities resulted in exchange of experiences and some local solutions, a major challenge was to secure systematic social learning and make new knowledge explicit at multiple levels. None of the development initiatives used a systematic approach to secure social learning, and sustainability assessments were not made systematically. We discuss how social learning can be improved, and how a learning network of development initiatives could be realized.
Ghumare, Eshwar; Schrooten, Maarten; Vandenberghe, Rik; Dupont, Patrick
2015-08-01
Kalman filter approaches are widely applied to derive time varying effective connectivity from electroencephalographic (EEG) data. For multi-trial data, a classical Kalman filter (CKF) designed for the estimation of single trial data, can be implemented by trial-averaging the data or by averaging single trial estimates. A general linear Kalman filter (GLKF) provides an extension for multi-trial data. In this work, we studied the performance of the different Kalman filtering approaches for different values of signal-to-noise ratio (SNR), number of trials and number of EEG channels. We used a simulated model from which we calculated scalp recordings. From these recordings, we estimated cortical sources. Multivariate autoregressive model parameters and partial directed coherence was calculated for these estimated sources and compared with the ground-truth. The results showed an overall superior performance of GLKF except for low levels of SNR and number of trials.
Lloret, Juan; Sancho, Juan; Pu, Minhao; Gasulla, Ivana; Yvind, Kresten; Sales, Salvador; Capmany, José
2011-06-20
A complex-valued multi-tap tunable microwave photonic filter based on single silicon-on-insulator microring resonator is presented. The degree of tunability of the approach involving two, three and four taps is theoretical and experimentally characterized, respectively. The constraints of exploiting the optical phase transfer function of a microring resonator aiming at implementing complex-valued multi-tap filtering schemes are also reported. The trade-off between the degree of tunability without changing the free spectral range and the number of taps is studied in-depth. Different window based scenarios are evaluated for improving the filter performance in terms of the side-lobe level.
Feng, Yan; Mitchison, Timothy J; Bender, Andreas; Young, Daniel W; Tallarico, John A
2009-07-01
Multi-parameter phenotypic profiling of small molecules provides important insights into their mechanisms of action, as well as a systems level understanding of biological pathways and their responses to small molecule treatments. It therefore deserves more attention at an early step in the drug discovery pipeline. Here, we summarize the technologies that are currently in use for phenotypic profiling--including mRNA-, protein- and imaging-based multi-parameter profiling--in the drug discovery context. We think that an earlier integration of phenotypic profiling technologies, combined with effective experimental and in silico target identification approaches, can improve success rates of lead selection and optimization in the drug discovery process.
Model selection and assessment for multi-species occupancy models
Broms, Kristin M.; Hooten, Mevin B.; Fitzpatrick, Ryan M.
2016-01-01
While multi-species occupancy models (MSOMs) are emerging as a popular method for analyzing biodiversity data, formal checking and validation approaches for this class of models have lagged behind. Concurrent with the rise in application of MSOMs among ecologists, a quiet regime shift is occurring in Bayesian statistics where predictive model comparison approaches are experiencing a resurgence. Unlike single-species occupancy models that use integrated likelihoods, MSOMs are usually couched in a Bayesian framework and contain multiple levels. Standard model checking and selection methods are often unreliable in this setting and there is only limited guidance in the ecological literature for this class of models. We examined several different contemporary Bayesian hierarchical approaches for checking and validating MSOMs and applied these methods to a freshwater aquatic study system in Colorado, USA, to better understand the diversity and distributions of plains fishes. Our findings indicated distinct differences among model selection approaches, with cross-validation techniques performing the best in terms of prediction.
Simeonov, Plamen L
2017-12-01
The goal of this paper is to advance an extensible theory of living systems using an approach to biomathematics and biocomputation that suitably addresses self-organized, self-referential and anticipatory systems with multi-temporal multi-agents. Our first step is to provide foundations for modelling of emergent and evolving dynamic multi-level organic complexes and their sustentative processes in artificial and natural life systems. Main applications are in life sciences, medicine, ecology and astrobiology, as well as robotics, industrial automation, man-machine interface and creative design. Since 2011 over 100 scientists from a number of disciplines have been exploring a substantial set of theoretical frameworks for a comprehensive theory of life known as Integral Biomathics. That effort identified the need for a robust core model of organisms as dynamic wholes, using advanced and adequately computable mathematics. The work described here for that core combines the advantages of a situation and context aware multivalent computational logic for active self-organizing networks, Wandering Logic Intelligence (WLI), and a multi-scale dynamic category theory, Memory Evolutive Systems (MES), hence WLIMES. This is presented to the modeller via a formal augmented reality language as a first step towards practical modelling and simulation of multi-level living systems. Initial work focuses on the design and implementation of this visual language and calculus (VLC) and its graphical user interface. The results will be integrated within the current methodology and practices of theoretical biology and (personalized) medicine to deepen and to enhance the holistic understanding of life. Copyright © 2017 Elsevier B.V. All rights reserved.
Kumar, Dushyant; Hariharan, Hari; Faizy, Tobias D; Borchert, Patrick; Siemonsen, Susanne; Fiehler, Jens; Reddy, Ravinder; Sedlacik, Jan
2018-05-12
We present a computationally feasible and iterative multi-voxel spatially regularized algorithm for myelin water fraction (MWF) reconstruction. This method utilizes 3D spatial correlations present in anatomical/pathological tissues and underlying B1 + -inhomogeneity or flip angle inhomogeneity to enhance the noise robustness of the reconstruction while intrinsically accounting for stimulated echo contributions using T2-distribution data alone. Simulated data and in vivo data acquired using 3D non-selective multi-echo spin echo (3DNS-MESE) were used to compare the reconstruction quality of the proposed approach against those of the popular algorithm (the method by Prasloski et al.) and our previously proposed 2D multi-slice spatial regularization spatial regularization approach. We also investigated whether the inter-sequence correlations and agreements improved as a result of the proposed approach. MWF-quantifications from two sequences, 3DNS-MESE vs 3DNS-gradient and spin echo (3DNS-GRASE), were compared for both reconstruction approaches to assess correlations and agreements between inter-sequence MWF-value pairs. MWF values from whole-brain data of six volunteers and two multiple sclerosis patients are being reported as well. In comparison with competing approaches such as Prasloski's method or our previously proposed 2D multi-slice spatial regularization method, the proposed method showed better agreements with simulated truths using regression analyses and Bland-Altman analyses. For 3DNS-MESE data, MWF-maps reconstructed using the proposed algorithm provided better depictions of white matter structures in subcortical areas adjoining gray matter which agreed more closely with corresponding contrasts on T2-weighted images than MWF-maps reconstructed with the method by Prasloski et al. We also achieved a higher level of correlations and agreements between inter-sequence (3DNS-MESE vs 3DNS-GRASE) MWF-value pairs. The proposed algorithm provides more noise-robust fits to T2-decay data and improves MWF-quantifications in white matter structures especially in the sub-cortical white matter and major white matter tract regions. Copyright © 2018 Elsevier Inc. All rights reserved.
Maruthur, Nisa; Mathioudakis, Nestoras; Spanakis, Elias; Rubin, Daniel; Zilbermint, Mihail; Hill-Briggs, Felicia
2017-01-01
Purpose of Review The goal of this review is to describe diabetes within a population health improvement framework and to review the evidence for a diabetes population health continuum of intervention approaches, including diabetes prevention and chronic and acute diabetes management, to improve clinical and economic outcomes. Recent Findings Recent studies have shown that compared to usual care, lifestyle interventions in prediabetes lower diabetes risk at the population-level and that group-based programs have low incremental medial cost effectiveness ratio for health systems. Effective outpatient interventions that improve diabetes control and process outcomes are multi-level, targeting the patient, provider, and healthcare system simultaneously and integrate community health workers as a liaison between the patient and community-based healthcare resources. A multi-faceted approach to diabetes management is also effective in the inpatient setting. Interventions shown to promote safe and effective glycemic control and use of evidence-based glucose management practices include provider reminder and clinical decision support systems, automated computer order entry, provider education, and organizational change. Summary Future studies should examine the cost-effectiveness of multi-faceted outpatient and inpatient diabetes management programs to determine the best financial models for incorporating them into diabetes population health strategies. PMID:28567711
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Huiying; Ray, Jaideep; Hou, Zhangshuan
In this study we developed an efficient Bayesian inversion framework for interpreting marine seismic amplitude versus angle (AVA) and controlled source electromagnetic (CSEM) data for marine reservoir characterization. The framework uses a multi-chain Markov-chain Monte Carlo (MCMC) sampler, which is a hybrid of DiffeRential Evolution Adaptive Metropolis (DREAM) and Adaptive Metropolis (AM) samplers. The inversion framework is tested by estimating reservoir-fluid saturations and porosity based on marine seismic and CSEM data. The multi-chain MCMC is scalable in terms of the number of chains, and is useful for computationally demanding Bayesian model calibration in scientific and engineering problems. As a demonstration,more » the approach is used to efficiently and accurately estimate the porosity and saturations in a representative layered synthetic reservoir. The results indicate that the seismic AVA and CSEM joint inversion provides better estimation of reservoir saturations than the seismic AVA-only inversion, especially for the parameters in deep layers. The performance of the inversion approach for various levels of noise in observational data was evaluated – reasonable estimates can be obtained with noise levels up to 25%. Sampling efficiency due to the use of multiple chains was also checked and was found to have almost linear scalability.« less
NASA Astrophysics Data System (ADS)
Ren, Huiying; Ray, Jaideep; Hou, Zhangshuan; Huang, Maoyi; Bao, Jie; Swiler, Laura
2017-12-01
In this study we developed an efficient Bayesian inversion framework for interpreting marine seismic Amplitude Versus Angle and Controlled-Source Electromagnetic data for marine reservoir characterization. The framework uses a multi-chain Markov-chain Monte Carlo sampler, which is a hybrid of DiffeRential Evolution Adaptive Metropolis and Adaptive Metropolis samplers. The inversion framework is tested by estimating reservoir-fluid saturations and porosity based on marine seismic and Controlled-Source Electromagnetic data. The multi-chain Markov-chain Monte Carlo is scalable in terms of the number of chains, and is useful for computationally demanding Bayesian model calibration in scientific and engineering problems. As a demonstration, the approach is used to efficiently and accurately estimate the porosity and saturations in a representative layered synthetic reservoir. The results indicate that the seismic Amplitude Versus Angle and Controlled-Source Electromagnetic joint inversion provides better estimation of reservoir saturations than the seismic Amplitude Versus Angle only inversion, especially for the parameters in deep layers. The performance of the inversion approach for various levels of noise in observational data was evaluated - reasonable estimates can be obtained with noise levels up to 25%. Sampling efficiency due to the use of multiple chains was also checked and was found to have almost linear scalability.
Kim, Won Hwa; Singh, Vikas; Chung, Moo K.; Hinrichs, Chris; Pachauri, Deepti; Okonkwo, Ozioma C.; Johnson, Sterling C.
2014-01-01
Statistical analysis on arbitrary surface meshes such as the cortical surface is an important approach to understanding brain diseases such as Alzheimer’s disease (AD). Surface analysis may be able to identify specific cortical patterns that relate to certain disease characteristics or exhibit differences between groups. Our goal in this paper is to make group analysis of signals on surfaces more sensitive. To do this, we derive multi-scale shape descriptors that characterize the signal around each mesh vertex, i.e., its local context, at varying levels of resolution. In order to define such a shape descriptor, we make use of recent results from harmonic analysis that extend traditional continuous wavelet theory from the Euclidean to a non-Euclidean setting (i.e., a graph, mesh or network). Using this descriptor, we conduct experiments on two different datasets, the Alzheimer’s Disease NeuroImaging Initiative (ADNI) data and images acquired at the Wisconsin Alzheimer’s Disease Research Center (W-ADRC), focusing on individuals labeled as having Alzheimer’s disease (AD), mild cognitive impairment (MCI) and healthy controls. In particular, we contrast traditional univariate methods with our multi-resolution approach which show increased sensitivity and improved statistical power to detect a group-level effects. We also provide an open source implementation. PMID:24614060
Galle, J; Hoffmann, M; Aust, G
2009-01-01
Collective phenomena in multi-cellular assemblies can be approached on different levels of complexity. Here, we discuss a number of mathematical models which consider the dynamics of each individual cell, so-called agent-based or individual-based models (IBMs). As a special feature, these models allow to account for intracellular decision processes which are triggered by biomechanical cell-cell or cell-matrix interactions. We discuss their impact on the growth and homeostasis of multi-cellular systems as simulated by lattice-free models. Our results demonstrate that cell polarisation subsequent to cell-cell contact formation can be a source of stability in epithelial monolayers. Stroma contact-dependent regulation of tumour cell proliferation and migration is shown to result in invasion dynamics in accordance with the migrating cancer stem cell hypothesis. However, we demonstrate that different regulation mechanisms can equally well comply with present experimental results. Thus, we suggest a panel of experimental studies for the in-depth validation of the model assumptions.
Gorsevski, Pece V; Donevska, Katerina R; Mitrovski, Cvetko D; Frizado, Joseph P
2012-02-01
This paper presents a GIS-based multi-criteria decision analysis approach for evaluating the suitability for landfill site selection in the Polog Region, Macedonia. The multi-criteria decision framework considers environmental and economic factors which are standardized by fuzzy membership functions and combined by integration of analytical hierarchy process (AHP) and ordered weighted average (OWA) techniques. The AHP is used for the elicitation of attribute weights while the OWA operator function is used to generate a wide range of decision alternatives for addressing uncertainty associated with interaction between multiple criteria. The usefulness of the approach is illustrated by different OWA scenarios that report landfill suitability on a scale between 0 and 1. The OWA scenarios are intended to quantify the level of risk taking (i.e., optimistic, pessimistic, and neutral) and to facilitate a better understanding of patterns that emerge from decision alternatives involved in the decision making process. Copyright © 2011 Elsevier Ltd. All rights reserved.
Eklund, Katie; Tanner, Nick; Stoll, Katie; Anway, Leslie
2015-06-01
The purpose of the current investigation was to compare 1,206 gifted and nongifted elementary students on the identification of emotional and behavioral risk (EBR) as rated by teachers and parents using a multigate, multi-informant approach to assessment. The Parent and Teacher Behavioral Assessment System for Children, Second Edition (BASC-2) and the Behavioral and Emotional Screening System were used to assess behavioral functioning as rated by teachers and parents. There were significant differences between the number of gifted and nongifted children demonstrating emotional and behavioral risk, with parents and teachers identifying a higher number of boys and nongifted children as at risk. Among children demonstrating EBR, gifted children demonstrated elevated internalizing behaviors as rated by parents. Gifted students demonstrated higher academic performance regardless of risk level, suggesting higher cognitive abilities may be one of several protective factors that serve to attenuate the development of other social, emotional, or behavioral concerns. Implications for practice and future research needs are discussed. (c) 2015 APA, all rights reserved).
Energy Efficient Image/Video Data Transmission on Commercial Multi-Core Processors
Lee, Sungju; Kim, Heegon; Chung, Yongwha; Park, Daihee
2012-01-01
In transmitting image/video data over Video Sensor Networks (VSNs), energy consumption must be minimized while maintaining high image/video quality. Although image/video compression is well known for its efficiency and usefulness in VSNs, the excessive costs associated with encoding computation and complexity still hinder its adoption for practical use. However, it is anticipated that high-performance handheld multi-core devices will be used as VSN processing nodes in the near future. In this paper, we propose a way to improve the energy efficiency of image and video compression with multi-core processors while maintaining the image/video quality. We improve the compression efficiency at the algorithmic level or derive the optimal parameters for the combination of a machine and compression based on the tradeoff between the energy consumption and the image/video quality. Based on experimental results, we confirm that the proposed approach can improve the energy efficiency of the straightforward approach by a factor of 2∼5 without compromising image/video quality. PMID:23202181
Analysis Commons, A Team Approach to Discovery in a Big-Data Environment for Genetic Epidemiology
Brody, Jennifer A.; Morrison, Alanna C.; Bis, Joshua C.; O'Connell, Jeffrey R.; Brown, Michael R.; Huffman, Jennifer E.; Ames, Darren C.; Carroll, Andrew; Conomos, Matthew P.; Gabriel, Stacey; Gibbs, Richard A.; Gogarten, Stephanie M.; Gupta, Namrata; Jaquish, Cashell E.; Johnson, Andrew D.; Lewis, Joshua P.; Liu, Xiaoming; Manning, Alisa K.; Papanicolaou, George J.; Pitsillides, Achilleas N.; Rice, Kenneth M.; Salerno, William; Sitlani, Colleen M.; Smith, Nicholas L.; Heckbert, Susan R.; Laurie, Cathy C.; Mitchell, Braxton D.; Vasan, Ramachandran S.; Rich, Stephen S.; Rotter, Jerome I.; Wilson, James G.; Boerwinkle, Eric; Psaty, Bruce M.; Cupples, L. Adrienne
2017-01-01
Summary paragraph The exploding volume of whole-genome sequence (WGS) and multi-omics data requires new approaches for analysis. As one solution, we have created a cloud-based Analysis Commons, which brings together genotype and phenotype data from multiple studies in a setting that is accessible by multiple investigators. This framework addresses many of the challenges of multi-center WGS analyses, including data sharing mechanisms, phenotype harmonization, integrated multi-omics analyses, annotation, and computational flexibility. In this setting, the computational pipeline facilitates a sequence-to-discovery analysis workflow illustrated here by an analysis of plasma fibrinogen levels in 3996 individuals from the National Heart, Lung, and Blood Institute (NHLBI) Trans-Omics for Precision Medicine (TOPMed) WGS program. The Analysis Commons represents a novel model for transforming WGS resources from a massive quantity of phenotypic and genomic data into knowledge of the determinants of health and disease risk in diverse human populations. PMID:29074945
Multi-level Operational C2 Holonic Reference Architecture Modeling for MHQ with MOC
2009-06-01
x), x(k), uj(k)) is defined as the task success probability, based on the asset allocation and task execution activities at the tactical level...on outcomes of asset- task allocation at the tactical level. We employ semi-Markov decision process (SMDP) approach to decide on missions to be...AGA) graph for addressing the mission monitoring/ planning issues related to task sequencing and asset allocation at the OLC-TLC layer (coordination
Krippendorff, Ben-Fillippo; Oyarzún, Diego A; Huisinga, Wilhelm
2012-04-01
Cell-level kinetic models for therapeutically relevant processes increasingly benefit the early stages of drug development. Later stages of the drug development processes, however, rely on pharmacokinetic compartment models while cell-level dynamics are typically neglected. We here present a systematic approach to integrate cell-level kinetic models and pharmacokinetic compartment models. Incorporating target dynamics into pharmacokinetic models is especially useful for the development of therapeutic antibodies because their effect and pharmacokinetics are inherently interdependent. The approach is illustrated by analysing the F(ab)-mediated inhibitory effect of therapeutic antibodies targeting the epidermal growth factor receptor. We build a multi-level model for anti-EGFR antibodies by combining a systems biology model with in vitro determined parameters and a pharmacokinetic model based on in vivo pharmacokinetic data. Using this model, we investigated in silico the impact of biochemical properties of anti-EGFR antibodies on their F(ab)-mediated inhibitory effect. The multi-level model suggests that the F(ab)-mediated inhibitory effect saturates with increasing drug-receptor affinity, thereby limiting the impact of increasing antibody affinity on improving the effect. This indicates that observed differences in the therapeutic effects of high affinity antibodies in the market and in clinical development may result mainly from Fc-mediated indirect mechanisms such as antibody-dependent cell cytotoxicity.
Ritchie, David W; Kozakov, Dima; Vajda, Sandor
2008-09-01
Predicting how proteins interact at the molecular level is a computationally intensive task. Many protein docking algorithms begin by using fast Fourier transform (FFT) correlation techniques to find putative rigid body docking orientations. Most such approaches use 3D Cartesian grids and are therefore limited to computing three dimensional (3D) translational correlations. However, translational FFTs can speed up the calculation in only three of the six rigid body degrees of freedom, and they cannot easily incorporate prior knowledge about a complex to focus and hence further accelerate the calculation. Furthemore, several groups have developed multi-term interaction potentials and others use multi-copy approaches to simulate protein flexibility, which both add to the computational cost of FFT-based docking algorithms. Hence there is a need to develop more powerful and more versatile FFT docking techniques. This article presents a closed-form 6D spherical polar Fourier correlation expression from which arbitrary multi-dimensional multi-property multi-resolution FFT correlations may be generated. The approach is demonstrated by calculating 1D, 3D and 5D rotational correlations of 3D shape and electrostatic expansions up to polynomial order L=30 on a 2 GB personal computer. As expected, 3D correlations are found to be considerably faster than 1D correlations but, surprisingly, 5D correlations are often slower than 3D correlations. Nonetheless, we show that 5D correlations will be advantageous when calculating multi-term knowledge-based interaction potentials. When docking the 84 complexes of the Protein Docking Benchmark, blind 3D shape plus electrostatic correlations take around 30 minutes on a contemporary personal computer and find acceptable solutions within the top 20 in 16 cases. Applying a simple angular constraint to focus the calculation around the receptor binding site produces acceptable solutions within the top 20 in 28 cases. Further constraining the search to the ligand binding site gives up to 48 solutions within the top 20, with calculation times of just a few minutes per complex. Hence the approach described provides a practical and fast tool for rigid body protein-protein docking, especially when prior knowledge about one or both binding sites is available.
A multi-disciplinary approach for the integrated assessment of multiple risks in delta areas.
NASA Astrophysics Data System (ADS)
Sperotto, Anna; Torresan, Silvia; Critto, Andrea; Marcomini, Antonio
2016-04-01
The assessment of climate change related risks is notoriously difficult due to the complex and uncertain combinations of hazardous events that might happen, the multiplicity of physical processes involved, the continuous changes and interactions of environmental and socio-economic systems. One important challenge lies in predicting and modelling cascades of natural and man -made hazard events which can be triggered by climate change, encompassing different spatial and temporal scales. Another regard the potentially difficult integration of environmental, social and economic disciplines in the multi-risk concept. Finally, the effective interaction between scientists and stakeholders is essential to ensure that multi-risk knowledge is translated into efficient adaptation and management strategies. The assessment is even more complex at the scale of deltaic systems which are particularly vulnerable to global environmental changes, due to the fragile equilibrium between the presence of valuable natural ecosystems and relevant economic activities. Improving our capacity to assess the combined effects of multiple hazards (e.g. sea-level rise, storm surges, reduction in sediment load, local subsidence, saltwater intrusion) is therefore essential to identify timely opportunities for adaptation. A holistic multi-risk approach is here proposed to integrate terminology, metrics and methodologies from different research fields (i.e. environmental, social and economic sciences) thus creating shared knowledge areas to advance multi risk assessment and management in delta regions. A first testing of the approach, including the application of Bayesian network analysis for the assessment of impacts of climate change on key natural systems (e.g. wetlands, protected areas, beaches) and socio-economic activities (e.g. agriculture, tourism), is applied in the Po river delta in Northern Italy. The approach is based on a bottom-up process involving local stakeholders early in different stages of the multi-risk assessment process (i.e. identification of objectives, collection of data, definition of risk thresholds and indicators). The results of the assessment will allow the development of multi-risk scenarios enabling the evaluation and prioritization of risk management and adaptation options under changing climate conditions.
Stakeholder conceptualisation of multi-level HIV and AIDS determinants in a Black epicentre.
Brawner, Bridgette M; Reason, Janaiya L; Hanlon, Kelsey; Guthrie, Barbara; Schensul, Jean J
2017-09-01
HIV has reached epidemic proportions among African Americans in the USA but certain urban contexts appear to experience a disproportionate disease burden. Geographic information systems mapping in Philadelphia indicates increased HIV incidence and prevalence in predominantly Black census tracts, with major differences across adjacent communities. What factors shape these geographic HIV disparities among Black Philadelphians? This descriptive study was designed to refine and validate a conceptual model developed to better understand multi-level determinants of HIV-related risk among Black Philadelphians. We used an expanded ecological approach to elicit reflective perceptions from administrators, direct service providers and community members about individual, social and structural factors that interact to protect against or increase the risk for acquiring HIV within their community. Gender equity, social capital and positive cultural mores (e.g., monogamy, abstinence) were seen as the main protective factors. Historical negative contributory influences of racial residential segregation, poverty and incarceration were among the most salient risk factors. This study was a critical next step toward initiating theory-based, multi-level community-based HIV prevention initiatives.
Addressing non-communicable diseases in the Seychelles: towards a comprehensive plan of action.
Bovet, Pascal; Viswanathan, Bharathi; Shamlaye, Conrad; Romain, Sarah; Gedeon, Jude
2010-06-01
This article reviews the different steps taken during the past 20 years for the prevention and control of non-communicable diseases (NCDs) in the Seychelles. National surveys revealed high levels of several cardiovascular risk factors and prompted an organized response, starting with the creation of an NCD unit in the Ministry of Health. Information campaigns and nationwide activities raised awareness and rallied increasingly broad and high-level support. Significant policy was developed including comprehensive tobacco legislation and a School Nutrition Policy that bans soft drinks in schools. NCD guidelines were developed and specialized 'NCD nurses' were trained to complement doctors in district health centers. Decreasing smoking prevalence is evidence of success, but the raising so-called diabesity epidemic calls for an integrated multi-sector policy to mould an environment conducive to healthy behaviors. Essential components of these efforts include: effective surveillance mechanisms supplemented by focused research; generating broad interest and consensus; mobilizing leadership and commitment at all levels; involving local and international expertise; building on existing efforts; and seeking integrated, multi-disciplinary and multi-sector approaches.
Life-space foam: A medium for motivational and cognitive dynamics
NASA Astrophysics Data System (ADS)
Ivancevic, Vladimir; Aidman, Eugene
2007-08-01
General stochastic dynamics, developed in a framework of Feynman path integrals, have been applied to Lewinian field-theoretic psychodynamics [K. Lewin, Field Theory in Social Science, University of Chicago Press, Chicago, 1951; K. Lewin, Resolving Social Conflicts, and, Field Theory in Social Science, American Psychological Association, Washington, 1997; M. Gold, A Kurt Lewin Reader, the Complete Social Scientist, American Psychological Association, Washington, 1999], resulting in the development of a new concept of life-space foam (LSF) as a natural medium for motivational and cognitive psychodynamics. According to LSF formalisms, the classic Lewinian life space can be macroscopically represented as a smooth manifold with steady force fields and behavioral paths, while at the microscopic level it is more realistically represented as a collection of wildly fluctuating force fields, (loco)motion paths and local geometries (and topologies with holes). A set of least-action principles is used to model the smoothness of global, macro-level LSF paths, fields and geometry. To model the corresponding local, micro-level LSF structures, an adaptive path integral is used, defining a multi-phase and multi-path (multi-field and multi-geometry) transition process from intention to goal-driven action. Application examples of this new approach include (but are not limited to) information processing, motivational fatigue, learning, memory and decision making.
Multi-level discriminative dictionary learning with application to large scale image classification.
Shen, Li; Sun, Gang; Huang, Qingming; Wang, Shuhui; Lin, Zhouchen; Wu, Enhua
2015-10-01
The sparse coding technique has shown flexibility and capability in image representation and analysis. It is a powerful tool in many visual applications. Some recent work has shown that incorporating the properties of task (such as discrimination for classification task) into dictionary learning is effective for improving the accuracy. However, the traditional supervised dictionary learning methods suffer from high computation complexity when dealing with large number of categories, making them less satisfactory in large scale applications. In this paper, we propose a novel multi-level discriminative dictionary learning method and apply it to large scale image classification. Our method takes advantage of hierarchical category correlation to encode multi-level discriminative information. Each internal node of the category hierarchy is associated with a discriminative dictionary and a classification model. The dictionaries at different layers are learnt to capture the information of different scales. Moreover, each node at lower layers also inherits the dictionary of its parent, so that the categories at lower layers can be described with multi-scale information. The learning of dictionaries and associated classification models is jointly conducted by minimizing an overall tree loss. The experimental results on challenging data sets demonstrate that our approach achieves excellent accuracy and competitive computation cost compared with other sparse coding methods for large scale image classification.
Ortega, Alexander N; Albert, Stephanie L; Sharif, Mienah Z; Langellier, Brent A; Garcia, Rosa Elena; Glik, Deborah C; Brookmeyer, Ron; Chan-Golston, Alec M; Friedlander, Scott; Prelip, Michael L
2015-04-01
Urban food swamps are typically situated in low-income, minority communities and contribute to overweight and obesity. Changing the food landscape in low income and underserved communities is one strategy to combat the negative health consequences associated with the lack of access to healthy food resources and an abundance of unhealthy food venues. In this paper, we describe Proyecto MercadoFRESCO (Fresh Market Project), a corner store intervention project in East Los Angeles and Boyle Heights in California that used a multi-level approach with a broad range of community, business, and academic partners. These are two neighboring, predominantly Latino communities that have high rates of overweight and obesity. Located in these two communities are approximately 150 corner stores. The project used a community-engaged approach to select, recruit, and convert four corner stores, so that they could become healthy community assets in order to improve residents' access to and awareness of fresh and affordable fruits and vegetables in their immediate neighborhoods. We describe the study framework for the multi-level intervention, which includes having multiple stakeholders, expertise in corner store operations, community and youth engagement strategies, and social marketing campaigns. We also describe the evaluation and survey methodology to determine community and patron impact of the intervention. This paper provides a framework useful to a variety of public health stakeholders for implementing a community-engaged corner store conversion, particularly in an urban food swamp.
Albert, Stephanie L.; Sharif, Mienah Z.; Langellier, Brent A.; Garcia, Rosa Elena; Glik, Deborah C.; Brookmeyer, Ron; Chan-Golston, Alec M.; Friedlander, Scott; Prelip, Michael L.
2014-01-01
Urban food swamps are typically situated in low-income, minority communities and contribute to overweight and obesity. Changing the food landscape in low income and underserved communities is one strategy to combat the negative health consequences associated with the lack of access to healthy food resources and an abundance of unhealthy food venues. In this paper, we describe Proyecto MercadoFRESCO (Fresh Market Project), a corner store intervention project in East Los Angeles and Boyle Heights in California that used a multi-level approach with a broad range of community, business, and academic partners. These are two neighboring, predominantly Latino communities that have high rates of overweight and obesity. Located in these two communities are approximately 150 corner stores. The project used a community-engaged approach to select, recruit, and convert four corner stores, so that they could become healthy community assets in order to improve residents’ access to and awareness of fresh and affordable fruits and vegetables in their immediate neighborhoods. We describe the study framework for the multi-level intervention, which includes having multiple stakeholders, expertise in corner store operations, community and youth engagement strategies, and social marketing campaigns. We also describe the evaluation and survey methodology to determine community and patron impact of the intervention. This paper provides a framework useful to a variety of public health stakeholders for implementing a community-engaged corner store conversion, particularly in an urban food swamp. PMID:25209600
Multi-parametric analysis of phagocyte antimicrobial responses using imaging flow cytometry.
Havixbeck, Jeffrey J; Wong, Michael E; More Bayona, Juan A; Barreda, Daniel R
2015-08-01
We feature a multi-parametric approach based on an imaging flow cytometry platform for examining phagocyte antimicrobial responses against the gram-negative bacterium Aeromonas veronii. This pathogen is known to induce strong inflammatory responses across a broad range of animal species, including humans. We examined the contribution of A. veronii to the induction of early phagocyte inflammatory processes in RAW 264.7 murine macrophages in vitro. We found that A. veronii, both in live or heat-killed forms, induced similar levels of macrophage activation based on NF-κB translocation. Although these macrophages maintained high levels of viability following heat-killed or live challenges with A. veronii, we identified inhibition of macrophage proliferation as early as 1h post in vitro challenge. The characterization of phagocytic responses showed a time-dependent increase in phagocytosis upon A. veronii challenge, which was paired with a robust induction of intracellular respiratory burst responses. Interestingly, despite the overall increase in the production of reactive oxygen species (ROS) among RAW 264.7 macrophages, we found a significant reduction in the production of ROS among the macrophage subset that had bound A. veronii. Phagocytic uptake of the pathogen further decreased ROS production levels, even beyond those of unstimulated controls. Overall, this multi-parametric imaging flow cytometry-based approach allowed for segregation of unique phagocyte sub-populations and examination of their downstream antimicrobial responses, and should contribute to improved understanding of phagocyte responses against Aeromonas and other pathogens. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Heremans, Stien; Suykens, Johan A. K.; Van Orshoven, Jos
2016-02-01
To be physically interpretable, sub-pixel land cover fractions or abundances should fulfill two constraints, the Abundance Non-negativity Constraint (ANC) and the Abundance Sum-to-one Constraint (ASC). This paper focuses on the effect of imposing these constraints onto the MultiLayer Perceptron (MLP) for a multi-class sub-pixel land cover classification of a time series of low resolution MODIS-images covering the northern part of Belgium. Two constraining modes were compared, (i) an in-training approach that uses 'softmax' as the transfer function in the MLP's output layer and (ii) a post-training approach that linearly rescales the outputs of the unconstrained MLP. Our results demonstrate that the pixel-level prediction accuracy is markedly increased by the explicit enforcement, both in-training and post-training, of the ANC and the ASC. For aggregations of pixels (municipalities), the constrained perceptrons perform at least as well as their unconstrained counterparts. Although the difference in performance between the in-training and post-training approach is small, we recommend the former for integrating the fractional abundance constraints into MLPs meant for sub-pixel land cover estimation, regardless of the targeted level of spatial aggregation.
NASA Astrophysics Data System (ADS)
Sun, Yuan; Bhattacherjee, Anol
2011-11-01
Information technology (IT) usage within organisations is a multi-level phenomenon that is influenced by individual-level and organisational-level variables. Yet, current theories, such as the unified theory of acceptance and use of technology, describe IT usage as solely an individual-level phenomenon. This article postulates a model of organisational IT usage that integrates salient organisational-level variables such as user training, top management support and technical support within an individual-level model to postulate a multi-level model of IT usage. The multi-level model was then empirically validated using multi-level data collected from 128 end users and 26 managers in 26 firms in China regarding their use of enterprise resource planning systems and analysed using the multi-level structural equation modelling (MSEM) technique. We demonstrate the utility of MSEM analysis of multi-level data relative to the more common structural equation modelling analysis of single-level data and show how single-level data can be aggregated to approximate multi-level analysis when multi-level data collection is not possible. We hope that this article will motivate future scholars to employ multi-level data and multi-level analysis for understanding organisational phenomena that are truly multi-level in nature.
This project investigated an innovative approach for transport of inorganic species under the influence of electric fields. This process, commonly known as electrokinetics uses low-level direct current (dc) electrical potential difference across a soil mass applied through inert...
ERIC Educational Resources Information Center
Edwards, D. Brent, Jr.
2012-01-01
This article elaborates one approach to conceptualizing and investigating international processes of education policy formation (IPEPF), which are dynamic, multi-level and processual in nature. This contribution is important because, although research is increasingly conducted on phenomena with such characteristics, extended discussions of how…
Introduction--Understanding Education, Fragility and Conflict
ERIC Educational Resources Information Center
Buchert, Lene
2013-01-01
This Introduction discusses approaches to and perspectives on analyzing the complex relationship between education, fragility, and conflict and its underlying causes and dynamics. It argues for the need for contextual and time-bound multi-level analyses of interlinked societal dimensions in order to address the ultimate purposes of education…
Development of Decision Support System for Remote Monitoring of PIP Corn
The EPA is developing a multi-level approach that utilizes satellite and airborne remote sensing to locate and monitor genetically modified corn in the agricultural landscape and pest infestation. The current status of the EPA IRM monitoring program based on remote sensed imager...
Pedersen, Jacob; Bjorner, Jakob Bue
2017-11-15
Work life expectancy (WLE) expresses the expected time a person will remain in the labor market until he or she retires. This paper compares a life table approach to estimating WLE to an approach based on multi-state proportional hazards models. The two methods are used to estimate WLE in Danish members and non-members of an early retirement pensioning (ERP) scheme according to levels of health. In 2008, data on self-rated health (SRH) was collected from 5212 employees 55-65 years of age. Data on previous and subsequent long-term sickness absence, unemployment, returning to work, and disability pension was collected from national registers. WLE was estimated from multi-state life tables and through multi-state models. Results from the multi-state model approach agreed with the life table approach but provided narrower confidence intervals for small groups. The shortest WLE was seen for employees with poor SRH and ERP membership while the longest WLE was seen for those with good SRH and no ERP membership. Employees aged 55-56 years with poor SRH but no ERP membership had shorter WLE than employees with good SRH and ERP membership. Relative WLE reversed for the two groups after age 57. At age 55, employees with poor SRH could be expected to spend approximately 12 months on long-term sick leave and 9-10 months unemployed before they retired - regardless of ERP membership. ERP members with poor SRH could be expected to spend 4.6 years working, while non-members could be expected to spend 7.1 years working. WLE estimated through multi-state models provided an effective way to summarize complex data on labor market affiliation. WLE differed noticeably between members and non-members of the ERP scheme. It has been hypothesized that while ERP membership would prompt some employees to retire earlier than they would have done otherwise, this effect would be partly offset by reduced time spent on long-term sick leave or unemployment. Our data showed no indication of such an effect, but this could be due to residual confounding and self-selection of people with poor health into the ERP scheme.
A multi-objective approach to solid waste management.
Galante, Giacomo; Aiello, Giuseppe; Enea, Mario; Panascia, Enrico
2010-01-01
The issue addressed in this paper consists in the localization and dimensioning of transfer stations, which constitute a necessary intermediate level in the logistic chain of the solid waste stream, from municipalities to the incinerator. Contextually, the determination of the number and type of vehicles involved is carried out in an integrated optimization approach. The model considers both initial investment and operative costs related to transportation and transfer stations. Two conflicting objectives are evaluated, the minimization of total cost and the minimization of environmental impact, measured by pollution. The design of the integrated waste management system is hence approached in a multi-objective optimization framework. To determine the best means of compromise, goal programming, weighted sum and fuzzy multi-objective techniques have been employed. The proposed analysis highlights how different attitudes of the decision maker towards the logic and structure of the problem result in the employment of different methodologies and the obtaining of different results. The novel aspect of the paper lies in the proposal of an effective decision support system for operative waste management, rather than a further contribution to the transportation problem. The model was applied to the waste management of optimal territorial ambit (OTA) of Palermo (Italy). 2010 Elsevier Ltd. All rights reserved.
A multi-objective approach to solid waste management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galante, Giacomo, E-mail: galante@dtpm.unipa.i; Aiello, Giuseppe; Enea, Mario
2010-08-15
The issue addressed in this paper consists in the localization and dimensioning of transfer stations, which constitute a necessary intermediate level in the logistic chain of the solid waste stream, from municipalities to the incinerator. Contextually, the determination of the number and type of vehicles involved is carried out in an integrated optimization approach. The model considers both initial investment and operative costs related to transportation and transfer stations. Two conflicting objectives are evaluated, the minimization of total cost and the minimization of environmental impact, measured by pollution. The design of the integrated waste management system is hence approached inmore » a multi-objective optimization framework. To determine the best means of compromise, goal programming, weighted sum and fuzzy multi-objective techniques have been employed. The proposed analysis highlights how different attitudes of the decision maker towards the logic and structure of the problem result in the employment of different methodologies and the obtaining of different results. The novel aspect of the paper lies in the proposal of an effective decision support system for operative waste management, rather than a further contribution to the transportation problem. The model was applied to the waste management of optimal territorial ambit (OTA) of Palermo (Italy).« less
Local variance for multi-scale analysis in geomorphometry.
Drăguţ, Lucian; Eisank, Clemens; Strasser, Thomas
2011-07-15
Increasing availability of high resolution Digital Elevation Models (DEMs) is leading to a paradigm shift regarding scale issues in geomorphometry, prompting new solutions to cope with multi-scale analysis and detection of characteristic scales. We tested the suitability of the local variance (LV) method, originally developed for image analysis, for multi-scale analysis in geomorphometry. The method consists of: 1) up-scaling land-surface parameters derived from a DEM; 2) calculating LV as the average standard deviation (SD) within a 3 × 3 moving window for each scale level; 3) calculating the rate of change of LV (ROC-LV) from one level to another, and 4) plotting values so obtained against scale levels. We interpreted peaks in the ROC-LV graphs as markers of scale levels where cells or segments match types of pattern elements characterized by (relatively) equal degrees of homogeneity. The proposed method has been applied to LiDAR DEMs in two test areas different in terms of roughness: low relief and mountainous, respectively. For each test area, scale levels for slope gradient, plan, and profile curvatures were produced at constant increments with either resampling (cell-based) or image segmentation (object-based). Visual assessment revealed homogeneous areas that convincingly associate into patterns of land-surface parameters well differentiated across scales. We found that the LV method performed better on scale levels generated through segmentation as compared to up-scaling through resampling. The results indicate that coupling multi-scale pattern analysis with delineation of morphometric primitives is possible. This approach could be further used for developing hierarchical classifications of landform elements.
Local variance for multi-scale analysis in geomorphometry
Drăguţ, Lucian; Eisank, Clemens; Strasser, Thomas
2011-01-01
Increasing availability of high resolution Digital Elevation Models (DEMs) is leading to a paradigm shift regarding scale issues in geomorphometry, prompting new solutions to cope with multi-scale analysis and detection of characteristic scales. We tested the suitability of the local variance (LV) method, originally developed for image analysis, for multi-scale analysis in geomorphometry. The method consists of: 1) up-scaling land-surface parameters derived from a DEM; 2) calculating LV as the average standard deviation (SD) within a 3 × 3 moving window for each scale level; 3) calculating the rate of change of LV (ROC-LV) from one level to another, and 4) plotting values so obtained against scale levels. We interpreted peaks in the ROC-LV graphs as markers of scale levels where cells or segments match types of pattern elements characterized by (relatively) equal degrees of homogeneity. The proposed method has been applied to LiDAR DEMs in two test areas different in terms of roughness: low relief and mountainous, respectively. For each test area, scale levels for slope gradient, plan, and profile curvatures were produced at constant increments with either resampling (cell-based) or image segmentation (object-based). Visual assessment revealed homogeneous areas that convincingly associate into patterns of land-surface parameters well differentiated across scales. We found that the LV method performed better on scale levels generated through segmentation as compared to up-scaling through resampling. The results indicate that coupling multi-scale pattern analysis with delineation of morphometric primitives is possible. This approach could be further used for developing hierarchical classifications of landform elements. PMID:21779138
Rasmussen, Luke V; Berg, Richard L; Linneman, James G; McCarty, Catherine A; Waudby, Carol; Chen, Lin; Denny, Joshua C; Wilke, Russell A; Pathak, Jyotishman; Carrell, David; Kho, Abel N; Starren, Justin B
2012-01-01
Objective There is increasing interest in using electronic health records (EHRs) to identify subjects for genomic association studies, due in part to the availability of large amounts of clinical data and the expected cost efficiencies of subject identification. We describe the construction and validation of an EHR-based algorithm to identify subjects with age-related cataracts. Materials and methods We used a multi-modal strategy consisting of structured database querying, natural language processing on free-text documents, and optical character recognition on scanned clinical images to identify cataract subjects and related cataract attributes. Extensive validation on 3657 subjects compared the multi-modal results to manual chart review. The algorithm was also implemented at participating electronic MEdical Records and GEnomics (eMERGE) institutions. Results An EHR-based cataract phenotyping algorithm was successfully developed and validated, resulting in positive predictive values (PPVs) >95%. The multi-modal approach increased the identification of cataract subject attributes by a factor of three compared to single-mode approaches while maintaining high PPV. Components of the cataract algorithm were successfully deployed at three other institutions with similar accuracy. Discussion A multi-modal strategy incorporating optical character recognition and natural language processing may increase the number of cases identified while maintaining similar PPVs. Such algorithms, however, require that the needed information be embedded within clinical documents. Conclusion We have demonstrated that algorithms to identify and characterize cataracts can be developed utilizing data collected via the EHR. These algorithms provide a high level of accuracy even when implemented across multiple EHRs and institutional boundaries. PMID:22319176
Cox, Benjamin L; Mackie, Thomas R; Eliceiri, Kevin W
2015-01-01
Multi-modal imaging approaches of tumor metabolism that provide improved specificity, physiological relevance and spatial resolution would improve diagnosing of tumors and evaluation of tumor progression. Currently, the molecular probe FDG, glucose fluorinated with 18F at the 2-carbon, is the primary metabolic approach for clinical diagnostics with PET imaging. However, PET lacks the resolution necessary to yield intratumoral distributions of deoxyglucose, on the cellular level. Multi-modal imaging could elucidate this problem, but requires the development of new glucose analogs that are better suited for other imaging modalities. Several such analogs have been created and are reviewed here. Also reviewed are several multi-modal imaging studies that have been performed that attempt to shed light on the cellular distribution of glucose analogs within tumors. Some of these studies are performed in vitro, while others are performed in vivo, in an animal model. The results from these studies introduce a visualization gap between the in vitro and in vivo studies that, if solved, could enable the early detection of tumors, the high resolution monitoring of tumors during treatment, and the greater accuracy in assessment of different imaging agents. PMID:25625022
Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Xu; Tuo, Rui; Jeff Wu, C. F.
Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less
Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion
He, Xu; Tuo, Rui; Jeff Wu, C. F.
2017-01-31
Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less
NASA Astrophysics Data System (ADS)
Gómez A, Héctor F.; Martínez-Tomás, Rafael; Arias Tapia, Susana A.; Rincón Zamorano, Mariano
2014-04-01
Automatic systems that monitor human behaviour for detecting security problems are a challenge today. Previously, our group defined the Horus framework, which is a modular architecture for the integration of multi-sensor monitoring stages. In this work, structure and technologies required for high-level semantic stages of Horus are proposed, and the associated methodological principles established with the aim of recognising specific behaviours and situations. Our methodology distinguishes three semantic levels of events: low level (compromised with sensors), medium level (compromised with context), and high level (target behaviours). The ontology for surveillance and ubiquitous computing has been used to integrate ontologies from specific domains and together with semantic technologies have facilitated the modelling and implementation of scenes and situations by reusing components. A home context and a supermarket context were modelled following this approach, where three suspicious activities were monitored via different virtual sensors. The experiments demonstrate that our proposals facilitate the rapid prototyping of this kind of systems.
Modeling, Materials, and Metrics: The Three-m Approach to FCS Signature Solutions
2002-05-07
calculations. These multiple levels will be incorporated into the MuSES software. The four levels are described as follows: "* Radiosity - Deterministic...view-factor-based, all-diffuse solution. Very fast. Independent of user position. "* Directional Reflectivity - Radiosity with directional incident...target and environment facets (view factor with BRDF). Last ray cast bounce = radiosity solution. "* Multi-bounce path trace - Rays traced from observer
NASA Astrophysics Data System (ADS)
Nourifar, Raheleh; Mahdavi, Iraj; Mahdavi-Amiri, Nezam; Paydar, Mohammad Mahdi
2017-09-01
Decentralized supply chain management is found to be significantly relevant in today's competitive markets. Production and distribution planning is posed as an important optimization problem in supply chain networks. Here, we propose a multi-period decentralized supply chain network model with uncertainty. The imprecision related to uncertain parameters like demand and price of the final product is appropriated with stochastic and fuzzy numbers. We provide mathematical formulation of the problem as a bi-level mixed integer linear programming model. Due to problem's convolution, a structure to solve is developed that incorporates a novel heuristic algorithm based on Kth-best algorithm, fuzzy approach and chance constraint approach. Ultimately, a numerical example is constructed and worked through to demonstrate applicability of the optimization model. A sensitivity analysis is also made.
From Synergy to Complexity: The Trend Toward Integrated Value Chain and Landscape Governance.
Ros-Tonen, Mirjam A F; Reed, James; Sunderland, Terry
2018-07-01
This Editorial introduces a special issue that illustrates a trend toward integrated landscape approaches. Whereas two papers echo older "win-win" strategies based on the trade of non-timber forest products, ten papers reflect a shift from a product to landscape perspective. However, they differ from integrated landscape approaches in that they emanate from sectorial approaches driven primarily by aims such as forest restoration, sustainable commodity sourcing, natural resource management, or carbon emission reduction. The potential of such initiatives for integrated landscape governance and achieving landscape-level outcomes has hitherto been largely unaddressed in the literature on integrated landscape approaches. This special issue addresses this gap, with a focus on actor constellations and institutional arrangements emerging in the transition from sectorial to integrated approaches. This editorial discusses the trends arising from the papers, including the need for a commonly shared concern and sense of urgency; inclusive stakeholder engagement; accommodating and coordinating polycentric governance in landscapes beset with institutional fragmentation and jurisdictional mismatches; alignment with locally embedded initiatives and governance structures; and a framework to assess and monitor the performance of integrated multi-stakeholder approaches. We conclude that, despite a growing tendency toward integrated approaches at the landscape level, inherent landscape complexity renders persistent and significant challenges such as balancing multiple objectives, equitable inclusion of all relevant stakeholders, dealing with power and gender asymmetries, adaptive management based on participatory outcome monitoring, and moving beyond existing administrative, jurisdictional, and sectorial silos. Multi-stakeholder platforms and bridging organizations and individuals are seen as key in overcoming such challenges.
López-Carr, David; Davis, Jason; Jankowska, Marta; Grant, Laura; López-Carr, Anna Carla; Clark, Matthew
2013-01-01
The relative role of space and place has long been debated in geography. Yet modeling efforts applied to coupled human-natural systems seemingly favor models assuming continuous spatial relationships. We examine the relative importance of placebased hierarchical versus spatial clustering influences in tropical land use/cover change (LUCC). Guatemala was chosen as our study site given its high rural population growth and deforestation in recent decades. We test predictors of 2009 forest cover and forest cover change from 2001-2009 across Guatemala's 331 municipalities and 22 departments using spatial and multi-level statistical models. Our results indicate the emergence of several socio-economic predictors of LUCC regardless of model choice. Hierarchical model results suggest that significant differences exist at the municipal and departmental levels but largely maintain the magnitude and direction of single-level model coefficient estimates. They are also intervention-relevant since policies tend to be applicable to distinct political units rather than to continuous space. Spatial models complement hierarchical approaches by indicating where and to what magnitude significant negative and positive clustering associations emerge. Appreciating the comparative advantages and limitations of spatial and nested models enhances a holistic approach to geographical analysis of tropical LUCC and human-environment interactions. PMID:24013908
Safety climate and firefighting: Focus group results.
DeJoy, David M; Smith, Todd D; Dyal, Mari-Amanda
2017-09-01
Firefighting is a hazardous occupation and there have been numerous calls for fundamental changes in how fire service organizations approach safety and balance safety with other operational priorities. These calls, however, have yielded little systematic research. As part of a larger project to develop and test a model of safety climate for the fire service, focus groups were used to identify potentially important dimensions of safety climate pertinent to firefighting. Analyses revealed nine overarching themes. Competency/professionalism, physical/psychological readiness, and that positive traits sometimes produce negative consequences were themes at the individual level; cohesion and supervisor leadership/support at the workgroup level; and politics/bureaucracy, resources, leadership, and hiring/promotion at the organizational level. A multi-level perspective seems appropriate for examining safety climate in firefighting. Safety climate in firefighting appears to be multi-dimensional and some dimensions prominent in the general safety climate literature also seem relevant to firefighting. These results also suggest that the fire service may be undergoing transitions encompassing mission, personnel, and its fundamental approach to safety and risk. These results help point the way to the development of safety climate measures specific to firefighting and to interventions for improving safety performance. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.
USDA-ARS?s Scientific Manuscript database
The health of the Chesapeake Bay ecosystem has been declining for several decades due to high levels of nutrients and sediments largely tied to agricultural production systems within the Bay watershed. Therefore, monitoring of crop production, agricultural water use and hydrologic connections betwee...
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
A Multi-Level Approach to Caregiver Training Accessibility.
ERIC Educational Resources Information Center
Range, Diana
This brief paper discusses how in 10 year's time a system for providing caregiver training was developed in Texas. The training accessibility system consists of five dimensions: publications, free materials, resource rooms, child development specialists, Title XX training contracts and traveling resource vans. Tips for program continuity under…
Consumer Education Resource Guide, K-12. A Multi-Disciplinary Approach.
ERIC Educational Resources Information Center
Calhoun, Calfrey C.; And Others
The guide suggests methods and resources for planning learning experiences in teaching consumer education to students at the K-12 levels. The major topics and related areas are: (1) financial planning (estimating income, estimating expenses, establishing goals, making decisions, and making the financial plan); (2) buying (importance of planned…
23 CFR 970.214 - Federal lands congestion management system (CMS).
Code of Federal Regulations, 2011 CFR
2011-04-01
... experiencing congestion, the NPS shall develop a separate CMS to cover those facilities. Approaches may include... congestion management strategies; (v) Determine methods to monitor and evaluate the performance of the multi... means the level at which transportation system performance is no longer acceptable due to traffic...
23 CFR 970.214 - Federal lands congestion management system (CMS).
Code of Federal Regulations, 2010 CFR
2010-04-01
... experiencing congestion, the NPS shall develop a separate CMS to cover those facilities. Approaches may include... congestion management strategies; (v) Determine methods to monitor and evaluate the performance of the multi... means the level at which transportation system performance is no longer acceptable due to traffic...
Teaching Ethics to Undergraduates: An Examination of Contextual Approaches
ERIC Educational Resources Information Center
Bush, H. Francis; Gutermuth, Karen; West, Clifford
2009-01-01
Our purpose was to advance the current academic discussion on how to most effectively teach managerial ethics at the undergraduate level. We argued that undergraduate ethics education should be comprehensive, multi-dimensional and woven into the fabric of each student's experience. In particular, we hypothesized that the inclusion of…
Primary Principals' Leadership Styles, School Organizational Health and Workplace Bullying
ERIC Educational Resources Information Center
Cemaloglu, Necati
2011-01-01
Purpose: The purpose of this paper is to determine the relationships between leadership styles of primary school principals and organizational health and bullying. Design/methodology/approach: Two hypotheses were formulated in relation to the research. Three instruments were used--a multi-level questionnaire for measuring leadership, an…
Development of a Multi-experience Approach in Introductory Soil and Vegetation Geography Courses.
ERIC Educational Resources Information Center
Limbird, Arthur
1982-01-01
Describes an introductory college level course in soil and vegetation which uses lecture, audiovisual tutorial, individualized instruction, field trips, films, and games. The course consists of three segments: basic concepts of soils, basic concepts of plants, and soil and vegetation concepts in a spatial context. (KC)
Tschakert, P.; Tappan, G.
2004-01-01
This paper presents the results of a multi-scale investigation of environmental change in the Old Peanut Basin of Senegal throughout the 20th century. Based on historical accounts, ethnographies, aerial photos, satellite images, field and household surveys as well as various participatory research activities with farmers in selected villages, the study attempts to make explicit layered scales of analysis, both temporally and spatially. It shows that, despite some general trends of resource degradation in the Old Peanut Basin, local farming systems have embarked on different pathways of change to adapt to their evolving environment. It also illustrates that high diversity with respect to soil fertility management exists at the farm and household level. Finally, the paper proposes a farmer-oriented approach to carbon sequestration in order to integrate recommended technical options more efficiently into the complex and dynamic livelihoods of smallholders in dryland environments. This approach includes pathway-specific land use and management options at the level of farming systems and, at the level of individual households, a basket of possible practices from which farmers can choose depending on their multiple needs, capacities, and adaptive strategies to cope with risk and uncertainty.
Efficiently Scheduling Multi-core Guest Virtual Machines on Multi-core Hosts in Network Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoginath, Srikanth B; Perumalla, Kalyan S
2011-01-01
Virtual machine (VM)-based simulation is a method used by network simulators to incorporate realistic application behaviors by executing actual VMs as high-fidelity surrogates for simulated end-hosts. A critical requirement in such a method is the simulation time-ordered scheduling and execution of the VMs. Prior approaches such as time dilation are less efficient due to the high degree of multiplexing possible when multiple multi-core VMs are simulated on multi-core host systems. We present a new simulation time-ordered scheduler to efficiently schedule multi-core VMs on multi-core real hosts, with a virtual clock realized on each virtual core. The distinguishing features of ourmore » approach are: (1) customizable granularity of the VM scheduling time unit on the simulation time axis, (2) ability to take arbitrary leaps in virtual time by VMs to maximize the utilization of host (real) cores when guest virtual cores idle, and (3) empirically determinable optimality in the tradeoff between total execution (real) time and time-ordering accuracy levels. Experiments show that it is possible to get nearly perfect time-ordered execution, with a slight cost in total run time, relative to optimized non-simulation VM schedulers. Interestingly, with our time-ordered scheduler, it is also possible to reduce the time-ordering error from over 50% of non-simulation scheduler to less than 1% realized by our scheduler, with almost the same run time efficiency as that of the highly efficient non-simulation VM schedulers.« less
NASA Astrophysics Data System (ADS)
Thomsen, N. I.; Troldborg, M.; McKnight, U. S.; Binning, P. J.; Bjerg, P. L.
2012-04-01
Mass discharge estimates are increasingly being used in the management of contaminated sites. Such estimates have proven useful for supporting decisions related to the prioritization of contaminated sites in a groundwater catchment. Potential management options can be categorised as follows: (1) leave as is, (2) clean up, or (3) further investigation needed. However, mass discharge estimates are often very uncertain, which may hamper the management decisions. If option 1 is incorrectly chosen soil and water quality will decrease, threatening or destroying drinking water resources. The risk of choosing option 2 is to spend money on remediating a site that does not pose a problem. Choosing option 3 will often be safest, but may not be the optimal economic solution. Quantification of the uncertainty in mass discharge estimates can therefore greatly improve the foundation for selecting the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We propose a method for quantifying the uncertainty of dynamic mass discharge estimates from contaminant point sources on the local scale. The method considers both parameter and conceptual uncertainty through a multi-model approach. The multi-model approach evaluates multiple conceptual models for the same site. The different conceptual models consider different source characterizations and hydrogeological descriptions. The idea is to include a set of essentially different conceptual models where each model is believed to be realistic representation of the given site, based on the current level of information. Parameter uncertainty is quantified using Monte Carlo simulations. For each conceptual model we calculate a transient mass discharge estimate with uncertainty bounds resulting from the parametric uncertainty. To quantify the conceptual uncertainty from a given site, we combine the outputs from the different conceptual models using Bayesian model averaging. The weight for each model is obtained by integrating available data and expert knowledge using Bayesian belief networks. The multi-model approach is applied to a contaminated site. At the site a DNAPL (dense non aqueous phase liquid) spill consisting of PCE (perchloroethylene) has contaminated a fractured clay till aquitard overlaying a limestone aquifer. The exact shape and nature of the source is unknown and so is the importance of transport in the fractures. The result of the multi-model approach is a visual representation of the uncertainty of the mass discharge estimates for the site which can be used to support the management options.
NASA Astrophysics Data System (ADS)
Ketcha, M. D.; De Silva, T.; Uneri, A.; Jacobson, M. W.; Goerres, J.; Kleinszig, G.; Vogt, S.; Wolinsky, J.-P.; Siewerdsen, J. H.
2017-06-01
A multi-stage image-based 3D-2D registration method is presented that maps annotations in a 3D image (e.g. point labels annotating individual vertebrae in preoperative CT) to an intraoperative radiograph in which the patient has undergone non-rigid anatomical deformation due to changes in patient positioning or due to the intervention itself. The proposed method (termed msLevelCheck) extends a previous rigid registration solution (LevelCheck) to provide an accurate mapping of vertebral labels in the presence of spinal deformation. The method employs a multi-stage series of rigid 3D-2D registrations performed on sets of automatically determined and increasingly localized sub-images, with the final stage achieving a rigid mapping for each label to yield a locally rigid yet globally deformable solution. The method was evaluated first in a phantom study in which a CT image of the spine was acquired followed by a series of 7 mobile radiographs with increasing degree of deformation applied. Second, the method was validated using a clinical data set of patients exhibiting strong spinal deformation during thoracolumbar spine surgery. Registration accuracy was assessed using projection distance error (PDE) and failure rate (PDE > 20 mm—i.e. label registered outside vertebra). The msLevelCheck method was able to register all vertebrae accurately for all cases of deformation in the phantom study, improving the maximum PDE of the rigid method from 22.4 mm to 3.9 mm. The clinical study demonstrated the feasibility of the approach in real patient data by accurately registering all vertebral labels in each case, eliminating all instances of failure encountered in the conventional rigid method. The multi-stage approach demonstrated accurate mapping of vertebral labels in the presence of strong spinal deformation. The msLevelCheck method maintains other advantageous aspects of the original LevelCheck method (e.g. compatibility with standard clinical workflow, large capture range, and robustness against mismatch in image content) and extends capability to cases exhibiting strong changes in spinal curvature.
NASA Redox system development project status
NASA Technical Reports Server (NTRS)
Nice, A. W.
1981-01-01
NASA-Redox energy storage systems developed for solar power applications and utility load leveling applications are discussed. The major objective of the project is to establish the technology readiness of Redox energy storage for transfer to industry for product development and commercialization by industry. The approach is to competitively contract to design, build, and test Redox systems progressively from preprototype to prototype multi-kW and megawatt systems and conduct supporting technology advancement tasks. The Redox electrode and membrane are fully adequate for multi-kW solar related applications and the viability of the Redox system technology as demonstrated for multi-kW solar related applications. The status of the NASA Redox Storage System Project is described along with the goals and objectives of the project elements.
A multi-image approach to CADx of breast cancer with integration into PACS
NASA Astrophysics Data System (ADS)
Elter, Matthias; Wittenberg, Thomas; Schulz-Wendtland, Rüdiger; Deserno, Thomas M.
2009-02-01
While screening mammography is accepted as the most adequate technique for the early detection of breast cancer, its low positive predictive value leads to many breast biopsies performed on benign lesions. Therefore, we have previously developed a knowledge-based system for computer-aided diagnosis (CADx) of mammographic lesions. It supports the radiologist in the discrimination of benign and malignant lesions. So far, our approach operates on the lesion level and employs the paradigm of content-based image retrieval (CBIR). Similar lesions with known diagnosis are retrieved automatically from a library of references. However, radiologists base their diagnostic decisions on additional resources, such as related mammographic projections, other modalities (e.g. ultrasound, MRI), and clinical data. Nonetheless, most CADx systems disregard the relation between the craniocaudal (CC) and mediolateral-oblique (MLO) views of conventional mammography. Therefore, we extend our approach to the full case level: (i) Multi-frame features are developed that jointly describe a lesion in different views of mammography. Taking into account the geometric relation between different images, these features can also be extracted from multi-modal data; (ii) the CADx system architecture is extended appropriately; (iii) the CADx system is integrated into the radiology information system (RIS) and the picture archiving and communication system (PACS). Here, the framework for image retrieval in medical applications (IRMA) is used to support access to the patient's health care record. Of particular interest is the application of the proposed CADx system to digital breast tomosynthesis (DBT), which has the potential to succeed digital mammography as the standard technique for breast cancer screening. The proposed system is a natural extension of CADx approaches that integrate only two modalities. However, we are still collecting a large enough database of breast lesions with images from multiple modalities to evaluate the benefits of the proposed approach on.
Behavior-aware cache hierarchy optimization for low-power multi-core embedded systems
NASA Astrophysics Data System (ADS)
Zhao, Huatao; Luo, Xiao; Zhu, Chen; Watanabe, Takahiro; Zhu, Tianbo
2017-07-01
In modern embedded systems, the increasing number of cores requires efficient cache hierarchies to ensure data throughput, but such cache hierarchies are restricted by their tumid size and interference accesses which leads to both performance degradation and wasted energy. In this paper, we firstly propose a behavior-aware cache hierarchy (BACH) which can optimally allocate the multi-level cache resources to many cores and highly improved the efficiency of cache hierarchy, resulting in low energy consumption. The BACH takes full advantage of the explored application behaviors and runtime cache resource demands as the cache allocation bases, so that we can optimally configure the cache hierarchy to meet the runtime demand. The BACH was implemented on the GEM5 simulator. The experimental results show that energy consumption of a three-level cache hierarchy can be saved from 5.29% up to 27.94% compared with other key approaches while the performance of the multi-core system even has a slight improvement counting in hardware overhead.
Optimizing Multi-Product Multi-Constraint Inventory Control Systems with Stochastic Replenishments
NASA Astrophysics Data System (ADS)
Allah Taleizadeh, Ata; Aryanezhad, Mir-Bahador; Niaki, Seyed Taghi Akhavan
Multi-periodic inventory control problems are mainly studied employing two assumptions. The first is the continuous review, where depending on the inventory level orders can happen at any time and the other is the periodic review, where orders can only happen at the beginning of each period. In this study, we relax these assumptions and assume that the periodic replenishments are stochastic in nature. Furthermore, we assume that the periods between two replenishments are independent and identically random variables. For the problem at hand, the decision variables are of integer-type and there are two kinds of space and service level constraints for each product. We develop a model of the problem in which a combination of back-order and lost-sales are considered for the shortages. Then, we show that the model is of an integer-nonlinear-programming type and in order to solve it, a search algorithm can be utilized. We employ a simulated annealing approach and provide a numerical example to demonstrate the applicability of the proposed methodology.
Lee, Jae Hoon; Kim, Joon Ha; Oh, Hee-Mock; An, Kwang-Guk
2013-01-01
The objectives of this study were to identify multi-level stressors at the DNA/biochemical level to the community level in fish in an urban stream and to develop an integrative health response (IHR) model for ecological health diagnosis. A pristine control site (S (c) ) and an impacted site (S (i) ) were selected from among seven pre-screened sites studied over seven years. Various chemical analyses indicated that nutrient enrichment (Nitrogen, Phosphorus) and organic pollution were significantly greater (t > 8.783, p < 0.01) at the S (i) site compared to the S (c) site. Single-cell gel electrophoresis (comet assays) of DNA-level impairment indicated significantly (t = 5.678, p < 0.01) greater tail intensity, expressed as % tail-DNA, at the S (i) site and genotoxic responses were detected in the downstream reach. Ethoxyresorufin-O-deethylase (EROD) assays, as a physiological bioindicator, were 2.8-fold higher (p < 0.05, NK-test after ANOVA) at the S (i) site. Tissue analysis using a necropsy-based health assessment index (NHAI) showed distinct internal organ disorders in three tissues, i.e., liver, kidney, and gill, at the S (i) site. Population-level analysis using the sentinel species Zacco platypus showed that the regression coefficient (b) was 3.012 for the S (i) site and 2.915 for the S (c) site, indicating population skewness in the downstream reach. Community-level health was impaired at the S (i) site based on an index of biological integrity (IBI), and physical habitat modifications were identified by a qualitative habitat evaluation index (QHEI). Overall, the model values for the integrative health response (IHR), developed using the star plot approach, were 3.22 (80.5%) at the S (c) site and 0.74 (18.5%) at the S (i) site, indicating that, overall, ecological health impairments were evident in the urban reach. Our study was based on multi-level approaches using biological organization and the results suggest that there is a pivotal point of linkage between mechanistic understanding and real ecological consequences of environmental stressors.
Wels, Michael; Carneiro, Gustavo; Aplas, Alexander; Huber, Martin; Hornegger, Joachim; Comaniciu, Dorin
2008-01-01
In this paper we present a fully automated approach to the segmentation of pediatric brain tumors in multi-spectral 3-D magnetic resonance images. It is a top-down segmentation approach based on a Markov random field (MRF) model that combines probabilistic boosting trees (PBT) and lower-level segmentation via graph cuts. The PBT algorithm provides a strong discriminative observation model that classifies tumor appearance while a spatial prior takes into account the pair-wise homogeneity in terms of classification labels and multi-spectral voxel intensities. The discriminative model relies not only on observed local intensities but also on surrounding context for detecting candidate regions for pathology. A mathematically sound formulation for integrating the two approaches into a unified statistical framework is given. The proposed method is applied to the challenging task of detection and delineation of pediatric brain tumors. This segmentation task is characterized by a high non-uniformity of both the pathology and the surrounding non-pathologic brain tissue. A quantitative evaluation illustrates the robustness of the proposed method. Despite dealing with more complicated cases of pediatric brain tumors the results obtained are mostly better than those reported for current state-of-the-art approaches to 3-D MR brain tumor segmentation in adult patients. The entire processing of one multi-spectral data set does not require any user interaction, and takes less time than previously proposed methods.
NASA Technical Reports Server (NTRS)
Park, Young W.; Montez, Moises N.
1994-01-01
A candidate onboard space navigation filter demonstrated excellent performance (less than 8 meter level RMS semi-major axis accuracy) in performing orbit determination of a low-Earth orbit Explorer satellite using single-frequency real GPS data. This performance is significantly better than predicted by other simulation studies using dual-frequency GPS data. The study results revealed the significance of two new modeling approaches evaluated in the work. One approach introduces a single-frequency ionospheric correction through pseudo-range and phase range averaging implementation. The other approach demonstrates a precise axis-dependent characterization of dynamic sample space uncertainty to compute a more accurate Kalman filter gain. Additionally, this navigation filter demonstrates a flexibility to accommodate both perturbational dynamic and observational biases required for multi-flight phase and inhomogeneous application environments. This paper reviews the potential application of these methods and the filter structure to terrestrial vehicle and positioning applications. Both the single-frequency ionospheric correction method and the axis-dependent state noise modeling approach offer valuable contributions in cost and accuracy improvements for terrestrial GPS receivers. With a modular design approach to either 'plug-in' or 'unplug' various force models, this multi-flight phase navigation filter design structure also provides a versatile GPS navigation software engine for both atmospheric and exo-atmospheric navigation or positioning use, thereby streamlining the flight phase or application-dependent software requirements. Thus, a standardized GPS navigation software engine that can reduce the development and maintenance cost of commercial GPS receivers is now possible.
User-assisted visual search and tracking across distributed multi-camera networks
NASA Astrophysics Data System (ADS)
Raja, Yogesh; Gong, Shaogang; Xiang, Tao
2011-11-01
Human CCTV operators face several challenges in their task which can lead to missed events, people or associations, including: (a) data overload in large distributed multi-camera environments; (b) short attention span; (c) limited knowledge of what to look for; and (d) lack of access to non-visual contextual intelligence to aid search. Developing a system to aid human operators and alleviate such burdens requires addressing the problem of automatic re-identification of people across disjoint camera views, a matching task made difficult by factors such as lighting, viewpoint and pose changes and for which absolute scoring approaches are not best suited. Accordingly, we describe a distributed multi-camera tracking (MCT) system to visually aid human operators in associating people and objects effectively over multiple disjoint camera views in a large public space. The system comprises three key novel components: (1) relative measures of ranking rather than absolute scoring to learn the best features for matching; (2) multi-camera behaviour profiling as higher-level knowledge to reduce the search space and increase the chance of finding correct matches; and (3) human-assisted data mining to interactively guide search and in the process recover missing detections and discover previously unknown associations. We provide an extensive evaluation of the greater effectiveness of the system as compared to existing approaches on industry-standard i-LIDS multi-camera data.
A mixed integer bi-level DEA model for bank branch performance evaluation by Stackelberg approach
NASA Astrophysics Data System (ADS)
Shafiee, Morteza; Lotfi, Farhad Hosseinzadeh; Saleh, Hilda; Ghaderi, Mehdi
2016-03-01
One of the most complicated decision making problems for managers is the evaluation of bank performance, which involves various criteria. There are many studies about bank efficiency evaluation by network DEA in the literature review. These studies do not focus on multi-level network. Wu (Eur J Oper Res 207:856-864, 2010) proposed a bi-level structure for cost efficiency at the first time. In this model, multi-level programming and cost efficiency were used. He used a nonlinear programming to solve the model. In this paper, we have focused on multi-level structure and proposed a bi-level DEA model. We then used a liner programming to solve our model. In other hand, we significantly improved the way to achieve the optimum solution in comparison with the work by Wu (2010) by converting the NP-hard nonlinear programing into a mixed integer linear programming. This study uses a bi-level programming data envelopment analysis model that embodies internal structure with Stackelberg-game relationships to evaluate the performance of banking chain. The perspective of decentralized decisions is taken in this paper to cope with complex interactions in banking chain. The results derived from bi-level programming DEA can provide valuable insights and detailed information for managers to help them evaluate the performance of the banking chain as a whole using Stackelberg-game relationships. Finally, this model was applied in the Iranian bank to evaluate cost efficiency.
Wiltshire, Travis J.; Lobato, Emilio J. C.; McConnell, Daniel S.; Fiore, Stephen M.
2015-01-01
In this paper we suggest that differing approaches to the science of social cognition mirror the arguments between radical embodied and traditional approaches to cognition. We contrast the use in social cognition of theoretical inference and mental simulation mechanisms with approaches emphasizing a direct perception of others’ mental states. We build from a recent integrative framework unifying these divergent perspectives through the use of dual-process theory and supporting social neuroscience research. Our elaboration considers two complementary notions of direct perception: one primarily stemming from ecological psychology and the other from enactive cognition theory. We use this as the foundation from which to offer an account of the informational basis for social information and assert a set of research propositions to further the science of social cognition. In doing so, we point out how perception of the minds of others can be supported in some cases by lawful information, supporting direct perception of social affordances and perhaps, mental states, and in other cases by cues that support indirect perceptual inference. Our goal is to extend accounts of social cognition by integrating advances across disciplines to provide a multi-level and multi-theoretic description that can advance this field and offer a means through which to reconcile radical embodied and traditional approaches to cognitive neuroscience. PMID:25709572
Ontological approach for safe and effective polypharmacy prescription
Grando, Adela; Farrish, Susan; Boyd, Cynthia; Boxwala, Aziz
2012-01-01
The intake of multiple medications in patients with various medical conditions challenges the delivery of medical care. Initial empirical studies and pilot implementations seem to indicate that generic safe and effective multi-drug prescription principles could be defined and reused to reduce adverse drug events and to support compliance with medical guidelines and drug formularies. Given that ontologies are known to provide well-principled, sharable, setting-independent and machine-interpretable declarative specification frameworks for modeling and reasoning on biomedical problems, we explore here their use in the context of multi-drug prescription. We propose an ontology for modeling drug-related knowledge and a repository of safe and effective generic prescription principles. To test the usability and the level of granularity of the developed ontology-based specification models and heuristic we implemented a tool that computes the complexity of multi-drug treatments, and a decision aid to check the safeness and effectiveness of prescribed multi-drug treatments. PMID:23304299
Abiotic Stress Responses and Microbe-Mediated Mitigation in Plants: The Omics Strategies
Meena, Kamlesh K.; Sorty, Ajay M.; Bitla, Utkarsh M.; Choudhary, Khushboo; Gupta, Priyanka; Pareek, Ashwani; Singh, Dhananjaya P.; Prabha, Ratna; Sahu, Pramod K.; Gupta, Vijai K.; Singh, Harikesh B.; Krishanani, Kishor K.; Minhas, Paramjit S.
2017-01-01
Abiotic stresses are the foremost limiting factors for agricultural productivity. Crop plants need to cope up adverse external pressure created by environmental and edaphic conditions with their intrinsic biological mechanisms, failing which their growth, development, and productivity suffer. Microorganisms, the most natural inhabitants of diverse environments exhibit enormous metabolic capabilities to mitigate abiotic stresses. Since microbial interactions with plants are an integral part of the living ecosystem, they are believed to be the natural partners that modulate local and systemic mechanisms in plants to offer defense under adverse external conditions. Plant-microbe interactions comprise complex mechanisms within the plant cellular system. Biochemical, molecular and physiological studies are paving the way in understanding the complex but integrated cellular processes. Under the continuous pressure of increasing climatic alterations, it now becomes more imperative to define and interpret plant-microbe relationships in terms of protection against abiotic stresses. At the same time, it also becomes essential to generate deeper insights into the stress-mitigating mechanisms in crop plants for their translation in higher productivity. Multi-omics approaches comprising genomics, transcriptomics, proteomics, metabolomics and phenomics integrate studies on the interaction of plants with microbes and their external environment and generate multi-layered information that can answer what is happening in real-time within the cells. Integration, analysis and decipherization of the big-data can lead to a massive outcome that has significant chance for implementation in the fields. This review summarizes abiotic stresses responses in plants in-terms of biochemical and molecular mechanisms followed by the microbe-mediated stress mitigation phenomenon. We describe the role of multi-omics approaches in generating multi-pronged information to provide a better understanding of plant–microbe interactions that modulate cellular mechanisms in plants under extreme external conditions and help to optimize abiotic stresses. Vigilant amalgamation of these high-throughput approaches supports a higher level of knowledge generation about root-level mechanisms involved in the alleviation of abiotic stresses in organisms. PMID:28232845
Abiotic Stress Responses and Microbe-Mediated Mitigation in Plants: The Omics Strategies.
Meena, Kamlesh K; Sorty, Ajay M; Bitla, Utkarsh M; Choudhary, Khushboo; Gupta, Priyanka; Pareek, Ashwani; Singh, Dhananjaya P; Prabha, Ratna; Sahu, Pramod K; Gupta, Vijai K; Singh, Harikesh B; Krishanani, Kishor K; Minhas, Paramjit S
2017-01-01
Abiotic stresses are the foremost limiting factors for agricultural productivity. Crop plants need to cope up adverse external pressure created by environmental and edaphic conditions with their intrinsic biological mechanisms, failing which their growth, development, and productivity suffer. Microorganisms, the most natural inhabitants of diverse environments exhibit enormous metabolic capabilities to mitigate abiotic stresses. Since microbial interactions with plants are an integral part of the living ecosystem, they are believed to be the natural partners that modulate local and systemic mechanisms in plants to offer defense under adverse external conditions. Plant-microbe interactions comprise complex mechanisms within the plant cellular system. Biochemical, molecular and physiological studies are paving the way in understanding the complex but integrated cellular processes. Under the continuous pressure of increasing climatic alterations, it now becomes more imperative to define and interpret plant-microbe relationships in terms of protection against abiotic stresses. At the same time, it also becomes essential to generate deeper insights into the stress-mitigating mechanisms in crop plants for their translation in higher productivity. Multi-omics approaches comprising genomics, transcriptomics, proteomics, metabolomics and phenomics integrate studies on the interaction of plants with microbes and their external environment and generate multi-layered information that can answer what is happening in real-time within the cells. Integration, analysis and decipherization of the big-data can lead to a massive outcome that has significant chance for implementation in the fields. This review summarizes abiotic stresses responses in plants in-terms of biochemical and molecular mechanisms followed by the microbe-mediated stress mitigation phenomenon. We describe the role of multi-omics approaches in generating multi-pronged information to provide a better understanding of plant-microbe interactions that modulate cellular mechanisms in plants under extreme external conditions and help to optimize abiotic stresses. Vigilant amalgamation of these high-throughput approaches supports a higher level of knowledge generation about root-level mechanisms involved in the alleviation of abiotic stresses in organisms.
Methodological flaws introduce strong bias into molecular analysis of microbial populations.
Krakat, N; Anjum, R; Demirel, B; Schröder, P
2017-02-01
In this study, we report how different cell disruption methods, PCR primers and in silico analyses can seriously bias results from microbial population studies, with consequences for the credibility and reproducibility of the findings. Our results emphasize the pitfalls of commonly used experimental methods that can seriously weaken the interpretation of results. Four different cell lysis methods, three commonly used primer pairs and various computer-based analyses were applied to investigate the microbial diversity of a fermentation sample composed of chicken dung. The fault-prone, but still frequently used, amplified rRNA gene restriction analysis was chosen to identify common weaknesses. In contrast to other studies, we focused on the complete analytical process, from cell disruption to in silico analysis, and identified potential error rates. This identified a wide disagreement of results between applied experimental approaches leading to very different community structures depending on the chosen approach. The interpretation of microbial diversity data remains a challenge. In order to accurately investigate the taxonomic diversity and structure of prokaryotic communities, we suggest a multi-level approach combining DNA-based and DNA-independent techniques. The identified weaknesses of commonly used methods to study microbial diversity can be overcome by a multi-level approach, which produces more reliable data about the fate and behaviour of microbial communities of engineered habitats such as biogas plants, so that the best performance can be ensured. © 2016 The Society for Applied Microbiology.
Gregory, Emma; West, Therese A; Cole, Wesley R; Bailie, Jason M; McCulloch, Karen L; Ettenhofer, Mark L; Cecchini, Amy; Qashu, Felicia M
2017-01-01
The large number of U.S. service members diagnosed with concussion/mild traumatic brain injury each year underscores the necessity for clear and effective clinical guidance for managing concussion. Relevant research continues to emerge supporting a gradual return to pre-injury activity levels without aggravating symptoms; however, available guidance does not provide detailed standards for this return to activity process. To fill this gap, the Defense and Veterans Brain Injury Center released a recommendation for primary care providers detailing a step-wise return to unrestricted activity during the acute phase of concussion. This guidance was developed in collaboration with an interdisciplinary group of clinical, military, and academic subject matter experts using an evidence-based approach. Systematic evaluation of the guidance is critical to ensure positive patient outcomes, to discover barriers to implementation by providers, and to identify ways to improve the recommendation. Here we describe a multi-level, mixed-methods approach to evaluate the recommendation incorporating outcomes from both patients and providers. Procedures were developed to implement the study within complex but ecologically-valid settings at multiple military treatment facilities and operational medical units. Special consideration was given to anticipated challenges such as the frequent movement of military personnel, selection of appropriate design and measures, study implementation at multiple sites, and involvement of multiple service branches (Army, Navy, and Marine Corps). We conclude by emphasizing the need to consider contemporary approaches for evaluating the effectiveness of clinical guidance. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kant, Deepender, E-mail: dkc@ceeri.ernet.in; Joshi, L. M.; Janyani, Vijay
The klystron is a well-known microwave amplifier which uses kinetic energy of an electron beam for amplification of the RF signal. There are some limitations of conventional single beam klystron such as high operating voltage, low efficiency and bulky size at higher power levels, which are very effectively handled in Multi Beam Klystron (MBK) that uses multiple low purveyance electron beams for RF interaction. Each beam propagates along its individual transit path through a resonant cavity structure. Multi-Beam klystron cavity design is a critical task due to asymmetric cavity structure and can be simulated by 3D code only. The presentmore » paper shall discuss the design of multi beam RF cavities for klystrons operating at 2856 MHz (S-band) and 5 GHz (C-band) respectively. The design approach uses some scaling laws for finding the electron beam parameters of the multi beam device from their single beam counter parts. The scaled beam parameters are then used for finding the design parameters of the multi beam cavities. Design of the desired multi beam cavity can be optimized through iterative simulations in CST Microwave Studio.« less
Quantitative multi-modal NDT data analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heideklang, René; Shokouhi, Parisa
2014-02-18
A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundantmore » information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.« less
Applying a Consumer Behavior Lens to Salt Reduction Initiatives.
Regan, Áine; Kent, Monique Potvin; Raats, Monique M; McConnon, Áine; Wall, Patrick; Dubois, Lise
2017-08-18
Reformulation of food products to reduce salt content has been a central strategy for achieving population level salt reduction. In this paper, we reflect on current reformulation strategies and consider how consumer behavior determines the ultimate success of these strategies. We consider the merits of adopting a 'health by stealth', silent approach to reformulation compared to implementing a communications strategy which draws on labeling initiatives in tandem with reformulation efforts. We end this paper by calling for a multi-actor approach which utilizes co-design, participatory tools to facilitate the involvement of all stakeholders, including, and especially, consumers, in making decisions around how best to achieve population-level salt reduction.
Ansell, Nicola
2014-01-01
Critics of empowerment have highlighted the concept's mutability, focus on individual transformation, one-dimensionality and challenges of operationalisation. Relating these critiques to children's empowerment raises new challenges. Drawing on scholarship on children's subjecthood and exercise of power, alongside empirical research with children affected by AIDS, I argue that empowerment envisaged as individual self-transformation and increased capacity to act independently offers little basis for progressive change. Rather it is essential to adopt a relational approach that recognises the need to transform power relationships at multiple levels. This analysis has implications for our wider understanding of empowerment in the 21st century.
Defense Reform: Supporting the Whole-of-Government Approach in Tomorrow’s Crisis
2017-03-29
government approach to trans-regional, multi-domain, and multi-functional threats. In addition to keeping military and political focus on broader...structure with more subordinate commands and less multi-domain and multi-functional integration, or in this case , vertical integration. Relying on ... APPROACH IN TOMORROW’S CRISIS 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Lt Col John B
2009-12-17
IEEE TDKE, 1996. 8( 1). 14. Garvey, T.D., The inference Problem for Computer Security. 1992, SRI International. 15. Chaum , D ., Blind Signatures for...Pervasive Computing Environments. IEEE Transactions on Vehicular Technology, 2006. 55(4). 17. Chaum , D ., Security without Identification: Transaction...Systems to make Big Brother Obsolete. Communications of the ACM 1985. 28(10). 18. Chaum , D ., Untraceable Electronic Mail, Return Addresses, and Digital
Weather and seasonal climate prediction for South America using a multi-model superensemble
NASA Astrophysics Data System (ADS)
Chaves, Rosane R.; Ross, Robert S.; Krishnamurti, T. N.
2005-11-01
This work examines the feasibility of weather and seasonal climate predictions for South America using the multi-model synthetic superensemble approach for climate, and the multi-model conventional superensemble approach for numerical weather prediction, both developed at Florida State University (FSU). The effect on seasonal climate forecasts of the number of models used in the synthetic superensemble is investigated. It is shown that the synthetic superensemble approach for climate and the conventional superensemble approach for numerical weather prediction can reduce the errors over South America in seasonal climate prediction and numerical weather prediction.For climate prediction, a suite of 13 models is used. The forecast lead-time is 1 month for the climate forecasts, which consist of precipitation and surface temperature forecasts. The multi-model ensemble is comprised of four versions of the FSU-Coupled Ocean-Atmosphere Model, seven models from the Development of a European Multi-model Ensemble System for Seasonal to Interannual Prediction (DEMETER), a version of the Community Climate Model (CCM3), and a version of the predictive Ocean Atmosphere Model for Australia (POAMA). The results show that conditions over South America are appropriately simulated by the Florida State University Synthetic Superensemble (FSUSSE) in comparison to observations and that the skill of this approach increases with the use of additional models in the ensemble. When compared to observations, the forecasts are generally better than those from both a single climate model and the multi-model ensemble mean, for the variables tested in this study.For numerical weather prediction, the conventional Florida State University Superensemble (FSUSE) is used to predict the mass and motion fields over South America. Predictions of mean sea level pressure, 500 hPa geopotential height, and 850 hPa wind are made with a multi-model superensemble comprised of six global models for the period January, February, and December of 2000. The six global models are from the following forecast centers: FSU, Bureau of Meteorology Research Center (BMRC), Japan Meteorological Agency (JMA), National Centers for Environmental Prediction (NCEP), Naval Research Laboratory (NRL), and Recherche en Prevision Numerique (RPN). Predictions of precipitation are made for the period January, February, and December of 2001 with a multi-analysis-multi-model superensemble where, in addition to the six forecast models just mentioned, five additional versions of the FSU model are used in the ensemble, each with a different initialization (analysis) based on different physical initialization procedures. On the basis of observations, the results show that the FSUSE provides the best forecasts of the mass and motion field variables to forecast day 5, when compared to both the models comprising the ensemble and the multi-model ensemble mean during the wet season of December-February over South America. Individual case studies show that the FSUSE provides excellent predictions of rainfall for particular synoptic events to forecast day 3. Copyright
Schoer, Karl; Wood, Richard; Arto, Iñaki; Weinzettel, Jan
2013-12-17
The mass of material consumed by a population has become a useful proxy for measuring environmental pressure. The "raw material equivalents" (RME) metric of material consumption addresses the issue of including the full supply chain (including imports) when calculating national or product level material impacts. The RME calculation suffers from data availability, however, as quantitative data on production practices along the full supply chain (in different regions) is required. Hence, the RME is currently being estimated by three main approaches: (1) assuming domestic technology in foreign economies, (2) utilizing region-specific life-cycle inventories (in a hybrid framework), and (3) utilizing multi-regional input-output (MRIO) analysis to explicitly cover all regions of the supply chain. While the first approach has been shown to give inaccurate results, this paper focuses on the benefits and costs of the latter two approaches. We analyze results from two key (MRIO and hybrid) projects modeling raw material equivalents, adjusting the models in a stepwise manner in order to quantify the effects of individual conceptual elements. We attempt to isolate the MRIO gap, which denotes the quantitative impact of calculating the RME of imports by an MRIO approach instead of the hybrid model, focusing on the RME of EU external trade imports. While, the models give quantitatively similar results, differences become more pronounced when tracking more detailed material flows. We assess the advantages and disadvantages of the two approaches and look forward to ways to further harmonize data and approaches.
2010-01-01
Multi-Disciplinary, Multi-Output Sensitivity Analysis ( MIMOSA ) .........29 3.1 Introduction to Research Thrust 1...39 3.3 MIMOSA Approach ..........................................................................................41 3.3.1...Collaborative Consistency of MIMOSA .......................................................41 3.3.2 Formulation of MIMOSA
Ren, Huiying; Ray, Jaideep; Hou, Zhangshuan; ...
2017-10-17
In this paper we developed an efficient Bayesian inversion framework for interpreting marine seismic Amplitude Versus Angle and Controlled-Source Electromagnetic data for marine reservoir characterization. The framework uses a multi-chain Markov-chain Monte Carlo sampler, which is a hybrid of DiffeRential Evolution Adaptive Metropolis and Adaptive Metropolis samplers. The inversion framework is tested by estimating reservoir-fluid saturations and porosity based on marine seismic and Controlled-Source Electromagnetic data. The multi-chain Markov-chain Monte Carlo is scalable in terms of the number of chains, and is useful for computationally demanding Bayesian model calibration in scientific and engineering problems. As a demonstration, the approach ismore » used to efficiently and accurately estimate the porosity and saturations in a representative layered synthetic reservoir. The results indicate that the seismic Amplitude Versus Angle and Controlled-Source Electromagnetic joint inversion provides better estimation of reservoir saturations than the seismic Amplitude Versus Angle only inversion, especially for the parameters in deep layers. The performance of the inversion approach for various levels of noise in observational data was evaluated — reasonable estimates can be obtained with noise levels up to 25%. Sampling efficiency due to the use of multiple chains was also checked and was found to have almost linear scalability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Huiying; Ray, Jaideep; Hou, Zhangshuan
In this paper we developed an efficient Bayesian inversion framework for interpreting marine seismic Amplitude Versus Angle and Controlled-Source Electromagnetic data for marine reservoir characterization. The framework uses a multi-chain Markov-chain Monte Carlo sampler, which is a hybrid of DiffeRential Evolution Adaptive Metropolis and Adaptive Metropolis samplers. The inversion framework is tested by estimating reservoir-fluid saturations and porosity based on marine seismic and Controlled-Source Electromagnetic data. The multi-chain Markov-chain Monte Carlo is scalable in terms of the number of chains, and is useful for computationally demanding Bayesian model calibration in scientific and engineering problems. As a demonstration, the approach ismore » used to efficiently and accurately estimate the porosity and saturations in a representative layered synthetic reservoir. The results indicate that the seismic Amplitude Versus Angle and Controlled-Source Electromagnetic joint inversion provides better estimation of reservoir saturations than the seismic Amplitude Versus Angle only inversion, especially for the parameters in deep layers. The performance of the inversion approach for various levels of noise in observational data was evaluated — reasonable estimates can be obtained with noise levels up to 25%. Sampling efficiency due to the use of multiple chains was also checked and was found to have almost linear scalability.« less
A variational approach to multi-phase motion of gas, liquid and solid based on the level set method
NASA Astrophysics Data System (ADS)
Yokoi, Kensuke
2009-07-01
We propose a simple and robust numerical algorithm to deal with multi-phase motion of gas, liquid and solid based on the level set method [S. Osher, J.A. Sethian, Front propagating with curvature-dependent speed: Algorithms based on Hamilton-Jacobi formulation, J. Comput. Phys. 79 (1988) 12; M. Sussman, P. Smereka, S. Osher, A level set approach for capturing solution to incompressible two-phase flow, J. Comput. Phys. 114 (1994) 146; J.A. Sethian, Level Set Methods and Fast Marching Methods, Cambridge University Press, 1999; S. Osher, R. Fedkiw, Level Set Methods and Dynamics Implicit Surface, Applied Mathematical Sciences, vol. 153, Springer, 2003]. In Eulerian framework, to simulate interaction between a moving solid object and an interfacial flow, we need to define at least two functions (level set functions) to distinguish three materials. In such simulations, in general two functions overlap and/or disagree due to numerical errors such as numerical diffusion. In this paper, we resolved the problem using the idea of the active contour model [M. Kass, A. Witkin, D. Terzopoulos, Snakes: active contour models, International Journal of Computer Vision 1 (1988) 321; V. Caselles, R. Kimmel, G. Sapiro, Geodesic active contours, International Journal of Computer Vision 22 (1997) 61; G. Sapiro, Geometric Partial Differential Equations and Image Analysis, Cambridge University Press, 2001; R. Kimmel, Numerical Geometry of Images: Theory, Algorithms, and Applications, Springer-Verlag, 2003] introduced in the field of image processing.
Digital image archiving: challenges and choices.
Dumery, Barbara
2002-01-01
In the last five years, imaging exam volume has grown rapidly. In addition to increased image acquisition, there is more patient information per study. RIS-PACS integration and information-rich DICOM headers now provide us with more patient information relative to each study. The volume of archived digital images is increasing and will continue to rise at a steeper incline than film-based storage of the past. Many filmless facilities have been caught off guard by this increase, which has been stimulated by many factors. The most significant factor is investment in new digital and DICOM-compliant modalities. A huge volume driver is the increase in images per study from multi-slice technology. Storage requirements also are affected by disaster recovery initiatives and state retention mandates. This burgeoning rate of imaging data volume presents many challenges: cost of ownership, data accessibility, storage media obsolescence, database considerations, physical limitations, reliability and redundancy. There are two basic approaches to archiving--single tier and multi-tier. Each has benefits. With a single-tier approach, all the data is stored on a single media that can be accessed very quickly. A redundant copy of the data is then stored onto another less expensive media. This is usually a removable media. In this approach, the on-line storage is increased incrementally as volume grows. In a multi-tier approach, storage levels are set up based on access speed and cost. In other words, all images are stored at the deepest archiving level, which is also the least expensive. Images are stored on or moved back to the intermediate and on-line levels if they will need to be accessed more quickly. It can be difficult to decide what the best approach is for your organization. The options include RAIDs (redundant array of independent disks), direct attached RAID storage (DAS), network storage using RAIDs (NAS and SAN), removable media such as different types of tape, compact disks (CDs and DVDs) and magneto-optical disks (MODs). As you evaluate the various options for storage, it is important to consider both performance and cost. For most imaging enterprises, a single-tier archiving approach is the best solution. With the cost of hard drives declining, NAS is a very feasible solution today. It is highly reliable, offers immediate access to all exams, and easily scales as imaging volume grows. Best of all, media obsolescence challenges need not be of concern. For back-up storage, removable media can be implemented, with a smaller investment needed as it will only be used for a redundant copy of the data. There is no need to keep it online and available. If further system redundancy is desired, multiple servers should be considered. The multi-tier approach still has its merits for smaller enterprises, but with a detailed long-term cost of ownership analysis, NAS will probably still come out on top as the solution of choice for many imaging facilities.
An integrated sampling and analysis approach for improved biodiversity monitoring
DeWan, Amielle A.; Zipkin, Elise
2010-01-01
Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.
Convoys of care: Theorizing intersections of formal and informal care
Kemp, Candace L.; Ball, Mary M.; Perkins, Molly M.
2013-01-01
Although most care to frail elders is provided informally, much of this care is paired with formal care services. Yet, common approaches to conceptualizing the formal–informal intersection often are static, do not consider self-care, and typically do not account for multi-level influences. In response, we introduce the “convoy of care” model as an alternative way to conceptualize the intersection and to theorize connections between care convoy properties and caregiver and recipient outcomes. The model draws on Kahn and Antonucci's (1980) convoy model of social relations, expanding it to include both formal and informal care providers and also incorporates theoretical and conceptual threads from life course, feminist gerontology, social ecology, and symbolic interactionist perspectives. This article synthesizes theoretical and empirical knowledge and demonstrates the convoy of care model in an increasingly popular long-term care setting, assisted living. We conceptualize care convoys as dynamic, evolving, person- and family-specific, and influenced by a host of multi-level factors. Care convoys have implications for older adults’ quality of care and ability to age in place, for job satisfaction and retention among formal caregivers, and for informal caregiver burden. The model moves beyond existing conceptual work to provide a comprehensive, multi-level, multi-factor framework that can be used to inform future research, including research in other care settings, and to spark further theoretical development. PMID:23273553
Contributions of Youth Engagement to the Development of Social Capital through Community Mapping
ERIC Educational Resources Information Center
Nathaniel, Keith C.; Kinsey, Sharon B.
2013-01-01
The Multi-State North Central Extension Research Activity (NCERA), Contributions of 4-H Participation to the Development of Social Capital, identified a strategy to pilot a research method that incorporates an inquiry-based approach to understanding community level impact of youth programs. This article focuses on how youth engagement educators…
ERIC Educational Resources Information Center
Moss, Julianne; O'Mara, Joanne; McCandless, Trevor
2017-01-01
Internationally, Intercultural Understanding (ICU) is increasingly prevalent in the field of education. The recent evidence base includes a growing academic literature and examples of specified education policy and curricula. In regards to leveraging ICU, research suggests a multi-level and longitudinal approach is needed to ensure effective and…
A Multi-Level Model of Moral Thinking Based on Neuroscience and Moral Psychology
ERIC Educational Resources Information Center
Jeong, Changwoo; Han, Hye Min
2011-01-01
Developments in neurobiology are providing new insights into the biological and physical features of human thinking, and brain-activation imaging methods such as functional magnetic resonance imaging have become the most dominant research techniques to approach the biological part of thinking. With the aid of neurobiology, there also have been…
Toward an Instructional Approach to Developing Interactive Second Language Listening
ERIC Educational Resources Information Center
Yeldham, Michael; Gruba, Paul
2014-01-01
This study details the development of six second language learners in an English listening course that focused on developing their bottom-up listening skills. The research employed longitudinal multi-case studies to chart the development of these lower proficiency-level Taiwanese university learners, and their progress in the course was analysed…
Using TIMSS and PISA Results to Inform Educational Policy: A Study of Russia and Its Neighbours
ERIC Educational Resources Information Center
Carnoy, Martin; Khavenson, Tatiana; Ivanova, Alina
2015-01-01
In this paper, we develop a multi-level comparative approach to analyse Trends in International Mathematics and Science Survey (TIMSS) and Programme of International Student Achievement (PISA) mathematics results for a country, Russia, where the two tests provide contradictory information about students' relative performance. Russian students do…
The Oklahoma's Promise Program: A National Model to Promote College Persistence
ERIC Educational Resources Information Center
Mendoza, Pilar; Mendez, Jesse P.
2013-01-01
Using a multi-method approach involving fixed effects and logistic regressions, this study examined the effect of the Oklahoma's Promise Program on student persistence in relation to the Pell and Stafford federal programs and according to socio-economic characteristics and class level. The Oklahoma's Promise is a hybrid state program that pays…
ERIC Educational Resources Information Center
Dawson, Anna P.; Cargo, Margaret; Stewart, Harold; Chong, Alwin; Daniel, Mark
2013-01-01
Aboriginal Australians, including Aboriginal Health Workers (AHWs), smoke at rates double the non-Aboriginal population. This study utilized concept mapping methodology to identify and prioritize culturally relevant strategies to promote smoking cessation in AHWs. Stakeholder participants included AHWs, other health service employees and tobacco…
Bottom-Up Analysis of Single-Case Research Designs
ERIC Educational Resources Information Center
Parker, Richard I.; Vannest, Kimberly J.
2012-01-01
This paper defines and promotes the qualities of a "bottom-up" approach to single-case research (SCR) data analysis. Although "top-down" models, for example, multi-level or hierarchical linear models, are gaining momentum and have much to offer, interventionists should be cautious about analyses that are not easily understood, are not governed by…
NASA Astrophysics Data System (ADS)
Jin, Biao; Rolle, Massimo
2016-04-01
Organic compounds are produced in vast quantities for industrial and agricultural use, as well as for human and animal healthcare [1]. These chemicals and their metabolites are frequently detected at trace levels in fresh water environments where they undergo degradation via different reaction pathways. Compound specific stable isotope analysis (CSIA) is a valuable tool to identify such degradation pathways in different environmental systems. Recent advances in analytical techniques have promoted the fast development and implementation of multi-element CSIA. However, quantitative frameworks to evaluate multi-element stable isotope data and incorporating mechanistic information on the degradation processes [2,3] are still lacking. In this study we propose a mechanism-based modeling approach to simultaneously evaluate concentration as well as bulk and position-specific multi-element isotope evolution during the transformation of organic micropollutants. The model explicitly simulates position-specific isotopologues for those atoms that experience isotope effects and, thereby, provides a mechanistic description of isotope fractionation occurring at different molecular positions. We validate the proposed approach with the concentration and multi-element isotope data of three selected organic micropollutants: dichlorobenzamide (BAM), isoproturon (IPU) and diclofenac (DCF). The model precisely captures the dual element isotope trends characteristic of different reaction pathways and their range of variation consistent with observed multi-element (C, N) bulk isotope fractionation. The proposed approach can also be used as a tool to explore transformation pathways in scenarios for which position-specific isotope data are not yet available. [1] Schwarzenbach, R.P., Egli, T., Hofstetter, T.B., von Gunten, U., Wehrli, B., 2010. Global Water Pollution and Human Health. Annu. Rev. Environ. Resour. doi:10.1146/annurev-environ-100809-125342. [2] Jin, B., Haderlein, S.B., Rolle, M., 2013. Integrated carbon and chlorine isotope modeling: Applications to chlorinated aliphatic hydrocarbons dechlorination. Environ. Sci. Technol. 47, 1443-1451. doi:10.1021/es304053h. [3] Jin, B., Rolle, M., 2014. Mechanistic approach to multi-element isotope modeling of organic contaminant degradation. Chemosphere 95, 131-139. doi:10.1016/j.chemosphere.2013.08.050.
NASA Astrophysics Data System (ADS)
Gill, Stuart P. D.; Knebe, Alexander; Gibson, Brad K.; Flynn, Chris; Ibata, Rodrigo A.; Lewis, Geraint F.
2003-04-01
An adaptive multi grid approach to simulating the formation of structure from collisionless dark matter is described. MLAPM (Multi-Level Adaptive Particle Mesh) is one of the most efficient serial codes available on the cosmological "market" today. As part of Swinburne University's role in the development of the Square Kilometer Array, we are implementing hydrodynamics, feedback, and radiative transfer within the MLAPM adaptive mesh, in order to simulate baryonic processes relevant to the interstellar and intergalactic media at high redshift. We will outline our progress to date in applying the existing MLAPM to a study of the decay of satellite galaxies within massive host potentials.
Numerical models for fluid-grains interactions: opportunities and limitations
NASA Astrophysics Data System (ADS)
Esteghamatian, Amir; Rahmani, Mona; Wachs, Anthony
2017-06-01
In the framework of a multi-scale approach, we develop numerical models for suspension flows. At the micro scale level, we perform particle-resolved numerical simulations using a Distributed Lagrange Multiplier/Fictitious Domain approach. At the meso scale level, we use a two-way Euler/Lagrange approach with a Gaussian filtering kernel to model fluid-solid momentum transfer. At both the micro and meso scale levels, particles are individually tracked in a Lagrangian way and all inter-particle collisions are computed by a Discrete Element/Soft-sphere method. The previous numerical models have been extended to handle particles of arbitrary shape (non-spherical, angular and even non-convex) as well as to treat heat and mass transfer. All simulation tools are fully-MPI parallel with standard domain decomposition and run on supercomputers with a satisfactory scalability on up to a few thousands of cores. The main asset of multi scale analysis is the ability to extend our comprehension of the dynamics of suspension flows based on the knowledge acquired from the high-fidelity micro scale simulations and to use that knowledge to improve the meso scale model. We illustrate how we can benefit from this strategy for a fluidized bed, where we introduce a stochastic drag force model derived from micro-scale simulations to recover the proper level of particle fluctuations. Conversely, we discuss the limitations of such modelling tools such as their limited ability to capture lubrication forces and boundary layers in highly inertial flows. We suggest ways to overcome these limitations in order to enhance further the capabilities of the numerical models.
Conjunctive management of multi-reservoir network system and groundwater system
NASA Astrophysics Data System (ADS)
Mani, A.; Tsai, F. T. C.
2015-12-01
This study develops a successive mixed-integer linear fractional programming (successive MILFP) method to conjunctively manage water resources provided by a multi-reservoir network system and a groundwater system. The conjunctive management objectives are to maximize groundwater withdrawals and maximize reservoir storages while satisfying water demands and raising groundwater level to a target level. The decision variables in the management problem are reservoir releases and spills, network flows and groundwater pumping rates. Using the fractional programming approach, the objective function is defined as a ratio of total groundwater withdraws to total reservoir storage deficits from the maximum storages. Maximizing this ratio function tends to maximizing groundwater use and minimizing surface water use. This study introduces a conditional constraint on groundwater head in order to sustain aquifers from overpumping: if current groundwater level is less than a target level, groundwater head at the next time period has to be raised; otherwise, it is allowed to decrease up to a certain extent. This conditional constraint is formulated into a set of mixed binary nonlinear constraints and results in a mixed-integer nonlinear fractional programming (MINLFP) problem. To solve the MINLFP problem, we first use the response matrix approach to linearize groundwater head with respect to pumping rate and reduce the problem to an MILFP problem. Using the Charnes-Cooper transformation, the MILFP is transformed to an equivalent mixed-integer linear programming (MILP). The solution of the MILP is successively updated by updating the response matrix in every iteration. The study uses IBM CPLEX to solve the MILP problem. The methodology is applied to water resources management in northern Louisiana. This conjunctive management approach aims to recover the declining groundwater level of the stressed Sparta aquifer by using surface water from a network of four reservoirs as an alternative source of supply.
Multi-Objective Approach for Energy-Aware Workflow Scheduling in Cloud Computing Environments
Kadima, Hubert; Granado, Bertrand
2013-01-01
We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach. PMID:24319361
Multi-objective approach for energy-aware workflow scheduling in cloud computing environments.
Yassa, Sonia; Chelouah, Rachid; Kadima, Hubert; Granado, Bertrand
2013-01-01
We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach.
NASA Astrophysics Data System (ADS)
Rahman, M. S.; Pota, H. R.; Mahmud, M. A.; Hossain, M. J.
2016-05-01
This paper presents the impact of large penetration of wind power on the transient stability through a dynamic evaluation of the critical clearing times (CCTs) by using intelligent agent-based approach. A decentralised multi-agent-based framework is developed, where agents represent a number of physical device models to form a complex infrastructure for computation and communication. They enable the dynamic flow of information and energy for the interaction between the physical processes and their activities. These agents dynamically adapt online measurements and use the CCT information for relay coordination to improve the transient stability of power systems. Simulations are carried out on a smart microgrid system for faults at increasing wind power penetration levels and the improvement in transient stability using the proposed agent-based framework is demonstrated.
Defect study in ZnO related structures—A multi-spectroscopic approach
NASA Astrophysics Data System (ADS)
Ling, C. C.; Cheung, C. K.; Gu, Q. L.; Dai, X. M.; Xu, S. J.; Zhu, C. Y.; Luo, J. M.; Zhu, C. Y.; Tam, K. H.; Djurišić, A. B.; Beling, C. D.; Fung, S.; Lu, L. W.; Brauer, G.; Anwand, W.; Skorupa, W.; Ong, H. C.
2008-10-01
ZnO has attracted a great deal of attention in recent years because of its potential applications for fabricating optoelectronic devices. Using a multi-spectroscopic approach including positron annihilation spectroscopy (PAS), deep level transient spectroscopy (DLTS), photoluminescence (PL) and X-ray photoelectron spectroscopy (XPS), we have studied the two observed phenomena from ZnO related structures. They namely included the H 2O 2 pre-treatment induced ohmic to rectifying contact conversion on Au/ n-ZnO contact and the p-type doping by nitrogen ion implantation. The aim of the studies was to offering comprehensive views as to how the defects influenced the structures electrical and optical properties of the structures. It was also shown that PAS measurement using the monoenergetic positron beam could offer valuable information of vacancy type defects in the vertical ZnO nanorod array structure.
Scalable software architecture for on-line multi-camera video processing
NASA Astrophysics Data System (ADS)
Camplani, Massimo; Salgado, Luis
2011-03-01
In this paper we present a scalable software architecture for on-line multi-camera video processing, that guarantees a good trade off between computational power, scalability and flexibility. The software system is modular and its main blocks are the Processing Units (PUs), and the Central Unit. The Central Unit works as a supervisor of the running PUs and each PU manages the acquisition phase and the processing phase. Furthermore, an approach to easily parallelize the desired processing application has been presented. In this paper, as case study, we apply the proposed software architecture to a multi-camera system in order to efficiently manage multiple 2D object detection modules in a real-time scenario. System performance has been evaluated under different load conditions such as number of cameras and image sizes. The results show that the software architecture scales well with the number of camera and can easily works with different image formats respecting the real time constraints. Moreover, the parallelization approach can be used in order to speed up the processing tasks with a low level of overhead.
Systems medicine and integrated care to combat chronic noncommunicable diseases
2011-01-01
We propose an innovative, integrated, cost-effective health system to combat major non-communicable diseases (NCDs), including cardiovascular, chronic respiratory, metabolic, rheumatologic and neurologic disorders and cancers, which together are the predominant health problem of the 21st century. This proposed holistic strategy involves comprehensive patient-centered integrated care and multi-scale, multi-modal and multi-level systems approaches to tackle NCDs as a common group of diseases. Rather than studying each disease individually, it will take into account their intertwined gene-environment, socio-economic interactions and co-morbidities that lead to individual-specific complex phenotypes. It will implement a road map for predictive, preventive, personalized and participatory (P4) medicine based on a robust and extensive knowledge management infrastructure that contains individual patient information. It will be supported by strategic partnerships involving all stakeholders, including general practitioners associated with patient-centered care. This systems medicine strategy, which will take a holistic approach to disease, is designed to allow the results to be used globally, taking into account the needs and specificities of local economies and health systems. PMID:21745417
Jordan, Nika; Zakrajšek, Jure; Bohanec, Simona; Roškar, Robert; Grabnar, Iztok
2018-05-01
The aim of the present research is to show that the methodology of Design of Experiments can be applied to stability data evaluation, as they can be seen as multi-factor and multi-level experimental designs. Linear regression analysis is usually an approach for analyzing stability data, but multivariate statistical methods could also be used to assess drug stability during the development phase. Data from a stability study for a pharmaceutical product with hydrochlorothiazide (HCTZ) as an unstable drug substance was used as a case example in this paper. The design space of the stability study was modeled using Umetrics MODDE 10.1 software. We showed that a Partial Least Squares model could be used for a multi-dimensional presentation of all data generated in a stability study and for determination of the relationship among factors that influence drug stability. It might also be used for stability predictions and potentially for the optimization of the extent of stability testing needed to determine shelf life and storage conditions, which would be time and cost-effective for the pharmaceutical industry.
A high-throughput method for GMO multi-detection using a microfluidic dynamic array.
Brod, Fábio Cristiano Angonesi; van Dijk, Jeroen P; Voorhuijzen, Marleen M; Dinon, Andréia Zilio; Guimarães, Luis Henrique S; Scholtens, Ingrid M J; Arisi, Ana Carolina Maisonnave; Kok, Esther J
2014-02-01
The ever-increasing production of genetically modified crops generates a demand for high-throughput DNA-based methods for the enforcement of genetically modified organisms (GMO) labelling requirements. The application of standard real-time PCR will become increasingly costly with the growth of the number of GMOs that is potentially present in an individual sample. The present work presents the results of an innovative approach in genetically modified crops analysis by DNA based methods, which is the use of a microfluidic dynamic array as a high throughput multi-detection system. In order to evaluate the system, six test samples with an increasing degree of complexity were prepared, preamplified and subsequently analysed in the Fluidigm system. Twenty-eight assays targeting different DNA elements, GM events and species-specific reference genes were used in the experiment. The large majority of the assays tested presented expected results. The power of low level detection was assessed and elements present at concentrations as low as 0.06 % were successfully detected. The approach proposed in this work presents the Fluidigm system as a suitable and promising platform for GMO multi-detection.
Classen, Sherrilene; Lopez, Ellen DS; Winter, Sandra; Awadzi, Kezia D; Ferree, Nita; Garvan, Cynthia W
2007-01-01
The topic of motor vehicle crashes among the elderly is dynamic and multi-faceted requiring a comprehensive and synergistic approach to intervention planning. This approach must be based on the values of a given population as well as health statistics and asserted through community, organizational and policy strategies. An integrated summary of the predictors (quantitative research), and views (qualitative research) of the older drivers and their stakeholders, does not currently exist. This study provided an explicit socio-ecological view explaining the interrelation of possible causative factors, an integrated summary of these causative factors, and empirical guidelines for developing public health interventions to promote older driver safety. Using a mixed methods approach, we were able to compare and integrate main findings from a national crash dataset with perspectives of stakeholders. We identified: 11 multi-causal factors for safe elderly driving; the importance of the environmental factors - previously underrated in the literature- interacting with behavioral and health factors; and the interrelatedness among many socio-ecological factors. For the first time, to our knowledge, we conceptualized the fundamental elements of a multi-causal health promotion plan, with measurable intermediate and long-term outcomes. After completing the detailed plan we will test the effectiveness of this intervention on multiple levels. PMID:18225470
A computational intelligent approach to multi-factor analysis of violent crime information system
NASA Astrophysics Data System (ADS)
Liu, Hongbo; Yang, Chao; Zhang, Meng; McLoone, Seán; Sun, Yeqing
2017-02-01
Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.
The risk of water scarcity at different levels of global warming
NASA Astrophysics Data System (ADS)
Schewe, Jacob; Sharpe, Simon
2015-04-01
Water scarcity is a threat to human well-being and economic development in many countries today. Future climate change is expected to exacerbate the global water crisis by reducing renewable freshwater resources different world regions, many of which are already dry. Studies of future water scarcity often focus on most-likely, or highest-confidence, scenarios. However, multi-model projections of water resources reveal large uncertainty ranges, which are due to different types of processes (climate, hydrology, human) and are therefore not easy to reduce. Thus, central estimates or multi-model mean results may be insufficient to inform policy and management. Here we present an alternative, risk-based approach. We use an ensemble of multiple global climate and hydrological models to quantify the likelihood of crossing a given water scarcity threshold under different levels of global warming. This approach allows assessing the risk associated with any particular, pre-defined threshold (or magnitude of change that must be avoided), regardless of whether it lies in the center or in the tails of the uncertainty distribution. We show applications of this method on the country and river basin scale, illustrate the effects of societal processes on the resulting risk estimates, and discuss the further potential of this approach for research and stakeholder dialogue.
NASA Astrophysics Data System (ADS)
Madani, Kaveh; Hooshyar, Milad
2014-11-01
Reservoir systems with multiple operators can benefit from coordination of operation policies. To maximize the total benefit of these systems the literature has normally used the social planner's approach. Based on this approach operation decisions are optimized using a multi-objective optimization model with a compound system's objective. While the utility of the system can be increased this way, fair allocation of benefits among the operators remains challenging for the social planner who has to assign controversial weights to the system's beneficiaries and their objectives. Cooperative game theory provides an alternative framework for fair and efficient allocation of the incremental benefits of cooperation. To determine the fair and efficient utility shares of the beneficiaries, cooperative game theory solution methods consider the gains of each party in the status quo (non-cooperation) as well as what can be gained through the grand coalition (social planner's solution or full cooperation) and partial coalitions. Nevertheless, estimation of the benefits of different coalitions can be challenging in complex multi-beneficiary systems. Reinforcement learning can be used to address this challenge and determine the gains of the beneficiaries for different levels of cooperation, i.e., non-cooperation, partial cooperation, and full cooperation, providing the essential input for allocation based on cooperative game theory. This paper develops a game theory-reinforcement learning (GT-RL) method for determining the optimal operation policies in multi-operator multi-reservoir systems with respect to fairness and efficiency criteria. As the first step to underline the utility of the GT-RL method in solving complex multi-agent multi-reservoir problems without a need for developing compound objectives and weight assignment, the proposed method is applied to a hypothetical three-agent three-reservoir system.
Granovsky, Alexander A
2015-12-21
We present a new, very efficient semi-numerical approach for the computation of state-specific nuclear gradients of a generic state-averaged multi-configuration self consistent field wavefunction. Our approach eliminates the costly coupled-perturbed multi-configuration Hartree-Fock step as well as the associated integral transformation stage. The details of the implementation within the Firefly quantum chemistry package are discussed and several sample applications are given. The new approach is routinely applicable to geometry optimization of molecular systems with 1000+ basis functions using a standalone multi-core workstation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granovsky, Alexander A., E-mail: alex.granovsky@gmail.com
We present a new, very efficient semi-numerical approach for the computation of state-specific nuclear gradients of a generic state-averaged multi-configuration self consistent field wavefunction. Our approach eliminates the costly coupled-perturbed multi-configuration Hartree-Fock step as well as the associated integral transformation stage. The details of the implementation within the Firefly quantum chemistry package are discussed and several sample applications are given. The new approach is routinely applicable to geometry optimization of molecular systems with 1000+ basis functions using a standalone multi-core workstation.
NASA Astrophysics Data System (ADS)
Hassan, Rania A.
In the design of complex large-scale spacecraft systems that involve a large number of components and subsystems, many specialized state-of-the-art design tools are employed to optimize the performance of various subsystems. However, there is no structured system-level concept-architecting process. Currently, spacecraft design is heavily based on the heritage of the industry. Old spacecraft designs are modified to adapt to new mission requirements, and feasible solutions---rather than optimal ones---are often all that is achieved. During the conceptual phase of the design, the choices available to designers are predominantly discrete variables describing major subsystems' technology options and redundancy levels. The complexity of spacecraft configurations makes the number of the system design variables that need to be traded off in an optimization process prohibitive when manual techniques are used. Such a discrete problem is well suited for solution with a Genetic Algorithm, which is a global search technique that performs optimization-like tasks. This research presents a systems engineering framework that places design requirements at the core of the design activities and transforms the design paradigm for spacecraft systems to a top-down approach rather than the current bottom-up approach. To facilitate decision-making in the early phases of the design process, the population-based search nature of the Genetic Algorithm is exploited to provide computationally inexpensive---compared to the state-of-the-practice---tools for both multi-objective design optimization and design optimization under uncertainty. In terms of computational cost, those tools are nearly on the same order of magnitude as that of standard single-objective deterministic Genetic Algorithm. The use of a multi-objective design approach provides system designers with a clear tradeoff optimization surface that allows them to understand the effect of their decisions on all the design objectives under consideration simultaneously. Incorporating uncertainties avoids large safety margins and unnecessary high redundancy levels. The focus on low computational cost for the optimization tools stems from the objective that improving the design of complex systems should not be achieved at the expense of a costly design methodology.
NASA Astrophysics Data System (ADS)
Zhu, Aichun; Wang, Tian; Snoussi, Hichem
2018-03-01
This paper addresses the problems of the graphical-based human pose estimation in still images, including the diversity of appearances and confounding background clutter. We present a new architecture for estimating human pose using a Convolutional Neural Network (CNN). Firstly, a Relative Mixture Deformable Model (RMDM) is defined by each pair of connected parts to compute the relative spatial information in the graphical model. Secondly, a Local Multi-Resolution Convolutional Neural Network (LMR-CNN) is proposed to train and learn the multi-scale representation of each body parts by combining different levels of part context. Thirdly, a LMR-CNN based hierarchical model is defined to explore the context information of limb parts. Finally, the experimental results demonstrate the effectiveness of the proposed deep learning approach for human pose estimation.
Novel Wireless-Communicating Textiles Made from Multi-Material and Minimally-Invasive Fibers
Gorgutsa, Stepan; Bélanger-Garnier, Victor; Ung, Bora; Viens, Jeff; Gosselin, Benoit; LaRochelle, Sophie; Messaddeq, Younes
2014-01-01
The ability to integrate multiple materials into miniaturized fiber structures enables the realization of novel biomedical textile devices with higher-level functionalities and minimally-invasive attributes. In this work, we present novel textile fabrics integrating unobtrusive multi-material fibers that communicate through 2.4 GHz wireless networks with excellent signal quality. The conductor elements of the textiles are embedded within the fibers themselves, providing electrical and chemical shielding against the environment, while preserving the mechanical and cosmetic properties of the garments. These multi-material fibers combine insulating and conducting materials into a well-defined geometry, and represent a cost-effective and minimally-invasive approach to sensor fabrics and bio-sensing textiles connected in real time to mobile communications infrastructures, suitable for a variety of health and life science applications. PMID:25325335
Novel wireless-communicating textiles made from multi-material and minimally-invasive fibers.
Bélanger-Garnier, Victor; Gorgutsa, Stephan; Ung, Bora; Viens, Jeff; Gosselin, Benoit; LaRochelle, Sophie; Messaddeq, Younes
2014-01-01
The ability to integrate multiple materials into miniaturized fiber structures enables the realization of novel biomedical textile devices with higher-level functionalities and minimally-invasive attributes. In this work, we present novel textile fabrics integrating unobtrusive multi-material fibers that communicate through 2.4 GHz wireless networks with excellent signal quality. The conductor elements of the textiles are embedded within the fibers themselves, providing electrical and chemical shielding against the environment, while preserving the mechanical and cosmetic properties of the garments. These multi-material fibers combine insulating and conducting materials into a well-defined geometry, and represent a cost-effective and minimally-invasive approach to sensor fabrics and bio-sensing textiles connected in real time to mobile communications infrastructures, suitable for a variety of health and life science applications.
Novel wireless-communicating textiles made from multi-material and minimally-invasive fibers.
Gorgutsa, Stepan; Bélanger-Garnier, Victor; Ung, Bora; Viens, Jeff; Gosselin, Benoit; LaRochelle, Sophie; Messaddeq, Younes
2014-10-16
The ability to integrate multiple materials into miniaturized fiber structures enables the realization of novel biomedical textile devices with higher-level functionalities and minimally-invasive attributes. In this work, we present novel textile fabrics integrating unobtrusive multi-material fibers that communicate through 2.4 GHz wireless networks with excellent signal quality. The conductor elements of the textiles are embedded within the fibers themselves, providing electrical and chemical shielding against the environment, while preserving the mechanical and cosmetic properties of the garments. These multi-material fibers combine insulating and conducting materials into a well-defined geometry, and represent a cost-effective and minimally-invasive approach to sensor fabrics and bio-sensing textiles connected in real time to mobile communications infrastructures, suitable for a variety of health and life science applications.
Supermodeling With A Global Atmospheric Model
NASA Astrophysics Data System (ADS)
Wiegerinck, Wim; Burgers, Willem; Selten, Frank
2013-04-01
In weather and climate prediction studies it often turns out to be the case that the multi-model ensemble mean prediction has the best prediction skill scores. One possible explanation is that the major part of the model error is random and is averaged out in the ensemble mean. In the standard multi-model ensemble approach, the models are integrated in time independently and the predicted states are combined a posteriori. Recently an alternative ensemble prediction approach has been proposed in which the models exchange information during the simulation and synchronize on a common solution that is closer to the truth than any of the individual model solutions in the standard multi-model ensemble approach or a weighted average of these. This approach is called the super modeling approach (SUMO). The potential of the SUMO approach has been demonstrated in the context of simple, low-order, chaotic dynamical systems. The information exchange takes the form of linear nudging terms in the dynamical equations that nudge the solution of each model to the solution of all other models in the ensemble. With a suitable choice of the connection strengths the models synchronize on a common solution that is indeed closer to the true system than any of the individual model solutions without nudging. This approach is called connected SUMO. An alternative approach is to integrate a weighted averaged model, weighted SUMO. At each time step all models in the ensemble calculate the tendency, these tendencies are weighted averaged and the state is integrated one time step into the future with this weighted averaged tendency. It was shown that in case the connected SUMO synchronizes perfectly, the connected SUMO follows the weighted averaged trajectory and both approaches yield the same solution. In this study we pioneer both approaches in the context of a global, quasi-geostrophic, three-level atmosphere model that is capable of simulating quite realistically the extra-tropical circulation in the Northern Hemisphere winter.
Laser Measurements Based for Volumetric Accuracy Improvement of Multi-axis Systems
NASA Astrophysics Data System (ADS)
Vladimir, Sokolov; Konstantin, Basalaev
The paper describes a new developed approach to CNC-controlled multi-axis systems geometric errors compensation based on optimal error correction strategy. Multi-axis CNC-controlled systems - machine-tools and CMM's are the basis of modern engineering industry. Similar design principles of both technological and measurement equipment allow usage of similar approaches to precision management. The approach based on geometric errors compensation are widely used at present time. The paper describes a system for compensation of geometric errors of multi-axis equipment based on the new approach. The hardware basis of the developed system is a multi-function laser interferometer. The principles of system's implementation, results of measurements and system's functioning simulation are described. The effectiveness of application of described principles to multi-axis equipment of different sizes and purposes for different machining directions and zones within workspace is presented. The concepts of optimal correction strategy is introduced and dynamic accuracy control is proposed.
High-resolution method for evolving complex interface networks
NASA Astrophysics Data System (ADS)
Pan, Shucheng; Hu, Xiangyu Y.; Adams, Nikolaus A.
2018-04-01
In this paper we describe a high-resolution transport formulation of the regional level-set approach for an improved prediction of the evolution of complex interface networks. The novelty of this method is twofold: (i) construction of local level sets and reconstruction of a global level set, (ii) local transport of the interface network by employing high-order spatial discretization schemes for improved representation of complex topologies. Various numerical test cases of multi-region flow problems, including triple-point advection, single vortex flow, mean curvature flow, normal driven flow, dry foam dynamics and shock-bubble interaction show that the method is accurate and suitable for a wide range of complex interface-network evolutions. Its overall computational cost is comparable to the Semi-Lagrangian regional level-set method while the prediction accuracy is significantly improved. The approach thus offers a viable alternative to previous interface-network level-set method.
Reclaiming Gender and Power in Sexual Violence Prevention in Adolescence.
Miller, Elizabeth
2018-03-01
The Mentors in Violence Prevention (MVP) model seeks to address the root causes of gender violence using a bystander approach and leadership training to challenge structures of patriarchy. Emerging research on adolescent relationship abuse and sexual violence points to key modifiable targets-transforming gender norms, addressing homophobia, integrating with comprehensive sexuality education, and acknowledging the needs of youth already exposed to violence. A social justice-based bystander approach such as the MVP model should be part of a multi-level approach to sexual violence prevention that addresses gender and power, encourages healthy sexuality conversations, and provides safety and support for survivors.
Uncovering Hidden Layers of Cell Cycle Regulation through Integrative Multi-omic Analysis
Aviner, Ranen; Shenoy, Anjana; Elroy-Stein, Orna; Geiger, Tamar
2015-01-01
Studying the complex relationship between transcription, translation and protein degradation is essential to our understanding of biological processes in health and disease. The limited correlations observed between mRNA and protein abundance suggest pervasive regulation of post-transcriptional steps and support the importance of profiling mRNA levels in parallel to protein synthesis and degradation rates. In this work, we applied an integrative multi-omic approach to study gene expression along the mammalian cell cycle through side-by-side analysis of mRNA, translation and protein levels. Our analysis sheds new light on the significant contribution of both protein synthesis and degradation to the variance in protein expression. Furthermore, we find that translation regulation plays an important role at S-phase, while progression through mitosis is predominantly controlled by changes in either mRNA levels or protein stability. Specific molecular functions are found to be co-regulated and share similar patterns of mRNA, translation and protein expression along the cell cycle. Notably, these include genes and entire pathways not previously implicated in cell cycle progression, demonstrating the potential of this approach to identify novel regulatory mechanisms beyond those revealed by traditional expression profiling. Through this three-level analysis, we characterize different mechanisms of gene expression, discover new cycling gene products and highlight the importance and utility of combining datasets generated using different techniques that monitor distinct steps of gene expression. PMID:26439921
Parallelization and checkpointing of GPU applications through program transformation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solano-Quinde, Lizandro Damian
2012-01-01
GPUs have emerged as a powerful tool for accelerating general-purpose applications. The availability of programming languages that makes writing general-purpose applications for running on GPUs tractable have consolidated GPUs as an alternative for accelerating general purpose applications. Among the areas that have benefited from GPU acceleration are: signal and image processing, computational fluid dynamics, quantum chemistry, and, in general, the High Performance Computing (HPC) Industry. In order to continue to exploit higher levels of parallelism with GPUs, multi-GPU systems are gaining popularity. In this context, single-GPU applications are parallelized for running in multi-GPU systems. Furthermore, multi-GPU systems help to solvemore » the GPU memory limitation for applications with large application memory footprint. Parallelizing single-GPU applications has been approached by libraries that distribute the workload at runtime, however, they impose execution overhead and are not portable. On the other hand, on traditional CPU systems, parallelization has been approached through application transformation at pre-compile time, which enhances the application to distribute the workload at application level and does not have the issues of library-based approaches. Hence, a parallelization scheme for GPU systems based on application transformation is needed. Like any computing engine of today, reliability is also a concern in GPUs. GPUs are vulnerable to transient and permanent failures. Current checkpoint/restart techniques are not suitable for systems with GPUs. Checkpointing for GPU systems present new and interesting challenges, primarily due to the natural differences imposed by the hardware design, the memory subsystem architecture, the massive number of threads, and the limited amount of synchronization among threads. Therefore, a checkpoint/restart technique suitable for GPU systems is needed. The goal of this work is to exploit higher levels of parallelism and to develop support for application-level fault tolerance in applications using multiple GPUs. Our techniques reduce the burden of enhancing single-GPU applications to support these features. To achieve our goal, this work designs and implements a framework for enhancing a single-GPU OpenCL application through application transformation.« less
State of science: occupational slips, trips and falls on the same level.
Chang, Wen-Ruey; Leclercq, Sylvie; Lockhart, Thurmon E; Haslam, Roger
2016-07-01
Occupational slips, trips and falls on the same level (STFL) result in substantial injuries worldwide. This paper summarises the state of science regarding STFL, outlining relevant aspects of epidemiology, biomechanics, psychophysics, tribology, organisational influences and injury prevention. This review reaffirms that STFL remain a major cause of workplace injury and STFL prevention is a complex problem, requiring multi-disciplinary, multi-faceted approaches. Despite progress in recent decades in understanding the mechanisms involved in STFL, especially slipping, research leading to evidence-based prevention practices remains insufficient, given the problem scale. It is concluded that there is a pressing need to develop better fall prevention strategies using systems approaches conceptualising and addressing the factors involved in STFL, with considerations of the full range of factors and their interactions. There is also an urgent need for field trials of various fall prevention strategies to assess the effectiveness of different intervention components and their interactions. Practitioner Summary: Work-related slipping, tripping and falls on the same level are a major source of occupational injury. The causes are broadly understood, although more attention is needed from a systems perspective. Research has shown preventative action to be effective, but further studies are required to understand which aspects are most beneficial.
State of science: occupational slips, trips and falls on the same level *
Chang, Wen-Ruey; Leclercq, Sylvie; Lockhart, Thurmon E.; Haslam, Roger
2016-01-01
Abstract Occupational slips, trips and falls on the same level (STFL) result in substantial injuries worldwide. This paper summarises the state of science regarding STFL, outlining relevant aspects of epidemiology, biomechanics, psychophysics, tribology, organisational influences and injury prevention. This review reaffirms that STFL remain a major cause of workplace injury and STFL prevention is a complex problem, requiring multi-disciplinary, multi-faceted approaches. Despite progress in recent decades in understanding the mechanisms involved in STFL, especially slipping, research leading to evidence-based prevention practices remains insufficient, given the problem scale. It is concluded that there is a pressing need to develop better fall prevention strategies using systems approaches conceptualising and addressing the factors involved in STFL, with considerations of the full range of factors and their interactions. There is also an urgent need for field trials of various fall prevention strategies to assess the effectiveness of different intervention components and their interactions. Practitioner Summary: Work-related slipping, tripping and falls on the same level are a major source of occupational injury. The causes are broadly understood, although more attention is needed from a systems perspective. Research has shown preventative action to be effective, but further studies are required to understand which aspects are most beneficial. PMID:26903401
NASA Astrophysics Data System (ADS)
Newig, Jens; Schulz, Daniel; Jager, Nicolas W.
2016-12-01
This article attempts to shed new light on prevailing puzzles of spatial scales in multi-level, participatory governance as regards the democratic legitimacy and environmental effectiveness of governance systems. We focus on the governance re-scaling by the European Water Framework Directive, which introduced new governance scales (mandated river basin management) and demands consultation of citizens and encourages `active involvement' of stakeholders. This allows to examine whether and how re-scaling through deliberate governance interventions impacts on democratic legitimacy and effective environmental policy delivery. To guide the enquiry, this article organizes existing—partly contradictory—claims on the relation of scale, democratic legitimacy, and environmental effectiveness into three clusters of mechanisms, integrating insights from multi-level governance, social-ecological systems, and public participation. We empirically examine Water Framework Directive implementation in a comparative case study of multi-level systems in the light of the suggested mechanisms. We compare two planning areas in Germany: North Rhine Westphalia and Lower Saxony. Findings suggest that the Water Framework Directive did have some impact on institutionalizing hydrological scales and participation. Local participation appears generally both more effective and legitimate than on higher levels, pointing to the need for yet more tailored multi-level governance approaches, depending on whether environmental knowledge or advocacy is sought. We find mixed results regarding the potential of participation to bridge spatial `misfits' between ecological and administrative scales of governance, depending on the historical institutionalization of governance on ecological scales. Polycentricity, finally, appeared somewhat favorable in effectiveness terms with some distinct differences regarding polycentricity in planning vs. polycentricity in implementation.
Newig, Jens; Schulz, Daniel; Jager, Nicolas W
2016-12-01
This article attempts to shed new light on prevailing puzzles of spatial scales in multi-level, participatory governance as regards the democratic legitimacy and environmental effectiveness of governance systems. We focus on the governance re-scaling by the European Water Framework Directive, which introduced new governance scales (mandated river basin management) and demands consultation of citizens and encourages 'active involvement' of stakeholders. This allows to examine whether and how re-scaling through deliberate governance interventions impacts on democratic legitimacy and effective environmental policy delivery. To guide the enquiry, this article organizes existing-partly contradictory-claims on the relation of scale, democratic legitimacy, and environmental effectiveness into three clusters of mechanisms, integrating insights from multi-level governance, social-ecological systems, and public participation. We empirically examine Water Framework Directive implementation in a comparative case study of multi-level systems in the light of the suggested mechanisms. We compare two planning areas in Germany: North Rhine Westphalia and Lower Saxony. Findings suggest that the Water Framework Directive did have some impact on institutionalizing hydrological scales and participation. Local participation appears generally both more effective and legitimate than on higher levels, pointing to the need for yet more tailored multi-level governance approaches, depending on whether environmental knowledge or advocacy is sought. We find mixed results regarding the potential of participation to bridge spatial 'misfits' between ecological and administrative scales of governance, depending on the historical institutionalization of governance on ecological scales. Polycentricity, finally, appeared somewhat favorable in effectiveness terms with some distinct differences regarding polycentricity in planning vs. polycentricity in implementation.
Multi-Level Building Reconstruction for Automatic Enhancement of High Resolution Dsms
NASA Astrophysics Data System (ADS)
Arefi, H.; Reinartz, P.
2012-07-01
In this article a multi-level approach is proposed for reconstruction-based improvement of high resolution Digital Surface Models (DSMs). The concept of Levels of Detail (LOD) defined by CityGML standard has been considered as basis for abstraction levels of building roof structures. Here, the LOD1 and LOD2 which are related to prismatic and parametric roof shapes are reconstructed. Besides proposing a new approach for automatic LOD1 and LOD2 generation from high resolution DSMs, the algorithm contains two generalization levels namely horizontal and vertical. Both generalization levels are applied to prismatic model of buildings. The horizontal generalization allows controlling the approximation level of building footprints which is similar to cartographic generalization concept of the urban maps. In vertical generalization, the prismatic model is formed using an individual building height and continuous to included all flat structures locating in different height levels. The concept of LOD1 generation is based on approximation of the building footprints into rectangular or non-rectangular polygons. For a rectangular building containing one main orientation a method based on Minimum Bounding Rectangle (MBR) in employed. In contrast, a Combined Minimum Bounding Rectangle (CMBR) approach is proposed for regularization of non-rectilinear polygons, i.e. buildings without perpendicular edge directions. Both MBRand CMBR-based approaches are iteratively employed on building segments to reduce the original building footprints to a minimum number of nodes with maximum similarity to original shapes. A model driven approach based on the analysis of the 3D points of DSMs in a 2D projection plane is proposed for LOD2 generation. Accordingly, a building block is divided into smaller parts according to the direction and number of existing ridge lines. The 3D model is derived for each building part and finally, a complete parametric model is formed by merging all the 3D models of the individual parts and adjusting the nodes after the merging step. In order to provide an enhanced DSM, a surface model is provided for each building by interpolation of the internal points of the generated models. All interpolated models are situated on a Digital Terrain Model (DTM) of corresponding area to shape the enhanced DSM. Proposed DSM enhancement approach has been tested on a dataset from Munich central area. The original DSM is created using robust stereo matching of Worldview-2 stereo images. A quantitative assessment of the new DSM by comparing the heights of the ridges and eaves shows a standard deviation of better than 50cm.
Ahmad, Hassan
2013-01-01
The spread and perpetuation of the HIV/AIDS epidemic in South Africa has hindered the country's social and economic growth after apartheid. This paper documents my experiences while working with the Projects Abroad Human Rights Office and specifically my interactions with the Treatment Action Campaign (TAC), an organization which has taken a multi-dimensional approach in order to educate people about HIV/AIDS and attempt to provide access to medicines for millions of South Africans afflicted with the disease. I discuss how TAC has used both traditional and non-traditional methods of advocacy to combat the epidemic and equate access to health care to a social justice issue by empowering marginalized communities. The paper's dual purpose is to applaud TAC's continuous success in combating HIV/AIDS with such a multi-dimensional approach and illustrate how other organizations can utilize such an approach in order to affect social change. To illustrate TAC's approach, I utilize Lucie White's three dimensions of lawyering and equate TAC to a single cause lawyer, signifying that White's characterization of multi-dimensional activism is not limited to individuals, but can rather be applied at the firm level. White's three dimensions include: (a) advocacy through litigation, (b) advocacy in stimulating progressive change, and (c) advocacy as a pedagogic process. From this analysis, I conclude that TAC's multi-dimensional approach and specifically its inherent practice of White's three dimensions has been the root of its success in educating millions about the virus and advocating for access to medicines for those who have contracted HIV. TAC's innovative advocacy has also mobilized a new generation of South African activists who have helped TAC grow into a vibrant and integral organization within the country's post-apartheid culture. Such an example can serve as a framework for future organizations who wish to tackle other challenges that face the country. PMID:23819672
Ahmad, Hassan
2013-03-01
The spread and perpetuation of the HIV/AIDS epidemic in South Africa has hindered the country's social and economic growth after apartheid. This paper documents my experiences while working with the Projects Abroad Human Rights Office and specifically my interactions with the Treatment Action Campaign (TAC), an organization which has taken a multi-dimensional approach in order to educate people about HIV/AIDS and attempt to provide access to medicines for millions of South Africans afflicted with the disease. I discuss how TAC has used both traditional and non-traditional methods of advocacy to combat the epidemic and equate access to health care to a social justice issue by empowering marginalized communities. The paper's dual purpose is to applaud TAC's continuous success in combating HIV/AIDS with such a multi-dimensional approach and illustrate how other organizations can utilize such an approach in order to affect social change. To illustrate TAC's approach, I utilize Lucie White's three dimensions of lawyering and equate TAC to a single cause lawyer, signifying that White's characterization of multi-dimensional activism is not limited to individuals, but can rather be applied at the firm level. White's three dimensions include: (a) advocacy through litigation, (b) advocacy in stimulating progressive change, and (c) advocacy as a pedagogic process. From this analysis, I conclude that TAC's multi-dimensional approach and specifically its inherent practice of White's three dimensions has been the root of its success in educating millions about the virus and advocating for access to medicines for those who have contracted HIV. TAC's innovative advocacy has also mobilized a new generation of South African activists who have helped TAC grow into a vibrant and integral organization within the country's post-apartheid culture. Such an example can serve as a framework for future organizations who wish to tackle other challenges that face the country.
Selecting essential information for biosurveillance--a multi-criteria decision analysis.
Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.
Delmotte, Sylvestre; Lopez-Ridaura, Santiago; Barbier, Jean-Marc; Wery, Jacques
2013-11-15
Evaluating the impacts of the development of alternative agricultural systems, such as organic or low-input cropping systems, in the context of an agricultural region requires the use of specific tools and methodologies. They should allow a prospective (using scenarios), multi-scale (taking into account the field, farm and regional level), integrated (notably multicriteria) and participatory assessment, abbreviated PIAAS (for Participatory Integrated Assessment of Agricultural System). In this paper, we compare the possible contribution to PIAAS of three modeling approaches i.e. Bio-Economic Modeling (BEM), Agent-Based Modeling (ABM) and statistical Land-Use/Land Cover Change (LUCC) models. After a presentation of each approach, we analyze their advantages and drawbacks, and identify their possible complementarities for PIAAS. Statistical LUCC modeling is a suitable approach for multi-scale analysis of past changes and can be used to start discussion about the futures with stakeholders. BEM and ABM approaches have complementary features for scenarios assessment at different scales. While ABM has been widely used for participatory assessment, BEM has been rarely used satisfactorily in a participatory manner. On the basis of these results, we propose to combine these three approaches in a framework targeted to PIAAS. Copyright © 2013 Elsevier Ltd. All rights reserved.
Teaching Shakespeare in the Digital Age: The eZoomBook Approach
ERIC Educational Resources Information Center
Evain, Christine; De Marco, Chris
2016-01-01
What collaborative process can teachers offer in order to stimulate their students' reading of and writing on Shakespeare's plays? How can new technologies contribute to facilitating the classroom experience? The eZoomBook (eZB) template was designed for teachers to create and share multi-level digital books called "eZoomBooks" that…
Small Wins: An Initiative to Promote Gender Equity in Higher Education
ERIC Educational Resources Information Center
Johnson, Katherine A.; Warr, Deborah J.; Hegarty, Kelsey; Guillemin, Marilys
2015-01-01
Gender inequity in leadership and management roles within the higher education sector remains a widespread problem. Researchers have suggested that a multi-pronged method is the preferred approach to reach and maintain gender equity over time. A large university faculty undertook an audit to gauge the level of gender equity on the senior…
ERIC Educational Resources Information Center
Betoret, Fernando Domenech
2009-01-01
This study examines the relationship between school resources, teacher self-efficacy, potential multi-level stressors and teacher burnout using structural equation modelling. The causal structure for primary and secondary school teachers was also examined. The sample was composed of 724 primary and secondary Spanish school teachers. The changes…
Education Policy as Normative Discourse and Negotiated Meanings: Engaging the Holocaust in Estonia
ERIC Educational Resources Information Center
Stevick, E. Doyle
2010-01-01
This article uses a socio-cultural approach to analyze the formation and implementation of Estonia's Holocaust Day Policy, a day of both commemoration for victims of the Holocaust and other crimes against humanity, and education about the Holocaust. It investigates both the multi-level development of the policy in light of external pressure (from…
School Security Measures and Extracurricular Participation: An Exploratory Multi-Level Analysis
ERIC Educational Resources Information Center
Mowen, Thomas J.; Manierre, Matthew J.
2017-01-01
Although delinquency in US schools is near historic lows, concern over delinquency in US schools remains a pressing issue among school officials, parents, and policy-makers. Many scholars argue that the current approach to discipline in the United States is highly punitive. While some projects have assessed the effect of punitive security on…
John G. Michopoulos; Tomonari Furukawa; John C. Hermanson; Samuel G. Lambrakos
2011-01-01
The goal of this paper is to propose and demonstrate a multi level design optimization approach for the coordinated determination of a material constitutive model synchronously to the design of the experimental procedure needed to acquire the necessary data. The methodology achieves both online (real-time) and offline design of optimum experiments required for...
A Multi-Level Model of Moral Functioning Revisited
ERIC Educational Resources Information Center
Reed, Don Collins
2009-01-01
The model of moral functioning scaffolded in the 2008 "JME" Special Issue is here revisited in response to three papers criticising that volume. As guest editor of that Special Issue I have formulated the main body of this response, concerning the dynamic systems approach to moral development, the problem of moral relativism and the role of…
The Robust Learning Model (RLM): A Comprehensive Approach to a New Online University
ERIC Educational Resources Information Center
Neumann, Yoram; Neumann, Edith F.
2010-01-01
This paper outlines the components of the Robust Learning Model (RLM) as a conceptual framework for creating a new online university offering numerous degree programs at all degree levels. The RLM is a multi-factorial model based on the basic belief that successful learning outcomes depend on multiple factors employed together in a holistic…
ERIC Educational Resources Information Center
Welch, Chiquitia L.; Roberts-Lewis, Amelia C.; Parker, Sharon
2009-01-01
The rise in female delinquency has resulted in large numbers of girls being incarcerated in Youth Development Centers (YDC). However, there are few gender specific treatment programs for incarcerated female adolescent offenders, particularly for those with a history of substance dependency. In this article, we present a Multi-level Risk Model…
The impact of climate change on surface level ozone is examined through a multi-scale modeling effort that linked global and regional climate models to drive air quality model simulations. Results are quantified in terms of the Relative Response Factor (RRFE), which es...
Does Multi-Level Intervention Enhance Work Process Knowledge?
ERIC Educational Resources Information Center
Leppanen, Anneli; Hopsu, Leila; Klemola, Soili; Kuosma, Eeva
2008-01-01
Purpose: The aim of this study is to find out the impacts of participation in formal training and development of work on the work process knowledge of school kitchen workers. Design/methodology/approach: The article describes a follow-up study on the consequences of intervention. In total, 108 subjects participated both in the interventions and in…
Multi-Level Steering and Institution Building: The European Union's Approach to Research Policy
ERIC Educational Resources Information Center
Young, Mitchell
2012-01-01
Adopting the conception of the university as a primary driver of innovation and economic growth has brought increased pressure for the European Union (EU) to actively steer university-based research policy, despite its being outside of the EU's direct jurisdiction. While the open method of coordination (OMC) was developed for such situations, the…
Barraza, Roberto; Velazquez-Angulo, Gilberto; Flores-Tavizón, Edith; Romero-González, Jaime; Huertas-Cardozo, José Ignacio
2016-04-27
This study examines a pathway for building urban climate change mitigation policies by presenting a multi-dimensional and transdisciplinary approach in which technical, economic, environmental, social, and political dimensions interact. Now, more than ever, the gap between science and policymaking needs to be bridged; this will enable judicious choices to be made in regarding energy and climate change mitigation strategies, leading to positive social impacts, in particular for the populations at-risk at the local level. Through a case study in Juarez, Chihuahua, Mexico, we propose a multidimensional and transdisciplinary approach with the role of scientist as policy advisers to improve the role of science in decision-making on mitigation policies at the local level in Mexico.
Kalash, Leen; Val, Cristina; Azuaje, Jhonny; Loza, María I; Svensson, Fredrik; Zoufir, Azedine; Mervin, Lewis; Ladds, Graham; Brea, José; Glen, Robert; Sotelo, Eddy; Bender, Andreas
2017-12-30
Compounds designed to display polypharmacology may have utility in treating complex diseases, where activity at multiple targets is required to produce a clinical effect. In particular, suitable compounds may be useful in treating neurodegenerative diseases by promoting neuronal survival in a synergistic manner via their multi-target activity at the adenosine A 1 and A 2A receptors (A 1 R and A 2A R) and phosphodiesterase 10A (PDE10A), which modulate intracellular cAMP levels. Hence, in this work we describe a computational method for the design of synthetically feasible ligands that bind to A 1 and A 2A receptors and inhibit phosphodiesterase 10A (PDE10A), involving a retrosynthetic approach employing in silico target prediction and docking, which may be generally applicable to multi-target compound design at several target classes. This approach has identified 2-aminopyridine-3-carbonitriles as the first multi-target ligands at A 1 R, A 2A R and PDE10A, by showing agreement between the ligand and structure based predictions at these targets. The series were synthesized via an efficient one-pot scheme and validated pharmacologically as A 1 R/A 2A R-PDE10A ligands, with IC 50 values of 2.4-10.0 μM at PDE10A and K i values of 34-294 nM at A 1 R and/or A 2A R. Furthermore, selectivity profiling of the synthesized 2-amino-pyridin-3-carbonitriles against other subtypes of both protein families showed that the multi-target ligand 8 exhibited a minimum of twofold selectivity over all tested off-targets. In addition, both compounds 8 and 16 exhibited the desired multi-target profile, which could be considered for further functional efficacy assessment, analog modification for the improvement of selectivity towards A 1 R, A 2A R and PDE10A collectively, and evaluation of their potential synergy in modulating cAMP levels.
Integrating Model-Based Transmission Reduction into a multi-tier architecture
NASA Astrophysics Data System (ADS)
Straub, J.
A multi-tier architecture consists of numerous craft as part of the system, orbital, aerial, and surface tiers. Each tier is able to collect progressively greater levels of information. Generally, craft from lower-level tiers are deployed to a target of interest based on its identification by a higher-level craft. While the architecture promotes significant amounts of science being performed in parallel, this may overwhelm the computational and transmission capabilities of higher-tier craft and links (particularly the deep space link back to Earth). Because of this, a new paradigm in in-situ data processing is required. Model-based transmission reduction (MBTR) is such a paradigm. Under MBTR, each node (whether a single spacecraft in orbit of the Earth or another planet or a member of a multi-tier network) is given an a priori model of the phenomenon that it is assigned to study. It performs activities to validate this model. If the model is found to be erroneous, corrective changes are identified, assessed to ensure their significance for being passed on, and prioritized for transmission. A limited amount of verification data is sent with each MBTR assertion message to allow those that might rely on the data to validate the correct operation of the spacecraft and MBTR engine onboard. Integrating MBTR with a multi-tier framework creates an MBTR hierarchy. Higher levels of the MBTR hierarchy task lower levels with data collection and assessment tasks that are required to validate or correct elements of its model. A model of the expected conditions is sent to the lower level craft; which then engages its own MBTR engine to validate or correct the model. This may include tasking a yet lower level of craft to perform activities. When the MBTR engine at a given level receives all of its component data (whether directly collected or from delegation), it randomly chooses some to validate (by reprocessing the validation data), performs analysis and sends its own results (v- lidation and/or changes of model elements and supporting validation data) to its upstream node. This constrains data transmission to only significant (either because it includes a change or is validation data critical for assessing overall performance) information and reduces the processing requirements (by not having to process insignificant data) at higher-level nodes. This paper presents a framework for multi-tier MBTR and two demonstration mission concepts: an Earth sensornet and a mission to Mars. These multi-tier MBTR concepts are compared to a traditional mission approach.
Sanz, Yolanda
2017-01-01
Abstract The miniaturized and portable DNA sequencer MinION™ has demonstrated great potential in different analyses such as genome-wide sequencing, pathogen outbreak detection and surveillance, human genome variability, and microbial diversity. In this study, we tested the ability of the MinION™ platform to perform long amplicon sequencing in order to design new approaches to study microbial diversity using a multi-locus approach. After compiling a robust database by parsing and extracting the rrn bacterial region from more than 67000 complete or draft bacterial genomes, we demonstrated that the data obtained during sequencing of the long amplicon in the MinION™ device using R9 and R9.4 chemistries were sufficient to study 2 mock microbial communities in a multiplex manner and to almost completely reconstruct the microbial diversity contained in the HM782D and D6305 mock communities. Although nanopore-based sequencing produces reads with lower per-base accuracy compared with other platforms, we presented a novel approach consisting of multi-locus and long amplicon sequencing using the MinION™ MkIb DNA sequencer and R9 and R9.4 chemistries that help to overcome the main disadvantage of this portable sequencing platform. Furthermore, the nanopore sequencing library, constructed with the last releases of pore chemistry (R9.4) and sequencing kit (SQK-LSK108), permitted the retrieval of the higher level of 1D read accuracy sufficient to characterize the microbial species present in each mock community analysed. Improvements in nanopore chemistry, such as minimizing base-calling errors and new library protocols able to produce rapid 1D libraries, will provide more reliable information in the near future. Such data will be useful for more comprehensive and faster specific detection of microbial species and strains in complex ecosystems. PMID:28605506
Hinckson, Erica; Schneider, Margaret; Winter, Sandra J; Stone, Emily; Puhan, Milo; Stathi, Afroditi; Porter, Michelle M; Gardiner, Paul A; Dos Santos, Daniela Lopes; Wolff, Andrea; King, Abby C
2017-09-29
Physical inactivity across the lifespan remains a public health issue for many developed countries. Inactivity has contributed considerably to the pervasiveness of lifestyle diseases. Government, national and local agencies and organizations have been unable to systematically, and in a coordinated way, translate behavioral research into practice that makes a difference at a population level. One approach for mobilizing multi-level efforts to improve the environment for physical activity is to engage in a process of citizen science. Citizen Science here is defined as a participatory research approach involving members of the public working closely with research investigators to initiate and advance scientific research projects. However, there are no common measures or protocols to guide citizen science research at the local community setting. We describe overarching categories of constructs that can be considered when designing citizen science projects expected to yield multi-level interventions, and provide an example of the citizen science approach to promoting PA. We also recommend potential measures across different levels of impact. Encouraging some consistency in measurement across studies will potentially accelerate the efficiency with which citizen science participatory research provides new insights into and solutions to the behaviorally-based public health issues that drive most of morbidity and mortality. The measures described in this paper abide by four fundamental principles specifically selected for inclusion in citizen science projects: feasibility, accuracy, propriety, and utility. The choice of measures will take into account the potential resources available for outcome and process evaluation. Our intent is to emphasize the importance for all citizen science participatory projects to follow an evidence-based approach and ensure that they incorporate an appropriate assessment protocol. We provided the rationale for and a list of contextual factors along with specific examples of measures to encourage consistency among studies that plan to use a citizen science participatory approach. The potential of this approach to promote health and wellbeing in communities is high and we hope that we have provided the tools needed to optimally promote synergistic gains in knowledge across a range of Citizen Science participatory projects.
A multi-scale spatial approach to address environmental effects of small hydropower development.
McManamay, Ryan A; Samu, Nicole; Kao, Shih-Chieh; Bevelhimer, Mark S; Hetrick, Shelaine C
2015-01-01
Hydropower development continues to grow worldwide in developed and developing countries. While the ecological and physical responses to dam construction have been well documented, translating this information into planning for hydropower development is extremely difficult. Very few studies have conducted environmental assessments to guide site-specific or widespread hydropower development. Herein, we propose a spatial approach for estimating environmental effects of hydropower development at multiple scales, as opposed to individual site-by-site assessments (e.g., environmental impact assessment). Because the complex, process-driven effects of future hydropower development may be uncertain or, at best, limited by available information, we invested considerable effort in describing novel approaches to represent environmental concerns using spatial data and in developing the spatial footprint of hydropower infrastructure. We then use two case studies in the US, one at the scale of the conterminous US and another within two adjoining rivers basins, to examine how environmental concerns can be identified and related to areas of varying energy capacity. We use combinations of reserve-design planning and multi-metric ranking to visualize tradeoffs among environmental concerns and potential energy capacity. Spatial frameworks, like the one presented, are not meant to replace more in-depth environmental assessments, but to identify information gaps and measure the sustainability of multi-development scenarios as to inform policy decisions at the basin or national level. Most importantly, the approach should foster discussions among environmental scientists and stakeholders regarding solutions to optimize energy development and environmental sustainability.
2013-01-01
Background Despite progress in the development of combined antiretroviral therapies (cART), HIV infection remains a significant challenge for human health. Current problems of cART include multi-drug-resistant virus variants, long-term toxicity and enormous treatment costs. Therefore, the identification of novel effective drugs is urgently needed. Methods We developed a straightforward screening approach for simultaneously evaluating the sensitivity of multiple HIV gag-pol mutants to antiviral drugs in one assay. Our technique is based on multi-colour lentiviral self-inactivating (SIN) LeGO vector technology. Results We demonstrated the successful use of this approach for screening compounds against up to four HIV gag-pol variants (wild-type and three mutants) simultaneously. Importantly, the technique was adapted to Biosafety Level 1 conditions by utilising ecotropic pseudotypes. This allowed upscaling to a large-scale screening protocol exploited by pharmaceutical companies in a successful proof-of-concept experiment. Conclusions The technology developed here facilitates fast screening for anti-HIV activity of individual agents from large compound libraries. Although drugs targeting gag-pol variants were used here, our approach permits screening compounds that target several different, key cellular and viral functions of the HIV life-cycle. The modular principle of the method also allows the easy exchange of various mutations in HIV sequences. In conclusion, the methodology presented here provides a valuable new approach for the identification of novel anti-HIV drugs. PMID:23286882
NASA Astrophysics Data System (ADS)
Lanni, Cristiano; Mazzorana, Bruno; Volcan, Claudio; Bertagnolli, Rudi
2015-04-01
Flood hazard is generally assessed by assuming the return period of the rainfall as a proxy for the return period of the discharge and the related hydrograph. Frequently this deterministic view is extended also to the straightforward application of hydrodynamic models. However, the climate (i.e. precipitation), the catchment (i.e. geology, soil and antecedent soil-moisture condition) and the anthropogenic (i.e. drainage system and its regulation) systems interact in a complex way, and the occurrence probability of a flood inundation event can significantly differ from the occurrence probability of the triggering event (i.e. rainfall). In order to reliably determine the spatial patterns of flood intensities and probabilities, the rigorous determination of flood event scenarios is beneficial because it provides a clear, rationale method to recognize and unveil the inherent stochastic behavior of natural processes. Therefore, a multi-scenario approach for hazard assessment should be applied and should consider the possible events taking place in the area potentially subject to flooding (i.e. floodplains). Here, we apply a multi-scenario approach for the assessment of the flood hazard around the Idro lake (Italy). We consider and estimate the probability of occurrence of several scenarios related to the initial (i.e. initial water level in the lake) and boundary (i.e. shape of the hydrograph, downslope drainage, spillway opening operations) conditions characterizing the lake. Finally, we discuss the advantages and issues of the presented methodological procedure compared to traditional (and essentially deterministic) approaches.
Highly stable families of soliton molecules in fiber-optic systems
NASA Astrophysics Data System (ADS)
Moubissi, A.-B.; Tchofo Dinda, P.; Nse Biyoghe, S.
2018-04-01
We develop an efficient approach to the design of families of single solitons and soliton molecules most suited to a given fiber system. The obtained solitonic entities exhibit very high stability, with a robustness which allows them to propagate over thousands of kilometers and to survive collisions with other solitonic entities. Our approach enables the generation of a large number of solitonic entities, including families of single solitons and two-soliton molecules, which can be distinguished sufficiently by their respective profiles or energy levels, and so can be easily identifiable and detectable without ambiguity. We discuss the possible use of such solitonic entities as symbols of a multi-level modulation format in fiber-optic communication systems.
Applying a Consumer Behavior Lens to Salt Reduction Initiatives
Potvin Kent, Monique; Raats, Monique M.; McConnon, Áine; Wall, Patrick; Dubois, Lise
2017-01-01
Reformulation of food products to reduce salt content has been a central strategy for achieving population level salt reduction. In this paper, we reflect on current reformulation strategies and consider how consumer behavior determines the ultimate success of these strategies. We consider the merits of adopting a ‘health by stealth’, silent approach to reformulation compared to implementing a communications strategy which draws on labeling initiatives in tandem with reformulation efforts. We end this paper by calling for a multi-actor approach which utilizes co-design, participatory tools to facilitate the involvement of all stakeholders, including, and especially, consumers, in making decisions around how best to achieve population-level salt reduction. PMID:28820449
Ambient and smartphone sensor assisted ADL recognition in multi-inhabitant smart environments.
Roy, Nirmalya; Misra, Archan; Cook, Diane
2016-02-01
Activity recognition in smart environments is an evolving research problem due to the advancement and proliferation of sensing, monitoring and actuation technologies to make it possible for large scale and real deployment. While activities in smart home are interleaved, complex and volatile; the number of inhabitants in the environment is also dynamic. A key challenge in designing robust smart home activity recognition approaches is to exploit the users' spatiotemporal behavior and location, focus on the availability of multitude of devices capable of providing different dimensions of information and fulfill the underpinning needs for scaling the system beyond a single user or a home environment. In this paper, we propose a hybrid approach for recognizing complex activities of daily living (ADL), that lie in between the two extremes of intensive use of body-worn sensors and the use of ambient sensors. Our approach harnesses the power of simple ambient sensors (e.g., motion sensors) to provide additional 'hidden' context (e.g., room-level location) of an individual, and then combines this context with smartphone-based sensing of micro-level postural/locomotive states. The major novelty is our focus on multi-inhabitant environments, where we show how the use of spatiotemporal constraints along with multitude of data sources can be used to significantly improve the accuracy and computational overhead of traditional activity recognition based approaches such as coupled-hidden Markov models. Experimental results on two separate smart home datasets demonstrate that this approach improves the accuracy of complex ADL classification by over 30 %, compared to pure smartphone-based solutions.
Ambient and smartphone sensor assisted ADL recognition in multi-inhabitant smart environments
Misra, Archan; Cook, Diane
2016-01-01
Activity recognition in smart environments is an evolving research problem due to the advancement and proliferation of sensing, monitoring and actuation technologies to make it possible for large scale and real deployment. While activities in smart home are interleaved, complex and volatile; the number of inhabitants in the environment is also dynamic. A key challenge in designing robust smart home activity recognition approaches is to exploit the users' spatiotemporal behavior and location, focus on the availability of multitude of devices capable of providing different dimensions of information and fulfill the underpinning needs for scaling the system beyond a single user or a home environment. In this paper, we propose a hybrid approach for recognizing complex activities of daily living (ADL), that lie in between the two extremes of intensive use of body-worn sensors and the use of ambient sensors. Our approach harnesses the power of simple ambient sensors (e.g., motion sensors) to provide additional ‘hidden’ context (e.g., room-level location) of an individual, and then combines this context with smartphone-based sensing of micro-level postural/locomotive states. The major novelty is our focus on multi-inhabitant environments, where we show how the use of spatiotemporal constraints along with multitude of data sources can be used to significantly improve the accuracy and computational overhead of traditional activity recognition based approaches such as coupled-hidden Markov models. Experimental results on two separate smart home datasets demonstrate that this approach improves the accuracy of complex ADL classification by over 30 %, compared to pure smartphone-based solutions. PMID:27042240
NASA Astrophysics Data System (ADS)
Gallina, Valentina; Torressan, Silvia; Zabeo, Alex; Critto, Andrea; Glade, Thomas; Marcomini, Antonio
2015-04-01
Climate change is expected to pose a wide range of impacts on natural and human systems worldwide, increasing risks from long-term climate trends and disasters triggered by weather extremes. Accordingly, in the future, one region could be potentially affected by interactions, synergies and trade-offs of multiple hazards and impacts. A multi-risk risk approach is needed to effectively address multiple threats posed by climate change across regions and targets supporting decision-makers toward a new paradigm of multi-hazard and risk management. Relevant initiatives have been already developed for the assessment of multiple hazards and risks affecting the same area in a defined timeframe by means of quantitative and semi-quantitative approaches. Most of them are addressing the relations of different natural hazards, however, the effect of future climate change is usually not considered. In order to fill this gap, an advanced multi-risk methodology was developed at the Euro-Mediterranean Centre on Climate Change (CMCC) for estimating cumulative impacts related to climate change at the regional (i.e. sub-national) scale. This methodology was implemented into an assessment tool which allows to scan and classify quickly natural systems and human assets at risk resulting from different interacting hazards. A multi-hazard index is proposed to evaluate the relationships of different climate-related hazards (e.g. sea-level rise, coastal erosion, storm surge) occurring in the same spatial and temporal area, by means of an influence matrix and the disjoint probability function. Future hazard scenarios provided by regional climate models are used as input for this step in order to consider possible effects of future climate change scenarios. Then, the multi-vulnerability of different exposed receptors (e.g. natural systems, beaches, agricultural and urban areas) is estimated through a variety of vulnerability indicators (e.g. vegetation cover, sediment budget, % of urbanization), tailored case by case to different sets of natural hazards and elements at risk. Finally, the multi-risk assessment integrates the multi-hazard with the multi-vulnerability index of exposed receptors, providing a relative ranking of areas and targets potentially affected by multiple risks in the considered region. The methodology was applied to the North Adriatic coast (Italy) producing a range of GIS-based multi-hazard, exposure, multi-vulnerability and multi-risk maps that can be used by policy-makers to define risk management and adaptation strategies. Results show that areas affected by higher multi-hazard scores are located close to the coastline where all the investigated hazards are present. Multi-vulnerability assumes relatively high scores in the whole case study, showing that beaches, wetlands, protected areas and river mouths are the more sensible targets. The final estimate of multi-risk for coastal municipalities provides useful information for local public authorities to set future priorities for adaptation and define future plans for shoreline and coastal management in view of climate change.
Upadhyay, Manas V.; Patra, Anirban; Wen, Wei; ...
2018-05-08
In this paper, we propose a multi-scale modeling approach that can simulate the microstructural and mechanical behavior of metal or alloy parts with complex geometries subjected to multi-axial load path changes. The model is used to understand the biaxial load path change behavior of 316L stainless steel cruciform samples. At the macroscale, a finite element approach is used to simulate the cruciform geometry and numerically predict the gauge stresses, which are difficult to obtain analytically. At each material point in the finite element mesh, the anisotropic viscoplastic self-consistent model is used to simulate the role of texture evolution on themore » mechanical response. At the single crystal level, a dislocation density based hardening law that appropriately captures the role of multi-axial load path changes on slip activity is used. The combined approach is experimentally validated using cruciform samples subjected to uniaxial load and unload followed by different biaxial reloads in the angular range [27º, 90º]. Polycrystalline yield surfaces before and after load path changes are generated using the full-field elasto-viscoplastic fast Fourier transform model to study the influence of the deformation history and reloading direction on the mechanical response, including the Bauschinger effect, of these cruciform samples. Results reveal that the Bauschinger effect is strongly dependent on the first loading direction and strain, intergranular and macroscopic residual stresses after first load, and the reloading angle. The microstructural origins of the mechanical response are discussed.« less
Development of policies for Natura 2000 sites: a multi-criteria approach to support decision makers.
Cortina, Carla; Boggia, Antonio
2014-08-01
The aim of this study is to present a methodology to support decision makers in the choice of Natura 2000 sites needing an appropriate management plan to ensure a sustainable socio-economic development. In order to promote sustainable development in the Natura 2000 sites compatible with nature preservation, conservation measures or management plans are necessary. The main issue is to decide when only conservation measures can be applied and when the sites need an appropriate management plan. We present a case study for the Italian Region of Umbria. The methodology is based on a multi-criteria approach to identify the biodiversity index (BI), and on the development of a human activities index (HAI). By crossing the two indexes for each site on a Cartesian plane, four groups of sites were identified. Each group corresponds to a specific need for an appropriate management plan. Sites in the first group with a high level both of biodiversity and human activities have the most urgent need of an appropriate management plan to ensure sustainable development. The proposed methodology and analysis is replicable in other regions or countries by using the data available for each site in the Natura 2000 standard data form. A multi-criteria analysis is especially suitable for supporting decision makers when they deal with a multidimensional decision process. We found the multi-criteria approach particularly sound in this case, due to the concept of biodiversity itself, which is complex and multidimensional, and to the high number of alternatives (Natura 2000 sites) to be assessed. Copyright © 2014 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upadhyay, Manas V.; Patra, Anirban; Wen, Wei
In this paper, we propose a multi-scale modeling approach that can simulate the microstructural and mechanical behavior of metal or alloy parts with complex geometries subjected to multi-axial load path changes. The model is used to understand the biaxial load path change behavior of 316L stainless steel cruciform samples. At the macroscale, a finite element approach is used to simulate the cruciform geometry and numerically predict the gauge stresses, which are difficult to obtain analytically. At each material point in the finite element mesh, the anisotropic viscoplastic self-consistent model is used to simulate the role of texture evolution on themore » mechanical response. At the single crystal level, a dislocation density based hardening law that appropriately captures the role of multi-axial load path changes on slip activity is used. The combined approach is experimentally validated using cruciform samples subjected to uniaxial load and unload followed by different biaxial reloads in the angular range [27º, 90º]. Polycrystalline yield surfaces before and after load path changes are generated using the full-field elasto-viscoplastic fast Fourier transform model to study the influence of the deformation history and reloading direction on the mechanical response, including the Bauschinger effect, of these cruciform samples. Results reveal that the Bauschinger effect is strongly dependent on the first loading direction and strain, intergranular and macroscopic residual stresses after first load, and the reloading angle. The microstructural origins of the mechanical response are discussed.« less
Freer, Phoebe E; Slanetz, Priscilla J; Haas, Jennifer S; Tung, Nadine M; Hughes, Kevin S; Armstrong, Katrina; Semine, A Alan; Troyan, Susan L; Birdwell, Robyn L
2015-09-01
Stemming from breast density notification legislation in Massachusetts effective 2015, we sought to develop a collaborative evidence-based approach to density notification that could be used by practitioners across the state. Our goal was to develop an evidence-based consensus management algorithm to help patients and health care providers follow best practices to implement a coordinated, evidence-based, cost-effective, sustainable practice and to standardize care in recommendations for supplemental screening. We formed the Massachusetts Breast Risk Education and Assessment Task Force (MA-BREAST) a multi-institutional, multi-disciplinary panel of expert radiologists, surgeons, primary care physicians, and oncologists to develop a collaborative approach to density notification legislation. Using evidence-based data from the Institute for Clinical and Economic Review, the Cochrane review, National Comprehensive Cancer Network guidelines, American Cancer Society recommendations, and American College of Radiology appropriateness criteria, the group collaboratively developed an evidence-based best-practices algorithm. The expert consensus algorithm uses breast density as one element in the risk stratification to determine the need for supplemental screening. Women with dense breasts and otherwise low risk (<15% lifetime risk), do not routinely require supplemental screening per the expert consensus. Women of high risk (>20% lifetime) should consider supplemental screening MRI in addition to routine mammography regardless of breast density. We report the development of the multi-disciplinary collaborative approach to density notification. We propose a risk stratification algorithm to assess personal level of risk to determine the need for supplemental screening for an individual woman.
Crowe, A S; Booty, W G
1995-05-01
A multi-level pesticide assessment methodology has been developed to permit regulatory personnel to undertake a variety of assessments on the potential for pesticide used in agricultural areas to contaminate the groundwater regime at an increasingly detailed geographical scale of investigation. A multi-level approach accounts for a variety of assessment objectives and detail required in the assessment, the restrictions on the availability and accuracy of data, the time available to undertake the assessment, and the expertise of the decision maker. The level 1: regional scale is designed to prioritize districts having a potentially high risk for groundwater contamination from the application of a specific pesticide for a particular crop. The level 2: local scale is used to identify critical areas for groundwater contamination, at a soil polygon scale, within a district. A level 3: soil profile scale allows the user to evaluate specific factors influencing pesticide leaching and persistence, and to determine the extent and timing of leaching, through the simulation of the migration of a pesticide within a soil profile. Because of the scale of investigation, limited amount of data required, and qualitative nature of the assessment results, the level 1 and level 2 assessment are designed primarily for quick and broad guidance related to management practices. A level 3 assessment is more complex, requires considerably more data and expertise on the part of the user, and hence is designed to verify the potential for contamination identified during the level 1 or 2 assessment. The system combines environmental modelling, geographical information systems, extensive databases, data management systems, expert systems, and pesticide assessment models, to form an environmental information system for assessing the potential for pesticides to contaminate groundwater.
A Mixtures-of-Trees Framework for Multi-Label Classification
Hong, Charmgil; Batal, Iyad; Hauskrecht, Milos
2015-01-01
We propose a new probabilistic approach for multi-label classification that aims to represent the class posterior distribution P(Y|X). Our approach uses a mixture of tree-structured Bayesian networks, which can leverage the computational advantages of conditional tree-structured models and the abilities of mixtures to compensate for tree-structured restrictions. We develop algorithms for learning the model from data and for performing multi-label predictions using the learned model. Experiments on multiple datasets demonstrate that our approach outperforms several state-of-the-art multi-label classification methods. PMID:25927011
New architectural paradigms for multi-petabyte distributed storage systems
NASA Technical Reports Server (NTRS)
Lee, Richard R.
1994-01-01
In the not too distant future, programs such as NASA's Earth Observing System, NSF/ARPA/NASA's Digital Libraries Initiative and Intelligence Community's (NSA, CIA, NRO, etc.) mass storage system upgrades will all require multi-petabyte (petabyte: 1015 bytes of bitfile data) (or larger) distributed storage solutions. None of these requirements, as currently defined, will meet their objectives utilizing either today's architectural paradigms or storage solutions. Radically new approaches will be required to not only store and manage veritable 'mountain ranges of data', but to make the cost of ownership affordable, much less practical in today's (and certainly the future's) austere budget environment! Within this paper we will explore new architectural paradigms and project systems performance benefits and dollars per petabyte of information stored. We will discuss essential 'top down' approaches to achieving an overall systems level performance capability sufficient to meet the challenges of these major programs.
Modeling human diseases with induced pluripotent stem cells: from 2D to 3D and beyond.
Liu, Chun; Oikonomopoulos, Angelos; Sayed, Nazish; Wu, Joseph C
2018-03-08
The advent of human induced pluripotent stem cells (iPSCs) presents unprecedented opportunities to model human diseases. Differentiated cells derived from iPSCs in two-dimensional (2D) monolayers have proven to be a relatively simple tool for exploring disease pathogenesis and underlying mechanisms. In this Spotlight article, we discuss the progress and limitations of the current 2D iPSC disease-modeling platform, as well as recent advancements in the development of human iPSC models that mimic in vivo tissues and organs at the three-dimensional (3D) level. Recent bioengineering approaches have begun to combine different 3D organoid types into a single '4D multi-organ system'. We summarize the advantages of this approach and speculate on the future role of 4D multi-organ systems in human disease modeling. © 2018. Published by The Company of Biologists Ltd.
Jung, Kwanghee; Takane, Yoshio; Hwang, Heungsun; Woodward, Todd S
2016-06-01
We extend dynamic generalized structured component analysis (GSCA) to enhance its data-analytic capability in structural equation modeling of multi-subject time series data. Time series data of multiple subjects are typically hierarchically structured, where time points are nested within subjects who are in turn nested within a group. The proposed approach, named multilevel dynamic GSCA, accommodates the nested structure in time series data. Explicitly taking the nested structure into account, the proposed method allows investigating subject-wise variability of the loadings and path coefficients by looking at the variance estimates of the corresponding random effects, as well as fixed loadings between observed and latent variables and fixed path coefficients between latent variables. We demonstrate the effectiveness of the proposed approach by applying the method to the multi-subject functional neuroimaging data for brain connectivity analysis, where time series data-level measurements are nested within subjects.
Multi-category micro-milling tool wear monitoring with continuous hidden Markov models
NASA Astrophysics Data System (ADS)
Zhu, Kunpeng; Wong, Yoke San; Hong, Geok Soon
2009-02-01
In-process monitoring of tool conditions is important in micro-machining due to the high precision requirement and high tool wear rate. Tool condition monitoring in micro-machining poses new challenges compared to conventional machining. In this paper, a multi-category classification approach is proposed for tool flank wear state identification in micro-milling. Continuous Hidden Markov models (HMMs) are adapted for modeling of the tool wear process in micro-milling, and estimation of the tool wear state given the cutting force features. For a noise-robust approach, the HMM outputs are connected via a medium filter to minimize the tool state before entry into the next state due to high noise level. A detailed study on the selection of HMM structures for tool condition monitoring (TCM) is presented. Case studies on the tool state estimation in the micro-milling of pure copper and steel demonstrate the effectiveness and potential of these methods.
Modeling Emergence in Neuroprotective Regulatory Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.; Haack, Jereme N.; McDermott, Jason E.
2013-01-05
The use of predictive modeling in the analysis of gene expression data can greatly accelerate the pace of scientific discovery in biomedical research by enabling in silico experimentation to test disease triggers and potential drug therapies. Techniques that focus on modeling emergence, such as agent-based modeling and multi-agent simulations, are of particular interest as they support the discovery of pathways that may have never been observed in the past. Thus far, these techniques have been primarily applied at the multi-cellular level, or have focused on signaling and metabolic networks. We present an approach where emergence modeling is extended to regulatorymore » networks and demonstrate its application to the discovery of neuroprotective pathways. An initial evaluation of the approach indicates that emergence modeling provides novel insights for the analysis of regulatory networks that can advance the discovery of acute treatments for stroke and other diseases.« less
Probabilistic performance-based design for high performance control systems
NASA Astrophysics Data System (ADS)
Micheli, Laura; Cao, Liang; Gong, Yongqiang; Cancelli, Alessandro; Laflamme, Simon; Alipour, Alice
2017-04-01
High performance control systems (HPCS) are advanced damping systems capable of high damping performance over a wide frequency bandwidth, ideal for mitigation of multi-hazards. They include active, semi-active, and hybrid damping systems. However, HPCS are more expensive than typical passive mitigation systems, rely on power and hardware (e.g., sensors, actuators) to operate, and require maintenance. In this paper, a life cycle cost analysis (LCA) approach is proposed to estimate the economic benefit these systems over the entire life of the structure. The novelty resides in the life cycle cost analysis in the performance based design (PBD) tailored to multi-level wind hazards. This yields a probabilistic performance-based design approach for HPCS. Numerical simulations are conducted on a building located in Boston, MA. LCA are conducted for passive control systems and HPCS, and the concept of controller robustness is demonstrated. Results highlight the promise of the proposed performance-based design procedure.
Cilfone, Nicholas A.; Kirschner, Denise E.; Linderman, Jennifer J.
2015-01-01
Biologically related processes operate across multiple spatiotemporal scales. For computational modeling methodologies to mimic this biological complexity, individual scale models must be linked in ways that allow for dynamic exchange of information across scales. A powerful methodology is to combine a discrete modeling approach, agent-based models (ABMs), with continuum models to form hybrid models. Hybrid multi-scale ABMs have been used to simulate emergent responses of biological systems. Here, we review two aspects of hybrid multi-scale ABMs: linking individual scale models and efficiently solving the resulting model. We discuss the computational choices associated with aspects of linking individual scale models while simultaneously maintaining model tractability. We demonstrate implementations of existing numerical methods in the context of hybrid multi-scale ABMs. Using an example model describing Mycobacterium tuberculosis infection, we show relative computational speeds of various combinations of numerical methods. Efficient linking and solution of hybrid multi-scale ABMs is key to model portability, modularity, and their use in understanding biological phenomena at a systems level. PMID:26366228
NASA Astrophysics Data System (ADS)
Ravi, Sathish Kumar; Gawad, Jerzy; Seefeldt, Marc; Van Bael, Albert; Roose, Dirk
2017-10-01
A numerical multi-scale model is being developed to predict the anisotropic macroscopic material response of multi-phase steel. The embedded microstructure is given by a meso-scale Representative Volume Element (RVE), which holds the most relevant features like phase distribution, grain orientation, morphology etc., in sufficient detail to describe the multi-phase behavior of the material. A Finite Element (FE) mesh of the RVE is constructed using statistical information from individual phases such as grain size distribution and ODF. The material response of the RVE is obtained for selected loading/deformation modes through numerical FE simulations in Abaqus. For the elasto-plastic response of the individual grains, single crystal plasticity based plastic potential functions are proposed as Abaqus material definitions. The plastic potential functions are derived using the Facet method for individual phases in the microstructure at the level of single grains. The proposed method is a new modeling framework and the results presented in terms of macroscopic flow curves are based on the building blocks of the approach, while the model would eventually facilitate the construction of an anisotropic yield locus of the underlying multi-phase microstructure derived from a crystal plasticity based framework.
Optimal maintenance policy incorporating system level and unit level for mechanical systems
NASA Astrophysics Data System (ADS)
Duan, Chaoqun; Deng, Chao; Wang, Bingran
2018-04-01
The study works on a multi-level maintenance policy combining system level and unit level under soft and hard failure modes. The system experiences system-level preventive maintenance (SLPM) when the conditional reliability of entire system exceeds SLPM threshold, and also undergoes a two-level maintenance for each single unit, which is initiated when a single unit exceeds its preventive maintenance (PM) threshold, and the other is performed simultaneously the moment when any unit is going for maintenance. The units experience both periodic inspections and aperiodic inspections provided by failures of hard-type units. To model the practical situations, two types of economic dependence have been taken into account, which are set-up cost dependence and maintenance expertise dependence due to the same technology and tool/equipment can be utilised. The optimisation problem is formulated and solved in a semi-Markov decision process framework. The objective is to find the optimal system-level threshold and unit-level thresholds by minimising the long-run expected average cost per unit time. A formula for the mean residual life is derived for the proposed multi-level maintenance policy. The method is illustrated by a real case study of feed subsystem from a boring machine, and a comparison with other policies demonstrates the effectiveness of our approach.
NASA Astrophysics Data System (ADS)
Müller, Ruben; Schütze, Niels
2014-05-01
Water resources systems with reservoirs are expected to be sensitive to climate change. Assessment studies that analyze the impact of climate change on the performance of reservoirs can be divided in two groups: (1) Studies that simulate the operation under projected inflows with the current set of operational rules. Due to non adapted operational rules the future performance of these reservoirs can be underestimated and the impact overestimated. (2) Studies that optimize the operational rules for best adaption of the system to the projected conditions before the assessment of the impact. The latter allows for estimating more realistically future performance and adaption strategies based on new operation rules are available if required. Multi-purpose reservoirs serve various, often conflicting functions. If all functions cannot be served simultaneously at a maximum level, an effective compromise between multiple objectives of the reservoir operation has to be provided. Yet under climate change the historically preferenced compromise may no longer be the most suitable compromise in the future. Therefore a multi-objective based climate change impact assessment approach for multi-purpose multi-reservoir systems is proposed in the study. Projected inflows are provided in a first step using a physically based rainfall-runoff model. In a second step, a time series model is applied to generate long-term inflow time series. Finally, the long-term inflow series are used as driving variables for a simulation-based multi-objective optimization of the reservoir system in order to derive optimal operation rules. As a result, the adapted Pareto-optimal set of diverse best compromise solutions can be presented to the decision maker in order to assist him in assessing climate change adaption measures with respect to the future performance of the multi-purpose reservoir system. The approach is tested on a multi-purpose multi-reservoir system in a mountainous catchment in Germany. A climate change assessment is performed for climate change scenarios based on the SRES emission scenarios A1B, B1 and A2 for a set of statistically downscaled meteorological data. The future performance of the multi-purpose multi-reservoir system is quantified and possible intensifications of trade-offs between management goals or reservoir utilizations are shown.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Passel, Steven, E-mail: Steven.vanpassel@uhasselt.be; University of Antwerp, Department Bioscience Engineering, Groenenborgerlaan 171, 2020 Antwerp; Meul, Marijke
Sustainability assessment is needed to build sustainable farming systems. A broad range of sustainability concepts, methodologies and applications already exists. They differ in level, focus, orientation, measurement, scale, presentation and intended end-users. In this paper we illustrate that a smart combination of existing methods with different levels of application can make sustainability assessment more profound, and that it can broaden the insights of different end-user groups. An overview of sustainability assessment tools on different levels and for different end-users shows the complementarities and the opportunities of using different methods. In a case-study, a combination of the sustainable value approach (SVA)more » and MOTIFS is used to perform a sustainability evaluation of farming systems in Flanders. SVA is used to evaluate sustainability at sector level, and is especially useful to support policy makers, while MOTIFS is used to support and guide farmers towards sustainability at farm level. The combined use of the two methods with complementary goals can widen the insights of both farmers and policy makers, without losing the particularities of the different approaches. To stimulate and support further research and applications, we propose guidelines for multilevel and multi-user sustainability assessments. - Highlights: Black-Right-Pointing-Pointer We give an overview of sustainability assessment tools for agricultural systems. Black-Right-Pointing-Pointer SVA and MOTIFS are used to evaluate the sustainability of dairy farming in Flanders. Black-Right-Pointing-Pointer Combination of methods with different levels broadens the insights of different end-user groups. Black-Right-Pointing-Pointer We propose guidelines for multilevel and multi-user sustainability assessments.« less
Multi-platform metabolomics assays for human lung lavage fluids in an air pollution exposure study.
Surowiec, Izabella; Karimpour, Masoumeh; Gouveia-Figueira, Sandra; Wu, Junfang; Unosson, Jon; Bosson, Jenny A; Blomberg, Anders; Pourazar, Jamshid; Sandström, Thomas; Behndig, Annelie F; Trygg, Johan; Nording, Malin L
2016-07-01
Metabolomics protocols are used to comprehensively characterize the metabolite content of biological samples by exploiting cutting-edge analytical platforms, such as gas chromatography (GC) or liquid chromatography (LC) coupled to mass spectrometry (MS) assays, as well as nuclear magnetic resonance (NMR) assays. We have developed novel sample preparation procedures combined with GC-MS, LC-MS, and NMR metabolomics profiling for analyzing bronchial wash (BW) and bronchoalveolar lavage (BAL) fluid from 15 healthy volunteers following exposure to biodiesel exhaust and filtered air. Our aim was to investigate the responsiveness of metabolite profiles in the human lung to air pollution exposure derived from combustion of biofuels, such as rapeseed methyl ester biodiesel, which are increasingly being promoted as alternatives to conventional fossil fuels. Our multi-platform approach enabled us to detect the greatest number of unique metabolites yet reported in BW and BAL fluid (82 in total). All of the metabolomics assays indicated that the metabolite profiles of the BW and BAL fluids differed appreciably, with 46 metabolites showing significantly different levels in the corresponding lung compartments. Furthermore, the GC-MS assay revealed an effect of biodiesel exhaust exposure on the levels of 1-monostearylglycerol, sucrose, inosine, nonanoic acid, and ethanolamine (in BAL) and pentadecanoic acid (in BW), whereas the LC-MS assay indicated a shift in the levels of niacinamide (in BAL). The NMR assay only identified lactic acid (in BW) as being responsive to biodiesel exhaust exposure. Our findings demonstrate that the proposed multi-platform approach is useful for wide metabolomics screening of BW and BAL fluids and can facilitate elucidation of metabolites responsive to biodiesel exhaust exposure. Graphical Abstract Graphical abstract illustrating the study workflow. NMR Nuclear Magnetic Resonance, LC-TOFMS Liquid chromatography-Time Of Flight Mass Spectrometry, GC Gas Chromatography-Mass spectrometry.
A ℓ2, 1 norm regularized multi-kernel learning for false positive reduction in Lung nodule CAD.
Cao, Peng; Liu, Xiaoli; Zhang, Jian; Li, Wei; Zhao, Dazhe; Huang, Min; Zaiane, Osmar
2017-03-01
The aim of this paper is to describe a novel algorithm for False Positive Reduction in lung nodule Computer Aided Detection(CAD). In this paper, we describes a new CT lung CAD method which aims to detect solid nodules. Specially, we proposed a multi-kernel classifier with a ℓ 2, 1 norm regularizer for heterogeneous feature fusion and selection from the feature subset level, and designed two efficient strategies to optimize the parameters of kernel weights in non-smooth ℓ 2, 1 regularized multiple kernel learning algorithm. The first optimization algorithm adapts a proximal gradient method for solving the ℓ 2, 1 norm of kernel weights, and use an accelerated method based on FISTA; the second one employs an iterative scheme based on an approximate gradient descent method. The results demonstrates that the FISTA-style accelerated proximal descent method is efficient for the ℓ 2, 1 norm formulation of multiple kernel learning with the theoretical guarantee of the convergence rate. Moreover, the experimental results demonstrate the effectiveness of the proposed methods in terms of Geometric mean (G-mean) and Area under the ROC curve (AUC), and significantly outperforms the competing methods. The proposed approach exhibits some remarkable advantages both in heterogeneous feature subsets fusion and classification phases. Compared with the fusion strategies of feature-level and decision level, the proposed ℓ 2, 1 norm multi-kernel learning algorithm is able to accurately fuse the complementary and heterogeneous feature sets, and automatically prune the irrelevant and redundant feature subsets to form a more discriminative feature set, leading a promising classification performance. Moreover, the proposed algorithm consistently outperforms the comparable classification approaches in the literature. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Mu, Lin
2018-01-01
This work introduces a number of algebraic topology approaches, including multi-component persistent homology, multi-level persistent homology, and electrostatic persistence for the representation, characterization, and description of small molecules and biomolecular complexes. In contrast to the conventional persistent homology, multi-component persistent homology retains critical chemical and biological information during the topological simplification of biomolecular geometric complexity. Multi-level persistent homology enables a tailored topological description of inter- and/or intra-molecular interactions of interest. Electrostatic persistence incorporates partial charge information into topological invariants. These topological methods are paired with Wasserstein distance to characterize similarities between molecules and are further integrated with a variety of machine learning algorithms, including k-nearest neighbors, ensemble of trees, and deep convolutional neural networks, to manifest their descriptive and predictive powers for protein-ligand binding analysis and virtual screening of small molecules. Extensive numerical experiments involving 4,414 protein-ligand complexes from the PDBBind database and 128,374 ligand-target and decoy-target pairs in the DUD database are performed to test respectively the scoring power and the discriminatory power of the proposed topological learning strategies. It is demonstrated that the present topological learning outperforms other existing methods in protein-ligand binding affinity prediction and ligand-decoy discrimination. PMID:29309403
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graves, Todd L; Hamada, Michael S
2008-01-01
Good estimates of the reliability of a system make use of test data and expert knowledge at all available levels. Furthermore, by integrating all these information sources, one can determine how best to allocate scarce testing resources to reduce uncertainty. Both of these goals are facilitated by modern Bayesian computational methods. We apply these tools to examples that were previously solvable only through the use of ingenious approximations, and use genetic algorithms to guide resource allocation.
Multi-criteria analysis of potential recovery facilities in a reverse supply chain
NASA Astrophysics Data System (ADS)
Nukala, Satish; Gupta, Surendra M.
2005-11-01
Analytic Hierarchy Process (AHP) has been employed by researchers for solving multi-criteria analysis problems. However, AHP is often criticized for its unbalanced scale of judgments and failure to precisely handle the inherent uncertainty and vagueness in carrying out the pair-wise comparisons. With an objective to address these drawbacks, in this paper, we employ a fuzzy approach in selecting potential recovery facilities in the strategic planning of a reverse supply chain network that addresses the decision maker's level of confidence in the fuzzy assessments and his/her attitude towards risk. A numerical example is considered to illustrate the methodology.
NASA Astrophysics Data System (ADS)
Ma, Weiwei; Gong, Cailan; Hu, Yong; Li, Long; Meng, Peng
2015-10-01
Remote sensing technology has been broadly recognized for its convenience and efficiency in mapping vegetation, particularly in high-altitude and inaccessible areas where there are lack of in-situ observations. In this study, Landsat Thematic Mapper (TM) images and Chinese environmental mitigation satellite CCD sensor (HJ-1 CCD) images, both of which are at 30m spatial resolution were employed for identifying and monitoring of vegetation types in a area of Western China——Qinghai Lake Watershed(QHLW). A decision classification tree (DCT) algorithm using multi-characteristic including seasonal TM/HJ-1 CCD time series data combined with digital elevation models (DEMs) dataset, and a supervised maximum likelihood classification (MLC) algorithm with single-data TM image were applied vegetation classification. Accuracy of the two algorithms was assessed using field observation data. Based on produced vegetation classification maps, it was found that the DCT using multi-season data and geomorphologic parameters was superior to the MLC algorithm using single-data image, improving the overall accuracy by 11.86% at second class level and significantly reducing the "salt and pepper" noise. The DCT algorithm applied to TM /HJ-1 CCD time series data geomorphologic parameters appeared as a valuable and reliable tool for monitoring vegetation at first class level (5 vegetation classes) and second class level(8 vegetation subclasses). The DCT algorithm using multi-characteristic might provide a theoretical basis and general approach to automatic extraction of vegetation types from remote sensing imagery over plateau areas.
De Jong, Joop T V M
2010-01-01
Political violence, armed conflicts and human rights violations are produced by a variety of political, economic and socio-cultural factors. Conflicts can be analyzed with an interdisciplinary approach to obtain a global understanding of the relative contribution of risk and protective factors. A public health framework was designed to address these risk factors and protective factors. The framework resulted in a matrix that combined primary, secondary and tertiary interventions with their implementation on the levels of the society-at-large, the community, and the family and individual. Subsequently, the risk and protective factors were translated into multi-sectoral, multi-modal and multi-level preventive interventions involving the economy, governance, diplomacy, the military, human rights, agriculture, health, and education. Then the interventions were slotted in their appropriate place in the matrix. The interventions can be applied in an integrative form by international agencies, governments and non-governmental organizations, and molded to meet the requirements of the historic, political-economic and socio-cultural context. The framework maps the complementary fit among the different actors while engaging themselves in preventive, rehabilitative and reconstructive interventions. The framework shows how the economic, diplomatic, political, criminal justice, human rights, military, health and rural development sectors can collaborate to promote peace or prevent the aggravation or continuation of violence. A deeper understanding of the association between risk and protective factors and the developmental pathways of generic, country-specific and culture-specific factors leading to political violence is needed.
Whelan, Jillian; Love, Penny; Romanus, Anne; Pettman, Tahna; Bolton, Kristy; Smith, Erin; Gill, Tim; Coveney, John; Waters, Elizabeth; Allender, Steve
2015-01-01
Abstract Objective: Obesity is the single biggest public health threat to developed and developing economies. In concert with healthy public policy, multi-strategy, multi-level community-based initiatives appear promising in preventing obesity, with several countries trialling this approach. In Australia, multiple levels of government have funded and facilitated a range of community-based obesity prevention initiatives (CBI), heterogeneous in their funding, timing, target audience and structure. This paper aims to present a central repository of CBI operating in Australia during 2013, to facilitate knowledge exchange and shared opportunities for learning, and to guide professional development towards best practice for CBI practitioners. Methods: A comprehensive search of government, non-government and community websites was undertaken to identify CBI in Australia in 2013. This was supplemented with data drawn from available reports, personal communication and key informant interviews. The data was translated into an interactive map for use by preventive health practitioners and other parties. Results: We identified 259 CBI; with the majority (84%) having a dual focus on physical activity and healthy eating. Few initiatives, (n=37) adopted a four-pronged multi-strategy approach implementing policy, built environment, social marketing and/or partnership building. Conclusion: This comprehensive overview of Australian CBI has the potential to facilitate engagement and collaboration through knowledge exchange and information sharing amongst CBI practitioners, funders, communities and researchers. Implications: An enhanced understanding of current practice highlights areas of strengths and opportunities for improvement to maximise the impact of obesity prevention initiatives. PMID:25561083
Whelan, Jillian; Love, Penny; Romanus, Anne; Pettman, Tahna; Bolton, Kristy; Smith, Erin; Gill, Tim; Coveney, John; Waters, Elizabeth; Allender, Steve
2015-04-01
Obesity is the single biggest public health threat to developed and developing economies. In concert with healthy public policy, multi-strategy, multi-level community-based initiatives appear promising in preventing obesity, with several countries trialling this approach. In Australia, multiple levels of government have funded and facilitated a range of community-based obesity prevention initiatives (CBI), heterogeneous in their funding, timing, target audience and structure. This paper aims to present a central repository of CBI operating in Australia during 2013, to facilitate knowledge exchange and shared opportunities for learning, and to guide professional development towards best practice for CBI practitioners. A comprehensive search of government, non-government and community websites was undertaken to identify CBI in Australia in 2013. This was supplemented with data drawn from available reports, personal communication and key informant interviews. The data was translated into an interactive map for use by preventive health practitioners and other parties. We identified 259 CBI; with the majority (84%) having a dual focus on physical activity and healthy eating. Few initiatives, (n=37) adopted a four-pronged multi-strategy approach implementing policy, built environment, social marketing and/or partnership building. This comprehensive overview of Australian CBI has the potential to facilitate engagement and collaboration through knowledge exchange and information sharing amongst CBI practitioners, funders, communities and researchers. An enhanced understanding of current practice highlights areas of strengths and opportunities for improvement to maximise the impact of obesity prevention initiatives. © 2015 Public Health Association of Australia.
NASA Astrophysics Data System (ADS)
Chiariotti, P.; Martarelli, M.; Revel, G. M.
2017-12-01
A novel non-destructive testing procedure for delamination detection based on the exploitation of the simultaneous time and spatial sampling provided by Continuous Scanning Laser Doppler Vibrometry (CSLDV) and the feature extraction capability of Multi-Level wavelet-based processing is presented in this paper. The processing procedure consists in a multi-step approach. Once the optimal mother-wavelet is selected as the one maximizing the Energy to Shannon Entropy Ratio criterion among the mother-wavelet space, a pruning operation aiming at identifying the best combination of nodes inside the full-binary tree given by Wavelet Packet Decomposition (WPD) is performed. The pruning algorithm exploits, in double step way, a measure of the randomness of the point pattern distribution on the damage map space with an analysis of the energy concentration of the wavelet coefficients on those nodes provided by the first pruning operation. A combination of the point pattern distributions provided by each node of the ensemble node set from the pruning algorithm allows for setting a Damage Reliability Index associated to the final damage map. The effectiveness of the whole approach is proven on both simulated and real test cases. A sensitivity analysis related to the influence of noise on the CSLDV signal provided to the algorithm is also discussed, showing that the processing developed is robust enough to measurement noise. The method is promising: damages are well identified on different materials and for different damage-structure varieties.
NASA Astrophysics Data System (ADS)
Bauer, Johannes; Dávila-Chacón, Jorge; Wermter, Stefan
2015-10-01
Humans and other animals have been shown to perform near-optimally in multi-sensory integration tasks. Probabilistic population codes (PPCs) have been proposed as a mechanism by which optimal integration can be accomplished. Previous approaches have focussed on how neural networks might produce PPCs from sensory input or perform calculations using them, like combining multiple PPCs. Less attention has been given to the question of how the necessary organisation of neurons can arise and how the required knowledge about the input statistics can be learned. In this paper, we propose a model of learning multi-sensory integration based on an unsupervised learning algorithm in which an artificial neural network learns the noise characteristics of each of its sources of input. Our algorithm borrows from the self-organising map the ability to learn latent-variable models of the input and extends it to learning to produce a PPC approximating a probability density function over the latent variable behind its (noisy) input. The neurons in our network are only required to perform simple calculations and we make few assumptions about input noise properties and tuning functions. We report on a neurorobotic experiment in which we apply our algorithm to multi-sensory integration in a humanoid robot to demonstrate its effectiveness and compare it to human multi-sensory integration on the behavioural level. We also show in simulations that our algorithm performs near-optimally under certain plausible conditions, and that it reproduces important aspects of natural multi-sensory integration on the neural level.
Robust set-point regulation for ecological models with multiple management goals.
Guiver, Chris; Mueller, Markus; Hodgson, Dave; Townley, Stuart
2016-05-01
Population managers will often have to deal with problems of meeting multiple goals, for example, keeping at specific levels both the total population and population abundances in given stage-classes of a stratified population. In control engineering, such set-point regulation problems are commonly tackled using multi-input, multi-output proportional and integral (PI) feedback controllers. Building on our recent results for population management with single goals, we develop a PI control approach in a context of multi-objective population management. We show that robust set-point regulation is achieved by using a modified PI controller with saturation and anti-windup elements, both described in the paper, and illustrate the theory with examples. Our results apply more generally to linear control systems with positive state variables, including a class of infinite-dimensional systems, and thus have broader appeal.
A 3D Freehand Ultrasound System for Multi-view Reconstructions from Sparse 2D Scanning Planes
2011-01-01
Background A significant limitation of existing 3D ultrasound systems comes from the fact that the majority of them work with fixed acquisition geometries. As a result, the users have very limited control over the geometry of the 2D scanning planes. Methods We present a low-cost and flexible ultrasound imaging system that integrates several image processing components to allow for 3D reconstructions from limited numbers of 2D image planes and multiple acoustic views. Our approach is based on a 3D freehand ultrasound system that allows users to control the 2D acquisition imaging using conventional 2D probes. For reliable performance, we develop new methods for image segmentation and robust multi-view registration. We first present a new hybrid geometric level-set approach that provides reliable segmentation performance with relatively simple initializations and minimum edge leakage. Optimization of the segmentation model parameters and its effect on performance is carefully discussed. Second, using the segmented images, a new coarse to fine automatic multi-view registration method is introduced. The approach uses a 3D Hotelling transform to initialize an optimization search. Then, the fine scale feature-based registration is performed using a robust, non-linear least squares algorithm. The robustness of the multi-view registration system allows for accurate 3D reconstructions from sparse 2D image planes. Results Volume measurements from multi-view 3D reconstructions are found to be consistently and significantly more accurate than measurements from single view reconstructions. The volume error of multi-view reconstruction is measured to be less than 5% of the true volume. We show that volume reconstruction accuracy is a function of the total number of 2D image planes and the number of views for calibrated phantom. In clinical in-vivo cardiac experiments, we show that volume estimates of the left ventricle from multi-view reconstructions are found to be in better agreement with clinical measures than measures from single view reconstructions. Conclusions Multi-view 3D reconstruction from sparse 2D freehand B-mode images leads to more accurate volume quantification compared to single view systems. The flexibility and low-cost of the proposed system allow for fine control of the image acquisition planes for optimal 3D reconstructions from multiple views. PMID:21251284
A 3D freehand ultrasound system for multi-view reconstructions from sparse 2D scanning planes.
Yu, Honggang; Pattichis, Marios S; Agurto, Carla; Beth Goens, M
2011-01-20
A significant limitation of existing 3D ultrasound systems comes from the fact that the majority of them work with fixed acquisition geometries. As a result, the users have very limited control over the geometry of the 2D scanning planes. We present a low-cost and flexible ultrasound imaging system that integrates several image processing components to allow for 3D reconstructions from limited numbers of 2D image planes and multiple acoustic views. Our approach is based on a 3D freehand ultrasound system that allows users to control the 2D acquisition imaging using conventional 2D probes.For reliable performance, we develop new methods for image segmentation and robust multi-view registration. We first present a new hybrid geometric level-set approach that provides reliable segmentation performance with relatively simple initializations and minimum edge leakage. Optimization of the segmentation model parameters and its effect on performance is carefully discussed. Second, using the segmented images, a new coarse to fine automatic multi-view registration method is introduced. The approach uses a 3D Hotelling transform to initialize an optimization search. Then, the fine scale feature-based registration is performed using a robust, non-linear least squares algorithm. The robustness of the multi-view registration system allows for accurate 3D reconstructions from sparse 2D image planes. Volume measurements from multi-view 3D reconstructions are found to be consistently and significantly more accurate than measurements from single view reconstructions. The volume error of multi-view reconstruction is measured to be less than 5% of the true volume. We show that volume reconstruction accuracy is a function of the total number of 2D image planes and the number of views for calibrated phantom. In clinical in-vivo cardiac experiments, we show that volume estimates of the left ventricle from multi-view reconstructions are found to be in better agreement with clinical measures than measures from single view reconstructions. Multi-view 3D reconstruction from sparse 2D freehand B-mode images leads to more accurate volume quantification compared to single view systems. The flexibility and low-cost of the proposed system allow for fine control of the image acquisition planes for optimal 3D reconstructions from multiple views.
Co-Labeling for Multi-View Weakly Labeled Learning.
Xu, Xinxing; Li, Wen; Xu, Dong; Tsang, Ivor W
2016-06-01
It is often expensive and time consuming to collect labeled training samples in many real-world applications. To reduce human effort on annotating training samples, many machine learning techniques (e.g., semi-supervised learning (SSL), multi-instance learning (MIL), etc.) have been studied to exploit weakly labeled training samples. Meanwhile, when the training data is represented with multiple types of features, many multi-view learning methods have shown that classifiers trained on different views can help each other to better utilize the unlabeled training samples for the SSL task. In this paper, we study a new learning problem called multi-view weakly labeled learning, in which we aim to develop a unified approach to learn robust classifiers by effectively utilizing different types of weakly labeled multi-view data from a broad range of tasks including SSL, MIL and relative outlier detection (ROD). We propose an effective approach called co-labeling to solve the multi-view weakly labeled learning problem. Specifically, we model the learning problem on each view as a weakly labeled learning problem, which aims to learn an optimal classifier from a set of pseudo-label vectors generated by using the classifiers trained from other views. Unlike traditional co-training approaches using a single pseudo-label vector for training each classifier, our co-labeling approach explores different strategies to utilize the predictions from different views, biases and iterations for generating the pseudo-label vectors, making our approach more robust for real-world applications. Moreover, to further improve the weakly labeled learning on each view, we also exploit the inherent group structure in the pseudo-label vectors generated from different strategies, which leads to a new multi-layer multiple kernel learning problem. Promising results for text-based image retrieval on the NUS-WIDE dataset as well as news classification and text categorization on several real-world multi-view datasets clearly demonstrate that our proposed co-labeling approach achieves state-of-the-art performance for various multi-view weakly labeled learning problems including multi-view SSL, multi-view MIL and multi-view ROD.
A Quantum Approach to Multi-Agent Systems (MAS), Organizations, and Control
2003-06-01
interdependent interactions between individuals represented approximately as vocal harmonic I resonators. Then the growth rate of an organization fits ...A quantum approach to multi-agent systems (MAS), organizations , and control W.F. Lawless Paine College 1235 15th Street Augusta, GA 30901...AND SUBTITLE A quantum approach to multi-agent systems (MAS), organizations , and control 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT
Uribe, Gustavo A; Blobel, Bernd; López, Diego M; Schulz, Stefan
2015-01-01
Chronic diseases such as Type 2 Diabetes Mellitus (T2DM) constitute a big burden to the global health economy. T2DM Care Management requires a multi-disciplinary and multi-organizational approach. Because of different languages and terminologies, education, experiences, skills, etc., such an approach establishes a special interoperability challenge. The solution is a flexible, scalable, business-controlled, adaptive, knowledge-based, intelligent system following a systems-oriented, architecture-centric, ontology-based and policy-driven approach. The architecture of real systems is described, using the basics and principles of the Generic Component Model (GCM). For representing the functional aspects of a system the Business Process Modeling Notation (BPMN) is used. The system architecture obtained is presented using a GCM graphical notation, class diagrams and BPMN diagrams. The architecture-centric approach considers the compositional nature of the real world system and its functionalities, guarantees coherence, and provides right inferences. The level of generality provided in this paper facilitates use case specific adaptations of the system. By that way, intelligent, adaptive and interoperable T2DM care systems can be derived from the presented model as presented in another publication.
Multi-Hazard Interactions in Guatemala
NASA Astrophysics Data System (ADS)
Gill, Joel; Malamud, Bruce D.
2017-04-01
In this paper, we combine physical and social science approaches to develop a multi-scale regional framework for natural hazard interactions in Guatemala. The identification and characterisation of natural hazard interactions is an important input for comprehensive multi-hazard approaches to disaster risk reduction at a regional level. We use five transdisciplinary evidence sources to organise and populate our framework: (i) internationally-accessible literature; (ii) civil protection bulletins; (iii) field observations; (iv) stakeholder interviews (hazard and civil protection professionals); and (v) stakeholder workshop results. These five evidence sources are synthesised to determine an appropriate natural hazard classification scheme for Guatemala (6 hazard groups, 19 hazard types, and 37 hazard sub-types). For a national spatial extent (Guatemala), we construct and populate a "21×21" hazard interaction matrix, identifying 49 possible interactions between 21 hazard types. For a sub-national spatial extent (Southern Highlands, Guatemala), we construct and populate a "33×33" hazard interaction matrix, identifying 112 possible interactions between 33 hazard sub-types. Evidence sources are also used to constrain anthropogenic processes that could trigger natural hazards in Guatemala, and characterise possible networks of natural hazard interactions (cascades). The outcomes of this approach are among the most comprehensive interaction frameworks for national and sub-national spatial scales in the published literature. These can be used to support disaster risk reduction and civil protection professionals in better understanding natural hazards and potential disasters at a regional scale.
NASA Astrophysics Data System (ADS)
Salucci, Marco; Tenuti, Lorenza; Nardin, Cristina; Oliveri, Giacomo; Viani, Federico; Rocca, Paolo; Massa, Andrea
2014-05-01
The application of non-destructive testing and evaluation (NDT/NDE) methodologies in civil engineering has raised a growing interest during the last years because of its potential impact in several different scenarios. As a consequence, Ground Penetrating Radar (GPR) technologies have been widely adopted as an instrument for the inspection of the structural stability of buildings and for the detection of cracks and voids. In this framework, the development and validation of GPR algorithms and methodologies represents one of the most active research areas within the ELEDIA Research Center of the University of Trento. More in detail, great efforts have been devoted towards the development of inversion techniques based on the integration of deterministic and stochastic search algorithms with multi-focusing strategies. These approaches proved to be effective in mitigating the effects of both nonlinearity and ill-posedness of microwave imaging problems, which represent the well-known issues arising in GPR inverse scattering formulations. More in detail, a regularized multi-resolution approach based on the Inexact Newton Method (INM) has been recently applied to subsurface prospecting, showing a remarkable advantage over a single-resolution implementation [1]. Moreover, the use of multi-frequency or frequency-hopping strategies to exploit the information coming from GPR data collected in time domain and transformed into its frequency components has been proposed as well. In this framework, the effectiveness of the multi-resolution multi-frequency techniques has been proven on synthetic data generated with numerical models such as GprMax [2]. The application of inversion algorithms based on Bayesian Compressive Sampling (BCS) [3][4] to GPR is currently under investigation, as well, in order to exploit their capability to provide satisfactory reconstructions in presence of single and multiple sparse scatterers [3][4]. Furthermore, multi-scaling approaches exploiting level-set-based optimization have been developed for the qualitative reconstruction of multiple and disconnected homogeneous scatterers [5]. Finally, the real-time detection and classification of subsurface scatterers has been investigated by means of learning-by-examples (LBE) techniques, such as Support Vector Machines (SVM) [6]. Acknowledgment - This work was partially supported by COST Action TU1208 'Civil Engineering Applications of Ground Penetrating Radar' References [1] M. Salucci, D. Sartori, N. Anselmi, A. Randazzo, G. Oliveri, and A. Massa, 'Imaging Buried Objects within the Second-Order Born Approximation through a Multiresolution Regularized Inexact-Newton Method', 2013 International Symposium on Electromagnetic Theory (EMTS), (Hiroshima, Japan), May 20-24 2013 (invited). [2] A. Giannopoulos, 'Modelling ground penetrating radar by GprMax', Construct. Build. Mater., vol. 19, no. 10, pp.755 -762 2005 [3] L. Poli, G. Oliveri, P. Rocca, and A. Massa, "Bayesian compressive sensing approaches for the reconstruction of two-dimensional sparse scatterers under TE illumination," IEEE Trans. Geosci. Remote Sensing, vol. 51, no. 5, pp. 2920-2936, May. 2013. [4] L. Poli, G. Oliveri, and A. Massa, "Imaging sparse metallic cylinders through a Local Shape Function Bayesian Compressive Sensing approach," Journal of Optical Society of America A, vol. 30, no. 6, pp. 1261-1272, 2013. [5] M. Benedetti, D. Lesselier, M. Lambert, and A. Massa, "Multiple shapes reconstruction by means of multi-region level sets," IEEE Trans. Geosci. Remote Sensing, vol. 48, no. 5, pp. 2330-2342, May 2010. [6] L. Lizzi, F. Viani, P. Rocca, G. Oliveri, M. Benedetti and A. Massa, "Three-dimensional real-time localization of subsurface objects - From theory to experimental validation," 2009 IEEE International Geoscience and Remote Sensing Symposium, vol. 2, pp. II-121-II-124, 12-17 July 2009.
Fast hierarchical knowledge-based approach for human face detection in color images
NASA Astrophysics Data System (ADS)
Jiang, Jun; Gong, Jie; Zhang, Guilin; Hu, Ruolan
2001-09-01
This paper presents a fast hierarchical knowledge-based approach for automatically detecting multi-scale upright faces in still color images. The approach consists of three levels. At the highest level, skin-like regions are determinated by skin model, which is based on the color attributes hue and saturation in HSV color space, as well color attributes red and green in normalized color space. In level 2, a new eye model is devised to select human face candidates in segmented skin-like regions. An important feature of the eye model is that it is independent of the scale of human face. So it is possible for finding human faces in different scale with scanning image only once, and it leads to reduction the computation time of face detection greatly. In level 3, a human face mosaic image model, which is consistent with physical structure features of human face well, is applied to judge whether there are face detects in human face candidate regions. This model includes edge and gray rules. Experiment results show that the approach has high robustness and fast speed. It has wide application perspective at human-computer interactions and visual telephone etc.
Adaptive mesh refinement and load balancing based on multi-level block-structured Cartesian mesh
NASA Astrophysics Data System (ADS)
Misaka, Takashi; Sasaki, Daisuke; Obayashi, Shigeru
2017-11-01
We developed a framework for a distributed-memory parallel computer that enables dynamic data management for adaptive mesh refinement and load balancing. We employed simple data structure of the building cube method (BCM) where a computational domain is divided into multi-level cubic domains and each cube has the same number of grid points inside, realising a multi-level block-structured Cartesian mesh. Solution adaptive mesh refinement, which works efficiently with the help of the dynamic load balancing, was implemented by dividing cubes based on mesh refinement criteria. The framework was investigated with the Laplace equation in terms of adaptive mesh refinement, load balancing and the parallel efficiency. It was then applied to the incompressible Navier-Stokes equations to simulate a turbulent flow around a sphere. We considered wall-adaptive cube refinement where a non-dimensional wall distance y+ near the sphere is used for a criterion of mesh refinement. The result showed the load imbalance due to y+ adaptive mesh refinement was corrected by the present approach. To utilise the BCM framework more effectively, we also tested a cube-wise algorithm switching where an explicit and implicit time integration schemes are switched depending on the local Courant-Friedrichs-Lewy (CFL) condition in each cube.
NASA Technical Reports Server (NTRS)
Rosenzweig, Cynthia; Solecki, William D.; Blake, Reginald; Bowman, Malcolm; Faris, Craig; Gornitz, Vivien; Horton, Radley; Jacob, Klaus; LeBlanc, Alice; Leichenko, Robin;
2010-01-01
While current rates of sea level rise and associated coastal flooding in the New York City region appear to be manageable by stakeholders responsible for communications, energy, transportation, and water infrastructure, projections for sea level rise and associated flooding in the future, especially those associated with rapid icemelt of the Greenland and West Antarctic Icesheets, may be beyond the range of current capacity because an extreme event might cause flooding and inundation beyond the planning and preparedness regimes. This paper describes the comprehensive process, approach, and tools developed by the New York City Panel on Climate Change (NPCC) in conjunction with the region s stakeholders who manage its critical infrastructure, much of which lies near the coast. It presents the adaptation approach and the sea-level rise and storm projections related to coastal risks developed through the stakeholder process. Climate change adaptation planning in New York City is characterized by a multi-jurisdictional stakeholder-scientist process, state-of-the-art scientific projections and mapping, and development of adaptation strategies based on a risk-management approach.
Antunes, Sofia; Esposito, Antonio; Palmisano, Anna; Colantoni, Caterina; Cerutti, Sergio; Rizzo, Giovanna
2016-05-01
Extraction of the cardiac surfaces of interest from multi-detector computed tomographic (MDCT) data is a pre-requisite step for cardiac analysis, as well as for image guidance procedures. Most of the existing methods need manual corrections, which is time-consuming. We present a fully automatic segmentation technique for the extraction of the right ventricle, left ventricular endocardium and epicardium from MDCT images. The method consists in a 3D level set surface evolution approach coupled to a new stopping function based on a multiscale directional second derivative Gaussian filter, which is able to stop propagation precisely on the real boundary of the structures of interest. We validated the segmentation method on 18 MDCT volumes from healthy and pathologic subjects using manual segmentation performed by a team of expert radiologists as gold standard. Segmentation errors were assessed for each structure resulting in a surface-to-surface mean error below 0.5 mm and a percentage of surface distance with errors less than 1 mm above 80%. Moreover, in comparison to other segmentation approaches, already proposed in previous work, our method presented an improved accuracy (with surface distance errors less than 1 mm increased of 8-20% for all structures). The obtained results suggest that our approach is accurate and effective for the segmentation of ventricular cavities and myocardium from MDCT images.
ERIC Educational Resources Information Center
Kefford, Colin W.
This description of a unit for teaching about the environment at the junior high level is an experimental study. The focus of the program is the integration of several media; films and tapes play a large role in the unit. Students perform a combination of classroom work, field work, and simulated exercises; assessment procedures are described.…
ERIC Educational Resources Information Center
Pazos, Pilar; Micari, Marina; Light, Gregory
2010-01-01
Collaborative learning is being used extensively by educators at all levels. Peer-led team learning in a version of collaborative learning that has shown consistent success in science, technology, engineering and mathematics disciplines. Using a multi-phase research study we describe the development of an observation instrument that can be used to…
ERIC Educational Resources Information Center
Wilhelm, Jennifer; Toland, Michael D.; Cole, Merryn
2017-01-01
Differences were examined between groups of sixth grade students? spatial-scientific development pre/post implementation of an Earth/Space unit. Treatment teachers employed a spatially-integrated Earth/Space curriculum, while control teachers implemented their Business as Usual (BAU) Earth/Space units. A multi-level modeling approach was used in a…
A Cross-Layer Approach to Multi-Hop Networking with Cognitive Radios
2008-11-01
recent investigations. In [2], Behzad and Rubin studied the special case that the same power level are used at each node and found that the maximum...Sep. 2, 2005. [2] A. Behzad and I. Rubin, “Impact of power control on the performance of ad hoc wireless networks,” in Proc. IEEE Infocom, pp. 102–113
A Semi-Parametric Bayesian Mixture Modeling Approach for the Analysis of Judge Mediated Data
ERIC Educational Resources Information Center
Muckle, Timothy Joseph
2010-01-01
Existing methods for the analysis of ordinal-level data arising from judge ratings, such as the Multi-Facet Rasch model (MFRM, or the so-called Facets model) have been widely used in assessment in order to render fair examinee ability estimates in situations where the judges vary in their behavior or severity. However, this model makes certain…
ERIC Educational Resources Information Center
Mark, Katharine M.; Pike, Alison
2017-01-01
We investigated the association between marital quality and child behavior, assessing mother-child relationship quality as a potential mediator. The sample included 78 mothers with two target children (mean ages = 9.82 and 12.05 years, respectively). Mothers reported on their children's behavior as well as their marital quality, while each child…
ERIC Educational Resources Information Center
Lough, Emma; Fisher, Marisa H.
2016-01-01
The current study took a multi-informant approach to compare parent to self-report ratings of social vulnerability of adults with Williams syndrome (WS). Participants included 102 pairs of adults with WS and their parents. Parents completed the "Social Vulnerability Questionnaire" and adults with WS completed an adapted version of the…
ERIC Educational Resources Information Center
Araya, Saba Q.
2013-01-01
As pressure increases to ensure that limited resources are utilized as effectively as possible, funding adequacy remains a priority for all California public schools. The research was conducted through a multi-methods approach of principal interviews, site level resource allocation data, and overall student achievement on state assessments. The…
ERIC Educational Resources Information Center
Goodman, Elizabeth M.; Gill, Fobola M. L.
In 1963, the Washington, D.C. Public School Department began a special demonstration project on the secondary school level, the Webster Girls School Program, to reduce the number of dropouts due to pregnancy and recidivism. An interagency, multidisciplinary plan was devised to provide comprehensive services to the girls. Social case work, academic…
ERIC Educational Resources Information Center
Czaja, Carol F.
An increasing number of children who are both medically fragile and profoundly retarded are living to reach school age due to advanced medical technology. The provisions of Public Law 94-142 the Education for All Handicapped Children Act, bring these children within the domain of public education. A major question concerns what service delivery…
ERIC Educational Resources Information Center
Begland, Robert R.
In reviewing the Army Continuing Education System in 1979, the Assistant Secretary of the Army found a basic skills program based on traditional academic level goals was inadequate to meet the Army's requirement to provide functional, job-related basic skill education. Combining the shrinking manpower pool and projected basic skill deficiencies of…
ERIC Educational Resources Information Center
O'Rourke, James S., IV
This paper argues the importance of preparing business managers for a global marketplace and addresses who is responsible for the training and how to go about it. The establishment of an MBA (Master of Business Administration)-level course in intercultural communication is examined. Areas discussed involve determining what the goals of the course…
ERIC Educational Resources Information Center
Timmer, Susan G.; Ho, Lareina K. L.; Urquiza, Anthony J.; Zebell, Nancy M.; Fernandez y Garcia, Erik; Boys, Deanna
2011-01-01
This study uses a multi-method approach to investigate the effectiveness of Parent-Child Interaction Therapy (PCIT) in reducing children's behavior problems when parents report clinical levels of depressive symptoms. Participants were 132 children, 2-7 years of age, and their biological mothers, who either reported low (N = 78) or clinical levels…
Antecedents and trajectories of achievement goals: a self-determination theory perspective.
Ciani, Keith D; Sheldon, Kennon M; Hilpert, Jonathan C; Easter, Matthew A
2011-06-01
Research has shown that both achievement goal theory and self-determination theory (SDT) are quite useful in explaining student motivation and success in academic contexts. However, little is known about how the two theories relate to each other. The current research used SDT as a framework to understand why students enter classes with particular achievement goal profiles, and also, how those profiles may change over time. One hundred and eighty-four undergraduate preservice teachers in a required domain course agreed to participate in the study. Data were collected at three time points during the semester, and both path modelling and multi-level longitudinal modelling techniques were used. Path modelling techniques with 169 students, results indicated that students' autonomy and relatedness need satisfaction in life predict their initial self-determined class motivation, which in turn predicts initial mastery-approach and -avoidance goals. Multi-level longitudinal modelling with 108 students found that perceived teacher autonomy support buffered against the general decline in students' mastery-approach goals over the course of the semester. Data provide a promising integration of SDT and achievement goal theory, posing a host of potentially fruitful future research questions regarding goal adoption and trajectories. ©2010 The British Psychological Society.
Zhang, Meiyu; Li, Erfen; Su, Yijuan; Song, Xuqin; Xie, Jingmeng; Zhang, Yingxia; He, Limin
2018-06-01
Seven drugs from different classes, namely, fluoroquinolones (enrofloxacin, ciprofloxacin, sarafloxacin), sulfonamides (sulfadimidine, sulfamonomethoxine), and macrolides (tilmicosin, tylosin), were used as test compounds in chickens by oral administration, a simple extraction step after cryogenic freezing might allow the effective extraction of multi-class veterinary drug residues from minced chicken muscles by mix vortexing. On basis of the optimized freeze-thaw approach, a convenient, selective, and reproducible liquid chromatography with tandem mass spectrometry method was developed. At three spiking levels in blank chicken and medicated chicken muscles, average recoveries of the analytes were in the range of 71-106 and 63-119%, respectively. All the relative standard deviations were <20%. The limits of quantification of analytes were 0.2-5.0 ng/g. Regardless of the chicken levels, there were no significant differences (P > 0.05) in the average contents of almost any of the analytes in medicated chickens between this method and specific methods in the literature for the determination of specific analytes. Finally, the developed method was successfully extended to the monitoring of residues of 55 common veterinary drugs in food animal muscles. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Multiscale Modeling of Plasmon-Exciton Dynamics of Malachite Green Monolayers on Gold Nanoparticles
NASA Astrophysics Data System (ADS)
Smith, Holden; Karam, Tony; Haber, Louis; Lopata, Kenneth
A multi-scale hybrid quantum/classical approach using classical electrodynamics and a collection of discrete two level quantum system is used to investigate the coupling dynamics of malachite green monolayers adsorbed to the surface of a spherical gold nanoparticle (NP). This method utilizes finite difference time domain (FDTD) to describe the plasmonic response of the NP and a two-level quantum description for the molecule via the Maxwell/Liouville equation. The molecular parameters are parameterized using CASPT2 for the energies and transition dipole moments, with the dephasing lifetime fit to experiment. This approach is suited to simulating thousands of molecules on the surface of a plasmonic NP. There is good agreement with experimental extinction measurements, predicting the plasmon and molecule depletions. Additionally, this model captures the polariton peaks overlapped with a Fano-type resonance profile observed in the experimental extinction measurements. This technique shows promise for modeling plasmon/molecule interactions in chemical sensing and light harvesting in multi-chromophore systems. This material is based upon work supported by the National Science Foundation under the NSF EPSCoR Cooperative Agreement No. EPS-1003897 and the Louisiana Board of Regents Research Competitiveness Subprogram under Contract Number LEQSF(2014-17)-RD-A-0.
Multiscale Modeling of Plasmon-Exciton Dynamics of Malachite Green Monolayers on Gold Nanoparticles
NASA Astrophysics Data System (ADS)
Smith, Holden; Karam, Tony; Haber, Louis; Lopata, Kenneth
A multi-scale hybrid quantum/classical approach using classical electrodynamics and a collection of discrete two-level quantum system is used to investigate the coupling dynamics of malachite green monolayers adsorbed to the surface of a spherical gold nanoparticle (NP). This method utilizes finite difference time domain (FDTD) to describe the plasmonic response of the NP and a two-level quantum description for the molecule via the Maxwell/Liouville equation. The molecular parameters are parameterized using CASPT2 for the energies and transition dipole moments, with the dephasing lifetime fit to experiment. This approach is suited to simulating thousands of molecules on the surface of a plasmonic NP. There is good agreement with experimental extinction measurements, predicting the plasmon and molecule depletions. Additionally, this model captures the polariton peaks overlapped with a Fano-type resonance profile observed in the experimental extinction measurements. This technique shows promise for modeling plasmon/molecule interactions in chemical sensing and light harvesting in multi-chromophore systems. This material is based upon work supported by the National Science Foundation under the NSF EPSCoR Cooperative Agreement No. EPS-1003897 and by the Louisiana Board of Regents Research Competitiveness Subprogram under Contract Number LEQSF(2014-17)-RD-A-0.
Ray, Dipanjan; Roy, Dipanjan; Sindhu, Brahmdeep; Sharan, Pratap; Banerjee, Arpan
2017-01-01
Contemporary mental health practice primarily centers around the neurobiological and psychological processes at the individual level. However, a more careful consideration of interpersonal and other group-level attributes (e.g., interpersonal relationship, mutual trust/hostility, interdependence, and cooperation) and a better grasp of their pathology can add a crucial dimension to our understanding of mental health problems. A few recent studies have delved into the interpersonal behavioral processes in the context of different psychiatric abnormalities. Neuroimaging can supplement these approaches by providing insight into the neurobiology of interpersonal functioning. Keeping this view in mind, we discuss a recently developed approach in functional neuroimaging that calls for a shift from a focus on neural information contained within brain space to a multi-brain framework exploring degree of similarity/dissimilarity of neural signals between multiple interacting brains. We hypothesize novel applications of quantitative neuroimaging markers like inter-subject correlation that might be able to evaluate the role of interpersonal attributes affecting an individual or a group. Empirical evidences of the usage of these markers in understanding the neurobiology of social interactions are provided to argue for their application in future mental health research.
Ray, Dipanjan; Roy, Dipanjan; Sindhu, Brahmdeep; Sharan, Pratap; Banerjee, Arpan
2017-01-01
Contemporary mental health practice primarily centers around the neurobiological and psychological processes at the individual level. However, a more careful consideration of interpersonal and other group-level attributes (e.g., interpersonal relationship, mutual trust/hostility, interdependence, and cooperation) and a better grasp of their pathology can add a crucial dimension to our understanding of mental health problems. A few recent studies have delved into the interpersonal behavioral processes in the context of different psychiatric abnormalities. Neuroimaging can supplement these approaches by providing insight into the neurobiology of interpersonal functioning. Keeping this view in mind, we discuss a recently developed approach in functional neuroimaging that calls for a shift from a focus on neural information contained within brain space to a multi-brain framework exploring degree of similarity/dissimilarity of neural signals between multiple interacting brains. We hypothesize novel applications of quantitative neuroimaging markers like inter-subject correlation that might be able to evaluate the role of interpersonal attributes affecting an individual or a group. Empirical evidences of the usage of these markers in understanding the neurobiology of social interactions are provided to argue for their application in future mental health research. PMID:29033866
A conceptual framework for economic optimization of an animal health surveillance portfolio.
Guo, X; Claassen, G D H; Oude Lansink, A G J M; Saatkamp, H W
2016-04-01
Decision making on hazard surveillance in livestock product chains is a multi-hazard, multi-stakeholder, and multi-criteria process that includes a variety of decision alternatives. The multi-hazard aspect means that the allocation of the scarce resource for surveillance should be optimized from the point of view of a surveillance portfolio (SP) rather than a single hazard. In this paper, we present a novel conceptual approach for economic optimization of a SP to address the resource allocation problem for a surveillance organization from a theoretical perspective. This approach uses multi-criteria techniques to evaluate the performances of different settings of a SP, taking cost-benefit aspects of surveillance and stakeholders' preferences into account. The credibility of the approach has also been checked for conceptual validity, data needs and operational validity; the application potentials of the approach are also discussed.
Multi-level significance of vulnerability indicators. Case study: Eastern Romania
NASA Astrophysics Data System (ADS)
Stanga, I. C.; Grozavu, A.
2012-04-01
Vulnerability assessment aims, most frequently, to emphasize internal fragility of a system comparing to a reference standard, to similar systems or in relation to a given hazard. Internal fragility, either biophysical or structural, may affect the capacity to predict, to prepare for, to cope with or to recover from a disaster. Thus, vulnerability is linked to resilience and adaptive capacity. From local level to global one, vulnerability factors and corresponding indicators are different and their significance must be tested and validated in a well-structured conceptual and methodological framework. In this paper, the authors aim to show the real vulnerability of rural settlements in Eastern Romania in a multi-level approach. The research area, Tutova Hills, counts about 3421 sq.km and more than 200.000 inhabitants in 421 villages characterized by deficient accessibility, lack of endowments, subsistential agriculture, high pressure on natural environment (especially on forest and soil resources), poverty and aging process of population. Factors that could influence the vulnerability of these rural settlements have been inventoried and assigned into groups through a cluster analysis: habitat and technical urban facilities, infrastructure, economical, social and demographical indicators, environment quality, management of emergency situations etc. Firstly, the main difficulty was to convert qualitative variable in quantitative indicators and to standardize all values to make possible mathematical and statistical processing of data. Secondly, the great variability of vulnerability factors, their different measuring units and their high amplitude of variation require different method of standardization in order to obtain values between zero (minimum vulnerability) and one (maximum vulnerability). Final vulnerability indicators were selected and integrated in a general scheme, according to their significance resulted from an appropriate factor analysis: linear and logistic regression, varimax rotation, multiple-criteria decision analysis, weight of evidence, multi-criteria evaluation method etc. The approach started from the local level which allows a functional and structural analysis and was progressively translated to an upper level and to a spatial analysis. The model shows that changing the level of analysis diminishes the functional significance of some indicators and increases the capacity of discretization in the case of others, highlighting the spatial and functional complexity of vulnerability.
The multi-facets of sustainable nanotechnology - Lessons from a nanosafety symposium.
George, Saji; Ho, Shirley S; Wong, Esther S P; Tan, Timothy Thatt Yang; Verma, Navin Kumar; Aitken, Robert J; Riediker, Michael; Cummings, Christopher; Yu, Liya; Wang, Zheng Ming; Zink, Daniele; Ng, Zhihan; Loo, Say Chye Joachim; Ng, Kee Woei
2015-05-01
An international symposium for nanosafety was held recently at the Nanyang Technological University in Singapore. Topics relating to understanding nanomaterial properties, tools, and infrastructure required for predicting hazardous outcomes, measuring nanomaterial exposure levels, systems approach for risk assessment and public's perception of nanotechnology were covered. The need for a multidisciplinary approach, across both natural and social sciences, for developing sustainable nanotechnology solutions was heavily emphasized. This commentary highlights the major issues discussed and the commitment of the nanosafety research community in Singapore to contribute collectively to realise the vision of sustainable nanotechnology.
Fuzzy Evaluating Customer Satisfaction of Jet Fuel Companies
NASA Astrophysics Data System (ADS)
Cheng, Haiying; Fang, Guoyi
Based on the market characters of jet fuel companies, the paper proposes an evaluation index system of jet fuel company customer satisfaction from five dimensions as time, business, security, fee and service. And a multi-level fuzzy evaluation model composing with the analytic hierarchy process approach and fuzzy evaluation approach is given. Finally a case of one jet fuel company customer satisfaction evaluation is studied and the evaluation results response the feelings of the jet fuel company customers, which shows the fuzzy evaluation model is effective and efficient.
A Conceptual Design For A Spaceborne 3D Imaging Lidar
NASA Technical Reports Server (NTRS)
Degnan, John J.; Smith, David E. (Technical Monitor)
2002-01-01
First generation spaceborne altimetric approaches are not well-suited to generating the few meter level horizontal resolution and decimeter accuracy vertical (range) resolution on the global scale desired by many in the Earth and planetary science communities. The present paper discusses the major technological impediments to achieving few meter transverse resolutions globally using conventional approaches and offers a feasible conceptual design which utilizes modest power kHz rate lasers, array detectors, photon-counting multi-channel timing receivers, and dual wedge optical scanners with transmitter point-ahead correction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waye, Scot
Power electronics that use high-temperature devices pose a challenge for thermal management. With the devices running at higher temperatures and having a smaller footprint, the heat fluxes increase from previous power electronic designs. This project overview presents an approach to examine and design thermal management strategies through cooling technologies to keep devices within temperature limits, dissipate the heat generated by the devices and protect electrical interconnects and other components for inverter, converter, and charger applications. This analysis, validation, and demonstration intends to take a multi-scale approach over the device, module, and system levels to reduce size, weight, and cost.
Multi-variants synthesis of Petri nets for FPGA devices
NASA Astrophysics Data System (ADS)
Bukowiec, Arkadiusz; Doligalski, Michał
2015-09-01
There is presented new method of synthesis of application specific logic controllers for FPGA devices. The specification of control algorithm is made with use of control interpreted Petri net (PT type). It allows specifying parallel processes in easy way. The Petri net is decomposed into state-machine type subnets. In this case, each subnet represents one parallel process. For this purpose there are applied algorithms of coloring of Petri nets. There are presented two approaches of such decomposition: with doublers of macroplaces or with one global wait place. Next, subnets are implemented into two-level logic circuit of the controller. The levels of logic circuit are obtained as a result of its architectural decomposition. The first level combinational circuit is responsible for generation of next places and second level decoder is responsible for generation output symbols. There are worked out two variants of such circuits: with one shared operational memory or with many flexible distributed memories as a decoder. Variants of Petri net decomposition and structures of logic circuits can be combined together without any restrictions. It leads to existence of four variants of multi-variants synthesis.
DReAM: Demand Response Architecture for Multi-level District Heating and Cooling Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharya, Saptarshi; Chandan, Vikas; Arya, Vijay
In this paper, we exploit the inherent hierarchy of heat exchangers in District Heating and Cooling (DHC) networks and propose DReAM, a novel Demand Response (DR) architecture for Multi-level DHC networks. DReAM serves to economize system operation while still respecting comfort requirements of individual consumers. Contrary to many present day DR schemes that work on a consumer level granularity, DReAM works at a level of hierarchy above buildings, i.e. substations that supply heat to a group of buildings. This improves the overall DR scalability and reduce the computational complexity. In the first step of the proposed approach, mathematical models ofmore » individual substations and their downstream networks are abstracted into appropriately constructed low-complexity structural forms. In the second step, this abstracted information is employed by the utility to perform DR optimization that determines the optimal heat inflow to individual substations rather than buildings, in order to achieve the targeted objectives across the network. We validate the proposed DReAM framework through experimental results under different scenarios on a test network.« less
An Integrated Systems Genetics and Omics Toolkit to Probe Gene Function.
Li, Hao; Wang, Xu; Rukina, Daria; Huang, Qingyao; Lin, Tao; Sorrentino, Vincenzo; Zhang, Hongbo; Bou Sleiman, Maroun; Arends, Danny; McDaid, Aaron; Luan, Peiling; Ziari, Naveed; Velázquez-Villegas, Laura A; Gariani, Karim; Kutalik, Zoltan; Schoonjans, Kristina; Radcliffe, Richard A; Prins, Pjotr; Morgenthaler, Stephan; Williams, Robert W; Auwerx, Johan
2018-01-24
Identifying genetic and environmental factors that impact complex traits and common diseases is a high biomedical priority. Here, we developed, validated, and implemented a series of multi-layered systems approaches, including (expression-based) phenome-wide association, transcriptome-/proteome-wide association, and (reverse-) mediation analysis, in an open-access web server (systems-genetics.org) to expedite the systems dissection of gene function. We applied these approaches to multi-omics datasets from the BXD mouse genetic reference population, and identified and validated associations between genes and clinical and molecular phenotypes, including previously unreported links between Rpl26 and body weight, and Cpt1a and lipid metabolism. Furthermore, through mediation and reverse-mediation analysis we established regulatory relations between genes, such as the co-regulation of BCKDHA and BCKDHB protein levels, and identified targets of transcription factors E2F6, ZFP277, and ZKSCAN1. Our multifaceted toolkit enabled the identification of gene-gene and gene-phenotype links that are robust and that translate well across populations and species, and can be universally applied to any populations with multi-omics datasets. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Joint multi-object registration and segmentation of left and right cardiac ventricles in 4D cine MRI
NASA Astrophysics Data System (ADS)
Ehrhardt, Jan; Kepp, Timo; Schmidt-Richberg, Alexander; Handels, Heinz
2014-03-01
The diagnosis of cardiac function based on cine MRI requires the segmentation of cardiac structures in the images, but the problem of automatic cardiac segmentation is still open, due to the imaging characteristics of cardiac MR images and the anatomical variability of the heart. In this paper, we present a variational framework for joint segmentation and registration of multiple structures of the heart. To enable the simultaneous segmentation and registration of multiple objects, a shape prior term is introduced into a region competition approach for multi-object level set segmentation. The proposed algorithm is applied for simultaneous segmentation of the myocardium as well as the left and right ventricular blood pool in short axis cine MRI images. Two experiments are performed: first, intra-patient 4D segmentation with a given initial segmentation for one time-point in a 4D sequence, and second, a multi-atlas segmentation strategy is applied to unseen patient data. Evaluation of segmentation accuracy is done by overlap coefficients and surface distances. An evaluation based on clinical 4D cine MRI images of 25 patients shows the benefit of the combined approach compared to sole registration and sole segmentation.
Interventions to Support System-level Implementation of Health Promoting Schools: A Scoping Review
McIsaac, Jessie-Lee D.; Hernandez, Kimberley J.; Kirk, Sara F.L.; Curran, Janet A.
2016-01-01
Health promoting schools (HPS) is recognized globally as a multifaceted approach that can support health behaviours. There is increasing clarity around factors that influence HPS at a school level but limited synthesized knowledge on the broader system-level elements that may impact local implementation barriers and support uptake of a HPS approach. This study comprised a scoping review to identify, summarise and disseminate the range of research to support the uptake of a HPS approach across school systems. Two reviewers screened and extracted data according to inclusion/exclusion criteria. Relevant studies were identified using a multi-phased approach including searching electronic bibliographic databases of peer reviewed literature, hand-searching reference lists and article recommendations from experts. In total, 41 articles met the inclusion criteria for the review, representing studies across nine international school systems. Overall, studies described policies that provided high-level direction and resources within school jurisdictions to support implementation of a HPS approach. Various multifaceted organizational and professional interventions were identified, including strategies to enable and restructure school environments through education, training, modelling and incentives. A systematic realist review of the literature may be warranted to identify the types of intervention that work best for whom, in what circumstance to create healthier schools and students. PMID:26861376
Progress in multi-dimensional upwind differencing
NASA Technical Reports Server (NTRS)
Vanleer, Bram
1992-01-01
Multi-dimensional upwind-differencing schemes for the Euler equations are reviewed. On the basis of the first-order upwind scheme for a one-dimensional convection equation, the two approaches to upwind differencing are discussed: the fluctuation approach and the finite-volume approach. The usual extension of the finite-volume method to the multi-dimensional Euler equations is not entirely satisfactory, because the direction of wave propagation is always assumed to be normal to the cell faces. This leads to smearing of shock and shear waves when these are not grid-aligned. Multi-directional methods, in which upwind-biased fluxes are computed in a frame aligned with a dominant wave, overcome this problem, but at the expense of robustness. The same is true for the schemes incorporating a multi-dimensional wave model not based on multi-dimensional data but on an 'educated guess' of what they could be. The fluctuation approach offers the best possibilities for the development of genuinely multi-dimensional upwind schemes. Three building blocks are needed for such schemes: a wave model, a way to achieve conservation, and a compact convection scheme. Recent advances in each of these components are discussed; putting them all together is the present focus of a worldwide research effort. Some numerical results are presented, illustrating the potential of the new multi-dimensional schemes.
Brenzel, Logan; Young, Darwin; Walker, Damian G
2015-05-07
Few detailed facility-based costing studies of routine immunization (RI) programs have been conducted in recent years, with planners, managers and donors relying on older information or data from planning tools. To fill gaps and improve quality of information, a multi-country study on costing and financing of routine immunization and new vaccines (EPIC) was conducted in Benin, Ghana, Honduras, Moldova, Uganda and Zambia. This paper provides the rationale for the launch of the EPIC study, as well as outlines methods used in a Common Approach on facility sampling, data collection, cost and financial flow estimation for both the routine program and new vaccine introduction. Costing relied on an ingredients-based approach from a government perspective. Estimating incremental economic costs of new vaccine introduction in contexts with excess capacity are highlighted. The use of more disaggregated System of Health Accounts (SHA) coding to evaluate financial flows is presented. The EPIC studies resulted in a sample of 319 primary health care facilities, with 65% of facilities in rural areas. The EPIC studies found wide variation in total and unit costs within each country, as well as between countries. Costs increased with level of scale and socio-economic status of the country. Governments are financing an increasing share of total RI financing. This study provides a wealth of high quality information on total and unit costs and financing for RI, and demonstrates the value of in-depth facility approaches. The paper discusses the lessons learned from using a standardized approach, as well as proposes further areas of methodology development. The paper discusses how results can be used for resource mobilization and allocation, improved efficiency of services at the country level, and to inform policies at the global level. Efforts at routinizing cost analysis to support sustainability efforts would be beneficial. Copyright © 2015 Elsevier Ltd. All rights reserved.
multiUQ: An intrusive uncertainty quantification tool for gas-liquid multiphase flows
NASA Astrophysics Data System (ADS)
Turnquist, Brian; Owkes, Mark
2017-11-01
Uncertainty quantification (UQ) can improve our understanding of the sensitivity of gas-liquid multiphase flows to variability about inflow conditions and fluid properties, creating a valuable tool for engineers. While non-intrusive UQ methods (e.g., Monte Carlo) are simple and robust, the cost associated with these techniques can render them unrealistic. In contrast, intrusive UQ techniques modify the governing equations by replacing deterministic variables with stochastic variables, adding complexity, but making UQ cost effective. Our numerical framework, called multiUQ, introduces an intrusive UQ approach for gas-liquid flows, leveraging a polynomial chaos expansion of the stochastic variables: density, momentum, pressure, viscosity, and surface tension. The gas-liquid interface is captured using a conservative level set approach, including a modified reinitialization equation which is robust and quadrature free. A least-squares method is leveraged to compute the stochastic interface normal and curvature needed in the continuum surface force method for surface tension. The solver is tested by applying uncertainty to one or two variables and verifying results against the Monte Carlo approach. NSF Grant #1511325.
Ng, Eng-Poh; Goh, Jia-Yi; Ling, Tau Chuan; Mukti, Rino R
2013-03-04
Nanoporous materials such as Mobil composite material number 41 (MCM-41) are attractive for applications such as catalysis, adsorption, supports, and carriers. Green synthesis of MCM-41 is particularly appealing because the chemical reagents are useful and valuable. We report on the eco-friendly synthesis of MCM-41 nanoporous materials via multi-cycle approach by re-using the non-reacted reagents in supernatant as mother liquor after separating the solid product. This approach was achieved via minimal requirement of chemical compensation where additional fresh reactants were added into the mother liquor followed by pH adjustment after each cycle of synthesis. The solid product of each successive batch was collected and characterized while the non-reacted reagents in supernatant can be recovered and re-used to produce subsequent cycle of MCM-41. The multi-cycle synthesis is demonstrated up to three times in this research. This approach suggests a low cost and eco-friendly synthesis of nanoporous material since less waste is discarded after the product has been collected, and in addition, product yield can be maintained at the high level.
2013-01-01
Nanoporous materials such as Mobil composite material number 41 (MCM-41) are attractive for applications such as catalysis, adsorption, supports, and carriers. Green synthesis of MCM-41 is particularly appealing because the chemical reagents are useful and valuable. We report on the eco-friendly synthesis of MCM-41 nanoporous materials via multi-cycle approach by re-using the non-reacted reagents in supernatant as mother liquor after separating the solid product. This approach was achieved via minimal requirement of chemical compensation where additional fresh reactants were added into the mother liquor followed by pH adjustment after each cycle of synthesis. The solid product of each successive batch was collected and characterized while the non-reacted reagents in supernatant can be recovered and re-used to produce subsequent cycle of MCM-41. The multi-cycle synthesis is demonstrated up to three times in this research. This approach suggests a low cost and eco-friendly synthesis of nanoporous material since less waste is discarded after the product has been collected, and in addition, product yield can be maintained at the high level. PMID:23497184
Multi Objective Optimization Using Genetic Algorithm of a Pneumatic Connector
NASA Astrophysics Data System (ADS)
Salaam, HA; Taha, Zahari; Ya, TMYS Tuan
2018-03-01
The concept of sustainability was first introduced by Dr Harlem Brutland in the 1980’s promoting the need to preserve today’s natural environment for the sake of future generations. Based on this concept, John Elkington proposed an approach to measure sustainability known as Triple Bottom Line (TBL). There are three evaluation criteria’s involved in the TBL approach; namely economics, environmental integrity and social equity. In manufacturing industry the manufacturing costs measure the economic sustainability of a company in a long term. Environmental integrity is a measure of the impact of manufacturing activities on the environment. Social equity is complicated to evaluate; but when the focus is at the production floor level, the production operator health can be considered. In this paper, the TBL approach is applied in the manufacturing of a pneumatic nipple hose. The evaluation criteria used are manufacturing costs, environmental impact, ergonomics impact and also energy used for manufacturing. This study involves multi objective optimization by using genetic algorithm of several possible alternatives for material used in the manufacturing of the pneumatic nipple.
A Multi-targeted Approach to Suppress Tumor-Promoting Inflammation
Samadi, Abbas K.; Georgakilas, Alexandros G.; Amedei, Amedeo; Amin, Amr; Bishayee, Anupam; Lokeshwar, Bal L.; Grue, Brendan; Panis, Carolina; Boosani, Chandra S.; Poudyal, Deepak; Stafforini, Diana M.; Bhakta, Dipita; Niccolai, Elena; Guha, Gunjan; Rupasinghe, H.P. Vasantha; Fujii, Hiromasa; Honoki, Kanya; Mehta, Kapil; Aquilano, Katia; Lowe, Leroy; Hofseth, Lorne J.; Ricciardiello, Luigi; Ciriolo, Maria Rosa; Singh, Neetu; Whelan, Richard L.; Chaturvedi, Rupesh; Ashraf, S. Salman; Kumara, HMC Shantha; Nowsheen, Somaira; Mohammed, Sulma I.; Helferich, William G.; Yang, Xujuan
2015-01-01
Cancers harbor significant genetic heterogeneity and patterns of relapse following many therapies are due to evolved resistance to treatment. While efforts have been made to combine targeted therapies, significant levels of toxicity have stymied efforts to effectively treat cancer with multi-drug combinations using currently approved therapeutics. We discuss the relationship between tumor-promoting inflammation and cancer as part of a larger effort to develop a broad-spectrum therapeutic approach aimed at a wide range of targets to address this heterogeneity. Specifically, macrophage migration inhibitory factor, cyclooxygenase-2, transcription factor nuclear factor-kappaB, tumor necrosis factor alpha, inducible nitric oxide synthase, protein kinase B, and CXC chemokines are reviewed as important antiinflammatory targets while curcumin, resveratrol, epigallocatechin gallate, genistein, lycopene, and anthocyanins are reviewed as low-cost, low toxicity means by which these targets might all be reached simultaneously. Future translational work will need to assess the resulting synergies of rationally designed antiinflammatory mixtures (employing low-toxicity constituents), and then combine this with similar approaches targeting the most important pathways across the range of cancer hallmark phenotypes. PMID:25951989
NASA Astrophysics Data System (ADS)
Ng, Eng-Poh; Goh, Jia-Yi; Ling, Tau Chuan; Mukti, Rino R.
2013-03-01
Nanoporous materials such as Mobil composite material number 41 (MCM-41) are attractive for applications such as catalysis, adsorption, supports, and carriers. Green synthesis of MCM-41 is particularly appealing because the chemical reagents are useful and valuable. We report on the eco-friendly synthesis of MCM-41 nanoporous materials via multi-cycle approach by re-using the non-reacted reagents in supernatant as mother liquor after separating the solid product. This approach was achieved via minimal requirement of chemical compensation where additional fresh reactants were added into the mother liquor followed by pH adjustment after each cycle of synthesis. The solid product of each successive batch was collected and characterized while the non-reacted reagents in supernatant can be recovered and re-used to produce subsequent cycle of MCM-41. The multi-cycle synthesis is demonstrated up to three times in this research. This approach suggests a low cost and eco-friendly synthesis of nanoporous material since less waste is discarded after the product has been collected, and in addition, product yield can be maintained at the high level.
NASA Technical Reports Server (NTRS)
Griffin, Brian Joseph; Burken, John J.; Xargay, Enric
2010-01-01
This paper presents an L(sub 1) adaptive control augmentation system design for multi-input multi-output nonlinear systems in the presence of unmatched uncertainties which may exhibit significant cross-coupling effects. A piecewise continuous adaptive law is adopted and extended for applicability to multi-input multi-output systems that explicitly compensates for dynamic cross-coupling. In addition, explicit use of high-fidelity actuator models are added to the L1 architecture to reduce uncertainties in the system. The L(sub 1) multi-input multi-output adaptive control architecture is applied to the X-29 lateral/directional dynamics and results are evaluated against a similar single-input single-output design approach.
Wilfley, Denise E.; Van Buren, Dorothy J.; Theim, Kelly R.; Stein, Richard I.; Saelens, Brian E.; Ezzet, Farkad; Russian, Angela C.; Perri, Michael G.; Epstein, Leonard H.
2011-01-01
Objective Weight loss outcomes achieved through conventional behavior change interventions are prone to deterioration over time. Basic learning laboratory studies in the area of behavioral extinction and renewal and multi-level models of weight control offer clues as to why newly acquired weight loss skills are prone to relapse. According to these models, current clinic-based interventions may not be of sufficient duration or scope to allow for the practice of new skills across the multiple community contexts necessary to promote sustainable weight loss. Although longer, more intensive interventions with greater reach may hold the key to improving weight loss outcomes, it is difficult to test these assumptions in a time efficient and cost-effective manner. A research design tool that has been increasingly utilized in other fields (e.g., pharmaceuticals) is the use of biosimulation analyses. The present paper describes our research team's use of computer simulation models to assist in designing a study to test a novel, comprehensive socio-environmental treatment approach to weight loss maintenance in children ages 7 to 12 years. Methods Weight outcome data from the weight loss, weight maintenance, and follow-up phases of a recently completed randomized controlled trial (RCT) were used to describe the time course of a proposed, extended multi-level treatment program. Simulations were then conducted to project the expected changes in child percent overweight trajectories in the proposed study. Results A 12.9% decrease in percent overweight at 30 months was estimated based upon the midway point between models of “best-case” and “worst-case” weight maintenance scenarios. Conclusions Preliminary data and further analyses, including biosimulation projections, suggest that our socio-environmental approach to weight loss maintenance treatment is promising and warrants evaluation in a large-scale RCT. Biosimulation techniques may have utility in the design of future community-level interventions for the treatment and prevention of childhood overweight. PMID:20107468
Occupancy in community-level studies
MacKenzie, Darryl I.; Nichols, James; Royle, Andy; Pollock, Kenneth H.; Bailey, Larissa L.; Hines, James
2018-01-01
Another type of multi-species studies, are those focused on community-level metrics such as species richness. In this chapter we detail how some of the single-species occupancy models described in earlier chapters have been applied, or extended, for use in such studies, while accounting for imperfect detection. We highlight how Bayesian methods using MCMC are particularly useful in such settings to easily calculate relevant community-level summaries based on presence/absence data. These modeling approaches can be used to assess richness at a single point in time, or to investigate changes in the species pool over time.
Barraza, Roberto; Velazquez-Angulo, Gilberto; Flores-Tavizón, Edith; Romero-González, Jaime; Huertas-Cardozo, José Ignacio
2016-01-01
This study examines a pathway for building urban climate change mitigation policies by presenting a multi-dimensional and transdisciplinary approach in which technical, economic, environmental, social, and political dimensions interact. Now, more than ever, the gap between science and policymaking needs to be bridged; this will enable judicious choices to be made in regarding energy and climate change mitigation strategies, leading to positive social impacts, in particular for the populations at-risk at the local level. Through a case study in Juarez, Chihuahua, Mexico, we propose a multidimensional and transdisciplinary approach with the role of scientist as policy advisers to improve the role of science in decision-making on mitigation policies at the local level in Mexico. PMID:27128933
NASA Astrophysics Data System (ADS)
Ghafouri, H. R.; Mosharaf-Dehkordi, M.; Afzalan, B.
2017-07-01
A simulation-optimization model is proposed for identifying the characteristics of local immiscible NAPL contaminant sources inside aquifers. This model employs the UTCHEM 9.0 software as its simulator for solving the governing equations associated with the multi-phase flow in porous media. As the optimization model, a novel two-level saturation based Imperialist Competitive Algorithm (ICA) is proposed to estimate the parameters of contaminant sources. The first level consists of three parallel independent ICAs and plays as a pre-conditioner for the second level which is a single modified ICA. The ICA in the second level is modified by dividing each country into a number of provinces (smaller parts). Similar to countries in the classical ICA, these provinces are optimized by the assimilation, competition, and revolution steps in the ICA. To increase the diversity of populations, a new approach named knock the base method is proposed. The performance and accuracy of the simulation-optimization model is assessed by solving a set of two and three-dimensional problems considering the effects of different parameters such as the grid size, rock heterogeneity and designated monitoring networks. The obtained numerical results indicate that using this simulation-optimization model provides accurate results at a less number of iterations when compared with the model employing the classical one-level ICA. A model is proposed to identify characteristics of immiscible NAPL contaminant sources. The contaminant is immiscible in water and multi-phase flow is simulated. The model is a multi-level saturation-based optimization algorithm based on ICA. Each answer string in second level is divided into a set of provinces. Each ICA is modified by incorporating a new knock the base model.
Negotiating water across levels: A peace and conflict "Toolbox" for water diplomacy
NASA Astrophysics Data System (ADS)
Grech-Madin, Charlotte; Döring, Stefan; Kim, Kyungmee; Swain, Ashok
2018-04-01
As a key policy tool, water diplomacy offers greater political engagement in the cooperative management of shared water. A range of initiatives has been dedicated to this end, almost invariably oriented around the interactions of nation states. Crucially, however, practitioners of water diplomacy also need to address water governance at sub-state levels. As a political, multi-level, and normative field, peace and conflict research offers a pluralism of approaches designed to bring actors together at all levels. Drawing upon this research, this paper offers new focal points for water diplomacy that can enhance its policy effectiveness and enrich its underlying academic current. More specifically, it presents three hitherto undervalued tools for water diplomacy: at the interstate level, to uncover the rich body of political norms that bind states to shared understandings of acceptable practice around water. At the intrastate level, to incorporate ethnography of water users and civil society groups' responses to state-led waterworks projects, and at the communal level to employ disaggregated georeferenced data on water resources in conflict-prone areas. Taken together, these analytical tools provide a multi-faceted political gauge of the dynamics of water diplomacy, and add vital impetus to develop water diplomacy across multiple levels of policy engagement.
Wang, Bao-Zhen; Chen, Zhi
2013-01-01
This article presents a GIS-based multi-source and multi-box modeling approach (GMSMB) to predict the spatial concentration distributions of airborne pollutant on local and regional scales. In this method, an extended multi-box model combined with a multi-source and multi-grid Gaussian model are developed within the GIS framework to examine the contributions from both point- and area-source emissions. By using GIS, a large amount of data including emission sources, air quality monitoring, meteorological data, and spatial location information required for air quality modeling are brought into an integrated modeling environment. It helps more details of spatial variation in source distribution and meteorological condition to be quantitatively analyzed. The developed modeling approach has been examined to predict the spatial concentration distribution of four air pollutants (CO, NO(2), SO(2) and PM(2.5)) for the State of California. The modeling results are compared with the monitoring data. Good agreement is acquired which demonstrated that the developed modeling approach could deliver an effective air pollution assessment on both regional and local scales to support air pollution control and management planning.
Großkinsky, Dominik K; Syaifullah, Syahnada Jaya; Roitsch, Thomas
2018-02-12
The study of senescence in plants is complicated by diverse levels of temporal and spatial dynamics as well as the impact of external biotic and abiotic factors and crop plant management. Whereas the molecular mechanisms involved in developmentally regulated leaf senescence are very well understood, in particular in the annual model plant species Arabidopsis, senescence of other organs such as the flower, fruit, and root is much less studied as well as senescence in perennials such as trees. This review addresses the need for the integration of multi-omics techniques and physiological phenotyping into holistic phenomics approaches to dissect the complex phenomenon of senescence. That became feasible through major advances in the establishment of various, complementary 'omics' technologies. Such an interdisciplinary approach will also need to consider knowledge from the animal field, in particular in relation to novel regulators such as small, non-coding RNAs, epigenetic control and telomere length. Such a characterization of phenotypes via the acquisition of high-dimensional datasets within a systems biology approach will allow us to systematically characterize the various programmes governing senescence beyond leaf senescence in Arabidopsis and to elucidate the underlying molecular processes. Such a multi-omics approach is expected to also spur the application of results from model plants to agriculture and their verification for sustainable and environmentally friendly improvement of crop plant stress resilience and productivity and contribute to improvements based on postharvest physiology for the food industry and the benefit of its customers. © The Author(s) 2017. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Bi-level multi-source learning for heterogeneous block-wise missing data.
Xiang, Shuo; Yuan, Lei; Fan, Wei; Wang, Yalin; Thompson, Paul M; Ye, Jieping
2014-11-15
Bio-imaging technologies allow scientists to collect large amounts of high-dimensional data from multiple heterogeneous sources for many biomedical applications. In the study of Alzheimer's Disease (AD), neuroimaging data, gene/protein expression data, etc., are often analyzed together to improve predictive power. Joint learning from multiple complementary data sources is advantageous, but feature-pruning and data source selection are critical to learn interpretable models from high-dimensional data. Often, the data collected has block-wise missing entries. In the Alzheimer's Disease Neuroimaging Initiative (ADNI), most subjects have MRI and genetic information, but only half have cerebrospinal fluid (CSF) measures, a different half has FDG-PET; only some have proteomic data. Here we propose how to effectively integrate information from multiple heterogeneous data sources when data is block-wise missing. We present a unified "bi-level" learning model for complete multi-source data, and extend it to incomplete data. Our major contributions are: (1) our proposed models unify feature-level and source-level analysis, including several existing feature learning approaches as special cases; (2) the model for incomplete data avoids imputing missing data and offers superior performance; it generalizes to other applications with block-wise missing data sources; (3) we present efficient optimization algorithms for modeling complete and incomplete data. We comprehensively evaluate the proposed models including all ADNI subjects with at least one of four data types at baseline: MRI, FDG-PET, CSF and proteomics. Our proposed models compare favorably with existing approaches. © 2013 Elsevier Inc. All rights reserved.
Deb, Kalyanmoy; Sinha, Ankur
2010-01-01
Bilevel optimization problems involve two optimization tasks (upper and lower level), in which every feasible upper level solution must correspond to an optimal solution to a lower level optimization problem. These problems commonly appear in many practical problem solving tasks including optimal control, process optimization, game-playing strategy developments, transportation problems, and others. However, they are commonly converted into a single level optimization problem by using an approximate solution procedure to replace the lower level optimization task. Although there exist a number of theoretical, numerical, and evolutionary optimization studies involving single-objective bilevel programming problems, not many studies look at the context of multiple conflicting objectives in each level of a bilevel programming problem. In this paper, we address certain intricate issues related to solving multi-objective bilevel programming problems, present challenging test problems, and propose a viable and hybrid evolutionary-cum-local-search based algorithm as a solution methodology. The hybrid approach performs better than a number of existing methodologies and scales well up to 40-variable difficult test problems used in this study. The population sizing and termination criteria are made self-adaptive, so that no additional parameters need to be supplied by the user. The study indicates a clear niche of evolutionary algorithms in solving such difficult problems of practical importance compared to their usual solution by a computationally expensive nested procedure. The study opens up many issues related to multi-objective bilevel programming and hopefully this study will motivate EMO and other researchers to pay more attention to this important and difficult problem solving activity.
Automated diagnosis of interstitial lung diseases and emphysema in MDCT imaging
NASA Astrophysics Data System (ADS)
Fetita, Catalin; Chang Chien, Kuang-Che; Brillet, Pierre-Yves; Prêteux, Françoise
2007-09-01
Diffuse lung diseases (DLD) include a heterogeneous group of non-neoplasic disease resulting from damage to the lung parenchyma by varying patterns of inflammation. Characterization and quantification of DLD severity using MDCT, mainly in interstitial lung diseases and emphysema, is an important issue in clinical research for the evaluation of new therapies. This paper develops a 3D automated approach for detection and diagnosis of diffuse lung diseases such as fibrosis/honeycombing, ground glass and emphysema. The proposed methodology combines multi-resolution 3D morphological filtering (exploiting the sup-constrained connection cost operator) and graph-based classification for a full characterization of the parenchymal tissue. The morphological filtering performs a multi-level segmentation of the low- and medium-attenuated lung regions as well as their classification with respect to a granularity criterion (multi-resolution analysis). The original intensity range of the CT data volume is thus reduced in the segmented data to a number of levels equal to the resolution depth used (generally ten levels). The specificity of such morphological filtering is to extract tissue patterns locally contrasting with their neighborhood and of size inferior to the resolution depth, while preserving their original shape. A multi-valued hierarchical graph describing the segmentation result is built-up according to the resolution level and the adjacency of the different segmented components. The graph nodes are then enriched with the textural information carried out by their associated components. A graph analysis-reorganization based on the nodes attributes delivers the final classification of the lung parenchyma in normal and ILD/emphysematous regions. It also makes possible to discriminate between different types, or development stages, among the same class of diseases.
Automatic detection of multi-level acetowhite regions in RGB color images of the uterine cervix
NASA Astrophysics Data System (ADS)
Lange, Holger
2005-04-01
Uterine cervical cancer is the second most common cancer among women worldwide. Colposcopy is a diagnostic method used to detect cancer precursors and cancer of the uterine cervix, whereby a physician (colposcopist) visually inspects the metaplastic epithelium on the cervix for certain distinctly abnormal morphologic features. A contrast agent, a 3-5% acetic acid solution, is used, causing abnormal and metaplastic epithelia to turn white. The colposcopist considers diagnostic features such as the acetowhite, blood vessel structure, and lesion margin to derive a clinical diagnosis. STI Medical Systems is developing a Computer-Aided-Diagnosis (CAD) system for colposcopy -- ColpoCAD, a complex image analysis system that at its core assesses the same visual features as used by colposcopists. The acetowhite feature has been identified as one of the most important individual predictors of lesion severity. Here, we present the details and preliminary results of a multi-level acetowhite region detection algorithm for RGB color images of the cervix, including the detection of the anatomic features: cervix, os and columnar region, which are used for the acetowhite region detection. The RGB images are assumed to be glare free, either obtained by cross-polarized image acquisition or glare removal pre-processing. The basic approach of the algorithm is to extract a feature image from the RGB image that provides a good acetowhite to cervix background ratio, to segment the feature image using novel pixel grouping and multi-stage region-growing algorithms that provide region segmentations with different levels of detail, to extract the acetowhite regions from the region segmentations using a novel region selection algorithm, and then finally to extract the multi-levels from the acetowhite regions using multiple thresholds. The performance of the algorithm is demonstrated using human subject data.
Koniotou, Marina; Evans, Bridie Angela; Chatters, Robin; Fothergill, Rachael; Garnsworthy, Christopher; Gaze, Sarah; Halter, Mary; Mason, Suzanne; Peconi, Julie; Porter, Alison; Siriwardena, A Niroshan; Toghill, Alun; Snooks, Helen
2015-07-10
Health services research is expected to involve service users as active partners in the research process, but few examples report how this has been achieved in practice in trials. We implemented a model to involve service users in a multi-centre randomised controlled trial in pre-hospital emergency care. We used the generic Standard Operating Procedure (SOP) from our Clinical Trials Unit (CTU) as the basis for creating a model to fit the context and population of the SAFER 2 trial. In our model, we planned to involve service users at all stages in the trial through decision-making forums at 3 levels: 1) strategic; 2) site (e.g. Wales; London; East Midlands); 3) local. We linked with charities and community groups to recruit people with experience of our study population. We collected notes of meetings alongside other documentary evidence such as attendance records and study documentation to track how we implemented our model. We involved service users at strategic, site and local level. We also added additional strategic level forums (Task and Finish Groups and Writing Days) where we included service users. Service user involvement varied in frequency and type across meetings, research stages and locations but stabilised and increased as the trial progressed. Involving service users in the SAFER 2 trial showed how it is feasible and achievable for patients, carers and potential patients sharing the demographic characteristics of our study population to collaborate in a multi-centre trial at the level which suited their health, location, skills and expertise. A standard model of involvement can be tailored by adopting a flexible approach to take account of the context and complexities of a multi-site trial. Current Controlled Trials ISRCTN60481756. Registered: 13 March 2009.
Perspective: Reaches of chemical physics in biology.
Gruebele, Martin; Thirumalai, D
2013-09-28
Chemical physics as a discipline contributes many experimental tools, algorithms, and fundamental theoretical models that can be applied to biological problems. This is especially true now as the molecular level and the systems level descriptions begin to connect, and multi-scale approaches are being developed to solve cutting edge problems in biology. In some cases, the concepts and tools got their start in non-biological fields, and migrated over, such as the idea of glassy landscapes, fluorescence spectroscopy, or master equation approaches. In other cases, the tools were specifically developed with biological physics applications in mind, such as modeling of single molecule trajectories or super-resolution laser techniques. In this introduction to the special topic section on chemical physics of biological systems, we consider a wide range of contributions, all the way from the molecular level, to molecular assemblies, chemical physics of the cell, and finally systems-level approaches, based on the contributions to this special issue. Chemical physicists can look forward to an exciting future where computational tools, analytical models, and new instrumentation will push the boundaries of biological inquiry.
Perspective: Reaches of chemical physics in biology
Gruebele, Martin; Thirumalai, D.
2013-01-01
Chemical physics as a discipline contributes many experimental tools, algorithms, and fundamental theoretical models that can be applied to biological problems. This is especially true now as the molecular level and the systems level descriptions begin to connect, and multi-scale approaches are being developed to solve cutting edge problems in biology. In some cases, the concepts and tools got their start in non-biological fields, and migrated over, such as the idea of glassy landscapes, fluorescence spectroscopy, or master equation approaches. In other cases, the tools were specifically developed with biological physics applications in mind, such as modeling of single molecule trajectories or super-resolution laser techniques. In this introduction to the special topic section on chemical physics of biological systems, we consider a wide range of contributions, all the way from the molecular level, to molecular assemblies, chemical physics of the cell, and finally systems-level approaches, based on the contributions to this special issue. Chemical physicists can look forward to an exciting future where computational tools, analytical models, and new instrumentation will push the boundaries of biological inquiry. PMID:24089712
Selecting Essential Information for Biosurveillance—A Multi-Criteria Decision Analysis
Generous, Nicholas; Margevicius, Kristen J.; Taylor-McCabe, Kirsten J.; Brown, Mac; Daniel, W. Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
The National Strategy for Biosurveillancedefines biosurveillance as “the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels.” However, the strategy does not specify how “essential information” is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being “essential”. Thequestion of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of “essential information” for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system. PMID:24489748
Multi-energy CT based on a prior rank, intensity and sparsity model (PRISM).
Gao, Hao; Yu, Hengyong; Osher, Stanley; Wang, Ge
2011-11-01
We propose a compressive sensing approach for multi-energy computed tomography (CT), namely the prior rank, intensity and sparsity model (PRISM). To further compress the multi-energy image for allowing the reconstruction with fewer CT data and less radiation dose, the PRISM models a multi-energy image as the superposition of a low-rank matrix and a sparse matrix (with row dimension in space and column dimension in energy), where the low-rank matrix corresponds to the stationary background over energy that has a low matrix rank, and the sparse matrix represents the rest of distinct spectral features that are often sparse. Distinct from previous methods, the PRISM utilizes the generalized rank, e.g., the matrix rank of tight-frame transform of a multi-energy image, which offers a way to characterize the multi-level and multi-filtered image coherence across the energy spectrum. Besides, the energy-dependent intensity information can be incorporated into the PRISM in terms of the spectral curves for base materials, with which the restoration of the multi-energy image becomes the reconstruction of the energy-independent material composition matrix. In other words, the PRISM utilizes prior knowledge on the generalized rank and sparsity of a multi-energy image, and intensity/spectral characteristics of base materials. Furthermore, we develop an accurate and fast split Bregman method for the PRISM and demonstrate the superior performance of the PRISM relative to several competing methods in simulations.
Sea-level change during the last 2500 years in New Jersey, USA
Kemp, Andrew C.; Horton, Benjamin P.; Vane, Christopher H.; Bernhardt, Christopher E.; Corbett, D. Reide; Engelhart, Simon E.; Anisfeld, Shimon C.; Parnell, Andrew C.; Cahill, Niamh
2013-01-01
Relative sea-level changes during the last ∼2500 years in New Jersey, USA were reconstructed to test if late Holocene sea level was stable or included persistent and distinctive phases of variability. Foraminifera and bulk-sediment δ13C values were combined to reconstruct paleomarsh elevation with decimeter precision from sequences of salt-marsh sediment at two sites using a multi-proxy approach. The additional paleoenvironmental information provided by bulk-sediment δ13C values reduced vertical uncertainty in the sea-level reconstruction by about one third of that estimated from foraminifera alone using a transfer function. The history of sediment deposition was constrained by a composite chronology. An age–depth model developed for each core enabled reconstruction of sea level with multi-decadal resolution. Following correction for land-level change (1.4 mm/yr), four successive and sustained (multi-centennial) sea-level trends were objectively identified and quantified (95% confidence interval) using error-in-variables change point analysis to account for age and sea-level uncertainties. From at least 500 BC to 250 AD, sea-level fell at 0.11 mm/yr. The second period saw sea-level rise at 0.62 mm/yr from 250 AD to 733 AD. Between 733 AD and 1850 AD, sea level fell at 0.12 mm/yr. The reconstructed rate of sea-level rise since ∼1850 AD was 3.1 mm/yr and represents the most rapid period of change for at least 2500 years. This trend began between 1830 AD and 1873 AD. Since this change point, reconstructed sea-level rise is in agreement with regional tide-gauge records and exceeds the global average estimate for the 20th century. These positive and negative departures from background rates demonstrate that the late Holocene sea level was not stable in New Jersey.
Construction of Covariance Functions with Variable Length Fields
NASA Technical Reports Server (NTRS)
Gaspari, Gregory; Cohn, Stephen E.; Guo, Jing; Pawson, Steven
2005-01-01
This article focuses on construction, directly in physical space, of three-dimensional covariance functions parametrized by a tunable length field, and on an application of this theory to reproduce the Quasi-Biennial Oscillation (QBO) in the Goddard Earth Observing System, Version 4 (GEOS-4) data assimilation system. These Covariance models are referred to as multi-level or nonseparable, to associate them with the application where a multi-level covariance with a large troposphere to stratosphere length field gradient is used to reproduce the QBO from sparse radiosonde observations in the tropical lower stratosphere. The multi-level covariance functions extend well-known single level covariance functions depending only on a length scale. Generalizations of the first- and third-order autoregressive covariances in three dimensions are given, providing multi-level covariances with zero and three derivatives at zero separation, respectively. Multi-level piecewise rational covariances with two continuous derivatives at zero separation are also provided. Multi-level powerlaw covariances are constructed with continuous derivatives of all orders. Additional multi-level covariance functions are constructed using the Schur product of single and multi-level covariance functions. A multi-level powerlaw covariance used to reproduce the QBO in GEOS-4 is described along with details of the assimilation experiments. The new covariance model is shown to represent the vertical wind shear associated with the QBO much more effectively than in the baseline GEOS-4 system.
Ray Casting of Large Multi-Resolution Volume Datasets
NASA Astrophysics Data System (ADS)
Lux, C.; Fröhlich, B.
2009-04-01
High quality volume visualization through ray casting on graphics processing units (GPU) has become an important approach for many application domains. We present a GPU-based, multi-resolution ray casting technique for the interactive visualization of massive volume data sets commonly found in the oil and gas industry. Large volume data sets are represented as a multi-resolution hierarchy based on an octree data structure. The original volume data is decomposed into small bricks of a fixed size acting as the leaf nodes of the octree. These nodes are the highest resolution of the volume. Coarser resolutions are represented through inner nodes of the hierarchy which are generated by down sampling eight neighboring nodes on a finer level. Due to limited memory resources of current desktop workstations and graphics hardware only a limited working set of bricks can be locally maintained for a frame to be displayed. This working set is chosen to represent the whole volume at different local resolution levels depending on the current viewer position, transfer function and distinct areas of interest. During runtime the working set of bricks is maintained in CPU- and GPU memory and is adaptively updated by asynchronously fetching data from external sources like hard drives or a network. The CPU memory hereby acts as a secondary level cache for these sources from which the GPU representation is updated. Our volume ray casting algorithm is based on a 3D texture-atlas in GPU memory. This texture-atlas contains the complete working set of bricks of the current multi-resolution representation of the volume. This enables the volume ray casting algorithm to access the whole working set of bricks through only a single 3D texture. For traversing rays through the volume, information about the locations and resolution levels of visited bricks are required for correct compositing computations. We encode this information into a small 3D index texture which represents the current octree subdivision on its finest level and spatially organizes the bricked data. This approach allows us to render a bricked multi-resolution volume data set utilizing only a single rendering pass with no loss of compositing precision. In contrast most state-of-the art volume rendering systems handle the bricked data as individual 3D textures, which are rendered one at a time while the results are composited into a lower precision frame buffer. Furthermore, our method enables us to integrate advanced volume rendering techniques like empty-space skipping, adaptive sampling and preintegrated transfer functions in a very straightforward manner with virtually no extra costs. Our interactive volume ray tracing implementation allows high quality visualizations of massive volume data sets of tens of Gigabytes in size on standard desktop workstations.
Pereira, Suzanne; Névéol, Aurélie; Kerdelhué, Gaétan; Serrot, Elisabeth; Joubert, Michel; Darmoni, Stéfan J
2008-11-06
To assist with the development of a French online quality-controlled health gateway(CISMeF), an automatic indexing tool assigning MeSH descriptors to medical text in French was created. The French Multi-Terminology Indexer (FMTI) relies on a multi-terminology approach involving four prominent medical terminologies and the mappings between them. In this paper,we compare lemmatization and stemming as methods to process French medical text for indexing. We also evaluate the multi-terminology approach implemented in F-MTI. The indexing strategies were assessed on a corpus of 18,814 resources indexed manually. There is little difference in the indexing performance when lemmatization or stemming is used. However, the multi-terminology approach outperforms indexing relying on a single terminology in terms of recall. F-MTI will soon be used in the CISMeF production environment and in a Health MultiTerminology Server in French.
On the Integration of Remote Experimentation into Undergraduate Laboratories--Pedagogical Approach
ERIC Educational Resources Information Center
Esche, Sven K.
2005-01-01
This paper presents an Internet-based open approach to laboratory instruction. In this article, the author talks about an open laboratory approach using a multi-user multi-device remote facility. This approach involves both the direct contact with the computer-controlled laboratory setup of interest with the students present in the laboratory…
Neuropharmacology beyond reductionism - A likely prospect.
Margineanu, Doru Georg
2016-03-01
Neuropharmacology had several major past successes, but the last few decades did not witness any leap forward in the drug treatment of brain disorders. Moreover, current drugs used in neurology and psychiatry alleviate the symptoms, while hardly curing any cause of disease, basically because the etiology of most neuro-psychic syndromes is but poorly known. This review argues that this largely derives from the unbalanced prevalence in neuroscience of the analytic reductionist approach, focused on the cellular and molecular level, while the understanding of integrated brain activities remains flimsier. The decline of drug discovery output in the last decades, quite obvious in neuropharmacology, coincided with the advent of the single target-focused search of potent ligands selective for a well-defined protein, deemed critical in a given pathology. However, all the widespread neuro-psychic troubles are multi-mechanistic and polygenic, their complex etiology making unsuited the single-target drug discovery. An evolving approach, based on systems biology considers that a disease expresses a disturbance of the network of interactions underlying organismic functions, rather than alteration of single molecular components. Accordingly, systems pharmacology seeks to restore a disturbed network via multi-targeted drugs. This review notices that neuropharmacology in fact relies on drugs which are multi-target, this feature having occurred just because those drugs were selected by phenotypic screening in vivo, or emerged from serendipitous clinical observations. The novel systems pharmacology aims, however, to devise ab initio multi-target drugs that will appropriately act on multiple molecular entities. Though this is a task much more complex than the single-target strategy, major informatics resources and computational tools for the systemic approach of drug discovery are already set forth and their rapid progress forecasts promising outcomes for neuropharmacology. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Freer, Phoebe E.; Slanetz, Priscilla J.; Haas, Jennifer S.; Tung, Nadine M.; Hughes, Kevin S.; Armstrong, Katrina; Semine, A. Alan; Troyan, Susan L.; Birdwell, Robyn L.
2015-01-01
Purpose Stemming from breast density notification legislation in Massachusetts effective 2015, we sought to develop a collaborative evidence-based approach to density notification that could be used by practitioners across the state. Our goal was to develop an evidence-based consensus management algorithm to help patients and health care providers follow best practices to implement a coordinated, evidence-based, cost-effective, sustainable practice and to standardize care in recommendations for supplemental screening. Methods We formed the Massachusetts Breast Risk Education and Assessment Task Force (MA-BREAST) a multi-institutional, multi-disciplinary panel of expert radiologists, surgeons, primary care physicians, and oncologists to develop a collaborative approach to density notification legislation. Using evidence-based data from the Institute for Clinical and Economic Review (ICER), the Cochrane review, National Comprehensive Cancer Network (NCCN) guidelines, American Cancer Society (ACS) recommendations, and American College of Radiology (ACR) appropriateness criteria, the group collaboratively developed an evidence-based best-practices algorithm. Results The expert consensus algorithm uses breast density as one element in the risk stratification to determine the need for supplemental screening. Women with dense breasts and otherwise low risk (<15% lifetime risk), do not routinely require supplemental screening per the expert consensus. Women of high risk (>20% lifetime) should consider supplemental screening MRI in addition to routine mammography regardless of breast density. Conclusion We report the development of the multi-disciplinary collaborative approach to density notification. We propose a risk stratification algorithm to assess personal level of risk to determine the need for supplemental screening for an individual woman. PMID:26290416
Zhang, Jingpu; Zhang, Zuping; Wang, Zixiang; Liu, Yuting; Deng, Lei
2018-05-15
Long non-coding RNAs (lncRNAs) are an enormous collection of functional non-coding RNAs. Over the past decades, a large number of novel lncRNA genes have been identified. However, most of the lncRNAs remain function uncharacterized at present. Computational approaches provide a new insight to understand the potential functional implications of lncRNAs. Considering that each lncRNA may have multiple functions and a function may be further specialized into sub-functions, here we describe NeuraNetL2GO, a computational ontological function prediction approach for lncRNAs using hierarchical multi-label classification strategy based on multiple neural networks. The neural networks are incrementally trained level by level, each performing the prediction of gene ontology (GO) terms belonging to a given level. In NeuraNetL2GO, we use topological features of the lncRNA similarity network as the input of the neural networks and employ the output results to annotate the lncRNAs. We show that NeuraNetL2GO achieves the best performance and the overall advantage in maximum F-measure and coverage on the manually annotated lncRNA2GO-55 dataset compared to other state-of-the-art methods. The source code and data are available at http://denglab.org/NeuraNetL2GO/. leideng@csu.edu.cn. Supplementary data are available at Bioinformatics online.
Bush Encroachment Mapping for Africa - Multi-Scale Analysis with Remote Sensing and GIS
NASA Astrophysics Data System (ADS)
Graw, V. A. M.; Oldenburg, C.; Dubovyk, O.
2015-12-01
Bush encroachment describes a global problem which is especially facing the savanna ecosystem in Africa. Livestock is directly affected by decreasing grasslands and inedible invasive species which defines the process of bush encroachment. For many small scale farmers in developing countries livestock represents a type of insurance in times of crop failure or drought. Among that bush encroachment is also a problem for crop production. Studies on the mapping of bush encroachment so far focus on small scales using high-resolution data and rarely provide information beyond the national level. Therefore a process chain was developed using a multi-scale approach to detect bush encroachment for whole Africa. The bush encroachment map is calibrated with ground truth data provided by experts in Southern, Eastern and Western Africa. By up-scaling location specific information on different levels of remote sensing imagery - 30m with Landsat images and 250m with MODIS data - a map is created showing potential and actual areas of bush encroachment on the African continent and thereby provides an innovative approach to map bush encroachment on the regional scale. A classification approach links location data based on GPS information from experts to the respective pixel in the remote sensing imagery. Supervised classification is used while actual bush encroachment information represents the training samples for the up-scaling. The classification technique is based on Random Forests and regression trees, a machine learning classification approach. Working on multiple scales and with the help of field data an innovative approach can be presented showing areas affected by bush encroachment on the African continent. This information can help to prevent further grassland decrease and identify those regions where land management strategies are of high importance to sustain livestock keeping and thereby also secure livelihoods in rural areas.
Model and Analytic Processes for Export License Assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.
2011-09-29
This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less
NASA Astrophysics Data System (ADS)
Garner, G. G.; Keller, K.
2017-12-01
Sea-level rise poses considerable risks to coastal communities, ecosystems, and infrastructure. Decision makers are faced with deeply uncertain sea-level projections when designing a strategy for coastal adaptation. The traditional methods have provided tremendous insight into this decision problem, but are often silent on tradeoffs as well as the effects of tail-area events and of potential future learning. Here we reformulate a simple sea-level rise adaptation model to address these concerns. We show that Direct Policy Search yields improved solution quality, with respect to Pareto-dominance in the objectives, over the traditional approach under uncertain sea-level rise projections and storm surge. Additionally, the new formulation produces high quality solutions with less computational demands than the traditional approach. Our results illustrate the utility of multi-objective adaptive formulations for the example of coastal adaptation, the value of information provided by observations, and point to wider-ranging application in climate change adaptation decision problems.
Multi-objective spatial tools to inform maritime spatial planning in the Adriatic Sea.
Depellegrin, Daniel; Menegon, Stefano; Farella, Giulio; Ghezzo, Michol; Gissi, Elena; Sarretta, Alessandro; Venier, Chiara; Barbanti, Andrea
2017-12-31
This research presents a set of multi-objective spatial tools for sea planning and environmental management in the Adriatic Sea Basin. The tools address four objectives: 1) assessment of cumulative impacts from anthropogenic sea uses on environmental components of marine areas; 2) analysis of sea use conflicts; 3) 3-D hydrodynamic modelling of nutrient dispersion (nitrogen and phosphorus) from riverine sources in the Adriatic Sea Basin and 4) marine ecosystem services capacity assessment from seabed habitats based on an ES matrix approach. Geospatial modelling results were illustrated, analysed and compared on country level and for three biogeographic subdivisions, Northern-Central-Southern Adriatic Sea. The paper discusses model results for their spatial implications, relevance for sea planning, limitations and concludes with an outlook towards the need for more integrated, multi-functional tools development for sea planning. Copyright © 2017. Published by Elsevier B.V.
Multi-look fusion identification: a paradigm shift from quality to quantity in data samples
NASA Astrophysics Data System (ADS)
Wong, S.
2009-05-01
A multi-look identification method known as score-level fusion is found to be capable of achieving very high identification accuracy, even when low quality target signatures are used. Analysis using measured ground vehicle radar signatures has shown that a 97% correct identification rate can be achieved using this multi-look fusion method; in contrast, only a 37% accuracy rate is obtained when single target signature input is used. The results suggest that quantity can be used to replace quality of the target data in improving identification accuracy. With the advent of sensor technology, a large amount of target signatures of marginal quality can be captured routinely. This quantity over quality approach allows maximum exploitation of the available data to improve the target identification performance and this could have the potential of being developed into a disruptive technology.
Development of Prototype HTS Components for Magnetic Suspension Applications
NASA Technical Reports Server (NTRS)
Haldar, P.; Hoehn, J., Jr.; Selvamanickam, V.; Farrell, R. A.; Balachandran, U.; Iyer, A. N.; Peterson, E.; Salazar, K.
1996-01-01
We have concentrated on developing prototype lengths of bismuth and thallium based silver sheathed superconductors by the powder-in-tube approach to fabricate high temperature superconducting (HTS) components for magnetic suspension applications. Long lengths of mono and multi filament tapes are presently being fabricated with critical current densities useful for maglev and many other applications. We have recently demonstrated the prototype manufacture of lengths exceeding 1 km of Bi-2223 multi filament conductor. Long lengths of thallium based multi-filament conductor have also been fabricated with practical levels of critical current density and improved field dependence behavior. Test coils and magnets have been built from these lengths and characterized over a range of temperatures and background fields to determine their performance. Work is in progress to develop, fabricate and test HTS windings that will be suitable for magnetic suspension, levitation and other electric power related applications.
Multi-scale model for the hierarchical architecture of native cellulose hydrogels.
Martínez-Sanz, Marta; Mikkelsen, Deirdre; Flanagan, Bernadine; Gidley, Michael J; Gilbert, Elliot P
2016-08-20
The structure of protiated and deuterated cellulose hydrogels has been investigated using a multi-technique approach combining small-angle scattering with diffraction, spectroscopy and microscopy. A model for the multi-scale structure of native cellulose hydrogels is proposed which highlights the essential role of water at different structural levels characterised by: (i) the existence of cellulose microfibrils containing an impermeable crystalline core surrounded by a partially hydrated paracrystalline shell, (ii) the creation of a strong network of cellulose microfibrils held together by hydrogen bonding to form cellulose ribbons and (iii) the differential behaviour of tightly bound water held within the ribbons compared to bulk solvent. Deuterium labelling provides an effective platform on which to further investigate the role of different plant cell wall polysaccharides in cellulose composite formation through the production of selectively deuterated cellulose composite hydrogels. Copyright © 2016 Elsevier Ltd. All rights reserved.
Public data and open source tools for multi-assay genomic investigation of disease.
Kannan, Lavanya; Ramos, Marcel; Re, Angela; El-Hachem, Nehme; Safikhani, Zhaleh; Gendoo, Deena M A; Davis, Sean; Gomez-Cabrero, David; Castelo, Robert; Hansen, Kasper D; Carey, Vincent J; Morgan, Martin; Culhane, Aedín C; Haibe-Kains, Benjamin; Waldron, Levi
2016-07-01
Molecular interrogation of a biological sample through DNA sequencing, RNA and microRNA profiling, proteomics and other assays, has the potential to provide a systems level approach to predicting treatment response and disease progression, and to developing precision therapies. Large publicly funded projects have generated extensive and freely available multi-assay data resources; however, bioinformatic and statistical methods for the analysis of such experiments are still nascent. We review multi-assay genomic data resources in the areas of clinical oncology, pharmacogenomics and other perturbation experiments, population genomics and regulatory genomics and other areas, and tools for data acquisition. Finally, we review bioinformatic tools that are explicitly geared toward integrative genomic data visualization and analysis. This review provides starting points for accessing publicly available data and tools to support development of needed integrative methods. © The Author 2015. Published by Oxford University Press.
Qiu, Wang-Ren; Zheng, Quan-Shu; Sun, Bi-Qian; Xiao, Xuan
2017-03-01
Predicting phosphorylation protein is a challenging problem, particularly when query proteins have multi-label features meaning that they may be phosphorylated at two or more different type amino acids. In fact, human protein usually be phosphorylated at serine, threonine and tyrosine. By introducing the "multi-label learning" approach, a novel predictor has been developed that can be used to deal with the systems containing both single- and multi-label phosphorylation protein. Here we proposed a predictor called Multi-iPPseEvo by (1) incorporating the protein sequence evolutionary information into the general pseudo amino acid composition (PseAAC) via the grey system theory, (2) balancing out the skewed training datasets by the asymmetric bootstrap approach, and (3) constructing an ensemble predictor by fusing an array of individual random forest classifiers thru a voting system. Rigorous cross-validations via a set of multi-label metrics indicate that the multi-label phosphorylation predictor is very promising and encouraging. The current approach represents a new strategy to deal with the multi-label biological problems, and the software is freely available for academic use at http://www.jci-bioinfo.cn/Multi-iPPseEvo. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Design of a Two-level Adaptive Multi-Agent System for Malaria Vectors driven by an ontology
Koum, Guillaume; Yekel, Augustin; Ndifon, Bengyella; Etang, Josiane; Simard, Frédéric
2007-01-01
Background The understanding of heterogeneities in disease transmission dynamics as far as malaria vectors are concerned is a big challenge. Many studies while tackling this problem don't find exact models to explain the malaria vectors propagation. Methods To solve the problem we define an Adaptive Multi-Agent System (AMAS) which has the property to be elastic and is a two-level system as well. This AMAS is a dynamic system where the two levels are linked by an Ontology which allows it to function as a reduced system and as an extended system. In a primary level, the AMAS comprises organization agents and in a secondary level, it is constituted of analysis agents. Its entry point, a User Interface Agent, can reproduce itself because it is given a minimum of background knowledge and it learns appropriate "behavior" from the user in the presence of ambiguous queries and from other agents of the AMAS in other situations. Results Some of the outputs of our system present a series of tables, diagrams showing some factors like Entomological parameters of malaria transmission, Percentages of malaria transmission per malaria vectors, Entomological inoculation rate. Many others parameters can be produced by the system depending on the inputted data. Conclusion Our approach is an intelligent one which differs from statistical approaches that are sometimes used in the field. This intelligent approach aligns itself with the distributed artificial intelligence. In terms of fight against malaria disease our system offers opportunities of reducing efforts of human resources who are not obliged to cover the entire territory while conducting surveys. Secondly the AMAS can determine the presence or the absence of malaria vectors even when specific data have not been collected in the geographical area. In the difference of a statistical technique, in our case the projection of the results in the field can sometimes appeared to be more general. PMID:17605778
[Research on the methods for multi-class kernel CSP-based feature extraction].
Wang, Jinjia; Zhang, Lingzhi; Hu, Bei
2012-04-01
To relax the presumption of strictly linear patterns in the common spatial patterns (CSP), we studied the kernel CSP (KCSP). A new multi-class KCSP (MKCSP) approach was proposed in this paper, which combines the kernel approach with multi-class CSP technique. In this approach, we used kernel spatial patterns for each class against all others, and extracted signal components specific to one condition from EEG data sets of multiple conditions. Then we performed classification using the Logistic linear classifier. Brain computer interface (BCI) competition III_3a was used in the experiment. Through the experiment, it can be proved that this approach could decompose the raw EEG singles into spatial patterns extracted from multi-class of single trial EEG, and could obtain good classification results.
A Framework of Multi Objectives Negotiation for Dynamic Supply Chain Model
NASA Astrophysics Data System (ADS)
Chai, Jia Yee; Sakaguchi, Tatsuhiko; Shirase, Keiichi
Trends of globalization and advances in Information Technology (IT) have created opportunity in collaborative manufacturing across national borders. A dynamic supply chain utilizes these advances to enable more flexibility in business cooperation. This research proposes a concurrent decision making framework for a three echelons dynamic supply chain model. The dynamic supply chain is formed by autonomous negotiation among agents based on multi agents approach. Instead of generating negotiation aspects (such as amount, price and due date) arbitrary, this framework proposes to utilize the information available at operational level of an organization in order to generate realistic negotiation aspect. The effectiveness of the proposed model is demonstrated by various case studies.
Group adaptation, formal darwinism and contextual analysis.
Okasha, S; Paternotte, C
2012-06-01
We consider the question: under what circumstances can the concept of adaptation be applied to groups, rather than individuals? Gardner and Grafen (2009, J. Evol. Biol.22: 659-671) develop a novel approach to this question, building on Grafen's 'formal Darwinism' project, which defines adaptation in terms of links between evolutionary dynamics and optimization. They conclude that only clonal groups, and to a lesser extent groups in which reproductive competition is repressed, can be considered as adaptive units. We re-examine the conditions under which the selection-optimization links hold at the group level. We focus on an important distinction between two ways of understanding the links, which have different implications regarding group adaptationism. We show how the formal Darwinism approach can be reconciled with G.C. Williams' famous analysis of group adaptation, and we consider the relationships between group adaptation, the Price equation approach to multi-level selection, and the alternative approach based on contextual analysis. © 2012 The Authors. Journal of Evolutionary Biology © 2012 European Society For Evolutionary Biology.
Multi-level trellis coded modulation and multi-stage decoding
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.; Wu, Jiantian; Lin, Shu
1990-01-01
Several constructions for multi-level trellis codes are presented and many codes with better performance than previously known codes are found. These codes provide a flexible trade-off between coding gain, decoding complexity, and decoding delay. New multi-level trellis coded modulation schemes using generalized set partitioning methods are developed for Quadrature Amplitude Modulation (QAM) and Phase Shift Keying (PSK) signal sets. New rotationally invariant multi-level trellis codes which can be combined with differential encoding to resolve phase ambiguity are presented.
ERIC Educational Resources Information Center
DOST, JEANNE
RISING PRESSURES OF COMPETITION FOR LAND IN URBAN AREAS SUGGEST THE NEED FOR NOVEL APPROACHES TO PLANNING PUBLIC LAND USE FOR FOSTERING HIGHER LEVELS OF LIVING DESIRABILITY OF THE URBAN ENVIRONMENT. EMPIRICAL INVESTIGATIONS IN BOTH ECONOMIC AND NON-ECONOMIC DISCIPLINES SERVE AS THE BASIS FOR A BROADER CONCEPT OF THE URBAN SCHOOL LOCATION PROBLEM.…
Johnson, B A; Kremer, P J; Swinburn, B A; de Silva-Sanigorski, A M
2012-07-01
The Be Active Eat Well (BAEW) community-based child obesity prevention intervention was successful in modestly reducing unhealthy weight gain in primary school children using a multi-strategy and multi-setting approach. To (1) examine the relationship between changes in obesity-related individual, household and school factors and changes in standardised child body mass index (zBMI), and (2) determine if the BAEW intervention moderated these effects. The longitudinal relationships between changes in individual, household and school variables and changes in zBMI were explored using multilevel modelling, with measurement time (baseline and follow-up) at level 1, individual (behaviours, n = 1812) at level 2 and households (n = 1318) and schools (n = 18) as higher levels (environments). The effect of the intervention was tested while controlling for child age, gender and maternal education level. This study confirmed that the BAEW intervention lowered child zBMI compared with the comparison group (-0.085 units, P = 0.03). The variation between household environments was found to be a large contributor to the percentage of unexplained change in child zBMI (59%), compared with contributions from the individual (23%) and school levels (1%). Across both groups, screen time (P = 0.03), sweet drink consumption (P = 0.03) and lack of household rules for television (TV) viewing (P = 0.05) were associated with increased zBMI, whereas there was a non-significant association with the frequency the TV was on during evening meals (P = 0.07). The moderating effect of the intervention was only evident for the relationship between the frequency of TV on during meals and zBMI, however, this effect was modest (P = 0.04). The development of childhood obesity involves multi-factorial and multi-level influences, some of which are amenable to change. Obesity prevention strategies should not only target individual behaviours but also the household environment and family practices. Although zBMI changes were modest, these findings are encouraging as small reductions can have population level impacts on childhood obesity levels.
Evidencing Learning Outcomes: A Multi-Level, Multi-Dimensional Course Alignment Model
ERIC Educational Resources Information Center
Sridharan, Bhavani; Leitch, Shona; Watty, Kim
2015-01-01
This conceptual framework proposes a multi-level, multi-dimensional course alignment model to implement a contextualised constructive alignment of rubric design that authentically evidences and assesses learning outcomes. By embedding quality control mechanisms at each level for each dimension, this model facilitates the development of an aligned…
Zheng, Chunli; Wang, Jinan; Liu, Jianling; Pei, Mengjie; Huang, Chao; Wang, Yonghua
2014-08-01
The term systems pharmacology describes a field of study that uses computational and experimental approaches to broaden the view of drug actions rooted in molecular interactions and advance the process of drug discovery. The aim of this work is to stick out the role that the systems pharmacology plays across the multi-target drug discovery from natural products for cardiovascular diseases (CVDs). Firstly, based on network pharmacology methods, we reconstructed the drug-target and target-target networks to determine the putative protein target set of multi-target drugs for CVDs treatment. Secondly, we reintegrated a compound dataset of natural products and then obtained a multi-target compounds subset by virtual-screening process. Thirdly, a drug-likeness evaluation was applied to find the ADME-favorable compounds in this subset. Finally, we conducted in vitro experiments to evaluate the reliability of the selected chemicals and targets. We found that four of the five randomly selected natural molecules can effectively act on the target set for CVDs, indicating the reasonability of our systems-based method. This strategy may serve as a new model for multi-target drug discovery of complex diseases.
Health, policy and geography: insights from a multi-level modelling approach.
Castelli, Adriana; Jacobs, Rowena; Goddard, Maria; Smith, Peter C
2013-09-01
Improving the health and wellbeing of citizens ranks highly on the agenda of most governments. Policy action to enhance health and wellbeing can be targeted at a range of geographical levels and in England the focus has tended to shift away from the national level to smaller areas, such as communities and neighbourhoods. Our focus is to identify the potential for targeting policy interventions at the most appropriate geographical levels in order to enhance health and wellbeing. The rationale is that where variations in health and wellbeing indicators are larger, there may be greater potential for policy intervention targeted at that geographical level to have an impact on the outcomes of interest, compared with a strategy of targeting policy at those levels where relative variations are smaller. We use a multi-level regression approach to identify the degree of variation that exists in a set of health indicators at each level, taking account of the geographical hierarchical organisation of public sector organisations. We find that for each indicator, the proportion of total residual variance is greatest at smaller geographical areas. We also explore the variations in health indicators within a hierarchical level, but across the geographical areas for which public sector organisations are responsible. We show that it is feasible to identify a sub-set of organisations for which unexplained variation in health indicators is significantly greater relative to their counterparts. We demonstrate that adopting a geographical perspective to analyse the variation in indicators of health at different levels offers a potentially powerful analytical tool to signal where public sector organisations, faced increasingly with many competing demands, should target their policy efforts. This is relevant not only to the English context but also to other countries where responsibilities for health and wellbeing are being devolved to localities and communities. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.
Integrated Assessment of Coastal Areas
NASA Astrophysics Data System (ADS)
Nicholls, R. J.
2016-12-01
Coastal areas are experiencing change due to a range of natural and human-induced drivers. Of particular concern is climate change, particularly sea-level rise (SLR). In low gradient coastal areas, small changes in water levels can have profound consequences. Hence SLR is rightly considered a major threat. However, to properly diagnose a problem and find sustainable solutions, a systems approach is essential as the impacts of SLR will be modified by the other drivers. This paper will consider these issues from a multi-disciplinary perspective drawing on examples from around the world.