Sample records for proposed approach involves

  1. Linear regression crash prediction models : issues and proposed solutions.

    DOT National Transportation Integrated Search

    2010-05-01

    The paper develops a linear regression model approach that can be applied to : crash data to predict vehicle crashes. The proposed approach involves novice data aggregation : to satisfy linear regression assumptions; namely error structure normality ...

  2. Systematic Approach to Calculate the Concentration of Chemical Species in Multi-Equilibrium Problems

    ERIC Educational Resources Information Center

    Baeza-Baeza, Juan Jose; Garcia-Alvarez-Coque, Maria Celia

    2011-01-01

    A general systematic approach is proposed for the numerical calculation of multi-equilibrium problems. The approach involves several steps: (i) the establishment of balances involving the chemical species in solution (e.g., mass balances, charge balance, and stoichiometric balance for the reaction products), (ii) the selection of the unknowns (the…

  3. "Walk the Talk": Developing Personal Ethical Agency through a Business Partnership Program

    ERIC Educational Resources Information Center

    Matherne, Brett P.; Gove, Steve; Forlani, Victor; Janney, Jay J.

    2006-01-01

    This article proposes a pedagogical approach dedicated to help students develop personal ethical agency--the ability to make decisions that involve ethical dilemmas consistent with an individual's ethical standards and professional standards of practice. The approach presented involves a tripartite gathering of students, business executives, and…

  4. Evaluating Attitudes, Skill, and Performance in a Learning-Enhanced Quantitative Methods Course: A Structural Modeling Approach.

    ERIC Educational Resources Information Center

    Harlow, Lisa L.; Burkholder, Gary J.; Morrow, Jennifer A.

    2002-01-01

    Used a structural modeling approach to evaluate relations among attitudes, initial skills, and performance in a Quantitative Methods course that involved students in active learning. Results largely confirmed hypotheses offering support for educational reform efforts that propose actively involving students in the learning process, especially in…

  5. The Effects of Adolescent Activities on Delinquency: A Differential Involvement Approach

    ERIC Educational Resources Information Center

    Wong, Siu Kwong

    2005-01-01

    T. Hirschi's (1969, "Causes of Delinquency." University of California Press, Berkeley, CA) control theory proposes that involvement, as an element of the social bond, should reduce delinquency. But, research studies have found that the effect of involvement is rather weak. This study reformulates Hirschi's involvement hypothesis by…

  6. A hybrid clustering approach for multivariate time series - A case study applied to failure analysis in a gas turbine.

    PubMed

    Fontes, Cristiano Hora; Budman, Hector

    2017-11-01

    A clustering problem involving multivariate time series (MTS) requires the selection of similarity metrics. This paper shows the limitations of the PCA similarity factor (SPCA) as a single metric in nonlinear problems where there are differences in magnitude of the same process variables due to expected changes in operation conditions. A novel method for clustering MTS based on a combination between SPCA and the average-based Euclidean distance (AED) within a fuzzy clustering approach is proposed. Case studies involving either simulated or real industrial data collected from a large scale gas turbine are used to illustrate that the hybrid approach enhances the ability to recognize normal and fault operating patterns. This paper also proposes an oversampling procedure to create synthetic multivariate time series that can be useful in commonly occurring situations involving unbalanced data sets. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  7. A Rule Based Approach to ISS Interior Volume Control and Layout

    NASA Technical Reports Server (NTRS)

    Peacock, Brian; Maida, Jim; Fitts, David; Dory, Jonathan

    2001-01-01

    Traditional human factors design involves the development of human factors requirements based on a desire to accommodate a certain percentage of the intended user population. As the product is developed human factors evaluation involves comparison between the resulting design and the specifications. Sometimes performance metrics are involved that allow leniency in the design requirements given that the human performance result is satisfactory. Clearly such approaches may work but they give rise to uncertainty and negotiation. An alternative approach is to adopt human factors design rules that articulate a range of each design continuum over which there are varying outcome expectations and interactions with other variables, including time. These rules are based on a consensus of human factors specialists, designers, managers and customers. The International Space Station faces exactly this challenge in interior volume control, which is based on anthropometric, performance and subjective preference criteria. This paper describes the traditional approach and then proposes a rule-based alternative. The proposed rules involve spatial, temporal and importance dimensions. If successful this rule-based concept could be applied to many traditional human factors design variables and could lead to a more effective and efficient contribution of human factors input to the design process.

  8. Optical flow and driver's kinematics analysis for state of alert sensing.

    PubMed

    Jiménez-Pinto, Javier; Torres-Torriti, Miguel

    2013-03-28

    Road accident statistics from different countries show that a significant number of accidents occur due to driver's fatigue and lack of awareness to traffic conditions. In particular, about 60% of the accidents in which long haul truck and bus drivers are involved are attributed to drowsiness and fatigue. It is thus fundamental to improve non-invasive systems for sensing a driver's state of alert. One of the main challenges to correctly resolve the state of alert is measuring the percentage of eyelid closure over time (PERCLOS), despite the driver's head and body movements. In this paper, we propose a technique that involves optical flow and driver's kinematics analysis to improve the robustness of the driver's alert state measurement under pose changes using a single camera with near-infrared illumination. The proposed approach infers and keeps track of the driver's pose in 3D space in order to ensure that eyes can be located correctly, even after periods of partial occlusion, for example, when the driver stares away from the camera. Our experiments show the effectiveness of the approach with a correct eyes detection rate of 99.41%, on average. The results obtained with the proposed approach in an experiment involving fifteen persons under different levels of sleep deprivation also confirm the discriminability of the fatigue levels. In addition to the measurement of fatigue and drowsiness, the pose tracking capability of the proposed approach has potential applications in distraction assessment and alerting of machine operators.

  9. Optical Flow and Driver's Kinematics Analysis for State of Alert Sensing

    PubMed Central

    Jiménez-Pinto, Javier; Torres-Torriti, Miguel

    2013-01-01

    Road accident statistics from different countries show that a significant number of accidents occur due to driver's fatigue and lack of awareness to traffic conditions. In particular, about 60% of the accidents in which long haul truck and bus drivers are involved are attributed to drowsiness and fatigue. It is thus fundamental to improve non-invasive systems for sensing a driver's state of alert. One of the main challenges to correctly resolve the state of alert is measuring the percentage of eyelid closure over time (PERCLOS), despite the driver's head and body movements. In this paper, we propose a technique that involves optical flow and driver's kinematics analysis to improve the robustness of the driver's alert state measurement under pose changes using a single camera with near-infrared illumination. The proposed approach infers and keeps track of the driver's pose in 3D space in order to ensure that eyes can be located correctly, even after periods of partial occlusion, for example, when the driver stares away from the camera. Our experiments show the effectiveness of the approach with a correct eyes detection rate of 99.41%, on average. The results obtained with the proposed approach in an experiment involving fifteen persons under different levels of sleep deprivation also confirm the discriminability of the fatigue levels. In addition to the measurement of fatigue and drowsiness, the pose tracking capability of the proposed approach has potential applications in distraction assessment and alerting of machine operators. PMID:23539029

  10. Quantifying phase synchronization using instances of Hilbert phase slips

    NASA Astrophysics Data System (ADS)

    Govindan, R. B.

    2018-07-01

    We propose to quantify phase synchronization between two signals, x(t) and y(t), by calculating variance in the Hilbert phase of y(t) at instances of phase slips exhibited by x(t). The proposed approach is tested on numerically simulated coupled chaotic Roessler systems and second order autoregressive processes. Furthermore we compare the performance of the proposed and original approaches using uterine electromyogram signals and show that both approaches yield consistent results A standard phase synchronization approach, which involves unwrapping the Hilbert phases (ϕ1(t) and ϕ2(t)) of the two signals and analyzing the variance in the | n ṡϕ1(t) - m ṡϕ2(t) | , mod 2 π, (n and m are integers), was used for comparison. The synchronization indexes obtained from the proposed approach and the standard approach agree reasonably well in all of the systems studied in this work. Our results indicate that the proposed approach, unlike the traditional approach, does not require the non-invertible transformations - unwrapping of the phases and calculation of mod 2 π and it can be used to reliably to quantify phase synchrony between two signals.

  11. [The Helsinki Declaration: relativism and vulnerability].

    PubMed

    Diniz, D; Corrêa, M

    2001-01-01

    The Helsinki Declaration is a crucial ethical landmark for clinical research involving human beings. Since the Declaration was issued, a series of revisions and modifications have been introduced into the original text, but they have not altered its humanist approach or its international force for regulating clinical research. A proposal for an extensive revision of the Declaration's underlying ethical principles has been debated for the past four years. If the proposal is approved, international clinical research involving human beings will be modified, further increasing the vulnerability of certain social groups. This article discusses the historical process involved in passing the Helsinki Declaration and the most recent debate on the new draft. The article analyzes the new text's social implications for underdeveloped countries, arguing for a political approach to the vulnerability concept.

  12. A Sociotechnical Systems Approach To Coastal Marine Spatial Planning

    DTIC Science & Technology

    2016-12-01

    the authors followed the MEAD step of identifying variances and creating a matrix of these variances. Then the authors were able to propose methods ...potential politics involved, and the risks involved in proposing and attempting to start up a new marine aquaculture operation. 69 Figure 16. Role...10 16. DLNR Board Responsiveness/Review Time 17. Assessment Value Redesign Suggestions • Have a coordinating group or person (with knowledge

  13. Service quality benchmarking via a novel approach based on fuzzy ELECTRE III and IPA: an empirical case involving the Italian public healthcare context.

    PubMed

    La Fata, Concetta Manuela; Lupo, Toni; Piazza, Tommaso

    2017-11-21

    A novel fuzzy-based approach which combines ELECTRE III along with the Importance-Performance Analysis (IPA) is proposed in the present work to comparatively evaluate the service quality in the public healthcare context. Specifically, ELECTRE III is firstly considered to compare the service performance of examined hospitals in a noncompensatory manner. Afterwards, IPA is employed to support the service quality management to point out improvement needs and their priorities. The proposed approach also incorporates features of the Fuzzy Set Theory so as to address the possible uncertainty, subjectivity and vagueness of involved experts in evaluating the service quality. The model is applied to five major Sicilian public hospitals, and strengths and criticalities of the delivered service are finally highlighted and discussed. Although several approaches combining multi-criteria methods have already been proposed in the literature to evaluate the service performance in the healthcare field, to the best of the authors' knowledge the present work represents the first attempt at comparing service performance of alternatives in a noncompensatory manner in the investigated context.

  14. Accounting for the Benefits of Database Normalization

    ERIC Educational Resources Information Center

    Wang, Ting J.; Du, Hui; Lehmann, Constance M.

    2010-01-01

    This paper proposes a teaching approach to reinforce accounting students' understanding of the concept of database normalization. Unlike a conceptual approach shown in most of the AIS textbooks, this approach involves with calculations and reconciliations with which accounting students are familiar because the methods are frequently used in…

  15. [A governance approach applied to analysing research into unemployed workers in the city of Medellin in Colombia].

    PubMed

    Cardona, Alvaro; Nieto, Emmanuel; Mejía, Luz M

    2010-01-01

    Performing an academic exercise aimed at applying the analytical categories from the governance approach developed by Marc Hufty et al., to understand social actors’ relationships in an investigation and intervention project studying so-cioeconomic conditions and seeking to guarantee health insurance continuity for those workers who had lost their work in the city of Medellin, Colombia, from 2004 to 2007. A process of investigation and intervention was examined as a casestudy in which researchers were one of the actors so involved. Characterising stake-holders included: their level of inclusion/involvement in the problem; their power for influencing public policy proposals; their perceptions and proposals’ characteristics, power and dynamics regarding the problem of unemployment and health insurance when someone has lost her/his work; and the characteristics of their interaction with other actors. The results showed that the four analytical dimensions proposed by Hufty (actors, social norms, nodal points and processes) were useful for describing and understanding the interaction of the actors involved in the research and intervention proposal being analysed here (i.e. the case-study). It was concluded that the analytical governance framework proposed by Hufty was useful for understanding how the social subjects interacted; these were the rules which were taken for describing their interaction, being the most important nodes for interaction and progresses achieved whilst implementing the intervention proposal.

  16. 7 CFR 4285.70 - Evaluation criteria.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) Adequacy, soundness, and appropriateness of the proposed approach to solve the identified problem. (30%) (3) Feasibility and probability of success of project solving the problem. (10%) (4) Qualifications, experience in... proposal demonstrates the following: (1) Focus on a practical solution to a significant problem involving...

  17. Formalized Conflicts Detection Based on the Analysis of Multiple Emails: An Approach Combining Statistics and Ontologies

    NASA Astrophysics Data System (ADS)

    Zakaria, Chahnez; Curé, Olivier; Salzano, Gabriella; Smaïli, Kamel

    In Computer Supported Cooperative Work (CSCW), it is crucial for project leaders to detect conflicting situations as early as possible. Generally, this task is performed manually by studying a set of documents exchanged between team members. In this paper, we propose a full-fledged automatic solution that identifies documents, subjects and actors involved in relational conflicts. Our approach detects conflicts in emails, probably the most popular type of documents in CSCW, but the methods used can handle other text-based documents. These methods rely on the combination of statistical and ontological operations. The proposed solution is decomposed in several steps: (i) we enrich a simple negative emotion ontology with terms occuring in the corpus of emails, (ii) we categorize each conflicting email according to the concepts of this ontology and (iii) we identify emails, subjects and team members involved in conflicting emails using possibilistic description logic and a set of proposed measures. Each of these steps are evaluated and validated on concrete examples. Moreover, this approach's framework is generic and can be easily adapted to domains other than conflicts, e.g. security issues, and extended with operations making use of our proposed set of measures.

  18. A combined NLP-differential evolution algorithm approach for the optimization of looped water distribution systems

    NASA Astrophysics Data System (ADS)

    Zheng, Feifei; Simpson, Angus R.; Zecchin, Aaron C.

    2011-08-01

    This paper proposes a novel optimization approach for the least cost design of looped water distribution systems (WDSs). Three distinct steps are involved in the proposed optimization approach. In the first step, the shortest-distance tree within the looped network is identified using the Dijkstra graph theory algorithm, for which an extension is proposed to find the shortest-distance tree for multisource WDSs. In the second step, a nonlinear programming (NLP) solver is employed to optimize the pipe diameters for the shortest-distance tree (chords of the shortest-distance tree are allocated the minimum allowable pipe sizes). Finally, in the third step, the original looped water network is optimized using a differential evolution (DE) algorithm seeded with diameters in the proximity of the continuous pipe sizes obtained in step two. As such, the proposed optimization approach combines the traditional deterministic optimization technique of NLP with the emerging evolutionary algorithm DE via the proposed network decomposition. The proposed methodology has been tested on four looped WDSs with the number of decision variables ranging from 21 to 454. Results obtained show the proposed approach is able to find optimal solutions with significantly less computational effort than other optimization techniques.

  19. You Can Get There from Here: Involving the Community in Children's Aspirations. A Proposal for Tremont Consolidated Grammar School, Tremont, Maine.

    ERIC Educational Resources Information Center

    Lawrence, Barbara Kent

    Tremont Consolidated Grammar School (Maine) has changed little in the past 50 years. The wisdom of Paulo Freire and other educators shows that student empowerment, critical thinking, and parent involvement are necessary for true education, particularly for oppressed people. An ecological approach to education involves the family, the community,…

  20. Sieve estimation in semiparametric modeling of longitudinal data with informative observation times.

    PubMed

    Zhao, Xingqiu; Deng, Shirong; Liu, Li; Liu, Lei

    2014-01-01

    Analyzing irregularly spaced longitudinal data often involves modeling possibly correlated response and observation processes. In this article, we propose a new class of semiparametric mean models that allows for the interaction between the observation history and covariates, leaving patterns of the observation process to be arbitrary. For inference on the regression parameters and the baseline mean function, a spline-based least squares estimation approach is proposed. The consistency, rate of convergence, and asymptotic normality of the proposed estimators are established. Our new approach is different from the usual approaches relying on the model specification of the observation scheme, and it can be easily used for predicting the longitudinal response. Simulation studies demonstrate that the proposed inference procedure performs well and is more robust. The analyses of bladder tumor data and medical cost data are presented to illustrate the proposed method.

  1. Novel image processing approach to detect malaria

    NASA Astrophysics Data System (ADS)

    Mas, David; Ferrer, Belen; Cojoc, Dan; Finaurini, Sara; Mico, Vicente; Garcia, Javier; Zalevsky, Zeev

    2015-09-01

    In this paper we present a novel image processing algorithm providing good preliminary capabilities for in vitro detection of malaria. The proposed concept is based upon analysis of the temporal variation of each pixel. Changes in dark pixels mean that inter cellular activity happened, indicating the presence of the malaria parasite inside the cell. Preliminary experimental results involving analysis of red blood cells being either healthy or infected with malaria parasites, validated the potential benefit of the proposed numerical approach.

  2. Report: Unsupervised identification of malaria parasites using computer vision.

    PubMed

    Khan, Najeed Ahmed; Pervaz, Hassan; Latif, Arsalan; Musharaff, Ayesha

    2017-01-01

    Malaria in human is a serious and fatal tropical disease. This disease results from Anopheles mosquitoes that are infected by Plasmodium species. The clinical diagnosis of malaria based on the history, symptoms and clinical findings must always be confirmed by laboratory diagnosis. Laboratory diagnosis of malaria involves identification of malaria parasite or its antigen / products in the blood of the patient. Manual diagnosis of malaria parasite by the pathologists has proven to become cumbersome. Therefore, there is a need of automatic, efficient and accurate identification of malaria parasite. In this paper, we proposed a computer vision based approach to identify the malaria parasite from light microscopy images. This research deals with the challenges involved in the automatic detection of malaria parasite tissues. Our proposed method is based on the pixel-based approach. We used K-means clustering (unsupervised approach) for the segmentation to identify malaria parasite tissues.

  3. Combined non-parametric and parametric approach for identification of time-variant systems

    NASA Astrophysics Data System (ADS)

    Dziedziech, Kajetan; Czop, Piotr; Staszewski, Wieslaw J.; Uhl, Tadeusz

    2018-03-01

    Identification of systems, structures and machines with variable physical parameters is a challenging task especially when time-varying vibration modes are involved. The paper proposes a new combined, two-step - i.e. non-parametric and parametric - modelling approach in order to determine time-varying vibration modes based on input-output measurements. Single-degree-of-freedom (SDOF) vibration modes from multi-degree-of-freedom (MDOF) non-parametric system representation are extracted in the first step with the use of time-frequency wavelet-based filters. The second step involves time-varying parametric representation of extracted modes with the use of recursive linear autoregressive-moving-average with exogenous inputs (ARMAX) models. The combined approach is demonstrated using system identification analysis based on the experimental mass-varying MDOF frame-like structure subjected to random excitation. The results show that the proposed combined method correctly captures the dynamics of the analysed structure, using minimum a priori information on the model.

  4. Towards an Approach for an Accessible and Inclusive Virtual Education Using ESVI-AL Project Results

    ERIC Educational Resources Information Center

    Amado-Salvatierra, Hector R.; Hilera, Jose R.

    2015-01-01

    Purpose: This paper aims to present an approach to achieve accessible and inclusive Virtual Education for all, but especially intended for students with disabilities. This work proposes main steps to take into consideration for stakeholders involved in the educational process related to an inclusive e-Learning. Design/methodology/approach: The…

  5. The Ecosystem of Information Retrieval

    ERIC Educational Resources Information Center

    Rodriguez-Munoz, Jose-Vicente; Martinez-Mendez, Francisco-Javier; Pastor-Sanchez, Juan-Antonio

    2012-01-01

    Introduction: This paper presents an initial proposal for a formal framework that, by studying the metric variables involved in information retrieval, can establish the sequence of events involved and how to perform it. Method: A systematic approach from the equations of Shannon and Weaver to establish the decidability of information retrieval…

  6. Corrected simulations for one-dimensional diffusion processes with naturally occurring boundaries.

    PubMed

    Shafiey, Hassan; Gan, Xinjun; Waxman, David

    2017-11-01

    To simulate a diffusion process, a usual approach is to discretize the time in the associated stochastic differential equation. This is the approach used in the Euler method. In the present work we consider a one-dimensional diffusion process where the terms occurring, within the stochastic differential equation, prevent the process entering a region. The outcome is a naturally occurring boundary (which may be absorbing or reflecting). A complication occurs in a simulation of this situation. The term involving a random variable, within the discretized stochastic differential equation, may take a trajectory across the boundary into a "forbidden region." The naive way of dealing with this problem, which we refer to as the "standard" approach, is simply to reset the trajectory to the boundary, based on the argument that crossing the boundary actually signifies achieving the boundary. In this work we show, within the framework of the Euler method, that such resetting introduces a spurious force into the original diffusion process. This force may have a significant influence on trajectories that come close to a boundary. We propose a corrected numerical scheme, for simulating one-dimensional diffusion processes with naturally occurring boundaries. This involves correcting the standard approach, so that an exact property of the diffusion process is precisely respected. As a consequence, the proposed scheme does not introduce a spurious force into the dynamics. We present numerical test cases, based on exactly soluble one-dimensional problems with one or two boundaries, which suggest that, for a given value of the discrete time step, the proposed scheme leads to substantially more accurate results than the standard approach. Alternatively, the standard approach needs considerably more computation time to obtain a comparable level of accuracy to the proposed scheme, because the standard approach requires a significantly smaller time step.

  7. Corrected simulations for one-dimensional diffusion processes with naturally occurring boundaries

    NASA Astrophysics Data System (ADS)

    Shafiey, Hassan; Gan, Xinjun; Waxman, David

    2017-11-01

    To simulate a diffusion process, a usual approach is to discretize the time in the associated stochastic differential equation. This is the approach used in the Euler method. In the present work we consider a one-dimensional diffusion process where the terms occurring, within the stochastic differential equation, prevent the process entering a region. The outcome is a naturally occurring boundary (which may be absorbing or reflecting). A complication occurs in a simulation of this situation. The term involving a random variable, within the discretized stochastic differential equation, may take a trajectory across the boundary into a "forbidden region." The naive way of dealing with this problem, which we refer to as the "standard" approach, is simply to reset the trajectory to the boundary, based on the argument that crossing the boundary actually signifies achieving the boundary. In this work we show, within the framework of the Euler method, that such resetting introduces a spurious force into the original diffusion process. This force may have a significant influence on trajectories that come close to a boundary. We propose a corrected numerical scheme, for simulating one-dimensional diffusion processes with naturally occurring boundaries. This involves correcting the standard approach, so that an exact property of the diffusion process is precisely respected. As a consequence, the proposed scheme does not introduce a spurious force into the dynamics. We present numerical test cases, based on exactly soluble one-dimensional problems with one or two boundaries, which suggest that, for a given value of the discrete time step, the proposed scheme leads to substantially more accurate results than the standard approach. Alternatively, the standard approach needs considerably more computation time to obtain a comparable level of accuracy to the proposed scheme, because the standard approach requires a significantly smaller time step.

  8. Exploiting Quantum Resonance to Solve Combinatorial Problems

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Fijany, Amir

    2006-01-01

    Quantum resonance would be exploited in a proposed quantum-computing approach to the solution of combinatorial optimization problems. In quantum computing in general, one takes advantage of the fact that an algorithm cannot be decoupled from the physical effects available to implement it. Prior approaches to quantum computing have involved exploitation of only a subset of known quantum physical effects, notably including parallelism and entanglement, but not including resonance. In the proposed approach, one would utilize the combinatorial properties of tensor-product decomposability of unitary evolution of many-particle quantum systems for physically simulating solutions to NP-complete problems (a class of problems that are intractable with respect to classical methods of computation). In this approach, reinforcement and selection of a desired solution would be executed by means of quantum resonance. Classes of NP-complete problems that are important in practice and could be solved by the proposed approach include planning, scheduling, search, and optimal design.

  9. ALS Pathogenesis and Therapeutic Approaches: The Role of Mesenchymal Stem Cells and Extracellular Vesicles.

    PubMed

    Bonafede, Roberta; Mariotti, Raffaella

    2017-01-01

    Amyotrophic lateral sclerosis (ALS) is a fatal neurodegenerative disease characterized by progressive muscle paralysis determined by the degeneration of motoneurons in the motor cortex brainstem and spinal cord. The ALS pathogenetic mechanisms are still unclear, despite the wealth of studies demonstrating the involvement of several altered signaling pathways, such as mitochondrial dysfunction, glutamate excitotoxicity, oxidative stress and neuroinflammation. To date, the proposed therapeutic strategies are targeted to one or a few of these alterations, resulting in only a minimal effect on disease course and survival of ALS patients. The involvement of different mechanisms in ALS pathogenesis underlines the need for a therapeutic approach targeted to multiple aspects. Mesenchymal stem cells (MSC) can support motoneurons and surrounding cells, reduce inflammation, stimulate tissue regeneration and release growth factors. On this basis, MSC have been proposed as promising candidates to treat ALS. However, due to the drawbacks of cell therapy, the possible therapeutic use of extracellular vesicles (EVs) released by stem cells is raising increasing interest. The present review summarizes the main pathological mechanisms involved in ALS and the related therapeutic approaches proposed to date, focusing on MSC therapy and their preclinical and clinical applications. Moreover, the nature and characteristics of EVs and their role in recapitulating the effect of stem cells are discussed, elucidating how and why these vesicles could provide novel opportunities for ALS treatment.

  10. Exact Tests for the Rasch Model via Sequential Importance Sampling

    ERIC Educational Resources Information Center

    Chen, Yuguo; Small, Dylan

    2005-01-01

    Rasch proposed an exact conditional inference approach to testing his model but never implemented it because it involves the calculation of a complicated probability. This paper furthers Rasch's approach by (1) providing an efficient Monte Carlo methodology for accurately approximating the required probability and (2) illustrating the usefulness…

  11. New Pedagogical Approaches to Improve Production of Materials in Distance Education.

    ERIC Educational Resources Information Center

    Mena, Marta

    1992-01-01

    Analyzes problems involved in the production of instructional materials for distance education and offers new pedagogical approaches to improve production of materials for distance education. Discusses past, present, and future methods used to design instructional materials, proposes models to aid in the production of instructional materials, and…

  12. A Strategy for Language Assessment of Young Children: A Combination of Two Approaches.

    ERIC Educational Resources Information Center

    Kelly, Donna J.; Rice, Mabel L.

    1986-01-01

    A proposed strategy for language assessment advocates a combination of descriptive and formal assessment measures. This approach involves a parent-clinician interview, parent-child observations, clinician-directed formal and nonformal assessment procedures, and a parent-clinician interpretation. An elaborated sample of language assessment is…

  13. Re-Animating the Mathematical Concept: A Materialist Look at Students Practicing Mathematics with Digital Technology

    ERIC Educational Resources Information Center

    Chorney, Sean

    2017-01-01

    This paper proposes a philosophical approach to the mathematical engagement involving students and a digital tool. This philosophical proposal aligns with other theories of learning that have been implemented in mathematics education but rearticulates some metaphors so as to promote insight and ideas to further support continued investigations…

  14. A new-old approach for shallow landslide analysis and susceptibility zoning in fine-grained weathered soils of southern Italy

    NASA Astrophysics Data System (ADS)

    Cascini, Leonardo; Ciurleo, Mariantonietta; Di Nocera, Silvio; Gullà, Giovanni

    2015-07-01

    Rainfall-induced shallow landslides involve several geo-environmental contexts and different types of soils. In clayey soils, they affect the most superficial layer, which is generally constituted by physically weathered soils characterised by a diffuse pattern of cracks. This type of landslide most commonly occurs in the form of multiple-occurrence landslide phenomena simultaneously involving large areas and thus has several consequences in terms of environmental and economic damage. Indeed, landslide susceptibility zoning is a relevant issue for land use planning and/or design purposes. This study proposes a multi-scale approach to reach this goal. The proposed approach is tested and validated over an area in southern Italy affected by widespread shallow landslides that can be classified as earth slides and earth slide-flows. Specifically, by moving from a small (1:100,000) to a medium scale (1:25,000), with the aid of heuristic and statistical methods, the approach identifies the main factors leading to landslide occurrence and effectively detects the areas potentially affected by these phenomena. Finally, at a larger scale (1:5000), deterministic methods, i.e., physically based models (TRIGRS and TRIGRS-unsaturated), allow quantitative landslide susceptibility assessment, starting from sample areas representative of those that can be affected by shallow landslides. Considering the reliability of the obtained results, the proposed approach seems useful for analysing other case studies in similar geological contexts.

  15. A Gaussian Approximation Approach for Value of Information Analysis.

    PubMed

    Jalal, Hawre; Alarid-Escudero, Fernando

    2018-02-01

    Most decisions are associated with uncertainty. Value of information (VOI) analysis quantifies the opportunity loss associated with choosing a suboptimal intervention based on current imperfect information. VOI can inform the value of collecting additional information, resource allocation, research prioritization, and future research designs. However, in practice, VOI remains underused due to many conceptual and computational challenges associated with its application. Expected value of sample information (EVSI) is rooted in Bayesian statistical decision theory and measures the value of information from a finite sample. The past few years have witnessed a dramatic growth in computationally efficient methods to calculate EVSI, including metamodeling. However, little research has been done to simplify the experimental data collection step inherent to all EVSI computations, especially for correlated model parameters. This article proposes a general Gaussian approximation (GA) of the traditional Bayesian updating approach based on the original work by Raiffa and Schlaifer to compute EVSI. The proposed approach uses a single probabilistic sensitivity analysis (PSA) data set and involves 2 steps: 1) a linear metamodel step to compute the EVSI on the preposterior distributions and 2) a GA step to compute the preposterior distribution of the parameters of interest. The proposed approach is efficient and can be applied for a wide range of data collection designs involving multiple non-Gaussian parameters and unbalanced study designs. Our approach is particularly useful when the parameters of an economic evaluation are correlated or interact.

  16. Consumer involvement in seafood as family meals in Norway: an application of the expectancy-value approach.

    PubMed

    Olsen, S O

    2001-04-01

    A theoretical model of involvement in consumption of food products was tested in a representative survey of Norwegian households for the particular case of consuming seafood as a common family meal. The empirical study is based on using structural equation approach to test construct validity of measures and the empirical fit of the theoretical model. Attitudes, negative feelings, social norms and moral obligation were proved to be important, reliable and different constructs and explained 63% of the variation in seafood involvement. Negative feelings and moral obligation was the most important antecedents of involvement. Both our proposed model and modified model with seafood involvement as a mediator fit well with the data and proved our expectations in a promising way. Copyright 2001 Academic Press.

  17. Dry deposition models for radionuclides dispersed in air: a new approach for deposition velocity evaluation schema

    NASA Astrophysics Data System (ADS)

    Giardina, M.; Buffa, P.; Cervone, A.; De Rosa, F.; Lombardo, C.; Casamirra, M.

    2017-11-01

    In the framework of a National Research Program funded by the Italian Minister of Economic Development, the Department of Energy, Information Engineering and Mathematical Models (DEIM) of Palermo University and ENEA Research Centre of Bologna, Italy are performing several research activities to study physical models and mathematical approaches aimed at investigating dry deposition mechanisms of radioactive pollutants. On the basis of such studies, a new approach to evaluate the dry deposition velocity for particles is proposed. Comparisons with some literature experimental data show that the proposed dry deposition scheme can capture the main phenomena involved in the dry deposition process successfully.

  18. An Alternative Approach for Nonlinear Latent Variable Models

    ERIC Educational Resources Information Center

    Mooijaart, Ab; Bentler, Peter M.

    2010-01-01

    In the last decades there has been an increasing interest in nonlinear latent variable models. Since the seminal paper of Kenny and Judd, several methods have been proposed for dealing with these kinds of models. This article introduces an alternative approach. The methodology involves fitting some third-order moments in addition to the means and…

  19. Representing Graphical User Interfaces with Sound: A Review of Approaches

    ERIC Educational Resources Information Center

    Ratanasit, Dan; Moore, Melody M.

    2005-01-01

    The inability of computer users who are visually impaired to access graphical user interfaces (GUIs) has led researchers to propose approaches for adapting GUIs to auditory interfaces, with the goal of providing access for visually impaired people. This article outlines the issues involved in nonvisual access to graphical user interfaces, reviews…

  20. Contributions to the Underlying Bivariate Normal Method for Factor Analyzing Ordinal Data

    ERIC Educational Resources Information Center

    Xi, Nuo; Browne, Michael W.

    2014-01-01

    A promising "underlying bivariate normal" approach was proposed by Jöreskog and Moustaki for use in the factor analysis of ordinal data. This was a limited information approach that involved the maximization of a composite likelihood function. Its advantage over full-information maximum likelihood was that very much less computation was…

  1. Ecology of Mind: A Batesonian Systems Thinking Approach to Curriculum Enactment

    ERIC Educational Resources Information Center

    Bloom, Jeffrey W.

    2012-01-01

    This article proposes a Batesonian systems thinking and ecology of mind approach to enacting curriculum. The key ideas for the model include ecology of mind, relationships, systems, systems thinking, pattern thinking, abductive thinking, and context. These ideas provide a basis for a recursive, three-part model involving developing (a) depth of…

  2. Triple Scheme of Learning Support Design for Scientific Discovery Learning Based on Computer Simulation: Experimental Research

    ERIC Educational Resources Information Center

    Zhang, Jianwei; Chen, Qi; Sun, Yanquing; Reid, David J.

    2004-01-01

    Learning support studies involving simulation-based scientific discovery learning have tended to adopt an ad hoc strategies-oriented approach in which the support strategies are typically pre-specified according to learners' difficulties in particular activities. This article proposes a more integrated approach, a triple scheme for learning…

  3. A fuzzy Bayesian network approach to quantify the human behaviour during an evacuation

    NASA Astrophysics Data System (ADS)

    Ramli, Nurulhuda; Ghani, Noraida Abdul; Ahmad, Nazihah

    2016-06-01

    Bayesian Network (BN) has been regarded as a successful representation of inter-relationship of factors affecting human behavior during an emergency. This paper is an extension of earlier work of quantifying the variables involved in the BN model of human behavior during an evacuation using a well-known direct probability elicitation technique. To overcome judgment bias and reduce the expert's burden in providing precise probability values, a new approach for the elicitation technique is required. This study proposes a new fuzzy BN approach for quantifying human behavior during an evacuation. Three major phases of methodology are involved, namely 1) development of qualitative model representing human factors during an evacuation, 2) quantification of BN model using fuzzy probability and 3) inferencing and interpreting the BN result. A case study of three inter-dependencies of human evacuation factors such as danger assessment ability, information about the threat and stressful conditions are used to illustrate the application of the proposed method. This approach will serve as an alternative to the conventional probability elicitation technique in understanding the human behavior during an evacuation.

  4. Educational Debt Burden: Law School Assistance Programs--A Review of Existing Programs and a Proposed New Approach.

    ERIC Educational Resources Information Center

    Vernon, David H.

    1989-01-01

    The paper reviews and critiques the 13 existing (1987) law school assistance programs and proposes a national repayment-assistance debt-forgiveness program which would involve an income-contingent repayment "tax" coupled with an assurance to creditors of repayment by means of a "guarantee" or "insurance" fund. (DB)

  5. Write Another Poem about Marigold: Meaningful Writing as a Process of Change.

    ERIC Educational Resources Information Center

    Teichmann, Sandra Gail

    1995-01-01

    Considers a process approach toward the goal of meaningful writing which may aid in positive personal change. Outlines recent criticism of contemporary poetry; argues against tradition and practice of craft in writing poetry. Proposes a means of writing centered on a method of inquiry involving elements of self-involvement, curiosity, and risk to…

  6. Forecasting conditional climate-change using a hybrid approach

    USGS Publications Warehouse

    Esfahani, Akbar Akbari; Friedel, Michael J.

    2014-01-01

    A novel approach is proposed to forecast the likelihood of climate-change across spatial landscape gradients. This hybrid approach involves reconstructing past precipitation and temperature using the self-organizing map technique; determining quantile trends in the climate-change variables by quantile regression modeling; and computing conditional forecasts of climate-change variables based on self-similarity in quantile trends using the fractionally differenced auto-regressive integrated moving average technique. The proposed modeling approach is applied to states (Arizona, California, Colorado, Nevada, New Mexico, and Utah) in the southwestern U.S., where conditional forecasts of climate-change variables are evaluated against recent (2012) observations, evaluated at a future time period (2030), and evaluated as future trends (2009–2059). These results have broad economic, political, and social implications because they quantify uncertainty in climate-change forecasts affecting various sectors of society. Another benefit of the proposed hybrid approach is that it can be extended to any spatiotemporal scale providing self-similarity exists.

  7. Prediction of nocturnal hypoglycemia by an aggregation of previously known prediction approaches: proof of concept for clinical application.

    PubMed

    Tkachenko, Pavlo; Kriukova, Galyna; Aleksandrova, Marharyta; Chertov, Oleg; Renard, Eric; Pereverzyev, Sergei V

    2016-10-01

    Nocturnal hypoglycemia (NH) is common in patients with insulin-treated diabetes. Despite the risk associated with NH, there are only a few methods aiming at the prediction of such events based on intermittent blood glucose monitoring data and none has been validated for clinical use. Here we propose a method of combining several predictors into a new one that will perform at the level of the best involved one, or even outperform all individual candidates. The idea of the method is to use a recently developed strategy for aggregating ranking algorithms. The method has been calibrated and tested on data extracted from clinical trials, performed in the European FP7-funded project DIAdvisor. Then we have tested the proposed approach on other datasets to show the portability of the method. This feature of the method allows its simple implementation in the form of a diabetic smartphone app. On the considered datasets the proposed approach exhibits good performance in terms of sensitivity, specificity and predictive values. Moreover, the resulting predictor automatically performs at the level of the best involved method or even outperforms it. We propose a strategy for a combination of NH predictors that leads to a method exhibiting a reliable performance and the potential for everyday use by any patient who performs self-monitoring of blood glucose. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based, Graduate-Level Analytical Chemistry Course

    NASA Astrophysics Data System (ADS)

    Toh, Chee-Seng

    2007-04-01

    A research-focused approach is described for a nonlaboratory-based graduate-level module on analytical chemistry. The approach utilizes commonly practiced activities carried out in active research laboratories, in particular, activities involving logging of ideas and thoughts, journal clubs, proposal writing, classroom participation and discussions, and laboratory tours. This approach was adapted without compromising the course content and results suggest possible adaptation and implementation in other graduate-level courses.

  9. A highly efficient approach to protein interactome mapping based on collaborative filtering framework.

    PubMed

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-09

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.

  10. A Highly Efficient Approach to Protein Interactome Mapping Based on Collaborative Filtering Framework

    PubMed Central

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-01

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly. PMID:25572661

  11. A Highly Efficient Approach to Protein Interactome Mapping Based on Collaborative Filtering Framework

    NASA Astrophysics Data System (ADS)

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-01

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.

  12. A hybrid approach to parameter identification of linear delay differential equations involving multiple delays

    NASA Astrophysics Data System (ADS)

    Marzban, Hamid Reza

    2018-05-01

    In this paper, we are concerned with the parameter identification of linear time-invariant systems containing multiple delays. The approach is based upon a hybrid of block-pulse functions and Legendre's polynomials. The convergence of the proposed procedure is established and an upper error bound with respect to the L2-norm associated with the hybrid functions is derived. The problem under consideration is first transformed into a system of algebraic equations. The least squares technique is then employed for identification of the desired parameters. Several multi-delay systems of varying complexity are investigated to evaluate the performance and capability of the proposed approximation method. It is shown that the proposed approach is also applicable to a class of nonlinear multi-delay systems. It is demonstrated that the suggested procedure provides accurate results for the desired parameters.

  13. Asymptotic Standard Errors for Item Response Theory True Score Equating of Polytomous Items

    ERIC Educational Resources Information Center

    Cher Wong, Cheow

    2015-01-01

    Building on previous works by Lord and Ogasawara for dichotomous items, this article proposes an approach to derive the asymptotic standard errors of item response theory true score equating involving polytomous items, for equivalent and nonequivalent groups of examinees. This analytical approach could be used in place of empirical methods like…

  14. [The hygienic evaluation of the mutagenic potential of industrial wastes].

    PubMed

    Zhurkov, V S; Rusakov, N V; Tonkopiĭ, N I; Sycheva, L P; Akhal'tseva, L V; Neiaskina, E V; Pirtakhiia, N V; Malysheva, A G; Rastiannikov, E G

    1998-01-01

    A combination of two approaches to assessing the carcinogenic and mutagenic potentials of industrial waste is proposed. One approach includes determination of the carcinogenic and mutagenic properties of individual chemicals of waste, the other involves biological indication of the cumulative mutagenic activity of waste samples. The mutagenic potential of some waste samples of aircraft industry was determined.

  15. Teaching Simulation and Computer-Aided Separation Optimization in Liquid Chromatography by Means of Illustrative Microsoft Excel Spreadsheets

    ERIC Educational Resources Information Center

    Fasoula, S.; Nikitas, P.; Pappa-Louisi, A.

    2017-01-01

    A series of Microsoft Excel spreadsheets were developed to simulate the process of separation optimization under isocratic and simple gradient conditions. The optimization procedure is performed in a stepwise fashion using simple macros for an automatic application of this approach. The proposed optimization approach involves modeling of the peak…

  16. Writing Abstracts for MLIS Research Proposals Using Worked Examples: An Innovative Approach to Teaching the Elements of Research Design

    ERIC Educational Resources Information Center

    Ondrusek, Anita L.; Thiele, Harold E.; Yang, Changwoo

    2014-01-01

    The authors examined abstracts written by graduate students for their research proposals as a requirement for a course in research methods in a distance learning MLIS program. The students learned under three instructional conditions that involved varying levels of access to worked examples created from abstracts representing research in the LIS…

  17. Link-Based Similarity Measures Using Reachability Vectors

    PubMed Central

    Yoon, Seok-Ho; Kim, Ji-Soo; Ryu, Minsoo; Choi, Ho-Jin

    2014-01-01

    We present a novel approach for computing link-based similarities among objects accurately by utilizing the link information pertaining to the objects involved. We discuss the problems with previous link-based similarity measures and propose a novel approach for computing link based similarities that does not suffer from these problems. In the proposed approach each target object is represented by a vector. Each element of the vector corresponds to all the objects in the given data, and the value of each element denotes the weight for the corresponding object. As for this weight value, we propose to utilize the probability of reaching from the target object to the specific object, computed using the “Random Walk with Restart” strategy. Then, we define the similarity between two objects as the cosine similarity of the two vectors. In this paper, we provide examples to show that our approach does not suffer from the aforementioned problems. We also evaluate the performance of the proposed methods in comparison with existing link-based measures, qualitatively and quantitatively, with respect to two kinds of data sets, scientific papers and Web documents. Our experimental results indicate that the proposed methods significantly outperform the existing measures. PMID:24701188

  18. Distributed reinforcement learning for adaptive and robust network intrusion response

    NASA Astrophysics Data System (ADS)

    Malialis, Kleanthis; Devlin, Sam; Kudenko, Daniel

    2015-07-01

    Distributed denial of service (DDoS) attacks constitute a rapidly evolving threat in the current Internet. Multiagent Router Throttling is a novel approach to defend against DDoS attacks where multiple reinforcement learning agents are installed on a set of routers and learn to rate-limit or throttle traffic towards a victim server. The focus of this paper is on online learning and scalability. We propose an approach that incorporates task decomposition, team rewards and a form of reward shaping called difference rewards. One of the novel characteristics of the proposed system is that it provides a decentralised coordinated response to the DDoS problem, thus being resilient to DDoS attacks themselves. The proposed system learns remarkably fast, thus being suitable for online learning. Furthermore, its scalability is successfully demonstrated in experiments involving 1000 learning agents. We compare our approach against a baseline and a popular state-of-the-art throttling technique from the network security literature and show that the proposed approach is more effective, adaptive to sophisticated attack rate dynamics and robust to agent failures.

  19. Characterization of systemic disease in primary Sjögren's syndrome: EULAR-SS Task Force recommendations for articular, cutaneous, pulmonary and renal involvements.

    PubMed

    Ramos-Casals, Manuel; Brito-Zerón, Pilar; Seror, Raphaèle; Bootsma, Hendrika; Bowman, Simon J; Dörner, Thomas; Gottenberg, Jacques-Eric; Mariette, Xavier; Theander, Elke; Bombardieri, Stefano; De Vita, Salvatore; Mandl, Thomas; Ng, Wan-Fai; Kruize, Aike; Tzioufas, Athanasios; Vitali, Claudio

    2015-12-01

    To reach a European consensus on the definition and characterization of the main organ-specific extraglandular manifestations in primary SS. The EULAR-SS Task Force Group steering committee agreed to approach SS-related systemic involvement according to the EULAR SS Disease Activity Index (ESSDAI) classification and proposed the preparation of four separate manuscripts: articular, cutaneous, pulmonary and renal ESSDAI involvement; muscular, peripheral nervous system, CNS and haematological ESSDAI involvement; organs not included in the ESSDAI classification; and lymphoproliferative disease. Currently available evidence was obtained by a systematic literature review focused on SS-related systemic features. The following information was summarized for articular, cutaneous, pulmonary and renal involvement: a clear, consensual definition of the clinical feature, a brief epidemiological description including an estimate of the prevalence reported in the main clinical series and a brief list of the key clinical and diagnostic features that could help physicians clearly identify these features. Unfortunately we found that the body of evidence relied predominantly on information retrieved from individual cases, and the scientific information provided was heterogeneous. The analysis of types of involvement was biased due to the unbalanced reporting of severe cases over non-severe cases, although the main sources of bias were the heterogeneous definitions of organ involvement (or even the lack of definition in some studies) and the heterogeneous diagnostic approach used in studies to investigate involvment of each organ. The proposals included in this article are a first step to developing an optimal diagnostic approach to systemic involvement in primary SS and may pave the way for further development of evidence-based diagnostic and therapeutic guidelines. © The Author 2015. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Application of machine learning methods for traffic signs recognition

    NASA Astrophysics Data System (ADS)

    Filatov, D. V.; Ignatev, K. V.; Deviatkin, A. V.; Serykh, E. V.

    2018-02-01

    This paper focuses on solving a relevant and pressing safety issue on intercity roads. Two approaches were considered for solving the problem of traffic signs recognition; the approaches involved neural networks to analyze images obtained from a camera in the real-time mode. The first approach is based on a sequential image processing. At the initial stage, with the help of color filters and morphological operations (dilatation and erosion), the area containing the traffic sign is located on the image, then the selected and scaled fragment of the image is analyzed using a feedforward neural network to determine the meaning of the found traffic sign. Learning of the neural network in this approach is carried out using a backpropagation method. The second approach involves convolution neural networks at both stages, i.e. when searching and selecting the area of the image containing the traffic sign, and when determining its meaning. Learning of the neural network in the second approach is carried out using the intersection over union function and a loss function. For neural networks to learn and the proposed algorithms to be tested, a series of videos from a dash cam were used that were shot under various weather and illumination conditions. As a result, the proposed approaches for traffic signs recognition were analyzed and compared by key indicators such as recognition rate percentage and the complexity of neural networks’ learning process.

  1. Automatic selection of landmarks in T1-weighted head MRI with regression forests for image registration initialization.

    PubMed

    Wang, Jianing; Liu, Yuan; Noble, Jack H; Dawant, Benoit M

    2017-10-01

    Medical image registration establishes a correspondence between images of biological structures, and it is at the core of many applications. Commonly used deformable image registration methods depend on a good preregistration initialization. We develop a learning-based method to automatically find a set of robust landmarks in three-dimensional MR image volumes of the head. These landmarks are then used to compute a thin plate spline-based initialization transformation. The process involves two steps: (1) identifying a set of landmarks that can be reliably localized in the images and (2) selecting among them the subset that leads to a good initial transformation. To validate our method, we use it to initialize five well-established deformable registration algorithms that are subsequently used to register an atlas to MR images of the head. We compare our proposed initialization method with a standard approach that involves estimating an affine transformation with an intensity-based approach. We show that for all five registration algorithms the final registration results are statistically better when they are initialized with the method that we propose than when a standard approach is used. The technique that we propose is generic and could be used to initialize nonrigid registration algorithms for other applications.

  2. A Generic Deep-Learning-Based Approach for Automated Surface Inspection.

    PubMed

    Ren, Ruoxu; Hung, Terence; Tan, Kay Chen

    2018-03-01

    Automated surface inspection (ASI) is a challenging task in industry, as collecting training dataset is usually costly and related methods are highly dataset-dependent. In this paper, a generic approach that requires small training data for ASI is proposed. First, this approach builds classifier on the features of image patches, where the features are transferred from a pretrained deep learning network. Next, pixel-wise prediction is obtained by convolving the trained classifier over input image. An experiment on three public and one industrial data set is carried out. The experiment involves two tasks: 1) image classification and 2) defect segmentation. The results of proposed algorithm are compared against several best benchmarks in literature. In the classification tasks, the proposed method improves accuracy by 0.66%-25.50%. In the segmentation tasks, the proposed method reduces error escape rates by 6.00%-19.00% in three defect types and improves accuracies by 2.29%-9.86% in all seven defect types. In addition, the proposed method achieves 0.0% error escape rate in the segmentation task of industrial data.

  3. A two-stage Monte Carlo approach to the expression of uncertainty with finite sample sizes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowder, Stephen Vernon; Moyer, Robert D.

    2005-05-01

    Proposed supplement I to the GUM outlines a 'propagation of distributions' approach to deriving the distribution of a measurand for any non-linear function and for any set of random inputs. The supplement's proposed Monte Carlo approach assumes that the distributions of the random inputs are known exactly. This implies that the sample sizes are effectively infinite. In this case, the mean of the measurand can be determined precisely using a large number of Monte Carlo simulations. In practice, however, the distributions of the inputs will rarely be known exactly, but must be estimated using possibly small samples. If these approximatedmore » distributions are treated as exact, the uncertainty in estimating the mean is not properly taken into account. In this paper, we propose a two-stage Monte Carlo procedure that explicitly takes into account the finite sample sizes used to estimate parameters of the input distributions. We will illustrate the approach with a case study involving the efficiency of a thermistor mount power sensor. The performance of the proposed approach will be compared to the standard GUM approach for finite samples using simple non-linear measurement equations. We will investigate performance in terms of coverage probabilities of derived confidence intervals.« less

  4. An Analytics-Based Approach to Managing Cognitive Load by Using Log Data of Learning Management Systems and Footprints of Social Media

    ERIC Educational Resources Information Center

    Yen, Cheng-Huang; Chen, I-Chuan; Lai, Su-Chun; Chuang, Yea-Ru

    2015-01-01

    Traces of learning behaviors generally provide insights into learners and the learning processes that they employ. In this article, a learning-analytics-based approach is proposed for managing cognitive load by adjusting the instructional strategies used in online courses. The technology-based learning environment examined in this study involved a…

  5. Longitudinal Changes in Behavioral Approach System Sensitivity and Brain Structures Involved in Reward Processing during Adolescence

    ERIC Educational Resources Information Center

    Urosevic, Snezana; Collins, Paul; Muetzel, Ryan; Lim, Kelvin; Luciana, Monica

    2012-01-01

    Adolescence is a period of radical normative changes and increased risk for substance use, mood disorders, and physical injury. Researchers have proposed that increases in reward sensitivity (i.e., sensitivity of the behavioral approach system [BAS]) and/or increases in reactivity to all emotional stimuli (i.e., reward and threat sensitivities)…

  6. A latent discriminative model-based approach for classification of imaginary motor tasks from EEG data.

    PubMed

    Saa, Jaime F Delgado; Çetin, Müjdat

    2012-04-01

    We consider the problem of classification of imaginary motor tasks from electroencephalography (EEG) data for brain-computer interfaces (BCIs) and propose a new approach based on hidden conditional random fields (HCRFs). HCRFs are discriminative graphical models that are attractive for this problem because they (1) exploit the temporal structure of EEG; (2) include latent variables that can be used to model different brain states in the signal; and (3) involve learned statistical models matched to the classification task, avoiding some of the limitations of generative models. Our approach involves spatial filtering of the EEG signals and estimation of power spectra based on autoregressive modeling of temporal segments of the EEG signals. Given this time-frequency representation, we select certain frequency bands that are known to be associated with execution of motor tasks. These selected features constitute the data that are fed to the HCRF, parameters of which are learned from training data. Inference algorithms on the HCRFs are used for the classification of motor tasks. We experimentally compare this approach to the best performing methods in BCI competition IV as well as a number of more recent methods and observe that our proposed method yields better classification accuracy.

  7. Brute-Force Approach for Mass Spectrometry-Based Variant Peptide Identification in Proteogenomics without Personalized Genomic Data

    NASA Astrophysics Data System (ADS)

    Ivanov, Mark V.; Lobas, Anna A.; Levitsky, Lev I.; Moshkovskii, Sergei A.; Gorshkov, Mikhail V.

    2018-02-01

    In a proteogenomic approach based on tandem mass spectrometry analysis of proteolytic peptide mixtures, customized exome or RNA-seq databases are employed for identifying protein sequence variants. However, the problem of variant peptide identification without personalized genomic data is important for a variety of applications. Following the recent proposal by Chick et al. (Nat. Biotechnol. 33, 743-749, 2015) on the feasibility of such variant peptide search, we evaluated two available approaches based on the previously suggested "open" search and the "brute-force" strategy. To improve the efficiency of these approaches, we propose an algorithm for exclusion of false variant identifications from the search results involving analysis of modifications mimicking single amino acid substitutions. Also, we propose a de novo based scoring scheme for assessment of identified point mutations. In the scheme, the search engine analyzes y-type fragment ions in MS/MS spectra to confirm the location of the mutation in the variant peptide sequence.

  8. Doppler-shift estimation of flat underwater channel using data-aided least-square approach

    NASA Astrophysics Data System (ADS)

    Pan, Weiqiang; Liu, Ping; Chen, Fangjiong; Ji, Fei; Feng, Jing

    2015-06-01

    In this paper we proposed a dada-aided Doppler estimation method for underwater acoustic communication. The training sequence is non-dedicate, hence it can be designed for Doppler estimation as well as channel equalization. We assume the channel has been equalized and consider only flat-fading channel. First, based on the training symbols the theoretical received sequence is composed. Next the least square principle is applied to build the objective function, which minimizes the error between the composed and the actual received signal. Then an iterative approach is applied to solve the least square problem. The proposed approach involves an outer loop and inner loop, which resolve the channel gain and Doppler coefficient, respectively. The theoretical performance bound, i.e. the Cramer-Rao Lower Bound (CRLB) of estimation is also derived. Computer simulations results show that the proposed algorithm achieves the CRLB in medium to high SNR cases.

  9. a Fast Approach for Stitching of Aerial Images

    NASA Astrophysics Data System (ADS)

    Moussa, A.; El-Sheimy, N.

    2016-06-01

    The last few years have witnessed an increasing volume of aerial image data because of the extensive improvements of the Unmanned Aerial Vehicles (UAVs). These newly developed UAVs have led to a wide variety of applications. A fast assessment of the achieved coverage and overlap of the acquired images of a UAV flight mission is of great help to save the time and cost of the further steps. A fast automatic stitching of the acquired images can help to visually assess the achieved coverage and overlap during the flight mission. This paper proposes an automatic image stitching approach that creates a single overview stitched image using the acquired images during a UAV flight mission along with a coverage image that represents the count of overlaps between the acquired images. The main challenge of such task is the huge number of images that are typically involved in such scenarios. A short flight mission with image acquisition frequency of one second can capture hundreds to thousands of images. The main focus of the proposed approach is to reduce the processing time of the image stitching procedure by exploiting the initial knowledge about the images positions provided by the navigation sensors. The proposed approach also avoids solving for all the transformation parameters of all the photos together to save the expected long computation time if all the parameters were considered simultaneously. After extracting the points of interest of all the involved images using Scale-Invariant Feature Transform (SIFT) algorithm, the proposed approach uses the initial image's coordinates to build an incremental constrained Delaunay triangulation that represents the neighborhood of each image. This triangulation helps to match only the neighbor images and therefore reduces the time-consuming features matching step. The estimated relative orientation between the matched images is used to find a candidate seed image for the stitching process. The pre-estimated transformation parameters of the images are employed successively in a growing fashion to create the stitched image and the coverage image. The proposed approach is implemented and tested using the images acquired through a UAV flight mission and the achieved results are presented and discussed.

  10. A Big Empty Space

    ERIC Educational Resources Information Center

    Blake, Anthony; Francis, David

    1973-01-01

    Approaches to developing management ability include systematic techniques, mental enlargement, self-analysis, and job-related counseling. A method is proposed to integrate them into a responsive program involving depth understanding, vision of the future, specialization commitment to change, and self-monitoring control. (MS)

  11. Automatic seizure detection based on the combination of newborn multi-channel EEG and HRV information

    NASA Astrophysics Data System (ADS)

    Mesbah, Mostefa; Balakrishnan, Malarvili; Colditz, Paul B.; Boashash, Boualem

    2012-12-01

    This article proposes a new method for newborn seizure detection that uses information extracted from both multi-channel electroencephalogram (EEG) and a single channel electrocardiogram (ECG). The aim of the study is to assess whether additional information extracted from ECG can improve the performance of seizure detectors based solely on EEG. Two different approaches were used to combine this extracted information. The first approach, known as feature fusion, involves combining features extracted from EEG and heart rate variability (HRV) into a single feature vector prior to feeding it to a classifier. The second approach, called classifier or decision fusion, is achieved by combining the independent decisions of the EEG and the HRV-based classifiers. Tested on recordings obtained from eight newborns with identified EEG seizures, the proposed neonatal seizure detection algorithms achieved 95.20% sensitivity and 88.60% specificity for the feature fusion case and 95.20% sensitivity and 94.30% specificity for the classifier fusion case. These results are considerably better than those involving classifiers using EEG only (80.90%, 86.50%) or HRV only (85.70%, 84.60%).

  12. Pythagorean fuzzy analytic hierarchy process to multi-criteria decision making

    NASA Astrophysics Data System (ADS)

    Mohd, Wan Rosanisah Wan; Abdullah, Lazim

    2017-11-01

    A numerous approaches have been proposed in the literature to determine the criteria of weight. The weight of criteria is very significant in the process of decision making. One of the outstanding approaches that used to determine weight of criteria is analytic hierarchy process (AHP). This method involves decision makers (DMs) to evaluate the decision to form the pair-wise comparison between criteria and alternatives. In classical AHP, the linguistic variable of pairwise comparison is presented in terms of crisp value. However, this method is not appropriate to present the real situation of the problems because it involved the uncertainty in linguistic judgment. For this reason, AHP has been extended by incorporating the Pythagorean fuzzy sets. In addition, no one has found in the literature proposed how to determine the weight of criteria using AHP under Pythagorean fuzzy sets. In order to solve the MCDM problem, the Pythagorean fuzzy analytic hierarchy process is proposed to determine the criteria weight of the evaluation criteria. Using the linguistic variables, pairwise comparison for evaluation criteria are made to the weights of criteria using Pythagorean fuzzy numbers (PFNs). The proposed method is implemented in the evaluation problem in order to demonstrate its applicability. This study shows that the proposed method provides us with a useful way and a new direction in solving MCDM problems with Pythagorean fuzzy context.

  13. An improved input shaping design for an efficient sway control of a nonlinear 3D overhead crane with friction

    NASA Astrophysics Data System (ADS)

    Maghsoudi, Mohammad Javad; Mohamed, Z.; Sudin, S.; Buyamin, S.; Jaafar, H. I.; Ahmad, S. M.

    2017-08-01

    This paper proposes an improved input shaping scheme for an efficient sway control of a nonlinear three dimensional (3D) overhead crane with friction using the particle swarm optimization (PSO) algorithm. Using this approach, a higher payload sway reduction is obtained as the input shaper is designed based on a complete nonlinear model, as compared to the analytical-based input shaping scheme derived using a linear second order model. Zero Vibration (ZV) and Distributed Zero Vibration (DZV) shapers are designed using both analytical and PSO approaches for sway control of rail and trolley movements. To test the effectiveness of the proposed approach, MATLAB simulations and experiments on a laboratory 3D overhead crane are performed under various conditions involving different cable lengths and sway frequencies. Their performances are studied based on a maximum residual of payload sway and Integrated Absolute Error (IAE) values which indicate total payload sway of the crane. With experiments, the superiority of the proposed approach over the analytical-based is shown by 30-50% reductions of the IAE values for rail and trolley movements, for both ZV and DZV shapers. In addition, simulations results show higher sway reductions with the proposed approach. It is revealed that the proposed PSO-based input shaping design provides higher payload sway reductions of a 3D overhead crane with friction as compared to the commonly designed input shapers.

  14. Fatigue-related crashes involving express buses in Malaysia: will the proposed policy of banning the early-hour operation reduce fatigue-related crashes and benefit overall road safety?

    PubMed

    Mohamed, Norlen; Mohd-Yusoff, Mohammad-Fadhli; Othman, Ilhamah; Zulkipli, Zarir-Hafiz; Osman, Mohd Rasid; Voon, Wong Shaw

    2012-03-01

    Fatigue-related crashes have long been the topic of discussion and study worldwide. The relationship between fatigue-related crashes and time of day is well documented. In Malaysia, the possibility of banning express buses from operating during the early-hours of the morning has emerged as an important consideration for passenger safety. This paper highlights the findings of an impact assessment study. The study was conducted to determine all possible impacts prior to the government making any decision on the proposed banning. This study is an example of a simple and inexpensive approach that may influence future policy-making process. The impact assessment comprised two major steps. The first step involved profiling existing operation scenarios, gathering information on crashes involving public express buses and stakeholders' views. The second step involved a qualitative impact assessment analysis using all information gathered during the profiling stage to describe the possible impacts. Based on the assessment, the move to ban early-hour operations could possibly result in further negative impacts on the overall road safety agenda. These negative impacts may occur if the fundamental issues, such as driving and working hours, and the need for rest and sleep facilities for drivers, are not addressed. In addition, a safer and more accessible public transportation system as an alternative for those who choose to travel at night would be required. The proposed banning of early-hour operations is also not a feasible solution for sustainability of express bus operations in Malaysia, especially for those operating long journeys. The paper concludes by highlighting the need to design a more holistic approach for preventing fatigue-related crashes involving express buses in Malaysia. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Word-level language modeling for P300 spellers based on discriminative graphical models

    NASA Astrophysics Data System (ADS)

    Delgado Saa, Jaime F.; de Pesters, Adriana; McFarland, Dennis; Çetin, Müjdat

    2015-04-01

    Objective. In this work we propose a probabilistic graphical model framework that uses language priors at the level of words as a mechanism to increase the performance of P300-based spellers. Approach. This paper is concerned with brain-computer interfaces based on P300 spellers. Motivated by P300 spelling scenarios involving communication based on a limited vocabulary, we propose a probabilistic graphical model framework and an associated classification algorithm that uses learned statistical models of language at the level of words. Exploiting such high-level contextual information helps reduce the error rate of the speller. Main results. Our experimental results demonstrate that the proposed approach offers several advantages over existing methods. Most importantly, it increases the classification accuracy while reducing the number of times the letters need to be flashed, increasing the communication rate of the system. Significance. The proposed approach models all the variables in the P300 speller in a unified framework and has the capability to correct errors in previous letters in a word, given the data for the current one. The structure of the model we propose allows the use of efficient inference algorithms, which in turn makes it possible to use this approach in real-time applications.

  16. A FFT-based formulation for efficient mechanical fields computation in isotropic and anisotropic periodic discrete dislocation dynamics

    NASA Astrophysics Data System (ADS)

    Bertin, N.; Upadhyay, M. V.; Pradalier, C.; Capolungo, L.

    2015-09-01

    In this paper, we propose a novel full-field approach based on the fast Fourier transform (FFT) technique to compute mechanical fields in periodic discrete dislocation dynamics (DDD) simulations for anisotropic materials: the DDD-FFT approach. By coupling the FFT-based approach to the discrete continuous model, the present approach benefits from the high computational efficiency of the FFT algorithm, while allowing for a discrete representation of dislocation lines. It is demonstrated that the computational time associated with the new DDD-FFT approach is significantly lower than that of current DDD approaches when large number of dislocation segments are involved for isotropic and anisotropic elasticity, respectively. Furthermore, for fine Fourier grids, the treatment of anisotropic elasticity comes at a similar computational cost to that of isotropic simulation. Thus, the proposed approach paves the way towards achieving scale transition from DDD to mesoscale plasticity, especially due to the method’s ability to incorporate inhomogeneous elasticity.

  17. Activity Detection and Retrieval for Image and Video Data with Limited Training

    DTIC Science & Technology

    2015-06-10

    applications. Here we propose two techniques for image segmentation. The first involves an automata based multiple threshold selection scheme, where a... automata . For our second approach to segmentation, we employ a region based segmentation technique that is capable of handling intensity inhomogeneity...techniques for image segmentation. The first involves an automata based multiple threshold selection scheme, where a mixture of Gaussian is fitted to the

  18. Topology-changing shape optimization with the genetic algorithm

    NASA Astrophysics Data System (ADS)

    Lamberson, Steven E., Jr.

    The goal is to take a traditional shape optimization problem statement and modify it slightly to allow for prescribed changes in topology. This modification enables greater flexibility in the choice of parameters for the topology optimization problem, while improving the direct physical relevance of the results. This modification involves changing the optimization problem statement from a nonlinear programming problem into a form of mixed-discrete nonlinear programing problem. The present work demonstrates one possible way of using the Genetic Algorithm (GA) to solve such a problem, including the use of "masking bits" and a new modification to the bit-string affinity (BSA) termination criterion specifically designed for problems with "masking bits." A simple ten-bar truss problem proves the utility of the modified BSA for this type of problem. A more complicated two dimensional bracket problem is solved using both the proposed approach and a more traditional topology optimization approach (Solid Isotropic Microstructure with Penalization or SIMP) to enable comparison. The proposed approach is able to solve problems with both local and global constraints, which is something traditional methods cannot do. The proposed approach has a significantly higher computational burden --- on the order of 100 times larger than SIMP, although the proposed approach is able to offset this with parallel computing.

  19. A Science Data System Approach for the SMAP Mission

    NASA Technical Reports Server (NTRS)

    Woollard, David; Kwoun, Oh-ig; Bicknell, Tom; West, Richard; Leung, Kon

    2009-01-01

    Though Science Data System (SDS) development has not traditionally been part of the mission concept phase, lessons learned and study of past Earth science missions indicate that SDS functionality can greatly benefit algorithm developers in all mission phases. We have proposed a SDS approach for the SMAP Mission that incorporates early support for an algorithm testbed, allowing scientists to develop codes and seamlessly integrate them into the operational SDS. This approach will greatly reduce both the costs and risks involved in algorithm transitioning and SDS development.

  20. Current Concepts of Bruxism.

    PubMed

    Manfredini, Daniele; Serra-Negra, Junia; Carboncini, Fabio; Lobbezoo, Frank

    Bruxism is a common phenomenon, and emerging evidence suggests that biologic, psychologic, and exogenous factors have greater involvement than morphologic factors in its etiology. Diagnosis should adopt the grading system of possible, probable, and definite. In children, it could be a warning sign of certain psychologic disorders. The proposed mechanism for the bruxism-pain relationship at the individual level is that stress sensitivity and anxious personality traits may be responsible for bruxism activities that may lead to temporomandibular pain, which in turn is modulated by psychosocial factors. A multiple-P (plates, pep talk, psychology, pills) approach involving reversible treatments is recommended, and adult prosthodontic management should be based on a common-sense cautionary approach.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kupar, J.; Hasek, M.

    The Sanitary Landfill Corrective Action Plan proposes a two pronged approach to remediation. The first part of the total remediation strategy is the placement of a RCRA style closure cap to provide source control of contaminants into the groundwater. The second part of the proposed remediation package is a phased approach primarily using an in situ bioremediation system for groundwater clean up of the Constituents of Concern (COCs) that exceed their proposed Alternate Concentration Limits (ACL). The phased in approach of groundwater clean up will involve operation of the in situ bioremediation system, followed by evaluation of the Phase 1more » system and, if necessary, additional phased remediation strategies. This document presents pertinent information on operations, well locations, anticipated capture zones, monitoring strategies, observation wells and other information which will allow a decision on the acceptability of the remedial strategy as an interim corrective action prior to permit application approval. The proposed interim phase of the remediation program will position two horizontal bioremediation wells such that the respective zones of influence will intersect the migration path for the highest concentrations of each plume.« less

  2. A single-index threshold Cox proportional hazard model for identifying a treatment-sensitive subset based on multiple biomarkers.

    PubMed

    He, Ye; Lin, Huazhen; Tu, Dongsheng

    2018-06-04

    In this paper, we introduce a single-index threshold Cox proportional hazard model to select and combine biomarkers to identify patients who may be sensitive to a specific treatment. A penalized smoothed partial likelihood is proposed to estimate the parameters in the model. A simple, efficient, and unified algorithm is presented to maximize this likelihood function. The estimators based on this likelihood function are shown to be consistent and asymptotically normal. Under mild conditions, the proposed estimators also achieve the oracle property. The proposed approach is evaluated through simulation analyses and application to the analysis of data from two clinical trials, one involving patients with locally advanced or metastatic pancreatic cancer and one involving patients with resectable lung cancer. Copyright © 2018 John Wiley & Sons, Ltd.

  3. Modeling Complex Dynamic Interactions of Nonlinear, Aeroelastic, Multistage, and Localization Phenomena in Turbine Engines

    DTIC Science & Technology

    2011-02-25

    fast method of predicting the number of iterations needed for converged results. A new hybrid technique is proposed to predict the convergence history...interchanging between the modes, whereas a smaller veering (or crossing) region shows fast mode switching. Then, the nonlinear vibration re- sponse of the...problems of interest involve dynamic ( fast ) crack propagation, then the nodes selected by the proposed approach at some time instant might not

  4. Achievement Goals, Reasons for Goal Pursuit, and Achievement Goal Complexes as Predictors of Beneficial Outcomes: Is the Influence of Goals Reducible to Reasons?

    ERIC Educational Resources Information Center

    Sommet, Nicolas; Elliot, Andrew J.

    2017-01-01

    In the present research, we proposed a systematic approach to disentangling the shared and unique variance explained by achievement goals, reasons for goal pursuit, and specific goal-reason combinations (i.e., achievement goal complexes). Four studies using this approach (involving nearly 1,800 participants) led to 3 basic sets of findings. First,…

  5. Anorexia nervosa and bulimia nervosa - a psychotherapeutic cognitive-constructivist approach.

    PubMed

    Abreu, Cristiano Nabuco de; Cangelli Filho, Raphael

    2017-06-01

    Of the eating disorders, anorexia nervosa and bulimia nervosa are the ones that have made adolescent patients-often females and aged younger and younger-seek for help. This help is provided through a multidisciplinary treatment involving psychiatrists, psychologists and dietists. Psychotherapy has shown to be an efficient component for these patients' improvement. The present article aims at presenting a proposal of psychotherapeutic treatment based on a cognitive-constructivist approach.

  6. Fuzzy set approach to quality function deployment: An investigation

    NASA Technical Reports Server (NTRS)

    Masud, Abu S. M.

    1992-01-01

    The final report of the 1992 NASA/ASEE Summer Faculty Fellowship at the Space Exploration Initiative Office (SEIO) in Langley Research Center is presented. Quality Function Deployment (QFD) is a process, focused on facilitating the integration of the customer's voice in the design and development of a product or service. Various input, in the form of judgements and evaluations, are required during the QFD analyses. All the input variables in these analyses are treated as numeric variables. The purpose of the research was to investigate how QFD analyses can be performed when some or all of the input variables are treated as linguistic variables with values expressed as fuzzy numbers. The reason for this consideration is that human judgement, perception, and cognition are often ambiguous and are better represented as fuzzy numbers. Two approaches for using fuzzy sets in QFD have been proposed. In both cases, all the input variables are considered as linguistic variables with values indicated as linguistic expressions. These expressions are then converted to fuzzy numbers. The difference between the two approaches is due to how the QFD computations are performed with these fuzzy numbers. In Approach 1, the fuzzy numbers are first converted to their equivalent crisp scores and then the QFD computations are performed using these crisp scores. As a result, the output of this approach are crisp numbers, similar to those in traditional QFD. In Approach 2, all the QFD computations are performed with the fuzzy numbers and the output are fuzzy numbers also. Both the approaches have been explained with the help of illustrative examples of QFD application. Approach 2 has also been applied in a QFD application exercise in SEIO, involving a 'mini moon rover' design. The mini moon rover is a proposed tele-operated vehicle that will traverse and perform various tasks, including autonomous operations, on the moon surface. The output of the moon rover application exercise is a ranking of the rover functions so that a subset of these functions can be targeted for design improvement. The illustrative examples and the mini rover application exercise confirm that the proposed approaches for using fuzzy sets in QFD are viable. However, further research is needed to study the various issues involved and to verify/validate the methods proposed.

  7. A Mixed Kijima Model Using the Weibull-Based Generalized Renewal Processes

    PubMed Central

    2015-01-01

    Generalized Renewal Processes are useful for approaching the rejuvenation of dynamical systems resulting from planned or unplanned interventions. We present new perspectives for the Generalized Renewal Processes in general and for the Weibull-based Generalized Renewal Processes in particular. Disregarding from literature, we present a mixed Generalized Renewal Processes approach involving Kijima Type I and II models, allowing one to infer the impact of distinct interventions on the performance of the system under study. The first and second theoretical moments of this model are introduced as well as its maximum likelihood estimation and random sampling approaches. In order to illustrate the usefulness of the proposed Weibull-based Generalized Renewal Processes model, some real data sets involving improving, stable, and deteriorating systems are used. PMID:26197222

  8. WHERE IS THE CONSENSUS? A PROPOSED FOUNDATION FOR MOVING ECOSYSTEM SERVICE CONCEPTS INTO PRACTICE

    EPA Science Inventory

    Inconsistency with terms, definitions, and classifications hinders the advancement of the study and application of ecosystem services. A unified approach among the many disciplines that are involved in researching and implementing ecosystem services is imperative to moving conce...

  9. Getting involved : a study of bicycle and pedestrian advisory committees and advocacy organizations.

    DOT National Transportation Integrated Search

    2001-01-01

    Bicycle and pedestrian advocacy groups have approached VDOT with a variety of proposals and requests for support. In deciding how to respond to such inquiries, the VDOT Bicycle and Pedestrian Program staff desires to know how other state DOTs interac...

  10. Image re-sampling detection through a novel interpolation kernel.

    PubMed

    Hilal, Alaa

    2018-06-01

    Image re-sampling involved in re-size and rotation transformations is an essential element block in a typical digital image alteration. Fortunately, traces left from such processes are detectable, proving that the image has gone a re-sampling transformation. Within this context, we present in this paper two original contributions. First, we propose a new re-sampling interpolation kernel. It depends on five independent parameters that controls its amplitude, angular frequency, standard deviation, and duration. Then, we demonstrate its capacity to imitate the same behavior of the most frequent interpolation kernels used in digital image re-sampling applications. Secondly, the proposed model is used to characterize and detect the correlation coefficients involved in re-sampling transformations. The involved process includes a minimization of an error function using the gradient method. The proposed method is assessed over a large database of 11,000 re-sampled images. Additionally, it is implemented within an algorithm in order to assess images that had undergone complex transformations. Obtained results demonstrate better performance and reduced processing time when compared to a reference method validating the suitability of the proposed approaches. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. A hierarchical approach for the design improvements of an Organocat biorefinery.

    PubMed

    Abdelaziz, Omar Y; Gadalla, Mamdouh A; El-Halwagi, Mahmoud M; Ashour, Fatma H

    2015-04-01

    Lignocellulosic biomass has emerged as a potentially attractive renewable energy source. Processing technologies of such biomass, particularly its primary separation, still lack economic justification due to intense energy requirements. Establishing an economically viable and energy efficient biorefinery scheme is a significant challenge. In this work, a systematic approach is proposed for improving basic/existing biorefinery designs. This approach is based on enhancing the efficiency of mass and energy utilization through the use of a hierarchical design approach that involves mass and energy integration. The proposed procedure is applied to a novel biorefinery called Organocat to minimize its energy and mass consumption and total annualized cost. An improved heat exchanger network with minimum energy consumption of 4.5 MJ/kgdry biomass is designed. An optimal recycle network with zero fresh water usage and minimum waste discharge is also constructed, making the process more competitive and economically attractive. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Efficient modeling of vector hysteresis using a novel Hopfield neural network implementation of Stoner–Wohlfarth-like operators

    PubMed Central

    Adly, Amr A.; Abd-El-Hafiz, Salwa K.

    2012-01-01

    Incorporation of hysteresis models in electromagnetic analysis approaches is indispensable to accurate field computation in complex magnetic media. Throughout those computations, vector nature and computational efficiency of such models become especially crucial when sophisticated geometries requiring massive sub-region discretization are involved. Recently, an efficient vector Preisach-type hysteresis model constructed from only two scalar models having orthogonally coupled elementary operators has been proposed. This paper presents a novel Hopfield neural network approach for the implementation of Stoner–Wohlfarth-like operators that could lead to a significant enhancement in the computational efficiency of the aforementioned model. Advantages of this approach stem from the non-rectangular nature of these operators that substantially minimizes the number of operators needed to achieve an accurate vector hysteresis model. Details of the proposed approach, its identification and experimental testing are presented in the paper. PMID:25685446

  13. Standardized Effect Sizes for Moderated Conditional Fixed Effects with Continuous Moderator Variables

    PubMed Central

    Bodner, Todd E.

    2017-01-01

    Wilkinson and Task Force on Statistical Inference (1999) recommended that researchers include information on the practical magnitude of effects (e.g., using standardized effect sizes) to distinguish between the statistical and practical significance of research results. To date, however, researchers have not widely incorporated this recommendation into the interpretation and communication of the conditional effects and differences in conditional effects underlying statistical interactions involving a continuous moderator variable where at least one of the involved variables has an arbitrary metric. This article presents a descriptive approach to investigate two-way statistical interactions involving continuous moderator variables where the conditional effects underlying these interactions are expressed in standardized effect size metrics (i.e., standardized mean differences and semi-partial correlations). This approach permits researchers to evaluate and communicate the practical magnitude of particular conditional effects and differences in conditional effects using conventional and proposed guidelines, respectively, for the standardized effect size and therefore provides the researcher important supplementary information lacking under current approaches. The utility of this approach is demonstrated with two real data examples and important assumptions underlying the standardization process are highlighted. PMID:28484404

  14. Improving Psychological Measurement: Does It Make a Difference? A Comment on Nesselroade and Molenaar (2016).

    PubMed

    Maydeu-Olivares, Alberto

    2016-01-01

    Nesselroade and Molenaar advocate the use of an idiographic filter approach. This is a fixed-effects approach, which may limit the number of individuals that can be simultaneously modeled, and it is not clear how to model the presence of subpopulations. Most important, Nesselroade and Molenaar's proposal appears to be best suited for modeling long time series on a few variables for a few individuals. Long time series are not common in psychological applications. Can it be applied to the usual longitudinal data we face? These are characterized by short time series (four to five points in time), hundreds of individuals, and dozens of variables. If so, what do we gain? Applied settings most often involve between-individual decisions. I conjecture that their approach will not outperform common, simpler, methods. However, when intraindividual decisions are involved, their approach may have an edge.

  15. Hierarchical Brain Networks Active in Approach and Avoidance Goal Pursuit

    PubMed Central

    Spielberg, Jeffrey M.; Heller, Wendy; Miller, Gregory A.

    2013-01-01

    Effective approach/avoidance goal pursuit is critical for attaining long-term health and well-being. Research on the neural correlates of key goal-pursuit processes (e.g., motivation) has long been of interest, with lateralization in prefrontal cortex being a particularly fruitful target of investigation. However, this literature has often been limited by a lack of spatial specificity and has not delineated the precise aspects of approach/avoidance motivation involved. Additionally, the relationships among brain regions (i.e., network connectivity) vital to goal-pursuit remain largely unexplored. Specificity in location, process, and network relationship is vital for moving beyond gross characterizations of function and identifying the precise cortical mechanisms involved in motivation. The present paper integrates research using more spatially specific methodologies (e.g., functional magnetic resonance imaging) with the rich psychological literature on approach/avoidance to propose an integrative network model that takes advantage of the strengths of each of these literatures. PMID:23785328

  16. Hierarchical brain networks active in approach and avoidance goal pursuit.

    PubMed

    Spielberg, Jeffrey M; Heller, Wendy; Miller, Gregory A

    2013-01-01

    Effective approach/avoidance goal pursuit is critical for attaining long-term health and well-being. Research on the neural correlates of key goal-pursuit processes (e.g., motivation) has long been of interest, with lateralization in prefrontal cortex being a particularly fruitful target of investigation. However, this literature has often been limited by a lack of spatial specificity and has not delineated the precise aspects of approach/avoidance motivation involved. Additionally, the relationships among brain regions (i.e., network connectivity) vital to goal-pursuit remain largely unexplored. Specificity in location, process, and network relationship is vital for moving beyond gross characterizations of function and identifying the precise cortical mechanisms involved in motivation. The present paper integrates research using more spatially specific methodologies (e.g., functional magnetic resonance imaging) with the rich psychological literature on approach/avoidance to propose an integrative network model that takes advantage of the strengths of each of these literatures.

  17. Applying the ecosystem approach to select priority areas for forest landscape restoration in the Yungas, Northwestern Argentina.

    PubMed

    Ianni, Elena; Geneletti, Davide

    2010-11-01

    This paper proposes a method to select forest restoration priority areas consistently with the key principles of the Ecosystem Approach (EA) and the Forest Landscape Restoration (FLR) framework. The methodology is based on the principles shared by the two approaches: acting at ecosystem scale, involving stakeholders, and evaluating alternatives. It proposes the involvement of social actors which have a stake in forest management through multicriteria analysis sessions aimed at identifying the most suitable forest restoration intervention. The method was applied to a study area in the native forests of Northern Argentina (the Yungas). Stakeholders were asked to identify alternative restoration actions, i.e. potential areas implementing FLR. Ten alternative fincas-estates derived from the Spanish land tenure system-differing in relation to ownership, management, land use, land tenure, and size were evaluated. Twenty criteria were selected and classified into four groups: biophysical, social, economic and political. Finca Ledesma was the closest to the economic, social, environmental and political goals, according to the values and views of the actors involved in the decision. This study represented the first attempt to apply EA principles to forest restoration at landscape scale in the Yungas region. The benefits obtained by the application of the method were twofold: on one hand, researchers and local actors were forced to conceive the Yungas as a complex net of rights rather than as a sum of personal interests. On the other hand, the participatory multicriteria approach provided a structured process for collective decision-making in an area where it has never been implemented.

  18. Detection of multiple damages employing best achievable eigenvectors under Bayesian inference

    NASA Astrophysics Data System (ADS)

    Prajapat, Kanta; Ray-Chaudhuri, Samit

    2018-05-01

    A novel approach is presented in this work to localize simultaneously multiple damaged elements in a structure along with the estimation of damage severity for each of the damaged elements. For detection of damaged elements, a best achievable eigenvector based formulation has been derived. To deal with noisy data, Bayesian inference is employed in the formulation wherein the likelihood of the Bayesian algorithm is formed on the basis of errors between the best achievable eigenvectors and the measured modes. In this approach, the most probable damage locations are evaluated under Bayesian inference by generating combinations of various possible damaged elements. Once damage locations are identified, damage severities are estimated using a Bayesian inference Markov chain Monte Carlo simulation. The efficiency of the proposed approach has been demonstrated by carrying out a numerical study involving a 12-story shear building. It has been found from this study that damage scenarios involving as low as 10% loss of stiffness in multiple elements are accurately determined (localized and severities quantified) even when 2% noise contaminated modal data are utilized. Further, this study introduces a term parameter impact (evaluated based on sensitivity of modal parameters towards structural parameters) to decide the suitability of selecting a particular mode, if some idea about the damaged elements are available. It has been demonstrated here that the accuracy and efficiency of the Bayesian quantification algorithm increases if damage localization is carried out a-priori. An experimental study involving a laboratory scale shear building and different stiffness modification scenarios shows that the proposed approach is efficient enough to localize the stories with stiffness modification.

  19. Applying the Ecosystem Approach to Select Priority Areas for Forest Landscape Restoration in the Yungas, Northwestern Argentina

    NASA Astrophysics Data System (ADS)

    Ianni, Elena; Geneletti, Davide

    2010-11-01

    This paper proposes a method to select forest restoration priority areas consistently with the key principles of the Ecosystem Approach (EA) and the Forest Landscape Restoration (FLR) framework. The methodology is based on the principles shared by the two approaches: acting at ecosystem scale, involving stakeholders, and evaluating alternatives. It proposes the involvement of social actors which have a stake in forest management through multicriteria analysis sessions aimed at identifying the most suitable forest restoration intervention. The method was applied to a study area in the native forests of Northern Argentina (the Yungas). Stakeholders were asked to identify alternative restoration actions, i.e. potential areas implementing FLR. Ten alternative fincas—estates derived from the Spanish land tenure system—differing in relation to ownership, management, land use, land tenure, and size were evaluated. Twenty criteria were selected and classified into four groups: biophysical, social, economic and political. Finca Ledesma was the closest to the economic, social, environmental and political goals, according to the values and views of the actors involved in the decision. This study represented the first attempt to apply EA principles to forest restoration at landscape scale in the Yungas region. The benefits obtained by the application of the method were twofold: on one hand, researchers and local actors were forced to conceive the Yungas as a complex net of rights rather than as a sum of personal interests. On the other hand, the participatory multicriteria approach provided a structured process for collective decision-making in an area where it has never been implemented.

  20. Moving object detection and tracking in videos through turbulent medium

    NASA Astrophysics Data System (ADS)

    Halder, Kalyan Kumar; Tahtali, Murat; Anavatti, Sreenatha G.

    2016-06-01

    This paper addresses the problem of identifying and tracking moving objects in a video sequence having a time-varying background. This is a fundamental task in many computer vision applications, though a very challenging one because of turbulence that causes blurring and spatiotemporal movements of the background images. Our proposed approach involves two major steps. First, a moving object detection algorithm that deals with the detection of real motions by separating the turbulence-induced motions using a two-level thresholding technique is used. In the second step, a feature-based generalized regression neural network is applied to track the detected objects throughout the frames in the video sequence. The proposed approach uses the centroid and area features of the moving objects and creates the reference regions instantly by selecting the objects within a circle. Simulation experiments are carried out on several turbulence-degraded video sequences and comparisons with an earlier method confirms that the proposed approach provides a more effective tracking of the targets.

  1. Obesity and public policies: the Brazilian government's definitions and strategies.

    PubMed

    Dias, Patricia Camacho; Henriques, Patrícia; Anjos, Luiz Antonio Dos; Burlandy, Luciene

    2017-07-27

    The study analyzes national strategies for dealing with obesity in Brazil in the framework of the Brazilian Unified National Health System (SUS) and the Food and Nutritional Security System (SISAN). Based on the document analysis method, we examined government documents produced in the last 15 years in the following dimensions: definitions of obesity, proposed actions, and strategies for linkage between sectors. In the SUS, obesity is approached as both a risk factor and a disease, with individual and social/environmental approaches aimed at changing eating practices and physical activity. In the SISAN, obesity is also conceived as a social problem involving food insecurity, and new modes of producing, marketing, and consuming foods are proposed to change eating practices in an integrated way. Proposals in the SUS point to an integrated and intra-sector approach to obesity, while those in SISAN emphasize the problem's inter-sector nature from an expanded perspective that challenges the prevailing sector-based institutional structures.

  2. A variational Bayes spatiotemporal model for electromagnetic brain mapping.

    PubMed

    Nathoo, F S; Babul, A; Moiseev, A; Virji-Babul, N; Beg, M F

    2014-03-01

    In this article, we present a new variational Bayes approach for solving the neuroelectromagnetic inverse problem arising in studies involving electroencephalography (EEG) and magnetoencephalography (MEG). This high-dimensional spatiotemporal estimation problem involves the recovery of time-varying neural activity at a large number of locations within the brain, from electromagnetic signals recorded at a relatively small number of external locations on or near the scalp. Framing this problem within the context of spatial variable selection for an underdetermined functional linear model, we propose a spatial mixture formulation where the profile of electrical activity within the brain is represented through location-specific spike-and-slab priors based on a spatial logistic specification. The prior specification accommodates spatial clustering in brain activation, while also allowing for the inclusion of auxiliary information derived from alternative imaging modalities, such as functional magnetic resonance imaging (fMRI). We develop a variational Bayes approach for computing estimates of neural source activity, and incorporate a nonparametric bootstrap for interval estimation. The proposed methodology is compared with several alternative approaches through simulation studies, and is applied to the analysis of a multimodal neuroimaging study examining the neural response to face perception using EEG, MEG, and fMRI. © 2013, The International Biometric Society.

  3. Critiquing: A Different Approach to Expert Computer Advice in Medicine

    PubMed Central

    Miller, Perry L.

    1984-01-01

    The traditional approach to computer-based advice in medicine has been to design systems which simulate a physician's decision process. This paper describes a different approach to computer advice in medicine: a critiquing approach. A critiquing system first asks how the physician is planning to manage his patient and then critiques that plan, discussing the advantages and disadvantages of the proposed approach, compared to other approaches which might be reasonable or preferred. Several critiquing systems are currently in different stages of implementation. The paper describes these systems and discusses the characteristics which make each domain suitable for critiquing. The critiquing approach may prove especially well-suited in domains where decisions involve a great deal of subjective judgement.

  4. Knowledge Building in Asynchronous Discussion Groups: Going Beyond Quantitative Analysis

    ERIC Educational Resources Information Center

    Schrire, Sarah

    2006-01-01

    This contribution examines the methodological challenges involved in defining the collaborative knowledge-building processes occurring in asynchronous discussion and proposes an approach that could advance understanding of these processes. The written protocols that are available to the analyst provide an exact record of the instructional…

  5. Training Tools for Translators and Interpreters

    ERIC Educational Resources Information Center

    Al-Qinai, Jamal

    2010-01-01

    The present paper reviews the traditional methodologies of translator training and proposes an eclectic multi-componential approach that involves a set of interdisciplinary skills with the ultimate objective of meeting market demand. Courses on translation for specific purposes (TSP) and think-aloud protocols (TAP) along with self-monitoring and…

  6. Epileptic seizure detection in EEG signal using machine learning techniques.

    PubMed

    Jaiswal, Abeg Kumar; Banka, Haider

    2018-03-01

    Epilepsy is a well-known nervous system disorder characterized by seizures. Electroencephalograms (EEGs), which capture brain neural activity, can detect epilepsy. Traditional methods for analyzing an EEG signal for epileptic seizure detection are time-consuming. Recently, several automated seizure detection frameworks using machine learning technique have been proposed to replace these traditional methods. The two basic steps involved in machine learning are feature extraction and classification. Feature extraction reduces the input pattern space by keeping informative features and the classifier assigns the appropriate class label. In this paper, we propose two effective approaches involving subpattern based PCA (SpPCA) and cross-subpattern correlation-based PCA (SubXPCA) with Support Vector Machine (SVM) for automated seizure detection in EEG signals. Feature extraction was performed using SpPCA and SubXPCA. Both techniques explore the subpattern correlation of EEG signals, which helps in decision-making process. SVM is used for classification of seizure and non-seizure EEG signals. The SVM was trained with radial basis kernel. All the experiments have been carried out on the benchmark epilepsy EEG dataset. The entire dataset consists of 500 EEG signals recorded under different scenarios. Seven different experimental cases for classification have been conducted. The classification accuracy was evaluated using tenfold cross validation. The classification results of the proposed approaches have been compared with the results of some of existing techniques proposed in the literature to establish the claim.

  7. Emerging accounting trends accounting for leases.

    PubMed

    Valletta, Robert; Huggins, Brian

    2010-12-01

    A new model for lease accounting can have a significant impact on hospitals and healthcare organizations. The new approach proposes a "right-of-use" model that involves complex estimates and significant administrative burden. Hospitals and health systems that draw heavily on lease arrangements should start preparing for the new approach now even though guidance and a final rule are not expected until mid-2011. This article highlights a number of considerations from the lessee point of view.

  8. Modeling work zone crash frequency by quantifying measurement errors in work zone length.

    PubMed

    Yang, Hong; Ozbay, Kaan; Ozturk, Ozgur; Yildirimoglu, Mehmet

    2013-06-01

    Work zones are temporary traffic control zones that can potentially cause safety problems. Maintaining safety, while implementing necessary changes on roadways, is an important challenge traffic engineers and researchers have to confront. In this study, the risk factors in work zone safety evaluation were identified through the estimation of a crash frequency (CF) model. Measurement errors in explanatory variables of a CF model can lead to unreliable estimates of certain parameters. Among these, work zone length raises a major concern in this analysis because it may change as the construction schedule progresses generally without being properly documented. This paper proposes an improved modeling and estimation approach that involves the use of a measurement error (ME) model integrated with the traditional negative binomial (NB) model. The proposed approach was compared with the traditional NB approach. Both models were estimated using a large dataset that consists of 60 work zones in New Jersey. Results showed that the proposed improved approach outperformed the traditional approach in terms of goodness-of-fit statistics. Moreover it is shown that the use of the traditional NB approach in this context can lead to the overestimation of the effect of work zone length on the crash occurrence. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. A Grammatical Approach to RNA-RNA Interaction Prediction

    NASA Astrophysics Data System (ADS)

    Kato, Yuki; Akutsu, Tatsuya; Seki, Hiroyuki

    2007-11-01

    Much attention has been paid to two interacting RNA molecules involved in post-transcriptional control of gene expression. Although there have been a few studies on RNA-RNA interaction prediction based on dynamic programming algorithm, no grammar-based approach has been proposed. The purpose of this paper is to provide a new modeling for RNA-RNA interaction based on multiple context-free grammar (MCFG). We present a polynomial time parsing algorithm for finding the most likely derivation tree for the stochastic version of MCFG, which is applicable to RNA joint secondary structure prediction including kissing hairpin loops. Also, elementary tests on RNA-RNA interaction prediction have shown that the proposed method is comparable to Alkan et al.'s method.

  10. Optimization of an innovative approach involving mechanical activation and acid digestion for the extraction of lithium from lepidolite

    NASA Astrophysics Data System (ADS)

    Vieceli, Nathália; Nogueira, Carlos A.; Pereira, Manuel F. C.; Durão, Fernando O.; Guimarães, Carlos; Margarido, Fernanda

    2018-01-01

    The recovery of lithium from hard rock minerals has received increased attention given the high demand for this element. Therefore, this study optimized an innovative process, which does not require a high-temperature calcination step, for lithium extraction from lepidolite. Mechanical activation and acid digestion were suggested as crucial process parameters, and experimental design and response-surface methodology were applied to model and optimize the proposed lithium extraction process. The promoting effect of amorphization and the formation of lithium sulfate hydrate on lithium extraction yield were assessed. Several factor combinations led to extraction yields that exceeded 90%, indicating that the proposed process is an effective approach for lithium recovery.

  11. Network Reconstruction From High-Dimensional Ordinary Differential Equations.

    PubMed

    Chen, Shizhe; Shojaie, Ali; Witten, Daniela M

    2017-01-01

    We consider the task of learning a dynamical system from high-dimensional time-course data. For instance, we might wish to estimate a gene regulatory network from gene expression data measured at discrete time points. We model the dynamical system nonparametrically as a system of additive ordinary differential equations. Most existing methods for parameter estimation in ordinary differential equations estimate the derivatives from noisy observations. This is known to be challenging and inefficient. We propose a novel approach that does not involve derivative estimation. We show that the proposed method can consistently recover the true network structure even in high dimensions, and we demonstrate empirical improvement over competing approaches. Supplementary materials for this article are available online.

  12. Real-time monitoring of a microbial electrolysis cell using an electrical equivalent circuit model.

    PubMed

    Hussain, S A; Perrier, M; Tartakovsky, B

    2018-04-01

    Efforts in developing microbial electrolysis cells (MECs) resulted in several novel approaches for wastewater treatment and bioelectrosynthesis. Practical implementation of these approaches necessitates the development of an adequate system for real-time (on-line) monitoring and diagnostics of MEC performance. This study describes a simple MEC equivalent electrical circuit (EEC) model and a parameter estimation procedure, which enable such real-time monitoring. The proposed approach involves MEC voltage and current measurements during its operation with periodic power supply connection/disconnection (on/off operation) followed by parameter estimation using either numerical or analytical solution of the model. The proposed monitoring approach is demonstrated using a membraneless MEC with flow-through porous electrodes. Laboratory tests showed that changes in the influent carbon source concentration and composition significantly affect MEC total internal resistance and capacitance estimated by the model. Fast response of these EEC model parameters to changes in operating conditions enables the development of a model-based approach for real-time monitoring and fault detection.

  13. Integration of plug-in hybrid electric vehicles (PHEV) with grid connected residential photovoltaic energy systems

    NASA Astrophysics Data System (ADS)

    Nagarajan, Adarsh; Shireen, Wajiha

    2013-06-01

    This paper proposes an approach for integrating Plug-In Hybrid Electric Vehicles (PHEV) to an existing residential photovoltaic system, to control and optimize the power consumption of residential load. Control involves determining the source from which residential load will be catered, where as optimization of power flow reduces the stress on the grid. The system built to achieve the goal is a combination of the existing residential photovoltaic system, PHEV, Power Conditioning Unit (PCU), and a controller. The PCU involves two DC-DC Boost Converters and an inverter. This paper emphasizes on developing the controller logic and its implementation in order to accommodate the flexibility and benefits of the proposed integrated system. The proposed controller logic has been simulated using MATLAB SIMULINK and further implemented using Digital Signal Processor (DSP) microcontroller, TMS320F28035, from Texas Instruments

  14. Overview of prohibited and permitted plant regulatory listing systems

    USGS Publications Warehouse

    Westbrooks, Randy G.; Tasker, Alan V.

    2011-01-01

    Pest risk analysis is a process that evaluates the risks involved with a proposed species to help determine whether it should be permitted or denied entry into a country, and how the risks could be managed if it is imported. The prohibited listing approach was developed in the late 1800s and early 1900s in response to outbreaks of plant and animals pests such as foot and mouth disease of livestock, Mediterranean fruitfly (Ceratitis capitata Wiedemann), and Gypsy moth (Lymantria dispar L.). Under this approach, selected species of concern are evaluated to determine if they should be regulated for entry. Under the permitted listing approach that was first used on a national level in Australia in the 1990s, all species that are proposed for introduction are assessed to determine if they should be regulated.

  15. Efficient video-equipped fire detection approach for automatic fire alarm systems

    NASA Astrophysics Data System (ADS)

    Kang, Myeongsu; Tung, Truong Xuan; Kim, Jong-Myon

    2013-01-01

    This paper proposes an efficient four-stage approach that automatically detects fire using video capabilities. In the first stage, an approximate median method is used to detect video frame regions involving motion. In the second stage, a fuzzy c-means-based clustering algorithm is employed to extract candidate regions of fire from all of the movement-containing regions. In the third stage, a gray level co-occurrence matrix is used to extract texture parameters by tracking red-colored objects in the candidate regions. These texture features are, subsequently, used as inputs of a back-propagation neural network to distinguish between fire and nonfire. Experimental results indicate that the proposed four-stage approach outperforms other fire detection algorithms in terms of consistently increasing the accuracy of fire detection in both indoor and outdoor test videos.

  16. Investigation of Proprioceptor Stimulation.

    ERIC Educational Resources Information Center

    Caukins, Sivan E.; And Others

    A research proposal to study the effect of multisensory teaching methods in first-grade reading is presented. The focus is on sex differences in learning and in multisensory approaches to teaching. The project will involve 10 experimental and 10 control first-grade classes in several Southern California schools. Both groups will be given IQ,…

  17. Personalised Information Services Using a Hybrid Recommendation Method Based on Usage Frequency

    ERIC Educational Resources Information Center

    Kim, Yong; Chung, Min Gyo

    2008-01-01

    Purpose: This paper seeks to describe a personal recommendation service (PRS) involving an innovative hybrid recommendation method suitable for deployment in a large-scale multimedia user environment. Design/methodology/approach: The proposed hybrid method partitions content and user into segments and executes association rule mining,…

  18. Teaching Students to Analyze Agency Actions via a NEPA Analysis Approach

    ERIC Educational Resources Information Center

    Whitworth, Paul M.

    2008-01-01

    Future recreation professionals need the ability to analyze the effects of proposed management actions and stakeholder concerns to make good decisions, maintain public support, and comply with state and federal laws. Importantly, when federal funds, lands, permits or licenses are involved, federal law requires consideration of environmental and…

  19. Recommended approaches in the application of toxicogenomics to derive points of departure for chemical risk assessment

    EPA Science Inventory

    ABSTRACT:Only a fraction of chemicals in commerce have been fully assessed for their potential hazards to human health due to difficulties involved in conventional regulatory tests. It has recently been proposed that quantitative transcriptomic data can be used to determine bench...

  20. Psychological Development and Females' Sport Participation: An Interactional Perspective.

    ERIC Educational Resources Information Center

    Weiss, Maureen R.; Glenn, Susan D.

    1992-01-01

    This paper examines the writings of Dorothy Harris and other authors, from a psychosocial perspective of sport involvement. It explores ways self-perceptions, social factors, and cognitive and biological maturity interact to explain females' physical activity participation; it proposes a heuristic model to represent an interactional approach. (SM)

  1. Elective English Program, Grades 9-12:

    ERIC Educational Resources Information Center

    Huntley Project Public Schools, Worden, MT.

    This literature-centered curriculum approach to English, grades 9-12, is proposed as a design to involve students in the learning experience. After an introductory explanation of the program's rationale and general procedures, each unit in the curriculum is outlined briefly; its content, objectives, suggested ability level, and procedures for…

  2. The Role of Nonlinear Pedagogy in Physical Education

    ERIC Educational Resources Information Center

    Chow, Jia Yi; Davids, Keith; Button, Chris; Shuttleworth, Rick; Renshaw, Ian; Araujo, Duarte

    2007-01-01

    In physical education, the Teaching Games for Understanding (TGfU) pedagogical strategy has attracted significant attention from theoreticians and educators for allowing the development of game education through a tactic-to-skill approach involving the use of modified games. However, some have proposed that as an educational framework, it lacks…

  3. Detecting Inappropriate Access to Electronic Health Records Using Collaborative Filtering.

    PubMed

    Menon, Aditya Krishna; Jiang, Xiaoqian; Kim, Jihoon; Vaidya, Jaideep; Ohno-Machado, Lucila

    2014-04-01

    Many healthcare facilities enforce security on their electronic health records (EHRs) through a corrective mechanism: some staff nominally have almost unrestricted access to the records, but there is a strict ex post facto audit process for inappropriate accesses, i.e., accesses that violate the facility's security and privacy policies. This process is inefficient, as each suspicious access has to be reviewed by a security expert, and is purely retrospective, as it occurs after damage may have been incurred. This motivates automated approaches based on machine learning using historical data. Previous attempts at such a system have successfully applied supervised learning models to this end, such as SVMs and logistic regression. While providing benefits over manual auditing, these approaches ignore the identity of the users and patients involved in a record access. Therefore, they cannot exploit the fact that a patient whose record was previously involved in a violation has an increased risk of being involved in a future violation. Motivated by this, in this paper, we propose a collaborative filtering inspired approach to predicting inappropriate accesses. Our solution integrates both explicit and latent features for staff and patients, the latter acting as a personalized "finger-print" based on historical access patterns. The proposed method, when applied to real EHR access data from two tertiary hospitals and a file-access dataset from Amazon, shows not only significantly improved performance compared to existing methods, but also provides insights as to what indicates an inappropriate access.

  4. Detecting Inappropriate Access to Electronic Health Records Using Collaborative Filtering

    PubMed Central

    Menon, Aditya Krishna; Jiang, Xiaoqian; Kim, Jihoon; Vaidya, Jaideep; Ohno-Machado, Lucila

    2013-01-01

    Many healthcare facilities enforce security on their electronic health records (EHRs) through a corrective mechanism: some staff nominally have almost unrestricted access to the records, but there is a strict ex post facto audit process for inappropriate accesses, i.e., accesses that violate the facility’s security and privacy policies. This process is inefficient, as each suspicious access has to be reviewed by a security expert, and is purely retrospective, as it occurs after damage may have been incurred. This motivates automated approaches based on machine learning using historical data. Previous attempts at such a system have successfully applied supervised learning models to this end, such as SVMs and logistic regression. While providing benefits over manual auditing, these approaches ignore the identity of the users and patients involved in a record access. Therefore, they cannot exploit the fact that a patient whose record was previously involved in a violation has an increased risk of being involved in a future violation. Motivated by this, in this paper, we propose a collaborative filtering inspired approach to predicting inappropriate accesses. Our solution integrates both explicit and latent features for staff and patients, the latter acting as a personalized “finger-print” based on historical access patterns. The proposed method, when applied to real EHR access data from two tertiary hospitals and a file-access dataset from Amazon, shows not only significantly improved performance compared to existing methods, but also provides insights as to what indicates an inappropriate access. PMID:24683293

  5. Constructing statistically unbiased cortical surface templates using feature-space covariance

    NASA Astrophysics Data System (ADS)

    Parvathaneni, Prasanna; Lyu, Ilwoo; Huo, Yuankai; Blaber, Justin; Hainline, Allison E.; Kang, Hakmook; Woodward, Neil D.; Landman, Bennett A.

    2018-03-01

    The choice of surface template plays an important role in cross-sectional subject analyses involving cortical brain surfaces because there is a tendency toward registration bias given variations in inter-individual and inter-group sulcal and gyral patterns. In order to account for the bias and spatial smoothing, we propose a feature-based unbiased average template surface. In contrast to prior approaches, we factor in the sample population covariance and assign weights based on feature information to minimize the influence of covariance in the sampled population. The mean surface is computed by applying the weights obtained from an inverse covariance matrix, which guarantees that multiple representations from similar groups (e.g., involving imaging, demographic, diagnosis information) are down-weighted to yield an unbiased mean in feature space. Results are validated by applying this approach in two different applications. For evaluation, the proposed unbiased weighted surface mean is compared with un-weighted means both qualitatively and quantitatively (mean squared error and absolute relative distance of both the means with baseline). In first application, we validated the stability of the proposed optimal mean on a scan-rescan reproducibility dataset by incrementally adding duplicate subjects. In the second application, we used clinical research data to evaluate the difference between the weighted and unweighted mean when different number of subjects were included in control versus schizophrenia groups. In both cases, the proposed method achieved greater stability that indicated reduced impacts of sampling bias. The weighted mean is built based on covariance information in feature space as opposed to spatial location, thus making this a generic approach to be applicable to any feature of interest.

  6. On the use of haplotype phylogeny to detect disease susceptibility loci

    PubMed Central

    Bardel, Claire; Danjean, Vincent; Hugot, Jean-Pierre; Darlu, Pierre; Génin, Emmanuelle

    2005-01-01

    Background The cladistic approach proposed by Templeton has been presented as promising for the study of the genetic factors involved in common diseases. This approach allows the joint study of multiple markers within a gene by considering haplotypes and grouping them in nested clades. The idea is to search for clades with an excess of cases as compared to the whole sample and to identify the mutations defining these clades as potential candidate disease susceptibility sites. However, the performance of this approach for the study of the genetic factors involved in complex diseases has never been studied. Results In this paper, we propose a new method to perform such a cladistic analysis and we estimate its power through simulations. We show that under models where the susceptibility to the disease is caused by a single genetic variant, the cladistic test is neither really more powerful to detect an association nor really more efficient to localize the susceptibility site than an individual SNP testing. However, when two interacting sites are responsible for the disease, the cladistic analysis greatly improves the probability to find the two susceptibility sites. The impact of the linkage disequilibrium and of the tree characteristics on the efficiency of the cladistic analysis are also discussed. An application on a real data set concerning the CARD15 gene and Crohn disease shows that the method can successfully identify the three variant sites that are involved in the disease susceptibility. Conclusion The use of phylogenies to group haplotypes is especially interesting to pinpoint the sites that are likely to be involved in disease susceptibility among the different markers identified within a gene. PMID:15904492

  7. Efficient dual approach to distance metric learning.

    PubMed

    Shen, Chunhua; Kim, Junae; Liu, Fayao; Wang, Lei; van den Hengel, Anton

    2014-02-01

    Distance metric learning is of fundamental interest in machine learning because the employed distance metric can significantly affect the performance of many learning methods. Quadratic Mahalanobis metric learning is a popular approach to the problem, but typically requires solving a semidefinite programming (SDP) problem, which is computationally expensive. The worst case complexity of solving an SDP problem involving a matrix variable of size D×D with O(D) linear constraints is about O(D(6.5)) using interior-point methods, where D is the dimension of the input data. Thus, the interior-point methods only practically solve problems exhibiting less than a few thousand variables. Because the number of variables is D(D+1)/2, this implies a limit upon the size of problem that can practically be solved around a few hundred dimensions. The complexity of the popular quadratic Mahalanobis metric learning approach thus limits the size of problem to which metric learning can be applied. Here, we propose a significantly more efficient and scalable approach to the metric learning problem based on the Lagrange dual formulation of the problem. The proposed formulation is much simpler to implement, and therefore allows much larger Mahalanobis metric learning problems to be solved. The time complexity of the proposed method is roughly O(D(3)), which is significantly lower than that of the SDP approach. Experiments on a variety of data sets demonstrate that the proposed method achieves an accuracy comparable with the state of the art, but is applicable to significantly larger problems. We also show that the proposed method can be applied to solve more general Frobenius norm regularized SDP problems approximately.

  8. Extension of a Kinetic-Theory Approach for Computing Chemical-Reaction Rates to Reactions with Charged Particles

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.; Lewis, Mark J.

    2010-01-01

    Recently introduced molecular-level chemistry models that predict equilibrium and nonequilibrium reaction rates using only kinetic theory and fundamental molecular properties (i.e., no macroscopic reaction rate information) are extended to include reactions involving charged particles and electronic energy levels. The proposed extensions include ionization reactions, exothermic associative ionization reactions, endothermic and exothermic charge exchange reactions, and other exchange reactions involving ionized species. The extensions are shown to agree favorably with the measured Arrhenius rates for near-equilibrium conditions.

  9. Successful management of vaginismus: An eclectic approach

    PubMed Central

    Harish, Thippeswamy; Muliyala, KrishnaPrasad; Murthy, Pratima

    2011-01-01

    Vaginismus is defined as recurrent or persistent involuntary spasm of the musculature of the outer third of the vagina, which interferes with coitus and causes distress and interpersonal difficulty. In this report, we describe the successful treatment of vaginismus in a 25-year-old lady based on a model proposed by Keith Hawton. The eclectic approach involved education, graded insertion of fingers, Kegel's exercises and usage of local anesthesia with vaginal containment along with the prescription of Escitalopram. PMID:21772650

  10. Successful management of vaginismus: An eclectic approach.

    PubMed

    Harish, Thippeswamy; Muliyala, Krishnaprasad; Murthy, Pratima

    2011-04-01

    Vaginismus is defined as recurrent or persistent involuntary spasm of the musculature of the outer third of the vagina, which interferes with coitus and causes distress and interpersonal difficulty. In this report, we describe the successful treatment of vaginismus in a 25-year-old lady based on a model proposed by Keith Hawton. The eclectic approach involved education, graded insertion of fingers, Kegel's exercises and usage of local anesthesia with vaginal containment along with the prescription of Escitalopram.

  11. Real-time probabilistic covariance tracking with efficient model update.

    PubMed

    Wu, Yi; Cheng, Jian; Wang, Jinqiao; Lu, Hanqing; Wang, Jun; Ling, Haibin; Blasch, Erik; Bai, Li

    2012-05-01

    The recently proposed covariance region descriptor has been proven robust and versatile for a modest computational cost. The covariance matrix enables efficient fusion of different types of features, where the spatial and statistical properties, as well as their correlation, are characterized. The similarity between two covariance descriptors is measured on Riemannian manifolds. Based on the same metric but with a probabilistic framework, we propose a novel tracking approach on Riemannian manifolds with a novel incremental covariance tensor learning (ICTL). To address the appearance variations, ICTL incrementally learns a low-dimensional covariance tensor representation and efficiently adapts online to appearance changes of the target with only O(1) computational complexity, resulting in a real-time performance. The covariance-based representation and the ICTL are then combined with the particle filter framework to allow better handling of background clutter, as well as the temporary occlusions. We test the proposed probabilistic ICTL tracker on numerous benchmark sequences involving different types of challenges including occlusions and variations in illumination, scale, and pose. The proposed approach demonstrates excellent real-time performance, both qualitatively and quantitatively, in comparison with several previously proposed trackers.

  12. Our current approach to root cause analysis: is it contributing to our failure to improve patient safety?

    PubMed

    Kellogg, Kathryn M; Hettinger, Zach; Shah, Manish; Wears, Robert L; Sellers, Craig R; Squires, Melissa; Fairbanks, Rollin J

    2017-05-01

    Despite over a decade of efforts to reduce the adverse event rate in healthcare, the rate has remained relatively unchanged. Root cause analysis (RCA) is a process used by hospitals in an attempt to reduce adverse event rates; however, the outputs of this process have not been well studied in healthcare. This study aimed to examine the types of solutions proposed in RCAs over an 8-year period at a major academic medical institution. All state-reportable adverse events were gathered, and those for which an RCA was performed were analysed. A consensus rating process was used to determine a severity rating for each case. A qualitative approach was used to categorise the types of solutions proposed by the RCA team in each case and descriptive statistics were calculated. 302 RCAs were reviewed. The most common event types involved a procedure complication, followed by cardiopulmonary arrest, neurological deficit and retained foreign body. In 106 RCAs, solutions were proposed. A large proportion (38.7%) of RCAs with solutions proposed involved a patient death. Of the 731 proposed solutions, the most common solution types were training (20%), process change (19.6%) and policy reinforcement (15.2%). We found that multiple event types were repeated in the study period, despite repeated RCAs. This study found that the most commonly proposed solutions were weaker actions, which were less likely to decrease event recurrence. These findings support recent attempts to improve the RCA process and to develop guidance for the creation of effective and sustainable solutions to be used by RCA teams. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  13. Discontinuity Detection in the Shield Metal Arc Welding Process

    PubMed Central

    Cocota, José Alberto Naves; Garcia, Gabriel Carvalho; da Costa, Adilson Rodrigues; de Lima, Milton Sérgio Fernandes; Rocha, Filipe Augusto Santos; Freitas, Gustavo Medeiros

    2017-01-01

    This work proposes a new methodology for the detection of discontinuities in the weld bead applied in Shielded Metal Arc Welding (SMAW) processes. The detection system is based on two sensors—a microphone and piezoelectric—that acquire acoustic emissions generated during the welding. The feature vectors extracted from the sensor dataset are used to construct classifier models. The approaches based on Artificial Neural Network (ANN) and Support Vector Machine (SVM) classifiers are able to identify with a high accuracy the three proposed weld bead classes: desirable weld bead, shrinkage cavity and burn through discontinuities. Experimental results illustrate the system’s high accuracy, greater than 90% for each class. A novel Hierarchical Support Vector Machine (HSVM) structure is proposed to make feasible the use of this system in industrial environments. This approach presented 96.6% overall accuracy. Given the simplicity of the equipment involved, this system can be applied in the metal transformation industries. PMID:28489045

  14. Discontinuity Detection in the Shield Metal Arc Welding Process.

    PubMed

    Cocota, José Alberto Naves; Garcia, Gabriel Carvalho; da Costa, Adilson Rodrigues; de Lima, Milton Sérgio Fernandes; Rocha, Filipe Augusto Santos; Freitas, Gustavo Medeiros

    2017-05-10

    This work proposes a new methodology for the detection of discontinuities in the weld bead applied in Shielded Metal Arc Welding (SMAW) processes. The detection system is based on two sensors-a microphone and piezoelectric-that acquire acoustic emissions generated during the welding. The feature vectors extracted from the sensor dataset are used to construct classifier models. The approaches based on Artificial Neural Network (ANN) and Support Vector Machine (SVM) classifiers are able to identify with a high accuracy the three proposed weld bead classes: desirable weld bead, shrinkage cavity and burn through discontinuities. Experimental results illustrate the system's high accuracy, greater than 90% for each class. A novel Hierarchical Support Vector Machine (HSVM) structure is proposed to make feasible the use of this system in industrial environments. This approach presented 96.6% overall accuracy. Given the simplicity of the equipment involved, this system can be applied in the metal transformation industries.

  15. The Ritz - Sublaminate Generalized Unified Formulation approach for piezoelectric composite plates

    NASA Astrophysics Data System (ADS)

    D'Ottavio, Michele; Dozio, Lorenzo; Vescovini, Riccardo; Polit, Olivier

    2018-01-01

    This paper extends to composite plates including piezoelectric plies the variable kinematics plate modeling approach called Sublaminate Generalized Unified Formulation (SGUF). Two-dimensional plate equations are obtained upon defining a priori the through-thickness distribution of the displacement field and electric potential. According to SGUF, independent approximations can be adopted for the four components of these generalized displacements: an Equivalent Single Layer (ESL) or Layer-Wise (LW) description over an arbitrary group of plies constituting the composite plate (the sublaminate) and the polynomial order employed in each sublaminate. The solution of the two-dimensional equations is sought in weak form by means of a Ritz method. In this work, boundary functions are used in conjunction with the domain approximation expressed by an orthogonal basis spanned by Legendre polynomials. The proposed computational tool is capable to represent electroded surfaces with equipotentiality conditions. Free-vibration problems as well as static problems involving actuator and sensor configurations are addressed. Two case studies are presented, which demonstrate the high accuracy of the proposed Ritz-SGUF approach. A model assessment is proposed for showcasing to which extent the SGUF approach allows a reduction of the number of unknowns with a controlled impact on the accuracy of the result.

  16. Whole abdominal wall segmentation using augmented active shape models (AASM) with multi-atlas label fusion and level set

    NASA Astrophysics Data System (ADS)

    Xu, Zhoubing; Baucom, Rebeccah B.; Abramson, Richard G.; Poulose, Benjamin K.; Landman, Bennett A.

    2016-03-01

    The abdominal wall is an important structure differentiating subcutaneous and visceral compartments and intimately involved with maintaining abdominal structure. Segmentation of the whole abdominal wall on routinely acquired computed tomography (CT) scans remains challenging due to variations and complexities of the wall and surrounding tissues. In this study, we propose a slice-wise augmented active shape model (AASM) approach to robustly segment both the outer and inner surfaces of the abdominal wall. Multi-atlas label fusion (MALF) and level set (LS) techniques are integrated into the traditional ASM framework. The AASM approach globally optimizes the landmark updates in the presence of complicated underlying local anatomical contexts. The proposed approach was validated on 184 axial slices of 20 CT scans. The Hausdorff distance against the manual segmentation was significantly reduced using proposed approach compared to that using ASM, MALF, and LS individually. Our segmentation of the whole abdominal wall enables the subcutaneous and visceral fat measurement, with high correlation to the measurement derived from manual segmentation. This study presents the first generic algorithm that combines ASM, MALF, and LS, and demonstrates practical application for automatically capturing visceral and subcutaneous fat volumes.

  17. Advice and care for patients who die by voluntarily stopping eating and drinking is not assisted suicide.

    PubMed

    McGee, Andrew; Miller, Franklin G

    2017-12-27

    A competent patient has the right to refuse foods and fluids even if the patient will die. The exercise of this right, known as voluntarily stopping eating and drinking (VSED), is sometimes proposed as an alternative to physician assisted suicide. However, there is ethical and legal uncertainty about physician involvement in VSED. Are physicians advising of this option, or making patients comfortable while they undertake VSED, assisting suicide? This paper attempts to resolve this ethical and legal uncertainty. The standard approach to resolving this conundrum has been to determine whether VSED itself is suicide. Those who claim that VSED is suicide invariably claim that physician involvement in VSED amounts to assisting suicide. Those who claim that VSED is not suicide claim that physician involvement in VSED does not amount to assisting suicide. We reject this standard approach. We instead argue that, even if VSED is classified as a kind of suicide, physician involvement in VSED is not a form of assisted suicide. Physician involvement in VSED does not therefore fall within legal provisions that prohibit VSED.

  18. Detecting representative data and generating synthetic samples to improve learning accuracy with imbalanced data sets.

    PubMed

    Li, Der-Chiang; Hu, Susan C; Lin, Liang-Sian; Yeh, Chun-Wu

    2017-01-01

    It is difficult for learning models to achieve high classification performances with imbalanced data sets, because with imbalanced data sets, when one of the classes is much larger than the others, most machine learning and data mining classifiers are overly influenced by the larger classes and ignore the smaller ones. As a result, the classification algorithms often have poor learning performances due to slow convergence in the smaller classes. To balance such data sets, this paper presents a strategy that involves reducing the sizes of the majority data and generating synthetic samples for the minority data. In the reducing operation, we use the box-and-whisker plot approach to exclude outliers and the Mega-Trend-Diffusion method to find representative data from the majority data. To generate the synthetic samples, we propose a counterintuitive hypothesis to find the distributed shape of the minority data, and then produce samples according to this distribution. Four real datasets were used to examine the performance of the proposed approach. We used paired t-tests to compare the Accuracy, G-mean, and F-measure scores of the proposed data pre-processing (PPDP) method merging in the D3C method (PPDP+D3C) with those of the one-sided selection (OSS), the well-known SMOTEBoost (SB) study, and the normal distribution-based oversampling (NDO) approach, and the proposed data pre-processing (PPDP) method. The results indicate that the classification performance of the proposed approach is better than that of above-mentioned methods.

  19. New Approach For Prediction Groundwater Depletion

    NASA Astrophysics Data System (ADS)

    Moustafa, Mahmoud

    2017-01-01

    Current approaches to quantify groundwater depletion involve water balance and satellite gravity. However, the water balance technique includes uncertain estimation of parameters such as evapotranspiration and runoff. The satellite method consumes time and effort. The work reported in this paper proposes using failure theory in a novel way to predict groundwater saturated thickness depletion. An important issue in the failure theory proposed is to determine the failure point (depletion case). The proposed technique uses depth of water as the net result of recharge/discharge processes in the aquifer to calculate remaining saturated thickness resulting from the applied pumping rates in an area to evaluate the groundwater depletion. Two parameters, the Weibull function and Bayes analysis were used to model and analyze collected data from 1962 to 2009. The proposed methodology was tested in a nonrenewable aquifer, with no recharge. Consequently, the continuous decline in water depth has been the main criterion used to estimate the depletion. The value of the proposed approach is to predict the probable effect of the current applied pumping rates on the saturated thickness based on the remaining saturated thickness data. The limitation of the suggested approach is that it assumes the applied management practices are constant during the prediction period. The study predicted that after 300 years there would be an 80% probability of the saturated aquifer which would be expected to be depleted. Lifetime or failure theory can give a simple alternative way to predict the remaining saturated thickness depletion with no time-consuming processes such as the sophisticated software required.

  20. Optimum Tolerance Design Using Component-Amount and Mixture-Amount Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.; Ozler, Cenk; Sehirlioglu, Ali Kemal

    2013-08-01

    One type of tolerance design problem involves optimizing component and assembly tolerances to minimize the total cost (sum of manufacturing cost and quality loss). Previous literature recommended using traditional response surface (RS) designs and models to solve this type of tolerance design problem. In this article, component-amount (CA) and mixture-amount (MA) approaches are proposed as more appropriate for solving this type of tolerance design problem. The advantages of the CA and MA approaches over the RS approach are discussed. Reasons for choosing between the CA and MA approaches are also discussed. The CA and MA approaches (experimental design, response modeling,more » and optimization) are illustrated using real examples.« less

  1. Improving human activity recognition and its application in early stroke diagnosis.

    PubMed

    Villar, José R; González, Silvia; Sedano, Javier; Chira, Camelia; Trejo-Gabriel-Galan, Jose M

    2015-06-01

    The development of efficient stroke-detection methods is of significant importance in today's society due to the effects and impact of stroke on health and economy worldwide. This study focuses on Human Activity Recognition (HAR), which is a key component in developing an early stroke-diagnosis tool. An overview of the proposed global approach able to discriminate normal resting from stroke-related paralysis is detailed. The main contributions include an extension of the Genetic Fuzzy Finite State Machine (GFFSM) method and a new hybrid feature selection (FS) algorithm involving Principal Component Analysis (PCA) and a voting scheme putting the cross-validation results together. Experimental results show that the proposed approach is a well-performing HAR tool that can be successfully embedded in devices.

  2. An Approach to Help Departments Meet the New ABET Process Safety Requirements

    ERIC Educational Resources Information Center

    Vaughen, Bruce K.

    2012-01-01

    The proposed program criteria changes by the Accreditation Board for Engineering and Technology, Inc. (ABET), for chemical, biochemical, biomolecular, and similarly named programs includes a fundamental awareness expectation of the hazards involved in chemical processing for a graduating chemical engineer. As of July 2010, these four new words…

  3. Validity of the Learning Portfolio: Analysis of a Portfolio Proposal for the University

    ERIC Educational Resources Information Center

    Gregori-Giralt, Eva; Menéndez-Varela, José Luis

    2015-01-01

    Validity is a central issue in portfolio-based assessment. This empirical study used a quantitative approach to analyse the validity of the inferences drawn from a disciplinary course work portfolio assessment comprising profession-specific and learning competencies. The study also examined the problems involved in the development of the…

  4. Leadership in a Performative Context: A Framework for Decision-Making

    ERIC Educational Resources Information Center

    Chitpin, Stephanie; Jones, Ken

    2015-01-01

    This paper examines a model of decision-making within the context of current and emerging regimes of accountability being proposed and implemented for school systems in a number of jurisdictions. These approaches to accountability typically involve the use of various measurable student learning outcomes as well as other measures of performance to…

  5. First-Year Engineering Students' Portrayal of Engineering in a Proposed Museum Exhibit for Middle School Students

    ERIC Educational Resources Information Center

    Mena, Irene B.; Diefes-Dux, Heidi A.

    2012-01-01

    Students' perceptions of engineering have been documented through studies involving interviews, surveys, and word associations that take a direct approach to asking students about various aspects of their understanding of engineering. Research on perceptions of engineering rarely focuses on how students would portray engineering to others.…

  6. A small-scale land-sparing approach to conserving biological diversity in tropical agricultural landscapes

    Treesearch

    Richard B. Chandler; David I. King; Raul Raudales; Richard Trubey; Carlin Chandler; Víctor Julio Arce Chávez

    2013-01-01

    Two contrasting strategies have been proposed for conserving biological diversity while meeting the increasing demand for agricultural products: land sparing and land sharing production systems. Land sparing involves increasing yield to reduce the amount of land needed for agriculture, whereas land-sharing agricultural practices incorporate elements of native...

  7. 24 CFR 248.213 - Plan of action.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... of the effect of the proposed changes on existing tenants. (6) In the case of a plan of action involving incentives, an appraisal using the residential income approach; (7) In the case of a plan of... summary, the local HUD field office, and the on-site office for the project, or if one is not available...

  8. 24 CFR 248.213 - Plan of action.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... of the effect of the proposed changes on existing tenants. (6) In the case of a plan of action involving incentives, an appraisal using the residential income approach; (7) In the case of a plan of... summary, the local HUD field office, and the on-site office for the project, or if one is not available...

  9. 24 CFR 248.213 - Plan of action.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... of the effect of the proposed changes on existing tenants. (6) In the case of a plan of action involving incentives, an appraisal using the residential income approach; (7) In the case of a plan of... summary, the local HUD field office, and the on-site office for the project, or if one is not available...

  10. 24 CFR 248.213 - Plan of action.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... of the effect of the proposed changes on existing tenants. (6) In the case of a plan of action involving incentives, an appraisal using the residential income approach; (7) In the case of a plan of... summary, the local HUD field office, and the on-site office for the project, or if one is not available...

  11. 24 CFR 248.213 - Plan of action.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... of the effect of the proposed changes on existing tenants. (6) In the case of a plan of action involving incentives, an appraisal using the residential income approach; (7) In the case of a plan of... summary, the local HUD field office, and the on-site office for the project, or if one is not available...

  12. Division in a Binary Representation for Complex Numbers

    ERIC Educational Resources Information Center

    Blest, David C.; Jamil, Tariq

    2003-01-01

    Computer operations involving complex numbers, essential in such applications as Fourier transforms or image processing, are normally performed in a "divide-and-conquer" approach dealing separately with real and imaginary parts. A number of proposals have treated complex numbers as a single unit but all have foundered on the problem of the…

  13. Research Knowledge Transfer through Business-Driven Student Assignment

    ERIC Educational Resources Information Center

    Sas, Corina

    2009-01-01

    Purpose: The purpose of this paper is to present a knowledge transfer method that capitalizes on both research and teaching dimensions of academic work. It also aims to propose a framework for evaluating the impact of such a method on the involved stakeholders. Design/methodology/approach: The case study outlines and evaluates the six-stage…

  14. Exploring the Partnership between Line Managers and HRM in Greece

    ERIC Educational Resources Information Center

    Papalexandris, Nancy; Panayotopoulou, Leda

    2005-01-01

    Purpose: This article seeks to discuss the role that line managers take up concerning human resource management issues among Greek firms and to propose ways for enhancing the synergistic relationship between human resource (HR) and line managers. Design/methodology/approach: It presents the trends of line management involvement in Greek firms,…

  15. 76 FR 52034 - Self-Regulatory Organizations; NYSE Arca, Inc.; Order Granting Approval of Proposed Rule Change...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-19

    ... a period of time greater than one day because mathematical compounding prevents the Funds from... not actively managed by traditional methods, which typically involve effecting changes in the... mathematical approach to determine the type, quantity, and mix of investment positions that it believes should...

  16. Systems medicine and integrated care to combat chronic noncommunicable diseases

    PubMed Central

    2011-01-01

    We propose an innovative, integrated, cost-effective health system to combat major non-communicable diseases (NCDs), including cardiovascular, chronic respiratory, metabolic, rheumatologic and neurologic disorders and cancers, which together are the predominant health problem of the 21st century. This proposed holistic strategy involves comprehensive patient-centered integrated care and multi-scale, multi-modal and multi-level systems approaches to tackle NCDs as a common group of diseases. Rather than studying each disease individually, it will take into account their intertwined gene-environment, socio-economic interactions and co-morbidities that lead to individual-specific complex phenotypes. It will implement a road map for predictive, preventive, personalized and participatory (P4) medicine based on a robust and extensive knowledge management infrastructure that contains individual patient information. It will be supported by strategic partnerships involving all stakeholders, including general practitioners associated with patient-centered care. This systems medicine strategy, which will take a holistic approach to disease, is designed to allow the results to be used globally, taking into account the needs and specificities of local economies and health systems. PMID:21745417

  17. A Multicriteria Decision Making Approach for Estimating the Number of Clusters in a Data Set

    PubMed Central

    Peng, Yi; Zhang, Yong; Kou, Gang; Shi, Yong

    2012-01-01

    Determining the number of clusters in a data set is an essential yet difficult step in cluster analysis. Since this task involves more than one criterion, it can be modeled as a multiple criteria decision making (MCDM) problem. This paper proposes a multiple criteria decision making (MCDM)-based approach to estimate the number of clusters for a given data set. In this approach, MCDM methods consider different numbers of clusters as alternatives and the outputs of any clustering algorithm on validity measures as criteria. The proposed method is examined by an experimental study using three MCDM methods, the well-known clustering algorithm–k-means, ten relative measures, and fifteen public-domain UCI machine learning data sets. The results show that MCDM methods work fairly well in estimating the number of clusters in the data and outperform the ten relative measures considered in the study. PMID:22870181

  18. Generation algorithm of craniofacial structure contour in cephalometric images

    NASA Astrophysics Data System (ADS)

    Mondal, Tanmoy; Jain, Ashish; Sardana, H. K.

    2010-02-01

    Anatomical structure tracing on cephalograms is a significant way to obtain cephalometric analysis. Computerized cephalometric analysis involves both manual and automatic approaches. The manual approach is limited in accuracy and repeatability. In this paper we have attempted to develop and test a novel method for automatic localization of craniofacial structure based on the detected edges on the region of interest. According to the grey scale feature at the different region of the cephalometric images, an algorithm for obtaining tissue contour is put forward. Using edge detection with specific threshold an improved bidirectional contour tracing approach is proposed by an interactive selection of the starting edge pixels, the tracking process searches repetitively for an edge pixel at the neighborhood of previously searched edge pixel to segment images, and then craniofacial structures are obtained. The effectiveness of the algorithm is demonstrated by the preliminary experimental results obtained with the proposed method.

  19. Semi-automated knowledge discovery: identifying and profiling human trafficking

    NASA Astrophysics Data System (ADS)

    Poelmans, Jonas; Elzinga, Paul; Ignatov, Dmitry I.; Kuznetsov, Sergei O.

    2012-11-01

    We propose an iterative and human-centred knowledge discovery methodology based on formal concept analysis. The proposed approach recognizes the important role of the domain expert in mining real-world enterprise applications and makes use of specific domain knowledge, including human intelligence and domain-specific constraints. Our approach was empirically validated at the Amsterdam-Amstelland police to identify suspects and victims of human trafficking in 266,157 suspicious activity reports. Based on guidelines of the Attorney Generals of the Netherlands, we first defined multiple early warning indicators that were used to index the police reports. Using concept lattices, we revealed numerous unknown human trafficking and loverboy suspects. In-depth investigation by the police resulted in a confirmation of their involvement in illegal activities resulting in actual arrestments been made. Our human-centred approach was embedded into operational policing practice and is now successfully used on a daily basis to cope with the vastly growing amount of unstructured information.

  20. Fast Legendre moment computation for template matching

    NASA Astrophysics Data System (ADS)

    Li, Bing C.

    2017-05-01

    Normalized cross correlation (NCC) based template matching is insensitive to intensity changes and it has many applications in image processing, object detection, video tracking and pattern recognition. However, normalized cross correlation implementation is computationally expensive since it involves both correlation computation and normalization implementation. In this paper, we propose Legendre moment approach for fast normalized cross correlation implementation and show that the computational cost of this proposed approach is independent of template mask sizes which is significantly faster than traditional mask size dependent approaches, especially for large mask templates. Legendre polynomials have been widely used in solving Laplace equation in electrodynamics in spherical coordinate systems, and solving Schrodinger equation in quantum mechanics. In this paper, we extend Legendre polynomials from physics to computer vision and pattern recognition fields, and demonstrate that Legendre polynomials can help to reduce the computational cost of NCC based template matching significantly.

  1. Interprofessional communication and medical error: a reframing of research questions and approaches.

    PubMed

    Varpio, Lara; Hall, Pippa; Lingard, Lorelei; Schryer, Catherine F

    2008-10-01

    Progress toward understanding the links between interprofessional communication and issues of medical error has been slow. Recent research proposes that this delay may result from overlooking the complexities involved in interprofessional care. Medical education initiatives in this domain tend to simplify the complexities of team membership fluidity, rotation, and use of communication tools. A new theoretically informed research approach is required to take into account these complexities. To generate such an approach, we review two theories from the social sciences: Activity Theory and Knotworking. Using these perspectives, we propose that research into interprofessional communication and medical error can develop better understandings of (1) how and why medical errors are generated and (2) how and why gaps in team defenses occur. Such complexities will have to be investigated if students and practicing clinicians are to be adequately prepared to work safely in interprofessional teams.

  2. An Exact Formula for Calculating Inverse Radial Lens Distortions

    PubMed Central

    Drap, Pierre; Lefèvre, Julien

    2016-01-01

    This article presents a new approach to calculating the inverse of radial distortions. The method presented here provides a model of reverse radial distortion, currently modeled by a polynomial expression, that proposes another polynomial expression where the new coefficients are a function of the original ones. After describing the state of the art, the proposed method is developed. It is based on a formal calculus involving a power series used to deduce a recursive formula for the new coefficients. We present several implementations of this method and describe the experiments conducted to assess the validity of the new approach. Such an approach, non-iterative, using another polynomial expression, able to be deduced from the first one, can actually be interesting in terms of performance, reuse of existing software, or bridging between different existing software tools that do not consider distortion from the same point of view. PMID:27258288

  3. A sequential factorial analysis approach to characterize the effects of uncertainties for supporting air quality management

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Veawab, A.

    2013-03-01

    This study proposes a sequential factorial analysis (SFA) approach for supporting regional air quality management under uncertainty. SFA is capable not only of examining the interactive effects of input parameters, but also of analyzing the effects of constraints. When there are too many factors involved in practical applications, SFA has the advantage of conducting a sequence of factorial analyses for characterizing the effects of factors in a systematic manner. The factor-screening strategy employed in SFA is effective in greatly reducing the computational effort. The proposed SFA approach is applied to a regional air quality management problem for demonstrating its applicability. The results indicate that the effects of factors are evaluated quantitatively, which can help decision makers identify the key factors that have significant influence on system performance and explore the valuable information that may be veiled beneath their interrelationships.

  4. A review of active learning approaches to experimental design for uncovering biological networks

    PubMed Central

    2017-01-01

    Various types of biological knowledge describe networks of interactions among elementary entities. For example, transcriptional regulatory networks consist of interactions among proteins and genes. Current knowledge about the exact structure of such networks is highly incomplete, and laboratory experiments that manipulate the entities involved are conducted to test hypotheses about these networks. In recent years, various automated approaches to experiment selection have been proposed. Many of these approaches can be characterized as active machine learning algorithms. Active learning is an iterative process in which a model is learned from data, hypotheses are generated from the model to propose informative experiments, and the experiments yield new data that is used to update the model. This review describes the various models, experiment selection strategies, validation techniques, and successful applications described in the literature; highlights common themes and notable distinctions among methods; and identifies likely directions of future research and open problems in the area. PMID:28570593

  5. Inelastic cotunneling with energy-dependent contact transmission

    NASA Astrophysics Data System (ADS)

    Blok, S.; Agundez Mojarro, R. R.; Maduro, L. A.; Blaauboer, M.; Van Der Molen, S. J.

    2017-03-01

    We investigate inelastic cotunneling in a model system where the charging island is connected to the leads through molecules with energy-dependent transmission functions. To study this problem, we propose two different approaches. The first is a pragmatic approach that assumes Lorentzian-like transmission functions that determine the transmission probability to the island. Using this model, we calculate current versus voltage (IV) curves for increasing resonance level positions of the molecule. We find that shifting the resonance energy of the molecule away from the Fermi energy of the contacts leads to a decreased current at low bias, but as bias increases, this difference decreases and eventually inverses. This is markedly different from IV behavior outside the cotunneling regime. The second approach involves multiple cotunneling where also the molecules are considered to be in the Coulomb blockade regime. We find here that when Ec≫eV ,kBT , the IV behavior approaches the original cotunneling behavior proposed by Averin and Nazarov [Phys. Rev. Lett. 65, 2446-2449 (1990)].

  6. Integration of Genome-Scale Modeling and Transcript Profiling Reveals Metabolic Pathways Underlying Light and Temperature Acclimation in Arabidopsis[C][W

    PubMed Central

    Töpfer, Nadine; Caldana, Camila; Grimbs, Sergio; Willmitzer, Lothar; Fernie, Alisdair R.; Nikoloski, Zoran

    2013-01-01

    Understanding metabolic acclimation of plants to challenging environmental conditions is essential for dissecting the role of metabolic pathways in growth and survival. As stresses involve simultaneous physiological alterations across all levels of cellular organization, a comprehensive characterization of the role of metabolic pathways in acclimation necessitates integration of genome-scale models with high-throughput data. Here, we present an integrative optimization-based approach, which, by coupling a plant metabolic network model and transcriptomics data, can predict the metabolic pathways affected in a single, carefully controlled experiment. Moreover, we propose three optimization-based indices that characterize different aspects of metabolic pathway behavior in the context of the entire metabolic network. We demonstrate that the proposed approach and indices facilitate quantitative comparisons and characterization of the plant metabolic response under eight different light and/or temperature conditions. The predictions of the metabolic functions involved in metabolic acclimation of Arabidopsis thaliana to the changing conditions are in line with experimental evidence and result in a hypothesis about the role of homocysteine-to-Cys interconversion and Asn biosynthesis. The approach can also be used to reveal the role of particular metabolic pathways in other scenarios, while taking into consideration the entirety of characterized plant metabolism. PMID:23613196

  7. Group decision-making approach for flood vulnerability identification using the fuzzy VIKOR method

    NASA Astrophysics Data System (ADS)

    Lee, G.; Jun, K. S.; Chung, E.-S.

    2015-04-01

    This study proposes an improved group decision making (GDM) framework that combines the VIKOR method with data fuzzification to quantify the spatial flood vulnerability including multiple criteria. In general, GDM method is an effective tool for formulating a compromise solution that involves various decision makers since various stakeholders may have different perspectives on their flood risk/vulnerability management responses. The GDM approach is designed to achieve consensus building that reflects the viewpoints of each participant. The fuzzy VIKOR method was developed to solve multi-criteria decision making (MCDM) problems with conflicting and noncommensurable criteria. This comprising method can be used to obtain a nearly ideal solution according to all established criteria. This approach effectively can propose some compromising decisions by combining the GDM method and fuzzy VIKOR method. The spatial flood vulnerability of the southern Han River using the GDM approach combined with the fuzzy VIKOR method was compared with the spatial flood vulnerability using general MCDM methods, such as the fuzzy TOPSIS and classical GDM methods (i.e., Borda, Condorcet, and Copeland). As a result, the proposed fuzzy GDM approach can reduce the uncertainty in the data confidence and weight derivation techniques. Thus, the combination of the GDM approach with the fuzzy VIKOR method can provide robust prioritization because it actively reflects the opinions of various groups and considers uncertainty in the input data.

  8. An imprecise probability approach for squeal instability analysis based on evidence theory

    NASA Astrophysics Data System (ADS)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-01-01

    An imprecise probability approach based on evidence theory is proposed for squeal instability analysis of uncertain disc brakes in this paper. First, the squeal instability of the finite element (FE) model of a disc brake is investigated and its dominant unstable eigenvalue is detected by running two typical numerical simulations, i.e., complex eigenvalue analysis (CEA) and transient dynamical analysis. Next, the uncertainty mainly caused by contact and friction is taken into account and some key parameters of the brake are described as uncertain parameters. All these uncertain parameters are usually involved with imprecise data such as incomplete information and conflict information. Finally, a squeal instability analysis model considering imprecise uncertainty is established by integrating evidence theory, Taylor expansion, subinterval analysis and surrogate model. In the proposed analysis model, the uncertain parameters with imprecise data are treated as evidence variables, and the belief measure and plausibility measure are employed to evaluate system squeal instability. The effectiveness of the proposed approach is demonstrated by numerical examples and some interesting observations and conclusions are summarized from the analyses and discussions. The proposed approach is generally limited to the squeal problems without too many investigated parameters. It can be considered as a potential method for squeal instability analysis, which will act as the first step to reduce squeal noise of uncertain brakes with imprecise information.

  9. Uses of duckweed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hillman, W.S.; Culley, D.D. Jr.

    1978-07-01

    Among the various approaches to improving present technologies for waste-water treatment, several involve the use of plants, which can remove pollutants and provide materials useful as animal feeds or energy sources. Various aquatic plants are being proposed in such approaches, and the duckweeds in particular, an essentially unique group of higher aquatic plants, might be especially advantageous in such systems. Although this article focuses on only this one group of plants, it can nevertheless provide an introduction to issues that are both scientifically challenging and existentially inescapable.

  10. Get Your Requirements Straight: Storyboarding Revisited

    NASA Astrophysics Data System (ADS)

    Haesen, Mieke; Luyten, Kris; Coninx, Karin

    Current user-centred software engineering (UCSE) approaches provide many techniques to combine know-how available in multidisciplinary teams. Although the involvement of various disciplines is beneficial for the user experience of the future application, the transition from a user needs analysis to a structured interaction analysis and UI design is not always straightforward. We propose storyboards, enriched by metadata, to specify functional and non-functional requirements. Accompanying tool support should facilitate the creation and use of storyboards. We used a meta-storyboard for the verification of storyboarding approaches.

  11. A global CT to US registration of the lumbar spine

    NASA Astrophysics Data System (ADS)

    Nagpal, Simrin; Hacihaliloglu, Ilker; Ungi, Tamas; Rasoulian, Abtin; Osborn, Jill; Lessoway, Victoria A.; Rohling, Robert N.; Borschneck, Daniel P.; Abolmaesumi, Purang; Mousavi, Parvin

    2014-03-01

    During percutaneous lumbar spine needle interventions, alignment of the preoperative computed tomography (CT) with intraoperative ultrasound (US) can augment anatomical visualization for the clinician. We propose an approach to rigidly align CT and US data of the lumbar spine. The approach involves an intensity-based volume registration step, followed by a surface segmentation and a point-based registration of the entire lumbar spine volume. A clinical feasibility study resulted in mean registration error of approximately 3 mm between CT and US data.

  12. Flexible Web services integration: a novel personalised social approach

    NASA Astrophysics Data System (ADS)

    Metrouh, Abdelmalek; Mokhati, Farid

    2018-05-01

    Dynamic composition or integration remains one of the key objectives of Web services technology. This paper aims to propose an innovative approach of dynamic Web services composition based on functional and non-functional attributes and individual preferences. In this approach, social networks of Web services are used to maintain interactions between Web services in order to select and compose Web services that are more tightly related to user's preferences. We use the concept of Web services community in a social network of Web services to reduce considerably their search space. These communities are created by the direct involvement of Web services providers.

  13. Ontology-based automatic generation of computerized cognitive exercises.

    PubMed

    Leonardi, Giorgio; Panzarasa, Silvia; Quaglini, Silvana

    2011-01-01

    Computer-based approaches can add great value to the traditional paper-based approaches for cognitive rehabilitation. The management of a big amount of stimuli and the use of multimedia features permits to improve the patient's involvement and to reuse and recombine them to create new exercises, whose difficulty level should be adapted to the patient's performance. This work proposes an ontological organization of the stimuli, to support the automatic generation of new exercises, tailored on the patient's preferences and skills, and its integration into a commercial cognitive rehabilitation tool. The possibilities offered by this approach are presented with the help of real examples.

  14. Pedagogical applications of cognitive research on musical improvisation

    PubMed Central

    Biasutti, Michele

    2015-01-01

    This paper presents a model for the implementation of educational activities involving musical improvisation that is based on a review of the literature on the psychology of music. Psychology of music is a complex field of research in which quantitative and qualitative methods have been employed involving participants ranging from novices to expert performers. The cognitive research has been analyzed to propose a pedagogical approach to the development of processes rather than products that focus on an expert’s use of improvisation. The intention is to delineate a reflective approach that goes beyond the mere instruction of some current practices of teaching improvisation in jazz pedagogy. The review highlights that improvisation is a complex, multidimensional act that involves creative and performance behaviors in real-time in addition to processes such as sensory and perceptual encoding, motor control, performance monitoring, and memory storage and recall. Educational applications for the following processes are outlined: anticipation, use of repertoire, emotive communication, feedback, and flow. These characteristics are discussed in relation to the design of a pedagogical approach to musical improvisation based on reflection and metacognition development. PMID:26029147

  15. Pedagogical applications of cognitive research on musical improvisation.

    PubMed

    Biasutti, Michele

    2015-01-01

    This paper presents a model for the implementation of educational activities involving musical improvisation that is based on a review of the literature on the psychology of music. Psychology of music is a complex field of research in which quantitative and qualitative methods have been employed involving participants ranging from novices to expert performers. The cognitive research has been analyzed to propose a pedagogical approach to the development of processes rather than products that focus on an expert's use of improvisation. The intention is to delineate a reflective approach that goes beyond the mere instruction of some current practices of teaching improvisation in jazz pedagogy. The review highlights that improvisation is a complex, multidimensional act that involves creative and performance behaviors in real-time in addition to processes such as sensory and perceptual encoding, motor control, performance monitoring, and memory storage and recall. Educational applications for the following processes are outlined: anticipation, use of repertoire, emotive communication, feedback, and flow. These characteristics are discussed in relation to the design of a pedagogical approach to musical improvisation based on reflection and metacognition development.

  16. Control Synthesis of Discrete-Time T-S Fuzzy Systems: Reducing the Conservatism Whilst Alleviating the Computational Burden.

    PubMed

    Xie, Xiangpeng; Yue, Dong; Zhang, Huaguang; Peng, Chen

    2017-09-01

    The augmented multi-indexed matrix approach acts as a powerful tool in reducing the conservatism of control synthesis of discrete-time Takagi-Sugeno fuzzy systems. However, its computational burden is sometimes too heavy as a tradeoff. Nowadays, reducing the conservatism whilst alleviating the computational burden becomes an ideal but very challenging problem. This paper is toward finding an efficient way to achieve one of satisfactory answers. Different from the augmented multi-indexed matrix approach in the literature, we aim to design a more efficient slack variable approach under a general framework of homogenous matrix polynomials. Thanks to the introduction of a new extended representation for homogeneous matrix polynomials, related matrices with the same coefficient are collected together into one sole set and thus those redundant terms of the augmented multi-indexed matrix approach can be removed, i.e., the computational burden can be alleviated in this paper. More importantly, due to the fact that more useful information is involved into control design, the conservatism of the proposed approach as well is less than the counterpart of the augmented multi-indexed matrix approach. Finally, numerical experiments are given to show the effectiveness of the proposed approach.

  17. Technology and education: First approach for measuring temperature with Arduino

    NASA Astrophysics Data System (ADS)

    Carrillo, Alejandro

    2017-04-01

    This poster session presents some ideas and approaches to understand concepts of thermal equilibrium, temperature and heat in order to bulid a man-nature relationship in a harmonious and responsible manner, emphasizing the interaction between science and technology, without neglecting the relationship of the environment and society, an approach to sustainability. It is proposed the development of practices that involve the use of modern technology, of easy access and low cost to measure temperature. We believe that the Arduino microcontroller and some temperature sensors can open the doors of innovation to carry out such practices. In this work we present some results of simple practices presented to a population of students between the ages of 16 and 17 years old. The practices in this proposal are: Zero law of thermodynamics and the concept of temperature, calibration of thermometers and measurement of temperature for heating and cooling of three different substances under the same physical conditions. Finally the student is asked to make an application that involves measuring of temperature and other physical parameters. Some suggestions are: to determine the temperature at which we take some food, measure the temperature difference at different rooms of a house, housing constructions that favour optimal condition, measure the temperature of different regions, measure of temperature trough different colour filters, solar activity and UV, propose applications to understand current problems such as global warming, etc. It is concluded that the Arduino practices and electrical sensors increase the cultural horizon of the students while awaking their interest to understand their operation, basic physics and its application from a modern perspective.

  18. One-Channel Surface Electromyography Decomposition for Muscle Force Estimation.

    PubMed

    Sun, Wentao; Zhu, Jinying; Jiang, Yinlai; Yokoi, Hiroshi; Huang, Qiang

    2018-01-01

    Estimating muscle force by surface electromyography (sEMG) is a non-invasive and flexible way to diagnose biomechanical diseases and control assistive devices such as prosthetic hands. To estimate muscle force using sEMG, a supervised method is commonly adopted. This requires simultaneous recording of sEMG signals and muscle force measured by additional devices to tune the variables involved. However, recording the muscle force of the lost limb of an amputee is challenging, and the supervised method has limitations in this regard. Although the unsupervised method does not require muscle force recording, it suffers from low accuracy due to a lack of reference data. To achieve accurate and easy estimation of muscle force by the unsupervised method, we propose a decomposition of one-channel sEMG signals into constituent motor unit action potentials (MUAPs) in two steps: (1) learning an orthogonal basis of sEMG signals through reconstruction independent component analysis; (2) extracting spike-like MUAPs from the basis vectors. Nine healthy subjects were recruited to evaluate the accuracy of the proposed approach in estimating muscle force of the biceps brachii. The results demonstrated that the proposed approach based on decomposed MUAPs explains more than 80% of the muscle force variability recorded at an arbitrary force level, while the conventional amplitude-based approach explains only 62.3% of this variability. With the proposed approach, we were also able to achieve grip force control of a prosthetic hand, which is one of the most important clinical applications of the unsupervised method. Experiments on two trans-radial amputees indicated that the proposed approach improves the performance of the prosthetic hand in grasping everyday objects.

  19. A De-centralized Scheduling and Load Balancing Algorithm for Heterogeneous Grid Environments

    NASA Technical Reports Server (NTRS)

    Arora, Manish; Das, Sajal K.; Biswas, Rupak

    2002-01-01

    In the past two decades, numerous scheduling and load balancing techniques have been proposed for locally distributed multiprocessor systems. However, they all suffer from significant deficiencies when extended to a Grid environment: some use a centralized approach that renders the algorithm unscalable, while others assume the overhead involved in searching for appropriate resources to be negligible. Furthermore, classical scheduling algorithms do not consider a Grid node to be N-resource rich and merely work towards maximizing the utilization of one of the resources. In this paper, we propose a new scheduling and load balancing algorithm for a generalized Grid model of N-resource nodes that not only takes into account the node and network heterogeneity, but also considers the overhead involved in coordinating among the nodes. Our algorithm is decentralized, scalable, and overlaps the node coordination time with that of the actual processing of ready jobs, thus saving valuable clock cycles needed for making decisions. The proposed algorithm is studied by conducting simulations using the Message Passing Interface (MPI) paradigm.

  20. A De-Centralized Scheduling and Load Balancing Algorithm for Heterogeneous Grid Environments

    NASA Technical Reports Server (NTRS)

    Arora, Manish; Das, Sajal K.; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2002-01-01

    In the past two decades, numerous scheduling and load balancing techniques have been proposed for locally distributed multiprocessor systems. However, they all suffer from significant deficiencies when extended to a Grid environment: some use a centralized approach that renders the algorithm unscalable, while others assume the overhead involved in searching for appropriate resources to be negligible. Furthermore, classical scheduling algorithms do not consider a Grid node to be N-resource rich and merely work towards maximizing the utilization of one of the resources. In this paper we propose a new scheduling and load balancing algorithm for a generalized Grid model of N-resource nodes that not only takes into account the node and network heterogeneity, but also considers the overhead involved in coordinating among the nodes. Our algorithm is de-centralized, scalable, and overlaps the node coordination time of the actual processing of ready jobs, thus saving valuable clock cycles needed for making decisions. The proposed algorithm is studied by conducting simulations using the Message Passing Interface (MPI) paradigm.

  1. A proposed approach to the application of nonlinear irreversible thermodynamics to fracture in composite materials

    NASA Technical Reports Server (NTRS)

    Lindenmeyer, P. H.

    1983-01-01

    The fracture criteria upon which most fracture mechanics is based involves an energy balance that is not appropriate for the fracture mechanics of viscoelastic materials such as polymer matrix composites. A more appropriate criterion based upon nonequilibrium thermodynamics and involving a power balance rather than an energy balance is proposed. This crierion is based upon a reformulation of the second law of thermodynamics which focuses attention on the total Legendre transform of energy expressed as a functional over time and space. This excess energy functional can be shown to be equivalent to the Rice J integral if the only irreversible process is the propogation of a single crack completely through the thickness of the specimen and if the crack propogation is assured to be independent of time. For the more general case of more than one crack in a viscoelastic medium integration over both time and space is required. Two experimentally measurable parameters are proposed which should permit the evaluation of this more general fracture criterion.

  2. Proposal for an integrated evaluation model for the study of whole systems health care in cancer.

    PubMed

    Jonas, Wayne B; Beckner, William; Coulter, Ian

    2006-12-01

    For more than 200 years, biomedicine has approached the treatment of disease by studying disease processes (patho-genesis), inferring causal connections and developing specific approaches for therapeutically interfering with those processes. This pathogenic approach has been highly successful in acute and traumatic disease but less successful in chronic disease, primarily because of the complex, multi-factorial nature of most chronic disease, which does not allow for simple causal inference or for simple therapeutic interventions. This article suggests that chronic disease is best approached by enhancing healing processes (salutogenesis) as a whole system. Because of the nature of complex systems in chronic disease, an evaluation model based on integrative medicine is felt to be more appropriate than a disease model. The authors propose and describe an integrated model for the evaluation of healing (IMEH) that collects multilevel "thick case" observational data in assessing complex practices for chronic disease. If successful, this approach could become a blueprint for studying healing capacity in whole medical systems, including complementary medicine, traditional medicine, and conventional primary care. In addition, streamlining data collection and applying rapid informatics management might allow for such data to be used in guiding clinical practice. The IMEH involves collection, integration, and potentially feedback of relevant variables in the following areas: (1) sociocultural, (2) psychological and behavioral, (3) clinical (diagnosis based), and (4) biological. Evaluation and integration of these components would involve specialized research teams that feed their data into a single data management and information analysis center. These data can then be subjected to descriptive and pathway analysis providing "bench and bedside" information.

  3. Environmental Assessment Addressing FTFA07-1174, Repair Approach Lighting System at the North End of Runway 01/19 at Eglin AFB, Florida

    DTIC Science & Technology

    2013-05-01

    Jltlaterial.\\’ (EA § 3.3, pages 3- lO to 3-12): Construction and demolition activities associated with the Proposed Action wi ll involve the use of hazardous...impacts will only last during those activities and will not be cumulati vely significant. Considering the use of management actions lo minimize the...potenti al lo r adverse effects on listed species, implementation of the Proposed Action is not anticipated to have signi ticant cumulati ve effects

  4. A building block for hardware belief networks.

    PubMed

    Behin-Aein, Behtash; Diep, Vinh; Datta, Supriyo

    2016-07-21

    Belief networks represent a powerful approach to problems involving probabilistic inference, but much of the work in this area is software based utilizing standard deterministic hardware based on the transistor which provides the gain and directionality needed to interconnect billions of them into useful networks. This paper proposes a transistor like device that could provide an analogous building block for probabilistic networks. We present two proof-of-concept examples of belief networks, one reciprocal and one non-reciprocal, implemented using the proposed device which is simulated using experimentally benchmarked models.

  5. A novel iterative scheme and its application to differential equations.

    PubMed

    Khan, Yasir; Naeem, F; Šmarda, Zdeněk

    2014-01-01

    The purpose of this paper is to employ an alternative approach to reconstruct the standard variational iteration algorithm II proposed by He, including Lagrange multiplier, and to give a simpler formulation of Adomian decomposition and modified Adomian decomposition method in terms of newly proposed variational iteration method-II (VIM). Through careful investigation of the earlier variational iteration algorithm and Adomian decomposition method, we find unnecessary calculations for Lagrange multiplier and also repeated calculations involved in each iteration, respectively. Several examples are given to verify the reliability and efficiency of the method.

  6. A Research-Inspired and Computer-Guided Clinical Interview for Mathematics Assessment: Introduction, Reliability and Validity

    ERIC Educational Resources Information Center

    Ginsburg, Herbert P.; Lee, Young-Sun; Pappas, Sandra

    2016-01-01

    Formative assessment involves the gathering of information that can guide the teaching of individual or groups of children. This approach requires a sound understanding of children's thinking and learning, as well as an effective method for gaining the information. We propose that formative assessment should employ a version of clinical…

  7. Relations between Informational Sources, Self-Efficacy and Academic Achievement: A Developmental Approach

    ERIC Educational Resources Information Center

    Phan, Huy Phuong

    2012-01-01

    As a cognitive-motivational construct, self-efficacy has been researched extensively and has involved two important lines of inquiries, namely the impact of sources of information on self-efficacy and the predictive effect of self-efficacy on learning outcomes. We proposed and tested the relations between the four major sources of information…

  8. Voice and Dialogue in Teaching Reading/Writing to Qatari Students

    ERIC Educational Resources Information Center

    Golkowska, Krystyna U.

    2013-01-01

    This paper describes an attempt to improve the reading comprehension and writing skills of students coming from an oral culture. The proposed approach involves using voice and dialogue--understood literally and metaphorically--as a tool in teaching students how to engage texts and write with a reader in mind. The author discusses a pilot study…

  9. The Long Term Effectiveness of Intensive Stuttering Therapy: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Irani, Farzan; Gabel, Rodney; Daniels, Derek; Hughes, Stephanie

    2012-01-01

    Purpose: The purpose of this study was to gain a deeper understanding of client perceptions of an intensive stuttering therapy program that utilizes a multi-faceted approach to therapy. The study also proposed to gain a deeper understanding about the process involved in long-term maintenance of meaningful changes made in therapy. Methods: The…

  10. Testing Mediation in Structural Equation Modeling: The Effectiveness of the Test of Joint Significance

    ERIC Educational Resources Information Center

    Leth-Steensen, Craig; Gallitto, Elena

    2016-01-01

    A large number of approaches have been proposed for estimating and testing the significance of indirect effects in mediation models. In this study, four sets of Monte Carlo simulations involving full latent variable structural equation models were run in order to contrast the effectiveness of the currently popular bias-corrected bootstrapping…

  11. Computational Scientific Inquiry with Virtual Worlds and Agent-Based Models: New Ways of Doing Science to Learn Science

    ERIC Educational Resources Information Center

    Jacobson, Michael J.; Taylor, Charlotte E.; Richards, Deborah

    2016-01-01

    In this paper, we propose computational scientific inquiry (CSI) as an innovative model for learning important scientific knowledge and new practices for "doing" science. This approach involves the use of a "game-like" virtual world for students to experience virtual biological fieldwork in conjunction with using an agent-based…

  12. The Unified Classification System (UCS): improving our understanding of periprosthetic fractures.

    PubMed

    Duncan, C P; Haddad, F S

    2014-06-01

    Periprosthetic fractures are an increasingly common complication following joint replacement. The principles which underpin their evaluation and treatment are common across the musculoskeletal system. The Unified Classification System proposes a rational approach to treatment, regardless of the bone that is broken or the joint involved. ©2014 The British Editorial Society of Bone & Joint Surgery.

  13. Automated Discovery of Elementary Chemical Reaction Steps Using Freezing String and Berny Optimization Methods.

    PubMed

    Suleimanov, Yury V; Green, William H

    2015-09-08

    We present a simple protocol which allows fully automated discovery of elementary chemical reaction steps using in cooperation double- and single-ended transition-state optimization algorithms--the freezing string and Berny optimization methods, respectively. To demonstrate the utility of the proposed approach, the reactivity of several single-molecule systems of combustion and atmospheric chemistry importance is investigated. The proposed algorithm allowed us to detect without any human intervention not only "known" reaction pathways, manually detected in the previous studies, but also new, previously "unknown", reaction pathways which involve significant atom rearrangements. We believe that applying such a systematic approach to elementary reaction path finding will greatly accelerate the discovery of new chemistry and will lead to more accurate computer simulations of various chemical processes.

  14. Testing the TPF Interferometry Approach before Launch

    NASA Technical Reports Server (NTRS)

    Serabyn, Eugene; Mennesson, Bertrand

    2006-01-01

    One way to directly detect nearby extra-solar planets is via their thermal infrared emission, and with this goal in mind, both NASA and ESA are investigating cryogenic infrared interferometers. Common to both agencies' approaches to faint off-axis source detection near bright stars is the use of a rotating nulling interferometer, such as the Terrestrial Planet Finder interferometer (TPF-I), or Darwin. In this approach, the central star is nulled, while the emission from off-axis sources is transmitted and modulated by the rotation of the off-axis fringes. Because of the high contrasts involved, and the novelty of the measurement technique, it is essential to gain experience with this technique before launch. Here we describe a simple ground-based experiment that can test the essential aspects of the TPF signal measurement and image reconstruction approaches by generating a rotating interferometric baseline within the pupil of a large singleaperture telescope. This approach can mimic potential space-based interferometric configurations, and allow the extraction of signals from off-axis sources using the same algorithms proposed for the space-based missions. This approach should thus allow for testing of the applicability of proposed signal extraction algorithms for the detection of single and multiple near-neighbor companions...

  15. Standards for gene therapy clinical trials based on pro-active risk assessment in a London NHS Teaching Hospital Trust.

    PubMed

    Bamford, K B; Wood, S; Shaw, R J

    2005-02-01

    Conducting gene therapy clinical trials with genetically modified organisms as the vectors presents unique safety and infection control issues. The area is governed by a range of legislation and guidelines, some unique to this field, as well as those pertinent to any area of clinical work. The relevant regulations covering gene therapy using genetically modified vectors are reviewed and illustrated with the approach taken by a large teaching hospital NHS Trust. Key elements were Trust-wide communication and involvement of staff in a pro-active approach to risk management, with specific emphasis on staff training and engagement, waste management, audit and record keeping. This process has led to the development of proposed standards for clinical trials involving genetically modified micro-organisms.

  16. Monte Carlo simulations of mixtures involving ketones and aldehydes by a direct bubble pressure calculation.

    PubMed

    Ferrando, Nicolas; Lachet, Véronique; Boutin, Anne

    2010-07-08

    Ketone and aldehyde molecules are involved in a large variety of industrial applications. Because they are mainly present mixed with other compounds, the prediction of phase equilibrium of mixtures involving these classes of molecules is of first interest particularly to design and optimize separation processes. The main goal of this work is to propose a transferable force field for ketones and aldehydes that allows accurate molecular simulations of not only pure compounds but also complex mixtures. The proposed force field is based on the anisotropic united-atoms AUA4 potential developed for hydrocarbons, and it introduces only one new atom, the carbonyl oxygen. The Lennard-Jones parameters of this oxygen atom have been adjusted on saturated thermodynamic properties of both acetone and acetaldehyde. To simulate mixtures, Monte Carlo simulations are carried out in a specific pseudoensemble which allows a direct calculation of the bubble pressure. For polar mixtures involved in this study, we show that this approach is an interesting alternative to classical calculations in the isothermal-isobaric Gibbs ensemble. The pressure-composition diagrams of polar + polar and polar + nonpolar binary mixtures are well reproduced. Mutual solubilities as well as azeotrope location, if present, are accurately predicted without any empirical binary interaction parameters or readjustment. Such result highlights the transferability of the proposed force field, which is an essential feature toward the simulation of complex oxygenated mixtures of industrial interest.

  17. A Modified Dynamic Evolving Neural-Fuzzy Approach to Modeling Customer Satisfaction for Affective Design

    PubMed Central

    Kwong, C. K.; Fung, K. Y.; Jiang, Huimin; Chan, K. Y.

    2013-01-01

    Affective design is an important aspect of product development to achieve a competitive edge in the marketplace. A neural-fuzzy network approach has been attempted recently to model customer satisfaction for affective design and it has been proved to be an effective one to deal with the fuzziness and non-linearity of the modeling as well as generate explicit customer satisfaction models. However, such an approach to modeling customer satisfaction has two limitations. First, it is not suitable for the modeling problems which involve a large number of inputs. Second, it cannot adapt to new data sets, given that its structure is fixed once it has been developed. In this paper, a modified dynamic evolving neural-fuzzy approach is proposed to address the above mentioned limitations. A case study on the affective design of mobile phones was conducted to illustrate the effectiveness of the proposed methodology. Validation tests were conducted and the test results indicated that: (1) the conventional Adaptive Neuro-Fuzzy Inference System (ANFIS) failed to run due to a large number of inputs; (2) the proposed dynamic neural-fuzzy model outperforms the subtractive clustering-based ANFIS model and fuzzy c-means clustering-based ANFIS model in terms of their modeling accuracy and computational effort. PMID:24385884

  18. A modified dynamic evolving neural-fuzzy approach to modeling customer satisfaction for affective design.

    PubMed

    Kwong, C K; Fung, K Y; Jiang, Huimin; Chan, K Y; Siu, Kin Wai Michael

    2013-01-01

    Affective design is an important aspect of product development to achieve a competitive edge in the marketplace. A neural-fuzzy network approach has been attempted recently to model customer satisfaction for affective design and it has been proved to be an effective one to deal with the fuzziness and non-linearity of the modeling as well as generate explicit customer satisfaction models. However, such an approach to modeling customer satisfaction has two limitations. First, it is not suitable for the modeling problems which involve a large number of inputs. Second, it cannot adapt to new data sets, given that its structure is fixed once it has been developed. In this paper, a modified dynamic evolving neural-fuzzy approach is proposed to address the above mentioned limitations. A case study on the affective design of mobile phones was conducted to illustrate the effectiveness of the proposed methodology. Validation tests were conducted and the test results indicated that: (1) the conventional Adaptive Neuro-Fuzzy Inference System (ANFIS) failed to run due to a large number of inputs; (2) the proposed dynamic neural-fuzzy model outperforms the subtractive clustering-based ANFIS model and fuzzy c-means clustering-based ANFIS model in terms of their modeling accuracy and computational effort.

  19. Gene Function Hypotheses for the Campylobacter jejuni Glycome Generated by a Logic-Based Approach

    PubMed Central

    Sternberg, Michael J.E.; Tamaddoni-Nezhad, Alireza; Lesk, Victor I.; Kay, Emily; Hitchen, Paul G.; Cootes, Adrian; van Alphen, Lieke B.; Lamoureux, Marc P.; Jarrell, Harold C.; Rawlings, Christopher J.; Soo, Evelyn C.; Szymanski, Christine M.; Dell, Anne; Wren, Brendan W.; Muggleton, Stephen H.

    2013-01-01

    Increasingly, experimental data on biological systems are obtained from several sources and computational approaches are required to integrate this information and derive models for the function of the system. Here, we demonstrate the power of a logic-based machine learning approach to propose hypotheses for gene function integrating information from two diverse experimental approaches. Specifically, we use inductive logic programming that automatically proposes hypotheses explaining the empirical data with respect to logically encoded background knowledge. We study the capsular polysaccharide biosynthetic pathway of the major human gastrointestinal pathogen Campylobacter jejuni. We consider several key steps in the formation of capsular polysaccharide consisting of 15 genes of which 8 have assigned function, and we explore the extent to which functions can be hypothesised for the remaining 7. Two sources of experimental data provide the information for learning—the results of knockout experiments on the genes involved in capsule formation and the absence/presence of capsule genes in a multitude of strains of different serotypes. The machine learning uses the pathway structure as background knowledge. We propose assignments of specific genes to five previously unassigned reaction steps. For four of these steps, there was an unambiguous optimal assignment of gene to reaction, and to the fifth, there were three candidate genes. Several of these assignments were consistent with additional experimental results. We therefore show that the logic-based methodology provides a robust strategy to integrate results from different experimental approaches and propose hypotheses for the behaviour of a biological system. PMID:23103756

  20. Gene function hypotheses for the Campylobacter jejuni glycome generated by a logic-based approach.

    PubMed

    Sternberg, Michael J E; Tamaddoni-Nezhad, Alireza; Lesk, Victor I; Kay, Emily; Hitchen, Paul G; Cootes, Adrian; van Alphen, Lieke B; Lamoureux, Marc P; Jarrell, Harold C; Rawlings, Christopher J; Soo, Evelyn C; Szymanski, Christine M; Dell, Anne; Wren, Brendan W; Muggleton, Stephen H

    2013-01-09

    Increasingly, experimental data on biological systems are obtained from several sources and computational approaches are required to integrate this information and derive models for the function of the system. Here, we demonstrate the power of a logic-based machine learning approach to propose hypotheses for gene function integrating information from two diverse experimental approaches. Specifically, we use inductive logic programming that automatically proposes hypotheses explaining the empirical data with respect to logically encoded background knowledge. We study the capsular polysaccharide biosynthetic pathway of the major human gastrointestinal pathogen Campylobacter jejuni. We consider several key steps in the formation of capsular polysaccharide consisting of 15 genes of which 8 have assigned function, and we explore the extent to which functions can be hypothesised for the remaining 7. Two sources of experimental data provide the information for learning-the results of knockout experiments on the genes involved in capsule formation and the absence/presence of capsule genes in a multitude of strains of different serotypes. The machine learning uses the pathway structure as background knowledge. We propose assignments of specific genes to five previously unassigned reaction steps. For four of these steps, there was an unambiguous optimal assignment of gene to reaction, and to the fifth, there were three candidate genes. Several of these assignments were consistent with additional experimental results. We therefore show that the logic-based methodology provides a robust strategy to integrate results from different experimental approaches and propose hypotheses for the behaviour of a biological system. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. A simple method for assessing occupational exposure via the one-way random effects model.

    PubMed

    Krishnamoorthy, K; Mathew, Thomas; Peng, Jie

    2016-11-01

    A one-way random effects model is postulated for the log-transformed shift-long personal exposure measurements, where the random effect in the model represents an effect due to the worker. Simple closed-form confidence intervals are proposed for the relevant parameters of interest using the method of variance estimates recovery (MOVER). The performance of the confidence bounds is evaluated and compared with those based on the generalized confidence interval approach. Comparison studies indicate that the proposed MOVER confidence bounds are better than the generalized confidence bounds for the overall mean exposure and an upper percentile of the exposure distribution. The proposed methods are illustrated using a few examples involving industrial hygiene data.

  2. Information Pre-Processing using Domain Meta-Ontology and Rule Learning System

    NASA Astrophysics Data System (ADS)

    Ranganathan, Girish R.; Biletskiy, Yevgen

    Around the globe, extraordinary amounts of documents are being created by Enterprises and by users outside these Enterprises. The documents created in the Enterprises constitute the main focus of the present chapter. These documents are used to perform numerous amounts of machine processing. While using thesedocuments for machine processing, lack of semantics of the information in these documents may cause misinterpretation of the information, thereby inhibiting the productiveness of computer assisted analytical work. Hence, it would be profitable to the Enterprises if they use well defined domain ontologies which will serve as rich source(s) of semantics for the information in the documents. These domain ontologies can be created manually, semi-automatically or fully automatically. The focus of this chapter is to propose an intermediate solution which will enable relatively easy creation of these domain ontologies. The process of extracting and capturing domain ontologies from these voluminous documents requires extensive involvement of domain experts and application of methods of ontology learning that are substantially labor intensive; therefore, some intermediate solutions which would assist in capturing domain ontologies must be developed. This chapter proposes a solution in this direction which involves building a meta-ontology that will serve as an intermediate information source for the main domain ontology. This chapter proposes a solution in this direction which involves building a meta-ontology as a rapid approach in conceptualizing a domain of interest from huge amount of source documents. This meta-ontology can be populated by ontological concepts, attributes and relations from documents, and then refined in order to form better domain ontology either through automatic ontology learning methods or some other relevant ontology building approach.

  3. A Three-Dimensional Approach and Open Source Structure for the Design and Experimentation of Teaching-Learning Sequences: The case of friction

    NASA Astrophysics Data System (ADS)

    Besson, Ugo; Borghi, Lidia; De Ambrosis, Anna; Mascheretti, Paolo

    2010-07-01

    We have developed a teaching-learning sequence (TLS) on friction based on a preliminary study involving three dimensions: an analysis of didactic research on the topic, an overview of usual approaches, and a critical analysis of the subject, considered also in its historical development. We found that mostly the usual presentations do not take into account the complexity of friction as it emerges from scientific research, may reinforce some inaccurate students' conceptions, and favour a limited vision of friction phenomena. The TLS we propose begins by considering a wide range of friction phenomena to favour an initial motivation and a broader view of the topic and then develops a path of interrelated observations, experiments, and theoretical aspects. It proposes the use of structural models, involving visual representations and stimulating intuition, aimed at helping students build mental models of friction mechanisms. To facilitate the reproducibility in school contexts, the sequence is designed as an open source structure, with a core of contents, conceptual correlations and methodological choices, and a cloud of elements that can be re-designed by teachers. The sequence has been tested in teacher education and in upper secondary school, and has shown positive results in overcoming student difficulties and stimulating richer reasoning based on the structural models we suggested. The proposed path has modified the teachers' view of the topic, producing a motivation to change their traditional presentations. The open structure of the sequence has facilitated its implementation by teachers in school in coherence with the rationale of the proposal.

  4. Comparison of penalty functions on a penalty approach to mixed-integer optimization

    NASA Astrophysics Data System (ADS)

    Francisco, Rogério B.; Costa, M. Fernanda P.; Rocha, Ana Maria A. C.; Fernandes, Edite M. G. P.

    2016-06-01

    In this paper, we present a comparative study involving several penalty functions that can be used in a penalty approach for globally solving bound mixed-integer nonlinear programming (bMIMLP) problems. The penalty approach relies on a continuous reformulation of the bMINLP problem by adding a particular penalty term to the objective function. A penalty function based on the `erf' function is proposed. The continuous nonlinear optimization problems are sequentially solved by the population-based firefly algorithm. Preliminary numerical experiments are carried out in order to analyze the quality of the produced solutions, when compared with other penalty functions available in the literature.

  5. Evolution of Autonomous Self-Righting Behaviors for Articulated Nanorovers

    NASA Technical Reports Server (NTRS)

    Tunstel, Edward

    1999-01-01

    Miniature rovers with articulated mobility mechanisms are being developed for planetary surface exploration on Mars and small solar system bodies. These vehicles are designed to be capable of autonomous recovery from overturning during surface operations. This paper describes a computational means of developing motion behaviors that achieve the autonomous recovery function. It proposes a control software design approach aimed at reducing the effort involved in developing self-righting behaviors. The approach is based on the integration of evolutionary computing with a dynamics simulation environment for evolving and evaluating motion behaviors. The automated behavior design approach is outlined and its underlying genetic programming infrastructure is described.

  6. Head Motion Modeling for Human Behavior Analysis in Dyadic Interaction

    PubMed Central

    Xiao, Bo; Georgiou, Panayiotis; Baucom, Brian; Narayanan, Shrikanth S.

    2015-01-01

    This paper presents a computational study of head motion in human interaction, notably of its role in conveying interlocutors’ behavioral characteristics. Head motion is physically complex and carries rich information; current modeling approaches based on visual signals, however, are still limited in their ability to adequately capture these important properties. Guided by the methodology of kinesics, we propose a data driven approach to identify typical head motion patterns. The approach follows the steps of first segmenting motion events, then parametrically representing the motion by linear predictive features, and finally generalizing the motion types using Gaussian mixture models. The proposed approach is experimentally validated using video recordings of communication sessions from real couples involved in a couples therapy study. In particular we use the head motion model to classify binarized expert judgments of the interactants’ specific behavioral characteristics where entrainment in head motion is hypothesized to play a role: Acceptance, Blame, Positive, and Negative behavior. We achieve accuracies in the range of 60% to 70% for the various experimental settings and conditions. In addition, we describe a measure of motion similarity between the interaction partners based on the proposed model. We show that the relative change of head motion similarity during the interaction significantly correlates with the expert judgments of the interactants’ behavioral characteristics. These findings demonstrate the effectiveness of the proposed head motion model, and underscore the promise of analyzing human behavioral characteristics through signal processing methods. PMID:26557047

  7. Optimal neighborhood indexing for protein similarity search.

    PubMed

    Peterlongo, Pierre; Noé, Laurent; Lavenier, Dominique; Nguyen, Van Hoa; Kucherov, Gregory; Giraud, Mathieu

    2008-12-16

    Similarity inference, one of the main bioinformatics tasks, has to face an exponential growth of the biological data. A classical approach used to cope with this data flow involves heuristics with large seed indexes. In order to speed up this technique, the index can be enhanced by storing additional information to limit the number of random memory accesses. However, this improvement leads to a larger index that may become a bottleneck. In the case of protein similarity search, we propose to decrease the index size by reducing the amino acid alphabet. The paper presents two main contributions. First, we show that an optimal neighborhood indexing combining an alphabet reduction and a longer neighborhood leads to a reduction of 35% of memory involved into the process, without sacrificing the quality of results nor the computational time. Second, our approach led us to develop a new kind of substitution score matrices and their associated e-value parameters. In contrast to usual matrices, these matrices are rectangular since they compare amino acid groups from different alphabets. We describe the method used for computing those matrices and we provide some typical examples that can be used in such comparisons. Supplementary data can be found on the website http://bioinfo.lifl.fr/reblosum. We propose a practical index size reduction of the neighborhood data, that does not negatively affect the performance of large-scale search in protein sequences. Such an index can be used in any study involving large protein data. Moreover, rectangular substitution score matrices and their associated statistical parameters can have applications in any study involving an alphabet reduction.

  8. Optimal neighborhood indexing for protein similarity search

    PubMed Central

    Peterlongo, Pierre; Noé, Laurent; Lavenier, Dominique; Nguyen, Van Hoa; Kucherov, Gregory; Giraud, Mathieu

    2008-01-01

    Background Similarity inference, one of the main bioinformatics tasks, has to face an exponential growth of the biological data. A classical approach used to cope with this data flow involves heuristics with large seed indexes. In order to speed up this technique, the index can be enhanced by storing additional information to limit the number of random memory accesses. However, this improvement leads to a larger index that may become a bottleneck. In the case of protein similarity search, we propose to decrease the index size by reducing the amino acid alphabet. Results The paper presents two main contributions. First, we show that an optimal neighborhood indexing combining an alphabet reduction and a longer neighborhood leads to a reduction of 35% of memory involved into the process, without sacrificing the quality of results nor the computational time. Second, our approach led us to develop a new kind of substitution score matrices and their associated e-value parameters. In contrast to usual matrices, these matrices are rectangular since they compare amino acid groups from different alphabets. We describe the method used for computing those matrices and we provide some typical examples that can be used in such comparisons. Supplementary data can be found on the website . Conclusion We propose a practical index size reduction of the neighborhood data, that does not negatively affect the performance of large-scale search in protein sequences. Such an index can be used in any study involving large protein data. Moreover, rectangular substitution score matrices and their associated statistical parameters can have applications in any study involving an alphabet reduction. PMID:19087280

  9. A nonparametric smoothing method for assessing GEE models with longitudinal binary data.

    PubMed

    Lin, Kuo-Chin; Chen, Yi-Ju; Shyr, Yu

    2008-09-30

    Studies involving longitudinal binary responses are widely applied in the health and biomedical sciences research and frequently analyzed by generalized estimating equations (GEE) method. This article proposes an alternative goodness-of-fit test based on the nonparametric smoothing approach for assessing the adequacy of GEE fitted models, which can be regarded as an extension of the goodness-of-fit test of le Cessie and van Houwelingen (Biometrics 1991; 47:1267-1282). The expectation and approximate variance of the proposed test statistic are derived. The asymptotic distribution of the proposed test statistic in terms of a scaled chi-squared distribution and the power performance of the proposed test are discussed by simulation studies. The testing procedure is demonstrated by two real data. Copyright (c) 2008 John Wiley & Sons, Ltd.

  10. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  11. Mobilizing cross-sector community partnerships to address the needs of criminal justice-involved older adults: a framework for action.

    PubMed

    Metzger, Lia; Ahalt, Cyrus; Kushel, Margot; Riker, Alissa; Williams, Brie

    2017-09-11

    Purpose The rapidly increasing number of older adults cycling through local criminal justice systems (jails, probation, and parole) suggests a need for greater collaboration among a diverse group of local stakeholders including professionals from healthcare delivery, public health, and criminal justice and directly affected individuals, their families, and advocates. The purpose of this paper is to develop a framework that local communities can use to understand and begin to address the needs of criminal justice-involved older adults. Design/methodology/approach The framework included solicit input from community stakeholders to identify pressing challenges facing criminal justice-involved older adults, conduct needs assessments of criminal justice-involved older adults and professionals working with them; implement quick-response interventions based on needs assessments; share findings with community stakeholders and generate public feedback; engage interdisciplinary group to develop an action plan to optimize services. Findings A five-step framework for creating an interdisciplinary community response is an effective approach to action planning and broad stakeholder engagement on behalf of older adults cycling through the criminal justice system. Originality/value This study proposes the Criminal Justice Involved Older Adults in Need of Treatment Initiative Framework for establishing an interdisciplinary community response to the growing population of medically and socially vulnerable criminal justice-involved older adults.

  12. Dynamic Involvement of Real World Objects in the IoT: A Consensus-Based Cooperation Approach

    PubMed Central

    Pilloni, Virginia; Atzori, Luigi; Mallus, Matteo

    2017-01-01

    A significant role in the Internet of Things (IoT) will be taken by mobile and low-cost unstable devices, which autonomously self-organize and introduce highly dynamic and heterogeneous scenarios for the deployment of distributed applications. This entails the devices to cooperate to dynamically find the suitable combination of their involvement so as to improve the system reliability while following the changes in their status. Focusing on the above scenario, we propose a distributed algorithm for resources allocation that is run by devices that can perform the same task required by the applications, allowing for a flexible and dynamic binding of the requested services with the physical IoT devices. It is based on a consensus approach, which maximizes the lifetime of groups of nodes involved and ensures the fulfillment of the requested Quality of Information (QoI) requirements. Experiments have been conducted with real devices, showing an improvement of device lifetime of more than 20%, with respect to a uniform distribution of tasks. PMID:28257030

  13. Psycho-social processes in dealing with legal innovation in the community: insights from biodiversity conservation.

    PubMed

    Castro, Paula; Mouro, Carla

    2011-06-01

    Mitigation measures for tackling the consequences of a changing climate will involve efforts of various types including the conservation of affected ecosystems. For this, communities throughout the world will be called on to change habits of land and water use. Many of these changes will emerge from the multilevel governance tools now commonly used for environmental protection. In this article, some tenets of a social psychology of legal innovation are proposed for approaching the psycho-social processes involved in how individuals, groups and communities respond to multilevel governance. Next, how this approach can improve our understanding of community-based conservation driven by legal innovation is highlighted. For this, the macro and micro level processes involved in the implementation of the European Natura 2000 Network of Protected Sites are examined. Finally, some insights gained from this example of multilevel governance through legal innovation will be enumerated as a contribution for future policy making aimed at dealing with climate change consequences.

  14. Dynamic Involvement of Real World Objects in the IoT: A Consensus-Based Cooperation Approach.

    PubMed

    Pilloni, Virginia; Atzori, Luigi; Mallus, Matteo

    2017-03-01

    A significant role in the Internet of Things (IoT) will be taken by mobile and low-cost unstable devices, which autonomously self-organize and introduce highly dynamic and heterogeneous scenarios for the deployment of distributed applications. This entails the devices to cooperate to dynamically find the suitable combination of their involvement so as to improve the system reliability while following the changes in their status. Focusing on the above scenario, we propose a distributed algorithm for resources allocation that is run by devices that can perform the same task required by the applications, allowing for a flexible and dynamic binding of the requested services with the physical IoT devices. It is based on a consensus approach, which maximizes the lifetime of groups of nodes involved and ensures the fulfillment of the requested Quality of Information (QoI) requirements. Experiments have been conducted with real devices, showing an improvement of device lifetime of more than 20 % , with respect to a uniform distribution of tasks.

  15. Interprofessional education about patient decision support in specialty care.

    PubMed

    Politi, Mary C; Pieterse, Arwen H; Truant, Tracy; Borkhoff, Cornelia; Jha, Vikram; Kuhl, Laura; Nicolai, Jennifer; Goss, Claudia

    2011-11-01

    Specialty care involves services provided by health professionals who focus on treating diseases affecting one body system. In contrast to primary care - aimed at providing continuous, comprehensive care - specialty care often involves intermittent episodes of care focused around specific medical conditions. In addition, it typically includes multiple providers who have unique areas of expertise that are important in supporting patients' care. Interprofessional care involves multiple professionals from different disciplines collaborating to provide an integrated approach to patient care. For patients to experience continuity of care across interprofessional providers, providers need to communicate and maintain a shared sense of responsibility to their patients. In this article, we describe challenges inherent in providing interprofessional patient decision support in specialty care. We propose ways for providers to engage in interprofessional decision support and discuss promising approaches to teaching an interprofessional decision support to specialty care providers. Additional evaluation and empirical research are required before further recommendations can be made about education for interprofessional decision support in specialty care.

  16. Sampling-based real-time motion planning under state uncertainty for autonomous micro-aerial vehicles in GPS-denied environments.

    PubMed

    Li, Dachuan; Li, Qing; Cheng, Nong; Song, Jingyan

    2014-11-18

    This paper presents a real-time motion planning approach for autonomous vehicles with complex dynamics and state uncertainty. The approach is motivated by the motion planning problem for autonomous vehicles navigating in GPS-denied dynamic environments, which involves non-linear and/or non-holonomic vehicle dynamics, incomplete state estimates, and constraints imposed by uncertain and cluttered environments. To address the above motion planning problem, we propose an extension of the closed-loop rapid belief trees, the closed-loop random belief trees (CL-RBT), which incorporates predictions of the position estimation uncertainty, using a factored form of the covariance provided by the Kalman filter-based estimator. The proposed motion planner operates by incrementally constructing a tree of dynamically feasible trajectories using the closed-loop prediction, while selecting candidate paths with low uncertainty using efficient covariance update and propagation. The algorithm can operate in real-time, continuously providing the controller with feasible paths for execution, enabling the vehicle to account for dynamic and uncertain environments. Simulation results demonstrate that the proposed approach can generate feasible trajectories that reduce the state estimation uncertainty, while handling complex vehicle dynamics and environment constraints.

  17. Data Collection for Mobile Group Consumption: An Asynchronous Distributed Approach.

    PubMed

    Zhu, Weiping; Chen, Weiran; Hu, Zhejie; Li, Zuoyou; Liang, Yue; Chen, Jiaojiao

    2016-04-06

    Mobile group consumption refers to consumption by a group of people, such as a couple, a family, colleagues and friends, based on mobile communications. It differs from consumption only involving individuals, because of the complex relations among group members. Existing data collection systems for mobile group consumption are centralized, which has the disadvantages of being a performance bottleneck, having single-point failure and increasing business and security risks. Moreover, these data collection systems are based on a synchronized clock, which is often unrealistic because of hardware constraints, privacy concerns or synchronization cost. In this paper, we propose the first asynchronous distributed approach to collecting data generated by mobile group consumption. We formally built a system model thereof based on asynchronous distributed communication. We then designed a simulation system for the model for which we propose a three-layer solution framework. After that, we describe how to detect the causality relation of two/three gathering events that happened in the system based on the collected data. Various definitions of causality relations based on asynchronous distributed communication are supported. Extensive simulation results show that the proposed approach is effective for data collection relating to mobile group consumption.

  18. Sampling-Based Real-Time Motion Planning under State Uncertainty for Autonomous Micro-Aerial Vehicles in GPS-Denied Environments

    PubMed Central

    Li, Dachuan; Li, Qing; Cheng, Nong; Song, Jingyan

    2014-01-01

    This paper presents a real-time motion planning approach for autonomous vehicles with complex dynamics and state uncertainty. The approach is motivated by the motion planning problem for autonomous vehicles navigating in GPS-denied dynamic environments, which involves non-linear and/or non-holonomic vehicle dynamics, incomplete state estimates, and constraints imposed by uncertain and cluttered environments. To address the above motion planning problem, we propose an extension of the closed-loop rapid belief trees, the closed-loop random belief trees (CL-RBT), which incorporates predictions of the position estimation uncertainty, using a factored form of the covariance provided by the Kalman filter-based estimator. The proposed motion planner operates by incrementally constructing a tree of dynamically feasible trajectories using the closed-loop prediction, while selecting candidate paths with low uncertainty using efficient covariance update and propagation. The algorithm can operate in real-time, continuously providing the controller with feasible paths for execution, enabling the vehicle to account for dynamic and uncertain environments. Simulation results demonstrate that the proposed approach can generate feasible trajectories that reduce the state estimation uncertainty, while handling complex vehicle dynamics and environment constraints. PMID:25412217

  19. Data Collection for Mobile Group Consumption: An Asynchronous Distributed Approach †

    PubMed Central

    Zhu, Weiping; Chen, Weiran; Hu, Zhejie; Li, Zuoyou; Liang, Yue; Chen, Jiaojiao

    2016-01-01

    Mobile group consumption refers to consumption by a group of people, such as a couple, a family, colleagues and friends, based on mobile communications. It differs from consumption only involving individuals, because of the complex relations among group members. Existing data collection systems for mobile group consumption are centralized, which has the disadvantages of being a performance bottleneck, having single-point failure and increasing business and security risks. Moreover, these data collection systems are based on a synchronized clock, which is often unrealistic because of hardware constraints, privacy concerns or synchronization cost. In this paper, we propose the first asynchronous distributed approach to collecting data generated by mobile group consumption. We formally built a system model thereof based on asynchronous distributed communication. We then designed a simulation system for the model for which we propose a three-layer solution framework. After that, we describe how to detect the causality relation of two/three gathering events that happened in the system based on the collected data. Various definitions of causality relations based on asynchronous distributed communication are supported. Extensive simulation results show that the proposed approach is effective for data collection relating to mobile group consumption. PMID:27058544

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shumway, R.H.; McQuarrie, A.D.

    Robust statistical approaches to the problem of discriminating between regional earthquakes and explosions are developed. We compare linear discriminant analysis using descriptive features like amplitude and spectral ratios with signal discrimination techniques using the original signal waveforms and spectral approximations to the log likelihood function. Robust information theoretic techniques are proposed and all methods are applied to 8 earthquakes and 8 mining explosions in Scandinavia and to an event from Novaya Zemlya of unknown origin. It is noted that signal discrimination approaches based on discrimination information and Renyi entropy perform better in the test sample than conventional methods based onmore » spectral ratios involving the P and S phases. Two techniques for identifying the ripple-firing pattern for typical mining explosions are proposed and shown to work well on simulated data and on several Scandinavian earthquakes and explosions. We use both cepstral analysis in the frequency domain and a time domain method based on the autocorrelation and partial autocorrelation functions. The proposed approach strips off underlying smooth spectral and seasonal spectral components corresponding to the echo pattern induced by two simple ripple-fired models. For two mining explosions, a pattern is identified whereas for two earthquakes, no pattern is evident.« less

  1. Calculating complete and exact Pareto front for multiobjective optimization: a new deterministic approach for discrete problems.

    PubMed

    Hu, Xiao-Bing; Wang, Ming; Di Paolo, Ezequiel

    2013-06-01

    Searching the Pareto front for multiobjective optimization problems usually involves the use of a population-based search algorithm or of a deterministic method with a set of different single aggregate objective functions. The results are, in fact, only approximations of the real Pareto front. In this paper, we propose a new deterministic approach capable of fully determining the real Pareto front for those discrete problems for which it is possible to construct optimization algorithms to find the k best solutions to each of the single-objective problems. To this end, two theoretical conditions are given to guarantee the finding of the actual Pareto front rather than its approximation. Then, a general methodology for designing a deterministic search procedure is proposed. A case study is conducted, where by following the general methodology, a ripple-spreading algorithm is designed to calculate the complete exact Pareto front for multiobjective route optimization. When compared with traditional Pareto front search methods, the obvious advantage of the proposed approach is its unique capability of finding the complete Pareto front. This is illustrated by the simulation results in terms of both solution quality and computational efficiency.

  2. Is Peer Review an Appropriate Form of Assessment in a MOOC? Student Participation and Performance in Formative Peer Review

    ERIC Educational Resources Information Center

    Meek, Sarah E. M.; Blakemore, Louise; Marks, Leah

    2017-01-01

    Many aspects of higher education must be reconceptualised for massive open online courses (MOOCs). Formative and summative assessment of qualitative work in particular requires novel approaches to cope with the numbers involved. Peer review has been proposed as one solution, and has been widely adopted by major MOOC providers, but there is…

  3. “Puerto Rico: gateway to landscape” from an ecological perspective

    Treesearch

    Grizelle Gonzalez

    2015-01-01

    The exhibit Puerto Rico: Gateway to Landscape proposes to explore various ways in which citizens approach the landscape, or construct it – inside and outside the city – and considers city planning, the creation of parks and natural reserves, and their interpretation. From a perspective of citizen involvement, this thematic scaffolding related to landscape and the...

  4. Report of the White House Conference on Youth.

    ERIC Educational Resources Information Center

    White House Conference on Youth, Washington, DC.

    The proposals reported here evolved out of the conference held in Estes Park, Colorado April 18-22, 1971 to find new approaches to ten major issues, and new ways for youth between the ages of 14 and 24 to become more involved in the decision-making processes of the social and political institutions of the United States. 918 youth delegates were…

  5. When to Intervene in Selective Mutism: The Multimodal Treatment of a Case of Persistent Selective Mutism.

    ERIC Educational Resources Information Center

    Powell, Shawn; Dalley, Mahlono

    1995-01-01

    An identification and treatment model differentiating transient mutism from persistent selective mutism is proposed. The case study of a six-year-old girl is presented, who was treated with a multimodal approach combining behavioral techniques with play therapy and family involvement. At posttreatment and follow-up, she was talking in a manner…

  6. Designing an Accompanying Ecosystem to Foster Entrepreneurship among Agronomic and Forestry Engineering Students. Opinion and Commitment of University Lecturers

    ERIC Educational Resources Information Center

    Ortiz-Medina, L.; Fernández-Ahumada, E.; Lara-Vélez, P.; Taguas, E. V.; Gallardo-Cobos, R.; del Campillo, M. C.; Guerrero-Ginel, J. E.

    2016-01-01

    In the Higher School of Agronomic and Forestry Engineering of the University of Cordoba, a collective project conceived as an 'ecosystem to support and accompany entrepreneurs' has been proposed. The approach aims to spread and consolidate the entrepreneurial spirit and to respond to the demands of possible stakeholders involved in the whole…

  7. Implementing Bilingual Programs Is Everybody's Business. Focus: Occasional Papers in Bilingual Education 11.

    ERIC Educational Resources Information Center

    Griego-Jones, Toni

    A discussion of the role of bilingual education programs focuses on their function as a district-wide or school-wide reform effort, rather than as a discrete program within a larger system. It is proposed that this approach requires changes in the traditional roles of school personnel and thoughtful attention to how to involve all participants.…

  8. Langevin Equation on Fractal Curves

    NASA Astrophysics Data System (ADS)

    Satin, Seema; Gangal, A. D.

    2016-07-01

    We analyze random motion of a particle on a fractal curve, using Langevin approach. This involves defining a new velocity in terms of mass of the fractal curve, as defined in recent work. The geometry of the fractal curve, plays an important role in this analysis. A Langevin equation with a particular model of noise is proposed and solved using techniques of the Fα-Calculus.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Pavel V., E-mail: pvm@ispms.tsc.ru

    An evolutionary approach to earthquake development is proposed. A medium under loading is treated as a multiscale nonlinear dynamic system. Its failure involves a number of stages typical of any dynamic system: dynamic chaos, self-organized criticality, and global stability loss in the final stage of its evolution. In the latter stage, the system evolves in a blow-up mode accompanied by catastrophic superfast movements of the elements of this geomedium.

  10. The Hunger Games: Salmonella, Anorexia, and NLRP3.

    PubMed

    O'Neill, Luke A J

    2017-02-07

    Rao and colleagues (2017) reveal how Salmonella limits anorexia in mice, protecting them and promoting the spread of infection. The mechanism involves inhibition of the NLRP3 inflammasome limiting vagal nerve stimulation by IL-1β, which in turn promotes appetite. A possible new therapeutic approach for treating anorexia in multiple diseases is proposed. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Recursive inversion of externally defined linear systems

    NASA Technical Reports Server (NTRS)

    Bach, Ralph E., Jr.; Baram, Yoram

    1988-01-01

    The approximate inversion of an internally unknown linear system, given by its impulse response sequence, by an inverse system having a finite impulse response, is considered. The recursive least squares procedure is shown to have an exact initialization, based on the triangular Toeplitz structure of the matrix involved. The proposed approach also suggests solutions to the problems of system identification and compensation.

  12. The Principal Challenge: Leading and Managing Schools in an Era of Accountability. The Jossey-Bass Education Series.

    ERIC Educational Resources Information Center

    Tucker, Marc S., Ed.; Codding, Judy B., Ed.

    These papers examine causes of the crisis in school leadership, offering an innovative proposal for a new kind of institution that will train school principals to be turn-around artists. The approach involves close collaboration between the new institution and entire school districts, combining face-to-face instruction with Web-based delivery. Ten…

  13. Chemomics-based marker compounds mining and mimetic processing for exploring chemical mechanisms in traditional processing of herbal medicines, a continuous study on Rehmanniae Radix.

    PubMed

    Zhou, Li; Xu, Jin-Di; Zhou, Shan-Shan; Shen, Hong; Mao, Qian; Kong, Ming; Zou, Ye-Ting; Xu, Ya-Yun; Xu, Jun; Li, Song-Lin

    2017-12-29

    Exploring processing chemistry, in particular the chemical transformation mechanisms involved, is a key step to elucidate the scientific basis in traditional processing of herbal medicines. Previously, taking Rehmanniae Radix (RR) as a case study, the holistic chemome (secondary metabolome and glycome) difference between raw and processed RR was revealed by integrating hyphenated chromatographic techniques-based targeted glycomics and untargeted metabolomics. Nevertheless, the complex chemical transformation mechanisms underpinning the holistic chemome variation in RR processing remain to be extensively clarified. As a continuous study, here a novel strategy by combining chemomics-based marker compounds mining and mimetic processing is proposed for further exploring the chemical mechanisms involved in herbal processing. First, the differential marker compounds between raw and processed herbs were rapidly discovered by untargeted chemomics-based mining approach through multivariate statistical analysis of the chemome data obtained by integrated metabolomics and glycomics analysis. Second, the marker compounds were mimetically processed under the simulated physicochemical conditions as in the herb processing, and the final reaction products were chemically characterized by targeted chemomics-based mining approach. Third, the main chemical transformation mechanisms involved were clarified by linking up the original marker compounds and their mimetic processing products. Using this strategy, a set of differential marker compounds including saccharides, glycosides and furfurals in raw and processed RR was rapidly found, and the major chemical mechanisms involved in RR processing were elucidated as stepwise transformations of saccharides (polysaccharides, oligosaccharides and monosaccharides) and glycosides (iridoid glycosides and phenethylalcohol glycosides) into furfurals (glycosylated/non-glycosylated hydroxymethylfurfurals) by deglycosylation and/or dehydration. The research deliverables indicated that the proposed strategy could advance the understanding of RR processing chemistry, and therefore may be considered a promising approach for delving into the scientific basis in traditional processing of herbal medicines. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Independent component analysis (ICA) and self-organizing map (SOM) approach to multidetection system for network intruders

    NASA Astrophysics Data System (ADS)

    Abdi, Abdi M.; Szu, Harold H.

    2003-04-01

    With the growing rate of interconnection among computer systems, network security is becoming a real challenge. Intrusion Detection System (IDS) is designed to protect the availability, confidentiality and integrity of critical network information systems. Today"s approach to network intrusion detection involves the use of rule-based expert systems to identify an indication of known attack or anomalies. However, these techniques are less successful in identifying today"s attacks. Hackers are perpetually inventing new and previously unanticipated techniques to compromise information infrastructure. This paper proposes a dynamic way of detecting network intruders on time serious data. The proposed approach consists of a two-step process. Firstly, obtaining an efficient multi-user detection method, employing the recently introduced complexity minimization approach as a generalization of a standard ICA. Secondly, we identified unsupervised learning neural network architecture based on Kohonen"s Self-Organizing Map for potential functional clustering. These two steps working together adaptively will provide a pseudo-real time novelty detection attribute to supplement the current intrusion detection statistical methodology.

  15. Neurobehavioral and self-awareness changes after traumatic brain injury: Towards new multidimensional approaches.

    PubMed

    Arnould, A; Dromer, E; Rochat, L; Van der Linden, M; Azouvi, P

    2016-02-01

    Neurobehavioral and self-awareness changes are frequently observed following traumatic brain injury (TBI). These disturbances have been related to negative consequences on functional outcomes, caregiver distress and social reintegration, representing therefore a challenge for clinical research. Some studies have recently been conducted to specifically explore apathetic and impulsive manifestations, as well as self-awareness impairments in patients with TBI. These findings underlined the heterogeneity of clinical manifestations for each behavioral disturbance and the diversity of psychological processes involved. In this context, new multidimensional approaches taking into account the various processes at play have been proposed to better understand and apprehend the complexity and dynamic nature of these problematic behaviors. In addition, the involvement of social and environmental factors as well as premorbid personality traits have increasingly been addressed. These new multidimensional frameworks have the potential to ensure targeted and effective rehabilitation by allowing a better identification and therefore consideration of the various mechanisms involved in the onset of problematic behaviors. In this context, the main objective of this position paper was to demonstrate the interest of multidimensional approaches in the understanding and rehabilitation of problematic behaviors in patients with TBI. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  16. Estimation of Attitude and External Acceleration Using Inertial Sensor Measurement During Various Dynamic Conditions

    PubMed Central

    Lee, Jung Keun; Park, Edward J.; Robinovitch, Stephen N.

    2012-01-01

    This paper proposes a Kalman filter-based attitude (i.e., roll and pitch) estimation algorithm using an inertial sensor composed of a triaxial accelerometer and a triaxial gyroscope. In particular, the proposed algorithm has been developed for accurate attitude estimation during dynamic conditions, in which external acceleration is present. Although external acceleration is the main source of the attitude estimation error and despite the need for its accurate estimation in many applications, this problem that can be critical for the attitude estimation has not been addressed explicitly in the literature. Accordingly, this paper addresses the combined estimation problem of the attitude and external acceleration. Experimental tests were conducted to verify the performance of the proposed algorithm in various dynamic condition settings and to provide further insight into the variations in the estimation accuracy. Furthermore, two different approaches for dealing with the estimation problem during dynamic conditions were compared, i.e., threshold-based switching approach versus acceleration model-based approach. Based on an external acceleration model, the proposed algorithm was capable of estimating accurate attitudes and external accelerations for short accelerated periods, showing its high effectiveness during short-term fast dynamic conditions. Contrariwise, when the testing condition involved prolonged high external accelerations, the proposed algorithm exhibited gradually increasing errors. However, as soon as the condition returned to static or quasi-static conditions, the algorithm was able to stabilize the estimation error, regaining its high estimation accuracy. PMID:22977288

  17. Multi-Objective Approach for Energy-Aware Workflow Scheduling in Cloud Computing Environments

    PubMed Central

    Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach. PMID:24319361

  18. Scalable and responsive event processing in the cloud

    PubMed Central

    Suresh, Visalakshmi; Ezhilchelvan, Paul; Watson, Paul

    2013-01-01

    Event processing involves continuous evaluation of queries over streams of events. Response-time optimization is traditionally done over a fixed set of nodes and/or by using metrics measured at query-operator levels. Cloud computing makes it easy to acquire and release computing nodes as required. Leveraging this flexibility, we propose a novel, queueing-theory-based approach for meeting specified response-time targets against fluctuating event arrival rates by drawing only the necessary amount of computing resources from a cloud platform. In the proposed approach, the entire processing engine of a distinct query is modelled as an atomic unit for predicting response times. Several such units hosted on a single node are modelled as a multiple class M/G/1 system. These aspects eliminate intrusive, low-level performance measurements at run-time, and also offer portability and scalability. Using model-based predictions, cloud resources are efficiently used to meet response-time targets. The efficacy of the approach is demonstrated through cloud-based experiments. PMID:23230164

  19. Parallel consensual neural networks.

    PubMed

    Benediktsson, J A; Sveinsson, J R; Ersoy, O K; Swain, P H

    1997-01-01

    A new type of a neural-network architecture, the parallel consensual neural network (PCNN), is introduced and applied in classification/data fusion of multisource remote sensing and geographic data. The PCNN architecture is based on statistical consensus theory and involves using stage neural networks with transformed input data. The input data are transformed several times and the different transformed data are used as if they were independent inputs. The independent inputs are first classified using the stage neural networks. The output responses from the stage networks are then weighted and combined to make a consensual decision. In this paper, optimization methods are used in order to weight the outputs from the stage networks. Two approaches are proposed to compute the data transforms for the PCNN, one for binary data and another for analog data. The analog approach uses wavelet packets. The experimental results obtained with the proposed approach show that the PCNN outperforms both a conjugate-gradient backpropagation neural network and conventional statistical methods in terms of overall classification accuracy of test data.

  20. Multi-objective approach for energy-aware workflow scheduling in cloud computing environments.

    PubMed

    Yassa, Sonia; Chelouah, Rachid; Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach.

  1. An Extended Petri-Net Based Approach for Supply Chain Process Enactment in Resource-Centric Web Service Environment

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Zhang, Xiaoyu; Cai, Hongming; Xu, Boyi

    Enacting a supply-chain process involves variant partners and different IT systems. REST receives increasing attention for distributed systems with loosely coupled resources. Nevertheless, resource model incompatibilities and conflicts prevent effective process modeling and deployment in resource-centric Web service environment. In this paper, a Petri-net based framework for supply-chain process integration is proposed. A resource meta-model is constructed to represent the basic information of resources. Then based on resource meta-model, XML schemas and documents are derived, which represent resources and their states in Petri-net. Thereafter, XML-net, a high level Petri-net, is employed for modeling control and data flow of process. From process model in XML-net, RESTful services and choreography descriptions are deduced. Therefore, unified resource representation and RESTful services description are proposed for cross-system integration in a more effective way. A case study is given to illustrate the approach and the desirable features of the approach are discussed.

  2. On the development of OpenFOAM solvers based on explicit and implicit high-order Runge-Kutta schemes for incompressible flows with heat transfer

    NASA Astrophysics Data System (ADS)

    D'Alessandro, Valerio; Binci, Lorenzo; Montelpare, Sergio; Ricci, Renato

    2018-01-01

    Open-source CFD codes provide suitable environments for implementing and testing low-dissipative algorithms typically used to simulate turbulence. In this research work we developed CFD solvers for incompressible flows based on high-order explicit and diagonally implicit Runge-Kutta (RK) schemes for time integration. In particular, an iterated PISO-like procedure based on Rhie-Chow correction was used to handle pressure-velocity coupling within each implicit RK stage. For the explicit approach, a projected scheme was used to avoid the "checker-board" effect. The above-mentioned approaches were also extended to flow problems involving heat transfer. It is worth noting that the numerical technology available in the OpenFOAM library was used for space discretization. In this work, we additionally explore the reliability and effectiveness of the proposed implementations by computing several unsteady flow benchmarks; we also show that the numerical diffusion due to the time integration approach is completely canceled using the solution techniques proposed here.

  3. Power calculation for comparing diagnostic accuracies in a multi-reader, multi-test design.

    PubMed

    Kim, Eunhee; Zhang, Zheng; Wang, Youdan; Zeng, Donglin

    2014-12-01

    Receiver operating characteristic (ROC) analysis is widely used to evaluate the performance of diagnostic tests with continuous or ordinal responses. A popular study design for assessing the accuracy of diagnostic tests involves multiple readers interpreting multiple diagnostic test results, called the multi-reader, multi-test design. Although several different approaches to analyzing data from this design exist, few methods have discussed the sample size and power issues. In this article, we develop a power formula to compare the correlated areas under the ROC curves (AUC) in a multi-reader, multi-test design. We present a nonparametric approach to estimate and compare the correlated AUCs by extending DeLong et al.'s (1988, Biometrics 44, 837-845) approach. A power formula is derived based on the asymptotic distribution of the nonparametric AUCs. Simulation studies are conducted to demonstrate the performance of the proposed power formula and an example is provided to illustrate the proposed procedure. © 2014, The International Biometric Society.

  4. Nicotine replacement therapy decision based on fuzzy multi-criteria analysis

    NASA Astrophysics Data System (ADS)

    Tarmudi, Zamali; Matmali, Norfazillah; Abdullah, Mohd Lazim

    2017-08-01

    It has been observed that Nicotine Replacement Therapy (NRT) is one of the alternatives to control and reduce smoking addiction among smokers. Since the decision to choose the best NRT alternative involves uncertainty, ambiguity factors and diverse input datasets, thus, this paper proposes a fuzzy multi-criteria analysis (FMA) to overcome these issues. It focuses on how the fuzzy approach can unify the diversity of datasets based on NRT's decision-making problem. The analysis done employed the advantage of the cost-benefit criterion to unify the mixture of dataset input. The performance matrix was utilised to derive the performance scores. An empirical example regarding the NRT's decision-making problem was employed to illustrate the proposed approach. Based on the calculations, this analytical approach was found to be highly beneficial in terms of usability. It was also very applicable and efficient in dealing with the mixture of input datasets. Hence, the decision-making process can easily be used by experts and patients who are interested to join the therapy/cessation program.

  5. Multiphase mean curvature flows with high mobility contrasts: A phase-field approach, with applications to nanowires

    NASA Astrophysics Data System (ADS)

    Bretin, Elie; Danescu, Alexandre; Penuelas, José; Masnou, Simon

    2018-07-01

    The structure of many multiphase systems is governed by an energy that penalizes the area of interfaces between phases weighted by surface tension coefficients. However, interface evolution laws depend also on interface mobility coefficients. Having in mind some applications where highly contrasted or even degenerate mobilities are involved, for which classical phase field models are inapplicable, we propose a new effective phase field approach to approximate multiphase mean curvature flows with mobilities. The key aspect of our model is to incorporate the mobilities not in the phase field energy (which is conventionally the case) but in the metric which determines the gradient flow. We show the consistency of such an approach by a formal analysis of the sharp interface limit. We also propose an efficient numerical scheme which allows us to illustrate the advantages of the model on various examples, as the wetting of droplets on solid surfaces or the simulation of nanowires growth generated by the so-called vapor-liquid-solid method.

  6. Integration of QFD, AHP, and LPP methods in supplier development problems under uncertainty

    NASA Astrophysics Data System (ADS)

    Shad, Zahra; Roghanian, Emad; Mojibian, Fatemeh

    2014-04-01

    Quality function deployment (QFD) is a customer-driven approach, widely used to develop or process new product to maximize customer satisfaction. Last researches used linear physical programming (LPP) procedure to optimize QFD; however, QFD issue involved uncertainties, or fuzziness, which requires taking them into account for more realistic study. In this paper, a set of fuzzy data is used to address linguistic values parameterized by triangular fuzzy numbers. Proposed integrated approach including analytic hierarchy process (AHP), QFD, and LPP to maximize overall customer satisfaction under uncertain conditions and apply them in the supplier development problem. The fuzzy AHP approach is adopted as a powerful method to obtain the relationship between the customer requirements and engineering characteristics (ECs) to construct house of quality in QFD method. LPP is used to obtain the optimal achievement level of the ECs and subsequently the customer satisfaction level under different degrees of uncertainty. The effectiveness of proposed method will be illustrated by an example.

  7. Proposed reliability cost model

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  8. A general approach for sample size calculation for the three-arm 'gold standard' non-inferiority design.

    PubMed

    Stucke, Kathrin; Kieser, Meinhard

    2012-12-10

    In the three-arm 'gold standard' non-inferiority design, an experimental treatment, an active reference, and a placebo are compared. This design is becoming increasingly popular, and it is, whenever feasible, recommended for use by regulatory guidelines. We provide a general method to calculate the required sample size for clinical trials performed in this design. As special cases, the situations of continuous, binary, and Poisson distributed outcomes are explored. Taking into account the correlation structure of the involved test statistics, the proposed approach leads to considerable savings in sample size as compared with application of ad hoc methods for all three scale levels. Furthermore, optimal sample size allocation ratios are determined that result in markedly smaller total sample sizes as compared with equal assignment. As optimal allocation makes the active treatment groups larger than the placebo group, implementation of the proposed approach is also desirable from an ethical viewpoint. Copyright © 2012 John Wiley & Sons, Ltd.

  9. Cross-modal learning to rank via latent joint representation.

    PubMed

    Wu, Fei; Jiang, Xinyang; Li, Xi; Tang, Siliang; Lu, Weiming; Zhang, Zhongfei; Zhuang, Yueting

    2015-05-01

    Cross-modal ranking is a research topic that is imperative to many applications involving multimodal data. Discovering a joint representation for multimodal data and learning a ranking function are essential in order to boost the cross-media retrieval (i.e., image-query-text or text-query-image). In this paper, we propose an approach to discover the latent joint representation of pairs of multimodal data (e.g., pairs of an image query and a text document) via a conditional random field and structural learning in a listwise ranking manner. We call this approach cross-modal learning to rank via latent joint representation (CML²R). In CML²R, the correlations between multimodal data are captured in terms of their sharing hidden variables (e.g., topics), and a hidden-topic-driven discriminative ranking function is learned in a listwise ranking manner. The experiments show that the proposed approach achieves a good performance in cross-media retrieval and meanwhile has the capability to learn the discriminative representation of multimodal data.

  10. A model for solving the prescribed burn planning problem.

    PubMed

    Rachmawati, Ramya; Ozlen, Melih; Reinke, Karin J; Hearne, John W

    2015-01-01

    The increasing frequency of destructive wildfires, with a consequent loss of life and property, has led to fire and land management agencies initiating extensive fuel management programs. This involves long-term planning of fuel reduction activities such as prescribed burning or mechanical clearing. In this paper, we propose a mixed integer programming (MIP) model that determines when and where fuel reduction activities should take place. The model takes into account multiple vegetation types in the landscape, their tolerance to frequency of fire events, and keeps track of the age of each vegetation class in each treatment unit. The objective is to minimise fuel load over the planning horizon. The complexity of scheduling fuel reduction activities has led to the introduction of sophisticated mathematical optimisation methods. While these approaches can provide optimum solutions, they can be computationally expensive, particularly for fuel management planning which extends across the landscape and spans long term planning horizons. This raises the question of how much better do exact modelling approaches compare to simpler heuristic approaches in their solutions. To answer this question, the proposed model is run using an exact MIP (using commercial MIP solver) and two heuristic approaches that decompose the problem into multiple single-period sub problems. The Knapsack Problem (KP), which is the first heuristic approach, solves the single period problems, using an exact MIP approach. The second heuristic approach solves the single period sub problem using a greedy heuristic approach. The three methods are compared in term of model tractability, computational time and the objective values. The model was tested using randomised data from 711 treatment units in the Barwon-Otway district of Victoria, Australia. Solutions for the exact MIP could be obtained for up to a 15-year planning only using a standard implementation of CPLEX. Both heuristic approaches can solve significantly larger problems, involving 100-year or even longer planning horizons. Furthermore there are no substantial differences in the solutions produced by the three approaches. It is concluded that for practical purposes a heuristic method is to be preferred to the exact MIP approach.

  11. Novel harmonic regularization approach for variable selection in Cox's proportional hazards model.

    PubMed

    Chu, Ge-Jin; Liang, Yong; Wang, Jia-Xuan

    2014-01-01

    Variable selection is an important issue in regression and a number of variable selection methods have been proposed involving nonconvex penalty functions. In this paper, we investigate a novel harmonic regularization method, which can approximate nonconvex Lq  (1/2 < q < 1) regularizations, to select key risk factors in the Cox's proportional hazards model using microarray gene expression data. The harmonic regularization method can be efficiently solved using our proposed direct path seeking approach, which can produce solutions that closely approximate those for the convex loss function and the nonconvex regularization. Simulation results based on the artificial datasets and four real microarray gene expression datasets, such as real diffuse large B-cell lymphoma (DCBCL), the lung cancer, and the AML datasets, show that the harmonic regularization method can be more accurate for variable selection than existing Lasso series methods.

  12. An experimental approach towards the development of an in vitro cortical-thalamic co-culture model.

    PubMed

    Kanagasabapathi, Thirukumaran T; Massobrio, Paolo; Tedesco, Mariateresa; Martinoia, Sergio; Wadman, Wytse J; Decré, Michel M J

    2011-01-01

    In this paper, we propose an experimental approach to develop an in vitro dissociated cortical-thalamic co-culture model using a dual compartment neurofluidic device. The device has two compartments separated by 10 μm wide and 3 μm high microchannels. The microchannels provide a physical isolation of neurons allowing only neurites to grow between the compartments. Long-term viable co-culture was maintained in the compartmented device, neurite growth through the microchannels was verified using immunofluorescence staining, and electrophysiological recordings from the co-culture system was investigated. Preliminary analysis of spontaneous activities from the co-culture shows a distinctively different firing pattern associated with cultures of individual cell types and further analysis is proposed for a deeper understanding of the dynamics involved in the network connectivity in such a co-culture system.

  13. Promoting health equity to prevent crime.

    PubMed

    Jackson, Dylan B; Vaughn, Michael G

    2018-08-01

    Traditionally, research activities aimed at diminishing health inequalities and preventing crime have been conducted in isolation, with relatively little cross-fertilization. We argue that moving forward, transdisciplinary collaborations that employ a life-course perspective constitute a productive approach to minimizing both health disparities and early delinquent involvement. Specifically, we propose a multidimensional framework that integrates findings on health disparities and crime across the early life-course and emphasizes the role of racial and socioeconomic disparities in health. Developing the empirical nexus between health disparities research and criminological research through this multidimensional framework could fruitfully direct and organize research that contributes to reductions in health inequalities and the prevention of crime during the early life course. We also propose that this unified approach can ultimately enhance public safety policies and attenuate the collateral consequences of incarceration. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. A Novel Approach with Time-Splitting Spectral Technique for the Coupled Schrödinger-Boussinesq Equations Involving Riesz Fractional Derivative

    NASA Astrophysics Data System (ADS)

    Saha Ray, S.

    2017-09-01

    In the present paper the Riesz fractional coupled Schrödinger-Boussinesq (S-B) equations have been solved by the time-splitting Fourier spectral (TSFS) method. This proposed technique is utilized for discretizing the Schrödinger like equation and further, a pseudospectral discretization has been employed for the Boussinesq-like equation. Apart from that an implicit finite difference approach has also been proposed to compare the results with the solutions obtained from the time-splitting technique. Furthermore, the time-splitting method is proved to be unconditionally stable. The error norms along with the graphical solutions have also been presented here. Supported by NBHM, Mumbai, under Department of Atomic Energy, Government of India vide Grant No. 2/48(7)/2015/NBHM (R.P.)/R&D II/11403

  15. Eighty phenomena about the self: representation, evaluation, regulation, and change

    PubMed Central

    Thagard, Paul; Wood, Joanne V.

    2015-01-01

    We propose a new approach for examining self-related aspects and phenomena. The approach includes (1) a taxonomy and (2) an emphasis on multiple levels of mechanisms. The taxonomy categorizes approximately eighty self-related phenomena according to three primary functions involving the self: representing, effecting, and changing. The representing self encompasses the ways in which people depict themselves, either to themselves or to others (e.g., self-concepts, self-presentation). The effecting self concerns ways in which people facilitate or limit their own traits and behaviors (e.g., self-enhancement, self-regulation). The changing self is less time-limited than the effecting self; it concerns phenomena that involve lasting alterations in how people represent and control themselves (e.g., self-expansion, self-development). Each self-related phenomenon within these three categories may be examined at four levels of interacting mechanisms (social, individual, neural, and molecular). We illustrate our approach by focusing on seven self-related phenomena. PMID:25870574

  16. [The cinema as a device for teaching complexity in mental health].

    PubMed

    Delego, Adriana; Carroll, Hugo

    2013-01-01

    This article proposes the use of Cinema as an instrument for a complex approach to Teaching in the field of Clinical Psychiatry and Psychopathology in Mental Health. With this aim, intends a different look that pretends the approach not only to the conceptual structures on Psychopathology, but also to the complexity involved, choosing Cinema as a powerful way of "empathic recreation". Based on previous work by several authors, the theoretical framework that supports this modality is presented in the philosophical, cognitive, pedagogical-didactic aspects and their consequences into teaching within Mental Health. This task also implies addressing the historical evolution of the representation of subjectivity in fiction. In this way, a new perspective to those working in the field of mental health, as subjects involved in continuous learning processes, is presented. This perspective emphasizes the interactions underlying psychic problematical.

  17. Demonstration of pelvic anatomy by modified midline transection that maintains intact internal pelvic organs.

    PubMed

    Steinke, Hanno; Saito, Toshiyuki; Herrmann, Gudrun; Miyaki, Takayoshi; Hammer, Niels; Sandrock, Mara; Itoh, Masahiro; Spanel-Borowski, Katharina

    2010-01-01

    Gross dissection for demonstrating anatomy of the human pelvis has traditionally involved one of two approaches, each with advantages and disadvantages. Classic hemisection in the median plane through the pelvic ring transects the visceral organs but maintains two symmetric pelvic halves. An alternative paramedial transection compromises one side of the bony pelvis but leaves the internal organs intact. The authors propose a modified technique that combines advantages of both classical dissections. This novel approach involves dividing the pubic symphysis and sacrum in the median plane after shifting all internal organs to one side. The hemipelvis without internal organs is immediately available for further dissection of the lower limb. The hemipelvis with intact internal organs is ideal for showing the complex spatial relationships of the pelvic organs and vessels relative to the intact pelvic floor.

  18. A new approach for beam hardening correction based on the local spectrum distributions

    NASA Astrophysics Data System (ADS)

    Rasoulpour, Naser; Kamali-Asl, Alireza; Hemmati, Hamidreza

    2015-09-01

    Energy dependence of material absorption and polychromatic nature of x-ray beams in the Computed Tomography (CT) causes a phenomenon which called "beam hardening". The purpose of this study is to provide a novel approach for Beam Hardening (BH) correction. This approach is based on the linear attenuation coefficients of Local Spectrum Distributions (LSDs) in the various depths of a phantom. The proposed method includes two steps. Firstly, the hardened spectra in various depths of the phantom (or LSDs) are estimated based on the Expectation Maximization (EM) algorithm for arbitrary thickness interval of known materials in the phantom. The performance of LSD estimation technique is evaluated by applying random Gaussian noise to transmission data. Then, the linear attenuation coefficients with regarding to the mean energy of LSDs are obtained. Secondly, a correction function based on the calculated attenuation coefficients is derived in order to correct polychromatic raw data. Since a correction function has been used for the conversion of the polychromatic data to the monochromatic data, the effect of BH in proposed reconstruction must be reduced in comparison with polychromatic reconstruction. The proposed approach has been assessed in the phantoms which involve less than two materials, but the correction function has been extended for using in the constructed phantoms with more than two materials. The relative mean energy difference in the LSDs estimations based on the noise-free transmission data was less than 1.5%. Also, it shows an acceptable value when a random Gaussian noise is applied to the transmission data. The amount of cupping artifact in the proposed reconstruction method has been effectively reduced and proposed reconstruction profile is uniform more than polychromatic reconstruction profile.

  19. Revolution or evolution: the challenges of conceptualizing patient and public involvement in a consumerist world

    PubMed Central

    Tritter, Jonathan Q.

    2009-01-01

    Abstract Background  Changing the relationship between citizens and the state is at the heart of current policy reforms. Across England and the developed world, from Oslo to Ontario, Newcastle to Newquay, giving the public a more direct say in shaping the organization and delivery of healthcare services is central to the current health reform agenda. Realigning public services around those they serve, based on evidence from service user’s experiences, and designed with and by the people rather than simply on their behalf, is challenging the dominance of managerialism, marketization and bureaucratic expertise. Despite this attention there is limited conceptual and theoretical work to underpin policy and practice. Objective  This article proposes a conceptual framework for patient and public involvement (PPI) and goes on to explore the different justifications for involvement and the implications of a rights‐based rather than a regulatory approach. These issues are highlighted through exploring the particular evolution of English health policy in relation to PPI on the one hand and patient choice on the other before turning to similar patterns apparent in the United States and more broadly. Conclusions  A framework for conceptualizing PPI is presented that differentiates between the different types and aims of involvement and their potential impact. Approaches to involvement are different in those countries that adopt a rights‐based rather than a regulatory approach. I conclude with a discussion of the tension and interaction apparent in the globalization of both involvement and patient choice in both policy and practice. PMID:19754691

  20. The economics of project analysis: Optimal investment criteria and methods of study

    NASA Technical Reports Server (NTRS)

    Scriven, M. C.

    1979-01-01

    Insight is provided toward the development of an optimal program for investment analysis of project proposals offering commercial potential and its components. This involves a critique of economic investment criteria viewed in relation to requirements of engineering economy analysis. An outline for a systems approach to project analysis is given Application of the Leontief input-output methodology to analysis of projects involving multiple processes and products is investigated. Effective application of elements of neoclassical economic theory to investment analysis of project components is demonstrated. Patterns of both static and dynamic activity levels are incorporated.

  1. Robust Lane Sensing and Departure Warning under Shadows and Occlusions

    PubMed Central

    Tapia-Espinoza, Rodolfo; Torres-Torriti, Miguel

    2013-01-01

    A prerequisite for any system that enhances drivers' awareness of road conditions and threatening situations is the correct sensing of the road geometry and the vehicle's relative pose with respect to the lane despite shadows and occlusions. In this paper we propose an approach for lane segmentation and tracking that is robust to varying shadows and occlusions. The approach involves color-based clustering, the use of MSAC for outlier removal and curvature estimation, and also the tracking of lane boundaries. Lane boundaries are modeled as planar curves residing in 3D-space using an inverse perspective mapping, instead of the traditional tracking of lanes in the image space, i.e., the segmented lane boundary points are 3D points in a coordinate frame fixed to the vehicle that have a depth component and belong to a plane tangent to the vehicle's wheels, rather than 2D points in the image space without depth information. The measurement noise and disturbances due to vehicle vibrations are reduced using an extended Kalman filter that involves a 6-DOF motion model for the vehicle, as well as measurements about the road's banking and slope angles. Additional contributions of the paper include: (i) the comparison of textural features obtained from a bank of Gabor filters and from a GMRF model; and (ii) the experimental validation of the quadratic and cubic approximations to the clothoid model for the lane boundaries. The results show that the proposed approach performs better than the traditional gradient-based approach under different levels of difficulty caused by shadows and occlusions. PMID:23478598

  2. A Human-Autonomy Teaming Approach for a Flight-Following Task

    NASA Technical Reports Server (NTRS)

    Brandt, Summer L.; Lachter, Joel; Russell, Ricky; Shively, R. Jay

    2017-01-01

    Human involvement with increasingly autonomous systems must adjust to allow for a more dynamic relationship involving cooperation and teamwork. As part of an ongoing project to develop a framework for human autonomy teaming (HAT) in aviation, a study was conducted to evaluate proposed tenets of HAT. Participants performed a flight-following task at a ground station both with and without HAT features enabled. Overall, participants preferred the ground station with HAT features enabled over the station without the HAT features. Participants reported that the HAT displays and automation were preferred for keeping up with operationally important issues. Additionally, participants reported that the HAT displays and automation provided enough situation awareness to complete the task, reduced the necessary workload and were efficient. Overall, there was general agreement that HAT features supported teaming with the automation. These results will be used to refine and expand our proposed framework for human-autonomy teaming.

  3. The politics of participation in watershed modeling.

    PubMed

    Korfmacher, K S

    2001-02-01

    While researchers and decision-makers increasingly recognize the importance of public participation in environmental decision-making, there is less agreement about how to involve the public. One of the most controversial issues is how to involve citizens in producing scientific information. Although this question is relevant to many areas of environmental policy, it has come to the fore in watershed management. Increasingly, the public is becoming involved in the sophisticated computer modeling efforts that have been developed to inform watershed management decisions. These models typically have been treated as technical inputs to the policy process. However, model-building itself involves numerous assumptions, judgments, and decisions that are relevant to the public. This paper examines the politics of public involvement in watershed modeling efforts and proposes five guidelines for good practice for such efforts. Using these guidelines, I analyze four cases in which different approaches to public involvement in the modeling process have been attempted and make recommendations for future efforts to involve communities in watershed modeling. Copyright 2001 Springer-Verlag

  4. A New Automatic Method of Urban Areas Mapping in East Asia from LANDSAT Data

    NASA Astrophysics Data System (ADS)

    XU, R.; Jia, G.

    2012-12-01

    Cities, as places where human activities are concentrated, account for a small percent of global land cover but are frequently cited as the chief causes of, and solutions to, climate, biogeochemistry, and hydrology processes at local, regional, and global scales. Accompanying with uncontrolled economic growth, urban sprawl has been attributed to the accelerating integration of East Asia into the world economy and involved dramatic changes in its urban form and land use. To understand the impact of urban extent on biogeophysical processes, reliable mapping of built-up areas is particularly essential in eastern cities as a result of their characteristics of smaller patches, more fragile, and a lower fraction of the urban landscape which does not have natural than in the West. Segmentation of urban land from other land-cover types using remote sensing imagery can be done by standard classification processes as well as a logic rule calculation based on spectral indices and their derivations. Efforts to establish such a logic rule with no threshold for automatically mapping are highly worthwhile. Existing automatic methods are reviewed, and then a proposed approach is introduced including the calculation of the new index and the improved logic rule. Following this, existing automatic methods as well as the proposed approach are compared in a common context. Afterwards, the proposed approach is tested separately in cities of large, medium, and small scale in East Asia selected from different LANDSAT images. The results are promising as the approach can efficiently segment urban areas, even in the presence of more complex eastern cities. Key words: Urban extraction; Automatic Method; Logic Rule; LANDSAT images; East AisaThe Proposed Approach of Extraction of Urban Built-up Areas in Guangzhou, China

  5. Results of community deliberation about social impacts of ecological restoration: comparing public input of self-selected versus actively engaged community members.

    PubMed

    Harris, Charles C; Nielsen, Erik A; Becker, Dennis R; Blahna, Dale J; McLaughlin, William J

    2012-08-01

    Participatory processes for obtaining residents' input about community impacts of proposed environmental management actions have long raised concerns about who participates in public involvement efforts and whose interests they represent. This study explored methods of broad-based involvement and the role of deliberation in social impact assessment. Interactive community forums were conducted in 27 communities to solicit public input on proposed alternatives for recovering wild salmon in the Pacific Northwest US. Individuals identified by fellow residents as most active and involved in community affairs ("AE residents") were invited to participate in deliberations about likely social impacts of proposed engineering and ecological actions such as dam removal. Judgments of these AE participants about community impacts were compared with the judgments of residents motivated to attend a forum out of personal interest, who were designated as self-selected ("SS") participants. While the magnitude of impacts rated by SS participants across all communities differed significantly from AE participants' ratings, in-depth analysis of results from two community case studies found that both AE and SS participants identified a large and diverse set of unique impacts, as well as many of the same kinds of impacts. Thus, inclusion of both kinds of residents resulted in a greater range of impacts for consideration in the environmental impact study. The case study results also found that the extent to which similar kinds of impacts are specified by AE and SS group members can differ by type of community. Study results caution against simplistic conclusions drawn from this approach to community-wide public participation. Nonetheless, the results affirm that deliberative methods for community-based impact assessment involving both AE and SS residents can provide a more complete picture of perceived impacts of proposed restoration activities.

  6. Results of Community Deliberation About Social Impacts of Ecological Restoration: Comparing Public Input of Self-Selected Versus Actively Engaged Community Members

    NASA Astrophysics Data System (ADS)

    Harris, Charles C.; Nielsen, Erik A.; Becker, Dennis R.; Blahna, Dale J.; McLaughlin, William J.

    2012-08-01

    Participatory processes for obtaining residents' input about community impacts of proposed environmental management actions have long raised concerns about who participates in public involvement efforts and whose interests they represent. This study explored methods of broad-based involvement and the role of deliberation in social impact assessment. Interactive community forums were conducted in 27 communities to solicit public input on proposed alternatives for recovering wild salmon in the Pacific Northwest US. Individuals identified by fellow residents as most active and involved in community affairs ("AE residents") were invited to participate in deliberations about likely social impacts of proposed engineering and ecological actions such as dam removal. Judgments of these AE participants about community impacts were compared with the judgments of residents motivated to attend a forum out of personal interest, who were designated as self-selected ("SS") participants. While the magnitude of impacts rated by SS participants across all communities differed significantly from AE participants' ratings, in-depth analysis of results from two community case studies found that both AE and SS participants identified a large and diverse set of unique impacts, as well as many of the same kinds of impacts. Thus, inclusion of both kinds of residents resulted in a greater range of impacts for consideration in the environmental impact study. The case study results also found that the extent to which similar kinds of impacts are specified by AE and SS group members can differ by type of community. Study results caution against simplistic conclusions drawn from this approach to community-wide public participation. Nonetheless, the results affirm that deliberative methods for community-based impact assessment involving both AE and SS residents can provide a more complete picture of perceived impacts of proposed restoration activities.

  7. Downscaling Land Surface Temperature in Complex Regions by Using Multiple Scale Factors with Adaptive Thresholds

    PubMed Central

    Yang, Yingbao; Li, Xiaolong; Pan, Xin; Zhang, Yong; Cao, Chen

    2017-01-01

    Many downscaling algorithms have been proposed to address the issue of coarse-resolution land surface temperature (LST) derived from available satellite-borne sensors. However, few studies have focused on improving LST downscaling in urban areas with several mixed surface types. In this study, LST was downscaled by a multiple linear regression model between LST and multiple scale factors in mixed areas with three or four surface types. The correlation coefficients (CCs) between LST and the scale factors were used to assess the importance of the scale factors within a moving window. CC thresholds determined which factors participated in the fitting of the regression equation. The proposed downscaling approach, which involves an adaptive selection of the scale factors, was evaluated using the LST derived from four Landsat 8 thermal imageries of Nanjing City in different seasons. Results of the visual and quantitative analyses show that the proposed approach achieves relatively satisfactory downscaling results on 11 August, with coefficient of determination and root-mean-square error of 0.87 and 1.13 °C, respectively. Relative to other approaches, our approach shows the similar accuracy and the availability in all seasons. The best (worst) availability occurred in the region of vegetation (water). Thus, the approach is an efficient and reliable LST downscaling method. Future tasks include reliable LST downscaling in challenging regions and the application of our model in middle and low spatial resolutions. PMID:28368301

  8. Spectral-Spatial Classification of Hyperspectral Images Using Hierarchical Optimization

    NASA Technical Reports Server (NTRS)

    Tarabalka, Yuliya; Tilton, James C.

    2011-01-01

    A new spectral-spatial method for hyperspectral data classification is proposed. For a given hyperspectral image, probabilistic pixelwise classification is first applied. Then, hierarchical step-wise optimization algorithm is performed, by iteratively merging neighboring regions with the smallest Dissimilarity Criterion (DC) and recomputing class labels for new regions. The DC is computed by comparing region mean vectors, class labels and a number of pixels in the two regions under consideration. The algorithm is converged when all the pixels get involved in the region merging procedure. Experimental results are presented on two remote sensing hyperspectral images acquired by the AVIRIS and ROSIS sensors. The proposed approach improves classification accuracies and provides maps with more homogeneous regions, when compared to previously proposed classification techniques.

  9. An element search ant colony technique for solving virtual machine placement problem

    NASA Astrophysics Data System (ADS)

    Srija, J.; Rani John, Rose; Kanaga, Grace Mary, Dr.

    2017-09-01

    The data centres in the cloud environment play a key role in providing infrastructure for ubiquitous computing, pervasive computing, mobile computing etc. This computing technique tries to utilize the available resources in order to provide services. Hence maintaining the resource utilization without wastage of power consumption has become a challenging task for the researchers. In this paper we propose the direct guidance ant colony system for effective mapping of virtual machines to the physical machine with maximal resource utilization and minimal power consumption. The proposed algorithm has been compared with the existing ant colony approach which is involved in solving virtual machine placement problem and thus the proposed algorithm proves to provide better result than the existing technique.

  10. An efficient variable projection formulation for separable nonlinear least squares problems.

    PubMed

    Gan, Min; Li, Han-Xiong

    2014-05-01

    We consider in this paper a class of nonlinear least squares problems in which the model can be represented as a linear combination of nonlinear functions. The variable projection algorithm projects the linear parameters out of the problem, leaving the nonlinear least squares problems involving only the nonlinear parameters. To implement the variable projection algorithm more efficiently, we propose a new variable projection functional based on matrix decomposition. The advantage of the proposed formulation is that the size of the decomposed matrix may be much smaller than those of previous ones. The Levenberg-Marquardt algorithm using finite difference method is then applied to minimize the new criterion. Numerical results show that the proposed approach achieves significant reduction in computing time.

  11. Prevention of the Posttraumatic Fibrotic Response in Joints

    DTIC Science & Technology

    2016-12-01

    fibrotic deposits. Evaluation of the efficacy of the proposed approach was achieved by biochemical assays of collagen content and composition, then by...the amount of cross-links in collagen deposits, by histological assays of involved tissues, and by biomechanical evaluation of the flexion contracture...batches collected from each bioreactor run were evaluated by analyzing the binding affinity of the purified antibody to procollagen I standard that

  12. Prevention of the Posttraumatic Fibrotic Response in Joints

    DTIC Science & Technology

    2016-12-01

    received the therapeutic antibody to minimize the formation of excessive fibrotic deposits. Evaluation of the efficacy of the proposed approach was...involved tissues, and by biomechanical evaluation of the flexion contracture. Appropriate controls were also included [5,6]. Note, that this is a...of the combined pool of the ACA batches collected from each bioreactor run were evaluated by analyzing the binding affinity of the purified antibody

  13. Multilateral Agencies and Their Policy Proposals for Education: Are They Contributing to Reduce the Knowledge Gap in the World?

    ERIC Educational Resources Information Center

    de Siqueira, Angela C.

    Since the 1960s, the World Bank has been involved in educational policy around the world. Applying a human capital theory/manpower forecasting approach, the World Bank has focused on the infrastructure, that is, buildings and equipment, in vocational and higher education. At the same time, the power and influence of UNICEF and UNESCO, the main…

  14. Recursive inversion of externally defined linear systems by FIR filters

    NASA Technical Reports Server (NTRS)

    Bach, Ralph E., Jr.; Baram, Yoram

    1989-01-01

    The approximate inversion of an internally unknown linear system, given by its impulse response sequence, by an inverse system having a finite impulse response, is considered. The recursive least-squares procedure is shown to have an exact initialization, based on the triangular Toeplitz structure of the matrix involved. The proposed approach also suggests solutions to the problem of system identification and compensation.

  15. 3D modelling of squeeze flow of unidirectional and fabric composite inserts

    NASA Astrophysics Data System (ADS)

    Ghnatios, Chady; Abisset-Chavanne, Emmanuelle; Chinesta, Francisco; Keunings, Roland

    2016-10-01

    The enhanced design flexibility provided to the thermo-forming of thermoplastic materials arises from the use of both continuous and discontinuous thermoplastic prepregs. Discontinuous prepregs are patches used to locally strengthen the part. In this paper, we propose a new modelling approach for suspensions involving composite patches that uses theoretical concepts related to discontinuous fibres suspensions, transversally isotropic fluids and extended dumbbell models.

  16. A Global Perspective on Early Childhood Care and Education: A Proposed Model. Action Research in Family and Early Childhood. UNESCO Education Sector Monograph.

    ERIC Educational Resources Information Center

    Lillemyr, Ole Fredrik; Fagerli, Oddvar; Sobstad, Frode

    This monograph describes an alternative model for early childhood care and education involving a complex and integrated system that allows for more collaboration among early childhood care and education activities. The model, with its emphasis on values in all educational practices, is intended to promote a more global and total approach to…

  17. Predictability and Coupled Dynamics of MJO During DYNAMO

    DTIC Science & Technology

    2013-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Predictability and Coupled Dynamics of MJO During DYNAMO ... DYNAMO time period. APPROACH We are working as a team to study MJO dynamics and predictability using several models as team members of the ONR DRI...associated with the DYNAMO experiment. This is a fundamentally collaborative proposal that involves close collaboration with Dr. Hyodae Seo of the

  18. A versatile approach to the study of the transient response of a submerged thin shell

    NASA Astrophysics Data System (ADS)

    Leblond, C.; Sigrist, J.-F.

    2010-01-01

    The transient response of submerged two-dimensional thin shell subjected to weak acoustical or mechanical excitations is addressed in this paper. The proposed approach is first exposed in a detailed manner: it is based on Laplace transform in time, in vacuo eigenvector expansion with time-dependent coefficients for the structural dynamics and boundary-integral formulation for the fluid. The projection of the fluid pressure on the in vacuo eigenvectors leads to a fully coupled system involving the modal time-dependent displacement coefficients, which are the problem unknowns. They are simply determined by matrix inversion in the Laplace domain. Application of the method to the response of a two-dimensional immersed shell to a weak acoustical excitation is then exposed: the proposed test-case corresponds to the design of immersed structures subjected to underwater explosions, which is of paramount importance in naval shipbuilding. Comparison of a numerical calculation based on the proposed approach with an analytical solution is exposed; versatility of the method is also highlighted by referring to "classical" FEM/FEM or FEM/BEM simulations. As a conspicuous feature of the method, calculation of the fluid response functions corresponding to a given geometry has to be performed once, allowing various simulations for different material properties of the structure, as well as for various excitations on the structure. This versatile approach can therefore be efficiently and extensively used for design purposes.

  19. Group decision-making approach for flood vulnerability identification using the fuzzy VIKOR method

    NASA Astrophysics Data System (ADS)

    Lee, G.; Jun, K. S.; Cung, E. S.

    2014-09-01

    This study proposes an improved group decision making (GDM) framework that combines VIKOR method with fuzzified data to quantify the spatial flood vulnerability including multi-criteria evaluation indicators. In general, GDM method is an effective tool for formulating a compromise solution that involves various decision makers since various stakeholders may have different perspectives on their flood risk/vulnerability management responses. The GDM approach is designed to achieve consensus building that reflects the viewpoints of each participant. The fuzzy VIKOR method was developed to solve multi-criteria decision making (MCDM) problems with conflicting and noncommensurable criteria. This comprising method can be used to obtain a nearly ideal solution according to all established criteria. Triangular fuzzy numbers are used to consider the uncertainty of weights and the crisp data of proxy variables. This approach can effectively propose some compromising decisions by combining the GDM method and fuzzy VIKOR method. The spatial flood vulnerability of the south Han River using the GDM approach combined with the fuzzy VIKOR method was compared with the results from general MCDM methods, such as the fuzzy TOPSIS and classical GDM methods, such as those developed by Borda, Condorcet, and Copeland. The evaluated priorities were significantly dependent on the employed decision-making method. The proposed fuzzy GDM approach can reduce the uncertainty in the data confidence and weight derivation techniques. Thus, the combination of the GDM approach with the fuzzy VIKOR method can provide robust prioritization because it actively reflects the opinions of various groups and considers uncertainty in the input data.

  20. A new approach for continuous estimation of baseflow using discrete water quality data: Method description and comparison with baseflow estimates from two existing approaches

    USGS Publications Warehouse

    Miller, Matthew P.; Johnson, Henry M.; Susong, David D.; Wolock, David M.

    2015-01-01

    Understanding how watershed characteristics and climate influence the baseflow component of stream discharge is a topic of interest to both the scientific and water management communities. Therefore, the development of baseflow estimation methods is a topic of active research. Previous studies have demonstrated that graphical hydrograph separation (GHS) and conductivity mass balance (CMB) methods can be applied to stream discharge data to estimate daily baseflow. While CMB is generally considered to be a more objective approach than GHS, its application across broad spatial scales is limited by a lack of high frequency specific conductance (SC) data. We propose a new method that uses discrete SC data, which are widely available, to estimate baseflow at a daily time step using the CMB method. The proposed approach involves the development of regression models that relate discrete SC concentrations to stream discharge and time. Regression-derived CMB baseflow estimates were more similar to baseflow estimates obtained using a CMB approach with measured high frequency SC data than were the GHS baseflow estimates at twelve snowmelt dominated streams and rivers. There was a near perfect fit between the regression-derived and measured CMB baseflow estimates at sites where the regression models were able to accurately predict daily SC concentrations. We propose that the regression-derived approach could be applied to estimate baseflow at large numbers of sites, thereby enabling future investigations of watershed and climatic characteristics that influence the baseflow component of stream discharge across large spatial scales.

  1. View-Invariant Gait Recognition Through Genetic Template Segmentation

    NASA Astrophysics Data System (ADS)

    Isaac, Ebenezer R. H. P.; Elias, Susan; Rajagopalan, Srinivasan; Easwarakumar, K. S.

    2017-08-01

    Template-based model-free approach provides by far the most successful solution to the gait recognition problem in literature. Recent work discusses how isolating the head and leg portion of the template increase the performance of a gait recognition system making it robust against covariates like clothing and carrying conditions. However, most involve a manual definition of the boundaries. The method we propose, the genetic template segmentation (GTS), employs the genetic algorithm to automate the boundary selection process. This method was tested on the GEI, GEnI and AEI templates. GEI seems to exhibit the best result when segmented with our approach. Experimental results depict that our approach significantly outperforms the existing implementations of view-invariant gait recognition.

  2. A time-parallel approach to strong-constraint four-dimensional variational data assimilation

    NASA Astrophysics Data System (ADS)

    Rao, Vishwas; Sandu, Adrian

    2016-05-01

    A parallel-in-time algorithm based on an augmented Lagrangian approach is proposed to solve four-dimensional variational (4D-Var) data assimilation problems. The assimilation window is divided into multiple sub-intervals that allows parallelization of cost function and gradient computations. The solutions to the continuity equations across interval boundaries are added as constraints. The augmented Lagrangian approach leads to a different formulation of the variational data assimilation problem than the weakly constrained 4D-Var. A combination of serial and parallel 4D-Vars to increase performance is also explored. The methodology is illustrated on data assimilation problems involving the Lorenz-96 and the shallow water models.

  3. Who regulates ethics in the virtual world?

    PubMed

    Sharma, Seemu; Lomash, Hitashi; Bawa, Seema

    2015-02-01

    This paper attempts to give an insight into emerging ethical issues due to the increased usage of the Internet in our lives. We discuss three main theoretical approaches relating to the ethics involved in the information technology (IT) era: first, the use of IT as a tool; second, the use of social constructivist methods; and third, the approach of phenomenologists. Certain aspects of ethics and IT have been discussed based on a phenomenological approach and moral development. Further, ethical issues related to social networking sites are discussed. A plausible way to make the virtual world ethically responsive is collective responsibility which proposes that society has the power to influence but not control behavior in the virtual world.

  4. A New Approach to Image Fusion Based on Cokriging

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; LeMoigne, Jacqueline; Mount, David M.; Morisette, Jeffrey T.

    2005-01-01

    We consider the image fusion problem involving remotely sensed data. We introduce cokriging as a method to perform fusion. We investigate the advantages of fusing Hyperion with ALI. The evaluation is performed by comparing the classification of the fused data with that of input images and by calculating well-chosen quantitative fusion quality metrics. We consider the Invasive Species Forecasting System (ISFS) project as our fusion application. The fusion of ALI with Hyperion data is studies using PCA and wavelet-based fusion. We then propose utilizing a geostatistical based interpolation method called cokriging as a new approach for image fusion.

  5. [A country doctor: a proposal of an ethical approach in the 125th anniversary of Frank Kafka's birth].

    PubMed

    Alvarez-Díaz, Jorge Alberto

    2008-01-01

    Within the framework of the 125 anniversary of the birth of Franz Kafka we discuss his work as a patient affected by tuberculosis. This essay outlines a review of Kafka as a writer and explains the meaning of the term "Kafkaesque". We put forward a commentary on the ethics expressed in a short story entitled A country doctor. An interpretation of Kafka must involve the notion of responsibility, a theological concept that is then followed by the legal context. Finally, Kafka embraces an ethical approach expressed in his work.

  6. Real-time control systems: feedback, scheduling and robustness

    NASA Astrophysics Data System (ADS)

    Simon, Daniel; Seuret, Alexandre; Sename, Olivier

    2017-08-01

    The efficient control of real-time distributed systems, where continuous components are governed through digital devices and communication networks, needs a careful examination of the constraints arising from the different involved domains inside co-design approaches. Thanks to the robustness of feedback control, both new control methodologies and slackened real-time scheduling schemes are proposed beyond the frontiers between these traditionally separated fields. A methodology to design robust aperiodic controllers is provided, where the sampling interval is considered as a control variable of the system. Promising experimental results are provided to show the feasibility and robustness of the approach.

  7. Prediction of Patient-Controlled Analgesic Consumption: A Multimodel Regression Tree Approach.

    PubMed

    Hu, Yuh-Jyh; Ku, Tien-Hsiung; Yang, Yu-Hung; Shen, Jia-Ying

    2018-01-01

    Several factors contribute to individual variability in postoperative pain, therefore, individuals consume postoperative analgesics at different rates. Although many statistical studies have analyzed postoperative pain and analgesic consumption, most have identified only the correlation and have not subjected the statistical model to further tests in order to evaluate its predictive accuracy. In this study involving 3052 patients, a multistrategy computational approach was developed for analgesic consumption prediction. This approach uses data on patient-controlled analgesia demand behavior over time and combines clustering, classification, and regression to mitigate the limitations of current statistical models. Cross-validation results indicated that the proposed approach significantly outperforms various existing regression methods. Moreover, a comparison between the predictions by anesthesiologists and medical specialists and those of the computational approach for an independent test data set of 60 patients further evidenced the superiority of the computational approach in predicting analgesic consumption because it produced markedly lower root mean squared errors.

  8. Use of raster-based data layers to model spatial variation of seismotectonic data in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Zolfaghari, Mohammad R.

    2009-07-01

    Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.

  9. Low power femtosecond tip-based nanofabrication with advanced control

    NASA Astrophysics Data System (ADS)

    Liu, Jiangbo; Guo, Zhixiong; Zou, Qingze

    2018-02-01

    In this paper, we propose an approach to enable the use of low power femtosecond laser in tip-based nanofabrication (TBN) without thermal damage. One major challenge in laser-assisted TBN is in maintaining precision control of the tip-surface positioning throughout the fabrication process. An advanced iterative learning control technique is exploited to overcome this challenge in achieving high-quality patterning of arbitrary shape on a metal surface. The experimental results are analyzed to understand the ablation mechanism involved. Specifically, the near-field radiation enhancement is examined via the surface-enhanced Raman scattering effect, and it was revealed the near-field enhanced plasma-mediated ablation. Moreover, silicon nitride tip is utilized to alleviate the adverse thermal damage. Experiment results including line patterns fabricated under different writing speeds and an "R" pattern are presented. The fabrication quality with regard to the line width, depth, and uniformity is characterized to demonstrate the efficacy of the proposed approach.

  10. Novel Harmonic Regularization Approach for Variable Selection in Cox's Proportional Hazards Model

    PubMed Central

    Chu, Ge-Jin; Liang, Yong; Wang, Jia-Xuan

    2014-01-01

    Variable selection is an important issue in regression and a number of variable selection methods have been proposed involving nonconvex penalty functions. In this paper, we investigate a novel harmonic regularization method, which can approximate nonconvex Lq  (1/2 < q < 1) regularizations, to select key risk factors in the Cox's proportional hazards model using microarray gene expression data. The harmonic regularization method can be efficiently solved using our proposed direct path seeking approach, which can produce solutions that closely approximate those for the convex loss function and the nonconvex regularization. Simulation results based on the artificial datasets and four real microarray gene expression datasets, such as real diffuse large B-cell lymphoma (DCBCL), the lung cancer, and the AML datasets, show that the harmonic regularization method can be more accurate for variable selection than existing Lasso series methods. PMID:25506389

  11. Medical Device Integrated Vital Signs Monitoring Application with Real-Time Clinical Decision Support.

    PubMed

    Moqeem, Aasia; Baig, Mirza; Gholamhosseini, Hamid; Mirza, Farhaan; Lindén, Maria

    2018-01-01

    This research involves the design and development of a novel Android smartphone application for real-time vital signs monitoring and decision support. The proposed application integrates market available, wireless and Bluetooth connected medical devices for collecting vital signs. The medical device data collected by the app includes heart rate, oxygen saturation and electrocardiograph (ECG). The collated data is streamed/displayed on the smartphone in real-time. This application was designed by adopting six screens approach (6S) mobile development framework and focused on user-centered approach and considered clinicians-as-a-user. The clinical engagement, consultations, feedback and usability of the application in the everyday practices were considered critical from the initial phase of the design and development. Furthermore, the proposed application is capable to deliver rich clinical decision support in real-time using the integrated medical device data.

  12. A Relevance Vector Machine-Based Approach with Application to Oil Sand Pump Prognostics

    PubMed Central

    Hu, Jinfei; Tse, Peter W.

    2013-01-01

    Oil sand pumps are widely used in the mining industry for the delivery of mixtures of abrasive solids and liquids. Because they operate under highly adverse conditions, these pumps usually experience significant wear. Consequently, equipment owners are quite often forced to invest substantially in system maintenance to avoid unscheduled downtime. In this study, an approach combining relevance vector machines (RVMs) with a sum of two exponential functions was developed to predict the remaining useful life (RUL) of field pump impellers. To handle field vibration data, a novel feature extracting process was proposed to arrive at a feature varying with the development of damage in the pump impellers. A case study involving two field datasets demonstrated the effectiveness of the developed method. Compared with standalone exponential fitting, the proposed RVM-based model was much better able to predict the remaining useful life of pump impellers. PMID:24051527

  13. CNN based approach for activity recognition using a wrist-worn accelerometer.

    PubMed

    Panwar, Madhuri; Dyuthi, S Ram; Chandra Prakash, K; Biswas, Dwaipayan; Acharyya, Amit; Maharatna, Koushik; Gautam, Arvind; Naik, Ganesh R

    2017-07-01

    In recent years, significant advancements have taken place in human activity recognition using various machine learning approaches. However, feature engineering have dominated conventional methods involving the difficult process of optimal feature selection. This problem has been mitigated by using a novel methodology based on deep learning framework which automatically extracts the useful features and reduces the computational cost. As a proof of concept, we have attempted to design a generalized model for recognition of three fundamental movements of the human forearm performed in daily life where data is collected from four different subjects using a single wrist worn accelerometer sensor. The validation of the proposed model is done with different pre-processing and noisy data condition which is evaluated using three possible methods. The results show that our proposed methodology achieves an average recognition rate of 99.8% as opposed to conventional methods based on K-means clustering, linear discriminant analysis and support vector machine.

  14. A systematic approach for finding the objective function and active constraints for dynamic flux balance analysis.

    PubMed

    Nikdel, Ali; Braatz, Richard D; Budman, Hector M

    2018-05-01

    Dynamic flux balance analysis (DFBA) has become an instrumental modeling tool for describing the dynamic behavior of bioprocesses. DFBA involves the maximization of a biologically meaningful objective subject to kinetic constraints on the rate of consumption/production of metabolites. In this paper, we propose a systematic data-based approach for finding both the biological objective function and a minimum set of active constraints necessary for matching the model predictions to the experimental data. The proposed algorithm accounts for the errors in the experiments and eliminates the need for ad hoc choices of objective function and constraints as done in previous studies. The method is illustrated for two cases: (1) for in silico (simulated) data generated by a mathematical model for Escherichia coli and (2) for actual experimental data collected from the batch fermentation of Bordetella Pertussis (whooping cough).

  15. Modeling intelligent adversaries for terrorism risk assessment: some necessary conditions for adversary models.

    PubMed

    Guikema, Seth

    2012-07-01

    Intelligent adversary modeling has become increasingly important for risk analysis, and a number of different approaches have been proposed for incorporating intelligent adversaries in risk analysis models. However, these approaches are based on a range of often-implicit assumptions about the desirable properties of intelligent adversary models. This "Perspective" paper aims to further risk analysis for situations involving intelligent adversaries by fostering a discussion of the desirable properties for these models. A set of four basic necessary conditions for intelligent adversary models is proposed and discussed. These are: (1) behavioral accuracy to the degree possible, (2) computational tractability to support decision making, (3) explicit consideration of uncertainty, and (4) ability to gain confidence in the model. It is hoped that these suggested necessary conditions foster discussion about the goals and assumptions underlying intelligent adversary modeling in risk analysis. © 2011 Society for Risk Analysis.

  16. A relevance vector machine-based approach with application to oil sand pump prognostics.

    PubMed

    Hu, Jinfei; Tse, Peter W

    2013-09-18

    Oil sand pumps are widely used in the mining industry for the delivery of mixtures of abrasive solids and liquids. Because they operate under highly adverse conditions, these pumps usually experience significant wear. Consequently, equipment owners are quite often forced to invest substantially in system maintenance to avoid unscheduled downtime. In this study, an approach combining relevance vector machines (RVMs) with a sum of two exponential functions was developed to predict the remaining useful life (RUL) of field pump impellers. To handle field vibration data, a novel feature extracting process was proposed to arrive at a feature varying with the development of damage in the pump impellers. A case study involving two field datasets demonstrated the effectiveness of the developed method. Compared with standalone exponential fitting, the proposed RVM-based model was much better able to predict the remaining useful life of pump impellers.

  17. Market-Based Coordination of Thermostatically Controlled Loads—Part I: A Mechanism Design Formulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Sen; Zhang, Wei; Lian, Jianming

    This paper focuses on the coordination of a population of Thermostatically Controlled Loads (TCLs) with unknown parameters to achieve group objectives. The problem involves designing the bidding and market clearing strategy to motivate self-interested users to realize efficient energy allocation subject to a peak power constraint. Using the mechanism design approach, we propose a market-based coordination framework, which can effectively incorporate heterogeneous load dynamics, systematically deal with user preferences, account for the unknown load model parameters, and enable the real-world implementation with limited communication resources. This paper is divided into two parts. Part I presents a mathematical formulation of themore » problem and develops a coordination framework using the mechanism design approach. Part II presents a learning scheme to account for the unknown load model parameters, and evaluates the proposed framework through realistic simulations.« less

  18. Composing problem solvers for simulation experimentation: a case study on steady state estimation.

    PubMed

    Leye, Stefan; Ewald, Roland; Uhrmacher, Adelinde M

    2014-01-01

    Simulation experiments involve various sub-tasks, e.g., parameter optimization, simulation execution, or output data analysis. Many algorithms can be applied to such tasks, but their performance depends on the given problem. Steady state estimation in systems biology is a typical example for this: several estimators have been proposed, each with its own (dis-)advantages. Experimenters, therefore, must choose from the available options, even though they may not be aware of the consequences. To support those users, we propose a general scheme to aggregate such algorithms to so-called synthetic problem solvers, which exploit algorithm differences to improve overall performance. Our approach subsumes various aggregation mechanisms, supports automatic configuration from training data (e.g., via ensemble learning or portfolio selection), and extends the plugin system of the open source modeling and simulation framework James II. We show the benefits of our approach by applying it to steady state estimation for cell-biological models.

  19. Super-resolution algorithm based on sparse representation and wavelet preprocessing for remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Ren, Ruizhi; Gu, Lingjia; Fu, Haoyang; Sun, Chenglin

    2017-04-01

    An effective super-resolution (SR) algorithm is proposed for actual spectral remote sensing images based on sparse representation and wavelet preprocessing. The proposed SR algorithm mainly consists of dictionary training and image reconstruction. Wavelet preprocessing is used to establish four subbands, i.e., low frequency, horizontal, vertical, and diagonal high frequency, for an input image. As compared to the traditional approaches involving the direct training of image patches, the proposed approach focuses on the training of features derived from these four subbands. The proposed algorithm is verified using different spectral remote sensing images, e.g., moderate-resolution imaging spectroradiometer (MODIS) images with different bands, and the latest Chinese Jilin-1 satellite images with high spatial resolution. According to the visual experimental results obtained from the MODIS remote sensing data, the SR images using the proposed SR algorithm are superior to those using a conventional bicubic interpolation algorithm or traditional SR algorithms without preprocessing. Fusion algorithms, e.g., standard intensity-hue-saturation, principal component analysis, wavelet transform, and the proposed SR algorithms are utilized to merge the multispectral and panchromatic images acquired by the Jilin-1 satellite. The effectiveness of the proposed SR algorithm is assessed by parameters such as peak signal-to-noise ratio, structural similarity index, correlation coefficient, root-mean-square error, relative dimensionless global error in synthesis, relative average spectral error, spectral angle mapper, and the quality index Q4, and its performance is better than that of the standard image fusion algorithms.

  20. Optic Nerve Lymphoma. Report of Two Cases and Review of the Literature

    PubMed Central

    Kim, Jennifer L.; Mendoza, Pia; Rashid, Alia; Hayek, Brent; Grossniklaus, Hans E.

    2014-01-01

    Lymphoma may involve the optic nerve as isolated optic nerve lymphoma or in association with CNS or systemic lymphoma. We present two biopsy-proven non-Hodgkin lymphomas of the optic nerve and compare our findings with previously reported cases. We discuss the mechanism of metastasis, classification of optic nerve involvement, clinical features, radiologic findings, optic nerve biopsy indications and techniques, histologic features, and treatments. We propose a classification system of optic nerve lymphoma: isolated optic nerve involvement, optic nerve involvement with CNS disease, optic nerve involvement with systemic disease, and optic nerve involvement with primary intraocular lymphoma. Although it is an uncommon cause of infiltrative optic neuropathy, optic nerve metastasis should be considered in patients with a history of lymphoma. The recommended approach to a patient with presumed optic nerve lymphoma includes neuroimaging, and cerebrospinal fluid evaluation as part of the initial work-up, then judicious use of optic nerve biopsy, depending on the clinical situation. PMID:25595061

  1. A semi-analytical refrigeration cycle modelling approach for a heat pump hot water heater

    NASA Astrophysics Data System (ADS)

    Panaras, G.; Mathioulakis, E.; Belessiotis, V.

    2018-04-01

    The use of heat pump systems in applications like the production of hot water or space heating makes important the modelling of the processes for the evaluation of the performance of existing systems, as well as for design purposes. The proposed semi-analytical model offers the opportunity to estimate the performance of a heat pump system producing hot water, without using detailed geometrical or any performance data. This is important, as for many commercial systems the type and characteristics of the involved subcomponents can hardly be detected, thus not allowing the implementation of more analytical approaches or the exploitation of the manufacturers' catalogue performance data. The analysis copes with the issues related with the development of the models of the subcomponents involved in the studied system. Issues not discussed thoroughly in the existing literature, as the refrigerant mass inventory in the case an accumulator is present, are examined effectively.

  2. Meta-Analysis of Effect Sizes Reported at Multiple Time Points Using General Linear Mixed Model.

    PubMed

    Musekiwa, Alfred; Manda, Samuel O M; Mwambi, Henry G; Chen, Ding-Geng

    2016-01-01

    Meta-analysis of longitudinal studies combines effect sizes measured at pre-determined time points. The most common approach involves performing separate univariate meta-analyses at individual time points. This simplistic approach ignores dependence between longitudinal effect sizes, which might result in less precise parameter estimates. In this paper, we show how to conduct a meta-analysis of longitudinal effect sizes where we contrast different covariance structures for dependence between effect sizes, both within and between studies. We propose new combinations of covariance structures for the dependence between effect size and utilize a practical example involving meta-analysis of 17 trials comparing postoperative treatments for a type of cancer, where survival is measured at 6, 12, 18 and 24 months post randomization. Although the results from this particular data set show the benefit of accounting for within-study serial correlation between effect sizes, simulations are required to confirm these results.

  3. Metasynthesis and bricolage: an artistic exercise of creating a collage of meaning.

    PubMed

    Kinn, Liv Grethe; Holgersen, Helge; Ekeland, Tor-Johan; Davidson, Larry

    2013-09-01

    During the past decades, new approaches to synthesizing qualitative data have been developed. However, this methodology continues to face significant philosophical and practical challenges. By reviewing the literature on this topic, our overall aim in this article is to explore the systematic and creative research processes involved in the act of metasynthesizing. By investigating synthesizing processes borrowed from two studies, we discuss matters of transparency and transferability in relation to how multiple qualitative studies are interpreted and transformed into one narrative. We propose concepts such as bricolage, metaphor, playfulness, and abduction as ideas that might enhance understanding of the importance of combinations of scientific and artistic approaches to the way the synthesizer "puzzles together" an interpretive account of qualitative studies. This study can benefit researchers by increasing their awareness of the artistic processes involved in qualitative analysis and metasynthesis to expand the domain and methods of their fields.

  4. Fractional-order in a macroeconomic dynamic model

    NASA Astrophysics Data System (ADS)

    David, S. A.; Quintino, D. D.; Soliani, J.

    2013-10-01

    In this paper, we applied the Riemann-Liouville approach in order to realize the numerical simulations to a set of equations that represent a fractional-order macroeconomic dynamic model. It is a generalization of a dynamic model recently reported in the literature. The aforementioned equations have been simulated for several cases involving integer and non-integer order analysis, with some different values to fractional order. The time histories and the phase diagrams have been plotted to visualize the effect of fractional order approach. The new contribution of this work arises from the fact that the macroeconomic dynamic model proposed here involves the public sector deficit equation, which renders the model more realistic and complete when compared with the ones encountered in the literature. The results reveal that the fractional-order macroeconomic model can exhibit a real reasonable behavior to macroeconomics systems and might offer greater insights towards the understanding of these complex dynamic systems.

  5. Multi-site precipitation downscaling using a stochastic weather generator

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Chen, Hua; Guo, Shenglian

    2018-03-01

    Statistical downscaling is an efficient way to solve the spatiotemporal mismatch between climate model outputs and the data requirements of hydrological models. However, the most commonly-used downscaling method only produces climate change scenarios for a specific site or watershed average, which is unable to drive distributed hydrological models to study the spatial variability of climate change impacts. By coupling a single-site downscaling method and a multi-site weather generator, this study proposes a multi-site downscaling approach for hydrological climate change impact studies. Multi-site downscaling is done in two stages. The first stage involves spatially downscaling climate model-simulated monthly precipitation from grid scale to a specific site using a quantile mapping method, and the second stage involves the temporal disaggregating of monthly precipitation to daily values by adjusting the parameters of a multi-site weather generator. The inter-station correlation is specifically considered using a distribution-free approach along with an iterative algorithm. The performance of the downscaling approach is illustrated using a 10-station watershed as an example. The precipitation time series derived from the National Centers for Environment Prediction (NCEP) reanalysis dataset is used as the climate model simulation. The precipitation time series of each station is divided into 30 odd years for calibration and 29 even years for validation. Several metrics, including the frequencies of wet and dry spells and statistics of the daily, monthly and annual precipitation are used as criteria to evaluate the multi-site downscaling approach. The results show that the frequencies of wet and dry spells are well reproduced for all stations. In addition, the multi-site downscaling approach performs well with respect to reproducing precipitation statistics, especially at monthly and annual timescales. The remaining biases mainly result from the non-stationarity of NCEP precipitation. Overall, the proposed approach is efficient for generating multi-site climate change scenarios that can be used to investigate the spatial variability of climate change impacts on hydrology.

  6. Beam-hardening correction by a surface fitting and phase classification by a least square support vector machine approach for tomography images of geological samples

    NASA Astrophysics Data System (ADS)

    Khan, F.; Enzmann, F.; Kersten, M.

    2015-12-01

    In X-ray computed microtomography (μXCT) image processing is the most important operation prior to image analysis. Such processing mainly involves artefact reduction and image segmentation. We propose a new two-stage post-reconstruction procedure of an image of a geological rock core obtained by polychromatic cone-beam μXCT technology. In the first stage, the beam-hardening (BH) is removed applying a best-fit quadratic surface algorithm to a given image data set (reconstructed slice), which minimizes the BH offsets of the attenuation data points from that surface. The final BH-corrected image is extracted from the residual data, or the difference between the surface elevation values and the original grey-scale values. For the second stage, we propose using a least square support vector machine (a non-linear classifier algorithm) to segment the BH-corrected data as a pixel-based multi-classification task. A combination of the two approaches was used to classify a complex multi-mineral rock sample. The Matlab code for this approach is provided in the Appendix. A minor drawback is that the proposed segmentation algorithm may become computationally demanding in the case of a high dimensional training data set.

  7. Simulating coupled dynamics of a rigid-flexible multibody system and compressible fluid

    NASA Astrophysics Data System (ADS)

    Hu, Wei; Tian, Qiang; Hu, HaiYan

    2018-04-01

    As a subsequent work of previous studies of authors, a new parallel computation approach is proposed to simulate the coupled dynamics of a rigid-flexible multibody system and compressible fluid. In this approach, the smoothed particle hydrodynamics (SPH) method is used to model the compressible fluid, the natural coordinate formulation (NCF) and absolute nodal coordinate formulation (ANCF) are used to model the rigid and flexible bodies, respectively. In order to model the compressible fluid properly and efficiently via SPH method, three measures are taken as follows. The first is to use the Riemann solver to cope with the fluid compressibility, the second is to define virtual particles of SPH to model the dynamic interaction between the fluid and the multibody system, and the third is to impose the boundary conditions of periodical inflow and outflow to reduce the number of SPH particles involved in the computation process. Afterwards, a parallel computation strategy is proposed based on the graphics processing unit (GPU) to detect the neighboring SPH particles and to solve the dynamic equations of SPH particles in order to improve the computation efficiency. Meanwhile, the generalized-alpha algorithm is used to solve the dynamic equations of the multibody system. Finally, four case studies are given to validate the proposed parallel computation approach.

  8. Damage severity estimation from the global stiffness decrease

    NASA Astrophysics Data System (ADS)

    Nitescu, C.; Gillich, G. R.; Abdel Wahab, M.; Manescu, T.; Korka, Z. I.

    2017-05-01

    In actual damage detection methods, localization and severity estimation can be treated separately. The severity is commonly estimated using fracture mechanics approach, with the main disadvantage of involving empirically deduced relations. In this paper, a damage severity estimator based on the global stiffness reduction is proposed. This feature is computed from the deflections of the intact and damaged beam, respectively. The damage is always located where the bending moment achieves maxima. If the damage is positioned elsewhere on the beam, its effect becomes lower, because the stress is produced by a diminished bending moment. It is shown that the global stiffness reduction produced by a crack is the same for all beams with a similar cross-section, regardless of the boundary conditions. One mathematical relation indicating the severity and another indicating the effect of removing damage from the beam. Measurements on damaged beams with different boundary conditions and cross-sections are carried out, and the location and severity are found using the proposed relations. These comparisons prove that the proposed approach can be used to accurately compute the severity estimator.

  9. Second-order sliding mode control with experimental application.

    PubMed

    Eker, Ilyas

    2010-07-01

    In this article, a second-order sliding mode control (2-SMC) is proposed for second-order uncertain plants using equivalent control approach to improve the performance of control systems. A Proportional + Integral + Derivative (PID) sliding surface is used for the sliding mode. The sliding mode control law is derived using direct Lyapunov stability approach and asymptotic stability is proved theoretically. The performance of the closed-loop system is analysed through an experimental application to an electromechanical plant to show the feasibility and effectiveness of the proposed second-order sliding mode control and factors involved in the design. The second-order plant parameters are experimentally determined using input-output measured data. The results of the experimental application are presented to make a quantitative comparison with the traditional (first-order) sliding mode control (SMC) and PID control. It is demonstrated that the proposed 2-SMC system improves the performance of the closed-loop system with better tracking specifications in the case of external disturbances, better behavior of the output and faster convergence of the sliding surface while maintaining the stability. 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Based on interval type-2 fuzzy-neural network direct adaptive sliding mode control for SISO nonlinear systems

    NASA Astrophysics Data System (ADS)

    Lin, Tsung-Chih

    2010-12-01

    In this paper, a novel direct adaptive interval type-2 fuzzy-neural tracking control equipped with sliding mode and Lyapunov synthesis approach is proposed to handle the training data corrupted by noise or rule uncertainties for nonlinear SISO nonlinear systems involving external disturbances. By employing adaptive fuzzy-neural control theory, the update laws will be derived for approximating the uncertain nonlinear dynamical system. In the meantime, the sliding mode control method and the Lyapunov stability criterion are incorporated into the adaptive fuzzy-neural control scheme such that the derived controller is robust with respect to unmodeled dynamics, external disturbance and approximation errors. In comparison with conventional methods, the advocated approach not only guarantees closed-loop stability but also the output tracking error of the overall system will converge to zero asymptotically without prior knowledge on the upper bound of the lumped uncertainty. Furthermore, chattering effect of the control input will be substantially reduced by the proposed technique. To illustrate the performance of the proposed method, finally simulation example will be given.

  11. Fault diagnosis of sensor networked structures with multiple faults using a virtual beam based approach

    NASA Astrophysics Data System (ADS)

    Wang, H.; Jing, X. J.

    2017-07-01

    This paper presents a virtual beam based approach suitable for conducting diagnosis of multiple faults in complex structures with limited prior knowledge of the faults involved. The "virtual beam", a recently-proposed concept for fault detection in complex structures, is applied, which consists of a chain of sensors representing a vibration energy transmission path embedded in the complex structure. Statistical tests and adaptive threshold are particularly adopted for fault detection due to limited prior knowledge of normal operational conditions and fault conditions. To isolate the multiple faults within a specific structure or substructure of a more complex one, a 'biased running' strategy is developed and embedded within the bacterial-based optimization method to construct effective virtual beams and thus to improve the accuracy of localization. The proposed method is easy and efficient to implement for multiple fault localization with limited prior knowledge of normal conditions and faults. With extensive experimental results, it is validated that the proposed method can localize both single fault and multiple faults more effectively than the classical trust index subtract on negative add on positive (TI-SNAP) method.

  12. Rapid customization system for 3D-printed splint using programmable modeling technique - a practical approach.

    PubMed

    Li, Jianyou; Tanaka, Hiroya

    2018-01-01

    Traditional splinting processes are skill dependent and irreversible, and patient satisfaction levels during rehabilitation are invariably lowered by the heavy structure and poor ventilation of splints. To overcome this drawback, use of the 3D-printing technology has been proposed in recent years, and there has been an increase in public awareness. However, application of 3D-printing technologies is limited by the low CAD proficiency of clinicians as well as unforeseen scan flaws within anatomic models.A programmable modeling tool has been employed to develop a semi-automatic design system for generating a printable splint model. The modeling process was divided into five stages, and detailed steps involved in construction of the proposed system as well as automatic thickness calculation, the lattice structure, and assembly method have been thoroughly described. The proposed approach allows clinicians to verify the state of the splint model at every stage, thereby facilitating adjustment of input content and/or other parameters to help solve possible modeling issues. A finite element analysis simulation was performed to evaluate the structural strength of generated models. A fit investigation was applied on fabricated splints and volunteers to assess the wearing experience. Manual modeling steps involved in complex splint designs have been programed into the proposed automatic system. Clinicians define the splinting region by drawing two curves, thereby obtaining the final model within minutes. The proposed system is capable of automatically patching up minor flaws within the limb model as well as calculating the thickness and lattice density of various splints. Large splints could be divided into three parts for simultaneous multiple printing. This study highlights the advantages, limitations, and possible strategies concerning application of programmable modeling tools in clinical processes, thereby aiding clinicians with lower CAD proficiencies to become adept with splint design process, thus improving the overall design efficiency of 3D-printed splints.

  13. A fully coupled method for massively parallel simulation of hydraulically driven fractures in 3-dimensions: FULLY COUPLED PARALLEL SIMULATION OF HYDRAULIC FRACTURES IN 3-D

    DOE PAGES

    Settgast, Randolph R.; Fu, Pengcheng; Walsh, Stuart D. C.; ...

    2016-09-18

    This study describes a fully coupled finite element/finite volume approach for simulating field-scale hydraulically driven fractures in three dimensions, using massively parallel computing platforms. The proposed method is capable of capturing realistic representations of local heterogeneities, layering and natural fracture networks in a reservoir. A detailed description of the numerical implementation is provided, along with numerical studies comparing the model with both analytical solutions and experimental results. The results demonstrate the effectiveness of the proposed method for modeling large-scale problems involving hydraulically driven fractures in three dimensions.

  14. Simple Backdoors on RSA Modulus by Using RSA Vulnerability

    NASA Astrophysics Data System (ADS)

    Sun, Hung-Min; Wu, Mu-En; Yang, Cheng-Ta

    This investigation proposes two methods for embedding backdoors in the RSA modulus N=pq rather than in the public exponent e. This strategy not only permits manufacturers to embed backdoors in an RSA system, but also allows users to choose any desired public exponent, such as e=216+1, to ensure efficient encryption. This work utilizes lattice attack and exhaustive attack to embed backdoors in two proposed methods, called RSASBLT and RSASBES, respectively. Both approaches involve straightforward steps, making their running time roughly the same as that of normal RSA key-generation time, implying that no one can detect the backdoor by observing time imparity.

  15. A fully coupled method for massively parallel simulation of hydraulically driven fractures in 3-dimensions: FULLY COUPLED PARALLEL SIMULATION OF HYDRAULIC FRACTURES IN 3-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Settgast, Randolph R.; Fu, Pengcheng; Walsh, Stuart D. C.

    This study describes a fully coupled finite element/finite volume approach for simulating field-scale hydraulically driven fractures in three dimensions, using massively parallel computing platforms. The proposed method is capable of capturing realistic representations of local heterogeneities, layering and natural fracture networks in a reservoir. A detailed description of the numerical implementation is provided, along with numerical studies comparing the model with both analytical solutions and experimental results. The results demonstrate the effectiveness of the proposed method for modeling large-scale problems involving hydraulically driven fractures in three dimensions.

  16. Advanced FL Learners Explaining Their Writing Choices: Epistemic Attitude as an Indicator of Problem-Solving and Strategic Knowledge in the On-Line Revision Process

    ERIC Educational Resources Information Center

    Mutta, Maarit; Johansson, Marjut

    2018-01-01

    Verbal protocols are usually used to study cognitive processes involved in various activities, as it is argued that they could make implicit processes of thinking visible and thus reportable. Here, it is proposed that verbalisations can also be approached from another angle, namely as a discourse that contains linguistic markers of writers'…

  17. Development of an electron paramagnetic resonance methodology for studying the photo-generation of reactive species in semiconductor nano-particle assembled films

    NASA Astrophysics Data System (ADS)

    Twardoch, Marek; Messai, Youcef; Vileno, Bertrand; Hoarau, Yannick; Mekki, Djamel E.; Felix, Olivier; Turek, Philippe; Weiss, Jean; Decher, Gero; Martel, David

    2018-06-01

    An experimental approach involving electron paramagnetic resonance is proposed for studying photo-generated reactive species in semiconductor nano-particle-based films deposited on the internal wall of glass capillaries. This methodology is applied here to nano-TiO2 and allows a semi-quantitative analysis of the kinetic evolutions of radical production using a spin scavenger probe.

  18. DIRProt: a computational approach for discriminating insecticide resistant proteins from non-resistant proteins.

    PubMed

    Meher, Prabina Kumar; Sahu, Tanmaya Kumar; Banchariya, Anjali; Rao, Atmakuri Ramakrishna

    2017-03-24

    Insecticide resistance is a major challenge for the control program of insect pests in the fields of crop protection, human and animal health etc. Resistance to different insecticides is conferred by the proteins encoded from certain class of genes of the insects. To distinguish the insecticide resistant proteins from non-resistant proteins, no computational tool is available till date. Thus, development of such a computational tool will be helpful in predicting the insecticide resistant proteins, which can be targeted for developing appropriate insecticides. Five different sets of feature viz., amino acid composition (AAC), di-peptide composition (DPC), pseudo amino acid composition (PAAC), composition-transition-distribution (CTD) and auto-correlation function (ACF) were used to map the protein sequences into numeric feature vectors. The encoded numeric vectors were then used as input in support vector machine (SVM) for classification of insecticide resistant and non-resistant proteins. Higher accuracies were obtained under RBF kernel than that of other kernels. Further, accuracies were observed to be higher for DPC feature set as compared to others. The proposed approach achieved an overall accuracy of >90% in discriminating resistant from non-resistant proteins. Further, the two classes of resistant proteins i.e., detoxification-based and target-based were discriminated from non-resistant proteins with >95% accuracy. Besides, >95% accuracy was also observed for discrimination of proteins involved in detoxification- and target-based resistance mechanisms. The proposed approach not only outperformed Blastp, PSI-Blast and Delta-Blast algorithms, but also achieved >92% accuracy while assessed using an independent dataset of 75 insecticide resistant proteins. This paper presents the first computational approach for discriminating the insecticide resistant proteins from non-resistant proteins. Based on the proposed approach, an online prediction server DIRProt has also been developed for computational prediction of insecticide resistant proteins, which is accessible at http://cabgrid.res.in:8080/dirprot/ . The proposed approach is believed to supplement the efforts needed to develop dynamic insecticides in wet-lab by targeting the insecticide resistant proteins.

  19. Improved pedagogy for linear differential equations by reconsidering how we measure the size of solutions

    NASA Astrophysics Data System (ADS)

    Tisdell, Christopher C.

    2017-11-01

    For over 50 years, the learning of teaching of a priori bounds on solutions to linear differential equations has involved a Euclidean approach to measuring the size of a solution. While the Euclidean approach to a priori bounds on solutions is somewhat manageable in the learning and teaching of the proofs involving second-order, linear problems with constant co-efficients, we believe it is not pedagogically optimal. Moreover, the Euclidean method becomes pedagogically unwieldy in the proofs involving higher-order cases. The purpose of this work is to propose a simpler pedagogical approach to establish a priori bounds on solutions by considering a different way of measuring the size of a solution to linear problems, which we refer to as the Uber size. The Uber form enables a simplification of pedagogy from the literature and the ideas are accessible to learners who have an understanding of the Fundamental Theorem of Calculus and the exponential function, both usually seen in a first course in calculus. We believe that this work will be of mathematical and pedagogical interest to those who are learning and teaching in the area of differential equations or in any of the numerous disciplines where linear differential equations are used.

  20. Contactless and pose invariant biometric identification using hand surface.

    PubMed

    Kanhangad, Vivek; Kumar, Ajay; Zhang, David

    2011-05-01

    This paper presents a novel approach for hand matching that achieves significantly improved performance even in the presence of large hand pose variations. The proposed method utilizes a 3-D digitizer to simultaneously acquire intensity and range images of the user's hand presented to the system in an arbitrary pose. The approach involves determination of the orientation of the hand in 3-D space followed by pose normalization of the acquired 3-D and 2-D hand images. Multimodal (2-D as well as 3-D) palmprint and hand geometry features, which are simultaneously extracted from the user's pose normalized textured 3-D hand, are used for matching. Individual matching scores are then combined using a new dynamic fusion strategy. Our experimental results on the database of 114 subjects with significant pose variations yielded encouraging results. Consistent (across various hand features considered) performance improvement achieved with the pose correction demonstrates the usefulness of the proposed approach for hand based biometric systems with unconstrained and contact-free imaging. The experimental results also suggest that the dynamic fusion approach employed in this work helps to achieve performance improvement of 60% (in terms of EER) over the case when matching scores are combined using the weighted sum rule.

  1. The Development of Animal Behavior: From Lorenz to Neural Nets

    NASA Astrophysics Data System (ADS)

    Bolhuis, Johan J.

    In the study of behavioral development both causal and functional approaches have been used, and they often overlap. The concept of ontogenetic adaptations suggests that each developmental phase involves unique adaptations to the environment of the developing animal. The functional concept of optimal outbreeding has led to further experimental evidence and theoretical models concerning the role of sexual imprinting in the evolutionary process of sexual selection. From a causal perspective it has been proposed that behavioral ontogeny involves the development of various kinds of perceptual, motor, and central mechanisms and the formation of connections among them. This framework has been tested for a number of complex behavior systems such as hunger and dustbathing. Imprinting is often seen as a model system for behavioral development in general. Recent advances in imprinting research have been the result of an interdisciplinary effort involving ethology, neuroscience, and experimental psychology, with a continual interplay between these approaches. The imprinting results are consistent with Lorenz' early intuitive suggestions and are also reflected in the architecture of recent neural net models.

  2. Micro-simulation of vehicle conflicts involving right-turn vehicles at signalized intersections based on cellular automata.

    PubMed

    Chai, C; Wong, Y D

    2014-02-01

    At intersection, vehicles coming from different directions conflict with each other. Improper geometric design and signal settings at signalized intersection will increase occurrence of conflicts between road users and results in a reduction of the safety level. This study established a cellular automata (CA) model to simulate vehicular interactions involving right-turn vehicles (as similar to left-turn vehicles in US). Through various simulation scenarios for four case cross-intersections, the relationships between conflict occurrences involving right-turn vehicles with traffic volume and right-turn movement control strategies are analyzed. Impacts of traffic volume, permissive right-turn compared to red-amber-green (RAG) arrow, shared straight-through and right-turn lane as well as signal setting are estimated from simulation results. The simulation model is found to be able to provide reasonable assessment of conflicts through comparison of existed simulation approach and observed accidents. Through the proposed approach, prediction models for occurrences and severity of vehicle conflicts can be developed for various geometric layouts and traffic control strategies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Distributed parameter modeling of repeated truss structures

    NASA Technical Reports Server (NTRS)

    Wang, Han-Ching

    1994-01-01

    A new approach to find homogeneous models for beam-like repeated flexible structures is proposed which conceptually involves two steps. The first step involves the approximation of 3-D non-homogeneous model by a 1-D periodic beam model. The structure is modeled as a 3-D non-homogeneous continuum. The displacement field is approximated by Taylor series expansion. Then, the cross sectional mass and stiffness matrices are obtained by energy equivalence using their additive properties. Due to the repeated nature of the flexible bodies, the mass, and stiffness matrices are also periodic. This procedure is systematic and requires less dynamics detail. The first step involves the homogenization from a 1-D periodic beam model to a 1-D homogeneous beam model. The periodic beam model is homogenized into an equivalent homogeneous beam model using the additive property of compliance along the generic axis. The major departure from previous approaches in literature is using compliance instead of stiffness in homogenization. An obvious justification is that the stiffness is additive at each cross section but not along the generic axis. The homogenized model preserves many properties of the original periodic model.

  4. Urban Stormwater Governance: The Need for a Paradigm Shift.

    PubMed

    Dhakal, Krishna P; Chevalier, Lizette R

    2016-05-01

    Traditional urban stormwater management involves rapid removal of stormwater through centralized conveyance systems of curb-gutter-pipe networks. This results in many adverse impacts on the environment including hydrological disruption, groundwater depletion, downstream flooding, receiving water quality degradation, channel erosion, and stream ecosystem damage. In order to mitigate these adverse impacts, urban stormwater managers are increasingly using green infrastructure that promote on-site infiltration, restore hydrological functions of the landscape, and reduce surface runoff. Existing stormwater governance, however, is centralized and structured to support the conventional systems. This governance approach is not suited to the emerging distributed management approach, which involves multiple stakeholders including parcel owners, government agencies, and non-governmental organizations. This incongruence between technology and governance calls for a paradigm shift in the governance from centralized and technocratic to distributed and participatory governance. This paper evaluates how five US cities have been adjusting their governance to address the discord. Finally, the paper proposes an alternative governance model, which provides a mechanism to involve stakeholders and implement distributed green infrastructure under an integrative framework.

  5. Phylogenetic analysis of genes involved in mycosporine-like amino acid biosynthesis in symbiotic dinoflagellates.

    PubMed

    Rosic, Nedeljka N

    2012-04-01

    Mycosporine-like amino acids (MAAs) are multifunctional secondary metabolites involved in photoprotection in many marine organisms. As well as having broad ultraviolet (UV) absorption spectra (310-362 nm), these biological sunscreens are also involved in the prevention of oxidative stress. More than 20 different MAAs have been discovered so far, characterized by distinctive chemical structures and a broad ecological distribution. Additionally, UV-screening MAA metabolites have been investigated and used in biotechnology and cosmetics. The biosynthesis of MAAs has been suggested to occur via either the shikimate or pentose phosphate pathways. Despite their wide distribution in marine and freshwater species and also the commercial application in cosmetic products, there are still a number of uncertainties regarding the genetic, biochemical, and evolutionary origin of MAAs. Here, using a transcriptome-mining approach, we identify the gene counterparts from the shikimate or pentose phosphate pathway involved in MAA biosynthesis within the sequences of the reef-building coral symbiotic dinoflagellates (genus Symbiodinium). We also report the highly similar sequences of genes from the proposed MAA biosynthetic pathway involved in the metabolism of 4-deoxygadusol (direct MAA precursor) in various Symbiodinium strains confirming their algal origin and conserved nature. Finally, we reveal the separate identity of two O-methyltransferase genes, possibly involved in MAA biosynthesis, as well as nonribosomal peptide synthetase and adenosine triphosphate grasp homologs in symbiotic dinoflagellates. This study provides a biochemical and phylogenetic overview of the genes from the proposed MAA biosynthetic pathway with a focus on coral endosymbionts.

  6. [Health education: adjusting to parents' expectations. Results of a quantitative and qualitative survey in Morbihan].

    PubMed

    Bourhis, Cathy; Tual, Florence

    2013-01-01

    Health education among children and adolescents tends to be more effective if the objectives are shared, supported and promoted by parents. Professionals and policy-makers are therefore keen to promote the active involvement of parents. However, they face the same challenge: how to get parents involved. To address this issue, we need to examine parents' concerns and expectations directly. Professionals will need to adapt the proposed responses to the identified needs. This approach is a basic methodological and ethical principle in health education and requires the ability to change perceptions and practices while taking into account public expectations.

  7. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    PubMed

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  8. Finite strain transient creep of D16T alloy: identification and validation employing heterogeneous tests

    NASA Astrophysics Data System (ADS)

    Shutov, A. V.; Larichkin, A. Yu

    2017-10-01

    A cyclic creep damage model, previously proposed by the authors, is modified for a better description of the transient creep of D16T alloy observed in the finite strain range under rapidly changing stresses. The new model encompasses the concept of kinematic hardening, which allows us to account for the creep-induced anisotropy. The model kinematics is based on the nested multiplicative split of the deformation gradient, proposed by Lion. The damage evolution is accounted for by the classical Kachanov-Rabotnov approach. The material parameters are identified using experimental data on cyclic torsion of thick-walled samples with different holding times between load reversals. For the validation of the proposed material model, an additional experiment is analyzed. Although this additional test is not involved in the identification procedure, the proposed cyclic creep damage model describes it accurately.

  9. Attention to Intentions—How to Stimulate Strong Intentions to Change

    NASA Astrophysics Data System (ADS)

    Dam, M.; Janssen, F. J. J. M.; van Driel, J. H.

    2017-04-01

    The implementation of educational reforms requires behavioral changes from the teachers involved. Theories on successful behavioral change prescribe the following conditions: teachers need to possess the necessary knowledge and skills, form strong positive intentions to perform the new behavior, and have a supporting environment for change. However, existing approaches to teacher professional development in the context of educational reforms are predominantly aimed at the development of knowledge and skills and at creating a supporting environment, but lack attention to teachers' intentions to change. In the study described in this article, we performed "motivating-for-educational-change" interviews (MECI) and explored the influence on teachers' intentions to change in the direction of the proposed national biology education reform, that is, the introduction of a context-based curriculum. The MECI comprised two tools: building on earlier successful experiences and using lesson segments to rearrange instructional approaches. We explored the influence of the MECI technique on the strength and specificity of participating teachers' intentions. When conducting the MECI, many participants expressed that they now realized how they had already implemented aspects of the reform in their regular instructional approaches. Furthermore, all the participants formulated stronger and more specific intentions to change their regular instructional approach towards that of the proposed reform while taking their regular instructional approach as a starting point.

  10. Learning to perceive in the sensorimotor approach: Piaget’s theory of equilibration interpreted dynamically

    PubMed Central

    Di Paolo, Ezequiel Alejandro; Barandiaran, Xabier E.; Beaton, Michael; Buhrmann, Thomas

    2014-01-01

    Learning to perceive is faced with a classical paradox: if understanding is required for perception, how can we learn to perceive something new, something we do not yet understand? According to the sensorimotor approach, perception involves mastery of regular sensorimotor co-variations that depend on the agent and the environment, also known as the “laws” of sensorimotor contingencies (SMCs). In this sense, perception involves enacting relevant sensorimotor skills in each situation. It is important for this proposal that such skills can be learned and refined with experience and yet up to this date, the sensorimotor approach has had no explicit theory of perceptual learning. The situation is made more complex if we acknowledge the open-ended nature of human learning. In this paper we propose Piaget’s theory of equilibration as a potential candidate to fulfill this role. This theory highlights the importance of intrinsic sensorimotor norms, in terms of the closure of sensorimotor schemes. It also explains how the equilibration of a sensorimotor organization faced with novelty or breakdowns proceeds by re-shaping pre-existing structures in coupling with dynamical regularities of the world. This way learning to perceive is guided by the equilibration of emerging forms of skillful coping with the world. We demonstrate the compatibility between Piaget’s theory and the sensorimotor approach by providing a dynamical formalization of equilibration to give an explicit micro-genetic account of sensorimotor learning and, by extension, of how we learn to perceive. This allows us to draw important lessons in the form of general principles for open-ended sensorimotor learning, including the need for an intrinsic normative evaluation by the agent itself. We also explore implications of our micro-genetic account at the personal level. PMID:25126065

  11. Learning to perceive in the sensorimotor approach: Piaget's theory of equilibration interpreted dynamically.

    PubMed

    Di Paolo, Ezequiel Alejandro; Barandiaran, Xabier E; Beaton, Michael; Buhrmann, Thomas

    2014-01-01

    if understanding is required for perception, how can we learn to perceive something new, something we do not yet understand? According to the sensorimotor approach, perception involves mastery of regular sensorimotor co-variations that depend on the agent and the environment, also known as the "laws" of sensorimotor contingencies (SMCs). In this sense, perception involves enacting relevant sensorimotor skills in each situation. It is important for this proposal that such skills can be learned and refined with experience and yet up to this date, the sensorimotor approach has had no explicit theory of perceptual learning. The situation is made more complex if we acknowledge the open-ended nature of human learning. In this paper we propose Piaget's theory of equilibration as a potential candidate to fulfill this role. This theory highlights the importance of intrinsic sensorimotor norms, in terms of the closure of sensorimotor schemes. It also explains how the equilibration of a sensorimotor organization faced with novelty or breakdowns proceeds by re-shaping pre-existing structures in coupling with dynamical regularities of the world. This way learning to perceive is guided by the equilibration of emerging forms of skillful coping with the world. We demonstrate the compatibility between Piaget's theory and the sensorimotor approach by providing a dynamical formalization of equilibration to give an explicit micro-genetic account of sensorimotor learning and, by extension, of how we learn to perceive. This allows us to draw important lessons in the form of general principles for open-ended sensorimotor learning, including the need for an intrinsic normative evaluation by the agent itself. We also explore implications of our micro-genetic account at the personal level.

  12. One lens optical correlation: application to face recognition.

    PubMed

    Jridi, Maher; Napoléon, Thibault; Alfalou, Ayman

    2018-03-20

    Despite its extensive use, the traditional 4f Vander Lugt Correlator optical setup can be further simplified. We propose a lightweight correlation scheme where the decision is taken in the Fourier plane. For this purpose, the Fourier plane is adapted and used as a decision plane. Then, the offline phase and the decision metric are re-examined in order to keep a reasonable recognition rate. The benefits of the proposed approach are numerous: (1) it overcomes the constraints related to the use of a second lens; (2) the optical correlation setup is simplified; (3) the multiplication with the correlation filter can be done digitally, which offers a higher adaptability according to the application. Moreover, the digital counterpart of the correlation scheme is lightened since with the proposed scheme we get rid of the inverse Fourier transform (IFT) calculation (i.e., decision directly in the Fourier domain without resorting to IFT). To assess the performance of the proposed approach, an insight into digital hardware resources saving is provided. The proposed method involves nearly 100 times fewer arithmetic operators. Moreover, from experimental results in the context of face verification-based correlation, we demonstrate that the proposed scheme provides comparable or better accuracy than the traditional method. One interesting feature of the proposed scheme is that it could greatly outperform the traditional scheme for face identification application in terms of sensitivity to face orientation. The proposed method is found to be digital/optical implementation-friendly, which facilitates its integration on a very broad range of scenarios.

  13. Sound Power Estimation for Beam and Plate Structures Using Polyvinylidene Fluoride Films as Sensors

    PubMed Central

    Mao, Qibo; Zhong, Haibing

    2017-01-01

    The theory for calculation and/or measurement of sound power based on the classical velocity-based radiation mode (V-mode) approach is well established for planar structures. However, the current V-mode theory is limited in scope in that it can only be applied to conventional motion sensors (i.e., accelerometers). In this study, in order to estimate the sound power of vibrating beam and plate structure by using polyvinylidene fluoride (PVDF) films as sensors, a PVDF-based radiation mode (C-mode) approach concept is introduced to determine the sound power radiation from the output signals of PVDF films of the vibrating structure. The proposed method is a hybrid of vibration measurement and numerical calculation of C-modes. The proposed C-mode approach has the following advantages: (1) compared to conventional motion sensors, the PVDF films are lightweight, flexible, and low-cost; (2) there is no need for special measuring environments, since the proposed method does not require the measurement of sound fields; (3) In low frequency range (typically with dimensionless frequency kl < 4), the radiation efficiencies of the C-modes fall off very rapidly with increasing mode order, furthermore, the shapes of the C-modes remain almost unchanged, which means that the computation load can be significantly reduced due to the fact only the first few dominant C-modes are involved in the low frequency range. Numerical simulations and experimental investigations were carried out to verify the accuracy and efficiency of the proposed method. PMID:28509870

  14. Non-Cartesian MRI Reconstruction With Automatic Regularization Via Monte-Carlo SURE

    PubMed Central

    Weller, Daniel S.; Nielsen, Jon-Fredrik; Fessler, Jeffrey A.

    2013-01-01

    Magnetic resonance image (MRI) reconstruction from undersampled k-space data requires regularization to reduce noise and aliasing artifacts. Proper application of regularization however requires appropriate selection of associated regularization parameters. In this work, we develop a data-driven regularization parameter adjustment scheme that minimizes an estimate (based on the principle of Stein’s unbiased risk estimate—SURE) of a suitable weighted squared-error measure in k-space. To compute this SURE-type estimate, we propose a Monte-Carlo scheme that extends our previous approach to inverse problems (e.g., MRI reconstruction) involving complex-valued images. Our approach depends only on the output of a given reconstruction algorithm and does not require knowledge of its internal workings, so it is capable of tackling a wide variety of reconstruction algorithms and nonquadratic regularizers including total variation and those based on the ℓ1-norm. Experiments with simulated and real MR data indicate that the proposed approach is capable of providing near mean squared-error (MSE) optimal regularization parameters for single-coil undersampled non-Cartesian MRI reconstruction. PMID:23591478

  15. SChloro: directing Viridiplantae proteins to six chloroplastic sub-compartments.

    PubMed

    Savojardo, Castrense; Martelli, Pier Luigi; Fariselli, Piero; Casadio, Rita

    2017-02-01

    Chloroplasts are organelles found in plants and involved in several important cell processes. Similarly to other compartments in the cell, chloroplasts have an internal structure comprising several sub-compartments, where different proteins are targeted to perform their functions. Given the relation between protein function and localization, the availability of effective computational tools to predict protein sub-organelle localizations is crucial for large-scale functional studies. In this paper we present SChloro, a novel machine-learning approach to predict protein sub-chloroplastic localization, based on targeting signal detection and membrane protein information. The proposed approach performs multi-label predictions discriminating six chloroplastic sub-compartments that include inner membrane, outer membrane, stroma, thylakoid lumen, plastoglobule and thylakoid membrane. In comparative benchmarks, the proposed method outperforms current state-of-the-art methods in both single- and multi-compartment predictions, with an overall multi-label accuracy of 74%. The results demonstrate the relevance of the approach that is eligible as a good candidate for integration into more general large-scale annotation pipelines of protein subcellular localization. The method is available as web server at http://schloro.biocomp.unibo.it gigi@biocomp.unibo.it.

  16. Development of a Fiber-Optics Microspatially Offset Raman Spectroscopy Sensor for Probing Layered Materials.

    PubMed

    Vandenabeele, Peter; Conti, Claudia; Rousaki, Anastasia; Moens, Luc; Realini, Marco; Matousek, Pavel

    2017-09-05

    Microspatially offset Raman spectroscopy (micro-SORS) has been proposed as a valuable approach to sample molecular information from layers that are covered by a turbid (nontransparent) layer. However, when large magnifications are involved, the approach is not straightforward, as spatial constraints exist to position the laser beam and the objective lens with the external beam delivery or, with internal beam delivery, the maximum spatial offset achievable is restricted. To overcome these limitations, we propose here a prototype of a new micro-SORS sensor, which uses bare glass fibers to transfer the laser radiation to the sample and to collect the Raman signal from a spatially offset zone to the Raman spectrometer. The concept also renders itself amenable to remote delivery and to the miniaturization of the probe head which could be beneficial for special applications, e.g., where access to sample areas is restricted. The basic applicability of this approach was demonstrated by studying several layered structure systems. Apart from proving the feasibility of the technique, also, practical aspects of the use of the prototype sensor are discussed.

  17. Longitudinal Study-Based Dementia Prediction for Public Health

    PubMed Central

    Kim, HeeChel; Chun, Hong-Woo; Kim, Seonho; Coh, Byoung-Youl; Kwon, Oh-Jin; Moon, Yeong-Ho

    2017-01-01

    The issue of public health in Korea has attracted significant attention given the aging of the country’s population, which has created many types of social problems. The approach proposed in this article aims to address dementia, one of the most significant symptoms of aging and a public health care issue in Korea. The Korean National Health Insurance Service Senior Cohort Database contains personal medical data of every citizen in Korea. There are many different medical history patterns between individuals with dementia and normal controls. The approach used in this study involved examination of personal medical history features from personal disease history, sociodemographic data, and personal health examinations to develop a prediction model. The prediction model used a support-vector machine learning technique to perform a 10-fold cross-validation analysis. The experimental results demonstrated promising performance (80.9% F-measure). The proposed approach supported the significant influence of personal medical history features during an optimal observation period. It is anticipated that a biomedical “big data”-based disease prediction model may assist the diagnosis of any disease more correctly. PMID:28867810

  18. What carries a mediation process? Configural analysis of mediation.

    PubMed

    von Eye, Alexander; Mun, Eun Young; Mair, Patrick

    2009-09-01

    Mediation is a process that links a predictor and a criterion via a mediator variable. Mediation can be full or partial. This well-established definition operates at the level of variables even if they are categorical. In this article, two new approaches to the analysis of mediation are proposed. Both of these approaches focus on the analysis of categorical variables. The first involves mediation analysis at the level of configurations instead of variables. Thus, mediation can be incorporated into the arsenal of methods of analysis for person-oriented research. Second, it is proposed that Configural Frequency Analysis (CFA) can be used for both exploration and confirmation of mediation relationships among categorical variables. The implications of using CFA are first that mediation hypotheses can be tested at the level of individual configurations instead of variables. Second, this approach leaves the door open for different types of mediation processes to exist within the same set. Using a data example, it is illustrated that aggregate-level analysis can overlook mediation processes that operate at the level of individual configurations.

  19. Nonlinear mechanics of non-rigid origami: an efficient computational approach

    NASA Astrophysics Data System (ADS)

    Liu, K.; Paulino, G. H.

    2017-10-01

    Origami-inspired designs possess attractive applications to science and engineering (e.g. deployable, self-assembling, adaptable systems). The special geometric arrangement of panels and creases gives rise to unique mechanical properties of origami, such as reconfigurability, making origami designs well suited for tunable structures. Although often being ignored, origami structures exhibit additional soft modes beyond rigid folding due to the flexibility of thin sheets that further influence their behaviour. Actual behaviour of origami structures usually involves significant geometric nonlinearity, which amplifies the influence of additional soft modes. To investigate the nonlinear mechanics of origami structures with deformable panels, we present a structural engineering approach for simulating the nonlinear response of non-rigid origami structures. In this paper, we propose a fully nonlinear, displacement-based implicit formulation for performing static/quasi-static analyses of non-rigid origami structures based on `bar-and-hinge' models. The formulation itself leads to an efficient and robust numerical implementation. Agreement between real models and numerical simulations demonstrates the ability of the proposed approach to capture key features of origami behaviour.

  20. Nonlinear mechanics of non-rigid origami: an efficient computational approach.

    PubMed

    Liu, K; Paulino, G H

    2017-10-01

    Origami-inspired designs possess attractive applications to science and engineering (e.g. deployable, self-assembling, adaptable systems). The special geometric arrangement of panels and creases gives rise to unique mechanical properties of origami, such as reconfigurability, making origami designs well suited for tunable structures. Although often being ignored, origami structures exhibit additional soft modes beyond rigid folding due to the flexibility of thin sheets that further influence their behaviour. Actual behaviour of origami structures usually involves significant geometric nonlinearity, which amplifies the influence of additional soft modes. To investigate the nonlinear mechanics of origami structures with deformable panels, we present a structural engineering approach for simulating the nonlinear response of non-rigid origami structures. In this paper, we propose a fully nonlinear, displacement-based implicit formulation for performing static/quasi-static analyses of non-rigid origami structures based on 'bar-and-hinge' models. The formulation itself leads to an efficient and robust numerical implementation. Agreement between real models and numerical simulations demonstrates the ability of the proposed approach to capture key features of origami behaviour.

  1. A statistical approach to optimizing concrete mixture design.

    PubMed

    Ahmad, Shamsad; Alghamdi, Saeid A

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (3(3)). A total of 27 concrete mixtures with three replicates (81 specimens) were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48), cementitious materials content (350, 375, and 400 kg/m(3)), and fine/total aggregate ratio (0.35, 0.40, and 0.45). The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options.

  2. A Statistical Approach to Optimizing Concrete Mixture Design

    PubMed Central

    Alghamdi, Saeid A.

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (33). A total of 27 concrete mixtures with three replicates (81 specimens) were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48), cementitious materials content (350, 375, and 400 kg/m3), and fine/total aggregate ratio (0.35, 0.40, and 0.45). The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options. PMID:24688405

  3. A novel approach on accelerated ageing towards reliability optimization of high concentration photovoltaic cells

    NASA Astrophysics Data System (ADS)

    Tsanakas, John A.; Jaffre, Damien; Sicre, Mathieu; Elouamari, Rachid; Vossier, Alexis; de Salins, Jean-Edouard; Bechou, Laurent; Levrier, Bruno; Perona, Arnaud; Dollet, Alain

    2014-09-01

    This paper presents a preliminary study upon a novel approach proposed for highly accelerated ageing and reliability optimization of high concentrating photovoltaic (HCPV) cells and assemblies. The intended approach aims to overcome several limitations of some current accelerated ageing tests (AAT) adopted up today, proposing the use of an alternative experimental set-up for performing faster and more realistic thermal cycles, under real sun, without the involvement of environmental chamber. The study also includes specific characterization techniques, before and after each AAT sequence, which respectively provide the initial and final diagnosis on the condition of the tested sample. The acquired data from these diagnostic/characterization methods are then used as indices to determine both quantitatively and qualitatively the severity of degradation and, thus, the ageing level for each tested HCPV assembly or cell sample. Ultimate goal of such "initial diagnosis - AAT - final diagnosis" sequences is to provide the basis for a future work on the reliability analysis of the main degradation mechanisms and confident prediction of failure propagation in HCPV cells, by means of acceleration factor (AF) and mean-time-to-failure (MTTF) estimations.

  4. A Sub-filter Scale Noise Equation far Hybrid LES Simulations

    NASA Technical Reports Server (NTRS)

    Goldstein, Marvin E.

    2006-01-01

    Hybrid LES/subscale modeling approaches have an important advantage over the current noise prediction methods in that they only involve modeling of the relatively universal subscale motion and not the configuration dependent larger scale turbulence . Previous hybrid approaches use approximate statistical techniques or extrapolation methods to obtain the requisite information about the sub-filter scale motion. An alternative approach would be to adopt the modeling techniques used in the current noise prediction methods and determine the unknown stresses from experimental data. The present paper derives an equation for predicting the sub scale sound from information that can be obtained with currently available experimental procedures. The resulting prediction method would then be intermediate between the current noise prediction codes and previously proposed hybrid techniques.

  5. Iterative approach of dual regression with a sparse prior enhances the performance of independent component analysis for group functional magnetic resonance imaging (fMRI) data.

    PubMed

    Kim, Yong-Hwan; Kim, Junghoe; Lee, Jong-Hwan

    2012-12-01

    This study proposes an iterative dual-regression (DR) approach with sparse prior regularization to better estimate an individual's neuronal activation using the results of an independent component analysis (ICA) method applied to a temporally concatenated group of functional magnetic resonance imaging (fMRI) data (i.e., Tc-GICA method). An ordinary DR approach estimates the spatial patterns (SPs) of neuronal activation and corresponding time courses (TCs) specific to each individual's fMRI data with two steps involving least-squares (LS) solutions. Our proposed approach employs iterative LS solutions to refine both the individual SPs and TCs with an additional a priori assumption of sparseness in the SPs (i.e., minimally overlapping SPs) based on L(1)-norm minimization. To quantitatively evaluate the performance of this approach, semi-artificial fMRI data were created from resting-state fMRI data with the following considerations: (1) an artificially designed spatial layout of neuronal activation patterns with varying overlap sizes across subjects and (2) a BOLD time series (TS) with variable parameters such as onset time, duration, and maximum BOLD levels. To systematically control the spatial layout variability of neuronal activation patterns across the "subjects" (n=12), the degree of spatial overlap across all subjects was varied from a minimum of 1 voxel (i.e., 0.5-voxel cubic radius) to a maximum of 81 voxels (i.e., 2.5-voxel radius) across the task-related SPs with a size of 100 voxels for both the block-based and event-related task paradigms. In addition, several levels of maximum percentage BOLD intensity (i.e., 0.5, 1.0, 2.0, and 3.0%) were used for each degree of spatial overlap size. From the results, the estimated individual SPs of neuronal activation obtained from the proposed iterative DR approach with a sparse prior showed an enhanced true positive rate and reduced false positive rate compared to the ordinary DR approach. The estimated TCs of the task-related SPs from our proposed approach showed greater temporal correlation coefficients with a reference hemodynamic response function than those of the ordinary DR approach. Moreover, the efficacy of the proposed DR approach was also successfully demonstrated by the results of real fMRI data acquired from left-/right-hand clenching tasks in both block-based and event-related task paradigms. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Involving Users to Improve the Collaborative Logical Framework

    PubMed Central

    2014-01-01

    In order to support collaboration in web-based learning, there is a need for an intelligent support that facilitates its management during the design, development, and analysis of the collaborative learning experience and supports both students and instructors. At aDeNu research group we have proposed the Collaborative Logical Framework (CLF) to create effective scenarios that support learning through interaction, exploration, discussion, and collaborative knowledge construction. This approach draws on artificial intelligence techniques to support and foster an effective involvement of students to collaborate. At the same time, the instructors' workload is reduced as some of their tasks—especially those related to the monitoring of the students behavior—are automated. After introducing the CLF approach, in this paper, we present two formative evaluations with users carried out to improve the design of this collaborative tool and thus enrich the personalized support provided. In the first one, we analyze, following the layered evaluation approach, the results of an observational study with 56 participants. In the second one, we tested the infrastructure to gather emotional data when carrying out another observational study with 17 participants. PMID:24592196

  7. Proving Correctness for Pointer Programs in a Verifying Compiler

    NASA Technical Reports Server (NTRS)

    Kulczycki, Gregory; Singh, Amrinder

    2008-01-01

    This research describes a component-based approach to proving the correctness of programs involving pointer behavior. The approach supports modular reasoning and is designed to be used within the larger context of a verifying compiler. The approach consists of two parts. When a system component requires the direct manipulation of pointer operations in its implementation, we implement it using a built-in component specifically designed to capture the functional and performance behavior of pointers. When a system component requires pointer behavior via a linked data structure, we ensure that the complexities of the pointer operations are encapsulated within the data structure and are hidden to the client component. In this way, programs that rely on pointers can be verified modularly, without requiring special rules for pointers. The ultimate objective of a verifying compiler is to prove-with as little human intervention as possible-that proposed program code is correct with respect to a full behavioral specification. Full verification for software is especially important for an agency like NASA that is routinely involved in the development of mission critical systems.

  8. Comparison of two correlated ROC curves at a given specificity or sensitivity level

    PubMed Central

    Bantis, Leonidas E.; Feng, Ziding

    2017-01-01

    The receiver operating characteristic (ROC) curve is the most popular statistical tool for evaluating the discriminatory capability of a given continuous biomarker. The need to compare two correlated ROC curves arises when individuals are measured with two biomarkers, which induces paired and thus correlated measurements. Many researchers have focused on comparing two correlated ROC curves in terms of the area under the curve (AUC), which summarizes the overall performance of the marker. However, particular values of specificity may be of interest. We focus on comparing two correlated ROC curves at a given specificity level. We propose parametric approaches, transformations to normality, and nonparametric kernel-based approaches. Our methods can be straightforwardly extended for inference in terms of ROC−1(t). This is of particular interest for comparing the accuracy of two correlated biomarkers at a given sensitivity level. Extensions also involve inference for the AUC and accommodating covariates. We evaluate the robustness of our techniques through simulations, compare to other known approaches and present a real data application involving prostate cancer screening. PMID:27324068

  9. Multi Objective Optimization Using Genetic Algorithm of a Pneumatic Connector

    NASA Astrophysics Data System (ADS)

    Salaam, HA; Taha, Zahari; Ya, TMYS Tuan

    2018-03-01

    The concept of sustainability was first introduced by Dr Harlem Brutland in the 1980’s promoting the need to preserve today’s natural environment for the sake of future generations. Based on this concept, John Elkington proposed an approach to measure sustainability known as Triple Bottom Line (TBL). There are three evaluation criteria’s involved in the TBL approach; namely economics, environmental integrity and social equity. In manufacturing industry the manufacturing costs measure the economic sustainability of a company in a long term. Environmental integrity is a measure of the impact of manufacturing activities on the environment. Social equity is complicated to evaluate; but when the focus is at the production floor level, the production operator health can be considered. In this paper, the TBL approach is applied in the manufacturing of a pneumatic nipple hose. The evaluation criteria used are manufacturing costs, environmental impact, ergonomics impact and also energy used for manufacturing. This study involves multi objective optimization by using genetic algorithm of several possible alternatives for material used in the manufacturing of the pneumatic nipple.

  10. Digital phased array beamforming using single-bit delta-sigma conversion with non-uniform oversampling.

    PubMed

    Kozak, M; Karaman, M

    2001-07-01

    Digital beamforming based on oversampled delta-sigma (delta sigma) analog-to-digital (A/D) conversion can reduce the overall cost, size, and power consumption of phased array front-end processing. The signal resampling involved in dynamic delta sigma beamforming, however, disrupts synchronization between the modulators and demodulator, causing significant degradation in the signal-to-noise ratio. As a solution to this, we have explored a new digital beamforming approach based on non-uniform oversampling delta sigma A/D conversion. Using this approach, the echo signals received by the transducer array are sampled at time instants determined by the beamforming timing and then digitized by single-bit delta sigma A/D conversion prior to the coherent beam summation. The timing information involves a non-uniform sampling scheme employing different clocks at each array channel. The delta sigma coded beamsums obtained by adding the delayed 1-bit coded RF echo signals are then processed through a decimation filter to produce final beamforming outputs. The performance and validity of the proposed beamforming approach are assessed by means of emulations using experimental raw RF data.

  11. A hybrid agent-based approach for modeling microbiological systems.

    PubMed

    Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing

    2008-11-21

    Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.

  12. Agent-Centric Approach for Cybersecurity Decision-Support with Partial Observability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tipireddy, Ramakrishna; Chatterjee, Samrat; Paulson, Patrick R.

    Generating automated cyber resilience policies for real-world settings is a challenging research problem that must account for uncertainties in system state over time and dynamics between attackers and defenders. In addition to understanding attacker and defender motives and tools, and identifying “relevant” system and attack data, it is also critical to develop rigorous mathematical formulations representing the defender’s decision-support problem under uncertainty. Game-theoretic approaches involving cyber resource allocation optimization with Markov decision processes (MDP) have been previously proposed in the literature. Moreover, advancements in reinforcement learning approaches have motivated the development of partially observable stochastic games (POSGs) in various multi-agentmore » problem domains with partial information. Recent advances in cyber-system state space modeling have also generated interest in potential applicability of POSGs for cybersecurity. However, as is the case in strategic card games such as poker, research challenges using game-theoretic approaches for practical cyber defense applications include: 1) solving for equilibrium and designing efficient algorithms for large-scale, general problems; 2) establishing mathematical guarantees that equilibrium exists; 3) handling possible existence of multiple equilibria; and 4) exploitation of opponent weaknesses. Inspired by advances in solving strategic card games while acknowledging practical challenges associated with the use of game-theoretic approaches in cyber settings, this paper proposes an agent-centric approach for cybersecurity decision-support with partial system state observability.« less

  13. Recommended approaches in the application of ...

    EPA Pesticide Factsheets

    ABSTRACT:Only a fraction of chemicals in commerce have been fully assessed for their potential hazards to human health due to difficulties involved in conventional regulatory tests. It has recently been proposed that quantitative transcriptomic data can be used to determine benchmark dose (BMD) and estimate a point of departure (POD). Several studies have shown that transcriptional PODs correlate with PODs derived from analysis of pathological changes, but there is no consensus on how the genes that are used to derive a transcriptional POD should be selected. Because of very large number of unrelated genes in gene expression data, the process of selecting subsets of informative genes is a major challenge. We used published microarray data from studies on rats exposed orally to multiple doses of six chemicals for 5, 14, 28, and 90 days. We evaluated eight different approaches to select genes for POD derivation and compared them to three previously proposed approaches. The relationship between transcriptional BMDs derived using these 11 approaches were compared with PODs derived from apical data that might be used in a human health risk assessment. We found that transcriptional benchmark dose values for all 11 approaches were remarkably aligned with different apical PODs, while a subset of between 3 and 8 of the approaches met standard statistical criteria across the 5-, 14-, 28-, and 90-day time points and thus qualify as effective estimates of apical PODs. Our r

  14. Feature weight estimation for gene selection: a local hyperlinear learning approach

    PubMed Central

    2014-01-01

    Background Modeling high-dimensional data involving thousands of variables is particularly important for gene expression profiling experiments, nevertheless,it remains a challenging task. One of the challenges is to implement an effective method for selecting a small set of relevant genes, buried in high-dimensional irrelevant noises. RELIEF is a popular and widely used approach for feature selection owing to its low computational cost and high accuracy. However, RELIEF based methods suffer from instability, especially in the presence of noisy and/or high-dimensional outliers. Results We propose an innovative feature weighting algorithm, called LHR, to select informative genes from highly noisy data. LHR is based on RELIEF for feature weighting using classical margin maximization. The key idea of LHR is to estimate the feature weights through local approximation rather than global measurement, which is typically used in existing methods. The weights obtained by our method are very robust in terms of degradation of noisy features, even those with vast dimensions. To demonstrate the performance of our method, extensive experiments involving classification tests have been carried out on both synthetic and real microarray benchmark datasets by combining the proposed technique with standard classifiers, including the support vector machine (SVM), k-nearest neighbor (KNN), hyperplane k-nearest neighbor (HKNN), linear discriminant analysis (LDA) and naive Bayes (NB). Conclusion Experiments on both synthetic and real-world datasets demonstrate the superior performance of the proposed feature selection method combined with supervised learning in three aspects: 1) high classification accuracy, 2) excellent robustness to noise and 3) good stability using to various classification algorithms. PMID:24625071

  15. Cardiorenal Syndrome in Western Countries: Epidemiology, Diagnosis and Management Approaches.

    PubMed

    Ronco, Claudio; Di Lullo, Luca

    2017-01-01

    It is well established that a large number of hospitalized patients present various degrees of heart and kidney dysfunction; primary disease of the heart or kidney often involves dysfunction or injury to the other. Based on above-cited organ cross-talk, the term cardiorenal syndrome (CRS) was proposed. Although CRS was usually referred to as abruption of kidney function following heart injury, it is now clearly established that it can describe negative effects of an impaired renal function on the heart and circulation. The historical lack of clear syndrome definition and complexity of diseases contributed to a waste of precious time especially concerning diagnosis and therapeutic strategies. The effective classification of CRS proposed in a Consensus Conference by the Acute Dialysis Quality Group essentially divides CRS into two main groups, cardiorenal and renocardiac CRS, on the basis of primum movens of disease (cardiac or renal); both cardiorenal and renocardiac CRS are then divided into acute and chronic according to disease onset. Type 5 CRS integrates all cardiorenal involvement induced by systemic disease. Prevalence and incidence data show a widespread increase of CRS also due to an increasing incidence of acute and chronic cardiovascular disease, such as acute decompensated heart failure, arterial hypertension and valvular heart disease. Patients with chronic kidney disease present various degrees of cardiovascular involvement especially due to chronic inflammatory status, volume and pressure overload and secondary hyperparathyroidism leading to a higher incidence of calcific heart disease. The following review will focus on the main aspects (epidemiology, risk factors, diagnostic tools and protocols, therapeutic approaches) of CRS in Western countries (Europe and United States).

  16. Infant Joint Attention, Neural Networks and Social Cognition

    PubMed Central

    Mundy, Peter; Jarrold, William

    2010-01-01

    Neural network models of attention can provide a unifying approach to the study of human cognitive and emotional development (Posner & Rothbart, 2007). This paper we argue that a neural networks approach to the infant development of joint attention can inform our understanding of the nature of human social learning, symbolic thought process and social cognition. At its most basic, joint attention involves the capacity to coordinate one’s own visual attention with that of another person. We propose that joint attention development involves increments in the capacity to engage in simultaneous or parallel processing of information about one’s own attention and the attention of other people. Infant practice with joint attention is both a consequence and organizer of the development of a distributed and integrated brain network involving frontal and parietal cortical systems. This executive distributed network first serves to regulate the capacity of infants to respond to and direct the overt behavior of other people in order to share experience with others through the social coordination of visual attention. In this paper we describe this parallel and distributed neural network model of joint attention development and discuss two hypotheses that stem from this model. One is that activation of this distributed network during coordinated attention enhances to depth of information processing and encoding beginning in the first year of life. We also propose that with development joint attention becomes internalized as the capacity to socially coordinate mental attention to internal representations. As this occurs the executive joint attention network makes vital contributions to the development of human symbolic thinking and social cognition. PMID:20884172

  17. Involving seldom-heard groups in a PPI process to inform the design of a proposed trial on the use of probiotics to prevent preterm birth: a case study.

    PubMed

    Rayment, Juliet; Lanlehin, Rosemary; McCourt, Christine; Husain, Shahid M

    2017-01-01

    When designing clinical trials it is important to involve members of the public, who can provide a view on what may encourage or prevent people participating and on what matters to them. This is known as Public and Patient Involvement (PPI). People from minority ethnic groups are often less likely to take part in clinical trials, but it is important to ensure they are able to participate fully so that health research and its findings are relevant to a wide population. We are preparing to conduct a randomised controlled trial (RCT) to test whether taking probiotic capsules can play a role in preventing preterm birth. Women from some minority ethnic groups, for example women from West Africa, and those who are from low-income groups are more likely to suffer preterm births. Preterm birth can lead to extra costs to health services and psychosocial costs for families. In this article we describe how we engaged women in discussion about the design of the planned trial, and how we aim to use our findings to ensure the trial is workable and beneficial to women, as well as to further engage service users in the future development of the trial. Four socially and ethnically diverse groups of women in East London took part in discussions about the trial and contributed their ideas and concerns. These discussions have helped to inform and improve the design of a small practice or 'pilot' trial to test the recruitment in a 'real life' setting, as well as encourage further PPI involvement for the future full-scale trial. Background Patient and public involvement (PPI) is an important tool in approaching research challenges. However, involvement of socially and ethnically diverse populations remains limited and practitioners need effective methods of involving a broad section of the population in planning and designing research. Methods In preparation for the development of a pilot randomised controlled trial (RCT) on the use of probiotics to prevent preterm birth, we conducted a public consultation exercise in a socially disadvantaged and ethnically diverse community. The consultation aimed to meet and engage local service users in considering the acceptability of the proposed protocol, and to encourage their participation in future and ongoing patient and public involvement activities. Four discussion groups were held in the community with mothers of young children within the proposed trial region, using an inclusive approach that incorporated a modified version of the Nominal Group Technique (NGT). Bringing the consultation to the community supported the involvement of often seldom-heard participants, such as those from minority ethnic groups. Results The women involved expressed a number of concerns about the proposed protocol, including adherence to the probiotic supplement regimen and randomisation. The proposal for the RCT in itself was perceived as confirmation that probiotic supplements had potentially beneficial effects, but also that they had potentially harmful side-effects. The complexity of the women's responses provided greater insights into the challenges of even quite simple trial designs and enabled the research team to take these concerns into account while planning the pilot trial. Conclusions The use of the NGT method allowed for a consultation of a population traditionally less likely to participate in medical research. A carefully facilitated PPI exercise can allow members to express unanticipated concerns that may not have been elicited by a survey method. Findings from such exercises can be utilised to improve clinical trial design, provide insight into the feasibility of trials, and enable engagement of often excluded population groups.

  18. Efficient least angle regression for identification of linear-in-the-parameters models

    PubMed Central

    Beach, Thomas H.; Rezgui, Yacine

    2017-01-01

    Least angle regression, as a promising model selection method, differentiates itself from conventional stepwise and stagewise methods, in that it is neither too greedy nor too slow. It is closely related to L1 norm optimization, which has the advantage of low prediction variance through sacrificing part of model bias property in order to enhance model generalization capability. In this paper, we propose an efficient least angle regression algorithm for model selection for a large class of linear-in-the-parameters models with the purpose of accelerating the model selection process. The entire algorithm works completely in a recursive manner, where the correlations between model terms and residuals, the evolving directions and other pertinent variables are derived explicitly and updated successively at every subset selection step. The model coefficients are only computed when the algorithm finishes. The direct involvement of matrix inversions is thereby relieved. A detailed computational complexity analysis indicates that the proposed algorithm possesses significant computational efficiency, compared with the original approach where the well-known efficient Cholesky decomposition is involved in solving least angle regression. Three artificial and real-world examples are employed to demonstrate the effectiveness, efficiency and numerical stability of the proposed algorithm. PMID:28293140

  19. Estimation and tracking of AP-diameter of the inferior vena cava in ultrasound images using a novel active circle algorithm.

    PubMed

    Karami, Ebrahim; Shehata, Mohamed S; Smith, Andrew

    2018-05-04

    Medical research suggests that the anterior-posterior (AP)-diameter of the inferior vena cava (IVC) and its associated temporal variation as imaged by bedside ultrasound is useful in guiding fluid resuscitation of the critically-ill patient. Unfortunately, indistinct edges and gaps in vessel walls are frequently present which impede accurate estimation of the IVC AP-diameter for both human operators and segmentation algorithms. The majority of research involving use of the IVC to guide fluid resuscitation involves manual measurement of the maximum and minimum AP-diameter as it varies over time. This effort proposes using a time-varying circle fitted inside the typically ellipsoid IVC as an efficient, consistent and novel approach to tracking and approximating the AP-diameter even in the context of poor image quality. In this active-circle algorithm, a novel evolution functional is proposed and shown to be a useful tool for ultrasound image processing. The proposed algorithm is compared with an expert manual measurement, and state-of-the-art relevant algorithms. It is shown that the algorithm outperforms other techniques and performs very close to manual measurement. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Device and circuit analysis of a sub 20 nm double gate MOSFET with gate stack using a look-up-table-based approach

    NASA Astrophysics Data System (ADS)

    Chakraborty, S.; Dasgupta, A.; Das, R.; Kar, M.; Kundu, A.; Sarkar, C. K.

    2017-12-01

    In this paper, we explore the possibility of mapping devices designed in TCAD environment to its modeled version developed in cadence virtuoso environment using a look-up table (LUT) approach. Circuit simulation of newly designed devices in TCAD environment is a very slow and tedious process involving complex scripting. Hence, the LUT based modeling approach has been proposed as a faster and easier alternative in cadence environment. The LUTs are prepared by extracting data from the device characteristics obtained from device simulation in TCAD. A comparative study is shown between the TCAD simulation and the LUT-based alternative to showcase the accuracy of modeled devices. Finally the look-up table approach is used to evaluate the performance of circuits implemented using 14 nm nMOSFET.

  1. MIT-Skywalker: On the use of a markerless system.

    PubMed

    Goncalves, Rogerio S; Hamilton, Taya; Krebs, Hermano I

    2017-07-01

    This paper describes our efforts to employ the Microsoft Kinect as a low cost vision control system for the MIT-Skywalker, a robotic gait rehabilitation device. The Kinect enables an alternative markerless solution to control the MIT-Skywalker and allows a more user-friendly set-up. A study involving eight healthy subjects and two stroke survivors using the MIT-Skywalker device demonstrates the advantages and challenges of this new proposed approach.

  2. Advancing Detached-Eddy Simulation

    DTIC Science & Technology

    2007-01-01

    fluxes leads to an improvement in the stability of the solution . This matrix is solved iteratively using a symmetric Gauss - Seidel procedure. Newtons sub...model (TLM) is a zonal approach, proposed by Balaras and Benocci (5) and Balaras et al. (4). The method involved the solution of filtered Navier...LES mesh. The method was subsequently used by Cabot (6) and Diurno et al. (7) to obtain the solution of the flow over a backward facing step and by

  3. Silicon-on-Sapphire Waveguides for Widely Tunable Coherent Mid-IR Sources

    DTIC Science & Technology

    2013-09-01

    fabricated using a chrome mask. .......................................... 10 1 1. BACKGROUND The mid- infrared (IR) range between 3 m...leveraging existing sources in telecom and short-wave infrared (SWIR) bands. It has been demonstrated using silicon waveguides on silicon-on-silicon...reported [3]. The approach proposed under this project involves the four-wave mixing of a pump at a SWIR wavelength around 2 m and signals in the near

  4. Synthesis of a mesoporous single crystal Ga2O3 nanoplate with improved photoluminescence and high sensitivity in detecting CO.

    PubMed

    Yan, Shicheng; Wan, Lijuan; Li, Zhaosheng; Zhou, Yong; Zou, Zhigang

    2010-09-14

    A new approach is proposed to synthesize a mesoporous single crystal Ga(2)O(3) nanoplate by heating a single crystal nanoplate of GaOOH, which involves an ion exchange between KGaO(2) and CH(3)COOH at room temperature for the formation of GaOOH and pseudomorphic and topotactic phase transformation from GaOOH to Ga(2)O(3).

  5. Improved result on stability analysis of discrete stochastic neural networks with time delay

    NASA Astrophysics Data System (ADS)

    Wu, Zhengguang; Su, Hongye; Chu, Jian; Zhou, Wuneng

    2009-04-01

    This Letter investigates the problem of exponential stability for discrete stochastic time-delay neural networks. By defining a novel Lyapunov functional, an improved delay-dependent exponential stability criterion is established in terms of linear matrix inequality (LMI) approach. Meanwhile, the computational complexity of the newly established stability condition is reduced because less variables are involved. Numerical example is given to illustrate the effectiveness and the benefits of the proposed method.

  6. Feasibility Study of Shoreline Protection and Lake Level Regulation for Lake Ontario. Reconnaissance Report. Volume I. Main Report.

    DTIC Science & Technology

    1981-11-01

    EVALUATION Impact Assessment 62 Evaluation 65 STUDY MANAGEMENT Interdisciplinary Study Approach 67 Public Involvement 69 Environmental Impact ...agencies to assess and document the effect of proposed actions on the envi- ronment in an Environmental Impact Statement (EIS). In compliance with this 9...these being National Economic Development (NED) and Environmental Quality (EQ). It also specifies the range of impacts that must be assessed, and

  7. Localization of synchronous cortical neural sources.

    PubMed

    Zerouali, Younes; Herry, Christophe L; Jemel, Boutheina; Lina, Jean-Marc

    2013-03-01

    Neural synchronization is a key mechanism to a wide variety of brain functions, such as cognition, perception, or memory. High temporal resolution achieved by EEG recordings allows the study of the dynamical properties of synchronous patterns of activity at a very fine temporal scale but with very low spatial resolution. Spatial resolution can be improved by retrieving the neural sources of EEG signal, thus solving the so-called inverse problem. Although many methods have been proposed to solve the inverse problem and localize brain activity, few of them target the synchronous brain regions. In this paper, we propose a novel algorithm aimed at localizing specifically synchronous brain regions and reconstructing the time course of their activity. Using multivariate wavelet ridge analysis, we extract signals capturing the synchronous events buried in the EEG and then solve the inverse problem on these signals. Using simulated data, we compare results of source reconstruction accuracy achieved by our method to a standard source reconstruction approach. We show that the proposed method performs better across a wide range of noise levels and source configurations. In addition, we applied our method on real dataset and identified successfully cortical areas involved in the functional network underlying visual face perception. We conclude that the proposed approach allows an accurate localization of synchronous brain regions and a robust estimation of their activity.

  8. A new approach of watermarking technique by means multichannel wavelet functions

    NASA Astrophysics Data System (ADS)

    Agreste, Santa; Puccio, Luigia

    2012-12-01

    The digital piracy involving images, music, movies, books, and so on, is a legal problem that has not found a solution. Therefore it becomes crucial to create and to develop methods and numerical algorithms in order to solve the copyright problems. In this paper we focus the attention on a new approach of watermarking technique applied to digital color images. Our aim is to describe the realized watermarking algorithm based on multichannel wavelet functions with multiplicity r = 3, called MCWM 1.0. We report a large experimentation and some important numerical results in order to show the robustness of the proposed algorithm to geometrical attacks.

  9. Computerized decision support system for mass identification in breast using digital mammogram: a study on GA-based neuro-fuzzy approaches.

    PubMed

    Das, Arpita; Bhattacharya, Mahua

    2011-01-01

    In the present work, authors have developed a treatment planning system implementing genetic based neuro-fuzzy approaches for accurate analysis of shape and margin of tumor masses appearing in breast using digital mammogram. It is obvious that a complicated structure invites the problem of over learning and misclassification. In proposed methodology, genetic algorithm (GA) has been used for searching of effective input feature vectors combined with adaptive neuro-fuzzy model for final classification of different boundaries of tumor masses. The study involves 200 digitized mammograms from MIAS and other databases and has shown 86% correct classification rate.

  10. Refined approach for quantification of in vivo ischemia-reperfusion injury in the mouse heart

    PubMed Central

    Medway, Debra J.; Schulz-Menger, Jeanette; Schneider, Jurgen E.; Neubauer, Stefan; Lygate, Craig A.

    2009-01-01

    Cardiac ischemia-reperfusion experiments in the mouse are important in vivo models of human disease. Infarct size is a particularly important scientific readout as virtually all cardiocirculatory pathways are affected by it. Therefore, such measurements must be exact and valid. The histological analysis, however, remains technically challenging, and the resulting quality is often unsatisfactory. For this report we have scrutinized each step involved in standard double-staining histology. We have tested published approaches and challenged their practicality. As a result, we propose an improved and streamlined protocol, which consistently yields high-quality histology, thereby minimizing experimental noise and group sizes. PMID:19820193

  11. Cognitive neuroscience of obsessive-compulsive disorder.

    PubMed

    Stern, Emily R; Taylor, Stephan F

    2014-09-01

    Cognitive neuroscience investigates neural responses to cognitive and emotional probes, an approach that has yielded critical insights into the neurobiological mechanisms of psychiatric disorders. This article reviews some of the major findings from neuroimaging studies using a cognitive neuroscience approach to investigate obsessive-compulsive disorder (OCD). It evaluates the consistency of results and interprets findings within the context of OCD symptoms, and proposes a model of OCD involving inflexibility of internally focused cognition. Although further research is needed, this body of work probing cognitive-emotional processes in OCD has already shed considerable light on the underlying mechanisms of the disorder. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Control of Flexible Systems in the Presence of Failures

    NASA Technical Reports Server (NTRS)

    Magahami, Peiman G.; Cox, David E.; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    Control of flexible systems under degradation or failure of sensors/actuators is considered. A Linear Matrix Inequality framework is used to synthesize H(sub infinity)-based controllers, which provide good disturbance rejection while capable of tolerating real parameter uncertainties in the system model, as well as potential degradation or failure of the control system hardware. In this approach, a one-at-a-time failure scenario is considered, wherein no more than one sensor or actuator is allowed to fail at any given time. A numerical example involving control synthesis for a two-dimensional flexible system is presented to demonstrate the feasibility of the proposed approach.

  13. Robust Fault Detection and Isolation for Stochastic Systems

    NASA Technical Reports Server (NTRS)

    George, Jemin; Gregory, Irene M.

    2010-01-01

    This paper outlines the formulation of a robust fault detection and isolation scheme that can precisely detect and isolate simultaneous actuator and sensor faults for uncertain linear stochastic systems. The given robust fault detection scheme based on the discontinuous robust observer approach would be able to distinguish between model uncertainties and actuator failures and therefore eliminate the problem of false alarms. Since the proposed approach involves precise reconstruction of sensor faults, it can also be used for sensor fault identification and the reconstruction of true outputs from faulty sensor outputs. Simulation results presented here validate the effectiveness of the robust fault detection and isolation system.

  14. [Obesity psychological treatment: beyond cognitive and behavioral therapy].

    PubMed

    Volery, M; Bonnemain, A; Latino, A; Ourrad, N; Perroud, A

    2015-03-25

    The psychological assessment of the patient with obesity aims to identify the factors of maintenance of excess weight, such as eating disorders or anxio-depressive disorders. Psychotherapy helps a better weight management. Cognitive-behavioral therapy has shown its effectiveness in the treatment of obesity. New psychotherapeutic approaches are explored. The hypnosis and mindfulness are proposed for the management of emotions and stress. A targeted approach on the body image disorder decreases body dissatisfaction. When post-traumatic stress syndrome is involved, EMDR (Eye Movement Desensitization & Reprocessing) is better than other types of therapies. Family therapy is indicated when the entourage is impacted. Psychological difficulties should be the subject of specific care.

  15. AIDS and human sexuality.

    PubMed

    Smith, L L; Lathrop, L M

    1993-01-01

    The sexual behaviours placing an individual at risk for HIV infection are those also placing him/her at risk for gonorrhoea, syphilis, hepatitis B, chlamydia and unplanned pregnancy. This article proposes that approaches to HIV prevention must be included within a broad context of human sexuality. To address disease prevention in the absence of including people's relationships, social, behavioural and emotional needs is futile. Compartmentalization, denial of risk by various populations, and societal barriers are all factors to be overcome in the fight against HIV transmission. Specific strategies involved in a comprehensive approach are outlined under the categories of predisposing, enabling and reinforcing factors contributing to healthy sexual behaviour.

  16. A new approach in psychotherapy: ACT (acceptance and commitment therapy).

    PubMed

    McHugh, Louise

    2011-09-01

    Acceptance and commitment therapy (ACT) focuses on enhancing psychological flexibility in the service of achieving core life values. One thing that distinguishes ACT from other psychotherapies is its grounding in empirical behavioural science. The results of the latter suggest that the capacity for human language can produce seriously negative psychological effects under certain circumstances. ACT is a therapeutic approach in which the negative effects of human language are undermined so as to support flexible values based living. ACT therapeutic work involves six key processes proposed under the "hexaflex" model. ACT has received considerable empirical support at a number of different levels of analysis.

  17. Drench effects of media portrayal of fatal virus disease on health locus of control beliefs.

    PubMed

    Bahk, C M

    2001-01-01

    Drawing on the notion of the drench hypothesis proposed by Greenberg (1988), the author proposes a preliminary theoretical framework to explain "drenching" effects of dramatic media. Three drench variables-perceived realism, role identification, and media involvement-were identified and tested regarding their role in mediating the impact of virus disease portrayals on health locus-of-control belief orientations. Participants in the experimental condition watched the movie Outbreak (a portrayal of an outbreak of a deadly virus disease). Perceived realism, role identification, and media involvement were measured concerning the movie depiction of the virus disease. The findings indicate that the dramatized portrayal significantly weakened the viewers' beliefs in self-controllability over health and strengthened their beliefs in chance outcomes of health. Beliefs in provider control over health were affected by the viewers' perception of realism regarding the movie portrayals. Effects of role identification were different between male and female viewers. The results are discussed in relation to drench analysis as a theoretical approach to media effects.

  18. Nucleophosmin integrates within the nucleolus via multi-modal interactions with proteins displaying R-rich linear motifs and rRNA.

    PubMed

    Mitrea, Diana M; Cika, Jaclyn A; Guy, Clifford S; Ban, David; Banerjee, Priya R; Stanley, Christopher B; Nourse, Amanda; Deniz, Ashok A; Kriwacki, Richard W

    2016-02-02

    The nucleolus is a membrane-less organelle formed through liquid-liquid phase separation of its components from the surrounding nucleoplasm. Here, we show that nucleophosmin (NPM1) integrates within the nucleolus via a multi-modal mechanism involving multivalent interactions with proteins containing arginine-rich linear motifs (R-motifs) and ribosomal RNA (rRNA). Importantly, these R-motifs are found in canonical nucleolar localization signals. Based on a novel combination of biophysical approaches, we propose a model for the molecular organization within liquid-like droplets formed by the N-terminal domain of NPM1 and R-motif peptides, thus providing insights into the structural organization of the nucleolus. We identify multivalency of acidic tracts and folded nucleic acid binding domains, mediated by N-terminal domain oligomerization, as structural features required for phase separation of NPM1 with other nucleolar components in vitro and for localization within mammalian nucleoli. We propose that one mechanism of nucleolar localization involves phase separation of proteins within the nucleolus.

  19. What is a new drug worth? An innovative model for performance-based pricing.

    PubMed

    Dranitsaris, G; Dorward, K; Owens, R C; Schipper, H

    2015-05-01

    This article focuses on a novel method to derive prices for new pharmaceuticals by making price a function of drug performance. We briefly review current models for determining price for a new product and discuss alternatives that have historically been favoured by various funding bodies. The progressive approach to drug pricing, proposed herein, may better address the views and concerns of multiple stakeholders in a developed healthcare system by acknowledging and incorporating input from disparate parties via comprehensive and successive negotiation stages. In proposing a valid construct for performance-based pricing, the following model seeks to achieve several crucial objectives: earlier and wider access to new treatments; improved transparency in drug pricing; multi-stakeholder involvement through phased pricing negotiations; recognition of innovative product performance and latent changes in value; an earlier and more predictable return for developers without sacrificing total return on investment (ROI); more involved and informed risk sharing by the end-user. © 2014 John Wiley & Sons Ltd.

  20. A fuzzy stochastic framework for managing hydro-environmental and socio-economic interactions under uncertainty

    NASA Astrophysics Data System (ADS)

    Subagadis, Yohannes Hagos; Schütze, Niels; Grundmann, Jens

    2014-05-01

    An amplified interconnectedness between a hydro-environmental and socio-economic system brings about profound challenges of water management decision making. In this contribution, we present a fuzzy stochastic approach to solve a set of decision making problems, which involve hydrologically, environmentally, and socio-economically motivated criteria subjected to uncertainty and ambiguity. The proposed methodological framework combines objective and subjective criteria in a decision making procedure for obtaining an acceptable ranking in water resources management alternatives under different type of uncertainty (subjective/objective) and heterogeneous information (quantitative/qualitative) simultaneously. The first step of the proposed approach involves evaluating the performance of alternatives with respect to different types of criteria. The ratings of alternatives with respect to objective and subjective criteria are evaluated by simulation-based optimization and fuzzy linguistic quantifiers, respectively. Subjective and objective uncertainties related to the input information are handled through linking fuzziness and randomness together. Fuzzy decision making helps entail the linguistic uncertainty and a Monte Carlo simulation process is used to map stochastic uncertainty. With this framework, the overall performance of each alternative is calculated using an Order Weighted Averaging (OWA) aggregation operator accounting for decision makers' experience and opinions. Finally, ranking is achieved by conducting pair-wise comparison of management alternatives. This has been done on the basis of the risk defined by the probability of obtaining an acceptable ranking and mean difference in total performance for the pair of management alternatives. The proposed methodology is tested in a real-world hydrosystem, to find effective and robust intervention strategies for the management of a coastal aquifer system affected by saltwater intrusion due to excessive groundwater extraction for irrigated agriculture and municipal use. The results show that the approach gives useful support for robust decision-making and is sensitive to the decision makers' degree of optimism.

  1. Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation.

    PubMed

    Das, Rahul Deb; Winter, Stephan

    2016-11-23

    Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS). Both applications rely on the sensor traces generated by travellers' smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation.

  2. Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation

    PubMed Central

    Das, Rahul Deb; Winter, Stephan

    2016-01-01

    Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS). Both applications rely on the sensor traces generated by travellers’ smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation. PMID:27886053

  3. Fusion of shallow and deep features for classification of high-resolution remote sensing images

    NASA Astrophysics Data System (ADS)

    Gao, Lang; Tian, Tian; Sun, Xiao; Li, Hang

    2018-02-01

    Effective spectral and spatial pixel description plays a significant role for the classification of high resolution remote sensing images. Current approaches of pixel-based feature extraction are of two main kinds: one includes the widelyused principal component analysis (PCA) and gray level co-occurrence matrix (GLCM) as the representative of the shallow spectral and shape features, and the other refers to the deep learning-based methods which employ deep neural networks and have made great promotion on classification accuracy. However, the former traditional features are insufficient to depict complex distribution of high resolution images, while the deep features demand plenty of samples to train the network otherwise over fitting easily occurs if only limited samples are involved in the training. In view of the above, we propose a GLCM-based convolution neural network (CNN) approach to extract features and implement classification for high resolution remote sensing images. The employment of GLCM is able to represent the original images and eliminate redundant information and undesired noises. Meanwhile, taking shallow features as the input of deep network will contribute to a better guidance and interpretability. In consideration of the amount of samples, some strategies such as L2 regularization and dropout methods are used to prevent over-fitting. The fine-tuning strategy is also used in our study to reduce training time and further enhance the generalization performance of the network. Experiments with popular data sets such as PaviaU data validate that our proposed method leads to a performance improvement compared to individual involved approaches.

  4. Improving semi-automated segmentation by integrating learning with active sampling

    NASA Astrophysics Data System (ADS)

    Huo, Jing; Okada, Kazunori; Brown, Matthew

    2012-02-01

    Interactive segmentation algorithms such as GrowCut usually require quite a few user interactions to perform well, and have poor repeatability. In this study, we developed a novel technique to boost the performance of the interactive segmentation method GrowCut involving: 1) a novel "focused sampling" approach for supervised learning, as opposed to conventional random sampling; 2) boosting GrowCut using the machine learned results. We applied the proposed technique to the glioblastoma multiforme (GBM) brain tumor segmentation, and evaluated on a dataset of ten cases from a multiple center pharmaceutical drug trial. The results showed that the proposed system has the potential to reduce user interaction while maintaining similar segmentation accuracy.

  5. In-vivo determination of chewing patterns using FBG and artificial neural networks

    NASA Astrophysics Data System (ADS)

    Pegorini, Vinicius; Zen Karam, Leandro; Rocha Pitta, Christiano S.; Ribeiro, Richardson; Simioni Assmann, Tangriani; Cardozo da Silva, Jean Carlos; Bertotti, Fábio L.; Kalinowski, Hypolito J.; Cardoso, Rafael

    2015-09-01

    This paper reports the process of pattern classification of the chewing process of ruminants. We propose a simplified signal processing scheme for optical fiber Bragg grating (FBG) sensors based on machine learning techniques. The FBG sensors measure the biomechanical forces during jaw movements and an artificial neural network is responsible for the classification of the associated chewing pattern. In this study, three patterns associated to dietary supplement, hay and ryegrass were considered. Additionally, two other important events for ingestive behavior studies were monitored, rumination and idle period. Experimental results show that the proposed approach for pattern classification has been capable of differentiating the materials involved in the chewing process with a small classification error.

  6. Efficient fuzzy Bayesian inference algorithms for incorporating expert knowledge in parameter estimation

    NASA Astrophysics Data System (ADS)

    Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad

    2016-05-01

    Bayesian inference has traditionally been conceived as the proper framework for the formal incorporation of expert knowledge in parameter estimation of groundwater models. However, conventional Bayesian inference is incapable of taking into account the imprecision essentially embedded in expert provided information. In order to solve this problem, a number of extensions to conventional Bayesian inference have been introduced in recent years. One of these extensions is 'fuzzy Bayesian inference' which is the result of integrating fuzzy techniques into Bayesian statistics. Fuzzy Bayesian inference has a number of desirable features which makes it an attractive approach for incorporating expert knowledge in the parameter estimation process of groundwater models: (1) it is well adapted to the nature of expert provided information, (2) it allows to distinguishably model both uncertainty and imprecision, and (3) it presents a framework for fusing expert provided information regarding the various inputs of the Bayesian inference algorithm. However an important obstacle in employing fuzzy Bayesian inference in groundwater numerical modeling applications is the computational burden, as the required number of numerical model simulations often becomes extremely exhaustive and often computationally infeasible. In this paper, a novel approach of accelerating the fuzzy Bayesian inference algorithm is proposed which is based on using approximate posterior distributions derived from surrogate modeling, as a screening tool in the computations. The proposed approach is first applied to a synthetic test case of seawater intrusion (SWI) in a coastal aquifer. It is shown that for this synthetic test case, the proposed approach decreases the number of required numerical simulations by an order of magnitude. Then the proposed approach is applied to a real-world test case involving three-dimensional numerical modeling of SWI in Kish Island, located in the Persian Gulf. An expert elicitation methodology is developed and applied to the real-world test case in order to provide a road map for the use of fuzzy Bayesian inference in groundwater modeling applications.

  7. Dynamic deformation image de-blurring and image processing for digital imaging correlation measurement

    NASA Astrophysics Data System (ADS)

    Guo, X.; Li, Y.; Suo, T.; Liu, H.; Zhang, C.

    2017-11-01

    This paper proposes a method for de-blurring of images captured in the dynamic deformation of materials. De-blurring is achieved based on the dynamic-based approach, which is used to estimate the Point Spread Function (PSF) during the camera exposure window. The deconvolution process involving iterative matrix calculations of pixels, is then performed on the GPU to decrease the time cost. Compared to the Gauss method and the Lucy-Richardson method, it has the best result of the image restoration. The proposed method has been evaluated by using the Hopkinson bar loading system. In comparison to the blurry image, the proposed method has successfully restored the image. It is also demonstrated from image processing applications that the de-blurring method can improve the accuracy and the stability of the digital imaging correlation measurement.

  8. A new IRT-based standard setting method: application to eCat-listening.

    PubMed

    García, Pablo Eduardo; Abad, Francisco José; Olea, Julio; Aguado, David

    2013-01-01

    Criterion-referenced interpretations of tests are highly necessary, which usually involves the difficult task of establishing cut scores. Contrasting with other Item Response Theory (IRT)-based standard setting methods, a non-judgmental approach is proposed in this study, in which Item Characteristic Curve (ICC) transformations lead to the final cut scores. eCat-Listening, a computerized adaptive test for the evaluation of English Listening, was administered to 1,576 participants, and the proposed standard setting method was applied to classify them into the performance standards of the Common European Framework of Reference for Languages (CEFR). The results showed a classification closely related to relevant external measures of the English language domain, according to the CEFR. It is concluded that the proposed method is a practical and valid standard setting alternative for IRT-based tests interpretations.

  9. A simulated annealing approach for redesigning a warehouse network problem

    NASA Astrophysics Data System (ADS)

    Khairuddin, Rozieana; Marlizawati Zainuddin, Zaitul; Jiun, Gan Jia

    2017-09-01

    Now a day, several companies consider downsizing their distribution networks in ways that involve consolidation or phase-out of some of their current warehousing facilities due to the increasing competition, mounting cost pressure and taking advantage on the economies of scale. Consequently, the changes on economic situation after a certain period of time require an adjustment on the network model in order to get the optimal cost under the current economic conditions. This paper aimed to develop a mixed-integer linear programming model for a two-echelon warehouse network redesign problem with capacitated plant and uncapacitated warehouses. The main contribution of this study is considering capacity constraint for existing warehouses. A Simulated Annealing algorithm is proposed to tackle with the proposed model. The numerical solution showed the model and method of solution proposed was practical.

  10. Semi-empirical quantum evaluation of peptide - MHC class II binding

    NASA Astrophysics Data System (ADS)

    González, Ronald; Suárez, Carlos F.; Bohórquez, Hugo J.; Patarroyo, Manuel A.; Patarroyo, Manuel E.

    2017-01-01

    Peptide presentation by the major histocompatibility complex (MHC) is a key process for triggering a specific immune response. Studying peptide-MHC (pMHC) binding from a structural-based approach has potential for reducing the costs of investigation into vaccine development. This study involved using two semi-empirical quantum chemistry methods (PM7 and FMO-DFTB) for computing the binding energies of peptides bonded to HLA-DR1 and HLA-DR2. We found that key stabilising water molecules involved in the peptide binding mechanism were required for finding high correlation with IC50 experimental values. Our proposal is computationally non-intensive, and is a reliable alternative for studying pMHC binding interactions.

  11. Endovascular Treatment of a Symptomatic Thoracoabdominal Aortic Aneurysm by Chimney and Periscope Techniques for Total Visceral and Renal Artery Revascularization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cariati, Maurizio, E-mail: cariati.maurizio@sancarlo.mi.it; Mingazzini, Pietro; Dallatana, Raffaello

    2013-05-02

    Conventional endovascular therapy of thoracoabdominal aortic aneurysm with involving visceral and renal arteries is limited by the absence of a landing zone for the aortic endograft. Solutions have been proposed to overcome the problem of no landing zone; however, most of them are not feasible in urgent and high-risk patients. We describe a case that was successfully treated by total endovascular technique with a two-by-two chimney-and-periscope approach in a patient with acute symptomatic type IV thoracoabdominal aortic aneurysm with supra-anastomotic aneurysm formation involving the renal and visceral arteries and a pseduaneurismatic sac localized in the left ileopsoas muscle.

  12. Promoting tissue regeneration by modulating the immune system.

    PubMed

    Julier, Ziad; Park, Anthony J; Briquez, Priscilla S; Martino, Mikaël M

    2017-04-15

    The immune system plays a central role in tissue repair and regeneration. Indeed, the immune response to tissue injury is crucial in determining the speed and the outcome of the healing process, including the extent of scarring and the restoration of organ function. Therefore, controlling immune components via biomaterials and drug delivery systems is becoming an attractive approach in regenerative medicine, since therapies based on stem cells and growth factors have not yet proven to be broadly effective in the clinic. To integrate the immune system into regenerative strategies, one of the first challenges is to understand the precise functions of the different immune components during the tissue healing process. While remarkable progress has been made, the immune mechanisms involved are still elusive, and there is indication for both negative and positive roles depending on the tissue type or organ and life stage. It is well recognized that the innate immune response comprising danger signals, neutrophils and macrophages modulates tissue healing. In addition, it is becoming evident that the adaptive immune response, in particular T cell subset activities, plays a critical role. In this review, we first present an overview of the basic immune mechanisms involved in tissue repair and regeneration. Then, we highlight various approaches based on biomaterials and drug delivery systems that aim at modulating these mechanisms to limit fibrosis and promote regeneration. We propose that the next generation of regenerative therapies may evolve from typical biomaterial-, stem cell-, or growth factor-centric approaches to an immune-centric approach. Most regenerative strategies have not yet proven to be safe or reasonably efficient in the clinic. In addition to stem cells and growth factors, the immune system plays a crucial role in the tissue healing process. Here, we propose that controlling the immune-mediated mechanisms of tissue repair and regeneration may support existing regenerative strategies or could be an alternative to using stem cells and growth factors. The first part of this review we highlight key immune mechanisms involved in the tissue healing process and marks them as potential target for designing regenerative strategies. In the second part, we discuss various approaches using biomaterials and drug delivery systems that aim at modulating the components of the immune system to promote tissue regeneration. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  13. Executive Functions and the Improvement of Thinking Abilities: The Intervention in Reading Comprehension

    PubMed Central

    García-Madruga, Juan A.; Gómez-Veiga, Isabel; Vila, José Ó.

    2016-01-01

    In this paper, we propose a preliminary theory of executive functions that address in a specific way their relationship with working memory (WM) and higher-level cognition. It includes: (a) four core on-line WM executive functions that are involved in every novel and complex cognitive task; (b) two higher order off-line executive functions, planning and revision, that are required to resolving the most complex intellectual abilities; and (c) emotional control that is involved in any complex, novel and difficult task. The main assumption is that efficiency on thinking abilities may be improved by specific instruction or training on the executive functions necessary to solving novel and complex tasks involved in these abilities. Evidence for the impact of our training proposal on WM's executive functions involved in higher-level cognitive abilities comes from three studies applying an adaptive program designed to improve reading comprehension in primary school students by boosting the core WM's executive functions involved in it: focusing on relevant information, switching (or shifting) between representations or tasks, connecting incoming information from text with long-term representations, updating of the semantic representation of the text in WM, and inhibition of irrelevant information. The results are consistent with the assumption that cognitive enhancements from the training intervention may have affected not only a specific but also a more domain-general mechanism involved in various executive functions. We discuss some methodological issues in the studies of effects of WM training on reading comprehension. The perspectives and limitations of our approach are finally discussed. PMID:26869961

  14. Measuring glomerular number from kidney MRI images

    NASA Astrophysics Data System (ADS)

    Thiagarajan, Jayaraman J.; Natesan Ramamurthy, Karthikeyan; Kanberoglu, Berkay; Frakes, David; Bennett, Kevin; Spanias, Andreas

    2016-03-01

    Measuring the glomerular number in the entire, intact kidney using non-destructive techniques is of immense importance in studying several renal and systemic diseases. Commonly used approaches either require destruction of the entire kidney or perform extrapolation from measurements obtained from a few isolated sections. A recent magnetic resonance imaging (MRI) method, based on the injection of a contrast agent (cationic ferritin), has been used to effectively identify glomerular regions in the kidney. In this work, we propose a robust, accurate, and low-complexity method for estimating the number of glomeruli from such kidney MRI images. The proposed technique has a training phase and a low-complexity testing phase. In the training phase, organ segmentation is performed on a few expert-marked training images, and glomerular and non-glomerular image patches are extracted. Using non-local sparse coding to compute similarity and dissimilarity graphs between the patches, the subspace in which the glomerular regions can be discriminated from the rest are estimated. For novel test images, the image patches extracted after pre-processing are embedded using the discriminative subspace projections. The testing phase is of low computational complexity since it involves only matrix multiplications, clustering, and simple morphological operations. Preliminary results with MRI data obtained from five kidneys of rats show that the proposed non-invasive, low-complexity approach performs comparably to conventional approaches such as acid maceration and stereology.

  15. Identification of material properties of orthotropic composite plate using experimental frequency response function data

    NASA Astrophysics Data System (ADS)

    Tam, Jun Hui; Ong, Zhi Chao; Ismail, Zubaidah; Ang, Bee Chin; Khoo, Shin Yee

    2018-05-01

    The demand for composite materials is increasing due to their great superiority in material properties, e.g., lightweight, high strength and high corrosion resistance. As a result, the invention of composite materials of diverse properties is becoming prevalent, and thus, leading to the development of material identification methods for composite materials. Conventional identification methods are destructive, time-consuming and costly. Therefore, an accurate identification approach is proposed to circumvent these drawbacks, involving the use of Frequency Response Function (FRF) error function defined by the correlation discrepancy between experimental and Finite-Element generated FRFs. A square E-glass epoxy composite plate is investigated under several different configurations of boundary conditions. It is notable that the experimental FRFs are used as the correlation reference, such that, during computation, the predicted FRFs are continuously updated with reference to the experimental FRFs until achieving a solution. The final identified elastic properties, namely in-plane elastic moduli, Ex and Ey, in-plane shear modulus, Gxy, and major Poisson's ratio, vxy of the composite plate are subsequently compared to the benchmark parameters as well as with those obtained using modal-based approach. As compared to the modal-based approach, the proposed method is found to have yielded relatively better results. This can be explained by the direct employment of raw data in the proposed method that avoids errors that might incur during the stage of modal extraction.

  16. Topology optimization under stochastic stiffness

    NASA Astrophysics Data System (ADS)

    Asadpoure, Alireza

    Topology optimization is a systematic computational tool for optimizing the layout of materials within a domain for engineering design problems. It allows variation of structural boundaries and connectivities. This freedom in the design space often enables discovery of new, high performance designs. However, solutions obtained by performing the optimization in a deterministic setting may be impractical or suboptimal when considering real-world engineering conditions with inherent variabilities including (for example) variabilities in fabrication processes and operating conditions. The aim of this work is to provide a computational methodology for topology optimization in the presence of uncertainties associated with structural stiffness, such as uncertain material properties and/or structural geometry. Existing methods for topology optimization under deterministic conditions are first reviewed. Modifications are then proposed to improve the numerical performance of the so-called Heaviside Projection Method (HPM) in continuum domains. Next, two approaches, perturbation and Polynomial Chaos Expansion (PCE), are proposed to account for uncertainties in the optimization procedure. These approaches are intrusive, allowing tight and efficient coupling of the uncertainty quantification with the optimization sensitivity analysis. The work herein develops a robust topology optimization framework aimed at reducing the sensitivity of optimized solutions to uncertainties. The perturbation-based approach combines deterministic topology optimization with a perturbation method for the quantification of uncertainties. The use of perturbation transforms the problem of topology optimization under uncertainty to an augmented deterministic topology optimization problem. The PCE approach combines the spectral stochastic approach for the representation and propagation of uncertainties with an existing deterministic topology optimization technique. The resulting compact representations for the response quantities allow for efficient and accurate calculation of sensitivities of response statistics with respect to the design variables. The proposed methods are shown to be successful at generating robust optimal topologies. Examples from topology optimization in continuum and discrete domains (truss structures) under uncertainty are presented. It is also shown that proposed methods lead to significant computational savings when compared to Monte Carlo-based optimization which involve multiple formations and inversions of the global stiffness matrix and that results obtained from the proposed method are in excellent agreement with those obtained from a Monte Carlo-based optimization algorithm.

  17. 45 CFR 2102.10 - Timing, scope and content of submissions for proposed projects involving land, buildings, or...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... proposed projects involving land, buildings, or other structures. 2102.10 Section 2102.10 Public Welfare... for proposed projects involving land, buildings, or other structures. (a) A party proposing a project... historical information about the building or other structure to be altered or razed; (ii) The identity of the...

  18. Inferior heel pain in soccer players: a retrospective study with a proposal for guidelines of treatment

    PubMed Central

    Saggini, Raoul; Migliorini, Maurizio; Carmignano, Simona Maria; Ancona, Emilio; Russo, Chiara; Bellomo, Rosa Grazia

    2018-01-01

    Background The cause of heel pain among soccer players is multifactorial and is related to repetitive microtrauma due to impact forces involving technical moves, but also the playground, the exercise mode, the recovery time, the climatic conditions and the footwear used. Aim To investigate the aetiology of plantar heel pain of soccer players with the objective of proposing an example of guidelines for treatment. Methods We investigated the prevalence and characteristics of inferior heel pain of 1473 professional, semiprofessional and amateur players. All evaluated subjects were submitted to a specific rehabilitation protocol that involved advanced physical therapies and viscoelastic insoles depending on the aetiology of pain. Results Clinical and instrumental examinations revealed that 960 of 1473 athletes had inferior heel pain. These patients were divided into seven groups based on aetiology: sural nerve compression, abductor digiti minimi compression, atrophy and inflammation of the fat pad, plantar fasciitis, stress injury of the heel spur, stress fracture of the heel bone and heel spur. The proposed rehabilitation treatment aims for a reduction of pain and an early return to sports, with excellent results. Conclusions According to what was observed in the present study, related also to the specific treatment of inferior heel pain, and considering the technological progress achieved in recent years, we can now propose an integrated therapeutic approach to treatment of heel pain, properly differentiated according to specific aetiology. PMID:29527319

  19. Social power and approach-related neural activity

    PubMed Central

    Smolders, Ruud; Cremer, David De

    2012-01-01

    It has been argued that power activates a general tendency to approach whereas powerlessness activates a tendency to inhibit. The assumption is that elevated power involves reward-rich environments, freedom and, as a consequence, triggers an approach-related motivational orientation and attention to rewards. In contrast, reduced power is associated with increased threat, punishment and social constraint and thereby activates inhibition-related motivation. Moreover, approach motivation has been found to be associated with increased relative left-sided frontal brain activity, while withdrawal motivation has been associated with increased right sided activations. We measured EEG activity while subjects engaged in a task priming either high or low social power. Results show that high social power is indeed associated with greater left-frontal brain activity compared to low social power, providing the first neural evidence for the theory that high power is associated with approach-related motivation. We propose a framework accounting for differences in both approach motivation and goal-directed behaviour associated with different levels of power. PMID:19304842

  20. Social power and approach-related neural activity.

    PubMed

    Boksem, Maarten A S; Smolders, Ruud; De Cremer, David

    2012-06-01

    It has been argued that power activates a general tendency to approach whereas powerlessness activates a tendency to inhibit. The assumption is that elevated power involves reward-rich environments, freedom and, as a consequence, triggers an approach-related motivational orientation and attention to rewards. In contrast, reduced power is associated with increased threat, punishment and social constraint and thereby activates inhibition-related motivation. Moreover, approach motivation has been found to be associated with increased relative left-sided frontal brain activity, while withdrawal motivation has been associated with increased right sided activations. We measured EEG activity while subjects engaged in a task priming either high or low social power. Results show that high social power is indeed associated with greater left-frontal brain activity compared to low social power, providing the first neural evidence for the theory that high power is associated with approach-related motivation. We propose a framework accounting for differences in both approach motivation and goal-directed behaviour associated with different levels of power.

  1. Recursive Bayesian recurrent neural networks for time-series modeling.

    PubMed

    Mirikitani, Derrick T; Nikolaev, Nikolay

    2010-02-01

    This paper develops a probabilistic approach to recursive second-order training of recurrent neural networks (RNNs) for improved time-series modeling. A general recursive Bayesian Levenberg-Marquardt algorithm is derived to sequentially update the weights and the covariance (Hessian) matrix. The main strengths of the approach are a principled handling of the regularization hyperparameters that leads to better generalization, and stable numerical performance. The framework involves the adaptation of a noise hyperparameter and local weight prior hyperparameters, which represent the noise in the data and the uncertainties in the model parameters. Experimental investigations using artificial and real-world data sets show that RNNs equipped with the proposed approach outperform standard real-time recurrent learning and extended Kalman training algorithms for recurrent networks, as well as other contemporary nonlinear neural models, on time-series modeling.

  2. Linking definitions, mechanisms, and modeling of drought-induced tree death.

    PubMed

    Anderegg, William R L; Berry, Joseph A; Field, Christopher B

    2012-12-01

    Tree death from drought and heat stress is a critical and uncertain component in forest ecosystem responses to a changing climate. Recent research has illuminated how tree mortality is a complex cascade of changes involving interconnected plant systems over multiple timescales. Explicit consideration of the definitions, dynamics, and temporal and biological scales of tree mortality research can guide experimental and modeling approaches. In this review, we draw on the medical literature concerning human death to propose a water resource-based approach to tree mortality that considers the tree as a complex organism with a distinct growth strategy. This approach provides insight into mortality mechanisms at the tree and landscape scales and presents promising avenues into modeling tree death from drought and temperature stress. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. New advances in the statistical parton distributions approach

    NASA Astrophysics Data System (ADS)

    Soffer, Jacques; Bourrely, Claude

    2016-03-01

    The quantum statistical parton distributions approach proposed more than one decade ago is revisited by considering a larger set of recent and accurate Deep Inelastic Scattering experimental results. It enables us to improve the description of the data by means of a new determination of the parton distributions. This global next-to-leading order QCD analysis leads to a good description of several structure functions, involving unpolarized parton distributions and helicity distributions, in terms of a rather small number of free parameters. There are many serious challenging issues. The predictions of this theoretical approach will be tested for single-jet production and charge asymmetry in W± production in p¯p and pp collisions up to LHC energies, using recent data and also for forthcoming experimental results. Presented by J. So.er at POETIC 2015

  4. Antepartum fetal heart rate feature extraction and classification using empirical mode decomposition and support vector machine

    PubMed Central

    2011-01-01

    Background Cardiotocography (CTG) is the most widely used tool for fetal surveillance. The visual analysis of fetal heart rate (FHR) traces largely depends on the expertise and experience of the clinician involved. Several approaches have been proposed for the effective interpretation of FHR. In this paper, a new approach for FHR feature extraction based on empirical mode decomposition (EMD) is proposed, which was used along with support vector machine (SVM) for the classification of FHR recordings as 'normal' or 'at risk'. Methods The FHR were recorded from 15 subjects at a sampling rate of 4 Hz and a dataset consisting of 90 randomly selected records of 20 minutes duration was formed from these. All records were labelled as 'normal' or 'at risk' by two experienced obstetricians. A training set was formed by 60 records, the remaining 30 left as the testing set. The standard deviations of the EMD components are input as features to a support vector machine (SVM) to classify FHR samples. Results For the training set, a five-fold cross validation test resulted in an accuracy of 86% whereas the overall geometric mean of sensitivity and specificity was 94.8%. The Kappa value for the training set was .923. Application of the proposed method to the testing set (30 records) resulted in a geometric mean of 81.5%. The Kappa value for the testing set was .684. Conclusions Based on the overall performance of the system it can be stated that the proposed methodology is a promising new approach for the feature extraction and classification of FHR signals. PMID:21244712

  5. Optimal Space Station solar array gimbal angle determination via radial basis function neural networks

    NASA Technical Reports Server (NTRS)

    Clancy, Daniel J.; Oezguener, Uemit; Graham, Ronald E.

    1994-01-01

    The potential for excessive plume impingement loads on Space Station Freedom solar arrays, caused by jet firings from an approaching Space Shuttle, is addressed. An artificial neural network is designed to determine commanded solar array beta gimbal angle for minimum plume loads. The commanded angle would be determined dynamically. The network design proposed involves radial basis functions as activation functions. Design, development, and simulation of this network design are discussed.

  6. A hybrid model for combining case-control and cohort studies in systematic reviews of diagnostic tests

    PubMed Central

    Chen, Yong; Liu, Yulun; Ning, Jing; Cormier, Janice; Chu, Haitao

    2014-01-01

    Systematic reviews of diagnostic tests often involve a mixture of case-control and cohort studies. The standard methods for evaluating diagnostic accuracy only focus on sensitivity and specificity and ignore the information on disease prevalence contained in cohort studies. Consequently, such methods cannot provide estimates of measures related to disease prevalence, such as population averaged or overall positive and negative predictive values, which reflect the clinical utility of a diagnostic test. In this paper, we propose a hybrid approach that jointly models the disease prevalence along with the diagnostic test sensitivity and specificity in cohort studies, and the sensitivity and specificity in case-control studies. In order to overcome the potential computational difficulties in the standard full likelihood inference of the proposed hybrid model, we propose an alternative inference procedure based on the composite likelihood. Such composite likelihood based inference does not suffer computational problems and maintains high relative efficiency. In addition, it is more robust to model mis-specifications compared to the standard full likelihood inference. We apply our approach to a review of the performance of contemporary diagnostic imaging modalities for detecting metastases in patients with melanoma. PMID:25897179

  7. Neural network for control of rearrangeable Clos networks.

    PubMed

    Park, Y K; Cherkassky, V

    1994-09-01

    Rapid evolution in the field of communication networks requires high speed switching technologies. This involves a high degree of parallelism in switching control and routing performed at the hardware level. The multistage crossbar networks have always been attractive to switch designers. In this paper a neural network approach to controlling a three-stage Clos network in real time is proposed. This controller provides optimal routing of communication traffic requests on a call-by-call basis by rearranging existing connections, with a minimum length of rearrangement sequence so that a new blocked call request can be accommodated. The proposed neural network controller uses Paull's rearrangement algorithm, along with the special (least used) switch selection rule in order to minimize the length of rearrangement sequences. The functional behavior of our model is verified by simulations and it is shown that the convergence time required for finding an optimal solution is constant, regardless of the switching network size. The performance is evaluated for random traffic with various traffic loads. Simulation results show that applying the least used switch selection rule increases the efficiency in switch rearrangements, reducing the network convergence time. The implementation aspects are also discussed to show the feasibility of the proposed approach.

  8. GMOtrack: generator of cost-effective GMO testing strategies.

    PubMed

    Novak, Petra Krau; Gruden, Kristina; Morisset, Dany; Lavrac, Nada; Stebih, Dejan; Rotter, Ana; Zel, Jana

    2009-01-01

    Commercialization of numerous genetically modified organisms (GMOs) has already been approved worldwide, and several additional GMOs are in the approval process. Many countries have adopted legislation to deal with GMO-related issues such as food safety, environmental concerns, and consumers' right of choice, making GMO traceability a necessity. The growing extent of GMO testing makes it important to study optimal GMO detection and identification strategies. This paper formally defines the problem of routine laboratory-level GMO tracking as a cost optimization problem, thus proposing a shift from "the same strategy for all samples" to "sample-centered GMO testing strategies." An algorithm (GMOtrack) for finding optimal two-phase (screening-identification) testing strategies is proposed. The advantages of cost optimization with increasing GMO presence on the market are demonstrated, showing that optimization approaches to analytic GMO traceability can result in major cost reductions. The optimal testing strategies are laboratory-dependent, as the costs depend on prior probabilities of local GMO presence, which are exemplified on food and feed samples. The proposed GMOtrack approach, publicly available under the terms of the General Public License, can be extended to other domains where complex testing is involved, such as safety and quality assurance in the food supply chain.

  9. PRIMAL: Page Rank-Based Indoor Mapping and Localization Using Gene-Sequenced Unlabeled WLAN Received Signal Strength

    PubMed Central

    Zhou, Mu; Zhang, Qiao; Xu, Kunjie; Tian, Zengshan; Wang, Yanmeng; He, Wei

    2015-01-01

    Due to the wide deployment of wireless local area networks (WLAN), received signal strength (RSS)-based indoor WLAN localization has attracted considerable attention in both academia and industry. In this paper, we propose a novel page rank-based indoor mapping and localization (PRIMAL) by using the gene-sequenced unlabeled WLAN RSS for simultaneous localization and mapping (SLAM). Specifically, first of all, based on the observation of the motion patterns of the people in the target environment, we use the Allen logic to construct the mobility graph to characterize the connectivity among different areas of interest. Second, the concept of gene sequencing is utilized to assemble the sporadically-collected RSS sequences into a signal graph based on the transition relations among different RSS sequences. Third, we apply the graph drawing approach to exhibit both the mobility graph and signal graph in a more readable manner. Finally, the page rank (PR) algorithm is proposed to construct the mapping from the signal graph into the mobility graph. The experimental results show that the proposed approach achieves satisfactory localization accuracy and meanwhile avoids the intensive time and labor cost involved in the conventional location fingerprinting-based indoor WLAN localization. PMID:26404274

  10. A new surrogate modeling technique combining Kriging and polynomial chaos expansions – Application to uncertainty analysis in computational dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kersaudy, Pierric, E-mail: pierric.kersaudy@orange.com; Whist Lab, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux; ESYCOM, Université Paris-Est Marne-la-Vallée, 5 boulevard Descartes, 77700 Marne-la-Vallée

    2015-04-01

    In numerical dosimetry, the recent advances in high performance computing led to a strong reduction of the required computational time to assess the specific absorption rate (SAR) characterizing the human exposure to electromagnetic waves. However, this procedure remains time-consuming and a single simulation can request several hours. As a consequence, the influence of uncertain input parameters on the SAR cannot be analyzed using crude Monte Carlo simulation. The solution presented here to perform such an analysis is surrogate modeling. This paper proposes a novel approach to build such a surrogate model from a design of experiments. Considering a sparse representationmore » of the polynomial chaos expansions using least-angle regression as a selection algorithm to retain the most influential polynomials, this paper proposes to use the selected polynomials as regression functions for the universal Kriging model. The leave-one-out cross validation is used to select the optimal number of polynomials in the deterministic part of the Kriging model. The proposed approach, called LARS-Kriging-PC modeling, is applied to three benchmark examples and then to a full-scale metamodeling problem involving the exposure of a numerical fetus model to a femtocell device. The performances of the LARS-Kriging-PC are compared to an ordinary Kriging model and to a classical sparse polynomial chaos expansion. The LARS-Kriging-PC appears to have better performances than the two other approaches. A significant accuracy improvement is observed compared to the ordinary Kriging or to the sparse polynomial chaos depending on the studied case. This approach seems to be an optimal solution between the two other classical approaches. A global sensitivity analysis is finally performed on the LARS-Kriging-PC model of the fetus exposure problem.« less

  11. A Novel Approach for Adaptive Signal Processing

    NASA Technical Reports Server (NTRS)

    Chen, Ya-Chin; Juang, Jer-Nan

    1998-01-01

    Adaptive linear predictors have been used extensively in practice in a wide variety of forms. In the main, their theoretical development is based upon the assumption of stationarity of the signals involved, particularly with respect to the second order statistics. On this basis, the well-known normal equations can be formulated. If high- order statistical stationarity is assumed, then the equivalent normal equations involve high-order signal moments. In either case, the cross moments (second or higher) are needed. This renders the adaptive prediction procedure non-blind. A novel procedure for blind adaptive prediction has been proposed and considerable implementation has been made in our contributions in the past year. The approach is based upon a suitable interpretation of blind equalization methods that satisfy the constant modulus property and offers significant deviations from the standard prediction methods. These blind adaptive algorithms are derived by formulating Lagrange equivalents from mechanisms of constrained optimization. In this report, other new update algorithms are derived from the fundamental concepts of advanced system identification to carry out the proposed blind adaptive prediction. The results of the work can be extended to a number of control-related problems, such as disturbance identification. The basic principles are outlined in this report and differences from other existing methods are discussed. The applications implemented are speech processing, such as coding and synthesis. Simulations are included to verify the novel modelling method.

  12. Robust moving mesh algorithms for hybrid stretched meshes: Application to moving boundaries problems

    NASA Astrophysics Data System (ADS)

    Landry, Jonathan; Soulaïmani, Azzeddine; Luke, Edward; Ben Haj Ali, Amine

    2016-12-01

    A robust Mesh-Mover Algorithm (MMA) approach is designed to adapt meshes of moving boundaries problems. A new methodology is developed from the best combination of well-known algorithms in order to preserve the quality of initial meshes. In most situations, MMAs distribute mesh deformation while preserving a good mesh quality. However, invalid meshes are generated when the motion is complex and/or involves multiple bodies. After studying a few MMA limitations, we propose the following approach: use the Inverse Distance Weighting (IDW) function to produce the displacement field, then apply the Geometric Element Transformation Method (GETMe) smoothing algorithms to improve the resulting mesh quality, and use an untangler to revert negative elements. The proposed approach has been proven efficient to adapt meshes for various realistic aerodynamic motions: a symmetric wing that has suffered large tip bending and twisting and the high-lift components of a swept wing that has moved to different flight stages. Finally, the fluid flow problem has been solved on meshes that have moved and they have produced results close to experimental ones. However, for situations where moving boundaries are too close to each other, more improvements need to be made or other approaches should be taken, such as an overset grid method.

  13. Weighted similarity-based clustering of chemical structures and bioactivity data in early drug discovery.

    PubMed

    Perualila-Tan, Nolen Joy; Shkedy, Ziv; Talloen, Willem; Göhlmann, Hinrich W H; Moerbeke, Marijke Van; Kasim, Adetayo

    2016-08-01

    The modern process of discovering candidate molecules in early drug discovery phase includes a wide range of approaches to extract vital information from the intersection of biology and chemistry. A typical strategy in compound selection involves compound clustering based on chemical similarity to obtain representative chemically diverse compounds (not incorporating potency information). In this paper, we propose an integrative clustering approach that makes use of both biological (compound efficacy) and chemical (structural features) data sources for the purpose of discovering a subset of compounds with aligned structural and biological properties. The datasets are integrated at the similarity level by assigning complementary weights to produce a weighted similarity matrix, serving as a generic input in any clustering algorithm. This new analysis work flow is semi-supervised method since, after the determination of clusters, a secondary analysis is performed wherein it finds differentially expressed genes associated to the derived integrated cluster(s) to further explain the compound-induced biological effects inside the cell. In this paper, datasets from two drug development oncology projects are used to illustrate the usefulness of the weighted similarity-based clustering approach to integrate multi-source high-dimensional information to aid drug discovery. Compounds that are structurally and biologically similar to the reference compounds are discovered using this proposed integrative approach.

  14. [Visual rehabilitation of patients with large post-traumatic defects of the anterior eye segment through iris-lens diaphragm implantation].

    PubMed

    Khodzhaev, N S; Sobolev, N P; Mushkova, I A; Izmaylova, S B; Karimova, A N

    The diversity of methodological approaches and lack of pathogenetically reasonable tactics for patients with combined ocular injuries became the basis for the development and systematization of surgical rehabilitation stages of patients, in whom post-traumatic cataract is combined with post-traumatic aniridia and corneal scarring. to construct a visual rehabilitation approach to patients with post-traumatic defects of the anterior eye segment following optical-reconstructive surgery that involved implantation of an iris-lens diaphragm (ILD). We have analyzed 80 reconstructive cases with ILD implantation in patients with post-traumatic aniridia and corneal damage. These patients constituted the first study group (Group 1). We have also investigated 58 eyes with residual ametropy and stable visual function 1 year after ILD implantation before and after conducting a laser keratorefractive surgery. These patients were assigned to the second study group (Group 2). Rehabilitation approach to patients after anterior segment injuries that has been proposed allows to achieve high clinical and functional results and reduce the risk of intra- and postoperative complications. The proposed approach to patients after optical-reconstructive surgery with iris-lens diaphragm implantation followed by keratorefractive surgery is an effective method of visual rehabilitation of anterior eye segment post-traumatic defects.

  15. Community participation in superfund practice and policy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gossett, L.B.

    1995-12-01

    Superfund has several statutory and regulatory provisions that provide vehicles for community involvement at Superfund sites including community relations plans, information repositories, public comment periods, and technical assistance grants to community organizations. There has been considerable debate about the effectiveness of these programs. The community participation requirement of the Superfund process are in a state of transition. The proposed Superfund Reform Act of 1994 contained additional community participation provisions. EPA appears to be incorporating some of these proposed changes and improvements learned from prior experiences into its current community relations practices. This study examines the status of community relations inmore » Superfund and the effectiveness of the community information and public participation programs in meeting legislative objectives. In addition to addressing current requirements and practices, the study looks at proposals to amend the community participation provisions as well as alternative approaches used by the EPA, potentially responsible parties, and citizens to address or resolve community concerns. While the focus will be on the overall program, a few brief selected case studies, representing a diversity of experiences, will be included. The resulting paper will discuss successes and shortcomings of community involvement in Superfund. It will address the sometimes competing goals of the various players in the Superfund process, bringing in not only the community perspective, but also concerns for decreased complexity and cost and increased efficiency. The conclusion will evaluate alternatives to improve procedures for community involvement in the Superfund program. Superfund reform, public and stakeholder involvement, and dispute resolution are addressed in this study. These are prominent, contemporary issues as the nation seeks to constructively solve its environmental problems.« less

  16. Life Support Catalyst Regeneration Using Ionic Liquids and In Situ Resources

    NASA Technical Reports Server (NTRS)

    Abney, Morgan B.; Karr, Laurel J.; Paley, Mark S.; Donovan, David N.; Kramer, Teersa J.

    2016-01-01

    Oxygen recovery from metabolic carbon dioxide is an enabling capability for long-duration manned space flight. Complete recovery of oxygen (100%) involves the production of solid carbon. Catalytic approaches for this purpose, such as Bosch technology, have been limited in trade analyses due in part to the mass penalty for high catalyst resupply caused by carbon fouling of the iron or nickel catalyst. In an effort to mitigate this challenge, several technology approaches have been proposed. These approaches have included methods to prolong the life of the catalysts by increasing the total carbon mass loading per mass catalyst, methods for simplified catalyst introduction and removal to limit the resupply container mass, methods of using in situ resources, and methods to regenerate catalyst material. Research and development into these methods is ongoing, but only use of in situ resources and/or complete regeneration of catalyst material has the potential to entirely eliminate the need for resupply. The use of ionic liquids provides an opportunity to combine these methods in a technology approach designed to eliminate the need for resupply of oxygen recovery catalyst. Here we describe the results of an initial feasibility study using ionic liquids and in situ resources for life support catalyst regeneration, we discuss the key challenges with the approach, and we propose future efforts to advance the technology.

  17. ­Understanding Information Flow Interaction along Separable Causal Paths in Environmental Signals

    NASA Astrophysics Data System (ADS)

    Jiang, P.; Kumar, P.

    2017-12-01

    Multivariate environmental signals reflect the outcome of complex inter-dependencies, such as those in ecohydrologic systems. Transfer entropy and information partitioning approaches have been used to characterize such dependencies. However, these approaches capture net information flow occurring through a multitude of pathways involved in the interaction and as a result mask our ability to discern the causal interaction within an interested subsystem through specific pathways. We build on recent developments of momentary information transfer along causal paths proposed by Runge [2015] to develop a framework for quantifying information decomposition along separable causal paths. Momentary information transfer along causal paths captures the amount of information flow between any two variables lagged at two specific points in time. Our approach expands this concept to characterize the causal interaction in terms of synergistic, unique and redundant information flow through separable causal paths. Multivariate analysis using this novel approach reveals precise understanding of causality and feedback. We illustrate our approach with synthetic and observed time series data. We believe the proposed framework helps better delineate the internal structure of complex systems in geoscience where huge amounts of observational datasets exist, and it will also help the modeling community by providing a new way to look at the complexity of real and modeled systems. Runge, Jakob. "Quantifying information transfer and mediation along causal pathways in complex systems." Physical Review E 92.6 (2015): 062829.

  18. Quantitative measurement of binary liquid distributions using multiple-tracer x-ray fluorescence and radiography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halls, Benjamin R.; Meyer, Terrence R.; Kastengren, Alan L.

    2015-01-01

    The complex geometry and large index-of-refraction gradients that occur near the point of impingement of binary liquid jets present a challenging environment for optical interrogation. A simultaneous quadruple-tracer x-ray fluorescence and line-of-sight radiography technique is proposed as a means of distinguishing and quantifying individual liquid component distributions prior to, during, and after jet impact. Two different pairs of fluorescence tracers are seeded into each liquid stream to maximize their attenuation ratio for reabsorption correction and differentiation of the two fluids during mixing. This approach for instantaneous correction of x-ray fluorescence reabsorption is compared with a more time-intensive approach of usingmore » stereographic reconstruction of x-ray attenuation along multiple lines of sight. The proposed methodology addresses the need for a quantitative measurement technique capable of interrogating optically complex, near-field liquid distributions in many mixing systems of practical interest involving two or more liquid streams.« less

  19. Parkinson's disease as a system-level disorder.

    PubMed

    Caligiore, Daniele; Helmich, Rick C; Hallett, Mark; Moustafa, Ahmed A; Timmermann, Lars; Toni, Ivan; Baldassarre, Gianluca

    2016-01-01

    Traditionally, the basal ganglia have been considered the main brain region implicated in Parkinson's disease. This single area perspective gives a restricted clinical picture and limits therapeutic approaches because it ignores the influence of altered interactions between the basal ganglia and other cerebral components on Parkinsonian symptoms. In particular, the basal ganglia work closely in concert with cortex and cerebellum to support motor and cognitive functions. This article proposes a theoretical framework for understanding Parkinson's disease as caused by the dysfunction of the entire basal ganglia-cortex-cerebellum system rather than by the basal ganglia in isolation. In particular, building on recent evidence, we propose that the three key symptoms of tremor, freezing, and impairments in action sequencing may be explained by considering partially overlapping neural circuits including basal ganglia, cortical and cerebellar areas. Studying the involvement of this system in Parkinson's disease is a crucial step for devising innovative therapeutic approaches targeting it rather than only the basal ganglia. Possible future therapies based on this different view of the disease are discussed.

  20. Changing Internal Representations of Self and Other: Philosophical Tools for the Attachment-Informed Psychotherapy with Perpetrators and Victims of Violence1

    PubMed Central

    Pârvan, Alexandra

    2016-01-01

    Attachment research shows that the formation of unconscious, insecure representations of the self, the other, and the self-other relations is linked to perpetration and receipt of violence. Attachment-focused therapy aims to change these internal schemata to more secure, adaptive representations by therapeutic work addressed to senses, emotions, and behavior. The paper proposes a new approach to altering the self and other representations in offenders and victims: it involves intellectual reflection on self, will, action and responsibility informed by Augustine’s views, facilitated by actual relational experience, and translated into a distinct self-soothing strategy. The reflective-experiential approach can complement existing methods of working with violent or traumatized individuals both within and outside an attachment theory framework. It consists in: identifying that a non-reflective nondistinction between self and behavior supports damaging self- and other- representations and interactions; proposing ways for clients to comprehend and consciously operate with the distinction between self and action. PMID:28936108

  1. Hardware Implementation of a MIMO Decoder Using Matrix Factorization Based Channel Estimation

    NASA Astrophysics Data System (ADS)

    Islam, Mohammad Tariqul; Numan, Mostafa Wasiuddin; Misran, Norbahiah; Ali, Mohd Alauddin Mohd; Singh, Mandeep

    2011-05-01

    This paper presents an efficient hardware realization of multiple-input multiple-output (MIMO) wireless communication decoder that utilizes the available resources by adopting the technique of parallelism. The hardware is designed and implemented on Xilinx Virtex™-4 XC4VLX60 field programmable gate arrays (FPGA) device in a modular approach which simplifies and eases hardware update, and facilitates testing of the various modules independently. The decoder involves a proficient channel estimation module that employs matrix factorization on least squares (LS) estimation to reduce a full rank matrix into a simpler form in order to eliminate matrix inversion. This results in performance improvement and complexity reduction of the MIMO system. Performance evaluation of the proposed method is validated through MATLAB simulations which indicate 2 dB improvement in terms of SNR compared to LS estimation. Moreover complexity comparison is performed in terms of mathematical operations, which shows that the proposed approach appreciably outperforms LS estimation at a lower complexity and represents a good solution for channel estimation technique.

  2. Construction of nested maximin designs based on successive local enumeration and modified novel global harmony search algorithm

    NASA Astrophysics Data System (ADS)

    Yi, Jin; Li, Xinyu; Xiao, Mi; Xu, Junnan; Zhang, Lin

    2017-01-01

    Engineering design often involves different types of simulation, which results in expensive computational costs. Variable fidelity approximation-based design optimization approaches can realize effective simulation and efficiency optimization of the design space using approximation models with different levels of fidelity and have been widely used in different fields. As the foundations of variable fidelity approximation models, the selection of sample points of variable-fidelity approximation, called nested designs, is essential. In this article a novel nested maximin Latin hypercube design is constructed based on successive local enumeration and a modified novel global harmony search algorithm. In the proposed nested designs, successive local enumeration is employed to select sample points for a low-fidelity model, whereas the modified novel global harmony search algorithm is employed to select sample points for a high-fidelity model. A comparative study with multiple criteria and an engineering application are employed to verify the efficiency of the proposed nested designs approach.

  3. Energy-based operator splitting approach for the time discretization of coupled systems of partial and ordinary differential equations for fluid flows: The Stokes case

    NASA Astrophysics Data System (ADS)

    Carichino, Lucia; Guidoboni, Giovanna; Szopos, Marcela

    2018-07-01

    The goal of this work is to develop a novel splitting approach for the numerical solution of multiscale problems involving the coupling between Stokes equations and ODE systems, as often encountered in blood flow modeling applications. The proposed algorithm is based on a semi-discretization in time based on operator splitting, whose design is guided by the rationale of ensuring that the physical energy balance is maintained at the discrete level. As a result, unconditional stability with respect to the time step choice is ensured by the implicit treatment of interface conditions within the Stokes substeps, whereas the coupling between Stokes and ODE substeps is enforced via appropriate initial conditions for each substep. Notably, unconditional stability is attained without the need of subiterating between Stokes and ODE substeps. Stability and convergence properties of the proposed algorithm are tested on three specific examples for which analytical solutions are derived.

  4. Effect of Shock Waves on Dielectric Properties of KDP Crystal

    NASA Astrophysics Data System (ADS)

    Sivakumar, A.; Suresh, S.; Pradeep, J. Anto; Balachandar, S.; Martin Britto Dhas, S. A.

    2018-05-01

    An alternative non-destructive approach is proposed and demonstrated for modifying electrical properties of crystal using shock-waves. The method alters dielectric properties of a potassium dihydrogen phosphate (KDP) crystal by loading shock-waves generated by a table-top shock tube. The experiment involves launching the shock-waves perpendicular to the (100) plane of the crystal using a pressure driven table-top shock tube with Mach number 1.9. Electrical properties of dielectric constant, dielectric loss, permittivity, impedance, AC conductivity, DC conductivity and capacitance as a function of spectrum of frequency from 1 Hz to 1 MHz are reported for both pre- and post-shock wave loaded conditions of the KDP crystal. The experimental results reveal that dielectric constant of KDP crystal is sensitive to the shock waves such that the value decreases for the shock-loaded KDP sample from 158 to 147. The advantage of the proposed approach is that it is an alternative to the conventional doping process for tailoring dielectric properties of this type of crystal.

  5. Forensic identification of resampling operators: A semi non-intrusive approach.

    PubMed

    Cao, Gang; Zhao, Yao; Ni, Rongrong

    2012-03-10

    Recently, several new resampling operators have been proposed and successfully invalidate the existing resampling detectors. However, the reliability of such anti-forensic techniques is unaware and needs to be investigated. In this paper, we focus on the forensic identification of digital image resampling operators including the traditional type and the anti-forensic type which hides the trace of traditional resampling. Various resampling algorithms involving geometric distortion (GD)-based, dual-path-based and postprocessing-based are investigated. The identification is achieved in the manner of semi non-intrusive, supposing the resampling software could be accessed. Given an input pattern of monotone signal, polarity aberration of GD-based resampled signal's first derivative is analyzed theoretically and measured by effective feature metric. Dual-path-based and postprocessing-based resampling can also be identified by feeding proper test patterns. Experimental results on various parameter settings demonstrate the effectiveness of the proposed approach. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  6. A multi-state trajectory method for non-adiabatic dynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tao, Guohua, E-mail: taogh@pkusz.edu.cn

    2016-03-07

    A multi-state trajectory approach is proposed to describe nuclear-electron coupled dynamics in nonadiabatic simulations. In this approach, each electronic state is associated with an individual trajectory, among which electronic transition occurs. The set of these individual trajectories constitutes a multi-state trajectory, and nuclear dynamics is described by one of these individual trajectories as the system is on the corresponding state. The total nuclear-electron coupled dynamics is obtained from the ensemble average of the multi-state trajectories. A variety of benchmark systems such as the spin-boson system have been tested and the results generated using the quasi-classical version of the method showmore » reasonably good agreement with the exact quantum calculations. Featured in a clear multi-state picture, high efficiency, and excellent numerical stability, the proposed method may have advantages in being implemented to realistic complex molecular systems, and it could be straightforwardly applied to general nonadiabatic dynamics involving multiple states.« less

  7. Estimating False Positive Contamination in Crater Annotations from Citizen Science Data

    NASA Astrophysics Data System (ADS)

    Tar, P. D.; Bugiolacchi, R.; Thacker, N. A.; Gilmour, J. D.

    2017-01-01

    Web-based citizen science often involves the classification of image features by large numbers of minimally trained volunteers, such as the identification of lunar impact craters under the Moon Zoo project. Whilst such approaches facilitate the analysis of large image data sets, the inexperience of users and ambiguity in image content can lead to contamination from false positive identifications. We give an approach, using Linear Poisson Models and image template matching, that can quantify levels of false positive contamination in citizen science Moon Zoo crater annotations. Linear Poisson Models are a form of machine learning which supports predictive error modelling and goodness-of-fits, unlike most alternative machine learning methods. The proposed supervised learning system can reduce the variability in crater counts whilst providing predictive error assessments of estimated quantities of remaining true verses false annotations. In an area of research influenced by human subjectivity, the proposed method provides a level of objectivity through the utilisation of image evidence, guided by candidate crater identifications.

  8. Quantitative measurement of binary liquid distributions using multiple-tracer x-ray fluorescence and radiography

    DOE PAGES

    Halls, Benjamin R.; Meyer, Terrence R.; Kastengren, Alan L.

    2015-01-23

    The complex geometry and large index-of-refraction gradients that occur near the point of impingement of binary liquid jets present a challenging environment for optical interrogation. A simultaneous quadruple-tracer x-ray fluorescence and line-of-sight radiography technique is proposed as a means of distinguishing and quantifying individual liquid component distributions prior to, during, and after jet impact. Two different pairs of fluorescence tracers are seeded into each liquid stream to maximize their attenuation ratio for reabsorption correction and differentiation of the two fluids during mixing. This approach for instantaneous correction of x-ray fluorescence reabsorption is compared with a more time-intensive approach of usingmore » stereographic reconstruction of x-ray attenuation along multiple lines of sight. The proposed methodology addresses the need for a quantitative measurement technique capable of interrogating optically complex, near-field liquid distributions in many mixing systems of practical interest involving two or more liquid streams.« less

  9. The Essential Elements of a Risk Governance Framework for Current and Future Nanotechnologies.

    PubMed

    Stone, Vicki; Führ, Martin; Feindt, Peter H; Bouwmeester, Hans; Linkov, Igor; Sabella, Stefania; Murphy, Finbarr; Bizer, Kilian; Tran, Lang; Ågerstrand, Marlene; Fito, Carlos; Andersen, Torben; Anderson, Diana; Bergamaschi, Enrico; Cherrie, John W; Cowan, Sue; Dalemcourt, Jean-Francois; Faure, Michael; Gabbert, Silke; Gajewicz, Agnieszka; Fernandes, Teresa F; Hristozov, Danail; Johnston, Helinor J; Lansdown, Terry C; Linder, Stefan; Marvin, Hans J P; Mullins, Martin; Purnhagen, Kai; Puzyn, Tomasz; Sanchez Jimenez, Araceli; Scott-Fordsmand, Janeck J; Streftaris, George; van Tongeren, Martie; Voelcker, Nicolas H; Voyiatzis, George; Yannopoulos, Spyros N; Poortvliet, P Marijn

    2017-12-14

    Societies worldwide are investing considerable resources into the safe development and use of nanomaterials. Although each of these protective efforts is crucial for governing the risks of nanomaterials, they are insufficient in isolation. What is missing is a more integrative governance approach that goes beyond legislation. Development of this approach must be evidence based and involve key stakeholders to ensure acceptance by end users. The challenge is to develop a framework that coordinates the variety of actors involved in nanotechnology and civil society to facilitate consideration of the complex issues that occur in this rapidly evolving research and development area. Here, we propose three sets of essential elements required to generate an effective risk governance framework for nanomaterials. (1) Advanced tools to facilitate risk-based decision making, including an assessment of the needs of users regarding risk assessment, mitigation, and transfer. (2) An integrated model of predicted human behavior and decision making concerning nanomaterial risks. (3) Legal and other (nano-specific and general) regulatory requirements to ensure compliance and to stimulate proactive approaches to safety. The implementation of such an approach should facilitate and motivate good practice for the various stakeholders to allow the safe and sustainable future development of nanotechnology. © 2017 Society for Risk Analysis.

  10. A short-term operating room surgery scheduling problem integrating multiple nurses roster constraints.

    PubMed

    Xiang, Wei; Yin, Jiao; Lim, Gino

    2015-02-01

    Operating room (OR) surgery scheduling determines the individual surgery's operation start time and assigns the required resources to each surgery over a schedule period, considering several constraints related to a complete surgery flow and the multiple resources involved. This task plays a decisive role in providing timely treatments for the patients while balancing hospital resource utilization. The originality of the present study is to integrate the surgery scheduling problem with real-life nurse roster constraints such as their role, specialty, qualification and availability. This article proposes a mathematical model and an ant colony optimization (ACO) approach to efficiently solve such surgery scheduling problems. A modified ACO algorithm with a two-level ant graph model is developed to solve such combinatorial optimization problems because of its computational complexity. The outer ant graph represents surgeries, while the inner graph is a dynamic resource graph. Three types of pheromones, i.e. sequence-related, surgery-related, and resource-related pheromone, fitting for a two-level model are defined. The iteration-best and feasible update strategy and local pheromone update rules are adopted to emphasize the information related to the good solution in makespan, and the balanced utilization of resources as well. The performance of the proposed ACO algorithm is then evaluated using the test cases from (1) the published literature data with complete nurse roster constraints, and 2) the real data collected from a hospital in China. The scheduling results using the proposed ACO approach are compared with the test case from both the literature and the real life hospital scheduling. Comparison results with the literature shows that the proposed ACO approach has (1) an 1.5-h reduction in end time; (2) a reduction in variation of resources' working time, i.e. 25% for ORs, 50% for nurses in shift 1 and 86% for nurses in shift 2; (3) an 0.25h reduction in individual maximum overtime (OT); and (4) an 42% reduction in the total OT of nurses. Comparison results with the real 10-workday hospital scheduling further show the advantage of the ACO in several measurements. Instead of assigning all surgeries by a surgeon to only one OR and the same nurses by traditional manual approach in hospital, ACO realizes a more balanced surgery arrangement by assigning the surgeries to different ORs and nurses. It eventually leads to shortening the end time within the confidential interval of [7.4%, 24.6%] with 95% confidence level. The ACO approach proposed in this paper efficiently solves the surgery scheduling problem with daily nurse roster while providing a shortened end time and relatively balanced resource allocations. It also supports the advantage of integrating the surgery scheduling with the nurse scheduling and the efficiency of systematic optimization considering a complete three-stage surgery flow and resources involved. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Utilizing the Structure and Content Information for XML Document Clustering

    NASA Astrophysics Data System (ADS)

    Tran, Tien; Kutty, Sangeetha; Nayak, Richi

    This paper reports on the experiments and results of a clustering approach used in the INEX 2008 document mining challenge. The clustering approach utilizes both the structure and content information of the Wikipedia XML document collection. A latent semantic kernel (LSK) is used to measure the semantic similarity between XML documents based on their content features. The construction of a latent semantic kernel involves the computing of singular vector decomposition (SVD). On a large feature space matrix, the computation of SVD is very expensive in terms of time and memory requirements. Thus in this clustering approach, the dimension of the document space of a term-document matrix is reduced before performing SVD. The document space reduction is based on the common structural information of the Wikipedia XML document collection. The proposed clustering approach has shown to be effective on the Wikipedia collection in the INEX 2008 document mining challenge.

  12. TREATING HEMOGLOBINOPATHIES USING GENE CORRECTION APPROACHES: PROMISES AND CHALLENGES

    PubMed Central

    Cottle, Renee N.; Lee, Ciaran M.; Bao, Gang

    2016-01-01

    Hemoglobinopathies are genetic disorders caused by aberrant hemoglobin expression or structure changes, resulting in severe mortality and health disparities worldwide. Sickle cell disease (SCD) and β-thalassemia, the most common forms of hemoglobinopathies, are typically treated using transfusions and pharmacological agents. Allogeneic hematopoietic stem cell transplantation is the only curative therapy, but has limited clinical applicability. Although gene therapy approaches have been proposed based on the insertion and forced expression of wild-type or anti-sickling β-globin variants, safety concerns may impede their clinical application. A novel curative approach is nuclease-based gene correction, which involves the application of precision genome editing tools to correct the disease-causing mutation. This review describes the development and potential application of gene therapy and precision genome editing approaches for treating SCD and β-thalassemia. The opportunities and challenges in advancing a curative therapy for hemoglobinopathies are also discussed. PMID:27314256

  13. Practical Strategies for Collaboration across Discipline-Based Education Research and the Learning Sciences

    PubMed Central

    Peffer, Melanie; Renken, Maggie

    2016-01-01

    Rather than pursue questions related to learning in biology from separate camps, recent calls highlight the necessity of interdisciplinary research agendas. Interdisciplinary collaborations allow for a complicated and expanded approach to questions about learning within specific science domains, such as biology. Despite its benefits, interdisciplinary work inevitably involves challenges. Some such challenges originate from differences in theoretical and methodological approaches across lines of work. Thus, aims at developing successful interdisciplinary research programs raise important considerations regarding methodologies for studying biology learning, strategies for approaching collaborations, and training of early-career scientists. Our goal here is to describe two fields important to understanding learning in biology, discipline-based education research and the learning sciences. We discuss differences between each discipline’s approach to biology education research and the benefits and challenges associated with incorporating these perspectives in a single research program. We then propose strategies for building productive interdisciplinary collaboration. PMID:27881446

  14. "Why not stoichiometry" versus "stoichiometry--why not?" Part I: General context.

    PubMed

    Michałowska-Kaczmarczyk, Anna Maria; Asuero, Agustin G; Michałowski, Tadeusz

    2015-01-01

    The elementary concepts involved with stoichiometry are considered from different viewpoints. Some examples of approximate calculations made according to the stoichiometric scheme are indicated, and correct resolution of the problems involved is presented. The principles of balancing chemical equations, based on their apparent similarities with algebraic equations, are criticized. The review concerns some peculiarities inherent in chemical reaction notation and its use (and abuse) in stoichiometric calculations that provide inconsistent results for various reasons. This "conventional" approach to stoichiometry is put in context with the generalized approach to electrolytic systems (GATES) established by Michałowski. The article contains a number of proposals that could potentially be taken into account and included in the next edition of the Orange Book. Notation of ions used in this article is not, deliberately, in accordance with actual IUPAC requirements in this respect. This article is intended to be provocative with the hope that some critical debate around the important topics treated should be generated and creatively expanded in the scientific community.

  15. Arterial Mechanical Motion Estimation Based on a Semi-Rigid Body Deformation Approach

    PubMed Central

    Guzman, Pablo; Hamarneh, Ghassan; Ros, Rafael; Ros, Eduardo

    2014-01-01

    Arterial motion estimation in ultrasound (US) sequences is a hard task due to noise and discontinuities in the signal derived from US artifacts. Characterizing the mechanical properties of the artery is a promising novel imaging technique to diagnose various cardiovascular pathologies and a new way of obtaining relevant clinical information, such as determining the absence of dicrotic peak, estimating the Augmentation Index (AIx), the arterial pressure or the arterial stiffness. One of the advantages of using US imaging is the non-invasive nature of the technique unlike Intra Vascular Ultra Sound (IVUS) or angiography invasive techniques, plus the relative low cost of the US units. In this paper, we propose a semi rigid deformable method based on Soft Bodies dynamics realized by a hybrid motion approach based on cross-correlation and optical flow methods to quantify the elasticity of the artery. We evaluate and compare different techniques (for instance optical flow methods) on which our approach is based. The goal of this comparative study is to identify the best model to be used and the impact of the accuracy of these different stages in the proposed method. To this end, an exhaustive assessment has been conducted in order to decide which model is the most appropriate for registering the variation of the arterial diameter over time. Our experiments involved a total of 1620 evaluations within nine simulated sequences of 84 frames each and the estimation of four error metrics. We conclude that our proposed approach obtains approximately 2.5 times higher accuracy than conventional state-of-the-art techniques. PMID:24871987

  16. A proposed approach for the assessment of chemicals in indirect potable reuse schemes.

    PubMed

    Rodriguez, Clemencia; Weinstein, Philip; Cook, Angus; Devine, Brian; Van Buynder, Paul

    2007-10-01

    The city of Perth in Western Australia is facing a future of compromised water supplies. In recent years, this urban region has been experiencing rapid population growth, coupled with drying climate, which has exacerbated water shortages. As part of the government strategy to secure water sustainability and to address an agenda focused on all elements of the water cycle, a target of 20% reuse of treated wastewater by 2012 was established. This includes a feasibility review of managed aquifer recharge for indirect potable reuse. A characterization of contaminants in wastewater after treatment and an assessment of the health implications are necessary to reassure both regulators and the public. To date, the commonly used approach involves a comparison of measured contaminant concentrations with the established drinking-water standards or other toxicological guidelines for the protection of human health. However, guidelines and standards have not been established for many contaminants in recycled water (unregulated chemicals). This article presents a three-tiered approach for the preliminary health risk assessment of chemicals in order to determine key contaminants that need to be monitored and managed. The proposed benchmark values for the calculation of risk quotients are health based, systematically defined, scientifically defensible, easy to apply, and clear to interpret. The proposed methodology is based on the derivation of health-based levels for unregulated contaminants with toxicity information and a "threshold of toxicological concern" for unregulated contaminants without toxicity data. The application of this approach will help policymakers set guidelines regarding unregulated chemicals in recycled water.

  17. A dynamic programming approach for the alignment of signal peaks in multiple gas chromatography-mass spectrometry experiments.

    PubMed

    Robinson, Mark D; De Souza, David P; Keen, Woon Wai; Saunders, Eleanor C; McConville, Malcolm J; Speed, Terence P; Likić, Vladimir A

    2007-10-29

    Gas chromatography-mass spectrometry (GC-MS) is a robust platform for the profiling of certain classes of small molecules in biological samples. When multiple samples are profiled, including replicates of the same sample and/or different sample states, one needs to account for retention time drifts between experiments. This can be achieved either by the alignment of chromatographic profiles prior to peak detection, or by matching signal peaks after they have been extracted from chromatogram data matrices. Automated retention time correction is particularly important in non-targeted profiling studies. A new approach for matching signal peaks based on dynamic programming is presented. The proposed approach relies on both peak retention times and mass spectra. The alignment of more than two peak lists involves three steps: (1) all possible pairs of peak lists are aligned, and similarity of each pair of peak lists is estimated; (2) the guide tree is built based on the similarity between the peak lists; (3) peak lists are progressively aligned starting with the two most similar peak lists, following the guide tree until all peak lists are exhausted. When two or more experiments are performed on different sample states and each consisting of multiple replicates, peak lists within each set of replicate experiments are aligned first (within-state alignment), and subsequently the resulting alignments are aligned themselves (between-state alignment). When more than two sets of replicate experiments are present, the between-state alignment also employs the guide tree. We demonstrate the usefulness of this approach on GC-MS metabolic profiling experiments acquired on wild-type and mutant Leishmania mexicana parasites. We propose a progressive method to match signal peaks across multiple GC-MS experiments based on dynamic programming. A sensitive peak similarity function is proposed to balance peak retention time and peak mass spectra similarities. This approach can produce the optimal alignment between an arbitrary number of peak lists, and models explicitly within-state and between-state peak alignment. The accuracy of the proposed method was close to the accuracy of manually-curated peak matching, which required tens of man-hours for the analyzed data sets. The proposed approach may offer significant advantages for processing of high-throughput metabolomics data, especially when large numbers of experimental replicates and multiple sample states are analyzed.

  18. Simple approach in understanding interzeolite transformations using ring building units

    NASA Astrophysics Data System (ADS)

    Suhendar, D.; Buchari; Mukti, R. R.; Ismunandar

    2018-04-01

    Recently, there are two general approaches used in understanding interzeolite transformations, thermodynamically represented by framework density (FD) and kinetically by structural building units. Two types of structural building units are composite building units (CBU’s) and secondary building units (SBU’s). This study aims to examine the approaches by using interzeolite transformation data available in literature and propose a possible alternative approach. From a number of cases of zeolite transformation, the FD and CBU approach are not suitable for use. The FD approach fails in cases involving zeolite parents that have moderate or high FD’s, while CBU approach fails because of CBU’s unavailability in parent zeolites compared with CBU’s in their transformation products. The SBU approach is most likely to fit because SBU’s are units that have basic form of ring structures and closer to the state and shape of oligomeric fragments present in zeolite synthesis or dissolution cases. Thus, a new approach can be considered in understanding the interzeolite transformation, namely the ring building unit (RBU) approach. The advantage of RBU approach is RBU’s can be easily derived from all framework types, but in SBU approach there are several types of frameworks that cannot be expressed in SBU forms.

  19. Linear SFM: A hierarchical approach to solving structure-from-motion problems by decoupling the linear and nonlinear components

    NASA Astrophysics Data System (ADS)

    Zhao, Liang; Huang, Shoudong; Dissanayake, Gamini

    2018-07-01

    This paper presents a novel hierarchical approach to solving structure-from-motion (SFM) problems. The algorithm begins with small local reconstructions based on nonlinear bundle adjustment (BA). These are then joined in a hierarchical manner using a strategy that requires solving a linear least squares optimization problem followed by a nonlinear transform. The algorithm can handle ordered monocular and stereo image sequences. Two stereo images or three monocular images are adequate for building each initial reconstruction. The bulk of the computation involves solving a linear least squares problem and, therefore, the proposed algorithm avoids three major issues associated with most of the nonlinear optimization algorithms currently used for SFM: the need for a reasonably accurate initial estimate, the need for iterations, and the possibility of being trapped in a local minimum. Also, by summarizing all the original observations into the small local reconstructions with associated information matrices, the proposed Linear SFM manages to preserve all the information contained in the observations. The paper also demonstrates that the proposed problem formulation results in a sparse structure that leads to an efficient numerical implementation. The experimental results using publicly available datasets show that the proposed algorithm yields solutions that are very close to those obtained using a global BA starting with an accurate initial estimate. The C/C++ source code of the proposed algorithm is publicly available at https://github.com/LiangZhaoPKUImperial/LinearSFM.

  20. Gender trouble: The World Health Organization, the International Statistical Classification of Diseases and Related Health Problems (ICD)-11 and the trans kids.

    PubMed

    Winter, Sam

    2017-10-01

    The World Health Organization (WHO) is revising its diagnostic manual, the International Statistical Classification of Diseases and Related Health Problems (ICD). At the time of writing, and based on recommendations from its ICD Working Group on Sexual Disorders and Sexual Health, WHO is proposing a new ICD chapter titled Conditions Related to Sexual Health, and that the gender incongruence diagnoses (replacements for the gender identity disorder diagnoses used in ICD-10) should be placed in that chapter. WHO is proposing that there should be a Gender incongruence of childhood (GIC) diagnosis for children below the age of puberty. This last proposal has come under fire. Trans community groups, as well as many healthcare professionals and others working for transgender health and wellbeing, have criticised the proposal on the grounds that the pathologisation of gender diversity at such a young age is inappropriate, unnecessary, harmful and inconsistent with WHO's approach in regard to other aspects of development in childhood and youth. Counter proposals have been offered that do not pathologise gender diversity and instead make use of Z codes to frame and document any contacts that young gender diverse children may have with health services. The author draws on his involvement in the ICD revision process, both as a member of the aforementioned WHO Working Group and as one of its critics, to put the case against the GIC proposal, and to recommend an alternative approach for ICD in addressing the needs of gender diverse children.

  1. An LMI approach for the Integral Sliding Mode and H∞ State Feedback Control Problem

    NASA Astrophysics Data System (ADS)

    Bezzaoucha, Souad; Henry, David

    2015-11-01

    This paper deals with the state feedback control problem for linear uncertain systems subject to both matched and unmatched perturbations. The proposed control law is based on an the Integral Sliding Mode Control (ISMC) approach to tackle matched perturbations as well as the H∞ paradigm for robustness against unmatched perturbations. The proposed method also parallels the work presented in [1] which addressed the same problem and proposed a solution involving an Algebraic Riccati Equation (ARE)-based formulation. The contribution of this paper is concerned by the establishment of a Linear Matrix Inequality (LMI)-based solution which offers the possibility to consider other types of constraints such as 𝓓-stability constraints (pole assignment-like constraints). The proposed methodology is applied to a pilot three-tank system and experiment results illustrate the feasibility. Note that only a few real experiments have been rarely considered using SMC in the past. This is due to the high energetic behaviour of the control signal. It is important to outline that the paper does not aim at proposing a LMI formulation of an ARE. This is done since 1971 [2] and further discussed in [3] where the link between AREs and ARIs (algebraic Riccati inequality) is established for the H∞ control problem. The main contribution of this paper is to establish the adequate LMI-based methodology (changes of matrix variables) so that the ARE that corresponds to the particular structure of the mixed ISMC/H∞ structure proposed by [1] can be re-formulated within the LMI paradigm.

  2. Infant joint attention, neural networks and social cognition.

    PubMed

    Mundy, Peter; Jarrold, William

    2010-01-01

    Neural network models of attention can provide a unifying approach to the study of human cognitive and emotional development (Posner & Rothbart, 2007). In this paper we argue that a neural network approach to the infant development of joint attention can inform our understanding of the nature of human social learning, symbolic thought process and social cognition. At its most basic, joint attention involves the capacity to coordinate one's own visual attention with that of another person. We propose that joint attention development involves increments in the capacity to engage in simultaneous or parallel processing of information about one's own attention and the attention of other people. Infant practice with joint attention is both a consequence and an organizer of the development of a distributed and integrated brain network involving frontal and parietal cortical systems. This executive distributed network first serves to regulate the capacity of infants to respond to and direct the overt behavior of other people in order to share experience with others through the social coordination of visual attention. In this paper we describe this parallel and distributed neural network model of joint attention development and discuss two hypotheses that stem from this model. One is that activation of this distributed network during coordinated attention enhances the depth of information processing and encoding beginning in the first year of life. We also propose that with development, joint attention becomes internalized as the capacity to socially coordinate mental attention to internal representations. As this occurs the executive joint attention network makes vital contributions to the development of human symbolic thinking and social cognition. Copyright © 2010 Elsevier Ltd. All rights reserved.

  3. Prediction-Correction Algorithms for Time-Varying Constrained Optimization

    DOE PAGES

    Simonetto, Andrea; Dall'Anese, Emiliano

    2017-07-26

    This article develops online algorithms to track solutions of time-varying constrained optimization problems. Particularly, resembling workhorse Kalman filtering-based approaches for dynamical systems, the proposed methods involve prediction-correction steps to provably track the trajectory of the optimal solutions of time-varying convex problems. The merits of existing prediction-correction methods have been shown for unconstrained problems and for setups where computing the inverse of the Hessian of the cost function is computationally affordable. This paper addresses the limitations of existing methods by tackling constrained problems and by designing first-order prediction steps that rely on the Hessian of the cost function (and do notmore » require the computation of its inverse). In addition, the proposed methods are shown to improve the convergence speed of existing prediction-correction methods when applied to unconstrained problems. Numerical simulations corroborate the analytical results and showcase performance and benefits of the proposed algorithms. A realistic application of the proposed method to real-time control of energy resources is presented.« less

  4. [Cooperative learning for improving healthy housing conditions in Bogota: a case study].

    PubMed

    Torres-Parra, Camilo A; García-Ubaque, Juan C; García-Ubaque, César A

    2014-01-01

    This was a community-based effort at constructing an educational proposal orientated towards self-empowerment aimed at improving the target population's sanitary, housing and living conditions through cooperative learning. A constructivist approach was adopted based on a programme called "Habitat community manger". The project involved working with fifteen families living in the Mochuelo Bajo barrio in Ciudad Bolívar in Bogotá, Colombia, for identifying the most relevant sanitary aspects for improving their homes and proposing a methodology and organisation for an educational proposal. Twenty-one poor housing-related epidemiological indicators were identified which formed the basis for defining specific problems and establishing a methodology for designing an educational proposal. The course which emerged from the cooperative learning experience was designed to promote the community's skills and education regarding health aimed at improving households' living conditions and ensuring a healthy environment which would allow them to develop an immediate habitat ensuring their own welfare and dignity.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simonetto, Andrea; Dall'Anese, Emiliano

    This article develops online algorithms to track solutions of time-varying constrained optimization problems. Particularly, resembling workhorse Kalman filtering-based approaches for dynamical systems, the proposed methods involve prediction-correction steps to provably track the trajectory of the optimal solutions of time-varying convex problems. The merits of existing prediction-correction methods have been shown for unconstrained problems and for setups where computing the inverse of the Hessian of the cost function is computationally affordable. This paper addresses the limitations of existing methods by tackling constrained problems and by designing first-order prediction steps that rely on the Hessian of the cost function (and do notmore » require the computation of its inverse). In addition, the proposed methods are shown to improve the convergence speed of existing prediction-correction methods when applied to unconstrained problems. Numerical simulations corroborate the analytical results and showcase performance and benefits of the proposed algorithms. A realistic application of the proposed method to real-time control of energy resources is presented.« less

  6. Adaptive Conditioning of Multiple-Point Geostatistical Facies Simulation to Flow Data with Facies Probability Maps

    NASA Astrophysics Data System (ADS)

    Khodabakhshi, M.; Jafarpour, B.

    2013-12-01

    Characterization of complex geologic patterns that create preferential flow paths in certain reservoir systems requires higher-order geostatistical modeling techniques. Multipoint statistics (MPS) provides a flexible grid-based approach for simulating such complex geologic patterns from a conceptual prior model known as a training image (TI). In this approach, a stationary TI that encodes the higher-order spatial statistics of the expected geologic patterns is used to represent the shape and connectivity of the underlying lithofacies. While MPS is quite powerful for describing complex geologic facies connectivity, the nonlinear and complex relation between the flow data and facies distribution makes flow data conditioning quite challenging. We propose an adaptive technique for conditioning facies simulation from a prior TI to nonlinear flow data. Non-adaptive strategies for conditioning facies simulation to flow data can involves many forward flow model solutions that can be computationally very demanding. To improve the conditioning efficiency, we develop an adaptive sampling approach through a data feedback mechanism based on the sampling history. In this approach, after a short period of sampling burn-in time where unconditional samples are generated and passed through an acceptance/rejection test, an ensemble of accepted samples is identified and used to generate a facies probability map. This facies probability map contains the common features of the accepted samples and provides conditioning information about facies occurrence in each grid block, which is used to guide the conditional facies simulation process. As the sampling progresses, the initial probability map is updated according to the collective information about the facies distribution in the chain of accepted samples to increase the acceptance rate and efficiency of the conditioning. This conditioning process can be viewed as an optimization approach where each new sample is proposed based on the sampling history to improve the data mismatch objective function. We extend the application of this adaptive conditioning approach to the case where multiple training images are proposed to describe the geologic scenario in a given formation. We discuss the advantages and limitations of the proposed adaptive conditioning scheme and use numerical experiments from fluvial channel formations to demonstrate its applicability and performance compared to non-adaptive conditioning techniques.

  7. Telerobotic Surgery: An Intelligent Systems Approach to Mitigate the Adverse Effects of Communication Delay. Chapter 4

    NASA Technical Reports Server (NTRS)

    Cardullo, Frank M.; Lewis, Harold W., III; Panfilov, Peter B.

    2007-01-01

    An extremely innovative approach has been presented, which is to have the surgeon operate through a simulator running in real-time enhanced with an intelligent controller component to enhance the safety and efficiency of a remotely conducted operation. The use of a simulator enables the surgeon to operate in a virtual environment free from the impediments of telecommunication delay. The simulator functions as a predictor and periodically the simulator state is corrected with truth data. Three major research areas must be explored in order to ensure achieving the objectives. They are: simulator as predictor, image processing, and intelligent control. Each is equally necessary for success of the project and each of these involves a significant intelligent component in it. These are diverse, interdisciplinary areas of investigation, thereby requiring a highly coordinated effort by all the members of our team, to ensure an integrated system. The following is a brief discussion of those areas. Simulator as a predictor: The delays encountered in remote robotic surgery will be greater than any encountered in human-machine systems analysis, with the possible exception of remote operations in space. Therefore, novel compensation techniques will be developed. Included will be the development of the real-time simulator, which is at the heart of our approach. The simulator will present real-time, stereoscopic images and artificial haptic stimuli to the surgeon. Image processing: Because of the delay and the possibility of insufficient bandwidth a high level of novel image processing is necessary. This image processing will include several innovative aspects, including image interpretation, video to graphical conversion, texture extraction, geometric processing, image compression and image generation at the surgeon station. Intelligent control: Since the approach we propose is in a sense predictor based, albeit a very sophisticated predictor, a controller, which not only optimizes end effector trajectory but also avoids error, is essential. We propose to investigate two different approaches to the controller design. One approach employs an optimal controller based on modern control theory; the other one involves soft computing techniques, i.e. fuzzy logic, neural networks, genetic algorithms and hybrids of these.

  8. A multi-objective approach to solid waste management.

    PubMed

    Galante, Giacomo; Aiello, Giuseppe; Enea, Mario; Panascia, Enrico

    2010-01-01

    The issue addressed in this paper consists in the localization and dimensioning of transfer stations, which constitute a necessary intermediate level in the logistic chain of the solid waste stream, from municipalities to the incinerator. Contextually, the determination of the number and type of vehicles involved is carried out in an integrated optimization approach. The model considers both initial investment and operative costs related to transportation and transfer stations. Two conflicting objectives are evaluated, the minimization of total cost and the minimization of environmental impact, measured by pollution. The design of the integrated waste management system is hence approached in a multi-objective optimization framework. To determine the best means of compromise, goal programming, weighted sum and fuzzy multi-objective techniques have been employed. The proposed analysis highlights how different attitudes of the decision maker towards the logic and structure of the problem result in the employment of different methodologies and the obtaining of different results. The novel aspect of the paper lies in the proposal of an effective decision support system for operative waste management, rather than a further contribution to the transportation problem. The model was applied to the waste management of optimal territorial ambit (OTA) of Palermo (Italy). 2010 Elsevier Ltd. All rights reserved.

  9. A multi-objective approach to solid waste management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galante, Giacomo, E-mail: galante@dtpm.unipa.i; Aiello, Giuseppe; Enea, Mario

    2010-08-15

    The issue addressed in this paper consists in the localization and dimensioning of transfer stations, which constitute a necessary intermediate level in the logistic chain of the solid waste stream, from municipalities to the incinerator. Contextually, the determination of the number and type of vehicles involved is carried out in an integrated optimization approach. The model considers both initial investment and operative costs related to transportation and transfer stations. Two conflicting objectives are evaluated, the minimization of total cost and the minimization of environmental impact, measured by pollution. The design of the integrated waste management system is hence approached inmore » a multi-objective optimization framework. To determine the best means of compromise, goal programming, weighted sum and fuzzy multi-objective techniques have been employed. The proposed analysis highlights how different attitudes of the decision maker towards the logic and structure of the problem result in the employment of different methodologies and the obtaining of different results. The novel aspect of the paper lies in the proposal of an effective decision support system for operative waste management, rather than a further contribution to the transportation problem. The model was applied to the waste management of optimal territorial ambit (OTA) of Palermo (Italy).« less

  10. Validating the simulation of large-scale parallel applications using statistical characteristics

    DOE PAGES

    Zhang, Deli; Wilke, Jeremiah; Hendry, Gilbert; ...

    2016-03-01

    Simulation is a widely adopted method to analyze and predict the performance of large-scale parallel applications. Validating the hardware model is highly important for complex simulations with a large number of parameters. Common practice involves calculating the percent error between the projected and the real execution time of a benchmark program. However, in a high-dimensional parameter space, this coarse-grained approach often suffers from parameter insensitivity, which may not be known a priori. Moreover, the traditional approach cannot be applied to the validation of software models, such as application skeletons used in online simulations. In this work, we present a methodologymore » and a toolset for validating both hardware and software models by quantitatively comparing fine-grained statistical characteristics obtained from execution traces. Although statistical information has been used in tasks like performance optimization, this is the first attempt to apply it to simulation validation. Lastly, our experimental results show that the proposed evaluation approach offers significant improvement in fidelity when compared to evaluation using total execution time, and the proposed metrics serve as reliable criteria that progress toward automating the simulation tuning process.« less

  11. Insights into Global Health Practice from the Agile Software Development Movement

    PubMed Central

    Flood, David; Chary, Anita; Austad, Kirsten; Diaz, Anne Kraemer; García, Pablo; Martinez, Boris; Canú, Waleska López; Rohloff, Peter

    2016-01-01

    Global health practitioners may feel frustration that current models of global health research, delivery, and implementation are overly focused on specific interventions, slow to provide health services in the field, and relatively ill-equipped to adapt to local contexts. Adapting design principles from the agile software development movement, we propose an analogous approach to designing global health programs that emphasizes tight integration between research and implementation, early involvement of ground-level health workers and program beneficiaries, and rapid cycles of iterative program improvement. Using examples from our own fieldwork, we illustrate the potential of ‘agile global health’ and reflect on the limitations, trade-offs, and implications of this approach. PMID:27134081

  12. Solving a layout design problem by analytic hierarchy process (AHP) and data envelopment analysis (DEA) approach

    NASA Astrophysics Data System (ADS)

    Tuzkaya, Umut R.; Eser, Arzum; Argon, Goner

    2004-02-01

    Today, growing amounts of waste due to fast consumption rate of products started an irreversible environmental pollution and damage. A considerable part of this waste is caused by packaging material. With the realization of this fact, various waste policies have taken important steps. Here we considered a firm, where waste Aluminum constitutes majority of raw materials for this fir0m. In order to achieve a profitable recycling process, plant layout should be well designed. In this study, we propose a two-step approach involving Analytic Hierarchy Process (AHP) and Data Envelopment Analysis (DEA) to solve facility layout design problems. A case example is considered to demonstrate the results achieved.

  13. Loop Variables in String Theory

    NASA Astrophysics Data System (ADS)

    Sathiapalan, B.

    The loop variable approach is a proposal for a gauge-invariant generalization of the sigma-model renormalization group method of obtaining equations of motion in string theory. The basic guiding principle is space-time gauge invariance rather than world sheet properties. In essence it is a version of Wilson's exact renormalization group equation for the world sheet theory. It involves all the massive modes and is defined with a finite world sheet cutoff, which allows one to go off the mass-shell. On shell the tree amplitudes of string theory are reproduced. The equations are gauge-invariant off shell also. This paper is a self-contained discussion of the loop variable approach as well as its connection with the Wilsonian RG.

  14. Basic research in evolution and ecology enhances forensics.

    PubMed

    Tomberlin, Jeffery K; Benbow, M Eric; Tarone, Aaron M; Mohr, Rachel M

    2011-02-01

    In 2009, the National Research Council recommended that the forensic sciences strengthen their grounding in basic empirical research to mitigate against criticism and improve accuracy and reliability. For DNA-based identification, this goal was achieved under the guidance of the population genetics community. This effort resulted in DNA analysis becoming the 'gold standard' of the forensic sciences. Elsewhere, we proposed a framework for streamlining research in decomposition ecology, which promotes quantitative approaches to collecting and applying data to forensic investigations involving decomposing human remains. To extend the ecological aspects of this approach, this review focuses on forensic entomology, although the framework can be extended to other areas of decomposition. Published by Elsevier Ltd.

  15. AGM: A DSL for mobile cloud computing based on directed graph

    NASA Astrophysics Data System (ADS)

    Tanković, Nikola; Grbac, Tihana Galinac

    2016-06-01

    This paper summarizes a novel approach for consuming a domain specific language (DSL) by transforming it to a directed graph representation persisted by a graph database. Using such specialized database enables advanced navigation trough the stored model exposing only relevant subsets of meta-data to different involved services and components. We applied this approach in a mobile cloud computing system and used it to model several mobile applications in retail, supply chain management and merchandising domain. These application are distributed in a Software-as-a-Service (SaaS) fashion and used by thousands of customers in Croatia. We report on lessons learned and propose further research on this topic.

  16. Robustness of Flexible Systems With Component-Level Uncertainties

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.

    2000-01-01

    Robustness of flexible systems in the presence of model uncertainties at the component level is considered. Specifically, an approach for formulating robustness of flexible systems in the presence of frequency and damping uncertainties at the component level is presented. The synthesis of the components is based on a modifications of a controls-based algorithm for component mode synthesis. The formulation deals first with robustness of synthesized flexible systems. It is then extended to deal with global (non-synthesized ) dynamic models with component-level uncertainties by projecting uncertainties from component levels to system level. A numerical example involving a two-dimensional simulated docking problem is worked out to demonstrate the feasibility of the proposed approach.

  17. Insights into Global Health Practice from the Agile Software Development Movement.

    PubMed

    Flood, David; Chary, Anita; Austad, Kirsten; Diaz, Anne Kraemer; García, Pablo; Martinez, Boris; Canú, Waleska López; Rohloff, Peter

    2016-01-01

    Global health practitioners may feel frustration that current models of global health research, delivery, and implementation are overly focused on specific interventions, slow to provide health services in the field, and relatively ill-equipped to adapt to local contexts. Adapting design principles from the agile software development movement, we propose an analogous approach to designing global health programs that emphasizes tight integration between research and implementation, early involvement of ground-level health workers and program beneficiaries, and rapid cycles of iterative program improvement. Using examples from our own fieldwork, we illustrate the potential of 'agile global health' and reflect on the limitations, trade-offs, and implications of this approach.

  18. An approach to optimal semi-active control of vibration energy harvesting based on MEMS

    NASA Astrophysics Data System (ADS)

    Rojas, Rafael A.; Carcaterra, Antonio

    2018-07-01

    In this paper the energy harvesting problem involving typical MEMS technology is reduced to an optimal control problem, where the objective function is the absorption of the maximum amount of energy in a given time interval from a vibrating environment. The interest here is to identify a physical upper bound for this energy storage. The mathematical tool is a new optimal control called Krotov's method, that has not yet been applied to engineering problems, except in quantum dynamics. This approach leads to identify new maximum bounds to the energy harvesting performance. Novel MEMS-based device control configurations for vibration energy harvesting are proposed with particular emphasis to piezoelectric, electromagnetic and capacitive circuits.

  19. Comparison of two correlated ROC curves at a given specificity or sensitivity level.

    PubMed

    Bantis, Leonidas E; Feng, Ziding

    2016-10-30

    The receiver operating characteristic (ROC) curve is the most popular statistical tool for evaluating the discriminatory capability of a given continuous biomarker. The need to compare two correlated ROC curves arises when individuals are measured with two biomarkers, which induces paired and thus correlated measurements. Many researchers have focused on comparing two correlated ROC curves in terms of the area under the curve (AUC), which summarizes the overall performance of the marker. However, particular values of specificity may be of interest. We focus on comparing two correlated ROC curves at a given specificity level. We propose parametric approaches, transformations to normality, and nonparametric kernel-based approaches. Our methods can be straightforwardly extended for inference in terms of ROC -1 (t). This is of particular interest for comparing the accuracy of two correlated biomarkers at a given sensitivity level. Extensions also involve inference for the AUC and accommodating covariates. We evaluate the robustness of our techniques through simulations, compare them with other known approaches, and present a real-data application involving prostate cancer screening. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  20. Qualitative evaluation: A critical and interpretative complementary approach to improve health programs and services

    PubMed Central

    Tayabas, Luz María Tejada; León, Teresita Castillo; ESPINO, JOEL MONARREZ

    2014-01-01

    This short essay aims at commenting on the origin, development, rationale, and main characteristics of qualitative evaluation (QE), emphasizing the value of this methodological tool to evaluate health programs and services. During the past decades, different approaches have come to light proposing complementary alternatives to appraise the performance of public health programs, mainly focusing on the implementation process involved rather than on measuring the impact of such actions. QE is an alternative tool that can be used to illustrate and understand the process faced when executing health programs. It can also lead to useful suggestions to modify its implementation from the stakeholders’ perspectives, as it uses a qualitative approach that considers participants as reflective subjects, generators of meanings. This implies that beneficiaries become involved in an active manner in the evaluated phenomena with the aim of improving the health programs or services that they receive. With this work we want to encourage evaluators in the field of public health to consider the use of QE as a complementary tool for program evaluation to be able to identify areas of opportunity to improve programs’ implementation processes from the perspective of intended beneficiaries. PMID:25152220

  1. Transient modeling/analysis of hyperbolic heat conduction problems employing mixed implicit-explicit alpha method

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; D'Costa, Joseph F.

    1991-01-01

    This paper describes the evaluation of mixed implicit-explicit finite element formulations for hyperbolic heat conduction problems involving non-Fourier effects. In particular, mixed implicit-explicit formulations employing the alpha method proposed by Hughes et al. (1987, 1990) are described for the numerical simulation of hyperbolic heat conduction models, which involves time-dependent relaxation effects. Existing analytical approaches for modeling/analysis of such models involve complex mathematical formulations for obtaining closed-form solutions, while in certain numerical formulations the difficulties include severe oscillatory solution behavior (which often disguises the true response) in the vicinity of the thermal disturbances, which propagate with finite velocities. In view of these factors, the alpha method is evaluated to assess the control of the amount of numerical dissipation for predicting the transient propagating thermal disturbances. Numerical test models are presented, and pertinent conclusions are drawn for the mixed-time integration simulation of hyperbolic heat conduction models involving non-Fourier effects.

  2. Dynamic SPECT reconstruction from few projections: a sparsity enforced matrix factorization approach

    NASA Astrophysics Data System (ADS)

    Ding, Qiaoqiao; Zan, Yunlong; Huang, Qiu; Zhang, Xiaoqun

    2015-02-01

    The reconstruction of dynamic images from few projection data is a challenging problem, especially when noise is present and when the dynamic images are vary fast. In this paper, we propose a variational model, sparsity enforced matrix factorization (SEMF), based on low rank matrix factorization of unknown images and enforced sparsity constraints for representing both coefficients and bases. The proposed model is solved via an alternating iterative scheme for which each subproblem is convex and involves the efficient alternating direction method of multipliers (ADMM). The convergence of the overall alternating scheme for the nonconvex problem relies upon the Kurdyka-Łojasiewicz property, recently studied by Attouch et al (2010 Math. Oper. Res. 35 438) and Attouch et al (2013 Math. Program. 137 91). Finally our proof-of-concept simulation on 2D dynamic images shows the advantage of the proposed method compared to conventional methods.

  3. Tracking by Identification Using Computer Vision and Radio

    PubMed Central

    Mandeljc, Rok; Kovačič, Stanislav; Kristan, Matej; Perš, Janez

    2013-01-01

    We present a novel system for detection, localization and tracking of multiple people, which fuses a multi-view computer vision approach with a radio-based localization system. The proposed fusion combines the best of both worlds, excellent computer-vision-based localization, and strong identity information provided by the radio system, and is therefore able to perform tracking by identification, which makes it impervious to propagated identity switches. We present comprehensive methodology for evaluation of systems that perform person localization in world coordinate system and use it to evaluate the proposed system as well as its components. Experimental results on a challenging indoor dataset, which involves multiple people walking around a realistically cluttered room, confirm that proposed fusion of both systems significantly outperforms its individual components. Compared to the radio-based system, it achieves better localization results, while at the same time it successfully prevents propagation of identity switches that occur in pure computer-vision-based tracking. PMID:23262485

  4. Singular spectrum decomposition of Bouligand-Minkowski fractal descriptors: an application to the classification of texture Images

    NASA Astrophysics Data System (ADS)

    Florindo, João. Batista

    2018-04-01

    This work proposes the use of Singular Spectrum Analysis (SSA) for the classification of texture images, more specifically, to enhance the performance of the Bouligand-Minkowski fractal descriptors in this task. Fractal descriptors are known to be a powerful approach to model and particularly identify complex patterns in natural images. Nevertheless, the multiscale analysis involved in those descriptors makes them highly correlated. Although other attempts to address this point was proposed in the literature, none of them investigated the relation between the fractal correlation and the well-established analysis employed in time series. And SSA is one of the most powerful techniques for this purpose. The proposed method was employed for the classification of benchmark texture images and the results were compared with other state-of-the-art classifiers, confirming the potential of this analysis in image classification.

  5. Simple and practical approach for computing the ray Hessian matrix in geometrical optics.

    PubMed

    Lin, Psang Dain

    2018-02-01

    A method is proposed for simplifying the computation of the ray Hessian matrix in geometrical optics by replacing the angular variables in the system variable vector with their equivalent cosine and sine functions. The variable vector of a boundary surface is similarly defined in such a way as to exclude any angular variables. It is shown that the proposed formulations reduce the computation time of the Hessian matrix by around 10 times compared to the previous method reported by the current group in Advanced Geometrical Optics (2016). Notably, the method proposed in this study involves only polynomial differentiation, i.e., trigonometric function calls are not required. As a consequence, the computation complexity is significantly reduced. Five illustrative examples are given. The first three examples show that the proposed method is applicable to the determination of the Hessian matrix for any pose matrix, irrespective of the order in which the rotation and translation motions are specified. The last two examples demonstrate the use of the proposed Hessian matrix in determining the axial and lateral chromatic aberrations of a typical optical system.

  6. Circular Regression in a Dual-Phase Lock-In Amplifier for Coherent Detection of Weak Signal

    PubMed Central

    Wang, Gaoxuan; Reboul, Serge; Fertein, Eric

    2017-01-01

    Lock-in amplification (LIA) is an effective approach for recovery of weak signal buried in noise. Determination of the input signal amplitude in a classical dual-phase LIA is based on incoherent detection which leads to a biased estimation at low signal-to-noise ratio. This article presents, for the first time to our knowledge, a new architecture of LIA involving phase estimation with a linear-circular regression for coherent detection. The proposed phase delay estimate, between the input signal and a reference, is defined as the maximum-likelihood of a set of observations distributed according to a von Mises distribution. In our implementation this maximum is obtained with a Newton Raphson algorithm. We show that the proposed LIA architecture provides an unbiased estimate of the input signal amplitude. Theoretical simulations with synthetic data demonstrate that the classical LIA estimates are biased for SNR of the input signal lower than −20 dB, while the proposed LIA is able to accurately recover the weak signal amplitude. The novel approach is applied to an optical sensor for accurate measurement of NO2 concentrations at the sub-ppbv level in the atmosphere. Side-by-side intercomparison measurements with a commercial LIA (SR830, Stanford Research Inc., Sunnyvale, CA, USA ) demonstrate that the proposed LIA has an identical performance in terms of measurement accuracy and precision but with simplified hardware architecture. PMID:29135951

  7. The exponentiated Hencky energy: anisotropic extension and case studies

    NASA Astrophysics Data System (ADS)

    Schröder, Jörg; von Hoegen, Markus; Neff, Patrizio

    2017-10-01

    In this paper we propose an anisotropic extension of the isotropic exponentiated Hencky energy, based on logarithmic strain invariants. Unlike other elastic formulations, the isotropic exponentiated Hencky elastic energy has been derived solely on differential geometric grounds, involving the geodesic distance of the deformation gradient \\varvec{F} to the group of rotations. We formally extend this approach towards anisotropy by defining additional anisotropic logarithmic strain invariants with the help of suitable structural tensors and consider our findings for selected case studies.

  8. Sinker tectonics - An approach to the surface of Miranda

    NASA Technical Reports Server (NTRS)

    Janes, D. M.; Melosh, H. J.

    1988-01-01

    Two of the proposed explanations for the coronae seen on Miranda involve mantle convection driven by density anomalies. In the sinker model, the coronae result from late-accreting large silicate bodies slowly sinking through an icy mantle toward the body's center; in the riser model, they result from a compositionally produced, low-density, rising diapir. The present study determines the surface stresses induced by such density anomalies and the expected surface expressions. The results are in good agreement with the predictions of the sinker model.

  9. Synthesis of Methylenebicyclo[3.2.1]octanol by a Sm(II)-Induced 1,2-Rearrangement Reaction with Ring Expansion of Methylenebicyclo[4.2.0]octanone.

    PubMed

    Takatori, Kazuhiko; Ota, Shoya; Tendo, Kenta; Matsunaga, Kazuma; Nagasawa, Kokoro; Watanabe, Shinya; Kishida, Atsushi; Kogen, Hiroshi; Nagaoka, Hiroto

    2017-07-21

    Direct conversion of methylenebicyclo[4.2.0]octanone to methylenebicyclo[3.2.1]octanol by a Sm(II)-induced 1,2-rearrangement with ring expansion of the methylenecyclobutane is described. Three conditions were optimized to allow the adaptation of this approach to various substrates. A rearrangement mechanism is proposed involving the generation of a ketyl radical and cyclopentanation by ketyl-olefin cyclization, followed by radical fragmentation and subsequent protonation.

  10. A Robust Geometric Model for Argument Classification

    NASA Astrophysics Data System (ADS)

    Giannone, Cristina; Croce, Danilo; Basili, Roberto; de Cao, Diego

    Argument classification is the task of assigning semantic roles to syntactic structures in natural language sentences. Supervised learning techniques for frame semantics have been recently shown to benefit from rich sets of syntactic features. However argument classification is also highly dependent on the semantics of the involved lexicals. Empirical studies have shown that domain dependence of lexical information causes large performance drops in outside domain tests. In this paper a distributional approach is proposed to improve the robustness of the learning model against out-of-domain lexical phenomena.

  11. E-Services quality assessment framework for collaborative networks

    NASA Astrophysics Data System (ADS)

    Stegaru, Georgiana; Danila, Cristian; Sacala, Ioan Stefan; Moisescu, Mihnea; Mihai Stanescu, Aurelian

    2015-08-01

    In a globalised networked economy, collaborative networks (CNs) are formed to take advantage of new business opportunities. Collaboration involves shared resources and capabilities, such as e-Services that can be dynamically composed to automate CN participants' business processes. Quality is essential for the success of business process automation. Current approaches mostly focus on quality of service (QoS)-based service selection and ranking algorithms, overlooking the process of service composition which requires interoperable, adaptable and secure e-Services to ensure seamless collaboration, data confidentiality and integrity. Lack of assessment of these quality attributes can result in e-Service composition failure. The quality of e-Service composition relies on the quality of each e-Service and on the quality of the composition process. Therefore, there is the need for a framework that addresses quality from both views: product and process. We propose a quality of e-Service composition (QoESC) framework for quality assessment of e-Service composition for CNs which comprises of a quality model for e-Service evaluation and guidelines for quality of e-Service composition process. We implemented a prototype considering a simplified telemedicine use case which involves a CN in e-Healthcare domain. To validate the proposed quality-driven framework, we analysed service composition reliability with and without using the proposed framework.

  12. Least-squares deconvolution of evoked potentials and sequence optimization for multiple stimuli under low-jitter conditions.

    PubMed

    Bardy, Fabrice; Dillon, Harvey; Van Dun, Bram

    2014-04-01

    Rapid presentation of stimuli in an evoked response paradigm can lead to overlap of multiple responses and consequently difficulties interpreting waveform morphology. This paper presents a deconvolution method allowing overlapping multiple responses to be disentangled. The deconvolution technique uses a least-squared error approach. A methodology is proposed to optimize the stimulus sequence associated with the deconvolution technique under low-jitter conditions. It controls the condition number of the matrices involved in recovering the responses. Simulations were performed using the proposed deconvolution technique. Multiple overlapping responses can be recovered perfectly in noiseless conditions. In the presence of noise, the amount of error introduced by the technique can be controlled a priori by the condition number of the matrix associated with the used stimulus sequence. The simulation results indicate the need for a minimum amount of jitter, as well as a sufficient number of overlap combinations to obtain optimum results. An aperiodic model is recommended to improve reconstruction. We propose a deconvolution technique allowing multiple overlapping responses to be extracted and a method of choosing the stimulus sequence optimal for response recovery. This technique may allow audiologists, psychologists, and electrophysiologists to optimize their experimental designs involving rapidly presented stimuli, and to recover evoked overlapping responses. Copyright © 2013 International Federation of Clinical Neurophysiology. All rights reserved.

  13. On the Modeling of Shells in Multibody Dynamics

    NASA Technical Reports Server (NTRS)

    Bauchau, Olivier A.; Choi, Jou-Young; Bottasso, Carlo L.

    2000-01-01

    Energy preserving/decaying schemes are presented for the simulation of the nonlinear multibody systems involving shell components. The proposed schemes are designed to meet four specific requirements: unconditional nonlinear stability of the scheme, a rigorous treatment of both geometric and material nonlinearities, exact satisfaction of the constraints, and the presence of high frequency numerical dissipation. The kinematic nonlinearities associated with arbitrarily large displacements and rotations of shells are treated in a rigorous manner, and the material nonlinearities can be handled when the, constitutive laws stem from the existence of a strain energy density function. The efficiency and robustness of the proposed approach is illustrated with specific numerical examples that also demonstrate the need for integration schemes possessing high frequency numerical dissipation.

  14. Improved Collaborative Filtering Algorithm via Information Transformation

    NASA Astrophysics Data System (ADS)

    Liu, Jian-Guo; Wang, Bing-Hong; Guo, Qiang

    In this paper, we propose a spreading activation approach for collaborative filtering (SA-CF). By using the opinion spreading process, the similarity between any users can be obtained. The algorithm has remarkably higher accuracy than the standard collaborative filtering using the Pearson correlation. Furthermore, we introduce a free parameter β to regulate the contributions of objects to user-user correlations. The numerical results indicate that decreasing the influence of popular objects can further improve the algorithmic accuracy and personality. We argue that a better algorithm should simultaneously require less computation and generate higher accuracy. Accordingly, we further propose an algorithm involving only the top-N similar neighbors for each target user, which has both less computational complexity and higher algorithmic accuracy.

  15. Nonlinear robust control of hypersonic aircrafts with interactions between flight dynamics and propulsion systems.

    PubMed

    Li, Zhaoying; Zhou, Wenjie; Liu, Hao

    2016-09-01

    This paper addresses the nonlinear robust tracking controller design problem for hypersonic vehicles. This problem is challenging due to strong coupling between the aerodynamics and the propulsion system, and the uncertainties involved in the vehicle dynamics including parametric uncertainties, unmodeled model uncertainties, and external disturbances. By utilizing the feedback linearization technique, a linear tracking error system is established with prescribed references. For the linear model, a robust controller is proposed based on the signal compensation theory to guarantee that the tracking error dynamics is robustly stable. Numerical simulation results are given to show the advantages of the proposed nonlinear robust control method, compared to the robust loop-shaping control approach. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Robust H(infinity) tracking control of boiler-turbine systems.

    PubMed

    Wu, J; Nguang, S K; Shen, J; Liu, G; Li, Y G

    2010-07-01

    In this paper, the problem of designing a fuzzy H(infinity) state feedback tracking control of a boiler-turbine is solved. First, the Takagi and Sugeno fuzzy model is used to model a boiler-turbine system. Next, based on the Takagi and Sugeno fuzzy model, sufficient conditions for the existence of a fuzzy H(infinity) nonlinear state feedback tracking control are derived in terms of linear matrix inequalities. The advantage of the proposed tracking control design is that it does not involve feedback linearization technique and complicated adaptive scheme. An industrial boiler-turbine system is used to illustrate the effectiveness of the proposed design as compared with a linearized approach. 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  17. The colloquial approach: An active learning technique

    NASA Astrophysics Data System (ADS)

    Arce, Pedro

    1994-09-01

    This paper addresses the very important problem of the effectiveness of teaching methodologies in fundamental engineering courses such as transport phenomena. An active learning strategy, termed the colloquial approach, is proposed in order to increase student involvement in the learning process. This methodology is a considerable departure from traditional methods that use solo lecturing. It is based on guided discussions, and it promotes student understanding of new concepts by directing the student to construct new ideas by building upon the current knowledge and by focusing on key cases that capture the essential aspects of new concepts. The colloquial approach motivates the student to participate in discussions, to develop detailed notes, and to design (or construct) his or her own explanation for a given problem. This paper discusses the main features of the colloquial approach within the framework of other current and previous techniques. Problem-solving strategies and the need for new textbooks and for future investigations based on the colloquial approach are also outlined.

  18. Basal ganglia circuit loops, dopamine and motivation: A review and enquiry

    PubMed Central

    Ikemoto, Satoshi; Yang, Chen; Tan, Aaron

    2015-01-01

    Dopamine neurons located in the midbrain play a role in motivation that regulates approach behavior (approach motivation). In addition, activation and inactivation of dopamine neurons regulate mood and induce reward and aversion, respectively. Accumulating evidence suggests that such motivational role of dopamine neurons is not limited to those located in the ventral tegmental area, but also in the substantia nigra. The present paper reviews previous rodent work concerning dopamine’s role in approach motivation and the connectivity of dopamine neurons, and proposes two working models: One concerns the relationship between extracellular dopamine concentration and approach motivation. High, moderate and low concentrations of extracellular dopamine induce euphoric, seeking and aversive states, respectively. The other concerns circuit loops involving the cerebral cortex, basal ganglia, thalamus, epithalamus, and midbrain through which dopaminergic activity alters approach motivation. These models should help to generate hypothesis-driven research and provide insights for understanding altered states associated with drugs of abuse and affective disorders. PMID:25907747

  19. An Analysis of Machine- and Human-Analytics in Classification.

    PubMed

    Tam, Gary K L; Kothari, Vivek; Chen, Min

    2017-01-01

    In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.

  20. Description and Nomenclature of Neisseria meningitidis Capsule Locus

    PubMed Central

    Claus, Heike; Jiang, Ying; Bennett, Julia S.; Bratcher, Holly B.; Jolley, Keith A.; Corton, Craig; Care, Rory; Poolman, Jan T.; Zollinger, Wendell D.; Frasch, Carl E.; Stephens, David S.; Feavers, Ian; Frosch, Matthias; Parkhill, Julian; Vogel, Ulrich; Quail, Michael A.; Bentley, Stephen D.; Maiden, Martin C.J.

    2013-01-01

    Pathogenic Neisseria meningitidis isolates contain a polysaccharide capsule that is the main virulence determinant for this bacterium. Thirteen capsular polysaccharides have been described, and nuclear magnetic resonance spectroscopy has enabled determination of the structure of capsular polysaccharides responsible for serogroup specificity. Molecular mechanisms involved in N. meningitidis capsule biosynthesis have also been identified, and genes involved in this process and in cell surface translocation are clustered at a single chromosomal locus termed cps. The use of multiple names for some of the genes involved in capsule synthesis, combined with the need for rapid diagnosis of serogroups commonly associated with invasive meningococcal disease, prompted a requirement for a consistent approach to the nomenclature of capsule genes. In this report, a comprehensive description of all N. meningitidis serogroups is provided, along with a proposed nomenclature, which was presented at the 2012 XVIIIth International Pathogenic Neisseria Conference. PMID:23628376

  1. Peer Interventions to Promote Health: Conceptual Considerations

    PubMed Central

    Simoni, Jane M.; Franks, Julie C.; Lehavot, Keren; Yard, Samantha S.

    2013-01-01

    Peers have intervened to promote health since ancient times, yet few attempts have been made to describe theoretically their role and their interventions. After a brief overview of the history and variety of peer-based health interventions, a 4-part definition of peer interveners is presented here with a consideration of the dimensions of their involvement in health promotion. Then, a 2-step process is proposed as a means of conceptualizing peer interventions to promote health. Step 1 involves establishing a theoretical framework for the intervention’s main focus (i.e., education, social support, social norms, self-efficacy, and patient advocacy), and Step 2 involves identifying a theory that justifies the use of peers and might explain their impact. As examples, the following might be referred to: theoretical perspectives from the mutual support group and self-help literature, social cognitive and social learning theories, the social support literature, social comparison theory, social network approaches, and empowerment models. PMID:21729015

  2. Training set expansion: an approach to improving the reconstruction of biological networks from limited and uneven reliable interactions

    PubMed Central

    Yip, Kevin Y.; Gerstein, Mark

    2009-01-01

    Motivation: An important problem in systems biology is reconstructing complete networks of interactions between biological objects by extrapolating from a few known interactions as examples. While there are many computational techniques proposed for this network reconstruction task, their accuracy is consistently limited by the small number of high-confidence examples, and the uneven distribution of these examples across the potential interaction space, with some objects having many known interactions and others few. Results: To address this issue, we propose two computational methods based on the concept of training set expansion. They work particularly effectively in conjunction with kernel approaches, which are a popular class of approaches for fusing together many disparate types of features. Both our methods are based on semi-supervised learning and involve augmenting the limited number of gold-standard training instances with carefully chosen and highly confident auxiliary examples. The first method, prediction propagation, propagates highly confident predictions of one local model to another as the auxiliary examples, thus learning from information-rich regions of the training network to help predict the information-poor regions. The second method, kernel initialization, takes the most similar and most dissimilar objects of each object in a global kernel as the auxiliary examples. Using several sets of experimentally verified protein–protein interactions from yeast, we show that training set expansion gives a measurable performance gain over a number of representative, state-of-the-art network reconstruction methods, and it can correctly identify some interactions that are ranked low by other methods due to the lack of training examples of the involved proteins. Contact: mark.gerstein@yale.edu Availability: The datasets and additional materials can be found at http://networks.gersteinlab.org/tse. PMID:19015141

  3. A new measure for gene expression biclustering based on non-parametric correlation.

    PubMed

    Flores, Jose L; Inza, Iñaki; Larrañaga, Pedro; Calvo, Borja

    2013-12-01

    One of the emerging techniques for performing the analysis of the DNA microarray data known as biclustering is the search of subsets of genes and conditions which are coherently expressed. These subgroups provide clues about the main biological processes. Until now, different approaches to this problem have been proposed. Most of them use the mean squared residue as quality measure but relevant and interesting patterns can not be detected such as shifting, or scaling patterns. Furthermore, recent papers show that there exist new coherence patterns involved in different kinds of cancer and tumors such as inverse relationships between genes which can not be captured. The proposed measure is called Spearman's biclustering measure (SBM) which performs an estimation of the quality of a bicluster based on the non-linear correlation among genes and conditions simultaneously. The search of biclusters is performed by using a evolutionary technique called estimation of distribution algorithms which uses the SBM measure as fitness function. This approach has been examined from different points of view by using artificial and real microarrays. The assessment process has involved the use of quality indexes, a set of bicluster patterns of reference including new patterns and a set of statistical tests. It has been also examined the performance using real microarrays and comparing to different algorithmic approaches such as Bimax, CC, OPSM, Plaid and xMotifs. SBM shows several advantages such as the ability to recognize more complex coherence patterns such as shifting, scaling and inversion and the capability to selectively marginalize genes and conditions depending on the statistical significance. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  4. Planum Sphenoidale and Tuberculum Sellae Meningiomas: Operative Nuances of a Modern Surgical Technique with Outcome and Proposal of a New Classification System.

    PubMed

    Mortazavi, Martin M; Brito da Silva, Harley; Ferreira, Manuel; Barber, Jason K; Pridgeon, James S; Sekhar, Laligam N

    2016-02-01

    The resection of planum sphenoidale and tuberculum sellae meningiomas is challenging. A universally accepted classification system predicting surgical risk and outcome is still lacking. We report a modern surgical technique specific for planum sphenoidale and tuberculum sellae meningiomas with associated outcome. A new classification system that can guide the surgical approach and may predict surgical risk is proposed. We conducted a retrospective review of the patients who between 2005 and March 2015 underwent a craniotomy or endoscopic surgery for the resection of meningiomas involving the suprasellar region. Operative nuances of a modified frontotemporal craniotomy and orbital osteotomy technique for meningioma removal and reconstruction are described. Twenty-seven patients were found to have tumors arising mainly from the planum sphenoidale or the tuberculum sellae; 25 underwent frontotemporal craniotomy and tumor removal with orbital osteotomy and bilateral optic canal decompression, and 2 patients underwent endonasal transphenoidal resection. The most common presenting symptom was visual disturbance (77%). Vision improved in 90% of those who presented with visual decline, and there was no permanent visual deterioration. Cerebrospinal fluid leak occurred in one of the 25 cranial cases (4%) and in 1 of 2 transphenoidal cases (50%), and in both cases it resolved with treatment. There was no surgical mortality. An orbitotomy and early decompression of the involved optic canal are important for achieving gross total resection, maximizing visual improvement, and avoiding recurrence. The visual outcomes were excellent. A new classification system that can allow the comparison of different series and approaches and indicate cases that are more suitable for an endoscopic transsphenoidal approach is presented. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Illusion of control: the role of personal involvement.

    PubMed

    Yarritu, Ion; Matute, Helena; Vadillo, Miguel A

    2014-01-01

    The illusion of control consists of overestimating the influence that our behavior exerts over uncontrollable outcomes. Available evidence suggests that an important factor in development of this illusion is the personal involvement of participants who are trying to obtain the outcome. The dominant view assumes that this is due to social motivations and self-esteem protection. We propose that this may be due to a bias in contingency detection which occurs when the probability of the action (i.e., of the potential cause) is high. Indeed, personal involvement might have been often confounded with the probability of acting, as participants who are more involved tend to act more frequently than those for whom the outcome is irrelevant and therefore become mere observers. We tested these two variables separately. In two experiments, the outcome was always uncontrollable and we used a yoked design in which the participants of one condition were actively involved in obtaining it and the participants in the other condition observed the adventitious cause-effect pairs. The results support the latter approach: Those acting more often to obtain the outcome developed stronger illusions, and so did their yoked counterparts.

  6. An experimental approach to the fundamental principles of hemodynamics.

    PubMed

    Pontiga, Francisco; Gaytán, Susana P

    2005-09-01

    An experimental model has been developed to give students hands-on experience with the fundamental laws of hemodynamics. The proposed experimental setup is of simple construction but permits the precise measurements of physical variables involved in the experience. The model consists in a series of experiments where different basic phenomena are quantitatively investigated, such as the pressure drop in a long straight vessel and in an obstructed vessel, the transition from laminar to turbulent flow, the association of vessels in vascular networks, or the generation of a critical stenosis. Through these experiments, students acquire a direct appreciation of the importance of the parameters involved in the relationship between pressure and flow rate, thus facilitating the comprehension of more complex problems in hemodynamics.

  7. Performance Optimizing Adaptive Control with Time-Varying Reference Model Modification

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.; Hashemi, Kelley E.

    2017-01-01

    This paper presents a new adaptive control approach that involves a performance optimization objective. The control synthesis involves the design of a performance optimizing adaptive controller from a subset of control inputs. The resulting effect of the performance optimizing adaptive controller is to modify the initial reference model into a time-varying reference model which satisfies the performance optimization requirement obtained from an optimal control problem. The time-varying reference model modification is accomplished by the real-time solutions of the time-varying Riccati and Sylvester equations coupled with the least-squares parameter estimation of the sensitivities of the performance metric. The effectiveness of the proposed method is demonstrated by an application of maneuver load alleviation control for a flexible aircraft.

  8. A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions

    PubMed Central

    Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.

    2009-01-01

    Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453

  9. Cost reduction in space operations - Structuring a planetary program to minimize the annual funding requirement as opposed to minimizing the program runout cost

    NASA Technical Reports Server (NTRS)

    Herman, D. H.; Niehoff, J. C.; Spadoni, D. J.

    1980-01-01

    An approach is proposed for the structuring of a planetary mission set wherein the peak annual funding is minimized to meet the annual budget restraint. One aspect of the approach is to have a transportation capability that can launch a mission in any planetary opportunity; such capability can be provided by solar electric propulsion. Another cost reduction technique is to structure a mission test in a time sequenced fashion that could utilize essentially the same spacecraft for the implementation of several missions. A third technique would be to fulfill a scientific objective in several sequential missions rather than attempt to accomplish all of the objectives with one mission. The application of the approach is illustrated by an example involving the Solar Orbiter Dual Probe mission.

  10. SNMP-SI: A Network Management Tool Based on Slow Intelligence System Approach

    NASA Astrophysics Data System (ADS)

    Colace, Francesco; de Santo, Massimo; Ferrandino, Salvatore

    The last decade has witnessed an intense spread of computer networks that has been further accelerated with the introduction of wireless networks. Simultaneously with, this growth has increased significantly the problems of network management. Especially in small companies, where there is no provision of personnel assigned to these tasks, the management of such networks is often complex and malfunctions can have significant impacts on their businesses. A possible solution is the adoption of Simple Network Management Protocol. Simple Network Management Protocol (SNMP) is a standard protocol used to exchange network management information. It is part of the Transmission Control Protocol/Internet Protocol (TCP/IP) protocol suite. SNMP provides a tool for network administrators to manage network performance, find and solve network problems, and plan for network growth. SNMP has a big disadvantage: its simple design means that the information it deals with is neither detailed nor well organized enough to deal with the expanding modern networking requirements. Over the past years much efforts has been given to improve the lack of Simple Network Management Protocol and new frameworks has been developed: A promising approach involves the use of Ontology. This is the starting point of this paper where a novel approach to the network management based on the use of the Slow Intelligence System methodologies and Ontology based techniques is proposed. Slow Intelligence Systems is a general-purpose systems characterized by being able to improve performance over time through a process involving enumeration, propagation, adaptation, elimination and concentration. Therefore, the proposed approach aims to develop a system able to acquire, according to an SNMP standard, information from the various hosts that are in the managed networks and apply solutions in order to solve problems. To check the feasibility of this model first experimental results in a real scenario are showed.

  11. Exploiting the potential of unlabeled endoscopic video data with self-supervised learning.

    PubMed

    Ross, Tobias; Zimmerer, David; Vemuri, Anant; Isensee, Fabian; Wiesenfarth, Manuel; Bodenstedt, Sebastian; Both, Fabian; Kessler, Philip; Wagner, Martin; Müller, Beat; Kenngott, Hannes; Speidel, Stefanie; Kopp-Schneider, Annette; Maier-Hein, Klaus; Maier-Hein, Lena

    2018-06-01

    Surgical data science is a new research field that aims to observe all aspects of the patient treatment process in order to provide the right assistance at the right time. Due to the breakthrough successes of deep learning-based solutions for automatic image annotation, the availability of reference annotations for algorithm training is becoming a major bottleneck in the field. The purpose of this paper was to investigate the concept of self-supervised learning to address this issue. Our approach is guided by the hypothesis that unlabeled video data can be used to learn a representation of the target domain that boosts the performance of state-of-the-art machine learning algorithms when used for pre-training. Core of the method is an auxiliary task based on raw endoscopic video data of the target domain that is used to initialize the convolutional neural network (CNN) for the target task. In this paper, we propose the re-colorization of medical images with a conditional generative adversarial network (cGAN)-based architecture as auxiliary task. A variant of the method involves a second pre-training step based on labeled data for the target task from a related domain. We validate both variants using medical instrument segmentation as target task. The proposed approach can be used to radically reduce the manual annotation effort involved in training CNNs. Compared to the baseline approach of generating annotated data from scratch, our method decreases exploratively the number of labeled images by up to 75% without sacrificing performance. Our method also outperforms alternative methods for CNN pre-training, such as pre-training on publicly available non-medical (COCO) or medical data (MICCAI EndoVis2017 challenge) using the target task (in this instance: segmentation). As it makes efficient use of available (non-)public and (un-)labeled data, the approach has the potential to become a valuable tool for CNN (pre-)training.

  12. Nucleophosmin integrates within the nucleolus via multi-modal interactions with proteins displaying R-rich linear motifs and rRNA

    PubMed Central

    Mitrea, Diana M; Cika, Jaclyn A; Guy, Clifford S; Ban, David; Banerjee, Priya R; Stanley, Christopher B; Nourse, Amanda; Deniz, Ashok A; Kriwacki, Richard W

    2016-01-01

    The nucleolus is a membrane-less organelle formed through liquid-liquid phase separation of its components from the surrounding nucleoplasm. Here, we show that nucleophosmin (NPM1) integrates within the nucleolus via a multi-modal mechanism involving multivalent interactions with proteins containing arginine-rich linear motifs (R-motifs) and ribosomal RNA (rRNA). Importantly, these R-motifs are found in canonical nucleolar localization signals. Based on a novel combination of biophysical approaches, we propose a model for the molecular organization within liquid-like droplets formed by the N-terminal domain of NPM1 and R-motif peptides, thus providing insights into the structural organization of the nucleolus. We identify multivalency of acidic tracts and folded nucleic acid binding domains, mediated by N-terminal domain oligomerization, as structural features required for phase separation of NPM1 with other nucleolar components in vitro and for localization within mammalian nucleoli. We propose that one mechanism of nucleolar localization involves phase separation of proteins within the nucleolus. DOI: http://dx.doi.org/10.7554/eLife.13571.001 PMID:26836305

  13. Integrating surveillance data on water-related diseases and drinking-water quality; action-research in a Brazilian municipality.

    PubMed

    Queiroz, Ana Carolina Lanza; Cardoso, Laís Santos de Magalhães; Heller, Léo; Cairncross, Sandy

    2015-12-01

    The Brazilian Ministry of Health proposed a research study involving municipal professional staff conducting both epidemiological and water quality surveillance to facilitate the integration of the data which they collected. It aimed to improve the intersectoral collaboration and health promotion activities in the municipalities, especially regarding drinking-water quality. We then conducted a study using the action-research approach. At its evaluation phase, a technique which we called 'the tree analogy' was applied in order to identify both possibilities and challenges related to the proposed interlinkage. Results showed that integrating the two data collection systems cannot be attained without prior institutional adjustments. It suggests therefore the necessity to unravel issues that go beyond the selection and the interrelation of indicators and compatibility of software, to include political, administrative and personal matters. The evaluation process led those involved to re-think their practice by sharing experiences encountered in everyday practice, and formulating constructive criticisms. All this inevitably unleashes a process of empowerment. From this perspective, we have certainly gathered some fruit from the Tree, but not necessarily the most visible.

  14. The influence of liquid/vapor phase change onto the Nusselt number

    NASA Astrophysics Data System (ADS)

    Popescu, Elena-Roxana; Colin, Catherine; Tanguy, Sebastien

    2017-11-01

    In spite of its significant interest in various fields, there is currently a very few information on how an external flow will modify the evaporation or the condensation of a liquid surface. Although most applications involve turbulent flows, the simpler configuration where a laminar superheated or subcooled vapor flow is shearing a saturated liquid interface has still never been solved. Based on a numerical approach, we propose to characterize the interaction between a laminar boundary layer of a superheated or subcooled vapor flow and a static liquid pool at saturation temperature. By performing a full set of simulations sweeping the parameters space, correlations are proposed for the first time on the Nusselt number depending on the dimensionless numbers that characterize both vaporization and condensation. As attended, the Nusselt number decreases or increases in the configurations involving respectively vaporization or condensation. More unexpected is the behaviour of the friction of the vapor flow on the liquid pool, for which we report that it is weakly affected by the phase change, despite the important variation of the local flow structure due to evaporation or condensation.

  15. Wearable ultrasonic guiding device with white cane for the visually impaired: A preliminary verisimilitude experiment.

    PubMed

    Cheng, Po-Hsun

    2016-01-01

    Several assistive technologies are available to help visually impaired individuals avoid obstructions while walking. Unfortunately, white canes and medical walkers are unable to detect obstacles on the road or react to encumbrances located above the waist. In this study, I adopted the cyber-physical system approach in the development of a cap-connected device to compensate for gaps in detection associated with conventional aids for the visually impaired. I developed a verisimilar, experimental route involving the participation of seven individuals with visual impairment, including straight sections, left turns, right turns, curves, and suspended objects. My aim was to facilitate the collection of information required for the practical use of the device. My findings demonstrate the feasibility of the proposed guiding device in alerting walkers to the presence of some kinds of obstacles from the small number of subjects. That is, it shows promise for future work and research with the proposed device. My findings provide a valuable reference for the further improvement of these devices as well as the establishment of experiments involving the visually impaired.

  16. Nucleophosmin integrates within the nucleolus via multi-modal interactions with proteins displaying R-rich linear motifs and rRNA

    DOE PAGES

    Mitrea, Diana M.; Cika, Jaclyn A.; Guy, Clifford S.; ...

    2016-02-02

    In this study, the nucleolus is a membrane-less organelle formed through liquid-liquid phase separation of its components from the surrounding nucleoplasm. Here, we show that nucleophosmin (NPM1) integrates within the nucleolus via a multi-modal mechanism involving multivalent interactions with proteins containing arginine-rich linear motifs (R-motifs) and ribosomal RNA (rRNA). Importantly, these R-motifs are found in canonical nucleolar localization signals. Based on a novel combination of biophysical approaches, we propose a model for the molecular organization within liquid-like droplets formed by the N-terminal domain of NPM1 and R-motif peptides, thus providing insights into the structural organization of the nucleolus. We identifymore » multivalency of acidic tracts and folded nucleic acid binding domains, mediated by N-terminal domain oligomerization, as structural features required for phase separation of NPM1 with other nucleolar components in vitro and for localization within mammalian nucleoli. We propose that one mechanism of nucleolar localization involves phase separation of proteins within the nucleolus.« less

  17. Success and Failure of Parliamentary Motions: A Social Dilemma Approach.

    PubMed

    Popping, Roel; Wittek, Rafael

    2015-01-01

    Parliamentary motions are a vital and frequently used element of political control in democratic regimes. Despite their high incidence and potential impact on the political fate of a government and its policies, we know relatively little about the conditions under which parliamentary motions are likely to be accepted or rejected. Current collective decision-making models use a voting power framework in which power and influence of the involved parties are the main predictors. We propose an alternative, social dilemma approach, according to which a motion's likelihood to be accepted depends on the severity of the social dilemma underlying the decision issue. Actor- and dilemma-centered hypotheses are developed and tested with data from a stratified random sample of 822 motions that have been voted upon in the Dutch Parliament between September 2009 and February 2011. The social dilemma structure of each motion is extracted through content coding, applying a cognitive mapping technique developed by Anthony, Heckathorn and Maser. Logistic regression analyses are in line with both, actor-centered and social-dilemma centered approaches, though the latter show stronger effect sizes. Motions have a lower chance to be accepted if voting potential is low, the proposer is not from the voting party, and if the problem underlying the motion reflects a prisoner's dilemma or a pure competition game as compared to a coordination game. The number of proposing parties or a battle of the sexes structure does not significantly affect the outcome.

  18. Success and Failure of Parliamentary Motions: A Social Dilemma Approach

    PubMed Central

    Popping, Roel; Wittek, Rafael

    2015-01-01

    Parliamentary motions are a vital and frequently used element of political control in democratic regimes. Despite their high incidence and potential impact on the political fate of a government and its policies, we know relatively little about the conditions under which parliamentary motions are likely to be accepted or rejected. Current collective decision-making models use a voting power framework in which power and influence of the involved parties are the main predictors. We propose an alternative, social dilemma approach, according to which a motion’s likelihood to be accepted depends on the severity of the social dilemma underlying the decision issue. Actor- and dilemma-centered hypotheses are developed and tested with data from a stratified random sample of 822 motions that have been voted upon in the Dutch Parliament between September 2009 and February 2011. The social dilemma structure of each motion is extracted through content coding, applying a cognitive mapping technique developed by Anthony, Heckathorn and Maser. Logistic regression analyses are in line with both, actor-centered and social-dilemma centered approaches, though the latter show stronger effect sizes. Motions have a lower chance to be accepted if voting potential is low, the proposer is not from the voting party, and if the problem underlying the motion reflects a prisoner’s dilemma or a pure competition game as compared to a coordination game. The number of proposing parties or a battle of the sexes structure does not significantly affect the outcome. PMID:26317869

  19. Towards sensible toxicity testing for nanomaterials: proposal for the specification of test design

    NASA Astrophysics Data System (ADS)

    Potthoff, Annegret; Weil, Mirco; Meißner, Tobias; Kühnel, Dana

    2015-12-01

    During the last decade, nanomaterials (NM) were extensively tested for potential harmful effects towards humans and environmental organisms. However, a sound hazard assessment was so far hampered by uncertainties and a low comparability of test results. The reason for the low comparability is a high variation in the (1) type of NM tested with regard to raw material, size and shape and (2) procedures before and during the toxicity testing. This calls for tailored, nanomaterial-specific protocols. Here, a structured approach is proposed, intended to lead to test protocols not only tailored to specific types of nanomaterials, but also to respective test system for toxicity testing. There are existing standards on single procedures involving nanomaterials, however, not all relevant procedures are covered by standards. Hence, our approach offers a detailed way of weighting several plausible alternatives for e.g. sample preparation, in order to decide on the procedure most meaningful for a specific nanomaterial and toxicity test. A framework of several decision trees (DT) and flow charts to support testing of NM is proposed as a basis for further refinement and in-depth elaboration. DT and flow charts were drafted for (1) general procedure—physicochemical characterisation, (2) choice of test media, (3) decision on test scenario and application of NM to liquid media, (4) application of NM to the gas phase, (5) application of NM to soil and sediments, (6) dose metrics, (S1) definition of a nanomaterial, and (S2) dissolution. The applicability of the proposed approach was surveyed by using experimental data retrieved from studies on nanoscale CuO. This survey demonstrated the DT and flow charts to be a convenient tool to systematically decide upon test procedures and processes, and hence pose an important step towards harmonisation of NM testing.

  20. Towards sensible toxicity testing for nanomaterials: proposal for the specification of test design.

    PubMed

    Potthoff, Annegret; Weil, Mirco; Meißner, Tobias; Kühnel, Dana

    2015-12-01

    During the last decade, nanomaterials (NM) were extensively tested for potential harmful effects towards humans and environmental organisms. However, a sound hazard assessment was so far hampered by uncertainties and a low comparability of test results. The reason for the low comparability is a high variation in the (1) type of NM tested with regard to raw material, size and shape and (2) procedures before and during the toxicity testing. This calls for tailored, nanomaterial-specific protocols. Here, a structured approach is proposed, intended to lead to test protocols not only tailored to specific types of nanomaterials, but also to respective test system for toxicity testing. There are existing standards on single procedures involving nanomaterials, however, not all relevant procedures are covered by standards. Hence, our approach offers a detailed way of weighting several plausible alternatives for e.g. sample preparation, in order to decide on the procedure most meaningful for a specific nanomaterial and toxicity test. A framework of several decision trees (DT) and flow charts to support testing of NM is proposed as a basis for further refinement and in-depth elaboration. DT and flow charts were drafted for (1) general procedure-physicochemical characterisation, (2) choice of test media, (3) decision on test scenario and application of NM to liquid media, (4) application of NM to the gas phase, (5) application of NM to soil and sediments, (6) dose metrics, (S1) definition of a nanomaterial, and (S2) dissolution. The applicability of the proposed approach was surveyed by using experimental data retrieved from studies on nanoscale CuO. This survey demonstrated the DT and flow charts to be a convenient tool to systematically decide upon test procedures and processes, and hence pose an important step towards harmonisation of NM testing.

Top