Sample records for original approach based

  1. Creative Disruption: A Task-Based Approach to Engaging With Original Works of Art

    ERIC Educational Resources Information Center

    Walker, Keith; Smith, Liz

    2004-01-01

    This paper examines the value of a task-based approach to engaging with original works of art and focuses in particular upon the experiences of a group of PGCE Art and Design trainees when they visited an exhibition entitled, Air Guitar: Art Reconsidering Rock Music, to carry out given tasks. The extent to which a task-based approach might…

  2. Administrator Preparation: Looking Backwards and Forwards

    ERIC Educational Resources Information Center

    Bridges, Edwin

    2012-01-01

    Purpose: The purpose of this paper was to conduct a critical analysis of the origins and implementation of problem-based learning in educational administration as a window into the limitations of this approach and more generally administrator preparation. Design/methodology/approach: The author reviewed the published work of the originator from…

  3. Development of a Compound Optimization Approach Based on Imperialist Competitive Algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Qimei; Yang, Zhihong; Wang, Yong

    In this paper, an improved novel approach is developed for the imperialist competitive algorithm to achieve a greater performance. The Nelder-Meand simplex method is applied to execute alternately with the original procedures of the algorithm. The approach is tested on twelve widely-used benchmark functions and is also compared with other relative studies. It is shown that the proposed approach has a faster convergence rate, better search ability, and higher stability than the original algorithm and other relative methods.

  4. An IR-Based Approach Utilizing Query Expansion for Plagiarism Detection in MEDLINE.

    PubMed

    Nawab, Rao Muhammad Adeel; Stevenson, Mark; Clough, Paul

    2017-01-01

    The identification of duplicated and plagiarized passages of text has become an increasingly active area of research. In this paper, we investigate methods for plagiarism detection that aim to identify potential sources of plagiarism from MEDLINE, particularly when the original text has been modified through the replacement of words or phrases. A scalable approach based on Information Retrieval is used to perform candidate document selection-the identification of a subset of potential source documents given a suspicious text-from MEDLINE. Query expansion is performed using the ULMS Metathesaurus to deal with situations in which original documents are obfuscated. Various approaches to Word Sense Disambiguation are investigated to deal with cases where there are multiple Concept Unique Identifiers (CUIs) for a given term. Results using the proposed IR-based approach outperform a state-of-the-art baseline based on Kullback-Leibler Distance.

  5. Comparing the Origins and Ideologies of the Independent Living Movement and Community Based Rehabilitation.

    ERIC Educational Resources Information Center

    Lysack, Catherine; Kaufert, Joseph

    1994-01-01

    This paper explores the origins, differences, and similarities of community-based rehabilitation, which developed in southern countries, and independent living, which developed in northern countries, for persons with disabilities. Although both approaches share a broad definition of rehabilitation and values emphasizing community and consumer…

  6. Novel approaches to estimating the turbulent kinetic energy dissipation rate from low- and moderate-resolution velocity fluctuation time series

    NASA Astrophysics Data System (ADS)

    Wacławczyk, Marta; Ma, Yong-Feng; Kopeć, Jacek M.; Malinowski, Szymon P.

    2017-11-01

    In this paper we propose two approaches to estimating the turbulent kinetic energy (TKE) dissipation rate, based on the zero-crossing method by Sreenivasan et al. (1983). The original formulation requires a fine resolution of the measured signal, down to the smallest dissipative scales. However, due to finite sampling frequency, as well as measurement errors, velocity time series obtained from airborne experiments are characterized by the presence of effective spectral cutoffs. In contrast to the original formulation the new approaches are suitable for use with signals originating from airborne experiments. The suitability of the new approaches is tested using measurement data obtained during the Physics of Stratocumulus Top (POST) airborne research campaign as well as synthetic turbulence data. They appear useful and complementary to existing methods. We show the number-of-crossings-based approaches respond differently to errors due to finite sampling and finite averaging than the classical power spectral method. Hence, their application for the case of short signals and small sampling frequencies is particularly interesting, as it can increase the robustness of turbulent kinetic energy dissipation rate retrieval.

  7. Sources, Developments and Directions of Task-Based Language Teaching

    ERIC Educational Resources Information Center

    Bygate, Martin

    2016-01-01

    This paper provides an outline of the origins, the current shape and the potential directions of task-based language teaching (TBLT) as an approach to language pedagogy. It first offers a brief description of TBLT and considers its origins within language teaching methodology and second language acquisition. It then summarises the current position…

  8. Interactive visual exploration and analysis of origin-destination data

    NASA Astrophysics Data System (ADS)

    Ding, Linfang; Meng, Liqiu; Yang, Jian; Krisp, Jukka M.

    2018-05-01

    In this paper, we propose a visual analytics approach for the exploration of spatiotemporal interaction patterns of massive origin-destination data. Firstly, we visually query the movement database for data at certain time windows. Secondly, we conduct interactive clustering to allow the users to select input variables/features (e.g., origins, destinations, distance, and duration) and to adjust clustering parameters (e.g. distance threshold). The agglomerative hierarchical clustering method is applied for the multivariate clustering of the origin-destination data. Thirdly, we design a parallel coordinates plot for visualizing the precomputed clusters and for further exploration of interesting clusters. Finally, we propose a gradient line rendering technique to show the spatial and directional distribution of origin-destination clusters on a map view. We implement the visual analytics approach in a web-based interactive environment and apply it to real-world floating car data from Shanghai. The experiment results show the origin/destination hotspots and their spatial interaction patterns. They also demonstrate the effectiveness of our proposed approach.

  9. Microwave vision for robots

    NASA Technical Reports Server (NTRS)

    Lewandowski, Leon; Struckman, Keith

    1994-01-01

    Microwave Vision (MV), a concept originally developed in 1985, could play a significant role in the solution to robotic vision problems. Originally our Microwave Vision concept was based on a pattern matching approach employing computer based stored replica correlation processing. Artificial Neural Network (ANN) processor technology offers an attractive alternative to the correlation processing approach, namely the ability to learn and to adapt to changing environments. This paper describes the Microwave Vision concept, some initial ANN-MV experiments, and the design of an ANN-MV system that has led to a second patent disclosure in the robotic vision field.

  10. Quantifying the origins of life on a planetary scale.

    PubMed

    Scharf, Caleb; Cronin, Leroy

    2016-07-19

    A simple, heuristic formula with parallels to the Drake Equation is introduced to help focus discussion on open questions for the origins of life in a planetary context. This approach indicates a number of areas where quantitative progress can be made on parameter estimation for determining origins of life probabilities, based on constraints from Bayesian approaches. We discuss a variety of "microscale" factors and their role in determining "macroscale" abiogenesis probabilities on suitable planets. We also propose that impact ejecta exchange between planets with parallel chemistries and chemical evolution could in principle amplify the development of molecular complexity and abiogenesis probabilities. This amplification could be very significant, and both bias our conclusions about abiogenesis probabilities based on the Earth and provide a major source of variance in the probability of life arising in planetary systems. We use our heuristic formula to suggest a number of observational routes for improving constraints on origins of life probabilities.

  11. Quantifying the origins of life on a planetary scale

    NASA Astrophysics Data System (ADS)

    Scharf, Caleb; Cronin, Leroy

    2016-07-01

    A simple, heuristic formula with parallels to the Drake Equation is introduced to help focus discussion on open questions for the origins of life in a planetary context. This approach indicates a number of areas where quantitative progress can be made on parameter estimation for determining origins of life probabilities, based on constraints from Bayesian approaches. We discuss a variety of “microscale” factors and their role in determining “macroscale” abiogenesis probabilities on suitable planets. We also propose that impact ejecta exchange between planets with parallel chemistries and chemical evolution could in principle amplify the development of molecular complexity and abiogenesis probabilities. This amplification could be very significant, and both bias our conclusions about abiogenesis probabilities based on the Earth and provide a major source of variance in the probability of life arising in planetary systems. We use our heuristic formula to suggest a number of observational routes for improving constraints on origins of life probabilities.

  12. On some methods for assessing earthquake predictions

    NASA Astrophysics Data System (ADS)

    Molchan, G.; Romashkova, L.; Peresan, A.

    2017-09-01

    A regional approach to the problem of assessing earthquake predictions inevitably faces a deficit of data. We point out some basic limits of assessment methods reported in the literature, considering the practical case of the performance of the CN pattern recognition method in the prediction of large Italian earthquakes. Along with the classical hypothesis testing, a new game approach, the so-called parimutuel gambling (PG) method, is examined. The PG, originally proposed for the evaluation of the probabilistic earthquake forecast, has been recently adapted for the case of 'alarm-based' CN prediction. The PG approach is a non-standard method; therefore it deserves careful examination and theoretical analysis. We show that the PG alarm-based version leads to an almost complete loss of information about predicted earthquakes (even for a large sample). As a result, any conclusions based on the alarm-based PG approach are not to be trusted. We also show that the original probabilistic PG approach does not necessarily identifies the genuine forecast correctly among competing seismicity rate models, even when applied to extensive data.

  13. A knowledge-based approach to improving and homogenizing intensity modulated radiation therapy planning quality among treatment centers: an example application to prostate cancer planning.

    PubMed

    Good, David; Lo, Joseph; Lee, W Robert; Wu, Q Jackie; Yin, Fang-Fang; Das, Shiva K

    2013-09-01

    Intensity modulated radiation therapy (IMRT) treatment planning can have wide variation among different treatment centers. We propose a system to leverage the IMRT planning experience of larger institutions to automatically create high-quality plans for outside clinics. We explore feasibility by generating plans for patient datasets from an outside institution by adapting plans from our institution. A knowledge database was created from 132 IMRT treatment plans for prostate cancer at our institution. The outside institution, a community hospital, provided the datasets for 55 prostate cancer cases, including their original treatment plans. For each "query" case from the outside institution, a similar "match" case was identified in the knowledge database, and the match case's plan parameters were then adapted and optimized to the query case by use of a semiautomated approach that required no expert planning knowledge. The plans generated with this knowledge-based approach were compared with the original treatment plans at several dose cutpoints. Compared with the original plan, the knowledge-based plan had a significantly more homogeneous dose to the planning target volume and a significantly lower maximum dose. The volumes of the rectum, bladder, and femoral heads above all cutpoints were nominally lower for the knowledge-based plan; the reductions were significantly lower for the rectum. In 40% of cases, the knowledge-based plan had overall superior (lower) dose-volume histograms for rectum and bladder; in 54% of cases, the comparison was equivocal; in 6% of cases, the knowledge-based plan was inferior for both bladder and rectum. Knowledge-based planning was superior or equivalent to the original plan in 95% of cases. The knowledge-based approach shows promise for homogenizing plan quality by transferring planning expertise from more experienced to less experienced institutions. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Application of LogitBoost Classifier for Traceability Using SNP Chip Data

    PubMed Central

    Kang, Hyunsung; Cho, Seoae; Kim, Heebal; Seo, Kang-Seok

    2015-01-01

    Consumer attention to food safety has increased rapidly due to animal-related diseases; therefore, it is important to identify their places of origin (POO) for safety purposes. However, only a few studies have addressed this issue and focused on machine learning-based approaches. In the present study, classification analyses were performed using a customized SNP chip for POO prediction. To accomplish this, 4,122 pigs originating from 104 farms were genotyped using the SNP chip. Several factors were considered to establish the best prediction model based on these data. We also assessed the applicability of the suggested model using a kinship coefficient-filtering approach. Our results showed that the LogitBoost-based prediction model outperformed other classifiers in terms of classification performance under most conditions. Specifically, a greater level of accuracy was observed when a higher kinship-based cutoff was employed. These results demonstrated the applicability of a machine learning-based approach using SNP chip data for practical traceability. PMID:26436917

  15. Application of LogitBoost Classifier for Traceability Using SNP Chip Data.

    PubMed

    Kim, Kwondo; Seo, Minseok; Kang, Hyunsung; Cho, Seoae; Kim, Heebal; Seo, Kang-Seok

    2015-01-01

    Consumer attention to food safety has increased rapidly due to animal-related diseases; therefore, it is important to identify their places of origin (POO) for safety purposes. However, only a few studies have addressed this issue and focused on machine learning-based approaches. In the present study, classification analyses were performed using a customized SNP chip for POO prediction. To accomplish this, 4,122 pigs originating from 104 farms were genotyped using the SNP chip. Several factors were considered to establish the best prediction model based on these data. We also assessed the applicability of the suggested model using a kinship coefficient-filtering approach. Our results showed that the LogitBoost-based prediction model outperformed other classifiers in terms of classification performance under most conditions. Specifically, a greater level of accuracy was observed when a higher kinship-based cutoff was employed. These results demonstrated the applicability of a machine learning-based approach using SNP chip data for practical traceability.

  16. Simple and fast multiplex PCR method for detection of species origin in meat products.

    PubMed

    Izadpanah, Mehrnaz; Mohebali, Nazanin; Elyasi Gorji, Zahra; Farzaneh, Parvaneh; Vakhshiteh, Faezeh; Shahzadeh Fazeli, Seyed Abolhassan

    2018-02-01

    Identification of animal species is one of the major concerns in food regulatory control and quality assurance system. Different approaches have been used for species identification in animal origin of feedstuff. This study aimed to develop a multiplex PCR approach to detect the origin of meat and meat products. Specific primers were designed based on the conserved region of mitochondrial Cytochrome C Oxidase subunit I ( COX1 ) gene. This method could successfully distinguish the origin of the pig, camel, sheep, donkey, goat, cow, and chicken in one single reaction. Since PCR products derived from each species represent unique molecular weight, the amplified products could be identified by electrophoresis and analyzed based on their size. Due to the synchronized amplification of segments within a single PCR reaction, multiplex PCR is considered to be a simple, fast, and inexpensive technique that can be applied for identification of meat products in food industries. Nowadays, this technique has been considered as a practical method to identify the species origin, which could further applied for animal feedstuffs identification.

  17. Dominant and Emerging Approaches in the Study of Higher Education Policy Change

    ERIC Educational Resources Information Center

    Saarinen, Taina; Ursin, Jani

    2012-01-01

    The purpose of the article is to analyse recent literature on higher education policy change. Based on the review, three different approaches are distinguished: structural, actor and agency. In the structural approach the dynamic of policy change originates in well-established structures. The actor approach focuses on either individual or…

  18. Total synthesis of feglymycin based on a linear/convergent hybrid approach using micro-flow amide bond formation

    NASA Astrophysics Data System (ADS)

    Fuse, Shinichiro; Mifune, Yuto; Nakamura, Hiroyuki; Tanaka, Hiroshi

    2016-11-01

    Feglymycin is a naturally occurring, anti-HIV and antimicrobial 13-mer peptide that includes highly racemizable 3,5-dihydroxyphenylglycines (Dpgs). Here we describe the total synthesis of feglymycin based on a linear/convergent hybrid approach. Our originally developed micro-flow amide bond formation enabled highly racemizable peptide chain elongation based on a linear approach that was previously considered impossible. Our developed approach will enable the practical preparation of biologically active oligopeptides that contain highly racemizable amino acids, which are attractive drug candidates.

  19. Designing Social Media for Informal Learning and Knowledge Maturing in the Digital Workplace

    ERIC Educational Resources Information Center

    Ravenscroft, A.; Schmidt, A.; Cook, J.; Bradley, C.

    2012-01-01

    This paper presents an original approach to designing social media that support informal learning in the digital workplace. It adapts design-based research to take into account the embeddedness of interactions within digitally mediated work-based contexts. The approach is demonstrated through the design, implementation, and evaluation of software…

  20. Responder analysis without dichotomization.

    PubMed

    Zhang, Zhiwei; Chu, Jianxiong; Rahardja, Dewi; Zhang, Hui; Tang, Li

    2016-01-01

    In clinical trials, it is common practice to categorize subjects as responders and non-responders on the basis of one or more clinical measurements under pre-specified rules. Such a responder analysis is often criticized for the loss of information in dichotomizing one or more continuous or ordinal variables. It is worth noting that a responder analysis can be performed without dichotomization, because the proportion of responders for each treatment can be derived from a model for the original clinical variables (used to define a responder) and estimated by substituting maximum likelihood estimators of model parameters. This model-based approach can be considerably more efficient and more effective for dealing with missing data than the usual approach based on dichotomization. For parameter estimation, the model-based approach generally requires correct specification of the model for the original variables. However, under the sharp null hypothesis, the model-based approach remains unbiased for estimating the treatment difference even if the model is misspecified. We elaborate on these points and illustrate them with a series of simulation studies mimicking a study of Parkinson's disease, which involves longitudinal continuous data in the definition of a responder.

  1. Least-Squares Support Vector Machine Approach to Viral Replication Origin Prediction

    PubMed Central

    Cruz-Cano, Raul; Chew, David S.H.; Kwok-Pui, Choi; Ming-Ying, Leung

    2010-01-01

    Replication of their DNA genomes is a central step in the reproduction of many viruses. Procedures to find replication origins, which are initiation sites of the DNA replication process, are therefore of great importance for controlling the growth and spread of such viruses. Existing computational methods for viral replication origin prediction have mostly been tested within the family of herpesviruses. This paper proposes a new approach by least-squares support vector machines (LS-SVMs) and tests its performance not only on the herpes family but also on a collection of caudoviruses coming from three viral families under the order of caudovirales. The LS-SVM approach provides sensitivities and positive predictive values superior or comparable to those given by the previous methods. When suitably combined with previous methods, the LS-SVM approach further improves the prediction accuracy for the herpesvirus replication origins. Furthermore, by recursive feature elimination, the LS-SVM has also helped find the most significant features of the data sets. The results suggest that the LS-SVMs will be a highly useful addition to the set of computational tools for viral replication origin prediction and illustrate the value of optimization-based computing techniques in biomedical applications. PMID:20729987

  2. Least-Squares Support Vector Machine Approach to Viral Replication Origin Prediction.

    PubMed

    Cruz-Cano, Raul; Chew, David S H; Kwok-Pui, Choi; Ming-Ying, Leung

    2010-06-01

    Replication of their DNA genomes is a central step in the reproduction of many viruses. Procedures to find replication origins, which are initiation sites of the DNA replication process, are therefore of great importance for controlling the growth and spread of such viruses. Existing computational methods for viral replication origin prediction have mostly been tested within the family of herpesviruses. This paper proposes a new approach by least-squares support vector machines (LS-SVMs) and tests its performance not only on the herpes family but also on a collection of caudoviruses coming from three viral families under the order of caudovirales. The LS-SVM approach provides sensitivities and positive predictive values superior or comparable to those given by the previous methods. When suitably combined with previous methods, the LS-SVM approach further improves the prediction accuracy for the herpesvirus replication origins. Furthermore, by recursive feature elimination, the LS-SVM has also helped find the most significant features of the data sets. The results suggest that the LS-SVMs will be a highly useful addition to the set of computational tools for viral replication origin prediction and illustrate the value of optimization-based computing techniques in biomedical applications.

  3. Quantifying the origins of life on a planetary scale

    PubMed Central

    Scharf, Caleb; Cronin, Leroy

    2016-01-01

    A simple, heuristic formula with parallels to the Drake Equation is introduced to help focus discussion on open questions for the origins of life in a planetary context. This approach indicates a number of areas where quantitative progress can be made on parameter estimation for determining origins of life probabilities, based on constraints from Bayesian approaches. We discuss a variety of “microscale” factors and their role in determining “macroscale” abiogenesis probabilities on suitable planets. We also propose that impact ejecta exchange between planets with parallel chemistries and chemical evolution could in principle amplify the development of molecular complexity and abiogenesis probabilities. This amplification could be very significant, and both bias our conclusions about abiogenesis probabilities based on the Earth and provide a major source of variance in the probability of life arising in planetary systems. We use our heuristic formula to suggest a number of observational routes for improving constraints on origins of life probabilities. PMID:27382156

  4. Characterization of an array of honeys of different types and botanical origins through fluorescence emission based on LEDs.

    PubMed

    Lastra-Mejías, Miguel; Torreblanca-Zanca, Albertina; Aroca-Santos, Regina; Cancilla, John C; Izquierdo, Jesús G; Torrecilla, José S

    2018-08-01

    A set of 10 honeys comprising a diverse range of botanical origins have been successfully characterized through fluorescence spectroscopy using inexpensive light-emitting diodes (LEDs) as light sources. It has been proven that each LED-honey combination tested originates a unique emission spectrum, which enables the authentication of every honey, being able to correctly label it with its botanical origin. Furthermore, the analysis was backed up by a mathematical analysis based on partial least square models which led to a correct classification rate of each type of honey of over 95%. Finally, the same approach was followed to analyze rice syrup, which is a common honey adulterant that is challenging to identify when mixed with honey. A LED-dependent and unique fluorescence spectrum was found for the syrup, which presumably qualifies this approach for the design of uncomplicated, fast, and cost-effective quality control and adulteration assessing tools for different types of honey. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Endoscopic Modified Medial Maxillectomy for Resection of an Inverted Papilloma Originating from the Entire Circumference of the Maxillary Sinus

    PubMed Central

    Wada, Kota; Ishigaki, Takashi; Ida, Yutaro; Yamada, Yuki; Hosono, Sachiko; Edamatsu, Hideo

    2015-01-01

    For treatment of a sinonasal inverted papilloma (IP), it is essential to have a definite diagnosis, to identify its origin by computed tomography (CT) and magnetic resonance imaging (MRI), and to select the appropriate surgical approach based on the staging system proposed by Krouse. Recently, a new surgical approach named endoscopic modified medial maxillectomy (EMMM) was proposed. This approach can preserve the inferior turbinate and nasolacrimal duct. We successfully treated sinonasal IP with EMMM in a 71-year-old female patient. In this patient, the sinonasal IP originated from the entire circumference of the maxillary sinus. EMMM is not a difficult procedure and provides good visibility of the operative field. Lacrimation and empty nose syndrome do not occur postoperatively as the nasolacrimal duct and inferior turbinate are preserved. EMMM is considered to be a very favorable approach for treatment of sinonasal IP. PMID:26146581

  6. Review: Authentication and traceability of foods from animal origin by polymerase chain reaction-based capillary electrophoresis.

    PubMed

    Rodríguez-Ramírez, Roberto; González-Córdova, Aarón F; Vallejo-Cordoba, Belinda

    2011-01-31

    This work presents an overview of the applicability of PCR-based capillary electrophoresis (CE) in food authentication and traceability of foods from animal origin. Analytical approaches for authenticating and tracing meat and meat products and fish and seafood products are discussed. Particular emphasis will be given to the usefulness of genotyping in food tracing by using CE-based genetic analyzers. Copyright © 2010 Elsevier B.V. All rights reserved.

  7. DNA replication origins—where do we begin?

    PubMed Central

    Prioleau, Marie-Noëlle; MacAlpine, David M.

    2016-01-01

    For more than three decades, investigators have sought to identify the precise locations where DNA replication initiates in mammalian genomes. The development of molecular and biochemical approaches to identify start sites of DNA replication (origins) based on the presence of defining and characteristic replication intermediates at specific loci led to the identification of only a handful of mammalian replication origins. The limited number of identified origins prevented a comprehensive and exhaustive search for conserved genomic features that were capable of specifying origins of DNA replication. More recently, the adaptation of origin-mapping assays to genome-wide approaches has led to the identification of tens of thousands of replication origins throughout mammalian genomes, providing an unprecedented opportunity to identify both genetic and epigenetic features that define and regulate their distribution and utilization. Here we summarize recent advances in our understanding of how primary sequence, chromatin environment, and nuclear architecture contribute to the dynamic selection and activation of replication origins across diverse cell types and developmental stages. PMID:27542827

  8. Comparison of Deck- and Trial-Based Approaches to Advantageous Decision Making on the Iowa Gambling Task

    ERIC Educational Resources Information Center

    Visagan, Ravindran; Xiang, Ally; Lamar, Melissa

    2012-01-01

    We compared the original deck-based model of advantageous decision making assessed with the Iowa Gambling Task (IGT) with a trial-based approach across behavioral and physiological outcomes in 33 younger adults (15 men, 18 women; 22.2 [plus or minus] 3.7 years of age). One administration of the IGT with simultaneous measurement of skin conductance…

  9. Heads-Up Display with Virtual Precision Approach Path Indicator as Implemented in a Real-Time Piloted Lifting-Body Simulation

    NASA Technical Reports Server (NTRS)

    Neuhaus, Jason R.

    2018-01-01

    This document describes the heads-up display (HUD) used in a piloted lifting-body entry, approach and landing simulation developed for the simulator facilities of the Simulation Development and Analysis Branch (SDAB) at NASA Langley Research Center. The HUD symbology originated with the piloted simulation evaluations of the HL-20 lifting body concept conducted in 1989 at NASA Langley. The original symbology was roughly based on Shuttle HUD symbology, as interpreted by Langley researchers. This document focuses on the addition of the precision approach path indicator (PAPI) lights to the HUD overlay.

  10. A Problem-Based Approach to Elastic Wave Propagation: The Role of Constraints

    ERIC Educational Resources Information Center

    Fazio, Claudio; Guastella, Ivan; Tarantino, Giovanni

    2009-01-01

    A problem-based approach to the teaching of mechanical wave propagation, focused on observation and measurement of wave properties in solids and on modelling of these properties, is presented. In particular, some experimental results, originally aimed at measuring the propagation speed of sound waves in metallic rods, are used in order to deepen…

  11. A Proposal to Plan and Develop a Sample Set of Drill and Testing Materials, Based on Audio and Visual Environmental and Situational Stimuli, Aimed at Training and Testing in the Creation of Original Utterances by Foreign Language Students at the Secondary and College Levels.

    ERIC Educational Resources Information Center

    Obrecht, Dean H.

    This report contrasts the results of a rigidly specified, pattern-oriented approach to learning Spanish with an approach that emphasizes the origination of sentences by the learner in direct response to stimuli. Pretesting and posttesting statistics are presented and conclusions are discussed. The experimental method, which required the student to…

  12. DNA replication origins-where do we begin?

    PubMed

    Prioleau, Marie-Noëlle; MacAlpine, David M

    2016-08-01

    For more than three decades, investigators have sought to identify the precise locations where DNA replication initiates in mammalian genomes. The development of molecular and biochemical approaches to identify start sites of DNA replication (origins) based on the presence of defining and characteristic replication intermediates at specific loci led to the identification of only a handful of mammalian replication origins. The limited number of identified origins prevented a comprehensive and exhaustive search for conserved genomic features that were capable of specifying origins of DNA replication. More recently, the adaptation of origin-mapping assays to genome-wide approaches has led to the identification of tens of thousands of replication origins throughout mammalian genomes, providing an unprecedented opportunity to identify both genetic and epigenetic features that define and regulate their distribution and utilization. Here we summarize recent advances in our understanding of how primary sequence, chromatin environment, and nuclear architecture contribute to the dynamic selection and activation of replication origins across diverse cell types and developmental stages. © 2016 Prioleau and MacAlpine; Published by Cold Spring Harbor Laboratory Press.

  13. Partial differential equation-based approach for empirical mode decomposition: application on image analysis.

    PubMed

    Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques

    2012-09-01

    The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.

  14. A nearest neighbour approach by genetic distance to the assignment of individual trees to geographic origin.

    PubMed

    Degen, Bernd; Blanc-Jolivet, Céline; Stierand, Katrin; Gillet, Elizabeth

    2017-03-01

    During the past decade, the use of DNA for forensic applications has been extensively implemented for plant and animal species, as well as in humans. Tracing back the geographical origin of an individual usually requires genetic assignment analysis. These approaches are based on reference samples that are grouped into populations or other aggregates and intend to identify the most likely group of origin. Often this grouping does not have a biological but rather a historical or political justification, such as "country of origin". In this paper, we present a new nearest neighbour approach to individual assignment or classification within a given but potentially imperfect grouping of reference samples. This method, which is based on the genetic distance between individuals, functions better in many cases than commonly used methods. We demonstrate the operation of our assignment method using two data sets. One set is simulated for a large number of trees distributed in a 120km by 120km landscape with individual genotypes at 150 SNPs, and the other set comprises experimental data of 1221 individuals of the African tropical tree species Entandrophragma cylindricum (Sapelli) genotyped at 61 SNPs. Judging by the level of correct self-assignment, our approach outperformed the commonly used frequency and Bayesian approaches by 15% for the simulated data set and by 5-7% for the Sapelli data set. Our new approach is less sensitive to overlapping sources of genetic differentiation, such as genetic differences among closely-related species, phylogeographic lineages and isolation by distance, and thus operates better even for suboptimal grouping of individuals. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. Research program for a search of the origin of Darwinian evolution. Research program for a vesicle-based model of the origin of Darwinian evolution on prebiotic early Earth

    NASA Astrophysics Data System (ADS)

    Tessera, Marc

    2017-03-01

    The search for origin of `life' is made even more complicated by differing definitions of the subject matter, although a general consensus is that an appropriate definition should center on Darwinian evolution (Cleland and Chyba 2002). Within a physical approach which has been defined as a level-4 evolution (Tessera and Hoelzer 2013), one mechanism could be described showing that only three conditions are required to allow natural selection to apply to populations of different system lineages. This approach leads to a vesicle- based model with the necessary properties. Of course such a model has to be tested. Thus, after a brief presentation of the model an experimental program is proposed that implements the different steps able to show whether this new direction of the research in the field is valid and workable.

  16. Research program for a search of the origin of Darwinian evolution : Research program for a vesicle-based model of the origin of Darwinian evolution on prebiotic early Earth.

    PubMed

    Tessera, Marc

    2017-03-01

    The search for origin of 'life' is made even more complicated by differing definitions of the subject matter, although a general consensus is that an appropriate definition should center on Darwinian evolution (Cleland and Chyba 2002). Within a physical approach which has been defined as a level-4 evolution (Tessera and Hoelzer 2013), one mechanism could be described showing that only three conditions are required to allow natural selection to apply to populations of different system lineages. This approach leads to a vesicle- based model with the necessary properties. Of course such a model has to be tested. Thus, after a brief presentation of the model an experimental program is proposed that implements the different steps able to show whether this new direction of the research in the field is valid and workable.

  17. A retrieval-based approach to eliminating hindsight bias.

    PubMed

    Van Boekel, Martin; Varma, Keisha; Varma, Sashank

    2017-03-01

    Individuals exhibit hindsight bias when they are unable to recall their original responses to novel questions after correct answers are provided to them. Prior studies have eliminated hindsight bias by modifying the conditions under which original judgments or correct answers are encoded. Here, we explored whether hindsight bias can be eliminated by manipulating the conditions that hold at retrieval. Our retrieval-based approach predicts that if the conditions at retrieval enable sufficient discrimination of memory representations of original judgments from memory representations of correct answers, then hindsight bias will be reduced or eliminated. Experiment 1 used the standard memory design to replicate the hindsight bias effect in middle-school students. Experiments 2 and 3 modified the retrieval phase of this design, instructing participants beforehand that they would be recalling both their original judgments and the correct answers. As predicted, this enabled participants to form compound retrieval cues that discriminated original judgment traces from correct answer traces, and eliminated hindsight bias. Experiment 4 found that when participants were not instructed beforehand that they would be making both recalls, they did not form discriminating retrieval cues, and hindsight bias returned. These experiments delineate the retrieval conditions that produce-and fail to produce-hindsight bias.

  18. EXTENDING AQUATIC CLASSIFICATION TO THE LANDSCAPE SCALE HYDROLOGY-BASED STRATEGIES

    EPA Science Inventory

    Aquatic classification of single water bodies (lakes, wetlands, estuaries) is often based on geologic origin, while stream classification has relied on multiple factors related to landform, geomorphology, and soils. We have developed an approach to aquatic classification based o...

  19. Adaptive distance metric learning for diffusion tensor image segmentation.

    PubMed

    Kong, Youyong; Wang, Defeng; Shi, Lin; Hui, Steve C N; Chu, Winnie C W

    2014-01-01

    High quality segmentation of diffusion tensor images (DTI) is of key interest in biomedical research and clinical application. In previous studies, most efforts have been made to construct predefined metrics for different DTI segmentation tasks. These methods require adequate prior knowledge and tuning parameters. To overcome these disadvantages, we proposed to automatically learn an adaptive distance metric by a graph based semi-supervised learning model for DTI segmentation. An original discriminative distance vector was first formulated by combining both geometry and orientation distances derived from diffusion tensors. The kernel metric over the original distance and labels of all voxels were then simultaneously optimized in a graph based semi-supervised learning approach. Finally, the optimization task was efficiently solved with an iterative gradient descent method to achieve the optimal solution. With our approach, an adaptive distance metric could be available for each specific segmentation task. Experiments on synthetic and real brain DTI datasets were performed to demonstrate the effectiveness and robustness of the proposed distance metric learning approach. The performance of our approach was compared with three classical metrics in the graph based semi-supervised learning framework.

  20. Adaptive Distance Metric Learning for Diffusion Tensor Image Segmentation

    PubMed Central

    Kong, Youyong; Wang, Defeng; Shi, Lin; Hui, Steve C. N.; Chu, Winnie C. W.

    2014-01-01

    High quality segmentation of diffusion tensor images (DTI) is of key interest in biomedical research and clinical application. In previous studies, most efforts have been made to construct predefined metrics for different DTI segmentation tasks. These methods require adequate prior knowledge and tuning parameters. To overcome these disadvantages, we proposed to automatically learn an adaptive distance metric by a graph based semi-supervised learning model for DTI segmentation. An original discriminative distance vector was first formulated by combining both geometry and orientation distances derived from diffusion tensors. The kernel metric over the original distance and labels of all voxels were then simultaneously optimized in a graph based semi-supervised learning approach. Finally, the optimization task was efficiently solved with an iterative gradient descent method to achieve the optimal solution. With our approach, an adaptive distance metric could be available for each specific segmentation task. Experiments on synthetic and real brain DTI datasets were performed to demonstrate the effectiveness and robustness of the proposed distance metric learning approach. The performance of our approach was compared with three classical metrics in the graph based semi-supervised learning framework. PMID:24651858

  1. Plants and Chemistry: A Teaching Course Based on the Chemistry of Substances of Plant Origin

    NASA Astrophysics Data System (ADS)

    Andreoli, Katia; Calascibetta, Franco; Campanella, Luigi; Favero, Gabriele; Occhionero, Francesca

    2002-08-01

    Over the past few years, we developed an idea about the teaching of chemistry by determining the links between theory and the real world. The principles, concepts, and experimental procedures of chemistry were illustrated through an original approach based on useful substances obtained from plants. The starting point was substances that have always been obtained from trees and vegetables. The approach was implemented during many refresher courses for secondary school teachers of chemistry. The courses were divided into sections, each called "Plants and ...", dedicated to colors, odors, tastes, medicines and drugs, fibers, soaps, and alcoholic beverages. Each section consisted of a theoretical lesson followed by a laboratory session.

  2. The eukaryotic cell originated in the integration and redistribution of hyperstructures from communities of prokaryotic cells based on molecular complementarity.

    PubMed

    Norris, Vic; Root-Bernstein, Robert

    2009-06-04

    In the "ecosystems-first" approach to the origins of life, networks of non-covalent assemblies of molecules (composomes), rather than individual protocells, evolved under the constraints of molecular complementarity. Composomes evolved into the hyperstructures of modern bacteria. We extend the ecosystems-first approach to explain the origin of eukaryotic cells through the integration of mixed populations of bacteria. We suggest that mutualism and symbiosis resulted in cellular mergers entailing the loss of redundant hyperstructures, the uncoupling of transcription and translation, and the emergence of introns and multiple chromosomes. Molecular complementarity also facilitated integration of bacterial hyperstructures to perform cytoskeletal and movement functions.

  3. Increasing Critical Thinking in Web-Based Graduate Management Courses

    ERIC Educational Resources Information Center

    Condon, Conna; Valverde, Raul

    2014-01-01

    A common approach for demonstrating learning in online classrooms is through submittal of research essays of a discussion topic followed by classroom participation. Issues arose at an online campus of a university regarding the originality and quality of critical thinking in the original submittals. Achievement of new course objectives oriented to…

  4. Learning Agreements and Socially Responsible Approaches to Professional and Human Resource Development in the United Kingdom

    ERIC Educational Resources Information Center

    Wallis, Emma

    2008-01-01

    This article draws upon original qualitative data to present an initial assessment of the significance of learning agreements for the development of socially responsible approaches to professional and human resource development within the workplace. The article suggests that the adoption of a partnership-based approach to learning is more…

  5. Approximate Dynamic Programming: Combining Regional and Local State Following Approximations.

    PubMed

    Deptula, Patryk; Rosenfeld, Joel A; Kamalapurkar, Rushikesh; Dixon, Warren E

    2018-06-01

    An infinite-horizon optimal regulation problem for a control-affine deterministic system is solved online using a local state following (StaF) kernel and a regional model-based reinforcement learning (R-MBRL) method to approximate the value function. Unlike traditional methods such as R-MBRL that aim to approximate the value function over a large compact set, the StaF kernel approach aims to approximate the value function in a local neighborhood of the state that travels within a compact set. In this paper, the value function is approximated using a state-dependent convex combination of the StaF-based and the R-MBRL-based approximations. As the state enters a neighborhood containing the origin, the value function transitions from being approximated by the StaF approach to the R-MBRL approach. Semiglobal uniformly ultimately bounded (SGUUB) convergence of the system states to the origin is established using a Lyapunov-based analysis. Simulation results are provided for two, three, six, and ten-state dynamical systems to demonstrate the scalability and performance of the developed method.

  6. Assigning breed origin to alleles in crossbred animals.

    PubMed

    Vandenplas, Jérémie; Calus, Mario P L; Sevillano, Claudia A; Windig, Jack J; Bastiaansen, John W M

    2016-08-22

    For some species, animal production systems are based on the use of crossbreeding to take advantage of the increased performance of crossbred compared to purebred animals. Effects of single nucleotide polymorphisms (SNPs) may differ between purebred and crossbred animals for several reasons: (1) differences in linkage disequilibrium between SNP alleles and a quantitative trait locus; (2) differences in genetic backgrounds (e.g., dominance and epistatic interactions); and (3) differences in environmental conditions, which result in genotype-by-environment interactions. Thus, SNP effects may be breed-specific, which has led to the development of genomic evaluations for crossbred performance that take such effects into account. However, to estimate breed-specific effects, it is necessary to know breed origin of alleles in crossbred animals. Therefore, our aim was to develop an approach for assigning breed origin to alleles of crossbred animals (termed BOA) without information on pedigree and to study its accuracy by considering various factors, including distance between breeds. The BOA approach consists of: (1) phasing genotypes of purebred and crossbred animals; (2) assigning breed origin to phased haplotypes; and (3) assigning breed origin to alleles of crossbred animals based on a library of assigned haplotypes, the breed composition of crossbred animals, and their SNP genotypes. The accuracy of allele assignments was determined for simulated datasets that include crosses between closely-related, distantly-related and unrelated breeds. Across these scenarios, the percentage of alleles of a crossbred animal that were correctly assigned to their breed origin was greater than 90 %, and increased with increasing distance between breeds, while the percentage of incorrectly assigned alleles was always less than 2 %. For the remaining alleles, i.e. 0 to 10 % of all alleles of a crossbred animal, breed origin could not be assigned. The BOA approach accurately assigns breed origin to alleles of crossbred animals, even if their pedigree is not recorded.

  7. Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models

    NASA Astrophysics Data System (ADS)

    Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter

    Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.

  8. Semistochastic approach to many electron systems

    NASA Astrophysics Data System (ADS)

    Grossjean, M. K.; Grossjean, M. F.; Schulten, K.; Tavan, P.

    1992-08-01

    A Pariser-Parr-Pople (PPP) Hamiltonian of an 8π electron system of the molecule octatetraene, represented in a configuration-interaction basis (CI basis), is analyzed with respect to the statistical properties of its matrix elements. Based on this analysis we develop an effective Hamiltonian, which represents virtual excitations by a Gaussian orthogonal ensemble (GOE). We also examine numerical approaches which replace the original Hamiltonian by a semistochastically generated CI matrix. In that CI matrix, the matrix elements of high energy excitations are choosen randomly according to distributions reflecting the statistics of the original CI matrix.

  9. Stratification of co-evolving genomic groups using ranked phylogenetic profiles

    PubMed Central

    Freilich, Shiri; Goldovsky, Leon; Gottlieb, Assaf; Blanc, Eric; Tsoka, Sophia; Ouzounis, Christos A

    2009-01-01

    Background Previous methods of detecting the taxonomic origins of arbitrary sequence collections, with a significant impact to genome analysis and in particular metagenomics, have primarily focused on compositional features of genomes. The evolutionary patterns of phylogenetic distribution of genes or proteins, represented by phylogenetic profiles, provide an alternative approach for the detection of taxonomic origins, but typically suffer from low accuracy. Herein, we present rank-BLAST, a novel approach for the assignment of protein sequences into genomic groups of the same taxonomic origin, based on the ranking order of phylogenetic profiles of target genes or proteins across the reference database. Results The rank-BLAST approach is validated by computing the phylogenetic profiles of all sequences for five distinct microbial species of varying degrees of phylogenetic proximity, against a reference database of 243 fully sequenced genomes. The approach - a combination of sequence searches, statistical estimation and clustering - analyses the degree of sequence divergence between sets of protein sequences and allows the classification of protein sequences according to the species of origin with high accuracy, allowing taxonomic classification of 64% of the proteins studied. In most cases, a main cluster is detected, representing the corresponding species. Secondary, functionally distinct and species-specific clusters exhibit different patterns of phylogenetic distribution, thus flagging gene groups of interest. Detailed analyses of such cases are provided as examples. Conclusion Our results indicate that the rank-BLAST approach can capture the taxonomic origins of sequence collections in an accurate and efficient manner. The approach can be useful both for the analysis of genome evolution and the detection of species groups in metagenomics samples. PMID:19860884

  10. Benevolent Paradox: Integrating Community-Based Empowerment and Transdisciplinary Research Approaches into Traditional Frameworks to Increase Funding and Long-Term Sustainability of Chicano-Community Research Programs

    ERIC Educational Resources Information Center

    de la Torre, Adela

    2014-01-01

    Niños Sanos, Familia Sana (NSFS) is a 5-year multi-intervention study aimed at preventing childhood obesity among Mexican-origin children in rural California. Using a transdisciplinary approach and community-based participatory research (CBPR) methodology, NSFS's development included a diversely trained team working in collaboration with community…

  11. Trends in Research on Project-Based Science and Technology Teaching and Learning at K-12 Levels: A Systematic Review

    ERIC Educational Resources Information Center

    Hasni, Abdelkrim; Bousadra, Fatima; Belletête, Vincent; Benabdallah, Ahmed; Nicole, Marie-Claude; Dumais, Nancy

    2016-01-01

    Project-based teaching is nothing new; it originates from the work of authors like Dewey and Kilpatrick. Recent decades have seen renewed interest in this approach. In many countries, it is currently considered to be an innovative approach to science and technology (S&T) teaching. In this article, we present a systematic review of what recent…

  12. Arts-Based Research: Trojan Horses and Shibboleths. The Liabilities of a Hybrid Research Approach. "What Hath Eisner Wrought?"

    ERIC Educational Resources Information Center

    Pariser, David

    2009-01-01

    The term "arts-based research" has been debated for some time now. In an article strongly in favor of this approach Bean (2007) identifies three species: "Research on the arts (italics in the original) (art history, visual and cultural studies, media studies etc.)...Research for the arts, refers to research into applied techniques, materials and…

  13. Interdisciplinary collaboration in gerontology and geriatrics in Latin America: conceptual approaches and health care teams.

    PubMed

    Gomez, Fernando; Curcio, Carmen Lucia

    2013-01-01

    The underlying rationale to support interdisciplinary collaboration in geriatrics and gerontology is based on the complexity of elderly care. The most important characteristic about interdisciplinary health care teams for older people in Latin America is their subjective-basis framework. In other regions, teams are organized according to a theoretical knowledge basis with well-justified priorities, functions, and long-term goals, in Latin America teams are arranged according to subjective interests on solving their problems. Three distinct approaches of interdisciplinary collaboration in gerontology are proposed. The first approach is grounded in the scientific rationalism of European origin. Denominated "logical-rational approach," its core is to identify the significance of knowledge. The second approach is grounded in pragmatism and is more associated with a North American tradition. The core of this approach consists in enhancing the skills and competences of each participant; denominated "logical-instrumental approach." The third approach denominated "logical-subjective approach" has a Latin America origin. Its core consists in taking into account the internal and emotional dimensions of the team. These conceptual frameworks based in geographical contexts will permit establishing the differences and shared characteristics of interdisciplinary collaboration in geriatrics and gerontology to look for operational answers to solve the "complex problems" of older adults.

  14. Tracing common origins of Genomic Islands in prokaryotes based on genome signature analyses.

    PubMed

    van Passel, Mark Wj

    2011-09-01

    Horizontal gene transfer constitutes a powerful and innovative force in evolution, but often little is known about the actual origins of transferred genes. Sequence alignments are generally of limited use in tracking the original donor, since still only a small fraction of the total genetic diversity is thought to be uncovered. Alternatively, approaches based on similarities in the genome specific relative oligonucleotide frequencies do not require alignments. Even though the exact origins of horizontally transferred genes may still not be established using these compositional analyses, it does suggest that compositionally very similar regions are likely to have had a common origin. These analyses have shown that up to a third of large acquired gene clusters that reside in the same genome are compositionally very similar, indicative of a shared origin. This brings us closer to uncovering the original donors of horizontally transferred genes, and could help in elucidating possible regulatory interactions between previously unlinked sequences.

  15. The Atomic Origin of the Reflection Law

    ERIC Educational Resources Information Center

    Prytz, Kjell

    2016-01-01

    It will be demonstrated how the reflection law may be derived on an atomic basis using the plane wave approximation together with Huygens' principle. The model utilized is based on the electric dipole character of matter originating from its molecular constituents. This approach is not new but has, since it was first introduced by Ewald and Oseen…

  16. Perspectives on the Origins of Life in Science Textbooks from a Christian Publisher: Implications for Teaching Science

    ERIC Educational Resources Information Center

    Santos Baptista, Geilsa Costa; da Silva Santos, Rodrigo; Cobern, William W.

    2016-01-01

    This paper presents the results of research regarding approaches to the origin of life featured in science textbooks produced by an Evangelical publisher. The research nature was qualitative with document analysis and an interpretive framework based on Epistemological Pluralism. Overall, the results indicate that there are four perspectives on the…

  17. Introduction to "The Behavior-Analytic Origins of Constraint-Induced Movement Therapy: An Example of Behavioral Neurorehabilitation"

    ERIC Educational Resources Information Center

    Schaal, David W.

    2012-01-01

    This article presents an introduction to "The Behavior-Analytic Origins of Constraint-Induced Movement Therapy: An Example of Behavioral Neurorehabilitation," by Edward Taub and his colleagues (Taub, 2012). Based on extensive experimentation with animal models of peripheral nerve injury, Taub and colleagues have created an approach to overcoming…

  18. A Case Study of Pharmaceutical Pricing in China: Setting the Price for Off-Patent Originators.

    PubMed

    Hu, Shanlian; Zhang, Yabing; He, Jiangjiang; Du, Lixia; Xu, Mingfei; Xie, Chunyan; Peng, Ying; Wang, Linan

    2015-08-01

    This article aims to define a value-based approach to pricing and reimbursement for off-patent originators using a multiple criteria decision analysis (MCDA) approach centered on a systematic analysis of current pricing and reimbursement policies in China. A drug price policy review was combined with a quantitative analysis of China's drug purchasing database. Policy preferences were identified through a MCDA performed by interviewing well-known academic experts and industry stakeholders. The study findings indicate that the current Chinese price policy includes cost-based pricing and the establishment of maximum retail prices and premiums for off-patent originators, whereas reference pricing may be adopted in the future. The literature review revealed significant differences in the dissolution profiles between originators and generics; therefore, dissolution profiles need to be improved. Market data analysis showed that the overall price ratio of generics and off-patent originators was around 0.54-0.59 in 2002-2011, with a 40% price difference, on average. Ten differentiating value attributes were identified and MCDA was applied to test the impact of three pricing policy scenarios. With the condition of implementing quality consistency regulations and controls, a reduction in the price gap between high-quality off-patent products (including originator and generics) seemed to be the preferred policy. Patents of many drugs will expire within the next 10 years; thus, pricing will be an issue of importance for off-patent originators and generic alternatives.

  19. Representation of photon limited data in emission tomography using origin ensembles

    NASA Astrophysics Data System (ADS)

    Sitek, A.

    2008-06-01

    Representation and reconstruction of data obtained by emission tomography scanners are challenging due to high noise levels in the data. Typically, images obtained using tomographic measurements are represented using grids. In this work, we define images as sets of origins of events detected during tomographic measurements; we call these origin ensembles (OEs). A state in the ensemble is characterized by a vector of 3N parameters Y, where the parameters are the coordinates of origins of detected events in a three-dimensional space and N is the number of detected events. The 3N-dimensional probability density function (PDF) for that ensemble is derived, and we present an algorithm for OE image estimation from tomographic measurements. A displayable image (e.g. grid based image) is derived from the OE formulation by calculating ensemble expectations based on the PDF using the Markov chain Monte Carlo method. The approach was applied to computer-simulated 3D list-mode positron emission tomography data. The reconstruction errors for a 10 000 000 event acquisition for simulated ranged from 0.1 to 34.8%, depending on object size and sampling density. The method was also applied to experimental data and the results of the OE method were consistent with those obtained by a standard maximum-likelihood approach. The method is a new approach to representation and reconstruction of data obtained by photon-limited emission tomography measurements.

  20. Mapping yeast origins of replication via single-stranded DNA detection.

    PubMed

    Feng, Wenyi; Raghuraman, M K; Brewer, Bonita J

    2007-02-01

    Studies in th Saccharomyces cerevisiae have provided a framework for understanding how eukaryotic cells replicate their chromosomal DNA to ensure faithful transmission of genetic information to their daughter cells. In particular, S. cerevisiae is the first eukaryote to have its origins of replication mapped on a genomic scale, by three independent groups using three different microarray-based approaches. Here we describe a new technique of origin mapping via detection of single-stranded DNA in yeast. This method not only identified the majority of previously discovered origins, but also detected new ones. We have also shown that this technique can identify origins in Schizosaccharomyces pombe, illustrating the utility of this method for origin mapping in other eukaryotes.

  1. A novel critical infrastructure resilience assessment approach using dynamic Bayesian networks

    NASA Astrophysics Data System (ADS)

    Cai, Baoping; Xie, Min; Liu, Yonghong; Liu, Yiliu; Ji, Renjie; Feng, Qiang

    2017-10-01

    The word resilience originally originates from the Latin word "resiliere", which means to "bounce back". The concept has been used in various fields, such as ecology, economics, psychology, and society, with different definitions. In the field of critical infrastructure, although some resilience metrics are proposed, they are totally different from each other, which are determined by the performances of the objects of evaluation. Here we bridge the gap by developing a universal critical infrastructure resilience metric from the perspective of reliability engineering. A dynamic Bayesian networks-based assessment approach is proposed to calculate the resilience value. A series, parallel and voting system is used to demonstrate the application of the developed resilience metric and assessment approach.

  2. Linear decomposition approach for a class of nonconvex programming problems.

    PubMed

    Shen, Peiping; Wang, Chunfeng

    2017-01-01

    This paper presents a linear decomposition approach for a class of nonconvex programming problems by dividing the input space into polynomially many grids. It shows that under certain assumptions the original problem can be transformed and decomposed into a polynomial number of equivalent linear programming subproblems. Based on solving a series of liner programming subproblems corresponding to those grid points we can obtain the near-optimal solution of the original problem. Compared to existing results in the literature, the proposed algorithm does not require the assumptions of quasi-concavity and differentiability of the objective function, and it differs significantly giving an interesting approach to solving the problem with a reduced running time.

  3. Low-Dimensional Nanostructures and a Semiclassical Approach for Teaching Feynman's Sum-over-Paths Quantum Theory

    ERIC Educational Resources Information Center

    Onorato, P.

    2011-01-01

    An introduction to quantum mechanics based on the sum-over-paths (SOP) method originated by Richard P. Feynman and developed by E. F. Taylor and coworkers is presented. The Einstein-Brillouin-Keller (EBK) semiclassical quantization rules are obtained following the SOP approach for bounded systems, and a general approach to the calculation of…

  4. Dynamic Emulation Modelling (DEMo) of large physically-based environmental models

    NASA Astrophysics Data System (ADS)

    Galelli, S.; Castelletti, A.

    2012-12-01

    In environmental modelling large, spatially-distributed, physically-based models are widely adopted to describe the dynamics of physical, social and economic processes. Such an accurate process characterization comes, however, to a price: the computational requirements of these models are considerably high and prevent their use in any problem requiring hundreds or thousands of model runs to be satisfactory solved. Typical examples include optimal planning and management, data assimilation, inverse modelling and sensitivity analysis. An effective approach to overcome this limitation is to perform a top-down reduction of the physically-based model by identifying a simplified, computationally efficient emulator, constructed from and then used in place of the original model in highly resource-demanding tasks. The underlying idea is that not all the process details in the original model are equally important and relevant to the dynamics of the outputs of interest for the type of problem considered. Emulation modelling has been successfully applied in many environmental applications, however most of the literature considers non-dynamic emulators (e.g. metamodels, response surfaces and surrogate models), where the original dynamical model is reduced to a static map between input and the output of interest. In this study we focus on Dynamic Emulation Modelling (DEMo), a methodological approach that preserves the dynamic nature of the original physically-based model, with consequent advantages in a wide variety of problem areas. In particular, we propose a new data-driven DEMo approach that combines the many advantages of data-driven modelling in representing complex, non-linear relationships, but preserves the state-space representation typical of process-based models, which is both particularly effective in some applications (e.g. optimal management and data assimilation) and facilitates the ex-post physical interpretation of the emulator structure, thus enhancing the credibility of the model to stakeholders and decision-makers. Numerical results from the application of the approach to the reduction of 3D coupled hydrodynamic-ecological models in several real world case studies, including Marina Reservoir (Singapore) and Googong Reservoir (Australia), are illustrated.

  5. Mapping copy number variation by population-scale genome sequencing.

    PubMed

    Mills, Ryan E; Walter, Klaudia; Stewart, Chip; Handsaker, Robert E; Chen, Ken; Alkan, Can; Abyzov, Alexej; Yoon, Seungtai Chris; Ye, Kai; Cheetham, R Keira; Chinwalla, Asif; Conrad, Donald F; Fu, Yutao; Grubert, Fabian; Hajirasouliha, Iman; Hormozdiari, Fereydoun; Iakoucheva, Lilia M; Iqbal, Zamin; Kang, Shuli; Kidd, Jeffrey M; Konkel, Miriam K; Korn, Joshua; Khurana, Ekta; Kural, Deniz; Lam, Hugo Y K; Leng, Jing; Li, Ruiqiang; Li, Yingrui; Lin, Chang-Yun; Luo, Ruibang; Mu, Xinmeng Jasmine; Nemesh, James; Peckham, Heather E; Rausch, Tobias; Scally, Aylwyn; Shi, Xinghua; Stromberg, Michael P; Stütz, Adrian M; Urban, Alexander Eckehart; Walker, Jerilyn A; Wu, Jiantao; Zhang, Yujun; Zhang, Zhengdong D; Batzer, Mark A; Ding, Li; Marth, Gabor T; McVean, Gil; Sebat, Jonathan; Snyder, Michael; Wang, Jun; Ye, Kenny; Eichler, Evan E; Gerstein, Mark B; Hurles, Matthew E; Lee, Charles; McCarroll, Steven A; Korbel, Jan O

    2011-02-03

    Genomic structural variants (SVs) are abundant in humans, differing from other forms of variation in extent, origin and functional impact. Despite progress in SV characterization, the nucleotide resolution architecture of most SVs remains unknown. We constructed a map of unbalanced SVs (that is, copy number variants) based on whole genome DNA sequencing data from 185 human genomes, integrating evidence from complementary SV discovery approaches with extensive experimental validations. Our map encompassed 22,025 deletions and 6,000 additional SVs, including insertions and tandem duplications. Most SVs (53%) were mapped to nucleotide resolution, which facilitated analysing their origin and functional impact. We examined numerous whole and partial gene deletions with a genotyping approach and observed a depletion of gene disruptions amongst high frequency deletions. Furthermore, we observed differences in the size spectra of SVs originating from distinct formation mechanisms, and constructed a map of SV hotspots formed by common mechanisms. Our analytical framework and SV map serves as a resource for sequencing-based association studies.

  6. New Ideas and Approaches

    ERIC Educational Resources Information Center

    Lukov, V. A.

    2014-01-01

    The article examines theories of youth that have been proposed in the past few years by Russian scientists, and presents the author's original version of a theory of youth that is based on the thesaurus methodological approach. It addresses the ways in which biosocial characteristics may be reflected in new theories of youth.

  7. David Dorfman's "Here": A Community-Building Approach in Dance Education

    ERIC Educational Resources Information Center

    Parrish, Mila

    2009-01-01

    Over a three-month period, Arcadia High School and Arizona State University formed a community partnership with the help of New York modern dance choreographer David Dorfman to create an original dance titled "Here." Dorfman's community-building approach is based on personal refection, collaboration, critical thinking, problem solving,…

  8. Prediction of Radial Vibration in Switched Reluctance Machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, CJ; Fahimi, B

    2013-12-01

    Origins of vibration in switched reluctance machines (SRMs) are investigated. Accordingly, an input-output model based on the mechanical impulse response of the SRMis developed. The proposed model is derived using an experimental approach. Using the proposed approach, vibration of the stator frame is captured and experimentally verified.

  9. An adaptive Gaussian process-based method for efficient Bayesian experimental design in groundwater contaminant source identification problems: ADAPTIVE GAUSSIAN PROCESS-BASED INVERSION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jiangjiang; Li, Weixuan; Zeng, Lingzao

    Surrogate models are commonly used in Bayesian approaches such as Markov Chain Monte Carlo (MCMC) to avoid repetitive CPU-demanding model evaluations. However, the approximation error of a surrogate may lead to biased estimations of the posterior distribution. This bias can be corrected by constructing a very accurate surrogate or implementing MCMC in a two-stage manner. Since the two-stage MCMC requires extra original model evaluations, the computational cost is still high. If the information of measurement is incorporated, a locally accurate approximation of the original model can be adaptively constructed with low computational cost. Based on this idea, we propose amore » Gaussian process (GP) surrogate-based Bayesian experimental design and parameter estimation approach for groundwater contaminant source identification problems. A major advantage of the GP surrogate is that it provides a convenient estimation of the approximation error, which can be incorporated in the Bayesian formula to avoid over-confident estimation of the posterior distribution. The proposed approach is tested with a numerical case study. Without sacrificing the estimation accuracy, the new approach achieves about 200 times of speed-up compared to our previous work using two-stage MCMC.« less

  10. Improved Traceability of a Small Satellite Mission Concept to Requirements Using Model Based System Engineering

    NASA Technical Reports Server (NTRS)

    Reil, Robin L.

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the "traditional" document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This paper presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magic's MagicDraw modeling tool. The model incorporates mission concept and requirement information from the mission's original DBSE design efforts. Active dependency relationships are modeled to demonstrate the completeness and consistency of the requirements to the mission concept. Anecdotal information and process-duration metrics are presented for both the MBSE and original DBSE design efforts of SporeSat.

  11. Improved Traceability of Mission Concept to Requirements Using Model Based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Reil, Robin

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the traditional document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This thesis presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magics MagicDraw modeling tool. The model incorporates mission concept and requirement information from the missions original DBSE design efforts. Active dependency relationships are modeled to analyze the completeness and consistency of the requirements to the mission concept. Overall experience and methodology are presented for both the MBSE and original DBSE design efforts of SporeSat.

  12. High-resolution mapping, characterization, and optimization of autonomously replicating sequences in yeast

    PubMed Central

    Liachko, Ivan; Youngblood, Rachel A.; Keich, Uri; Dunham, Maitreya J.

    2013-01-01

    DNA replication origins are necessary for the duplication of genomes. In addition, plasmid-based expression systems require DNA replication origins to maintain plasmids efficiently. The yeast autonomously replicating sequence (ARS) assay has been a valuable tool in dissecting replication origin structure and function. However, the dearth of information on origins in diverse yeasts limits the availability of efficient replication origin modules to only a handful of species and restricts our understanding of origin function and evolution. To enable rapid study of origins, we have developed a sequencing-based suite of methods for comprehensively mapping and characterizing ARSs within a yeast genome. Our approach finely maps genomic inserts capable of supporting plasmid replication and uses massively parallel deep mutational scanning to define molecular determinants of ARS function with single-nucleotide resolution. In addition to providing unprecedented detail into origin structure, our data have allowed us to design short, synthetic DNA sequences that retain maximal ARS function. These methods can be readily applied to understand and modulate ARS function in diverse systems. PMID:23241746

  13. The Use of Original Sources and Its Potential Relation to the Recruitment Problem

    ERIC Educational Resources Information Center

    Jankvist, Uffe Thomas

    2014-01-01

    Based on a study about using original sources with Danish upper secondary students, the paper addresses the potential outcome of such an approach in regard to the so-called recruitment problem to the mathematical sciences. 24 students were exposed to questionnaire questions and 16 of these to follow-up interviews, which form the basis for both a…

  14. Prediction of Protein Structure by Template-Based Modeling Combined with the UNRES Force Field.

    PubMed

    Krupa, Paweł; Mozolewska, Magdalena A; Joo, Keehyoung; Lee, Jooyoung; Czaplewski, Cezary; Liwo, Adam

    2015-06-22

    A new approach to the prediction of protein structures that uses distance and backbone virtual-bond dihedral angle restraints derived from template-based models and simulations with the united residue (UNRES) force field is proposed. The approach combines the accuracy and reliability of template-based methods for the segments of the target sequence with high similarity to those having known structures with the ability of UNRES to pack the domains correctly. Multiplexed replica-exchange molecular dynamics with restraints derived from template-based models of a given target, in which each restraint is weighted according to the accuracy of the prediction of the corresponding section of the molecule, is used to search the conformational space, and the weighted histogram analysis method and cluster analysis are applied to determine the families of the most probable conformations, from which candidate predictions are selected. To test the capability of the method to recover template-based models from restraints, five single-domain proteins with structures that have been well-predicted by template-based methods were used; it was found that the resulting structures were of the same quality as the best of the original models. To assess whether the new approach can improve template-based predictions with incorrectly predicted domain packing, four such targets were selected from the CASP10 targets; for three of them the new approach resulted in significantly better predictions compared with the original template-based models. The new approach can be used to predict the structures of proteins for which good templates can be found for sections of the sequence or an overall good template can be found for the entire sequence but the prediction quality is remarkably weaker in putative domain-linker regions.

  15. Progressive transmission of secured images with authentication using decompositions into monovariate functions

    NASA Astrophysics Data System (ADS)

    Leni, Pierre-Emmanuel; Fougerolle, Yohan D.; Truchetet, Frédéric

    2014-05-01

    We propose a progressive transmission approach of an image authenticated using an overlapping subimage that can be removed to restore the original image. Our approach is different from most visible watermarking approaches that allow one to later remove the watermark, because the mark is not directly introduced in the two-dimensional image space. Instead, it is rather applied to an equivalent monovariate representation of the image. Precisely, the approach is based on our progressive transmission approach that relies on a modified Kolmogorov spline network, and therefore inherits its advantages: resilience to packet losses during transmission and support of heterogeneous display environments. The marked image can be accessed at any intermediate resolution, and a key is needed to remove the mark to fully recover the original image without loss. Moreover, the key can be different for every resolution, and the images can be globally restored in case of packet losses during the transmission. Our contributions lie in the proposition of decomposing a mark (an overlapping image) and an image into monovariate functions following the Kolmogorov superposition theorem; and in the combination of these monovariate functions to provide a removable visible "watermarking" of images with the ability to restore the original image using a key.

  16. Embedding methods for the steady Euler equations

    NASA Technical Reports Server (NTRS)

    Chang, S. H.; Johnson, G. M.

    1983-01-01

    An approach to the numerical solution of the steady Euler equations is to embed the first-order Euler system in a second-order system and then to recapture the original solution by imposing additional boundary conditions. Initial development of this approach and computational experimentation with it were previously based on heuristic physical reasoning. This has led to the construction of a relaxation procedure for the solution of two-dimensional steady flow problems. The theoretical justification for the embedding approach is addressed. It is proven that, with the appropriate choice of embedding operator and additional boundary conditions, the solution to the embedded system is exactly the one to the original Euler equations. Hence, solving the embedded version of the Euler equations will not produce extraneous solutions.

  17. Color preservation for tone reproduction and image enhancement

    NASA Astrophysics Data System (ADS)

    Hsin, Chengho; Lee, Zong Wei; Lee, Zheng Zhan; Shin, Shaw-Jyh

    2014-01-01

    Applications based on luminance processing often face the problem of recovering the original chrominance in the output color image. A common approach to reconstruct a color image from the luminance output is by preserving the original hue and saturation. However, this approach often produces a highly colorful image which is undesirable. We develop a color preservation method that not only retains the ratios of the input tri-chromatic values but also adjusts the output chroma in an appropriate way. Linearizing the output luminance is the key idea to realize this method. In addition, a lightness difference metric together with a colorfulness difference metric are proposed to evaluate the performance of the color preservation methods. It shows that the proposed method performs consistently better than the existing approaches.

  18. Using a Blended Approach to Facilitate Postgraduate Supervision

    ERIC Educational Resources Information Center

    de Beer, Marie; Mason, Roger B.

    2009-01-01

    This paper explores the feasibility of using a blended approach to postgraduate research-degree supervision. Such a model could reduce research supervisors' workloads and improve the quality and success of Masters and Doctoral students' research output. The paper presents a case study that is based on a framework that was originally designed for…

  19. Advances in the application of amino acid nitrogen isotopic analysis in ecological and biogeochemical studies

    USDA-ARS?s Scientific Manuscript database

    Compound-specific isotopic analysis of amino acids (CSIA-AA) has emerged in the last decade as a powerful approach for tracing the origins and fate of nitrogen in ecological and biogeochemical studies. This approach is based on the empirical knowledge that source AAs (i.e., phenylalanine), fractiona...

  20. Local electric dipole moments: A generalized approach.

    PubMed

    Groß, Lynn; Herrmann, Carmen

    2016-09-30

    We present an approach for calculating local electric dipole moments for fragments of molecular or supramolecular systems. This is important for understanding chemical gating and solvent effects in nanoelectronics, atomic force microscopy, and intensities in infrared spectroscopy. Owing to the nonzero partial charge of most fragments, "naively" defined local dipole moments are origin-dependent. Inspired by previous work based on Bader's atoms-in-molecules (AIM) partitioning, we derive a definition of fragment dipole moments which achieves origin-independence by relying on internal reference points. Instead of bond critical points (BCPs) as in existing approaches, we use as few reference points as possible, which are located between the fragment and the remainder(s) of the system and may be chosen based on chemical intuition. This allows our approach to be used with AIM implementations that circumvent the calculation of critical points for reasons of computational efficiency, for cases where no BCPs are found due to large interfragment distances, and with local partitioning schemes other than AIM which do not provide BCPs. It is applicable to both covalently and noncovalently bound systems. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  1. Computational Systems Biology in Cancer: Modeling Methods and Applications

    PubMed Central

    Materi, Wayne; Wishart, David S.

    2007-01-01

    In recent years it has become clear that carcinogenesis is a complex process, both at the molecular and cellular levels. Understanding the origins, growth and spread of cancer, therefore requires an integrated or system-wide approach. Computational systems biology is an emerging sub-discipline in systems biology that utilizes the wealth of data from genomic, proteomic and metabolomic studies to build computer simulations of intra and intercellular processes. Several useful descriptive and predictive models of the origin, growth and spread of cancers have been developed in an effort to better understand the disease and potential therapeutic approaches. In this review we describe and assess the practical and theoretical underpinnings of commonly-used modeling approaches, including ordinary and partial differential equations, petri nets, cellular automata, agent based models and hybrid systems. A number of computer-based formalisms have been implemented to improve the accessibility of the various approaches to researchers whose primary interest lies outside of model development. We discuss several of these and describe how they have led to novel insights into tumor genesis, growth, apoptosis, vascularization and therapy. PMID:19936081

  2. Effectiveness evaluation of objective and subjective weighting methods for aquifer vulnerability assessment in urban context

    NASA Astrophysics Data System (ADS)

    Sahoo, Madhumita; Sahoo, Satiprasad; Dhar, Anirban; Pradhan, Biswajeet

    2016-10-01

    Groundwater vulnerability assessment has been an accepted practice to identify the zones with relatively increased potential for groundwater contamination. DRASTIC is the most popular secondary information-based vulnerability assessment approach. Original DRASTIC approach considers relative importance of features/sub-features based on subjective weighting/rating values. However variability of features at a smaller scale is not reflected in this subjective vulnerability assessment process. In contrast to the subjective approach, the objective weighting-based methods provide flexibility in weight assignment depending on the variation of the local system. However experts' opinion is not directly considered in the objective weighting-based methods. Thus effectiveness of both subjective and objective weighting-based approaches needs to be evaluated. In the present study, three methods - Entropy information method (E-DRASTIC), Fuzzy pattern recognition method (F-DRASTIC) and Single parameter sensitivity analysis (SA-DRASTIC), were used to modify the weights of the original DRASTIC features to include local variability. Moreover, a grey incidence analysis was used to evaluate the relative performance of subjective (DRASTIC and SA-DRASTIC) and objective (E-DRASTIC and F-DRASTIC) weighting-based methods. The performance of the developed methodology was tested in an urban area of Kanpur City, India. Relative performance of the subjective and objective methods varies with the choice of water quality parameters. This methodology can be applied without/with suitable modification. These evaluations establish the potential applicability of the methodology for general vulnerability assessment in urban context.

  3. A Review of DIMPACK Version 1.0: Conditional Covariance-Based Test Dimensionality Analysis Package

    ERIC Educational Resources Information Center

    Deng, Nina; Han, Kyung T.; Hambleton, Ronald K.

    2013-01-01

    DIMPACK Version 1.0 for assessing test dimensionality based on a nonparametric conditional covariance approach is reviewed. This software was originally distributed by Assessment Systems Corporation and now can be freely accessed online. The software consists of Windows-based interfaces of three components: DIMTEST, DETECT, and CCPROX/HAC, which…

  4. Ten Things Students Need To Know About the Origins of Israel and Palestine. Footnotes. Volume 13, Number 4

    ERIC Educational Resources Information Center

    Luxenberg, Alan H.

    2008-01-01

    This essay is based on the author's book, "The Palestine Mandate and the Creation of Israel." As the sixtieth anniversary of Israel's independence in May 2008 approaches, that country remains a focal point of world attention. Israel's origins do much to explain why the Arab-Israeli conflict has been so hard to resolve, but also provide a…

  5. The Galapagos.

    ERIC Educational Resources Information Center

    Schiller, Nancy A.; Herreid, Clyde F.

    2000-01-01

    Uses a problem-based teaching approach to teach about the geological origins of the Galapagos Islands, colonization, species formation, and threats to biodiversity. Discusses finches, tortoises, and sea cucumbers and provides instructions for student discussions. (YDS)

  6. The ARM Best Estimate Station-based Surface (ARMBESTNS) Data set

    DOE Data Explorer

    Qi,Tang; Xie,Shaocheng

    2015-08-06

    The ARM Best Estimate Station-based Surface (ARMBESTNS) data set merges together key surface measurements from the Southern Great Plains (SGP) sites. It is a twin data product of the ARM Best Estimate 2-dimensional Gridded Surface (ARMBE2DGRID) data set. Unlike the 2DGRID data set, the STNS data are reported at the original site locations and show the original information, except for the interpolation over time. Therefore, users have the flexibility to process the data with the approach more suitable for their applications.

  7. An Automatic Approach for Analyzing Treatment Effectiveness Based on Medication Hierarchy - The Myocardial Infarction Case Study.

    PubMed

    Li, Yingxue; Hu, Yiying; Yang, Jingang; Li, Xiang; Liu, Haifeng; Xie, Guotong; Xu, Meilin; Hu, Jingyi; Yang, Yuejin

    2017-01-01

    Treatment effectiveness plays a fundamental role in patient therapies. In most observational studies, researchers often design an analysis pipeline for a specific treatment based on the study cohort. To evaluate other treatments in the data set, much repeated and multifarious work including cohort construction, statistical analysis need to be done. In addition, as treatments are often with an intrinsic hierarchical relationship, many rational comparable treatment pairs can be derived as new treatment variables besides the original single treatment one from the original cohort data set. In this paper, we propose an automatic treatment effectiveness analysis approach to solve this problem. With our approach, clinicians can assess the effect of treatments not only more conveniently but also more thoroughly and comprehensively. We applied this method to a real world case of estimating the drug effectiveness on Chinese Acute Myocardial Infarction (CAMI) data set and some meaningful results are obtained for potential improvement of patient treatments.

  8. Probabilistic divergence time estimation without branch lengths: dating the origins of dinosaurs, avian flight and crown birds.

    PubMed

    Lloyd, G T; Bapst, D W; Friedman, M; Davis, K E

    2016-11-01

    Branch lengths-measured in character changes-are an essential requirement of clock-based divergence estimation, regardless of whether the fossil calibrations used represent nodes or tips. However, a separate set of divergence time approaches are typically used to date palaeontological trees, which may lack such branch lengths. Among these methods, sophisticated probabilistic approaches have recently emerged, in contrast with simpler algorithms relying on minimum node ages. Here, using a novel phylogenetic hypothesis for Mesozoic dinosaurs, we apply two such approaches to estimate divergence times for: (i) Dinosauria, (ii) Avialae (the earliest birds) and (iii) Neornithes (crown birds). We find: (i) the plausibility of a Permian origin for dinosaurs to be dependent on whether Nyasasaurus is the oldest dinosaur, (ii) a Middle to Late Jurassic origin of avian flight regardless of whether Archaeopteryx or Aurornis is considered the first bird and (iii) a Late Cretaceous origin for Neornithes that is broadly congruent with other node- and tip-dating estimates. Demonstrating the feasibility of probabilistic time-scaling further opens up divergence estimation to the rich histories of extinct biodiversity in the fossil record, even in the absence of detailed character data. © 2016 The Authors.

  9. Chemical space visualization: transforming multidimensional chemical spaces into similarity-based molecular networks.

    PubMed

    de la Vega de León, Antonio; Bajorath, Jürgen

    2016-09-01

    The concept of chemical space is of fundamental relevance for medicinal chemistry and chemical informatics. Multidimensional chemical space representations are coordinate-based. Chemical space networks (CSNs) have been introduced as a coordinate-free representation. A computational approach is presented for the transformation of multidimensional chemical space into CSNs. The design of transformation CSNs (TRANS-CSNs) is based upon a similarity function that directly reflects distance relationships in original multidimensional space. TRANS-CSNs provide an immediate visualization of coordinate-based chemical space and do not require the use of dimensionality reduction techniques. At low network density, TRANS-CSNs are readily interpretable and make it possible to evaluate structure-activity relationship information originating from multidimensional chemical space.

  10. [The organizational benefits of the Kaizen approach at the Centre Hospitalier Universitaire de Sherbrooke (CHUS)].

    PubMed

    Comtois, Jonathan; Paris, Yvon; Poder, Thomas G; Chaussé, Sylvain

    2013-01-01

    The purpose of this study was to calculate the cost savings associated with using the kaizen approach in our hospital. Originally developed in Japan, the kaizen approach, based on the idea of continuous improvement, has considerable support in North America, including in the Quebec health care system. This study assessed the first fifteen kaizen projects at the CHUS. Based on an economic evaluation, we showed that using the kaizen approach can result in substantial cost savings. The success of the kaizen approach requires compliance with specific prerequisites. The future of the approach will depend on our ability to comply with these prerequisites. More specifically, such compliance will determine whether the approach is merely a passing fad or a strategy for improving our management style to promote greater efficiency.

  11. Pilot-Scale Laboratory Instruction for Chemical Engineering: The Specific Case of the Pilot-Unit Leading Group

    ERIC Educational Resources Information Center

    Billet, Anne-Marie; Camy, Severine; Coufort-Saudejaud, Carole

    2010-01-01

    This paper presents an original approach for Chemical Engineering laboratory teaching that is currently applied at INP-ENSIACET (France). This approach, referred to as "pilot-unit leading group" is based on a partial management of the laboratories by the students themselves who become temporarily in charge of one specific laboratory. In…

  12. A novel approach based on preference-based index for interval bilevel linear programming problem.

    PubMed

    Ren, Aihong; Wang, Yuping; Xue, Xingsi

    2017-01-01

    This paper proposes a new methodology for solving the interval bilevel linear programming problem in which all coefficients of both objective functions and constraints are considered as interval numbers. In order to keep as much uncertainty of the original constraint region as possible, the original problem is first converted into an interval bilevel programming problem with interval coefficients in both objective functions only through normal variation of interval number and chance-constrained programming. With the consideration of different preferences of different decision makers, the concept of the preference level that the interval objective function is preferred to a target interval is defined based on the preference-based index. Then a preference-based deterministic bilevel programming problem is constructed in terms of the preference level and the order relation [Formula: see text]. Furthermore, the concept of a preference δ -optimal solution is given. Subsequently, the constructed deterministic nonlinear bilevel problem is solved with the help of estimation of distribution algorithm. Finally, several numerical examples are provided to demonstrate the effectiveness of the proposed approach.

  13. Mindfulness-Based Cognitive Behaviour Therapy with Emotionally Disturbed Adolescents Affected by HIV/AIDS

    ERIC Educational Resources Information Center

    Sinha, Uday K.; Kumar, Deepak

    2010-01-01

    Mindfulness-based approaches have been shown to be useful in a variety of physical and mental health conditions including chronic pain, cancer, psoriasis, eating disorders, anxiety and depression. Mindfulness based CBT finds its origins in Eastern Buddhist meditation which began many centuries ago. Recent studies on CBT with mindfulness have shown…

  14. A Constraint-Based Approach to Acquisition of Word-Final Consonant Clusters in Turkish Children

    ERIC Educational Resources Information Center

    Gokgoz-Kurt, Burcu

    2017-01-01

    The current study provides a constraint-based analysis of L1 word-final consonant cluster acquisition in Turkish child language, based on the data originally presented by Topbas and Kopkalli-Yavuz (2008). The present analysis was done using [?]+obstruent consonant cluster acquisition. A comparison of Gradual Learning Algorithm (GLA) under…

  15. Team-Based Learning in the Humanities Classroom: "Women's Environmental Writing" as a Case Study

    ERIC Educational Resources Information Center

    Harde, Roxanne

    2015-01-01

    This essay presents the adaptation of Team-Based Learning (TBL) for a course that uses ecofeminist approaches to environmental literature. Developed originally for use in professional programs, TBL's cornerstones are permanent learning teams, preparation, application, and timely assessment (Michaelsen, Knight, & Fink, 2002). I wanted my…

  16. EVALUATION OF HOST SPECIFIC PCR-BASED METHODS FOR THE IDENTIFICATION OF FECAL POLLUTION

    EPA Science Inventory

    Microbial Source Tracking (MST) is an approach to determine the origin of fecal pollution impacting a body of water. MST is based on the assumption that, given the appropriate method and indicator, the source of microbial pollution can be identified. One of the key elements of...

  17. Non-Deterministic Modelling of Food-Web Dynamics

    PubMed Central

    Planque, Benjamin; Lindstrøm, Ulf; Subbey, Sam

    2014-01-01

    A novel approach to model food-web dynamics, based on a combination of chance (randomness) and necessity (system constraints), was presented by Mullon et al. in 2009. Based on simulations for the Benguela ecosystem, they concluded that observed patterns of ecosystem variability may simply result from basic structural constraints within which the ecosystem functions. To date, and despite the importance of these conclusions, this work has received little attention. The objective of the present paper is to replicate this original model and evaluate the conclusions that were derived from its simulations. For this purpose, we revisit the equations and input parameters that form the structure of the original model and implement a comparable simulation model. We restate the model principles and provide a detailed account of the model structure, equations, and parameters. Our model can reproduce several ecosystem dynamic patterns: pseudo-cycles, variation and volatility, diet, stock-recruitment relationships, and correlations between species biomass series. The original conclusions are supported to a large extent by the current replication of the model. Model parameterisation and computational aspects remain difficult and these need to be investigated further. Hopefully, the present contribution will make this approach available to a larger research community and will promote the use of non-deterministic-network-dynamics models as ‘null models of food-webs’ as originally advocated. PMID:25299245

  18. Adaptive correlation filter-based video stabilization without accumulative global motion estimation

    NASA Astrophysics Data System (ADS)

    Koh, Eunjin; Lee, Chanyong; Jeong, Dong Gil

    2014-12-01

    We present a digital video stabilization approach that provides both robustness and efficiency for practical applications. In this approach, we adopt a stabilization model that maintains spatio-temporal information of past input frames efficiently and can track original stabilization position. Because of the stabilization model, the proposed method does not need accumulative global motion estimation and can recover the original position even if there is a failure in interframe motion estimation. It can also intelligently overcome the situation of damaged or interrupted video sequences. Moreover, because it is simple and suitable to parallel scheme, we implement it on a commercial field programmable gate array and a graphics processing unit board with compute unified device architecture in a breeze. Experimental results show that the proposed approach is both fast and robust.

  19. The implementation and use of Ada on distributed systems with high reliability requirements

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1987-01-01

    Performance analysis was begin on the Ada implementations. The goal is to supply the system designer with tools that will allow a rational decision to be made about whether a particular implementation can support a given application early in the design cycle. Primary activities were: analysis of the original approach to recovery in distributed Ada programs using the Advanced Transport Operating System (ATOPS) example; review and assessment of the original approach which was found to be capable of improvement; preparation and presentation of a paper at the 1987 Washington DC Ada Symposium; development of a refined approach to recovery that is presently being applied to the ATOPS example; and design and development of a performance assessment scheme for Ada programs based on a flexible user-driven benchmarking system.

  20. Computed Tomography Image Origin Identification Based on Original Sensor Pattern Noise and 3-D Image Reconstruction Algorithm Footprints.

    PubMed

    Duan, Yuping; Bouslimi, Dalel; Yang, Guanyu; Shu, Huazhong; Coatrieux, Gouenou

    2017-07-01

    In this paper, we focus on the "blind" identification of the computed tomography (CT) scanner that has produced a CT image. To do so, we propose a set of noise features derived from the image chain acquisition and which can be used as CT-scanner footprint. Basically, we propose two approaches. The first one aims at identifying a CT scanner based on an original sensor pattern noise (OSPN) that is intrinsic to the X-ray detectors. The second one identifies an acquisition system based on the way this noise is modified by its three-dimensional (3-D) image reconstruction algorithm. As these reconstruction algorithms are manufacturer dependent and kept secret, our features are used as input to train a support vector machine (SVM) based classifier to discriminate acquisition systems. Experiments conducted on images issued from 15 different CT-scanner models of 4 distinct manufacturers demonstrate that our system identifies the origin of one CT image with a detection rate of at least 94% and that it achieves better performance than sensor pattern noise (SPN) based strategy proposed for general public camera devices.

  1. A graph decomposition-based approach for water distribution network optimization

    NASA Astrophysics Data System (ADS)

    Zheng, Feifei; Simpson, Angus R.; Zecchin, Aaron C.; Deuerlein, Jochen W.

    2013-04-01

    A novel optimization approach for water distribution network design is proposed in this paper. Using graph theory algorithms, a full water network is first decomposed into different subnetworks based on the connectivity of the network's components. The original whole network is simplified to a directed augmented tree, in which the subnetworks are substituted by augmented nodes and directed links are created to connect them. Differential evolution (DE) is then employed to optimize each subnetwork based on the sequence specified by the assigned directed links in the augmented tree. Rather than optimizing the original network as a whole, the subnetworks are sequentially optimized by the DE algorithm. A solution choice table is established for each subnetwork (except for the subnetwork that includes a supply node) and the optimal solution of the original whole network is finally obtained by use of the solution choice tables. Furthermore, a preconditioning algorithm is applied to the subnetworks to produce an approximately optimal solution for the original whole network. This solution specifies promising regions for the final optimization algorithm to further optimize the subnetworks. Five water network case studies are used to demonstrate the effectiveness of the proposed optimization method. A standard DE algorithm (SDE) and a genetic algorithm (GA) are applied to each case study without network decomposition to enable a comparison with the proposed method. The results show that the proposed method consistently outperforms the SDE and GA (both with tuned parameters) in terms of both the solution quality and efficiency.

  2. Towards a Universal Approach Based on Omics Technologies for the Quality Control of Food

    PubMed Central

    Ferri, Emanuele; Airoldi, Cristina; Ciaramelli, Carlotta; Palmioli, Alessandro; Bruni, Ilaria

    2015-01-01

    In the last decades, food science has greatly developed, turning from the consideration of food as mere source of energy to a growing awareness on its importance for health and particularly in reducing the risk of diseases. Such vision led to an increasing attention towards the origin and quality of raw materials as well as their derived food products. The continuous advance in molecular biology allowed setting up efficient and universal omics tools to unequivocally identify the origin of food items and their traceability. In this review, we considered the application of a genomics approach known as DNA barcoding in characterizing the composition of foodstuffs and its traceability along the food supply chain. Moreover, metabolomics analytical strategies based on Nuclear Magnetic Resonance (NMR) and Mass Spectroscopy (MS) were discussed as they also work well in evaluating food quality. The combination of both approaches allows us to define a sort of molecular labelling of food that is easily understandable by the operators involved in the food sector: producers, distributors, and consumers. Current technologies based on digital information systems such as web platforms and smartphone apps can facilitate the adoption of such molecular labelling. PMID:26783518

  3. Towards a Universal Approach Based on Omics Technologies for the Quality Control of Food.

    PubMed

    Ferri, Emanuele; Galimberti, Andrea; Casiraghi, Maurizio; Airoldi, Cristina; Ciaramelli, Carlotta; Palmioli, Alessandro; Mezzasalma, Valerio; Bruni, Ilaria; Labra, Massimo

    2015-01-01

    In the last decades, food science has greatly developed, turning from the consideration of food as mere source of energy to a growing awareness on its importance for health and particularly in reducing the risk of diseases. Such vision led to an increasing attention towards the origin and quality of raw materials as well as their derived food products. The continuous advance in molecular biology allowed setting up efficient and universal omics tools to unequivocally identify the origin of food items and their traceability. In this review, we considered the application of a genomics approach known as DNA barcoding in characterizing the composition of foodstuffs and its traceability along the food supply chain. Moreover, metabolomics analytical strategies based on Nuclear Magnetic Resonance (NMR) and Mass Spectroscopy (MS) were discussed as they also work well in evaluating food quality. The combination of both approaches allows us to define a sort of molecular labelling of food that is easily understandable by the operators involved in the food sector: producers, distributors, and consumers. Current technologies based on digital information systems such as web platforms and smartphone apps can facilitate the adoption of such molecular labelling.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shumway, R.H.; McQuarrie, A.D.

    Robust statistical approaches to the problem of discriminating between regional earthquakes and explosions are developed. We compare linear discriminant analysis using descriptive features like amplitude and spectral ratios with signal discrimination techniques using the original signal waveforms and spectral approximations to the log likelihood function. Robust information theoretic techniques are proposed and all methods are applied to 8 earthquakes and 8 mining explosions in Scandinavia and to an event from Novaya Zemlya of unknown origin. It is noted that signal discrimination approaches based on discrimination information and Renyi entropy perform better in the test sample than conventional methods based onmore » spectral ratios involving the P and S phases. Two techniques for identifying the ripple-firing pattern for typical mining explosions are proposed and shown to work well on simulated data and on several Scandinavian earthquakes and explosions. We use both cepstral analysis in the frequency domain and a time domain method based on the autocorrelation and partial autocorrelation functions. The proposed approach strips off underlying smooth spectral and seasonal spectral components corresponding to the echo pattern induced by two simple ripple-fired models. For two mining explosions, a pattern is identified whereas for two earthquakes, no pattern is evident.« less

  5. Brief communication: Using averaged soil moisture estimates to improve the performances of a regional-scale landslide early warning system

    NASA Astrophysics Data System (ADS)

    Segoni, Samuele; Rosi, Ascanio; Lagomarsino, Daniela; Fanti, Riccardo; Casagli, Nicola

    2018-03-01

    We communicate the results of a preliminary investigation aimed at improving a state-of-the-art RSLEWS (regional-scale landslide early warning system) based on rainfall thresholds by integrating mean soil moisture values averaged over the territorial units of the system. We tested two approaches. The simplest can be easily applied to improve other RSLEWS: it is based on a soil moisture threshold value under which rainfall thresholds are not used because landslides are not expected to occur. Another approach deeply modifies the original RSLEWS: thresholds based on antecedent rainfall accumulated over long periods are substituted with soil moisture thresholds. A back analysis demonstrated that both approaches consistently reduced false alarms, while the second approach reduced missed alarms as well.

  6. Cockcroft-Gault revisited: New de-liver-ance on recommendations for use in cirrhosis.

    PubMed

    Scappaticci, Gianni B; Regal, Randolph E

    2017-01-28

    The Cockcroft-Gault (CG) equation has become perhaps the most popular practical approach for estimating renal function among health care professionals. Despite its widespread use, clinicians often overlook not only the limitations of the original serum creatinine (SCr) based equation, but also may not appreciate the validity of the many variations used to compensate for these limitations. For cirrhotic patients in particular, the underlying pathophysiology of the disease contributes to a falsely low SCr, thereby overestimating renal function with use of the CG equation in this population. We reviewed the original CG trial from 1976 along with data surrounding clinician specific alterations to the CG equation that followed through time. These alterations included different formulas for body weight in obese patients and the "rounding up" approach in patients with low SCr. Additionally, we described the pathophysiology and hemodynamic changes that occur in cirrhosis; and reviewed several studies that attempted to estimate renal function in this population. The evidence we reviewed regarding the most accurate manipulation of the original CG equation to estimate creatinine clearance (CrCl) was inconclusive. Unfortunately, the homogeneity of the patient population in the original CG trial limited its external validity. Elimination of body weight in the CG equation actually produced the estimate closest to the measure CrCl. Furthermore, "rounding up" of SCr values often underestimated CrCl. This approach could lead to suboptimal dosing of drug therapies in patients with low SCr. In cirrhotic patients, utilization of SCr based methods overestimated true renal function by about 50% in the literature we reviewed.

  7. From Astrochemistry to prebiotic chemistry? An hypothetical approach toward Astrobiology

    NASA Astrophysics Data System (ADS)

    Le Sergeant d'Hendecourt, L.; Danger, G.

    2012-12-01

    We present in this paper a general perspective about the evolution of molecular complexity, as observed from an astrophysicist point of view and its possible relation to the problem of the origin of life on Earth. Based on the cosmic abundances of the elements and the molecular composition of our life, we propose that life cannot really be based on other elements. We discuss where the necessary molecular complexity is built-up in astrophysical environments, actually within inter/circumstellar solid state materials known as ``grains''. Considerations based on non-directed laboratory experiments, that must be further extended in the prebiotic domain, lead to the hypothesis that if the chemistry at the origin of life may indeed be a rather universal and deterministic phenomenon, once molecular complexity is installed, the chemical evolution that generated the first prebiotic reactions that involve autoreplication must be treated in a systemic approach because of the strong contingency imposed by the complex local environment(s) and associated processes in which these chemical systems have evolved.

  8. Introducing Systems Approaches

    NASA Astrophysics Data System (ADS)

    Reynolds, Martin; Holwell, Sue

    Systems Approaches to Managing Change brings together five systems approaches to managing complex issues, each having a proven track record of over 25 years. The five approaches are: System Dynamics (SD) developed originally in the late 1950s by Jay Forrester Viable Systems Model (VSM) developed originally in the late 1960s by Stafford Beer Strategic Options Development and Analysis (SODA: with cognitive mapping) developed originally in the 1970s by Colin Eden Soft Systems Methodology (SSM) developed originally in the 1970s by Peter Checkland Critical Systems Heuristics (CSH) developed originally in the late 1970s by Werner Ulrich

  9. Towards breaking the spatial resolution barriers: An optical flow and super-resolution approach for sea ice motion estimation

    NASA Astrophysics Data System (ADS)

    Petrou, Zisis I.; Xian, Yang; Tian, YingLi

    2018-04-01

    Estimation of sea ice motion at fine scales is important for a number of regional and local level applications, including modeling of sea ice distribution, ocean-atmosphere and climate dynamics, as well as safe navigation and sea operations. In this study, we propose an optical flow and super-resolution approach to accurately estimate motion from remote sensing images at a higher spatial resolution than the original data. First, an external example learning-based super-resolution method is applied on the original images to generate higher resolution versions. Then, an optical flow approach is applied on the higher resolution images, identifying sparse correspondences and interpolating them to extract a dense motion vector field with continuous values and subpixel accuracies. Our proposed approach is successfully evaluated on passive microwave, optical, and Synthetic Aperture Radar data, proving appropriate for multi-sensor applications and different spatial resolutions. The approach estimates motion with similar or higher accuracy than the original data, while increasing the spatial resolution of up to eight times. In addition, the adopted optical flow component outperforms a state-of-the-art pattern matching method. Overall, the proposed approach results in accurate motion vectors with unprecedented spatial resolutions of up to 1.5 km for passive microwave data covering the entire Arctic and 20 m for radar data, and proves promising for numerous scientific and operational applications.

  10. Global Interior Robot Localisation by a Colour Content Image Retrieval System

    NASA Astrophysics Data System (ADS)

    Chaari, A.; Lelandais, S.; Montagne, C.; Ahmed, M. Ben

    2007-12-01

    We propose a new global localisation approach to determine a coarse position of a mobile robot in structured indoor space using colour-based image retrieval techniques. We use an original method of colour quantisation based on the baker's transformation to extract a two-dimensional colour pallet combining as well space and vicinity-related information as colourimetric aspect of the original image. We conceive several retrieving approaches bringing to a specific similarity measure [InlineEquation not available: see fulltext.] integrating the space organisation of colours in the pallet. The baker's transformation provides a quantisation of the image into a space where colours that are nearby in the original space are also nearby in the output space, thereby providing dimensionality reduction and invariance to minor changes in the image. Whereas the distance [InlineEquation not available: see fulltext.] provides for partial invariance to translation, sight point small changes, and scale factor. In addition to this study, we developed a hierarchical search module based on the logic classification of images following rooms. This hierarchical module reduces the searching indoor space and ensures an improvement of our system performances. Results are then compared with those brought by colour histograms provided with several similarity measures. In this paper, we focus on colour-based features to describe indoor images. A finalised system must obviously integrate other type of signature like shape and texture.

  11. A Behavioural Approach to Understanding Semi-Subsistence Farmers' Technology Adoption Decisions: The Case of Improved Paddy-Prawn System in Indonesia

    ERIC Educational Resources Information Center

    Sambodo, Leonardo A. A. T.; Nuthall, Peter L.

    2010-01-01

    Purpose: This study traced the origins of subsistence Farmers' technology adoption attitudes and extracted the critical elements in their decision making systems. Design/Methodology/Approach: The analysis was structured using a model based on the Theory of Planned Behaviour (TPB). The role of a "bargaining process" was particularly…

  12. A New Approach to Automated Labeling of Internal Features of Hardwood Logs Using CT Images

    Treesearch

    Daniel L. Schmoldt; Pei Li; A. Lynn Abbott

    1996-01-01

    The feasibility of automatically identifying internal features of hardwood logs using CT imagery has been established previously. Features of primary interest are bark, knots, voids, decay, and clear wood. Our previous approach: filtered original CT images, applied histogram segmentation, grew volumes to extract 3-d regions, and applied a rule base, with Dempster-...

  13. Situational Approaches to Direct Practice: Origin, Decline, and Re-Emergence

    ERIC Educational Resources Information Center

    Murdach, Allison D.

    2007-01-01

    During the 1890s and the first three decades of the 20th century, social work in the United States developed a community-based direct practice approach to family assistance and social reform. The basis for this method was a situational view of social life that emphasized the use of interpersonal and transactional methods to achieve social and…

  14. Simple Shared Motifs (SSM) in conserved region of promoters: a new approach to identify co-regulation patterns.

    PubMed

    Gruel, Jérémy; LeBorgne, Michel; LeMeur, Nolwenn; Théret, Nathalie

    2011-09-12

    Regulation of gene expression plays a pivotal role in cellular functions. However, understanding the dynamics of transcription remains a challenging task. A host of computational approaches have been developed to identify regulatory motifs, mainly based on the recognition of DNA sequences for transcription factor binding sites. Recent integration of additional data from genomic analyses or phylogenetic footprinting has significantly improved these methods. Here, we propose a different approach based on the compilation of Simple Shared Motifs (SSM), groups of sequences defined by their length and similarity and present in conserved sequences of gene promoters. We developed an original algorithm to search and count SSM in pairs of genes. An exceptional number of SSM is considered as a common regulatory pattern. The SSM approach is applied to a sample set of genes and validated using functional gene-set enrichment analyses. We demonstrate that the SSM approach selects genes that are over-represented in specific biological categories (Ontology and Pathways) and are enriched in co-expressed genes. Finally we show that genes co-expressed in the same tissue or involved in the same biological pathway have increased SSM values. Using unbiased clustering of genes, Simple Shared Motifs analysis constitutes an original contribution to provide a clearer definition of expression networks.

  15. Simple Shared Motifs (SSM) in conserved region of promoters: a new approach to identify co-regulation patterns

    PubMed Central

    2011-01-01

    Background Regulation of gene expression plays a pivotal role in cellular functions. However, understanding the dynamics of transcription remains a challenging task. A host of computational approaches have been developed to identify regulatory motifs, mainly based on the recognition of DNA sequences for transcription factor binding sites. Recent integration of additional data from genomic analyses or phylogenetic footprinting has significantly improved these methods. Results Here, we propose a different approach based on the compilation of Simple Shared Motifs (SSM), groups of sequences defined by their length and similarity and present in conserved sequences of gene promoters. We developed an original algorithm to search and count SSM in pairs of genes. An exceptional number of SSM is considered as a common regulatory pattern. The SSM approach is applied to a sample set of genes and validated using functional gene-set enrichment analyses. We demonstrate that the SSM approach selects genes that are over-represented in specific biological categories (Ontology and Pathways) and are enriched in co-expressed genes. Finally we show that genes co-expressed in the same tissue or involved in the same biological pathway have increased SSM values. Conclusions Using unbiased clustering of genes, Simple Shared Motifs analysis constitutes an original contribution to provide a clearer definition of expression networks. PMID:21910886

  16. Enacting Key Skills-Based Curricula in Secondary Education: Lessons from a Technology-Mediated, Group-Based Learning Initiative

    ERIC Educational Resources Information Center

    Johnston, Keith; Conneely, Claire; Murchan, Damian; Tangney, Brendan

    2015-01-01

    Bridge21 is an innovative approach to learning for secondary education that was originally conceptualised as part of a social outreach intervention in the authors' third-level institution whereby participants attended workshops at a dedicated learning space on campus focusing on a particular model of technology-mediated group-based learning. This…

  17. Load Balancing in Distributed Web Caching: A Novel Clustering Approach

    NASA Astrophysics Data System (ADS)

    Tiwari, R.; Kumar, K.; Khan, G.

    2010-11-01

    The World Wide Web suffers from scaling and reliability problems due to overloaded and congested proxy servers. Caching at local proxy servers helps, but cannot satisfy more than a third to half of requests; more requests are still sent to original remote origin servers. In this paper we have developed an algorithm for Distributed Web Cache, which incorporates cooperation among proxy servers of one cluster. This algorithm uses Distributed Web Cache concepts along with static hierarchies with geographical based clusters of level one proxy server with dynamic mechanism of proxy server during the congestion of one cluster. Congestion and scalability problems are being dealt by clustering concept used in our approach. This results in higher hit ratio of caches, with lesser latency delay for requested pages. This algorithm also guarantees data consistency between the original server objects and the proxy cache objects.

  18. Managing School-Based Curriculum Innovations: A Hong Kong Case Study

    ERIC Educational Resources Information Center

    Law, Edmond H. F.; Wan, Sally W. Y.; Galton, Maurice; Lee, John C. K.

    2010-01-01

    This study was originally designed to explore the impact of a distributed approach to developing curriculum leadership among schoolteachers. Previous papers have focused on reporting evidence of teacher learning in the process of engaging teachers in various types of curriculum decision-making in an innovation project based on interview data. This…

  19. The origin of the western constellations (II). (Italian Title: Líorigine delle costellazioni occidentali (II parte))

    NASA Astrophysics Data System (ADS)

    Vanin, G.

    2012-06-01

    In this article the author reviews the major contributions that have been published on the origin of Western constellations. He puts these contributions to a strict criticism, based both on the most recent historiographic acquisitions and use of modern software. This approach deprives of any foundation the ideas proposed by Maunder, Ovenden, Roy, Gurshtein, still considered reliable by many of the nonspecialists and the audience of fans, while appropriately emphasizes the great news and great value, albeit with some reservations, of the studies recently undertaken on the subject by Bradley Schaefer.

  20. The origin of the western constellations (I). (Italian Title: Líorigine delle costellazioni occidentali (I parte))

    NASA Astrophysics Data System (ADS)

    Vanin, G.

    2012-04-01

    In this article the author reviews the major contributions that have been published on the origin of Western constellations. He puts these contributions to a strict criticism, based both on the most recent historiographic acquisitions and use of modern software. This approach deprives of any foundation the ideas proposed by Maunder, Ovenden, Roy, Gurshtein, still considered reliable by many of the nonspecialists and the audience of fans, while appropriately emphasizes the great news and great value, albeit with some reservations, of the studies recently undertaken on the subject by Bradley Schaefer.

  1. A Logic-Based Psychotherapy Approach to Treating Patients Which Focuses on Faultless Logical Functioning: A Case Study Method

    PubMed Central

    Almeida, Fernando; Moreira, Diana

    2017-01-01

    Many clinical patients present to mental health clinics with depressive symptoms, anxiety, psychosomatic complaints, and sleeping problems. These symptoms which originated may originate from marital problems, conflictual interpersonal relationships, problems in securing work, and housing issues, among many others. These issues might interfere which underlie the difficulties that with the ability of the patients face in maintaining faultless logical reasoning (FLR) and faultless logical functioning (FLF). FLR implies to assess correctly premises, rules, and conclusions. And FLF implies assessing not only FLR, but also the circumstances, life experience, personality, events that validate a conclusion. Almost always, the symptomatology is accompanied by intense emotional changes. Clinical experience shows that a logic-based psychotherapy (LBP) approach is not practiced, and that therapists’ resort to psychopharmacotherapy or other types of psychotherapeutic approaches that are not focused on logical reasoning and, especially, logical functioning. Because of this, patients do not learn to overcome their reasoning and functioning errors. The aim of this work was to investigate how LBP works to improve the patients’ ability to think and function in a faultless logical way. This work describes the case studies of three patients. For this purpose we described the treatment of three patients. With this psychotherapeutic approach, patients gain knowledge that can then be applied not only to the issues that led them to the consultation, but also to other problems they have experienced, thus creating a learning experience and helping to prevent such patients from becoming involved in similar problematic situations. This highlights that LBP is a way of treating symptoms that interfere on some level with daily functioning. This psychotherapeutic approach is relevant for improving patients’ quality of life, and it fills a gap in the literature by describing original case analyses. PMID:29312088

  2. Coupled electromagnetic-thermodynamic simulations of microwave heating problems using the FDTD algorithm.

    PubMed

    Kopyt, Paweł; Celuch, Małgorzata

    2007-01-01

    A practical implementation of a hybrid simulation system capable of modeling coupled electromagnetic-thermodynamic problems typical in microwave heating is described. The paper presents two approaches to modeling such problems. Both are based on an FDTD-based commercial electromagnetic solver coupled to an external thermodynamic analysis tool required for calculations of heat diffusion. The first approach utilizes a simple FDTD-based thermal solver while in the second it is replaced by a universal commercial CFD solver. The accuracy of the two modeling systems is verified against the original experimental data as well as the measurement results available in literature.

  3. New cell engineering approaches for cartilage regenerative medicine.

    PubMed

    Cucchiarini, Magali

    2017-01-01

    Articular cartilage injuries have an inadequate aptitude to reproduce the original structure and functions of this highly specialized tissue. As most of the currently available options also do not lead to the restoration of the original hyaline cartilage, novel treatments are critically needed to address this global problems in the clinics. Gene therapy combined with tissue engineering approaches offers effective tools capable of enhancing cartilage repair experimentally, especially those based on the controlled delivery of the highly effective, clinically adapted recombinant adeno-associated viral (rAAV) vectors. This work presents an overview of the most recent evidence showing the benefits of using rAAV vectors and biocompatible materials for the elaboration of adapted treatments against cartilage injuries.

  4. Parallel Implementation of the Discontinuous Galerkin Method

    NASA Technical Reports Server (NTRS)

    Baggag, Abdalkader; Atkins, Harold; Keyes, David

    1999-01-01

    This paper describes a parallel implementation of the discontinuous Galerkin method. Discontinuous Galerkin is a spatially compact method that retains its accuracy and robustness on non-smooth unstructured grids and is well suited for time dependent simulations. Several parallelization approaches are studied and evaluated. The most natural and symmetric of the approaches has been implemented in all object-oriented code used to simulate aeroacoustic scattering. The parallel implementation is MPI-based and has been tested on various parallel platforms such as the SGI Origin, IBM SP2, and clusters of SGI and Sun workstations. The scalability results presented for the SGI Origin show slightly superlinear speedup on a fixed-size problem due to cache effects.

  5. Modified multiblock partial least squares path modeling algorithm with backpropagation neural networks approach

    NASA Astrophysics Data System (ADS)

    Yuniarto, Budi; Kurniawan, Robert

    2017-03-01

    PLS Path Modeling (PLS-PM) is different from covariance based SEM, where PLS-PM use an approach based on variance or component, therefore, PLS-PM is also known as a component based SEM. Multiblock Partial Least Squares (MBPLS) is a method in PLS regression which can be used in PLS Path Modeling which known as Multiblock PLS Path Modeling (MBPLS-PM). This method uses an iterative procedure in its algorithm. This research aims to modify MBPLS-PM with Back Propagation Neural Network approach. The result is MBPLS-PM algorithm can be modified using the Back Propagation Neural Network approach to replace the iterative process in backward and forward step to get the matrix t and the matrix u in the algorithm. By modifying the MBPLS-PM algorithm using Back Propagation Neural Network approach, the model parameters obtained are relatively not significantly different compared to model parameters obtained by original MBPLS-PM algorithm.

  6. Estimating Origin-Destination Matrices Using AN Efficient Moth Flame-Based Spatial Clustering Approach

    NASA Astrophysics Data System (ADS)

    Heidari, A. A.; Moayedi, A.; Abbaspour, R. Ali

    2017-09-01

    Automated fare collection (AFC) systems are regarded as valuable resources for public transport planners. In this paper, the AFC data are utilized to analysis and extract mobility patterns in a public transportation system. For this purpose, the smart card data are inserted into a proposed metaheuristic-based aggregation model and then converted to O-D matrix between stops, since the size of O-D matrices makes it difficult to reproduce the measured passenger flows precisely. The proposed strategy is applied to a case study from Haaglanden, Netherlands. In this research, moth-flame optimizer (MFO) is utilized and evaluated for the first time as a new metaheuristic algorithm (MA) in estimating transit origin-destination matrices. The MFO is a novel, efficient swarm-based MA inspired from the celestial navigation of moth insects in nature. To investigate the capabilities of the proposed MFO-based approach, it is compared to methods that utilize the K-means algorithm, gray wolf optimization algorithm (GWO) and genetic algorithm (GA). The sum of the intra-cluster distances and computational time of operations are considered as the evaluation criteria to assess the efficacy of the optimizers. The optimality of solutions of different algorithms is measured in detail. The traveler's behavior is analyzed to achieve to a smooth and optimized transport system. The results reveal that the proposed MFO-based aggregation strategy can outperform other evaluated approaches in terms of convergence tendency and optimality of the results. The results show that it can be utilized as an efficient approach to estimating the transit O-D matrices.

  7. Problem of lunar mascons: An alternative approach

    NASA Astrophysics Data System (ADS)

    Barenbaum, A. A.; Shpekin, M. I.

    2018-01-01

    The origin of lunar mascons is discussed on the base of results of the orbital experimental exploration of the Moon by the Gravity Recovery and Interior Laboratory and the Lunar Reconnaissance Orbiter missions. We lead the discussion on the basis of representations of Galactocentric paradigm which links processes in the Solar System and on its planets with the Galaxy influences. The article describes a new approach to the interpretation of the crater data, which takes into account the quasi-periodic bombardments of the Moon by galactic comets. We present a preliminary evaluation of the age of mascons as well as of craters and mares on the Moon based on this approach.

  8. Gray-world-assumption-based illuminant color estimation using color gamuts with high and low chroma

    NASA Astrophysics Data System (ADS)

    Kawamura, Harumi; Yonemura, Shunichi; Ohya, Jun; Kojima, Akira

    2013-02-01

    A new approach is proposed for estimating illuminant colors from color images under an unknown scene illuminant. The approach is based on a combination of a gray-world-assumption-based illuminant color estimation method and a method using color gamuts. The former method, which is one we had previously proposed, improved on the original method that hypothesizes that the average of all the object colors in a scene is achromatic. Since the original method estimates scene illuminant colors by calculating the average of all the image pixel values, its estimations are incorrect when certain image colors are dominant. Our previous method improves on it by choosing several colors on the basis of an opponent-color property, which is that the average color of opponent colors is achromatic, instead of using all colors. However, it cannot estimate illuminant colors when there are only a few image colors or when the image colors are unevenly distributed in local areas in the color space. The approach we propose in this paper combines our previous method and one using high chroma and low chroma gamuts, which makes it possible to find colors that satisfy the gray world assumption. High chroma gamuts are used for adding appropriate colors to the original image and low chroma gamuts are used for narrowing down illuminant color possibilities. Experimental results obtained using actual images show that even if the image colors are localized in a certain area in the color space, the illuminant colors are accurately estimated, with smaller estimation error average than that generated in the conventional method.

  9. Toward Exposing Timing-Based Probing Attacks in Web Applications †

    PubMed Central

    Mao, Jian; Chen, Yue; Shi, Futian; Jia, Yaoqi; Liang, Zhenkai

    2017-01-01

    Web applications have become the foundation of many types of systems, ranging from cloud services to Internet of Things (IoT) systems. Due to the large amount of sensitive data processed by web applications, user privacy emerges as a major concern in web security. Existing protection mechanisms in modern browsers, e.g., the same origin policy, prevent the users’ browsing information on one website from being directly accessed by another website. However, web applications executed in the same browser share the same runtime environment. Such shared states provide side channels for malicious websites to indirectly figure out the information of other origins. Timing is a classic side channel and the root cause of many recent attacks, which rely on the variations in the time taken by the systems to process different inputs. In this paper, we propose an approach to expose the timing-based probing attacks in web applications. It monitors the browser behaviors and identifies anomalous timing behaviors to detect browser probing attacks. We have prototyped our system in the Google Chrome browser and evaluated the effectiveness of our approach by using known probing techniques. We have applied our approach on a large number of top Alexa sites and reported the suspicious behavior patterns with corresponding analysis results. Our theoretical analysis illustrates that the effectiveness of the timing-based probing attacks is dramatically limited by our approach. PMID:28245610

  10. Toward Exposing Timing-Based Probing Attacks in Web Applications.

    PubMed

    Mao, Jian; Chen, Yue; Shi, Futian; Jia, Yaoqi; Liang, Zhenkai

    2017-02-25

    Web applications have become the foundation of many types of systems, ranging from cloud services to Internet of Things (IoT) systems. Due to the large amount of sensitive data processed by web applications, user privacy emerges as a major concern in web security. Existing protection mechanisms in modern browsers, e.g., the same origin policy, prevent the users' browsing information on one website from being directly accessed by another website. However, web applications executed in the same browser share the same runtime environment. Such shared states provide side channels for malicious websites to indirectly figure out the information of other origins. Timing is a classic side channel and the root cause of many recent attacks, which rely on the variations in the time taken by the systems to process different inputs. In this paper, we propose an approach to expose the timing-based probing attacks in web applications. It monitors the browser behaviors and identifies anomalous timing behaviors to detect browser probing attacks. We have prototyped our system in the Google Chrome browser and evaluated the effectiveness of our approach by using known probing techniques. We have applied our approach on a large number of top Alexa sites and reported the suspicious behavior patterns with corresponding analysis results. Our theoretical analysis illustrates that the effectiveness of the timing-based probing attacks is dramatically limited by our approach.

  11. First-Principles Approach to Model Electrochemical Reactions: Understanding the Fundamental Mechanisms behind Mg Corrosion

    NASA Astrophysics Data System (ADS)

    Surendralal, Sudarsan; Todorova, Mira; Finnis, Michael W.; Neugebauer, Jörg

    2018-06-01

    Combining concepts of semiconductor physics and corrosion science, we develop a novel approach that allows us to perform ab initio calculations under controlled potentiostat conditions for electrochemical systems. The proposed approach can be straightforwardly applied in standard density functional theory codes. To demonstrate the performance and the opportunities opened by this approach, we study the chemical reactions that take place during initial corrosion at the water-Mg interface under anodic polarization. Based on this insight, we derive an atomistic model that explains the origin of the anodic hydrogen evolution.

  12. Comparison of algebraic and analytical approaches to the formulation of the statistical model-based reconstruction problem for X-ray computed tomography.

    PubMed

    Cierniak, Robert; Lorent, Anna

    2016-09-01

    The main aim of this paper is to investigate properties of our originally formulated statistical model-based iterative approach applied to the image reconstruction from projections problem which are related to its conditioning, and, in this manner, to prove a superiority of this approach over ones recently used by other authors. The reconstruction algorithm based on this conception uses a maximum likelihood estimation with an objective adjusted to the probability distribution of measured signals obtained from an X-ray computed tomography system with parallel beam geometry. The analysis and experimental results presented here show that our analytical approach outperforms the referential algebraic methodology which is explored widely in the literature and exploited in various commercial implementations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Cross-Battery Assessment Pattern of Strengths and Weaknesses Approach to the Identification of Specific Learning Disorders: Evidence-Based Practice or Pseudoscience?

    ERIC Educational Resources Information Center

    Kranzler, John H.; Floyd, Randy G.; Benson, Nicholas; Zaboski, Brian; Thibodaux, Lia

    2016-01-01

    In this rejoinder, the authors describe the aim of the original study as an effort to conduct a critical test of an important postulate underlying the Cross-Battery Assessment PSW approach (XBA PSW; Kranzler, Floyd, Benson, Zaboski, & Thibodaux, this issue). The authors used classification agreement analysis to examine the concordance between…

  14. Analysis of creative mathematic thinking ability in problem based learning model based on self-regulation learning

    NASA Astrophysics Data System (ADS)

    Munahefi, D. N.; Waluya, S. B.; Rochmad

    2018-03-01

    The purpose of this research identified the effectiveness of Problem Based Learning (PBL) models based on Self Regulation Leaning (SRL) on the ability of mathematical creative thinking and analyzed the ability of mathematical creative thinking of high school students in solving mathematical problems. The population of this study was students of grade X SMA N 3 Klaten. The research method used in this research was sequential explanatory. Quantitative stages with simple random sampling technique, where two classes were selected randomly as experimental class was taught with the PBL model based on SRL and control class was taught with expository model. The selection of samples at the qualitative stage was non-probability sampling technique in which each selected 3 students were high, medium, and low academic levels. PBL model with SRL approach effectived to students’ mathematical creative thinking ability. The ability of mathematical creative thinking of low academic level students with PBL model approach of SRL were achieving the aspect of fluency and flexibility. Students of academic level were achieving fluency and flexibility aspects well. But the originality of students at the academic level was not yet well structured. Students of high academic level could reach the aspect of originality.

  15. Comparative Analysis Study of Open Source GIS in Malaysia

    NASA Astrophysics Data System (ADS)

    Rasid, Muhammad Zamir Abdul; Kamis, Naddia; Khuizham Abd Halim, Mohd

    2014-06-01

    Open source origin might appear like a major prospective change which is qualified to deliver in various industries and also competing means in developing countries. The leading purpose of this research study is to basically discover the degree of adopting Open Source Software (OSS) that is connected with Geographic Information System (GIS) application within Malaysia. It was derived based on inadequate awareness with regards to the origin ideas or even on account of techie deficiencies in the open origin instruments. This particular research has been carried out based on two significant stages; the first stage involved a survey questionnaire: to evaluate the awareness and acceptance level based on the comparison feedback regarding OSS and commercial GIS. This particular survey was conducted among three groups of candidates: government servant, university students and lecturers, as well as individual. The approaches of measuring awareness in this research were based on a comprehending signal plus a notion signal for each survey questions. These kinds of signs had been designed throughout the analysis in order to supply a measurable and also a descriptive signal to produce the final result. The second stage involved an interview session with a major organization that carries out available origin internet GIS; the Federal Department of Town and Country Planning Peninsular Malaysia (JPBD). The impact of this preliminary study was to understand the particular viewpoint of different groups of people on the available origin, and also their insufficient awareness with regards to origin ideas as well as likelihood may be significant root of adopting level connected with available origin options.

  16. Identification of milk origin and process-induced changes in milk by stable isotope ratio mass spectrometry.

    PubMed

    Scampicchio, Matteo; Mimmo, Tanja; Capici, Calogero; Huck, Christian; Innocente, Nadia; Drusch, Stephan; Cesco, Stefano

    2012-11-14

    Stable isotope values were used to develop a new analytical approach enabling the simultaneous identification of milk samples either processed with different heating regimens or from different geographical origins. The samples consisted of raw, pasteurized (HTST), and ultrapasteurized (UHT) milk from different Italian origins. The approach consisted of the analysis of the isotope ratio of δ¹³C and δ¹⁵N for the milk samples and their fractions (fat, casein, and whey). The main finding of this work is that as the heat processing affects the composition of the milk fractions, changes in δ¹³C and δ¹⁵N were also observed. These changes were used as markers to develop pattern recognition maps based on principal component analysis and supervised classification models, such as linear discriminant analysis (LDA), multivariate regression (MLR), principal component regression (PCR), and partial least-squares (PLS). The results give proof of the concept that isotope ratio mass spectroscopy can discriminate simultaneously between milk samples according to their geographical origin and type of processing.

  17. Adjacent-Categories Mokken Models for Rater-Mediated Assessments

    PubMed Central

    Wind, Stefanie A.

    2016-01-01

    Molenaar extended Mokken’s original probabilistic-nonparametric scaling models for use with polytomous data. These polytomous extensions of Mokken’s original scaling procedure have facilitated the use of Mokken scale analysis as an approach to exploring fundamental measurement properties across a variety of domains in which polytomous ratings are used, including rater-mediated educational assessments. Because their underlying item step response functions (i.e., category response functions) are defined using cumulative probabilities, polytomous Mokken models can be classified as cumulative models based on the classifications of polytomous item response theory models proposed by several scholars. In order to permit a closer conceptual alignment with educational performance assessments, this study presents an adjacent-categories variation on the polytomous monotone homogeneity and double monotonicity models. Data from a large-scale rater-mediated writing assessment are used to illustrate the adjacent-categories approach, and results are compared with the original formulations. Major findings suggest that the adjacent-categories models provide additional diagnostic information related to individual raters’ use of rating scale categories that is not observed under the original formulation. Implications are discussed in terms of methods for evaluating rating quality. PMID:29795916

  18. Knowledge-based IMRT treatment planning for prostate cancer.

    PubMed

    Chanyavanich, Vorakarn; Das, Shiva K; Lee, William R; Lo, Joseph Y

    2011-05-01

    To demonstrate the feasibility of using a knowledge base of prior treatment plans to generate new prostate intensity modulated radiation therapy (IMRT) plans. Each new case would be matched against others in the knowledge base. Once the best match is identified, that clinically approved plan is used to generate the new plan. A database of 100 prostate IMRT treatment plans was assembled into an information-theoretic system. An algorithm based on mutual information was implemented to identify similar patient cases by matching 2D beam's eye view projections of contours. Ten randomly selected query cases were each matched with the most similar case from the database of prior clinically approved plans. Treatment parameters from the matched case were used to develop new treatment plans. A comparison of the differences in the dose-volume histograms between the new and the original treatment plans were analyzed. On average, the new knowledge-based plan is capable of achieving very comparable planning target volume coverage as the original plan, to within 2% as evaluated for D98, D95, and D1. Similarly, the dose to the rectum and dose to the bladder are also comparable to the original plan. For the rectum, the mean and standard deviation of the dose percentage differences for D20, D30, and D50 are 1.8% +/- 8.5%, -2.5% +/- 13.9%, and -13.9% +/- 23.6%, respectively. For the bladder, the mean and standard deviation of the dose percentage differences for D20, D30, and D50 are -5.9% +/- 10.8%, -12.2% +/- 14.6%, and -24.9% +/- 21.2%, respectively. A negative percentage difference indicates that the new plan has greater dose sparing as compared to the original plan. The authors demonstrate a knowledge-based approach of using prior clinically approved treatment plans to generate clinically acceptable treatment plans of high quality. This semiautomated approach has the potential to improve the efficiency of the treatment planning process while ensuring that high quality plans are developed.

  19. A Support Vector Machine based method to distinguish proteobacterial proteins from eukaryotic plant proteins

    PubMed Central

    2012-01-01

    Background Members of the phylum Proteobacteria are most prominent among bacteria causing plant diseases that result in a diminution of the quantity and quality of food produced by agriculture. To ameliorate these losses, there is a need to identify infections in early stages. Recent developments in next generation nucleic acid sequencing and mass spectrometry open the door to screening plants by the sequences of their macromolecules. Such an approach requires the ability to recognize the organismal origin of unknown DNA or peptide fragments. There are many ways to approach this problem but none have emerged as the best protocol. Here we attempt a systematic way to determine organismal origins of peptides by using a machine learning algorithm. The algorithm that we implement is a Support Vector Machine (SVM). Result The amino acid compositions of proteobacterial proteins were found to be different from those of plant proteins. We developed an SVM model based on amino acid and dipeptide compositions to distinguish between a proteobacterial protein and a plant protein. The amino acid composition (AAC) based SVM model had an accuracy of 92.44% with 0.85 Matthews correlation coefficient (MCC) while the dipeptide composition (DC) based SVM model had a maximum accuracy of 94.67% and 0.89 MCC. We also developed SVM models based on a hybrid approach (AAC and DC), which gave a maximum accuracy 94.86% and a 0.90 MCC. The models were tested on unseen or untrained datasets to assess their validity. Conclusion The results indicate that the SVM based on the AAC and DC hybrid approach can be used to distinguish proteobacterial from plant protein sequences. PMID:23046503

  20. Solution-focused premarital counseling: helping couples build a vision for their marriage.

    PubMed

    Murray, Christine E; Murray, Thomas L

    2004-07-01

    This article outlines a solution-focused approach to premarital counseling. Solution-focused premarital counseling is a strength-based approach that focuses on a couple's resources to develop a shared vision for the marriage. Background information about premarital counseling and solution-focused therapy provide the framework for the development of intervention strategies that are grounded in the solution-focused approach. Solution-oriented interventions include solution-oriented questions, providing feedback, and the Couple's Resource Map, an original intervention that is described in this article.

  1. A Process Approach to Community-Based Education: The People's Free University of Saskatchewan

    ERIC Educational Resources Information Center

    Woodhouse, Howard

    2005-01-01

    On the basis of insights provided by Whitehead and John Cobb, I show how the People's Free University of Saskatchewan (PFU) is a working model of free, open, community-based education that embodies several characteristics of Whitehead's philosophy of education. Formed in opposition to the growing commercialization at the original "people?s…

  2. "What's so Terrible about Swallowing an Apple Seed?" Problem-Based Learning in Kindergarten

    ERIC Educational Resources Information Center

    Zhang, Meilan; Parker, Joyce; Eberhardt, Jan; Passalacqua, Susan

    2011-01-01

    Problem-Based Learning (PBL), an instructional approach originated in medical education, has gained increasing attention in K-12 science education because of its emphasis on self-directed learning and real-world problem-solving. Yet few studies have examined how PBL can be adapted for kindergarten. In this study, we examined how a veteran…

  3. Ecological risk assessment of agricultural soils for the definition of soil screening values: A comparison between substance-based and matrix-based approaches.

    PubMed

    Pivato, Alberto; Lavagnolo, Maria Cristina; Manachini, Barbara; Vanin, Stefano; Raga, Roberto; Beggio, Giovanni

    2017-04-01

    The Italian legislation on contaminated soils does not include the Ecological Risk Assessment (ERA) and this deficiency has important consequences for the sustainable management of agricultural soils. The present research compares the results of two ERA procedures applied to agriculture (i) one based on the "substance-based" approach and (ii) a second based on the "matrix-based" approach. In the former the soil screening values (SVs) for individual substances were derived according to institutional foreign guidelines. In the latter, the SVs characterizing the whole-matrix were derived originally by the authors by means of experimental activity. The results indicate that the "matrix-based" approach can be efficiently implemented in the Italian legislation for the ERA of agricultural soils. This method, if compared to the institutionalized "substance based" approach is (i) comparable in economic terms and in testing time, (ii) is site specific and assesses the real effect of the investigated soil on a battery of bioassays, (iii) accounts for phenomena that may radically modify the exposure of the organisms to the totality of contaminants and (iv) can be considered sufficiently conservative.

  4. Translating person-centered care into practice: A comparative analysis of motivational interviewing, illness-integration support, and guided self-determination.

    PubMed

    Zoffmann, Vibeke; Hörnsten, Åsa; Storbækken, Solveig; Graue, Marit; Rasmussen, Bodil; Wahl, Astrid; Kirkevold, Marit

    2016-03-01

    Person-centred care [PCC] can engage people in living well with a chronic condition. However, translating PCC into practice is challenging. We aimed to compare the translational potentials of three approaches: motivational interviewing [MI], illness integration support [IIS] and guided self-determination [GSD]. Comparative analysis included eight components: (1) philosophical origin; (2) development in original clinical setting; (3) theoretical underpinnings; (4) overarching goal and supportive processes; (5) general principles, strategies or tools for engaging peoples; (6) health care professionals' background and training; (7) fidelity assessment; (8) reported effects. Although all approaches promoted autonomous motivation, they differed in other ways. Their original settings explain why IIS and GSD strive for life-illness integration, whereas MI focuses on managing ambivalence. IIS and GSD were based on grounded theories, and MI was intuitively developed. All apply processes and strategies to advance professionals' communication skills and engagement; GSD includes context-specific reflection sheets. All offer training programs; MI and GSD include fidelity tools. Each approach has a primary application: MI, when ambivalence threatens positive change; IIS, when integrating newly diagnosed chronic conditions; and GSD, when problem solving is difficult, or deadlocked. Professionals must critically consider the context in their choice of approach. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. Parallelization of an Object-Oriented Unstructured Aeroacoustics Solver

    NASA Technical Reports Server (NTRS)

    Baggag, Abdelkader; Atkins, Harold; Oezturan, Can; Keyes, David

    1999-01-01

    A computational aeroacoustics code based on the discontinuous Galerkin method is ported to several parallel platforms using MPI. The discontinuous Galerkin method is a compact high-order method that retains its accuracy and robustness on non-smooth unstructured meshes. In its semi-discrete form, the discontinuous Galerkin method can be combined with explicit time marching methods making it well suited to time accurate computations. The compact nature of the discontinuous Galerkin method also makes it well suited for distributed memory parallel platforms. The original serial code was written using an object-oriented approach and was previously optimized for cache-based machines. The port to parallel platforms was achieved simply by treating partition boundaries as a type of boundary condition. Code modifications were minimal because boundary conditions were abstractions in the original program. Scalability results are presented for the SCI Origin, IBM SP2, and clusters of SGI and Sun workstations. Slightly superlinear speedup is achieved on a fixed-size problem on the Origin, due to cache effects.

  6. The Search for Hydrologic Signatures: The Effect of Data Transformations on Bayesian Model Calibration

    NASA Astrophysics Data System (ADS)

    Sadegh, M.; Vrugt, J. A.

    2011-12-01

    In the past few years, several contributions have begun to appear in the hydrologic literature that introduced and analyzed the benefits of using a signature based approach to watershed analysis. This signature-based approach abandons the standard single criteria model-data fitting paradigm in favor of a diagnostic approach that better extracts the available information from the available data. Despite the prospects of this new viewpoint, rather ad-hoc criteria have hitherto been proposed to improve watershed modeling. Here, we aim to provide a proper mathematical foundation to signature based analysis. We analyze the information content of different data transformation by analyzing their convergence speed with Markov Chain Monte Carlo (MCMC) simulation using the Generalized Likelihood function of Schousp and Vrugt (2010). We compare the information content of the original discharge data against a simple square root and Box-Cox transformation of the streamflow data. We benchmark these results against wavelet and flow duration curve transformations that temporally disaggregate the discharge data. Our results conclusive demonstrate that wavelet transformations and flow duration curves significantly reduce the information content of the streamflow data and consequently unnecessarily increase the uncertainty of the HYMOD model parameters. Hydrologic signatures thus need to be found in the original data, without temporal disaggregation.

  7. Influence of Geographical Origin and Flour Type on Diversity of Lactic Acid Bacteria in Traditional Belgian Sourdoughs▿ †

    PubMed Central

    Scheirlinck, Ilse; Van der Meulen, Roel; Van Schoor, Ann; Vancanneyt, Marc; De Vuyst, Luc; Vandamme, Peter; Huys, Geert

    2007-01-01

    A culture-based approach was used to investigate the diversity of lactic acid bacteria (LAB) in Belgian traditional sourdoughs and to assess the influence of flour type, bakery environment, geographical origin, and technological characteristics on the taxonomic composition of these LAB communities. For this purpose, a total of 714 LAB from 21 sourdoughs sampled at 11 artisan bakeries throughout Belgium were subjected to a polyphasic identification approach. The microbial composition of the traditional sourdoughs was characterized by bacteriological culture in combination with genotypic identification methods, including repetitive element sequence-based PCR fingerprinting and phenylalanyl-tRNA synthase (pheS) gene sequence analysis. LAB from Belgian sourdoughs belonged to the genera Lactobacillus, Pediococcus, Leuconostoc, Weissella, and Enterococcus, with the heterofermentative species Lactobacillus paralimentarius, Lactobacillus sanfranciscensis, Lactobacillus plantarum, and Lactobacillus pontis as the most frequently isolated taxa. Statistical analysis of the identification data indicated that the microbial composition of the sourdoughs is mainly affected by the bakery environment rather than the flour type (wheat, rye, spelt, or a mixture of these) used. In conclusion, the polyphasic approach, based on rapid genotypic screening and high-resolution, sequence-dependent identification, proved to be a powerful tool for studying the LAB diversity in traditional fermented foods such as sourdough. PMID:17675431

  8. Influence of geographical origin and flour type on diversity of lactic acid bacteria in traditional Belgian sourdoughs.

    PubMed

    Scheirlinck, Ilse; Van der Meulen, Roel; Van Schoor, Ann; Vancanneyt, Marc; De Vuyst, Luc; Vandamme, Peter; Huys, Geert

    2007-10-01

    A culture-based approach was used to investigate the diversity of lactic acid bacteria (LAB) in Belgian traditional sourdoughs and to assess the influence of flour type, bakery environment, geographical origin, and technological characteristics on the taxonomic composition of these LAB communities. For this purpose, a total of 714 LAB from 21 sourdoughs sampled at 11 artisan bakeries throughout Belgium were subjected to a polyphasic identification approach. The microbial composition of the traditional sourdoughs was characterized by bacteriological culture in combination with genotypic identification methods, including repetitive element sequence-based PCR fingerprinting and phenylalanyl-tRNA synthase (pheS) gene sequence analysis. LAB from Belgian sourdoughs belonged to the genera Lactobacillus, Pediococcus, Leuconostoc, Weissella, and Enterococcus, with the heterofermentative species Lactobacillus paralimentarius, Lactobacillus sanfranciscensis, Lactobacillus plantarum, and Lactobacillus pontis as the most frequently isolated taxa. Statistical analysis of the identification data indicated that the microbial composition of the sourdoughs is mainly affected by the bakery environment rather than the flour type (wheat, rye, spelt, or a mixture of these) used. In conclusion, the polyphasic approach, based on rapid genotypic screening and high-resolution, sequence-dependent identification, proved to be a powerful tool for studying the LAB diversity in traditional fermented foods such as sourdough.

  9. Dependence and risk assessment for oil prices and exchange rate portfolios: A wavelet based approach

    NASA Astrophysics Data System (ADS)

    Aloui, Chaker; Jammazi, Rania

    2015-10-01

    In this article, we propose a wavelet-based approach to accommodate the stylized facts and complex structure of financial data, caused by frequent and abrupt changes of markets and noises. Specifically, we show how the combination of both continuous and discrete wavelet transforms with traditional financial models helps improve portfolio's market risk assessment. In the empirical stage, three wavelet-based models (wavelet-EGARCH with dynamic conditional correlations, wavelet-copula, and wavelet-extreme value) are considered and applied to crude oil price and US dollar exchange rate data. Our findings show that the wavelet-based approach provides an effective and powerful tool for detecting extreme moments and improving the accuracy of VaR and Expected Shortfall estimates of oil-exchange rate portfolios after noise is removed from the original data.

  10. Holistic and sustainable health improvement: the contribution of the settings-based approach to health promotion.

    PubMed

    Dooris, Mark

    2009-01-01

    Highlighting the need for holistic and sustainable health improvement, this paper starts by reviewing the origins, history and conceptualization of the settings approach to health promotion. It then takes stock of current practice both internationally and nationally, noting its continuing importance worldwide and its inconsistent profile and utilization across the four UK countries. It goes on to explore the applicability and future development of settings-based health promotion in relation to three key issues: inequalities and inclusion; place-shaping and systems-based responses to complex problems. Concluding that the settings approach remains highly relevant to 21st century public health, the paper calls on the new "Royal" to provide much-needed leadership, thereby placing settings-based health promotion firmly on the national agenda across the whole of the UK.

  11. The implementation and use of Ada on distributed systems with high reliability requirements

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1988-01-01

    The use and implementation of Ada were investigated in distributed environments in which reliability is the primary concern. In particular, the focus was on the possibility that a distributed system may be programmed entirely in Ada so that the individual tasks of the system are unconcerned with which processors are being executed, and that failures may occur in the software and underlying hardware. A secondary interest is in the performance of Ada systems and how that performance can be gauged reliably. Primary activities included: analysis of the original approach to recovery in distributed Ada programs using the Advanced Transport Operating System (ATOPS) example; review and assessment of the original approach which was found to be capable of improvement; development of a refined approach to recovery that was applied to the ATOPS example; and design and development of a performance assessment scheme for Ada programs based on a flexible user-driven benchmarking system.

  12. On iterative processes in the Krylov-Sonneveld subspaces

    NASA Astrophysics Data System (ADS)

    Ilin, Valery P.

    2016-10-01

    The iterative Induced Dimension Reduction (IDR) methods are considered for solving large systems of linear algebraic equations (SLAEs) with nonsingular nonsymmetric matrices. These approaches are investigated by many authors and are charachterized sometimes as the alternative to the classical processes of Krylov type. The key moments of the IDR algorithms consist in the construction of the embedded Sonneveld subspaces, which have the decreasing dimensions and use the orthogonalization to some fixed subspace. Other independent approaches for research and optimization of the iterations are based on the augmented and modified Krylov subspaces by using the aggregation and deflation procedures with present various low rank approximations of the original matrices. The goal of this paper is to show, that IDR method in Sonneveld subspaces present an original interpretation of the modified algorithms in the Krylov subspaces. In particular, such description is given for the multi-preconditioned semi-conjugate direction methods which are actual for the parallel algebraic domain decomposition approaches.

  13. Sorted Index Numbers for Privacy Preserving Face Recognition

    NASA Astrophysics Data System (ADS)

    Wang, Yongjin; Hatzinakos, Dimitrios

    2009-12-01

    This paper presents a novel approach for changeable and privacy preserving face recognition. We first introduce a new method of biometric matching using the sorted index numbers (SINs) of feature vectors. Since it is impossible to recover any of the exact values of the original features, the transformation from original features to the SIN vectors is noninvertible. To address the irrevocable nature of biometric signals whilst obtaining stronger privacy protection, a random projection-based method is employed in conjunction with the SIN approach to generate changeable and privacy preserving biometric templates. The effectiveness of the proposed method is demonstrated on a large generic data set, which contains images from several well-known face databases. Extensive experimentation shows that the proposed solution may improve the recognition accuracy.

  14. The Bootstrap, the Jackknife, and the Randomization Test: A Sampling Taxonomy.

    PubMed

    Rodgers, J L

    1999-10-01

    A simple sampling taxonomy is defined that shows the differences between and relationships among the bootstrap, the jackknife, and the randomization test. Each method has as its goal the creation of an empirical sampling distribution that can be used to test statistical hypotheses, estimate standard errors, and/or create confidence intervals. Distinctions between the methods can be made based on the sampling approach (with replacement versus without replacement) and the sample size (replacing the whole original sample versus replacing a subset of the original sample). The taxonomy is useful for teaching the goals and purposes of resampling schemes. An extension of the taxonomy implies other possible resampling approaches that have not previously been considered. Univariate and multivariate examples are presented.

  15. Use of Traffic Displays for General Aviation Approach Spacing: A Human Factors Study

    DTIC Science & Technology

    2007-12-01

    engine rated pilots participated. Eight flew approaches in a twin-engine Piper Aztec originating in Sanford, ME, and eight flew approaches in the same...flew approaches in a twin-engine Piper Aztec originating in Sanford, ME, and eight flew approaches in the same aircraft originating in Atlantic City... Aztec . The plane was equipped with a horizontal Situation Indicator (hSI). The Garmin International MX-20™ multifunction traffic display or “Basic

  16. Multi-Level Building Reconstruction for Automatic Enhancement of High Resolution Dsms

    NASA Astrophysics Data System (ADS)

    Arefi, H.; Reinartz, P.

    2012-07-01

    In this article a multi-level approach is proposed for reconstruction-based improvement of high resolution Digital Surface Models (DSMs). The concept of Levels of Detail (LOD) defined by CityGML standard has been considered as basis for abstraction levels of building roof structures. Here, the LOD1 and LOD2 which are related to prismatic and parametric roof shapes are reconstructed. Besides proposing a new approach for automatic LOD1 and LOD2 generation from high resolution DSMs, the algorithm contains two generalization levels namely horizontal and vertical. Both generalization levels are applied to prismatic model of buildings. The horizontal generalization allows controlling the approximation level of building footprints which is similar to cartographic generalization concept of the urban maps. In vertical generalization, the prismatic model is formed using an individual building height and continuous to included all flat structures locating in different height levels. The concept of LOD1 generation is based on approximation of the building footprints into rectangular or non-rectangular polygons. For a rectangular building containing one main orientation a method based on Minimum Bounding Rectangle (MBR) in employed. In contrast, a Combined Minimum Bounding Rectangle (CMBR) approach is proposed for regularization of non-rectilinear polygons, i.e. buildings without perpendicular edge directions. Both MBRand CMBR-based approaches are iteratively employed on building segments to reduce the original building footprints to a minimum number of nodes with maximum similarity to original shapes. A model driven approach based on the analysis of the 3D points of DSMs in a 2D projection plane is proposed for LOD2 generation. Accordingly, a building block is divided into smaller parts according to the direction and number of existing ridge lines. The 3D model is derived for each building part and finally, a complete parametric model is formed by merging all the 3D models of the individual parts and adjusting the nodes after the merging step. In order to provide an enhanced DSM, a surface model is provided for each building by interpolation of the internal points of the generated models. All interpolated models are situated on a Digital Terrain Model (DTM) of corresponding area to shape the enhanced DSM. Proposed DSM enhancement approach has been tested on a dataset from Munich central area. The original DSM is created using robust stereo matching of Worldview-2 stereo images. A quantitative assessment of the new DSM by comparing the heights of the ridges and eaves shows a standard deviation of better than 50cm.

  17. Molecular recognition of the environment and mechanisms of the origin of species in quantum-like modeling of evolution.

    PubMed

    Melkikh, Alexey V; Khrennikov, Andrei

    2017-11-01

    A review of the mechanisms of speciation is performed. The mechanisms of the evolution of species, taking into account the feedback of the state of the environment and mechanisms of the emergence of complexity, are considered. It is shown that these mechanisms, at the molecular level, cannot work steadily in terms of classical mechanics. Quantum mechanisms of changes in the genome, based on the long-range interaction potential between biologically important molecules, are proposed as one of possible explanation. Different variants of interactions of the organism and environment based on molecular recognition and leading to new species origins are considered. Experiments to verify the model are proposed. This bio-physical study is completed by the general operational model of based on quantum information theory. The latter is applied to model of epigenetic evolution. We briefly present the basics of the quantum-like approach to modeling of bio-informational processes. This approach is illustrated by the quantum-like model of epigenetic evolution. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. An approach to the origin of self-replicating system. I - Intermolecular interactions

    NASA Technical Reports Server (NTRS)

    Macelroy, R. D.; Coeckelenbergh, Y.; Rein, R.

    1978-01-01

    The present paper deals with the characteristics and potentialities of a recently developed computer-based molecular modeling system. Some characteristics of current coding systems are examined and are extrapolated to the apparent requirements of primitive prebiological coding systems.

  19. Usefulness of Mendelian Randomization in Observational Epidemiology

    PubMed Central

    Bochud, Murielle; Rousson, Valentin

    2010-01-01

    Mendelian randomization refers to the random allocation of alleles at the time of gamete formation. In observational epidemiology, this refers to the use of genetic variants to estimate a causal effect between a modifiable risk factor and an outcome of interest. In this review, we recall the principles of a “Mendelian randomization” approach in observational epidemiology, which is based on the technique of instrumental variables; we provide simulations and an example based on real data to demonstrate its implications; we present the results of a systematic search on original articles having used this approach; and we discuss some limitations of this approach in view of what has been found so far. PMID:20616999

  20. Impact of migration origin on individual protection strategies against sexual transmission of HIV in Paris metropolitan area, SIRS cohort study, France.

    PubMed

    Kesteman, Thomas; Lapostolle, Annabelle; Costagliola, Dominique; Massari, Véronique; Chauvin, Pierre

    2015-08-20

    The impact of migration and country or region of origin on sexual behaviours and prevention of the sexual transmission of HIV has been scarcely studied in France. The objective of this study was to evaluate if and how individual attitudes of prevention towards HIV infection are different according to country or region of origins in Paris area, France. 3006 individuals were interviewed in the Paris metropolitan area in 2010. Outcome variables were (i) the intention of the individual to protect oneself against HIV, and (ii) the adoption of a condom-based approach for protection against HIV. To explore factors associated with these outcomes, we constructed multivariate logistic regression models, first taking into account only demographic variables -including country of origin-, then successively adding socioeconomic variables and variables related to sexual behaviour and HIV perception and prevention behaviour. French and foreign people who have origins in Sub-Saharan Africa declared more intentions to protect themselves than French people with French parents (in foreign men, aOR = 3.43 [1.66-7.13]; in foreign women, aOR = 2.94 [1.65-5.23]), but did not declare more recourse to a condom-based approach for protection against HIV (in foreign men, aOR = 1.38 [0.38-4.93]; in foreign women, aOR = 0.93 [0.40-2.18]). Conversely, foreign women and French women from foreign origin, especially from Maghreb (Northern Africa), reported less intention of protection than French women with French parents. These results underline the importance of taking culture and origins of target populations into consideration when designing information, education and communication about HIV and sexually transmitted diseases. These results also draw attention to fractions of the general population that could escape from prevention messages.

  1. GeoDeepDive: Towards a Machine Reading-Ready Digital Library and Information Integration Resource

    NASA Astrophysics Data System (ADS)

    Husson, J. M.; Peters, S. E.; Livny, M.; Ross, I.

    2015-12-01

    Recent developments in machine reading and learning approaches to text and data mining hold considerable promise for accelerating the pace and quality of literature-based data synthesis, but these advances have outpaced even basic levels of access to the published literature. For many geoscience domains, particularly those based on physical samples and field-based descriptions, this limitation is significant. Here we describe a general infrastructure to support published literature-based machine reading and learning approaches to information integration and knowledge base creation. This infrastructure supports rate-controlled automated fetching of original documents, along with full bibliographic citation metadata, from remote servers, the secure storage of original documents, and the utilization of considerable high-throughput computing resources for the pre-processing of these documents by optical character recognition, natural language parsing, and other document annotation and parsing software tools. New tools and versions of existing tools can be automatically deployed against original documents when they are made available. The products of these tools (text/XML files) are managed by MongoDB and are available for use in data extraction applications. Basic search and discovery functionality is provided by ElasticSearch, which is used to identify documents of potential relevance to a given data extraction task. Relevant files derived from the original documents are then combined into basic starting points for application building; these starting points are kept up-to-date as new relevant documents are incorporated into the digital library. Currently, our digital library stores contains more than 360K documents supplied by Elsevier and the USGS and we are actively seeking additional content providers. By focusing on building a dependable infrastructure to support the retrieval, storage, and pre-processing of published content, we are establishing a foundation for complex, and continually improving, information integration and data extraction applications. We have developed one such application, which we present as an example, and invite new collaborations to develop other such applications.

  2. US forests are showing increased rates of decline in response to a changing climate

    Treesearch

    Warren B. Cohen; Zhiqiang Yang; David M. Bell; Stephen V. Stehman

    2015-01-01

    How vulnerable are US forest to a changing climate? We answer this question using Landsat time series data and a unique interpretation approach, TimeSync, a plot-based Landsat visualization and data collection tool. Original analyses were based on a stratified two-stage cluster sample design that included interpretation of 3858 forested plots. From these data, we...

  3. A Model-Based Method for Content Validation of Automatically Generated Test Items

    ERIC Educational Resources Information Center

    Zhang, Xinxin; Gierl, Mark

    2016-01-01

    The purpose of this study is to describe a methodology to recover the item model used to generate multiple-choice test items with a novel graph theory approach. Beginning with the generated test items and working backward to recover the original item model provides a model-based method for validating the content used to automatically generate test…

  4. Evaluation of a 3D local multiresolution algorithm for the correction of partial volume effects in positron emission tomography.

    PubMed

    Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris

    2011-09-01

    Partial volume effects (PVEs) are consequences of the limited spatial resolution in emission tomography leading to underestimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multiresolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low-resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model, which may introduce artifacts in regions where no significant correlation exists between anatomical and functional details. A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present, the new model outperformed the 2D global approach, avoiding artifacts and significantly improving quality of the corrected images and their quantitative accuracy. A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multiresolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information.

  5. Evaluation of a 3D local multiresolution algorithm for the correction of partial volume effects in positron emission tomography

    PubMed Central

    Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E.; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris

    2011-01-01

    Purpose Partial volume effects (PVE) are consequences of the limited spatial resolution in emission tomography leading to under-estimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multi-resolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model which may introduce artefacts in regions where no significant correlation exists between anatomical and functional details. Methods A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Results Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present the new model outperformed the 2D global approach, avoiding artefacts and significantly improving quality of the corrected images and their quantitative accuracy. Conclusions A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multi-resolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information. PMID:21978037

  6. Determining origin in a migratory marine vertebrate: a novel method to integrate stable isotopes and satellite tracking

    USGS Publications Warehouse

    Vander Zanden, Hannah B.; Tucker, Anton D.; Hart, Kristen M.; Lamont, Margaret M.; Fujisaki, Ikuko; Addison, David S.; Mansfield, Katherine L.; Phillips, Katrina F.; Wunder, Michael B.; Bowen, Gabriel J.; Pajuelo, Mariela; Bolten, Alan B.; Bjorndal, Karen A.

    2015-01-01

    Stable isotope analysis is a useful tool to track animal movements in both terrestrial and marine environments. These intrinsic markers are assimilated through the diet and may exhibit spatial gradients as a result of biogeochemical processes at the base of the food web. In the marine environment, maps to predict the spatial distribution of stable isotopes are limited, and thus determining geographic origin has been reliant upon integrating satellite telemetry and stable isotope data. Migratory sea turtles regularly move between foraging and reproductive areas. Whereas most nesting populations can be easily accessed and regularly monitored, little is known about the demographic trends in foraging populations. The purpose of the present study was to examine migration patterns of loggerhead nesting aggregations in the Gulf of Mexico (GoM), where sea turtles have been historically understudied. Two methods of geographic assignment using stable isotope values in known-origin samples from satellite telemetry were compared: 1) a nominal approach through discriminant analysis and 2) a novel continuous-surface approach using bivariate carbon and nitrogen isoscapes (isotopic landscapes) developed for this study. Tissue samples for stable isotope analysis were obtained from 60 satellite-tracked individuals at five nesting beaches within the GoM. Both methodological approaches for assignment resulted in high accuracy of foraging area determination, though each has advantages and disadvantages. The nominal approach is more appropriate when defined boundaries are necessary, but up to 42% of the individuals could not be considered in this approach. All individuals can be included in the continuous-surface approach, and individual results can be aggregated to identify geographic hotspots of foraging area use, though the accuracy rate was lower than nominal assignment. The methodological validation provides a foundation for future sea turtle studies in the region to inexpensively determine geographic origin for large numbers of untracked individuals. Regular monitoring of sea turtle nesting aggregations with stable isotope sampling can be used to fill critical data gaps regarding habitat use and migration patterns. Probabilistic assignment to origin with isoscapes has not been previously used in the marine environment, but the methods presented here could also be applied to other migratory marine species.

  7. Determining origin in a migratory marine vertebrate: a novel method to integrate stable isotopes and satellite tracking.

    PubMed

    Zanden, Hannah B Vander; Tucker, Anton D; Hart, Kristen M; Lamont, Margaret M; Fuisaki, Ikuko; Addison, David; Mansfield, Katherine L; Phillips, Katrina F; Wunder, Michael B; Bowen, Gabriel J; Pajuelo, Mariela; Bolten, Alan B; Bjorndal, Karen A

    2015-03-01

    Stable isotope analysis is a useful tool to track animal movements in both terrestrial and marine environments. These intrinsic markers are assimilated through the diet and may exhibit spatial gradients as a result of biogeochemical processes at the base of the food web. In the marine environment, maps to predict the spatial distribution of stable isotopes are limited, and thus determining geographic origin has been reliant upon integrating satellite telemetry and stable isotope data. Migratory sea turtles regularly move between foraging and reproductive areas. Whereas most nesting populations can be easily accessed and regularly monitored, little is known about the demographic trends in foraging populations. The purpose of the present study was to examine migration patterns of loggerhead nesting aggregations in the Gulf of Mexico (GoM), where sea turtles have been historically understudied. Two methods of geographic assignment using stable isotope values in known-origin samples from satellite telemetry were compared: (1) a nominal approach through discriminant analysis and (2) a novel continuous-surface approach using bivariate carbon and nitrogen isoscapes (isotopic landscapes) developed for this study. Tissue samples for stable isotope analysis were obtained from 60 satellite-tracked individuals at five nesting beaches within the GoM. Both methodological approaches for assignment resulted in high accuracy of foraging area determination, though each has advantages and disadvantages. The nominal approach is more appropriate when defined boundaries are necessary, but up to 42% of the individuals could not be considered in this approach. All individuals can be included in the continuous-surface approach, and individual results can be aggregated to identify geographic hotspots of foraging area use, though the accuracy rate was lower than nominal assignment. The methodological validation provides a foundation for future sea turtle studies in the region to inexpensively determine geographic origin for large numbers of untracked individuals. Regular monitoring of sea turtle nesting aggregations with stable isotope sampling can be used to fill critical data gaps regarding habitat use and migration patterns. Probabilistic assignment to origin with isoscapes has not been previously used in the marine environment, but the methods presented here could also be applied to other migratory marine species.

  8. Constructive Approaches for Understanding the Origin of Self-Replication and Evolution.

    PubMed

    Ichihashi, Norikazu; Yomo, Tetsuya

    2016-07-13

    The mystery of the origin of life can be divided into two parts. The first part is the origin of biomolecules: under what physicochemical conditions did biomolecules such as amino acids, nucleotides, and their polymers arise? The second part of the mystery is the origin of life-specific functions such as the replication of genetic information, the reproduction of cellular structures, metabolism, and evolution. These functions require the coordination of many different kinds of biological molecules. A direct strategy to approach the second part of the mystery is the constructive approach, in which life-specific functions are recreated in a test tube from specific biological molecules. Using this approach, we are able to employ design principles to reproduce life-specific functions, and the knowledge gained through the reproduction process provides clues as to their origins. In this mini-review, we introduce recent insights gained using this approach, and propose important future directions for advancing our understanding of the origins of life.

  9. Unique haplotypes of cacao trees as revealed by trnH-psbA chloroplast DNA

    PubMed Central

    Gutiérrez-López, Nidia; Ovando-Medina, Isidro; Salvador-Figueroa, Miguel; Molina-Freaner, Francisco; Avendaño-Arrazate, Carlos H.

    2016-01-01

    Cacao trees have been cultivated in Mesoamerica for at least 4,000 years. In this study, we analyzed sequence variation in the chloroplast DNA trnH-psbA intergenic spacer from 28 cacao trees from different farms in the Soconusco region in southern Mexico. Genetic relationships were established by two analysis approaches based on geographic origin (five populations) and genetic origin (based on a previous study). We identified six polymorphic sites, including five insertion/deletion (indels) types and one transversion. The overall nucleotide diversity was low for both approaches (geographic = 0.0032 and genetic = 0.0038). Conversely, we obtained moderate to high haplotype diversity (0.66 and 0.80) with 10 and 12 haplotypes, respectively. The common haplotype (H1) for both networks included cacao trees from all geographic locations (geographic approach) and four genetic groups (genetic approach). This common haplotype (ancient) derived a set of intermediate haplotypes and singletons interconnected by one or two mutational steps, which suggested directional selection and event purification from the expansion of narrow populations. Cacao trees from Soconusco region were grouped into one cluster without any evidence of subclustering based on AMOVA (FST = 0) and SAMOVA (FST = 0.04393) results. One population (Mazatán) showed a high haplotype frequency; thus, this population could be considered an important reservoir of genetic material. The indels located in the trnH-psbA intergenic spacer of cacao trees could be useful as markers for the development of DNA barcoding. PMID:27076998

  10. An Extended Spectral-Spatial Classification Approach for Hyperspectral Data

    NASA Astrophysics Data System (ADS)

    Akbari, D.

    2017-11-01

    In this paper an extended classification approach for hyperspectral imagery based on both spectral and spatial information is proposed. The spatial information is obtained by an enhanced marker-based minimum spanning forest (MSF) algorithm. Three different methods of dimension reduction are first used to obtain the subspace of hyperspectral data: (1) unsupervised feature extraction methods including principal component analysis (PCA), independent component analysis (ICA), and minimum noise fraction (MNF); (2) supervised feature extraction including decision boundary feature extraction (DBFE), discriminate analysis feature extraction (DAFE), and nonparametric weighted feature extraction (NWFE); (3) genetic algorithm (GA). The spectral features obtained are then fed into the enhanced marker-based MSF classification algorithm. In the enhanced MSF algorithm, the markers are extracted from the classification maps obtained by both SVM and watershed segmentation algorithm. To evaluate the proposed approach, the Pavia University hyperspectral data is tested. Experimental results show that the proposed approach using GA achieves an approximately 8 % overall accuracy higher than the original MSF-based algorithm.

  11. Landmark-based elastic registration using approximating thin-plate splines.

    PubMed

    Rohr, K; Stiehl, H S; Sprengel, R; Buzug, T M; Weese, J; Kuhn, M H

    2001-06-01

    We consider elastic image registration based on a set of corresponding anatomical point landmarks and approximating thin-plate splines. This approach is an extension of the original interpolating thin-plate spline approach and allows to take into account landmark localization errors. The extension is important for clinical applications since landmark extraction is always prone to error. Our approach is based on a minimizing functional and can cope with isotropic as well as anisotropic landmark errors. In particular, in the latter case it is possible to include different types of landmarks, e.g., unique point landmarks as well as arbitrary edge points. Also, the scheme is general with respect to the image dimension and the order of smoothness of the underlying functional. Optimal affine transformations as well as interpolating thin-plate splines are special cases of this scheme. To localize landmarks we use a semi-automatic approach which is based on three-dimensional (3-D) differential operators. Experimental results are presented for two-dimensional as well as 3-D tomographic images of the human brain.

  12. Image edge detection based tool condition monitoring with morphological component analysis.

    PubMed

    Yu, Xiaolong; Lin, Xin; Dai, Yiquan; Zhu, Kunpeng

    2017-07-01

    The measurement and monitoring of tool condition are keys to the product precision in the automated manufacturing. To meet the need, this study proposes a novel tool wear monitoring approach based on the monitored image edge detection. Image edge detection has been a fundamental tool to obtain features of images. This approach extracts the tool edge with morphological component analysis. Through the decomposition of original tool wear image, the approach reduces the influence of texture and noise for edge measurement. Based on the target image sparse representation and edge detection, the approach could accurately extract the tool wear edge with continuous and complete contour, and is convenient in charactering tool conditions. Compared to the celebrated algorithms developed in the literature, this approach improves the integrity and connectivity of edges, and the results have shown that it achieves better geometry accuracy and lower error rate in the estimation of tool conditions. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Emerging Approaches to Counseling Intervention: Dialectical Behavior Therapy

    ERIC Educational Resources Information Center

    Neacsiu, Andrada D.; Ward-Ciesielski, Erin F.; Linehan, Marsha M.

    2012-01-01

    Dialectical Behavior Therapy (DBT) is a comprehensive, multimodal cognitive behavioral treatment originally developed for individuals who met criteria for borderline personality disorder (BPD) who displayed suicidal tendencies. DBT is based on behavioral theory but also includes principles of acceptance, mindfulness, and validation. Since its…

  14. Determination of fermentable sugars in beer wort by gold nanoparticles@polydopamine: A layer-by-layer approach for Localized Surface Plasmon Resonance measurements at fixed wavelength.

    PubMed

    Scarano, S; Pascale, E; Palladino, P; Fratini, E; Minunni, M

    2018-06-01

    Polydopamine decorated in-situ with Localized Surface Plasmon Resonance (LSPR)-active gold nanoparticles (AuNPs) may extend the applicability of nanoplasmonic materials to original and innovative applications in several fields. Here we report the modification of disposable UV-Vis polystyrene cuvettes with AuNPs@PDA for refractive index LSPR-based measurements. An original layer-by-layer deposition method of PDA followed by AuNPs growth is here developed, showing linear correlation between PDA thickness and optical properties. In particular, the modulation from wavelength sensitivity toward absorbance sensitivity is obtained, allowing measurements at fixed wavelength (578 nm). As applicative example of the photonic cuvettes, the measurement of fermentable sugars in beer wort is here reported. The analytical performance of our approach has been directly compared to portable refractometer of reference, displaying excellent results in terms of the precise estimation of sugars in beer wort (expressed in degrees Brix), reproducibility and sensitivity. The approach may be extended to other materials of interest in LSPR based optical sensors, e.g. optical fibers. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Effective progression of nuclear magnetic resonance-detected fragment hits.

    PubMed

    Eaton, Hugh L; Wyss, Daniel F

    2011-01-01

    Fragment-based drug discovery (FBDD) has become increasingly popular over the last decade as an alternate lead generation tool to HTS approaches. Several compounds have now progressed into the clinic which originated from a fragment-based approach, demonstrating the utility of this emerging field. While fragment hit identification has become much more routine and may involve different screening approaches, the efficient progression of fragment hits into quality lead series may still present a major bottleneck for the broadly successful application of FBDD. In our laboratory, we have extensive experience in fragment-based NMR screening (SbN) and the subsequent iterative progression of fragment hits using structure-assisted chemistry. To maximize impact, we have applied this approach strategically to early- and high-priority targets, and those struggling for leads. Its application has yielded a clinical candidate for BACE1 and lead series in about one third of the SbN/FBDD projects. In this chapter, we will give an overview of our strategy and focus our discussion on NMR-based FBDD approaches. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Laparoscopic management of gastric gastrointestinal stromal tumors

    PubMed Central

    Correa-Cote, Juan; Morales-Uribe, Carlos; Sanabria, Alvaro

    2014-01-01

    Gastrointestinal stromal tumors (GISTs) are the most frequent gastrointestinal tumors of mesodermal origin. Gastric GISTs represent approximately 70% of all gastrointestinal GISTs. The only curative option is surgical resection. Many surgical groups have shown good results with the laparoscopic approach. There have not been any randomized controlled trials comparing the open vs laparoscopic approach, and all recommendations have been based on observational studies. The experience obtained from gastric laparoscopic surgery during recent decades and the development of specific devices have allowed the treatment of most gastric GISTs through the laparoscopic approach. PMID:25031788

  17. Laparoscopic management of gastric gastrointestinal stromal tumors.

    PubMed

    Correa-Cote, Juan; Morales-Uribe, Carlos; Sanabria, Alvaro

    2014-07-16

    Gastrointestinal stromal tumors (GISTs) are the most frequent gastrointestinal tumors of mesodermal origin. Gastric GISTs represent approximately 70% of all gastrointestinal GISTs. The only curative option is surgical resection. Many surgical groups have shown good results with the laparoscopic approach. There have not been any randomized controlled trials comparing the open vs laparoscopic approach, and all recommendations have been based on observational studies. The experience obtained from gastric laparoscopic surgery during recent decades and the development of specific devices have allowed the treatment of most gastric GISTs through the laparoscopic approach.

  18. Escaping the healthcare leadership cul-de-sac.

    PubMed

    Edmonstone, John Duncan

    2017-02-06

    Purpose This paper aims to propose that healthcare is dominated by a managerialist ideology, powerfully shaped by business schools and embodied in the Masters in Business Administration. It suggests that there may be unconscious collusion between universities, healthcare employers and student leaders and managers. Design/methodology/approach Based on a review of relevant literature, the paper examines critiques of managerialism generally and explores the assumptions behind leadership development. It draws upon work which suggests that leading in healthcare organisations is fundamentally different and proposes that leadership development should be more practice-based. Findings The way forward for higher education institutions is to include work- or practice-based approaches alongside academic approaches. Practical implications The paper suggests that there is a challenge for higher education institutions to adopt and integrate practice-based development methods into their programme designs. Originality/value The paper provides a challenge to the future role of higher education institutions in developing leadership in healthcare.

  19. From Signature-Based Towards Behaviour-Based Anomaly Detection (Extended Abstract)

    DTIC Science & Technology

    2010-11-01

    data acquisition can serve as sensors. De- facto standard for IP flow monitoring is NetFlow format. Although NetFlow was originally developed by Cisco...packets with some common properties that pass through a network device. These collected flows are exported to an external device, the NetFlow ...Thanks to the network-based approach using NetFlow data, the detection algorithm is host independent and highly scalable. Deep Packet Inspection

  20. Signal enhancement based on complex curvelet transform and complementary ensemble empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Dong, Lieqian; Wang, Deying; Zhang, Yimeng; Zhou, Datong

    2017-09-01

    Signal enhancement is a necessary step in seismic data processing. In this paper we utilize the complementary ensemble empirical mode decomposition (CEEMD) and complex curvelet transform (CCT) methods to separate signal from random noise further to improve the signal to noise (S/N) ratio. Firstly, the original data with noise is decomposed into a series of intrinsic mode function (IMF) profiles with the aid of CEEMD. Then the IMFs with noise are transformed into CCT domain. By choosing different thresholds which are based on the noise level difference of each IMF profile, the noise in original data can be suppressed. Finally, we illustrate the effectiveness of the approach by simulated and field datasets.

  1. Novel approach identifies SNPs in SLC2A10 and KCNK9 with evidence for parent-of-origin effect on body mass index.

    PubMed

    Hoggart, Clive J; Venturini, Giulia; Mangino, Massimo; Gomez, Felicia; Ascari, Giulia; Zhao, Jing Hua; Teumer, Alexander; Winkler, Thomas W; Tšernikova, Natalia; Luan, Jian'an; Mihailov, Evelin; Ehret, Georg B; Zhang, Weihua; Lamparter, David; Esko, Tõnu; Macé, Aurelien; Rüeger, Sina; Bochud, Pierre-Yves; Barcella, Matteo; Dauvilliers, Yves; Benyamin, Beben; Evans, David M; Hayward, Caroline; Lopez, Mary F; Franke, Lude; Russo, Alessia; Heid, Iris M; Salvi, Erika; Vendantam, Sailaja; Arking, Dan E; Boerwinkle, Eric; Chambers, John C; Fiorito, Giovanni; Grallert, Harald; Guarrera, Simonetta; Homuth, Georg; Huffman, Jennifer E; Porteous, David; Moradpour, Darius; Iranzo, Alex; Hebebrand, Johannes; Kemp, John P; Lammers, Gert J; Aubert, Vincent; Heim, Markus H; Martin, Nicholas G; Montgomery, Grant W; Peraita-Adrados, Rosa; Santamaria, Joan; Negro, Francesco; Schmidt, Carsten O; Scott, Robert A; Spector, Tim D; Strauch, Konstantin; Völzke, Henry; Wareham, Nicholas J; Yuan, Wei; Bell, Jordana T; Chakravarti, Aravinda; Kooner, Jaspal S; Peters, Annette; Matullo, Giuseppe; Wallaschofski, Henri; Whitfield, John B; Paccaud, Fred; Vollenweider, Peter; Bergmann, Sven; Beckmann, Jacques S; Tafti, Mehdi; Hastie, Nicholas D; Cusi, Daniele; Bochud, Murielle; Frayling, Timothy M; Metspalu, Andres; Jarvelin, Marjo-Riitta; Scherag, André; Smith, George Davey; Borecki, Ingrid B; Rousson, Valentin; Hirschhorn, Joel N; Rivolta, Carlo; Loos, Ruth J F; Kutalik, Zoltán

    2014-07-01

    The phenotypic effect of some single nucleotide polymorphisms (SNPs) depends on their parental origin. We present a novel approach to detect parent-of-origin effects (POEs) in genome-wide genotype data of unrelated individuals. The method exploits increased phenotypic variance in the heterozygous genotype group relative to the homozygous groups. We applied the method to >56,000 unrelated individuals to search for POEs influencing body mass index (BMI). Six lead SNPs were carried forward for replication in five family-based studies (of ∼4,000 trios). Two SNPs replicated: the paternal rs2471083-C allele (located near the imprinted KCNK9 gene) and the paternal rs3091869-T allele (located near the SLC2A10 gene) increased BMI equally (beta = 0.11 (SD), P<0.0027) compared to the respective maternal alleles. Real-time PCR experiments of lymphoblastoid cell lines from the CEPH families showed that expression of both genes was dependent on parental origin of the SNPs alleles (P<0.01). Our scheme opens new opportunities to exploit GWAS data of unrelated individuals to identify POEs and demonstrates that they play an important role in adult obesity.

  2. Novel Approach Identifies SNPs in SLC2A10 and KCNK9 with Evidence for Parent-of-Origin Effect on Body Mass Index

    PubMed Central

    Hoggart, Clive J.; Venturini, Giulia; Mangino, Massimo; Gomez, Felicia; Ascari, Giulia; Zhao, Jing Hua; Teumer, Alexander; Winkler, Thomas W.; Tšernikova, Natalia; Luan, Jian'an; Mihailov, Evelin; Ehret, Georg B.; Zhang, Weihua; Lamparter, David; Esko, Tõnu; Macé, Aurelien; Rüeger, Sina; Bochud, Pierre-Yves; Barcella, Matteo; Dauvilliers, Yves; Benyamin, Beben; Evans, David M.; Hayward, Caroline; Lopez, Mary F.; Franke, Lude; Russo, Alessia; Heid, Iris M.; Salvi, Erika; Vendantam, Sailaja; Arking, Dan E.; Boerwinkle, Eric; Chambers, John C.; Fiorito, Giovanni; Grallert, Harald; Guarrera, Simonetta; Homuth, Georg; Huffman, Jennifer E.; Porteous, David; Moradpour, Darius; Iranzo, Alex; Hebebrand, Johannes; Kemp, John P.; Lammers, Gert J.; Aubert, Vincent; Heim, Markus H.; Martin, Nicholas G.; Montgomery, Grant W.; Peraita-Adrados, Rosa; Santamaria, Joan; Negro, Francesco; Schmidt, Carsten O.; Scott, Robert A.; Spector, Tim D.; Strauch, Konstantin; Völzke, Henry; Wareham, Nicholas J.; Yuan, Wei; Bell, Jordana T.; Chakravarti, Aravinda; Kooner, Jaspal S.; Peters, Annette; Matullo, Giuseppe; Wallaschofski, Henri; Whitfield, John B.; Paccaud, Fred; Vollenweider, Peter; Bergmann, Sven; Beckmann, Jacques S.; Tafti, Mehdi; Hastie, Nicholas D.; Cusi, Daniele; Bochud, Murielle; Frayling, Timothy M.; Metspalu, Andres; Jarvelin, Marjo-Riitta; Scherag, André; Smith, George Davey; Borecki, Ingrid B.; Rousson, Valentin; Hirschhorn, Joel N.; Rivolta, Carlo; Loos, Ruth J. F.; Kutalik, Zoltán

    2014-01-01

    The phenotypic effect of some single nucleotide polymorphisms (SNPs) depends on their parental origin. We present a novel approach to detect parent-of-origin effects (POEs) in genome-wide genotype data of unrelated individuals. The method exploits increased phenotypic variance in the heterozygous genotype group relative to the homozygous groups. We applied the method to >56,000 unrelated individuals to search for POEs influencing body mass index (BMI). Six lead SNPs were carried forward for replication in five family-based studies (of ∼4,000 trios). Two SNPs replicated: the paternal rs2471083-C allele (located near the imprinted KCNK9 gene) and the paternal rs3091869-T allele (located near the SLC2A10 gene) increased BMI equally (beta = 0.11 (SD), P<0.0027) compared to the respective maternal alleles. Real-time PCR experiments of lymphoblastoid cell lines from the CEPH families showed that expression of both genes was dependent on parental origin of the SNPs alleles (P<0.01). Our scheme opens new opportunities to exploit GWAS data of unrelated individuals to identify POEs and demonstrates that they play an important role in adult obesity. PMID:25078964

  3. Multidisciplinary approach for the study of an Egyptian coffin (late 22nd/early 25th dynasty): combining imaging and spectroscopic techniques.

    PubMed

    Bracci, S; Caruso, O; Galeotti, M; Iannaccone, R; Magrini, D; Picchi, D; Pinna, D; Porcinai, S

    2015-06-15

    This paper demonstrates that an educated methodology based on both non-invasive and micro invasive techniques in a two-step approach is a powerful tool to characterize the materials and stratigraphies of an Egyptian coffin, which was restored several times. This coffin, belonging to a certain Mesiset, is now located at the Museo Civico Archeologico of Bologna (inventory number MCABo EG 1963). Scholars attributed it to the late 22nd/early 25th dynasty by stylistic comparison. The first step of the diagnostic approach applied imaging techniques on the whole surface in order to select measurements spots and to unveil both original and restored areas. Images and close microscopic examination of the polychrome surface allowed selecting representative areas to be investigated in situ by portable spectroscopic techniques: X-ray Fluorescence (XRF), Fiber Optic Reflectance Spectroscopy (FORS) and Fourier Transform Infrared spectroscopy (FTIR). After the analysis of the results coming from the first step, very few selected samples were taken to clarify the stratigraphy of the polychrome layers. The first step, based on the combination of imaging and spectroscopic techniques in a totally non-invasive modality, is quite unique in the literature on Egyptian coffins and enabled us to reveal many differences in the ground layer's composition and to identify a remarkable number of pigments in the original and restored areas. This work offered also a chance to check the limitations of the non-invasive approach applied on a complex case, namely the right localization of different materials in the stratigraphy and the identification of binding media. Indeed, to dissolve any remaining doubts on superimposed layers belonging to different interventions, it was necessary to sample few micro-fragments in some selected areas and analyze them prepared as cross-sections. The original ground layer is made of calcite, while the restored areas show the presence of either a mixture of calcite and silicates or a gypsum ground, overlapped by lead white. The original pigments were identified as orpiment, cinnabar and red clay, Egyptian blue and green copper based pigments. Some other pigments, such as white lead, Naples yellow, cerulean blue and azurite were only found in the restored areas. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Where Is Health Care Headed?

    PubMed Central

    Bland, Jeffrey

    2016-01-01

    Looking at the trends, developments, and discoveries points us toward the future, but it is only when we consider these in the context of our understanding about the origins of disease that we can truly gain a clearer view of where health care is headed. This is the view that moves us from a focus on the diagnosis and treatment of a disease to an understanding of the origin of the alteration in function in the individual. This change in both perspective and understanding of the origin of disease is what will lead us to a systems approach to health care that delivers personalized and precision care that is based on the inherent rehabilitative power that resides within the genome. PMID:27547161

  5. Synthesis of recurrent neural networks for dynamical system simulation.

    PubMed

    Trischler, Adam P; D'Eleuterio, Gabriele M T

    2016-08-01

    We review several of the most widely used techniques for training recurrent neural networks to approximate dynamical systems, then describe a novel algorithm for this task. The algorithm is based on an earlier theoretical result that guarantees the quality of the network approximation. We show that a feedforward neural network can be trained on the vector-field representation of a given dynamical system using backpropagation, then recast it as a recurrent network that replicates the original system's dynamics. After detailing this algorithm and its relation to earlier approaches, we present numerical examples that demonstrate its capabilities. One of the distinguishing features of our approach is that both the original dynamical systems and the recurrent networks that simulate them operate in continuous time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. The Digital Reconstruction of Petrarch's "Fragmenta"

    ERIC Educational Resources Information Center

    Magni, Isabella

    2017-01-01

    The working principles of my dissertation originate with my collaboration with Storey and Walsh's Petrarchive, an innovative digital edition based on a new "rich-text" approach to Petrarch's "Rerum vulgarium fragmenta," maintaining both the textual-material and the digital aspects of my experience. The "Fragmenta" is…

  7. Male Midlife Depression: Multidimensional Contributing Factors and Renewed Practice Approaches

    ERIC Educational Resources Information Center

    Grove, Debbie L.

    2012-01-01

    Based on original doctoral research conducted with midlife women and men who completed counselling for depression, this article presents research findings of male participant perspectives and experiences in managing midlife depression and the role of counselling. Hermeneutic inquiry using conversational semistructured interviews generated multiple…

  8. New approaches in agent-based modeling of complex financial systems

    NASA Astrophysics Data System (ADS)

    Chen, Ting-Ting; Zheng, Bo; Li, Yan; Jiang, Xiong-Fei

    2017-12-01

    Agent-based modeling is a powerful simulation technique to understand the collective behavior and microscopic interaction in complex financial systems. Recently, the concept for determining the key parameters of agent-based models from empirical data instead of setting them artificially was suggested. We first review several agent-based models and the new approaches to determine the key model parameters from historical market data. Based on the agents' behaviors with heterogeneous personal preferences and interactions, these models are successful in explaining the microscopic origination of the temporal and spatial correlations of financial markets. We then present a novel paradigm combining big-data analysis with agent-based modeling. Specifically, from internet query and stock market data, we extract the information driving forces and develop an agent-based model to simulate the dynamic behaviors of complex financial systems.

  9. E-Learning Personalization Using Triple-Factor Approach in Standard-Based Education

    NASA Astrophysics Data System (ADS)

    Laksitowening, K. A.; Santoso, H. B.; Hasibuan, Z. A.

    2017-01-01

    E-Learning can be a tool in monitoring learning process and progress towards the targeted competency. Process and progress on every learner can be different one to another, since every learner may have different learning type. Learning type itself can be identified by taking into account learning style, motivation, and knowledge ability. This study explores personalization for learning type based on Triple-Factor Approach. Considering that factors in Triple-Factor Approach are dynamic, the personalization system needs to accommodate the changes that may occurs. Originated from the issue, this study proposed personalization that guides learner progression dynamically towards stages of their learning process. The personalization is implemented in the form of interventions that trigger learner to access learning contents and discussion forums more often as well as improve their level of knowledge ability based on their state of learning type.

  10. A BPF-FBP tandem algorithm for image reconstruction in reverse helical cone-beam CT

    PubMed Central

    Cho, Seungryong; Xia, Dan; Pellizzari, Charles A.; Pan, Xiaochuan

    2010-01-01

    Purpose: Reverse helical cone-beam computed tomography (CBCT) is a scanning configuration for potential applications in image-guided radiation therapy in which an accurate anatomic image of the patient is needed for image-guidance procedures. The authors previously developed an algorithm for image reconstruction from nontruncated data of an object that is completely within the reverse helix. The purpose of this work is to develop an image reconstruction approach for reverse helical CBCT of a long object that extends out of the reverse helix and therefore constitutes data truncation. Methods: The proposed approach comprises of two reconstruction steps. In the first step, a chord-based backprojection-filtration (BPF) algorithm reconstructs a volumetric image of an object from the original cone-beam data. Because there exists a chordless region in the middle of the reverse helix, the image obtained in the first step contains an unreconstructed central-gap region. In the second step, the gap region is reconstructed by use of a Pack–Noo-formula-based filteredbackprojection (FBP) algorithm from the modified cone-beam data obtained by subtracting from the original cone-beam data the reprojection of the image reconstructed in the first step. Results: The authors have performed numerical studies to validate the proposed approach in image reconstruction from reverse helical cone-beam data. The results confirm that the proposed approach can reconstruct accurate images of a long object without suffering from data-truncation artifacts or cone-angle artifacts. Conclusions: They developed and validated a BPF-FBP tandem algorithm to reconstruct images of a long object from reverse helical cone-beam data. The chord-based BPF algorithm was utilized for converting the long-object problem into a short-object problem. The proposed approach is applicable to other scanning configurations such as reduced circular sinusoidal trajectories. PMID:20175463

  11. A BPF-FBP tandem algorithm for image reconstruction in reverse helical cone-beam CT.

    PubMed

    Cho, Seungryong; Xia, Dan; Pellizzari, Charles A; Pan, Xiaochuan

    2010-01-01

    Reverse helical cone-beam computed tomography (CBCT) is a scanning configuration for potential applications in image-guided radiation therapy in which an accurate anatomic image of the patient is needed for image-guidance procedures. The authors previously developed an algorithm for image reconstruction from nontruncated data of an object that is completely within the reverse helix. The purpose of this work is to develop an image reconstruction approach for reverse helical CBCT of a long object that extends out of the reverse helix and therefore constitutes data truncation. The proposed approach comprises of two reconstruction steps. In the first step, a chord-based backprojection-filtration (BPF) algorithm reconstructs a volumetric image of an object from the original cone-beam data. Because there exists a chordless region in the middle of the reverse helix, the image obtained in the first step contains an unreconstructed central-gap region. In the second step, the gap region is reconstructed by use of a Pack-Noo-formula-based filteredback-projection (FBP) algorithm from the modified cone-beam data obtained by subtracting from the original cone-beam data the reprojection of the image reconstructed in the first step. The authors have performed numerical studies to validate the proposed approach in image reconstruction from reverse helical cone-beam data. The results confirm that the proposed approach can reconstruct accurate images of a long object without suffering from data-truncation artifacts or cone-angle artifacts. They developed and validated a BPF-FBP tandem algorithm to reconstruct images of a long object from reverse helical cone-beam data. The chord-based BPF algorithm was utilized for converting the long-object problem into a short-object problem. The proposed approach is applicable to other scanning configurations such as reduced circular sinusoidal trajectories.

  12. Comment on 'Parametrization of Stillinger-Weber potential based on a valence force field model: application to single-layer MoS2 and black phosphorus'.

    PubMed

    Midtvedt, Daniel; Croy, Alexander

    2016-06-10

    We compare the simplified valence-force model for single-layer black phosphorus with the original model and recent ab initio results. Using an analytic approach and numerical calculations we find that the simplified model yields Young's moduli that are smaller compared to the original model and are almost a factor of two smaller than ab initio results. Moreover, the Poisson ratios are an order of magnitude smaller than values found in the literature.

  13. Improving mixing efficiency of a polymer micromixer by use of a plastic shim divider

    NASA Astrophysics Data System (ADS)

    Li, Lei; Lee, L. James; Castro, Jose M.; Yi, Allen Y.

    2010-03-01

    In this paper, a critical modification to a polymer based affordable split-and-recombination static micromixer is described. To evaluate the improvement, both the original and the modified design were carefully investigated using an experimental setup and numerical modeling approach. The structure of the micromixer was designed to take advantage of the process capabilities of both ultraprecision micromachining and microinjection molding process. Specifically, the original and the modified design were numerically simulated using commercial finite element method software ANSYS CFX to assist the re-designing of the micromixers. The simulation results have shown that both designs are capable of performing mixing while the modified design has a much improved performance. Mixing experiments with two different fluids were carried out using the original and the modified mixers again showed a significantly improved mixing uniformity by the latter. The measured mixing coefficient for the original design was 0.11, and for the improved design it was 0.065. The developed manufacturing process based on ultraprecision machining and microinjection molding processes for device fabrication has the advantage of high-dimensional precision, low cost and manufacturing flexibility.

  14. Investigating human geographic origins using dual-isotope (87Sr/86Sr, δ18O) assignment approaches.

    PubMed

    Laffoon, Jason E; Sonnemann, Till F; Shafie, Termeh; Hofman, Corinne L; Brandes, Ulrik; Davies, Gareth R

    2017-01-01

    Substantial progress in the application of multiple isotope analyses has greatly improved the ability to identify nonlocal individuals amongst archaeological populations over the past decades. More recently the development of large scale models of spatial isotopic variation (isoscapes) has contributed to improved geographic assignments of human and animal origins. Persistent challenges remain, however, in the accurate identification of individual geographic origins from skeletal isotope data in studies of human (and animal) migration and provenance. In an attempt to develop and test more standardized and quantitative approaches to geographic assignment of individual origins using isotopic data two methods, combining 87Sr/86Sr and δ18O isoscapes, are examined for the Circum-Caribbean region: 1) an Interval approach using a defined range of fixed isotopic variation per location; and 2) a Likelihood assignment approach using univariate and bivariate probability density functions. These two methods are tested with enamel isotope data from a modern sample of known origin from Caracas, Venezuela and further explored with two archaeological samples of unknown origin recovered from Cuba and Trinidad. The results emphasize both the potential and limitation of the different approaches. Validation tests on the known origin sample exclude most areas of the Circum-Caribbean region and correctly highlight Caracas as a possible place of origin with both approaches. The positive validation results clearly demonstrate the overall efficacy of a dual-isotope approach to geoprovenance. The accuracy and precision of geographic assignments may be further improved by better understanding of the relationships between environmental and biological isotope variation; continued development and refinement of relevant isoscapes; and the eventual incorporation of a broader array of isotope proxy data.

  15. Deterministic representation of chaos with application to turbulence

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1987-01-01

    Chaotic motions of nonlinear dynamical systems are decomposed into mean components and fluctuations. The approach is based upon the concept that the fluctuations driven by the instability of the original (unperturbed) motion grow until a new stable state is approached. The Reynolds-type equations written for continuous as well as for finite-degrees-of-freedom dynamical systems are closed by using this stabilization principle. The theory is applied to conservative systems, to strange attractors and to turbulent motions.

  16. Real-Time Wing-Vortex and Pressure Distribution Estimation on Wings Via Displacements and Strains in Unsteady and Transitional Flight Conditions

    DTIC Science & Technology

    2016-09-07

    approach in co simulation with fluid-dynamics solvers is used. An original variational formulation is developed for the inverse problem of...by the inverse solution meshing. The same approach is used to map the structural and fluid interface kinematics and loads during the fluid structure...co-simulation. The inverse analysis is verified by reconstructing the deformed solution obtained with a corresponding direct formulation, based on

  17. Factor selection for service quality evaluation: a hospital case study.

    PubMed

    Ameryoun, Ahmad; Najafi, Seyedvahid; Nejati-Zarnaqi, Bayram; Khalilifar, Seyed Omid; Ajam, Mahdi; Ansarimoghadam, Ahmad

    2017-02-13

    Purpose The purpose of this paper is to develop a systematic approach to predict service quality dimension's influence on service quality using a novel analysis based on data envelopment and SERVQUAL. Design/methodology/approach To assess hospital service quality in Tehran, expectation and perception of those who received the services were evaluated using SERVQUAL. The hospital service quality dimensions were found by exploratory factor analysis (EFA). To compare customer expectation and perception, perceived service quality index (PSQI) was measured using a new method based on common weights. A novel sensitivity approach was used to test the service quality factor's impact on the PSQI. Findings A new service quality dimension named "trust in services" was found using EFA, which is not an original SERVQUAL factor. The approach was applied to assess the hospital's service quality. Since the PSQI value was 0.76 it showed that improvements are needed to meet customer expectations. The results showed the factor order that affect PSQI. "Trust in services" has the strongest influence on PSQI followed by "tangibles," "assurance," "empathy," and "responsiveness," respectively. Practical implications This work gives managers insight into service quality by following a systematic method; i.e., measuring perceived service quality from the customer viewpoint and service factors' impact on customer perception. Originality/value The procedure helps managers to select the required service quality dimensions which need improvement and predict their effects on customer perception.

  18. Rekindling the Flame: Principals Combating Teacher Burnout.

    ERIC Educational Resources Information Center

    Brock, Barbara L.; Grady, Marilyn L.

    This book offers a research-based, practical approach to recognizing, managing, and preventing teacher burnout. It provides a description of the origins and symptoms of burnout and a personality profile of teachers who are most susceptible to burnout. Organizational issues and administrative roles that contribute to burnout are identified, along…

  19. Not beyond Our Reach: Collaboration in Special Collection Libraries

    ERIC Educational Resources Information Center

    Dekydtspotter, Lori; Williams, Cherry

    2014-01-01

    Based on a three-year collaboration with elementary school instructors, this paper discusses a creative approach to introducing younger students to the historical aspects and unique structure of the medieval book as a physical object. Through incremental activities, students learn to contextualize primary sources in both original and digital…

  20. Scientific and Humanistic Evaluations of Follow Through.

    ERIC Educational Resources Information Center

    House, Ernest R.

    The thesis of this paper is that the humanistic mode of inquiry is underemployed in evaluation studies and the future evaluation of Follow Through could profitably use humanistic approaches. The original Follow Through evaluation was based on the assumption that the world consists of a single system explainable by appropriate methods; the…

  1. Kansas Working Papers in Linguistics. Volume 19.

    ERIC Educational Resources Information Center

    Roby, Linda M., Ed.

    1994-01-01

    This collection of papers presents the latest original research by the institutions. The papers in Number 1 are: (1) "Xhosa departments of the University of Kansas, as well as contributors from other institutions. The papers in Number 1 are: (1) "Xhosa Nominal Tonology: A Domain-Based Approach" (Mbulelo Jokweni); (2) "On the…

  2. An Integrated Approach to ESL Teaching.

    ERIC Educational Resources Information Center

    De Luca, Rosemary J.

    A University of Waikato (New Zealand) course in English for academic purposes is described. The credit course was originally designed for native English-speaking students to address their academic writing needs. However, based on the idea that the writing tasks of native speakers and non-native speakers are similar and that their writing…

  3. A Writing-Intensive, Methods-Based Laboratory Course for Undergraduates

    ERIC Educational Resources Information Center

    Colabroy, Keri L.

    2011-01-01

    Engaging undergraduate students in designing and executing original research should not only be accompanied by technique training but also intentional instruction in the critical analysis and writing of scientific literature. The course described here takes a rigorous approach to scientific reading and writing using primary literature as the model…

  4. The effect of environmental factors on the implementation of the Mechanistic-empirical pavement design guide (MEPDG).

    DOT National Transportation Integrated Search

    2011-07-01

    Current pavement design based on the AASHTO Design Guide uses an empirical approach from the results of the AASHO Road Test conducted in 1958. To address some of the limitations of the original design guide, AASHTO developed a new guide: Mechanistic ...

  5. Who Makes the Most? Measuring the "Urban Environmental Virtuosity"

    ERIC Educational Resources Information Center

    Romano, Oriana; Ercolano, Salvatore

    2013-01-01

    This paper advances a composite indicator called urban environmental virtuosity index (UEVI), in order to measure the efforts made by public local bodies in applying an ecosystem approach to urban management. UEVI employs the less exploited process-based selection criteria for representing the original concept of virtuosity, providing makes a…

  6. A Comparison of Two Approaches to Beta-Flexible Clustering.

    ERIC Educational Resources Information Center

    Belbin, Lee; And Others

    1992-01-01

    A method for hierarchical agglomerative polythetic (multivariate) clustering, based on unweighted pair group using arithmetic averages (UPGMA) is compared with the original beta-flexible technique, a weighted average method. Reasons the flexible UPGMA strategy is recommended are discussed, focusing on the ability to recover cluster structure over…

  7. Administrative Narcissism and the Tyranny of Isolation: Its Decline and Fall, 1954-1984.

    ERIC Educational Resources Information Center

    Walker, W. G.

    1984-01-01

    An invited perspective article relects on developments related to the professorship of educational administration in the United States. The originally Americocentric approach is now in decline as leaders seek to learn from abroad, widen theory bases, and observe new modes of administrator preparation. (Author/MLF)

  8. Espousing Democratic Leadership Practices: A Study of Values in Action

    ERIC Educational Resources Information Center

    Devereaux, Lorraine

    2003-01-01

    This article examines principals' espoused values and their values in action. It provides a reanalysis of previously collected data through a values lens. The original research study was an international quantitative and qualitative investigation of principals' leadership approaches that was based in 15 schools. This particular excerpt of the…

  9. Database of ground-based anemometer measurements of wake vortices at Kennedy Airport

    DOT National Transportation Integrated Search

    1997-07-01

    A 7OO foot array of horizontal and vertical single-axis anemometers was installed at New York's Kennedy Airport on 30-foot poles under the approach to Runway 31R. Although the original purpose for the anemometers was to track the lateral position of ...

  10. Thoughts on Education and Innovation

    ERIC Educational Resources Information Center

    Whitehead, Diane P.

    2008-01-01

    The word "innovate" can be traced back to the 1400s, where it originated from the Middle French word "innovacyon" meaning "renewal" or "new way of doing things." Typically, innovation is considered an activity of technology, engineering, and other specialized, scientifically based fields that employ approaches and strategies to spark connectivity,…

  11. A vigorous approach to customer service.

    PubMed

    Pollock, E K

    1993-01-01

    PPG Industries, Inc. is the world's largest supplier of automotive original coatings. Its business-to-business customers require individualized service based on specific requirements. The company has solidified these relationships by establishing satellite supply facilities, applying the quality process to problem solving, and providing a variety of outlets for customer feedback.

  12. Limiting the Limits on Domains: A Commentary on Fowler and Heteronomy.

    ERIC Educational Resources Information Center

    Turiel, Elliot; Smetana, Judith G.

    1998-01-01

    Defends domain theory approach to children's moral development based on limitations of Piaget's original theory. Argues that Fowler's characterization of domain theory research omits important features and studies. Maintains that distinctions between morality and convention cannot be reduced to differences in perceptible harm and punishment; it is…

  13. Inferring the mode of origin of polyploid species from next-generation sequence data.

    PubMed

    Roux, Camille; Pannell, John R

    2015-03-01

    Many eukaryote organisms are polyploid. However, despite their importance, evolutionary inference of polyploid origins and modes of inheritance has been limited by a need for analyses of allele segregation at multiple loci using crosses. The increasing availability of sequence data for nonmodel species now allows the application of established approaches for the analysis of genomic data in polyploids. Here, we ask whether approximate Bayesian computation (ABC), applied to realistic traditional and next-generation sequence data, allows correct inference of the evolutionary and demographic history of polyploids. Using simulations, we evaluate the robustness of evolutionary inference by ABC for tetraploid species as a function of the number of individuals and loci sampled, and the presence or absence of an outgroup. We find that ABC adequately retrieves the recent evolutionary history of polyploid species on the basis of both old and new sequencing technologies. The application of ABC to sequence data from diploid and polyploid species of the plant genus Capsella confirms its utility. Our analysis strongly supports an allopolyploid origin of C. bursa-pastoris about 80 000 years ago. This conclusion runs contrary to previous findings based on the same data set but using an alternative approach and is in agreement with recent findings based on whole-genome sequencing. Our results indicate that ABC is a promising and powerful method for revealing the evolution of polyploid species, without the need to attribute alleles to a homeologous chromosome pair. The approach can readily be extended to more complex scenarios involving higher ploidy levels. © 2015 John Wiley & Sons Ltd.

  14. SU-F-E-20: A Mathematical Model of Linac Jaw Calibration Integrated with Collimator Walkout

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Y; Corns, R; Huang, V

    2016-06-15

    Purpose: Accurate jaw calibration is possible, but it does not necessarily achieve good junctions because of collimator rotation walkout. We developed a mathematical model seeking to pick an origin for calibration that minimizes the collimator walkout effect. Methods: We use radioopaque markers aligned with crosshair on the EPID to determine the collimator walkout at collimator angles 0°, 90° and 270°. We can accurately calibrate jaws to any arbitrary origin near the radiation field centre. While the absolute position of an origin moves with the collimator walkout, its relative location to the crosshair is an invariant. We studied two approaches tomore » select an optimal origin. One approach seeks to bring all three origin locations (0°–90°–270°) as close as possible by minimizing the perimeter of the triangle formed by these points. The other approach focuses on the gap for 0°–90° junctions. Results: Our perimeter cost function has two variables and non-linear behaviour. Generally, it does not have zero-perimeter-length solution which leads to perfect jaw matches. The zero solution can only be achieved, if the collimator rotates about a single fixed axis. In the second approach, we can always get perfect 0°–0° and 0°–90° junctions, because we ignore the 0°–270° situation. For our TrueBeams, both techniques for selecting an origin improved junction dose inhomogeneities to less than ±6%. Conclusion: Our model considers the general jaw matching with collimator rotations and proposes two potential solutions. One solution optimizes the junction gaps by considering all three collimator angles while the other only considers 0°–90°. The first solution will not give perfect matching, but can be clinically acceptable with minimized collimator walkout effect, while the second can have perfect junctions at the expense of the 0°–270° junctions. Different clinics might choose between these two methods basing on their clinical practices.« less

  15. Gas Chromatography Data Classification Based on Complex Coefficients of an Autoregressive Model

    DOE PAGES

    Zhao, Weixiang; Morgan, Joshua T.; Davis, Cristina E.

    2008-01-01

    This paper introduces autoregressive (AR) modeling as a novel method to classify outputs from gas chromatography (GC). The inverse Fourier transformation was applied to the original sensor data, and then an AR model was applied to transform data to generate AR model complex coefficients. This series of coefficients effectively contains a compressed version of all of the information in the original GC signal output. We applied this method to chromatograms resulting from proliferating bacteria species grown in culture. Three types of neural networks were used to classify the AR coefficients: backward propagating neural network (BPNN), radial basis function-principal component analysismore » (RBF-PCA) approach, and radial basis function-partial least squares regression (RBF-PLSR) approach. This exploratory study demonstrates the feasibility of using complex root coefficient patterns to distinguish various classes of experimental data, such as those from the different bacteria species. This cognition approach also proved to be robust and potentially useful for freeing us from time alignment of GC signals.« less

  16. Engaging Mexican Origin Families in a School-Based Preventive Intervention

    PubMed Central

    Mauricio, Anne M.; Gonzales, Nancy A.; Millsap, Roger E.; Meza, Connie M.; Dumka, Larry E.; Germán, Miguelina; Genalo, M. Toni

    2009-01-01

    This study describes a culturally sensitive approach to engage Mexican origin families in a school-based, family-focused preventive intervention trial. The approach was evaluated via assessing study enrollment and intervention program participation, as well as examining predictors of engagement at each stage. Incorporating traditional cultural values into all aspects of engagement resulted in participation rates higher than reported rates of minority-focused trials not emphasizing cultural sensitivity. Family preferred language (English or Spanish) or acculturation status predicted engagement at all levels, with less acculturated families participating at higher rates. Spanish-language families with less acculturated adolescents participated at higher rates than Spanish-language families with more acculturated adolescents. Other findings included two-way interactions between family language and the target child’s familism values, family single- vs. dual-parent status, and number of hours the primary parent worked in predicting intervention participation. Editors’ Strategic Implications: The authors present a promising approach—which requires replication—to engaging and retaining Mexican American families in a school-based prevention program. The research also highlights the importance of considering acculturation status when implementing and studying culturally tailored aspects of prevention models. PMID:18004659

  17. Defeaturing CAD models using a geometry-based size field and facet-based reduction operators.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quadros, William Roshan; Owen, Steven James

    2010-04-01

    We propose a method to automatically defeature a CAD model by detecting irrelevant features using a geometry-based size field and a method to remove the irrelevant features via facet-based operations on a discrete representation. A discrete B-Rep model is first created by obtaining a faceted representation of the CAD entities. The candidate facet entities are then marked for reduction by using a geometry-based size field. This is accomplished by estimating local mesh sizes based on geometric criteria. If the field value at a facet entity goes below a user specified threshold value then it is identified as an irrelevant featuremore » and is marked for reduction. The reduction of marked facet entities is primarily performed using an edge collapse operator. Care is taken to retain a valid geometry and topology of the discrete model throughout the procedure. The original model is not altered as the defeaturing is performed on a separate discrete model. Associativity between the entities of the discrete model and that of original CAD model is maintained in order to decode the attributes and boundary conditions applied on the original CAD entities onto the mesh via the entities of the discrete model. Example models are presented to illustrate the effectiveness of the proposed approach.« less

  18. Assessing the use of multiple sources in student essays.

    PubMed

    Hastings, Peter; Hughes, Simon; Magliano, Joseph P; Goldman, Susan R; Lawless, Kimberly

    2012-09-01

    The present study explored different approaches for automatically scoring student essays that were written on the basis of multiple texts. Specifically, these approaches were developed to classify whether or not important elements of the texts were present in the essays. The first was a simple pattern-matching approach called "multi-word" that allowed for flexible matching of words and phrases in the sentences. The second technique was latent semantic analysis (LSA), which was used to compare student sentences to original source sentences using its high-dimensional vector-based representation. Finally, the third was a machine-learning technique, support vector machines, which learned a classification scheme from the corpus. The results of the study suggested that the LSA-based system was superior for detecting the presence of explicit content from the texts, but the multi-word pattern-matching approach was better for detecting inferences outside or across texts. These results suggest that the best approach for analyzing essays of this nature should draw upon multiple natural language processing approaches.

  19. Reconciled Rat and Human Metabolic Networks for Comparative Toxicogenomics and Biomarker Predictions

    DTIC Science & Technology

    2017-02-08

    compared with the original human GPR rules (Supplementary Fig. 3). The consensus-based approach for filtering orthology annotations was designed to...ARTICLE Received 29 Jan 2016 | Accepted 13 Dec 2016 | Published 8 Feb 2017 Reconciled rat and human metabolic networks for comparative toxicogenomics...predictions in response to 76 drugs. We validate comparative predictions for xanthine derivatives with new experimental data and literature- based evidence

  20. Cells of Origin of Epithelial Ovarian Cancers

    DTIC Science & Technology

    2015-09-01

    cells in oral squamous cell carcinomas by a novel pathway-based lineage tracing approach in a murine model. ! 13! Specific aims: 1. Determine...SUNDARESAN Lineage tracing and clonal analysis of oral cancer initiating cells The goal of this project is to study cancer stem cells /cancer initiating...whether oral cancer cells genetically marked based on their activities for stem cell -related pathways exhibit cancer stem cell properties in vivo by

  1. Applied Concept:The Military Information Operations Function within a Comprehensive and Effects-Based Approach

    DTIC Science & Technology

    2009-04-03

    Bonn, Germany 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) USJFCOM J9 Joint Concept...Bonn, Germany , The original document contains color images. 14. ABSTRACT This concept paper is based on the Multinational Information Operations...Bonn, Germany 24 25 Phone: +49 (0)228 43320-464 E-mail: peterwestenkirchner@bundeswehr.org 26 UNCLASSIFIED FOR PUBLIC RELEASE – The Military

  2. Cognitive techniques and language: A return to behavioral origins.

    PubMed

    Froján Parga, María X; Núñez de Prado Gordillo, Miguel; de Pascual Verdú, Ricardo

    2017-08-01

    the main purpose of this study is to offer an alternative explanatory account of the functioning of cognitive techniques that is based on the principles of associative learning and highlights their verbal nature. The traditional accounts are questioned and analyzed in the light of the situation of psychology in the 1970s. conceptual analysis is employed to revise the concepts of language, cognition and behavior. Several operant- and Pavlovian-based approaches to these phenomena are presented, while particular emphasis is given to Mowrer’s (1954) approach and Ryle (1949) and Wittgenstein’s (1953) philosophical contributions to the field. several logical problems are found in regard to the theoretical foundations of cognitive techniques. A combination of both operant and Pavlovian paradigms based on the above-mentioned approaches is offered as an alternative explanatory account of cognitive techniques. This new approach could overcome the conceptual fragilities of the cognitive standpoint and its dependence upon constructs of dubious logical and scientific validity.

  3. Three hybridization models based on local search scheme for job shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Balbi Fraga, Tatiana

    2015-05-01

    This work presents three different hybridization models based on the general schema of Local Search Heuristics, named Hybrid Successive Application, Hybrid Neighborhood, and Hybrid Improved Neighborhood. Despite similar approaches might have already been presented in the literature in other contexts, in this work these models are applied to analyzes the solution of the job shop scheduling problem, with the heuristics Taboo Search and Particle Swarm Optimization. Besides, we investigate some aspects that must be considered in order to achieve better solutions than those obtained by the original heuristics. The results demonstrate that the algorithms derived from these three hybrid models are more robust than the original algorithms and able to get better results than those found by the single Taboo Search.

  4. Parameter Uncertainty Analysis Using Monte Carlo Simulations for a Regional-Scale Groundwater Model

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Pohlmann, K.

    2016-12-01

    Regional-scale grid-based groundwater models for flow and transport often contain multiple types of parameters that can intensify the challenge of parameter uncertainty analysis. We propose a Monte Carlo approach to systematically quantify the influence of various types of model parameters on groundwater flux and contaminant travel times. The Monte Carlo simulations were conducted based on the steady-state conversion of the original transient model, which was then combined with the PEST sensitivity analysis tool SENSAN and particle tracking software MODPATH. Results identified hydrogeologic units whose hydraulic conductivity can significantly affect groundwater flux, and thirteen out of 173 model parameters that can cause large variation in travel times for contaminant particles originating from given source zones.

  5. Feedback linearization of singularly perturbed systems based on canonical similarity transformations

    NASA Astrophysics Data System (ADS)

    Kabanov, A. A.

    2018-05-01

    This paper discusses the problem of feedback linearization of a singularly perturbed system in a state-dependent coefficient form. The result is based on the introduction of a canonical similarity transformation. The transformation matrix is constructed from separate blocks for fast and slow part of an original singularly perturbed system. The transformed singular perturbed system has a linear canonical form that significantly simplifies a control design problem. Proposed similarity transformation allows accomplishing linearization of the system without considering the virtual output (as it is needed for normal form method), a technique of a transition from phase coordinates of the transformed system to state variables of the original system is simpler. The application of the proposed approach is illustrated through example.

  6. Feasibility of a feedback control of atomic self-organization in an optical cavity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanov, D. A., E-mail: ivanov-den@yandex.ru; Ivanova, T. Yu.

    Many interesting nonlinear effects are based on the strong interaction of motional degrees of freedom of atoms with an optical cavity field. Among them is the spatial self-organization of atoms in a pattern where the atoms group in either odd or even sites of the cavity-induced optical potential. An experimental observation of this effect can be simplified by using, along with the original cavity-induced feedback, an additional electronic feedback based on the detection of light leaking the cavity and the control of the optical potential for the atoms. Following our previous study, we show that this approach is more efficientmore » from the laser power perspective than the original scheme without the electronic feedback.« less

  7. Medical image security using modified chaos-based cryptography approach

    NASA Astrophysics Data System (ADS)

    Talib Gatta, Methaq; Al-latief, Shahad Thamear Abd

    2018-05-01

    The progressive development in telecommunication and networking technologies have led to the increased popularity of telemedicine usage which involve storage and transfer of medical images and related information so security concern is emerged. This paper presents a method to provide the security to the medical images since its play a major role in people healthcare organizations. The main idea in this work based on the chaotic sequence in order to provide efficient encryption method that allows reconstructing the original image from the encrypted image with high quality and minimum distortion in its content and doesn’t effect in human treatment and diagnosing. Experimental results prove the efficiency of the proposed method using some of statistical measures and robust correlation between original image and decrypted image.

  8. Fuzzy model-based servo and model following control for nonlinear systems.

    PubMed

    Ohtake, Hiroshi; Tanaka, Kazuo; Wang, Hua O

    2009-12-01

    This correspondence presents servo and nonlinear model following controls for a class of nonlinear systems using the Takagi-Sugeno fuzzy model-based control approach. First, the construction method of the augmented fuzzy system for continuous-time nonlinear systems is proposed by differentiating the original nonlinear system. Second, the dynamic fuzzy servo controller and the dynamic fuzzy model following controller, which can make outputs of the nonlinear system converge to target points and to outputs of the reference system, respectively, are introduced. Finally, the servo and model following controller design conditions are given in terms of linear matrix inequalities. Design examples illustrate the utility of this approach.

  9. A fusion approach for coarse-to-fine target recognition

    NASA Astrophysics Data System (ADS)

    Folkesson, Martin; Grönwall, Christina; Jungert, Erland

    2006-04-01

    A fusion approach in a query based information system is presented. The system is designed for querying multimedia data bases, and here applied to target recognition using heterogeneous data sources. The recognition process is coarse-to-fine, with an initial attribute estimation step and a following matching step. Several sensor types and algorithms are involved in each of these two steps. An independence of the matching results, on the origin of the estimation results, is observed. It allows for distribution of data between algorithms in an intermediate fusion step, without risk of data incest. This increases the overall chance of recognising the target. An implementation of the system is described.

  10. Decomposability and scalability in space-based observatory scheduling

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Smith, Stephen F.

    1992-01-01

    In this paper, we discuss issues of problem and model decomposition within the HSTS scheduling framework. HSTS was developed and originally applied in the context of the Hubble Space Telescope (HST) scheduling problem, motivated by the limitations of the current solution and, more generally, the insufficiency of classical planning and scheduling approaches in this problem context. We first summarize the salient architectural characteristics of HSTS and their relationship to previous scheduling and AI planning research. Then, we describe some key problem decomposition techniques supported by HSTS and underlying our integrated planning and scheduling approach, and we discuss the leverage they provide in solving space-based observatory scheduling problems.

  11. Can Family-Based Treatment of Anorexia Nervosa Be Manualized?

    PubMed Central

    Lock, James; Le Grange, Daniel

    2001-01-01

    The authors report on the development of a manual for treating adolescents with anorexia nervosa modeled on a family-based intervention originating at the Maudsley Hospital in London. The manual provides the first detailed account of a clinical approach shown to be consistently efficacious in randomized clinical trials for this disorder. Manualized family therapy appears to be acceptable to therapists, patients, and families. Preliminary outcomes are comparable to what would be expected in clinically supervised sessions. These results suggest that through the use of this manual a valuable treatment approach can now be tested more broadly in controlled and uncontrolled settings. PMID:11696652

  12. A new phase-correlation-based iris matching for degraded images.

    PubMed

    Krichen, Emine; Garcia-Salicetti, Sonia; Dorizzi, Bernadette

    2009-08-01

    In this paper, we present a new phase-correlation-based iris matching approach in order to deal with degradations in iris images due to unconstrained acquisition procedures. Our matching system is a fusion of global and local Gabor phase-correlation schemes. The main originality of our local approach is that we do not only consider the correlation peak amplitudes but also their locations in different regions of the images. Results on several degraded databases, namely, the CASIA-BIOSECURE and Iris Challenge Evaluation 2005 databases, show the improvement of our method compared to two available reference systems, Masek and Open Source for Iris (OSRIS), in verification mode.

  13. Non-fragile multivariable PID controller design via system augmentation

    NASA Astrophysics Data System (ADS)

    Liu, Jinrong; Lam, James; Shen, Mouquan; Shu, Zhan

    2017-07-01

    In this paper, the issue of designing non-fragile H∞ multivariable proportional-integral-derivative (PID) controllers with derivative filters is investigated. In order to obtain the controller gains, the original system is associated with an extended system such that the PID controller design can be formulated as a static output-feedback control problem. By taking the system augmentation approach, the conditions with slack matrices for solving the non-fragile H∞ multivariable PID controller gains are established. Based on the results, linear matrix inequality -based iterative algorithms are provided to compute the controller gains. Simulations are conducted to verify the effectiveness of the proposed approaches.

  14. Hydrodynamic cavitation: from theory towards a new experimental approach

    NASA Astrophysics Data System (ADS)

    Lucia, Umberto; Gervino, Gianpiero

    2009-09-01

    Hydrodynamic cavitation is analysed by a global thermodynamics principle following an approach based on the maximum irreversible entropy variation that has already given promising results for open systems and has been successfully applied in specific engineering problems. In this paper we present a new phenomenological method to evaluate the conditions inducing cavitation. We think this method could be useful in the design of turbo-machineries and related technologies: it represents both an original physical approach to cavitation and an economical saving in planning because the theoretical analysis could allow engineers to reduce the experimental tests and the costs of the design process.

  15. Research on inverse, hybrid and optimization problems in engineering sciences with emphasis on turbomachine aerodynamics: Review of Chinese advances

    NASA Technical Reports Server (NTRS)

    Liu, Gao-Lian

    1991-01-01

    Advances in inverse design and optimization theory in engineering fields in China are presented. Two original approaches, the image-space approach and the variational approach, are discussed in terms of turbomachine aerodynamic inverse design. Other areas of research in turbomachine aerodynamic inverse design include the improved mean-streamline (stream surface) method and optimization theory based on optimal control. Among the additional engineering fields discussed are the following: the inverse problem of heat conduction, free-surface flow, variational cogeneration of optimal grid and flow field, and optimal meshing theory of gears.

  16. High-throughput authentication of edible oils with benchtop Ultrafast 2D NMR.

    PubMed

    Gouilleux, B; Marchand, J; Charrier, B; Remaud, G S; Giraudeau, P

    2018-04-01

    We report the use of an Ultrafast 2D NMR approach applied on a benchtop NMR system (43 MHz) for the authentication of edible oils. Our results demonstrate that a profiling strategy based on fast 2D NMR spectra recorded in 2.4 min is more efficient than the standard 1D experiments to classify oils from different botanical origins, since 1D spectra on the same samples suffer from strong peak overlaps. Six edible oils with different botanical origins (olive, hazelnut, sesame, rapeseed, corn and sunflower) have been clearly discriminated by PCA analysis. Furthermore, we show how this approach combined with a PLS model can detect adulteration processes such as the addition of hazelnut oil into olive oil, a common fraud in food industry. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Using fuzzy rule-based knowledge model for optimum plating conditions search

    NASA Astrophysics Data System (ADS)

    Solovjev, D. S.; Solovjeva, I. A.; Litovka, Yu V.; Arzamastsev, A. A.; Glazkov, V. P.; L’vov, A. A.

    2018-03-01

    The paper discusses existing approaches to plating process modeling in order to decrease the distribution thickness of plating surface cover. However, these approaches do not take into account the experience, knowledge, and intuition of the decision-makers when searching the optimal conditions of electroplating technological process. The original approach to optimal conditions search for applying the electroplating coatings, which uses the rule-based model of knowledge and allows one to reduce the uneven product thickness distribution, is proposed. The block diagrams of a conventional control system of a galvanic process as well as the system based on the production model of knowledge are considered. It is shown that the fuzzy production model of knowledge in the control system makes it possible to obtain galvanic coatings of a given thickness unevenness with a high degree of adequacy to the experimental data. The described experimental results confirm the theoretical conclusions.

  18. Modeling of Pedestrian Flows Using Hybrid Models of Euler Equations and Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Bärwolff, Günter; Slawig, Thomas; Schwandt, Hartmut

    2007-09-01

    In the last years various systems have been developed for controlling, planning and predicting the traffic of persons and vehicles, in particular under security aspects. Going beyond pure counting and statistical models, approaches were found to be very adequate and accurate which are based on well-known concepts originally developed in very different research areas, namely continuum mechanics and computer science. In the present paper, we outline a continuum mechanical approach for the description of pedestrain flow.

  19. Practical Strategies for Collaboration across Discipline-Based Education Research and the Learning Sciences

    PubMed Central

    Peffer, Melanie; Renken, Maggie

    2016-01-01

    Rather than pursue questions related to learning in biology from separate camps, recent calls highlight the necessity of interdisciplinary research agendas. Interdisciplinary collaborations allow for a complicated and expanded approach to questions about learning within specific science domains, such as biology. Despite its benefits, interdisciplinary work inevitably involves challenges. Some such challenges originate from differences in theoretical and methodological approaches across lines of work. Thus, aims at developing successful interdisciplinary research programs raise important considerations regarding methodologies for studying biology learning, strategies for approaching collaborations, and training of early-career scientists. Our goal here is to describe two fields important to understanding learning in biology, discipline-based education research and the learning sciences. We discuss differences between each discipline’s approach to biology education research and the benefits and challenges associated with incorporating these perspectives in a single research program. We then propose strategies for building productive interdisciplinary collaboration. PMID:27881446

  20. A 150-year conundrum: cranial robusticity and its bearing on the origin of aboriginal australians.

    PubMed

    Curnoe, Darren

    2011-01-20

    The origin of Aboriginal Australians has been a central question of palaeoanthropology since its inception during the 19th Century. Moreover, the idea that Australians could trace their ancestry to a non-modern Pleistocene population such as Homo erectus in Southeast Asia have existed for more than 100 years, being explicitly linked to cranial robusticity. It is argued here that in order to resolve this issue a new program of research should be embraced, one aiming to test the full range of alternative explanations for robust morphology. Recent developments in the morphological sciences, especially relating to the ontogeny of the cranium indicate that character atomisation, an approach underpinning phylogenetic reconstruction, is fraught with difficulties. This leads to the conclusion that phylogenetic-based explanations for robusticity should be reconsidered and a more parsimonious approach to explaining Aboriginal Australian origins taken. One that takes proper account of the complex processes involved in the growth of the human cranium rather than just assuming natural selection to explain every subtle variation seen in past populations. In doing so, the null hypothesis that robusticity might result from phenotypic plasticity alone cannot be rejected, a position at odds with both reticulate and deep-time continuity models of Australian origins.

  1. A 150-Year Conundrum: Cranial Robusticity and Its Bearing on the Origin of Aboriginal Australians

    PubMed Central

    Curnoe, Darren

    2011-01-01

    The origin of Aboriginal Australians has been a central question of palaeoanthropology since its inception during the 19th Century. Moreover, the idea that Australians could trace their ancestry to a non-modern Pleistocene population such as Homo erectus in Southeast Asia have existed for more than 100 years, being explicitly linked to cranial robusticity. It is argued here that in order to resolve this issue a new program of research should be embraced, one aiming to test the full range of alternative explanations for robust morphology. Recent developments in the morphological sciences, especially relating to the ontogeny of the cranium indicate that character atomisation, an approach underpinning phylogenetic reconstruction, is fraught with difficulties. This leads to the conclusion that phylogenetic-based explanations for robusticity should be reconsidered and a more parsimonious approach to explaining Aboriginal Australian origins taken. One that takes proper account of the complex processes involved in the growth of the human cranium rather than just assuming natural selection to explain every subtle variation seen in past populations. In doing so, the null hypothesis that robusticity might result from phenotypic plasticity alone cannot be rejected, a position at odds with both reticulate and deep-time continuity models of Australian origins. PMID:21350636

  2. The New School Management by Wandering around

    ERIC Educational Resources Information Center

    Streshly, William A.; Gray, Susan Penny; Frase, Larry E.

    2012-01-01

    The topic of management by wandering around is not new, but the authors' approach is fresh and timely. This current rendition based on the original work by Frase and Hetzel gives new and seasoned administrators smart, practical advice about how to "wander around" with purpose and develop a more interactive leadership style. This text cites more…

  3. Anomalous change detection in imagery

    DOEpatents

    Theiler, James P [Los Alamos, NM; Perkins, Simon J [Santa Fe, NM

    2011-05-31

    A distribution-based anomaly detection platform is described that identifies a non-flat background that is specified in terms of the distribution of the data. A resampling approach is also disclosed employing scrambled resampling of the original data with one class specified by the data and the other by the explicit distribution, and solving using binary classification.

  4. An Elementary Approach to Teaching Wind Power

    ERIC Educational Resources Information Center

    Love, Tyler S.; Strimel, Greg

    2013-01-01

    Exposing students to the application of math and science through a design-based activity can make them more technologically literate and teach integration between the STEM disciplines at an early age. This article discusses an activity that originated as a portion of a green residential house project conducted by the authors with their high school…

  5. The Socratic Dialogue in Asynchronous Online Discussions: Is Constructivism Redundant?

    ERIC Educational Resources Information Center

    Kingsley, Paul

    2011-01-01

    Purpose: This paper aims to examine Socratic dialogue in asynchronous online discussions in relation to constructivism. The links between theory and practice in teaching are to be discussed whilst tracing the origins of Socratic dialogue and recent trends and use of seminar in research based institutions. Design/methodology/approach: Many online…

  6. A Computational Model of Learners Achievement Emotions Using Control-Value Theory

    ERIC Educational Resources Information Center

    Muñoz, Karla; Noguez, Julieta; Neri, Luis; Mc Kevitt, Paul; Lunney, Tom

    2016-01-01

    Game-based Learning (GBL) environments make instruction flexible and interactive. Positive experiences depend on personalization. Student modelling has focused on affect. Three methods are used: (1) recognizing the physiological effects of emotion, (2) reasoning about emotion from its origin and (3) an approach combining 1 and 2. These have proven…

  7. Building Successful Therapeutics into a Problem-Based Medical Curriculum in Africa

    ERIC Educational Resources Information Center

    Harries, C. S.; Mbali, C.; Botha, J.

    2006-01-01

    Irrational prescribing originates in undergraduate therapeutics education, where prescribing skills have been overlooked. P-drug, a rational prescribing approach, has been developed in response to poor prescribing. In 2004, the first cohort of PBL final year students at Nelson R. Mandela School of Medicine reported feeling unprepared to prescribe…

  8. Effective Approaches to Enhancing the Social Dimension of Higher Education

    ERIC Educational Resources Information Center

    Tupan-Wenno, Mary; Camilleri, Anthony Fisher; Fröhlich, Melanie; King, Sadie

    2016-01-01

    Despite all intentions in the course of the Bologna Process and decades of investment into improving the social dimension, results in many national and international studies show that inequity remains stubbornly persistent, and that inequity based on socio-economic status, parental education, gender, country-of-origin, rural background and more…

  9. Singing the Spaces: Artful Approaches to Navigating the Emotional Landscape in Environmental Education

    ERIC Educational Resources Information Center

    Burkhart, Jocelyn

    2016-01-01

    This paper briefly explores the gap in the environmental education literature on emotions, and then offers a rationale and potential directions for engaging the emotions more fully, through the arts. Using autoenthnographic and arts-based methods, and including original songs and invitational reflective questions to open spaces for further inquiry…

  10. The Origins and Underpinning Principles of E-Scape

    ERIC Educational Resources Information Center

    Kimbell, Richard

    2012-01-01

    In this article I describe the context within which we developed project e-scape and the early work that laid the foundations of the project. E-scape (e-solutions for creative assessment in portfolio environments) is centred on two innovations. The first concerns a web-based approach to portfolio building; allowing learners to build their…

  11. Cell-based approaches for screening and prioritization of chemicals that may cause developmental neurotoxicity

    EPA Science Inventory

    The National Academies report on Toxicity Testing in the 21st Century envisioned the use of in vitro toxicity tests using cells of human origin to predict the ability of chemicals to cause toxicity in vivo. Successful implementation of this strategy will ultimately result in fast...

  12. Oxytonergic circuitry sustains and enables creative cognition in humans

    PubMed Central

    Baas, Matthijs; Roskes, Marieke; Sligte, Daniel J.; Ebstein, Richard P.; Chew, Soo Hong; Tong, Terry; Jiang, Yushi; Mayseless, Naama; Shamay-Tsoory, Simone G.

    2014-01-01

    Creativity enables humans to adapt flexibly to changing circumstances, to manage complex social relations and to survive and prosper through social, technological and medical innovations. In humans, chronic, trait-based as well as temporary, state-based approach orientation has been linked to increased capacity for divergent rather than convergent thinking, to more global and holistic processing styles and to more original ideation and creative problem solving. Here, we link creative cognition to oxytocin, a hypothalamic neuropeptide known to up-regulate approach orientation in both animals and humans. Study 1 (N = 492) showed that plasma oxytocin predicts novelty-seeking temperament. Study 2 (N = 110) revealed that genotype differences in a polymorphism in the oxytocin receptor gene rs1042778 predicted creative ideation, with GG/GT-carriers being more original than TT-carriers. Using double-blind placebo-controlled between-subjects designs, Studies 3–6 (N = 191) finally showed that intranasal oxytocin (vs matching placebo) reduced analytical reasoning, and increased holistic processing, divergent thinking and creative performance. We conclude that the oxytonergic circuitry sustains and enables the day-to-day creativity humans need for survival and prosperity and discuss implications. PMID:23863476

  13. The application of complex network time series analysis in turbulent heated jets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less

  14. The application of complex network time series analysis in turbulent heated jets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.

    2014-06-15

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less

  15. [African community empowerment approach to diagnosis in Health Care in two countries: Guinea Conakry and Congo Brazzaville].

    PubMed

    Vieira, Gildas; Courtois, Robert; Rusch, Emmanuel

    2017-01-01

    After immigration to France, the populations of Sub-Saharan Africa have often maintained their traditional lifestyles, this is why housing policies have been promoting their clustering in priority neighborhoods. Discussing issues about health promotion, requires to investigate health policies in their countries of origin. For this, we (i) organized brainstorming sessions with a group of 16 persons resident in France who were involved in a process of empowerment strengthening of community health programs in order to make them understood the incentives and the obstacles in health care in their countries of origin. We also (ii) collected literature data prior to undertake several trips to Guinea and Congo, in order to compare literature data with those of these countries. The result concerning health promotion in these countries allowed the identification of measures to be put in place. Among them, the facilitation of accessing to community health programs, basing on successful experiences, with the prospect of transferring them to France for migrants. These measures are based on the involvement of the institutional actors and of the populations in educational approaches to health behavior change. "Territorial" diagnosis allows to emphasize the importance of the influence of health environment in the country of origin on subsequent behaviours. Moreover, it allows to highlight solutions that can promote harmonization of African community health in France.

  16. A Mobility Management Using Follow-Me Cloud-Cloudlet in Fog-Computing-Based RANs for Smart Cities.

    PubMed

    Chen, Yuh-Shyan; Tsai, Yi-Ting

    2018-02-06

    Mobility management for supporting the location tracking and location-based service (LBS) is an important issue of smart city by providing the means for the smooth transportation of people and goods. The mobility is useful to contribute the innovation in both public and private transportation infrastructures for smart cities. With the assistance of edge/fog computing, this paper presents a fully new mobility management using the proposed follow-me cloud-cloudlet (FMCL) approach in fog-computing-based radio access networks (Fog-RANs) for smart cities. The proposed follow-me cloud-cloudlet approach is an integration strategy of follow-me cloud (FMC) and follow-me edge (FME) (or called cloudlet). A user equipment (UE) receives the data, transmitted from original cloud, into the original edge cloud before the handover operation. After the handover operation, an UE searches for a new cloud, called as a migrated cloud, and a new edge cloud, called as a migrated edge cloud near to UE, where the remaining data is migrated from the original cloud to the migrated cloud and all the remaining data are received in the new edge cloud. Existing FMC results do not have the property of the VM migration between cloudlets for the purpose of reducing the transmission latency, and existing FME results do not keep the property of the service migration between data centers for reducing the transmission latency. Our proposed FMCL approach can simultaneously keep the VM migration between cloudlets and service migration between data centers to significantly reduce the transmission latency. The new proposed mobility management using FMCL approach aims to reduce the total transmission time if some data packets are pre-scheduled and pre-stored into the cache of cloudlet if UE is switching from the previous Fog-RAN to the serving Fog-RAN. To illustrate the performance achievement, the mathematical analysis and simulation results are examined in terms of the total transmission time, the throughput, the probability of packet loss, and the number of control messages.

  17. A Mobility Management Using Follow-Me Cloud-Cloudlet in Fog-Computing-Based RANs for Smart Cities

    PubMed Central

    Tsai, Yi-Ting

    2018-01-01

    Mobility management for supporting the location tracking and location-based service (LBS) is an important issue of smart city by providing the means for the smooth transportation of people and goods. The mobility is useful to contribute the innovation in both public and private transportation infrastructures for smart cities. With the assistance of edge/fog computing, this paper presents a fully new mobility management using the proposed follow-me cloud-cloudlet (FMCL) approach in fog-computing-based radio access networks (Fog-RANs) for smart cities. The proposed follow-me cloud-cloudlet approach is an integration strategy of follow-me cloud (FMC) and follow-me edge (FME) (or called cloudlet). A user equipment (UE) receives the data, transmitted from original cloud, into the original edge cloud before the handover operation. After the handover operation, an UE searches for a new cloud, called as a migrated cloud, and a new edge cloud, called as a migrated edge cloud near to UE, where the remaining data is migrated from the original cloud to the migrated cloud and all the remaining data are received in the new edge cloud. Existing FMC results do not have the property of the VM migration between cloudlets for the purpose of reducing the transmission latency, and existing FME results do not keep the property of the service migration between data centers for reducing the transmission latency. Our proposed FMCL approach can simultaneously keep the VM migration between cloudlets and service migration between data centers to significantly reduce the transmission latency. The new proposed mobility management using FMCL approach aims to reduce the total transmission time if some data packets are pre-scheduled and pre-stored into the cache of cloudlet if UE is switching from the previous Fog-RAN to the serving Fog-RAN. To illustrate the performance achievement, the mathematical analysis and simulation results are examined in terms of the total transmission time, the throughput, the probability of packet loss, and the number of control messages. PMID:29415510

  18. An approach for maximizing the smallest eigenfrequency of structure vibration based on piecewise constant level set method

    NASA Astrophysics Data System (ADS)

    Zhang, Zhengfang; Chen, Weifeng

    2018-05-01

    Maximization of the smallest eigenfrequency of the linearized elasticity system with area constraint is investigated. The elasticity system is extended into a large background domain, but the void is vacuum and not filled with ersatz material. The piecewise constant level set (PCLS) method is applied to present two regions, the original material region and the void region. A quadratic PCLS function is proposed to represent the characteristic function. Consequently, the functional derivative of the smallest eigenfrequency with respect to PCLS function takes nonzero value in the original material region and zero in the void region. A penalty gradient algorithm is proposed, which initializes the whole background domain with the original material and decreases the area of original material region till the area constraint is satisfied. 2D and 3D numerical examples are presented, illustrating the validity of the proposed algorithm.

  19. A motion-constraint logic for moving-base simulators based on variable filter parameters

    NASA Technical Reports Server (NTRS)

    Miller, G. K., Jr.

    1974-01-01

    A motion-constraint logic for moving-base simulators has been developed that is a modification to the linear second-order filters generally employed in conventional constraints. In the modified constraint logic, the filter parameters are not constant but vary with the instantaneous motion-base position to increase the constraint as the system approaches the positional limits. With the modified constraint logic, accelerations larger than originally expected are limited while conventional linear filters would result in automatic shutdown of the motion base. In addition, the modified washout logic has frequency-response characteristics that are an improvement over conventional linear filters with braking for low-frequency pilot inputs. During simulated landing approaches of an externally blown flap short take-off and landing (STOL) transport using decoupled longitudinal controls, the pilots were unable to detect much difference between the modified constraint logic and the logic based on linear filters with braking.

  20. Sensor fusion display evaluation using information integration models in enhanced/synthetic vision applications

    NASA Technical Reports Server (NTRS)

    Foyle, David C.

    1993-01-01

    Based on existing integration models in the psychological literature, an evaluation framework is developed to assess sensor fusion displays as might be implemented in an enhanced/synthetic vision system. The proposed evaluation framework for evaluating the operator's ability to use such systems is a normative approach: The pilot's performance with the sensor fusion image is compared to models' predictions based on the pilot's performance when viewing the original component sensor images prior to fusion. This allows for the determination as to when a sensor fusion system leads to: poorer performance than one of the original sensor displays, clearly an undesirable system in which the fused sensor system causes some distortion or interference; better performance than with either single sensor system alone, but at a sub-optimal level compared to model predictions; optimal performance compared to model predictions; or, super-optimal performance, which may occur if the operator were able to use some highly diagnostic 'emergent features' in the sensor fusion display, which were unavailable in the original sensor displays.

  1. PGI chicory (Cichorium intybus L.) traceability by means of HRMAS-NMR spectroscopy: a preliminary study.

    PubMed

    Ritota, Mena; Casciani, Lorena; Valentini, Massimiliano

    2013-05-01

    Analytical traceability of PGI and PDO foods (Protected Geographical Indication and Protected Denomination Origin respectively) is one of the most challenging tasks of current applied research. Here we proposed a metabolomic approach based on the combination of (1)H high-resolution magic angle spinning-nuclear magnetic resonance (HRMAS-NMR) spectroscopy with multivariate analysis, i.e. PLS-DA, as a reliable tool for the traceability of Italian PGI chicories (Cichorium intybus L.), i.e. Radicchio Rosso di Treviso and Radicchio Variegato di Castelfranco, also known as red and red-spotted, respectively. The metabolic profile was gained by means of HRMAS-NMR, and multivariate data analysis allowed us to build statistical models capable of providing clear discrimination among the two varieties and classification according to the geographical origin. Based on Variable Importance in Projection values, the molecular markers for classifying the different types of red chicories analysed were found accounting for both the cultivar and the place of origin. © 2012 Society of Chemical Industry.

  2. Learning SVM in Kreĭn Spaces.

    PubMed

    Loosli, Gaelle; Canu, Stephane; Ong, Cheng Soon

    2016-06-01

    This paper presents a theoretical foundation for an SVM solver in Kreĭn spaces. Up to now, all methods are based either on the matrix correction, or on non-convex minimization, or on feature-space embedding. Here we justify and evaluate a solution that uses the original (indefinite) similarity measure, in the original Kreĭn space. This solution is the result of a stabilization procedure. We establish the correspondence between the stabilization problem (which has to be solved) and a classical SVM based on minimization (which is easy to solve). We provide simple equations to go from one to the other (in both directions). This link between stabilization and minimization problems is the key to obtain a solution in the original Kreĭn space. Using KSVM, one can solve SVM with usually troublesome kernels (large negative eigenvalues or large numbers of negative eigenvalues). We show experiments showing that our algorithm KSVM outperforms all previously proposed approaches to deal with indefinite matrices in SVM-like kernel methods.

  3. Semantics-Based Composition of Integrated Cardiomyocyte Models Motivated by Real-World Use Cases.

    PubMed

    Neal, Maxwell L; Carlson, Brian E; Thompson, Christopher T; James, Ryan C; Kim, Karam G; Tran, Kenneth; Crampin, Edmund J; Cook, Daniel L; Gennari, John H

    2015-01-01

    Semantics-based model composition is an approach for generating complex biosimulation models from existing components that relies on capturing the biological meaning of model elements in a machine-readable fashion. This approach allows the user to work at the biological rather than computational level of abstraction and helps minimize the amount of manual effort required for model composition. To support this compositional approach, we have developed the SemGen software, and here report on SemGen's semantics-based merging capabilities using real-world modeling use cases. We successfully reproduced a large, manually-encoded, multi-model merge: the "Pandit-Hinch-Niederer" (PHN) cardiomyocyte excitation-contraction model, previously developed using CellML. We describe our approach for annotating the three component models used in the PHN composition and for merging them at the biological level of abstraction within SemGen. We demonstrate that we were able to reproduce the original PHN model results in a semi-automated, semantics-based fashion and also rapidly generate a second, novel cardiomyocyte model composed using an alternative, independently-developed tension generation component. We discuss the time-saving features of our compositional approach in the context of these merging exercises, the limitations we encountered, and potential solutions for enhancing the approach.

  4. Semantics-Based Composition of Integrated Cardiomyocyte Models Motivated by Real-World Use Cases

    PubMed Central

    Neal, Maxwell L.; Carlson, Brian E.; Thompson, Christopher T.; James, Ryan C.; Kim, Karam G.; Tran, Kenneth; Crampin, Edmund J.; Cook, Daniel L.; Gennari, John H.

    2015-01-01

    Semantics-based model composition is an approach for generating complex biosimulation models from existing components that relies on capturing the biological meaning of model elements in a machine-readable fashion. This approach allows the user to work at the biological rather than computational level of abstraction and helps minimize the amount of manual effort required for model composition. To support this compositional approach, we have developed the SemGen software, and here report on SemGen’s semantics-based merging capabilities using real-world modeling use cases. We successfully reproduced a large, manually-encoded, multi-model merge: the “Pandit-Hinch-Niederer” (PHN) cardiomyocyte excitation-contraction model, previously developed using CellML. We describe our approach for annotating the three component models used in the PHN composition and for merging them at the biological level of abstraction within SemGen. We demonstrate that we were able to reproduce the original PHN model results in a semi-automated, semantics-based fashion and also rapidly generate a second, novel cardiomyocyte model composed using an alternative, independently-developed tension generation component. We discuss the time-saving features of our compositional approach in the context of these merging exercises, the limitations we encountered, and potential solutions for enhancing the approach. PMID:26716837

  5. Learner-Centered Inquiry in Undergraduate Biology: Positive Relationships with Long-Term Student Achievement

    PubMed Central

    Ebert-May, Diane

    2010-01-01

    We determined short- and long-term correlates of a revised introductory biology curriculum on understanding of biology as a process of inquiry and learning of content. In the original curriculum students completed two traditional lecture-based introductory courses. In the revised curriculum students completed two new learner-centered, inquiry-based courses. The new courses differed significantly from those of the original curriculum through emphases on critical thinking, collaborative work, and/or inquiry-based activities. Assessments were administered to compare student understanding of the process of biological science and content knowledge in the two curricula. More seniors who completed the revised curriculum had high-level profiles on the Views About Science Survey for Biology compared with seniors who completed the original curriculum. Also as seniors, students who completed the revised curriculum scored higher on the standardized Biology Field Test. Our results showed that an intense inquiry-based learner-centered learning experience early in the biology curriculum was associated with long-term improvements in learning. We propose that students learned to learn science in the new courses which, in turn, influenced their learning in subsequent courses. Studies that determine causal effects of learner-centered inquiry-based approaches, rather than correlative relationships, are needed to test our proposed explanation. PMID:21123693

  6. Alert management for home healthcare based on home automation analysis.

    PubMed

    Truong, T T; de Lamotte, F; Diguet, J-Ph; Said-Hocine, F

    2010-01-01

    Rising healthcare for elder and disabled people can be controlled by offering people autonomy at home by means of information technology. In this paper, we present an original and sensorless alert management solution which performs multimedia and home automation service discrimination and extracts highly regular home activities as sensors for alert management. The results of simulation data, based on real context, allow us to evaluate our approach before application to real data.

  7. Systems-based decomposition schemes for the approximate solution of multi-term fractional differential equations

    NASA Astrophysics Data System (ADS)

    Ford, Neville J.; Connolly, Joseph A.

    2009-07-01

    We give a comparison of the efficiency of three alternative decomposition schemes for the approximate solution of multi-term fractional differential equations using the Caputo form of the fractional derivative. The schemes we compare are based on conversion of the original problem into a system of equations. We review alternative approaches and consider how the most appropriate numerical scheme may be chosen to solve a particular equation.

  8. Introduction to fragment-based drug discovery.

    PubMed

    Erlanson, Daniel A

    2012-01-01

    Fragment-based drug discovery (FBDD) has emerged in the past decade as a powerful tool for discovering drug leads. The approach first identifies starting points: very small molecules (fragments) that are about half the size of typical drugs. These fragments are then expanded or linked together to generate drug leads. Although the origins of the technique date back some 30 years, it was only in the mid-1990s that experimental techniques became sufficiently sensitive and rapid for the concept to be become practical. Since that time, the field has exploded: FBDD has played a role in discovery of at least 18 drugs that have entered the clinic, and practitioners of FBDD can be found throughout the world in both academia and industry. Literally dozens of reviews have been published on various aspects of FBDD or on the field as a whole, as have three books (Jahnke and Erlanson, Fragment-based approaches in drug discovery, 2006; Zartler and Shapiro, Fragment-based drug discovery: a practical approach, 2008; Kuo, Fragment based drug design: tools, practical approaches, and examples, 2011). However, this chapter will assume that the reader is approaching the field with little prior knowledge. It will introduce some of the key concepts, set the stage for the chapters to follow, and demonstrate how X-ray crystallography plays a central role in fragment identification and advancement.

  9. VizieR Online Data Catalog: Kepler TTVs. IX. The full long-cadence data set (Holczer+, 2016)

    NASA Astrophysics Data System (ADS)

    Holczer, T.; Mazeh, T.; Nachmani, G.; Jontof-Hutter, D.; Ford, E. B.; Fabrycky, D.; Ragozzine, D.; Kane, M.; Steffen, J. H.

    2016-10-01

    The Kepler mission in its original mode of operation has been terminated after 17 quarters (May 2009-Apr 2013), and we do not expect any additional Kepler TTVs for the KOIs identified during the original mission. Thus, here we analyze the whole data set of the mission and derive a complete catalog of the transit timings. Following the approach of Mazeh et al. (2013, J/ApJS/208/16), we present here an analysis of 2599 KOIs (from the NASA exoplanet archive), based on all 17 quarters of the Kepler data. (7 data files).

  10. Biomarkers and isotopic fingerprinting to track sediment origin and connectivity at Baldegg Lake (Switzerland)

    NASA Astrophysics Data System (ADS)

    Lavrieux, Marlène; Meusburger, Katrin; Birkholz, Axel; Alewell, Christine

    2017-04-01

    Slope destabilization and associated sediment transfer are among the major causes of aquatic ecosystems and surface water quality impairment. Through land uses and agricultural practices, human activities modify the soil erosive risk and the catchment connectivity, becoming a key factor of sediment dynamics. Hence, restoration and management plans of water bodies can only be efficient if the sediment sources and the proportion attributable to different land uses and agricultural practices are identified. Several sediment fingerprinting methods, based on the geochemical (elemental composition), color, magnetic or isotopic (137Cs) sediment properties, are currently in use. However, these tools are not suitable for a land-use based fingerprinting. New organic geochemical approaches are now developed to discriminate source-soil contributions under different land-uses: The compound-specific stable isotopes (CSSI) technique, based on the biomarkers isotopic signature (here, fatty acids δ13C) variability within the plant species, The analysis of highly specific (i.e. source-family- or even source-species-specific) biomarkers assemblages, which use is until now mainly restricted to palaeoenvironmental reconstructions, and which offer also promising prospects for tracing current sediment origin. The approach was applied to reconstruct the spatio-temporal variability of the main sediment sources of Baldegg Lake (Lucern Canton, Switzerland), which suffers from a substantial eutrophication, despite several restoration attempts during the last 40 years. The sediment supplying areas and the exported volumes were identified using CSSI technique and highly specific biomarkers, coupled to a sediment connectivity model. The sediment origin variability was defined through the analysis of suspended river sediments sampled at high flow conditions (short term), and by the analysis of a lake sediment core covering the last 130 years (long term). The results show the utility of biomarkers and CSSI to track organic sources in contrasted land-use settings. Associated to other fingerprinting methods, this approach could in the future become a decision support tool for catchments management.

  11. Fluorescence triggering: A general strategy for enumerating and phenotyping extracellular vesicles by flow cytometry.

    PubMed

    Arraud, Nicolas; Gounou, Céline; Turpin, Delphine; Brisson, Alain R

    2016-02-01

    Plasma contains cell-derived extracellular vesicles (EVs) which participate in various physiopathological processes and have potential biomedical applications. Despite intense research activity, knowledge on EVs is limited mainly due to the difficulty of isolating and characterizing sub-micrometer particles like EVs. We have recently reported that a simple flow cytometry (FCM) approach based on triggering the detection on a fluorescence signal enabled the detection of 50× more Annexin-A5 binding EVs (Anx5+ EVs) in plasma than the conventional FCM approach based on light scattering triggering. Here, we present the application of the fluorescence triggering approach to the enumeration and phenotyping of EVs from platelet free plasma (PFP), focusing on CD41+ and CD235a+ EVs, as well as their sub-populations which bind or do not bind Anx5. Higher EV concentrations were detected by fluorescence triggering as compared to light scattering triggering, namely 40× for Anx5+ EVs, 75× for CD41+ EVs, and 15× for CD235a+ EVs. We found that about 30% of Anx5+ EVs were of platelet origin while only 3% of them were of erythrocyte origin. In addition, a majority of EVs from platelet and erythrocyte origin do not expose PS, in contrast to the classical theory of EV formation. Furthermore, the same PFP samples were analyzed fresh and after freeze-thawing, showing that freeze-thawing processes induce an increase, of about 35%, in the amount of Anx5+ EVs, while the other EV phenotypes remain unchanged. The method of EV detection and phenotyping by fluorescence triggering is simple, sensitive and reliable. We foresee that its application to EV studies will improve our understanding on the formation mechanisms and functions of EVs in health and disease and help the development of EV-based biomarkers. © 2015 International Society for Advancement of Cytometry.

  12. A Firefly Algorithm-based Approach for Pseudo-Relevance Feedback: Application to Medical Database.

    PubMed

    Khennak, Ilyes; Drias, Habiba

    2016-11-01

    The difficulty of disambiguating the sense of the incomplete and imprecise keywords that are extensively used in the search queries has caused the failure of search systems to retrieve the desired information. One of the most powerful and promising method to overcome this shortcoming and improve the performance of search engines is Query Expansion, whereby the user's original query is augmented by new keywords that best characterize the user's information needs and produce more useful query. In this paper, a new Firefly Algorithm-based approach is proposed to enhance the retrieval effectiveness of query expansion while maintaining low computational complexity. In contrast to the existing literature, the proposed approach uses a Firefly Algorithm to find the best expanded query among a set of expanded query candidates. Moreover, this new approach allows the determination of the length of the expanded query empirically. Experimental results on MEDLINE, the on-line medical information database, show that our proposed approach is more effective and efficient compared to the state-of-the-art.

  13. Selecting Populations for Non-Analogous Climate Conditions Using Universal Response Functions: The Case of Douglas-Fir in Central Europe.

    PubMed

    Chakraborty, Debojyoti; Wang, Tongli; Andre, Konrad; Konnert, Monika; Lexer, Manfred J; Matulla, Christoph; Schueler, Silvio

    2015-01-01

    Identifying populations within tree species potentially adapted to future climatic conditions is an important requirement for reforestation and assisted migration programmes. Such populations can be identified either by empirical response functions based on correlations of quantitative traits with climate variables or by climate envelope models that compare the climate of seed sources and potential growing areas. In the present study, we analyzed the intraspecific variation in climate growth response of Douglas-fir planted within the non-analogous climate conditions of Central and continental Europe. With data from 50 common garden trials, we developed Universal Response Functions (URF) for tree height and mean basal area and compared the growth performance of the selected best performing populations with that of populations identified through a climate envelope approach. Climate variables of the trial location were found to be stronger predictors of growth performance than climate variables of the population origin. Although the precipitation regime of the population sources varied strongly none of the precipitation related climate variables of population origin was found to be significant within the models. Overall, the URFs explained more than 88% of variation in growth performance. Populations identified by the URF models originate from western Cascades and coastal areas of Washington and Oregon and show significantly higher growth performance than populations identified by the climate envelope approach under both current and climate change scenarios. The URFs predict decreasing growth performance at low and middle elevations of the case study area, but increasing growth performance on high elevation sites. Our analysis suggests that population recommendations based on empirical approaches should be preferred and population selections by climate envelope models without considering climatic constrains of growth performance should be carefully appraised before transferring populations to planting locations with novel or dissimilar climate.

  14. Clustergrammer, a web-based heatmap visualization and analysis tool for high-dimensional biological data

    PubMed Central

    Fernandez, Nicolas F.; Gundersen, Gregory W.; Rahman, Adeeb; Grimes, Mark L.; Rikova, Klarisa; Hornbeck, Peter; Ma’ayan, Avi

    2017-01-01

    Most tools developed to visualize hierarchically clustered heatmaps generate static images. Clustergrammer is a web-based visualization tool with interactive features such as: zooming, panning, filtering, reordering, sharing, performing enrichment analysis, and providing dynamic gene annotations. Clustergrammer can be used to generate shareable interactive visualizations by uploading a data table to a web-site, or by embedding Clustergrammer in Jupyter Notebooks. The Clustergrammer core libraries can also be used as a toolkit by developers to generate visualizations within their own applications. Clustergrammer is demonstrated using gene expression data from the cancer cell line encyclopedia (CCLE), original post-translational modification data collected from lung cancer cells lines by a mass spectrometry approach, and original cytometry by time of flight (CyTOF) single-cell proteomics data from blood. Clustergrammer enables producing interactive web based visualizations for the analysis of diverse biological data. PMID:28994825

  15. Comparison of Sasang Constitutional Medicine, Traditional Chinese Medicine and Ayurveda

    PubMed Central

    Kim, Jong Yeol; Pham, Duong Duc; Koh, Byung Hee

    2011-01-01

    Sasang constitutional medicine (SCM), traditional Chinese medicine (TCM) and Ayurveda are three different forms of Asian traditional medicine. Although these traditions share a lot in common as holistic medicines, the different philosophical foundations found in each confer distinguishing attributes and unique qualities. SCM is based on a constitution-based approach, and is in this way relatively more similar to the Ayurvedic tradition than to the TCM, although many of the basic SCM theories were originally derived from TCM, a syndrome-based medicine. SCM and TCM use the same botanical materials that are distributed mainly in the East Asian region, but the basic principles of usage and the underlying rationale are completely different from each other. Meanwhile, the principles of the Ayurvedic use of botanical resources are very similar to those seen in SCM, but the medicinal herbs used in Ayurveda generally originate from the West Asian region which displays a different spectrum of flora. PMID:21949669

  16. Vaginal microbial flora analysis by next generation sequencing and microarrays; can microbes indicate vaginal origin in a forensic context?

    PubMed

    Benschop, Corina C G; Quaak, Frederike C A; Boon, Mathilde E; Sijen, Titia; Kuiper, Irene

    2012-03-01

    Forensic analysis of biological traces generally encompasses the investigation of both the person who contributed to the trace and the body site(s) from which the trace originates. For instance, for sexual assault cases, it can be beneficial to distinguish vaginal samples from skin or saliva samples. In this study, we explored the use of microbial flora to indicate vaginal origin. First, we explored the vaginal microbiome for a large set of clinical vaginal samples (n = 240) by next generation sequencing (n = 338,184 sequence reads) and found 1,619 different sequences. Next, we selected 389 candidate probes targeting genera or species and designed a microarray, with which we analysed a diverse set of samples; 43 DNA extracts from vaginal samples and 25 DNA extracts from samples from other body sites, including sites in close proximity of or in contact with the vagina. Finally, we used the microarray results and next generation sequencing dataset to assess the potential for a future approach that uses microbial markers to indicate vaginal origin. Since no candidate genera/species were found to positively identify all vaginal DNA extracts on their own, while excluding all non-vaginal DNA extracts, we deduce that a reliable statement about the cellular origin of a biological trace should be based on the detection of multiple species within various genera. Microarray analysis of a sample will then render a microbial flora pattern that is probably best analysed in a probabilistic approach.

  17. Digitized locksmith forensics: automated detection and segmentation of toolmarks on highly structured surfaces

    NASA Astrophysics Data System (ADS)

    Clausing, Eric; Vielhauer, Claus

    2014-02-01

    Locksmith forensics is an important area in crime scene forensics. Due to new optical, contactless, nanometer range sensing technology, such traces can be captured, digitized and analyzed more easily allowing a complete digital forensic investigation. In this paper we present a significantly improved approach for the detection and segmentation of toolmarks on surfaces of locking cylinder components (using the example of the locking cylinder component 'key pin') acquired by a 3D Confocal Laser Scanning Microscope. This improved approach is based on our prior work1 using a block-based classification approach with textural features. In this prior work1 we achieve a solid detection rate of 75-85% for the detection of toolmarks originating from illegal opening methods. Here, in this paper we improve, expand and fuse this prior approach with additional features from acquired surface topography data, color data and an image processing approach using adapted Gabor filters. In particular we are able of raising the detection and segmentation rates above 90% with our test set of 20 key pins with approximately 700 single toolmark traces of four different opening methods. We can provide a precise pixel- based segmentation as opposed to the rather imprecise segmentation of our prior block-based approach and as the use of the two additional data types (color and especially topography) require a specific pre-processing, we furthermore propose an adequate approach for this purpose.

  18. An under-designed RC frame: Seismic assessment through displacement based approach and possible refurbishment with FRP strips and RC jacketing

    NASA Astrophysics Data System (ADS)

    Valente, Marco; Milani, Gabriele

    2017-07-01

    Many existing reinforced concrete buildings in Southern Europe were built (and hence designed) before the introduction of displacement based design in national seismic codes. They are obviously highly vulnerable to seismic actions. In such a situation, simplified methodologies for the seismic assessment and retrofitting of existing structures are required. In this study, a displacement based procedure using non-linear static analyses is applied to a four-story existing RC frame. The aim is to obtain an estimation of its overall structural inadequacy as well as the effectiveness of a specific retrofitting intervention by means of GFRP laminates and RC jacketing. Accurate numerical models are developed within a displacement based approach to reproduce the seismic response of the RC frame in the original configuration and after strengthening.

  19. Realizing a terrestrial reference frame using the Global Positioning System

    NASA Astrophysics Data System (ADS)

    Haines, Bruce J.; Bar-Sever, Yoaz E.; Bertiger, Willy I.; Desai, Shailen D.; Harvey, Nate; Sibois, Aurore E.; Weiss, Jan P.

    2015-08-01

    We describe a terrestrial reference frame (TRF) realization based on Global Positioning System (GPS) data alone. Our approach rests on a highly dynamic, long-arc (9 day) estimation strategy and on GPS satellite antenna calibrations derived from Gravity Recovery and Climate Experiment and TOPEX/Poseidon low Earth orbit receiver GPS data. Based on nearly 17 years of data (1997-2013), our solution for scale rate agrees with International Terrestrial Reference Frame (ITRF)2008 to 0.03 ppb yr-1, and our solution for 3-D origin rate agrees with ITRF2008 to 0.4 mm yr-1. Absolute scale differs by 1.1 ppb (7 mm at the Earth's surface) and 3-D origin by 8 mm. These differences lie within estimated error levels for the contemporary TRF.

  20. Life Origination and Development Hydrate theory (LOH-Theory): new approaches to the problems of the optimal nutrition and life prolongation

    NASA Astrophysics Data System (ADS)

    Kadyshevich, E. A.; Ostrovskii, V. E.

    2014-04-01

    Life Origination Hydrate Theory (LOH-Theory) and Mitosis and Replication Hydrate Theory (MRHTheory), both grounded on the notion of honeycomb gas-hydrate structures formation/destruction as the physicochemical phenomenon governing the DNA origination and replication, allow new approaches to the optimal nutrition and life prolongation problems.

  1. Establishing survey validity and reliability for American Indians through "think aloud" and test-retest methods.

    PubMed

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L; Burgess, Katherine M; Puumala, Susan E; Wilton, Georgiana; Hanson, Jessica D

    2015-06-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a "think aloud" methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test-retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test-retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. © The Author(s) 2015.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bottacini, Eugenio; Orlando, Elena; Moskalenko, Igor

    X-ray spectral lines at unforeseen energies are important because they can shed light on the extreme physical conditions of the environment around the supermassive black holes of active galactic nuclei (AGNs). Mrk 876 displays such a line at 4.80{sub −0.04}{sup +0.05} rest-frame energy. A possible interpretation of its origin can be found in the hotspot scenario. In this scenario, the primary radiation from a flare in the hot corona of an AGN illuminates a limited portion of the accretion disk that emits by fluorescence. In this context, the line can represent an extreme gravitationally redshifted Fe line originating on themore » accretion disk below six gravitational radii from a rotating supermassive black hole. The correct estimate of the line significance requires a dedicated approach. Based on an existing rigorous approach, we have performed extensive Monte Carlo simulations. We determine that the line is a real feature at a ∼99% confidence level.« less

  3. Assessing power of large river fish monitoring programs to detect population changes: the Missouri River sturgeon example

    USGS Publications Warehouse

    Wildhaber, M.L.; Holan, S.H.; Bryan, J.L.; Gladish, D.W.; Ellersieck, M.

    2011-01-01

    In 2003, the US Army Corps of Engineers initiated the Pallid Sturgeon Population Assessment Program (PSPAP) to monitor pallid sturgeon and the fish community of the Missouri River. The power analysis of PSPAP presented here was conducted to guide sampling design and effort decisions. The PSPAP sampling design has a nested structure with multiple gear subsamples within a river bend. Power analyses were based on a normal linear mixed model, using a mixed cell means approach, with variance estimates from the original data. It was found that, at current effort levels, at least 20 years for pallid and 10 years for shovelnose sturgeon is needed to detect a 5% annual decline. Modified bootstrap simulations suggest power estimates from the original data are conservative due to excessive zero fish counts. In general, the approach presented is applicable to a wide array of animal monitoring programs.

  4. Equivalency principle for magnetoelectroelastic multiferroics with arbitrary microstructure: The phase field approach

    NASA Astrophysics Data System (ADS)

    Ni, Yong; He, Linghui; Khachaturyan, Armen G.

    2010-07-01

    A phase field method is proposed to determine the equilibrium fields of a magnetoelectroelastic multiferroic with arbitrarily distributed constitutive constants under applied loadings. This method is based on a developed generalized Eshelby's equivalency principle, in which the elastic strain, electrostatic, and magnetostatic fields at the equilibrium in the original heterogeneous system are exactly the same as those in an equivalent homogeneous magnetoelectroelastic coupled or uncoupled system with properly chosen distributed effective eigenstrain, polarization, and magnetization fields. Finding these effective fields fully solves the equilibrium elasticity, electrostatics, and magnetostatics in the original heterogeneous multiferroic. The paper formulates a variational principle proving that the effective fields are minimizers of appropriate close-form energy functional. The proposed phase field approach produces the energy minimizing effective fields (and thus solving the general multiferroic problem) as a result of artificial relaxation process described by the Ginzburg-Landau-Khalatnikov kinetic equations.

  5. Model-based adaptive 3D sonar reconstruction in reverberating environments.

    PubMed

    Saucan, Augustin-Alexandru; Sintes, Christophe; Chonavel, Thierry; Caillec, Jean-Marc Le

    2015-10-01

    In this paper, we propose a novel model-based approach for 3D underwater scene reconstruction, i.e., bathymetry, for side scan sonar arrays in complex and highly reverberating environments like shallow water areas. The presence of multipath echoes and volume reverberation generates false depth estimates. To improve the resulting bathymetry, this paper proposes and develops an adaptive filter, based on several original geometrical models. This multimodel approach makes it possible to track and separate the direction of arrival trajectories of multiple echoes impinging the array. Echo tracking is perceived as a model-based processing stage, incorporating prior information on the temporal evolution of echoes in order to reject cluttered observations generated by interfering echoes. The results of the proposed filter on simulated and real sonar data showcase the clutter-free and regularized bathymetric reconstruction. Model validation is carried out with goodness of fit tests, and demonstrates the importance of model-based processing for bathymetry reconstruction.

  6. A skeleton family generator via physics-based deformable models.

    PubMed

    Krinidis, Stelios; Chatzis, Vassilios

    2009-01-01

    This paper presents a novel approach for object skeleton family extraction. The introduced technique utilizes a 2-D physics-based deformable model that parameterizes the objects shape. Deformation equations are solved exploiting modal analysis, and proportional to model physical characteristics, a different skeleton is produced every time, generating, in this way, a family of skeletons. The theoretical properties and the experiments presented demonstrate that obtained skeletons match to hand-labeled skeletons provided by human subjects, even in the presence of significant noise and shape variations, cuts and tears, and have the same topology as the original skeletons. In particular, the proposed approach produces no spurious branches without the need of any known skeleton pruning method.

  7. Typology of delivery quality: latent profile analysis of teacher engagement and delivery techniques in a school-based prevention intervention, keepin’ it REAL curriculum

    PubMed Central

    Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L.; Krieger, Janice L.

    2014-01-01

    Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may or may not have a prevention background. The goal of the current study is to add to the scholarly literature on implementation quality for a school-based substance use prevention intervention. Twenty-five schools in Ohio and Pennsylvania implemented the original keepin’ REAL (kiR) substance use prevention curriculum. Each of the 10, 40–45 min lessons of the kiR curriculum was video recorded. Coders observed and rated a random sample of 276 videos reflecting 78 classes taught by 31 teachers. Codes included teachers’ delivery techniques (e.g. lecture, discussion, demonstration and role play) and engagement with students (e.g. attentiveness, enthusiasm and positivity). Based on the video ratings, a latent profile analysis was run to identify typology of delivery quality. Five profiles were identified: holistic approach, attentive teacher-orientated approach, enthusiastic lecture approach, engaged interactive learning approach and skill practice-only approach. This study provides a descriptive typology of delivery quality while implementing a school-based substance use prevention intervention. PMID:25274721

  8. Observability of nonlinear dynamics: normalized results and a time-series approach.

    PubMed

    Aguirre, Luis A; Bastos, Saulo B; Alves, Marcela A; Letellier, Christophe

    2008-03-01

    This paper investigates the observability of nonlinear dynamical systems. Two difficulties associated with previous studies are dealt with. First, a normalized degree observability is defined. This permits the comparison of different systems, which was not generally possible before. Second, a time-series approach is proposed based on omnidirectional nonlinear correlation functions to rank a set of time series of a system in terms of their potential use to reconstruct the original dynamics without requiring the knowledge of the system equations. The two approaches proposed in this paper and a former method were applied to five benchmark systems and an overall agreement of over 92% was found.

  9. [Investigation of fast filter of ECG signals with lifting wavelet and smooth filter].

    PubMed

    Li, Xuefei; Mao, Yuxing; He, Wei; Yang, Fan; Zhou, Liang

    2008-02-01

    The lifting wavelet is used to decompose the original ECG signals and separate them into the approach signals with low frequency and the detail signals with high frequency, based on frequency characteristic. Parts of the detail signals are ignored according to the frequency characteristic. To avoid the distortion of QRS Complexes, the approach signals are filtered by an adaptive smooth filter with a proper threshold value. Through the inverse transform of the lifting wavelet, the reserved approach signals are reconstructed, and the three primary kinds of noise are limited effectively. In addition, the method is fast and there is no time delay between input and output.

  10. [Maxillary swing approach in the management of tumors in the central and lateral cranial base].

    PubMed

    Liao, Hua; Hua, Qing-quan; Wu, Zhan-yuan

    2006-04-01

    A retrospective review of seventeen patients who were operated through the maxillary swing approach was carried out to assess the efficacy of this approach in the management of tumors of the central and lateral cranial base. From May 2000 to January 2005, 17 patients with primary or recurrent neoplasms involving the central cranial or lateral base underwent surgical resection via maxillary swing approach. Ten patients were male, and other seven patients were female, and age range was 7 to 58 years, with a mean age of 42. 6 years. Eight patients had tumors originally involving lateral cranial base, and nine patients had tumors originated from central cranial base. The pathology spectrum was very wide. Among them, five suffered from chordoma, two had rhabdomyosarcoma, two had squamous cell carcinoma, one had malignant fibrous histiocytoma, one had malignant melanoma, one had esthesioneuroblastoma, one had invaded hypophysoma, two had schwannoma, one had pleomorphic adenoma, and one had angiofibroma. Three patients had received previous surgery, two patients had previous radiation therapy and nine patients received postoperative radiotherapy. Sixteen of all seventeen patients had oncologically complete resection, one had near-total resection. This group patients was followed up from 10 to 60 months, with a median follow-up time of 28 months. Two patients died 14 and 26 months after surgery respectively, as a result of local recurrence and metastasis. One patient defaulted follow-up at 12 months after operation, and the other 14 patients were alive at the time of analysis. Of the 12 malignant tumors, the 1-and 2-year survival rate were 91.67% and 72.92%, respectively. The facial wounds of all patients healed primarily, and there were no necrosis of the maxilla, damage of the temporal branch of the facial nerve, lower-lid ectropion, and facial deformity. Epiphora and facial hypoesthesia were detected in all patients. Four patients (23.5%) developed palatal fistula, ten patients developed serous otitis media (58.8%), and four patients developed a certain degree of trismus (23.5%). Cerebrospinal fluid leak occurred in two patients. They subsequently healed with conservative management. The maxillary swing approach is a proven method for access to the central and lateral skull base with good exposure and acceptable morbidity. Complications and sequelae associated with this approach include facial scarring, transaction of the infraorbital nerve, impaired lacrimal drainage, eustachian tube dysfunction and serous otitis, palatal fistula, trismus etc. Some procedures should be performed for reducing the incidence and severity of complications in the maxillary swing approach.

  11. A Cultural-Histroical Approach to Learning and Teaching: New Pespectives on Advancing Development.

    ERIC Educational Resources Information Center

    Portes, Pedro R., Ed.

    1993-01-01

    This special issue is devoted to the cultural-historical school of thought about mental development based on the work of Lev Vygotsky. The research of Vygotsky addressed the sociocultural basis of higher-level cognitive functions, and ascribed an influential role to human speech and other mediational tools in originating changes in cognition and…

  12. Far infrared all-sky survey

    NASA Technical Reports Server (NTRS)

    Richards, Paul L.

    1991-01-01

    An all-sky survey at submillimeter waves is examined. Far-infrared all-sky surveys were performed using high-thoroughput bolometric detectors from a one-meter balloon telescope. Based on the large-bodied experience obtained with the original all-sky survey telescope, a number of radically different approaches were implemented. Continued balloon measurements of the spectrum of the cosmic microwave background were performed.

  13. Child Development Laboratory Schools as Generators of Knowledge in Early Childhood Education: New Models and Approaches

    ERIC Educational Resources Information Center

    McBride, Brent A.; Groves, Melissa; Barbour, Nancy; Horm, Diane; Stremmel, Andrew; Lash, Martha; Bersani, Carol; Ratekin, Cynthia; Moran, James; Elicker, James; Toussaint, Susan

    2012-01-01

    Research Findings: University-based child development laboratory programs have a long and rich history of supporting teaching, research, and outreach activities in the child development/early childhood education fields. Although these programs were originally developed in order to conduct research on children and families to inform policy and…

  14. Quantitative comparison of the absorption spectra of the gas mixtures in analogy to the criterion of Pearson

    NASA Astrophysics Data System (ADS)

    Kistenev, Yu. V.; Kuzmin, D. A.; Sandykova, E. A.; Shapovalov, A. V.

    2015-11-01

    An approach to the reduction of the space of the absorption spectra, based on the original criterion for profile analysis of the spectra, was proposed. This criterion dates back to the known statistics chi-square test of Pearson. Introduced criterion allows to quantify the differences of spectral curves.

  15. Guanosine radical reactivity explored by pulse radiolysis coupled with transient electrochemistry.

    PubMed

    Latus, A; Alam, M S; Mostafavi, M; Marignier, J-L; Maisonhaute, E

    2015-06-04

    We follow the reactivity of a guanosine radical created by a radiolytic electron pulse both by spectroscopic and electrochemical methods. This original approach allows us to demonstrate that there is a competition between oxidation and reduction of these intermediates, an important result to further analyse the degradation or repair pathways of DNA bases.

  16. On the Constitutive Response Characterization for Composite Materials Via Data-Driven Design Optimization

    Treesearch

    John G. Michopoulos; John G. Hermanson; Athanasios lliopoulos; Samuel Lambrakos; Tomonari Furukawa

    2011-01-01

    In the present paper we focus on demonstrating the use of design optimization for the constitutive characterization of anisotropic material systems such as polymer matrix composites, with or without damage. All approaches are based on the availability of experimental data originating from mechatronic material testing systems that can expose specimens to...

  17. An HBCU-Based Educational Approach for Black College Student Success: Toward a Framework with Implications for All Institutions

    ERIC Educational Resources Information Center

    Arroyo, Andrew T.; Gasman, Marybeth

    2014-01-01

    This conceptual study builds an institution-focused, non-Eurocentric, theoretical framework of black college student success. Specifically, the study synthesizes the relevant empirical research on the contributions historically black colleges and universities (HBCUs) have made for black student success, leading to an original model that all…

  18. Communicative Language Teaching Today. Portfolio Series #13

    ERIC Educational Resources Information Center

    Richards, Jack C.

    2005-01-01

    This booklet examines the methodology known as Communicative Language Teaching or CLT and explores the assumptions it is based on, its origins and evolution since it was first proposed in the 1970s, and how it has influenced approaches to language teaching today. It serves to review what has been learned from CLT and what its relevance is today. A…

  19. Building confidence and credibility into CAD with belief decision trees

    NASA Astrophysics Data System (ADS)

    Affenit, Rachael N.; Barns, Erik R.; Furst, Jacob D.; Rasin, Alexander; Raicu, Daniela S.

    2017-03-01

    Creating classifiers for computer-aided diagnosis in the absence of ground truth is a challenging problem. Using experts' opinions as reference truth is difficult because the variability in the experts' interpretations introduces uncertainty in the labeled diagnostic data. This uncertainty translates into noise, which can significantly affect the performance of any classifier on test data. To address this problem, we propose a new label set weighting approach to combine the experts' interpretations and their variability, as well as a selective iterative classification (SIC) approach that is based on conformal prediction. Using the NIH/NCI Lung Image Database Consortium (LIDC) dataset in which four radiologists interpreted the lung nodule characteristics, including the degree of malignancy, we illustrate the benefits of the proposed approach. Our results show that the proposed 2-label-weighted approach significantly outperforms the accuracy of the original 5- label and 2-label-unweighted classification approaches by 39.9% and 7.6%, respectively. We also found that the weighted 2-label models produce higher skewness values by 1.05 and 0.61 for non-SIC and SIC respectively on root mean square error (RMSE) distributions. When each approach was combined with selective iterative classification, this further improved the accuracy of classification for the 2-weighted-label by 7.5% over the original, and improved the skewness of the 5-label and 2-unweighted-label by 0.22 and 0.44 respectively.

  20. Development of an Aeroelastic Code Based on an Euler/Navier-Stokes Aerodynamic Solver

    NASA Technical Reports Server (NTRS)

    Bakhle, Milind A.; Srivastava, Rakesh; Keith, Theo G., Jr.; Stefko, George L.; Janus, Mark J.

    1996-01-01

    This paper describes the development of an aeroelastic code (TURBO-AE) based on an Euler/Navier-Stokes unsteady aerodynamic analysis. A brief review of the relevant research in the area of propulsion aeroelasticity is presented. The paper briefly describes the original Euler/Navier-Stokes code (TURBO) and then details the development of the aeroelastic extensions. The aeroelastic formulation is described. The modeling of the dynamics of the blade using a modal approach is detailed, along with the grid deformation approach used to model the elastic deformation of the blade. The work-per-cycle approach used to evaluate aeroelastic stability is described. Representative results used to verify the code are presented. The paper concludes with an evaluation of the development thus far, and some plans for further development and validation of the TURBO-AE code.

  1. Intranasal approach for manipulating the depressor septi nasi.

    PubMed

    Oh, Sang-Ha; Choi, Sangmun; Kim, Dong Woon; Jeong, Jae Yong

    2012-03-01

    A hyperactivated depressor septi nasi is an important factor contributing to nasal tip drooping. Although many studies have examined this, its treatment remains controversial. This study presents a surgical intervention based on an anatomic study.Ten fresh cadavers with large noses were used for the anatomic study. Between April 2008 and September 2010, 20 patients underwent surgical intervention for hyperactivated depressor septi nasi.In all of the cadaver dissections, the depressor septi nasi was present, although it was difficult to identify the muscle clearly in 6 of the cadavers. We found that the depressor septi nasi in the other 4 cadavers consisted of 3 fascicles. The medial fascicles were divided into superficial and deep fibers. Both the deep and superficial fibers were inserted into the dermocartilaginous ligament in the nearby nasal tip. After the superficial fibers were interdigitated with the orbicularis oris, they originated from the alveolar bone. The deep fibers originated at the anterior nasal spine. The intermediate fascicles inserted to the footplates of the medial crus and caudal septum. After interdigitating with the medial fascicles and orbicularis oris, they also originated from the alveolar bone. The drooping nasal tips were improved in all cases using an intranasal approach to manipulate the depressor septi nasi. No specific complication was seen. Surgical intervention of a hyperactivated depressor septi nasi using an intranasal approach was a useful method for correcting a drooping nasal tip.

  2. [An anti-Taylor approach: the invention of a method for the cogovernance of health care institutions in order to produce freedom and compromise].

    PubMed

    Campos, G W

    1998-01-01

    This paper describes a new health care management method. A triangular confrontation system was constructed, based on a theoretical review, empirical facts observed from health services, and the researcher's knowledge, jointly analyzed. This new management model was termed 'health-team-focused collegiate management', entailing several original organizational concepts: production unity, matrix-based reference team, collegiate management system, cogovernance, and product/production interface.

  3. The Handbook of the Evolving Research of Transformative Learning: Based on the Learning Activities Survey (10th Anniversary Edition). Adult Education Special Topics--Theory, Research and Practice in LifeLong Learning

    ERIC Educational Resources Information Center

    King, Kathleen P., Ed.

    2009-01-01

    This handbook is a much expanded version of the original Learning Activities Survey published by Dr. Kathleen P. King of Fordham University in 1998. Based on her ground breaking research in this field where she used a mixed methodology research approach to study transformative learning, the book will provide a model of research, firsthand…

  4. High-Performance THz Emitters Based on Ferromagnetic/Nonmagnetic Heterostructures.

    PubMed

    Wu, Yang; Elyasi, Mehrdad; Qiu, Xuepeng; Chen, Mengji; Liu, Yang; Ke, Lin; Yang, Hyunsoo

    2017-01-01

    A low-cost, intense, broadband, noise resistive, magnetic field controllable, flexible, and low power driven THz emitter based on thin nonmagnetic/ferromagnetic metallic heterostructures is demonstrated. The THz emission origins from the inverse spin Hall Effect. The proposed devices are not only promising for a wide range of THz equipment, but also offer an alternative approach to characterize the spin-orbit interaction in nonmagnetic/ferromagnetic bilayers. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Knowledge-Based Vision Techniques for the Autonomous Land Vehicle Program

    DTIC Science & Technology

    1991-10-01

    Knowledge System The CKS is an object-oriented knowledge database that was originally designed to serve as the central information manager for a...34 Representation Space: An Approach to the Integra- tion of Visual Information ," Proc. of DARPA Image Understanding Workshop, Palo Alto, CA, pp. 263-272, May 1989...Strat, " Information Management in a Sensor-Based Au- tonomous System," Proc. DARPA Image Understanding Workshop, University of Southern CA, Vol.1, pp

  6. High Order Modulation Protograph Codes

    NASA Technical Reports Server (NTRS)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  7. Towards a street-level pollen concentration and exposure forecast

    NASA Astrophysics Data System (ADS)

    van der Molen, Michiel; Krol, Maarten; van Vliet, Arnold; Heuvelink, Gerard

    2015-04-01

    Atmospheric pollen are an increasing source of nuisance for people in industrialised countries and are associated with significant cost of medication and sick leave. Citizen pollen warnings are often based on emission mapping based on local temperature sum approaches or on long-range atmospheric model approaches. In practise, locally observed pollen may originate from both local sources (plants in streets and gardens) and from long-range transport. We argue that making this distinction is relevant because the diurnal and spatial variation in pollen concentrations is much larger for pollen from local sources than for pollen from long-range transport due to boundary layer processes. This may have an important impact on exposure of citizens to pollen and on mitigation strategies. However, little is known about the partitioning of pollen into local and long-range origin categories. Our objective is to study how the concentrations of pollen from different sources vary temporally and spatially, and how the source region influences exposure and mitigation strategies. We built a Hay Fever Forecast system (HFF) based on WRF-chem, Allergieradar.nl, and geo-statistical downscaling techniques. HFF distinguishes between local (individual trees) and regional sources (based on tree distribution maps). We show first results on how the diurnal variation of pollen concentrations depends on source proximity. Ultimately, we will compare the model with local pollen counts, patient nuisance scores and medicine use.

  8. A multi-scale tensor voting approach for small retinal vessel segmentation in high resolution fundus images.

    PubMed

    Christodoulidis, Argyrios; Hurtut, Thomas; Tahar, Houssem Ben; Cheriet, Farida

    2016-09-01

    Segmenting the retinal vessels from fundus images is a prerequisite for many CAD systems for the automatic detection of diabetic retinopathy lesions. So far, research efforts have concentrated mainly on the accurate localization of the large to medium diameter vessels. However, failure to detect the smallest vessels at the segmentation step can lead to false positive lesion detection counts in a subsequent lesion analysis stage. In this study, a new hybrid method for the segmentation of the smallest vessels is proposed. Line detection and perceptual organization techniques are combined in a multi-scale scheme. Small vessels are reconstructed from the perceptual-based approach via tracking and pixel painting. The segmentation was validated in a high resolution fundus image database including healthy and diabetic subjects using pixel-based as well as perceptual-based measures. The proposed method achieves 85.06% sensitivity rate, while the original multi-scale line detection method achieves 81.06% sensitivity rate for the corresponding images (p<0.05). The improvement in the sensitivity rate for the database is 6.47% when only the smallest vessels are considered (p<0.05). For the perceptual-based measure, the proposed method improves the detection of the vasculature by 7.8% against the original multi-scale line detection method (p<0.05). Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Hardware in the Loop Performance Assessment of LIDAR-Based Spacecraft Pose Determination

    PubMed Central

    Fasano, Giancarmine; Grassi, Michele

    2017-01-01

    In this paper an original, easy to reproduce, semi-analytic calibration approach is developed for hardware-in-the-loop performance assessment of pose determination algorithms processing point cloud data, collected by imaging a non-cooperative target with LIDARs. The laboratory setup includes a scanning LIDAR, a monocular camera, a scaled-replica of a satellite-like target, and a set of calibration tools. The point clouds are processed by uncooperative model-based algorithms to estimate the target relative position and attitude with respect to the LIDAR. Target images, acquired by a monocular camera operated simultaneously with the LIDAR, are processed applying standard solutions to the Perspective-n-Points problem to get high-accuracy pose estimates which can be used as a benchmark to evaluate the accuracy attained by the LIDAR-based techniques. To this aim, a precise knowledge of the extrinsic relative calibration between the camera and the LIDAR is essential, and it is obtained by implementing an original calibration approach which does not need ad-hoc homologous targets (e.g., retro-reflectors) easily recognizable by the two sensors. The pose determination techniques investigated by this work are of interest to space applications involving close-proximity maneuvers between non-cooperative platforms, e.g., on-orbit servicing and active debris removal. PMID:28946651

  10. Hardware in the Loop Performance Assessment of LIDAR-Based Spacecraft Pose Determination.

    PubMed

    Opromolla, Roberto; Fasano, Giancarmine; Rufino, Giancarlo; Grassi, Michele

    2017-09-24

    In this paper an original, easy to reproduce, semi-analytic calibration approach is developed for hardware-in-the-loop performance assessment of pose determination algorithms processing point cloud data, collected by imaging a non-cooperative target with LIDARs. The laboratory setup includes a scanning LIDAR, a monocular camera, a scaled-replica of a satellite-like target, and a set of calibration tools. The point clouds are processed by uncooperative model-based algorithms to estimate the target relative position and attitude with respect to the LIDAR. Target images, acquired by a monocular camera operated simultaneously with the LIDAR, are processed applying standard solutions to the Perspective- n -Points problem to get high-accuracy pose estimates which can be used as a benchmark to evaluate the accuracy attained by the LIDAR-based techniques. To this aim, a precise knowledge of the extrinsic relative calibration between the camera and the LIDAR is essential, and it is obtained by implementing an original calibration approach which does not need ad-hoc homologous targets (e.g., retro-reflectors) easily recognizable by the two sensors. The pose determination techniques investigated by this work are of interest to space applications involving close-proximity maneuvers between non-cooperative platforms, e.g., on-orbit servicing and active debris removal.

  11. Towards the automated identification of Chrysomya blow flies from wing images.

    PubMed

    Macleod, N; Hall, M J R; Wardhana, A H

    2018-04-15

    The Old World screwworm fly (OWSF), Chrysomya bezziana (Diptera: Calliphoridae), is an important agent of traumatic myiasis and, as such, a major human and animal health problem. In the implementation of OWSF control operations, it is important to determine the geographical origins of such disease-causing species in order to establish whether they derive from endemic or invading populations. Gross morphological and molecular studies have demonstrated the existence of two distinct lineages of this species, one African and the other Asian. Wing morphometry is known to be of substantial assistance in identifying the geographical origin of individuals because it provides diagnostic markers that complement molecular diagnostics. However, placement of the landmarks used in traditional geometric morphometric analysis can be time-consuming and subject to error caused by operator subjectivity. Here we report results of an image-based approach to geometric morphometric analysis for delivering wing-based identifications. Our results indicate that this approach can produce identifications that are practically indistinguishable from more traditional landmark-based results. In addition, we demonstrate that the direct analysis of digital wing images can be used to discriminate between three Chrysomya species of veterinary and forensic importance and between C. bezziana genders. © 2018 The Trustees of the Natural History Museum, London. Medical and Veterinary Entomology © 2018 Royal Entomological Society.

  12. High-dynamic range imaging techniques based on both color-separation algorithms used in conventional graphic arts and the human visual perception modeling

    NASA Astrophysics Data System (ADS)

    Lo, Mei-Chun; Hsieh, Tsung-Hsien; Perng, Ruey-Kuen; Chen, Jiong-Qiao

    2010-01-01

    The aim of this research is to derive illuminant-independent type of HDR imaging modules which can optimally multispectrally reconstruct of every color concerned in high-dynamic-range of original images for preferable cross-media color reproduction applications. Each module, based on either of broadband and multispectral approach, would be incorporated models of perceptual HDR tone-mapping, device characterization. In this study, an xvYCC format of HDR digital camera was used to capture HDR scene images for test. A tone-mapping module was derived based on a multiscale representation of the human visual system and used equations similar to a photoreceptor adaptation equation, proposed by Michaelis-Menten. Additionally, an adaptive bilateral type of gamut mapping algorithm, using approach of a multiple conversing-points (previously derived), was incorporated with or without adaptive Un-sharp Masking (USM) to carry out the optimization of HDR image rendering. An LCD with standard color space of Adobe RGB (D65) was used as a soft-proofing platform to display/represent HDR original RGB images, and also evaluate both renditionquality and prediction-performance of modules derived. Also, another LCD with standard color space of sRGB was used to test gamut-mapping algorithms, used to be integrated with tone-mapping module derived.

  13. A naive Bayes algorithm for tissue origin diagnosis (TOD-Bayes) of synchronous multifocal tumors in the hepatobiliary and pancreatic system.

    PubMed

    Jiang, Weiqin; Shen, Yifei; Ding, Yongfeng; Ye, Chuyu; Zheng, Yi; Zhao, Peng; Liu, Lulu; Tong, Zhou; Zhou, Linfu; Sun, Shuo; Zhang, Xingchen; Teng, Lisong; Timko, Michael P; Fan, Longjiang; Fang, Weijia

    2018-01-15

    Synchronous multifocal tumors are common in the hepatobiliary and pancreatic system but because of similarities in their histological features, oncologists have difficulty in identifying their precise tissue clonal origin through routine histopathological methods. To address this problem and assist in more precise diagnosis, we developed a computational approach for tissue origin diagnosis based on naive Bayes algorithm (TOD-Bayes) using ubiquitous RNA-Seq data. Massive tissue-specific RNA-Seq data sets were first obtained from The Cancer Genome Atlas (TCGA) and ∼1,000 feature genes were used to train and validate the TOD-Bayes algorithm. The accuracy of the model was >95% based on tenfold cross validation by the data from TCGA. A total of 18 clinical cancer samples (including six negative controls) with definitive tissue origin were subsequently used for external validation and 17 of the 18 samples were classified correctly in our study (94.4%). Furthermore, we included as cases studies seven tumor samples, taken from two individuals who suffered from synchronous multifocal tumors across tissues, where the efforts to make a definitive primary cancer diagnosis by traditional diagnostic methods had failed. Using our TOD-Bayes analysis, the two clinical test cases were successfully diagnosed as pancreatic cancer (PC) and cholangiocarcinoma (CC), respectively, in agreement with their clinical outcomes. Based on our findings, we believe that the TOD-Bayes algorithm is a powerful novel methodology to accurately identify the tissue origin of synchronous multifocal tumors of unknown primary cancers using RNA-Seq data and an important step toward more precision-based medicine in cancer diagnosis and treatment. © 2017 UICC.

  14. Moderated, Water-Based, Condensational Particle Growth in a Laminar Flow

    PubMed Central

    Hering, Susanne V.; Spielman, Steven R.; Lewis, Gregory S.

    2014-01-01

    Presented is a new approach for laminar-flow water condensation that produces saturations above 1.5 while maintaining temperatures of less than 30°C in the majority of the flow and providing an exiting dew point below 15°C. With the original laminar flow water condensation method, the particle activation and growth occurs in a region with warm, wetted walls throughout, which has the side-effect of heating the flow. The “moderated” approach presented here replaces this warm region with a two sections – a short, warm, wet-walled “initiator”, followed by a cool-walled “moderator”. The initiator provides the water vapor that creates the supersaturation, while the moderator provides the time for particle growth. The combined length of the initiator and moderator sections is the same as that of the original, warm-walled growth section. Model results show that this new approach reduces the added heat and water vapor while achieving the same peak supersaturation and similar droplet growth. Experimental measurements confirm the trends predicted by the modeling. PMID:24839342

  15. Phenotypic screening in cancer drug discovery - past, present and future.

    PubMed

    Moffat, John G; Rudolph, Joachim; Bailey, David

    2014-08-01

    There has been a resurgence of interest in the use of phenotypic screens in drug discovery as an alternative to target-focused approaches. Given that oncology is currently the most active therapeutic area, and also one in which target-focused approaches have been particularly prominent in the past two decades, we investigated the contribution of phenotypic assays to oncology drug discovery by analysing the origins of all new small-molecule cancer drugs approved by the US Food and Drug Administration (FDA) over the past 15 years and those currently in clinical development. Although the majority of these drugs originated from target-based discovery, we identified a significant number whose discovery depended on phenotypic screening approaches. We postulate that the contribution of phenotypic screening to cancer drug discovery has been hampered by a reliance on 'classical' nonspecific drug effects such as cytotoxicity and mitotic arrest, exacerbated by a paucity of mechanistically defined cellular models for therapeutically translatable cancer phenotypes. However, technical and biological advances that enable such mechanistically informed phenotypic models have the potential to empower phenotypic drug discovery in oncology.

  16. A bat algorithm with mutation for UCAV path planning.

    PubMed

    Wang, Gaige; Guo, Lihong; Duan, Hong; Liu, Luo; Wang, Heqi

    2012-01-01

    Path planning for uninhabited combat air vehicle (UCAV) is a complicated high dimension optimization problem, which mainly centralizes on optimizing the flight route considering the different kinds of constrains under complicated battle field environments. Original bat algorithm (BA) is used to solve the UCAV path planning problem. Furthermore, a new bat algorithm with mutation (BAM) is proposed to solve the UCAV path planning problem, and a modification is applied to mutate between bats during the process of the new solutions updating. Then, the UCAV can find the safe path by connecting the chosen nodes of the coordinates while avoiding the threat areas and costing minimum fuel. This new approach can accelerate the global convergence speed while preserving the strong robustness of the basic BA. The realization procedure for original BA and this improved metaheuristic approach BAM is also presented. To prove the performance of this proposed metaheuristic method, BAM is compared with BA and other population-based optimization methods, such as ACO, BBO, DE, ES, GA, PBIL, PSO, and SGA. The experiment shows that the proposed approach is more effective and feasible in UCAV path planning than the other models.

  17. Developmental Origins of Health and Disease: Environmental Exposures

    PubMed Central

    Swanson, James M.; Entringer, Sonja; Buss, Claudia; Wadhwa, Pathik D.

    2010-01-01

    The developmental origins of health and disease (DOHaD) approach has evolved over the past 20 years, and the current hypothesis proposes that fetal adaptations to intrauterine and maternal conditions during development shape structure and function of organs. Here we present a review of some environmental exposures that may trigger fetal maladaptations in these processes, including three examples: exposures to tobacco smoke, antidepressant medication, and folic acid deficits in the food supply. We provide a selected review of current research on the effects of each of these exposures on fetal development and birth outcomes, and use the DOHaD approach to suggest how these exposures may alter long-term outcomes. In the interpretation of this literature, we review the evidence of gene–environment interactions based on evaluation of biological pathways and evidence that some exposures to the fetus may be moderated by maternal and fetal genotypes. Finally, we use the design of the National Children’s Study (now in progress) to propose how the DOHaD approach could be used to address questions that have emerged in this area that are relevant to reproductive medicine and subsequent health outcomes. PMID:19711249

  18. Mapping groundwater contamination risk of multiple aquifers using multi-model ensemble of machine learning algorithms.

    PubMed

    Barzegar, Rahim; Moghaddam, Asghar Asghari; Deo, Ravinesh; Fijani, Elham; Tziritis, Evangelos

    2018-04-15

    Constructing accurate and reliable groundwater risk maps provide scientifically prudent and strategic measures for the protection and management of groundwater. The objectives of this paper are to design and validate machine learning based-risk maps using ensemble-based modelling with an integrative approach. We employ the extreme learning machines (ELM), multivariate regression splines (MARS), M5 Tree and support vector regression (SVR) applied in multiple aquifer systems (e.g. unconfined, semi-confined and confined) in the Marand plain, North West Iran, to encapsulate the merits of individual learning algorithms in a final committee-based ANN model. The DRASTIC Vulnerability Index (VI) ranged from 56.7 to 128.1, categorized with no risk, low and moderate vulnerability thresholds. The correlation coefficient (r) and Willmott's Index (d) between NO 3 concentrations and VI were 0.64 and 0.314, respectively. To introduce improvements in the original DRASTIC method, the vulnerability indices were adjusted by NO 3 concentrations, termed as the groundwater contamination risk (GCR). Seven DRASTIC parameters utilized as the model inputs and GCR values utilized as the outputs of individual machine learning models were served in the fully optimized committee-based ANN-predictive model. The correlation indicators demonstrated that the ELM and SVR models outperformed the MARS and M5 Tree models, by virtue of a larger d and r value. Subsequently, the r and d metrics for the ANN-committee based multi-model in the testing phase were 0.8889 and 0.7913, respectively; revealing the superiority of the integrated (or ensemble) machine learning models when compared with the original DRASTIC approach. The newly designed multi-model ensemble-based approach can be considered as a pragmatic step for mapping groundwater contamination risks of multiple aquifer systems with multi-model techniques, yielding the high accuracy of the ANN committee-based model. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Pulpal perio relations: Interdisciplinary diagnostic approach - I

    PubMed Central

    Nirola, Ashutosh; Grover, Sunanda; Sharma, Ajay; Kaur, Damanjeet

    2011-01-01

    Lesions of pulpal and periodontal origin may perpetuate from either the infections of dental pulp or periodontium or alveolar bone. This review focuses on interdisciplinary diagnostic approach towards lesions of periodontal or endodontic origin. PMID:21772729

  20. An improved method for measuring the magnetic inhomogeneity shift in hydrogen masers

    NASA Technical Reports Server (NTRS)

    Reinhardt, V. S.; Peters, H. E.

    1975-01-01

    The reported method makes it possible to conduct all maser frequency measurements under conditions of low magnetic field intensity for which the hydrogen maser is most stable. Aspects concerning the origin of the magnetic inhomogeneity shift are examined and the available approaches for measuring this shift are considered, taking into account certain drawbacks of currently used methods. An approach free of these drawbacks can be based on the measurement of changes in a parameter representing the difference between the number of atoms in the involved states.

  1. Rescheduling with iterative repair

    NASA Technical Reports Server (NTRS)

    Zweben, Monte; Davis, Eugene; Daun, Brian; Deale, Michael

    1992-01-01

    This paper presents a new approach to rescheduling called constraint-based iterative repair. This approach gives our system the ability to satisfy domain constraints, address optimization concerns, minimize perturbation to the original schedule, and produce modified schedules quickly. The system begins with an initial, flawed schedule and then iteratively repairs constraint violations until a conflict-free schedule is produced. In an empirical demonstration, we vary the importance of minimizing perturbation and report how fast the system is able to resolve conflicts in a given time bound. These experiments were performed within the domain of Space Shuttle ground processing.

  2. Considerations on non equilibrium thermodynamics of interactions

    NASA Astrophysics Data System (ADS)

    Lucia, Umberto

    2016-04-01

    Nature can be considered the ;first; engineer! For scientists and engineers, dynamics and evolution of complex systems are not easy to predict. A fundamental approach to study complex system is thermodynamics. But, the result is the origin of too many schools of thermodynamics with a consequent difficulty in communication between thermodynamicists and other scientists and, also, among themselves. The solution is to obtain a unified approach based on the fundamentals of physics. Here we suggest a possible unification of the schools of thermodynamics starting from two fundamental concepts of physics, interaction and flows.

  3. Interactively Open Autonomy Unifies Two Approaches to Function

    NASA Astrophysics Data System (ADS)

    Collier, John

    2004-08-01

    Functionality is essential to any form of anticipation beyond simple directedness at an end. In the literature on function in biology, there are two distinct approaches. One, the etiological view, places the origin of function in selection, while the other, the organizational view, individuates function by organizational role. Both approaches have well-known advantages and disadvantages. I propose a reconciliation of the two approaches, based in an interactivist approach to the individuation and stability of organisms. The approach was suggested by Kant in the Critique of Judgment, but since it requires, on his account, the identification a new form of causation, it has not been accessible by analytical techniques. I proceed by construction of the required concept to fit certain design requirements. This construction builds on concepts introduced in my previous four talks to these meetings.

  4. A BPF-FBP tandem algorithm for image reconstruction in reverse helical cone-beam CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, Seungryong; Xia, Dan; Pellizzari, Charles A.

    2010-01-15

    Purpose: Reverse helical cone-beam computed tomography (CBCT) is a scanning configuration for potential applications in image-guided radiation therapy in which an accurate anatomic image of the patient is needed for image-guidance procedures. The authors previously developed an algorithm for image reconstruction from nontruncated data of an object that is completely within the reverse helix. The purpose of this work is to develop an image reconstruction approach for reverse helical CBCT of a long object that extends out of the reverse helix and therefore constitutes data truncation. Methods: The proposed approach comprises of two reconstruction steps. In the first step, amore » chord-based backprojection-filtration (BPF) algorithm reconstructs a volumetric image of an object from the original cone-beam data. Because there exists a chordless region in the middle of the reverse helix, the image obtained in the first step contains an unreconstructed central-gap region. In the second step, the gap region is reconstructed by use of a Pack-Noo-formula-based filteredbackprojection (FBP) algorithm from the modified cone-beam data obtained by subtracting from the original cone-beam data the reprojection of the image reconstructed in the first step. Results: The authors have performed numerical studies to validate the proposed approach in image reconstruction from reverse helical cone-beam data. The results confirm that the proposed approach can reconstruct accurate images of a long object without suffering from data-truncation artifacts or cone-angle artifacts. Conclusions: They developed and validated a BPF-FBP tandem algorithm to reconstruct images of a long object from reverse helical cone-beam data. The chord-based BPF algorithm was utilized for converting the long-object problem into a short-object problem. The proposed approach is applicable to other scanning configurations such as reduced circular sinusoidal trajectories.« less

  5. Putting knowledge to work: a new approach.

    PubMed

    Evans, Karen; Guile, David; Harris, Judy; Allan, Helen

    2010-04-01

    Approaches to the longstanding challenges of 'integrating' subject-based and work-based knowledge have typically focused on questions of how learning can be 'transferred' from one setting to another, relating the assumed 'abstract' nature of theory to the assumed 'real' nature of practice. This is often seen as a single movement as encapsulated in the term 'from theory to practice'. The authors have developed a fresh approach that concentrates on different forms of knowledge and the ways in which these are contextualised and 're-contextualised' in movements between different sites of learning in colleges and workplaces. While the research has been carried out in a range of professional fields outside nursing, the arguments put forward by the authors are relevant to continuing debates within nursing around the theory-practice gap. The aim has been to explore how the subject-based and work-based aspects of a curriculum or learning programme can articulate with one another more effectively. The potential of the 're-contextualisation' approach for nurse education is outlined, with a view to further research. The original research was sponsored by the London Chamber of Commerce and Industry Commercial Education Trust and the Economic and Social Research Council Teaching and Learning Research Programme.

  6. A Hybrid Digital-Signature and Zero-Watermarking Approach for Authentication and Protection of Sensitive Electronic Documents

    PubMed Central

    Kabir, Muhammad N.; Alginahi, Yasser M.

    2014-01-01

    This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints. PMID:25254247

  7. Probabilistic prediction of barrier-island response to hurricanes

    USGS Publications Warehouse

    Plant, Nathaniel G.; Stockdon, Hilary F.

    2012-01-01

    Prediction of barrier-island response to hurricane attack is important for assessing the vulnerability of communities, infrastructure, habitat, and recreational assets to the impacts of storm surge, waves, and erosion. We have demonstrated that a conceptual model intended to make qualitative predictions of the type of beach response to storms (e.g., beach erosion, dune erosion, dune overwash, inundation) can be reformulated in a Bayesian network to make quantitative predictions of the morphologic response. In an application of this approach at Santa Rosa Island, FL, predicted dune-crest elevation changes in response to Hurricane Ivan explained about 20% to 30% of the observed variance. An extended Bayesian network based on the original conceptual model, which included dune elevations, storm surge, and swash, but with the addition of beach and dune widths as input variables, showed improved skill compared to the original model, explaining 70% of dune elevation change variance and about 60% of dune and shoreline position change variance. This probabilistic approach accurately represented prediction uncertainty (measured with the log likelihood ratio), and it outperformed the baseline prediction (i.e., the prior distribution based on the observations). Finally, sensitivity studies demonstrated that degrading the resolution of the Bayesian network or removing data from the calibration process reduced the skill of the predictions by 30% to 40%. The reduction in skill did not change conclusions regarding the relative importance of the input variables, and the extended model's skill always outperformed the original model.

  8. The narrow endemic Norwegian peat moss Sphagnum troendelagicum originated before the last glacial maximum

    PubMed Central

    Stenøien, H K; Shaw, A J; Stengrundet, K; Flatberg, K I

    2011-01-01

    It is commonly found that individual hybrid, polyploid species originate recurrently and that many polyploid species originated relatively recently. It has been previously hypothesized that the extremely rare allopolyploid peat moss Sphagnum troendelagicum has originated multiple times, possibly after the last glacial maximum in Scandinavia. This conclusion was based on low linkage disequilibrium in anonymous genetic markers within natural populations, in which sexual reproduction has never been observed. Here we employ microsatellite markers and chloroplast DNA (cpDNA)-encoded trnG sequence data to test hypotheses concerning the origin and evolution of this species. We find that S. tenellum is the maternal progenitor and S. balticum is the paternal progenitor of S. troendelagicum. Using various Bayesian approaches, we estimate that S. troendelagicum originated before the Holocene but not before c. 80 000 years ago (median expected time since speciation 40 000 years before present). The observed lack of complete linkage disequilibrium in the genome of this species suggests cryptic sexual reproduction and recombination. Several lines of evidence suggest multiple origins for S. troendelagicum, but a single origin is supported by approximate Bayesian computation analyses. We hypothesize that S. troendelagicum originated in a peat-dominated refugium before last glacial maximum, and subsequently immigrated to central Norway by means of spore flow during the last thousands of years. PMID:20717162

  9. From the Binet-Simon to the Wechsler-Bellevue: tracing the history of intelligence testing.

    PubMed

    Boake, Corwin

    2002-05-01

    The history of David Wechsler's intelligence scales is reviewed by tracing the origins of the subtests in the 1939 Wechsler-Bellevue Intelligence Scale. The subtests originated from tests developed between 1880 and World War I, and was based on approaches to mental testing including anthropometrics, association psychology, the Binet-Simon scales, language-free performance testing of immigrants and school children, and group testing of military recruits. Wechsler's subtest selection can be understood partly from his clinical experiences during World War I. The structure of the Wechsler-Bellevue Scale, which introduced major innovations in intelligence testing, has remained almost unchanged through later revisions.

  10. Multichannel blind deconvolution of spatially misaligned images.

    PubMed

    Sroubek, Filip; Flusser, Jan

    2005-07-01

    Existing multichannel blind restoration techniques assume perfect spatial alignment of channels, correct estimation of blur size, and are prone to noise. We developed an alternating minimization scheme based on a maximum a posteriori estimation with a priori distribution of blurs derived from the multichannel framework and a priori distribution of original images defined by the variational integral. This stochastic approach enables us to recover the blurs and the original image from channels severely corrupted by noise. We observe that the exact knowledge of the blur size is not necessary, and we prove that translation misregistration up to a certain extent can be automatically removed in the restoration process.

  11. Land cover classification of Landsat 8 satellite data based on Fuzzy Logic approach

    NASA Astrophysics Data System (ADS)

    Taufik, Afirah; Sakinah Syed Ahmad, Sharifah

    2016-06-01

    The aim of this paper is to propose a method to classify the land covers of a satellite image based on fuzzy rule-based system approach. The study uses bands in Landsat 8 and other indices, such as Normalized Difference Water Index (NDWI), Normalized difference built-up index (NDBI) and Normalized Difference Vegetation Index (NDVI) as input for the fuzzy inference system. The selected three indices represent our main three classes called water, built- up land, and vegetation. The combination of the original multispectral bands and selected indices provide more information about the image. The parameter selection of fuzzy membership is performed by using a supervised method known as ANFIS (Adaptive neuro fuzzy inference system) training. The fuzzy system is tested for the classification on the land cover image that covers Klang Valley area. The results showed that the fuzzy system approach is effective and can be explored and implemented for other areas of Landsat data.

  12. Local Geometry and Evolutionary Conservation of Protein Surfaces Reveal the Multiple Recognition Patches in Protein-Protein Interactions

    PubMed Central

    Laine, Elodie; Carbone, Alessandra

    2015-01-01

    Protein-protein interactions (PPIs) are essential to all biological processes and they represent increasingly important therapeutic targets. Here, we present a new method for accurately predicting protein-protein interfaces, understanding their properties, origins and binding to multiple partners. Contrary to machine learning approaches, our method combines in a rational and very straightforward way three sequence- and structure-based descriptors of protein residues: evolutionary conservation, physico-chemical properties and local geometry. The implemented strategy yields very precise predictions for a wide range of protein-protein interfaces and discriminates them from small-molecule binding sites. Beyond its predictive power, the approach permits to dissect interaction surfaces and unravel their complexity. We show how the analysis of the predicted patches can foster new strategies for PPIs modulation and interaction surface redesign. The approach is implemented in JET2, an automated tool based on the Joint Evolutionary Trees (JET) method for sequence-based protein interface prediction. JET2 is freely available at www.lcqb.upmc.fr/JET2. PMID:26690684

  13. The Use of EST Expression Matrixes for the Quality Control of Gene Expression Data

    PubMed Central

    Milnthorpe, Andrew T.; Soloviev, Mikhail

    2012-01-01

    EST expression profiling provides an attractive tool for studying differential gene expression, but cDNA libraries' origins and EST data quality are not always known or reported. Libraries may originate from pooled or mixed tissues; EST clustering, EST counts, library annotations and analysis algorithms may contain errors. Traditional data analysis methods, including research into tissue-specific gene expression, assume EST counts to be correct and libraries to be correctly annotated, which is not always the case. Therefore, a method capable of assessing the quality of expression data based on that data alone would be invaluable for assessing the quality of EST data and determining their suitability for mRNA expression analysis. Here we report an approach to the selection of a small generic subset of 244 UniGene clusters suitable for identification of the tissue of origin for EST libraries and quality control of the expression data using EST expression information alone. We created a small expression matrix of UniGene IDs using two rounds of selection followed by two rounds of optimisation. Our selection procedures differ from traditional approaches to finding “tissue-specific” genes and our matrix yields consistency high positive correlation values for libraries with confirmed tissues of origin and can be applied for tissue typing and quality control of libraries as small as just a few hundred total ESTs. Furthermore, we can pick up tissue correlations between related tissues e.g. brain and peripheral nervous tissue, heart and muscle tissues and identify tissue origins for a few libraries of uncharacterised tissue identity. It was possible to confirm tissue identity for some libraries which have been derived from cancer tissues or have been normalised. Tissue matching is affected strongly by cancer progression or library normalisation and our approach may potentially be applied for elucidating the stage of normalisation in normalised libraries or for cancer staging. PMID:22412959

  14. Species richness in soil bacterial communities: a proposed approach to overcome sample size bias.

    PubMed

    Youssef, Noha H; Elshahed, Mostafa S

    2008-09-01

    Estimates of species richness based on 16S rRNA gene clone libraries are increasingly utilized to gauge the level of bacterial diversity within various ecosystems. However, previous studies have indicated that regardless of the utilized approach, species richness estimates obtained are dependent on the size of the analyzed clone libraries. We here propose an approach to overcome sample size bias in species richness estimates in complex microbial communities. Parametric (Maximum likelihood-based and rarefaction curve-based) and non-parametric approaches were used to estimate species richness in a library of 13,001 near full-length 16S rRNA clones derived from soil, as well as in multiple subsets of the original library. Species richness estimates obtained increased with the increase in library size. To obtain a sample size-unbiased estimate of species richness, we calculated the theoretical clone library sizes required to encounter the estimated species richness at various clone library sizes, used curve fitting to determine the theoretical clone library size required to encounter the "true" species richness, and subsequently determined the corresponding sample size-unbiased species richness value. Using this approach, sample size-unbiased estimates of 17,230, 15,571, and 33,912 were obtained for the ML-based, rarefaction curve-based, and ACE-1 estimators, respectively, compared to bias-uncorrected values of 15,009, 11,913, and 20,909.

  15. Comparison of Alternate and Original Items on the Montreal Cognitive Assessment.

    PubMed

    Lebedeva, Elena; Huang, Mei; Koski, Lisa

    2016-03-01

    The Montreal Cognitive Assessment (MoCA) is a screening tool for mild cognitive impairment (MCI) in elderly individuals. We hypothesized that measurement error when using the new alternate MoCA versions to monitor change over time could be related to the use of items that are not of comparable difficulty to their corresponding originals of similar content. The objective of this study was to compare the difficulty of the alternate MoCA items to the original ones. Five selected items from alternate versions of the MoCA were included with items from the original MoCA administered adaptively to geriatric outpatients (N = 78). Rasch analysis was used to estimate the difficulty level of the items. None of the five items from the alternate versions matched the difficulty level of their corresponding original items. This study demonstrates the potential benefits of a Rasch analysis-based approach for selecting items during the process of development of parallel forms. The results suggest that better match of the items from different MoCA forms by their difficulty would result in higher sensitivity to changes in cognitive function over time.

  16. Approximation-based common principal component for feature extraction in multi-class brain-computer interfaces.

    PubMed

    Hoang, Tuan; Tran, Dat; Huang, Xu

    2013-01-01

    Common Spatial Pattern (CSP) is a state-of-the-art method for feature extraction in Brain-Computer Interface (BCI) systems. However it is designed for 2-class BCI classification problems. Current extensions of this method to multiple classes based on subspace union and covariance matrix similarity do not provide a high performance. This paper presents a new approach to solving multi-class BCI classification problems by forming a subspace resembled from original subspaces and the proposed method for this approach is called Approximation-based Common Principal Component (ACPC). We perform experiments on Dataset 2a used in BCI Competition IV to evaluate the proposed method. This dataset was designed for motor imagery classification with 4 classes. Preliminary experiments show that the proposed ACPC feature extraction method when combining with Support Vector Machines outperforms CSP-based feature extraction methods on the experimental dataset.

  17. Diagnosis of metastatic neoplasms: a clinicopathologic and morphologic approach.

    PubMed

    Marchevsky, Alberto M; Gupta, Ruta; Balzer, Bonnie

    2010-02-01

    The diagnosis of the site of origin of metastatic neoplasms often poses a challenge to practicing pathologists. A variety of immunohistochemical and molecular tests have been proposed for the identification of tumor site of origin, but these methods are no substitute for careful attention to the pathologic features of tumors and their correlation with imaging findings and other clinical data. The current trend in anatomic pathology is to overly rely on immunohistochemical and molecular tests to identify the site of origin of metastatic neoplasms, but this "shotgun approach" is often costly and can result in contradictory and even erroneous conclusions about the site of origin of a metastatic neoplasm. To describe the use of a systematic approach to the evaluation of metastatic neoplasms. Literature review and personal experience. A systematic approach can frequently help to narrow down differential diagnoses for a patient to a few likely tumor sites of origin that can be confirmed or excluded with the use of selected immunohistochemistry and/or molecular tests. This approach involves the qualitative evaluation of the "pretest and posttest probabilities" of various diagnoses before the immunohistochemical and molecular tests are ordered. Pretest probabilities are qualitatively estimated for each individual by taking into consideration the patient's age, sex, clinical history, imaging findings, and location of the metastases. This estimate is further narrowed by qualitatively evaluating, through careful observation of a variety of gross pathology and histopathologic features, the posttest probabilities of the most likely tumor sites of origin. Multiple examples of the use of this systematic approach for the evaluation of metastatic lesions are discussed.

  18. The effects of vent location, event scale and time forecasts on pyroclastic density current hazard maps at Campi Flegrei caldera (Italy)

    NASA Astrophysics Data System (ADS)

    Bevilacqua, Andrea; Neri, Augusto; Bisson, Marina; Esposti Ongaro, Tomaso; Flandoli, Franco; Isaia, Roberto; Rosi, Mauro; Vitale, Stefano

    2017-09-01

    This study presents a new method for producing long-term hazard maps for pyroclastic density currents (PDC) originating at Campi Flegrei caldera. Such method is based on a doubly stochastic approach and is able to combine the uncertainty assessments on the spatial location of the volcanic vent, the size of the flow and the expected time of such an event. The results are obtained by using a Monte Carlo approach and adopting a simplified invasion model based on the box model integral approximation. Temporal assessments are modelled through a Cox-type process including self-excitement effects, based on the eruptive record of the last 15 kyr. Mean and percentile maps of PDC invasion probability are produced, exploring their sensitivity to some sources of uncertainty and to the effects of the dependence between PDC scales and the caldera sector where they originated. Conditional maps representative of PDC originating inside limited zones of the caldera, or of PDC with a limited range of scales are also produced. Finally, the effect of assuming different time windows for the hazard estimates is explored, also including the potential occurrence of a sequence of multiple events. Assuming that the last eruption of Monte Nuovo (A.D. 1538) marked the beginning of a new epoch of activity similar to the previous ones, results of the statistical analysis indicate a mean probability of PDC invasion above 5% in the next 50 years on almost the entire caldera (with a probability peak of 25% in the central part of the caldera). In contrast, probability values reduce by a factor of about 3 if the entire eruptive record is considered over the last 15 kyr, i.e. including both eruptive epochs and quiescent periods.

  19. Histogram equalization with Bayesian estimation for noise robust speech recognition.

    PubMed

    Suh, Youngjoo; Kim, Hoirin

    2018-02-01

    The histogram equalization approach is an efficient feature normalization technique for noise robust automatic speech recognition. However, it suffers from performance degradation when some fundamental conditions are not satisfied in the test environment. To remedy these limitations of the original histogram equalization methods, class-based histogram equalization approach has been proposed. Although this approach showed substantial performance improvement under noise environments, it still suffers from performance degradation due to the overfitting problem when test data are insufficient. To address this issue, the proposed histogram equalization technique employs the Bayesian estimation method in the test cumulative distribution function estimation. It was reported in a previous study conducted on the Aurora-4 task that the proposed approach provided substantial performance gains in speech recognition systems based on the acoustic modeling of the Gaussian mixture model-hidden Markov model. In this work, the proposed approach was examined in speech recognition systems with deep neural network-hidden Markov model (DNN-HMM), the current mainstream speech recognition approach where it also showed meaningful performance improvement over the conventional maximum likelihood estimation-based method. The fusion of the proposed features with the mel-frequency cepstral coefficients provided additional performance gains in DNN-HMM systems, which otherwise suffer from performance degradation in the clean test condition.

  20. Linearly Adjustable International Portfolios

    NASA Astrophysics Data System (ADS)

    Fonseca, R. J.; Kuhn, D.; Rustem, B.

    2010-09-01

    We present an approach to multi-stage international portfolio optimization based on the imposition of a linear structure on the recourse decisions. Multiperiod decision problems are traditionally formulated as stochastic programs. Scenario tree based solutions however can become intractable as the number of stages increases. By restricting the space of decision policies to linear rules, we obtain a conservative tractable approximation to the original problem. Local asset prices and foreign exchange rates are modelled separately, which allows for a direct measure of their impact on the final portfolio value.

  1. Remote sensing fusion based on guided image filtering

    NASA Astrophysics Data System (ADS)

    Zhao, Wenfei; Dai, Qinling; Wang, Leiguang

    2015-12-01

    In this paper, we propose a novel remote sensing fusion approach based on guided image filtering. The fused images can well preserve the spectral features of the original multispectral (MS) images, meanwhile, enhance the spatial details information. Four quality assessment indexes are also introduced to evaluate the fusion effect when compared with other fusion methods. Experiments carried out on Gaofen-2, QuickBird, WorldView-2 and Landsat-8 images. And the results show an excellent performance of the proposed method.

  2. Near-Body Grid Adaption for Overset Grids

    NASA Technical Reports Server (NTRS)

    Buning, Pieter G.; Pulliam, Thomas H.

    2016-01-01

    A solution adaption capability for curvilinear near-body grids has been implemented in the OVERFLOW overset grid computational fluid dynamics code. The approach follows closely that used for the Cartesian off-body grids, but inserts refined grids in the computational space of original near-body grids. Refined curvilinear grids are generated using parametric cubic interpolation, with one-sided biasing based on curvature and stretching ratio of the original grid. Sensor functions, grid marking, and solution interpolation tasks are implemented in the same fashion as for off-body grids. A goal-oriented procedure, based on largest error first, is included for controlling growth rate and maximum size of the adapted grid system. The adaption process is almost entirely parallelized using MPI, resulting in a capability suitable for viscous, moving body simulations. Two- and three-dimensional examples are presented.

  3. Nanoparticle surface characterization and clustering through concentration-dependent surface adsorption modeling.

    PubMed

    Chen, Ran; Zhang, Yuntao; Sahneh, Faryad Darabi; Scoglio, Caterina M; Wohlleben, Wendel; Haase, Andrea; Monteiro-Riviere, Nancy A; Riviere, Jim E

    2014-09-23

    Quantitative characterization of nanoparticle interactions with their surrounding environment is vital for safe nanotechnological development and standardization. A recent quantitative measure, the biological surface adsorption index (BSAI), has demonstrated promising applications in nanomaterial surface characterization and biological/environmental prediction. This paper further advances the approach beyond the application of five descriptors in the original BSAI to address the concentration dependence of the descriptors, enabling better prediction of the adsorption profile and more accurate categorization of nanomaterials based on their surface properties. Statistical analysis on the obtained adsorption data was performed based on three different models: the original BSAI, a concentration-dependent polynomial model, and an infinite dilution model. These advancements in BSAI modeling showed a promising development in the application of quantitative predictive modeling in biological applications, nanomedicine, and environmental safety assessment of nanomaterials.

  4. Advanced, phase-locked, 100 kW, 1.3 GHz magnetron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Read, Michael; Ives, R. Lawrence; Bui, Thuc

    Calabazas Creek Research, Inc., in collaboration with Fermilab and Communications & Power Industries, LLC, is developing a phase-locked, 100 kW peak, 10 kW average power magnetron-based RF system for driving accelerators. Here, phase locking will be achieved using an approach originating at Fermilab that includes control of both amplitude and phase on a fast time scale.

  5. 78 FR 42992 - Self-Regulatory Organizations; Chicago Mercantile Exchange Inc.; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-18

    ... utilizations are approaching their limits. In the future, we will provide firms with access to a separate view... 19b-4(f)(1) \\4\\ thereunder, so that the proposal was effective upon filing with the Commission. The... house origins. The utilization of this limit will be based on the same margin methodology that CME...

  6. Yet To Make the Grade: New Zealand National Government's Early Education Policy.

    ERIC Educational Resources Information Center

    Farquhar, Sarah-Eve J.

    The early education policy of the National Party government in New Zealand is based on the Parents As First Teachers (PAFT) approach and the New Parents as Teachers project, which originated in the United States in Missouri. It is difficult to assess this project, which is not well-documented. The PAFT policy that parents should be home with…

  7. In Pursuit of Professionalism in the Field of Chemistry Education in China: The Story of Zhixin Liu

    ERIC Educational Resources Information Center

    Wei, Bing

    2012-01-01

    In China, science educators as a professional group were originally referred to as academic staff responsible for teaching the subject-based science teaching methods course at the related science departments at teachers' universities. In this study, a biographic method was used to approach the professional life of Zhixin Liu, who was a senior…

  8. Advanced, phase-locked, 100 kW, 1.3 GHz magnetron

    DOE PAGES

    Read, Michael; Ives, R. Lawrence; Bui, Thuc; ...

    2017-03-06

    Calabazas Creek Research, Inc., in collaboration with Fermilab and Communications & Power Industries, LLC, is developing a phase-locked, 100 kW peak, 10 kW average power magnetron-based RF system for driving accelerators. Here, phase locking will be achieved using an approach originating at Fermilab that includes control of both amplitude and phase on a fast time scale.

  9. Actin Immobilization on Chitin for Purifying Myosin II: A Laboratory Exercise That Integrates Concepts of Molecular Cell Biology and Protein Chemistry

    ERIC Educational Resources Information Center

    de Souza, Marcelle Gomes; Grossi, Andre Luiz; Pereira, Elisangela Lima Bastos; da Cruz, Carolina Oliveira; Mendes, Fernanda Machado; Cameron, Luiz Claudio; Paiva, Carmen Lucia Antao

    2008-01-01

    This article presents our experience on teaching biochemical sciences through an innovative approach that integrates concepts of molecular cell biology and protein chemistry. This original laboratory exercise is based on the preparation of an affinity chromatography column containing F-actin molecules immobilized on chitin particles for purifying…

  10. Synthesis of Di- and Trisubstituted Azulenes Using a Danheiser Annulation as the Key Step: An Advanced Organic Laboratory Experiment

    ERIC Educational Resources Information Center

    Thomas, Rebecca M.; Shea, Kevin M.

    2013-01-01

    This three-week advanced-level organic experiment provides students with an inquiry-based approach focused on learning traditional skills such as primary literature interpretation, reaction design, flash column chromatography, and NMR analysis. Additionally, students address higher-order concepts such as the origin of azulene's blue color,…

  11. Teaching Composition Skills with Weekly Multiple Choice Tests in Lieu of Theme Writing. Final Report.

    ERIC Educational Resources Information Center

    Scannell, Dale P.; Haugh, Oscar M.

    The purpose of the study was to compare the effectiveness with which composition skills could be taught by the traditional theme-assignment approach and by an experimental method using weekly multiple-choice composition tests in lieu of theme writing. The weekly tests were based on original but typical first-draft compositions and covered problems…

  12. Overcoming the Crisis in Curriculum Theory: A Knowledge-Based Approach

    ERIC Educational Resources Information Center

    Young, Michael

    2013-01-01

    This paper begins by identifying what it sees as the current crisis in curriculum theory. Following a brief history of the field, it argues that recent developments have led to it losing its object--what is taught and learned in school--and its distinctive role in the educational sciences. Arising from this brief account of the origins and nature…

  13. Linear dispersion relation for the mirror instability in context of the gyrokinetic theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porazik, Peter; Johnson, Jay R.

    2013-10-15

    The linear dispersion relation for the mirror instability is discussed in context of the gyrokinetic theory. The objective is to provide a coherent view of different kinetic approaches used to derive the dispersion relation. The method based on gyrocenter phase space transformations is adopted in order to display the origin and ordering of various terms.

  14. Coordinating the Complexity of Tools, Tasks, and Users: On Theory-Based Approaches to Authoring Tool Usability

    ERIC Educational Resources Information Center

    Murray, Tom

    2016-01-01

    Intelligent Tutoring Systems authoring tools are highly complex educational software applications used to produce highly complex software applications (i.e. ITSs). How should our assumptions about the target users (authors) impact the design of authoring tools? In this article I first reflect on the factors leading to my original 1999 article on…

  15. A digital signal processing-based bioinformatics approach to identifying the origins of HIV-1 non B subtypes infecting US Army personnel serving abroad.

    PubMed

    Nwankwo, Norbert

    2013-06-01

    Two HIV-1 non B isolates, 98US_MSC5007 and 98US_MSC5016, which have been identified amongst the US Army personnel serving abroad, are known to have originated from other nations. Notwithstanding, they are categorized as American strains. This is because their countries of origin are unknown. American isolates are basically B subtype. 98US_MSC5007 belongs to Circulating Recombinant Form (CRF02_AG) while 98US_MSC5016 is of the C clade. Both sub-groups are recognized to have originated from African and Asian continents. It has become necessary to properly determine the countries of origin of microbes and viruses. This is because diversity and cross-subtyping have been found to mitigate the designing and development of vaccine and therapeutic interventions. The aim of this study therefore is to identify the countries of origin of the two American isolates found amongst US Army personnel serving abroad. A Digital Signal Processing-based Bioinformatics technique called Informational Spectrum Method (ISM) has been engaged. ISM entails translating the amino acids sequences of the protein into numerical sequences (signals) by means of one biological parameter (Amino Acids Scale). The signals are then processed using Discrete Fourier Transform (DFT) in order to uncover and present the embedded biological information as Informational Spectra (IS). Spectral Position of Maximum Binding Interaction (SPMBI) is used. Several approaches including Phylogeny have preliminarily been employed in the determination of evolutionary trends of organisms and viruses. SPMBI has preliminarily been used to re-establish the semblance and common originality that exist between human and Chimpanzee, evolutionary roadmaps in the Influenza and HIV viruses. The results disclosed that 98US_MSC5007 shared same semblance and originality with a Nigeria isolate (92NG083) while 98US_MSC5016 with the Zairian isolates (ELI, MAL, and Z2/CDC-34). These results appear to demonstrate that the American soldiers harboring these strains may have been infected by isolates from Nigeria and Zaire, respectively. This is because 98US_MSC5007 and the Nigerian isolate share SPMBI at position 44. Additionally, 98US_MSC5016, which has SPMBI at position 148, may have come from Zaire as it has similar SPMBI with the Zairian isolates at 150. SPMBI is a demonstration of Bio-functionality arising from maximum affinity by the proteins from different sources to a common protein. To help validate the findings, the experiment was further repeated using ISM-based Phylogenetic technique. The outcome appears not to be in complete accord with the results obtained in this study. It is therefore recommended that the countries in which these US Army personnel are deployed be identified and where the findings made and the locations of the Army personnel appropriately correlate, this novel procedure be engaged in the identification of the nations of origins of all other such HIV isolates across all clades and nations.

  16. A weakly-compressible Cartesian grid approach for hydrodynamic flows

    NASA Astrophysics Data System (ADS)

    Bigay, P.; Oger, G.; Guilcher, P.-M.; Le Touzé, D.

    2017-11-01

    The present article aims at proposing an original strategy to solve hydrodynamic flows. In introduction, the motivations for this strategy are developed. It aims at modeling viscous and turbulent flows including complex moving geometries, while avoiding meshing constraints. The proposed approach relies on a weakly-compressible formulation of the Navier-Stokes equations. Unlike most hydrodynamic CFD (Computational Fluid Dynamics) solvers usually based on implicit incompressible formulations, a fully-explicit temporal scheme is used. A purely Cartesian grid is adopted for numerical accuracy and algorithmic simplicity purposes. This characteristic allows an easy use of Adaptive Mesh Refinement (AMR) methods embedded within a massively parallel framework. Geometries are automatically immersed within the Cartesian grid with an AMR compatible treatment. The method proposed uses an Immersed Boundary Method (IBM) adapted to the weakly-compressible formalism and imposed smoothly through a regularization function, which stands as another originality of this work. All these features have been implemented within an in-house solver based on this WCCH (Weakly-Compressible Cartesian Hydrodynamic) method which meets the above requirements whilst allowing the use of high-order (> 3) spatial schemes rarely used in existing hydrodynamic solvers. The details of this WCCH method are presented and validated in this article.

  17. Concise Review: Criteria for Chamber‐Specific Categorization of Human Cardiac Myocytes Derived from Pluripotent Stem Cells

    PubMed Central

    Kane, Christopher

    2017-01-01

    Abstract Human pluripotent stem cell‐derived cardiomyocytes (PSC‐CMs) have great potential application in almost all areas of cardiovascular research. A current major goal of the field is to build on the past success of differentiation strategies to produce CMs with the properties of those originating from the different chambers of the adult human heart. With no anatomical origin or developmental pathway to draw on, the question of how to judge the success of such approaches and assess the chamber specificity of PSC‐CMs has become increasingly important; commonly used methods have substantial limitations and are based on limited evidence to form such an assessment. In this article, we discuss the need for chamber‐specific PSC‐CMs in a number of areas as well as current approaches used to assess these cells on their likeness to those from different chambers of the heart. Furthermore, describing in detail the structural and functional features that distinguish the different chamber‐specific human adult cardiac myocytes, we propose an evidence‐based tool to aid investigators in the phenotypic characterization of differentiated PSC‐CMs. Stem Cells 2017;35:1881–1897 PMID:28577296

  18. Oxytonergic circuitry sustains and enables creative cognition in humans.

    PubMed

    De Dreu, Carsten K W; Baas, Matthijs; Roskes, Marieke; Sligte, Daniel J; Ebstein, Richard P; Chew, Soo Hong; Tong, Terry; Jiang, Yushi; Mayseless, Naama; Shamay-Tsoory, Simone G

    2014-08-01

    Creativity enables humans to adapt flexibly to changing circumstances, to manage complex social relations and to survive and prosper through social, technological and medical innovations. In humans, chronic, trait-based as well as temporary, state-based approach orientation has been linked to increased capacity for divergent rather than convergent thinking, to more global and holistic processing styles and to more original ideation and creative problem solving. Here, we link creative cognition to oxytocin, a hypothalamic neuropeptide known to up-regulate approach orientation in both animals and humans. Study 1 (N = 492) showed that plasma oxytocin predicts novelty-seeking temperament. Study 2 (N = 110) revealed that genotype differences in a polymorphism in the oxytocin receptor gene rs1042778 predicted creative ideation, with GG/GT-carriers being more original than TT-carriers. Using double-blind placebo-controlled between-subjects designs, Studies 3-6 (N = 191) finally showed that intranasal oxytocin (vs matching placebo) reduced analytical reasoning, and increased holistic processing, divergent thinking and creative performance. We conclude that the oxytonergic circuitry sustains and enables the day-to-day creativity humans need for survival and prosperity and discuss implications. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  19. Continuity-based model interfacing for plant-wide simulation: a general approach.

    PubMed

    Volcke, Eveline I P; van Loosdrecht, Mark C M; Vanrolleghem, Peter A

    2006-08-01

    In plant-wide simulation studies of wastewater treatment facilities, often existing models from different origin need to be coupled. However, as these submodels are likely to contain different state variables, their coupling is not straightforward. The continuity-based interfacing method (CBIM) provides a general framework to construct model interfaces for models of wastewater systems, taking into account conservation principles. In this contribution, the CBIM approach is applied to study the effect of sludge digestion reject water treatment with a SHARON-Anammox process on a plant-wide scale. Separate models were available for the SHARON process and for the Anammox process. The Benchmark simulation model no. 2 (BSM2) is used to simulate the behaviour of the complete WWTP including sludge digestion. The CBIM approach is followed to develop three different model interfaces. At the same time, the generally applicable CBIM approach was further refined and particular issues when coupling models in which pH is considered as a state variable, are pointed out.

  20. Lanczos algorithm with matrix product states for dynamical correlation functions

    NASA Astrophysics Data System (ADS)

    Dargel, P. E.; Wöllert, A.; Honecker, A.; McCulloch, I. P.; Schollwöck, U.; Pruschke, T.

    2012-05-01

    The density-matrix renormalization group (DMRG) algorithm can be adapted to the calculation of dynamical correlation functions in various ways which all represent compromises between computational efficiency and physical accuracy. In this paper we reconsider the oldest approach based on a suitable Lanczos-generated approximate basis and implement it using matrix product states (MPS) for the representation of the basis states. The direct use of matrix product states combined with an ex post reorthogonalization method allows us to avoid several shortcomings of the original approach, namely the multitargeting and the approximate representation of the Hamiltonian inherent in earlier Lanczos-method implementations in the DMRG framework, and to deal with the ghost problem of Lanczos methods, leading to a much better convergence of the spectral weights and poles. We present results for the dynamic spin structure factor of the spin-1/2 antiferromagnetic Heisenberg chain. A comparison to Bethe ansatz results in the thermodynamic limit reveals that the MPS-based Lanczos approach is much more accurate than earlier approaches at minor additional numerical cost.

  1. Systems Pharmacology-Based Approach of Connecting Disease Genes in Genome-Wide Association Studies with Traditional Chinese Medicine.

    PubMed

    Kim, Jihye; Yoo, Minjae; Shin, Jimin; Kim, Hyunmin; Kang, Jaewoo; Tan, Aik Choon

    2018-01-01

    Traditional Chinese medicine (TCM) originated in ancient China has been practiced over thousands of years for treating various symptoms and diseases. However, the molecular mechanisms of TCM in treating these diseases remain unknown. In this study, we employ a systems pharmacology-based approach for connecting GWAS diseases with TCM for potential drug repurposing and repositioning. We studied 102 TCM components and their target genes by analyzing microarray gene expression experiments. We constructed disease-gene networks from 2558 GWAS studies. We applied a systems pharmacology approach to prioritize disease-target genes. Using this bioinformatics approach, we analyzed 14,713 GWAS disease-TCM-target gene pairs and identified 115 disease-gene pairs with q value < 0.2. We validated several of these GWAS disease-TCM-target gene pairs with literature evidence, demonstrating that this computational approach could reveal novel indications for TCM. We also develop TCM-Disease web application to facilitate the traditional Chinese medicine drug repurposing efforts. Systems pharmacology is a promising approach for connecting GWAS diseases with TCM for potential drug repurposing and repositioning. The computational approaches described in this study could be easily expandable to other disease-gene network analysis.

  2. Systems Pharmacology-Based Approach of Connecting Disease Genes in Genome-Wide Association Studies with Traditional Chinese Medicine

    PubMed Central

    Kim, Jihye; Yoo, Minjae; Shin, Jimin; Kim, Hyunmin; Kang, Jaewoo

    2018-01-01

    Traditional Chinese medicine (TCM) originated in ancient China has been practiced over thousands of years for treating various symptoms and diseases. However, the molecular mechanisms of TCM in treating these diseases remain unknown. In this study, we employ a systems pharmacology-based approach for connecting GWAS diseases with TCM for potential drug repurposing and repositioning. We studied 102 TCM components and their target genes by analyzing microarray gene expression experiments. We constructed disease-gene networks from 2558 GWAS studies. We applied a systems pharmacology approach to prioritize disease-target genes. Using this bioinformatics approach, we analyzed 14,713 GWAS disease-TCM-target gene pairs and identified 115 disease-gene pairs with q value < 0.2. We validated several of these GWAS disease-TCM-target gene pairs with literature evidence, demonstrating that this computational approach could reveal novel indications for TCM. We also develop TCM-Disease web application to facilitate the traditional Chinese medicine drug repurposing efforts. Systems pharmacology is a promising approach for connecting GWAS diseases with TCM for potential drug repurposing and repositioning. The computational approaches described in this study could be easily expandable to other disease-gene network analysis. PMID:29765977

  3. On NUFFT-based gridding for non-Cartesian MRI

    NASA Astrophysics Data System (ADS)

    Fessler, Jeffrey A.

    2007-10-01

    For MRI with non-Cartesian sampling, the conventional approach to reconstructing images is to use the gridding method with a Kaiser-Bessel (KB) interpolation kernel. Recently, Sha et al. [L. Sha, H. Guo, A.W. Song, An improved gridding method for spiral MRI using nonuniform fast Fourier transform, J. Magn. Reson. 162(2) (2003) 250-258] proposed an alternative method based on a nonuniform FFT (NUFFT) with least-squares (LS) design of the interpolation coefficients. They described this LS_NUFFT method as shift variant and reported that it yielded smaller reconstruction approximation errors than the conventional shift-invariant KB approach. This paper analyzes the LS_NUFFT approach in detail. We show that when one accounts for a certain linear phase factor, the core of the LS_NUFFT interpolator is in fact real and shift invariant. Furthermore, we find that the KB approach yields smaller errors than the original LS_NUFFT approach. We show that optimizing certain scaling factors can lead to a somewhat improved LS_NUFFT approach, but the high computation cost seems to outweigh the modest reduction in reconstruction error. We conclude that the standard KB approach, with appropriate parameters as described in the literature, remains the practical method of choice for gridding reconstruction in MRI.

  4. Micro-Power Sources Enabling Robotic Outpost Based Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    West, W. C.; Whitacre, J. F.; Ratnakumar, B. V.; Brandon, E. J.; Studor, G. F.

    2001-01-01

    Robotic outpost based exploration represents a fundamental shift in mission design from conventional, single spacecraft missions towards a distributed risk approach with many miniaturized semi-autonomous robots and sensors. This approach can facilitate wide-area sampling and exploration, and may consist of a web of orbiters, landers, or penetrators. To meet the mass and volume constraints of deep space missions such as the Europa Ocean Science Station, the distributed units must be fully miniaturized to fully leverage the wide-area exploration approach. However, presently there is a dearth of available options for powering these miniaturized sensors and robots. This group is currently examining miniaturized, solid state batteries as candidates to meet the demand of applications requiring low power, mass, and volume micro-power sources. These applications may include powering microsensors, battery-backing rad-hard CMOS memory and providing momentary chip back-up power. Additional information is contained in the original extended abstract.

  5. Chaos control of the brushless direct current motor using adaptive dynamic surface control based on neural network with the minimum weights.

    PubMed

    Luo, Shaohua; Wu, Songli; Gao, Ruizhen

    2015-07-01

    This paper investigates chaos control for the brushless DC motor (BLDCM) system by adaptive dynamic surface approach based on neural network with the minimum weights. The BLDCM system contains parameter perturbation, chaotic behavior, and uncertainty. With the help of radial basis function (RBF) neural network to approximate the unknown nonlinear functions, the adaptive law is established to overcome uncertainty of the control gain. By introducing the RBF neural network and adaptive technology into the dynamic surface control design, a robust chaos control scheme is developed. It is proved that the proposed control approach can guarantee that all signals in the closed-loop system are globally uniformly bounded, and the tracking error converges to a small neighborhood of the origin. Simulation results are provided to show that the proposed approach works well in suppressing chaos and parameter perturbation.

  6. The potential of prison-based democratic therapeutic communities.

    PubMed

    Bennett, Jamie; Shuker, Richard

    2017-03-13

    Purpose The purpose of this paper is to describe the work of HMP Grendon, the only prison in the UK to operate entirely as a series of democratic therapeutic communities and to summarise the research of its effectiveness. Design/methodology/approach The paper is both descriptive, providing an overview of the work of a prison-based therapeutic community, and offers a literature review regarding evidence of effectiveness. Findings The work of HMP Grendon has a wide range of positive benefits including reduced levels of disruption in prison, reduced self-harm, improved well-being, an environment that is experienced as more humane and reduced levels of reoffending. Originality/value The work of HMP Grendon offers a well established and evidenced approach to managing men who have committed serious violent and sexually violent offences. It also promotes and embodies a progressive approach to managing prisons rooted in the welfare tradition.

  7. Chaos control of the brushless direct current motor using adaptive dynamic surface control based on neural network with the minimum weights

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Shaohua; Department of Mechanical Engineering, Chongqing Aerospace Polytechnic, Chongqing, 400021; Wu, Songli

    2015-07-15

    This paper investigates chaos control for the brushless DC motor (BLDCM) system by adaptive dynamic surface approach based on neural network with the minimum weights. The BLDCM system contains parameter perturbation, chaotic behavior, and uncertainty. With the help of radial basis function (RBF) neural network to approximate the unknown nonlinear functions, the adaptive law is established to overcome uncertainty of the control gain. By introducing the RBF neural network and adaptive technology into the dynamic surface control design, a robust chaos control scheme is developed. It is proved that the proposed control approach can guarantee that all signals in themore » closed-loop system are globally uniformly bounded, and the tracking error converges to a small neighborhood of the origin. Simulation results are provided to show that the proposed approach works well in suppressing chaos and parameter perturbation.« less

  8. Transaction based approach

    NASA Astrophysics Data System (ADS)

    Hunka, Frantisek; Matula, Jiri

    2017-07-01

    Transaction based approach is utilized in some methodologies in business process modeling. Essential parts of these transactions are human beings. The notion of agent or actor role is usually used for them. The paper on a particular example describes possibilities of Design Engineering Methodology for Organizations (DEMO) and Resource-Event-Agent (REA) methodology. Whereas the DEMO methodology can be regarded as a generic methodology having its foundation in the theory of Enterprise Ontology the REA methodology is regarded as the domain specific methodology and has its origin in accountancy systems. The results of these approaches is that the DEMO methodology captures everything that happens in the reality with a good empirical evidence whereas the REA methodology captures only changes connected with economic events. Economic events represent either change of the property rights to economic resource or consumption or production of economic resources. This results from the essence of economic events and their connection to economic resources.

  9. H-Ransac a Hybrid Point Cloud Segmentation Combining 2d and 3d Data

    NASA Astrophysics Data System (ADS)

    Adam, A.; Chatzilari, E.; Nikolopoulos, S.; Kompatsiaris, I.

    2018-05-01

    In this paper, we present a novel 3D segmentation approach operating on point clouds generated from overlapping images. The aim of the proposed hybrid approach is to effectively segment co-planar objects, by leveraging the structural information originating from the 3D point cloud and the visual information from the 2D images, without resorting to learning based procedures. More specifically, the proposed hybrid approach, H-RANSAC, is an extension of the well-known RANSAC plane-fitting algorithm, incorporating an additional consistency criterion based on the results of 2D segmentation. Our expectation that the integration of 2D data into 3D segmentation will achieve more accurate results, is validated experimentally in the domain of 3D city models. Results show that HRANSAC can successfully delineate building components like main facades and windows, and provide more accurate segmentation results compared to the typical RANSAC plane-fitting algorithm.

  10. Towards an evolutionary theory of the origin of life based on kinetics and thermodynamics.

    PubMed

    Pascal, Robert; Pross, Addy; Sutherland, John D

    2013-11-06

    A sudden transition in a system from an inanimate state to the living state-defined on the basis of present day living organisms-would constitute a highly unlikely event hardly predictable from physical laws. From this uncontroversial idea, a self-consistent representation of the origin of life process is built up, which is based on the possibility of a series of intermediate stages. This approach requires a particular kind of stability for these stages-dynamic kinetic stability (DKS)-which is not usually observed in regular chemistry, and which is reflected in the persistence of entities capable of self-reproduction. The necessary connection of this kinetic behaviour with far-from-equilibrium thermodynamic conditions is emphasized and this leads to an evolutionary view for the origin of life in which multiplying entities must be associated with the dissipation of free energy. Any kind of entity involved in this process has to pay the energetic cost of irreversibility, but, by doing so, the contingent emergence of new functions is made feasible. The consequences of these views on the studies of processes by which life can emerge are inferred.

  11. Building a model of the blue cone pigment based on the wild type rhodopsin structure with QM/MM methods.

    PubMed

    Frähmcke, Jan S; Wanko, Marius; Elstner, Marcus

    2012-03-15

    Understanding the mechanism of color tuning of the retinal chromophore by its host protein became one of the key issues in the research on rhodopsins. While early mutation studies addressed its genetic origin, recent studies advanced to investigate its structural origin, based on X-ray crystallographic structures. For the human cone pigments, no crystal structures have been produced, and homology models were employed to elucidate the origin of its blue-shifted absorption. In this theoretical study, we take a different route to establish a structural model for human blue. Starting from the well-resolved structure of bovine rhodopsin, we derive multiple mutant models by stepwise mutation and equilibration using molecular dynamics simulations in a hybrid quantum mechanics/molecular mechanics framework. Our 30fold mutant reproduces the experimental UV-vis absorption shift of 0.45 eV and provides new insights about both structural and genetic factors that affect the excitation energy. Electrostatic effects of individual amino acids and collaborative structural effects are analyzed using semiempirical (OM2/MRCI) and ab initio (SORCI) multireference approaches. © 2012 American Chemical Society

  12. Sensory and chemical profiles of Finnish honeys of different botanical origins and consumer preferences.

    PubMed

    Kortesniemi, Maaria; Rosenvald, Sirli; Laaksonen, Oskar; Vanag, Anita; Ollikka, Tarja; Vene, Kristel; Yang, Baoru

    2018-04-25

    The sensory-chemical profiles of Finnish honeys (labeled as buckwheat, cloudberry-bog, lingonberry, sweet clover, willowherb and multifloral honeys) were investigated using a multi-analytical approach. The sensory test (untrained panel, n = 62) was based on scaling and check-all-that-apply (CATA) methods accompanied with questions on preference and usage of honey. The results were correlated with corresponding profiles of odor-active compounds, determined using gas chromatography coupled with mass spectrometry/olfactometry (GC-MS/O). Botanical origins and chemical compositions including sugars were evaluated using NMR spectroscopy. A total of 73 odor-active compounds were listed based on GC-O. Sweet and mild honeys with familiar sensory properties were preferred by the panelists (PCA, R 2 X(1) = 0.7) while buckwheat and cloudberry-bog honeys with strong odor, flavor and color were regarded as unfamiliar and unpleasant. The data will give the honey industry novel information on honey properties in relation to the botanical origin, and consumer preference. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Algorithms for the automatic generation of 2-D structured multi-block grids

    NASA Technical Reports Server (NTRS)

    Schoenfeld, Thilo; Weinerfelt, Per; Jenssen, Carl B.

    1995-01-01

    Two different approaches to the fully automatic generation of structured multi-block grids in two dimensions are presented. The work aims to simplify the user interactivity necessary for the definition of a multiple block grid topology. The first approach is based on an advancing front method commonly used for the generation of unstructured grids. The original algorithm has been modified toward the generation of large quadrilateral elements. The second method is based on the divide-and-conquer paradigm with the global domain recursively partitioned into sub-domains. For either method each of the resulting blocks is then meshed using transfinite interpolation and elliptic smoothing. The applicability of these methods to practical problems is demonstrated for typical geometries of fluid dynamics.

  14. SPR Biosensors in Direct Molecular Fishing: Implications for Protein Interactomics.

    PubMed

    Florinskaya, Anna; Ershov, Pavel; Mezentsev, Yuri; Kaluzhskiy, Leonid; Yablokov, Evgeniy; Medvedev, Alexei; Ivanov, Alexis

    2018-05-18

    We have developed an original experimental approach based on the use of surface plasmon resonance (SPR) biosensors, applicable for investigation of potential partners involved in protein⁻protein interactions (PPI) as well as protein⁻peptide or protein⁻small molecule interactions. It is based on combining a SPR biosensor, size exclusion chromatography (SEC), mass spectrometric identification of proteins (LC-MS/MS) and direct molecular fishing employing principles of affinity chromatography for isolation of potential partner proteins from the total lysate of biological samples using immobilized target proteins (or small non-peptide compounds) as ligands. Applicability of this approach has been demonstrated within the frame of the Human Proteome Project (HPP) and PPI regulation by a small non-peptide biologically active compound, isatin.

  15. Optimal cross-sectional sampling for river modelling with bridges: An information theory-based method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ridolfi, E.; Napolitano, F., E-mail: francesco.napolitano@uniroma1.it; Alfonso, L.

    2016-06-08

    The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existingmore » guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers’ cross-sectional spacing.« less

  16. A comparative UPLC-Q/TOF-MS-based metabolomics approach for distinguishing Zingiber officinale Roscoe of two geographical origins.

    PubMed

    Mais, Enos; Alolga, Raphael N; Wang, Shi-Lei; Linus, Loveth O; Yin, Xiaojin; Qi, Lian-Wen

    2018-02-01

    Ginger, the rhizome of Zingiber officinale Roscoe, is a popular spice used in the food, beverage and confectionary industries. In this study, we report an untargeted UPLC-Q/TOF-MS-based metabolomics approach for comprehensively discriminating between ginger from two geographical locations, Ghana in West Africa and China. Forty batches of fresh ginger from both countries were discriminated using principal component analysis and orthogonal partial least squares discrimination analysis. Sixteen differential metabolites were identified between the gingers from the two geographical locations, six of which were identified as the marker compounds responsible for the discrimination. Our study highlights the essence and predictive power of metabolomics in detecting minute differences in same varieties of plants/plant samples based on the levels and composition of their metabolites. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Ontology-based knowledge representation for resolution of semantic heterogeneity in GIS

    NASA Astrophysics Data System (ADS)

    Liu, Ying; Xiao, Han; Wang, Limin; Han, Jialing

    2017-07-01

    Lack of semantic interoperability in geographical information systems has been identified as the main obstacle for data sharing and database integration. The new method should be found to overcome the problems of semantic heterogeneity. Ontologies are considered to be one approach to support geographic information sharing. This paper presents an ontology-driven integration approach to help in detecting and possibly resolving semantic conflicts. Its originality is that each data source participating in the integration process contains an ontology that defines the meaning of its own data. This approach ensures the automation of the integration through regulation of semantic integration algorithm. Finally, land classification in field GIS is described as the example.

  18. Multi-model approach to characterize human handwriting motion.

    PubMed

    Chihi, I; Abdelkrim, A; Benrejeb, M

    2016-02-01

    This paper deals with characterization and modelling of human handwriting motion from two forearm muscle activity signals, called electromyography signals (EMG). In this work, an experimental approach was used to record the coordinates of a pen tip moving on the (x, y) plane and EMG signals during the handwriting act. The main purpose is to design a new mathematical model which characterizes this biological process. Based on a multi-model approach, this system was originally developed to generate letters and geometric forms written by different writers. A Recursive Least Squares algorithm is used to estimate the parameters of each sub-model of the multi-model basis. Simulations show good agreement between predicted results and the recorded data.

  19. Local Table Condensation in Rough Set Approach for Jumping Emerging Pattern Induction

    NASA Astrophysics Data System (ADS)

    Terlecki, Pawel; Walczak, Krzysztof

    This paper extends the rough set approach for JEP induction based on the notion of a condensed decision table. The original transaction database is transformed to a relational form and patterns are induced by means of local reducts. The transformation employs an item aggregation obtained by coloring a graph that re0ects con0icts among items. For e±ciency reasons we propose to perform this preprocessing locally, i.e. at the transaction level, to achieve a higher dimensionality gain. Special maintenance strategy is also used to avoid graph rebuilds. Both global and local approach have been tested and discussed for dense and synthetically generated sparse datasets.

  20. Proposal: A Hybrid Dictionary Modelling Approach for Malay Tweet Normalization

    NASA Astrophysics Data System (ADS)

    Muhamad, Nor Azlizawati Binti; Idris, Norisma; Arshi Saloot, Mohammad

    2017-02-01

    Malay Twitter message presents a special deviation from the original language. Malay Tweet widely used currently by Twitter users, especially at Malaya archipelago. Thus, it is important to make a normalization system which can translated Malay Tweet language into the standard Malay language. Some researchers have conducted in natural language processing which mainly focuses on normalizing English Twitter messages, while few studies have been done for normalize Malay Tweets. This paper proposes an approach to normalize Malay Twitter messages based on hybrid dictionary modelling methods. This approach normalizes noisy Malay twitter messages such as colloquially language, novel words, and interjections into standard Malay language. This research will be used Language Model and N-grams model.

  1. A fuzzy logic approach to modeling a vehicle crash test

    NASA Astrophysics Data System (ADS)

    Pawlus, Witold; Karimi, Hamid Reza; Robbersmyr, Kjell G.

    2013-03-01

    This paper presents an application of fuzzy approach to vehicle crash modeling. A typical vehicle to pole collision is described and kinematics of a car involved in this type of crash event is thoroughly characterized. The basics of fuzzy set theory and modeling principles based on fuzzy logic approach are presented. In particular, exceptional attention is paid to explain the methodology of creation of a fuzzy model of a vehicle collision. Furthermore, the simulation results are presented and compared to the original vehicle's kinematics. It is concluded which factors have influence on the accuracy of the fuzzy model's output and how they can be adjusted to improve the model's fidelity.

  2. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    PubMed

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  3. Image encryption based on fractal-structured phase mask in fractional Fourier transform domain

    NASA Astrophysics Data System (ADS)

    Zhao, Meng-Dan; Gao, Xu-Zhen; Pan, Yue; Zhang, Guan-Lin; Tu, Chenghou; Li, Yongnan; Wang, Hui-Tian

    2018-04-01

    We present an optical encryption approach based on the combination of fractal Fresnel lens (FFL) and fractional Fourier transform (FrFT). Our encryption approach is in fact a four-fold encryption scheme, including the random phase encoding produced by the Gerchberg–Saxton algorithm, a FFL, and two FrFTs. A FFL is composed of a Sierpinski carpet fractal plate and a Fresnel zone plate. In our encryption approach, the security is enhanced due to the more expandable key spaces and the use of FFL overcomes the alignment problem of the optical axis in optical system. Only using the perfectly matched parameters of the FFL and the FrFT, the plaintext can be recovered well. We present an image encryption algorithm that from the ciphertext we can get two original images by the FrFT with two different phase distribution keys, obtained by performing 100 iterations between the two plaintext and ciphertext, respectively. We test the sensitivity of our approach to various parameters such as the wavelength of light, the focal length of FFL, and the fractional orders of FrFT. Our approach can resist various attacks.

  4. Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.

    PubMed

    Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan

    2016-02-01

    This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market.

  5. An integrated bioanalytical method development and validation approach: case studies.

    PubMed

    Xue, Y-J; Melo, Brian; Vallejo, Martha; Zhao, Yuwen; Tang, Lina; Chen, Yuan-Shek; Keller, Karin M

    2012-10-01

    We proposed an integrated bioanalytical method development and validation approach: (1) method screening based on analyte's physicochemical properties and metabolism information to determine the most appropriate extraction/analysis conditions; (2) preliminary stability evaluation using both quality control and incurred samples to establish sample collection, storage and processing conditions; (3) mock validation to examine method accuracy and precision and incurred sample reproducibility; and (4) method validation to confirm the results obtained during method development. This integrated approach was applied to the determination of compound I in rat plasma and compound II in rat and dog plasma. The effectiveness of the approach was demonstrated by the superior quality of three method validations: (1) a zero run failure rate; (2) >93% of quality control results within 10% of nominal values; and (3) 99% incurred sample within 9.2% of the original values. In addition, rat and dog plasma methods for compound II were successfully applied to analyze more than 900 plasma samples obtained from Investigational New Drug (IND) toxicology studies in rats and dogs with near perfect results: (1) a zero run failure rate; (2) excellent accuracy and precision for standards and quality controls; and (3) 98% incurred samples within 15% of the original values. Copyright © 2011 John Wiley & Sons, Ltd.

  6. Development and evaluation of consensus-based sediment effect concentrations for polychlorinated biphenyls

    USGS Publications Warehouse

    MacDonald, Donald D.; Dipinto, Lisa M.; Field, Jay; Ingersoll, Christopher G.; Long, Edward R.; Swartz, Richard C.

    2000-01-01

    Sediment-quality guidelines (SQGs) have been published for polychlorinated biphenyls (PCBs) using both empirical and theoretical approaches. Empirically based guidelines have been developed using the screening-level concentration, effects range, effects level, and apparent effects threshold approaches. Theoretically based guidelines have been developed using the equilibrium-partitioning approach. Empirically-based guidelines were classified into three general categories, in accordance with their original narrative intents, and used to develop three consensus-based sediment effect concentrations (SECs) for total PCBs (tPCBs), including a threshold effect concentration, a midrange effect concentration, and an extreme effect concentration. Consensus-based SECs were derived because they estimate the central tendency of the published SQGs and, thus, reconcile the guidance values that have been derived using various approaches. Initially, consensus-based SECs for tPCBs were developed separately for freshwater sediments and for marine and estuarine sediments. Because the respective SECs were statistically similar, the underlying SQGs were subsequently merged and used to formulate more generally applicable SECs. The three consensus-based SECs were then evaluated for reliability using matching sediment chemistry and toxicity data from field studies, dose-response data from spiked-sediment toxicity tests, and SQGs derived from the equilibrium-partitioning approach. The results of this evaluation demonstrated that the consensus-based SECs can accurately predict both the presence and absence of toxicity in field-collected sediments. Importantly, the incidence of toxicity increases incrementally with increasing concentrations of tPCBs. Moreover, the consensus-based SECs are comparable to the chronic toxicity thresholds that have been estimated from dose-response data and equilibrium-partitioning models. Therefore, consensus-based SECs provide a unifying synthesis of existing SQGs, reflect causal rather than correlative effects, and accurately predict sediment toxicity in PCB-contaminated sediments.

  7. Studying the dynamics of interbeat interval time series of healthy and congestive heart failure subjects using scale based symbolic entropy analysis

    PubMed Central

    Awan, Imtiaz; Aziz, Wajid; Habib, Nazneen; Alowibdi, Jalal S.; Saeed, Sharjil; Nadeem, Malik Sajjad Ahmed; Shah, Syed Ahsin Ali

    2018-01-01

    Considerable interest has been devoted for developing a deeper understanding of the dynamics of healthy biological systems and how these dynamics are affected due to aging and disease. Entropy based complexity measures have widely been used for quantifying the dynamics of physical and biological systems. These techniques have provided valuable information leading to a fuller understanding of the dynamics of these systems and underlying stimuli that are responsible for anomalous behavior. The single scale based traditional entropy measures yielded contradictory results about the dynamics of real world time series data of healthy and pathological subjects. Recently the multiscale entropy (MSE) algorithm was introduced for precise description of the complexity of biological signals, which was used in numerous fields since its inception. The original MSE quantified the complexity of coarse-grained time series using sample entropy. The original MSE may be unreliable for short signals because the length of the coarse-grained time series decreases with increasing scaling factor τ, however, MSE works well for long signals. To overcome the drawback of original MSE, various variants of this method have been proposed for evaluating complexity efficiently. In this study, we have proposed multiscale normalized corrected Shannon entropy (MNCSE), in which instead of using sample entropy, symbolic entropy measure NCSE has been used as an entropy estimate. The results of the study are compared with traditional MSE. The effectiveness of the proposed approach is demonstrated using noise signals as well as interbeat interval signals from healthy and pathological subjects. The preliminary results of the study indicate that MNCSE values are more stable and reliable than original MSE values. The results show that MNCSE based features lead to higher classification accuracies in comparison with the MSE based features. PMID:29771977

  8. Studying the dynamics of interbeat interval time series of healthy and congestive heart failure subjects using scale based symbolic entropy analysis.

    PubMed

    Awan, Imtiaz; Aziz, Wajid; Shah, Imran Hussain; Habib, Nazneen; Alowibdi, Jalal S; Saeed, Sharjil; Nadeem, Malik Sajjad Ahmed; Shah, Syed Ahsin Ali

    2018-01-01

    Considerable interest has been devoted for developing a deeper understanding of the dynamics of healthy biological systems and how these dynamics are affected due to aging and disease. Entropy based complexity measures have widely been used for quantifying the dynamics of physical and biological systems. These techniques have provided valuable information leading to a fuller understanding of the dynamics of these systems and underlying stimuli that are responsible for anomalous behavior. The single scale based traditional entropy measures yielded contradictory results about the dynamics of real world time series data of healthy and pathological subjects. Recently the multiscale entropy (MSE) algorithm was introduced for precise description of the complexity of biological signals, which was used in numerous fields since its inception. The original MSE quantified the complexity of coarse-grained time series using sample entropy. The original MSE may be unreliable for short signals because the length of the coarse-grained time series decreases with increasing scaling factor τ, however, MSE works well for long signals. To overcome the drawback of original MSE, various variants of this method have been proposed for evaluating complexity efficiently. In this study, we have proposed multiscale normalized corrected Shannon entropy (MNCSE), in which instead of using sample entropy, symbolic entropy measure NCSE has been used as an entropy estimate. The results of the study are compared with traditional MSE. The effectiveness of the proposed approach is demonstrated using noise signals as well as interbeat interval signals from healthy and pathological subjects. The preliminary results of the study indicate that MNCSE values are more stable and reliable than original MSE values. The results show that MNCSE based features lead to higher classification accuracies in comparison with the MSE based features.

  9. Gauge-origin dependence in electronic g-tensor calculations

    NASA Astrophysics Data System (ADS)

    Glasbrenner, Michael; Vogler, Sigurd; Ochsenfeld, Christian

    2018-06-01

    We present a benchmark study on the gauge-origin dependence of the electronic g-tensor using data from unrestricted density functional theory calculations with the spin-orbit mean field ansatz. Our data suggest in accordance with previous studies that g-tensor calculations employing a common gauge-origin are sufficiently accurate for small molecules; however, for extended molecules, the introduced errors can become relevant and significantly exceed the basis set error. Using calculations with the spin-orbit mean field ansatz and gauge-including atomic orbitals as a reference, we furthermore show that the accuracy and reliability of common gauge-origin approaches in larger molecules depends strongly on the locality of the spin density distribution. We propose a new pragmatic ansatz for choosing the gauge-origin which takes the spin density distribution into account and gives reasonably accurate values for molecules with a single localized spin center. For more general cases like molecules with several spatially distant spin centers, common gauge-origin approaches are shown to be insufficient for consistently achieving high accuracy. Therefore the computation of g-tensors using distributed gauge-origin methods like gauge-including atomic orbitals is considered as the ideal approach and is recommended for larger molecular systems.

  10. SU-E-J-108: Solving the Chinese Postman Problem for Effective Contour Deformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, J; Zhang, L; Balter, P

    2015-06-15

    Purpose: To develop a practical approach for accurate contour deformation when deformable image registration (DIR) is used for atlas-based segmentation or contour propagation in image-guided radiotherapy. Methods: A contour deformation approach was developed on the basis of 3D mesh operations. The 2D contours represented by a series of points in each slice were first converted to a 3D triangular mesh, which was deformed by the deformation vectors resulting from DIR. A set of parallel 2D planes then cut through the deformed 3D mesh, generating unordered points and line segments, which should be reorganized into a set of 2D contour points.more » It was realized that the reorganization problem was equivalent to solving the Chinese Postman Problem (CPP) by traversing a graph built from the unordered points with the least cost. Alternatively, deformation could be applied to a binary mask converted from the original contours. The deformed binary mask was then converted back into contours at the CT slice locations. We performed a qualitative comparison to validate the mesh-based approach against the image-based approach. Results: The DIR could considerably change the 3D mesh, making complicated 2D contour representations after deformation. CPP was able to effectively reorganize the points in 2D planes no matter how complicated the 2D contours were. The mesh-based approach did not require a post-processing of the contour, thus accurately showing the actual deformation in DIR. The mesh-based approach could keep some fine details and resulted in smoother contours than the image-based approach did, especially for the lung structure. Image-based approach appeared to over-process contours and suffered from image resolution limits. The mesh-based approach was integrated into in-house DIR software for use in routine clinic and research. Conclusion: We developed a practical approach for accurate contour deformation. The efficiency of this approach was demonstrated in both clinic and research applications. This work was partially supported by Cancer Prevention & Research Institute of Texas (CPRIT) RP110562.« less

  11. Nanomaterials, and Occupational Health and Safety—A Literature Review About Control Banding and a Semi-Quantitative Method Proposed for Hazard Assessment.

    NASA Astrophysics Data System (ADS)

    Dimou, Kaotar; Emond, Claude

    2017-06-01

    In recent decades, the control banding (CB) approach has been recognised as a hazard assessment methodology because of its increased importance in the occupational safety, health and hygiene (OSHH) industry. According to the American Industrial Hygiene Association, this approach originates from the pharmaceutical industry in the United Kingdom. The aim of the CB approach is to protect more than 90% (or approximately 2.7 billion) of the world’s workers who do not have access to OSHH professionals and traditional quantitative risk assessment methods. In other words, CB is a qualitative or semi-quantitative tool designed to prevent occupational accidents by controlling worker exposures to potentially hazardous chemicals in the absence of comprehensive toxicological and exposure data. These criteria correspond very precisely to the development and production of engineered nanomaterials (ENMs). Considering the significant lack of scientific knowledge about work-related health risks because of ENMs, CB is, in general, appropriate for these issues. Currently, CB can be adapted to the specificities of ENMs; hundreds of nanotechnology products containing ENMs are already on the market. In this context, this qualitative or semi-quantitative approach appears to be relevant for characterising and quantifying the degree of physico-chemical and biological reactivities of ENMs, leading towards better control of human health effects and the safe handling of ENMs in workplaces. The need to greater understand the CB approach is important to further manage the risks related to handling hazardous substances, such as ENMs, without established occupational exposure limits. In recent years, this topic has garnered much interest, including discussions in many technical papers. Several CB models have been developed, and many countries have created their own nano-specific CB instruments. The aims of this research were to perform a literature review about CBs, to classify the main approaches that were developed worldwide, and then to suggest an original methodology based on the characterisation of the hazard. For this research, our team conducted a systematic literature review over the past 20 years. This approach is important in understanding the conceptual basis for CB and the model’s overall effectiveness. These considerations will lead to the proposal of an original hazard assessment method based on physico-chemical and biological characteristics. Such a method should help the entire industry better understand the ability of the CB approach to limit workers’ exposure, while identifying the strengths and weaknesses of the approach. Developing this practice method will help to provide relevant recommendations to workers who handle hazardous chemicals such as ENMs and to the general population.

  12. An efficient hybrid approach for multiobjective optimization of water distribution systems

    NASA Astrophysics Data System (ADS)

    Zheng, Feifei; Simpson, Angus R.; Zecchin, Aaron C.

    2014-05-01

    An efficient hybrid approach for the design of water distribution systems (WDSs) with multiple objectives is described in this paper. The objectives are the minimization of the network cost and maximization of the network resilience. A self-adaptive multiobjective differential evolution (SAMODE) algorithm has been developed, in which control parameters are automatically adapted by means of evolution instead of the presetting of fine-tuned parameter values. In the proposed method, a graph algorithm is first used to decompose a looped WDS into a shortest-distance tree (T) or forest, and chords (Ω). The original two-objective optimization problem is then approximated by a series of single-objective optimization problems of the T to be solved by nonlinear programming (NLP), thereby providing an approximate Pareto optimal front for the original whole network. Finally, the solutions at the approximate front are used to seed the SAMODE algorithm to find an improved front for the original entire network. The proposed approach is compared with two other conventional full-search optimization methods (the SAMODE algorithm and the NSGA-II) that seed the initial population with purely random solutions based on three case studies: a benchmark network and two real-world networks with multiple demand loading cases. Results show that (i) the proposed NLP-SAMODE method consistently generates better-quality Pareto fronts than the full-search methods with significantly improved efficiency; and (ii) the proposed SAMODE algorithm (no parameter tuning) exhibits better performance than the NSGA-II with calibrated parameter values in efficiently offering optimal fronts.

  13. The complex evolutionary dynamics of ancient and recent polyploidy in Leucaena (Leguminosae; Mimosoideae).

    PubMed

    Govindarajulu, Rajanikanth; Hughes, Colin E; Alexander, Patrick J; Bailey, C Donovan

    2011-12-01

    The evolutionary history of Leucaena has been impacted by polyploidy, hybridization, and divergent allopatric species diversification, suggesting that this is an ideal group to investigate the evolutionary tempo of polyploidy and the complexities of reticulation and divergence in plant diversification. Parsimony- and ML-based phylogenetic approaches were applied to 105 accessions sequenced for six sequence characterized amplified region-based nuclear encoded loci, nrDNA ITS, and four cpDNA regions. Hypotheses for the origin of tetraploid species were inferred using results derived from a novel species tree and established gene tree methods and from data on genome sizes and geographic distributions. The combination of comprehensively sampled multilocus DNA sequence data sets and a novel methodology provide strong resolution and support for the origins of all five tetraploid species. A minimum of four allopolyploidization events are required to explain the origins of these species. The origin(s) of one tetraploid pair (L. involucrata/L. pallida) can be equally explained by two unique allopolyploidizations or a single event followed by divergent speciation. Alongside other recent findings, a comprehensive picture of the complex evolutionary dynamics of polyploidy in Leucaena is emerging that includes paleotetraploidization, diploidization of the last common ancestor to Leucaena, allopatric divergence among diploids, and recent allopolyploid origins for tetraploid species likely associated with human translocation of seed. These results provide insights into the role of divergence and reticulation in a well-characterized angiosperm lineage and into traits of diploid parents and derived tetraploids (particularly self-compatibility and year-round flowering) favoring the formation and establishment of novel tetraploids combinations.

  14. New controller for high voltage converter modulator at spallation neutron source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wezensky, Mark W; Brown, David L; Lee, Sung-Woo

    2017-01-01

    The Spallation Neutron Source (SNS) has developed a new control system for the High Voltage Convertor Modulator (HVCM) at the SNS to replace the original control system which is approaching obsolescence. The original system was based on controllers for similar high voltage systems that were already in use [1]. The new controller, based on National Instruments PXI/FlexRIO Field Programmable Gate Array (FPGA) platform, offers enhancements such as modular construction, flexibility and non-proprietary software. The new controller also provides new capabilities like various methods for modulator pulse flattening, waveform capture, and first fault detection. This paper will discuss the design ofmore » the system, including the human machine interface, based on lessons learned at the SNS and other projects. It will also discuss performance and other issues related to its operation in an accelerator facility which requires high availability. To date, 73% of the operational HVCMs have been upgraded to with the new controller, and the remainder are scheduled for completion by mid-2017.« less

  15. Human ear detection in the thermal infrared spectrum

    NASA Astrophysics Data System (ADS)

    Abaza, Ayman; Bourlai, Thirimachos

    2012-06-01

    In this paper the problem of human ear detection in the thermal infrared (IR) spectrum is studied in order to illustrate the advantages and limitations of the most important steps of ear-based biometrics that can operate in day and night time environments. The main contributions of this work are two-fold: First, a dual-band database is assembled that consists of visible and thermal profile face images. The thermal data was collected using a high definition middle-wave infrared (3-5 microns) camera that is capable of acquiring thermal imprints of human skin. Second, a fully automated, thermal imaging based ear detection method is developed for real-time segmentation of human ears in either day or night time environments. The proposed method is based on Haar features forming a cascaded AdaBoost classifier (our modified version of the original Viola-Jones approach1 that was designed to be applied mainly in visible band images). The main advantage of the proposed method, applied on our profile face image data set collected in the thermal-band, is that it is designed to reduce the learning time required by the original Viola-Jones method from several weeks to several hours. Unlike other approaches reported in the literature, which have been tested but not designed to operate in the thermal band, our method yields a high detection accuracy that reaches ~ 91.5%. Further analysis on our data set yielded that: (a) photometric normalization techniques do not directly improve ear detection performance. However, when using a certain photometric normalization technique (CLAHE) on falsely detected images, the detection rate improved by ~ 4%; (b) the high detection accuracy of our method did not degrade when we lowered down the original spatial resolution of thermal ear images. For example, even after using one third of the original spatial resolution (i.e. ~ 20% of the original computational time) of the thermal profile face images, the high ear detection accuracy of our method remained unaffected. This resulted also in speeding up the detection time of an ear image from 265 to 17 milliseconds per image. To the best of our knowledge this is the first time that the problem of human ear detection in the thermal band is being investigated in the open literature.

  16. A commentary on the status of the behavioral approach in the healthcare marketplace.

    PubMed

    Moss, G R

    1993-12-01

    Clinically applied behavioral technology (e.g., integrated, systems-based hospital programs) and specific behavior therapies (e.g., systematic desensitization) have a long record of documented and powerful efficacy yet have failed to penetrate successfully the healthcare marketplace and to receive adequate public recognition. Many behavioral techniques are utilized widely without acknowledgement of their true origins. The current position of the behavioral approach in the healthcare marketplace is examined, and factors making for resistance to its acceptance are identified. Recommendations are offered for the more effective promotion of behavioral methods and services.

  17. Flattening maps for the visualization of multibranched vessels.

    PubMed

    Zhu, Lei; Haker, Steven; Tannenbaum, Allen

    2005-02-01

    In this paper, we present two novel algorithms which produce flattened visualizations of branched physiological surfaces, such as vessels. The first approach is a conformal mapping algorithm based on the minimization of two Dirichlet functionals. From a triangulated representation of vessel surfaces, we show how the algorithm can be implemented using a finite element technique. The second method is an algorithm which adjusts the conformal mapping to produce a flattened representation of the original surface while preserving areas. This approach employs the theory of optimal mass transport. Furthermore, a new way of extracting center lines for vessel fly-throughs is provided.

  18. Flattening Maps for the Visualization of Multibranched Vessels

    PubMed Central

    Zhu, Lei; Haker, Steven; Tannenbaum, Allen

    2013-01-01

    In this paper, we present two novel algorithms which produce flattened visualizations of branched physiological surfaces, such as vessels. The first approach is a conformal mapping algorithm based on the minimization of two Dirichlet functionals. From a triangulated representation of vessel surfaces, we show how the algorithm can be implemented using a finite element technique. The second method is an algorithm which adjusts the conformal mapping to produce a flattened representation of the original surface while preserving areas. This approach employs the theory of optimal mass transport. Furthermore, a new way of extracting center lines for vessel fly-throughs is provided. PMID:15707245

  19. Iterative outlier removal: A method for identifying outliers in laboratory recalibration studies

    PubMed Central

    Parrinello, Christina M.; Grams, Morgan E.; Sang, Yingying; Couper, David; Wruck, Lisa M.; Li, Danni; Eckfeldt, John H.; Selvin, Elizabeth; Coresh, Josef

    2016-01-01

    Background Extreme values that arise for any reason, including through non-laboratory measurement procedure-related processes (inadequate mixing, evaporation, mislabeling), lead to outliers and inflate errors in recalibration studies. We present an approach termed iterative outlier removal (IOR) for identifying such outliers. Methods We previously identified substantial laboratory drift in uric acid measurements in the Atherosclerosis Risk in Communities (ARIC) Study over time. Serum uric acid was originally measured in 1990–92 on a Coulter DACOS instrument using an uricase-based measurement procedure. To recalibrate previous measured concentrations to a newer enzymatic colorimetric measurement procedure, uric acid was re-measured in 200 participants from stored plasma in 2011–13 on a Beckman Olympus 480 autoanalyzer. To conduct IOR, we excluded data points >3 standard deviations (SDs) from the mean difference. We continued this process using the resulting data until no outliers remained. Results IOR detected more outliers and yielded greater precision in simulation. The original mean difference (SD) in uric acid was 1.25 (0.62) mg/dL. After four iterations, 9 outliers were excluded, and the mean difference (SD) was 1.23 (0.45) mg/dL. Conducting only one round of outlier removal (standard approach) would have excluded 4 outliers (mean difference [SD] = 1.22 [0.51] mg/dL). Applying the recalibration (derived from Deming regression) from each approach to the original measurements, the prevalence of hyperuricemia (>7 mg/dL) was 28.5% before IOR and 8.5% after IOR. Conclusion IOR is a useful method for removal of extreme outliers irrelevant to recalibrating laboratory measurements, and identifies more extraneous outliers than the standard approach. PMID:27197675

  20. Developmental Therapy- Developmental Teaching: An Outreach Project for Young Children with Social-Emotional-Behavioral Disabilities (October 1, 1997-September 30, 2000). Final Performance Report.

    ERIC Educational Resources Information Center

    Georgia Univ., Athens. Coll. of Family and Consumer Sciences.

    This outreach project is based on the validated Developmental Therapy-Developmental Teaching model originally designed for young children with severe emotional/behavioral problems and their families. It is an approach that emphasizes the teaching skills that foster a child's social-emotional-behavioral competence. The model has proven effective in…

  1. A risk analysis approach for using discriminant functions to manage logging-related landslides on granitic terrain

    Treesearch

    Raymond M. Rice; Norman H. Pillsbury; Kurt W. Schmidt

    1985-01-01

    Abstract - A linear discriminant function, developed to predict debris avalanches after clearcut logging on a granitic batholith in northwestern California, was tested on data from two batholiths. The equation was inaccurate in predicting slope stability on one of them. A new equation based on slope, crown cover, and distance from a stream (retained from the original...

  2. The Expansion and Integration of the Loanwords in the Togo Remnant Languages: An Approach Based on the Akebu Language.

    ERIC Educational Resources Information Center

    Koffi, Phil Yao

    A study suggests that the nature of linguistic borrowing in a group of 14 African languages termed Togo remnant languages--Basila, Lelemie (Buem), Aogba, Adele, Likpe, Santrokofi, Akpafu-Lolobi, Avatime, Nyangbo-Tafi, Bowili, Aklo, Kposo, Kebu, Animere--is similar to that of the Akebu language. Analysis focuses on the origins and itineraries of…

  3. Complete Genome Sequence of a Putative Densovirus of the Asian Citrus Psyllid, Diaphorina citri.

    PubMed

    Nigg, Jared C; Nouri, Shahideh; Falk, Bryce W

    2016-07-28

    Here, we report the complete genome sequence of a putative densovirus of the Asian citrus psyllid, Diaphorina citri Diaphorina citri densovirus (DcDNV) was originally identified through metagenomics, and here, we obtained the complete nucleotide sequence using PCR-based approaches. Phylogenetic analysis places DcDNV between viruses of the Ambidensovirus and Iteradensovirus genera. Copyright © 2016 Nigg et al.

  4. The numerical modelling of MHD astrophysical flows with chemistry

    NASA Astrophysics Data System (ADS)

    Kulikov, I.; Chernykh, I.; Protasov, V.

    2017-10-01

    The new code for numerical simulation of magnetic hydrodynamical astrophysical flows with consideration of chemical reactions is given in the paper. At the heart of the code - the new original low-dissipation numerical method based on a combination of operator splitting approach and piecewise-parabolic method on the local stencil. The chemodynamics of the hydrogen while the turbulent formation of molecular clouds is modeled.

  5. An evaluation of fossil tip-dating versus node-age calibrations in tetraodontiform fishes (Teleostei: Percomorphaceae).

    PubMed

    Arcila, Dahiana; Alexander Pyron, R; Tyler, James C; Ortí, Guillermo; Betancur-R, Ricardo

    2015-01-01

    Time-calibrated phylogenies based on molecular data provide a framework for comparative studies. Calibration methods to combine fossil information with molecular phylogenies are, however, under active development, often generating disagreement about the best way to incorporate paleontological data into these analyses. This study provides an empirical comparison of the most widely used approach based on node-dating priors for relaxed clocks implemented in the programs BEAST and MrBayes, with two recently proposed improvements: one using a new fossilized birth-death process model for node dating (implemented in the program DPPDiv), and the other using a total-evidence or tip-dating method (implemented in MrBayes and BEAST). These methods are applied herein to tetraodontiform fishes, a diverse group of living and extinct taxa that features one of the most extensive fossil records among teleosts. Previous estimates of time-calibrated phylogenies of tetraodontiforms using node-dating methods reported disparate estimates for their age of origin, ranging from the late Jurassic to the early Paleocene (ca. 150-59Ma). We analyzed a comprehensive dataset with 16 loci and 210 morphological characters, including 131 taxa (95 extant and 36 fossil species) representing all families of fossil and extant tetraodontiforms, under different molecular clock calibration approaches. Results from node-dating methods produced consistently younger ages than the tip-dating approaches. The older ages inferred by tip dating imply an unlikely early-late Jurassic (ca. 185-119Ma) origin for this order and the existence of extended ghost lineages in their fossil record. Node-based methods, by contrast, produce time estimates that are more consistent with the stratigraphic record, suggesting a late Cretaceous (ca. 86-96Ma) origin. We show that the precision of clade age estimates using tip dating increases with the number of fossils analyzed and with the proximity of fossil taxa to the node under assessment. This study suggests that current implementations of tip dating may overestimate ages of divergence in calibrated phylogenies. It also provides a comprehensive phylogenetic framework for tetraodontiform systematics and future comparative studies. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Improving care and wellness in bipolar disorder: origins, evolution and future directions of a collaborative knowledge exchange network

    PubMed Central

    2012-01-01

    The Collaborative RESearch team to study psychosocial factors in bipolar disorder (CREST.BD) is a multidisciplinary, cross-sectoral network dedicated to both fundamental research and knowledge exchange on bipolar disorder (BD). The core mission of the network is to advance the science and understanding of psychological and social issues associated with BD, improve the care and wellness of people living with BD, and strengthen services and supports for these individuals. CREST.BD bridges traditional and newer research approaches, particularly embracing community-based participatory research (CBPR) methods. Membership of CREST is broad, including academic researchers, people with BD, their family members and supports, and a variety of health care providers. Here, we describe the origins, evolution, approach to planning and evaluation and future vision for our network within the landscape of CBPR and integrated knowledge translation (KT), and explore the keys and challenges to success we have encountered working within this framework. PMID:22963889

  7. Identifying and overcoming the interface originating c-axis instability in highly Sc enhanced AlN for piezoelectric micro-electromechanical systems

    NASA Astrophysics Data System (ADS)

    Fichtner, Simon; Wolff, Niklas; Krishnamurthy, Gnanavel; Petraru, Adrian; Bohse, Sascha; Lofink, Fabian; Chemnitz, Steffen; Kohlstedt, Hermann; Kienle, Lorenz; Wagner, Bernhard

    2017-07-01

    Enhancing the piezoelectric activity of AlN by partially substituting Al with Sc to form Al1-xScxN is a promising approach to improve the performance of piezoelectric micro-electromechanical systems. Here, we present evidence of an instability in the morphology of Al1-xScxN, which originates at, or close to, the substrate/Al1-xScxN interface and becomes more pronounced as the Sc content is increased. Based on Transmission electron microscopy, piezoresponse force microscopy, X-ray diffraction, and SEM analysis, it is identified to be the incipient formation of (100) oriented grains. Approaches to successfully reestablish exclusive c-axis orientation up to x = 0.43 are revealed, with electrode pre-treatment and cathode-substrate distance found to exert significant influence. This allows us to present first measurements of the transversal thin film piezoelectric coefficient e31,f and dielectric loss tangent tan δ beyond x = 0.3.

  8. Cerebrovascular diseases at the C. Mondino National Institute of Neurology: from Ottorino Rossi to the present day

    PubMed Central

    Micieli, Giuseppe; Martignoni, Emilia; Sandrini, Giorgio; Bono, Giorgio; Nappi, Giuseppe

    Summary This paper traces the development of research and healthcare models in the field of cerebrovascular disorders at the C. Mondino National Institute of Neurology in Pavia, Italy. It starts with a description of the original experiences of Ottorino Rossi and his thesis on atherosclerosis which date back to the beginning of the last century; it then illustrates the connections between his seminal essay and the future directions followed by research in this institute, through to the development of one of the first stroke units in Italy. In this context, we examine a large range of scientific approaches, many related to cerebrovascular diseases (such as headaches) and autonomic disorders, and some of their biological and physiological markers. The originality of an approach also based on tools of advanced technology, including information technology, is emphasised, as is the importance of passion and perseverance in the pursuit of extraordinary results in what is an extremely complex and difficult field. PMID:21729590

  9. Combination of watermarking and joint watermarking-decryption for reliability control and traceability of medical images.

    PubMed

    Bouslimi, D; Coatrieux, G; Cozic, M; Roux, Ch

    2014-01-01

    In this paper, we propose a novel crypto-watermarking system for the purpose of verifying the reliability of images and tracing them, i.e. identifying the person at the origin of an illegal distribution. This system couples a common watermarking method, based on Quantization Index Modulation (QIM), and a joint watermarking-decryption (JWD) approach. At the emitter side, it allows the insertion of a watermark as a proof of reliability of the image before sending it encrypted; at the reception, another watermark, a proof of traceability, is embedded during the decryption process. The scheme we propose makes interoperate such a combination of watermarking approaches taking into account risks of interferences between embedded watermarks, allowing the access to both reliability and traceability proofs. Experimental results confirm the efficiency of our system, and demonstrate it can be used to identify the physician at the origin of a disclosure even if the image has been modified.

  10. Lightning Scaling Laws Revisited

    NASA Technical Reports Server (NTRS)

    Boccippio, D. J.; Arnold, James E. (Technical Monitor)

    2000-01-01

    Scaling laws relating storm electrical generator power (and hence lightning flash rate) to charge transport velocity and storm geometry were originally posed by Vonnegut (1963). These laws were later simplified to yield simple parameterizations for lightning based upon cloud top height, with separate parameterizations derived over land and ocean. It is demonstrated that the most recent ocean parameterization: (1) yields predictions of storm updraft velocity which appear inconsistent with observation, and (2) is formally inconsistent with the theory from which it purports to derive. Revised formulations consistent with Vonnegut's original framework are presented. These demonstrate that Vonnegut's theory is, to first order, consistent with observation. The implications of assuming that flash rate is set by the electrical generator power, rather than the electrical generator current, are examined. The two approaches yield significantly different predictions about the dependence of charge transfer per flash on storm dimensions, which should be empirically testable. The two approaches also differ significantly in their explanation of regional variability in lightning observations.

  11. Magnetism as indirect tool for carbon content assessment in nickel nanoparticles

    NASA Astrophysics Data System (ADS)

    Oumellal, Y.; Magnin, Y.; Martínez de Yuso, A.; Aguiar Hualde, J. M.; Amara, H.; Paul-Boncour, V.; Matei Ghimbeu, C.; Malouche, A.; Bichara, C.; Pellenq, R.; Zlotea, C.

    2017-12-01

    We report a combined experimental and theoretical study to ascertain carbon solubility in nickel nanoparticles embedded into a carbon matrix via the one-pot method. This original approach is based on the experimental characterization of the magnetic properties of Ni at room temperature and Monte Carlo simulations used to calculate the magnetization as a function of C content in Ni nanoparticles. Other commonly used experimental methods fail to accurately determine the chemical analysis of these types of nanoparticles. Thus, we could assess the C content within Ni nanoparticles and it decreases from 8 to around 4 at. % with increasing temperature during the synthesis. This behavior could be related to the catalytic transformation of dissolved C in the Ni particles into graphite layers surrounding the particles at high temperature. The proposed approach is original and easy to implement experimentally since only magnetization measurements at room temperature are needed. Moreover, it can be extended to other types of magnetic nanoparticles dissolving carbon.

  12. Mean Flow Augmented Acoustics in Rocket Systems

    NASA Technical Reports Server (NTRS)

    Fischbach, Sean

    2014-01-01

    Combustion instability in solid rocket motors and liquid engines has long been a subject of concern. Many rockets display violent fluctuations in pressure, velocity, and temperature originating from the complex interactions between the combustion process and gas dynamics. Recent advances in energy based modeling of combustion instabilities require accurate determination of acoustic frequencies and mode shapes. Of particular interest is the acoustic mean flow interactions within the converging section of a rocket nozzle, where gradients of pressure, density, and velocity become large. The expulsion of unsteady energy through the nozzle of a rocket is identified as the predominate source of acoustic damping for most rocket systems. Recently, an approach to address nozzle damping with mean flow effects was implemented by French [1]. This new approach extends the work originated by Sigman and Zinn [2] by solving the acoustic velocity potential equation (AVPE) formulated by perturbing the Euler equations [3]. The present study aims to implement the French model within the COMSOL Multiphysiscs framework and analyzes one of the author's presented test cases.

  13. Morphological and molecular phylogenetic context of the angiosperms: contrasting the 'top-down' and 'bottom-up' approaches used to infer the likely characteristics of the first flowers.

    PubMed

    Bateman, Richard M; Hilton, Jason; Rudall, Paula J

    2006-01-01

    Recent attempts to address the long-debated 'origin' of the angiosperms depend on a phylogenetic framework derived from a matrix of taxa versus characters; most assume that empirical rigour is proportional to the size of the matrix. Sequence-based genotypic approaches increase the number of characters (nucleotides and indels) in the matrix but are confined to the highly restricted spectrum of extant species, whereas morphology-based approaches increase the number of phylogenetically informative taxa (including fossils) at the expense of accessing only a restricted spectrum of phenotypic characters. The two approaches are currently delivering strongly contrasting hypotheses of relationship. Most molecular studies indicate that all extant gymnosperms form a natural group, suggesting surprisingly early divergence of the lineage that led to angiosperms, whereas morphology-only phylogenies indicate that a succession of (mostly extinct) gymnosperms preceded a later angiosperm origin. Causes of this conflict include: (i) the vast phenotypic and genotypic lacuna, largely reflecting pre-Cenozoic extinctions, that separates early-divergent living angiosperms from their closest relatives among the living gymnosperms; (ii) profound uncertainty regarding which (a) extant and (b) extinct angiosperms are most closely related to gymnosperms; and (iii) profound uncertainty regarding which (a) extant and (b) extinct gymnosperms are most closely related to angiosperms, and thus best serve as 'outgroups' dictating the perceived evolutionary polarity of character transitions among the early-divergent angiosperms. These factors still permit a remarkable range of contrasting, yet credible, hypotheses regarding the order of acquisition of the many phenotypic characters, reproductive and vegetative, that distinguish 'classic' angiospermy from 'classic' gymnospermy. The flower remains ill-defined and its mode (or modes) of origin remains hotly disputed; some definitions and hypotheses of evolutionary relationships preclude a role for the flower in delimiting the angiosperms. We advocate maintenance of parallel, reciprocally illuminating programmes of morphological and molecular phylogeny reconstruction, respectively supported by homology testing through additional taxa (especially fossils) and evolutionary-developmental genetic studies that explore genes potentially responsible for major phenotypic transitions.

  14. Molecular phylogeny supports the paraphyletic nature of the genus Trogoderma (Coleoptera: Dermestidae) collected in the Australasian ecozone.

    PubMed

    Castalanelli, M A; Baker, A M; Munyard, K A; Grimm, M; Groth, D M

    2012-02-01

    To date, a molecular phylogenetic approach has not been used to investigate the evolutionary structure of Trogoderma and closely related genera. Using two mitochondrial genes, Cytochrome Oxidase I and Cytochrome B, and the nuclear gene, 18S, the reported polyphyletic positioning of Trogoderma was examined. Paraphyly in Trogoderma was observed, with one Australian Trogoderma species reconciled as sister to all Dermestidae and the Anthrenocerus genus deeply nested within the Australian Trogoderma clade. In addition, time to most recent common ancestor for a number of Dermestidae was calculated. Based on these estimations, the Dermestidae origin exceeded 175 million years, placing the origins of this family in Pangaea.

  15. Consistency assessment with global and bridging development strategies in emerging markets.

    PubMed

    Li, Gang; Chen, Josh; Quan, Hui; Shentu, Yue

    2013-11-01

    Global trial strategy with the participation of all major regions including countries from emerging markets surely increases new drug development efficiency. Nevertheless, there are circumstances in which some countries in emerging markets cannot join the original global trial. To evaluate the extrapolability of the original trial results to a new country, a bridging trial in the country has to be conducted. In this paper, we first evaluate the efficiency loss of the bridging trial strategy compared to that of the global trial strategy as a function of between-study variability from consistency assessment perspective. The provided evidence should encourage countries in emerging markets to make a greater effort to participate in the original global trial. We then discuss sample size requirement for desired assurance probability for consistency assessment based on various approaches for both global and bridging trial strategies. Examples are presented for numerical demonstration and comparisons. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. A novel alignment-free method for detection of lateral genetic transfer based on TF-IDF.

    PubMed

    Cong, Yingnan; Chan, Yao-Ban; Ragan, Mark A

    2016-07-25

    Lateral genetic transfer (LGT) plays an important role in the evolution of microbes. Existing computational methods for detecting genomic regions of putative lateral origin scale poorly to large data. Here, we propose a novel method based on TF-IDF (Term Frequency-Inverse Document Frequency) statistics to detect not only regions of lateral origin, but also their origin and direction of transfer, in sets of hierarchically structured nucleotide or protein sequences. This approach is based on the frequency distributions of k-mers in the sequences. If a set of contiguous k-mers appears sufficiently more frequently in another phyletic group than in its own, we infer that they have been transferred from the first group to the second. We performed rigorous tests of TF-IDF using simulated and empirical datasets. With the simulated data, we tested our method under different parameter settings for sequence length, substitution rate between and within groups and post-LGT, deletion rate, length of transferred region and k size, and found that we can detect LGT events with high precision and recall. Our method performs better than an established method, ALFY, which has high recall but low precision. Our method is efficient, with runtime increasing approximately linearly with sequence length.

  17. Evolutionary origin and early biogeography of otophysan fishes (Ostariophysi: Teleostei).

    PubMed

    Chen, Wei-Jen; Lavoué, Sébastien; Mayden, Richard L

    2013-08-01

    The biogeography of the mega-diverse, freshwater, and globally distributed Otophysi has received considerable attention. This attraction largely stems from assumptions as to their ancient origin, the clade being almost exclusively freshwater, and their suitability as to explanations of trans-oceanic distributions. Despite multiple hypotheses explaining present-day distributions, problems remain, precluding more parsimonious explanations. Underlying previous hypotheses are alternative phylogenies for Otophysi, uncertainties as to temporal diversification and assumptions integral to various explanations. We reexamine the origin and early diversification of this clade based on a comprehensive time-calibrated, molecular-based phylogenetic analysis and event-based approaches for ancestral range inference of lineages. Our results do not corroborate current phylogenetic classifications of otophysans. We demonstrate Siluriformes are never sister to Gymnotiformes and Characiformes are most likely nonmonophyletic. Divergence time estimates specify a split between Cypriniformes and Characiphysi with the fragmentation of Pangea. The early diversification of characiphysans either predated, or was contemporary with, the separation of Africa and South America, and involved a combination of within- and between-continental divergence events for these lineages. The intercontinental diversification of siluroids and characoids postdated major intercontinental tectonic fragmentations (<90 Mya). Post-tectonic drift dispersal events are hypothesized to account for their current distribution patterns. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.

  18. A Bat Algorithm with Mutation for UCAV Path Planning

    PubMed Central

    Wang, Gaige; Guo, Lihong; Duan, Hong; Liu, Luo; Wang, Heqi

    2012-01-01

    Path planning for uninhabited combat air vehicle (UCAV) is a complicated high dimension optimization problem, which mainly centralizes on optimizing the flight route considering the different kinds of constrains under complicated battle field environments. Original bat algorithm (BA) is used to solve the UCAV path planning problem. Furthermore, a new bat algorithm with mutation (BAM) is proposed to solve the UCAV path planning problem, and a modification is applied to mutate between bats during the process of the new solutions updating. Then, the UCAV can find the safe path by connecting the chosen nodes of the coordinates while avoiding the threat areas and costing minimum fuel. This new approach can accelerate the global convergence speed while preserving the strong robustness of the basic BA. The realization procedure for original BA and this improved metaheuristic approach BAM is also presented. To prove the performance of this proposed metaheuristic method, BAM is compared with BA and other population-based optimization methods, such as ACO, BBO, DE, ES, GA, PBIL, PSO, and SGA. The experiment shows that the proposed approach is more effective and feasible in UCAV path planning than the other models. PMID:23365518

  19. Automatic video summarization driven by a spatio-temporal attention model

    NASA Astrophysics Data System (ADS)

    Barland, R.; Saadane, A.

    2008-02-01

    According to the literature, automatic video summarization techniques can be classified in two parts, following the output nature: "video skims", which are generated using portions of the original video and "key-frame sets", which correspond to the images, selected from the original video, having a significant semantic content. The difference between these two categories is reduced when we consider automatic procedures. Most of the published approaches are based on the image signal and use either pixel characterization or histogram techniques or image decomposition by blocks. However, few of them integrate properties of the Human Visual System (HVS). In this paper, we propose to extract keyframes for video summarization by studying the variations of salient information between two consecutive frames. For each frame, a saliency map is produced simulating the human visual attention by a bottom-up (signal-dependent) approach. This approach includes three parallel channels for processing three early visual features: intensity, color and temporal contrasts. For each channel, the variations of the salient information between two consecutive frames are computed. These outputs are then combined to produce the global saliency variation which determines the key-frames. Psychophysical experiments have been defined and conducted to analyze the relevance of the proposed key-frame extraction algorithm.

  20. PSEA-Quant: a protein set enrichment analysis on label-free and label-based protein quantification data.

    PubMed

    Lavallée-Adam, Mathieu; Rauniyar, Navin; McClatchy, Daniel B; Yates, John R

    2014-12-05

    The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights.

  1. PSEA-Quant: A Protein Set Enrichment Analysis on Label-Free and Label-Based Protein Quantification Data

    PubMed Central

    2015-01-01

    The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights. PMID:25177766

  2. Human health risk assessment: models for predicting the effective exposure duration of on-site receptors exposed to contaminated groundwater.

    PubMed

    Baciocchi, Renato; Berardi, Simona; Verginelli, Iason

    2010-09-15

    Clean-up of contaminated sites is usually based on a risk-based approach for the definition of the remediation goals, which relies on the well known ASTM-RBCA standard procedure. In this procedure, migration of contaminants is described through simple analytical models and the source contaminants' concentration is supposed to be constant throughout the entire exposure period, i.e. 25-30 years. The latter assumption may often result over-protective of human health, leading to unrealistically low remediation goals. The aim of this work is to propose an alternative model taking in account the source depletion, while keeping the original simplicity and analytical form of the ASTM-RBCA approach. The results obtained by the application of this model are compared with those provided by the traditional ASTM-RBCA approach, by a model based on the source depletion algorithm of the RBCA ToolKit software and by a numerical model, allowing to assess its feasibility for inclusion in risk analysis procedures. The results discussed in this work are limited to on-site exposure to contaminated water by ingestion, but the approach proposed can be extended to other exposure pathways. Copyright 2010 Elsevier B.V. All rights reserved.

  3. Evolution of the ART approach: highlights and achievements

    PubMed Central

    FRENCKEN, Jo E.

    2009-01-01

    ABSTRACT Atraumatic Restorative Treatment (ART) was initiated in the mid-eighties in Tanzania in response to an inappropriately functioning community oral health programme that was based on western health care models and western technology. The approach has evolved to its present standing as an effective minimal intervention approach mainly because the originators anticipated the great potential of ART to alleviate inequality in oral health care, and because they recognised the need to carry out research to investigate its effectiveness and applicability. Twenty-five years later, ART was accepted by the World Health Organisation (1994) and the FDI World Dental Federation (2002). It is included in textbooks on cariology, restorative dentistry and minimal intervention dentistry. It is being systematically introduced into public oral health service systems in a number of low- and middle income countries. Private practitioners use it. Many publications related to aspects of ART have been published and many more will follow. To achieve quality results with ART one has to attend well-conducted and sufficiently long training courses, preferably in combination with other caries preventive strategies. ART should, therefore, not be considered in isolation and must be part of an evidence-based approach to oral health with a strong foundation based on prevention. PMID:21499660

  4. Absolute order-of-magnitude reasoning applied to a social multi-criteria evaluation framework

    NASA Astrophysics Data System (ADS)

    Afsordegan, A.; Sánchez, M.; Agell, N.; Aguado, J. C.; Gamboa, G.

    2016-03-01

    A social multi-criteria evaluation framework for solving a real-case problem of selecting a wind farm location in the regions of Urgell and Conca de Barberá in Catalonia (northeast of Spain) is studied. This paper applies a qualitative multi-criteria decision analysis approach based on linguistic labels assessment able to address uncertainty and deal with different levels of precision. This method is based on qualitative reasoning as an artificial intelligence technique for assessing and ranking multi-attribute alternatives with linguistic labels in order to handle uncertainty. This method is suitable for problems in the social framework such as energy planning which require the construction of a dialogue process among many social actors with high level of complexity and uncertainty. The method is compared with an existing approach, which has been applied previously in the wind farm location problem. This approach, consisting of an outranking method, is based on Condorcet's original method. The results obtained by both approaches are analysed and their performance in the selection of the wind farm location is compared in aggregation procedures. Although results show that both methods conduct to similar alternatives rankings, the study highlights both their advantages and drawbacks.

  5. An emergentist perspective on the origin of number sense

    PubMed Central

    2018-01-01

    The finding that human infants and many other animal species are sensitive to numerical quantity has been widely interpreted as evidence for evolved, biologically determined numerical capacities across unrelated species, thereby supporting a ‘nativist’ stance on the origin of number sense. Here, we tackle this issue within the ‘emergentist’ perspective provided by artificial neural network models, and we build on computer simulations to discuss two different approaches to think about the innateness of number sense. The first, illustrated by artificial life simulations, shows that numerical abilities can be supported by domain-specific representations emerging from evolutionary pressure. The second assumes that numerical representations need not be genetically pre-determined but can emerge from the interplay between innate architectural constraints and domain-general learning mechanisms, instantiated in deep learning simulations. We show that deep neural networks endowed with basic visuospatial processing exhibit a remarkable performance in numerosity discrimination before any experience-dependent learning, whereas unsupervised sensory experience with visual sets leads to subsequent improvement of number acuity and reduces the influence of continuous visual cues. The emergent neuronal code for numbers in the model includes both numerosity-sensitive (summation coding) and numerosity-selective response profiles, closely mirroring those found in monkey intraparietal neurons. We conclude that a form of innatism based on architectural and learning biases is a fruitful approach to understanding the origin and development of number sense. This article is part of a discussion meeting issue ‘The origins of numerical abilities'. PMID:29292348

  6. Transfer of analytical procedures: a panel of strategies selected for risk management, with emphasis on an integrated equivalence-based comparative testing approach.

    PubMed

    Agut, C; Caron, A; Giordano, C; Hoffman, D; Ségalini, A

    2011-09-10

    In 2001, a multidisciplinary team made of analytical scientists and statisticians at Sanofi-aventis has published a methodology which has governed, from that time, the transfers from R&D sites to Manufacturing sites of the release monographs. This article provides an overview of the recent adaptations brought to this original methodology taking advantage of our experience and the new regulatory framework, and, in particular, the risk management perspective introduced by ICH Q9. Although some alternate strategies have been introduced in our practices, the comparative testing one, based equivalence testing as statistical approach, remains the standard for assays lying on very critical quality attributes. This is conducted with the concern to control the most important consumer's risk involved at two levels in analytical decisions in the frame of transfer studies: risk, for the receiving laboratory, to take poor release decisions with the analytical method and risk, for the sending laboratory, to accredit such a receiving laboratory on account of its insufficient performances with the method. Among the enhancements to the comparative studies, the manuscript presents the process settled within our company for a better integration of the transfer study into the method life-cycle, just as proposals of generic acceptance criteria and designs for assay and related substances methods. While maintaining rigor and selectivity of the original approach, these improvements tend towards an increased efficiency in the transfer operations. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. Recreational Stream Crossing Effects on Sediment Delivery and Macroinvertebrates in Southwestern Virginia, USA

    NASA Astrophysics Data System (ADS)

    Kidd, Kathryn R.; Aust, W. Michael; Copenheaver, Carolyn A.

    2014-09-01

    Trail-based recreation has increased over recent decades, raising the environmental management issue of soil erosion that originates from unsurfaced, recreational trail systems. Trail-based soil erosion that occurs near stream crossings represents a non-point source of pollution to streams. We modeled soil erosion rates along multiple-use (hiking, mountain biking, and horseback riding) recreational trails that approach culvert and ford stream crossings as potential sources of sediment input and evaluated whether recreational stream crossings were impacting water quality based on downstream changes in macroinvertebrate-based indices within the Poverty Creek Trail System of the George Washington and Jefferson National Forest in southwestern Virginia, USA. We found modeled soil erosion rates for non-motorized recreational approaches that were lower than published estimates for an off-road vehicle approach, bare horse trails, and bare forest operational skid trail and road approaches, but were 13 times greater than estimated rates for undisturbed forests and 2.4 times greater than a 2-year old clearcut in this region. Estimated soil erosion rates were similar to rates for skid trails and horse trails where best management practices (BMPs) had been implemented. Downstream changes in macroinvertebrate-based indices indicated water quality was lower downstream from crossings than in upstream reference reaches. Our modeled soil erosion rates illustrate recreational stream crossing approaches have the potential to deliver sediment into adjacent streams, particularly where BMPs are not being implemented or where approaches are not properly managed, and as a result can negatively impact water quality below stream crossings.

  8. A Spatiotemporal Indexing Approach for Efficient Processing of Big Array-Based Climate Data with MapReduce

    NASA Technical Reports Server (NTRS)

    Li, Zhenlong; Hu, Fei; Schnase, John L.; Duffy, Daniel Q.; Lee, Tsengdar; Bowen, Michael K.; Yang, Chaowei

    2016-01-01

    Climate observations and model simulations are producing vast amounts of array-based spatiotemporal data. Efficient processing of these data is essential for assessing global challenges such as climate change, natural disasters, and diseases. This is challenging not only because of the large data volume, but also because of the intrinsic high-dimensional nature of geoscience data. To tackle this challenge, we propose a spatiotemporal indexing approach to efficiently manage and process big climate data with MapReduce in a highly scalable environment. Using this approach, big climate data are directly stored in a Hadoop Distributed File System in its original, native file format. A spatiotemporal index is built to bridge the logical array-based data model and the physical data layout, which enables fast data retrieval when performing spatiotemporal queries. Based on the index, a data-partitioning algorithm is applied to enable MapReduce to achieve high data locality, as well as balancing the workload. The proposed indexing approach is evaluated using the National Aeronautics and Space Administration (NASA) Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate reanalysis dataset. The experimental results show that the index can significantly accelerate querying and processing (10 speedup compared to the baseline test using the same computing cluster), while keeping the index-to-data ratio small (0.0328). The applicability of the indexing approach is demonstrated by a climate anomaly detection deployed on a NASA Hadoop cluster. This approach is also able to support efficient processing of general array-based spatiotemporal data in various geoscience domains without special configuration on a Hadoop cluster.

  9. "Shape function + memory mechanism"-based hysteresis modeling of magnetorheological fluid actuators

    NASA Astrophysics Data System (ADS)

    Qian, Li-Jun; Chen, Peng; Cai, Fei-Long; Bai, Xian-Xu

    2018-03-01

    A hysteresis model based on "shape function + memory mechanism" is presented and its feasibility is verified through modeling the hysteresis behavior of a magnetorheological (MR) damper. A hysteresis phenomenon in resistor-capacitor (RC) circuit is first presented and analyzed. In the hysteresis model, the "memory mechanism" originating from the charging and discharging processes of the RC circuit is constructed by adopting a virtual displacement variable and updating laws for the reference points. The "shape function" is achieved and generalized from analytical solutions of the simple semi-linear Duhem model. Using the approach, the memory mechanism reveals the essence of specific Duhem model and the general shape function provides a direct and clear means to fit the hysteresis loop. In the frame of the structure of a "Restructured phenomenological model", the original hysteresis operator, i.e., the Bouc-Wen operator, is replaced with the new hysteresis operator. The comparative work with the Bouc-Wen operator based model demonstrates superior performances of high computational efficiency and comparable accuracy of the new hysteresis operator-based model.

  10. Technical Note: Simple, scalable, and sensitive protocol for retrieving Bacillus anthracis (and other live bacteria) from heroin.

    PubMed

    Grass, Gregor; Ahrens, Bjoern; Schleenbecker, Uwe; Dobrzykowski, Linda; Wagner, Matthias; Krüger, Christian; Wölfel, Roman

    2016-02-01

    We describe a culture-based method suitable for isolating Bacillus anthracis and other live bacteria from heroin. This protocol was developed as a consequence of the bioforensic need to retrieve bacteria from batches of the drug associated with cases of injectional anthrax among heroin-consumers in Europe. This uncommon manifestation of infection with the notorious pathogen B. anthracis has resulted in 26 deaths between the years 2000 to 2013. Thus far, no life disease agent has been isolated from heroin during forensic investigations surrounding these incidences. Because of the conjectured very small number of disease-causing endospores in the contaminated drug it is likely that too few target sequences are available for molecular genetic analysis. Therefore, a direct culture-based approach was chosen here. Endospores of attenuated B. anthracis artificially spiked into heroin were successfully retrieved at 84-98% recovery rates using a wash solution consisting of 0.5% Tween 20 in water. Using this approach, 82 samples of un-cut heroin originating from the German Federal Criminal Police Office's heroin analysis program seized during the period between 2000 and 2014 were tested and found to be surprisingly poor in retrievable bacteria. Notably, while no B. anthracis was isolated from the drug batches, other bacteria were successfully cultured. The resulting methodical protocol is therefore suitable for analyzing un-cut heroin which can be anticipated to comprise the original microbiota from the drug's original source without interference from contaminations introduced by cutting. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Drug delivery into the cochlear apex: Improved control to sequentially affect finely spaced regions along the entire length of the cochlear spiral.

    PubMed

    Lichtenhan, J T; Hartsock, J; Dornhoffer, J R; Donovan, K M; Salt, A N

    2016-11-01

    Administering pharmaceuticals to the scala tympani of the inner ear is a common approach to study cochlear physiology and mechanics. We present here a novel method for in vivo drug delivery in a controlled manner to sealed ears. Injections of ototoxic solutions were applied from a pipette sealed into a fenestra in the cochlear apex, progressively driving solutions along the length of scala tympani toward the cochlear aqueduct at the base. Drugs can be delivered rapidly or slowly. In this report we focus on slow delivery in which the injection rate is automatically adjusted to account for varying cross sectional area of the scala tympani, therefore driving a solution front at uniform rate. Objective measurements originating from finely spaced, low- to high-characteristic cochlear frequency places were sequentially affected. Comparison with existing methods(s): Controlled administration of pharmaceuticals into the cochlear apex overcomes a number of serious limitations of previously established methods such as cochlear perfusions with an injection pipette in the cochlear base: The drug concentration achieved is more precisely controlled, drug concentrations remain in scala tympani and are not rapidly washed out by cerebrospinal fluid flow, and the entire length of the cochlear spiral can be treated quickly or slowly with time. Controlled administration of solutions into the cochlear apex can be a powerful approach to sequentially effect objective measurements originating from finely spaced cochlear regions and allows, for the first time, the spatial origin of CAPs to be objectively defined. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Drug delivery into the cochlear apex: Improved control to sequentially affect finely spaced regions along the entire length of the cochlear spiral

    PubMed Central

    Lichtenhan, JT; Hartsock, J; Dornhoffer, JR; Donovan, KM; Salt, AN

    2016-01-01

    Background Administering pharmaceuticals to the scala tympani of the inner ear is a common approach to study cochlear physiology and mechanics. We present here a novel method for in vivo drug delivery in a controlled manner to sealed ears. New method Injections of ototoxic solutions were applied from a pipette sealed into a fenestra in the cochlear apex, progressively driving solutions along the length of scala tympani toward the cochlear aqueduct at the base. Drugs can be delivered rapidly or slowly. In this report we focus on slow delivery in which the injection rate is automatically adjusted to account for varying cross sectional area of the scala tympani, therefore driving a solution front at uniform rate. Results Objective measurements originating from finely spaced, low- to high-characteristic cochlear frequency places were sequentially affected. Comparison with existing methods(s): Controlled administration of pharmaceuticals into the cochlear apex overcomes a number of serious limitations of previously established methods such as cochlear perfusions with an injection pipette in the cochlear base: The drug concentration achieved is more precisely controlled, drug concentrations remain in scala tympani and are not rapidly washed out by cerebrospinal fluid flow, and the entire length of the cochlear spiral can be treated quickly or slowly with time. Conclusions Controlled administration of solutions into the cochlear apex can be a powerful approach to sequentially effect objective measurements originating from finely spaced cochlear regions and allows, for the first time, the spatial origin of CAPs to be objectively defined. PMID:27506463

  13. Selecting Populations for Non-Analogous Climate Conditions Using Universal Response Functions: The Case of Douglas-Fir in Central Europe

    PubMed Central

    Chakraborty, Debojyoti; Wang, Tongli; Andre, Konrad; Konnert, Monika; Lexer, Manfred J.; Matulla, Christoph; Schueler, Silvio

    2015-01-01

    Identifying populations within tree species potentially adapted to future climatic conditions is an important requirement for reforestation and assisted migration programmes. Such populations can be identified either by empirical response functions based on correlations of quantitative traits with climate variables or by climate envelope models that compare the climate of seed sources and potential growing areas. In the present study, we analyzed the intraspecific variation in climate growth response of Douglas-fir planted within the non-analogous climate conditions of Central and continental Europe. With data from 50 common garden trials, we developed Universal Response Functions (URF) for tree height and mean basal area and compared the growth performance of the selected best performing populations with that of populations identified through a climate envelope approach. Climate variables of the trial location were found to be stronger predictors of growth performance than climate variables of the population origin. Although the precipitation regime of the population sources varied strongly none of the precipitation related climate variables of population origin was found to be significant within the models. Overall, the URFs explained more than 88% of variation in growth performance. Populations identified by the URF models originate from western Cascades and coastal areas of Washington and Oregon and show significantly higher growth performance than populations identified by the climate envelope approach under both current and climate change scenarios. The URFs predict decreasing growth performance at low and middle elevations of the case study area, but increasing growth performance on high elevation sites. Our analysis suggests that population recommendations based on empirical approaches should be preferred and population selections by climate envelope models without considering climatic constrains of growth performance should be carefully appraised before transferring populations to planting locations with novel or dissimilar climate. PMID:26288363

  14. AveBoost2: Boosting for Noisy Data

    NASA Technical Reports Server (NTRS)

    Oza, Nikunj C.

    2004-01-01

    AdaBoost is a well-known ensemble learning algorithm that constructs its constituent or base models in sequence. A key step in AdaBoost is constructing a distribution over the training examples to create each base model. This distribution, represented as a vector, is constructed to be orthogonal to the vector of mistakes made by the pre- vious base model in the sequence. The idea is to make the next base model's errors uncorrelated with those of the previous model. In previous work, we developed an algorithm, AveBoost, that constructed distributions orthogonal to the mistake vectors of all the previous models, and then averaged them to create the next base model s distribution. Our experiments demonstrated the superior accuracy of our approach. In this paper, we slightly revise our algorithm to allow us to obtain non-trivial theoretical results: bounds on the training error and generalization error (difference between training and test error). Our averaging process has a regularizing effect which, as expected, leads us to a worse training error bound for our algorithm than for AdaBoost but a superior generalization error bound. For this paper, we experimented with the data that we used in both as originally supplied and with added label noise-a small fraction of the data has its original label changed. Noisy data are notoriously difficult for AdaBoost to learn. Our algorithm's performance improvement over AdaBoost is even greater on the noisy data than the original data.

  15. A Multilocus Molecular Phylogeny of the Parrots (Psittaciformes): Support for a Gondwanan Origin during the Cretaceous

    PubMed Central

    Schirtzinger, Erin E.; Matsumoto, Tania; Eberhard, Jessica R.; Graves, Gary R.; Sanchez, Juan J.; Capelli, Sara; Müller, Heinrich; Scharpegge, Julia; Chambers, Geoffrey K.; Fleischer, Robert C.

    2008-01-01

    The question of when modern birds (Neornithes) first diversified has generated much debate among avian systematists. Fossil evidence generally supports a Tertiary diversification, whereas estimates based on molecular dating favor an earlier diversification in the Cretaceous period. In this study, we used an alternate approach, the inference of historical biogeographic patterns, to test the hypothesis that the initial radiation of the Order Psittaciformes (the parrots and cockatoos) originated on the Gondwana supercontinent during the Cretaceous. We utilized broad taxonomic sampling (representatives of 69 of the 82 extant genera and 8 outgroup taxa) and multilocus molecular character sampling (3,941 bp from mitochondrial DNA (mtDNA) genes cytochrome oxidase I and NADH dehydrogenase 2 and nuclear introns of rhodopsin intron 1, tropomyosin alpha-subunit intron 5, and transforming growth factor ß-2) to generate phylogenetic hypotheses for the Psittaciformes. Analyses of the combined character partitions using maximum parsimony, maximum likelihood, and Bayesian criteria produced well-resolved and topologically similar trees in which the New Zealand taxa Strigops and Nestor (Psittacidae) were sister to all other psittaciforms and the cockatoo clade (Cacatuidae) was sister to a clade containing all remaining parrots (Psittacidae). Within this large clade of Psittacidae, some traditionally recognized tribes and subfamilies were monophyletic (e.g., Arini, Psittacini, and Loriinae), whereas several others were polyphyletic (e.g., Cyclopsittacini, Platycercini, Psittaculini, and Psittacinae). Ancestral area reconstructions using our Bayesian phylogenetic hypothesis and current distributions of genera supported the hypothesis of an Australasian origin for the Psittaciformes. Separate analyses of the timing of parrot diversification constructed with both Bayesian relaxed-clock and penalized likelihood approaches showed better agreement between geologic and diversification events in the chronograms based on a Cretaceous dating of the basal split within parrots than the chronograms based on a Tertiary dating of this split, although these data are more equivocal. Taken together, our results support a Cretaceous origin of Psittaciformes in Gondwana after the separation of Africa and the India/Madagascar block with subsequent diversification through both vicariance and dispersal. These well-resolved molecular phylogenies will be of value for comparative studies of behavior, ecology, and life history in parrots. PMID:18653733

  16. Locality-preserving sparse representation-based classification in hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Gao, Lianru; Yu, Haoyang; Zhang, Bing; Li, Qingting

    2016-10-01

    This paper proposes to combine locality-preserving projections (LPP) and sparse representation (SR) for hyperspectral image classification. The LPP is first used to reduce the dimensionality of all the training and testing data by finding the optimal linear approximations to the eigenfunctions of the Laplace Beltrami operator on the manifold, where the high-dimensional data lies. Then, SR codes the projected testing pixels as sparse linear combinations of all the training samples to classify the testing pixels by evaluating which class leads to the minimum approximation error. The integration of LPP and SR represents an innovative contribution to the literature. The proposed approach, called locality-preserving SR-based classification, addresses the imbalance between high dimensionality of hyperspectral data and the limited number of training samples. Experimental results on three real hyperspectral data sets demonstrate that the proposed approach outperforms the original counterpart, i.e., SR-based classification.

  17. Dynamics of functional failures and recovery in complex road networks

    NASA Astrophysics Data System (ADS)

    Zhan, Xianyuan; Ukkusuri, Satish V.; Rao, P. Suresh C.

    2017-11-01

    We propose a new framework for modeling the evolution of functional failures and recoveries in complex networks, with traffic congestion on road networks as the case study. Differently from conventional approaches, we transform the evolution of functional states into an equivalent dynamic structural process: dual-vertex splitting and coalescing embedded within the original network structure. The proposed model successfully explains traffic congestion and recovery patterns at the city scale based on high-resolution data from two megacities. Numerical analysis shows that certain network structural attributes can amplify or suppress cascading functional failures. Our approach represents a new general framework to model functional failures and recoveries in flow-based networks and allows understanding of the interplay between structure and function for flow-induced failure propagation and recovery.

  18. Developing Evidence for Football (Soccer) Reminiscence Interventions Within Long-term Care: A Co-operative Approach Applied in Scotland and Spain.

    PubMed

    Coll-Planas, Laura; Watchman, Karen; Doménech, Sara; McGillivray, David; O'Donnell, Hugh; Tolson, Debbie

    2017-04-01

    Loneliness is a common experience within long-term care and, to promote well-being and quality of life among people with dementia, it is important to draw upon a repertoire of strategies that provide social stimulation, companionship, and enjoyment. This paper describes and reflects on a program of co-operative social participatory research that sought to introduce football-focused (ie, soccer-based) reminiscence based in 4 community settings within Spain and Scotland. Findings are reported and inform an original conceptual model that supports the introduction of sustainable approaches to the development of football-focused reminiscence with and for people with dementia. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  19. Hyper sausage neuron: Recognition of transgenic sugar-beet based on terahertz spectroscopy

    NASA Astrophysics Data System (ADS)

    Liu, Jianjun; Li, Zhi; Hu, Fangrong; Chen, Tao; Du, Yong; Xin, Haitao

    2015-01-01

    This paper presents a novel approach for identification of terahertz (THz) spectral of genetically modified organisms (GMOs) based on Hyper Sausage Neuron (HSN), and THz transmittance spectra of some typical transgenic sugar-beet samples are investigated to demonstrate its feasibility. Principal component analysis (PCA) is applied to extract features of the spectrum data, and instead of the original spectrum data, the feature signals are fed into the HSN pattern recognition, a new multiple weights neural network (MWNN). The experimental result shows that the HSN model not only can correctly classify different types of transgenic sugar-beets, but also can reject identity non similar samples in the same type. The proposed approach provides a new effective method for detection and identification of GMOs by using THz spectroscopy.

  20. Identification of Transgenic Organisms Based on Terahertz Spectroscopy and Hyper Sausage Neuron

    NASA Astrophysics Data System (ADS)

    Liu, J.; Li, Zh.; Hu, F.; Chen, T.; Du, Y.; Xin, H.

    2015-03-01

    This paper presents a novel approach for identifi cation of terahertz (THz) spectra of genetically modifi ed organisms (GMOs) based on hyper sausage neuron (HSN), and THz transmittance spectra of some typical transgenic sugarbeet samples are investigated to demonstrate its feasibility. Principal component analysis (PCA) is applied to extract features of the spectrum data, and instead of the original spectrum data, the feature signals are fed into the HSN pattern recognition, a new multiple weights neural network (MWNN). The experimental result shows that the HSN model not only can correctly classify different types of transgenic sugar-beets, but also can reject nonsimilar samples of the same type. The proposed approach provides a new effective method for detection and identification of genetically modified organisms by using THz spectroscopy.

  1. A computational analysis of lower bounds for the economic lot sizing problem in remanufacturing with separate setups

    NASA Astrophysics Data System (ADS)

    Aishah Syed Ali, Sharifah

    2017-09-01

    This paper considers economic lot sizing problem in remanufacturing with separate setup (ELSRs), where remanufactured and new products are produced on dedicated production lines. Since this problem is NP-hard in general, which leads to computationally inefficient and low-quality of solutions, we present (a) a multicommodity formulation and (b) a strengthened formulation based on a priori addition of valid inequalities in the space of original variables, which are then compared with the Wagner-Whitin based formulation available in the literature. Computational experiments on a large number of test data sets are performed to evaluate the different approaches. The numerical results show that our strengthened formulation outperforms all the other tested approaches in terms of linear relaxation bounds. Finally, we conclude with future research directions.

  2. Doping porous silicon with erbium: pores filling as a method to limit the Er-clustering effects and increasing its light emission.

    PubMed

    Mula, Guido; Printemps, Tony; Licitra, Christophe; Sogne, Elisa; D'Acapito, Francesco; Gambacorti, Narciso; Sestu, Nicola; Saba, Michele; Pinna, Elisa; Chiriu, Daniele; Ricci, Pier Carlo; Casu, Alberto; Quochi, Francesco; Mura, Andrea; Bongiovanni, Giovanni; Falqui, Andrea

    2017-07-20

    Er clustering plays a major role in hindering sufficient optical gain in Er-doped Si materials. For porous Si, the long-standing failure to govern the clustering has been attributed to insufficient knowledge of the several, concomitant and complex processes occurring during the electrochemical Er-doping. We propose here an alternative road to solve the issue: instead of looking for an equilibrium between Er content and light emission using 1-2% Er, we propose to significantly increase the electrochemical doping level to reach the filling the porous silicon pores with luminescent Er-rich material. To better understand the intricate and superposing phenomena of this process, we exploit an original approach based on needle electron tomography, EXAFS and photoluminescence. Needle electron tomography surprisingly shows a heterogeneous distribution of Er content in the silicon thin pores that until now couldn't be revealed by the sole use of scanning electron microscopy compositional mapping. Besides, while showing that pore filling leads to enhanced photoluminescence emission, we demonstrate that the latter is originated from both erbium oxide and silicate. These results give a much deeper understanding of the photoluminescence origin down to nanoscale and could lead to novel approaches focused on noteworthy enhancement of Er-related photoluminescence in porous silicon.

  3. Investigating the timing of origin and evolutionary processes shaping regional species diversity: Insights from simulated data and neotropical butterfly diversification rates.

    PubMed

    Matos-Maraví, Pável

    2016-07-01

    Different diversification scenarios have been proposed to explain the origin of extant biodiversity. However, most existing meta-analyses of time-calibrated phylogenies rely on approaches that do not quantitatively test alternative diversification processes. Here, I highlight the shortcomings of using species divergence ranks, which is a method widely used in meta-analyses. Divergence ranks consist of categorizing cladogenetic events to certain periods of time, typically to either Pleistocene or to pre-Pleistocene ages. This approach has been claimed to shed light on the origin of most extant species and the timing and dynamics of diversification in any biogeographical region. However, interpretations drawn from such method often confound two fundamental questions in macroevolutionary studies, tempo (timing of evolutionary rate shifts) and mode ("how" and "why" of speciation). By using simulated phylogenies under four diversification scenarios, constant-rate, diversity-dependence, high extinction, and high speciation rates in the Pleistocene, I showed that interpretations based on species divergence ranks might have been seriously misleading. Future meta-analyses of dated phylogenies need to be aware of the impacts of incomplete taxonomic sampling, tree topology, and divergence time uncertainties, as well as they might be benefited by including quantitative tests of alternative diversification models that acknowledge extinction and diversity dependence. © 2016 The Author(s).

  4. Overcoming the winner's curse: estimating penetrance parameters from case-control data.

    PubMed

    Zollner, Sebastian; Pritchard, Jonathan K

    2007-04-01

    Genomewide association studies are now a widely used approach in the search for loci that affect complex traits. After detection of significant association, estimates of penetrance and allele-frequency parameters for the associated variant indicate the importance of that variant and facilitate the planning of replication studies. However, when these estimates are based on the original data used to detect the variant, the results are affected by an ascertainment bias known as the "winner's curse." The actual genetic effect is typically smaller than its estimate. This overestimation of the genetic effect may cause replication studies to fail because the necessary sample size is underestimated. Here, we present an approach that corrects for the ascertainment bias and generates an estimate of the frequency of a variant and its penetrance parameters. The method produces a point estimate and confidence region for the parameter estimates. We study the performance of this method using simulated data sets and show that it is possible to greatly reduce the bias in the parameter estimates, even when the original association study had low power. The uncertainty of the estimate decreases with increasing sample size, independent of the power of the original test for association. Finally, we show that application of the method to case-control data can improve the design of replication studies considerably.

  5. Addressing the "Replication Crisis": Using Original Studies to Design Replication Studies with Appropriate Statistical Power.

    PubMed

    Anderson, Samantha F; Maxwell, Scott E

    2017-01-01

    Psychology is undergoing a replication crisis. The discussion surrounding this crisis has centered on mistrust of previous findings. Researchers planning replication studies often use the original study sample effect size as the basis for sample size planning. However, this strategy ignores uncertainty and publication bias in estimated effect sizes, resulting in overly optimistic calculations. A psychologist who intends to obtain power of .80 in the replication study, and performs calculations accordingly, may have an actual power lower than .80. We performed simulations to reveal the magnitude of the difference between actual and intended power based on common sample size planning strategies and assessed the performance of methods that aim to correct for effect size uncertainty and/or bias. Our results imply that even if original studies reflect actual phenomena and were conducted in the absence of questionable research practices, popular approaches to designing replication studies may result in a low success rate, especially if the original study is underpowered. Methods correcting for bias and/or uncertainty generally had higher actual power, but were not a panacea for an underpowered original study. Thus, it becomes imperative that 1) original studies are adequately powered and 2) replication studies are designed with methods that are more likely to yield the intended level of power.

  6. Towards a Universal Biology: Is the Origin and Evolution of Life Predictable?

    NASA Technical Reports Server (NTRS)

    Rothschild, Lynn J.

    2017-01-01

    The origin and evolution of life seems an unpredictable oddity, based on the quirks of contingency. Celebrated by the late Stephen Jay Gould in several books, "evolution by contingency" has all the adventure of a thriller, but lacks the predictive power of the physical sciences. Not necessarily so, replied Simon Conway Morris, for convergence reassures us that certain evolutionary responses are replicable. The outcome of this debate is critical to Astrobiology. How can we understand where we came from on Earth without prophesy? Further, we cannot design a rational strategy for the search for life elsewhere - or to understand what the future will hold for life on Earth and beyond - without extrapolating from pre-biotic chemistry and evolution. There are several indirect approaches to understanding, and thus describing, what life must be. These include philosophical approaches to defining life (is there even a satisfactory definition of life?), using what we know of physics, chemistry and life to imagine alternate scenarios, using different approaches that life takes as pseudoreplicates (e.g., ribosomal vs non-ribosomal protein synthesis), and experimental approaches to understand the art of the possible. Given that: (1) Life is a process based on physical components rather than simply an object; (2). Life is likely based on organic carbon and needs a solvent for chemistry, most likely water, and (3) Looking for convergence in terrestrial evolution we can predict certain tendencies, if not quite "laws", that provide predictive power. Biological history must obey the laws of physics and chemistry, the principles of natural selection, the constraints of an evolutionary past, genetics, and developmental biology. This amalgam creates a surprising amount of predictive power in the broad outline. Critical is the apparent prevalence of organic chemistry, and uniformity in the universe of the laws of chemistry and physics. Instructive is the widespread occurrence of convergent or parallel evolution, which suggests that under certain conditions similar solutions are arrived at independently.

  7. Applications of Vygotsky's Sociocultural Approach for Teachers' Professional Development

    ERIC Educational Resources Information Center

    Shabani, Karim

    2016-01-01

    This paper outlines an approach to teachers' professional development (PD) that originates in Vygotsky's sociocultural theory (SCT), arguing that what Vygotsky claimed about students' learning in the school setting is applicable to the teachers and that the developmental theories of Vygotsky resting on the notions of social origin of mental…

  8. The Supraorbital Keyhole Craniotomy through an Eyebrow Incision: Its Origins and Evolution

    PubMed Central

    Ormond, D. Ryan; Hadjipanayis, Costas G.

    2013-01-01

    In the modern era of neurosurgery, the use of the operative microscope, rigid rod-lens endoscope, and neuronavigation has helped to overcome some of the previous limitations of surgery due to poor lighting and anatomic localization available to the surgeon. Over the last thirty years, the supraorbital craniotomy and subfrontal approach through an eyebrow incision have been developed and refined to play a legitimate role in the armamentarium of the modern skull base neurosurgeon. With careful patient selection, the supraorbital “keyhole” approach offers a less invasive but still efficacious approach to a number of lesions along the subfrontal corridor. Well over 1000 cases have been reported in the literature utilizing this approach establishing its safety and efficacy. This paper discusses the nuances of this approach, including the benefits and limitations of its use described through our technique, review of the literature, and case illustration. PMID:23936644

  9. Comparison of Alternate and Original Items on the Montreal Cognitive Assessment

    PubMed Central

    Lebedeva, Elena; Huang, Mei; Koski, Lisa

    2016-01-01

    Background The Montreal Cognitive Assessment (MoCA) is a screening tool for mild cognitive impairment (MCI) in elderly individuals. We hypothesized that measurement error when using the new alternate MoCA versions to monitor change over time could be related to the use of items that are not of comparable difficulty to their corresponding originals of similar content. The objective of this study was to compare the difficulty of the alternate MoCA items to the original ones. Methods Five selected items from alternate versions of the MoCA were included with items from the original MoCA administered adaptively to geriatric outpatients (N = 78). Rasch analysis was used to estimate the difficulty level of the items. Results None of the five items from the alternate versions matched the difficulty level of their corresponding original items. Conclusions This study demonstrates the potential benefits of a Rasch analysis-based approach for selecting items during the process of development of parallel forms. The results suggest that better match of the items from different MoCA forms by their difficulty would result in higher sensitivity to changes in cognitive function over time. PMID:27076861

  10. An item-response theory approach to safety climate measurement: The Liberty Mutual Safety Climate Short Scales.

    PubMed

    Huang, Yueng-Hsiang; Lee, Jin; Chen, Zhuo; Perry, MacKenna; Cheung, Janelle H; Wang, Mo

    2017-06-01

    Zohar and Luria's (2005) safety climate (SC) scale, measuring organization- and group- level SC each with 16 items, is widely used in research and practice. To improve the utility of the SC scale, we shortened the original full-length SC scales. Item response theory (IRT) analysis was conducted using a sample of 29,179 frontline workers from various industries. Based on graded response models, we shortened the original scales in two ways: (1) selecting items with above-average discriminating ability (i.e. offering more than 6.25% of the original total scale information), resulting in 8-item organization-level and 11-item group-level SC scales; and (2) selecting the most informative items that together retain at least 30% of original scale information, resulting in 4-item organization-level and 4-item group-level SC scales. All four shortened scales had acceptable reliability (≥0.89) and high correlations (≥0.95) with the original scale scores. The shortened scales will be valuable for academic research and practical survey implementation in improving occupational safety. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  11. Kernel PLS-SVC for Linear and Nonlinear Discrimination

    NASA Technical Reports Server (NTRS)

    Rosipal, Roman; Trejo, Leonard J.; Matthews, Bryan

    2003-01-01

    A new methodology for discrimination is proposed. This is based on kernel orthonormalized partial least squares (PLS) dimensionality reduction of the original data space followed by support vector machines for classification. Close connection of orthonormalized PLS and Fisher's approach to linear discrimination or equivalently with canonical correlation analysis is described. This gives preference to use orthonormalized PLS over principal component analysis. Good behavior of the proposed method is demonstrated on 13 different benchmark data sets and on the real world problem of the classification finger movement periods versus non-movement periods based on electroencephalogram.

  12. [Classification and organization technologies in public health].

    PubMed

    Filatov, V B; Zhiliaeva, E P; Kal'fa, Iu I

    2000-01-01

    The authors discuss the impact and main characteristics of organization technologies in public health and the processes of their development and evaluation. They offer an original definition of the notion "organization technologies" with approaches to their classification. A system of logical bases is offered, which can be used for classification. These bases include the level of organization maturity and stage of development of organization technology, its destination to a certain level of management, type of influence and concentration of trend, mechanism of effect, functional group, and methods of development.

  13. A neural-network-based exponential H∞ synchronisation for chaotic secure communication via improved genetic algorithm

    NASA Astrophysics Data System (ADS)

    Hsiao, Feng-Hsiag

    2016-10-01

    In this study, a novel approach via improved genetic algorithm (IGA)-based fuzzy observer is proposed to realise exponential optimal H∞ synchronisation and secure communication in multiple time-delay chaotic (MTDC) systems. First, an original message is inserted into the MTDC system. Then, a neural-network (NN) model is employed to approximate the MTDC system. Next, a linear differential inclusion (LDI) state-space representation is established for the dynamics of the NN model. Based on this LDI state-space representation, this study proposes a delay-dependent exponential stability criterion derived in terms of Lyapunov's direct method, thus ensuring that the trajectories of the slave system approach those of the master system. Subsequently, the stability condition of this criterion is reformulated into a linear matrix inequality (LMI). Due to GA's random global optimisation search capabilities, the lower and upper bounds of the search space can be set so that the GA will seek better fuzzy observer feedback gains, accelerating feedback gain-based synchronisation via the LMI-based approach. IGA, which exhibits better performance than traditional GA, is used to synthesise a fuzzy observer to not only realise the exponential synchronisation, but also achieve optimal H∞ performance by minimizing the disturbance attenuation level and recovering the transmitted message. Finally, a numerical example with simulations is given in order to demonstrate the effectiveness of our approach.

  14. The need for comprehensive vulnerability approaches to mirror the multiplicity in mountain hazard risk

    NASA Astrophysics Data System (ADS)

    Keiler, Margreth; Fuchs, Sven

    2014-05-01

    The concept of vulnerability is pillared by multiple disciplinary theories underpinning either a technical or a social origin of the concept and resulting in a range of paradigms for vulnerability quantification. By taking a natural scientific approach we argue that a large number of studies have focused either on damage-loss functions for individual mountain hazards or on semi-quantitative indicator-based approaches for multiple hazards (hazard chains). However, efforts to reduce susceptibility to hazards and to create disaster-resilient communities require intersections among these approaches, as well as among theories originating in natural and social sciences, since human activity cannot be seen independently from the environmental setting. Acknowledging different roots of disciplinary paradigms in risk management, issues determining structural, economic, institutional and social vulnerability have to be more comprehensively addressed in the future with respect to mountain hazards in Europe and beyond. It is argued that structural vulnerability as originator results in considerable economic vulnerability, generated by the institutional settings of dealing with natural hazards and shaped by the overall societal framework. If vulnerability and its counterpart, resilience, is analysed and evaluated by using such a comprehensive approach, a better understanding of the vulnerability-influencing parameters could be achieved, taking into account the interdependencies and interactions between the disciplinary foci. As a result, three key issues should be addressed in future research: (1) Vulnerability requires a new perspective on the relationship between society and environment: not as a duality, but more as a mutually constitutive relationship (including methods for assessment). (2) There is a need for concepts of vulnerability that emphasise the dynamics of temporal and spatial scales, particularly with respect to Global Change processes in mountain regions. (3) Loss and damage is part of a process in which interactions of climate change with societal processes shape and transform human societies. They are part of the human-environment interaction that needs assessment and adaptation.

  15. A qualitative study of the determinants of dieting and non-dieting approaches in overweight/obese Australian adults

    PubMed Central

    2012-01-01

    Background Dieting has historically been the main behavioural treatment paradigm for overweight/obesity, although a non-dieting paradigm has more recently emerged based on the criticisms of the original dieting approach. There is a dearth of research contrasting why these approaches are adopted. To address this, we conducted a qualitative investigation into the determinants of dieting and non-dieting approaches based on the perspectives and experiences of overweight/obese Australian adults. Methods Grounded theory was used inductively to generate a model of themes contrasting the determinants of dieting and non-dieting approaches based on the perspectives of 21 overweight/obese adults. Data was collected using semi-structured interviews to elicit in-depth individual experiences and perspectives. Results Several categories emerged which distinguished between the adoption of a dieting or non-dieting approach. These categories included the focus of each approach (weight/image or lifestyle/health behaviours); internal or external attributions about dieting failure; attitudes towards established diets, and personal autonomy. Personal autonomy was also influenced by another category; the perceived knowledge and self-efficacy about each approach, with adults more likely to choose an approach they knew more about and were confident in implementing. The time perspective of change (short or long-term) and the perceived identity of the person (fat/dieter or healthy person) also emerged as determinants of dieting or non-dieting approaches respectively. Conclusions The model of determinants elicited from this study assists in understanding why dieting and non-dieting approaches are adopted, from the perspectives and experiences of overweight/obese adults. Understanding this decision-making process can assist clinicians and public health researchers to design and tailor dieting and non-dieting interventions to population subgroups that have preferences and characteristics suitable for each approach. PMID:23249115

  16. A Computationally Efficient Method for Polyphonic Pitch Estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Ruohua; Reiss, Joshua D.; Mattavelli, Marco; Zoia, Giorgio

    2009-12-01

    This paper presents a computationally efficient method for polyphonic pitch estimation. The method employs the Fast Resonator Time-Frequency Image (RTFI) as the basic time-frequency analysis tool. The approach is composed of two main stages. First, a preliminary pitch estimation is obtained by means of a simple peak-picking procedure in the pitch energy spectrum. Such spectrum is calculated from the original RTFI energy spectrum according to harmonic grouping principles. Then the incorrect estimations are removed according to spectral irregularity and knowledge of the harmonic structures of the music notes played on commonly used music instruments. The new approach is compared with a variety of other frame-based polyphonic pitch estimation methods, and results demonstrate the high performance and computational efficiency of the approach.

  17. Probabilistic Modeling of Aircraft Trajectories for Dynamic Separation Volumes

    NASA Technical Reports Server (NTRS)

    Lewis, Timothy A.

    2016-01-01

    With a proliferation of new and unconventional vehicles and operations expected in the future, the ab initio airspace design will require new approaches to trajectory prediction for separation assurance and other air traffic management functions. This paper presents an approach to probabilistic modeling of the trajectory of an aircraft when its intent is unknown. The approach uses a set of feature functions to constrain a maximum entropy probability distribution based on a set of observed aircraft trajectories. This model can be used to sample new aircraft trajectories to form an ensemble reflecting the variability in an aircraft's intent. The model learning process ensures that the variability in this ensemble reflects the behavior observed in the original data set. Computational examples are presented.

  18. Dimensions of family therapy.

    PubMed

    Madanes, C; Haley, J

    1977-08-01

    This article is a description of different approaches to therapy with a family orientation. There are general categories of family therpay which had their origins in individual therapy, such as the approaches based upon psychodynamic theory, those derived from experiential procedures, and the behavioral approaches. There are also family therapies which have not developed from individual therapy, such as the extended family system approach and the communication school of family therapy. The different therapy approaches are described within a set of dimensions which characterize most therapy. Such dimensions include whether the past or present is emphasized, whether the therapist uses interpretation or directives, whether the approach is in terms of growth or specific problems, whether hierarchy is a concern, and whether the unit is an individual, two people, three people, or a wider network. Illustrations of the different family therapy approaches are given in terms of the kinds of information that would interest the therapist of each school and the kinds of actions he or she would take to bring about change.

  19. Identifying significant gene‐environment interactions using a combination of screening testing and hierarchical false discovery rate control

    PubMed Central

    Shen, Li; Saykin, Andrew J.; Williams, Scott M.; Moore, Jason H.

    2016-01-01

    ABSTRACT Although gene‐environment (G× E) interactions play an important role in many biological systems, detecting these interactions within genome‐wide data can be challenging due to the loss in statistical power incurred by multiple hypothesis correction. To address the challenge of poor power and the limitations of existing multistage methods, we recently developed a screening‐testing approach for G× E interaction detection that combines elastic net penalized regression with joint estimation to support a single omnibus test for the presence of G× E interactions. In our original work on this technique, however, we did not assess type I error control or power and evaluated the method using just a single, small bladder cancer data set. In this paper, we extend the original method in two important directions and provide a more rigorous performance evaluation. First, we introduce a hierarchical false discovery rate approach to formally assess the significance of individual G× E interactions. Second, to support the analysis of truly genome‐wide data sets, we incorporate a score statistic‐based prescreening step to reduce the number of single nucleotide polymorphisms prior to fitting the first stage penalized regression model. To assess the statistical properties of our method, we compare the type I error rate and statistical power of our approach with competing techniques using both simple simulation designs as well as designs based on real disease architectures. Finally, we demonstrate the ability of our approach to identify biologically plausible SNP‐education interactions relative to Alzheimer's disease status using genome‐wide association study data from the Alzheimer's Disease Neuroimaging Initiative (ADNI). PMID:27578615

  20. To Control False Positives in Gene-Gene Interaction Analysis: Two Novel Conditional Entropy-Based Approaches

    PubMed Central

    Lin, Meihua; Li, Haoli; Zhao, Xiaolei; Qin, Jiheng

    2013-01-01

    Genome-wide analysis of gene-gene interactions has been recognized as a powerful avenue to identify the missing genetic components that can not be detected by using current single-point association analysis. Recently, several model-free methods (e.g. the commonly used information based metrics and several logistic regression-based metrics) were developed for detecting non-linear dependence between genetic loci, but they are potentially at the risk of inflated false positive error, in particular when the main effects at one or both loci are salient. In this study, we proposed two conditional entropy-based metrics to challenge this limitation. Extensive simulations demonstrated that the two proposed metrics, provided the disease is rare, could maintain consistently correct false positive rate. In the scenarios for a common disease, our proposed metrics achieved better or comparable control of false positive error, compared to four previously proposed model-free metrics. In terms of power, our methods outperformed several competing metrics in a range of common disease models. Furthermore, in real data analyses, both metrics succeeded in detecting interactions and were competitive with the originally reported results or the logistic regression approaches. In conclusion, the proposed conditional entropy-based metrics are promising as alternatives to current model-based approaches for detecting genuine epistatic effects. PMID:24339984

  1. An Ensemble Approach for Drug Side Effect Prediction

    PubMed Central

    Jahid, Md Jamiul; Ruan, Jianhua

    2014-01-01

    In silico prediction of drug side-effects in early stage of drug development is becoming more popular now days, which not only reduces the time for drug design but also reduces the drug development costs. In this article we propose an ensemble approach to predict drug side-effects of drug molecules based on their chemical structure. Our idea originates from the observation that similar drugs have similar side-effects. Based on this observation we design an ensemble approach that combine the results from different classification models where each model is generated by a different set of similar drugs. We applied our approach to 1385 side-effects in the SIDER database for 888 drugs. Results show that our approach outperformed previously published approaches and standard classifiers. Furthermore, we applied our method to a number of uncharacterized drug molecules in DrugBank database and predict their side-effect profiles for future usage. Results from various sources confirm that our method is able to predict the side-effects for uncharacterized drugs and more importantly able to predict rare side-effects which are often ignored by other approaches. The method described in this article can be useful to predict side-effects in drug design in an early stage to reduce experimental cost and time. PMID:25327524

  2. 3-D ultrasound volume reconstruction using the direct frame interpolation method.

    PubMed

    Scheipers, Ulrich; Koptenko, Sergei; Remlinger, Rachel; Falco, Tony; Lachaine, Martin

    2010-11-01

    A new method for 3-D ultrasound volume reconstruction using tracked freehand 3-D ultrasound is proposed. The method is based on solving the forward volume reconstruction problem using direct interpolation of high-resolution ultrasound B-mode image frames. A series of ultrasound B-mode image frames (an image series) is acquired using the freehand scanning technique and position sensing via optical tracking equipment. The proposed algorithm creates additional intermediate image frames by directly interpolating between two or more adjacent image frames of the original image series. The target volume is filled using the original frames in combination with the additionally constructed frames. Compared with conventional volume reconstruction methods, no additional filling of empty voxels or holes within the volume is required, because the whole extent of the volume is defined by the arrangement of the original and the additionally constructed B-mode image frames. The proposed direct frame interpolation (DFI) method was tested on two different data sets acquired while scanning the head and neck region of different patients. The first data set consisted of eight B-mode 2-D frame sets acquired under optimal laboratory conditions. The second data set consisted of 73 image series acquired during a clinical study. Sample volumes were reconstructed for all 81 image series using the proposed DFI method with four different interpolation orders, as well as with the pixel nearest-neighbor method using three different interpolation neighborhoods. In addition, volumes based on a reduced number of image frames were reconstructed for comparison of the different methods' accuracy and robustness in reconstructing image data that lies between the original image frames. The DFI method is based on a forward approach making use of a priori information about the position and shape of the B-mode image frames (e.g., masking information) to optimize the reconstruction procedure and to reduce computation times and memory requirements. The method is straightforward, independent of additional input or parameters, and uses the high-resolution B-mode image frames instead of usually lower-resolution voxel information for interpolation. The DFI method can be considered as a valuable alternative to conventional 3-D ultrasound reconstruction methods based on pixel or voxel nearest-neighbor approaches, offering better quality and competitive reconstruction time.

  3. Less noise, more hacking: how to deploy principles from MIT's hacking medicine to accelerate health care.

    PubMed

    DePasse, Jacqueline W; Carroll, Ryan; Ippolito, Andrea; Yost, Allison; Santorino, Data; Chu, Zen; Olson, Kristian R

    2014-07-01

    Medical technology offers enormous potential for scalable medicine--to improve the quality and access in health care while simultaneously reducing cost. However, current medical device innovation within companies often only offers incremental advances on existing products, or originates from engineers with limited knowledge of the clinical complexities. We describe how the Hacking Medicine Initiative, based at Massachusetts Institute of Technology has developed an innovative "healthcare hackathon" approach, bringing diverse teams together to rapidly validate clinical needs and develop solutions. Hackathons are based on three core principles; emphasis on a problem-based approach, cross-pollination of disciplines, and "pivoting" on or rapidly iterating on ideas. Hackathons also offer enormous potential for innovation in global health by focusing on local needs and resources as well as addressing feasibility and cultural contextualization. Although relatively new, the success of this approach is clear, as evidenced by the development of successful startup companies, pioneering product design, and the incorporation of creative people from outside traditional life science backgrounds who are working with clinicians and other scientists to create transformative innovation in health care.

  4. What Feynman Could Not yet Use: The Generalised Hong-Ou-Mandel Experiment to Improve the QED Explanation of the Pauli Exclusion Principle

    ERIC Educational Resources Information Center

    Malgieri, Massimiliano; Tenni, Antonio; Onorato, Pasquale; De Ambrosis, Anna

    2016-01-01

    In this paper we present a reasoning line for introducing the Pauli exclusion principle in the context of an introductory course on quantum theory based on the sum over paths approach. We start from the argument originally introduced by Feynman in "QED: The Strange Theory of Light and Matter" and improve it by discussing with students…

  5. The Heritage Park model: A partnership approach to park expansion in poor rural areas

    Treesearch

    Charles Ndabeni; Maretha Shroyer; Willie Boonzaaier; Gabriel Mokgoko; Sam Mochine

    2007-01-01

    The initiative to create a conservation corridor-the Heritage Park-linking the existing 62,000 ha (153,205 acre) Madikwe Game Reserve with the 49,000 ha (121,082 acre) Pilanesberg National Park, to form a 275,000 ha (679,540 acre) nature-based tourism anchor project and primary economic catalyst for a poor rural region, originated in 1999. An innovative park expansion...

  6. Old Wine in New Bottles: A Response to Claims That Teaching Games for Understanding Was Not Developed as A Theoretically Based Pedagogical Framework

    ERIC Educational Resources Information Center

    Harvey, Stephen; Pill, Shane; Almond, Len

    2018-01-01

    Background: Teaching games for understanding (TGfU) has stimulated so much attention, research and debate since the 1980s that it is easy for its origins to become refracted and misunderstood. For example, in a recent edition of the "Physical Education and Sport Pedagogy" journal there was paper arguing a constraints-led approach (CLA)…

  7. In Search for the Open Educator: Proposal of a Definition and a Framework to Increase Openness Adoption among University Educators

    ERIC Educational Resources Information Center

    Nascimbeni, Fabio; Burgos, Daniel

    2016-01-01

    The paper explores the change process that university teachers need to go through in order to become fluent with Open Education approaches. Based on a literature review and a set of interviews with a number of leading experts in the field of Open Educational Resources and Open Education, the paper puts forward an original definition of Open…

  8. LEOPACK The integrated services communications system based on LEO satellites

    NASA Astrophysics Data System (ADS)

    Negoda, A.; Bunin, S.; Bushuev, E.; Dranovsky, V.

    LEOPACK is yet another LEO satellite project which provides global integrated services for 'business' communications. It utilizes packet rather then circuit switching in both terrestrial and satellite chains as well as cellular approach for frequencies use. Original multiple access protocols and decentralized network control make it possible to organize regionally or logically independent and world-wide networks. Relatively small number of satellites (28) provides virtually global network coverage.

  9. Astronomy and Disabled: Implementation of new technologies to communicate science to new audiences

    NASA Astrophysics Data System (ADS)

    García, Beatriz; Ortiz Gil, Amelia; Proust, Dominique

    2015-08-01

    Commission 46 proposed in 2012 the creation of an interdisciplinary WG in which astronomers work together with technicians, educators and disability specialists to develop new teaching and learning strategies devoted o generate resources of high impact among disabled populations, which are usually away from astronomy. Successful initiatives designed to research the best-practices in using new technologies to communicate science in these special audiences include the creation of models and applications, and the implementation of a data base of didactic approaches and tools. Between the achievements of this proposal, we have original development in: design of electronics, design of original software, scripts and music for Planetarium functions, design of models and their associated explanatory script, printed material in Braille and 3D, filming associated with sign language, interviews and docs recompilation and the recently project on the Sign Language Universal Encyclopedic Dictionary, based on the proposal by Proust (2009) and, which proposes the dissemination of a unique language for the deaf worldwide, associated with astronomical terms.We present, on behalf of the WG, some of the achievements, developments, successful stories of recent applications of this new approach to the science for all, thinking in the new “public of sciences”, and new challenges.

  10. Importance of public participation in decision-making process in healthcare system illustarted with an example of the development of American and Polish scope of health benefit basket.

    PubMed

    Kolasa, Katarzyna; Hermanowski, Tomasz; Borek, Ewa

    2013-01-01

    The process of the development of health benefit basket may serve as a good example of decision-making process in the healthcare system which is based on public participation. Comparative analysis of development and implementation of health benefit basket in Poland and the USA. On a basis of the literature review, following questions were studied, i.e.: What is the origin of health benefit basket development in the USA and Poland? What was the role of pubic opinion in determining the range of health benefit basket in both countries? What criteria were employed to determine the range of health benefit basket in both countries? What conclusions can be drawn for Poland from the USA experience of determining the range of health benefit basket? Irrespective of the similarities in the origin of health benefit basket development, both countries approached this issue differently. In the USA, the approach based on social dialogue and patient's perspective was selected while in Poland the perspective of public payer predominated. The transparency of principles and social dialogue constitute the fundamental elements of effective process of health benefit basket development and implementation which is both required and generally unpopular modification.

  11. Improved resistivity imaging of groundwater solute plumes using POD-based inversion

    NASA Astrophysics Data System (ADS)

    Oware, E. K.; Moysey, S. M.; Khan, T.

    2012-12-01

    We propose a new approach for enforcing physics-based regularization in electrical resistivity imaging (ERI) problems. The approach utilizes a basis-constrained inversion where an optimal set of basis vectors is extracted from training data by Proper Orthogonal Decomposition (POD). The key aspect of the approach is that Monte Carlo simulation of flow and transport is used to generate a training dataset, thereby intrinsically capturing the physics of the underlying flow and transport models in a non-parametric form. POD allows for these training data to be projected onto a subspace of the original domain, resulting in the extraction of a basis for the inversion that captures characteristics of the groundwater flow and transport system, while simultaneously allowing for dimensionality reduction of the original problem in the projected space We use two different synthetic transport scenarios in heterogeneous media to illustrate how the POD-based inversion compares with standard Tikhonov and coupled inversion. The first scenario had a single source zone leading to a unimodal solute plume (synthetic #1), whereas, the second scenario had two source zones that produced a bimodal plume (synthetic #2). For both coupled inversion and the POD approach, the conceptual flow and transport model used considered only a single source zone for both scenarios. Results were compared based on multiple metrics (concentration root-mean square error (RMSE), peak concentration, and total solute mass). In addition, results for POD inversion based on 3 different data densities (120, 300, and 560 data points) and varying number of selected basis images (100, 300, and 500) were compared. For synthetic #1, we found that all three methods provided qualitatively reasonable reproduction of the true plume. Quantitatively, the POD inversion performed best overall for each metric considered. Moreover, since synthetic #1 was consistent with the conceptual transport model, a small number of basis vectors (100) contained enough a priori information to constrain the inversion. Increasing the amount of data or number of selected basis images did not translate into significant improvement in imaging results. For synthetic #2, the RMSE and error in total mass were lowest for the POD inversion. However, the peak concentration was significantly overestimated by the POD approach. Regardless, the POD-based inversion was the only technique that could capture the bimodality of the plume in the reconstructed image, thus providing critical information that could be used to reconceptualize the transport problem. We also found that, in the case of synthetic #2, increasing the number of resistivity measurements and the number of selected basis vectors allowed for significant improvements in the reconstructed images.

  12. Computer Synthesis Approaches of Hyperboloid Gear Drives with Linear Contact

    NASA Astrophysics Data System (ADS)

    Abadjiev, Valentin; Kawasaki, Haruhisa

    2014-09-01

    The computer design has improved forming different type software for scientific researches in the field of gearing theory as well as performing an adequate scientific support of the gear drives manufacture. Here are attached computer programs that are based on mathematical models as a result of scientific researches. The modern gear transmissions require the construction of new mathematical approaches to their geometric, technological and strength analysis. The process of optimization, synthesis and design is based on adequate iteration procedures to find out an optimal solution by varying definite parameters. The study is dedicated to accepted methodology in the creation of soft- ware for the synthesis of a class high reduction hyperboloid gears - Spiroid and Helicon ones (Spiroid and Helicon are trademarks registered by the Illinois Tool Works, Chicago, Ill). The developed basic computer products belong to software, based on original mathematical models. They are based on the two mathematical models for the synthesis: "upon a pitch contact point" and "upon a mesh region". Computer programs are worked out on the basis of the described mathematical models, and the relations between them are shown. The application of the shown approaches to the synthesis of commented gear drives is illustrated.

  13. Assigning site of origin in metastatic neuroendocrine neoplasms: a clinically significant application of diagnostic immunohistochemistry.

    PubMed

    Bellizzi, Andrew M

    2013-09-01

    The neuroendocrine epithelial neoplasms (NENs) include well-differentiated neuroendocrine tumors (WDNETs) and poorly differentiated neuroendocrine carcinomas (PDNECs). Whereas PDNECs are highly lethal, with localized Merkel cell carcinoma somewhat of an exception, WDNETs exhibit a range of "indolent" biologic potentials-from benign to widely metastatic and eventually fatal. Within each of these 2 groups there is substantial morphologic overlap. In the metastatic setting, the site of origin of a WDNET has significant prognostic and therapeutic implications. In the skin, Merkel cell carcinoma must be distinguished from spread of a visceral PDNEC. This review intends to prove the thesis that determining the site of origin of a NEN is clinically vital and that diagnostic immunohistochemistry is well suited to the task. It will begin by reviewing current World Health Organization terminology for the NENs, as well as an embryologic and histologic pattern-based classification. It will present population-based data on the relative frequency and biology of WDNETs arising at various anatomic sites, including the frequency of metastases of unknown primary, and comment on limitations of contemporary imaging techniques, as a means of defining the scope of the problem. It will go on to discuss the therapeutic significance of site of origin. The heart of this review is a synthesis of data compiled from >100 manuscripts on the expression of individual markers in WDNETs and PDNECs, as regards site of origin. These include proteins that are considered "key markers" and others that are either useful "secondary markers," potentially very useful markers that need to be further vetted, or ones that are widely applied despite a lack of efficacy. It will conclude with my approach to the metastatic NEN of unknown origin.

  14. Constraint-based integration of planning and scheduling for space-based observatory management

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Smith, Steven F.

    1994-01-01

    Progress toward the development of effective, practical solutions to space-based observatory scheduling problems within the HSTS scheduling framework is reported. HSTS was developed and originally applied in the context of the Hubble Space Telescope (HST) short-term observation scheduling problem. The work was motivated by the limitations of the current solution and, more generally, by the insufficiency of classical planning and scheduling approaches in this problem context. HSTS has subsequently been used to develop improved heuristic solution techniques in related scheduling domains and is currently being applied to develop a scheduling tool for the upcoming Submillimeter Wave Astronomy Satellite (SWAS) mission. The salient architectural characteristics of HSTS and their relationship to previous scheduling and AI planning research are summarized. Then, some key problem decomposition techniques underlying the integrated planning and scheduling approach to the HST problem are described; research results indicate that these techniques provide leverage in solving space-based observatory scheduling problems. Finally, more recently developed constraint-posting scheduling procedures and the current SWAS application focus are summarized.

  15. How Does the Context of Reception Matter? The Role of Residential Enclaves in Maternal Smoking During Pregnancy Among Mexican-Origin Mothers.

    PubMed

    Noah, Aggie J; Landale, Nancy S; Sparks, Corey S

    2015-08-01

    This study investigated whether and how different patterns of group exposure within residential contexts (i.e., living in a Mexican immigrant enclave, a Mexican ethnic enclave, a pan-Hispanic enclave, or a non-Hispanic white neighborhood) are associated with smoking during pregnancy among Mexican-origin mothers. Using a hierarchical linear modeling approach, we found that Mexican-origin mothers' residential contexts are important for understanding their smoking during pregnancy. Residence in an ethnic enclave is associated with decreased odds of smoking during pregnancy, while residence in a non-Hispanic white neighborhood is associated with increased odds of smoking during pregnancy, above and beyond the mothers' individual characteristics. The magnitude of the associations between residence in an ethnic enclave and smoking during pregnancy is similar across the different types of ethnic enclaves examined. The important roles of inter- and intra-group exposures suggests that in order to help Mexican-origin women, policy makers should more carefully design place-based programs and interventions that target geographic areas and the specific types of residential contexts in which women are at greater risk.

  16. Evaluation of tomotherapy MVCT image enhancement program for tumor volume delineation

    PubMed Central

    Martin, Spencer; Rodrigues, George; Chen, Quan; Pavamani, Simon; Read, Nancy; Ahmad, Belal; Hammond, J. Alex; Venkatesan, Varagur; Renaud, James

    2011-01-01

    The aims of this study were to investigate the variability between physicians in delineation of head and neck tumors on original tomotherapy megavoltage CT (MVCT) studies and corresponding software enhanced MVCT images, and to establish an optimal approach for evaluation of image improvement. Five physicians contoured the gross tumor volume (GTV) for three head and neck cancer patients on 34 original and enhanced MVCT studies. Variation between original and enhanced MVCT studies was quantified by DICE coefficient and the coefficient of variance. Based on volume of agreement between physicians, higher correlation in terms of average DICE coefficients was observed in GTV delineation for enhanced MVCT for patients 1, 2, and 3 by 15%, 3%, and 7%, respectively, while delineation variance among physicians was reduced using enhanced MVCT for 12 of 17 weekly image studies. Enhanced MVCT provides advantages in reduction of variance among physicians in delineation of the GTV. Agreement on contouring by the same physician on both original and enhanced MVCT was equally high. PACS numbers: 87.57.N‐, 87.57.np, 87.57.nt

  17. Functional genomics platform for pooled screening and mammalian genetic interaction maps

    PubMed Central

    Kampmann, Martin; Bassik, Michael C.; Weissman, Jonathan S.

    2014-01-01

    Systematic genetic interaction maps in microorganisms are powerful tools for identifying functional relationships between genes and defining the function of uncharacterized genes. We have recently implemented this strategy in mammalian cells as a two-stage approach. First, genes of interest are robustly identified in a pooled genome-wide screen using complex shRNA libraries. Second, phenotypes for all pairwise combinations of hit genes are measured in a double-shRNA screen and used to construct a genetic interaction map. Our protocol allows for rapid pooled screening under various conditions without a requirement for robotics, in contrast to arrayed approaches. Each stage of the protocol can be implemented in ~2 weeks, with additional time for analysis and generation of reagents. We discuss considerations for screen design, and present complete experimental procedures as well as a full computational analysis suite for identification of hits in pooled screens and generation of genetic interaction maps. While the protocols outlined here were developed for our original shRNA-based approach, they can be applied more generally, including to CRISPR-based approaches. PMID:24992097

  18. A Dereplication and Bioguided Discovery Approach to Reveal New Compounds from a Marine-Derived Fungus Stilbella fimetaria

    PubMed Central

    Kildgaard, Sara; Subko, Karolina; Phillips, Emma; Goidts, Violaine; de la Cruz, Mercedes; Díaz, Caridad; Gotfredsen, Charlotte H.; Frisvad, Jens C.; Nielsen, Kristian F.; Larsen, Thomas O.

    2017-01-01

    A marine-derived Stilbella fimetaria fungal strain was screened for new bioactive compounds based on two different approaches: (i) bio-guided approach using cytotoxicity and antimicrobial bioassays; and (ii) dereplication based approach using liquid chromatography with both diode array detection and high resolution mass spectrometry. This led to the discovery of several bioactive compound families with different biosynthetic origins, including pimarane-type diterpenoids and hybrid polyketide-non ribosomal peptide derived compounds. Prefractionation before bioassay screening proved to be a great aid in the dereplication process, since separate fractions displaying different bioactivities allowed a quick tentative identification of known antimicrobial compounds and of potential new analogues. A new pimarane-type diterpene, myrocin F, was discovered in trace amounts and displayed cytotoxicity towards various cancer cell lines. Further media optimization led to increased production followed by the purification and bioactivity screening of several new and known pimarane-type diterpenoids. A known broad-spectrum antifungal compound, ilicicolin H, was purified along with two new analogues, hydroxyl-ilicicolin H and ilicicolin I, and their antifungal activity was evaluated. PMID:28805711

  19. A modelling approach to assessing the timescale uncertainties in proxy series with chronological errors

    NASA Astrophysics Data System (ADS)

    Divine, D. V.; Godtliebsen, F.; Rue, H.

    2012-01-01

    The paper proposes an approach to assessment of timescale errors in proxy-based series with chronological uncertainties. The method relies on approximation of the physical process(es) forming a proxy archive by a random Gamma process. Parameters of the process are partly data-driven and partly determined from prior assumptions. For a particular case of a linear accumulation model and absolutely dated tie points an analytical solution is found suggesting the Beta-distributed probability density on age estimates along the length of a proxy archive. In a general situation of uncertainties in the ages of the tie points the proposed method employs MCMC simulations of age-depth profiles yielding empirical confidence intervals on the constructed piecewise linear best guess timescale. It is suggested that the approach can be further extended to a more general case of a time-varying expected accumulation between the tie points. The approach is illustrated by using two ice and two lake/marine sediment cores representing the typical examples of paleoproxy archives with age models based on tie points of mixed origin.

  20. An interactive approach based on a discrete differential evolution algorithm for a class of integer bilevel programming problems

    NASA Astrophysics Data System (ADS)

    Li, Hong; Zhang, Li; Jiao, Yong-Chang

    2016-07-01

    This paper presents an interactive approach based on a discrete differential evolution algorithm to solve a class of integer bilevel programming problems, in which integer decision variables are controlled by an upper-level decision maker and real-value or continuous decision variables are controlled by a lower-level decision maker. Using the Karush--Kuhn-Tucker optimality conditions in the lower-level programming, the original discrete bilevel formulation can be converted into a discrete single-level nonlinear programming problem with the complementarity constraints, and then the smoothing technique is applied to deal with the complementarity constraints. Finally, a discrete single-level nonlinear programming problem is obtained, and solved by an interactive approach. In each iteration, for each given upper-level discrete variable, a system of nonlinear equations including the lower-level variables and Lagrange multipliers is solved first, and then a discrete nonlinear programming problem only with inequality constraints is handled by using a discrete differential evolution algorithm. Simulation results show the effectiveness of the proposed approach.

  1. Proposed evaluation framework for assessing operator performance with multisensor displays

    NASA Technical Reports Server (NTRS)

    Foyle, David C.

    1992-01-01

    Despite aggressive work on the development of sensor fusion algorithms and techniques, no formal evaluation procedures have been proposed. Based on existing integration models in the literature, an evaluation framework is developed to assess an operator's ability to use multisensor, or sensor fusion, displays. The proposed evaluation framework for evaluating the operator's ability to use such systems is a normative approach: The operator's performance with the sensor fusion display can be compared to the models' predictions based on the operator's performance when viewing the original sensor displays prior to fusion. This allows for the determination as to when a sensor fusion system leads to: 1) poorer performance than one of the original sensor displays (clearly an undesirable system in which the fused sensor system causes some distortion or interference); 2) better performance than with either single sensor system alone, but at a sub-optimal (compared to the model predictions) level; 3) optimal performance (compared to model predictions); or, 4) super-optimal performance, which may occur if the operator were able to use some highly diagnostic 'emergent features' in the sensor fusion display, which were unavailable in the original sensor displays. An experiment demonstrating the usefulness of the proposed evaluation framework is discussed.

  2. A New Chronology for the Bronze Age of Northeastern Thailand and Its Implications for Southeast Asian Prehistory.

    PubMed

    Higham, Charles F W; Douka, Katerina; Higham, Thomas F G

    2015-01-01

    There are two models for the origins and timing of the Bronze Age in Southeast Asia. The first centres on the sites of Ban Chiang and Non Nok Tha in Northeast Thailand. It places the first evidence for bronze technology in about 2000 B.C., and identifies the origin by means of direct contact with specialists of the Seima Turbino metallurgical tradition of Central Eurasia. The second is based on the site of Ban Non Wat, 280 km southwest of Ban Chiang, where extensive radiocarbon dating places the transition into the Bronze Age in the 11th century B.C. with likely origins in a southward expansion of technological expertise rooted in the early states of the Yellow and Yangtze valleys, China. We have redated Ban Chiang and Non Nok Tha, as well as the sites of Ban Na Di and Ban Lum Khao, and here present 105 radiocarbon determinations that strongly support the latter model. The statistical analysis of the results using a Bayesian approach allows us to examine the data at a regional level, elucidate the timing of arrival of copper base technology in Southeast Asia and consider its social impact.

  3. A New Chronology for the Bronze Age of Northeastern Thailand and Its Implications for Southeast Asian Prehistory

    PubMed Central

    Higham, Charles F. W.

    2015-01-01

    There are two models for the origins and timing of the Bronze Age in Southeast Asia. The first centres on the sites of Ban Chiang and Non Nok Tha in Northeast Thailand. It places the first evidence for bronze technology in about 2000 B.C., and identifies the origin by means of direct contact with specialists of the Seima Turbino metallurgical tradition of Central Eurasia. The second is based on the site of Ban Non Wat, 280 km southwest of Ban Chiang, where extensive radiocarbon dating places the transition into the Bronze Age in the 11th century B.C. with likely origins in a southward expansion of technological expertise rooted in the early states of the Yellow and Yangtze valleys, China. We have redated Ban Chiang and Non Nok Tha, as well as the sites of Ban Na Di and Ban Lum Khao, and here present 105 radiocarbon determinations that strongly support the latter model. The statistical analysis of the results using a Bayesian approach allows us to examine the data at a regional level, elucidate the timing of arrival of copper base technology in Southeast Asia and consider its social impact. PMID:26384011

  4. SU-F-J-186: Enabling Adaptive IMPT with CBCT-Based Dose Recalculation for H&N and Prostate Cancer Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurz, C; LMU Munich, Munich; Park, Y

    2016-06-15

    Purpose: To enable adaptive intensity modulated proton therapy for sites sensitive to inter-fractional changes on the basis of accurate CBCT-based proton dose calculations. To this aim two CBCT intensity correction methods are considered: planning CT (pCT) to CBCT DIR and projection correction based on pCT DIR prior. Methods: 3 H&N and 3 prostate cancer patients with CBCT images and corresponding projections were used in this study, in addition to pCT and re-planning CT (rpCT) images (H&N only). A virtual CT (vCT) was generated by pCT to CBCT DIR. In a second approach, the vCT was used as prior for scattermore » correction of the CBCT projections to yield a CBCTcor image. BEV 2D range maps of SFUD IMPT plans were compared. For the prostate cases, the geometric accuracy of the vCT was also evaluated by contour comparison to physician delineation of the CBCTcor and original CBCT. Results: SFUD dose calculations on vCT and CBCTcor were found to be within 3mm for 97% to 99% of 2D range maps. Median range differences compared to rpCT were below 0.5mm. Analysis showed that the DIR-based vCT approach exhibits inaccuracies in the pelvic region due to the very low soft-tissue contrast in the CBCT. The CBCTcor approach yielded results closer to the original CBCT in terms of DICE coefficients than the vCT (median 0.91 vs 0.81) for targets and OARs. In general, the CBCTcor approach was less affected by inaccuracies of the DIR used during the generation of the vCT prior. Conclusion: Both techniques yield 3D CBCT images with intensities equivalent to diagnostic CT and appear suitable for IMPT dose calculation for most sites. For H&N cases, no considerable differences between the two techniques were found, while improved results of the CBCTcor were observed for pelvic cases due to the reduced sensitivity to registration inaccuracies. Deutsche Forschungsgemeinschaft (MAP); Bundesministerium fur Bildung und Forschung (01IB13001)« less

  5. Compact Integration of a GSM-19 Magnetic Sensor with High-Precision Positioning using VRS GNSS Technology

    PubMed Central

    Martín, Angel; Padín, Jorge; Anquela, Ana Belén; Sánchez, Juán; Belda, Santiago

    2009-01-01

    Magnetic data consists of a sequence of collected points with spatial coordinates and magnetic information. The spatial location of these points needs to be as exact as possible in order to develop a precise interpretation of magnetic anomalies. GPS is a valuable tool for accomplishing this objective, especially if the RTK approach is used. In this paper the VRS (Virtual Reference Station) technique is introduced as a new approach for real-time positioning of magnetic sensors. The main advantages of the VRS approach are, firstly, that only a single GPS receiver is needed (no base station is necessary), reducing field work and equipment costs. Secondly, VRS can operate at distances separated 50–70 km from the reference stations without degrading accuracy. A compact integration of a GSM-19 magnetometer sensor with a geodetic GPS antenna is presented; this integration does not diminish the operational flexibility of the original magnetometer and can work with the VRS approach. The coupled devices were tested in marshlands around Gandia, a city located approximately 100 km South of Valencia (Spain), thought to be the site of a Roman cemetery. The results obtained show adequate geometry and high-precision positioning for the structures to be studied (a comparison with the original low precision GPS of the magnetometer is presented). Finally, the results of the magnetic survey are of great interest for archaeological purposes. PMID:22574055

  6. Automated bond order assignment as an optimization problem.

    PubMed

    Dehof, Anna Katharina; Rurainski, Alexander; Bui, Quang Bao Anh; Böcker, Sebastian; Lenhof, Hans-Peter; Hildebrandt, Andreas

    2011-03-01

    Numerous applications in Computational Biology process molecular structures and hence strongly rely not only on correct atomic coordinates but also on correct bond order information. For proteins and nucleic acids, bond orders can be easily deduced but this does not hold for other types of molecules like ligands. For ligands, bond order information is not always provided in molecular databases and thus a variety of approaches tackling this problem have been developed. In this work, we extend an ansatz proposed by Wang et al. that assigns connectivity-based penalty scores and tries to heuristically approximate its optimum. In this work, we present three efficient and exact solvers for the problem replacing the heuristic approximation scheme of the original approach: an A*, an ILP and an fixed-parameter approach (FPT) approach. We implemented and evaluated the original implementation, our A*, ILP and FPT formulation on the MMFF94 validation suite and the KEGG Drug database. We show the benefit of computing exact solutions of the penalty minimization problem and the additional gain when computing all optimal (or even suboptimal) solutions. We close with a detailed comparison of our methods. The A* and ILP solution are integrated into the open-source C++ LGPL library BALL and the molecular visualization and modelling tool BALLView and can be downloaded from our homepage www.ball-project.org. The FPT implementation can be downloaded from http://bio.informatik.uni-jena.de/software/.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jiangjiang; Li, Weixuan; Lin, Guang

    In decision-making for groundwater management and contamination remediation, it is important to accurately evaluate the probability of the occurrence of a failure event. For small failure probability analysis, a large number of model evaluations are needed in the Monte Carlo (MC) simulation, which is impractical for CPU-demanding models. One approach to alleviate the computational cost caused by the model evaluations is to construct a computationally inexpensive surrogate model instead. However, using a surrogate approximation can cause an extra error in the failure probability analysis. Moreover, constructing accurate surrogates is challenging for high-dimensional models, i.e., models containing many uncertain input parameters.more » To address these issues, we propose an efficient two-stage MC approach for small failure probability analysis in high-dimensional groundwater contaminant transport modeling. In the first stage, a low-dimensional representation of the original high-dimensional model is sought with Karhunen–Loève expansion and sliced inverse regression jointly, which allows for the easy construction of a surrogate with polynomial chaos expansion. Then a surrogate-based MC simulation is implemented. In the second stage, the small number of samples that are close to the failure boundary are re-evaluated with the original model, which corrects the bias introduced by the surrogate approximation. The proposed approach is tested with a numerical case study and is shown to be 100 times faster than the traditional MC approach in achieving the same level of estimation accuracy.« less

  8. Urban pavement surface temperature. Comparison of numerical and statistical approach

    NASA Astrophysics Data System (ADS)

    Marchetti, Mario; Khalifa, Abderrahmen; Bues, Michel; Bouilloud, Ludovic; Martin, Eric; Chancibaut, Katia

    2015-04-01

    The forecast of pavement surface temperature is very specific in the context of urban winter maintenance. to manage snow plowing and salting of roads. Such forecast mainly relies on numerical models based on a description of the energy balance between the atmosphere, the buildings and the pavement, with a canyon configuration. Nevertheless, there is a specific need in the physical description and the numerical implementation of the traffic in the energy flux balance. This traffic was originally considered as a constant. Many changes were performed in a numerical model to describe as accurately as possible the traffic effects on this urban energy balance, such as tires friction, pavement-air exchange coefficient, and infrared flux neat balance. Some experiments based on infrared thermography and radiometry were then conducted to quantify the effect fo traffic on urban pavement surface. Based on meteorological data, corresponding pavement temperature forecast were calculated and were compared with fiels measurements. Results indicated a good agreement between the forecast from the numerical model based on this energy balance approach. A complementary forecast approach based on principal component analysis (PCA) and partial least-square regression (PLS) was also developed, with data from thermal mapping usng infrared radiometry. The forecast of pavement surface temperature with air temperature was obtained in the specific case of urban configurtation, and considering traffic into measurements used for the statistical analysis. A comparison between results from the numerical model based on energy balance, and PCA/PLS was then conducted, indicating the advantages and limits of each approach.

  9. ALI (Autonomous Lunar Investigator): Revolutionary Approach to Exploring the Moon with Addressable Reconfigurable Technology

    NASA Technical Reports Server (NTRS)

    Clark, P. E.; Curtis, S. A.; Rilee, M. L.; Floyd, S. R.

    2005-01-01

    Addressable Reconfigurable Technology (ART) based structures: Mission Concepts based on Addressable Reconfigurable Technology (ART), originally studied for future ANTS (Autonomous Nanotechnology Swarm) Space Architectures, are now being developed as rovers for nearer term use in lunar and planetary surface exploration. The architecture is based on the reconfigurable tetrahedron as a building block. Tetrahedra are combined to form space-filling networks, shaped for the required function. Basic structural components are highly modular, addressable arrays of robust nodes (tetrahedral apices) from which highly reconfigurable struts (tetrahedral edges), acting as supports or tethers, are efficiently reversibly deployed/stowed, transforming and reshaping the structures as required.

  10. Photorefractivity of triphenylamine polymers

    NASA Astrophysics Data System (ADS)

    Tsujimura, S.; Kinashi, K.; Sakai, W.; Tsutsumi, N.

    2012-10-01

    We present here the enhanced photorefractive performance and dynamic holographic image of poly(4-diphenylamino)styrene (PDAS)-based photorefractive polymeric composites (PPCs). PDAS and FDCST were synthesized as a photoconductive polymer and a nonlinear optical (NLO) dye, respectively. PPC films including PDAS, TPA (or ECZ), FDCST, and PCBM were investigated. The photorefractive quantities of the PDAS-based PPCs were measured by a degenerate four-wave mixing (DFWM) technique. Additionally, the dynamic holographic images were recorded through an appropriate PDAS-based PPC. Those dynamic holographic images clearly duplicate the original motion with high-speed quality. The present approach provides a promising candidate for the future application of dynamic holographic displays.

  11. Model based manipulator control

    NASA Technical Reports Server (NTRS)

    Petrosky, Lyman J.; Oppenheim, Irving J.

    1989-01-01

    The feasibility of using model based control (MBC) for robotic manipulators was investigated. A double inverted pendulum system was constructed as the experimental system for a general study of dynamically stable manipulation. The original interest in dynamically stable systems was driven by the objective of high vertical reach (balancing), and the planning of inertially favorable trajectories for force and payload demands. The model-based control approach is described and the results of experimental tests are summarized. Results directly demonstrate that MBC can provide stable control at all speeds of operation and support operations requiring dynamic stability such as balancing. The application of MBC to systems with flexible links is also discussed.

  12. High-Speed Photorefractive Response Capability in Triphenylamine Polymer-Based Composites

    NASA Astrophysics Data System (ADS)

    Tsujimura, Sho; Kinashi, Kenji; Sakai, Wataru; Tsutsumi, Naoto

    2012-06-01

    We present here the poly(4-diphenylamino)styrene (PDAS)-based photorefractive composites with a high-speed response time. PDAS was synthesized as a photoconductive polymer and photorefractive polymeric composite (PPC) films by using triphenylamine (TPA) (or ethylcarbazole, ECZ), 4-homopiperidino-2-fluorobenzylidene malononitrile (FDCST), and [6,6]-phenyl-C61-butyric acid methyl ester (PCBM) were investigated. The photorefractive quantities of the PDAS-based PPCs were determined by a degenerate four-wave mixing (DFWM) technique. Additionally, the holographic images were recorded through an appropriate PDAS-based PPC. Those holographic images clearly reconstruct the original motion with high-speed quality. The present approach provides a promising candidate for the future application of dynamic holographic displays.

  13. Spatio-Temporal Mining of PolSAR Satellite Image Time Series

    NASA Astrophysics Data System (ADS)

    Julea, A.; Meger, N.; Trouve, E.; Bolon, Ph.; Rigotti, C.; Fallourd, R.; Nicolas, J.-M.; Vasile, G.; Gay, M.; Harant, O.; Ferro-Famil, L.

    2010-12-01

    This paper presents an original data mining approach for describing Satellite Image Time Series (SITS) spatially and temporally. It relies on pixel-based evolution and sub-evolution extraction. These evolutions, namely the frequent grouped sequential patterns, are required to cover a minimum surface and to affect pixels that are sufficiently connected. These spatial constraints are actively used to face large data volumes and to select evolutions making sense for end-users. In this paper, a specific application to fully polarimetric SAR image time series is presented. Preliminary experiments performed on a RADARSAT-2 SITS covering the Chamonix Mont-Blanc test-site are used to illustrate the proposed approach.

  14. A convenient alignment approach for x-ray imaging experiments based on laser positioning devices

    PubMed Central

    Zhang, Da; Donovan, Molly; Wu, Xizeng; Liu, Hong

    2008-01-01

    This study presents a two-laser alignment approach for facilitating the precise alignment of various imaging and measuring components with respect to the x-ray beam. The first laser constantly pointed to the output window of the source, in a direction parallel to the path along which the components are placed. The second laser beam, originating from the opposite direction, was calibrated to coincide with the first laser beam. Thus, a visible indicator of the direction of the incident x-ray beam was established, and the various components could then be aligned conveniently and accurately with its help. PMID:19070224

  15. Rescheduling with iterative repair

    NASA Technical Reports Server (NTRS)

    Zweben, Monte; Davis, Eugene; Daun, Brian; Deale, Michael

    1992-01-01

    This paper presents a new approach to rescheduling called constraint-based iterative repair. This approach gives our system the ability to satisfy domain constraints, address optimization concerns, minimize perturbation to the original schedule, produce modified schedules, quickly, and exhibits 'anytime' behavior. The system begins with an initial, flawed schedule and then iteratively repairs constraint violations until a conflict-free schedule is produced. In an empirical demonstration, we vary the importance of minimizing perturbation and report how fast the system is able to resolve conflicts in a given time bound. We also show the anytime characteristics of the system. These experiments were performed within the domain of Space Shuttle ground processing.

  16. Adjustable lossless image compression based on a natural splitting of an image into drawing, shading, and fine-grained components

    NASA Technical Reports Server (NTRS)

    Novik, Dmitry A.; Tilton, James C.

    1993-01-01

    The compression, or efficient coding, of single band or multispectral still images is becoming an increasingly important topic. While lossy compression approaches can produce reconstructions that are visually close to the original, many scientific and engineering applications require exact (lossless) reconstructions. However, the most popular and efficient lossless compression techniques do not fully exploit the two-dimensional structural links existing in the image data. We describe here a general approach to lossless data compression that effectively exploits two-dimensional structural links of any length. After describing in detail two main variants on this scheme, we discuss experimental results.

  17. Adaptive Load-Balancing Algorithms using Symmetric Broadcast Networks

    NASA Technical Reports Server (NTRS)

    Das, Sajal K.; Harvey, Daniel J.; Biswas, Rupak; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    In a distributed computing environment, it is important to ensure that the processor workloads are adequately balanced, Among numerous load-balancing algorithms, a unique approach due to Das and Prasad defines a symmetric broadcast network (SBN) that provides a robust communication pattern among the processors in a topology-independent manner. In this paper, we propose and analyze three efficient SBN-based dynamic load-balancing algorithms, and implement them on an SGI Origin2000. A thorough experimental study with Poisson distributed synthetic loads demonstrates that our algorithms are effective in balancing system load. By optimizing completion time and idle time, the proposed algorithms are shown to compare favorably with several existing approaches.

  18. A cloud-based data network approach for translational cancer research.

    PubMed

    Xing, Wei; Tsoumakos, Dimitrios; Ghanem, Moustafa

    2015-01-01

    We develop a new model and associated technology for constructing and managing self-organizing data to support translational cancer research studies. We employ a semantic content network approach to address the challenges of managing cancer research data. Such data is heterogeneous, large, decentralized, growing and continually being updated. Moreover, the data originates from different information sources that may be partially overlapping, creating redundancies as well as contradictions and inconsistencies. Building on the advantages of elasticity of cloud computing, we deploy the cancer data networks on top of the CELAR Cloud platform to enable more effective processing and analysis of Big cancer data.

  19. Coulomb explosion: a novel approach to separate single-walled carbon nanotubes from their bundle.

    PubMed

    Liu, Guangtong; Zhao, Yuanchun; Zheng, Kaihong; Liu, Zheng; Ma, Wenjun; Ren, Yan; Xie, Sishen; Sun, Lianfeng

    2009-01-01

    A novel approach based on Coulomb explosion has been developed to separate single-walled carbon nanotubes (SWNTs) from their bundle. With this technique, we can readily separate a bundle of SWNTs into smaller bundles with uniform diameter as well as some individual SWNTs. The separated SWNTs have a typical length of several microns and form a nanotree at one end of the original bundle. More importantly, this separating procedure involves no surfactant and includes only one-step physical process. The separation method offers great conveniences for the subsequent individual SWNT or multiterminal SWNTs device fabrication and their physical properties studies.

  20. Production of recombinant allergens in plants.

    PubMed

    Schmidt, Georg; Gadermaier, Gabriele; Pertl, Heidi; Siegert, Marc; Oksman-Caldentey, Kirsi-Marja; Ritala, Anneli; Himly, Martin; Obermeyer, Gerhard; Ferreira, Fatima

    2008-10-01

    A large percentage of allergenic proteins are of plant origin. Hence, plant-based expression systems are considered ideal for the recombinant production of certain allergens. First attempts to establish production of plant-derived allergens in plants focused on transient expression in Nicotiana benthamiana infected with recombinant viral vectors. Accordingly, allergens from birch and mugwort pollen, as well as from apple have been expressed in plants. Production of house dust mite allergens has been achieved by Agrobacterium-mediated transformation of tobacco plants. Beside the use of plants as production systems, other approaches have focused on the development of edible vaccines expressing allergens or epitopes thereof, which bypasses the need of allergen purification. The potential of this approach has been convincingly demonstrated for transgenic rice seeds expressing seven dominant human T cell epitopes derived from Japanese cedar pollen allergens. Parallel to efforts in developing recombinant-based diagnostic and therapeutic reagents, different gene-silencing approaches have been used to decrease the expression of allergenic proteins in allergen sources. In this way hypoallergenic ryegrass, soybean, rice, apple, and tomato were developed.

Top