Sample records for target reliability based

  1. Geo-Referenced Dynamic Pushbroom Stereo Mosaics for 3D and Moving Target Extraction - A New Geometric Approach

    DTIC Science & Technology

    2009-12-01

    facilitating reliable stereo matching, occlusion handling, accurate 3D reconstruction and robust moving target detection . We use the fact that all the...a moving platform, we will have to naturally and effectively handle obvious motion parallax and object occlusions in order to be able to detect ...facilitating reliable stereo matching, occlusion handling, accurate 3D reconstruction and robust moving target detection . Based on the above two

  2. Method of Testing and Predicting Failures of Electronic Mechanical Systems

    NASA Technical Reports Server (NTRS)

    Iverson, David L.; Patterson-Hine, Frances A.

    1996-01-01

    A method employing a knowledge base of human expertise comprising a reliability model analysis implemented for diagnostic routines is disclosed. The reliability analysis comprises digraph models that determine target events created by hardware failures human actions, and other factors affecting the system operation. The reliability analysis contains a wealth of human expertise information that is used to build automatic diagnostic routines and which provides a knowledge base that can be used to solve other artificial intelligence problems.

  3. Study on evaluation of construction reliability for engineering project based on fuzzy language operator

    NASA Astrophysics Data System (ADS)

    Shi, Yu-Fang; Ma, Yi-Yi; Song, Ping-Ping

    2018-03-01

    System Reliability Theory is a research hotspot of management science and system engineering in recent years, and construction reliability is useful for quantitative evaluation of project management level. According to reliability theory and target system of engineering project management, the defination of construction reliability appears. Based on fuzzy mathematics theory and language operator, value space of construction reliability is divided into seven fuzzy subsets and correspondingly, seven membership function and fuzzy evaluation intervals are got with the operation of language operator, which provides the basis of corresponding method and parameter for the evaluation of construction reliability. This method is proved to be scientific and reasonable for construction condition and an useful attempt for theory and method research of engineering project system reliability.

  4. Weighted integration of short-term memory and sensory signals in the oculomotor system.

    PubMed

    Deravet, Nicolas; Blohm, Gunnar; de Xivry, Jean-Jacques Orban; Lefèvre, Philippe

    2018-05-01

    Oculomotor behaviors integrate sensory and prior information to overcome sensory-motor delays and noise. After much debate about this process, reliability-based integration has recently been proposed and several models of smooth pursuit now include recurrent Bayesian integration or Kalman filtering. However, there is a lack of behavioral evidence in humans supporting these theoretical predictions. Here, we independently manipulated the reliability of visual and prior information in a smooth pursuit task. Our results show that both smooth pursuit eye velocity and catch-up saccade amplitude were modulated by visual and prior information reliability. We interpret these findings as the continuous reliability-based integration of a short-term memory of target motion with visual information, which support modeling work. Furthermore, we suggest that saccadic and pursuit systems share this short-term memory. We propose that this short-term memory of target motion is quickly built and continuously updated, and constitutes a general building block present in all sensorimotor systems.

  5. Behavior and neural basis of near-optimal visual search

    PubMed Central

    Ma, Wei Ji; Navalpakkam, Vidhya; Beck, Jeffrey M; van den Berg, Ronald; Pouget, Alexandre

    2013-01-01

    The ability to search efficiently for a target in a cluttered environment is one of the most remarkable functions of the nervous system. This task is difficult under natural circumstances, as the reliability of sensory information can vary greatly across space and time and is typically a priori unknown to the observer. In contrast, visual-search experiments commonly use stimuli of equal and known reliability. In a target detection task, we randomly assigned high or low reliability to each item on a trial-by-trial basis. An optimal observer would weight the observations by their trial-to-trial reliability and combine them using a specific nonlinear integration rule. We found that humans were near-optimal, regardless of whether distractors were homogeneous or heterogeneous and whether reliability was manipulated through contrast or shape. We present a neural-network implementation of near-optimal visual search based on probabilistic population coding. The network matched human performance. PMID:21552276

  6. Advancing methods for reliably assessing motivational interviewing fidelity using the Motivational Interviewing Skills Code

    PubMed Central

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W.; Imel, Zac E.; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C.

    2014-01-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. PMID:25242192

  7. Advancing methods for reliably assessing motivational interviewing fidelity using the motivational interviewing skills code.

    PubMed

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C

    2015-02-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Reliability Assessment of a Robust Design Under Uncertainty for a 3-D Flexible Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J. -W.; Newman, Perry A.

    2003-01-01

    The paper presents reliability assessment results for the robust designs under uncertainty of a 3-D flexible wing previously reported by the authors. Reliability assessments (additional optimization problems) of the active constraints at the various probabilistic robust design points are obtained and compared with the constraint values or target constraint probabilities specified in the robust design. In addition, reliability-based sensitivity derivatives with respect to design variable mean values are also obtained and shown to agree with finite difference values. These derivatives allow one to perform reliability based design without having to obtain second-order sensitivity derivatives. However, an inner-loop optimization problem must be solved for each active constraint to find the most probable point on that constraint failure surface.

  9. Evaluation of the reliability of maize reference assays for GMO quantification.

    PubMed

    Papazova, Nina; Zhang, David; Gruden, Kristina; Vojvoda, Jana; Yang, Litao; Buh Gasparic, Meti; Blejec, Andrej; Fouilloux, Stephane; De Loose, Marc; Taverniers, Isabel

    2010-03-01

    A reliable PCR reference assay for relative genetically modified organism (GMO) quantification must be specific for the target taxon and amplify uniformly along the commercialised varieties within the considered taxon. Different reference assays for maize (Zea mays L.) are used in official methods for GMO quantification. In this study, we evaluated the reliability of eight existing maize reference assays, four of which are used in combination with an event-specific polymerase chain reaction (PCR) assay validated and published by the Community Reference Laboratory (CRL). We analysed the nucleotide sequence variation in the target genomic regions in a broad range of transgenic and conventional varieties and lines: MON 810 varieties cultivated in Spain and conventional varieties from various geographical origins and breeding history. In addition, the reliability of the assays was evaluated based on their PCR amplification performance. A single base pair substitution, corresponding to a single nucleotide polymorphism (SNP) reported in an earlier study, was observed in the forward primer of one of the studied alcohol dehydrogenase 1 (Adh1) (70) assays in a large number of varieties. The SNP presence is consistent with a poor PCR performance observed for this assay along the tested varieties. The obtained data show that the Adh1 (70) assay used in the official CRL NK603 assay is unreliable. Based on our results from both the nucleotide stability study and the PCR performance test, we can conclude that the Adh1 (136) reference assay (T25 and Bt11 assays) as well as the tested high mobility group protein gene assay, which also form parts of CRL methods for quantification, are highly reliable. Despite the observed uniformity in the nucleotide sequence of the invertase gene assay, the PCR performance test reveals that this target sequence might occur in more than one copy. Finally, although currently not forming a part of official quantification methods, zein and SSIIb assays are found to be highly reliable in terms of nucleotide stability and PCR performance and are proposed as good alternative targets for a reference assay for maize.

  10. Radar-based collision avoidance for unmanned surface vehicles

    NASA Astrophysics Data System (ADS)

    Zhuang, Jia-yuan; Zhang, Lei; Zhao, Shi-qi; Cao, Jian; Wang, Bo; Sun, Han-bing

    2016-12-01

    Unmanned surface vehicles (USVs) have become a focus of research because of their extensive applications. To ensure safety and reliability and to perform complex tasks autonomously, USVs are required to possess accurate perception of the environment and effective collision avoidance capabilities. To achieve these, investigation into realtime marine radar target detection and autonomous collision avoidance technologies is required, aiming at solving the problems of noise jamming, uneven brightness, target loss, and blind areas in marine radar images. These technologies should also satisfy the requirements of real-time and reliability related to high navigation speeds of USVs. Therefore, this study developed an embedded collision avoidance system based on the marine radar, investigated a highly real-time target detection method which contains adaptive smoothing algorithm and robust segmentation algorithm, developed a stable and reliable dynamic local environment model to ensure the safety of USV navigation, and constructed a collision avoidance algorithm based on velocity obstacle (V-obstacle) which adjusts the USV's heading and speed in real-time. Sea trials results in multi-obstacle avoidance firstly demonstrate the effectiveness and efficiency of the proposed avoidance system, and then verify its great adaptability and relative stability when a USV sailing in a real and complex marine environment. The obtained results will improve the intelligent level of USV and guarantee the safety of USV independent sailing.

  11. Compact Surface Plasmon Resonance Biosensor for Fieldwork Environmental Detection

    NASA Astrophysics Data System (ADS)

    Boyd, Margrethe; Drake, Madison; Stipe, Kristian; Serban, Monica; Turner, Ivana; Thomas, Aaron; Macaluso, David

    2017-04-01

    The ability to accurately and reliably detect biomolecular targets is important in innumerable applications, including the identification of food-borne parasites, viral pathogens in human tissue, and environmental pollutants. While detection methods do exist, they are typically slow, expensive, and restricted to laboratory use. The method of surface plasmon resonance based biosensing offers a unique opportunity to characterize molecular targets while avoiding these constraints. By incorporating a plasmon-supporting gold film within a prism/laser optical system, it is possible to reliably detect and quantify the presence of specific biomolecules of interest in real time. This detection is accomplished by observing shifts in plasmon formation energies corresponding to optical absorption due to changes in index of refraction near the gold-prism interface caused by the binding of target molecules. A compact, inexpensive, battery-powered surface plasmon resonance biosensor based on this method is being developed at the University of Montana to detect waterborne pollutants in field-based environmental research.

  12. Identification of reliable gridded reference data for statistical downscaling methods in Alberta

    NASA Astrophysics Data System (ADS)

    Eum, H. I.; Gupta, A.

    2017-12-01

    Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system was developed to allow users to easily select the most reliable reference climate data at each target point based on the elevation of grid cell. By constructing the best combination of reference data for the study domain, the accurate and reliable statistically downscaled climate projections could be significantly improved.

  13. Reliabilities of mental rotation tasks: limits to the assessment of individual differences.

    PubMed

    Hirschfeld, Gerrit; Thielsch, Meinald T; Zernikow, Boris

    2013-01-01

    Mental rotation tasks with objects and body parts as targets are widely used in cognitive neuropsychology. Even though these tasks are well established to study between-groups differences, the reliability on an individual level is largely unknown. We present a systematic study on the internal consistency and test-retest reliability of individual differences in mental rotation tasks comparing different target types and orders of presentations. In total n = 99 participants (n = 63 for the retest) completed the mental rotation tasks with hands, feet, faces, and cars as targets. Different target types were presented in either randomly mixed blocks or blocks of homogeneous targets. Across all target types, the consistency (split-half reliability) and stability (test-retest reliabilities) were good or acceptable both for intercepts and slopes. At the level of individual targets, only intercepts showed acceptable reliabilities. Blocked presentations resulted in significantly faster and numerically more consistent and stable responses. Mental rotation tasks-especially in blocked variants-can be used to reliably assess individual differences in global processing speed. However, the assessment of the theoretically important slope parameter for individual targets requires further adaptations to mental rotation tests.

  14. HomPPI: a class of sequence homology based protein-protein interface prediction methods

    PubMed Central

    2011-01-01

    Background Although homology-based methods are among the most widely used methods for predicting the structure and function of proteins, the question as to whether interface sequence conservation can be effectively exploited in predicting protein-protein interfaces has been a subject of debate. Results We studied more than 300,000 pair-wise alignments of protein sequences from structurally characterized protein complexes, including both obligate and transient complexes. We identified sequence similarity criteria required for accurate homology-based inference of interface residues in a query protein sequence. Based on these analyses, we developed HomPPI, a class of sequence homology-based methods for predicting protein-protein interface residues. We present two variants of HomPPI: (i) NPS-HomPPI (Non partner-specific HomPPI), which can be used to predict interface residues of a query protein in the absence of knowledge of the interaction partner; and (ii) PS-HomPPI (Partner-specific HomPPI), which can be used to predict the interface residues of a query protein with a specific target protein. Our experiments on a benchmark dataset of obligate homodimeric complexes show that NPS-HomPPI can reliably predict protein-protein interface residues in a given protein, with an average correlation coefficient (CC) of 0.76, sensitivity of 0.83, and specificity of 0.78, when sequence homologs of the query protein can be reliably identified. NPS-HomPPI also reliably predicts the interface residues of intrinsically disordered proteins. Our experiments suggest that NPS-HomPPI is competitive with several state-of-the-art interface prediction servers including those that exploit the structure of the query proteins. The partner-specific classifier, PS-HomPPI can, on a large dataset of transient complexes, predict the interface residues of a query protein with a specific target, with a CC of 0.65, sensitivity of 0.69, and specificity of 0.70, when homologs of both the query and the target can be reliably identified. The HomPPI web server is available at http://homppi.cs.iastate.edu/. Conclusions Sequence homology-based methods offer a class of computationally efficient and reliable approaches for predicting the protein-protein interface residues that participate in either obligate or transient interactions. For query proteins involved in transient interactions, the reliability of interface residue prediction can be improved by exploiting knowledge of putative interaction partners. PMID:21682895

  15. Neural Networks Based Approach to Enhance Space Hardware Reliability

    NASA Technical Reports Server (NTRS)

    Zebulum, Ricardo S.; Thakoor, Anilkumar; Lu, Thomas; Franco, Lauro; Lin, Tsung Han; McClure, S. S.

    2011-01-01

    This paper demonstrates the use of Neural Networks as a device modeling tool to increase the reliability analysis accuracy of circuits targeted for space applications. The paper tackles a number of case studies of relevance to the design of Flight hardware. The results show that the proposed technique generates more accurate models than the ones regularly used to model circuits.

  16. Research on vehicle detection based on background feature analysis in SAR images

    NASA Astrophysics Data System (ADS)

    Zhang, Bochuan; Tang, Bo; Zhang, Cong; Hu, Ruiguang; Yun, Hongquan; Xiao, Liping

    2017-10-01

    Aiming at vehicle detection on the ground through low resolution SAR images, a method is proposed for determining the region of the vehicles first and then detecting the target in the specific region. The experimental results show that this method not only reduces the target detection area, but also reduces the influence of terrain clutter on the detection, which greatly improves the reliability of the target detection.

  17. Delay Analysis of Car-to-Car Reliable Data Delivery Strategies Based on Data Mulling with Network Coding

    NASA Astrophysics Data System (ADS)

    Park, Joon-Sang; Lee, Uichin; Oh, Soon Young; Gerla, Mario; Lun, Desmond Siumen; Ro, Won Woo; Park, Joonseok

    Vehicular ad hoc networks (VANET) aims to enhance vehicle navigation safety by providing an early warning system: any chance of accidents is informed through the wireless communication between vehicles. For the warning system to work, it is crucial that safety messages be reliably delivered to the target vehicles in a timely manner and thus reliable and timely data dissemination service is the key building block of VANET. Data mulling technique combined with three strategies, network codeing, erasure coding and repetition coding, is proposed for the reliable and timely data dissemination service. Particularly, vehicles in the opposite direction on a highway are exploited as data mules, mobile nodes physically delivering data to destinations, to overcome intermittent network connectivity cause by sparse vehicle traffic. Using analytic models, we show that in such a highway data mulling scenario the network coding based strategy outperforms erasure coding and repetition based strategies.

  18. Cue reliability and a landmark stability heuristic determine relative weighting between egocentric and allocentric visual information in memory-guided reach.

    PubMed

    Byrne, Patrick A; Crawford, J Douglas

    2010-06-01

    It is not known how egocentric visual information (location of a target relative to the self) and allocentric visual information (location of a target relative to external landmarks) are integrated to form reach plans. Based on behavioral data from rodents and humans we hypothesized that the degree of stability in visual landmarks would influence the relative weighting. Furthermore, based on numerous cue-combination studies we hypothesized that the reach system would act like a maximum-likelihood estimator (MLE), where the reliability of both cues determines their relative weighting. To predict how these factors might interact we developed an MLE model that weighs egocentric and allocentric information based on their respective reliabilities, and also on an additional stability heuristic. We tested the predictions of this model in 10 human subjects by manipulating landmark stability and reliability (via variable amplitude vibration of the landmarks and variable amplitude gaze shifts) in three reach-to-touch tasks: an egocentric control (reaching without landmarks), an allocentric control (reaching relative to landmarks), and a cue-conflict task (involving a subtle landmark "shift" during the memory interval). Variability from all three experiments was used to derive parameters for the MLE model, which was then used to simulate egocentric-allocentric weighting in the cue-conflict experiment. As predicted by the model, landmark vibration--despite its lack of influence on pointing variability (and thus allocentric reliability) in the control experiment--had a strong influence on egocentric-allocentric weighting. A reduced model without the stability heuristic was unable to reproduce this effect. These results suggest heuristics for extrinsic cue stability are at least as important as reliability for determining cue weighting in memory-guided reaching.

  19. Risk-Informed Mean Recurrence Intervals for Updated Wind Maps in ASCE 7-16.

    PubMed

    McAllister, Therese P; Wang, Naiyu; Ellingwood, Bruce R

    2018-05-01

    ASCE 7 is moving toward adopting load requirements that are consistent with risk-informed design goals characteristic of performance-based engineering (PBE). ASCE 7-10 provided wind maps that correspond to return periods of 300, 700, and 1,700 years for Risk Categories I, II, and combined III/IV, respectively. The risk targets for Risk Categories III and IV buildings and other structures (designated as essential facilities) are different in PBE. The reliability analyses reported in this paper were conducted using updated wind load data to (1) confirm that the return periods already in ASCE 7-10 were also appropriate for risk-informed PBE, and (2) to determine a new risk-based return period for Risk Category IV. The use of data for wind directionality factor, K d , which has become available from recent wind tunnel tests, revealed that reliabilities associated with wind load combinations for Risk Category II structures are, in fact, consistent with the reliabilities associated with the ASCE 7 gravity load combinations. This paper shows that the new wind maps in ASCE 7-16, which are based on return periods of 300, 700, 1,700, and 3,000 years for Risk Categories I, II, III, and IV, respectively), achieve the reliability targets in Section 1.3.1.3 of ASCE 7-16 for nonhurricane wind loads.

  20. Search by photo methodology for signature properties assessment by human observers

    NASA Astrophysics Data System (ADS)

    Selj, Gorm K.; Heinrich, Daniela H.

    2015-05-01

    Reliable, low-cost and simple methods for assessment of signature properties for military purposes are very important. In this paper we present such an approach that uses human observers in a search by photo assessment of signature properties of generic test targets. The method was carried out by logging a large number of detection times of targets recorded in relevant terrain backgrounds. The detection times were harvested by using human observers searching for targets in scene images shown by a high definition pc screen. All targets were identically located in each "search image", allowing relative comparisons (and not just rank by order) of targets. To avoid biased detections, each observer only searched for one target per scene. Statistical analyses were carried out for the detection times data. Analysis of variance was chosen if detection times distribution associated with all targets satisfied normality, and non-parametric tests, such as Wilcoxon's rank test, if otherwise. The new methodology allows assessment of signature properties in a reproducible, rapid and reliable setting. Such assessments are very complex as they must sort out what is of relevance in a signature test, but not loose information of value. We believe that choosing detection times as the primary variable for a comparison of signature properties, allows a careful and necessary inspection of observer data as the variable is continuous rather than discrete. Our method thus stands in opposition to approaches based on detections by subsequent, stepwise reductions in distance to target, or based on probability of detection.

  1. INFLUENCES OF RESPONSE RATE AND DISTRIBUTION ON THE CALCULATION OF INTEROBSERVER RELIABILITY SCORES

    PubMed Central

    Rolider, Natalie U.; Iwata, Brian A.; Bullock, Christopher E.

    2012-01-01

    We examined the effects of several variations in response rate on the calculation of total, interval, exact-agreement, and proportional reliability indices. Trained observers recorded computer-generated data that appeared on a computer screen. In Study 1, target responses occurred at low, moderate, and high rates during separate sessions so that reliability results based on the four calculations could be compared across a range of values. Total reliability was uniformly high, interval reliability was spuriously high for high-rate responding, proportional reliability was somewhat lower for high-rate responding, and exact-agreement reliability was the lowest of the measures, especially for high-rate responding. In Study 2, we examined the separate effects of response rate per se, bursting, and end-of-interval responding. Response rate and bursting had little effect on reliability scores; however, the distribution of some responses at the end of intervals decreased interval reliability somewhat, proportional reliability noticeably, and exact-agreement reliability markedly. PMID:23322930

  2. Solar thermal technology evaluation, fiscal year 1982. Volume 2: Technical

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The technology base of solar thermal energy is investigated. The materials, components, subsystems, and processes capable of meeting specific energy cost targets are emphasized, as are system efficiency and reliability.

  3. Comparison of three microarray probe annotation pipelines: differences in strategies and their effect on downstream analysis

    PubMed Central

    Neerincx, Pieter BT; Casel, Pierrot; Prickett, Dennis; Nie, Haisheng; Watson, Michael; Leunissen, Jack AM; Groenen, Martien AM; Klopp, Christophe

    2009-01-01

    Background Reliable annotation linking oligonucleotide probes to target genes is essential for functional biological analysis of microarray experiments. We used the IMAD, OligoRAP and sigReannot pipelines to update the annotation for the ARK-Genomics Chicken 20 K array as part of a joined EADGENE/SABRE workshop. In this manuscript we compare their annotation strategies and results. Furthermore, we analyse the effect of differences in updated annotation on functional analysis for an experiment involving Eimeria infected chickens and finally we propose guidelines for optimal annotation strategies. Results IMAD, OligoRAP and sigReannot update both annotation and estimated target specificity. The 3 pipelines can assign oligos to target specificity categories although with varying degrees of resolution. Target specificity is judged based on the amount and type of oligo versus target-gene alignments (hits), which are determined by filter thresholds that users can adjust based on their experimental conditions. Linking oligos to annotation on the other hand is based on rigid rules, which differ between pipelines. For 52.7% of the oligos from a subset selected for in depth comparison all pipelines linked to one or more Ensembl genes with consensus on 44.0%. In 31.0% of the cases none of the pipelines could assign an Ensembl gene to an oligo and for the remaining 16.3% the coverage differed between pipelines. Differences in updated annotation were mainly due to different thresholds for hybridisation potential filtering of oligo versus target-gene alignments and different policies for expanding annotation using indirect links. The differences in updated annotation packages had a significant effect on GO term enrichment analysis with consensus on only 67.2% of the enriched terms. Conclusion In addition to flexible thresholds to determine target specificity, annotation tools should provide metadata describing the relationships between oligos and the annotation assigned to them. These relationships can then be used to judge the varying degrees of reliability allowing users to fine-tune the balance between reliability and coverage. This is important as it can have a significant effect on functional microarray analysis as exemplified by the lack of consensus on almost one third of the terms found with GO term enrichment analysis based on updated IMAD, OligoRAP or sigReannot annotation. PMID:19615109

  4. Comparison of three microarray probe annotation pipelines: differences in strategies and their effect on downstream analysis.

    PubMed

    Neerincx, Pieter Bt; Casel, Pierrot; Prickett, Dennis; Nie, Haisheng; Watson, Michael; Leunissen, Jack Am; Groenen, Martien Am; Klopp, Christophe

    2009-07-16

    Reliable annotation linking oligonucleotide probes to target genes is essential for functional biological analysis of microarray experiments. We used the IMAD, OligoRAP and sigReannot pipelines to update the annotation for the ARK-Genomics Chicken 20 K array as part of a joined EADGENE/SABRE workshop. In this manuscript we compare their annotation strategies and results. Furthermore, we analyse the effect of differences in updated annotation on functional analysis for an experiment involving Eimeria infected chickens and finally we propose guidelines for optimal annotation strategies. IMAD, OligoRAP and sigReannot update both annotation and estimated target specificity. The 3 pipelines can assign oligos to target specificity categories although with varying degrees of resolution. Target specificity is judged based on the amount and type of oligo versus target-gene alignments (hits), which are determined by filter thresholds that users can adjust based on their experimental conditions. Linking oligos to annotation on the other hand is based on rigid rules, which differ between pipelines.For 52.7% of the oligos from a subset selected for in depth comparison all pipelines linked to one or more Ensembl genes with consensus on 44.0%. In 31.0% of the cases none of the pipelines could assign an Ensembl gene to an oligo and for the remaining 16.3% the coverage differed between pipelines. Differences in updated annotation were mainly due to different thresholds for hybridisation potential filtering of oligo versus target-gene alignments and different policies for expanding annotation using indirect links. The differences in updated annotation packages had a significant effect on GO term enrichment analysis with consensus on only 67.2% of the enriched terms. In addition to flexible thresholds to determine target specificity, annotation tools should provide metadata describing the relationships between oligos and the annotation assigned to them. These relationships can then be used to judge the varying degrees of reliability allowing users to fine-tune the balance between reliability and coverage. This is important as it can have a significant effect on functional microarray analysis as exemplified by the lack of consensus on almost one third of the terms found with GO term enrichment analysis based on updated IMAD, OligoRAP or sigReannot annotation.

  5. Enhanced Reliability and Accuracy for Field Deployable Bioforensic Detection and Discrimination of Xylella fastidiosa subsp. pauca, Causal Agent of Citrus Variegated Chlorosis Using Razor Ex Technology and TaqMan Quantitative PCR

    PubMed Central

    Fletcher, Jacqueline; Melcher, Ulrich; Ochoa Corona, Francisco Manuel

    2013-01-01

    A reliable, accurate and rapid multigene-based assay combining real time quantitative PCR (qPCR) and a Razor Ex BioDetection System (Razor Ex) was validated for detection of Xylella fastidiosa subsp. pauca (Xfp, a xylem-limited bacterium that causes citrus variegated chlorosis [CVC]). CVC, which is exotic to the United States, has spread through South and Central America and could significantly impact U.S. citrus if it arrives. A method for early, accurate and sensitive detection of Xfp in plant tissues is needed by plant health officials for inspection of products from quarantined locations, and by extension specialists for detection, identification and management of disease outbreaks and reservoir hosts. Two sets of specific PCR primers and probes, targeting Xfp genes for fimbrillin and the periplasmic iron-binding protein were designed. A third pair of primers targeting the conserved cobalamin synthesis protein gene was designed to detect all possible X. fastidiosa (Xf) strains. All three primer sets detected as little as 1 fg of plasmid DNA carrying X. fastidiosa target sequences and genomic DNA of Xfp at as little as 1 - 10 fg. The use of Razor Ex facilitates a rapid (about 30 min) in-field assay capability for detection of all Xf strains, and for specific detection of Xfp. Combined use of three primer sets targeting different genes increased the assay accuracy and broadened the range of detection. To our knowledge, this is the first report of a field-deployable rapid and reliable bioforensic detection and discrimination method for a bacterial phytopathogen based on multigene targets. PMID:24312333

  6. Enhanced reliability and accuracy for field deployable bioforensic detection and discrimination of Xylella fastidiosa subsp. pauca, causal agent of citrus variegated chlorosis using razor ex technology and TaqMan quantitative PCR.

    PubMed

    Ouyang, Ping; Arif, Mohammad; Fletcher, Jacqueline; Melcher, Ulrich; Ochoa Corona, Francisco Manuel

    2013-01-01

    A reliable, accurate and rapid multigene-based assay combining real time quantitative PCR (qPCR) and a Razor Ex BioDetection System (Razor Ex) was validated for detection of Xylella fastidiosa subsp. pauca (Xfp, a xylem-limited bacterium that causes citrus variegated chlorosis [CVC]). CVC, which is exotic to the United States, has spread through South and Central America and could significantly impact U.S. citrus if it arrives. A method for early, accurate and sensitive detection of Xfp in plant tissues is needed by plant health officials for inspection of products from quarantined locations, and by extension specialists for detection, identification and management of disease outbreaks and reservoir hosts. Two sets of specific PCR primers and probes, targeting Xfp genes for fimbrillin and the periplasmic iron-binding protein were designed. A third pair of primers targeting the conserved cobalamin synthesis protein gene was designed to detect all possible X. fastidiosa (Xf) strains. All three primer sets detected as little as 1 fg of plasmid DNA carrying X. fastidiosa target sequences and genomic DNA of Xfp at as little as 1 - 10 fg. The use of Razor Ex facilitates a rapid (about 30 min) in-field assay capability for detection of all Xf strains, and for specific detection of Xfp. Combined use of three primer sets targeting different genes increased the assay accuracy and broadened the range of detection. To our knowledge, this is the first report of a field-deployable rapid and reliable bioforensic detection and discrimination method for a bacterial phytopathogen based on multigene targets.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santamarina, A.; Bernard, D.; Dos Santos, N.

    This paper describes the method to define relevant targeted integral measurements that allow the improvement of nuclear data evaluations and the determination of corresponding reliable covariances. {sup 235}U and {sup 56}Fe examples are pointed out for the improvement of JEFF3 data. Utilizations of these covariances are shown for Sensitivity and Representativity studies, Uncertainty calculations, and Transposition of experimental results to industrial applications. S/U studies are more and more used in Reactor Physics and Safety-Criticality. However, the reliability of study results relies strongly on the ND covariance relevancy. Our method derives the real uncertainty associated with each evaluation from calibration onmore » targeted integral measurements. These realistic covariance matrices allow reliable JEFF3.1.1 calculation of prior uncertainty due to nuclear data, as well as uncertainty reduction based on representative integral experiments, in challenging design calculations such as GEN3 and RJH reactors.« less

  8. Regional reliability of quantitative signal targeting with alternating radiofrequency (STAR) labeling of arterial regions (QUASAR).

    PubMed

    Tatewaki, Yasuko; Higano, Shuichi; Taki, Yasuyuki; Thyreau, Benjamin; Murata, Takaki; Mugikura, Shunji; Ito, Daisuke; Takase, Kei; Takahashi, Shoki

    2014-01-01

    Quantitative signal targeting with alternating radiofrequency labeling of arterial regions (QUASAR) is a recent spin labeling technique that could improve the reliability of brain perfusion measurements. Although it is considered reliable for measuring gray matter as a whole, it has never been evaluated regionally. Here we assessed this regional reliability. Using a 3-Tesla Philips Achieva whole-body system, we scanned four times 10 healthy volunteers, in two sessions 2 weeks apart, to obtain QUASAR images. We computed perfusion images and ran a voxel-based analysis within all brain structures. We also calculated mean regional cerebral blood flow (rCBF) within regions of interest configured for each arterial territory distribution. The mean CBF over whole gray matter was 37.74 with intraclass correlation coefficient (ICC) of .70. In white matter, it was 13.94 with an ICC of .30. Voxel-wise ICC and coefficient-of-variation maps showed relatively lower reliability in watershed areas and white matter especially in deeper white matter. The absolute mean rCBF values were consistent with the ones reported from PET, as was the relatively low variability in different feeding arteries. Thus, QUASAR reliability for regional perfusion is high within gray matter, but uncertain within white matter. © 2014 The Authors. Journal of Neuroimaging published by the American Society of Neuroimaging.

  9. Regional Reliability of Quantitative Signal Targeting with Alternating Radiofrequency (STAR) Labeling of Arterial Regions (QUASAR)

    PubMed Central

    Tatewaki, Yasuko; Higano, Shuichi; Taki, Yasuyuki; Thyreau, Benjamin; Murata, Takaki; Mugikura, Shunji; Ito, Daisuke; Takase, Kei; Takahashi, Shoki

    2014-01-01

    BACKGROUND AND PURPOSE Quantitative signal targeting with alternating radiofrequency labeling of arterial regions (QUASAR) is a recent spin labeling technique that could improve the reliability of brain perfusion measurements. Although it is considered reliable for measuring gray matter as a whole, it has never been evaluated regionally. Here we assessed this regional reliability. METHODS Using a 3-Tesla Philips Achieva whole-body system, we scanned four times 10 healthy volunteers, in two sessions 2 weeks apart, to obtain QUASAR images. We computed perfusion images and ran a voxel-based analysis within all brain structures. We also calculated mean regional cerebral blood flow (rCBF) within regions of interest configured for each arterial territory distribution. RESULTS The mean CBF over whole gray matter was 37.74 with intraclass correlation coefficient (ICC) of .70. In white matter, it was 13.94 with an ICC of .30. Voxel-wise ICC and coefficient-of-variation maps showed relatively lower reliability in watershed areas and white matter especially in deeper white matter. The absolute mean rCBF values were consistent with the ones reported from PET, as was the relatively low variability in different feeding arteries. CONCLUSIONS Thus, QUASAR reliability for regional perfusion is high within gray matter, but uncertain within white matter. PMID:25370338

  10. Co-detection: ultra-reliable nanoparticle-based electrical detection of biomolecules in the presence of large background interference.

    PubMed

    Liu, Yang; Gu, Ming; Alocilja, Evangelyn C; Chakrabartty, Shantanu

    2010-11-15

    An ultra-reliable technique for detecting trace quantities of biomolecules is reported. The technique called "co-detection" exploits the non-linear redundancy amongst synthetically patterned biomolecular logic circuits for deciphering the presence or absence of target biomolecules in a sample. In this paper, we verify the "co-detection" principle on gold-nanoparticle-based conductimetric soft-logic circuits which use a silver-enhancement technique for signal amplification. Using co-detection, we have been able to demonstrate a great improvement in the reliability of detecting mouse IgG at concentration levels that are 10(5) lower than the concentration of rabbit IgG which serves as background interference. Copyright © 2010 Elsevier B.V. All rights reserved.

  11. A Bioinformatic Pipeline for Monitoring of the Mutational Stability of Viral Drug Targets with Deep-Sequencing Technology.

    PubMed

    Kravatsky, Yuri; Chechetkin, Vladimir; Fedoseeva, Daria; Gorbacheva, Maria; Kravatskaya, Galina; Kretova, Olga; Tchurikov, Nickolai

    2017-11-23

    The efficient development of antiviral drugs, including efficient antiviral small interfering RNAs (siRNAs), requires continuous monitoring of the strict correspondence between a drug and the related highly variable viral DNA/RNA target(s). Deep sequencing is able to provide an assessment of both the general target conservation and the frequency of particular mutations in the different target sites. The aim of this study was to develop a reliable bioinformatic pipeline for the analysis of millions of short, deep sequencing reads corresponding to selected highly variable viral sequences that are drug target(s). The suggested bioinformatic pipeline combines the available programs and the ad hoc scripts based on an original algorithm of the search for the conserved targets in the deep sequencing data. We also present the statistical criteria for the threshold of reliable mutation detection and for the assessment of variations between corresponding data sets. These criteria are robust against the possible sequencing errors in the reads. As an example, the bioinformatic pipeline is applied to the study of the conservation of RNA interference (RNAi) targets in human immunodeficiency virus 1 (HIV-1) subtype A. The developed pipeline is freely available to download at the website http://virmut.eimb.ru/. Brief comments and comparisons between VirMut and other pipelines are also presented.

  12. GalaxyTBM: template-based modeling by building a reliable core and refining unreliable local regions.

    PubMed

    Ko, Junsu; Park, Hahnbeom; Seok, Chaok

    2012-08-10

    Protein structures can be reliably predicted by template-based modeling (TBM) when experimental structures of homologous proteins are available. However, it is challenging to obtain structures more accurate than the single best templates by either combining information from multiple templates or by modeling regions that vary among templates or are not covered by any templates. We introduce GalaxyTBM, a new TBM method in which the more reliable core region is modeled first from multiple templates and less reliable, variable local regions, such as loops or termini, are then detected and re-modeled by an ab initio method. This TBM method is based on "Seok-server," which was tested in CASP9 and assessed to be amongst the top TBM servers. The accuracy of the initial core modeling is enhanced by focusing on more conserved regions in the multiple-template selection and multiple sequence alignment stages. Additional improvement is achieved by ab initio modeling of up to 3 unreliable local regions in the fixed framework of the core structure. Overall, GalaxyTBM reproduced the performance of Seok-server, with GalaxyTBM and Seok-server resulting in average GDT-TS of 68.1 and 68.4, respectively, when tested on 68 single-domain CASP9 TBM targets. For application to multi-domain proteins, GalaxyTBM must be combined with domain-splitting methods. Application of GalaxyTBM to CASP9 targets demonstrates that accurate protein structure prediction is possible by use of a multiple-template-based approach, and ab initio modeling of variable regions can further enhance the model quality.

  13. Creating High Reliability in Health Care Organizations

    PubMed Central

    Pronovost, Peter J; Berenholtz, Sean M; Goeschel, Christine A; Needham, Dale M; Sexton, J Bryan; Thompson, David A; Lubomski, Lisa H; Marsteller, Jill A; Makary, Martin A; Hunt, Elizabeth

    2006-01-01

    Objective The objective of this paper was to present a comprehensive approach to help health care organizations reliably deliver effective interventions. Context Reliability in healthcare translates into using valid rate-based measures. Yet high reliability organizations have proven that the context in which care is delivered, called organizational culture, also has important influences on patient safety. Model for Improvement Our model to improve reliability, which also includes interventions to improve culture, focuses on valid rate-based measures. This model includes (1) identifying evidence-based interventions that improve the outcome, (2) selecting interventions with the most impact on outcomes and converting to behaviors, (3) developing measures to evaluate reliability, (4) measuring baseline performance, and (5) ensuring patients receive the evidence-based interventions. The comprehensive unit-based safety program (CUSP) is used to improve culture and guide organizations in learning from mistakes that are important, but cannot be measured as rates. Conclusions We present how this model was used in over 100 intensive care units in Michigan to improve culture and eliminate catheter-related blood stream infections—both were accomplished. Our model differs from existing models in that it incorporates efforts to improve a vital component for system redesign—culture, it targets 3 important groups—senior leaders, team leaders, and front line staff, and facilitates change management—engage, educate, execute, and evaluate for planned interventions. PMID:16898981

  14. An N-targeting real-time PCR strategy for the accurate detection of spring viremia of carp virus.

    PubMed

    Shao, Ling; Xiao, Yu; He, Zhengkan; Gao, Longying

    2016-03-01

    Spring viremia of carp virus (SVCV) is a highly pathogenic agent of several economically important Cyprinidae fish species. Currently, there are no effective vaccines or drugs for this virus, and prevention of the disease mostly relies on prompt diagnosis. Previously, nested RT-PCR and RT-qPCR detection methods based on the glycoprotein gene G have been developed. However, the high genetic diversity of the G gene seriously limits the reliability of those methods. Compared with the G gene, phylogenetic analyses indicate that the nucleoprotein gene N is more conserved. Furthermore, studies in other members of the Rhabdoviridae family reveals that their gene transcription level follows the order N>P>M>G>L, indicating that an N gene based RT-PCR should have higher sensitivity. Therefore, two pairs of primers and two corresponding probes targeting the conserved regions of the N gene were designed. RT-qPCR assays demonstrated all primers and probes could detect phylogenetically distant isolates specifically and efficiently. Moreover, in artificially infected fish, the detected copy numbers of the N gene were much higher than those of the G gene in all tissues, and both the N and G gene copy numbers were highest in the kidney and spleen. Testing in 1100 farm-raised fish also showed that the N-targeting strategy was more reliable than the G-targeting methods. The method developed in this study provides a reliable tool for the rapid diagnosis of SVCV. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. SU-E-J-24: Image-Guidance Using Cone-Beam CT for Stereotactic Body Radiotherapy (SBRT) of Lung Cancer Patients: Bony Alignment or Soft Tissue Alignment?

    PubMed

    Wang, L; Turaka, A; Meyer, J; Spoka, D; Jin, L; Fan, J; Ma, C

    2012-06-01

    To assess the reliability of soft tissue alignment by comparing pre- and post-treatment cone-beam CT (CBCT) for image guidance in stereotactic body radiotherapy (SBRT) of lung cancers. Our lung SBRT procedures require all patients undergo 4D CT scan in order to obtain patient-specific target motion information through reconstructed 4D data using the maximum-intensity projection (MIP) algorithm. The internal target volume (ITV) was outlined directly from the MIP images and a 3-5 mm margin expansion was then applied to the ITV to create the PTV. Conformal treatment planning was performed on the helical images, to which the MIP images were fused. Prior to each treatment, CBCT was used for image guidance by comparing with the simulation CT and for patient relocalization based on the bony anatomy. Any displacement of the patient bony structure would be considered as setup errors and would be corrected by couch shifts. Theoretically, as the PTV definition included target internal motion, no further shifts other than setup corrections should be made. However, it is our practice to have treating physicians further check target localization within the PTV. Whenever the shifts based on the soft-tissue alignment (that is, target alignment) exceeded a certain value (e.g. 5 mm), a post-treatment CBCT was carried out to ensure that the tissue alignment is reliable by comparing between pre- and post-treatment CBCT. Pre- and post-CBCT has been performed for 7 patients so far who had shifts beyond 5 mm despite bony alignment. For all patients, post CBCT confirmed that the visualized target position was kept in the same position as before treatment after adjusting for soft-tissue alignment. For the patient population studied, it is shown that soft-tissue alignment is necessary and reliable in the lung SBRT for individual cases. © 2012 American Association of Physicists in Medicine.

  16. THE DYNAMIC LEAP AND BALANCE TEST (DLBT): A TEST-RETEST RELIABILITY STUDY

    PubMed Central

    Newman, Thomas M.; Smith, Brent I.; John Miller, Sayers

    2017-01-01

    Background There is a need for new clinical assessment tools to test dynamic balance during typical functional movements. Common methods for assessing dynamic balance, such as the Star Excursion Balance Test, which requires controlled movement of body segments over an unchanged base of support, may not be an adequate measure for testing typical functional movements that involve controlled movement of body segments along with a change in base of support. Purpose/hypothesis The purpose of this study was to determine the reliability of the Dynamic Leap and Balance Test (DLBT) by assessing its test-retest reliability. It was hypothesized that there would be no statistically significant differences between testing days in time taken to complete the test. Study Design Reliability study Methods Thirty healthy college aged individuals participated in this study. Participants performed a series of leaps in a prescribed sequence, unique to the DLBT test. Time required by the participants to complete the 20-leap task was the dependent variable. Subjects leaped back and forth from peripheral to central targets alternating weight bearing from one leg to the other. Participants landed on the central target with the tested limb and were required to stabilize for two seconds before leaping to the next target. Stability was based upon qualitative measures similar to Balance Error Scoring System. Each assessment was comprised of three trials and performed on two days with a separation of at least six days. Results Two-way mixed ANOVA was used to analyze the differences in time to complete the sequence between the three trial averages of the two testing sessions. Intraclass Correlation Coefficient (ICC3,1) was used to establish between session test-retest reliability of the test trial averages. Significance was set a priori at p ≤ 0.05. No significant differences (p > 0.05) were detected between the two testing sessions. The ICC was 0.93 with a 95% confidence interval from 0.84 to 0.96. Conclusion This test is a cost-effective, easy to administer and clinically relevant novel measure for assessing dynamic balance that has excellent test-retest reliability. Clinical relevance As a new measure of dynamic balance, the DLBT has the potential to be a cost-effective, challenging and functional tool for clinicians. Level of Evidence 2b PMID:28900556

  17. Robust Targeting for the Smartphone Video Guidance Sensor

    NASA Technical Reports Server (NTRS)

    Carter, Christopher

    2017-01-01

    The Smartphone Video Guidance Sensor (SVGS) is a miniature, self-contained autonomous rendezvous and docking sensor developed using a commercial off the shelf Android-based smartphone. It aims to provide a miniaturized solution for rendezvous and docking, enabling small satellites to conduct proximity operations and formation flying while minimizing interference with a primary payload. Previously, the sensor was limited by a slow (2 Hz) refresh rate and its use of retro-reflectors, both of which contributed to a limited operating environment. To advance the technology readiness level, a modified approach was developed, combining a multi-colored LED target with a focused target-detection algorithm. Alone, the use of an LED system was determined to be much more reliable, though slower, than the retro-reflector system. The focused target-detection system was developed in response to this problem to mitigate the speed reduction of using color. However, it also improved the reliability. In combination these two methods have been demonstrated to dramatically increase sensor speed and allow the sensor to select the target even with significant noise interfering with the sensor, providing millimeter level accuracy at a range of two meters with a 1U target.

  18. Robust Targeting for the Smartphone Video Guidance Sensor

    NASA Technical Reports Server (NTRS)

    Carter, C.

    2017-01-01

    The Smartphone Video Guidance Sensor (SVGS) is a miniature, self-contained autonomous rendezvous and docking sensor developed using a commercial off the shelf Android-based smartphone. It aims to provide a miniaturized solution for rendezvous and docking, enabling small satellites to conduct proximity operations and formation flying while minimizing interference with a primary payload. Previously, the sensor was limited by a slow (2 Hz) refresh rate and its use of retro-reflectors, both of which contributed to a limited operating environment. To advance the technology readiness level, a modified approach was developed, combining a multi-colored LED target with a focused target-detection algorithm. Alone, the use of an LED system was determined to be much more reliable, though slower, than the retro-reflector system. The focused target-detection system was developed in response to this problem to mitigate the speed reduction of using color. However it also improved the reliability. In combination these two methods have been demonstrated to dramatically increase sensor speed and allow the sensor to select the target even with significant noise interfering with the sensor, providing millimeter level precision at a range of two meters with a 1U target.

  19. Reliability based design of the primary structure of oil tankers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casella, G.; Dogliani, M.; Guedes Soares, C.

    1996-12-31

    The present paper describes the reliability analysis carried out for two oil tanker-ships having comparable dimensions but different design. The scope of the analysis was to derive indications on the value of the reliability index obtained for existing, typical and well designed oil tankers, as well as to apply the tentative rule checking formulation developed within the CEC-funded SHIPREL Project. The checking formula was adopted to redesign the midships section of one of the considered ships, upgrading her in order to meet the target failure probability considered in the rule development process. The resulting structure, in view of an upgradingmore » of the steel grade in the central part of the deck, lead to a convenient reliability level. The results of the analysis clearly showed that a large scatter exists presently in the design safety levels of ships, even when the Classification Societies` unified requirements are satisfied. A reliability based approach for the calibration of the rules for the global strength of ships is therefore proposed, in order to assist designers and Classification Societies in the process of producing ships which are more optimized, with respect to ensured safety levels. Based on the work reported in the paper, the feasibility and usefulness of a reliability based approach in the development of ship longitudinal strength requirements has been demonstrated.« less

  20. A Target-Less Vision-Based Displacement Sensor Based on Image Convex Hull Optimization for Measuring the Dynamic Response of Building Structures.

    PubMed

    Choi, Insub; Kim, JunHee; Kim, Donghyun

    2016-12-08

    Existing vision-based displacement sensors (VDSs) extract displacement data through changes in the movement of a target that is identified within the image using natural or artificial structure markers. A target-less vision-based displacement sensor (hereafter called "TVDS") is proposed. It can extract displacement data without targets, which then serve as feature points in the image of the structure. The TVDS can extract and track the feature points without the target in the image through image convex hull optimization, which is done to adjust the threshold values and to optimize them so that they can have the same convex hull in every image frame and so that the center of the convex hull is the feature point. In addition, the pixel coordinates of the feature point can be converted to physical coordinates through a scaling factor map calculated based on the distance, angle, and focal length between the camera and target. The accuracy of the proposed scaling factor map was verified through an experiment in which the diameter of a circular marker was estimated. A white-noise excitation test was conducted, and the reliability of the displacement data obtained from the TVDS was analyzed by comparing the displacement data of the structure measured with a laser displacement sensor (LDS). The dynamic characteristics of the structure, such as the mode shape and natural frequency, were extracted using the obtained displacement data, and were compared with the numerical analysis results. TVDS yielded highly reliable displacement data and highly accurate dynamic characteristics, such as the natural frequency and mode shape of the structure. As the proposed TVDS can easily extract the displacement data even without artificial or natural markers, it has the advantage of extracting displacement data from any portion of the structure in the image.

  1. NEPP DDR Device Reliability FY13 Report

    NASA Technical Reports Server (NTRS)

    Guertin, Steven M.; Armbar, Mehran

    2014-01-01

    This document reports the status of the NEPP Double Data Rate (DDR) Device Reliability effort for FY2013. The task targeted general reliability of > 100 DDR2 devices from Hynix, Samsung, and Micron. Detailed characterization of some devices when stressed by several data storage patterns was studied, targeting ability of the data cells to store the different data patterns without refresh, highlighting the weakest bits. DDR2, Reliability, Data Retention, Temperature Stress, Test System Evaluation, General Reliability, IDD measurements, electronic parts, parts testing, microcircuits

  2. Fiber-based Coherent Lidar for Target Ranging, Velocimetry, and Atmospheric Wind Sensing

    NASA Technical Reports Server (NTRS)

    Amzajerdian, Farzin; Pierrottet, Diego

    2006-01-01

    By employing a combination of optical heterodyne and linear frequency modulation techniques and utilizing state-of-the-art fiber optic technologies, highly efficient, compact and reliable lidar suitable for operation in a space environment is being developed.

  3. New distributed radar technology based on UAV or UGV application

    NASA Astrophysics Data System (ADS)

    Molchanov, Pavlo A.; Contarino, Vincent M.

    2013-05-01

    Regular micro and nano radars cannot provide reliable tracking of low altitude low profile aerial targets in urban and mountain areas because of reflection and re-reflections from buildings and terrain. They become visible and vulnerable to guided missiles if positioned on a tower or blimp. Doppler radar cannot distinguish moving cars and small low altitude aerial targets in an urban area. A new concept of pocket size distributed radar technology based on the application of UAV (Unmanned Air Vehicles), UGV (Unmanned Ground Vehicles) is proposed for tracking of low altitude low profile aerial targets at short and medium distances for protection of stadium, camp, military facility in urban or mountain areas.

  4. A Camera-Based Target Detection and Positioning UAV System for Search and Rescue (SAR) Purposes

    PubMed Central

    Sun, Jingxuan; Li, Boyang; Jiang, Yifan; Wen, Chih-yung

    2016-01-01

    Wilderness search and rescue entails performing a wide-range of work in complex environments and large regions. Given the concerns inherent in large regions due to limited rescue distribution, unmanned aerial vehicle (UAV)-based frameworks are a promising platform for providing aerial imaging. In recent years, technological advances in areas such as micro-technology, sensors and navigation have influenced the various applications of UAVs. In this study, an all-in-one camera-based target detection and positioning system is developed and integrated into a fully autonomous fixed-wing UAV. The system presented in this paper is capable of on-board, real-time target identification, post-target identification and location and aerial image collection for further mapping applications. Its performance is examined using several simulated search and rescue missions, and the test results demonstrate its reliability and efficiency. PMID:27792156

  5. A Camera-Based Target Detection and Positioning UAV System for Search and Rescue (SAR) Purposes.

    PubMed

    Sun, Jingxuan; Li, Boyang; Jiang, Yifan; Wen, Chih-Yung

    2016-10-25

    Wilderness search and rescue entails performing a wide-range of work in complex environments and large regions. Given the concerns inherent in large regions due to limited rescue distribution, unmanned aerial vehicle (UAV)-based frameworks are a promising platform for providing aerial imaging. In recent years, technological advances in areas such as micro-technology, sensors and navigation have influenced the various applications of UAVs. In this study, an all-in-one camera-based target detection and positioning system is developed and integrated into a fully autonomous fixed-wing UAV. The system presented in this paper is capable of on-board, real-time target identification, post-target identification and location and aerial image collection for further mapping applications. Its performance is examined using several simulated search and rescue missions, and the test results demonstrate its reliability and efficiency.

  6. TBI Assessment of Readiness Using a Gait Evaluation Test (TARGET): Development of a Portable mTBI Screening Device

    DTIC Science & Technology

    2016-05-01

    Acute Concussion Evaluation (MACE), smartphone , TARGET, military, civilian, validity, reliability What was accomplished under these goals? For this...Issues with acceleration profile saturation in the smartphone app were observed in the data (see Major Task 3 below for explanation) and were...their potential impact. The goal of this project is to provide a portable, objective assessment of balance using an Android-based smartphone app that

  7. Gaussian mixture models-based ship target recognition algorithm in remote sensing infrared images

    NASA Astrophysics Data System (ADS)

    Yao, Shoukui; Qin, Xiaojuan

    2018-02-01

    Since the resolution of remote sensing infrared images is low, the features of ship targets become unstable. The issue of how to recognize ships with fuzzy features is an open problem. In this paper, we propose a novel ship target recognition algorithm based on Gaussian mixture models (GMMs). In the proposed algorithm, there are mainly two steps. At the first step, the Hu moments of these ship target images are calculated, and the GMMs are trained on the moment features of ships. At the second step, the moment feature of each ship image is assigned to the trained GMMs for recognition. Because of the scale, rotation, translation invariance property of Hu moments and the power feature-space description ability of GMMs, the GMMs-based ship target recognition algorithm can recognize ship reliably. Experimental results of a large simulating image set show that our approach is effective in distinguishing different ship types, and obtains a satisfactory ship recognition performance.

  8. A Novel Sensor Based on a Single-Pixel Microwave Radiometer for Warm Object Counting: Concept Validation and IoT Perspectives

    PubMed Central

    Alimenti, Federico; Bonafoni, Stefania; Roselli, Luca

    2017-01-01

    Controlled measurements by a low-cost single-pixel microwave radiometer operating at 12.65 GHz were carried out to assess the detection and counting capability for targets warmer than the surroundings. The adopted reference test targets were pre-warmed water and oil; and a hand, both naked and wearing a glove. The results showed the reliability of microwave radiometry for counting operations under controlled conditions, and its effectiveness at detecting even warm targets masked by unheated dielectric layers. An electromagnetic model describing the scenario sensed by the radiometer antenna is proposed, and comparison with the experimental observations shows a good agreement. The measurements prove that reliable counting is enabled by an antenna temperature increment, for each target sample added, of around 1 K. Starting from this value, an analysis of the antenna filling factor was performed to provide an instrument useful for evaluating real applicability in many practical situations. This study also allows the direct people counting problem to be addressed, providing preliminary operational indications, reference numbers and experimental validation. PMID:28613264

  9. Methodology Series Module 9: Designing Questionnaires and Clinical Record Forms - Part II.

    PubMed

    Setia, Maninder Singh

    2017-01-01

    This article is a continuation of the previous module on designing questionnaires and clinical record form in which we have discussed some basic points about designing the questionnaire and clinical record forms. In this section, we will discuss the reliability and validity of questionnaires. The different types of validity are face validity, content validity, criterion validity, and construct validity. The different types of reliability are test-retest reliability, inter-rater reliability, and intra-rater reliability. Some of these parameters are assessed by subject area experts. However, statistical tests should be used for evaluation of other parameters. Once the questionnaire has been designed, the researcher should pilot test the questionnaire. The items in the questionnaire should be changed based on the feedback from the pilot study participants and the researcher's experience. After the basic structure of the questionnaire has been finalized, the researcher should assess the validity and reliability of the questionnaire or the scale. If an existing standard questionnaire is translated in the local language, the researcher should assess the reliability and validity of the translated questionnaire, and these values should be presented in the manuscript. The decision to use a self- or interviewer-administered, paper- or computer-based questionnaire depends on the nature of the questions, literacy levels of the target population, and resources.

  10. Methodology Series Module 9: Designing Questionnaires and Clinical Record Forms – Part II

    PubMed Central

    Setia, Maninder Singh

    2017-01-01

    This article is a continuation of the previous module on designing questionnaires and clinical record form in which we have discussed some basic points about designing the questionnaire and clinical record forms. In this section, we will discuss the reliability and validity of questionnaires. The different types of validity are face validity, content validity, criterion validity, and construct validity. The different types of reliability are test-retest reliability, inter-rater reliability, and intra-rater reliability. Some of these parameters are assessed by subject area experts. However, statistical tests should be used for evaluation of other parameters. Once the questionnaire has been designed, the researcher should pilot test the questionnaire. The items in the questionnaire should be changed based on the feedback from the pilot study participants and the researcher's experience. After the basic structure of the questionnaire has been finalized, the researcher should assess the validity and reliability of the questionnaire or the scale. If an existing standard questionnaire is translated in the local language, the researcher should assess the reliability and validity of the translated questionnaire, and these values should be presented in the manuscript. The decision to use a self- or interviewer-administered, paper- or computer-based questionnaire depends on the nature of the questions, literacy levels of the target population, and resources. PMID:28584367

  11. An Algorithm Based Wavelet Entropy for Shadowing Effect of Human Detection Using Ultra-Wideband Bio-Radar

    PubMed Central

    Liu, Miao; Zhang, Yang; Liang, Fulai; Qi, Fugui; Lv, Hao; Wang, Jianqi; Zhang, Yang

    2017-01-01

    Ultra-wide band (UWB) radar for short-range human target detection is widely used to find and locate survivors in some rescue missions after a disaster. The results of the application of bistatic UWB radar for detecting multi-stationary human targets have shown that human targets close to the radar antennas are very often visible, while those farther from radar antennas are detected with less reliability. In this paper, on account of the significant difference of frequency content between the echo signal of the human target and that of noise in the shadowing region, an algorithm based on wavelet entropy is proposed to detect multiple targets. Our findings indicate that the entropy value of human targets was much lower than that of noise. Compared with the method of adaptive filtering and the energy spectrum, wavelet entropy can accurately detect the person farther from the radar antennas, and it can be employed as a useful tool in detecting multiple targets by bistatic UWB radar. PMID:28973988

  12. An Algorithm Based Wavelet Entropy for Shadowing Effect of Human Detection Using Ultra-Wideband Bio-Radar.

    PubMed

    Xue, Huijun; Liu, Miao; Zhang, Yang; Liang, Fulai; Qi, Fugui; Chen, Fuming; Lv, Hao; Wang, Jianqi; Zhang, Yang

    2017-09-30

    Ultra-wide band (UWB) radar for short-range human target detection is widely used to find and locate survivors in some rescue missions after a disaster. The results of the application of bistatic UWB radar for detecting multi-stationary human targets have shown that human targets close to the radar antennas are very often visible, while those farther from radar antennas are detected with less reliability. In this paper, on account of the significant difference of frequency content between the echo signal of the human target and that of noise in the shadowing region, an algorithm based on wavelet entropy is proposed to detect multiple targets. Our findings indicate that the entropy value of human targets was much lower than that of noise. Compared with the method of adaptive filtering and the energy spectrum, wavelet entropy can accurately detect the person farther from the radar antennas, and it can be employed as a useful tool in detecting multiple targets by bistatic UWB radar.

  13. Reliable Acquisition of RAM Dumps from Intel-Based Apple Mac Computers over FireWire

    NASA Astrophysics Data System (ADS)

    Gladyshev, Pavel; Almansoori, Afrah

    RAM content acquisition is an important step in live forensic analysis of computer systems. FireWire offers an attractive way to acquire RAM content of Apple Mac computers equipped with a FireWire connection. However, the existing techniques for doing so require substantial knowledge of the target computer configuration and cannot be used reliably on a previously unknown computer in a crime scene. This paper proposes a novel method for acquiring RAM content of Apple Mac computers over FireWire, which automatically discovers necessary information about the target computer and can be used in the crime scene setting. As an application of the developed method, the techniques for recovery of AOL Instant Messenger (AIM) conversation fragments from RAM dumps are also discussed in this paper.

  14. Employing machine learning for reliable miRNA target identification in plants

    PubMed Central

    2011-01-01

    Background miRNAs are ~21 nucleotide long small noncoding RNA molecules, formed endogenously in most of the eukaryotes, which mainly control their target genes post transcriptionally by interacting and silencing them. While a lot of tools has been developed for animal miRNA target system, plant miRNA target identification system has witnessed limited development. Most of them have been centered around exact complementarity match. Very few of them considered other factors like multiple target sites and role of flanking regions. Result In the present work, a Support Vector Regression (SVR) approach has been implemented for plant miRNA target identification, utilizing position specific dinucleotide density variation information around the target sites, to yield highly reliable result. It has been named as p-TAREF (plant-Target Refiner). Performance comparison for p-TAREF was done with other prediction tools for plants with utmost rigor and where p-TAREF was found better performing in several aspects. Further, p-TAREF was run over the experimentally validated miRNA targets from species like Arabidopsis, Medicago, Rice and Tomato, and detected them accurately, suggesting gross usability of p-TAREF for plant species. Using p-TAREF, target identification was done for the complete Rice transcriptome, supported by expression and degradome based data. miR156 was found as an important component of the Rice regulatory system, where control of genes associated with growth and transcription looked predominant. The entire methodology has been implemented in a multi-threaded parallel architecture in Java, to enable fast processing for web-server version as well as standalone version. This also makes it to run even on a simple desktop computer in concurrent mode. It also provides a facility to gather experimental support for predictions made, through on the spot expression data analysis, in its web-server version. Conclusion A machine learning multivariate feature tool has been implemented in parallel and locally installable form, for plant miRNA target identification. The performance was assessed and compared through comprehensive testing and benchmarking, suggesting a reliable performance and gross usability for transcriptome wide plant miRNA target identification. PMID:22206472

  15. Employing machine learning for reliable miRNA target identification in plants.

    PubMed

    Jha, Ashwani; Shankar, Ravi

    2011-12-29

    miRNAs are ~21 nucleotide long small noncoding RNA molecules, formed endogenously in most of the eukaryotes, which mainly control their target genes post transcriptionally by interacting and silencing them. While a lot of tools has been developed for animal miRNA target system, plant miRNA target identification system has witnessed limited development. Most of them have been centered around exact complementarity match. Very few of them considered other factors like multiple target sites and role of flanking regions. In the present work, a Support Vector Regression (SVR) approach has been implemented for plant miRNA target identification, utilizing position specific dinucleotide density variation information around the target sites, to yield highly reliable result. It has been named as p-TAREF (plant-Target Refiner). Performance comparison for p-TAREF was done with other prediction tools for plants with utmost rigor and where p-TAREF was found better performing in several aspects. Further, p-TAREF was run over the experimentally validated miRNA targets from species like Arabidopsis, Medicago, Rice and Tomato, and detected them accurately, suggesting gross usability of p-TAREF for plant species. Using p-TAREF, target identification was done for the complete Rice transcriptome, supported by expression and degradome based data. miR156 was found as an important component of the Rice regulatory system, where control of genes associated with growth and transcription looked predominant. The entire methodology has been implemented in a multi-threaded parallel architecture in Java, to enable fast processing for web-server version as well as standalone version. This also makes it to run even on a simple desktop computer in concurrent mode. It also provides a facility to gather experimental support for predictions made, through on the spot expression data analysis, in its web-server version. A machine learning multivariate feature tool has been implemented in parallel and locally installable form, for plant miRNA target identification. The performance was assessed and compared through comprehensive testing and benchmarking, suggesting a reliable performance and gross usability for transcriptome wide plant miRNA target identification.

  16. Vision-Based Target Finding and Inspection of a Ground Target Using a Multirotor UAV System.

    PubMed

    Hinas, Ajmal; Roberts, Jonathan M; Gonzalez, Felipe

    2017-12-17

    In this paper, a system that uses an algorithm for target detection and navigation and a multirotor Unmanned Aerial Vehicle (UAV) for finding a ground target and inspecting it closely is presented. The system can also be used for accurate and safe delivery of payloads or spot spraying applications in site-specific crop management. A downward-looking camera attached to a multirotor is used to find the target on the ground. The UAV descends to the target and hovers above the target for a few seconds to inspect the target. A high-level decision algorithm based on an OODA (observe, orient, decide, and act) loop was developed as a solution to address the problem. Navigation of the UAV was achieved by continuously sending local position messages to the autopilot via Mavros. The proposed system performed hovering above the target in three different stages: locate, descend, and hover. The system was tested in multiple trials, in simulations and outdoor tests, from heights of 10 m to 40 m. Results show that the system is highly reliable and robust to sensor errors, drift, and external disturbance.

  17. A New Tool for Nutrition App Quality Evaluation (AQEL): Development, Validation, and Reliability Testing

    PubMed Central

    Huang, Wenhao; Chapman-Novakofski, Karen M

    2017-01-01

    Background The extensive availability and increasing use of mobile apps for nutrition-based health interventions makes evaluation of the quality of these apps crucial for integration of apps into nutritional counseling. Objective The goal of this research was the development, validation, and reliability testing of the app quality evaluation (AQEL) tool, an instrument for evaluating apps’ educational quality and technical functionality. Methods Items for evaluating app quality were adapted from website evaluations, with additional items added to evaluate the specific characteristics of apps, resulting in 79 initial items. Expert panels of nutrition and technology professionals and app users reviewed items for face and content validation. After recommended revisions, nutrition experts completed a second AQEL review to ensure clarity. On the basis of 150 sets of responses using the revised AQEL, principal component analysis was completed, reducing AQEL into 5 factors that underwent reliability testing, including internal consistency, split-half reliability, test-retest reliability, and interrater reliability (IRR). Two additional modifiable constructs for evaluating apps based on the age and needs of the target audience as selected by the evaluator were also tested for construct reliability. IRR testing using intraclass correlations (ICC) with all 7 constructs was conducted, with 15 dietitians evaluating one app. Results Development and validation resulted in the 51-item AQEL. These were reduced to 25 items in 5 factors after principal component analysis, plus 9 modifiable items in two constructs that were not included in principal component analysis. Internal consistency and split-half reliability of the following constructs derived from principal components analysis was good (Cronbach alpha >.80, Spearman-Brown coefficient >.80): behavior change potential, support of knowledge acquisition, app function, and skill development. App purpose split half-reliability was .65. Test-retest reliability showed no significant change over time (P>.05) for all but skill development (P=.001). Construct reliability was good for items assessing age appropriateness of apps for children, teens, and a general audience. In addition, construct reliability was acceptable for assessing app appropriateness for various target audiences (Cronbach alpha >.70). For the 5 main factors, ICC (1,k) was >.80, with a P value of <.05. When 15 nutrition professionals evaluated one app, ICC (2,15) was .98, with a P value of <.001 for all 7 constructs when the modifiable items were specified for adults seeking weight loss support. Conclusions Our preliminary effort shows that AQEL is a valid, reliable instrument for evaluating nutrition apps’ qualities for clinical interventions by nutrition clinicians, educators, and researchers. Further efforts in validating AQEL in various contexts are needed. PMID:29079554

  18. Biotechnological uses of RNAi in plants: risk assessment considerations.

    PubMed

    Casacuberta, Josep M; Devos, Yann; du Jardin, Patrick; Ramon, Matthew; Vaucheret, Hervé; Nogué, Fabien

    2015-03-01

    RNAi offers opportunities to generate new traits in genetically modified (GM) plants. Instead of expressing novel proteins, RNAi-based GM plants reduce target gene expression. Silencing of off-target genes may trigger unintended effects, and identifying these genes would facilitate risk assessment. However, using bioinformatics alone is not reliable, due to the lack of genomic data and insufficient knowledge of mechanisms governing mRNA-small (s)RNA interactions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Target recognition and scene interpretation in image/video understanding systems based on network-symbolic models

    NASA Astrophysics Data System (ADS)

    Kuvich, Gary

    2004-08-01

    Vision is only a part of a system that converts visual information into knowledge structures. These structures drive the vision process, resolving ambiguity and uncertainty via feedback, and provide image understanding, which is an interpretation of visual information in terms of these knowledge models. These mechanisms provide a reliable recognition if the object is occluded or cannot be recognized as a whole. It is hard to split the entire system apart, and reliable solutions to the target recognition problems are possible only within the solution of a more generic Image Understanding Problem. Brain reduces informational and computational complexities, using implicit symbolic coding of features, hierarchical compression, and selective processing of visual information. Biologically inspired Network-Symbolic representation, where both systematic structural/logical methods and neural/statistical methods are parts of a single mechanism, is the most feasible for such models. It converts visual information into relational Network-Symbolic structures, avoiding artificial precise computations of 3-dimensional models. Network-Symbolic Transformations derive abstract structures, which allows for invariant recognition of an object as exemplar of a class. Active vision helps creating consistent models. Attention, separation of figure from ground and perceptual grouping are special kinds of network-symbolic transformations. Such Image/Video Understanding Systems will be reliably recognizing targets.

  20. Targeted quantitative analysis of Streptococcus pyogenes virulence factors by multiple reaction monitoring.

    PubMed

    Lange, Vinzenz; Malmström, Johan A; Didion, John; King, Nichole L; Johansson, Björn P; Schäfer, Juliane; Rameseder, Jonathan; Wong, Chee-Hong; Deutsch, Eric W; Brusniak, Mi-Youn; Bühlmann, Peter; Björck, Lars; Domon, Bruno; Aebersold, Ruedi

    2008-08-01

    In many studies, particularly in the field of systems biology, it is essential that identical protein sets are precisely quantified in multiple samples such as those representing differentially perturbed cell states. The high degree of reproducibility required for such experiments has not been achieved by classical mass spectrometry-based proteomics methods. In this study we describe the implementation of a targeted quantitative approach by which predetermined protein sets are first identified and subsequently quantified at high sensitivity reliably in multiple samples. This approach consists of three steps. First, the proteome is extensively mapped out by multidimensional fractionation and tandem mass spectrometry, and the data generated are assembled in the PeptideAtlas database. Second, based on this proteome map, peptides uniquely identifying the proteins of interest, proteotypic peptides, are selected, and multiple reaction monitoring (MRM) transitions are established and validated by MS2 spectrum acquisition. This process of peptide selection, transition selection, and validation is supported by a suite of software tools, TIQAM (Targeted Identification for Quantitative Analysis by MRM), described in this study. Third, the selected target protein set is quantified in multiple samples by MRM. Applying this approach we were able to reliably quantify low abundance virulence factors from cultures of the human pathogen Streptococcus pyogenes exposed to increasing amounts of plasma. The resulting quantitative protein patterns enabled us to clearly define the subset of virulence proteins that is regulated upon plasma exposure.

  1. Reliability and Validity of the Dyadic Observed Communication Scale (DOCS).

    PubMed

    Hadley, Wendy; Stewart, Angela; Hunter, Heather L; Affleck, Katelyn; Donenberg, Geri; Diclemente, Ralph; Brown, Larry K

    2013-02-01

    We evaluated the reliability and validity of the Dyadic Observed Communication Scale (DOCS) coding scheme, which was developed to capture a range of communication components between parents and adolescents. Adolescents and their caregivers were recruited from mental health facilities for participation in a large, multi-site family-based HIV prevention intervention study. Seventy-one dyads were randomly selected from the larger study sample and coded using the DOCS at baseline. Preliminary validity and reliability of the DOCS was examined using various methods, such as comparing results to self-report measures and examining interrater reliability. Results suggest that the DOCS is a reliable and valid measure of observed communication among parent-adolescent dyads that captures both verbal and nonverbal communication behaviors that are typical intervention targets. The DOCS is a viable coding scheme for use by researchers and clinicians examining parent-adolescent communication. Coders can be trained to reliably capture individual and dyadic components of communication for parents and adolescents and this complex information can be obtained relatively quickly.

  2. Effects of Targeted Professional Development on Teachers' Specific Praise Rates

    ERIC Educational Resources Information Center

    Simonsen, Brandi; Freeman, Jennifer; Dooley, Kathryn; Maddock, Eleanor; Kern, Laura; Myers, Diane

    2017-01-01

    Classroom management continues to be a concern for educators, administrators, and policymakers. Although evidence-based classroom management practices exist, teachers often receive insufficient training and support to implement these practices successfully. Schools need reliable and efficient ways to support teachers' classroom management. This…

  3. The research and application of visual saliency and adaptive support vector machine in target tracking field.

    PubMed

    Chen, Yuantao; Xu, Weihong; Kuang, Fangjun; Gao, Shangbing

    2013-01-01

    The efficient target tracking algorithm researches have become current research focus of intelligent robots. The main problems of target tracking process in mobile robot face environmental uncertainty. They are very difficult to estimate the target states, illumination change, target shape changes, complex backgrounds, and other factors and all affect the occlusion in tracking robustness. To further improve the target tracking's accuracy and reliability, we present a novel target tracking algorithm to use visual saliency and adaptive support vector machine (ASVM). Furthermore, the paper's algorithm has been based on the mixture saliency of image features. These features include color, brightness, and sport feature. The execution process used visual saliency features and those common characteristics have been expressed as the target's saliency. Numerous experiments demonstrate the effectiveness and timeliness of the proposed target tracking algorithm in video sequences where the target objects undergo large changes in pose, scale, and illumination.

  4. Molecular Dynamics Simulations and Kinetic Measurements to Estimate and Predict Protein-Ligand Residence Times.

    PubMed

    Mollica, Luca; Theret, Isabelle; Antoine, Mathias; Perron-Sierra, Françoise; Charton, Yves; Fourquez, Jean-Marie; Wierzbicki, Michel; Boutin, Jean A; Ferry, Gilles; Decherchi, Sergio; Bottegoni, Giovanni; Ducrot, Pierre; Cavalli, Andrea

    2016-08-11

    Ligand-target residence time is emerging as a key drug discovery parameter because it can reliably predict drug efficacy in vivo. Experimental approaches to binding and unbinding kinetics are nowadays available, but we still lack reliable computational tools for predicting kinetics and residence time. Most attempts have been based on brute-force molecular dynamics (MD) simulations, which are CPU-demanding and not yet particularly accurate. We recently reported a new scaled-MD-based protocol, which showed potential for residence time prediction in drug discovery. Here, we further challenged our procedure's predictive ability by applying our methodology to a series of glucokinase activators that could be useful for treating type 2 diabetes mellitus. We combined scaled MD with experimental kinetics measurements and X-ray crystallography, promptly checking the protocol's reliability by directly comparing computational predictions and experimental measures. The good agreement highlights the potential of our scaled-MD-based approach as an innovative method for computationally estimating and predicting drug residence times.

  5. System-level multi-target drug discovery from natural products with applications to cardiovascular diseases.

    PubMed

    Zheng, Chunli; Wang, Jinan; Liu, Jianling; Pei, Mengjie; Huang, Chao; Wang, Yonghua

    2014-08-01

    The term systems pharmacology describes a field of study that uses computational and experimental approaches to broaden the view of drug actions rooted in molecular interactions and advance the process of drug discovery. The aim of this work is to stick out the role that the systems pharmacology plays across the multi-target drug discovery from natural products for cardiovascular diseases (CVDs). Firstly, based on network pharmacology methods, we reconstructed the drug-target and target-target networks to determine the putative protein target set of multi-target drugs for CVDs treatment. Secondly, we reintegrated a compound dataset of natural products and then obtained a multi-target compounds subset by virtual-screening process. Thirdly, a drug-likeness evaluation was applied to find the ADME-favorable compounds in this subset. Finally, we conducted in vitro experiments to evaluate the reliability of the selected chemicals and targets. We found that four of the five randomly selected natural molecules can effectively act on the target set for CVDs, indicating the reasonability of our systems-based method. This strategy may serve as a new model for multi-target drug discovery of complex diseases.

  6. Fusion-based multi-target tracking and localization for intelligent surveillance systems

    NASA Astrophysics Data System (ADS)

    Rababaah, Haroun; Shirkhodaie, Amir

    2008-04-01

    In this paper, we have presented two approaches addressing visual target tracking and localization in complex urban environment. The two techniques presented in this paper are: fusion-based multi-target visual tracking, and multi-target localization via camera calibration. For multi-target tracking, the data fusion concepts of hypothesis generation/evaluation/selection, target-to-target registration, and association are employed. An association matrix is implemented using RGB histograms for associated tracking of multi-targets of interests. Motion segmentation of targets of interest (TOI) from the background was achieved by a Gaussian Mixture Model. Foreground segmentation, on other hand, was achieved by the Connected Components Analysis (CCA) technique. The tracking of individual targets was estimated by fusing two sources of information, the centroid with the spatial gating, and the RGB histogram association matrix. The localization problem is addressed through an effective camera calibration technique using edge modeling for grid mapping (EMGM). A two-stage image pixel to world coordinates mapping technique is introduced that performs coarse and fine location estimation of moving TOIs. In coarse estimation, an approximate neighborhood of the target position is estimated based on nearest 4-neighbor method, and in fine estimation, we use Euclidean interpolation to localize the position within the estimated four neighbors. Both techniques were tested and shown reliable results for tracking and localization of Targets of interests in complex urban environment.

  7. Assessing the Eating Behaviors of Low-Income, Urban Adolescents

    ERIC Educational Resources Information Center

    Fahlman, Mariane; McCaughtry, Nate; Martin, Jeffrey; Garn, Alex C.; Shen, Bo

    2012-01-01

    Background: There is a need for instruments that can accurately determine the effectiveness of nutrition interventions targeting low-income, inner-city adolescents. Purpose: To examine the development of a valid and reliable eating behavior scale (EBS) for use in school-based nutrition interventions in urban, inner-city communities dominated by…

  8. A Novel Hybrid Mental Spelling Application Based on Eye Tracking and SSVEP-Based BCI

    PubMed Central

    Stawicki, Piotr; Gembler, Felix; Rezeika, Aya; Volosyak, Ivan

    2017-01-01

    Steady state visual evoked potentials (SSVEPs)-based Brain-Computer interfaces (BCIs), as well as eyetracking devices, provide a pathway for re-establishing communication for people with severe disabilities. We fused these control techniques into a novel eyetracking/SSVEP hybrid system, which utilizes eye tracking for initial rough selection and the SSVEP technology for fine target activation. Based on our previous studies, only four stimuli were used for the SSVEP aspect, granting sufficient control for most BCI users. As Eye tracking data is not used for activation of letters, false positives due to inappropriate dwell times are avoided. This novel approach combines the high speed of eye tracking systems and the high classification accuracies of low target SSVEP-based BCIs, leading to an optimal combination of both methods. We evaluated accuracy and speed of the proposed hybrid system with a 30-target spelling application implementing all three control approaches (pure eye tracking, SSVEP and the hybrid system) with 32 participants. Although the highest information transfer rates (ITRs) were achieved with pure eye tracking, a considerable amount of subjects was not able to gain sufficient control over the stand-alone eye-tracking device or the pure SSVEP system (78.13% and 75% of the participants reached reliable control, respectively). In this respect, the proposed hybrid was most universal (over 90% of users achieved reliable control), and outperformed the pure SSVEP system in terms of speed and user friendliness. The presented hybrid system might offer communication to a wider range of users in comparison to the standard techniques. PMID:28379187

  9. Joint Calibration of 3d Laser Scanner and Digital Camera Based on Dlt Algorithm

    NASA Astrophysics Data System (ADS)

    Gao, X.; Li, M.; Xing, L.; Liu, Y.

    2018-04-01

    Design a calibration target that can be scanned by 3D laser scanner while shot by digital camera, achieving point cloud and photos of a same target. A method to joint calibrate 3D laser scanner and digital camera based on Direct Linear Transformation algorithm was proposed. This method adds a distortion model of digital camera to traditional DLT algorithm, after repeating iteration, it can solve the inner and external position element of the camera as well as the joint calibration of 3D laser scanner and digital camera. It comes to prove that this method is reliable.

  10. Validation and reliability of the sex estimation of the human os coxae using freely available DSP2 software for bioarchaeology and forensic anthropology.

    PubMed

    Brůžek, Jaroslav; Santos, Frédéric; Dutailly, Bruno; Murail, Pascal; Cunha, Eugenia

    2017-10-01

    A new tool for skeletal sex estimation based on measurements of the human os coxae is presented using skeletons from a metapopulation of identified adult individuals from twelve independent population samples. For reliable sex estimation, a posterior probability greater than 0.95 was considered to be the classification threshold: below this value, estimates are considered indeterminate. By providing free software, we aim to develop an even more disseminated method for sex estimation. Ten metric variables collected from 2,040 ossa coxa of adult subjects of known sex were recorded between 1986 and 2002 (reference sample). To test both the validity and reliability, a target sample consisting of two series of adult ossa coxa of known sex (n = 623) was used. The DSP2 software (Diagnose Sexuelle Probabiliste v2) is based on Linear Discriminant Analysis, and the posterior probabilities are calculated using an R script. For the reference sample, any combination of four dimensions provides a correct sex estimate in at least 99% of cases. The percentage of individuals for whom sex can be estimated depends on the number of dimensions; for all ten variables it is higher than 90%. Those results are confirmed in the target sample. Our posterior probability threshold of 0.95 for sex estimate corresponds to the traditional sectioning point used in osteological studies. DSP2 software is replacing the former version that should not be used anymore. DSP2 is a robust and reliable technique for sexing adult os coxae, and is also user friendly. © 2017 Wiley Periodicals, Inc.

  11. New support vector machine-based method for microRNA target prediction.

    PubMed

    Li, L; Gao, Q; Mao, X; Cao, Y

    2014-06-09

    MicroRNA (miRNA) plays important roles in cell differentiation, proliferation, growth, mobility, and apoptosis. An accurate list of precise target genes is necessary in order to fully understand the importance of miRNAs in animal development and disease. Several computational methods have been proposed for miRNA target-gene identification. However, these methods still have limitations with respect to their sensitivity and accuracy. Thus, we developed a new miRNA target-prediction method based on the support vector machine (SVM) model. The model supplies information of two binding sites (primary and secondary) for a radial basis function kernel as a similarity measure for SVM features. The information is categorized based on structural, thermodynamic, and sequence conservation. Using high-confidence datasets selected from public miRNA target databases, we obtained a human miRNA target SVM classifier model with high performance and provided an efficient tool for human miRNA target gene identification. Experiments have shown that our method is a reliable tool for miRNA target-gene prediction, and a successful application of an SVM classifier. Compared with other methods, the method proposed here improves the sensitivity and accuracy of miRNA prediction. Its performance can be further improved by providing more training examples.

  12. Target-decoy Based False Discovery Rate Estimation for Large-scale Metabolite Identification.

    PubMed

    Wang, Xusheng; Jones, Drew R; Shaw, Timothy I; Cho, Ji-Hoon; Wang, Yuanyuan; Tan, Haiyan; Xie, Boer; Zhou, Suiping; Li, Yuxin; Peng, Junmin

    2018-05-23

    Metabolite identification is a crucial step in mass spectrometry (MS)-based metabolomics. However, it is still challenging to assess the confidence of assigned metabolites. In this study, we report a novel method for estimating false discovery rate (FDR) of metabolite assignment with a target-decoy strategy, in which the decoys are generated through violating the octet rule of chemistry by adding small odd numbers of hydrogen atoms. The target-decoy strategy was integrated into JUMPm, an automated metabolite identification pipeline for large-scale MS analysis, and was also evaluated with two other metabolomics tools, mzMatch and mzMine 2. The reliability of FDR calculation was examined by false datasets, which were simulated by altering MS1 or MS2 spectra. Finally, we used the JUMPm pipeline coupled with the target-decoy strategy to process unlabeled and stable-isotope labeled metabolomic datasets. The results demonstrate that the target-decoy strategy is a simple and effective method for evaluating the confidence of high-throughput metabolite identification.

  13. A New Tool for Nutrition App Quality Evaluation (AQEL): Development, Validation, and Reliability Testing.

    PubMed

    DiFilippo, Kristen Nicole; Huang, Wenhao; Chapman-Novakofski, Karen M

    2017-10-27

    The extensive availability and increasing use of mobile apps for nutrition-based health interventions makes evaluation of the quality of these apps crucial for integration of apps into nutritional counseling. The goal of this research was the development, validation, and reliability testing of the app quality evaluation (AQEL) tool, an instrument for evaluating apps' educational quality and technical functionality. Items for evaluating app quality were adapted from website evaluations, with additional items added to evaluate the specific characteristics of apps, resulting in 79 initial items. Expert panels of nutrition and technology professionals and app users reviewed items for face and content validation. After recommended revisions, nutrition experts completed a second AQEL review to ensure clarity. On the basis of 150 sets of responses using the revised AQEL, principal component analysis was completed, reducing AQEL into 5 factors that underwent reliability testing, including internal consistency, split-half reliability, test-retest reliability, and interrater reliability (IRR). Two additional modifiable constructs for evaluating apps based on the age and needs of the target audience as selected by the evaluator were also tested for construct reliability. IRR testing using intraclass correlations (ICC) with all 7 constructs was conducted, with 15 dietitians evaluating one app. Development and validation resulted in the 51-item AQEL. These were reduced to 25 items in 5 factors after principal component analysis, plus 9 modifiable items in two constructs that were not included in principal component analysis. Internal consistency and split-half reliability of the following constructs derived from principal components analysis was good (Cronbach alpha >.80, Spearman-Brown coefficient >.80): behavior change potential, support of knowledge acquisition, app function, and skill development. App purpose split half-reliability was .65. Test-retest reliability showed no significant change over time (P>.05) for all but skill development (P=.001). Construct reliability was good for items assessing age appropriateness of apps for children, teens, and a general audience. In addition, construct reliability was acceptable for assessing app appropriateness for various target audiences (Cronbach alpha >.70). For the 5 main factors, ICC (1,k) was >.80, with a P value of <.05. When 15 nutrition professionals evaluated one app, ICC (2,15) was .98, with a P value of <.001 for all 7 constructs when the modifiable items were specified for adults seeking weight loss support. Our preliminary effort shows that AQEL is a valid, reliable instrument for evaluating nutrition apps' qualities for clinical interventions by nutrition clinicians, educators, and researchers. Further efforts in validating AQEL in various contexts are needed. ©Kristen Nicole DiFilippo, Wenhao Huang, Karen M. Chapman-Novakofski. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 27.10.2017.

  14. Perceptions of Problem Behavior in Adolescents' Families: Perceiver, Target, and Family Effects

    ERIC Educational Resources Information Center

    Manders, Willeke A.; Janssens, Jan M. A. M.; Cook, William L.; Oud, Johan H. L.; De Bruyn, Eric E. J.; Scholte, Ron H. J.

    2009-01-01

    Considerable research has focused on the reliability and validity of informant reports of family behavior, especially maternal reports of adolescent problem behavior. None of these studies, however, has based their orientation on a theoretical model of interpersonal perception. In this study we used the social relations model (SRM) to examine…

  15. The Cause of Category-Based Distortions in Spatial Memory: A Distribution Analysis

    ERIC Educational Resources Information Center

    Sampaio, Cristina; Wang, Ranxiao Frances

    2017-01-01

    Recall of remembered locations reliably reflects a compromise between a target's true position and its region's prototypical position. The effect is quite robust, and a standard interpretation for these data is that the metric and categorical codings blend in a Bayesian combinatory fashion. However, there has been no direct experimental evidence…

  16. Stochastic analysis of motor-control stability, polymer based force sensing, and optical stimulation as a preventive measure for falls

    NASA Astrophysics Data System (ADS)

    Landrock, Clinton K.

    Falls are the leading cause of all external injuries. Outcomes of falls include the leading cause of traumatic brain injury and bone fractures, and high direct medical costs in the billions of dollars. This work focused on developing three areas of enabling component technology to be used in postural control monitoring tools targeting the mitigation of falls. The first was an analysis tool based on stochastic fractal analysis to reliably measure levels of motor control. The second focus was on thin film wearable pressure sensors capable of relaying data for the first tool. The third was new thin film advanced optics for improving phototherapy devices targeting postural control disorders. Two populations, athletes and elderly, were studied against control groups. The results of these studies clearly show that monitoring postural stability in at-risk groups can be achieved reliably, and an integrated wearable system can be envisioned for both monitoring and treatment purposes. Keywords: electro-active polymer, ionic polymer-metal composite, postural control, motor control, fall prevention, sports medicine, fractal analysis, physiological signals, wearable sensors, phototherapy, photobiomodulation, nano-optics.

  17. Measuring homework completion in behavioral activation.

    PubMed

    Busch, Andrew M; Uebelacker, Lisa A; Kalibatseva, Zornitsa; Miller, Ivan W

    2010-07-01

    The aim of this study was to develop and validate an observer-based coding system for the characterization and completion of homework assignments during Behavioral Activation (BA). Existing measures of homework completion are generally unsophisticated, and there is no current measure of homework completion designed to capture the particularities of BA. The tested scale sought to capture the type of assignment, realm of functioning targeted, extent of completion, and assignment difficulty. Homework assignments were drawn from 12 (mean age = 48, 83% female) clients in two trials of a 10-session BA manual targeting treatment-resistant depression in primary care. The two coders demonstrated acceptable or better reliability on most codes, and unreliable codes were dropped from the proposed scale. In addition, correlations between homework completion and outcome were strong, providing some support for construct validity. Ultimately, this line of research aims to develop a user-friendly, reliable measure of BA homework completion that can be completed by a therapist during session.

  18. High Available COTS Based Computer for Space

    NASA Astrophysics Data System (ADS)

    Hartmann, J.; Magistrati, Giorgio

    2015-09-01

    The availability and reliability factors of a system are central requirements of a target application. From a simple fuel injection system used in cars up to a flight control system of an autonomous navigating spacecraft, each application defines its specific availability factor under the target application boundary conditions. Increasing quality requirements on data processing systems used in space flight applications calling for new architectures to fulfill the availability, reliability as well as the increase of the required data processing power. Contrary to the increased quality request simplification and use of COTS components to decrease costs while keeping the interface compatibility to currently used system standards are clear customer needs. Data processing system design is mostly dominated by strict fulfillment of the customer requirements and reuse of available computer systems were not always possible caused by obsolescence of EEE-Parts, insufficient IO capabilities or the fact that available data processing systems did not provide the required scalability and performance.

  19. SDSS-IV eBOSS emission-line galaxy pilot survey

    DOE PAGES

    Comparat, J.; Delubac, T.; Jouvel, S.; ...

    2016-08-09

    The Sloan Digital Sky Survey IV extended Baryonic Oscillation Spectroscopic Survey (SDSS-IV/eBOSS) will observe 195,000 emission-line galaxies (ELGs) to measure the Baryonic Acoustic Oscillation standard ruler (BAO) at redshift 0.9. To test different ELG selection algorithms, 9,000 spectra were observed with the SDSS spectrograph as a pilot survey based on data from several imaging surveys. First, using visual inspection and redshift quality flags, we show that the automated spectroscopic redshifts assigned by the pipeline meet the quality requirements for a reliable BAO measurement. We also show the correlations between sky emission, signal-to-noise ratio in the emission lines, and redshift error.more » Then we provide a detailed description of each target selection algorithm we tested and compare them with the requirements of the eBOSS experiment. As a result, we provide reliable redshift distributions for the different target selection schemes we tested. Lastly, we determine an target selection algorithms that is best suited to be applied on DECam photometry because they fulfill the eBOSS survey efficiency requirements.« less

  20. Probing Reliability of Transport Phenomena Based Heat Transfer and Fluid Flow Analysis in Autogeneous Fusion Welding Process

    NASA Astrophysics Data System (ADS)

    Bag, S.; de, A.

    2010-09-01

    The transport phenomena based heat transfer and fluid flow calculations in weld pool require a number of input parameters. Arc efficiency, effective thermal conductivity, and viscosity in weld pool are some of these parameters, values of which are rarely known and difficult to assign a priori based on the scientific principles alone. The present work reports a bi-directional three-dimensional (3-D) heat transfer and fluid flow model, which is integrated with a real number based genetic algorithm. The bi-directional feature of the integrated model allows the identification of the values of a required set of uncertain model input parameters and, next, the design of process parameters to achieve a target weld pool dimension. The computed values are validated with measured results in linear gas-tungsten-arc (GTA) weld samples. Furthermore, a novel methodology to estimate the overall reliability of the computed solutions is also presented.

  1. State-of-the-Art Review on Physiologically Based Pharmacokinetic Modeling in Pediatric Drug Development.

    PubMed

    Yellepeddi, Venkata; Rower, Joseph; Liu, Xiaoxi; Kumar, Shaun; Rashid, Jahidur; Sherwin, Catherine M T

    2018-05-18

    Physiologically based pharmacokinetic modeling and simulation is an important tool for predicting the pharmacokinetics, pharmacodynamics, and safety of drugs in pediatrics. Physiologically based pharmacokinetic modeling is applied in pediatric drug development for first-time-in-pediatric dose selection, simulation-based trial design, correlation with target organ toxicities, risk assessment by investigating possible drug-drug interactions, real-time assessment of pharmacokinetic-safety relationships, and assessment of non-systemic biodistribution targets. This review summarizes the details of a physiologically based pharmacokinetic modeling approach in pediatric drug research, emphasizing reports on pediatric physiologically based pharmacokinetic models of individual drugs. We also compare and contrast the strategies employed by various researchers in pediatric physiologically based pharmacokinetic modeling and provide a comprehensive overview of physiologically based pharmacokinetic modeling strategies and approaches in pediatrics. We discuss the impact of physiologically based pharmacokinetic models on regulatory reviews and product labels in the field of pediatric pharmacotherapy. Additionally, we examine in detail the current limitations and future directions of physiologically based pharmacokinetic modeling in pediatrics with regard to the ability to predict plasma concentrations and pharmacokinetic parameters. Despite the skepticism and concern in the pediatric community about the reliability of physiologically based pharmacokinetic models, there is substantial evidence that pediatric physiologically based pharmacokinetic models have been used successfully to predict differences in pharmacokinetics between adults and children for several drugs. It is obvious that the use of physiologically based pharmacokinetic modeling to support various stages of pediatric drug development is highly attractive and will rapidly increase, provided the robustness and reliability of these techniques are well established.

  2. Biomimetic nanochannels based biosensor for ultrasensitive and label-free detection of nucleic acids.

    PubMed

    Sun, Zhongyue; Liao, Tangbin; Zhang, Yulin; Shu, Jing; Zhang, Hong; Zhang, Guo-Jun

    2016-12-15

    A very simple sensing device based on biomimetic nanochannels has been developed for label-free, ultrasensitive and highly sequence-specific detection of DNA. Probe DNA was modified on the inner wall of the nanochannel surface by layer-by-layer (LBL) assembly. After probe DNA immobilization, DNA detection was realized by monitoring the rectified ion current when hybridization occurred. Due to three dimensional (3D) nanoscale environment of the nanochannel, this special geometry dramatically increased the surface area of the nanochannel for immobilization of probe molecules on the inner-surface and enlarged contact area between probes and target-molecules. Thus, the unique sensor reached a reliable detection limit of 10 fM for target DNA. In addition, this DNA sensor could discriminate complementary DNA (c-DNA) from non-complementary DNA (nc-DNA), two-base mismatched DNA (2bm-DNA) and one-base mismatched DNA (1bm-DNA) with high specificity. Moreover, the nanochannel-based biosensor was also able to detect target DNA even in an interfering environment and serum samples. This approach will provide a novel biosensing platform for detection and discrimination of disease-related molecular targets and unknown sequence DNA. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Feature reliability determines specificity and transfer of perceptual learning in orientation search.

    PubMed

    Yashar, Amit; Denison, Rachel N

    2017-12-01

    Training can modify the visual system to produce a substantial improvement on perceptual tasks and therefore has applications for treating visual deficits. Visual perceptual learning (VPL) is often specific to the trained feature, which gives insight into processes underlying brain plasticity, but limits VPL's effectiveness in rehabilitation. Under what circumstances VPL transfers to untrained stimuli is poorly understood. Here we report a qualitatively new phenomenon: intrinsic variation in the representation of features determines the transfer of VPL. Orientations around cardinal are represented more reliably than orientations around oblique in V1, which has been linked to behavioral consequences such as visual search asymmetries. We studied VPL for visual search of near-cardinal or oblique targets among distractors of the other orientation while controlling for other display and task attributes, including task precision, task difficulty, and stimulus exposure. Learning was the same in all training conditions; however, transfer depended on the orientation of the target, with full transfer of learning from near-cardinal to oblique targets but not the reverse. To evaluate the idea that representational reliability was the key difference between the orientations in determining VPL transfer, we created a model that combined orientation-dependent reliability, improvement of reliability with learning, and an optimal search strategy. Modeling suggested that not only search asymmetries but also the asymmetric transfer of VPL depended on preexisting differences between the reliability of near-cardinal and oblique representations. Transfer asymmetries in model behavior also depended on having different learning rates for targets and distractors, such that greater learning for low-reliability distractors facilitated transfer. These findings suggest that training on sensory features with intrinsically low reliability may maximize the generalizability of learning in complex visual environments.

  4. Feature reliability determines specificity and transfer of perceptual learning in orientation search

    PubMed Central

    2017-01-01

    Training can modify the visual system to produce a substantial improvement on perceptual tasks and therefore has applications for treating visual deficits. Visual perceptual learning (VPL) is often specific to the trained feature, which gives insight into processes underlying brain plasticity, but limits VPL’s effectiveness in rehabilitation. Under what circumstances VPL transfers to untrained stimuli is poorly understood. Here we report a qualitatively new phenomenon: intrinsic variation in the representation of features determines the transfer of VPL. Orientations around cardinal are represented more reliably than orientations around oblique in V1, which has been linked to behavioral consequences such as visual search asymmetries. We studied VPL for visual search of near-cardinal or oblique targets among distractors of the other orientation while controlling for other display and task attributes, including task precision, task difficulty, and stimulus exposure. Learning was the same in all training conditions; however, transfer depended on the orientation of the target, with full transfer of learning from near-cardinal to oblique targets but not the reverse. To evaluate the idea that representational reliability was the key difference between the orientations in determining VPL transfer, we created a model that combined orientation-dependent reliability, improvement of reliability with learning, and an optimal search strategy. Modeling suggested that not only search asymmetries but also the asymmetric transfer of VPL depended on preexisting differences between the reliability of near-cardinal and oblique representations. Transfer asymmetries in model behavior also depended on having different learning rates for targets and distractors, such that greater learning for low-reliability distractors facilitated transfer. These findings suggest that training on sensory features with intrinsically low reliability may maximize the generalizability of learning in complex visual environments. PMID:29240813

  5. Development of Species-Specific SCAR Markers, Based on a SCoT Analysis, to Authenticate Physalis (Solanaceae) Species

    PubMed Central

    Feng, Shangguo; Zhu, Yujia; Yu, Chenliang; Jiao, Kaili; Jiang, Mengying; Lu, Jiangjie; Shen, Chenjia; Ying, Qicai; Wang, Huizhong

    2018-01-01

    Physalis is an important genus in the Solanaceae family. It includes many species of significant medicinal value, edible value, and ornamental value. However, many Physalis species are easily confused because of their similar morphological traits, which hinder the utilization and protection of Physalis resources. Therefore, it is necessary to create fast, sensitive, and reliable methods for the Physalis species authentication. Intended for that, in this study, species-specific sequence-characterized amplified region (SCAR) markers were developed for accurate identification of the closely related Physalis species P. angulata, P. minima, P. pubescens, and P. alkekengi var. franchetii, based on a simple and novel marker system, start codon targeted (SCoT) marker. A total of 34 selected SCoT primers yielded 289 reliable SCoT loci, of which 265 were polymorphic. Four species-specific SCoT fragments (SCoT3-1404, SCoT3-1589, SCoT5-550, and SCoT36-520) from Physalis species were successfully identified, cloned, and sequenced. Based on these selected specific DNA fragments, four SCAR primers pairs were developed and named ST3KZ, ST3MSJ, ST5SJ, and ST36XSJ. PCR analysis of each of these primer pairs clearly demonstrated a specific amplified band in all samples of the target Physalis species, but no amplification was observed in other Physalis species. Therefore, the species-specific SCAR primer pairs developed in this study could be used as powerful tools that can rapidly, effectively, and reliably identify and differentiate Physalis species.

  6. Decomposition-based transfer distance metric learning for image classification.

    PubMed

    Luo, Yong; Liu, Tongliang; Tao, Dacheng; Xu, Chao

    2014-09-01

    Distance metric learning (DML) is a critical factor for image analysis and pattern recognition. To learn a robust distance metric for a target task, we need abundant side information (i.e., the similarity/dissimilarity pairwise constraints over the labeled data), which is usually unavailable in practice due to the high labeling cost. This paper considers the transfer learning setting by exploiting the large quantity of side information from certain related, but different source tasks to help with target metric learning (with only a little side information). The state-of-the-art metric learning algorithms usually fail in this setting because the data distributions of the source task and target task are often quite different. We address this problem by assuming that the target distance metric lies in the space spanned by the eigenvectors of the source metrics (or other randomly generated bases). The target metric is represented as a combination of the base metrics, which are computed using the decomposed components of the source metrics (or simply a set of random bases); we call the proposed method, decomposition-based transfer DML (DTDML). In particular, DTDML learns a sparse combination of the base metrics to construct the target metric by forcing the target metric to be close to an integration of the source metrics. The main advantage of the proposed method compared with existing transfer metric learning approaches is that we directly learn the base metric coefficients instead of the target metric. To this end, far fewer variables need to be learned. We therefore obtain more reliable solutions given the limited side information and the optimization tends to be faster. Experiments on the popular handwritten image (digit, letter) classification and challenge natural image annotation tasks demonstrate the effectiveness of the proposed method.

  7. Robust pedestrian detection and tracking from a moving vehicle

    NASA Astrophysics Data System (ADS)

    Tuong, Nguyen Xuan; Müller, Thomas; Knoll, Alois

    2011-01-01

    In this paper, we address the problem of multi-person detection, tracking and distance estimation in a complex scenario using multi-cameras. Specifically, we are interested in a vision system for supporting the driver in avoiding any unwanted collision with the pedestrian. We propose an approach using Histograms of Oriented Gradients (HOG) to detect pedestrians on static images and a particle filter as a robust tracking technique to follow targets from frame to frame. Because the depth map requires expensive computation, we extract depth information of targets using Direct Linear Transformation (DLT) to reconstruct 3D-coordinates of correspondent points found by running Speeded Up Robust Features (SURF) on two input images. Using the particle filter the proposed tracker can efficiently handle target occlusions in a simple background environment. However, to achieve reliable performance in complex scenarios with frequent target occlusions and complex cluttered background, results from the detection module are integrated to create feedback and recover the tracker from tracking failures due to the complexity of the environment and target appearance model variability. The proposed approach is evaluated on different data sets both in a simple background scenario and a cluttered background environment. The result shows that, by integrating detector and tracker, a reliable and stable performance is possible even if occlusion occurs frequently in highly complex environment. A vision-based collision avoidance system for an intelligent car, as a result, can be achieved.

  8. The use of head/eye-centered, hand-centered and allocentric representations for visually guided hand movements and perceptual judgments.

    PubMed

    Thaler, Lore; Todd, James T

    2009-04-01

    Two experiments are reported that were designed to measure the accuracy and reliability of both visually guided hand movements (Exp. 1) and perceptual matching judgments (Exp. 2). The specific procedure for informing subjects of the required response on each trial was manipulated so that some tasks could only be performed using an allocentric representation of the visual target; others could be performed using either an allocentric or hand-centered representation; still others could be performed based on an allocentric, hand-centered or head/eye-centered representation. Both head/eye and hand centered representations are egocentric because they specify visual coordinates with respect to the subject. The results reveal that accuracy and reliability of both motor and perceptual responses are highest when subjects direct their response towards a visible target location, which allows them to rely on a representation of the target in head/eye-centered coordinates. Systematic changes in averages and standard deviations of responses are observed when subjects cannot direct their response towards a visible target location, but have to represent target distance and direction in either hand-centered or allocentric visual coordinates instead. Subjects' motor and perceptual performance agree quantitatively well. These results strongly suggest that subjects process head/eye-centered representations differently from hand-centered or allocentric representations, but that they process visual information for motor actions and perceptual judgments together.

  9. Evidence-Based Assessment of Compulsive Skin Picking, Chronic Tic Disorders and Trichotillomania in Children

    ERIC Educational Resources Information Center

    McGuire, Joseph F.; Kugler, Brittany B.; Park, Jennifer M.; Horng, Betty; Lewin, Adam B.; Murphy, Tanya K.; Storch, Eric A.

    2012-01-01

    Body-focused repetitive behavior (BFRB) is an umbrella term for debilitating, repetitive behaviors that target one or more body regions. Despite regularly occurring in youth, there has been limited investigation of BFRBs in pediatric populations. One reason for this may be that there are few reliable and valid assessments available to evaluate the…

  10. Network-based drug discovery by integrating systems biology and computational technologies

    PubMed Central

    Leung, Elaine L.; Cao, Zhi-Wei; Jiang, Zhi-Hong; Zhou, Hua

    2013-01-01

    Network-based intervention has been a trend of curing systemic diseases, but it relies on regimen optimization and valid multi-target actions of the drugs. The complex multi-component nature of medicinal herbs may serve as valuable resources for network-based multi-target drug discovery due to its potential treatment effects by synergy. Recently, robustness of multiple systems biology platforms shows powerful to uncover molecular mechanisms and connections between the drugs and their targeting dynamic network. However, optimization methods of drug combination are insufficient, owning to lacking of tighter integration across multiple ‘-omics’ databases. The newly developed algorithm- or network-based computational models can tightly integrate ‘-omics’ databases and optimize combinational regimens of drug development, which encourage using medicinal herbs to develop into new wave of network-based multi-target drugs. However, challenges on further integration across the databases of medicinal herbs with multiple system biology platforms for multi-target drug optimization remain to the uncertain reliability of individual data sets, width and depth and degree of standardization of herbal medicine. Standardization of the methodology and terminology of multiple system biology and herbal database would facilitate the integration. Enhance public accessible databases and the number of research using system biology platform on herbal medicine would be helpful. Further integration across various ‘-omics’ platforms and computational tools would accelerate development of network-based drug discovery and network medicine. PMID:22877768

  11. GEPSI: A Gene Expression Profile Similarity-Based Identification Method of Bioactive Components in Traditional Chinese Medicine Formula.

    PubMed

    Zhang, Baixia; He, Shuaibing; Lv, Chenyang; Zhang, Yanling; Wang, Yun

    2018-01-01

    The identification of bioactive components in traditional Chinese medicine (TCM) is an important part of the TCM material foundation research. Recently, molecular docking technology has been extensively used for the identification of TCM bioactive components. However, target proteins that are used in molecular docking may not be the actual TCM target. For this reason, the bioactive components would likely be omitted or incorrect. To address this problem, this study proposed the GEPSI method that identified the target proteins of TCM based on the similarity of gene expression profiles. The similarity of the gene expression profiles affected by TCM and small molecular drugs was calculated. The pharmacological action of TCM may be similar to that of small molecule drugs that have a high similarity score. Indeed, the target proteins of the small molecule drugs could be considered TCM targets. Thus, we identified the bioactive components of a TCM by molecular docking and verified the reliability of this method by a literature investigation. Using the target proteins that TCM actually affected as targets, the identification of the bioactive components was more accurate. This study provides a fast and effective method for the identification of TCM bioactive components.

  12. GEPSI: A Gene Expression Profile Similarity-Based Identification Method of Bioactive Components in Traditional Chinese Medicine Formula

    PubMed Central

    Zhang, Baixia; He, Shuaibing; Lv, Chenyang; Zhang, Yanling

    2018-01-01

    The identification of bioactive components in traditional Chinese medicine (TCM) is an important part of the TCM material foundation research. Recently, molecular docking technology has been extensively used for the identification of TCM bioactive components. However, target proteins that are used in molecular docking may not be the actual TCM target. For this reason, the bioactive components would likely be omitted or incorrect. To address this problem, this study proposed the GEPSI method that identified the target proteins of TCM based on the similarity of gene expression profiles. The similarity of the gene expression profiles affected by TCM and small molecular drugs was calculated. The pharmacological action of TCM may be similar to that of small molecule drugs that have a high similarity score. Indeed, the target proteins of the small molecule drugs could be considered TCM targets. Thus, we identified the bioactive components of a TCM by molecular docking and verified the reliability of this method by a literature investigation. Using the target proteins that TCM actually affected as targets, the identification of the bioactive components was more accurate. This study provides a fast and effective method for the identification of TCM bioactive components. PMID:29692857

  13. Performance of the WeNMR CS-Rosetta3 web server in CASD-NMR.

    PubMed

    van der Schot, Gijs; Bonvin, Alexandre M J J

    2015-08-01

    We present here the performance of the WeNMR CS-Rosetta3 web server in CASD-NMR, the critical assessment of automated structure determination by NMR. The CS-Rosetta server uses only chemical shifts for structure prediction, in combination, when available, with a post-scoring procedure based on unassigned NOE lists (Huang et al. in J Am Chem Soc 127:1665-1674, 2005b, doi: 10.1021/ja047109h). We compare the original submissions using a previous version of the server based on Rosetta version 2.6 with recalculated targets using the new R3FP fragment picker for fragment selection and implementing a new annotation of prediction reliability (van der Schot et al. in J Biomol NMR 57:27-35, 2013, doi: 10.1007/s10858-013-9762-6), both implemented in the CS-Rosetta3 WeNMR server. In this second round of CASD-NMR, the WeNMR CS-Rosetta server has demonstrated a much better performance than in the first round since only converged targets were submitted. Further, recalculation of all CASD-NMR targets using the new version of the server demonstrates that our new annotation of prediction quality is giving reliable results. Predictions annotated as weak are often found to provide useful models, but only for a fraction of the sequence, and should therefore only be used with caution.

  14. Object-based implicit learning in visual search: perceptual segmentation constrains contextual cueing.

    PubMed

    Conci, Markus; Müller, Hermann J; von Mühlenen, Adrian

    2013-07-09

    In visual search, detection of a target is faster when it is presented within a spatial layout of repeatedly encountered nontarget items, indicating that contextual invariances can guide selective attention (contextual cueing; Chun & Jiang, 1998). However, perceptual regularities may interfere with contextual learning; for instance, no contextual facilitation occurs when four nontargets form a square-shaped grouping, even though the square location predicts the target location (Conci & von Mühlenen, 2009). Here, we further investigated potential causes for this interference-effect: We show that contextual cueing can reliably occur for targets located within the region of a segmented object, but not for targets presented outside of the object's boundaries. Four experiments demonstrate an object-based facilitation in contextual cueing, with a modulation of context-based learning by relatively subtle grouping cues including closure, symmetry, and spatial regularity. Moreover, the lack of contextual cueing for targets located outside the segmented region was due to an absence of (latent) learning of contextual layouts, rather than due to an attentional bias towards the grouped region. Taken together, these results indicate that perceptual segmentation provides a basic structure within which contextual scene regularities are acquired. This in turn argues that contextual learning is constrained by object-based selection.

  15. Baseline and target values for regional and point PV power forecasts: Toward improved solar forecasting

    DOE PAGES

    Zhang, Jie; Hodge, Bri -Mathias; Lu, Siyuan; ...

    2015-11-10

    Accurate solar photovoltaic (PV) power forecasting allows utilities to reliably utilize solar resources on their systems. However, to truly measure the improvements that any new solar forecasting methods provide, it is important to develop a methodology for determining baseline and target values for the accuracy of solar forecasting at different spatial and temporal scales. This paper aims at developing a framework to derive baseline and target values for a suite of generally applicable, value-based, and custom-designed solar forecasting metrics. The work was informed by close collaboration with utility and independent system operator partners. The baseline values are established based onmore » state-of-the-art numerical weather prediction models and persistence models in combination with a radiative transfer model. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of PV power output. The proposed reserve-based methodology is a reasonable and practical approach that can be used to assess the economic benefits gained from improvements in accuracy of solar forecasting. Lastly, the financial baseline and targets can be translated back to forecasting accuracy metrics and requirements, which will guide research on solar forecasting improvements toward the areas that are most beneficial to power systems operations.« less

  16. Multitarget mixture reduction algorithm with incorporated target existence recursions

    NASA Astrophysics Data System (ADS)

    Ristic, Branko; Arulampalam, Sanjeev

    2000-07-01

    The paper derives a deferred logic data association algorithm based on the mixture reduction approach originally due to Salmond [SPIE vol.1305, 1990]. The novelty of the proposed algorithm provides the recursive formulae for both data association and target existence (confidence) estimation, thus allowing automatic track initiation and termination. T he track initiation performance of the proposed filter is investigated by computer simulations. It is observed that at moderately high levels of clutter density the proposed filter initiates tracks more reliably than its corresponding PDA filter. An extension of the proposed filter to the multi-target case is also presented. In addition, the paper compares the track maintenance performance of the MR algorithm with an MHT implementation.

  17. The Development of DNA Based Methods for the Reliable and Efficient Identification of Nicotiana tabacum in Tobacco and Its Derived Products

    PubMed Central

    Fan, Wei; Li, Rong; Li, Sifan; Ping, Wenli; Li, Shujun; Naumova, Alexandra; Peelen, Tamara; Yuan, Zheng; Zhang, Dabing

    2016-01-01

    Reliable methods are needed to detect the presence of tobacco components in tobacco products to effectively control smuggling and classify tariff and excise in tobacco industry to control illegal tobacco trade. In this study, two sensitive and specific DNA based methods, one quantitative real-time PCR (qPCR) assay and the other loop-mediated isothermal amplification (LAMP) assay, were developed for the reliable and efficient detection of the presence of tobacco (Nicotiana tabacum) in various tobacco samples and commodities. Both assays targeted the same sequence of the uridine 5′-monophosphate synthase (UMPS), and their specificities and sensitivities were determined with various plant materials. Both qPCR and LAMP methods were reliable and accurate in the rapid detection of tobacco components in various practical samples, including customs samples, reconstituted tobacco samples, and locally purchased cigarettes, showing high potential for their application in tobacco identification, particularly in the special cases where the morphology or chemical compositions of tobacco have been disrupted. Therefore, combining both methods would facilitate not only the detection of tobacco smuggling control, but also the detection of tariff classification and of excise. PMID:27635142

  18. Evaluating the effect of database inflation in proteogenomic search on sensitive and reliable peptide identification.

    PubMed

    Li, Honglan; Joh, Yoon Sung; Kim, Hyunwoo; Paek, Eunok; Lee, Sang-Won; Hwang, Kyu-Baek

    2016-12-22

    Proteogenomics is a promising approach for various tasks ranging from gene annotation to cancer research. Databases for proteogenomic searches are often constructed by adding peptide sequences inferred from genomic or transcriptomic evidence to reference protein sequences. Such inflation of databases has potential of identifying novel peptides. However, it also raises concerns on sensitive and reliable peptide identification. Spurious peptides included in target databases may result in underestimated false discovery rate (FDR). On the other hand, inflation of decoy databases could decrease the sensitivity of peptide identification due to the increased number of high-scoring random hits. Although several studies have addressed these issues, widely applicable guidelines for sensitive and reliable proteogenomic search have hardly been available. To systematically evaluate the effect of database inflation in proteogenomic searches, we constructed a variety of real and simulated proteogenomic databases for yeast and human tandem mass spectrometry (MS/MS) data, respectively. Against these databases, we tested two popular database search tools with various approaches to search result validation: the target-decoy search strategy (with and without a refined scoring-metric) and a mixture model-based method. The effect of separate filtering of known and novel peptides was also examined. The results from real and simulated proteogenomic searches confirmed that separate filtering increases the sensitivity and reliability in proteogenomic search. However, no one method consistently identified the largest (or the smallest) number of novel peptides from real proteogenomic searches. We propose to use a set of search result validation methods with separate filtering, for sensitive and reliable identification of peptides in proteogenomic search.

  19. Particle Filtering with Region-based Matching for Tracking of Partially Occluded and Scaled Targets*

    PubMed Central

    Nakhmani, Arie; Tannenbaum, Allen

    2012-01-01

    Visual tracking of arbitrary targets in clutter is important for a wide range of military and civilian applications. We propose a general framework for the tracking of scaled and partially occluded targets, which do not necessarily have prominent features. The algorithm proposed in the present paper utilizes a modified normalized cross-correlation as the likelihood for a particle filter. The algorithm divides the template, selected by the user in the first video frame, into numerous patches. The matching process of these patches by particle filtering allows one to handle the target’s occlusions and scaling. Experimental results with fixed rectangular templates show that the method is reliable for videos with nonstationary, noisy, and cluttered background, and provides accurate trajectories in cases of target translation, scaling, and occlusion. PMID:22506088

  20. Custom oligonucleotide array-based CGH: a reliable diagnostic tool for detection of exonic copy-number changes in multiple targeted genes

    PubMed Central

    Vasson, Aurélie; Leroux, Céline; Orhant, Lucie; Boimard, Mathieu; Toussaint, Aurélie; Leroy, Chrystel; Commere, Virginie; Ghiotti, Tiffany; Deburgrave, Nathalie; Saillour, Yoann; Atlan, Isabelle; Fouveaut, Corinne; Beldjord, Cherif; Valleix, Sophie; Leturcq, France; Dodé, Catherine; Bienvenu, Thierry; Chelly, Jamel; Cossée, Mireille

    2013-01-01

    The frequency of disease-related large rearrangements (referred to as copy-number mutations, CNMs) varies among genes, and search for these mutations has an important place in diagnostic strategies. In recent years, CGH method using custom-designed high-density oligonucleotide-based arrays allowed the development of a powerful tool for detection of alterations at the level of exons and made it possible to provide flexibility through the possibility of modeling chips. The aim of our study was to test custom-designed oligonucleotide CGH array in a diagnostic laboratory setting that analyses several genes involved in various genetic diseases, and to compare it with conventional strategies. To this end, we designed a 12-plex CGH array (135k; 135 000 probes/subarray) (Roche Nimblegen) with exonic and intronic oligonucleotide probes covering 26 genes routinely analyzed in the laboratory. We tested control samples with known CNMs and patients for whom genetic causes underlying their disorders were unknown. The contribution of this technique is undeniable. Indeed, it appeared reproducible, reliable and sensitive enough to detect heterozygous single-exon deletions or duplications, complex rearrangements and somatic mosaicism. In addition, it improves reliability of CNM detection and allows determination of boundaries precisely enough to direct targeted sequencing of breakpoints. All of these points, associated with the possibility of a simultaneous analysis of several genes and scalability ‘homemade' make it a valuable tool as a new diagnostic approach of CNMs. PMID:23340513

  1. Vehicle security encryption based on unlicensed encryption

    NASA Astrophysics Data System (ADS)

    Huang, Haomin; Song, Jing; Xu, Zhijia; Ding, Xiaoke; Deng, Wei

    2018-03-01

    The current vehicle key is easy to be destroyed and damage, proposing the use of elliptical encryption algorithm is improving the reliability of vehicle security system. Based on the encryption rules of elliptic curve, the chip's framework and hardware structure are designed, then the chip calculation process simulation has been analyzed by software. The simulation has been achieved the expected target. Finally, some issues pointed out in the data calculation about the chip's storage control and other modules.

  2. A quantitative framework for the forward design of synthetic miRNA circuits.

    PubMed

    Bloom, Ryan J; Winkler, Sally M; Smolke, Christina D

    2014-11-01

    Synthetic genetic circuits incorporating regulatory components based on RNA interference (RNAi) have been used in a variety of systems. A comprehensive understanding of the parameters that determine the relationship between microRNA (miRNA) and target expression levels is lacking. We describe a quantitative framework supporting the forward engineering of gene circuits that incorporate RNAi-based regulatory components in mammalian cells. We developed a model that captures the quantitative relationship between miRNA and target gene expression levels as a function of parameters, including mRNA half-life and miRNA target-site number. We extended the model to synthetic circuits that incorporate protein-responsive miRNA switches and designed an optimized miRNA-based protein concentration detector circuit that noninvasively measures small changes in the nuclear concentration of β-catenin owing to induction of the Wnt signaling pathway. Our results highlight the importance of methods for guiding the quantitative design of genetic circuits to achieve robust, reliable and predictable behaviors in mammalian cells.

  3. Highly sensitive detection of target molecules using a new fluorescence-based bead assay

    NASA Astrophysics Data System (ADS)

    Scheffler, Silvia; Strauß, Denis; Sauer, Markus

    2007-07-01

    Development of immunoassays with improved sensitivity, specificity and reliability are of major interest in modern bioanalytical research. We describe the development of a new immunomagnetic fluorescence detection (IM-FD) assay based on specific antigen/antibody interactions and on accumulation of the fluorescence signal on superparamagnetic PE beads in combination with the use of extrinsic fluorescent labels. IM-FD can be easily modified by varying the order of coatings and assay conditions. Depending on the target molecule, antibodies (ABs), entire proteins, or small protein epitopes can be used as capture molecules. The presence of target molecules is detected by fluorescence microscopy using fluorescently labeled secondary or detection antibodies. Here, we demonstrate the potential of the new assay detecting the two tumor markers IGF-I and p53 antibodies in the clinically relevant concentration range. Our data show that the fluorescence-based bead assay exhibits a large dynamic range and a high sensitivity down to the subpicomolar level.

  4. Thermal Management and Reliability of Automotive Power Electronics and Electric Machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narumanchi, Sreekant V; Bennion, Kevin S; Cousineau, Justine E

    Low-cost, high-performance thermal management technologies are helping meet aggressive power density, specific power, cost, and reliability targets for power electronics and electric machines. The National Renewable Energy Laboratory is working closely with numerous industry and research partners to help influence development of components that meet aggressive performance and cost targets through development and characterization of cooling technologies, and thermal characterization and improvements of passive stack materials and interfaces. Thermomechanical reliability and lifetime estimation models are important enablers for industry in cost-and time-effective design.

  5. Rational Design of an Ultrasensitive Quorum-Sensing Switch.

    PubMed

    Zeng, Weiqian; Du, Pei; Lou, Qiuli; Wu, Lili; Zhang, Haoqian M; Lou, Chunbo; Wang, Hongli; Ouyang, Qi

    2017-08-18

    One of the purposes of synthetic biology is to develop rational methods that accelerate the design of genetic circuits, saving time and effort spent on experiments and providing reliably predictable circuit performance. We applied a reverse engineering approach to design an ultrasensitive transcriptional quorum-sensing switch. We want to explore how systems biology can guide synthetic biology in the choice of specific DNA sequences and their regulatory relations to achieve a targeted function. The workflow comprises network enumeration that achieves the target function robustly, experimental restriction of the obtained candidate networks, global parameter optimization via mathematical analysis, selection and engineering of parts based on these calculations, and finally, circuit construction based on the principles of standardization and modularization. The performance of realized quorum-sensing switches was in good qualitative agreement with the computational predictions. This study provides practical principles for the rational design of genetic circuits with targeted functions.

  6. System engineering of complex optical systems for mission assurance and affordability

    NASA Astrophysics Data System (ADS)

    Ahmad, Anees

    2017-08-01

    Affordability and reliability are equally important as the performance and development time for many optical systems for military, space and commercial applications. These characteristics are even more important for the systems meant for space and military applications where total lifecycle costs must be affordable. Most customers are looking for high performance optical systems that are not only affordable but are designed with "no doubt" mission assurance, reliability and maintainability in mind. Both US military and commercial customers are now demanding an optimum balance between performance, reliability and affordability. Therefore, it is important to employ a disciplined systems design approach for meeting the performance, cost and schedule targets while keeping affordability and reliability in mind. The US Missile Defense Agency (MDA) now requires all of their systems to be engineered, tested and produced according to the Mission Assurance Provisions (MAP). These provisions or requirements are meant to ensure complex and expensive military systems are designed, integrated, tested and produced with the reliability and total lifecycle costs in mind. This paper describes a system design approach based on the MAP document for developing sophisticated optical systems that are not only cost-effective but also deliver superior and reliable performance during their intended missions.

  7. A Methodology for Studying the Relationship between Comprehension and Second Language Development in a Comprehension-Based ESL Program.

    ERIC Educational Resources Information Center

    Paribakht, T. Sima; Wesche, Marjorie Bingham

    A study investigated the role of comprehension of meaningful language input in young adults' second language learning, focusing on: (1) what kinds of measurement instruments and procedures can be used in tracking student gains in specific aspects of target language proficiency; (2) development of a reliable self-report scale capturing different…

  8. Predicting Vandalism in a General Youth Sample via the HEW Youth Development Model's Community Program Impact Scales, Age, and Sex.

    ERIC Educational Resources Information Center

    Truckenmiller, James L.

    The former HEW National Strategy for Youth Development model was a community-based planning and procedural tool to enhance and to prevent delinquency through a process of youth needs assessments, needs targeted programs, and program impact evaluation. The program's 12 Impact Scales have been found to have acceptable reliabilities, substantial…

  9. Pinch aperture proprioception: reliability and feasibility study

    PubMed Central

    Yahya, Abdalghani; von Behren, Timothy; Levine, Shira; dos Santos, Marcio

    2018-01-01

    [Purpose] To establish the reliability and feasibility of a novel pinch aperture device to measure proprioceptive joint position sense. [Subjects and Methods] Reliability of the pinch aperture device was assessed in 21 healthy subjects. Following familiarization with a 15° target position of the index finger and thumb, subjects performed 5 trials in which they attempted to actively reproduce the target position without visual feedback. This procedure was repeated at a testing session on a separate date, and the between-session intraclass correlation coefficient (ICC) was calculated. In addition, extensor tendon vibration was applied to 19 healthy subjects, and paired t-tests were conducted to compare performance under vibration and no-vibration conditions. Pinch aperture proprioception was also assessed in two individuals with known diabetic neuropathy. [Results] The pinch aperture device demonstrated excellent reliability in healthy subjects (ICC 0.88, 95% confidence interval 0.70–0.95). Tendon vibration disrupted pinch aperture proprioception, causing subjects to undershoot the target position (18.1 ± 2.6° vs. 14.8° ± 0.76, p<0.001). This tendency to undershoot the target position was also noted in individuals with diabetic neuropathy. [Conclusion] This study describes a reliable, feasible, and functional means of measuring finger proprioception. Further research should investigate the assessment and implications of pinch aperture proprioception in neurological and orthopedic populations. PMID:29765192

  10. A High-Speed Target-Free Vision-Based Sensor for Bus Rapid Transit Viaduct Vibration Measurements Using CMT and ORB Algorithms.

    PubMed

    Hu, Qijun; He, Songsheng; Wang, Shilong; Liu, Yugang; Zhang, Zutao; He, Leping; Wang, Fubin; Cai, Qijie; Shi, Rendan; Yang, Yuan

    2017-06-06

    Bus Rapid Transit (BRT) has become an increasing source of concern for public transportation of modern cities. Traditional contact sensing techniques during the process of health monitoring of BRT viaducts cannot overcome the deficiency that the normal free-flow of traffic would be blocked. Advances in computer vision technology provide a new line of thought for solving this problem. In this study, a high-speed target-free vision-based sensor is proposed to measure the vibration of structures without interrupting traffic. An improved keypoints matching algorithm based on consensus-based matching and tracking (CMT) object tracking algorithm is adopted and further developed together with oriented brief (ORB) keypoints detection algorithm for practicable and effective tracking of objects. Moreover, by synthesizing the existing scaling factor calculation methods, more rational approaches to reducing errors are implemented. The performance of the vision-based sensor is evaluated through a series of laboratory tests. Experimental tests with different target types, frequencies, amplitudes and motion patterns are conducted. The performance of the method is satisfactory, which indicates that the vision sensor can extract accurate structure vibration signals by tracking either artificial or natural targets. Field tests further demonstrate that the vision sensor is both practicable and reliable.

  11. A High-Speed Target-Free Vision-Based Sensor for Bus Rapid Transit Viaduct Vibration Measurements Using CMT and ORB Algorithms

    PubMed Central

    Hu, Qijun; He, Songsheng; Wang, Shilong; Liu, Yugang; Zhang, Zutao; He, Leping; Wang, Fubin; Cai, Qijie; Shi, Rendan; Yang, Yuan

    2017-01-01

    Bus Rapid Transit (BRT) has become an increasing source of concern for public transportation of modern cities. Traditional contact sensing techniques during the process of health monitoring of BRT viaducts cannot overcome the deficiency that the normal free-flow of traffic would be blocked. Advances in computer vision technology provide a new line of thought for solving this problem. In this study, a high-speed target-free vision-based sensor is proposed to measure the vibration of structures without interrupting traffic. An improved keypoints matching algorithm based on consensus-based matching and tracking (CMT) object tracking algorithm is adopted and further developed together with oriented brief (ORB) keypoints detection algorithm for practicable and effective tracking of objects. Moreover, by synthesizing the existing scaling factor calculation methods, more rational approaches to reducing errors are implemented. The performance of the vision-based sensor is evaluated through a series of laboratory tests. Experimental tests with different target types, frequencies, amplitudes and motion patterns are conducted. The performance of the method is satisfactory, which indicates that the vision sensor can extract accurate structure vibration signals by tracking either artificial or natural targets. Field tests further demonstrate that the vision sensor is both practicable and reliable. PMID:28587275

  12. Doppler Feature Based Classification of Wind Profiler Data

    NASA Astrophysics Data System (ADS)

    Sinha, Swati; Chandrasekhar Sarma, T. V.; Lourde. R, Mary

    2017-01-01

    Wind Profilers (WP) are coherent pulsed Doppler radars in UHF and VHF bands. They are used for vertical profiling of wind velocity and direction. This information is very useful for weather modeling, study of climatic patterns and weather prediction. Observations at different height and different wind velocities are possible by changing the operating parameters of WP. A set of Doppler power spectra is the standard form of WP data. Wind velocity, direction and wind velocity turbulence at different heights can be derived from it. Modern wind profilers operate for long duration and generate approximately 4 megabytes of data per hour. The radar data stream contains Doppler power spectra from different radar configurations with echoes from different atmospheric targets. In order to facilitate systematic study, this data needs to be segregated according the type of target. A reliable automated target classification technique is required to do this job. Classical techniques of radar target identification use pattern matching and minimization of mean squared error, Euclidean distance etc. These techniques are not effective for the classification of WP echoes, as these targets do not have well-defined signature in Doppler power spectra. This paper presents an effective target classification technique based on range-Doppler features.

  13. Promoting the Quality of Health Research-based News: Introduction of a Tool

    PubMed Central

    Ashoorkhani, Mahnaz; Majdzadeh, Reza; Nedjat, Saharnaz; Gholami, Jaleh

    2017-01-01

    Introduction: While disseminating health research findings to the public, it is very important to present appropriate and accurate information to give the target audience a correct understanding of the subject matter. The objective of this study was to design and psychometrically evaluate a checklist for health journalists to help them prepare news of appropriate accuracy and authenticity. Methods: The study consisted of two phases, checklist design and psychometrics. Literature review and expert opinion were used to extract the items of the checklist in the first phase. In the second phase, to assess content and face validity, the judgment of 38 persons (epidemiologists with a tool production history, editors-in-chief, and health journalists) was used to check the items’ understandability, nonambiguity, relevancy, and clarity. Reliability was assessed by the test–retest method using intra-cluster correlation (ICC) indices in the two phases. Cronbach's alpha was used to assess internal validity of the checklist. Results: Based on the participants’ opinions, the items were reduced from 20 to 14 in number. The items were categorized into the following three domains: (a) items assessing the source of news and its validity, (b) items addressing the presentation of complete and accurate information on research findings, and (c) items which if adhered to lead to the target audiences’ better understanding. The checklist was approved for content and face validity. The reliability of the checklist was assessed in the last stage; the ICC was 1 for 12 items and above 0.8 for the other two. Internal consistency (Cronbach's alpha) was 0.98. Discussion and Conclusions: The resultant indices of the study indicate that the checklist has appropriate validity and reliability. Hence, it can be used by health journalists to develop health research-based news. PMID:29184638

  14. Reliability of Instruments Measuring At-Risk and Problem Gambling Among Young Individuals: A Systematic Review Covering Years 2009-2015.

    PubMed

    Edgren, Robert; Castrén, Sari; Mäkelä, Marjukka; Pörtfors, Pia; Alho, Hannu; Salonen, Anne H

    2016-06-01

    This review aims to clarify which instruments measuring at-risk and problem gambling (ARPG) among youth are reliable and valid in light of reported estimates of internal consistency, classification accuracy, and psychometric properties. A systematic search was conducted in PubMed, Medline, and PsycInfo covering the years 2009-2015. In total, 50 original research articles fulfilled the inclusion criteria: target age under 29 years, using an instrument designed for youth, and reporting a reliability estimate. Articles were evaluated with the revised Quality Assessment of Diagnostic Accuracy Studies tool. Reliability estimates were reported for five ARPG instruments. Most studies (66%) evaluated the South Oaks Gambling Screen Revised for Adolescents. The Gambling Addictive Behavior Scale for Adolescents was the only novel instrument. In general, the evaluation of instrument reliability was superficial. Despite its rare use, the Canadian Adolescent Gambling Inventory (CAGI) had a strong theoretical and methodological base. The Gambling Addictive Behavior Scale for Adolescents and the CAGI were the only instruments originally developed for youth. All studies, except the CAGI study, were population based. ARPG instruments for youth have not been rigorously evaluated yet. Further research is needed especially concerning instruments designed for clinical use. Copyright © 2016 The Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  15. A Power Conditioning Stage Based on Analog-Circuit MPPT Control and a Superbuck Converter for Thermoelectric Generators in Spacecraft Power Systems

    NASA Astrophysics Data System (ADS)

    Sun, Kai; Wu, Hongfei; Cai, Yan; Xing, Yan

    2014-06-01

    A thermoelectric generator (TEG) is a very important kind of power supply for spacecraft, especially for deep-space missions, due to its long lifetime and high reliability. To develop a practical TEG power supply for spacecraft, a power conditioning stage is indispensable, being employed to convert the varying output voltage of the TEG modules to a definite voltage for feeding batteries or loads. To enhance the system reliability, a power conditioning stage based on analog-circuit maximum-power-point tracking (MPPT) control and a superbuck converter is proposed in this paper. The input of this power conditioning stage is connected to the output of the TEG modules, and the output of this stage is connected to the battery and loads. The superbuck converter is employed as the main circuit, featuring low input current ripples and high conversion efficiency. Since for spacecraft power systems reliable operation is the key target for control circuits, a reset-set flip-flop-based analog circuit is used as the basic control circuit to implement MPPT, being much simpler than digital control circuits and offering higher reliability. Experiments have verified the feasibility and effectiveness of the proposed power conditioning stage. The results show the advantages of the proposed stage, such as maximum utilization of TEG power, small input ripples, and good stability.

  16. NEW FRONTIERS IN DRUGGABILITY

    PubMed Central

    Kozakov, Dima; Hall, David R.; Napoleon, Raeanne L.; Yueh, Christine; Whitty, Adrian; Vajda, Sandor

    2016-01-01

    A powerful early approach to evaluating the druggability of proteins involved determining the hit rate in NMR-based screening of a library of small compounds. Here we show that a computational analog of this method, based on mapping proteins using small molecules as probes, can reliably reproduce druggability results from NMR-based screening, and can provide a more meaningful assessment in cases where the two approaches disagree. We apply the method to a large set of proteins. The results show that, because the method is based on the biophysics of binding rather than on empirical parameterization, meaningful information can be gained about classes of proteins and classes of compounds beyond those resembling validated targets and conventionally druglike ligands. In particular, the method identifies targets that, while not druggable by druglike compounds, may become druggable using compound classes such as macrocycles or other large molecules beyond the rule-of-five limit. PMID:26230724

  17. COTS-Based Fault Tolerance in Deep Space: Qualitative and Quantitative Analyses of a Bus Network Architecture

    NASA Technical Reports Server (NTRS)

    Tai, Ann T.; Chau, Savio N.; Alkalai, Leon

    2000-01-01

    Using COTS products, standards and intellectual properties (IPs) for all the system and component interfaces is a crucial step toward significant reduction of both system cost and development cost as the COTS interfaces enable other COTS products and IPs to be readily accommodated by the target system architecture. With respect to the long-term survivable systems for deep-space missions, the major challenge for us is, under stringent power and mass constraints, to achieve ultra-high reliability of the system comprising COTS products and standards that are not developed for mission-critical applications. The spirit of our solution is to exploit the pertinent standard features of a COTS product to circumvent its shortcomings, though these standard features may not be originally designed for highly reliable systems. In this paper, we discuss our experiences and findings on the design of an IEEE 1394 compliant fault-tolerant COTS-based bus architecture. We first derive and qualitatively analyze a -'stacktree topology" that not only complies with IEEE 1394 but also enables the implementation of a fault-tolerant bus architecture without node redundancy. We then present a quantitative evaluation that demonstrates significant reliability improvement from the COTS-based fault tolerance.

  18. SKYWARD: the next generation airborne infrared search and track

    NASA Astrophysics Data System (ADS)

    Fortunato, L.; Colombi, G.; Ondini, A.; Quaranta, C.; Giunti, C.; Sozzi, B.; Balzarotti, G.

    2016-05-01

    Infrared Search and Track systems are an essential element of the modern and future combat aircrafts. Passive automatic search, detection and tracking functions, are key points for silent operations or jammed tactical scenarios. SKYWARD represents the latest evolution of IRST technology in which high quality electro-optical components, advanced algorithms, efficient hardware and software solutions are harmonically integrated to provide high-end affordable performances. Additionally, the reduction of critical opto-mechanical elements optimises weight and volume and increases the overall reliability. Multiple operative modes dedicated to different situations are available; many options can be selected among multiple or single target tracking, for surveillance or engagement, and imaging, for landing or navigation aid, assuring the maximum system flexibility. The high quality 2D-IR sensor is exploited by multiple parallel processing chains, based on linear and non-linear techniques, to extract the possible targets from background, in different conditions, with false alarm rate control. A widely tested track processor manages a large amount of candidate targets simultaneously and allows discriminating real targets from noise whilst operating with low target to background contrasts. The capability of providing reliable passive range estimation is an additional qualifying element of the system. Particular care has been dedicated to the detector non-uniformities, a possible limiting factor for distant targets detection, as well as to the design of the electro-optics for a harsh airborne environment. The system can be configured for LWIR or MWIR waveband according to the customer operational requirements. An embedded data recorder saves all the necessary images and data for mission debriefing, particularly useful during inflight system integration and tuning.

  19. Going DEEP: guidelines for building simulation-based team assessments.

    PubMed

    Grand, James A; Pearce, Marina; Rench, Tara A; Chao, Georgia T; Fernandez, Rosemarie; Kozlowski, Steve W J

    2013-05-01

    Whether for team training, research or evaluation, making effective use of simulation-based technologies requires robust, reliable and accurate assessment tools. Extant literature on simulation-based assessment practices has primarily focused on scenario and instructional design; however, relatively little direct guidance has been provided regarding the challenging decisions and fundamental principles related to assessment development and implementation. The objective of this manuscript is to introduce a generalisable assessment framework supplemented by specific guidance on how to construct and ensure valid and reliable simulation-based team assessment tools. The recommendations reflect best practices in assessment and are designed to empower healthcare educators, professionals and researchers with the knowledge to design and employ valid and reliable simulation-based team assessments. Information and actionable recommendations associated with creating assessments of team processes (non-technical 'teamwork' activities) and performance (demonstration of technical proficiency) are presented which provide direct guidance on how to Distinguish the underlying competencies one aims to assess, Elaborate the measures used to capture team member behaviours during simulation activities, Establish the content validity of these measures and Proceduralise the measurement tools in a way that is systematically aligned with the goals of the simulation activity while maintaining methodological rigour (DEEP). The DEEP framework targets fundamental principles and critical activities that are important for effective assessment, and should benefit healthcare educators, professionals and researchers seeking to design or enhance any simulation-based assessment effort.

  20. [Quality indicators in the storage and dispensing process in a Hospital Pharmacy].

    PubMed

    Rabuñal-Álvarez, M T; Calvin-Lamas, M; Feal-Cortizas, B; Martínez-López, L M; Pedreira-Vázquez, I; Martín-Herranz, M I

    2014-01-01

    To establish indicators for the evaluation of the quality of the storage and dispensing processes related to semiautomatic vertical (SAVCS) and horizontal (SAHCS) carousel systems. Descriptive observational study conducted between January-December 2012. Definition of quality indicators, a target value is established and an obtained value is calculated for 2012. Five quality indicators in the process of storage and dispensing of drugs were defined and calculated: indicator 1, error filling unidose trolleys: target (<1.67%), obtained (1.03%); indicator 2, filling accuracy unidose trolleys by using an SAVCS: target (<15%), obtained (11.5%); indicator 3, reliability of drug inventory in the process of drug entries using an SAHCS: target (<15%), obtained (6.53%); indicator 4, reliability of drug inventory in the picking process of orders replacement stock of clinical units using an SAHCS: target (<10%), obtained (1.97%); indicator 5, accuracy of the picking process of drug orders using an SAHCS: target (<10%), obtained (10.41%). Establishing indicators has allowed the quality in terms of safety, precision and reliability of semiautomatic systems for storage and dispensing drugs to be assessed. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  1. Assessing Reliability of Cold Spray Sputter Targets in Photovoltaic Manufacturing

    NASA Astrophysics Data System (ADS)

    Hardikar, Kedar; Vlcek, Johannes; Bheemreddy, Venkata; Juliano, Daniel

    2017-10-01

    Cold spray has been used to manufacture more than 800 Cu-In-Ga (CIG) sputter targets for deposition of high-efficiency photovoltaic thin films. It is a preferred technique since it enables high deposit purity and transfer of non-equilibrium alloy states to the target material. In this work, an integrated approach to reliability assessment of such targets with deposit weight in excess of 50 lb. is undertaken, involving thermal-mechanical characterization of the material in as-deposited condition, characterization of the interface adhesion on cylindrical substrate in as-deposited condition, and developing means to assess target integrity under thermal-mechanical loads during the physical vapor deposition (PVD) sputtering process. Mechanical characterization of cold spray deposited CIG alloy is accomplished through the use of indentation testing and adaptation of Brazilian disk test. A custom lever test was developed to characterize adhesion along the cylindrical interface between the CIG deposit and cylindrical substrate, overcoming limitations of current standards. A cohesive zone model for crack initiation and propagation at the deposit interface is developed and validated using the lever test and later used to simulate the potential catastrophic target failure in the PVD process. It is shown that this approach enables reliability assessment of sputter targets and improves robustness.

  2. Development and psychometric testing of a trans-professional evidence-based practice profile questionnaire.

    PubMed

    McEvoy, Maureen Patricia; Williams, Marie T; Olds, Timothy Stephen

    2010-01-01

    Previous survey tools operationalising knowledge, attitudes or beliefs about evidence-based practice (EBP) have shortcomings in content, psychometric properties and target audience. This study developed and psychometrically assessed a self-report trans-professional questionnaire to describe an EBP profile. Sixty-six items were collated from existing EBP questionnaires and administered to 526 academics and students from health and non-health backgrounds. Principal component factor analysis revealed the presence of five factors (Relevance, Terminology, Confidence, Practice and Sympathy). Following expert panel review and pilot testing, the 58-item final questionnaire was disseminated to 105 subjects on two occasions. Test-retest and internal reliability were quantified using intra-class correlation coefficients (ICCs) and Cronbach's alpha, convergent validity against a commonly used EBP questionnaire by Pearson's correlation coefficient and discriminative validity via analysis of variance (ANOVA) based on exposure to EBP training. The final questionnaire demonstrated acceptable internal consistency (Cronbach's alpha 0.96), test-retest reliability (ICCs range 0.77-0.94) and convergent validity (Practice 0.66, Confidence 0.80 and Sympathy 0.54). Three factors (Relevance, Terminology and Confidence) distinguished EBP exposure groups (ANOVA p < 0.001-0.004). The evidence-based practice profile (EBP(2)) questionnaire is a reliable instrument with the ability to discriminate for three factors, between respondents with differing EBP exposures.

  3. Statistical Modeling of Single Target Cell Encapsulation

    PubMed Central

    Moon, SangJun; Ceyhan, Elvan; Gurkan, Umut Atakan; Demirci, Utkan

    2011-01-01

    High throughput drop-on-demand systems for separation and encapsulation of individual target cells from heterogeneous mixtures of multiple cell types is an emerging method in biotechnology that has broad applications in tissue engineering and regenerative medicine, genomics, and cryobiology. However, cell encapsulation in droplets is a random process that is hard to control. Statistical models can provide an understanding of the underlying processes and estimation of the relevant parameters, and enable reliable and repeatable control over the encapsulation of cells in droplets during the isolation process with high confidence level. We have modeled and experimentally verified a microdroplet-based cell encapsulation process for various combinations of cell loading and target cell concentrations. Here, we explain theoretically and validate experimentally a model to isolate and pattern single target cells from heterogeneous mixtures without using complex peripheral systems. PMID:21814548

  4. Test-retest reliability of evoked BOLD signals from a cognitive-emotive fMRI test battery.

    PubMed

    Plichta, Michael M; Schwarz, Adam J; Grimm, Oliver; Morgen, Katrin; Mier, Daniela; Haddad, Leila; Gerdes, Antje B M; Sauer, Carina; Tost, Heike; Esslinger, Christine; Colman, Peter; Wilson, Frederick; Kirsch, Peter; Meyer-Lindenberg, Andreas

    2012-04-15

    Even more than in cognitive research applications, moving fMRI to the clinic and the drug development process requires the generation of stable and reliable signal changes. The performance characteristics of the fMRI paradigm constrain experimental power and may require different study designs (e.g., crossover vs. parallel groups), yet fMRI reliability characteristics can be strongly dependent on the nature of the fMRI task. The present study investigated both within-subject and group-level reliability of a combined three-task fMRI battery targeting three systems of wide applicability in clinical and cognitive neuroscience: an emotional (face matching), a motivational (monetary reward anticipation) and a cognitive (n-back working memory) task. A group of 25 young, healthy volunteers were scanned twice on a 3T MRI scanner with a mean test-retest interval of 14.6 days. FMRI reliability was quantified using the intraclass correlation coefficient (ICC) applied at three different levels ranging from a global to a localized and fine spatial scale: (1) reliability of group-level activation maps over the whole brain and within targeted regions of interest (ROIs); (2) within-subject reliability of ROI-mean amplitudes and (3) within-subject reliability of individual voxels in the target ROIs. Results showed robust evoked activation of all three tasks in their respective target regions (emotional task=amygdala; motivational task=ventral striatum; cognitive task=right dorsolateral prefrontal cortex and parietal cortices) with high effect sizes (ES) of ROI-mean summary values (ES=1.11-1.44 for the faces task, 0.96-1.43 for the reward task, 0.83-2.58 for the n-back task). Reliability of group level activation was excellent for all three tasks with ICCs of 0.89-0.98 at the whole brain level and 0.66-0.97 within target ROIs. Within-subject reliability of ROI-mean amplitudes across sessions was fair to good for the reward task (ICCs=0.56-0.62) and, dependent on the particular ROI, also fair-to-good for the n-back task (ICCs=0.44-0.57) but lower for the faces task (ICC=-0.02-0.16). In conclusion, all three tasks are well suited to between-subject designs, including imaging genetics. When specific recommendations are followed, the n-back and reward task are also suited for within-subject designs, including pharmaco-fMRI. The present study provides task-specific fMRI reliability performance measures that will inform the optimal use, powering and design of fMRI studies using comparable tasks. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. A new method of small target detection based on neural network

    NASA Astrophysics Data System (ADS)

    Hu, Jing; Hu, Yongli; Lu, Xinxin

    2018-02-01

    The detection and tracking of moving dim target in infrared image have been an research hotspot for many years. The target in each frame of images only occupies several pixels without any shape and structure information. Moreover, infrared small target is often submerged in complicated background with low signal-to-clutter ratio, making the detection very difficult. Different backgrounds exhibit different statistical properties, making it becomes extremely complex to detect the target. If the threshold segmentation is not reasonable, there may be more noise points in the final detection, which is unfavorable for the detection of the trajectory of the target. Single-frame target detection may not be able to obtain the desired target and cause high false alarm rate. We believe the combination of suspicious target detection spatially in each frame and temporal association for target tracking will increase reliability of tracking dim target. The detection of dim target is mainly divided into two parts, In the first part, we adopt bilateral filtering method in background suppression, after the threshold segmentation, the suspicious target in each frame are extracted, then we use LSTM(long short term memory) neural network to predict coordinates of target of the next frame. It is a brand-new method base on the movement characteristic of the target in sequence images which could respond to the changes in the relationship between past and future values of the values. Simulation results demonstrate proposed algorithm can effectively predict the trajectory of the moving small target and work efficiently and robustly with low false alarm.

  6. Identification of human microRNA targets from isolated argonaute protein complexes.

    PubMed

    Beitzinger, Michaela; Peters, Lasse; Zhu, Jia Yun; Kremmer, Elisabeth; Meister, Gunter

    2007-06-01

    MicroRNAs (miRNAs) constitute a class of small non-coding RNAs that regulate gene expression on the level of translation and/or mRNA stability. Mammalian miRNAs associate with members of the Argonaute (Ago) protein family and bind to partially complementary sequences in the 3' untranslated region (UTR) of specific target mRNAs. Computer algorithms based on factors such as free binding energy or sequence conservation have been used to predict miRNA target mRNAs. Based on such predictions, up to one third of all mammalian mRNAs seem to be under miRNA regulation. However, due to the low degree of complementarity between the miRNA and its target, such computer programs are often imprecise and therefore not very reliable. Here we report the first biochemical identification approach of miRNA targets from human cells. Using highly specific monoclonal antibodies against members of the Ago protein family, we co-immunoprecipitate Ago-bound mRNAs and identify them by cloning. Interestingly, most of the identified targets are also predicted by different computer programs. Moreover, we randomly analyzed six different target candidates and were able to experimentally validate five as miRNA targets. Our data clearly indicate that miRNA targets can be experimentally identified from Ago complexes and therefore provide a new tool to directly analyze miRNA function.

  7. Targeted cellular ablation based on the morphology of malignant cells

    NASA Astrophysics Data System (ADS)

    Ivey, Jill W.; Latouche, Eduardo L.; Sano, Michael B.; Rossmeisl, John H.; Davalos, Rafael V.; Verbridge, Scott S.

    2015-11-01

    Treatment of glioblastoma multiforme (GBM) is especially challenging due to a shortage of methods to preferentially target diffuse infiltrative cells, and therapy-resistant glioma stem cell populations. Here we report a physical treatment method based on electrical disruption of cells, whose action depends strongly on cellular morphology. Interestingly, numerical modeling suggests that while outer lipid bilayer disruption induced by long pulses (~100 μs) is enhanced for larger cells, short pulses (~1 μs) preferentially result in high fields within the cell interior, which scale in magnitude with nucleus size. Because enlarged nuclei represent a reliable indicator of malignancy, this suggested a means of preferentially targeting malignant cells. While we demonstrate killing of both normal and malignant cells using pulsed electric fields (PEFs) to treat spontaneous canine GBM, we proposed that properly tuned PEFs might provide targeted ablation based on nuclear size. Using 3D hydrogel models of normal and malignant brain tissues, which permit high-resolution interrogation during treatment testing, we confirmed that PEFs could be tuned to preferentially kill cancerous cells. Finally, we estimated the nuclear envelope electric potential disruption needed for cell death from PEFs. Our results may be useful in safely targeting the therapy-resistant cell niches that cause recurrence of GBM tumors.

  8. Targeted cellular ablation based on the morphology of malignant cells

    PubMed Central

    Ivey, Jill W.; Latouche, Eduardo L.; Sano, Michael B.; Rossmeisl, John H.; Davalos, Rafael V.; Verbridge, Scott S.

    2015-01-01

    Treatment of glioblastoma multiforme (GBM) is especially challenging due to a shortage of methods to preferentially target diffuse infiltrative cells, and therapy-resistant glioma stem cell populations. Here we report a physical treatment method based on electrical disruption of cells, whose action depends strongly on cellular morphology. Interestingly, numerical modeling suggests that while outer lipid bilayer disruption induced by long pulses (~100 μs) is enhanced for larger cells, short pulses (~1 μs) preferentially result in high fields within the cell interior, which scale in magnitude with nucleus size. Because enlarged nuclei represent a reliable indicator of malignancy, this suggested a means of preferentially targeting malignant cells. While we demonstrate killing of both normal and malignant cells using pulsed electric fields (PEFs) to treat spontaneous canine GBM, we proposed that properly tuned PEFs might provide targeted ablation based on nuclear size. Using 3D hydrogel models of normal and malignant brain tissues, which permit high-resolution interrogation during treatment testing, we confirmed that PEFs could be tuned to preferentially kill cancerous cells. Finally, we estimated the nuclear envelope electric potential disruption needed for cell death from PEFs. Our results may be useful in safely targeting the therapy-resistant cell niches that cause recurrence of GBM tumors. PMID:26596248

  9. Link-state-estimation-based transmission power control in wireless body area networks.

    PubMed

    Kim, Seungku; Eom, Doo-Seop

    2014-07-01

    This paper presents a novel transmission power control protocol to extend the lifetime of sensor nodes and to increase the link reliability in wireless body area networks (WBANs). We first experimentally investigate the properties of the link states using the received signal strength indicator (RSSI). We then propose a practical transmission power control protocol based on both short- and long-term link-state estimations. Both the short- and long-term link-state estimations enable the transceiver to adapt the transmission power level and target the RSSI threshold range, respectively, to simultaneously satisfy the requirements of energy efficiency and link reliability. Finally, the performance of the proposed protocol is experimentally evaluated in two experimental scenarios-body posture change and dynamic body motion-and compared with the typical WBAN transmission power control protocols, a real-time reactive scheme, and a dynamic postural position inference mechanism. From the experimental results, it is found that the proposed protocol increases the lifetime of the sensor nodes by a maximum of 9.86% and enhances the link reliability by reducing the packet loss by a maximum of 3.02%.

  10. Feature extraction algorithm for space targets based on fractal theory

    NASA Astrophysics Data System (ADS)

    Tian, Balin; Yuan, Jianping; Yue, Xiaokui; Ning, Xin

    2007-11-01

    In order to offer a potential for extending the life of satellites and reducing the launch and operating costs, satellite servicing including conducting repairs, upgrading and refueling spacecraft on-orbit become much more frequently. Future space operations can be more economically and reliably executed using machine vision systems, which can meet real time and tracking reliability requirements for image tracking of space surveillance system. Machine vision was applied to the research of relative pose for spacecrafts, the feature extraction algorithm was the basis of relative pose. In this paper fractal geometry based edge extraction algorithm which can be used in determining and tracking the relative pose of an observed satellite during proximity operations in machine vision system was presented. The method gets the gray-level image distributed by fractal dimension used the Differential Box-Counting (DBC) approach of the fractal theory to restrain the noise. After this, we detect the consecutive edge using Mathematical Morphology. The validity of the proposed method is examined by processing and analyzing images of space targets. The edge extraction method not only extracts the outline of the target, but also keeps the inner details. Meanwhile, edge extraction is only processed in moving area to reduce computation greatly. Simulation results compared edge detection using the method which presented by us with other detection methods. The results indicate that the presented algorithm is a valid method to solve the problems of relative pose for spacecrafts.

  11. Unreliability as a Threat to Understanding Psychopathology: The Cautionary Tale of Attentional Bias

    PubMed Central

    Rodebaugh, Thomas L.; Scullin, Rachel B.; Langer, Julia K.; Dixon, David J.; Huppert, Jonathan D.; Bernstein, Amit; Zvielli, Ariel; Lenze, Eric J.

    2016-01-01

    The use of unreliable measures constitutes a threat to our understanding of psychopathology, because advancement of science using both behavioral and biologically-oriented measures can only be certain if such measurements are reliable. Two pillars of NIMH’s portfolio – the Research Domain Criteria (RDoC) initiative for psychopathology and the target engagement initiative in clinical trials – cannot succeed without measures that possess the high reliability necessary for tests involving mediation and selection based on individual differences. We focus on the historical lack of reliability of attentional bias measures as an illustration of how reliability can pose a threat to our understanding. Our own data replicate previous findings of poor reliability for traditionally-used scores, which suggests a serious problem with the ability to test theories regarding attentional bias. This lack of reliability may also suggest problems with the assumption (in both theory and the formula for the scores) that attentional bias is consistent and stable across time. In contrast, measures accounting for attention as a dynamic process in time show good reliability in our data. The field is sorely in need of research reporting findings and reliability for attentional bias scores using multiple methods, including those focusing on dynamic processes over time. We urge researchers to test and report reliability of all measures, considering findings of low reliability not just as a nuisance but as an opportunity to modify and improve upon the underlying theory. Full assessment of reliability of measures will maximize the possibility that RDoC (and psychological science more generally) will succeed. PMID:27322741

  12. Development of a new Rasch-based scoring algorithm for the National Eye Institute Visual Functioning Questionnaire to improve its interpretability.

    PubMed

    Petrillo, Jennifer; Bressler, Neil M; Lamoureux, Ecosse; Ferreira, Alberto; Cano, Stefan

    2017-08-14

    The NEI VFQ-25 has undergone psychometric evaluation in patients with varying ocular conditions and the general population. However, important limitations which may affect the interpretation of clinical trial results have been previously identified, such as concerns with reliability and validity. The purpose of this study was to evaluate the National Eye Institute Visual Functioning Questionnaire (NEI VFQ-25) and make recommendations for a revised scoring structure, with a view to improving its psychometric performance and interpretability. Rasch Measurement Theory analyses were conducted in two stages using pooled baseline NEI VFQ-25 data for 2487 participants with retinal diseases enrolled in six clinical trials. In stage 1, we examined: scale-to-sample targeting; thresholds for item response options; item fit statistics; stability; local dependence; and reliability. In stage 2, a post-hoc revision of the scoring structure (VFQ-28R) was created and psychometrically re-evaluated. In stage 1, we found that the NEI VFQ-25 was mis-targeted to the sample, and had disordered response thresholds (15/25 items) and mis-fitting items (8/25 items). However, items appeared to be stable (differential item functioning for three items), have minimal item dependency (one pair of items) and good reliability (person-separation index, 0.93). In stage 2, the modified Rasch-scored NEI VFQ-28-R was assessed. It comprised two broad domains: Activity Limitation (19 items) and Socio-Emotional Functioning (nine items). The NEI VFQ-28-R demonstrated improved performance with fewer disordered response thresholds (no items), less item misfit (three items) and improved population targeting (reduced ceiling effect) compared with the NEI VFQ-25. Compared with the original version, the proposed NEI VFQ-28-R, with Rasch-based scoring and a two-domain structure, appears to offer improved psychometric performance and interpretability of the vision-related quality of life scale for the population analysed.

  13. A Survey of Reliability, Maintainability, Supportability, and Testability Software Tools

    DTIC Science & Technology

    1991-04-01

    designs in terms of their contributions toward forced mission termination and vehicle or function loss . Includes the ability to treat failure modes of...ABSTRACT: Inputs: MTBFs, MTTRs, support equipment costs, equipment weights and costs, available targets, military occupational specialty skill level and...US Army CECOM NAME: SPARECOST ABSTRACT: Calculates expected number of failures and performs spares holding optimization based on cost, weight , or

  14. A rainwater harvesting system reliability model based on nonparametric stochastic rainfall generator

    NASA Astrophysics Data System (ADS)

    Basinger, Matt; Montalto, Franco; Lall, Upmanu

    2010-10-01

    SummaryThe reliability with which harvested rainwater can be used as a means of flushing toilets, irrigating gardens, and topping off air-conditioner serving multifamily residential buildings in New York City is assessed using a new rainwater harvesting (RWH) system reliability model. Although demonstrated with a specific case study, the model is portable because it is based on a nonparametric rainfall generation procedure utilizing a bootstrapped markov chain. Precipitation occurrence is simulated using transition probabilities derived for each day of the year based on the historical probability of wet and dry day state changes. Precipitation amounts are selected from a matrix of historical values within a moving 15 day window that is centered on the target day. RWH system reliability is determined for user-specified catchment area and tank volume ranges using precipitation ensembles generated using the described stochastic procedure. The reliability with which NYC backyard gardens can be irrigated and air conditioning units supplied with water harvested from local roofs exceeds 80% and 90%, respectively, for the entire range of catchment areas and tank volumes considered in the analysis. For RWH systems installed on the most commonly occurring rooftop catchment areas found in NYC (51-75 m 2), toilet flushing demand can be met with 7-40% reliability, with lower end of the range representing buildings with high flow toilets and no storage elements, and the upper end representing buildings that feature low flow fixtures and storage tanks of up to 5 m 3. When the reliability curves developed are used to size RWH systems to flush the low flow toilets of all multifamily buildings found a typical residential neighborhood in the Bronx, rooftop runoff inputs to the sewer system are reduced by approximately 28% over an average rainfall year, and potable water demand is reduced by approximately 53%.

  15. Scene Configuration and Object Reliability Affect the Use of Allocentric Information for Memory-Guided Reaching

    PubMed Central

    Klinghammer, Mathias; Blohm, Gunnar; Fiehler, Katja

    2017-01-01

    Previous research has shown that egocentric and allocentric information is used for coding target locations for memory-guided reaching movements. Especially, task-relevance determines the use of objects as allocentric cues. Here, we investigated the influence of scene configuration and object reliability as a function of task-relevance on allocentric coding for memory-guided reaching. For that purpose, we presented participants images of a naturalistic breakfast scene with five objects on a table and six objects in the background. Six of these objects served as potential reach-targets (= task-relevant objects). Participants explored the scene and after a short delay, a test scene appeared with one of the task-relevant objects missing, indicating the location of the reach target. After the test scene vanished, participants performed a memory-guided reaching movement toward the target location. Besides removing one object from the test scene, we also shifted the remaining task-relevant and/or task-irrelevant objects left- or rightwards either coherently in the same direction or incoherently in opposite directions. By varying object coherence, we manipulated the reliability of task-relevant and task-irrelevant objects in the scene. In order to examine the influence of scene configuration (distributed vs. grouped arrangement of task-relevant objects) on allocentric coding, we compared the present data with our previously published data set (Klinghammer et al., 2015). We found that reaching errors systematically deviated in the direction of object shifts, but only when the objects were task-relevant and their reliability was high. However, this effect was substantially reduced when task-relevant objects were distributed across the scene leading to a larger target-cue distance compared to a grouped configuration. No deviations of reach endpoints were observed in conditions with shifts of only task-irrelevant objects or with low object reliability irrespective of task-relevancy. Moreover, when solely task-relevant objects were shifted incoherently, the variability of reaching endpoints increased compared to coherent shifts of task-relevant objects. Our results suggest that the use of allocentric information for coding targets for memory-guided reaching depends on the scene configuration, in particular the average distance of the reach target to task-relevant objects, and the reliability of task-relevant allocentric information. PMID:28450826

  16. Scene Configuration and Object Reliability Affect the Use of Allocentric Information for Memory-Guided Reaching.

    PubMed

    Klinghammer, Mathias; Blohm, Gunnar; Fiehler, Katja

    2017-01-01

    Previous research has shown that egocentric and allocentric information is used for coding target locations for memory-guided reaching movements. Especially, task-relevance determines the use of objects as allocentric cues. Here, we investigated the influence of scene configuration and object reliability as a function of task-relevance on allocentric coding for memory-guided reaching. For that purpose, we presented participants images of a naturalistic breakfast scene with five objects on a table and six objects in the background. Six of these objects served as potential reach-targets (= task-relevant objects). Participants explored the scene and after a short delay, a test scene appeared with one of the task-relevant objects missing, indicating the location of the reach target. After the test scene vanished, participants performed a memory-guided reaching movement toward the target location. Besides removing one object from the test scene, we also shifted the remaining task-relevant and/or task-irrelevant objects left- or rightwards either coherently in the same direction or incoherently in opposite directions. By varying object coherence, we manipulated the reliability of task-relevant and task-irrelevant objects in the scene. In order to examine the influence of scene configuration (distributed vs. grouped arrangement of task-relevant objects) on allocentric coding, we compared the present data with our previously published data set (Klinghammer et al., 2015). We found that reaching errors systematically deviated in the direction of object shifts, but only when the objects were task-relevant and their reliability was high. However, this effect was substantially reduced when task-relevant objects were distributed across the scene leading to a larger target-cue distance compared to a grouped configuration. No deviations of reach endpoints were observed in conditions with shifts of only task-irrelevant objects or with low object reliability irrespective of task-relevancy. Moreover, when solely task-relevant objects were shifted incoherently, the variability of reaching endpoints increased compared to coherent shifts of task-relevant objects. Our results suggest that the use of allocentric information for coding targets for memory-guided reaching depends on the scene configuration, in particular the average distance of the reach target to task-relevant objects, and the reliability of task-relevant allocentric information.

  17. Interventions for Age-Related Macular Degeneration: Are Practice Guidelines Based on Systematic Reviews?

    PubMed

    Lindsley, Kristina; Li, Tianjing; Ssemanda, Elizabeth; Virgili, Gianni; Dickersin, Kay

    2016-04-01

    Are existing systematic reviews of interventions for age-related macular degeneration incorporated into clinical practice guidelines? High-quality systematic reviews should be used to underpin evidence-based clinical practice guidelines and clinical care. We examined the reliability of systematic reviews of interventions for age-related macular degeneration (AMD) and described the main findings of reliable reviews in relation to clinical practice guidelines. Eligible publications were systematic reviews of the effectiveness of treatment interventions for AMD. We searched a database of systematic reviews in eyes and vision without language or date restrictions; the database was up to date as of May 6, 2014. Two authors independently screened records for eligibility and abstracted and assessed the characteristics and methods of each review. We classified reviews as reliable when they reported eligibility criteria, comprehensive searches, methodologic quality of included studies, appropriate statistical methods for meta-analysis, and conclusions based on results. We mapped treatment recommendations from the American Academy of Ophthalmology (AAO) Preferred Practice Patterns (PPPs) for AMD to systematic reviews and citations of reliable systematic reviews to support each treatment recommendation. Of 1570 systematic reviews in our database, 47 met inclusion criteria; most targeted neovascular AMD and investigated anti-vascular endothelial growth factor (VEGF) interventions, dietary supplements, or photodynamic therapy. We classified 33 (70%) reviews as reliable. The quality of reporting varied, with criteria for reliable reporting met more often by Cochrane reviews and reviews whose authors disclosed conflicts of interest. Anti-VEGF agents and photodynamic therapy were the only interventions identified as effective by reliable reviews. Of 35 treatment recommendations extracted from the PPPs, 15 could have been supported with reliable systematic reviews; however, only 1 recommendation cited a reliable intervention systematic review. No reliable systematic review was identified for 20 treatment recommendations, highlighting areas of evidence gaps. For AMD, reliable systematic reviews exist for many treatment recommendations in the AAO PPPs and should be cited to support these recommendations. We also identified areas where no high-level evidence exists. Mapping clinical practice guidelines to existing systematic reviews is one way to highlight areas where evidence generation or evidence synthesis is either available or needed. Copyright © 2016 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  18. Advances in developing rapid, reliable and portable detection systems for alcohol.

    PubMed

    Thungon, Phurpa Dema; Kakoti, Ankana; Ngashangva, Lightson; Goswami, Pranab

    2017-11-15

    Development of portable, reliable, sensitive, simple, and inexpensive detection system for alcohol has been an instinctive demand not only in traditional brewing, pharmaceutical, food and clinical industries but also in rapidly growing alcohol based fuel industries. Highly sensitive, selective, and reliable alcohol detections are currently amenable typically through the sophisticated instrument based analyses confined mostly to the state-of-art analytical laboratory facilities. With the growing demand of rapid and reliable alcohol detection systems, an all-round attempt has been made over the past decade encompassing various disciplines from basic and engineering sciences. Of late, the research for developing small-scale portable alcohol detection system has been accelerated with the advent of emerging miniaturization techniques, advanced materials and sensing platforms such as lab-on-chip, lab-on-CD, lab-on-paper etc. With these new inter-disciplinary approaches along with the support from the parallel knowledge growth on rapid detection systems being pursued for various targets, the progress on translating the proof-of-concepts to commercially viable and environment friendly portable alcohol detection systems is gaining pace. Here, we summarize the progress made over the years on the alcohol detection systems, with a focus on recent advancement towards developing portable, simple and efficient alcohol sensors. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Accelerator mass spectrometer with ion selection in high-voltage terminal

    NASA Astrophysics Data System (ADS)

    Rastigeev, S. A.; Goncharov, A. D.; Klyuev, V. F.; Konstantinov, E. S.; Kutnyakova, L. A.; Parkhomchuk, V. V.; Petrozhitskii, A. V.; Frolov, A. R.

    2016-12-01

    The folded electrostatic tandem accelerator with ion selection in a high-voltage terminal is the basis of accelerator mass spectrometry (AMS) at the BINP. Additional features of the BINP AMS are the target based on magnesium vapors as a stripper without vacuum deterioration and a time-of-flight telescope with thin films for reliable ion identification. The acceleration complex demonstrates reliable operation in a mode of 1 MV with 50 Hz counting rate of 14C+3 radiocarbon for modern samples (14C/12C 1.2 × 10-12). The current state of the AMS has been considered and the experimental results of the radiocarbon concentration measurements in test samples have been presented.

  20. Test-retest reliability and stability of N400 effects in a word-pair semantic priming paradigm.

    PubMed

    Kiang, Michael; Patriciu, Iulia; Roy, Carolyn; Christensen, Bruce K; Zipursky, Robert B

    2013-04-01

    Elicited by any meaningful stimulus, the N400 event-related potential (ERP) component is reduced when the stimulus is related to a preceding one. This N400 semantic priming effect has been used to probe abnormal semantic relationship processing in clinical disorders, and suggested as a possible biomarker for treatment studies. Validating N400 semantic priming effects as a clinical biomarker requires characterizing their test-retest reliability. We assessed test-retest reliability of N400 semantic priming in 16 healthy adults who viewed the same related and unrelated prime-target word pairs in two sessions one week apart. As expected, N400 amplitudes were smaller for related versus unrelated targets across sessions. N400 priming effects (amplitude differences between unrelated and related targets) were highly correlated across sessions (r=0.85, P<0.0001), but smaller in the second session due to larger N400s to related targets. N400 priming effects have high reliability over a one-week interval. They may decrease with repeat testing, possibly because of motivational changes. Use of N400 priming effects in treatment studies should account for possible magnitude decreases with repeat testing. Further research is needed to delineate N400 priming effects' test-retest reliability and stability in different age and clinical groups, and with different stimulus types. Copyright © 2012 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  1. A Ligand-observed Mass Spectrometry Approach Integrated into the Fragment Based Lead Discovery Pipeline

    PubMed Central

    Chen, Xin; Qin, Shanshan; Chen, Shuai; Li, Jinlong; Li, Lixin; Wang, Zhongling; Wang, Quan; Lin, Jianping; Yang, Cheng; Shui, Wenqing

    2015-01-01

    In fragment-based lead discovery (FBLD), a cascade combining multiple orthogonal technologies is required for reliable detection and characterization of fragment binding to the target. Given the limitations of the mainstream screening techniques, we presented a ligand-observed mass spectrometry approach to expand the toolkits and increase the flexibility of building a FBLD pipeline especially for tough targets. In this study, this approach was integrated into a FBLD program targeting the HCV RNA polymerase NS5B. Our ligand-observed mass spectrometry analysis resulted in the discovery of 10 hits from a 384-member fragment library through two independent screens of complex cocktails and a follow-up validation assay. Moreover, this MS-based approach enabled quantitative measurement of weak binding affinities of fragments which was in general consistent with SPR analysis. Five out of the ten hits were then successfully translated to X-ray structures of fragment-bound complexes to lay a foundation for structure-based inhibitor design. With distinctive strengths in terms of high capacity and speed, minimal method development, easy sample preparation, low material consumption and quantitative capability, this MS-based assay is anticipated to be a valuable addition to the repertoire of current fragment screening techniques. PMID:25666181

  2. Coverage-based constraints for IMRT optimization

    NASA Astrophysics Data System (ADS)

    Mescher, H.; Ulrich, S.; Bangert, M.

    2017-09-01

    Radiation therapy treatment planning requires an incorporation of uncertainties in order to guarantee an adequate irradiation of the tumor volumes. In current clinical practice, uncertainties are accounted for implicitly with an expansion of the target volume according to generic margin recipes. Alternatively, it is possible to account for uncertainties by explicit minimization of objectives that describe worst-case treatment scenarios, the expectation value of the treatment or the coverage probability of the target volumes during treatment planning. In this note we show that approaches relying on objectives to induce a specific coverage of the clinical target volumes are inevitably sensitive to variation of the relative weighting of the objectives. To address this issue, we introduce coverage-based constraints for intensity-modulated radiation therapy (IMRT) treatment planning. Our implementation follows the concept of coverage-optimized planning that considers explicit error scenarios to calculate and optimize patient-specific probabilities q(\\hat{d}, \\hat{v}) of covering a specific target volume fraction \\hat{v} with a certain dose \\hat{d} . Using a constraint-based reformulation of coverage-based objectives we eliminate the trade-off between coverage and competing objectives during treatment planning. In-depth convergence tests including 324 treatment plan optimizations demonstrate the reliability of coverage-based constraints for varying levels of probability, dose and volume. General clinical applicability of coverage-based constraints is demonstrated for two cases. A sensitivity analysis regarding penalty variations within this planing study based on IMRT treatment planning using (1) coverage-based constraints, (2) coverage-based objectives, (3) probabilistic optimization, (4) robust optimization and (5) conventional margins illustrates the potential benefit of coverage-based constraints that do not require tedious adjustment of target volume objectives.

  3. Enhancing SERS by Means of Supramolecular Charge Transfer

    NASA Technical Reports Server (NTRS)

    Wong, Eric; Flood, Amar; Morales, Alfredo

    2009-01-01

    In a proposed method of sensing small quantities of molecules of interest, surface enhanced Raman scattering (SERS) spectroscopy would be further enhanced by means of intermolecular or supramolecular charge transfer. There is a very large potential market for sensors based on this method for rapid detection of chemical and biological hazards. In SERS, the Raman signals (vibrational spectra) of target molecules become enhanced by factors of the order of 108 when those molecules are in the vicinities of nanostructured substrate surfaces that have been engineered to have plasmon resonances that enhance local electric fields. SERS, as reported in several prior NASA Tech Briefs articles and elsewhere, has remained a research tool and has not yet been developed into a practical technique for sensing of target molecules: this is because the short range (5 to 20 nm) of the field enhancement necessitates engineering of receptor molecules to attract target molecules to the nanostructured substrate surfaces and to enable reliable identification of the target molecules in the presence of interferants. Intermolecular charge-transfer complexes have been used in fluorescence-, photoluminescence-, and electrochemistry-based techniques for sensing target molecules, but, until now, have not been considered for use in SERS-based sensing. The basic idea of the proposed method is to engineer receptor molecules that would be attached to nanostructured SERS substrates and that would interact with the target molecules to form receptor-target supramolecular charge-transfer complexes wherein the charge transfer could be photoexcited.

  4. Identifying factors associated with fast food consumption among adolescents in Beijing China using a theory-based approach.

    PubMed

    Ma, R; Castellanos, D C; Bachman, J

    2016-07-01

    China is in the midst of the nutrition transition with increasing rates of obesity and dietary changes. One contributor is the increase in fast food chains within the country. The purpose of this study was to develop a theory-based instrument that explores influencing factors of fast food consumption in adolescents residing in Beijing, China. Cross-sectional study. Value expectancy and theory of planned behaviour were utilised to explore influencing factors of fast food consumption in the target population. There were 201 Chinese adolescents between the ages of 12 and 18. Cronbach's alpha correlation coefficients were used to examine internal reliability of the theory-based questionnaire. Bivariate correlations and a MANOVA were utilised to determine the relationship between theory-based constructs, body mass index (BMI)-for-age and fast food intake frequency as well as to determine differences in theory-based scores among fast food consumption frequency groupings. The theory-based questionnaire showed good reliability. Furthermore, there was a significant difference in the theory-based subcategory scores between fast food frequency groups. A significant positive correlation was observed between times per week fast food was consumed and each theory-based subscale score. Using BMI-for-age of 176 participants, 81% were normal weight and 19% were considered overweight or obese. Results showed consumption of fast food to be on average 1.50 ± 1.33 per week. The relationship between BMI-for-age and times per week fast food was consumed was not significant. As the nutrition transition continues and fast food chains expand, it is important to explore factors effecting fast food consumption in China. Interventions targeting influencing factors can be developed to encourage healthy dietary choice in the midst of this transition. Copyright © 2016. Published by Elsevier Ltd.

  5. Influenza A Subtyping

    PubMed Central

    Kaul, Karen L.; Mangold, Kathy A.; Du, Hongyan; Pesavento, Kristen M.; Nawrocki, John; Nowak, Jan A.

    2010-01-01

    Influenza virus subtyping has emerged as a critical tool in the diagnosis of influenza. Antiviral resistance is present in the majority of seasonal H1N1 influenza A infections, with association of viral strain type and antiviral resistance. Influenza A virus subtypes can be reliably distinguished by examining conserved sequences in the matrix protein gene. We describe our experience with an assay for influenza A subtyping based on matrix gene sequences. Viral RNA was prepared from nasopharyngeal swab samples, and real-time RT-PCR detection of influenza A and B was performed using a laboratory developed analyte-specific reagent-based assay that targets a conserved region of the influenza A matrix protein gene. FluA-positive samples were analyzed using a second RT-PCR assay targeting the matrix protein gene to distinguish seasonal influenza subtypes based on differential melting of fluorescence resonance energy transfer probes. The novel H1N1 influenza strain responsible for the 2009 pandemic showed a melting profile distinct from that of seasonal H1N1 or H3N2 and compatible with the predicted melting temperature based on the published novel H1N1 matrix gene sequence. Validation by comparison with the Centers for Disease Control and Prevention real-time RT-PCR for swine influenza A (novel H1N1) test showed this assay to be both rapid and reliable (>99% sensitive and specific) in the identification of the novel H1N1 influenza A virus strain. PMID:20595627

  6. The Reliability of Environmental Measures of the College Alcohol Environment.

    ERIC Educational Resources Information Center

    Clapp, John D.; Whitney, Mike; Shillington, Audrey M.

    2002-01-01

    Assesses the inter-rater reliability of two environmental scanning tools designed to identify alcohol-related advertisements targeting college students. Inter-rater reliability for these forms varied across different rating categories and ranged from poor to excellent. Suggestions for future research are addressed. (Contains 26 references and 6…

  7. Contextual cueing in multiconjunction visual search is dependent on color- and configuration-based intertrial contingencies.

    PubMed

    Geyer, Thomas; Shi, Zhuanghua; Müller, Hermann J

    2010-06-01

    Three experiments examined memory-based guidance of visual search using a modified version of the contextual-cueing paradigm (Jiang & Chun, 2001). The target, if present, was a conjunction of color and orientation, with target (and distractor) features randomly varying across trials (multiconjunction search). Under these conditions, reaction times (RTs) were faster when all items in the display appeared at predictive ("old") relative to nonpredictive ("new") locations. However, this RT benefit was smaller compared to when only one set of items, namely that sharing the target's color (but not that in the alternative color) appeared in predictive arrangement. In all conditions, contextual cueing was reliable on both target-present and -absent trials and enhanced if a predictive display was preceded by a predictive (though differently arranged) display, rather than a nonpredictive display. These results suggest that (1) contextual cueing is confined to color subsets of items, that (2) retrieving contextual associations for one color subset of items can be impeded by associations formed within the alternative subset ("contextual interference"), and (3) that contextual cueing is modulated by intertrial priming.

  8. Design of a sensitive aptasensor based on magnetic microbeads-assisted strand displacement amplification and target recycling.

    PubMed

    Li, Ying; Ji, Xiaoting; Song, Weiling; Guo, Yingshu

    2013-04-03

    A cross-circular amplification system for sensitive detection of adenosine triphosphate (ATP) in cancer cells was developed based on aptamer-target interaction, magnetic microbeads (MBs)-assisted strand displacement amplification and target recycling. Here we described a new recognition probe possessing two parts, the ATP aptamer and the extension part. The recognition probe was firstly immobilized on the surface of MBs and hybridized with its complementary sequence to form a duplex. When combined with ATP, the probe changed its conformation, revealing the extension part in single-strand form, which further served as a toehold for subsequent target recycling. The released complementary sequence of the probe acted as the catalyst of the MB-assisted strand displacement reaction. Incorporated with target recycling, a large amount of biotin-tagged MB complexes were formed to stimulate the generation of chemiluminescence (CL) signal in the presence of luminol and H2O2 by incorporating with streptavidin-HRP, reaching a detection limit of ATP as low as 6.1×10(-10)M. Moreover, sample assays of ATP in Ramos Burkitt's lymphoma B cells were performed, which confirmed the reliability and practicality of the protocol. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Infrared vehicle recognition using unsupervised feature learning based on K-feature

    NASA Astrophysics Data System (ADS)

    Lin, Jin; Tan, Yihua; Xia, Haijiao; Tian, Jinwen

    2018-02-01

    Subject to the complex battlefield environment, it is difficult to establish a complete knowledge base in practical application of vehicle recognition algorithms. The infrared vehicle recognition is always difficult and challenging, which plays an important role in remote sensing. In this paper we propose a new unsupervised feature learning method based on K-feature to recognize vehicle in infrared images. First, we use the target detection algorithm which is based on the saliency to detect the initial image. Then, the unsupervised feature learning based on K-feature, which is generated by Kmeans clustering algorithm that extracted features by learning a visual dictionary from a large number of samples without label, is calculated to suppress the false alarm and improve the accuracy. Finally, the vehicle target recognition image is finished by some post-processing. Large numbers of experiments demonstrate that the proposed method has satisfy recognition effectiveness and robustness for vehicle recognition in infrared images under complex backgrounds, and it also improve the reliability of it.

  10. High resolution melting curve analysis targeting the HBB gene mutational hot-spot offers a reliable screening approach for all common as well as most of the rare beta-globin gene mutations in Bangladesh.

    PubMed

    Islam, Md Tarikul; Sarkar, Suprovath Kumar; Sultana, Nusrat; Begum, Mst Noorjahan; Bhuyan, Golam Sarower; Talukder, Shezote; Muraduzzaman, A K M; Alauddin, Md; Islam, Mohammad Sazzadul; Biswas, Pritha Promita; Biswas, Aparna; Qadri, Syeda Kashfi; Shirin, Tahmina; Banu, Bilquis; Sadya, Salma; Hussain, Manzoor; Sarwardi, Golam; Khan, Waqar Ahmed; Mannan, Mohammad Abdul; Shekhar, Hossain Uddin; Chowdhury, Emran Kabir; Sajib, Abu Ashfaqur; Akhteruzzaman, Sharif; Qadri, Syed Saleheen; Qadri, Firdausi; Mannoor, Kaiissar

    2018-01-02

    Bangladesh lies in the global thalassemia belt, which has a defined mutational hot-spot in the beta-globin gene. The high carrier frequencies of beta-thalassemia trait and hemoglobin E-trait in Bangladesh necessitate a reliable DNA-based carrier screening approach that could supplement the use of hematological and electrophoretic indices to overcome the barriers of carrier screening. With this view in mind, the study aimed to establish a high resolution melting (HRM) curve-based rapid and reliable mutation screening method targeting the mutational hot-spot of South Asian and Southeast Asian countries that encompasses exon-1 (c.1 - c.92), intron-1 (c.92 + 1 - c.92 + 130) and a portion of exon-2 (c.93 - c.217) of the HBB gene which harbors more than 95% of mutant alleles responsible for beta-thalassemia in Bangladesh. Our HRM approach could successfully differentiate ten beta-globin gene mutations, namely c.79G > A, c.92 + 5G > C, c.126_129delCTTT, c.27_28insG, c.46delT, c.47G > A, c.92G > C, c.92 + 130G > C, c.126delC and c.135delC in heterozygous states from the wild type alleles, implying the significance of the approach for carrier screening as the first three of these mutations account for ~85% of total mutant alleles in Bangladesh. Moreover, different combinations of compound heterozygous mutations were found to generate melt curves that were distinct from the wild type alleles and from one another. Based on the findings, sixteen reference samples were run in parallel to 41 unknown specimens to perform direct genotyping of the beta-thalassemia specimens using HRM. The HRM-based genotyping of the unknown specimens showed 100% consistency with the sequencing result. Targeting the mutational hot-spot, the HRM approach could be successfully applied for screening of beta-thalassemia carriers in Bangladesh as well as in other countries of South Asia and Southeast Asia. The approach could be a useful supplement of hematological and electrophortic indices in order to avoid false positive and false negative results.

  11. A multiplex primer design algorithm for target amplification of continuous genomic regions.

    PubMed

    Ozturk, Ahmet Rasit; Can, Tolga

    2017-06-19

    Targeted Next Generation Sequencing (NGS) assays are cost-efficient and reliable alternatives to Sanger sequencing. For sequencing of very large set of genes, the target enrichment approach is suitable. However, for smaller genomic regions, the target amplification method is more efficient than both the target enrichment method and Sanger sequencing. The major difficulty of the target amplification method is the preparation of amplicons, regarding required time, equipment, and labor. Multiplex PCR (MPCR) is a good solution for the mentioned problems. We propose a novel method to design MPCR primers for a continuous genomic region, following the best practices of clinically reliable PCR design processes. On an experimental setup with 48 different combinations of factors, we have shown that multiple parameters might effect finding the first feasible solution. Increasing the length of the initial primer candidate selection sequence gives better results whereas waiting for a longer time to find the first feasible solution does not have a significant impact. We generated MPCR primer designs for the HBB whole gene, MEFV coding regions, and human exons between 2000 bp to 2100 bp-long. Our benchmarking experiments show that the proposed MPCR approach is able produce reliable NGS assay primers for a given sequence in a reasonable amount of time.

  12. 150 {mu}A 18F{sup -} target and beam port upgrade for the IBA 18/9 cyclotron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stokely, M. H.; Peeples, J. L.; Poorman, M. C.

    2012-12-19

    A high power ({approx}3 kW) target platform has been developed for the IBA 18/9 cyclotron. New designs for the airlock, collimator and target subsystems have been fabricated and deployed. The primary project goal is reliable commercial production of 18F{sup -} at 150 {mu}A or greater, while secondary goals include improving serviceability and extending service intervals relative to OEM systems. Reliable operation in a production environment has been observed at beam currents up to 140 {mu}A. Challenges include ion source lifetime and localized peaking in the beam intensity distribution.

  13. A high-throughput multiplex method adapted for GMO detection.

    PubMed

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  14. Design and Evaluation of Fusion Approach for Combining Brain and Gaze Inputs for Target Selection

    PubMed Central

    Évain, Andéol; Argelaguet, Ferran; Casiez, Géry; Roussel, Nicolas; Lécuyer, Anatole

    2016-01-01

    Gaze-based interfaces and Brain-Computer Interfaces (BCIs) allow for hands-free human–computer interaction. In this paper, we investigate the combination of gaze and BCIs. We propose a novel selection technique for 2D target acquisition based on input fusion. This new approach combines the probabilistic models for each input, in order to better estimate the intent of the user. We evaluated its performance against the existing gaze and brain–computer interaction techniques. Twelve participants took part in our study, in which they had to search and select 2D targets with each of the evaluated techniques. Our fusion-based hybrid interaction technique was found to be more reliable than the previous gaze and BCI hybrid interaction techniques for 10 participants over 12, while being 29% faster on average. However, similarly to what has been observed in hybrid gaze-and-speech interaction, gaze-only interaction technique still provides the best performance. Our results should encourage the use of input fusion, as opposed to sequential interaction, in order to design better hybrid interfaces. PMID:27774048

  15. A trunk ranging system based on binocular stereo vision

    NASA Astrophysics Data System (ADS)

    Zhao, Xixuan; Kan, Jiangming

    2017-07-01

    Trunk ranging is an essential function for autonomous forestry robots. Traditional trunk ranging systems based on personal computers are not convenient in practical application. This paper examines the implementation of a trunk ranging system based on the binocular vision theory via TI's DaVinc DM37x system. The system is smaller and more reliable than that implemented using a personal computer. It calculates the three-dimensional information from the images acquired by binocular cameras, producing the targeting and ranging results. The experimental results show that the measurement error is small and the system design is feasible for autonomous forestry robots.

  16. Evidence base and future research directions in the management of low back pain.

    PubMed

    Abbott, Allan

    2016-03-18

    Low back pain (LBP) is a prevalent and costly condition. Awareness of valid and reliable patient history taking, physical examination and clinical testing is important for diagnostic accuracy. Stratified care which targets treatment to patient subgroups based on key characteristics is reliant upon accurate diagnostics. Models of stratified care that can potentially improve treatment effects include prognostic risk profiling for persistent LBP, likely response to specific treatment based on clinical prediction models or suspected underlying causal mechanisms. The focus of this editorial is to highlight current research status and future directions for LBP diagnostics and stratified care.

  17. Reliability and criterion validity of measurements using a smart phone-based measurement tool for the transverse rotation angle of the pelvis during single-leg lifting.

    PubMed

    Jung, Sung-Hoon; Kwon, Oh-Yun; Jeon, In-Cheol; Hwang, Ui-Jae; Weon, Jong-Hyuck

    2018-01-01

    The purposes of this study were to determine the intra-rater test-retest reliability of a smart phone-based measurement tool (SBMT) and a three-dimensional (3D) motion analysis system for measuring the transverse rotation angle of the pelvis during single-leg lifting (SLL) and the criterion validity of the transverse rotation angle of the pelvis measurement using SBMT compared with a 3D motion analysis system (3DMAS). Seventeen healthy volunteers performed SLL with their dominant leg without bending the knee until they reached a target placed 20 cm above the table. This study used a 3DMAS, considered the gold standard, to measure the transverse rotation angle of the pelvis to assess the criterion validity of the SBMT measurement. Intra-rater test-retest reliability was determined using the SBMT and 3DMAS using intra-class correlation coefficient (ICC) [3,1] values. The criterion validity of the SBMT was assessed with ICC [3,1] values. Both the 3DMAS (ICC = 0.77) and SBMT (ICC = 0.83) showed excellent intra-rater test-retest reliability in the measurement of the transverse rotation angle of the pelvis during SLL in a supine position. Moreover, the SBMT showed an excellent correlation with the 3DMAS (ICC = 0.99). Measurement of the transverse rotation angle of the pelvis using the SBMT showed excellent reliability and criterion validity compared with the 3DMAS.

  18. Research on target information optics communications transmission characteristic and performance in multi-screens testing system

    NASA Astrophysics Data System (ADS)

    Li, Hanshan

    2016-04-01

    To enhance the stability and reliability of multi-screens testing system, this paper studies multi-screens target optical information transmission link properties and performance in long-distance, sets up the discrete multi-tone modulation transmission model based on geometric model of laser multi-screens testing system and visible light information communication principle; analyzes the electro-optic and photoelectric conversion function of sender and receiver in target optical information communication system; researches target information transmission performance and transfer function of the generalized visible-light communication channel; found optical information communication transmission link light intensity space distribution model and distribution function; derives the SNR model of information transmission communication system. Through the calculation and experiment analysis, the results show that the transmission error rate increases with the increment of transmission rate in a certain channel modulation depth; when selecting the appropriate transmission rate, the bit error rate reach 0.01.

  19. Real-time reliability measure-driven multi-hypothesis tracking using 2D and 3D features

    NASA Astrophysics Data System (ADS)

    Zúñiga, Marcos D.; Brémond, François; Thonnat, Monique

    2011-12-01

    We propose a new multi-target tracking approach, which is able to reliably track multiple objects even with poor segmentation results due to noisy environments. The approach takes advantage of a new dual object model combining 2D and 3D features through reliability measures. In order to obtain these 3D features, a new classifier associates an object class label to each moving region (e.g. person, vehicle), a parallelepiped model and visual reliability measures of its attributes. These reliability measures allow to properly weight the contribution of noisy, erroneous or false data in order to better maintain the integrity of the object dynamics model. Then, a new multi-target tracking algorithm uses these object descriptions to generate tracking hypotheses about the objects moving in the scene. This tracking approach is able to manage many-to-many visual target correspondences. For achieving this characteristic, the algorithm takes advantage of 3D models for merging dissociated visual evidence (moving regions) potentially corresponding to the same real object, according to previously obtained information. The tracking approach has been validated using video surveillance benchmarks publicly accessible. The obtained performance is real time and the results are competitive compared with other tracking algorithms, with minimal (or null) reconfiguration effort between different videos.

  20. Robust Control Design for Systems With Probabilistic Uncertainty

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.

    2005-01-01

    This paper presents a reliability- and robustness-based formulation for robust control synthesis for systems with probabilistic uncertainty. In a reliability-based formulation, the probability of violating design requirements prescribed by inequality constraints is minimized. In a robustness-based formulation, a metric which measures the tendency of a random variable/process to cluster close to a target scalar/function is minimized. A multi-objective optimization procedure, which combines stability and performance requirements in time and frequency domains, is used to search for robustly optimal compensators. Some of the fundamental differences between the proposed strategy and conventional robust control methods are: (i) unnecessary conservatism is eliminated since there is not need for convex supports, (ii) the most likely plants are favored during synthesis allowing for probabilistic robust optimality, (iii) the tradeoff between robust stability and robust performance can be explored numerically, (iv) the uncertainty set is closely related to parameters with clear physical meaning, and (v) compensators with improved robust characteristics for a given control structure can be synthesized.

  1. Improvement in Visual Target Tracking for a Mobile Robot

    NASA Technical Reports Server (NTRS)

    Kim, Won; Ansar, Adnan; Madison, Richard

    2006-01-01

    In an improvement of the visual-target-tracking software used aboard a mobile robot (rover) of the type used to explore the Martian surface, an affine-matching algorithm has been replaced by a combination of a normalized- cross-correlation (NCC) algorithm and a template-image-magnification algorithm. Although neither NCC nor template-image magnification is new, the use of both of them to increase the degree of reliability with which features can be matched is new. In operation, a template image of a target is obtained from a previous rover position, then the magnification of the template image is based on the estimated change in the target distance from the previous rover position to the current rover position (see figure). For this purpose, the target distance at the previous rover position is determined by stereoscopy, while the target distance at the current rover position is calculated from an estimate of the current pose of the rover. The template image is then magnified by an amount corresponding to the estimated target distance to obtain a best template image to match with the image acquired at the current rover position.

  2. Prior Knowledge Guides Speech Segregation in Human Auditory Cortex.

    PubMed

    Wang, Yuanye; Zhang, Jianfeng; Zou, Jiajie; Luo, Huan; Ding, Nai

    2018-05-18

    Segregating concurrent sound streams is a computationally challenging task that requires integrating bottom-up acoustic cues (e.g. pitch) and top-down prior knowledge about sound streams. In a multi-talker environment, the brain can segregate different speakers in about 100 ms in auditory cortex. Here, we used magnetoencephalographic (MEG) recordings to investigate the temporal and spatial signature of how the brain utilizes prior knowledge to segregate 2 speech streams from the same speaker, which can hardly be separated based on bottom-up acoustic cues. In a primed condition, the participants know the target speech stream in advance while in an unprimed condition no such prior knowledge is available. Neural encoding of each speech stream is characterized by the MEG responses tracking the speech envelope. We demonstrate that an effect in bilateral superior temporal gyrus and superior temporal sulcus is much stronger in the primed condition than in the unprimed condition. Priming effects are observed at about 100 ms latency and last more than 600 ms. Interestingly, prior knowledge about the target stream facilitates speech segregation by mainly suppressing the neural tracking of the non-target speech stream. In sum, prior knowledge leads to reliable speech segregation in auditory cortex, even in the absence of reliable bottom-up speech segregation cue.

  3. The assessment of fidelity in a motor speech-treatment approach

    PubMed Central

    Hayden, Deborah; Namasivayam, Aravind Kumar; Ward, Roslyn

    2015-01-01

    Objective To demonstrate the application of the constructs of treatment fidelity for research and clinical practice for motor speech disorders, using the Prompts for Restructuring Oral Muscular Phonetic Targets (PROMPT) Fidelity Measure (PFM). Treatment fidelity refers to a set of procedures used to monitor and improve the validity and reliability of behavioral intervention. While the concept of treatment fidelity has been emphasized in medical and allied health sciences, documentation of procedures for the systematic evaluation of treatment fidelity in Speech-Language Pathology is sparse. Methods The development and iterative process to improve the PFM, is discussed. Further, the PFM is evaluated against recommended measurement strategies documented in the literature. This includes evaluating the appropriateness of goals and objectives; and the training of speech–language pathologists, using direct and indirect procedures. Three expert raters scored the PFM to examine inter-rater reliability. Results Three raters, blinded to each other's scores, completed fidelity ratings on three separate occasions. Inter-rater reliability, using Krippendorff's Alpha, was >80% for the PFM on the final scoring occasion. This indicates strong inter-rater reliability. Conclusion The development of fidelity measures for the training of service providers and treatment delivery is important in specialized treatment approaches where certain ‘active ingredients’ (e.g. specific treatment targets and therapeutic techniques) must be present in order for treatment to be effective. The PFM reflects evidence-based practice by integrating treatment delivery and clinical skill as a single quantifiable metric. PFM enables researchers and clinicians to objectively measure treatment outcomes within the PROMPT approach. PMID:26213623

  4. Robust Visual Tracking Revisited: From Correlation Filter to Template Matching.

    PubMed

    Liu, Fanghui; Gong, Chen; Huang, Xiaolin; Zhou, Tao; Yang, Jie; Tao, Dacheng

    2018-06-01

    In this paper, we propose a novel matching based tracker by investigating the relationship between template matching and the recent popular correlation filter based trackers (CFTs). Compared to the correlation operation in CFTs, a sophisticated similarity metric termed mutual buddies similarity is proposed to exploit the relationship of multiple reciprocal nearest neighbors for target matching. By doing so, our tracker obtains powerful discriminative ability on distinguishing target and background as demonstrated by both empirical and theoretical analyses. Besides, instead of utilizing single template with the improper updating scheme in CFTs, we design a novel online template updating strategy named memory, which aims to select a certain amount of representative and reliable tracking results in history to construct the current stable and expressive template set. This scheme is beneficial for the proposed tracker to comprehensively understand the target appearance variations, recall some stable results. Both qualitative and quantitative evaluations on two benchmarks suggest that the proposed tracking method performs favorably against some recently developed CFTs and other competitive trackers.

  5. Trust and reliance on an automated combat identification system.

    PubMed

    Wang, Lu; Jamieson, Greg A; Hollands, Justin G

    2009-06-01

    We examined the effects of aid reliability and reliability disclosure on human trust in and reliance on a combat identification (CID) aid. We tested whether trust acts as a mediating factor between belief in and reliance on a CID aid. Individual CID systems have been developed to reduce friendly fire incidents. However, these systems cannot positively identify a target that does not have a working transponder. Therefore, when the feedback is "unknown", the target could be hostile, neutral, or friendly. Soldiers have difficulty relying on this type of imperfect automation appropriately. In manual and aided conditions, 24 participants completed a simulated CID task. The reliability of the aid varied within participants, half of whom were told the aid reliability level. We used the difference in response bias values across conditions to measure automation reliance. Response bias varied more appropriately with the aid reliability level when it was disclosed than when not. Trust in aid feedback correlated with belief in aid reliability and reliance on aid feedback; however, belief was not correlated with reliance. To engender appropriate reliance on CID systems, users should be made aware of system reliability. The findings can be applied to the design of information displays for individual CID systems and soldier training.

  6. Effectiveness Evaluation Method of Anti-Radiation Missile against Active Decoy

    NASA Astrophysics Data System (ADS)

    Tang, Junyao; Cao, Fei; Li, Sijia

    2017-06-01

    In the problem of anti-radiation missile against active decoy, whether the ARM can effectively kill the target radiation source and bait is an important index for evaluating the operational effectiveness of the missile. Aiming at this problem, this paper proposes a method to evaluate the effect of ARM against active decoy. Based on the calculation of ARM’s ability to resist the decoy, the paper proposes a method to evaluate the decoy resistance based on the key components of the hitting radar. The method has the advantages of scientific and reliability.

  7. Stereotactic ultrasound for target volume definition in a patient with prostate cancer and bilateral total hip replacement.

    PubMed

    Boda-Heggemann, Judit; Haneder, Stefan; Ehmann, Michael; Sihono, Dwi Seno Kuncoro; Wertz, Hansjörg; Mai, Sabine; Kegel, Stefan; Heitmann, Sigrun; von Swietochowski, Sandra; Lohr, Frank; Wenz, Frederik

    2015-01-01

    Target-volume definition for prostate cancer in patients with bilateral metal total hip replacements (THRs) is a challenge because of metal artifacts in the planning computed tomography (CT) scans. Magnetic resonance imaging (MRI) can be used for matching and prostate delineation; however, at a spatial and temporal distance from the planning CT, identical rectal and vesical filling is difficult to achieve. In addition, MRI may also be impaired by metal artifacts, even resulting in spatial image distortion. Here, we present a method to define prostate target volumes based on ultrasound images acquired during CT simulation and online-matched to the CT data set directly at the planning CT. A 78-year-old patient with cT2cNxM0 prostate cancer with bilateral metal THRs was referred to external beam radiation therapy. T2-weighted MRI was performed on the day of the planning CT with preparation according to a protocol for reproducible bladder and rectal filling. The planning CT was obtained with the immediate acquisition of a 3-dimensional ultrasound data set with a dedicated stereotactic ultrasound system for online intermodality image matching referenced to the isocenter by ceiling-mounted infrared cameras. MRI (offline) and ultrasound images (online) were thus both matched to the CT images for planning. Daily image guided radiation therapy (IGRT) was performed with transabdominal ultrasound and compared with cone beam CT. Because of variations in bladder and rectal filling and metal-induced image distortion in MRI, soft-tissue-based matching of the MRI to CT was not sufficient for unequivocal prostate target definition. Ultrasound-based images could be matched, and prostate, seminal vesicles, and target volumes were reliably defined. Daily IGRT could be successfully completed with transabdominal ultrasound with good accordance between cone beam CT and ultrasound. For prostate cancer patients with bilateral THRs causing artifacts in planning CTs, ultrasound referenced to the isocenter of the CT simulator and acquired with intermodal online coregistration directly at the planning CT is a fast and easy method to reliably delineate the prostate and target volumes and for daily IGRT. Copyright © 2015 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  8. Targets Mask U-Net for Wind Turbines Detection in Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Han, M.; Wang, H.; Wang, G.; Liu, Y.

    2018-04-01

    To detect wind turbines precisely and quickly in very high resolution remote sensing images (VHRRSI) we propose target mask U-Net. This convolution neural network (CNN), which is carefully designed to be a wide-field detector, models the pixel class assignment to wind turbines and their context information. The shadow, which is the context information of the target in this study, has been regarded as part of a wind turbine instance. We have trained the target mask U-Net on training dataset, which is composed of down sampled image blocks and instance mask blocks. Some post-processes have been integrated to eliminate wrong spots and produce bounding boxes of wind turbine instances. The evaluation metrics prove the reliability and effectiveness of our method for the average F1-score of our detection method is up to 0.97. The comparison of detection accuracy and time consuming with the weakly supervised targets detection method based on CNN illustrates the superiority of our method.

  9. Individual differences in learning correlate with modulation of brain activity induced by transcranial direct current stimulation

    PubMed Central

    Falcone, Brian; Wada, Atsushi; Parasuraman, Raja

    2018-01-01

    Transcranial direct current stimulation (tDCS) has been shown to enhance cognitive performance on a variety of tasks. It is hypothesized that tDCS enhances performance by affecting task related cortical excitability changes in networks underlying or connected to the site of stimulation facilitating long term potentiation. However, many recent studies have called into question the reliability and efficacy of tDCS to induce modulatory changes in brain activity. In this study, our goal is to investigate the individual differences in tDCS induced modulatory effects on brain activity related to the degree of enhancement in performance, providing insight into this lack of reliability. In accomplishing this goal, we used functional magnetic resonance imaging (fMRI) concurrently with tDCS stimulation (1 mA, 30 minutes duration) using a visual search task simulating real world conditions. The experiment consisted of three fMRI sessions: pre-training (no performance feedback), training (performance feedback which included response accuracy and target location and either real tDCS or sham stimulation given), and post-training (no performance feedback). The right posterior parietal cortex was selected as the site of anodal tDCS based on its known role in visual search and spatial attention processing. Our results identified a region in the right precentral gyrus, known to be involved with visual spatial attention and orienting, that showed tDCS induced task related changes in cortical excitability that were associated with individual differences in improved performance. This same region showed greater activity during the training session for target feedback of incorrect (target-error feedback) over correct trials for the tDCS stim over sham group indicating greater attention to target features during training feedback when trials were incorrect. These results give important insight into the nature of neural excitability induced by tDCS as it relates to variability in individual differences in improved performance shedding some light the apparent lack of reliability found in tDCS research. PMID:29782510

  10. Fine reservoir structure modeling based upon 3D visualized stratigraphic correlation between horizontal wells: methodology and its application

    NASA Astrophysics Data System (ADS)

    Chenghua, Ou; Chaochun, Li; Siyuan, Huang; Sheng, James J.; Yuan, Xu

    2017-12-01

    As the platform-based horizontal well production mode has been widely applied in petroleum industry, building a reliable fine reservoir structure model by using horizontal well stratigraphic correlation has become very important. Horizontal wells usually extend between the upper and bottom boundaries of the target formation, with limited penetration points. Using these limited penetration points to conduct well deviation correction means the formation depth information obtained is not accurate, which makes it hard to build a fine structure model. In order to solve this problem, a method of fine reservoir structure modeling, based on 3D visualized stratigraphic correlation among horizontal wells, is proposed. This method can increase the accuracy when estimating the depth of the penetration points, and can also effectively predict the top and bottom interfaces in the horizontal penetrating section. Moreover, this method will greatly increase not only the number of points of depth data available, but also the accuracy of these data, which achieves the goal of building a reliable fine reservoir structure model by using the stratigraphic correlation among horizontal wells. Using this method, four 3D fine structure layer models have been successfully built of a specimen shale gas field with platform-based horizontal well production mode. The shale gas field is located to the east of Sichuan Basin, China; the successful application of the method has proven its feasibility and reliability.

  11. Boundary conditions for the influence of unfamiliar non-target primes in unconscious evaluative priming: The moderating role of attentional task sets.

    PubMed

    Kiefer, Markus; Sim, Eun-Jim; Wentura, Dirk

    2015-09-01

    Evaluative priming by masked emotional stimuli that are not consciously perceived has been taken as evidence that affective stimulus evaluation can also occur unconsciously. However, as masked priming effects were small and frequently observed only for familiar primes that there also presented as visible targets in an evaluative decision task, priming was thought to reflect primarily response activation based on acquired S-R associations and not evaluative semantic stimulus analysis. The present study therefore assessed across three experiments boundary conditions for the emergence of masked evaluative priming effects with unfamiliar primes in an evaluative decision task and investigated the role of the frequency of target repetition on priming with pictorial and verbal stimuli. While familiar primes elicited robust priming effects in all conditions, priming effects by unfamiliar primes were reliably obtained for low repetition (pictures) or unrepeated targets (words), but not for targets repeated at a high frequency. This suggests that unfamiliar masked stimuli only elicit evaluative priming effects when the task set associated with the visible target involves evaluative semantic analysis and is not based on S-R triggered responding as for high repetition targets. The present results therefore converge with the growing body of evidence demonstrating attentional control influences on unconscious processing. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. The (un)reliability of item-level semantic priming effects.

    PubMed

    Heyman, Tom; Bruninx, Anke; Hutchison, Keith A; Storms, Gert

    2018-04-05

    Many researchers have tried to predict semantic priming effects using a myriad of variables (e.g., prime-target associative strength or co-occurrence frequency). The idea is that relatedness varies across prime-target pairs, which should be reflected in the size of the priming effect (e.g., cat should prime dog more than animal does). However, it is only insightful to predict item-level priming effects if they can be measured reliably. Thus, in the present study we examined the split-half and test-retest reliabilities of item-level priming effects under conditions that should discourage the use of strategies. The resulting priming effects proved extremely unreliable, and reanalyses of three published priming datasets revealed similar cases of low reliability. These results imply that previous attempts to predict semantic priming were unlikely to be successful. However, one study with an unusually large sample size yielded more favorable reliability estimates, suggesting that big data, in terms of items and participants, should be the future for semantic priming research.

  13. Target virus log10 reduction values determined for two reclaimed wastewater irrigation scenarios in Japan based on tolerable annual disease burden.

    PubMed

    Ito, Toshihiro; Kitajima, Masaaki; Kato, Tsuyoshi; Ishii, Satoshi; Segawa, Takahiro; Okabe, Satoshi; Sano, Daisuke

    2017-11-15

    Multiple-barriers are widely employed for managing microbial risks in water reuse, in which different types of wastewater treatment units (biological treatment, disinfection, etc.) and health protection measures (use of personal protective gear, vegetable washing, etc.) are combined to achieve a performance target value of log 10 reduction (LR) of viruses. The LR virus target value needs to be calculated based on the data obtained from monitoring the viruses of concern and the water reuse scheme in the context of the countries/regions where water reuse is implemented. In this study, we calculated the virus LR target values under two exposure scenarios for reclaimed wastewater irrigation in Japan, using the concentrations of indigenous viruses in untreated wastewater and a defined tolerable annual disease burden (10 -4 or 10 -6 disability-adjusted life years per person per year (DALY pppy )). Three genogroups of norovirus (norovirus genogroup I (NoV GI), geogroup II (NoV GII), and genogroup IV (NoV GIV)) in untreated wastewater were quantified as model viruses using reverse transcription-microfluidic quantitative PCR, and only NoV GII was present in quantifiable concentration. The probabilistic distribution of NoV GII concentration in untreated wastewater was then estimated from its concentration dataset, and used to calculate the LR target values of NoV GII for wastewater treatment. When an accidental ingestion of reclaimed wastewater by Japanese farmers was assumed, the NoV GII LR target values corresponding to the tolerable annual disease burden of 10 -6 DALY pppy were 3.2, 4.4, and 5.7 at 95, 99, and 99.9%tile, respectively. These percentile values, defined as "reliability," represent the cumulative probability of NoV GII concentration distribution in untreated wastewater below the corresponding tolerable annual disease burden after wastewater reclamation. An approximate 1-log 10 difference of LR target values was observed between 10 -4 and 10 -6 DALY pppy . The LR target values were influenced mostly by the change in the logarithmic standard deviation (SD) values of NoV GII concentration in untreated wastewater and the reliability values, which highlights the importance of accurately determining the probabilistic distribution of reference virus concentrations in source water for water reuse. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  14. Development Status of Low-Shock Payload Separation Mechanism for H-IIA Launch Vehicle

    NASA Astrophysics Data System (ADS)

    Terashima, Keita; Kamita, Toru; Horie, Youichi; Kobayashi, Masakazu; Onikura, Hiroki

    2013-09-01

    This paper presents the design, analysis and test results of the low-shock payload separation mechanism for the H-IIA launch vehicle. The mechanism is based on a simple and reliable four-bar linkage, which makes the release speed of the marman clamp band tension lower than the current system.The adequacy of the principle for low-shock mechanism was evaluated by some simulations and results of fundamental tests. Then, we established the reliability design model of this mechanism, and the adequacy of this model was evaluated by elemental tests.Finally, we conducted the system separation tests using the payload adapter to which the mechanism was assembled, to confirm that the actual separation shock level satisfied our target.

  15. Fully convolutional network with cluster for semantic segmentation

    NASA Astrophysics Data System (ADS)

    Ma, Xiao; Chen, Zhongbi; Zhang, Jianlin

    2018-04-01

    At present, image semantic segmentation technology has been an active research topic for scientists in the field of computer vision and artificial intelligence. Especially, the extensive research of deep neural network in image recognition greatly promotes the development of semantic segmentation. This paper puts forward a method based on fully convolutional network, by cluster algorithm k-means. The cluster algorithm using the image's low-level features and initializing the cluster centers by the super-pixel segmentation is proposed to correct the set of points with low reliability, which are mistakenly classified in great probability, by the set of points with high reliability in each clustering regions. This method refines the segmentation of the target contour and improves the accuracy of the image segmentation.

  16. Simultaneous LC-MS/MS determination of 40 legal and illegal psychoactive drugs in breast and bovine milk.

    PubMed

    López-García, Ester; Mastroianni, Nicola; Postigo, Cristina; Valcárcel, Yolanda; González-Alonso, Silvia; Barceló, Damia; López de Alda, Miren

    2018-04-15

    This work presents a fast, sensitive and reliable multi-residue methodology based on fat and protein precipitation and liquid chromatography-tandem mass spectrometry for the determination of common legal and illegal psychoactive drugs, and major metabolites, in breast milk. One-fourth of the 40 target analytes is investigated for the first time in this biological matrix. The method was validated in breast milk and also in various types of bovine milk, as tranquilizers are occasionally administered to food-producing animals. Absolute recoveries were satisfactory for 75% of the target analytes. The use of isotopically labeled compounds assisted in correcting analyte losses due to ionization suppression matrix effects (higher in whole milk than in the other investigated milk matrices) and ensured the reliability of the results. Average method limits of quantification ranged between 0.4 and 6.8 ng/mL. Application of the developed method showed the presence of caffeine in breast milk samples (12-179 ng/mL). Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Genetic tool development underpins recent advances in thermophilic whole‐cell biocatalysts

    PubMed Central

    Taylor, M. P.; van Zyl, L.; Tuffin, I. M.; Leak, D. J.; Cowan, D. A.

    2011-01-01

    Summary The environmental value of sustainably producing bioproducts from biomass is now widely appreciated, with a primary target being the economic production of fuels such as bioethanol from lignocellulose. The application of thermophilic prokaryotes is a rapidly developing niche in this field, driven by their known catabolic versatility with lignocellulose‐derived carbohydrates. Fundamental to the success of this work has been the development of reliable genetic and molecular systems. These technical tools are now available to assist in the development of other (hyper)thermophilic strains with diverse phenotypes such as hemicellulolytic and cellulolytic properties, branched chain alcohol production and other ‘valuable bioproduct’ synthetic capabilities. Here we present an insight into the historical limitations, recent developments and current status of a number of genetic systems for thermophiles. We also highlight the value of reliable genetic methods for increasing our knowledge of thermophile physiology. We argue that the development of robust genetic systems is paramount in the evolution of future thermophilic based bioprocesses and make suggestions for future approaches and genetic targets that will facilitate this process. PMID:21310009

  18. Large-Scale Off-Target Identification Using Fast and Accurate Dual Regularized One-Class Collaborative Filtering and Its Application to Drug Repurposing.

    PubMed

    Lim, Hansaim; Poleksic, Aleksandar; Yao, Yuan; Tong, Hanghang; He, Di; Zhuang, Luke; Meng, Patrick; Xie, Lei

    2016-10-01

    Target-based screening is one of the major approaches in drug discovery. Besides the intended target, unexpected drug off-target interactions often occur, and many of them have not been recognized and characterized. The off-target interactions can be responsible for either therapeutic or side effects. Thus, identifying the genome-wide off-targets of lead compounds or existing drugs will be critical for designing effective and safe drugs, and providing new opportunities for drug repurposing. Although many computational methods have been developed to predict drug-target interactions, they are either less accurate than the one that we are proposing here or computationally too intensive, thereby limiting their capability for large-scale off-target identification. In addition, the performances of most machine learning based algorithms have been mainly evaluated to predict off-target interactions in the same gene family for hundreds of chemicals. It is not clear how these algorithms perform in terms of detecting off-targets across gene families on a proteome scale. Here, we are presenting a fast and accurate off-target prediction method, REMAP, which is based on a dual regularized one-class collaborative filtering algorithm, to explore continuous chemical space, protein space, and their interactome on a large scale. When tested in a reliable, extensive, and cross-gene family benchmark, REMAP outperforms the state-of-the-art methods. Furthermore, REMAP is highly scalable. It can screen a dataset of 200 thousands chemicals against 20 thousands proteins within 2 hours. Using the reconstructed genome-wide target profile as the fingerprint of a chemical compound, we predicted that seven FDA-approved drugs can be repurposed as novel anti-cancer therapies. The anti-cancer activity of six of them is supported by experimental evidences. Thus, REMAP is a valuable addition to the existing in silico toolbox for drug target identification, drug repurposing, phenotypic screening, and side effect prediction. The software and benchmark are available at https://github.com/hansaimlim/REMAP.

  19. Large-Scale Off-Target Identification Using Fast and Accurate Dual Regularized One-Class Collaborative Filtering and Its Application to Drug Repurposing

    PubMed Central

    Poleksic, Aleksandar; Yao, Yuan; Tong, Hanghang; Meng, Patrick; Xie, Lei

    2016-01-01

    Target-based screening is one of the major approaches in drug discovery. Besides the intended target, unexpected drug off-target interactions often occur, and many of them have not been recognized and characterized. The off-target interactions can be responsible for either therapeutic or side effects. Thus, identifying the genome-wide off-targets of lead compounds or existing drugs will be critical for designing effective and safe drugs, and providing new opportunities for drug repurposing. Although many computational methods have been developed to predict drug-target interactions, they are either less accurate than the one that we are proposing here or computationally too intensive, thereby limiting their capability for large-scale off-target identification. In addition, the performances of most machine learning based algorithms have been mainly evaluated to predict off-target interactions in the same gene family for hundreds of chemicals. It is not clear how these algorithms perform in terms of detecting off-targets across gene families on a proteome scale. Here, we are presenting a fast and accurate off-target prediction method, REMAP, which is based on a dual regularized one-class collaborative filtering algorithm, to explore continuous chemical space, protein space, and their interactome on a large scale. When tested in a reliable, extensive, and cross-gene family benchmark, REMAP outperforms the state-of-the-art methods. Furthermore, REMAP is highly scalable. It can screen a dataset of 200 thousands chemicals against 20 thousands proteins within 2 hours. Using the reconstructed genome-wide target profile as the fingerprint of a chemical compound, we predicted that seven FDA-approved drugs can be repurposed as novel anti-cancer therapies. The anti-cancer activity of six of them is supported by experimental evidences. Thus, REMAP is a valuable addition to the existing in silico toolbox for drug target identification, drug repurposing, phenotypic screening, and side effect prediction. The software and benchmark are available at https://github.com/hansaimlim/REMAP. PMID:27716836

  20. The influence of incubation time, sample preparation and exposure to oxygen on the quality of the MALDI-TOF MS spectrum of anaerobic bacteria.

    PubMed

    Veloo, A C M; Elgersma, P E; Friedrich, A W; Nagy, E; van Winkelhoff, A J

    2014-12-01

    With matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS), bacteria can be identified quickly and reliably. This accounts especially for anaerobic bacteria. Because growth rate and oxygen sensitivity differ among anaerobic bacteria, we aimed to study the influence of incubation time, exposure to oxygen and sample preparation on the quality of the spectrum using the Bruker system. Also, reproducibility and inter-examiner variability were determined. Twenty-six anaerobic species, representing 17 genera, were selected based on gram-stain characteristics, growth rate and colony morphology. Inter-examiner variation showed that experience in the preparation of the targets can be a significant variable. The influence of incubation time was determined between 24 and 96 h of incubation. Reliable species identification was obtained after 48 h of incubation for gram-negative anaerobes and after 72 h for gram-positive anaerobes. Exposure of the cultures to oxygen did not influence the results of the MALDI-TOF MS identifications of all tested gram-positive species. Fusobacterium necrophorum and Prevotella intermedia could not be identified after >24 h and 48 h of exposure to oxygen, respectively. Other tested gram-negative bacteria could be identified after 48 h of exposure to oxygen. Most of the tested species could be identified using the direct spotting method. Bifidobacterium longum and Finegoldia magna needed on-target extraction with 70% formic acid in order to obtain reliable species identification and Peptoniphilus ivorii a full extraction. Spectrum quality was influenced by the amount of bacteria spotted on the target, the homogeneity of the smear and the experience of the examiner. © 2014 The Authors Clinical Microbiology and Infection © 2014 European Society of Clinical Microbiology and Infectious Diseases.

  1. Definition of DLPFC and M1 according to anatomical landmarks for navigated brain stimulation: inter-rater reliability, accuracy, and influence of gender and age.

    PubMed

    Mylius, V; Ayache, S S; Ahdab, R; Farhat, W H; Zouari, H G; Belke, M; Brugières, P; Wehrmann, E; Krakow, K; Timmesfeld, N; Schmidt, S; Oertel, W H; Knake, S; Lefaucheur, J P

    2013-09-01

    The optimization of the targeting of a defined cortical region is a challenge in the current practice of transcranial magnetic stimulation (TMS). The dorsolateral prefrontal cortex (DLPFC) and the primary motor cortex (M1) are among the most usual TMS targets, particularly in its "therapeutic" application. This study describes a practical algorithm to determine the anatomical location of the DLPFC and M1 using a three-dimensional (3D) brain reconstruction provided by a TMS-dedicated navigation system from individual magnetic resonance imaging (MRI) data. The coordinates of the right and left DLPFC and M1 were determined in 50 normal brains (100 hemispheres) by five different investigators using a standardized procedure. Inter-rater reliability was good, with 95% limits of agreement ranging between 7 and 16 mm for the different coordinates. As expressed in the Talairach space and compared with anatomical or imaging data from the literature, the coordinates of the DLPFC defined by our algorithm corresponded to the junction between BA9 and BA46, while M1 coordinates corresponded to the posterior border of hand representation. Finally, we found an influence of gender and possibly of age on some coordinates on both rostrocaudal and dorsoventral axes. Our algorithm only requires a short training and can be used to provide a reliable targeting of DLPFC and M1 between various TMS investigators. This method, based on an image-guided navigation system using individual MRI data, should be helpful to a variety of TMS studies, especially to standardize the procedure of stimulation in multicenter "therapeutic" studies. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Development and validation of a vision-specific quality-of-life questionnaire for Timor-Leste.

    PubMed

    du Toit, Rènée; Palagyi, Anna; Ramke, Jacqueline; Brian, Garry; Lamoureux, Ecosse L

    2008-10-01

    To develop and determine the reliability and validity of a vision-specific quality-of-life instrument (TL-VSQOL) designed to assess the impact of distance and near vision impairment in adults living in Timor-Leste. A vision-specific quality-of-life questionnaire was developed, piloted, and administered to 704 Timorese aged >or=40 years during a population-based eye health rapid assessment. Rasch analysis was performed on the data of 457 participants with presenting near vision worse than N8 (78.5%) and/or distance vision worse than 6/18 (69.8%). Unidimensionality, item fit to the model, response category performance, differential item functioning, and targeting of items to participants were assessed. Initially, the questionnaire lacked fit to the Rasch model. Removal of two items concerning emotional well-being resulted in a fit of the data (overall item-trait interaction: chi(2) (df) = 81 (51); mean (SD) person and item fit residual values: -0.30 (1.02) and -0.32 (1.46), and good targeting of person ability and item difficulty was evident. Poorer distance and near visual acuities were significantly associated with worse quality-of-life scores (P < 0.001). Person separation reliability was substantial (0.93), indicating that the instrument can discriminate between groups with normal and impaired vision. All 17 items were free of differential item functioning, and there was no evidence of multidimensionality. This 17-item TL-VSQOL has high reliability, construct, and criterion validity and effective targeting. It can effectively assess the impact on quality of life of adult Timorese with distance and near vision impairment. The TL-VSQOL could be adapted for use in other low-resource settings.

  3. Recovering actives in multi-antitarget and target design of analogs of the myosin II inhibitor blebbistatin

    NASA Astrophysics Data System (ADS)

    Roman, Bart I.; Guedes, Rita C.; Stevens, Christian V.; García-Sosa, Alfonso T.

    2018-05-01

    In multitarget drug design, it is critical to identify active and inactive compounds against a variety of targets and antitargets. Multitarget strategies thus test the limits of available technology, be that in screening large databases of compounds versus a large number of targets, or in using in silico methods for understanding and reliably predicting these pharmacological outcomes. In this paper, we have evaluated the potential of several in silico approaches to predict the target, antitarget and physicochemical profile of (S)-blebbistatin, the best-known myosin II ATPase inhibitor, and a series of analogs thereof. Standard and augmented structure-based design techniques could not recover the observed activity profiles. A ligand-based method using molecular fingerprints was, however, able to select actives for myosin II inhibition. Using further ligand- and structure-based methods, we also evaluated toxicity through androgen receptor binding, affinity for an array of antitargets and the ADME profile (including assay-interfering compounds) of the series. In conclusion, in the search for (S)-blebbistatin analogs, the dissimilarity distance of molecular fingerprints to known actives and the computed antitarget and physicochemical profile of the molecules can be used for compound design for molecules with potential as tools for modulating myosin II and motility-related diseases.

  4. Reliability of Laterality Effects in a Dichotic Listening Task with Words and Syllables

    ERIC Educational Resources Information Center

    Russell, Nancy L.; Voyer, Daniel

    2004-01-01

    Large and reliable laterality effects have been found using a dichotic target detection task in a recent experiment using word stimuli pronounced with an emotional component. The present study tested the hypothesis that the magnitude and reliability of the laterality effects would increase with the removal of the emotional component and variations…

  5. Fast photoacoustic imaging system based on 320-element linear transducer array.

    PubMed

    Yin, Bangzheng; Xing, Da; Wang, Yi; Zeng, Yaguang; Tan, Yi; Chen, Qun

    2004-04-07

    A fast photoacoustic (PA) imaging system, based on a 320-transducer linear array, was developed and tested on a tissue phantom. To reconstruct a test tomographic image, 64 time-domain PA signals were acquired from a tissue phantom with embedded light-absorption targets. A signal acquisition was accomplished by utilizing 11 phase-controlled sub-arrays, each consisting of four transducers. The results show that the system can rapidly map the optical absorption of a tissue phantom and effectively detect the embedded light-absorbing target. By utilizing the multi-element linear transducer array and phase-controlled imaging algorithm, we thus can acquire PA tomography more efficiently, compared to other existing technology and algorithms. The methodology and equipment thus provide a rapid and reliable approach to PA imaging that may have potential applications in noninvasive imaging and clinic diagnosis.

  6. Reliability of questionnaires to assess the healthy eating and activity environment of a child's home and school.

    PubMed

    Wilson, Annabelle; Magarey, Anthea; Mastersson, Nadia

    2013-01-01

    Childhood overweight and obesity are a growing concern globally, and environments, including the home and school, can contribute to this epidemic. This paper assesses the reliability of two questionnaires (parent and teacher) used in the evaluation of a community-based childhood obesity prevention intervention, the eat well be active (ewba) Community Programs. Parents and teachers were recruited from two primary schools and they completed the same questionnaire twice in 2008 and 2009. Data from both questionnaires were classified into outcomes relevant to healthy eating and activity, and target outcomes, based on the goals of the ewba Community Programs, were identified. Fourteen and 12 outcomes were developed from the parent and teacher questionnaires, respectively. Sixty parents and 28 teachers participated in the reliability study. Intraclass correlation coefficients for outcomes ranged from 0.37 to 0.92 (parent) (P < 0.05) and from 0.42 to 0.86 (teacher) (P < 0.05). Internal consistency, measured by Cronbach's alpha, of teacher scores ranged from 0.11 to 0.91 and 0.13 to 0.78 for scores from the parent questionnaire. The parent and teacher questionnaires are moderately reliable tools for simultaneously assessing child intakes, environments, attitudes, and knowledge associated with healthy eating and physical activity in the home and school and may be useful for evaluation of similar programs.

  7. Accuracy-based proficiency testing for testosterone measurements with immunoassays and liquid chromatography-mass spectrometry.

    PubMed

    Cao, Zhimin Tim; Botelho, Julianne Cook; Rej, Robert; Vesper, Hubert

    2017-06-01

    Accurate testosterone measurements are needed to correctly diagnose and treat patients. Proficiency Testing (PT) programs using modified specimens for testing can be limited because of matrix effects and usage of non-reference measurement procedure (RMP)-defined targets for evaluation. Accuracy-based PT can overcome such limitations; however, there is a lack of information on accuracy-based PT and feasibility of its implementation in evaluation for testosterone measurements. Unaltered, single-donor human serum from 2 male and 2 female adult donors were analyzed for testosterone by 142 NYSDH-certified clinical laboratories using 16 immunoassays and LC-MS/MS methods. Testosterone target values were determined using an RMP. The testosterone target concentrations for the 4 specimens were 15.5, 30.0, 402 and 498ng/dl. The biases ranged from -17.8% to 73.1%, 3.1% to 21.3%, -24.8% to 8.6%, and -22.1% to 6.8% for the 4 specimens, respectively. Using a total error target of ±25.1%, which was calculated using the minimum allowable bias and imprecision, 73% of participating laboratories had ≥3 of the 4 results within these limits. The variability in total testosterone measurements can affect clinical decisions. Accuracy-based PT can significantly contribute to improving testosterone testing by providing reliable data on accuracy in patient care to laboratories, assay manufacturers, and standardization programs. Copyright © 2017. Published by Elsevier B.V.

  8. Interventions for Age-Related Macular Degeneration: Are Practice Guidelines Based on Systematic Reviews?

    PubMed Central

    Lindsley, Kristina; Li, Tianjing; Ssemanda, Elizabeth; Virgili, Gianni; Dickersin, Kay

    2016-01-01

    Topic Are existing systematic reviews of interventions for age-related macular degeneration incorporated into clinical practice guidelines? Clinical relevance High-quality systematic reviews should be used to underpin evidence-based clinical practice guidelines and clinical care. We have examined the reliability of systematic reviews of interventions for age-related macular degeneration (AMD) and described the main findings of reliable reviews in relation to clinical practice guidelines. Methods Eligible publications are systematic reviews of the effectiveness of treatment interventions for AMD. We searched a database of systematic reviews in eyes and vision and employed no language or date restrictions; the database is up-to-date as of May 6, 2014. Two authors independently screened records for eligibility and abstracted and assessed the characteristics and methods of each review. We classified reviews as “reliable” when they reported eligibility criteria, comprehensive searches, appraisal of methodological quality of included studies, appropriate statistical methods for meta-analysis, and conclusions based on results. We mapped treatment recommendations from the American Academy of Ophthalmology Preferred Practice Patterns (AAO PPP) for AMD to the identified systematic reviews and assessed whether any reliable systematic review was cited or could have been cited to support each treatment recommendation. Results Of 1,570 systematic reviews in our database, 47 met our inclusion criteria. Most of the systematic reviews targeted neovascular AMD and investigated anti-vascular endothelial growth factor (anti-VEGF) interventions, dietary supplements or photodynamic therapy. We classified over two-thirds (33/47) of the reports as reliable. The quality of reporting varied, with criteria for reliable reporting met more often for Cochrane reviews and for reviews whose authors disclosed conflicts of interest. Although most systematic reviews were reliable, anti-VEGF agents and photodynamic therapy were the only interventions identified as effective by reliable reviews. Of 35 treatment recommendations extracted from the AAO PPP, 15 could have been supported with reliable systematic reviews; however, only one recommendation had an accompanying intervention systematic review citation, which we assessed as a reliable systematic review. No reliable systematic review was identified for 20 treatment recommendations, highlighting areas of evidence gaps. Conclusions For AMD, reliable systematic reviews exist for many treatment recommendations in the AAO PPP and should be used to support these recommendations. We also identified areas where no high-level evidence exists. Mapping clinical practice guidelines to existing systematic reviews is one way to highlight areas where evidence generation or evidence synthesis is either available or needed. PMID:26804762

  9. TargetNet: a web service for predicting potential drug-target interaction profiling via multi-target SAR models.

    PubMed

    Yao, Zhi-Jiang; Dong, Jie; Che, Yu-Jing; Zhu, Min-Feng; Wen, Ming; Wang, Ning-Ning; Wang, Shan; Lu, Ai-Ping; Cao, Dong-Sheng

    2016-05-01

    Drug-target interactions (DTIs) are central to current drug discovery processes and public health fields. Analyzing the DTI profiling of the drugs helps to infer drug indications, adverse drug reactions, drug-drug interactions, and drug mode of actions. Therefore, it is of high importance to reliably and fast predict DTI profiling of the drugs on a genome-scale level. Here, we develop the TargetNet server, which can make real-time DTI predictions based only on molecular structures, following the spirit of multi-target SAR methodology. Naïve Bayes models together with various molecular fingerprints were employed to construct prediction models. Ensemble learning from these fingerprints was also provided to improve the prediction ability. When the user submits a molecule, the server will predict the activity of the user's molecule across 623 human proteins by the established high quality SAR model, thus generating a DTI profiling that can be used as a feature vector of chemicals for wide applications. The 623 SAR models related to 623 human proteins were strictly evaluated and validated by several model validation strategies, resulting in the AUC scores of 75-100 %. We applied the generated DTI profiling to successfully predict potential targets, toxicity classification, drug-drug interactions, and drug mode of action, which sufficiently demonstrated the wide application value of the potential DTI profiling. The TargetNet webserver is designed based on the Django framework in Python, and is freely accessible at http://targetnet.scbdd.com .

  10. TargetNet: a web service for predicting potential drug-target interaction profiling via multi-target SAR models

    NASA Astrophysics Data System (ADS)

    Yao, Zhi-Jiang; Dong, Jie; Che, Yu-Jing; Zhu, Min-Feng; Wen, Ming; Wang, Ning-Ning; Wang, Shan; Lu, Ai-Ping; Cao, Dong-Sheng

    2016-05-01

    Drug-target interactions (DTIs) are central to current drug discovery processes and public health fields. Analyzing the DTI profiling of the drugs helps to infer drug indications, adverse drug reactions, drug-drug interactions, and drug mode of actions. Therefore, it is of high importance to reliably and fast predict DTI profiling of the drugs on a genome-scale level. Here, we develop the TargetNet server, which can make real-time DTI predictions based only on molecular structures, following the spirit of multi-target SAR methodology. Naïve Bayes models together with various molecular fingerprints were employed to construct prediction models. Ensemble learning from these fingerprints was also provided to improve the prediction ability. When the user submits a molecule, the server will predict the activity of the user's molecule across 623 human proteins by the established high quality SAR model, thus generating a DTI profiling that can be used as a feature vector of chemicals for wide applications. The 623 SAR models related to 623 human proteins were strictly evaluated and validated by several model validation strategies, resulting in the AUC scores of 75-100 %. We applied the generated DTI profiling to successfully predict potential targets, toxicity classification, drug-drug interactions, and drug mode of action, which sufficiently demonstrated the wide application value of the potential DTI profiling. The TargetNet webserver is designed based on the Django framework in Python, and is freely accessible at http://targetnet.scbdd.com.

  11. Designing polymers with sugar-based advantages for bioactive delivery applications.

    PubMed

    Zhang, Yingyue; Chan, Jennifer W; Moretti, Alysha; Uhrich, Kathryn E

    2015-12-10

    Sugar-based polymers have been extensively explored as a means to increase drug delivery systems' biocompatibility and biodegradation. Here,we review he use of sugar-based polymers for drug delivery applications, with a particular focus on the utility of the sugar component(s) to provide benefits for drug targeting and stimuli responsive systems. Specifically, numerous synthetic methods have been developed to reliably modify naturally-occurring polysaccharides, conjugate sugar moieties to synthetic polymer scaffolds to generate glycopolymers, and utilize sugars as a multifunctional building block to develop sugar-linked polymers. The design of sugar-based polymer systems has tremendous implications on both the physiological and biological properties imparted by the saccharide units and are unique from synthetic polymers. These features include the ability of glycopolymers to preferentially target various cell types and tissues through receptor interactions, exhibit bioadhesion for prolonged residence time, and be rapidly recognized and internalized by cancer cells. Also discussed are the distinct stimuli-sensitive properties of saccharide-modified polymers to mediate drug release under desired conditions. Saccharide-based systems with inherent pH- and temperature-sensitive properties, as well as enzyme-cleavable polysaccharides for targeted bioactive delivery, are covered. Overall, this work emphasizes inherent benefits of sugar-containing polymer systems for bioactive delivery.

  12. Estimating the impact on health of poor reliability of drinking water interventions in developing countries.

    PubMed

    Hunter, Paul R; Zmirou-Navier, Denis; Hartemann, Philippe

    2009-04-01

    Recent evidence suggests that many improved drinking water supplies suffer from poor reliability. This study investigates what impact poor reliability may have on achieving health improvement targets. A Quantitative Microbiological Risk Assessment was conducted of the impact of interruptions in water supplies that forced people to revert to drinking raw water. Data from the literature were used to construct models on three waterborne pathogens common in Africa: Rotavirus, Cryptosporidium and Enterotoxigenic E. coli. Risk of infection by the target pathogens is substantially greater on days that people revert to raw water consumption. Over the course of a few days raw water consumption, the annual health benefits attributed to consumption of water from an improved supply will be almost all lost. Furthermore, risk of illness on days drinking raw water will fall substantially on very young children who have the highest risk of death following infection. Agencies responsible for implementing improved drinking water provision will not make meaningful contributions to public health targets if those systems are subject to poor reliability. Funders of water quality interventions in developing countries should put more effort into auditing whether interventions are sustainable and whether the health benefits are being achieved.

  13. Highly Enhanced Electromechanical Stability of Large-Area Graphene with Increased Interfacial Adhesion Energy by Electrothermal-Direct Transfer for Transparent Electrodes.

    PubMed

    Kim, Jangheon; Kim, Gi Gyu; Kim, Soohyun; Jung, Wonsuk

    2016-09-07

    Graphene, a two-dimensional sheet of carbon atoms in a hexagonal lattice structure, has been extensively investigated for research and industrial applications as a promising material with outstanding electrical, mechanical, and chemical properties. To fabricate graphene-based devices, graphene transfer to the target substrate with a clean and minimally defective surface is the first step. However, graphene transfer technologies require improvement in terms of uniform transfer with a clean, nonfolded and nontorn area, amount of defects, and electromechanical reliability of the transferred graphene. More specifically, uniform transfer of a large area is a key challenge when graphene is repetitively transferred onto pretransferred layers because the adhesion energy between graphene layers is too low to ensure uniform transfer, although uniform multilayers of graphene have exhibited enhanced electrical and optical properties. In this work, we developed a newly suggested electrothermal-direct (ETD) transfer method for large-area high quality monolayer graphene with less defects and an absence of folding or tearing of the area at the surface. This method delivers uniform multilayer transfer of graphene by repetitive monolayer transfer steps based on high adhesion energy between graphene layers and the target substrate. To investigate the highly enhanced electromechanical stability, we conducted mechanical elastic bending experiments and reliability tests in a highly humid environment. This ETD-transferred graphene is expected to replace commercial transparent electrodes with ETD graphene-based transparent electrodes and devices such as a touch panels with outstanding electromechanical stability.

  14. Drug Target Prediction and Repositioning Using an Integrated Network-Based Approach

    PubMed Central

    Emig, Dorothea; Ivliev, Alexander; Pustovalova, Olga; Lancashire, Lee; Bureeva, Svetlana; Nikolsky, Yuri; Bessarabova, Marina

    2013-01-01

    The discovery of novel drug targets is a significant challenge in drug development. Although the human genome comprises approximately 30,000 genes, proteins encoded by fewer than 400 are used as drug targets in the treatment of diseases. Therefore, novel drug targets are extremely valuable as the source for first in class drugs. On the other hand, many of the currently known drug targets are functionally pleiotropic and involved in multiple pathologies. Several of them are exploited for treating multiple diseases, which highlights the need for methods to reliably reposition drug targets to new indications. Network-based methods have been successfully applied to prioritize novel disease-associated genes. In recent years, several such algorithms have been developed, some focusing on local network properties only, and others taking the complete network topology into account. Common to all approaches is the understanding that novel disease-associated candidates are in close overall proximity to known disease genes. However, the relevance of these methods to the prediction of novel drug targets has not yet been assessed. Here, we present a network-based approach for the prediction of drug targets for a given disease. The method allows both repositioning drug targets known for other diseases to the given disease and the prediction of unexploited drug targets which are not used for treatment of any disease. Our approach takes as input a disease gene expression signature and a high-quality interaction network and outputs a prioritized list of drug targets. We demonstrate the high performance of our method and highlight the usefulness of the predictions in three case studies. We present novel drug targets for scleroderma and different types of cancer with their underlying biological processes. Furthermore, we demonstrate the ability of our method to identify non-suspected repositioning candidates using diabetes type 1 as an example. PMID:23593264

  15. Development of the Exam of GeoloGy Standards, EGGS, to Measure Students' Conceptual Understanding of Geology Concepts

    NASA Astrophysics Data System (ADS)

    Guffey, S. K.; Slater, T. F.; Slater, S. J.

    2017-12-01

    Discipline-based geoscience education researchers have considerable need for criterion-referenced, easy-to-administer and easy-to-score, conceptual diagnostic surveys for undergraduates taking introductory science survey courses in order for faculty to better be able to monitor the learning impacts of various interactive teaching approaches. To support ongoing discipline-based science education research to improve teaching and learning across the geosciences, this study establishes the reliability and validity of a 28-item, multiple-choice, pre- and post- Exam of GeoloGy Standards, hereafter simply called EGGS. The content knowledge EGGS addresses is based on 11 consensus concepts derived from a systematic, thematic analysis of the overlapping ideas presented in national science education reform documents including the Next Generation Science Standards, the AAAS Benchmarks for Science Literacy, the Earth Science Literacy Principles, and the NRC National Science Education Standards. Using community agreed upon best-practices for creating, field-testing, and iteratively revising modern multiple-choice test items using classical item analysis techniques, EGGS emphasizes natural student language over technical scientific vocabulary, leverages illustrations over students' reading ability, specifically targets students' misconceptions identified in the scholarly literature, and covers the range of topics most geology educators expect general education students to know at the end of their formal science learning experiences. The current version of EGGS is judged to be valid and reliable with college-level, introductory science survey students based on both standard quantitative and qualitative measures, including extensive clinical interviews with targeted students and systematic expert review.

  16. An integrated miRNA functional screening and target validation method for organ morphogenesis.

    PubMed

    Rebustini, Ivan T; Vlahos, Maryann; Packer, Trevor; Kukuruzinska, Maria A; Maas, Richard L

    2016-03-16

    The relative ease of identifying microRNAs and their increasing recognition as important regulators of organogenesis motivate the development of methods to efficiently assess microRNA function during organ morphogenesis. In this context, embryonic organ explants provide a reliable and reproducible system that recapitulates some of the important early morphogenetic processes during organ development. Here we present a method to target microRNA function in explanted mouse embryonic organs. Our method combines the use of peptide-based nanoparticles to transfect specific microRNA inhibitors or activators into embryonic organ explants, with a microRNA pulldown assay that allows direct identification of microRNA targets. This method provides effective assessment of microRNA function during organ morphogenesis, allows prioritization of multiple microRNAs in parallel for subsequent genetic approaches, and can be applied to a variety of embryonic organs.

  17. Immunohistochemical Analysis in the Rat Central Nervous System and Peripheral Lymph Node Tissue Sections.

    PubMed

    Adzemovic, Milena Z; Zeitelhofer, Manuel; Leisser, Marianne; Köck, Ulricke; Kury, Angela; Olsson, Tomas

    2016-11-14

    Immunohistochemistry (IHC) provides highly specific, reliable and attractive protein visualization. Correct performance and interpretation of an IHC-based multicolor labeling is challenging, especially when utilized for assessing interrelations between target proteins in the tissue with a high fat content such as the central nervous system (CNS). Our protocol represents a refinement of the standard immunolabeling technique particularly adjusted for detection of both structural and soluble proteins in the rat CNS and peripheral lymph nodes (LN) affected by neuroinflammation. Nonetheless, with or without further modifications, our protocol could likely be used for detection of other related protein targets, even in other organs and species than here presented.

  18. Fabrication of boron sputter targets

    DOEpatents

    Makowiecki, Daniel M.; McKernan, Mark A.

    1995-01-01

    A process for fabricating high density boron sputtering targets with sufficient mechanical strength to function reliably at typical magnetron sputtering power densities and at normal process parameters. The process involves the fabrication of a high density boron monolithe by hot isostatically compacting high purity (99.9%) boron powder, machining the boron monolithe into the final dimensions, and brazing the finished boron piece to a matching boron carbide (B.sub.4 C) piece, by placing aluminum foil there between and applying pressure and heat in a vacuum. An alternative is the application of aluminum metallization to the back of the boron monolithe by vacuum deposition. Also, a titanium based vacuum braze alloy can be used in place of the aluminum foil.

  19. Reliability and validity of the neurorehabilitation experience questionnaire for inpatients.

    PubMed

    Kneebone, Ian I; Hull, Samantha L; McGurk, Rhona; Cropley, Mark

    2012-09-01

    Patient-centered measures of the inpatient neurorehabilitation experience are needed to assess services. The objective of this study was to develop a valid and reliable Neurorehabilitation Experience Questionnaire (NREQ) to assess whether neurorehabilitation inpatients experience service elements important to them. Based on the themes established in prior qualitative research, adopting questions from established inventories and using a literature review, a draft version of the NREQ was generated. Focus groups and interviews were conducted with 9 patients and 26 staff from neurological rehabilitation units to establish face validity. Then, 70 patients were recruited to complete the NREQ to ascertain reliability (internal and test-retest) and concurrent validity. On the basis of the face validity testing, several modifications were made to the draft version of the NREQ. Subsequently, internal reliability (time 1 α = .76, time 2 α = .80), test retest reliability (r = 0.70), and concurrent validity (r = 0.32 and r = 0.56) were established for the revised version. Whereas responses were associated with positive mood (r = 0.30), they appeared not to be influenced by negative mood, age, education, length of stay, sex, functional independence, or whether a participant had been a patient on a unit previously. Preliminary validation of the NREQ suggests promise for use with its target population.

  20. Targeted Peptide Measurements in Biology and Medicine: Best Practices for Mass Spectrometry-based Assay Development Using a Fit-for-Purpose Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carr, Steven A.; Abbateillo, Susan E.; Ackermann, Bradley L.

    2014-01-14

    Adoption of targeted mass spectrometry (MS) approaches such as multiple reaction monitoring (MRM) to study biological and biomedical questions is well underway in the proteomics community. Successful application depends on the ability to generate reliable assays that uniquely and confidently identify target peptides in a sample. Unfortunately, there is a wide range of criteria being applied to say that an assay has been successfully developed. There is no consensus on what criteria are acceptable and little understanding of the impact of variable criteria on the quality of the results generated. Publications describing targeted MS assays for peptides frequently do notmore » contain sufficient information for readers to establish confidence that the tests work as intended or to be able to apply the tests described in their own labs. Guidance must be developed so that targeted MS assays with established performance can be made widely distributed and applied by many labs worldwide. To begin to address the problems and their solutions, a workshop was held at the National Institutes of Health with representatives from the multiple communities developing and employing targeted MS assays. Participants discussed the analytical goals of their experiments and the experimental evidence needed to establish that the assays they develop work as intended and are achieving the required levels of performance. Using this “fit-for-purpose” approach, the group defined three tiers of assays distinguished by their performance and extent of analytical characterization. Computational and statistical tools useful for the analysis of targeted MS results were described. Participants also detailed the information that authors need to provide in their manuscripts to enable reviewers and readers to clearly understand what procedures were performed and to evaluate the reliability of the peptide or protein quantification measurements reported. This paper presents a summary of the meeting and recommendations. Molecular & Cellular Proteomics 13: 10.1074/mcp.M113.036095, 907–917, 2014.« less

  1. Targeted peptide measurements in biology and medicine: best practices for mass spectrometry-based assay development using a fit-for-purpose approach.

    PubMed

    Carr, Steven A; Abbatiello, Susan E; Ackermann, Bradley L; Borchers, Christoph; Domon, Bruno; Deutsch, Eric W; Grant, Russell P; Hoofnagle, Andrew N; Hüttenhain, Ruth; Koomen, John M; Liebler, Daniel C; Liu, Tao; MacLean, Brendan; Mani, D R; Mansfield, Elizabeth; Neubert, Hendrik; Paulovich, Amanda G; Reiter, Lukas; Vitek, Olga; Aebersold, Ruedi; Anderson, Leigh; Bethem, Robert; Blonder, Josip; Boja, Emily; Botelho, Julianne; Boyne, Michael; Bradshaw, Ralph A; Burlingame, Alma L; Chan, Daniel; Keshishian, Hasmik; Kuhn, Eric; Kinsinger, Christopher; Lee, Jerry S H; Lee, Sang-Won; Moritz, Robert; Oses-Prieto, Juan; Rifai, Nader; Ritchie, James; Rodriguez, Henry; Srinivas, Pothur R; Townsend, R Reid; Van Eyk, Jennifer; Whiteley, Gordon; Wiita, Arun; Weintraub, Susan

    2014-03-01

    Adoption of targeted mass spectrometry (MS) approaches such as multiple reaction monitoring (MRM) to study biological and biomedical questions is well underway in the proteomics community. Successful application depends on the ability to generate reliable assays that uniquely and confidently identify target peptides in a sample. Unfortunately, there is a wide range of criteria being applied to say that an assay has been successfully developed. There is no consensus on what criteria are acceptable and little understanding of the impact of variable criteria on the quality of the results generated. Publications describing targeted MS assays for peptides frequently do not contain sufficient information for readers to establish confidence that the tests work as intended or to be able to apply the tests described in their own labs. Guidance must be developed so that targeted MS assays with established performance can be made widely distributed and applied by many labs worldwide. To begin to address the problems and their solutions, a workshop was held at the National Institutes of Health with representatives from the multiple communities developing and employing targeted MS assays. Participants discussed the analytical goals of their experiments and the experimental evidence needed to establish that the assays they develop work as intended and are achieving the required levels of performance. Using this "fit-for-purpose" approach, the group defined three tiers of assays distinguished by their performance and extent of analytical characterization. Computational and statistical tools useful for the analysis of targeted MS results were described. Participants also detailed the information that authors need to provide in their manuscripts to enable reviewers and readers to clearly understand what procedures were performed and to evaluate the reliability of the peptide or protein quantification measurements reported. This paper presents a summary of the meeting and recommendations.

  2. Targeted Peptide Measurements in Biology and Medicine: Best Practices for Mass Spectrometry-based Assay Development Using a Fit-for-Purpose Approach*

    PubMed Central

    Carr, Steven A.; Abbatiello, Susan E.; Ackermann, Bradley L.; Borchers, Christoph; Domon, Bruno; Deutsch, Eric W.; Grant, Russell P.; Hoofnagle, Andrew N.; Hüttenhain, Ruth; Koomen, John M.; Liebler, Daniel C.; Liu, Tao; MacLean, Brendan; Mani, DR; Mansfield, Elizabeth; Neubert, Hendrik; Paulovich, Amanda G.; Reiter, Lukas; Vitek, Olga; Aebersold, Ruedi; Anderson, Leigh; Bethem, Robert; Blonder, Josip; Boja, Emily; Botelho, Julianne; Boyne, Michael; Bradshaw, Ralph A.; Burlingame, Alma L.; Chan, Daniel; Keshishian, Hasmik; Kuhn, Eric; Kinsinger, Christopher; Lee, Jerry S.H.; Lee, Sang-Won; Moritz, Robert; Oses-Prieto, Juan; Rifai, Nader; Ritchie, James; Rodriguez, Henry; Srinivas, Pothur R.; Townsend, R. Reid; Van Eyk, Jennifer; Whiteley, Gordon; Wiita, Arun; Weintraub, Susan

    2014-01-01

    Adoption of targeted mass spectrometry (MS) approaches such as multiple reaction monitoring (MRM) to study biological and biomedical questions is well underway in the proteomics community. Successful application depends on the ability to generate reliable assays that uniquely and confidently identify target peptides in a sample. Unfortunately, there is a wide range of criteria being applied to say that an assay has been successfully developed. There is no consensus on what criteria are acceptable and little understanding of the impact of variable criteria on the quality of the results generated. Publications describing targeted MS assays for peptides frequently do not contain sufficient information for readers to establish confidence that the tests work as intended or to be able to apply the tests described in their own labs. Guidance must be developed so that targeted MS assays with established performance can be made widely distributed and applied by many labs worldwide. To begin to address the problems and their solutions, a workshop was held at the National Institutes of Health with representatives from the multiple communities developing and employing targeted MS assays. Participants discussed the analytical goals of their experiments and the experimental evidence needed to establish that the assays they develop work as intended and are achieving the required levels of performance. Using this “fit-for-purpose” approach, the group defined three tiers of assays distinguished by their performance and extent of analytical characterization. Computational and statistical tools useful for the analysis of targeted MS results were described. Participants also detailed the information that authors need to provide in their manuscripts to enable reviewers and readers to clearly understand what procedures were performed and to evaluate the reliability of the peptide or protein quantification measurements reported. This paper presents a summary of the meeting and recommendations. PMID:24443746

  3. A lightweight neighbor-info-based routing protocol for no-base-station taxi-call system.

    PubMed

    Zhu, Xudong; Wang, Jinhang; Chen, Yunchao

    2014-01-01

    Since the quick topology change and short connection duration, the VANET has had unstable routing and wireless signal quality. This paper proposes a kind of lightweight routing protocol-LNIB for call system without base station, which is applicable to the urban taxis. LNIB maintains and predicts neighbor information dynamically, thus finding the reliable path between the source and the target. This paper describes the protocol in detail and evaluates the performance of this protocol by simulating under different nodes density and speed. The result of evaluation shows that the performance of LNIB is better than AODV which is a classic protocol in taxi-call scene.

  4. Cost of unreliability method to estimate loss of revenue based on unreliability data: Case study of Printing Company

    NASA Astrophysics Data System (ADS)

    alhilman, Judi

    2017-12-01

    In the production line process of the printing office, the reliability of the printing machine plays a very important role, if the machine fail it can disrupt production target so that the company will suffer huge financial loss. One method to calculate the financial loss cause by machine failure is use the Cost of Unreliability(COUR) method. COUR method works based on down time machine and costs associated with unreliability data. Based on the calculation of COUR method, so the sum of cost due to unreliability printing machine during active repair time and downtime is 1003,747.00.

  5. Evidence base and future research directions in the management of low back pain

    PubMed Central

    Abbott, Allan

    2016-01-01

    Low back pain (LBP) is a prevalent and costly condition. Awareness of valid and reliable patient history taking, physical examination and clinical testing is important for diagnostic accuracy. Stratified care which targets treatment to patient subgroups based on key characteristics is reliant upon accurate diagnostics. Models of stratified care that can potentially improve treatment effects include prognostic risk profiling for persistent LBP, likely response to specific treatment based on clinical prediction models or suspected underlying causal mechanisms. The focus of this editorial is to highlight current research status and future directions for LBP diagnostics and stratified care. PMID:27004162

  6. Quantitative Detection of Small Molecule/DNA Complexes Employing a Force-Based and Label-Free DNA-Microarray

    PubMed Central

    Ho, Dominik; Dose, Christian; Albrecht, Christian H.; Severin, Philip; Falter, Katja; Dervan, Peter B.; Gaub, Hermann E.

    2009-01-01

    Force-based ligand detection is a promising method to characterize molecular complexes label-free at physiological conditions. Because conventional implementations of this technique, e.g., based on atomic force microscopy or optical traps, are low-throughput and require extremely sensitive and sophisticated equipment, this approach has to date found only limited application. We present a low-cost, chip-based assay, which combines high-throughput force-based detection of dsDNA·ligand interactions with the ease of fluorescence detection. Within the comparative unbinding force assay, many duplicates of a target DNA duplex are probed against a defined reference DNA duplex each. The fractions of broken target and reference DNA duplexes are determined via fluorescence. With this assay, we investigated the DNA binding behavior of artificial pyrrole-imidazole polyamides. These small compounds can be programmed to target specific dsDNA sequences and distinguish between D- and L-DNA. We found that titration with polyamides specific for a binding motif, which is present in the target DNA duplex and not in the reference DNA duplex, reliably resulted in a shift toward larger fractions of broken reference bonds. From the concentration dependence nanomolar to picomolar dissociation constants of dsDNA·ligand complexes were determined, agreeing well with prior quantitative DNAase footprinting experiments. This finding corroborates that the forced unbinding of dsDNA in presence of a ligand is a nonequilibrium process that produces a snapshot of the equilibrium distribution between dsDNA and dsDNA·ligand complexes. PMID:19486688

  7. The number of measurements needed to obtain high reliability for traits related to enzymatic activities and photosynthetic compounds in soybean plants infected with Phakopsora pachyrhizi.

    PubMed

    Oliveira, Tássia Boeno de; Azevedo Peixoto, Leonardo de; Teodoro, Paulo Eduardo; Alvarenga, Amauri Alves de; Bhering, Leonardo Lopes; Campo, Clara Beatriz Hoffmann

    2018-01-01

    Asian rust affects the physiology of soybean plants and causes losses in yield. Repeatability coefficients may help breeders to know how many measurements are needed to obtain a suitable reliability for a target trait. Therefore, the objectives of this study were to determine the repeatability coefficients of 14 traits in soybean plants inoculated with Phakopsora pachyrhizi and to establish the minimum number of measurements needed to predict the breeding value with high accuracy. Experiments were performed in a 3x2 factorial arrangement with three treatments and two inoculations in a random block design. Repeatability coefficients, coefficients of determination and number of measurements needed to obtain a certain reliability were estimated using ANOVA, principal component analysis based on the covariance matrix and the correlation matrix, structural analysis and mixed model. It was observed that the principal component analysis based on the covariance matrix out-performed other methods for almost all traits. Significant differences were observed for all traits except internal CO2 concentration for the treatment effects. For the measurement effects, all traits were significantly different. In addition, significant differences were found for all Treatment x Measurement interaction traits except coumestrol, chitinase and chlorophyll content. Six measurements were suitable to obtain a coefficient of determination higher than 0.7 for all traits based on principal component analysis. The information obtained from this research will help breeders and physiologists determine exactly how many measurements are needed to evaluate each trait in soybean plants infected by P. pachyrhizi with a desirable reliability.

  8. The number of measurements needed to obtain high reliability for traits related to enzymatic activities and photosynthetic compounds in soybean plants infected with Phakopsora pachyrhizi

    PubMed Central

    de Oliveira, Tássia Boeno; Teodoro, Paulo Eduardo; de Alvarenga, Amauri Alves; Bhering, Leonardo Lopes; Campo, Clara Beatriz Hoffmann

    2018-01-01

    Asian rust affects the physiology of soybean plants and causes losses in yield. Repeatability coefficients may help breeders to know how many measurements are needed to obtain a suitable reliability for a target trait. Therefore, the objectives of this study were to determine the repeatability coefficients of 14 traits in soybean plants inoculated with Phakopsora pachyrhizi and to establish the minimum number of measurements needed to predict the breeding value with high accuracy. Experiments were performed in a 3x2 factorial arrangement with three treatments and two inoculations in a random block design. Repeatability coefficients, coefficients of determination and number of measurements needed to obtain a certain reliability were estimated using ANOVA, principal component analysis based on the covariance matrix and the correlation matrix, structural analysis and mixed model. It was observed that the principal component analysis based on the covariance matrix out-performed other methods for almost all traits. Significant differences were observed for all traits except internal CO2 concentration for the treatment effects. For the measurement effects, all traits were significantly different. In addition, significant differences were found for all Treatment x Measurement interaction traits except coumestrol, chitinase and chlorophyll content. Six measurements were suitable to obtain a coefficient of determination higher than 0.7 for all traits based on principal component analysis. The information obtained from this research will help breeders and physiologists determine exactly how many measurements are needed to evaluate each trait in soybean plants infected by P. pachyrhizi with a desirable reliability. PMID:29438380

  9. Label-free optical biosensors based on aptamer-functionalized porous silicon scaffolds.

    PubMed

    Urmann, Katharina; Walter, Johanna-Gabriela; Scheper, Thomas; Segal, Ester

    2015-02-03

    A proof-of-concept for a label-free and reagentless optical biosensing platform based on nanostructured porous silicon (PSi) and aptamers is presented in this work. Aptamers are oligonucleotides (single-stranded DNA or RNA) that can bind their targets with high affinity and specificity, making them excellent recognition elements for biosensor design. Here we describe the fabrication and characterization of aptamer-conjugated PSi biosensors, where a previously characterized his-tag binding aptamer (6H7) is used as model system. Exposure of the aptamer-functionalized PSi to the target proteins as well as to complex fluids (i.e., bacteria lysates containing target proteins) results in robust and well-defined changes in the PSi optical interference spectrum, ascribed to specific aptamer-protein binding events occurring within the nanoscale pores, monitored in real time. The biosensors show exceptional stability and can be easily regenerated by a short rinsing step for multiple biosensing analyses. This proof-of-concept study demonstrates the possibility of designing highly stable and specific label-free optical PSi biosensors, employing aptamers as capture probes, holding immense potential for application in detection of a broad range of targets, in a simple yet reliable manner.

  10. PsRobot: a web-based plant small RNA meta-analysis toolbox.

    PubMed

    Wu, Hua-Jun; Ma, Ying-Ke; Chen, Tong; Wang, Meng; Wang, Xiu-Jie

    2012-07-01

    Small RNAs (smRNAs) in plants, mainly microRNAs and small interfering RNAs, play important roles in both transcriptional and post-transcriptional gene regulation. The broad application of high-throughput sequencing technology has made routinely generation of bulk smRNA sequences in laboratories possible, thus has significantly increased the need for batch analysis tools. PsRobot is a web-based easy-to-use tool dedicated to the identification of smRNAs with stem-loop shaped precursors (such as microRNAs and short hairpin RNAs) and their target genes/transcripts. It performs fast analysis to identify smRNAs with stem-loop shaped precursors among batch input data and predicts their targets using a modified Smith-Waterman algorithm. PsRobot integrates the expression data of smRNAs in major plant smRNA biogenesis gene mutants and smRNA-associated protein complexes to give clues to the smRNA generation and functional processes. Besides improved specificity, the reliability of smRNA target prediction results can also be evaluated by mRNA cleavage (degradome) data. The cross species conservation statuses and the multiplicity of smRNA target sites are also provided. PsRobot is freely accessible at http://omicslab.genetics.ac.cn/psRobot/.

  11. Sensorimotor Mismapping in Poor-pitch Singing.

    PubMed

    He, Hao; Zhang, Wei-Dong

    2017-09-01

    This study proposes that there are two types of sensorimotor mismapping in poor-pitch singing: erroneous mapping and no mapping. We created operational definitions for the two types of mismapping based on the precision of pitch-matching and predicted that in the two types of mismapping, phonation differs in terms of accuracy and the dependence on the articulation consistency between the target and the intended vocal action. The study aimed to test this hypothesis by examining the reliability and criterion-related validity of the operational definitions. A within-subject design was used in this study. Thirty-two participants identified as poor-pitch singers were instructed to vocally imitate pure tones and to imitate their own vocal recordings with the same articulation as self-targets and with different articulation from self-targets. Definitions of the types of mismapping were demonstrated to be reliable with the split-half approach and to have good criterion-related validity with findings that pitch-matching with no mapping was less accurate and more dependent on the articulation consistency between the target and the intended vocal action than pitch-matching with erroneous mapping was. Furthermore, the precision of pitch-matching was positively associated with its accuracy and its dependence on articulation consistency when mismapping was analyzed on a continuum. Additionally, the data indicated that the self-imitation advantage was a function of articulation consistency. Types of sensorimotor mismapping lead to pitch-matching that differs in accuracy and its dependence on the articulation consistency between the target and the intended vocal action. Additionally, articulation consistency produces the self-advantage. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  12. Reliable detection of Bacillus anthracis, Francisella tularensis and Yersinia pestis by using multiplex qPCR including internal controls for nucleic acid extraction and amplification.

    PubMed

    Janse, Ingmar; Hamidjaja, Raditijo A; Bok, Jasper M; van Rotterdam, Bart J

    2010-12-08

    Several pathogens could seriously affect public health if not recognized timely. To reduce the impact of such highly pathogenic micro-organisms, rapid and accurate diagnostic tools are needed for their detection in various samples, including environmental samples. Multiplex real-time PCRs were designed for rapid and reliable detection of three major pathogens that have the potential to cause high morbidity and mortality in humans: B. anthracis, F. tularensis and Y. pestis. The developed assays detect three pathogen-specific targets, including at least one chromosomal target, and one target from B. thuringiensis which is used as an internal control for nucleic acid extraction from refractory spores as well as successful DNA amplification. Validation of the PCRs showed a high analytical sensitivity, specificity and coverage of diverse pathogen strains. The multiplex qPCR assays that were developed allow the rapid detection of 3 pathogen-specific targets simultaneously, without compromising sensitivity. The application of B. thuringiensis spores as internal controls further reduces false negative results. This ensures highly reliable detection, while template consumption and laboratory effort are kept at a minimum.

  13. Reliable detection of Bacillus anthracis, Francisella tularensis and Yersinia pestis by using multiplex qPCR including internal controls for nucleic acid extraction and amplification

    PubMed Central

    2010-01-01

    Background Several pathogens could seriously affect public health if not recognized timely. To reduce the impact of such highly pathogenic micro-organisms, rapid and accurate diagnostic tools are needed for their detection in various samples, including environmental samples. Results Multiplex real-time PCRs were designed for rapid and reliable detection of three major pathogens that have the potential to cause high morbidity and mortality in humans: B. anthracis, F. tularensis and Y. pestis. The developed assays detect three pathogen-specific targets, including at least one chromosomal target, and one target from B. thuringiensis which is used as an internal control for nucleic acid extraction from refractory spores as well as successful DNA amplification. Validation of the PCRs showed a high analytical sensitivity, specificity and coverage of diverse pathogen strains. Conclusions The multiplex qPCR assays that were developed allow the rapid detection of 3 pathogen-specific targets simultaneously, without compromising sensitivity. The application of B. thuringiensis spores as internal controls further reduces false negative results. This ensures highly reliable detection, while template consumption and laboratory effort are kept at a minimum PMID:21143837

  14. Incidental and context-responsive activation of structure- and function-based action features during object identification

    PubMed Central

    Lee, Chia-lin; Middleton, Erica; Mirman, Daniel; Kalénine, Solène; Buxbaum, Laurel J.

    2012-01-01

    Previous studies suggest that action representations are activated during object processing, even when task-irrelevant. In addition, there is evidence that lexical-semantic context may affect such activation during object processing. Finally, prior work from our laboratory and others indicates that function-based (“use”) and structure-based (“move”) action subtypes may differ in their activation characteristics. Most studies assessing such effects, however, have required manual object-relevant motor responses, thereby plausibly influencing the activation of action representations. The present work utilizes eyetracking and a Visual World Paradigm task without object-relevant actions to assess the time course of activation of action representations, as well as their responsiveness to lexical-semantic context. In two experiments, participants heard a target word and selected its referent from an array of four objects. Gaze fixations on non-target objects signal activation of features shared between targets and non-targets. The experiments assessed activation of structure-based (Experiment 1) or function-based (Experiment 2) distractors, using neutral sentences (“S/he saw the …”) or sentences with a relevant action verb (Experiment 1: “S/he picked up the……”; Experiment 2: “S/he used the….”). We observed task-irrelevant activations of action information in both experiments. In neutral contexts, structure-based activation was relatively faster-rising but more transient than function-based activation. Additionally, action verb contexts reliably modified patterns of activation in both Experiments. These data provide fine-grained information about the dynamics of activation of function-based and structure-based actions in neutral and action-relevant contexts, in support of the “Two Action System” model of object and action processing (e.g., Buxbaum & Kalénine, 2010). PMID:22390294

  15. Quantifying the Relationships among Drug Classes

    PubMed Central

    Hert, Jérôme; Keiser, Michael J.; Irwin, John J.; Oprea, Tudor I.; Shoichet, Brian K.

    2009-01-01

    The similarity of drug targets is typically measured using sequence or structural information. Here, we consider chemo-centric approaches that measure target similarity on the basis of their ligands, asking how chemoinformatics similarities differ from those derived bioinformatically, how stable the ligand networks are to changes in chemoinformatics metrics, and which network is the most reliable for prediction of pharmacology. We calculated the similarities between hundreds of drug targets and their ligands and mapped the relationship between them in a formal network. Bioinformatics networks were based on the BLAST similarity between sequences, while chemoinformatics networks were based on the ligand-set similarities calculated with either the Similarity Ensemble Approach (SEA) or a method derived from Bayesian statistics. By multiple criteria, bioinformatics and chemoinformatics networks differed substantially, and only occasionally did a high sequence similarity correspond to a high ligand-set similarity. In contrast, the chemoinformatics networks were stable to the method used to calculate the ligand-set similarities and to the chemical representation of the ligands. Also, the chemoinformatics networks were more natural and more organized, by network theory, than their bioinformatics counterparts: ligand-based networks were found to be small-world and broad-scale. PMID:18335977

  16. Surface-modified CMOS IC electrochemical sensor array targeting single chromaffin cells for highly parallel amperometry measurements.

    PubMed

    Huang, Meng; Delacruz, Joannalyn B; Ruelas, John C; Rathore, Shailendra S; Lindau, Manfred

    2018-01-01

    Amperometry is a powerful method to record quantal release events from chromaffin cells and is widely used to assess how specific drugs modify quantal size, kinetics of release, and early fusion pore properties. Surface-modified CMOS-based electrochemical sensor arrays allow simultaneous recordings from multiple cells. A reliable, low-cost technique is presented here for efficient targeting of single cells specifically to the electrode sites. An SU-8 microwell structure is patterned on the chip surface to provide insulation for the circuitry as well as cell trapping at the electrode sites. A shifted electrode design is also incorporated to increase the flexibility of the dimension and shape of the microwells. The sensitivity of the electrodes is validated by a dopamine injection experiment. Microwells with dimensions slightly larger than the cells to be trapped ensure excellent single-cell targeting efficiency, increasing the reliability and efficiency for on-chip single-cell amperometry measurements. The surface-modified device was validated with parallel recordings of live chromaffin cells trapped in the microwells. Rapid amperometric spikes with no diffusional broadening were observed, indicating that the trapped and recorded cells were in very close contact with the electrodes. The live cell recording confirms in a single experiment that spike parameters vary significantly from cell to cell but the large number of cells recorded simultaneously provides the statistical significance.

  17. Chapter 15: Disease Gene Prioritization

    PubMed Central

    Bromberg, Yana

    2013-01-01

    Disease-causing aberrations in the normal function of a gene define that gene as a disease gene. Proving a causal link between a gene and a disease experimentally is expensive and time-consuming. Comprehensive prioritization of candidate genes prior to experimental testing drastically reduces the associated costs. Computational gene prioritization is based on various pieces of correlative evidence that associate each gene with the given disease and suggest possible causal links. A fair amount of this evidence comes from high-throughput experimentation. Thus, well-developed methods are necessary to reliably deal with the quantity of information at hand. Existing gene prioritization techniques already significantly improve the outcomes of targeted experimental studies. Faster and more reliable techniques that account for novel data types are necessary for the development of new diagnostics, treatments, and cure for many diseases. PMID:23633938

  18. Special methods for aerodynamic-moment calculations from parachute FSI modeling

    NASA Astrophysics Data System (ADS)

    Takizawa, Kenji; Tezduyar, Tayfun E.; Boswell, Cody; Tsutsui, Yuki; Montel, Kenneth

    2015-06-01

    The space-time fluid-structure interaction (STFSI) methods for 3D parachute modeling are now at a level where they can bring reliable, practical analysis to some of the most complex parachute systems, such as spacecraft parachutes. The methods include the Deforming-Spatial-Domain/Stabilized ST method as the core computational technology, and a good number of special FSI methods targeting parachutes. Evaluating the stability characteristics of a parachute based on how the aerodynamic moment varies as a function of the angle of attack is one of the practical analyses that reliable parachute FSI modeling can deliver. We describe the special FSI methods we developed for this specific purpose and present the aerodynamic-moment data obtained from FSI modeling of NASA Orion spacecraft parachutes and Japan Aerospace Exploration Agency (JAXA) subscale parachutes.

  19. Note: The design of thin gap chamber simulation signal source based on field programmable gate array.

    PubMed

    Hu, Kun; Lu, Houbing; Wang, Xu; Li, Feng; Liang, Futian; Jin, Ge

    2015-01-01

    The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.

  20. Note: The design of thin gap chamber simulation signal source based on field programmable gate array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Kun; Wang, Xu; Li, Feng

    The Thin Gap Chamber (TGC) is an important part of ATLAS detector and LHC accelerator. Targeting the feature of the output signal of TGC detector, we have designed a simulation signal source. The core of the design is based on field programmable gate array, randomly outputting 256-channel simulation signals. The signal is generated by true random number generator. The source of randomness originates from the timing jitter in ring oscillators. The experimental results show that the random number is uniform in histogram, and the whole system has high reliability.

  1. Backlighting Direct-Drive Cryogenic DT Implosions on OMEGA

    NASA Astrophysics Data System (ADS)

    Stoeckl, C.

    2016-10-01

    X-ray backlighting has been frequently used to measure the in-flight characteristics of an imploding shell in both direct- and indirect-drive inertial confinement fusion implosions. These measurements provide unique insight into the early time and stagnation stages of an implosion and guide the modeling efforts to improve the target designs. Backlighting a layered DT implosion on OMEGA is a particular challenge because the opacity of the DT shell is low, the shell velocity is high, the size and wall thickness of the shell is small, and the self-emission from the hot core at the onset of burn is exceedingly bright. A framing-camera-based crystal imaging system with a Si Heα backlighter at 1.865keV driven by 10-ps short pulses from OMEGA EP was developed to meet these radiography challenges. A fast target inserter was developed to accurately place the Si backlighter foil at a distance of 5 mm to the implosion target following the removal of the cryogenic shroud and an ultra-stable triggering system was implemented to reliably trigger the framing camera coincident with the arrival of the OMEGA EP pulse. This talk will report on a series of implosions in which the DT shell is imaged for a range of convergence ratios and in-flight aspect ratios. The images acquired have been analyzed for low-mode shape variations, the DT shell thickness, the level of ablator mixing into the DT fuel (even 0.1% of carbon mix can be reliably inferred), the areal density of the DT shell, and the impact of the support stalk. The measured implosion performance will be compared with hydrodynamic simulations that include imprint (up to mode 200), cross-beam energy transfer, nonlocal thermal transport, and initial low-mode perturbations such as power imbalance and target misalignment. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.

  2. Mining protein interactomes to improve their reliability and support the advancement of network medicine.

    PubMed

    Alanis-Lobato, Gregorio

    2015-01-01

    High-throughput detection of protein interactions has had a major impact in our understanding of the intricate molecular machinery underlying the living cell, and has permitted the construction of very large protein interactomes. The protein networks that are currently available are incomplete and a significant percentage of their interactions are false positives. Fortunately, the structural properties observed in good quality social or technological networks are also present in biological systems. This has encouraged the development of tools, to improve the reliability of protein networks and predict new interactions based merely on the topological characteristics of their components. Since diseases are rarely caused by the malfunction of a single protein, having a more complete and reliable interactome is crucial in order to identify groups of inter-related proteins involved in disease etiology. These system components can then be targeted with minimal collateral damage. In this article, an important number of network mining tools is reviewed, together with resources from which reliable protein interactomes can be constructed. In addition to the review, a few representative examples of how molecular and clinical data can be integrated to deepen our understanding of pathogenesis are discussed.

  3. The Revised Child Anxiety and Depression Scale 25-Parent Version: Scale Development and Validation in a School-Based and Clinical Sample.

    PubMed

    Ebesutani, Chad; Korathu-Larson, Priya; Nakamura, Brad J; Higa-McMillan, Charmaine; Chorpita, Bruce

    2017-09-01

    To help facilitate the dissemination and implementation of evidence-based assessment practices, we examined the psychometric properties of the shortened 25-item version of the Revised Child Anxiety and Depression Scale-parent report (RCADS-25-P), which was based on the same items as the previously published shortened 25-item child version. We used two independent samples of youth-a school sample ( N = 967, Grades 3-12) and clinical sample ( N = 433; 6-18 years)-to examine the factor structure, reliability, and validity of the RCADS-25-P scale scores. Results revealed that the two-factor structure (i.e., depression and broad anxiety factor) fit the data well in both the school and clinical sample. All reliability estimates, including test-retest indices, exceeded benchmark for good reliability. In the school sample, the RCADS-25-P scale scores converged significantly with related criterion measures and diverged with nonrelated criterion measures. In the clinical sample, the RCADS-25-P scale scores successfully discriminated between those with and without target problem diagnoses. In both samples, child-parent agreement indices were in the expected ranges. Normative data were also reported. The RCADS-25-P thus demonstrated robust psychometric properties across both a school and clinical sample as an effective brief screening instrument to assess for depression and anxiety in children and adolescents.

  4. Training balance with opto-kinetic stimuli in the home: a randomized controlled feasibility study in people with pure cerebellar disease.

    PubMed

    Bunn, Lisa M; Marsden, Jonathan F; Giunti, Paola; Day, Brian L

    2015-02-01

    To investigate the feasibility of a randomized controlled trial of a home-based balance intervention for people with cerebellar ataxia. A randomized controlled trial design. Intervention and assessment took place in the home environment. A total of 12 people with spinocerebellar ataxia type 6 were randomized into a therapy or control group. Both groups received identical assessments at baseline, four and eight weeks. Therapy group participants undertook balance exercises in front of optokinetic stimuli during weeks 4-8, while control group participants received no intervention. Test-retest reliability was analysed from outcome measures collected twice at baseline and four weeks later. Feasibility issues were evaluated using daily diaries and end trial exit interviews. The home-based training intervention with opto-kinetic stimuli was feasible for people with pure ataxia, with one drop-out. Test-retest reliability is strong (intraclass correlation coefficient >0.7) for selected outcome measures evaluating balance at impairment and activity levels. Some measures reveal trends towards improvement for those in the therapy group. Sample size estimations indicate that Bal-SARA scores could detect a clinically significant change of 0.8 points in this functional balance score if 80 people per group were analysed in future trials. Home-based targeted training of functional balance for people with pure cerebellar ataxia is feasible and the outcome measures employed are reliable. © The Author(s) 2014.

  5. Reliability of environmental sampling culture results using the negative binomial intraclass correlation coefficient.

    PubMed

    Aly, Sharif S; Zhao, Jianyang; Li, Ben; Jiang, Jiming

    2014-01-01

    The Intraclass Correlation Coefficient (ICC) is commonly used to estimate the similarity between quantitative measures obtained from different sources. Overdispersed data is traditionally transformed so that linear mixed model (LMM) based ICC can be estimated. A common transformation used is the natural logarithm. The reliability of environmental sampling of fecal slurry on freestall pens has been estimated for Mycobacterium avium subsp. paratuberculosis using the natural logarithm transformed culture results. Recently, the negative binomial ICC was defined based on a generalized linear mixed model for negative binomial distributed data. The current study reports on the negative binomial ICC estimate which includes fixed effects using culture results of environmental samples. Simulations using a wide variety of inputs and negative binomial distribution parameters (r; p) showed better performance of the new negative binomial ICC compared to the ICC based on LMM even when negative binomial data was logarithm, and square root transformed. A second comparison that targeted a wider range of ICC values showed that the mean of estimated ICC closely approximated the true ICC.

  6. Effect of instructive visual stimuli on neurofeedback training for motor imagery-based brain-computer interface.

    PubMed

    Kondo, Toshiyuki; Saeki, Midori; Hayashi, Yoshikatsu; Nakayashiki, Kosei; Takata, Yohei

    2015-10-01

    Event-related desynchronization (ERD) of the electroencephalogram (EEG) from the motor cortex is associated with execution, observation, and mental imagery of motor tasks. Generation of ERD by motor imagery (MI) has been widely used for brain-computer interfaces (BCIs) linked to neuroprosthetics and other motor assistance devices. Control of MI-based BCIs can be acquired by neurofeedback training to reliably induce MI-associated ERD. To develop more effective training conditions, we investigated the effect of static and dynamic visual representations of target movements (a picture of forearms or a video clip of hand grasping movements) during the BCI neurofeedback training. After 4 consecutive training days, the group that performed MI while viewing the video showed significant improvement in generating MI-associated ERD compared with the group that viewed the static image. This result suggests that passively observing the target movement during MI would improve the associated mental imagery and enhance MI-based BCIs skills. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Template-based protein structure modeling using the RaptorX web server.

    PubMed

    Källberg, Morten; Wang, Haipeng; Wang, Sheng; Peng, Jian; Wang, Zhiyong; Lu, Hui; Xu, Jinbo

    2012-07-19

    A key challenge of modern biology is to uncover the functional role of the protein entities that compose cellular proteomes. To this end, the availability of reliable three-dimensional atomic models of proteins is often crucial. This protocol presents a community-wide web-based method using RaptorX (http://raptorx.uchicago.edu/) for protein secondary structure prediction, template-based tertiary structure modeling, alignment quality assessment and sophisticated probabilistic alignment sampling. RaptorX distinguishes itself from other servers by the quality of the alignment between a target sequence and one or multiple distantly related template proteins (especially those with sparse sequence profiles) and by a novel nonlinear scoring function and a probabilistic-consistency algorithm. Consequently, RaptorX delivers high-quality structural models for many targets with only remote templates. At present, it takes RaptorX ~35 min to finish processing a sequence of 200 amino acids. Since its official release in August 2011, RaptorX has processed ~6,000 sequences submitted by ~1,600 users from around the world.

  8. Template-based protein structure modeling using the RaptorX web server

    PubMed Central

    Källberg, Morten; Wang, Haipeng; Wang, Sheng; Peng, Jian; Wang, Zhiyong; Lu, Hui; Xu, Jinbo

    2016-01-01

    A key challenge of modern biology is to uncover the functional role of the protein entities that compose cellular proteomes. To this end, the availability of reliable three-dimensional atomic models of proteins is often crucial. This protocol presents a community-wide web-based method using RaptorX (http://raptorx.uchicago.edu/) for protein secondary structure prediction, template-based tertiary structure modeling, alignment quality assessment and sophisticated probabilistic alignment sampling. RaptorX distinguishes itself from other servers by the quality of the alignment between a target sequence and one or multiple distantly related template proteins (especially those with sparse sequence profiles) and by a novel nonlinear scoring function and a probabilistic-consistency algorithm. Consequently, RaptorX delivers high-quality structural models for many targets with only remote templates. At present, it takes RaptorX ~35 min to finish processing a sequence of 200 amino acids. Since its official release in August 2011, RaptorX has processed ~6,000 sequences submitted by ~1,600 users from around the world. PMID:22814390

  9. PAT: predictor for structured units and its application for the optimization of target molecules for the generation of synthetic antibodies.

    PubMed

    Jeon, Jouhyun; Arnold, Roland; Singh, Fateh; Teyra, Joan; Braun, Tatjana; Kim, Philip M

    2016-04-01

    The identification of structured units in a protein sequence is an important first step for most biochemical studies. Importantly for this study, the identification of stable structured region is a crucial first step to generate novel synthetic antibodies. While many approaches to find domains or predict structured regions exist, important limitations remain, such as the optimization of domain boundaries and the lack of identification of non-domain structured units. Moreover, no integrated tool exists to find and optimize structural domains within protein sequences. Here, we describe a new tool, PAT ( http://www.kimlab.org/software/pat ) that can efficiently identify both domains (with optimized boundaries) and non-domain putative structured units. PAT automatically analyzes various structural properties, evaluates the folding stability, and reports possible structural domains in a given protein sequence. For reliability evaluation of PAT, we applied PAT to identify antibody target molecules based on the notion that soluble and well-defined protein secondary and tertiary structures are appropriate target molecules for synthetic antibodies. PAT is an efficient and sensitive tool to identify structured units. A performance analysis shows that PAT can characterize structurally well-defined regions in a given sequence and outperforms other efforts to define reliable boundaries of domains. Specially, PAT successfully identifies experimentally confirmed target molecules for antibody generation. PAT also offers the pre-calculated results of 20,210 human proteins to accelerate common queries. PAT can therefore help to investigate large-scale structured domains and improve the success rate for synthetic antibody generation.

  10. Fluorescent "on-off-on" switching sensor based on CdTe quantum dots coupled with multiwalled carbon nanotubes@graphene oxide nanoribbons for simultaneous monitoring of dual foreign DNAs in transgenic soybean.

    PubMed

    Li, Yaqi; Sun, Li; Qian, Jing; Long, Lingliang; Li, Henan; Liu, Qian; Cai, Jianrong; Wang, Kun

    2017-06-15

    With the increasing concern of potential health and environmental risk, it is essential to develop reliable methods for transgenic soybean detection. Herein, a simple, sensitive and selective assay was constructed based on homogeneous fluorescence resonance energy transfer (FRET) between CdTe quantum dots (QDs) and multiwalled carbon nanotubes@graphene oxide nanoribbons (MWCNTs@GONRs) to form the fluorescent "on-off-on" switching for simultaneous monitoring dual target DNAs of promoter cauliflower mosaic virus 35s (P35s) and terminator nopaline synthase (TNOS) from transgenic soybean. The capture DNAs were immobilized with corresponding QDs to obtain strong fluorescent signals (turning on). The strong π-π stacking interaction between single-stranded DNA (ssDNA) probes and MWCNTs@GONRs led to minimal background fluorescence due to the FRET process (turning off). The targets of P35s and TNOS were recognized by dual fluorescent probes to form double-stranded DNA (dsDNA) through the specific hybridization between target DNAs and ssDNA probes. And the dsDNA were released from the surface of MWCNTs@GONRs, which leaded the dual fluorescent probes to generate the strong fluorescent emissions (turning on). Therefore, this proposed homogeneous assay can be achieved to detect P35s and TNOS simultaneously by monitoring the relevant fluorescent emissions. Moreover, this assay can distinguish complementary and mismatched nucleic acid sequences with high sensitivity. The constructed approach has the potential to be a tool for daily detection of genetically modified organism with the merits of feasibility and reliability. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Three-year outcomes of root canal treatment: Mining an insurance database.

    PubMed

    Raedel, Michael; Hartmann, Andrea; Bohm, Steffen; Walter, Michael H

    2015-04-01

    There is doubt whether success rates of root canal treatments reported from clinical trials are achievable outside of standardized study populations. The aim of this study was to analyse the outcome of a large number of root canal treatments conducted in general practice. The data was collected from the digital database of a major German national health insurance company. All teeth with complete treatment data were included. Only patients who had been insurance members for the whole 3-year period from 2010 to 2012 were eligible. Kaplan-Meier survival analyses were conducted based on completed root canal treatments. Target events were re-interventions as (1) retreatment of the root canal treatment, (2) apical root resection (apicoectomy) and (3) extraction. The influences of vitality status and root numbers on survival were tested with the log-rank test. A total of 556,067 root canal treatments were included. The cumulative overall survival rate for all target events combined was 84.3% for 3 years. The survival rate for nonvital teeth (82.6%) was significantly lower than for vital teeth (85.6%; p<0.001). The survival rate for single rooted teeth (83.4%) was significantly lower than for multi-rooted teeth (85.5%; p<0.001). The most frequent target event was extraction followed by apical root resection and retreatment. Based on these 3-year outcomes, root canal treatment is considered a reliable treatment in practice routine under the conditions of the German national health insurance system. Root canal treatment can be considered as a reliable treatment option suitable to salvage most of the affected teeth. This statement applies to treatments that in the vast majority of cases were delivered by general practitioners under the terms and conditions of a nationwide health insurance system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. The effects of intermittent illumination on a visual inspection task.

    PubMed

    Kennedy, A; Brysbaert, M; Murray, W S

    1998-02-01

    Two experiments are described in which eye movements were monitored as subjects performed a simple target-spotting task under conditions of intermittent illumination produced by varying the display-screen frame rate on a computer VDU. In Experiment 1, subjects executed a saccade from a fixation point to a target which appeared randomly at a fixed eccentricity of 14 character positions to the left or right. Saccade latency did not differ reliably as a function of screen refresh rate, but average saccade extent at 70 Hz and 110 Hz was reliably shorter than at 90 Hz and 100 Hz. Experiment 2 examined the same task using a range of target eccentricities (7, 14, and 28 character positions to the left and right) and across a wider range of screen refresh rates. The results confirmed the curvilinear relationship obtained in Experiment 1, with average saccade extent reliably shorter at refresh rates of 50 Hz and 125 Hz than at 75 Hz and 100 Hz. While the effect was greater for remote targets, analyses of the proportional target error failed to show a reliable interaction between target eccentricity and display refresh rate. In contrast to Experiment 1, there was a pronounced effect of refresh rate on saccade latency (corrected for time to write the screen frame), with shorter latencies at higher refresh rates. It may be concluded that pulsation at frequencies above fusion disrupts saccade control. However, the curvilinear functional relationship between screen refresh rate and saccade extent obtained in these studies differs from previously reported effects of intermittent illumination on the average size of "entry saccades" (the first saccade to enter a given word) in a task involving word identification (Kennedy & Murray, 1993a, 1996). This conflict of data may arise in part because within-word adjustments in viewing position, which are typical of normal reading, influence measures of average saccade extent.

  13. Prediction of Protein Structure by Template-Based Modeling Combined with the UNRES Force Field.

    PubMed

    Krupa, Paweł; Mozolewska, Magdalena A; Joo, Keehyoung; Lee, Jooyoung; Czaplewski, Cezary; Liwo, Adam

    2015-06-22

    A new approach to the prediction of protein structures that uses distance and backbone virtual-bond dihedral angle restraints derived from template-based models and simulations with the united residue (UNRES) force field is proposed. The approach combines the accuracy and reliability of template-based methods for the segments of the target sequence with high similarity to those having known structures with the ability of UNRES to pack the domains correctly. Multiplexed replica-exchange molecular dynamics with restraints derived from template-based models of a given target, in which each restraint is weighted according to the accuracy of the prediction of the corresponding section of the molecule, is used to search the conformational space, and the weighted histogram analysis method and cluster analysis are applied to determine the families of the most probable conformations, from which candidate predictions are selected. To test the capability of the method to recover template-based models from restraints, five single-domain proteins with structures that have been well-predicted by template-based methods were used; it was found that the resulting structures were of the same quality as the best of the original models. To assess whether the new approach can improve template-based predictions with incorrectly predicted domain packing, four such targets were selected from the CASP10 targets; for three of them the new approach resulted in significantly better predictions compared with the original template-based models. The new approach can be used to predict the structures of proteins for which good templates can be found for sections of the sequence or an overall good template can be found for the entire sequence but the prediction quality is remarkably weaker in putative domain-linker regions.

  14. Development of an ESI-LC-MS-based assay for kinetic evaluation of Mycobacterium tuberculosis shikimate kinase activity and inhibition.

    PubMed

    Simithy, Johayra; Gill, Gobind; Wang, Yu; Goodwin, Douglas C; Calderón, Angela I

    2015-02-17

    A simple and reliable liquid chromatography-mass spectrometry (LC-MS) assay has been developed and validated for the kinetic characterization and evaluation of inhibitors of shikimate kinase from Mycobacterium tuberculosis (MtSK), a potential target for the development of novel antitubercular drugs. This assay is based on the direct determination of the reaction product shikimate-3-phosphate (S3P) using electrospray ionization (ESI) and a quadrupole time-of-flight (Q-TOF) detector. A comparative analysis of the kinetic parameters of MtSK obtained by the LC-MS assay with those obtained by a conventional UV-assay was performed. Kinetic parameters determined by LC-MS were in excellent agreement with those obtained from the UV assay, demonstrating the accuracy, and reliability of this method. The validated assay was successfully applied to the kinetic characterization of a known inhibitor of shikimate kinase; inhibition constants and mode of inhibition were accurately delineated with LC-MS.

  15. Node Self-Deployment Algorithm Based on Pigeon Swarm Optimization for Underwater Wireless Sensor Networks

    PubMed Central

    Yu, Shanen; Xu, Yiming; Jiang, Peng; Wu, Feng; Xu, Huan

    2017-01-01

    At present, free-to-move node self-deployment algorithms aim at event coverage and cannot improve network coverage under the premise of considering network connectivity, network reliability and network deployment energy consumption. Thus, this study proposes pigeon-based self-deployment algorithm (PSA) for underwater wireless sensor networks to overcome the limitations of these existing algorithms. In PSA, the sink node first finds its one-hop nodes and maximizes the network coverage in its one-hop region. The one-hop nodes subsequently divide the network into layers and cluster in each layer. Each cluster head node constructs a connected path to the sink node to guarantee network connectivity. Finally, the cluster head node regards the ratio of the movement distance of the node to the change in the coverage redundancy ratio as the target function and employs pigeon swarm optimization to determine the positions of the nodes. Simulation results show that PSA improves both network connectivity and network reliability, decreases network deployment energy consumption, and increases network coverage. PMID:28338615

  16. Signal Amplification Technologies for the Detection of Nucleic Acids: from Cell-Free Analysis to Live-Cell Imaging.

    PubMed

    Fozooni, Tahereh; Ravan, Hadi; Sasan, Hosseinali

    2017-12-01

    Due to their unique properties, such as programmability, ligand-binding capability, and flexibility, nucleic acids can serve as analytes and/or recognition elements for biosensing. To improve the sensitivity of nucleic acid-based biosensing and hence the detection of a few copies of target molecule, different modern amplification methodologies, namely target-and-signal-based amplification strategies, have already been developed. These recent signal amplification technologies, which are capable of amplifying the signal intensity without changing the targets' copy number, have resulted in fast, reliable, and sensitive methods for nucleic acid detection. Working in cell-free settings, researchers have been able to optimize a variety of complex and quantitative methods suitable for deploying in live-cell conditions. In this study, a comprehensive review of the signal amplification technologies for the detection of nucleic acids is provided. We classify the signal amplification methodologies into enzymatic and non-enzymatic strategies with a primary focus on the methods that enable us to shift away from in vitro detecting to in vivo imaging. Finally, the future challenges and limitations of detection for cellular conditions are discussed.

  17. A Comprehensive Strategy to Construct In-house Database for Accurate and Batch Identification of Small Molecular Metabolites.

    PubMed

    Zhao, Xinjie; Zeng, Zhongda; Chen, Aiming; Lu, Xin; Zhao, Chunxia; Hu, Chunxiu; Zhou, Lina; Liu, Xinyu; Wang, Xiaolin; Hou, Xiaoli; Ye, Yaorui; Xu, Guowang

    2018-05-29

    Identification of the metabolites is an essential step in metabolomics study to interpret regulatory mechanism of pathological and physiological processes. However, it is still a big headache in LC-MSn-based studies because of the complexity of mass spectrometry, chemical diversity of metabolites, and deficiency of standards database. In this work, a comprehensive strategy is developed for accurate and batch metabolite identification in non-targeted metabolomics studies. First, a well defined procedure was applied to generate reliable and standard LC-MS2 data including tR, MS1 and MS2 information at a standard operational procedure (SOP). An in-house database including about 2000 metabolites was constructed and used to identify the metabolites in non-targeted metabolic profiling by retention time calibration using internal standards, precursor ion alignment and ion fusion, auto-MS2 information extraction and selection, and database batch searching and scoring. As an application example, a pooled serum sample was analyzed to deliver the strategy, 202 metabolites were identified in the positive ion mode. It shows our strategy is useful for LC-MSn-based non-targeted metabolomics study.

  18. Target recognition based on the moment functions of radar signatures

    NASA Astrophysics Data System (ADS)

    Kim, Kyung-Tae; Kim, Hyo-Tae

    2002-03-01

    In this paper, we present the results of target recognition research based on the moment functions of various radar signatures, such as time-frequency signatures, range profiles, and scattering centers. The proposed approach utilizes geometrical moments or central moments of the obtained radar signatures. In particular, we derived exact and closed form expressions of the geometrical moments of the adaptive Gaussian representation (AGR), which is one of the adaptive joint time-frequency techniques, and also computed the central moments of range profiles and one-dimensional (1-D) scattering centers on a target, which are obtained by various super-resolution techniques. The obtained moment functions are further processed to provide small dimensional and redundancy-free feature vectors, and classified via a neural network approach or a Bayes classifier. The performances of the proposed technique are demonstrated using a simulated radar cross section (RCS) data set, or a measured RCS data set of various scaled aircraft models, obtained at the Pohang University of Science and Technology (POSTECH) compact range facility. Results show that the techniques in this paper can not only provide reliable classification accuracy, but also save computational resources.

  19. In situ mutation detection and visualization of intratumor heterogeneity for cancer research and diagnostics

    PubMed Central

    Grundberg, Ida; Kiflemariam, Sara; Mignardi, Marco; Imgenberg-Kreuz, Juliana; Edlund, Karolina; Micke, Patrick; Sundström, Magnus; Sjöblom, Tobias

    2013-01-01

    Current assays for somatic mutation analysis are based on extracts from tissue sections that often contain morphologically heterogeneous neoplastic regions with variable contents of genetically normal stromal and inflammatory cells, obscuring the results of the assays. We have developed an RNA-based in situ mutation assay that targets oncogenic mutations in a multiplex fashion that resolves the heterogeneity of the tissue sample. Activating oncogenic mutations are targets for a new generation of cancer drugs. For anti-EGFR therapy prediction, we demonstrate reliable in situ detection of KRAS mutations in codon 12 and 13 in colon and lung cancers in three different types of routinely processed tissue materials. High-throughput screening of KRAS mutation status was successfully performed on a tissue microarray. Moreover, we show how the patterns of expressed mutated and wild-type alleles can be studied in situ in tumors with complex combinations of mutated EGFR, KRAS and TP53. This in situ method holds great promise as a tool to investigate the role of somatic mutations during tumor progression and for prediction of response to targeted therapy. PMID:24280411

  20. a Landmark Extraction Method Associated with Geometric Features and Location Distribution

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Li, J.; Wang, Y.; Xiao, Y.; Liu, P.; Zhang, S.

    2018-04-01

    Landmark plays an important role in spatial cognition and spatial knowledge organization. Significance measuring model is the main method of landmark extraction. It is difficult to take account of the spatial distribution pattern of landmarks because that the significance of landmark is built in one-dimensional space. In this paper, we start with the geometric features of the ground object, an extraction method based on the target height, target gap and field of view is proposed. According to the influence region of Voronoi Diagram, the description of target gap is established to the geometric representation of the distribution of adjacent targets. Then, segmentation process of the visual domain of Voronoi K order adjacent is given to set up target view under the multi view; finally, through three kinds of weighted geometric features, the landmarks are identified. Comparative experiments show that this method has a certain coincidence degree with the results of traditional significance measuring model, which verifies the effectiveness and reliability of the method and reduces the complexity of landmark extraction process without losing the reference value of landmark.

  1. Targeted gene insertion for molecular medicine.

    PubMed

    Voigt, Katrin; Izsvák, Zsuzsanna; Ivics, Zoltán

    2008-11-01

    Genomic insertion of a functional gene together with suitable transcriptional regulatory elements is often required for long-term therapeutical benefit in gene therapy for several genetic diseases. A variety of integrating vectors for gene delivery exist. Some of them exhibit random genomic integration, whereas others have integration preferences based on attributes of the targeted site, such as primary DNA sequence and physical structure of the DNA, or through tethering to certain DNA sequences by host-encoded cellular factors. Uncontrolled genomic insertion bears the risk of the transgene being silenced due to chromosomal position effects, and can lead to genotoxic effects due to mutagenesis of cellular genes. None of the vector systems currently used in either preclinical experiments or clinical trials displays sufficient preferences for target DNA sequences that would ensure appropriate and reliable expression of the transgene and simultaneously prevent hazardous side effects. We review in this paper the advantages and disadvantages of both viral and non-viral gene delivery technologies, discuss mechanisms of target site selection of integrating genetic elements (viruses and transposons), and suggest distinct molecular strategies for targeted gene delivery.

  2. Multiple reaction monitoring-ion pair finder: a systematic approach to transform nontargeted mode to pseudotargeted mode for metabolomics study based on liquid chromatography-mass spectrometry.

    PubMed

    Luo, Ping; Dai, Weidong; Yin, Peiyuan; Zeng, Zhongda; Kong, Hongwei; Zhou, Lina; Wang, Xiaolin; Chen, Shili; Lu, Xin; Xu, Guowang

    2015-01-01

    Pseudotargeted metabolic profiling is a novel strategy combining the advantages of both targeted and untargeted methods. The strategy obtains metabolites and their product ions from quadrupole time-of-flight (Q-TOF) MS by information-dependent acquisition (IDA) and then picks targeted ion pairs and measures them on a triple-quadrupole MS by multiple reaction monitoring (MRM). The picking of ion pairs from thousands of candidates is the most time-consuming step of the pseudotargeted strategy. Herein, a systematic and automated approach and software (MRM-Ion Pair Finder) were developed to acquire characteristic MRM ion pairs by precursor ions alignment, MS(2) spectrum extraction and reduction, characteristic product ion selection, and ion fusion. To test the reliability of the approach, a mixture of 15 metabolite standards was first analyzed; the representative ion pairs were correctly picked out. Then, pooled serum samples were further studied, and the results were confirmed by the manual selection. Finally, a comparison with a commercial peak alignment software was performed, and a good characteristic ion coverage of metabolites was obtained. As a proof of concept, the proposed approach was applied to a metabolomics study of liver cancer; 854 metabolite ion pairs were defined in the positive ion mode from serum. Our approach provides a high throughput method which is reliable to acquire MRM ion pairs for pseudotargeted metabolomics with improved metabolite coverage and facilitate more reliable biomarkers discoveries.

  3. Reliable motion detection of small targets in video with low signal-to-clutter ratios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nichols, S.A.; Naylor, R.B.

    1995-07-01

    Studies show that vigilance decreases rapidly after several minutes when human operators are required to search live video for infrequent intrusion detections. Therefore, there is a need for systems which can automatically detect targets in live video and reserve the operator`s attention for assessment only. Thus far, automated systems have not simultaneously provided adequate detection sensitivity, false alarm suppression, and ease of setup when used in external, unconstrained environments. This unsatisfactory performance can be exacerbated by poor video imagery with low contrast, high noise, dynamic clutter, image misregistration, and/or the presence of small, slow, or erratically moving targets. This papermore » describes a highly adaptive video motion detection and tracking algorithm which has been developed as part of Sandia`s Advanced Exterior Sensor (AES) program. The AES is a wide-area detection and assessment system for use in unconstrained exterior security applications. The AES detection and tracking algorithm provides good performance under stressing data and environmental conditions. Features of the algorithm include: reliable detection with negligible false alarm rate of variable velocity targets having low signal-to-clutter ratios; reliable tracking of targets that exhibit motion that is non-inertial, i.e., varies in direction and velocity; automatic adaptation to both infrared and visible imagery with variable quality; and suppression of false alarms caused by sensor flaws and/or cutouts.« less

  4. Concordance of IHC, FISH and RT-PCR for EML4-ALK rearrangements.

    PubMed

    Teixidó, Cristina; Karachaliou, Niki; Peg, Vicente; Gimenez-Capitan, Ana; Rosell, Rafael

    2014-04-01

    The echinoderm microtubule-associated protein-like 4 anaplastic lymphoma kinase (EML4-ALK) has emerged as the second most important driver oncogene in lung cancer and the first targetable fusion oncokinase to be identified in 4-6% of lung adenocarcinomas. Crizotinib, along with a diagnostic test-the Vysis ALK Break Apart fluorescence in situ hybridization (FISH) Probe Kit-is approved for the treatment of ALK positive advanced non-small cell lung cancer (NSCLC). However, the success of a targeted drug is critically dependent on a sensitive and specific screening assay to detect the molecular drug target. In our experience, reverse transcription polymerase chain reaction (RT-PCR)-based detection of EML4-ALK is a more sensitive and reliable approach compared to FISH and immunohistochemistry (IHC). Although ALK FISH is clinically validated, the assay can be technically challenging and other diagnostic modalities, including IHC and RT-PCR should be further explored.

  5. Delineation of geochemical anomalies based on stream sediment data utilizing fractal modeling and staged factor analysis

    NASA Astrophysics Data System (ADS)

    Afzal, Peyman; Mirzaei, Misagh; Yousefi, Mahyar; Adib, Ahmad; Khalajmasoumi, Masoumeh; Zarifi, Afshar Zia; Foster, Patrick; Yasrebi, Amir Bijan

    2016-07-01

    Recognition of significant geochemical signatures and separation of geochemical anomalies from background are critical issues in interpretation of stream sediment data to define exploration targets. In this paper, we used staged factor analysis in conjunction with the concentration-number (C-N) fractal model to generate exploration targets for prospecting Cr and Fe mineralization in Balvard area, SE Iran. The results show coexistence of derived multi-element geochemical signatures of the deposit-type sought and ultramafic-mafic rocks in the NE and northern parts of the study area indicating significant chromite and iron ore prospects. In this regard, application of staged factor analysis and fractal modeling resulted in recognition of significant multi-element signatures that have a high spatial association with host lithological units of the deposit-type sought, and therefore, the generated targets are reliable for further prospecting of the deposit in the study area.

  6. Characterization of image heterogeneity using 2D Minkowski functionals increases the sensitivity of detection of a targeted MRI contrast agent.

    PubMed

    Canuto, Holly C; McLachlan, Charles; Kettunen, Mikko I; Velic, Marko; Krishnan, Anant S; Neves, Andre' A; de Backer, Maaike; Hu, D-E; Hobson, Michael P; Brindle, Kevin M

    2009-05-01

    A targeted Gd(3+)-based contrast agent has been developed that detects tumor cell death by binding to the phosphatidylserine (PS) exposed on the plasma membrane of dying cells. Although this agent has been used to detect tumor cell death in vivo, the differences in signal intensity between treated and untreated tumors was relatively small. As cell death is often spatially heterogeneous within tumors, we investigated whether an image analysis technique that parameterizes heterogeneity could be used to increase the sensitivity of detection of this targeted contrast agent. Two-dimensional (2D) Minkowski functionals (MFs) provided an automated and reliable method for parameterization of image heterogeneity, which does not require prior assumptions about the number of regions or features in the image, and were shown to increase the sensitivity of detection of the contrast agent as compared to simple signal intensity analysis. (c) 2009 Wiley-Liss, Inc.

  7. Universal, colorimetric microRNA detection strategy based on target-catalyzed toehold-mediated strand displacement reaction

    NASA Astrophysics Data System (ADS)

    Park, Yeonkyung; Lee, Chang Yeol; Kang, Shinyoung; Kim, Hansol; Park, Ki Soo; Park, Hyun Gyu

    2018-02-01

    In this work, we developed a novel, label-free, and enzyme-free strategy for the colorimetric detection of microRNA (miRNA), which relies on a target-catalyzed toehold-mediated strand displacement (TMSD) reaction. The system employs a detection probe that specifically binds to the target miRNA and sequentially releases a catalyst strand (CS) intended to trigger the subsequent TMSD reaction. Thus, the presence of target miRNA releases the CS that mediates the formation of an active G-quadruplex DNAzyme which is initially caged and inactivated by a blocker strand. In addition, a fuel strand that is supplemented for the recycling of the CS promotes another TMSD reaction, consequently generating a large number of active G-quadruplex DNAzymes. As a result, a distinct colorimetric signal is produced by the ABTS oxidation promoted by the peroxidase mimicking activity of the released G-quadruplex DNAzymes. Based on this novel strategy, we successfully detected miR-141, a promising biomarker for human prostate cancer, with high selectivity. The diagnostic capability of this system was also demonstrated by reliably determining target miR-141 in human serum, showing its great potential towards real clinical applications. Importantly, the proposed approach is composed of separate target recognition and signal transduction modules. Thus, it could be extended to analyze different target miRNAs by simply redesigning the detection probe while keeping the same signal transduction module as a universal signal amplification unit, which was successfully demonstrated by analyzing another target miRNA, let-7d.

  8. Universal, colorimetric microRNA detection strategy based on target-catalyzed toehold-mediated strand displacement reaction.

    PubMed

    Park, Yeonkyung; Lee, Chang Yeol; Kang, Shinyoung; Kim, Hansol; Park, Ki Soo; Park, Hyun Gyu

    2018-02-23

    In this work, we developed a novel, label-free, and enzyme-free strategy for the colorimetric detection of microRNA (miRNA), which relies on a target-catalyzed toehold-mediated strand displacement (TMSD) reaction. The system employs a detection probe that specifically binds to the target miRNA and sequentially releases a catalyst strand (CS) intended to trigger the subsequent TMSD reaction. Thus, the presence of target miRNA releases the CS that mediates the formation of an active G-quadruplex DNAzyme which is initially caged and inactivated by a blocker strand. In addition, a fuel strand that is supplemented for the recycling of the CS promotes another TMSD reaction, consequently generating a large number of active G-quadruplex DNAzymes. As a result, a distinct colorimetric signal is produced by the ABTS oxidation promoted by the peroxidase mimicking activity of the released G-quadruplex DNAzymes. Based on this novel strategy, we successfully detected miR-141, a promising biomarker for human prostate cancer, with high selectivity. The diagnostic capability of this system was also demonstrated by reliably determining target miR-141 in human serum, showing its great potential towards real clinical applications. Importantly, the proposed approach is composed of separate target recognition and signal transduction modules. Thus, it could be extended to analyze different target miRNAs by simply redesigning the detection probe while keeping the same signal transduction module as a universal signal amplification unit, which was successfully demonstrated by analyzing another target miRNA, let-7d.

  9. Research on major antitumor active components in Zi-Cao-Cheng-Qi decoction based on hollow fiber cell fishing with high performance liquid chromatography.

    PubMed

    Li, Miaomiao; Hu, Shuang; Chen, Xuan; Wang, Runqin; Bai, Xiaohong

    2018-02-05

    Hollow fiber cell fishing (HFCF) based on hepatoma HepG-2 cells, human renal tubular ACHN cells or human cervical carcinoma HeLa cells, coupled with high-performance liquid chromatography (HPLC), was developed and employed to research the major active components in Zi-Cao-Cheng-Qi decoction both in vitro and in vivo. The research showed that the active components, such as hesperidin, magnolol, honokiol, shikonin, emodin and β,β'-dimethylacrylshikonin were screened out by HFCF based on the cancer cells in vitro, furthermore they can be absorbed into blood and reach in the target organ, and some of the active components can be fished by the cells and maintain effective concentrations. Before application of HFCF with HPLC, cell growth state, cell survival rate, positive effect on screening results binding between active centers on the fiber and target components, repeatability of retention times and relative peak areas of the target analytes were analysed and investigated. In short, HFCF with HPLC is a simple, inexpensive, effective, and reliable method that can be used in researching active components from traditional Chinese medicine (TCM) and its formula both in vitro and in vivo, elucidating preliminarily the TCM characteristics of multiple components and multiple targets, laying a foundation for expounding the antitumor efficacy material basis in TCM. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. A liquid chromatography-tandem mass spectrometry-based targeted proteomics assay for monitoring P-glycoprotein levels in human breast tissue.

    PubMed

    Yang, Ting; Chen, Fei; Xu, Feifei; Wang, Fengliang; Xu, Qingqing; Chen, Yun

    2014-09-25

    P-glycoprotein (P-gp) can efflux drugs from cancer cells, and its overexpression is commonly associated with multi-drug resistance (MDR). Thus, the accurate quantification of P-gp would help predict the response to chemotherapy and for prognosis of breast cancer patients. An advanced liquid chromatography-tandem mass spectrometry (LC/MS/MS)-based targeted proteomics assay was developed and validated for monitoring P-gp levels in breast tissue. Tryptic peptide 368IIDNKPSIDSYSK380 was selected as a surrogate analyte for quantification, and immuno-depleted tissue extract was used as a surrogate matrix. Matched pairs of breast tissue samples from 60 patients who were suspected to have drug resistance were subject to analysis. The levels of P-gp were quantified. Using data from normal tissue, we suggested a P-gp reference interval. The experimental values of tumor tissue samples were compared with those obtained from Western blotting and immunohistochemistry (IHC). The result indicated that the targeted proteomics approach was comparable to IHC but provided a lower limit of quantification (LOQ) and could afford more reliable results at low concentrations than the other two methods. LC/MS/MS-based targeted proteomics may allow the quantification of P-gp in breast tissue in a more accurate manner. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Pharmacology and Clinical Drug Candidates in Redox Medicine

    PubMed Central

    Casas, Ana I.; Maghzal, Ghassan J.; Seredenina, Tamara; Kaludercic, Nina; Robledinos-Anton, Natalia; Di Lisa, Fabio; Stocker, Roland; Ghezzi, Pietro; Jaquet, Vincent; Cuadrado, Antonio

    2015-01-01

    Abstract Significance: Oxidative stress is suggested to be a disease mechanism common to a wide range of disorders affecting human health. However, so far, the pharmacotherapeutic exploitation of this, for example, based on chemical scavenging of pro-oxidant molecules, has been unsuccessful. Recent Advances: An alternative emerging approach is to target the enzymatic sources of disease-relevant oxidative stress. Several such enzymes and isoforms have been identified and linked to different pathologies. For some targets, the respective pharmacology is quite advanced, that is, up to late-stage clinical development or even on the market; for others, drugs are already in clinical use, although not for indications based on oxidative stress, and repurposing seems to be a viable option. Critical Issues: For all other targets, reliable preclinical validation and drug ability are key factors for any translation into the clinic. In this study, specific pharmacological agents with optimal pharmacokinetic profiles are still lacking. Moreover, these enzymes also serve largely unknown physiological functions and their inhibition may lead to unwanted side effects. Future Directions: The current promising data based on new targets, drugs, and drug repurposing are mainly a result of academic efforts. With the availability of optimized compounds and coordinated efforts from academia and industry scientists, unambiguous validation and translation into proof-of-principle studies seem achievable in the very near future, possibly leading towards a new era of redox medicine. Antioxid. Redox Signal. 23, 1113–1129. PMID:26415051

  12. Reliability of image-free navigation to monitor lower-limb alignment.

    PubMed

    Pearle, Andrew D; Goleski, Patrick; Musahl, Volker; Kendoff, Daniel

    2009-02-01

    Proper alignment of the mechanical axis of the lower limb is the principal goal of a high tibial osteotomy. A well-accepted and relevant technical specification is the coronal plane lower-limb alignment. Target values for coronal plane alignment after high tibial osteotomy include 2 degrees of overcorrection, while tolerances for this specification have been established as 2 degrees to 4 degrees. However, the role of axial plane and sagittal plane realignment after high tibial osteotomy is poorly understood; consequently, targets and tolerance for this technical specification remain undefined. This article reviews the literature concerning the reliability and precision of navigation in monitoring the clinically relevant specification of lower-limb alignment in high tibial osteotomy. We conclude that image-free navigation registration may be clinically useful for intraoperative monitoring of the coronal plane only. Only fair and poor results for the axial and sagittal planes can be obtained by image-free navigation systems. In the future, combined image-based data, such as those from radiographs, magnetic resonance imaging, and gait analysis, may be used to help to improve the accuracy and reproducibility of quantitative intraoperative monitoring of lower-limb alignment.

  13. Generating a Dynamic Synthetic Population – Using an Age-Structured Two-Sex Model for Household Dynamics

    PubMed Central

    Namazi-Rad, Mohammad-Reza; Mokhtarian, Payam; Perez, Pascal

    2014-01-01

    Generating a reliable computer-simulated synthetic population is necessary for knowledge processing and decision-making analysis in agent-based systems in order to measure, interpret and describe each target area and the human activity patterns within it. In this paper, both synthetic reconstruction (SR) and combinatorial optimisation (CO) techniques are discussed for generating a reliable synthetic population for a certain geographic region (in Australia) using aggregated- and disaggregated-level information available for such an area. A CO algorithm using the quadratic function of population estimators is presented in this paper in order to generate a synthetic population while considering a two-fold nested structure for the individuals and households within the target areas. The baseline population in this study is generated from the confidentialised unit record files (CURFs) and 2006 Australian census tables. The dynamics of the created population is then projected over five years using a dynamic micro-simulation model for individual- and household-level demographic transitions. This projection is then compared with the 2011 Australian census. A prediction interval is provided for the population estimates obtained by the bootstrapping method, by which the variability structure of a predictor can be replicated in a bootstrap distribution. PMID:24733522

  14. Image Tracking for the High Similarity Drug Tablets Based on Light Intensity Reflective Energy and Artificial Neural Network

    PubMed Central

    Liang, Zhongwei; Zhou, Liang; Liu, Xiaochu; Wang, Xiaogang

    2014-01-01

    It is obvious that tablet image tracking exerts a notable influence on the efficiency and reliability of high-speed drug mass production, and, simultaneously, it also emerges as a big difficult problem and targeted focus during production monitoring in recent years, due to the high similarity shape and random position distribution of those objectives to be searched for. For the purpose of tracking tablets accurately in random distribution, through using surface fitting approach and transitional vector determination, the calibrated surface of light intensity reflective energy can be established, describing the shape topology and topography details of objective tablet. On this basis, the mathematical properties of these established surfaces have been proposed, and thereafter artificial neural network (ANN) has been employed for classifying those moving targeted tablets by recognizing their different surface properties; therefore, the instantaneous coordinate positions of those drug tablets on one image frame can then be determined. By repeating identical pattern recognition on the next image frame, the real-time movements of objective tablet templates were successfully tracked in sequence. This paper provides reliable references and new research ideas for the real-time objective tracking in the case of drug production practices. PMID:25143781

  15. Genetic tool development underpins recent advances in thermophilic whole-cell biocatalysts.

    PubMed

    Taylor, M P; van Zyl, L; Tuffin, I M; Leak, D J; Cowan, D A

    2011-07-01

    The environmental value of sustainably producing bioproducts from biomass is now widely appreciated, with a primary target being the economic production of fuels such as bioethanol from lignocellulose. The application of thermophilic prokaryotes is a rapidly developing niche in this field, driven by their known catabolic versatility with lignocellulose-derived carbohydrates. Fundamental to the success of this work has been the development of reliable genetic and molecular systems. These technical tools are now available to assist in the development of other (hyper)thermophilic strains with diverse phenotypes such as hemicellulolytic and cellulolytic properties, branched chain alcohol production and other 'valuable bioproduct' synthetic capabilities. Here we present an insight into the historical limitations, recent developments and current status of a number of genetic systems for thermophiles. We also highlight the value of reliable genetic methods for increasing our knowledge of thermophile physiology. We argue that the development of robust genetic systems is paramount in the evolution of future thermophilic based bioprocesses and make suggestions for future approaches and genetic targets that will facilitate this process. © 2011 The Authors. Journal compilation © 2011 Society for Applied Microbiology and Blackwell Publishing Ltd.

  16. Application of a Real-Time, Calculable Limiting Form of the Renyi Entropy for Molecular Imaging of Tumors

    PubMed Central

    Marsh, J. N.; Wallace, K. D.; McCarthy, J. E.; Wickerhauser, M. V.; Maurizi, B. N.; Lanza, G. M.; Wickline, S. A.; Hughes, M. S.

    2011-01-01

    Previously, we reported new methods for ultrasound signal characterization using entropy, Hf; a generalized entropy, the Renyi entropy, If(r); and a limiting form of Renyi entropy suitable for real-time calculation, If,∞. All of these quantities demonstrated significantly more sensitivity to subtle changes in scattering architecture than energy-based methods in certain settings. In this study, the real-time calculable limit of the Renyi entropy, If,∞, is applied for the imaging of angiogenic murine neovasculature in a breast cancer xenograft using a targeted contrast agent. It is shown that this approach may be used to detect reliably the accumulation of targeted nanoparticles at five minutes post-injection in this in vivo model. PMID:20679020

  17. Fabrication of boron sputter targets

    DOEpatents

    Makowiecki, D.M.; McKernan, M.A.

    1995-02-28

    A process is disclosed for fabricating high density boron sputtering targets with sufficient mechanical strength to function reliably at typical magnetron sputtering power densities and at normal process parameters. The process involves the fabrication of a high density boron monolithe by hot isostatically compacting high purity (99.9%) boron powder, machining the boron monolithe into the final dimensions, and brazing the finished boron piece to a matching boron carbide (B{sub 4}C) piece, by placing aluminum foil there between and applying pressure and heat in a vacuum. An alternative is the application of aluminum metallization to the back of the boron monolithe by vacuum deposition. Also, a titanium based vacuum braze alloy can be used in place of the aluminum foil. 7 figs.

  18. Automation in visual inspection tasks: X-ray luggage screening supported by a system of direct, indirect or adaptable cueing with low and high system reliability.

    PubMed

    Chavaillaz, Alain; Schwaninger, Adrian; Michel, Stefan; Sauer, Juergen

    2018-05-25

    The present study evaluated three automation modes for improving performance in an X-ray luggage screening task. 140 participants were asked to detect the presence of prohibited items in X-ray images of cabin luggage. Twenty participants conducted this task without automatic support (control group), whereas the others worked with either indirect cues (system indicated the target presence without specifying its location), or direct cues (system pointed out the exact target location) or adaptable automation (participants could freely choose between no cue, direct and indirect cues). Furthermore, automatic support reliability was manipulated (low vs. high). The results showed a clear advantage for direct cues regarding detection performance and response time. No benefits were observed for adaptable automation. Finally, high automation reliability led to better performance and higher operator trust. The findings overall confirmed that automatic support systems for luggage screening should be designed such that they provide direct, highly reliable cues.

  19. Reliability and validity of a treatment fidelity assessment for motivational interviewing targeting sexual risk behaviors in people living with HIV/AIDS.

    PubMed

    Seng, Elizabeth K; Lovejoy, Travis I

    2013-12-01

    This study psychometrically evaluates the Motivational Interviewing Treatment Integrity Code (MITI) to assess fidelity to motivational interviewing to reduce sexual risk behaviors in people living with HIV/AIDS. 74 sessions from a pilot randomized controlled trial of motivational interviewing to reduce sexual risk behaviors in people living with HIV were coded with the MITI. Participants reported sexual behavior at baseline, 3-month, and 6-months. Regarding reliability, excellent inter-rater reliability was achieved for measures of behavior frequency across the 12 sessions coded by both coders; global scales demonstrated poor intraclass correlations, but adequate percent agreement. Regarding validity, principle components analyses indicated that a two-factor model accounted for an adequate amount of variance in the data. These factors were associated with decreases in sexual risk behaviors after treatment. The MITI is a reliable and valid measurement of treatment fidelity for motivational interviewing targeting sexual risk behaviors in people living with HIV/AIDS.

  20. Specific detection of the cleavage activity of mycobacterial enzymes using a quantum dot based DNA nanosensor

    NASA Astrophysics Data System (ADS)

    Jepsen, Morten Leth; Harmsen, Charlotte; Godbole, Adwait Anand; Nagaraja, Valakunja; Knudsen, Birgitta R.; Ho, Yi-Ping

    2015-12-01

    We present a quantum dot based DNA nanosensor specifically targeting the cleavage step in the reaction cycle of the essential DNA-modifying enzyme, mycobacterial topoisomerase I. The design takes advantages of the unique photophysical properties of quantum dots to generate visible fluorescence recovery upon specific cleavage by mycobacterial topoisomerase I. This report, for the first time, demonstrates the possibility to quantify the cleavage activity of the mycobacterial enzyme without the pre-processing sample purification or post-processing signal amplification. The cleavage induced signal response has also proven reliable in biological matrices, such as whole cell extracts prepared from Escherichia coli and human Caco-2 cells. It is expected that the assay may contribute to the clinical diagnostics of bacterial diseases, as well as the evaluation of treatment outcomes.We present a quantum dot based DNA nanosensor specifically targeting the cleavage step in the reaction cycle of the essential DNA-modifying enzyme, mycobacterial topoisomerase I. The design takes advantages of the unique photophysical properties of quantum dots to generate visible fluorescence recovery upon specific cleavage by mycobacterial topoisomerase I. This report, for the first time, demonstrates the possibility to quantify the cleavage activity of the mycobacterial enzyme without the pre-processing sample purification or post-processing signal amplification. The cleavage induced signal response has also proven reliable in biological matrices, such as whole cell extracts prepared from Escherichia coli and human Caco-2 cells. It is expected that the assay may contribute to the clinical diagnostics of bacterial diseases, as well as the evaluation of treatment outcomes. Electronic supplementary information (ESI) available: Characterization of the QD-based DNA Nanosensor. See DOI: 10.1039/c5nr06326d

  1. Near-Millimeter Wave Technology Base Study: Volume I. Propagation and Target/Background Characteristics

    DTIC Science & Technology

    1979-11-01

    diameter test cell used for laser propagation measurements is Path length-84 m to 2.0 km available and has been designed for circulating aerosols or...36- and 110-GHz and found an attenuation ratio of comparison measurements along a 4-km path with rain rate measured near the receiver end. a *02 They...time. Tipping-bucket systems . gauges are reliable, but become increasingly in- accurate at high rates . Flow gauges which The direct field measurement

  2. Double ErrP Detection for Automatic Error Correction in an ERP-Based BCI Speller.

    PubMed

    Cruz, Aniana; Pires, Gabriel; Nunes, Urbano J

    2018-01-01

    Brain-computer interface (BCI) is a useful device for people with severe motor disabilities. However, due to its low speed and low reliability, BCI still has a very limited application in daily real-world tasks. This paper proposes a P300-based BCI speller combined with a double error-related potential (ErrP) detection to automatically correct erroneous decisions. This novel approach introduces a second error detection to infer whether wrong automatic correction also elicits a second ErrP. Thus, two single-trial responses, instead of one, contribute to the final selection, improving the reliability of error detection. Moreover, to increase error detection, the evoked potential detected as target by the P300 classifier is combined with the evoked error potential at a feature-level. Discriminable error and positive potentials (response to correct feedback) were clearly identified. The proposed approach was tested on nine healthy participants and one tetraplegic participant. The online average accuracy for the first and second ErrPs were 88.4% and 84.8%, respectively. With automatic correction, we achieved an improvement around 5% achieving 89.9% in spelling accuracy for an effective 2.92 symbols/min. The proposed approach revealed that double ErrP detection can improve the reliability and speed of BCI systems.

  3. Probabilistic Fatigue Life Updating for Railway Bridges Based on Local Inspection and Repair.

    PubMed

    Lee, Young-Joo; Kim, Robin E; Suh, Wonho; Park, Kiwon

    2017-04-24

    Railway bridges are exposed to repeated train loads, which may cause fatigue failure. As critical links in a transportation network, railway bridges are expected to survive for a target period of time, but sometimes they fail earlier than expected. To guarantee the target bridge life, bridge maintenance activities such as local inspection and repair should be undertaken properly. However, this is a challenging task because there are various sources of uncertainty associated with aging bridges, train loads, environmental conditions, and maintenance work. Therefore, to perform optimal risk-based maintenance of railway bridges, it is essential to estimate the probabilistic fatigue life of a railway bridge and update the life information based on the results of local inspections and repair. Recently, a system reliability approach was proposed to evaluate the fatigue failure risk of structural systems and update the prior risk information in various inspection scenarios. However, this approach can handle only a constant-amplitude load and has limitations in considering a cyclic load with varying amplitude levels, which is the major loading pattern generated by train traffic. In addition, it is not feasible to update the prior risk information after bridges are repaired. In this research, the system reliability approach is further developed so that it can handle a varying-amplitude load and update the system-level risk of fatigue failure for railway bridges after inspection and repair. The proposed method is applied to a numerical example of an in-service railway bridge, and the effects of inspection and repair on the probabilistic fatigue life are discussed.

  4. Probabilistic Fatigue Life Updating for Railway Bridges Based on Local Inspection and Repair

    PubMed Central

    Lee, Young-Joo; Kim, Robin E.; Suh, Wonho; Park, Kiwon

    2017-01-01

    Railway bridges are exposed to repeated train loads, which may cause fatigue failure. As critical links in a transportation network, railway bridges are expected to survive for a target period of time, but sometimes they fail earlier than expected. To guarantee the target bridge life, bridge maintenance activities such as local inspection and repair should be undertaken properly. However, this is a challenging task because there are various sources of uncertainty associated with aging bridges, train loads, environmental conditions, and maintenance work. Therefore, to perform optimal risk-based maintenance of railway bridges, it is essential to estimate the probabilistic fatigue life of a railway bridge and update the life information based on the results of local inspections and repair. Recently, a system reliability approach was proposed to evaluate the fatigue failure risk of structural systems and update the prior risk information in various inspection scenarios. However, this approach can handle only a constant-amplitude load and has limitations in considering a cyclic load with varying amplitude levels, which is the major loading pattern generated by train traffic. In addition, it is not feasible to update the prior risk information after bridges are repaired. In this research, the system reliability approach is further developed so that it can handle a varying-amplitude load and update the system-level risk of fatigue failure for railway bridges after inspection and repair. The proposed method is applied to a numerical example of an in-service railway bridge, and the effects of inspection and repair on the probabilistic fatigue life are discussed. PMID:28441768

  5. Predicting Incursion of Plant Invaders into Kruger National Park, South Africa: The Interplay of General Drivers and Species-Specific Factors

    PubMed Central

    Jarošík, Vojtěch; Pyšek, Petr; Foxcroft, Llewellyn C.; Richardson, David M.; Rouget, Mathieu; MacFadyen, Sandra

    2011-01-01

    Background Overcoming boundaries is crucial for incursion of alien plant species and their successful naturalization and invasion within protected areas. Previous work showed that in Kruger National Park, South Africa, this process can be quantified and that factors determining the incursion of invasive species can be identified and predicted confidently. Here we explore the similarity between determinants of incursions identified by the general model based on a multispecies assemblage, and those identified by species-specific models. We analyzed the presence and absence of six invasive plant species in 1.0×1.5 km segments along the border of the park as a function of environmental characteristics from outside and inside the KNP boundary, using two data-mining techniques: classification trees and random forests. Principal Findings The occurrence of Ageratum houstonianum, Chromolaena odorata, Xanthium strumarium, Argemone ochroleuca, Opuntia stricta and Lantana camara can be reliably predicted based on landscape characteristics identified by the general multispecies model, namely water runoff from surrounding watersheds and road density in a 10 km radius. The presence of main rivers and species-specific combinations of vegetation types are reliable predictors from inside the park. Conclusions The predictors from the outside and inside of the park are complementary, and are approximately equally reliable for explaining the presence/absence of current invaders; those from the inside are, however, more reliable for predicting future invasions. Landscape characteristics determined as crucial predictors from outside the KNP serve as guidelines for management to enact proactive interventions to manipulate landscape features near the KNP to prevent further incursions. Predictors from the inside the KNP can be used reliably to identify high-risk areas to improve the cost-effectiveness of management, to locate invasive plants and target them for eradication. PMID:22194893

  6. Predicting incursion of plant invaders into Kruger National Park, South Africa: the interplay of general drivers and species-specific factors.

    PubMed

    Jarošík, Vojtěch; Pyšek, Petr; Foxcroft, Llewellyn C; Richardson, David M; Rouget, Mathieu; MacFadyen, Sandra

    2011-01-01

    Overcoming boundaries is crucial for incursion of alien plant species and their successful naturalization and invasion within protected areas. Previous work showed that in Kruger National Park, South Africa, this process can be quantified and that factors determining the incursion of invasive species can be identified and predicted confidently. Here we explore the similarity between determinants of incursions identified by the general model based on a multispecies assemblage, and those identified by species-specific models. We analyzed the presence and absence of six invasive plant species in 1.0×1.5 km segments along the border of the park as a function of environmental characteristics from outside and inside the KNP boundary, using two data-mining techniques: classification trees and random forests. The occurrence of Ageratum houstonianum, Chromolaena odorata, Xanthium strumarium, Argemone ochroleuca, Opuntia stricta and Lantana camara can be reliably predicted based on landscape characteristics identified by the general multispecies model, namely water runoff from surrounding watersheds and road density in a 10 km radius. The presence of main rivers and species-specific combinations of vegetation types are reliable predictors from inside the park. The predictors from the outside and inside of the park are complementary, and are approximately equally reliable for explaining the presence/absence of current invaders; those from the inside are, however, more reliable for predicting future invasions. Landscape characteristics determined as crucial predictors from outside the KNP serve as guidelines for management to enact proactive interventions to manipulate landscape features near the KNP to prevent further incursions. Predictors from the inside the KNP can be used reliably to identify high-risk areas to improve the cost-effectiveness of management, to locate invasive plants and target them for eradication.

  7. A mass spectrometry-based multiplex SNP genotyping by utilizing allele-specific ligation and strand displacement amplification.

    PubMed

    Park, Jung Hun; Jang, Hyowon; Jung, Yun Kyung; Jung, Ye Lim; Shin, Inkyung; Cho, Dae-Yeon; Park, Hyun Gyu

    2017-05-15

    We herein describe a new mass spectrometry-based method for multiplex SNP genotyping by utilizing allele-specific ligation and strand displacement amplification (SDA) reaction. In this method, allele-specific ligation is first performed to discriminate base sequence variations at the SNP site within the PCR-amplified target DNA. The primary ligation probe is extended by a universal primer annealing site while the secondary ligation probe has base sequences as an overhang with a nicking enzyme recognition site and complementary mass marker sequence. The ligation probe pairs are ligated by DNA ligase only at specific allele in the target DNA and the resulting ligated product serves as a template to promote the SDA reaction using a universal primer. This process isothermally amplifies short DNA fragments, called mass markers, to be analyzed by mass spectrometry. By varying the sizes of the mass markers, we successfully demonstrated the multiplex SNP genotyping capability of this method by reliably identifying several BRCA mutations in a multiplex manner with mass spectrometry. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Mito-magneto: A Tool for Nanoparticle Mediated Mitochondria Isolation†

    PubMed Central

    Banik, Bhabatosh; Askins, Brett W.; Dhar, Shanta

    2016-01-01

    The field of intracellular organelle targeting using nanoparticles (NPs) is mushrooming rapidly. Thus, the area of nanotechnology-enabled targeting of mitochondrion, the cellular powerhouse, for diseases characterized by mitochondrial dysfunctions such as cancer, diseases of the central nervous system, cardiovascular diseases is also growing at a rapid pace. Optimization of NP’s ability to target the mitochondria requires quantification of the particles in this subcellular organelle and isolation of mitochondria from cells. Conventional gradient centrifugation used in currently available methods may not be appropriate for NP containing mitochondria isolation as these particles undergo Brownian motion under centrifugal forces yielding irreproducible results. There is only one method for centrifugation free mitochondria isolation, however this method requires immune-precipitation. Thus, a reliable centrifugation and immune-precipitation free method is urgently needed to support this growing field of nanotechnology-based mitochondria targeting. Here, we report a mitochondria-targeted magnetic NP, Mito-magneto, to avoid centrifugation and immune precipitation methods for isolation of functional, respiration active pure mitochondria which can be used to analyze and quantify mitochondria targeting properties of various NPs to provide an important tool for the growing field of “mitochondrial nanomedicine”. PMID:27735003

  9. Investigating effects of communications modulation technique on targeting performance

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Eusebio, Gerald; Huling, Edward

    2006-05-01

    One of the key challenges facing the global war on terrorism (GWOT) and urban operations is the increased need for rapid and diverse information from distributed sources. For users to get adequate information on target types and movements, they would need reliable data. In order to facilitate reliable computational intelligence, we seek to explore the communication modulation tradeoffs affecting information distribution and accumulation. In this analysis, we explore the modulation techniques of Orthogonal Frequency Division Multiplexing (OFDM), Direct Sequence Spread Spectrum (DSSS), and statistical time-division multiple access (TDMA) as a function of the bit error rate and jitter that affect targeting performance. In the analysis, we simulate a Link 16 with a simple bandpass frequency shift keying (PSK) technique using different Signal-to-Noise ratios. The communications transfer delay and accuracy tradeoffs are assessed as to the effects incurred in targeting performance.

  10. Stereo-vision-based cooperative-vehicle positioning using OCC and neural networks

    NASA Astrophysics Data System (ADS)

    Ifthekhar, Md. Shareef; Saha, Nirzhar; Jang, Yeong Min

    2015-10-01

    Vehicle positioning has been subjected to extensive research regarding driving safety measures and assistance as well as autonomous navigation. The most common positioning technique used in automotive positioning is the global positioning system (GPS). However, GPS is not reliably accurate because of signal blockage caused by high-rise buildings. In addition, GPS is error prone when a vehicle is inside a tunnel. Moreover, GPS and other radio-frequency-based approaches cannot provide orientation information or the position of neighboring vehicles. In this study, we propose a cooperative-vehicle positioning (CVP) technique by using the newly developed optical camera communications (OCC). The OCC technique utilizes image sensors and cameras to receive and decode light-modulated information from light-emitting diodes (LEDs). A vehicle equipped with an OCC transceiver can receive positioning and other information such as speed, lane change, driver's condition, etc., through optical wireless links of neighboring vehicles. Thus, the target vehicle position that is too far away to establish an OCC link can be determined by a computer-vision-based technique combined with the cooperation of neighboring vehicles. In addition, we have devised a back-propagation (BP) neural-network learning method for positioning and range estimation for CVP. The proposed neural-network-based technique can estimate target vehicle position from only two image points of target vehicles using stereo vision. For this, we use rear LEDs on target vehicles as image points. We show from simulation results that our neural-network-based method achieves better accuracy than that of the computer-vision method.

  11. Development and validation of a toddler silhouette scale.

    PubMed

    Hager, Erin R; McGill, Adrienne E; Black, Maureen M

    2010-02-01

    The purpose of this study is to develop and validate a toddler silhouette scale. A seven-point scale was developed by an artist based on photographs of 15 toddlers (6 males, 9 females) varying in race/ethnicity and body size, and a list of phenotypic descriptions of toddlers of varying body sizes. Content validity, age-appropriateness, and gender and race/ethnicity neutrality were assessed among 180 pediatric health professionals and 129 parents of toddlers. Inter- and intrarater reliability and concurrent validity were assessed by having 138 pediatric health professionals match the silhouettes with photographs of toddlers. Assessments of content validity revealed that most health professionals (74.6%) and parents of toddlers (63.6%) ordered all seven silhouettes correctly, and interobserver agreement for weight status classification was high (kappa = 0.710, r = 0.827, P < 0.001). Most respondents reported that the scale represented toddlers aged 12-36 months (89%) and was gender (68.5%) and race/ethnicity (77.3%) neutral. The inter-rater reliability, based on matching silhouettes with photographs, was 0.787 (Cronbach's alpha) and the intrarater reliability was 0.855 (P < 0.001). The concurrent validity, based on the correlation between silhouette choice and the weight-for-length percentile of each toddler's photograph, was 0.633 (P < 0.001). In conclusion, a valid and reliable toddler silhouette scale that may be used for male or female toddlers, aged 12-36 months, of varying race/ethnicity was developed and evaluated. This scale may be used clinically or in research settings to assess parents' perception of and satisfaction with their toddler's body size. Interventions can be targeted toward parents who have inaccurate perceptions of or are dissatisfied with their toddler's body size.

  12. Indexing strategic retrieval of colour information with event-related potentials.

    PubMed

    Wilding, E L; Fraser, C S; Herron, J E

    2005-09-01

    Event-related potentials (ERPs) were acquired during two experiments in order to determine boundary conditions for when recollection of colour information can be controlled strategically. In initial encoding phases, participants saw an equal number of words presented in red or green. In subsequent retrieval phases, all words were shown in white. Participants were asked to endorse old words that had been shown at encoding in one colour (targets), and to reject new test words as well as old words shown in the alternate colour (non-targets). Study and test lists were longer in Experiment 1, and as a result, the accuracy of memory judgments was superior in Experiment 2. The left-parietal ERP old/new effect--the electrophysiological signature of recollection--was reliable for targets in both experiments, and reliable for non-targets in Experiment 1 only. These findings are consistent with the view that participants were able to restrict recollection to targets in Experiment 2, while recollecting information about targets as well as non-targets in Experiment 1. The fact that this selective strategy was implemented in Experiment 2 despite the close correspondence between the kinds of information associated with targets and non-targets indicates that participants were able to exert considerable control over the conditions under which recollection of task-relevant information occurred.

  13. Measurement of sedentary behaviour in population health surveys: a review and recommendations

    PubMed Central

    LeBlanc, Allana G.; Colley, Rachel C.; Saunders, Travis J.

    2017-01-01

    Background The purpose of this review was to determine the most valid and reliable questions for targeting key modes of sedentary behaviour (SB) in a broad range of national and international health surveillance surveys. This was done by reviewing the SB modules currently used in population health surveys, as well as examining SB questionnaires that have performed well in psychometric testing. Methods Health surveillance surveys were identified via scoping review and contact with experts in the field. Previous systematic reviews provided psychometric information on pediatric questionnaires. A comprehensive search of four bibliographic databases was used to identify studies reporting psychometric information for adult questionnaires. Only surveys/studies published/used in English or French were included. Results The review identified a total of 16 pediatric and 18 adult national/international surveys assessing SB, few of which have undergone psychometric testing. Fourteen pediatric and 35 adult questionnaires with psychometric information were included. While reliability was generally good to excellent for questions targeting key modes of SB, validity was poor to moderate, and reported much less frequently. The most valid and reliable questions targeting specific modes of SB were combined to create a single questionnaire targeting key modes of SB. Discussion Our results highlight the importance of including SB questions in survey modules that are adaptable, able to assess various modes of SB, and that exhibit adequate reliability and validity. Future research could investigate the psychometric properties of the module we have proposed in this paper, as well as other questionnaires currently used in national and international population health surveys. PMID:29250468

  14. Measurement of sedentary behaviour in population health surveys: a review and recommendations.

    PubMed

    Prince, Stephanie A; LeBlanc, Allana G; Colley, Rachel C; Saunders, Travis J

    2017-01-01

    The purpose of this review was to determine the most valid and reliable questions for targeting key modes of sedentary behaviour (SB) in a broad range of national and international health surveillance surveys. This was done by reviewing the SB modules currently used in population health surveys, as well as examining SB questionnaires that have performed well in psychometric testing. Health surveillance surveys were identified via scoping review and contact with experts in the field. Previous systematic reviews provided psychometric information on pediatric questionnaires. A comprehensive search of four bibliographic databases was used to identify studies reporting psychometric information for adult questionnaires. Only surveys/studies published/used in English or French were included. The review identified a total of 16 pediatric and 18 adult national/international surveys assessing SB, few of which have undergone psychometric testing. Fourteen pediatric and 35 adult questionnaires with psychometric information were included. While reliability was generally good to excellent for questions targeting key modes of SB, validity was poor to moderate, and reported much less frequently. The most valid and reliable questions targeting specific modes of SB were combined to create a single questionnaire targeting key modes of SB. Our results highlight the importance of including SB questions in survey modules that are adaptable, able to assess various modes of SB, and that exhibit adequate reliability and validity. Future research could investigate the psychometric properties of the module we have proposed in this paper, as well as other questionnaires currently used in national and international population health surveys.

  15. Assessing cortical excitability in migraine: reliability of magnetic suppression of perceptual accuracy technique over time.

    PubMed

    Custers, Anouk; Mulleners, Wim M; Chronicle, Edward P

    2005-10-01

    To examine test-retest reliability of magnetic suppression of perceptual accuracy (MSPA) prior to its use as a marker of cortical excitability in a trial of migraine prophylactic agents. MSPA is a relatively novel avenue of research in headache, providing an opportunity to study cortical responsiveness objectively and noninvasively. However, little is known about the reliability of magnetic stimulation protocols such as MSPA in longitudinal research designs. We tested 10 healthy headache-free volunteers who had no family history of migraine. In 54 trials, they were briefly presented different three-letter combinations, flashed on a computer screen for 24 ms (target). After a brief interval, each target was followed by a single magnetic pulse through a 90-mm circular coil centered 7 cm above inion in the midline. The interval between target and magnetic pulse was systematically varied. Volunteers were requested to report as many letters as they had possibly identified. After 2 weeks, all volunteers were retested using identical methods. MSPA performance is expressed as a profile of response accuracy (ie, percentage of correctly identified letters) across target-pulse intervals. Profiles were characteristic of normal headache-free subjects at the first test. Analysis of variance revealed no significant difference in profiles between test and retest (F= 2.05; P= .136): the retest profiles are almost coincidental with the test profiles. MSPA is a safe and objective measure of cortical excitability, which is reliable over time. MSPA, therefore, shows excellent promise as a biological marker of cortical response in trials of migraine prophylactics.

  16. Spatial working memory for locations specified by vision and audition: testing the amodality hypothesis.

    PubMed

    Loomis, Jack M; Klatzky, Roberta L; McHugh, Brendan; Giudice, Nicholas A

    2012-08-01

    Spatial working memory can maintain representations from vision, hearing, and touch, representations referred to here as spatial images. The present experiment addressed whether spatial images from vision and hearing that are simultaneously present within working memory retain modality-specific tags or are amodal. Observers were presented with short sequences of targets varying in angular direction, with the targets in a given sequence being all auditory, all visual, or a sequential mixture of the two. On two thirds of the trials, one of the locations was repeated, and observers had to respond as quickly as possible when detecting this repetition. Ancillary detection and localization tasks confirmed that the visual and auditory targets were perceptually comparable. Response latencies in the working memory task showed small but reliable costs in performance on trials involving a sequential mixture of auditory and visual targets, as compared with trials of pure vision or pure audition. These deficits were statistically reliable only for trials on which the modalities of the matching location switched from the penultimate to the final target in the sequence, indicating a switching cost. The switching cost for the pair in immediate succession means that the spatial images representing the target locations retain features of the visual or auditory representations from which they were derived. However, there was no reliable evidence of a performance cost for mixed modalities in the matching pair when the second of the two did not immediately follow the first, suggesting that more enduring spatial images in working memory may be amodal.

  17. CCTop: An Intuitive, Flexible and Reliable CRISPR/Cas9 Target Prediction Tool

    PubMed Central

    del Sol Keyer, Maria; Wittbrodt, Joachim; Mateo, Juan L.

    2015-01-01

    Engineering of the CRISPR/Cas9 system has opened a plethora of new opportunities for site-directed mutagenesis and targeted genome modification. Fundamental to this is a stretch of twenty nucleotides at the 5’ end of a guide RNA that provides specificity to the bound Cas9 endonuclease. Since a sequence of twenty nucleotides can occur multiple times in a given genome and some mismatches seem to be accepted by the CRISPR/Cas9 complex, an efficient and reliable in silico selection and evaluation of the targeting site is key prerequisite for the experimental success. Here we present the CRISPR/Cas9 target online predictor (CCTop, http://crispr.cos.uni-heidelberg.de) to overcome limitations of already available tools. CCTop provides an intuitive user interface with reasonable default parameters that can easily be tuned by the user. From a given query sequence, CCTop identifies and ranks all candidate sgRNA target sites according to their off-target quality and displays full documentation. CCTop was experimentally validated for gene inactivation, non-homologous end-joining as well as homology directed repair. Thus, CCTop provides the bench biologist with a tool for the rapid and efficient identification of high quality target sites. PMID:25909470

  18. Molecular imaging of human tumor cells that naturally overexpress type 2 cannabinoid receptors using a quinolone-based near-infrared fluorescent probe

    NASA Astrophysics Data System (ADS)

    Wu, Zhiyuan; Shao, Pin; Zhang, Shaojuan; Ling, Xiaoxi; Bai, Mingfeng

    2014-07-01

    Cannabinoid CB2 receptors (CB2R) hold promise as therapeutic targets for treating diverse diseases, such as cancers, neurodegenerative diseases, pain, inflammation, osteoporosis, psychiatric disorders, addiction, and immune disorders. However, the fundamental role of CBR in the regulation of diseases remains unclear, largely due to a lack of reliable imaging tools for the receptors. The goal of this study was to develop a CBR-targeted molecular imaging probe and evaluate the specificity of the probe using human tumor cells that naturally overexpress CBR. To synthesize the CBR-targeted probe (NIR760-Q), a conjugable CBR ligand based on the quinolone structure was first prepared, followed by bioconjugation with a near-infrared (NIR) fluorescent dye, NIR760. In vitro fluorescence imaging and competitive binding studies showed higher uptake of NIR760-Q than free NIR760 dye in Jurkat human acute T-lymphoblastic leukemia cells. In addition, the high uptake of NIR760-Q was significantly inhibited by the blocking agent, 4-quinolone-3-carboxamide, indicating specific binding of NIR760-Q to the target receptors. These results indicate that the NIR760-Q has potential in diagnostic imaging of CBR positive cancers and elucidating the role of CBR in the regulation of disease progression.

  19. Airborne Infrared and Visible Image Fusion Combined with Region Segmentation

    PubMed Central

    Zuo, Yujia; Liu, Jinghong; Bai, Guanbing; Wang, Xuan; Sun, Mingchao

    2017-01-01

    This paper proposes an infrared (IR) and visible image fusion method introducing region segmentation into the dual-tree complex wavelet transform (DTCWT) region. This method should effectively improve both the target indication and scene spectrum features of fusion images, and the target identification and tracking reliability of fusion system, on an airborne photoelectric platform. The method involves segmenting the region in an IR image by significance, and identifying the target region and the background region; then, fusing the low-frequency components in the DTCWT region according to the region segmentation result. For high-frequency components, the region weights need to be assigned by the information richness of region details to conduct fusion based on both weights and adaptive phases, and then introducing a shrinkage function to suppress noise; Finally, the fused low-frequency and high-frequency components are reconstructed to obtain the fusion image. The experimental results show that the proposed method can fully extract complementary information from the source images to obtain a fusion image with good target indication and rich information on scene details. They also give a fusion result superior to existing popular fusion methods, based on eithers subjective or objective evaluation. With good stability and high fusion accuracy, this method can meet the fusion requirements of IR-visible image fusion systems. PMID:28505137

  20. Airborne Infrared and Visible Image Fusion Combined with Region Segmentation.

    PubMed

    Zuo, Yujia; Liu, Jinghong; Bai, Guanbing; Wang, Xuan; Sun, Mingchao

    2017-05-15

    This paper proposes an infrared (IR) and visible image fusion method introducing region segmentation into the dual-tree complex wavelet transform (DTCWT) region. This method should effectively improve both the target indication and scene spectrum features of fusion images, and the target identification and tracking reliability of fusion system, on an airborne photoelectric platform. The method involves segmenting the region in an IR image by significance, and identifying the target region and the background region; then, fusing the low-frequency components in the DTCWT region according to the region segmentation result. For high-frequency components, the region weights need to be assigned by the information richness of region details to conduct fusion based on both weights and adaptive phases, and then introducing a shrinkage function to suppress noise; Finally, the fused low-frequency and high-frequency components are reconstructed to obtain the fusion image. The experimental results show that the proposed method can fully extract complementary information from the source images to obtain a fusion image with good target indication and rich information on scene details. They also give a fusion result superior to existing popular fusion methods, based on eithers subjective or objective evaluation. With good stability and high fusion accuracy, this method can meet the fusion requirements of IR-visible image fusion systems.

  1. Unequal error control scheme for dimmable visible light communication systems

    NASA Astrophysics Data System (ADS)

    Deng, Keyan; Yuan, Lei; Wan, Yi; Li, Huaan

    2017-01-01

    Visible light communication (VLC), which has the advantages of a very large bandwidth, high security, and freedom from license-related restrictions and electromagnetic-interference, has attracted much interest. Because a VLC system simultaneously performs illumination and communication functions, dimming control, efficiency, and reliable transmission are significant and challenging issues of such systems. In this paper, we propose a novel unequal error control (UEC) scheme in which expanding window fountain (EWF) codes in an on-off keying (OOK)-based VLC system are used to support different dimming target values. To evaluate the performance of the scheme for various dimming target values, we apply it to H.264 scalable video coding bitstreams in a VLC system. The results of the simulations that are performed using additive white Gaussian noises (AWGNs) with different signal-to-noise ratios (SNRs) are used to compare the performance of the proposed scheme for various dimming target values. It is found that the proposed UEC scheme enables earlier base layer recovery compared to the use of the equal error control (EEC) scheme for different dimming target values and therefore afford robust transmission for scalable video multicast over optical wireless channels. This is because of the unequal error protection (UEP) and unequal recovery time (URT) of the EWF code in the proposed scheme.

  2. A Canopy Density Model for Planar Orchard Target Detection Based on Ultrasonic Sensors

    PubMed Central

    Li, Hanzhe; Zhai, Changyuan; Weckler, Paul; Wang, Ning; Yang, Shuo; Zhang, Bo

    2016-01-01

    Orchard target-oriented variable rate spraying is an effective method to reduce pesticide drift and excessive residues. To accomplish this task, the orchard targets’ characteristic information is needed to control liquid flow rate and airflow rate. One of the most important characteristics is the canopy density. In order to establish the canopy density model for a planar orchard target which is indispensable for canopy density calculation, a target density detection testing system was developed based on an ultrasonic sensor. A time-domain energy analysis method was employed to analyze the ultrasonic signal. Orthogonal regression central composite experiments were designed and conducted using man-made canopies of known density with three or four layers of leaves. Two model equations were obtained, of which the model for the canopies with four layers was found to be the most reliable. A verification test was conducted with different layers at the same density values and detecting distances. The test results showed that the relative errors of model density values and actual values of five, four, three and two layers of leaves were acceptable, while the maximum relative errors were 17.68%, 25.64%, 21.33% and 29.92%, respectively. It also suggested the model equation with four layers had a good applicability with different layers which increased with adjacent layers. PMID:28029132

  3. Multi-component identification and target cell-based screening of potential bioactive compounds in toad venom by UPLC coupled with high-resolution LTQ-Orbitrap MS and high-sensitivity Qtrap MS.

    PubMed

    Ren, Wei; Han, Lingyu; Luo, Mengyi; Bian, Baolin; Guan, Ming; Yang, Hui; Han, Chao; Li, Na; Li, Tuo; Li, Shilei; Zhang, Yangyang; Zhao, Zhenwen; Zhao, Haiyu

    2018-04-28

    Traditional Chinese medicines (TCMs) are undoubtedly treasured natural resources for discovering effective medicines in treating and preventing various diseases. However, it is still extremely difficult for screening the bioactive compounds due to the tremendous constituents in TCMs. In this work, the chemical composition of toad venom was comprehensively analyzed using ultra-high performance liquid chromatography (UPLC) coupled with high-resolution LTQ-Orbitrap mass spectrometry and 93 compounds were detected. Among them, 17 constituents were confirmed by standard substances and 8 constituents were detected in toad venom for the first time. Further, a compound database of toad venom containing the fullest compounds was further constructed using UPLC coupled with high-sensitivity Qtrap MS. Then a target cell-based approach for screening potential bioactive compounds from toad venom was developed by analyzing the target cell extracts. The reliability of this method was validated by negative controls and positive controls. In total, 17 components in toad venom were discovered to interact with the target cancer cells. Further, in vitro pharmacological trials were performed to confirm the anti-cancer activity of four of them. The results showed that the six bufogenins and seven bufotoxins detected in our research represented a promising resource to explore bufogenins/bufotoxins-based anticancer agents with low cardiotoxic effect. The target cell-based screening method coupled with the compound database of toad venom constructed by UPLC-Qtrap-MS with high sensitivity provide us a new strategy to rapidly screen and identify the potential bioactive constituents with low content in natural products, which was beneficial for drug discovery from other TCMs. ᅟ Graphical abstract.

  4. The reliability and internal consistency of one-shot and flicker change detection for measuring individual differences in visual working memory capacity.

    PubMed

    Pailian, Hrag; Halberda, Justin

    2015-04-01

    We investigated the psychometric properties of the one-shot change detection task for estimating visual working memory (VWM) storage capacity-and also introduced and tested an alternative flicker change detection task for estimating these limits. In three experiments, we found that the one-shot whole-display task returns estimates of VWM storage capacity (K) that are unreliable across set sizes-suggesting that the whole-display task is measuring different things at different set sizes. In two additional experiments, we found that the one-shot single-probe variant shows improvements in the reliability and consistency of K estimates. In another additional experiment, we found that a one-shot whole-display-with-click task (requiring target localization) also showed improvements in reliability and consistency. The latter results suggest that the one-shot task can return reliable and consistent estimates of VWM storage capacity (K), and they highlight the possibility that the requirement to localize the changed target is what engenders this enhancement. Through a final series of four experiments, we introduced and tested an alternative flicker change detection method that also requires the observer to localize the changing target and that generates, from response times, an estimate of VWM storage capacity (K). We found that estimates of K from the flicker task correlated with estimates from the traditional one-shot task and also had high reliability and consistency. We highlight the flicker method's ability to estimate executive functions as well as VWM storage capacity, and discuss the potential for measuring multiple abilities with the one-shot and flicker tasks.

  5. Selective pressures for accurate altruism targeting: evidence from digital evolution for difficult-to-test aspects of inclusive fitness theory.

    PubMed

    Clune, Jeff; Goldsby, Heather J; Ofria, Charles; Pennock, Robert T

    2011-03-07

    Inclusive fitness theory predicts that natural selection will favour altruist genes that are more accurate in targeting altruism only to copies of themselves. In this paper, we provide evidence from digital evolution in support of this prediction by competing multiple altruist-targeting mechanisms that vary in their accuracy in determining whether a potential target for altruism carries a copy of the altruist gene. We compete altruism-targeting mechanisms based on (i) kinship (kin targeting), (ii) genetic similarity at a level greater than that expected of kin (similarity targeting), and (iii) perfect knowledge of the presence of an altruist gene (green beard targeting). Natural selection always favoured the most accurate targeting mechanism available. Our investigations also revealed that evolution did not increase the altruism level when all green beard altruists used the same phenotypic marker. The green beard altruism levels stably increased only when mutations that changed the altruism level also changed the marker (e.g. beard colour), such that beard colour reliably indicated the altruism level. For kin- and similarity-targeting mechanisms, we found that evolution was able to stably adjust altruism levels. Our results confirm that natural selection favours altruist genes that are increasingly accurate in targeting altruism to only their copies. Our work also emphasizes that the concept of targeting accuracy must include both the presence of an altruist gene and the level of altruism it produces.

  6. A novel multiparametric flow cytometry-based cytotoxicity assay simultaneously immunophenotypes effector cells: Comparisons to a 4 h 51Cr-release assay

    PubMed Central

    Kim, GG; Donnenberg, VS; Donnenberg, AD; Gooding, W; Whiteside, TL

    2007-01-01

    Natural killer (NK) cell- or T cell-mediated cytotoxicity traditionally is measured in 4-16h 51Cr-release assays (CRA). A new four-color flow cytometry-based cytotoxicity assay (FCC) was developed to simultaneously measure NK cell cytotoxicity and NK cell phenotype (CD3−CD16+CD56+). Target cells, K562 or Daudi, were labeled with Cell Tracker Orange (CTO) prior to the addition of effector cells. Following co-incubation, 7 amino-actinomycin D (7-AAD) was added to measure death of target cells. The phenotype of effectors, viability of targets, the formation of tumor-effector cell conjugates and absolute numbers of all cells were measured based on light scatter (FSC/SSC), double discrimination of the fluorescence peak integral and height, and fluorescence intensity. Kinetic studies (0.5 and 1 to 4h) at different effector to target (E:T) cell ratios (50, 25, 12, and 6) confirmed that the 3h incubation was optimal. The FCC assay is more sensitive than the CRA, has a coefficient of variation (CV) 8–13% and reliably measures NK cell- or lymphokine-activated killer (LAK) cell-mediated killing of target cells in normal controls and subjects with cancer. The FCC assay can be used to study a range of phenotypic attributes, in addition to lytic activity of various subsets of effector cells, without radioactive tracers and thus, it is relatively inexpensive. The FCC assay has a potential for providing information about molecular interactions underlying target cell lysis and thus becoming a major tool for studies of disease pathogenesis as well as development of novel immune therapies. PMID:17617419

  7. A novel multiparametric flow cytometry-based cytotoxicity assay simultaneously immunophenotypes effector cells: comparisons to a 4 h 51Cr-release assay.

    PubMed

    Kim, G G; Donnenberg, V S; Donnenberg, A D; Gooding, W; Whiteside, T L

    2007-08-31

    Natural killer (NK) cell-or T cell-mediated cytotoxicity traditionally is measured in 4-16 h (51)Cr-release assays (CRA). A new four-color flow cytometry-based cytotoxicity assay (FCC) was developed to simultaneously measure NK cell cytotoxicity and NK cell phenotype (CD3(-)CD16(+)CD56(+)). Target cells, K562 or Daudi, were labeled with Cell Tracker Orange (CTO) prior to the addition of effector cells. Following co-incubation, 7 amino-actinomycin D (7-AAD) was added to measure death of target cells. The phenotype of effectors, viability of targets, the formation of tumor-effector cell conjugates and absolute numbers of all cells were measured based on light scatter (FSC/SSC), double discrimination of the fluorescence peak integral and height, and fluorescence intensity. Kinetic studies (0.5 and 1 to 4 h) at different effector to target (E:T) cell ratios (50, 25, 12, and 6) confirmed that the 3 h incubation was optimal. The FCC assay is more sensitive than the CRA, has a coefficient of variation (CV) 8-13% and reliably measures NK cell-or lymphokine-activated killer (LAK) cell-mediated killing of target cells in normal controls and subjects with cancer. The FCC assay can be used to study a range of phenotypic attributes, in addition to lytic activity of various subsets of effector cells, without radioactive tracers and thus, it is relatively inexpensive. The FCC assay has a potential for providing information about molecular interactions underlying target cell lysis and thus becoming a major tool for studies of disease pathogenesis as well as development of novel immune therapies.

  8. Photonics

    NASA Astrophysics Data System (ADS)

    Roh, Won B.

    Photonic technologies-based computational systems are projected to be able to offer order-of-magnitude improvements in processing speed, due to their intrinsic architectural parallelism and ultrahigh switching speeds; these architectures also minimize connectors, thereby enhancing reliability, and preclude EMP vulnerability. The use of optoelectronic ICs would also extend weapons capabilities in such areas as automated target recognition, systems-state monitoring, and detection avoidance. Fiber-optics technologies have an information-carrying capacity fully five orders of magnitude greater than copper-wire-based systems; energy loss in transmission is two orders of magnitude lower, and error rates one order of magnitude lower. Attention is being given to ZrF glasses for optical fibers with unprecedentedly low scattering levels.

  9. A closer look at diagnosis in clinical dental practice: part 1. Reliability, validity, specificity and sensitivity of diagnostic procedures.

    PubMed

    Pretty, Iain A; Maupomé, Gerardo

    2004-04-01

    Dentists are involved in diagnosing disease in every aspect of their clinical practice. A range of tests, systems, guides and equipment--which can be generally referred to as diagnostic procedures--are available to aid in diagnostic decision making. In this era of evidence-based dentistry, and given the increasing demand for diagnostic accuracy and properly targeted health care, it is important to assess the value of such diagnostic procedures. Doing so allows dentists to weight appropriately the information these procedures supply, to purchase new equipment if it proves more reliable than existing equipment or even to discard a commonly used procedure if it is shown to be unreliable. This article, the first in a 6-part series, defines several concepts used to express the usefulness of diagnostic procedures, including reliability and validity, and describes some of their operating characteristics (statistical measures of performance), in particular, specificity and sensitivity. Subsequent articles in the series will discuss the value of diagnostic procedures used in daily dental practice and will compare today's most innovative procedures with established methods.

  10. Reliability demonstration test for load-sharing systems with exponential and Weibull components

    PubMed Central

    Hu, Qingpei; Yu, Dan; Xie, Min

    2017-01-01

    Conducting a Reliability Demonstration Test (RDT) is a crucial step in production. Products are tested under certain schemes to demonstrate whether their reliability indices reach pre-specified thresholds. Test schemes for RDT have been studied in different situations, e.g., lifetime testing, degradation testing and accelerated testing. Systems designed with several structures are also investigated in many RDT plans. Despite the availability of a range of test plans for different systems, RDT planning for load-sharing systems hasn’t yet received the attention it deserves. In this paper, we propose a demonstration method for two specific types of load-sharing systems with components subject to two distributions: exponential and Weibull. Based on the assumptions and interpretations made in several previous works on such load-sharing systems, we set the mean time to failure (MTTF) of the total system as the demonstration target. We represent the MTTF as a summation of mean time between successive component failures. Next, we introduce generalized test statistics for both the underlying distributions. Finally, RDT plans for the two types of systems are established on the basis of these test statistics. PMID:29284030

  11. Reliability demonstration test for load-sharing systems with exponential and Weibull components.

    PubMed

    Xu, Jianyu; Hu, Qingpei; Yu, Dan; Xie, Min

    2017-01-01

    Conducting a Reliability Demonstration Test (RDT) is a crucial step in production. Products are tested under certain schemes to demonstrate whether their reliability indices reach pre-specified thresholds. Test schemes for RDT have been studied in different situations, e.g., lifetime testing, degradation testing and accelerated testing. Systems designed with several structures are also investigated in many RDT plans. Despite the availability of a range of test plans for different systems, RDT planning for load-sharing systems hasn't yet received the attention it deserves. In this paper, we propose a demonstration method for two specific types of load-sharing systems with components subject to two distributions: exponential and Weibull. Based on the assumptions and interpretations made in several previous works on such load-sharing systems, we set the mean time to failure (MTTF) of the total system as the demonstration target. We represent the MTTF as a summation of mean time between successive component failures. Next, we introduce generalized test statistics for both the underlying distributions. Finally, RDT plans for the two types of systems are established on the basis of these test statistics.

  12. Assessing spatial uncertainty in reservoir characterization for carbon sequestration planning using public well-log data: A case study

    USGS Publications Warehouse

    Venteris, E.R.; Carter, K.M.

    2009-01-01

    Mapping and characterization of potential geologic reservoirs are key components in planning carbon dioxide (CO2) injection projects. The geometry of target and confining layers is vital to ensure that the injected CO2 remains in a supercritical state and is confined to the target layer. Also, maps of injection volume (porosity) are necessary to estimate sequestration capacity at undrilled locations. Our study uses publicly filed geophysical logs and geostatistical modeling methods to investigate the reliability of spatial prediction for oil and gas plays in the Medina Group (sandstone and shale facies) in northwestern Pennsylvania. Specifically, the modeling focused on two targets: the Grimsby Formation and Whirlpool Sandstone. For each layer, thousands of data points were available to model structure and thickness but only hundreds were available to support volumetric modeling because of the rarity of density-porosity logs in the public records. Geostatistical analysis based on this data resulted in accurate structure models, less accurate isopach models, and inconsistent models of pore volume. Of the two layers studied, only the Whirlpool Sandstone data provided for a useful spatial model of pore volume. Where reliable models for spatial prediction are absent, the best predictor available for unsampled locations is the mean value of the data, and potential sequestration sites should be planned as close as possible to existing wells with volumetric data. ?? 2009. The American Association of Petroleum Geologists/Division of Environmental Geosciences. All rights reserved.

  13. Comparative cath-lab assessment of coronary stenosis by radiology technician, junior and senior interventional cardiologist in patients treated with coronary angioplasty.

    PubMed

    Brunetti, Natale Daniele; Delli Carri, Felice; Ruggiero, Maria Assunta; Cuculo, Andrea; Ruggiero, Antonio; Ziccardi, Luigi; De Gennaro, Luisa; Di Biase, Matteo

    2014-03-01

    Exact quantification of plaque extension during coronary angioplasty (PCI) usually falls on interventional cardiologist (IC). Quantitative coronary stenosis assessment (QCA) may be possibly committed to the radiology technician (RT), who usually supports cath-lab nurse and IC during PCI. We therefore sought to investigate the reliability of QCA performed by RT in comparison with IC. Forty-four consecutive patients with acute coronary syndrome underwent PCI; target coronary vessel size beneath target coronary lesion (S) and target coronary lesion length (L) were assessed by the RT, junior IC (JIC), and senior IC (SIC) and then compared. SIC evaluation, which determined the final stent selection for coronary stenting, was considered as a reference benchmark. RT performance with QCA support in assessing target vessel size and target lesion length was not significantly different from SIC (r = 0.46, p < 0.01; r = 0.64, p < 0.001, respectively) as well as JIC (r = 0.79, r = 0.75, p < 0.001, respectively). JIC performance was significantly better than RT in assessing target vessel size (p < 0.05), while not significant when assessing target lesion length. RT may reliably assess target lesion by using adequate QCA software in the cath-lab in case of PCI; RT performance does not differ from SIC.

  14. Rapid determination of caffeoylquinic acid derivatives in Cynara scolymus L. by ultra-fast liquid chromatography/tandem mass spectrometry based on a fused core C18 column.

    PubMed

    Shen, Qing; Dai, Zhiyuan; Lu, Yanbin

    2010-10-01

    An ultra-fast high-performance LC-ESI-MS/MS method was developed for the analysis and quantification of caffeoylquinic acid derivatives, including chlorogenic acid, 1,3-di-O-caffeoylquinic acid (cynarin) and 1,5-di-O-caffeoylquinic acid, in artichoke (Cynara scolymus L.) heads and leaves. The rapid separation (less than 4  min) was achieved based on a Halo fused core C18-silica column (50  mm × 2.1  mm id, 2.7  μm). The target compounds were detected and quantified by a triple-quadrupole mass spectrometer in multiple-reaction monitoring mode. The calibration function is linear from 0.06 to 2800  ng/mL for chlorogenic acid, 0.3-3000  ng/mL for cynarin and 0.24-4800  ng/mL for 1,5-di-O-caffeoylquinic acid, respectively. The average recoveries ranged from 92.1 to 113.2% with RSDs ≤6.5%. Moreover, four batches of artichoke head and leaf extracts were analyzed using the established method. The results indicated that the Halo fused core column provided much faster separations and higher sample throughput without sacrificing column ruggedness and reliability, and triple-quadrupole MS provided extraordinarily lower LOQs for most of the target analytes. Comparing to conventional quantitative approaches, the established method was fast, sensitive and reliable for the determination of caffeoylquinic acid derivatives in artichoke.

  15. Reliability of a computer-based system for measuring visual performance skills.

    PubMed

    Erickson, Graham B; Citek, Karl; Cove, Michelle; Wilczek, Jennifer; Linster, Carolyn; Bjarnason, Brendon; Langemo, Nathan

    2011-09-01

    Athletes have demonstrated better visual abilities than nonathletes. A vision assessment for an athlete should include methods to evaluate the quality of visual performance skills in the most appropriate, accurate, and repeatable manner. This study determines the reliability of the visual performance measures assessed with a computer-based system, known as the Nike Sensory Station. One hundred twenty-five subjects (56 men, 69 women), age 18 to 30, completed Phase I of the study. Subjects attended 2 sessions, separated by at least 1 week, in which identical protocols were followed. Subjects completed the following assessments: Visual Clarity, Contrast Sensitivity, Depth Perception, Near-Far Quickness, Target Capture, Perception Span, Eye-Hand Coordination, Go/No Go, and Reaction Time. An additional 36 subjects (20 men, 16 women), age 22 to 35, completed Phase II of the study involving modifications to the equipment, instructions, and protocols from Phase I. Results show no significant change in performance over time on assessments of Visual Clarity, Contrast Sensitivity, Depth Perception, Target Capture, Perception Span, and Reaction Time. Performance did improve over time for Near-Far Quickness, Eye-Hand Coordination, and Go/No Go. The results of this study show that many of the Nike Sensory Station assessments show repeatability and no learning effect over time. The measures that did improve across sessions show an expected learning effect caused by the motor response characteristics being measured. Copyright © 2011 American Optometric Association. Published by Elsevier Inc. All rights reserved.

  16. The Mechanisms for Within-Host Influenza Virus Control Affect Model-Based Assessment and Prediction of Antiviral Treatment

    PubMed Central

    Cao, Pengxing

    2017-01-01

    Models of within-host influenza viral dynamics have contributed to an improved understanding of viral dynamics and antiviral effects over the past decade. Existing models can be classified into two broad types based on the mechanism of viral control: models utilising target cell depletion to limit the progress of infection and models which rely on timely activation of innate and adaptive immune responses to control the infection. In this paper, we compare how two exemplar models based on these different mechanisms behave and investigate how the mechanistic difference affects the assessment and prediction of antiviral treatment. We find that the assumed mechanism for viral control strongly influences the predicted outcomes of treatment. Furthermore, we observe that for the target cell-limited model the assumed drug efficacy strongly influences the predicted treatment outcomes. The area under the viral load curve is identified as the most reliable predictor of drug efficacy, and is robust to model selection. Moreover, with support from previous clinical studies, we suggest that the target cell-limited model is more suitable for modelling in vitro assays or infection in some immunocompromised/immunosuppressed patients while the immune response model is preferred for predicting the infection/antiviral effect in immunocompetent animals/patients. PMID:28933757

  17. Reliability and validity of a school recess physical activity recall in Spanish youth.

    PubMed

    Martínez-Gómez, David; Calabro, M Andres; Welk, Gregory J; Marcos, Ascension; Veiga, Oscar L

    2010-05-01

    Recess is a frequent target in school-based physical activity (PA) promotion research but there are challenges in assessing PA during this time period. The purpose of this study was to evaluate the reliability and validity of a recess PA recall (RPAR) instrument designed to assess total PA and time spent in moderate to vigorous PA (MVPA) during recess. One hundred twenty-five 7th and 8th-grade students (59 females), age 12-14 years, participated in the study. Activity levels were objectively monitored on Mondays using different activity monitors (Yamax Digiwalker, Biotrainer and ActiGraph). On Tuesdays, 2 RPAR self-reports were administered within 1-hr. Test-retest reliability showed ICC = 0.87 and 0.88 for total PA and time spent in MVPA, respectively. The RPAR was correlated against Yamax (r = .35), Biotrainer (r = .40 and 0.54) and ActiGraph (r = .42) to assess total PA during recess. The RPAR was also correlated against ActiGraph (r = .54) to assess time spent in MVPA during recess. Mean difference between the RPAR and ActiGraph to assess time spent in MVPA during recess was no significant (2.15 +/- 3.67 min, p = .313). The RPAR showed an adequate reliability and a reasonable validity for assessing PA during the school recess in youth.

  18. Assessing adherence to the evidence base in the management of poststroke dysphagia.

    PubMed

    Burton, Christopher; Pennington, Lindsay; Roddam, Hazel; Russell, Ian; Russell, Daphne; Krawczyk, Karen; Smith, Hilary A

    2006-01-01

    To evaluate the reliability and responsiveness to change of an audit tool to assess adherence to evidence of effectiveness in the speech and language therapy (SLT) management of poststroke dysphagia. The tool was used to review SLT practice as part of a randomized study of different education strategies. Medical records were audited before and after delivery of the trial intervention. Seventeen SLT departments in the north-west of England participated in the study. The assessment tool was used to assess the medical records of 753 patients before and 717 patients after delivery of the trial intervention across the 17 departments. A target of 10 records per department per month was sought, using systematic sampling with a random start. Inter- and intra-rater reliability were explored, together with the tool's internal consistency and responsiveness to change. The assessment tool had high face validity, although internal consistency was low (ra = 0.37). Composite scores on the tool were however responsive to differences between SLT departments. Both inter- and intra-rater reliability ranged from 'substantial' to 'near perfect' across all items. The audit tool has high face validity and measurement reliability. The use of a composite adherence score should, however, proceed with caution as internal consistency is low.

  19. The ventriloquist in periphery: impact of eccentricity-related reliability on audio-visual localization.

    PubMed

    Charbonneau, Geneviève; Véronneau, Marie; Boudrias-Fournier, Colin; Lepore, Franco; Collignon, Olivier

    2013-10-28

    The relative reliability of separate sensory estimates influences the way they are merged into a unified percept. We investigated how eccentricity-related changes in reliability of auditory and visual stimuli influence their integration across the entire frontal space. First, we surprisingly found that despite a strong decrease in auditory and visual unisensory localization abilities in periphery, the redundancy gain resulting from the congruent presentation of audio-visual targets was not affected by stimuli eccentricity. This result therefore contrasts with the common prediction that a reduction in sensory reliability necessarily induces an enhanced integrative gain. Second, we demonstrate that the visual capture of sounds observed with spatially incongruent audio-visual targets (ventriloquist effect) steadily decreases with eccentricity, paralleling a lowering of the relative reliability of unimodal visual over unimodal auditory stimuli in periphery. Moreover, at all eccentricities, the ventriloquist effect positively correlated with a weighted combination of the spatial resolution obtained in unisensory conditions. These findings support and extend the view that the localization of audio-visual stimuli relies on an optimal combination of auditory and visual information according to their respective spatial reliability. All together, these results evidence that the external spatial coordinates of multisensory events relative to an observer's body (e.g., eyes' or head's position) influence how this information is merged, and therefore determine the perceptual outcome.

  20. Predicting pedestrian flow: a methodology and a proof of concept based on real-life data.

    PubMed

    Davidich, Maria; Köster, Gerta

    2013-01-01

    Building a reliable predictive model of pedestrian motion is very challenging: Ideally, such models should be based on observations made in both controlled experiments and in real-world environments. De facto, models are rarely based on real-world observations due to the lack of available data; instead, they are largely based on intuition and, at best, literature values and laboratory experiments. Such an approach is insufficient for reliable simulations of complex real-life scenarios: For instance, our analysis of pedestrian motion under natural conditions at a major German railway station reveals that the values for free-flow velocities and the flow-density relationship differ significantly from widely used literature values. It is thus necessary to calibrate and validate the model against relevant real-life data to make it capable of reproducing and predicting real-life scenarios. In this work we aim at constructing such realistic pedestrian stream simulation. Based on the analysis of real-life data, we present a methodology that identifies key parameters and interdependencies that enable us to properly calibrate the model. The success of the approach is demonstrated for a benchmark model, a cellular automaton. We show that the proposed approach significantly improves the reliability of the simulation and hence the potential prediction accuracy. The simulation is validated by comparing the local density evolution of the measured data to that of the simulated data. We find that for our model the most sensitive parameters are: the source-target distribution of the pedestrian trajectories, the schedule of pedestrian appearances in the scenario and the mean free-flow velocity. Our results emphasize the need for real-life data extraction and analysis to enable predictive simulations.

  1. Application of a real-time, calculable limiting form of the Renyi entropy for molecular imaging of tumors.

    PubMed

    Marsh, Jon N; Wallace, Kirk D; McCarthy, John E; Wickerhauser, Mladen V; Maurizi, Brian N; Lanza, Gregory M; Wickline, Samuel A; Hughes, Michael S

    2010-08-01

    Previously, we reported new methods for ultrasound signal characterization using entropy, H(f); a generalized entropy, the Renyi entropy, I(f)(r); and a limiting form of Renyi entropy suitable for real-time calculation, I(f),(infinity). All of these quantities demonstrated significantly more sensitivity to subtle changes in scattering architecture than energy-based methods in certain settings. In this study, the real-time calculable limit of the Renyi entropy, I(f),(infinity), is applied for the imaging of angiogenic murine neovasculature in a breast cancer xenograft using a targeted contrast agent. It is shown that this approach may be used to reliably detect the accumulation of targeted nanoparticles at five minutes post-injection in this in vivo model.

  2. The fragmentation of 510 MeV/nucleon iron-56 in polyethylene. II. Comparisons between data and a model

    NASA Technical Reports Server (NTRS)

    Zeitlin, C.; Heilbronn, L.; Miller, J.; Schimmerling, W.; Townsend, L. W.; Tripathi, R. K.; Wilson, J. W.

    1996-01-01

    The results of a Monte Carlo model for calculating fragment fluences and LET spectra are compared to data taken with 600 MeV/nucleon iron ions incident on an accelerator beamline configured for irradiation of biological samples, with no target and with 2, 5 and 8 cm of polyethylene. The model uses a multi-generation nuclear fragmentation code, coupled with a formulation of ionization energy loss based on the Bethe-Bloch equation. In the region where the data are reliable and the experimental acceptance is well understood, many of the features of the experimental spectra are well replicated by the model. To obtain good agreement with the experimental data, the model must allow for at least two generations of fragment production in the target.

  3. MultiMiTar: a novel multi objective optimization based miRNA-target prediction method.

    PubMed

    Mitra, Ramkrishna; Bandyopadhyay, Sanghamitra

    2011-01-01

    Machine learning based miRNA-target prediction algorithms often fail to obtain a balanced prediction accuracy in terms of both sensitivity and specificity due to lack of the gold standard of negative examples, miRNA-targeting site context specific relevant features and efficient feature selection process. Moreover, all the sequence, structure and machine learning based algorithms are unable to distribute the true positive predictions preferentially at the top of the ranked list; hence the algorithms become unreliable to the biologists. In addition, these algorithms fail to obtain considerable combination of precision and recall for the target transcripts that are translationally repressed at protein level. In the proposed article, we introduce an efficient miRNA-target prediction system MultiMiTar, a Support Vector Machine (SVM) based classifier integrated with a multiobjective metaheuristic based feature selection technique. The robust performance of the proposed method is mainly the result of using high quality negative examples and selection of biologically relevant miRNA-targeting site context specific features. The features are selected by using a novel feature selection technique AMOSA-SVM, that integrates the multi objective optimization technique Archived Multi-Objective Simulated Annealing (AMOSA) and SVM. MultiMiTar is found to achieve much higher Matthew's correlation coefficient (MCC) of 0.583 and average class-wise accuracy (ACA) of 0.8 compared to the others target prediction methods for a completely independent test data set. The obtained MCC and ACA values of these algorithms range from -0.269 to 0.155 and 0.321 to 0.582, respectively. Moreover, it shows a more balanced result in terms of precision and sensitivity (recall) for the translationally repressed data set as compared to all the other existing methods. An important aspect is that the true positive predictions are distributed preferentially at the top of the ranked list that makes MultiMiTar reliable for the biologists. MultiMiTar is now available as an online tool at www.isical.ac.in/~bioinfo_miu/multimitar.htm. MultiMiTar software can be downloaded from www.isical.ac.in/~bioinfo_miu/multimitar-download.htm.

  4. Object Extraction in Cluttered Environments via a P300-Based IFCE

    PubMed Central

    He, Huidong; Xian, Bin; Zeng, Ming; Zhou, Huihui; Niu, Linwei; Chen, Genshe

    2017-01-01

    One of the fundamental issues for robot navigation is to extract an object of interest from an image. The biggest challenges for extracting objects of interest are how to use a machine to model the objects in which a human is interested and extract them quickly and reliably under varying illumination conditions. This article develops a novel method for segmenting an object of interest in a cluttered environment by combining a P300-based brain computer interface (BCI) and an improved fuzzy color extractor (IFCE). The induced P300 potential identifies the corresponding region of interest and obtains the target of interest for the IFCE. The classification results not only represent the human mind but also deliver the associated seed pixel and fuzzy parameters to extract the specific objects in which the human is interested. Then, the IFCE is used to extract the corresponding objects. The results show that the IFCE delivers better performance than the BP network or the traditional FCE. The use of a P300-based IFCE provides a reliable solution for assisting a computer in identifying an object of interest within images taken under varying illumination intensities. PMID:28740505

  5. Assessing the Conditional Reliability of State Assessments

    ERIC Educational Resources Information Center

    May, Henry; Cole, Russell; Haimson, Josh; Perez-Johnson, Irma

    2010-01-01

    The purpose of this study is to provide empirical benchmarks of the conditional reliabilities of state tests for samples of the student population defined by ability level. Given that many educational interventions are targeted for samples of low performing students, schools, or districts, the primary goal of this research is to determine how…

  6. Applications of Human Performance Reliability Evaluation Concepts and Demonstration Guidelines

    DTIC Science & Technology

    1977-03-15

    ship stops dead in the water and the AN/SQS-26 operator recommends a new heading (000°). At T + 14 minutes, the target ship begins a hard turn to...Various Simulated Conditions 82 9 Hunan Reliability for Each Simulated Operator (Baseline Run) 83 10 Human and Equipment Availabilit / under

  7. MTTE: an innovative strategy for the evaluation of targeted/exome enrichment efficiency

    PubMed Central

    Klonowska, Katarzyna; Handschuh, Luiza; Swiercz, Aleksandra; Figlerowicz, Marek; Kozlowski, Piotr

    2016-01-01

    Although currently available strategies for the preparation of exome-enriched libraries are well established, a final validation of the libraries in terms of exome enrichment efficiency prior to the sequencing step is of considerable importance. Here, we present a strategy for the evaluation of exome enrichment, i.e., the Multipoint Test for Targeted-enrichment Efficiency (MTTE), PCR-based approach utilizing multiplex ligation-dependent probe amplification with capillary electrophoresis separation. We used MTTE for the analysis of subsequent steps of the Illumina TruSeq Exome Enrichment procedure. The calculated values of enrichment-associated parameters (i.e., relative enrichment, relative clearance, overall clearance, and fold enrichment) and the comparison of MTTE results with the actual enrichment revealed the high reliability of our assay. Additionally, the MTTE assay enabled the determination of the sequence-associated features that may confer bias in the enrichment of different targets. Importantly, the MTTE is low cost method that can be easily adapted to the region of interest important for a particular project. Thus, the MTTE strategy is attractive for post-capture validation in a variety of targeted/exome enrichment NGS projects. PMID:27572310

  8. MTTE: an innovative strategy for the evaluation of targeted/exome enrichment efficiency.

    PubMed

    Klonowska, Katarzyna; Handschuh, Luiza; Swiercz, Aleksandra; Figlerowicz, Marek; Kozlowski, Piotr

    2016-10-11

    Although currently available strategies for the preparation of exome-enriched libraries are well established, a final validation of the libraries in terms of exome enrichment efficiency prior to the sequencing step is of considerable importance. Here, we present a strategy for the evaluation of exome enrichment, i.e., the Multipoint Test for Targeted-enrichment Efficiency (MTTE), PCR-based approach utilizing multiplex ligation-dependent probe amplification with capillary electrophoresis separation. We used MTTE for the analysis of subsequent steps of the Illumina TruSeq Exome Enrichment procedure. The calculated values of enrichment-associated parameters (i.e., relative enrichment, relative clearance, overall clearance, and fold enrichment) and the comparison of MTTE results with the actual enrichment revealed the high reliability of our assay. Additionally, the MTTE assay enabled the determination of the sequence-associated features that may confer bias in the enrichment of different targets. Importantly, the MTTE is low cost method that can be easily adapted to the region of interest important for a particular project. Thus, the MTTE strategy is attractive for post-capture validation in a variety of targeted/exome enrichment NGS projects.

  9. Efficient elimination of nonstoichiometric enzyme inhibitors from HTS hit lists.

    PubMed

    Habig, Michael; Blechschmidt, Anke; Dressler, Sigmar; Hess, Barbara; Patel, Viral; Billich, Andreas; Ostermeier, Christian; Beer, David; Klumpp, Martin

    2009-07-01

    High-throughput screening often identifies not only specific, stoichiometrically binding inhibitors but also undesired compounds that unspecifically interfere with the targeted activity by nonstoichiometrically binding, unfolding, and/or inactivating proteins. In this study, the effect of such unwanted inhibitors on several different enzyme targets was assessed based on screening results for over a million compounds. In particular, the shift in potency on variation of enzyme concentration was used as a means to identify nonstoichiometric inhibitors among the screening hits. These potency shifts depended on both compound structure and target enzyme. The approach was confirmed by statistical analysis of thousands of dose-response curves, which showed that the potency of competitive and therefore clearly stoichiometric inhibitors was not affected by increasing enzyme concentration. Light-scattering measurements of thermal protein unfolding further verified that compounds that stabilize protein structure by stoichiometric binding show the same potency irrespective of enzyme concentration. In summary, measuring inhibitor IC(50) values at different enzyme concentrations is a simple, cost-effective, and reliable method to identify and eliminate compounds that inhibit a specific target enzyme via nonstoichiometric mechanisms.

  10. Microbubble Enzyme-Linked Immunosorbent Assay for the Detection of Targeted Microbubbles in in Vitro Static Binding Assays.

    PubMed

    Wischhusen, Jennifer; Padilla, Frederic

    2017-07-01

    Targeted microbubbles (MBs) are ultrasound contrast agents that are functionalized with a ligand for ultrasound molecular imaging of endothelial markers. Novel targeted MBs are characterized in vitro by incubation in protein-coated wells, followed by binding quantification by microscopy or ultrasound imaging. Both methods provide operator-dependent results: Between 3 and 20 fields of view from a heterogeneous sample are typically selected for analysis by microscopy, and in ultrasound imaging, different acoustic settings affect signal intensities. This study proposes a new method to reproducibly quantify MB binding based on enzyme-linked immunosorbent assay (ELISA), in which bound MBs are revealed with an enzyme-linked antibody. MB-ELISA was adapted to in vitro static binding assays, incubating the MBs in inverted position or by agitation, and compared with microscopy. The specificity and sensitivity of MB-ELISA enable the reliable quantification of MB binding in a rapid, high-throughput and whole-well analysis, facilitating the characterization of new targeted contrast agents. Copyright © 2017 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  11. Social Cognition as Reinforcement Learning: Feedback Modulates Emotion Inference.

    PubMed

    Zaki, Jamil; Kallman, Seth; Wimmer, G Elliott; Ochsner, Kevin; Shohamy, Daphna

    2016-09-01

    Neuroscientific studies of social cognition typically employ paradigms in which perceivers draw single-shot inferences about the internal states of strangers. Real-world social inference features much different parameters: People often encounter and learn about particular social targets (e.g., friends) over time and receive feedback about whether their inferences are correct or incorrect. Here, we examined this process and, more broadly, the intersection between social cognition and reinforcement learning. Perceivers were scanned using fMRI while repeatedly encountering three social targets who produced conflicting visual and verbal emotional cues. Perceivers guessed how targets felt and received feedback about whether they had guessed correctly. Visual cues reliably predicted one target's emotion, verbal cues predicted a second target's emotion, and neither reliably predicted the third target's emotion. Perceivers successfully used this information to update their judgments over time. Furthermore, trial-by-trial learning signals-estimated using two reinforcement learning models-tracked activity in ventral striatum and ventromedial pFC, structures associated with reinforcement learning, and regions associated with updating social impressions, including TPJ. These data suggest that learning about others' emotions, like other forms of feedback learning, relies on domain-general reinforcement mechanisms as well as domain-specific social information processing.

  12. Development of a machine vision system for automated structural assembly

    NASA Technical Reports Server (NTRS)

    Sydow, P. Daniel; Cooper, Eric G.

    1992-01-01

    Research is being conducted at the LaRC to develop a telerobotic assembly system designed to construct large space truss structures. This research program was initiated within the past several years, and a ground-based test-bed was developed to evaluate and expand the state of the art. Test-bed operations currently use predetermined ('taught') points for truss structural assembly. Total dependence on the use of taught points for joint receptacle capture and strut installation is neither robust nor reliable enough for space operations. Therefore, a machine vision sensor guidance system is being developed to locate and guide the robot to a passive target mounted on the truss joint receptacle. The vision system hardware includes a miniature video camera, passive targets mounted on the joint receptacles, target illumination hardware, and an image processing system. Discrimination of the target from background clutter is accomplished through standard digital processing techniques. Once the target is identified, a pose estimation algorithm is invoked to determine the location, in three-dimensional space, of the target relative to the robots end-effector. Preliminary test results of the vision system in the Automated Structural Assembly Laboratory with a range of lighting and background conditions indicate that it is fully capable of successfully identifying joint receptacle targets throughout the required operational range. Controlled optical bench test results indicate that the system can also provide the pose estimation accuracy to define the target position.

  13. Automated batch fiducial-less tilt-series alignment in Appion using Protomo

    PubMed Central

    Noble, Alex J.; Stagg, Scott M.

    2015-01-01

    The field of electron tomography has benefited greatly from manual and semi-automated approaches to marker-based tilt-series alignment that have allowed for the structural determination of multitudes of in situ cellular structures as well as macromolecular structures of individual protein complexes. The emergence of complementary metal-oxide semiconductor detectors capable of detecting individual electrons has enabled the collection of low dose, high contrast images, opening the door for reliable correlation-based tilt-series alignment. Here we present a set of automated, correlation-based tilt-series alignment, contrast transfer function (CTF) correction, and reconstruction workflows for use in conjunction with the Appion/Leginon package that are primarily targeted at automating structure determination with cryogenic electron microscopy. PMID:26455557

  14. Development and Evaluation of a Parallel Reaction Monitoring Strategy for Large-Scale Targeted Metabolomics Quantification.

    PubMed

    Zhou, Juntuo; Liu, Huiying; Liu, Yang; Liu, Jia; Zhao, Xuyang; Yin, Yuxin

    2016-04-19

    Recent advances in mass spectrometers which have yielded higher resolution and faster scanning speeds have expanded their application in metabolomics of diverse diseases. Using a quadrupole-Orbitrap LC-MS system, we developed an efficient large-scale quantitative method targeting 237 metabolites involved in various metabolic pathways using scheduled, parallel reaction monitoring (PRM). We assessed the dynamic range, linearity, reproducibility, and system suitability of the PRM assay by measuring concentration curves, biological samples, and clinical serum samples. The quantification performances of PRM and MS1-based assays in Q-Exactive were compared, and the MRM assay in QTRAP 6500 was also compared. The PRM assay monitoring 237 polar metabolites showed greater reproducibility and quantitative accuracy than MS1-based quantification and also showed greater flexibility in postacquisition assay refinement than the MRM assay in QTRAP 6500. We present a workflow for convenient PRM data processing using Skyline software which is free of charge. In this study we have established a reliable PRM methodology on a quadrupole-Orbitrap platform for evaluation of large-scale targeted metabolomics, which provides a new choice for basic and clinical metabolomics study.

  15. A neural-network-based approach to the double traveling salesman problem.

    PubMed

    Plebe, Alessio; Anile, Angelo Marcello

    2002-02-01

    The double traveling salesman problem is a variation of the basic traveling salesman problem where targets can be reached by two salespersons operating in parallel. The real problem addressed by this work concerns the optimization of the harvest sequence for the two independent arms of a fruit-harvesting robot. This application poses further constraints, like a collision-avoidance function. The proposed solution is based on a self-organizing map structure, initialized with as many artificial neurons as the number of targets to be reached. One of the key components of the process is the combination of competitive relaxation with a mechanism for deleting and creating artificial neurons. Moreover, in the competitive relaxation process, information about the trajectory connecting the neurons is combined with the distance of neurons from the target. This strategy prevents tangles in the trajectory and collisions between the two tours. Results of tests indicate that the proposed approach is efficient and reliable for harvest sequence planning. Moreover, the enhancements added to the pure self-organizing map concept are of wider importance, as proved by a traveling salesman problem version of the program, simplified from the double version for comparison.

  16. Transposed-letter priming of prelexical orthographic representations.

    PubMed

    Kinoshita, Sachiko; Norris, Dennis

    2009-01-01

    A prime generated by transposing two internal letters (e.g., jugde) produces strong priming of the original word (judge). In lexical decision, this transposed-letter (TL) priming effect is generally weak or absent for nonword targets; thus, it is unclear whether the origin of this effect is lexical or prelexical. The authors describe the Bayesian Reader theory of masked priming (D. Norris & S. Kinoshita, 2008), which explains why nonwords do not show priming in lexical decision but why they do in the cross-case same-different task. This analysis is followed by 3 experiments that show that priming in this task is not based on low-level perceptual similarity between the prime and target, or on phonology, to make the case that priming is based on prelexical orthographic representation. The authors then use this task to demonstrate equivalent TL priming effects for nonwords and words. The results are interpreted as the first reliable evidence based on the masked priming procedure that letter position is not coded absolutely within the prelexical, orthographic representation. The implications of the results for current letter position coding schemes are discussed.

  17. A multi-model approach to nucleic acid-based drug development.

    PubMed

    Gautherot, Isabelle; Sodoyer, Regís

    2004-01-01

    With the advent of functional genomics and the shift of interest towards sequence-based therapeutics, the past decades have witnessed intense research efforts on nucleic acid-mediated gene regulation technologies. Today, RNA interference is emerging as a groundbreaking discovery, holding promise for development of genetic modulators of unprecedented potency. Twenty-five years after the discovery of antisense RNA and ribozymes, gene control therapeutics are still facing developmental difficulties, with only one US FDA-approved antisense drug currently available in the clinic. Limited predictability of target site selection models is recognized as one major stumbling block that is shared by all of the so-called complementary technologies, slowing the progress towards a commercial product. Currently employed in vitro systems for target site selection include RNAse H-based mapping, antisense oligonucleotide microarrays, and functional screening approaches using libraries of catalysts with randomized target-binding arms to identify optimal ribozyme/DNAzyme cleavage sites. Individually, each strategy has its drawbacks from a drug development perspective. Utilization of message-modulating sequences as therapeutic agents requires that their action on a given target transcript meets criteria of potency and selectivity in the natural physiological environment. In addition to sequence-dependent characteristics, other factors will influence annealing reactions and duplex stability, as well as nucleic acid-mediated catalysis. Parallel consideration of physiological selection systems thus appears essential for screening for nucleic acid compounds proposed for therapeutic applications. Cellular message-targeting studies face issues relating to efficient nucleic acid delivery and appropriate analysis of response. For reliability and simplicity, prokaryotic systems can provide a rapid and cost-effective means of studying message targeting under pseudo-cellular conditions, but such approaches also have limitations. To streamline nucleic acid drug discovery, we propose a multi-model strategy integrating high-throughput-adapted bacterial screening, followed by reporter-based and/or natural cellular models and potentially also in vitro assays for characterization of the most promising candidate sequences, before final in vivo testing.

  18. Exome sequencing of a multigenerational human pedigree.

    PubMed

    Hedges, Dale J; Hedges, Dale; Burges, Dan; Powell, Eric; Almonte, Cherylyn; Huang, Jia; Young, Stuart; Boese, Benjamin; Schmidt, Mike; Pericak-Vance, Margaret A; Martin, Eden; Zhang, Xinmin; Harkins, Timothy T; Züchner, Stephan

    2009-12-14

    Over the next few years, the efficient use of next-generation sequencing (NGS) in human genetics research will depend heavily upon the effective mechanisms for the selective enrichment of genomic regions of interest. Recently, comprehensive exome capture arrays have become available for targeting approximately 33 Mb or approximately 180,000 coding exons across the human genome. Selective genomic enrichment of the human exome offers an attractive option for new experimental designs aiming to quickly identify potential disease-associated genetic variants, especially in family-based studies. We have evaluated a 2.1 M feature human exome capture array on eight individuals from a three-generation family pedigree. We were able to cover up to 98% of the targeted bases at a long-read sequence read depth of > or = 3, 86% at a read depth of > or = 10, and over 50% of all targets were covered with > or = 20 reads. We identified up to 14,284 SNPs and small indels per individual exome, with up to 1,679 of these representing putative novel polymorphisms. Applying the conservative genotype calling approach HCDiff, the average rate of detection of a variant allele based on Illumina 1 M BeadChips genotypes was 95.2% at > or = 10x sequence. Further, we propose an advantageous genotype calling strategy for low covered targets that empirically determines cut-off thresholds at a given coverage depth based on existing genotype data. Application of this method was able to detect >99% of SNPs covered > or = 8x. Our results offer guidance for "real-world" applications in human genetics and provide further evidence that microarray-based exome capture is an efficient and reliable method to enrich for chromosomal regions of interest in next-generation sequencing experiments.

  19. Intraoperative aberrometry-based aphakia refraction in patients with cataract: status and options.

    PubMed

    Huelle, Jan O; Druchkiv, Vasyl; Habib, Nabil E; Richard, Gisbert; Katz, Toam; Linke, Stephan J

    2017-02-01

    To explore the application of intraoperative wavefront aberrometry (IWA) for aphakia-based biometry using three existing formulae derived from autorefractive retinoscopy and introducing new improved formulae. In 74 patients undergoing cataract surgery, three repeated measurements of aphakic spherical equivalent (SE) were taken. All measurements were objectively graded for their quality and evaluated with the 'limits of agreement' approach. ORs were calculated and analysis of variance was applied. The intraocular lens (IOL) power that would have given the target refraction was back-calculated from manifest refraction at 3 months postoperatively. Regression analysis was performed to generate two aphakic SE-based formulae for predicting this IOL. The accuracy of the formulae was determined by comparing them to conventional biometry and published aphakia formulae. In 32 eyes, three consecutive aphakic measurements were successful. Objective parameters of IWA map quality significantly impacted measurement variability (p<0.05). The limits of agreement of repeated aphakic SE readings were +0.66 dioptre (D) and -0.69 D. Intraoperative biometry by our formula resulted in 25% and 53% of all cases ±0.50D and ±1.00 D within SE target, respectively. A second formula that took axial length (AL) into account resulted in improved ratios of 41% and 70%, respectively. A reliable application of IWA to calculate IOL power during routine cataract surgery may not be feasible given the high rate of measurement failures and the large variations of the readings. To enable reliable IOL calculation from IWA, measurement precision must be improved and aphakic IOL formulae need to be fine-tuned. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  20. Pioneering topological methods for network-based drug-target prediction by exploiting a brain-network self-organization theory.

    PubMed

    Durán, Claudio; Daminelli, Simone; Thomas, Josephine M; Haupt, V Joachim; Schroeder, Michael; Cannistraci, Carlo Vittorio

    2017-04-26

    The bipartite network representation of the drug-target interactions (DTIs) in a biosystem enhances understanding of the drugs' multifaceted action modes, suggests therapeutic switching for approved drugs and unveils possible side effects. As experimental testing of DTIs is costly and time-consuming, computational predictors are of great aid. Here, for the first time, state-of-the-art DTI supervised predictors custom-made in network biology were compared-using standard and innovative validation frameworks-with unsupervised pure topological-based models designed for general-purpose link prediction in bipartite networks. Surprisingly, our results show that the bipartite topology alone, if adequately exploited by means of the recently proposed local-community-paradigm (LCP) theory-initially detected in brain-network topological self-organization and afterwards generalized to any complex network-is able to suggest highly reliable predictions, with comparable performance with the state-of-the-art-supervised methods that exploit additional (non-topological, for instance biochemical) DTI knowledge. Furthermore, a detailed analysis of the novel predictions revealed that each class of methods prioritizes distinct true interactions; hence, combining methodologies based on diverse principles represents a promising strategy to improve drug-target discovery. To conclude, this study promotes the power of bio-inspired computing, demonstrating that simple unsupervised rules inspired by principles of topological self-organization and adaptiveness arising during learning in living intelligent systems (like the brain) can efficiently equal perform complicated algorithms based on advanced, supervised and knowledge-based engineering. © The Author 2017. Published by Oxford University Press.

  1. Behavioral Assessment of Hearing in 2 to 4 Year-old Children: A Two-interval, Observer-based Procedure Using Conditioned Play-based Responses.

    PubMed

    Bonino, Angela Yarnell; Leibold, Lori J

    2017-01-23

    Collecting reliable behavioral data from toddlers and preschoolers is challenging. As a result, there are significant gaps in our understanding of human auditory development for these age groups. This paper describes an observer-based procedure for measuring hearing sensitivity with a two-interval, two-alternative forced-choice paradigm. Young children are trained to perform a play-based, motor response (e.g., putting a block in a bucket) whenever they hear a target signal. An experimenter observes the child's behavior and makes a judgment about whether the signal was presented during the first or second observation interval; the experimenter is blinded to the true signal interval, so this judgment is based solely on the child's behavior. These procedures were used to test 2 to 4 year-olds (n = 33) with no known hearing problems. The signal was a 1,000 Hz warble tone presented in quiet, and the signal level was adjusted to estimate a threshold corresponding to 71%-correct detection. A valid threshold was obtained for 82% of children. These results indicate that the two-interval procedure is both feasible and reliable for use with toddlers and preschoolers. The two-interval, observer-based procedure described in this paper is a powerful tool for evaluating hearing in young children because it guards against response bias on the part of the experimenter.

  2. Gauging the gaps in student problem-solving skills: assessment of individual and group use of problem-solving strategies using online discussions.

    PubMed

    Anderson, William L; Mitchell, Steven M; Osgood, Marcy P

    2008-01-01

    For the past 3 yr, faculty at the University of New Mexico, Department of Biochemistry and Molecular Biology have been using interactive online Problem-Based Learning (PBL) case discussions in our large-enrollment classes. We have developed an illustrative tracking method to monitor student use of problem-solving strategies to provide targeted help to groups and to individual students. This method of assessing performance has a high interrater reliability, and senior students, with training, can serve as reliable graders. We have been able to measure improvements in many students' problem-solving strategies, but, not unexpectedly, there is a population of students who consistently apply the same failing strategy when there is no faculty intervention. This new methodology provides an effective tool to direct faculty to constructively intercede in this area of student development.

  3. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing

    PubMed Central

    Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang

    2018-01-01

    Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, feature extraction algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system. PMID:29462855

  4. A combination of spin diffusion methods for the determination of protein-ligand complex structural ensembles.

    PubMed

    Pilger, Jens; Mazur, Adam; Monecke, Peter; Schreuder, Herman; Elshorst, Bettina; Bartoschek, Stefan; Langer, Thomas; Schiffer, Alexander; Krimm, Isabelle; Wegstroth, Melanie; Lee, Donghan; Hessler, Gerhard; Wendt, K-Ulrich; Becker, Stefan; Griesinger, Christian

    2015-05-26

    Structure-based drug design (SBDD) is a powerful and widely used approach to optimize affinity of drug candidates. With the recently introduced INPHARMA method, the binding mode of small molecules to their protein target can be characterized even if no spectroscopic information about the protein is known. Here, we show that the combination of the spin-diffusion-based NMR methods INPHARMA, trNOE, and STD results in an accurate scoring function for docking modes and therefore determination of protein-ligand complex structures. Applications are shown on the model system protein kinase A and the drug targets glycogen phosphorylase and soluble epoxide hydrolase (sEH). Multiplexing of several ligands improves the reliability of the scoring function further. The new score allows in the case of sEH detecting two binding modes of the ligand in its binding site, which was corroborated by X-ray analysis. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing.

    PubMed

    Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang

    2018-02-15

    Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED light target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, direction location algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system.

  6. Thermo-compressive transfer printing for facile alignment and robust device integration of nanowires.

    PubMed

    Lee, Won Seok; Won, Sejeong; Park, Jeunghee; Lee, Jihye; Park, Inkyu

    2012-06-07

    Controlled alignment and mechanically robust bonding between nanowires (NWs) and electrodes are essential requirements for reliable operation of functional NW-based electronic devices. In this work, we developed a novel process for the alignment and bonding between NWs and metal electrodes by using thermo-compressive transfer printing. In this process, bottom-up synthesized NWs were aligned in parallel by shear loading onto the intermediate substrate and then finally transferred onto the target substrate with low melting temperature metal electrodes. In particular, multi-layer (e.g. Cr/Au/In/Au and Cr/Cu/In/Au) metal electrodes are softened at low temperatures (below 100 °C) and facilitate submergence of aligned NWs into the surface of electrodes at a moderate pressure (∼5 bar). By using this thermo-compressive transfer printing process, robust electrical and mechanical contact between NWs and metal electrodes can be realized. This method is believed to be very useful for the large-area fabrication of NW-based electrical devices with improved mechanical robustness, electrical contact resistance, and reliability.

  7. Strategies to improve electrode positioning and safety in cochlear implants.

    PubMed

    Rebscher, S J; Heilmann, M; Bruszewski, W; Talbot, N H; Snyder, R L; Merzenich, M M

    1999-03-01

    An injection-molded internal supporting rib has been produced to control the flexibility of silicone rubber encapsulated electrodes designed to electrically stimulate the auditory nerve in human subjects with severe to profound hearing loss. The rib molding dies, and molds for silicone rubber encapsulation of the electrode, were designed and machined using AutoCad and MasterCam software packages in a PC environment. After molding, the prototype plastic ribs were iteratively modified based on observations of the performance of the rib/silicone composite insert in a clear plastic model of the human scala tympani cavity. The rib-based electrodes were reliably inserted farther into these models, required less insertion force and were positioned closer to the target auditory neural elements than currently available cochlear implant electrodes. With further design improvements the injection-molded rib may also function to accurately support metal stimulating contacts and wire leads during assembly to significantly increase the manufacturing efficiency of these devices. This method to reliably control the mechanical properties of miniature implantable devices with multiple electrical leads may be valuable in other areas of biomedical device design.

  8. A vacuum-sealed compact x-ray tube based on focused carbon nanotube field-emission electrons

    NASA Astrophysics Data System (ADS)

    Jeong, Jin-Woo; Kim, Jae-Woo; Kang, Jun-Tae; Choi, Sungyoul; Ahn, Seungjoon; Song, Yoon-Ho

    2013-03-01

    We report on a fully vacuum-sealed compact x-ray tube based on focused carbon nanotube (CNT) field-emission electrons for various radiography applications. The specially designed two-step brazing process enabled us to accomplish a good vacuum level for the stable and reliable operation of the x-ray tube without any active vacuum pump. Also, the integrated focusing electrodes in the field-emission electron gun focused electron beams from the CNT emitters onto the anode target effectively, giving a small focal spot of around 0.3 mm with a large current of above 50 mA. The active-current control through the cathode electrode of the x-ray tube led a fast digital modulation of x-ray dose with a low voltage of below 5 V. The fabricated compact x-ray tube showed a stable and reliable operation, indicating good maintenance of a vacuum level of below 5 × 10-6 Torr and the possibility of field-emission x-ray tubes in a stand-alone device without an active pumping system.

  9. Self-Adaptive Strategy Based on Fuzzy Control Systems for Improving Performance in Wireless Sensors Networks.

    PubMed

    Hernández Díaz, Vicente; Martínez, José-Fernán; Lucas Martínez, Néstor; del Toro, Raúl M

    2015-09-18

    The solutions to cope with new challenges that societies have to face nowadays involve providing smarter daily systems. To achieve this, technology has to evolve and leverage physical systems automatic interactions, with less human intervention. Technological paradigms like Internet of Things (IoT) and Cyber-Physical Systems (CPS) are providing reference models, architectures, approaches and tools that are to support cross-domain solutions. Thus, CPS based solutions will be applied in different application domains like e-Health, Smart Grid, Smart Transportation and so on, to assure the expected response from a complex system that relies on the smooth interaction and cooperation of diverse networked physical systems. The Wireless Sensors Networks (WSN) are a well-known wireless technology that are part of large CPS. The WSN aims at monitoring a physical system, object, (e.g., the environmental condition of a cargo container), and relaying data to the targeted processing element. The WSN communication reliability, as well as a restrained energy consumption, are expected features in a WSN. This paper shows the results obtained in a real WSN deployment, based on SunSPOT nodes, which carries out a fuzzy based control strategy to improve energy consumption while keeping communication reliability and computational resources usage among boundaries.

  10. Self-Adaptive Strategy Based on Fuzzy Control Systems for Improving Performance in Wireless Sensors Networks

    PubMed Central

    Hernández Díaz, Vicente; Martínez, José-Fernán; Lucas Martínez, Néstor; del Toro, Raúl M.

    2015-01-01

    The solutions to cope with new challenges that societies have to face nowadays involve providing smarter daily systems. To achieve this, technology has to evolve and leverage physical systems automatic interactions, with less human intervention. Technological paradigms like Internet of Things (IoT) and Cyber-Physical Systems (CPS) are providing reference models, architectures, approaches and tools that are to support cross-domain solutions. Thus, CPS based solutions will be applied in different application domains like e-Health, Smart Grid, Smart Transportation and so on, to assure the expected response from a complex system that relies on the smooth interaction and cooperation of diverse networked physical systems. The Wireless Sensors Networks (WSN) are a well-known wireless technology that are part of large CPS. The WSN aims at monitoring a physical system, object, (e.g., the environmental condition of a cargo container), and relaying data to the targeted processing element. The WSN communication reliability, as well as a restrained energy consumption, are expected features in a WSN. This paper shows the results obtained in a real WSN deployment, based on SunSPOT nodes, which carries out a fuzzy based control strategy to improve energy consumption while keeping communication reliability and computational resources usage among boundaries. PMID:26393612

  11. Effects of imperfect automation on decision making in a simulated command and control task.

    PubMed

    Rovira, Ericka; McGarry, Kathleen; Parasuraman, Raja

    2007-02-01

    Effects of four types of automation support and two levels of automation reliability were examined. The objective was to examine the differential impact of information and decision automation and to investigate the costs of automation unreliability. Research has shown that imperfect automation can lead to differential effects of stages and levels of automation on human performance. Eighteen participants performed a "sensor to shooter" targeting simulation of command and control. Dependent variables included accuracy and response time of target engagement decisions, secondary task performance, and subjective ratings of mental work-load, trust, and self-confidence. Compared with manual performance, reliable automation significantly reduced decision times. Unreliable automation led to greater cost in decision-making accuracy under the higher automation reliability condition for three different forms of decision automation relative to information automation. At low automation reliability, however, there was a cost in performance for both information and decision automation. The results are consistent with a model of human-automation interaction that requires evaluation of the different stages of information processing to which automation support can be applied. If fully reliable decision automation cannot be guaranteed, designers should provide users with information automation support or other tools that allow for inspection and analysis of raw data.

  12. A hyperspectral image optimizing method based on sub-pixel MTF analysis

    NASA Astrophysics Data System (ADS)

    Wang, Yun; Li, Kai; Wang, Jinqiang; Zhu, Yajie

    2015-04-01

    Hyperspectral imaging is used to collect tens or hundreds of images continuously divided across electromagnetic spectrum so that the details under different wavelengths could be represented. A popular hyperspectral imaging methods uses a tunable optical band-pass filter settled in front of the focal plane to acquire images of different wavelengths. In order to alleviate the influence of chromatic aberration in some segments in a hyperspectral series, in this paper, a hyperspectral optimizing method uses sub-pixel MTF to evaluate image blurring quality was provided. This method acquired the edge feature in the target window by means of the line spread function (LSF) to calculate the reliable position of the edge feature, then the evaluation grid in each line was interpolated by the real pixel value based on its relative position to the optimal edge and the sub-pixel MTF was used to analyze the image in frequency domain, by which MTF calculation dimension was increased. The sub-pixel MTF evaluation was reliable, since no image rotation and pixel value estimation was needed, and no artificial information was introduced. With theoretical analysis, the method proposed in this paper is reliable and efficient when evaluation the common images with edges of small tilt angle in real scene. It also provided a direction for the following hyperspectral image blurring evaluation and the real-time focal plane adjustment in real time in related imaging system.

  13. Research on photodiode detector-based spatial transient light detection and processing system

    NASA Astrophysics Data System (ADS)

    Liu, Meiying; Wang, Hu; Liu, Yang; Zhao, Hui; Nan, Meng

    2016-10-01

    In order to realize real-time signal identification and processing of spatial transient light, the features and the energy of the captured target light signal are first described and quantitatively calculated. Considering that the transient light signal has random occurrence, a short duration and an evident beginning and ending, a photodiode detector based spatial transient light detection and processing system is proposed and designed in this paper. This system has a large field of view and is used to realize non-imaging energy detection of random, transient and weak point target under complex background of spatial environment. Weak signal extraction under strong background is difficult. In this paper, considering that the background signal changes slowly and the target signal changes quickly, filter is adopted for signal's background subtraction. A variable speed sampling is realized by the way of sampling data points with a gradually increased interval. The two dilemmas that real-time processing of large amount of data and power consumption required by the large amount of data needed to be stored are solved. The test results with self-made simulative signal demonstrate the effectiveness of the design scheme. The practical system could be operated reliably. The detection and processing of the target signal under the strong sunlight background was realized. The results indicate that the system can realize real-time detection of target signal's characteristic waveform and monitor the system working parameters. The prototype design could be used in a variety of engineering applications.

  14. Listeners Experience Linguistic Masking Release in Noise-Vocoded Speech-in-Speech Recognition.

    PubMed

    Viswanathan, Navin; Kokkinakis, Kostas; Williams, Brittany T

    2018-02-15

    The purpose of this study was to evaluate whether listeners with normal hearing perceiving noise-vocoded speech-in-speech demonstrate better intelligibility of target speech when the background speech was mismatched in language (linguistic release from masking [LRM]) and/or location (spatial release from masking [SRM]) relative to the target. We also assessed whether the spectral resolution of the noise-vocoded stimuli affected the presence of LRM and SRM under these conditions. In Experiment 1, a mixed factorial design was used to simultaneously manipulate the masker language (within-subject, English vs. Dutch), the simulated masker location (within-subject, right, center, left), and the spectral resolution (between-subjects, 6 vs. 12 channels) of noise-vocoded target-masker combinations presented at +25 dB signal-to-noise ratio (SNR). In Experiment 2, the study was repeated using a spectral resolution of 12 channels at +15 dB SNR. In both experiments, listeners' intelligibility of noise-vocoded targets was better when the background masker was Dutch, demonstrating reliable LRM in all conditions. The pattern of results in Experiment 1 was not reliably different across the 6- and 12-channel noise-vocoded speech. Finally, a reliable spatial benefit (SRM) was detected only in the more challenging SNR condition (Experiment 2). The current study is the first to report a clear LRM benefit in noise-vocoded speech-in-speech recognition. Our results indicate that this benefit is available even under spectrally degraded conditions and that it may augment the benefit due to spatial separation of target speech and competing backgrounds.

  15. Figure-ground asymmetries in the Implicit Association Test (IAT).

    PubMed

    Rothermund, K; Wentura, D

    2001-01-01

    Based on the assumption that binary classification tasks are often processed asymmetrically (figure-ground asymmetries), two experiments showed that association alone cannot account for effects observed in the Implicit Association Test (IAT). Experiment 1 (N = 16) replicated a standard version of the IAT effect using old vs. young names as target categories and good and bad words as attribute categories. However, reliable compatibility effects were also found for a modified version of the task in which neutral words vs. nonwords instead of good vs. bad words were used as attribute categories. In Experiment 2 (N = 8), a reversed IAT effect was observed after the figure-ground asymmetry in the target dimension had been inverted by a previous go/nogo detection task in which participants searched for exemplars of the category "young." The experiments support the hypothesis that figure-ground asymmetries produce compatibility effects in the IAT and suggest that IAT effects do not rely exclusively on evaluative associations between the target and attribute categories.

  16. A portable molecular-sieve-based CO2 sampling system for radiocarbon measurements

    NASA Astrophysics Data System (ADS)

    Palonen, V.

    2015-12-01

    We have developed a field-capable sampling system for the collection of CO2 samples for radiocarbon-concentration measurements. Most target systems in environmental research are limited in volume and CO2 concentration, making conventional flask sampling hard or impossible for radiocarbon studies. The present system captures the CO2 selectively to cartridges containing 13X molecular sieve material. The sampling does not introduce significant under-pressures or significant losses of moisture to the target system, making it suitable for most environmental targets. The system also incorporates a significantly larger sieve container for the removal of CO2 from chambers prior to the CO2 build-up phase and sampling. In addition, both the CO2 and H2O content of the sample gas are measured continuously. This enables in situ estimation of the amount of collected CO2 and the determination of CO2 flux to a chamber. The portable sampling system is described in detail and tests for the reliability of the method are presented.

  17. A portable molecular-sieve-based CO2 sampling system for radiocarbon measurements.

    PubMed

    Palonen, V

    2015-12-01

    We have developed a field-capable sampling system for the collection of CO2 samples for radiocarbon-concentration measurements. Most target systems in environmental research are limited in volume and CO2 concentration, making conventional flask sampling hard or impossible for radiocarbon studies. The present system captures the CO2 selectively to cartridges containing 13X molecular sieve material. The sampling does not introduce significant under-pressures or significant losses of moisture to the target system, making it suitable for most environmental targets. The system also incorporates a significantly larger sieve container for the removal of CO2 from chambers prior to the CO2 build-up phase and sampling. In addition, both the CO2 and H2O content of the sample gas are measured continuously. This enables in situ estimation of the amount of collected CO2 and the determination of CO2 flux to a chamber. The portable sampling system is described in detail and tests for the reliability of the method are presented.

  18. The control system of the polarized internal target of ANKE at COSY

    NASA Astrophysics Data System (ADS)

    Kleines, H.; Sarkadi, J.; Zwoll, K.; Engels, R.; Grigoryev, K.; Mikirtychyants, M.; Nekipelov, M.; Rathmann, F.; Seyfarth, H.; Kravtsov, P.; Vasilyev, A.

    2006-05-01

    The polarized internal target for the ANKE experiment at the Cooler Synchrotron COSY of the Forschungszentrum Jülich utilizes a polarized atomic beam source to feed a storage cell with polarized hydrogen or deuterium atoms. The nuclear polarization is measured with a Lamb-shift polarimeter. For common control of the two systems, industrial equipment was selected providing reliable, long-term support and remote control of the target as well as measurement and optimization of its operating parameters. The interlock system has been implemented on the basis of SIEMENS SIMATIC S7-300 family of programmable logic controllers. In order to unify the interfacing to the control computer, all front-end equipment is connected via the PROFIBUS DP fieldbus. The process control software was implemented using the Windows-based WinCC toolkit from SIEMENS. The variety of components, to be controlled, and the logical structure of the control and interlock system are described. Finally, a number of applications derived from the present development to other, new installations are briefly mentioned.

  19. Limited generalization with varied, as compared to specific, practice in short-term motor learning.

    PubMed

    Willey, Chéla R; Liu, Zili

    2018-01-01

    The schema theory of learning predicts that varied training in motor learning should give rise to better transfer than specific training. For example, throwing beanbags during practice to targets 5 and 9ft away should better generalize to targets 7 and 11ft away, as compared to only throwing to a target 7ft away. In this study, we tested this prediction in a throwing task, when the pretest, practice, and posttest were all completed within an hour. Participants in the varied group practiced throwing at 5 and 9ft targets, while participants in the specific group practiced throwing at 7ft only. All participants reliably reduced errors from pretest to posttest. The varied group never outperformed the specific group at the 7ft target (the trained target for the specific group). They did not reliably outperform the specific group at 11ft, either. The numerically better performance at 11ft by the varied group was due, as it turned out in a subsequent experiment, to the fact that 11ft was closer to 9ft (one of the two training targets for the varied group) than to 7ft (the training target for the specific group). We conclude that varied training played a very limited role in short-term motor learning. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Comparative cath-lab assessment of coronary stenosis by radiology technician, junior and senior interventional cardiologist in patients treated with coronary angioplasty

    PubMed Central

    Delli Carri, Felice; Ruggiero, Maria Assunta; Cuculo, Andrea; Ruggiero, Antonio; Ziccardi, Luigi; De Gennaro, Luisa; Di Biase, Matteo

    2014-01-01

    Background Exact quantification of plaque extension during coronary angioplasty (PCI) usually falls on interventional cardiologist (IC). Quantitative coronary stenosis assessment (QCA) may be possibly committed to the radiology technician (RT), who usually supports cath-lab nurse and IC during PCI. We therefore sought to investigate the reliability of QCA performed by RT in comparison with IC. Methods Forty-four consecutive patients with acute coronary syndrome underwent PCI; target coronary vessel size beneath target coronary lesion (S) and target coronary lesion length (L) were assessed by the RT, junior IC (JIC), and senior IC (SIC) and then compared. SIC evaluation, which determined the final stent selection for coronary stenting, was considered as a reference benchmark. Results RT performance with QCA support in assessing target vessel size and target lesion length was not significantly different from SIC (r = 0.46, p < 0.01; r = 0.64, p < 0.001, respectively) as well as JIC (r = 0.79, r = 0.75, p < 0.001, respectively). JIC performance was significantly better than RT in assessing target vessel size (p < 0.05), while not significant when assessing target lesion length. Conclusions RT may reliably assess target lesion by using adequate QCA software in the cath-lab in case of PCI; RT performance does not differ from SIC. PMID:24672672

  1. Revised Household-Based Microplanning in Polio Supplemental Immunization Activities in Kano State, Nigeria. 2013-2014.

    PubMed

    Gali, Emmanuel; Mkanda, Pascal; Banda, Richard; Korir, Charles; Bawa, Samuel; Warigon, Charity; Abdullahi, Suleiman; Abba, Bashir; Isiaka, Ayodeji; Yahualashet, Yared G; Touray, Kebba; Chevez, Ana; Tegegne, Sisay G; Nsubuga, Peter; Etsano, Andrew; Shuaib, Faisal; Vaz, Rui G

    2016-05-01

    Remarkable progress had been made since the launch of the Global Polio Eradication Initiative in 1988. However endemic wild poliovirus transmission in Nigeria, Pakistan, and Afghanistan remains an issue of international concern. Poor microplanning has been identified as a major contributor to the high numbers of chronically missed children. We assessed the contribution of the revised household-based microplanning process implemented in Kano State from September 2013 to April 2014 to the outcomes of subsequent polio supplemental immunization activities using used preselected planning and outcome indicators. There was a 38% increase in the number of settlements enumerated, a 30% reduction in the number of target households, and a 54% reduction in target children. The reported number of children vaccinated and the doses of oral polio vaccine used during subsequent polio supplemental immunization activities showed a decline. Postvaccination lot quality assurance sampling and chronically missed settlement reports also showed a progressive reduction in the number of children and settlements missed. We observed improvement in Kano State's performance based on the selected postcampaign performance evaluation indicators and reliability of baseline demographic estimates after the revised household-based microplanning exercise. © 2016 World Health Organization; licensee Oxford Journals.

  2. Revised Household-Based Microplanning in Polio Supplemental Immunization Activities in Kano State, Nigeria. 2013–2014

    PubMed Central

    Gali, Emmanuel; Mkanda, Pascal; Banda, Richard; Korir, Charles; Bawa, Samuel; Warigon, Charity; Abdullahi, Suleiman; Abba, Bashir; Isiaka, Ayodeji; Yahualashet, Yared G.; Touray, Kebba; Chevez, Ana; Tegegne, Sisay G.; Nsubuga, Peter; Etsano, Andrew; Shuaib, Faisal; Vaz, Rui G.

    2016-01-01

    Background. Remarkable progress had been made since the launch of the Global Polio Eradication Initiative in 1988. However endemic wild poliovirus transmission in Nigeria, Pakistan, and Afghanistan remains an issue of international concern. Poor microplanning has been identified as a major contributor to the high numbers of chronically missed children. Methods. We assessed the contribution of the revised household-based microplanning process implemented in Kano State from September 2013 to April 2014 to the outcomes of subsequent polio supplemental immunization activities using used preselected planning and outcome indicators. Results. There was a 38% increase in the number of settlements enumerated, a 30% reduction in the number of target households, and a 54% reduction in target children. The reported number of children vaccinated and the doses of oral polio vaccine used during subsequent polio supplemental immunization activities showed a decline. Postvaccination lot quality assurance sampling and chronically missed settlement reports also showed a progressive reduction in the number of children and settlements missed. Conclusions. We observed improvement in Kano State's performance based on the selected postcampaign performance evaluation indicators and reliability of baseline demographic estimates after the revised household-based microplanning exercise. PMID:26908755

  3. The kinematics of upper extremity reaching: a reliability study on people with and without shoulder impingement syndrome

    PubMed Central

    2010-01-01

    Background Tasks chosen to evaluate motor performance should reflect the movement deficits characteristic of the target population and present an appropriate challenge for the patients who would be evaluated. A reaching task that evaluates impairment characteristics of people with shoulder impingement syndrome (SIS) was developed to evaluate the motor performance of this population. The objectives of this study were to characterize the reproducibility of this reaching task in people with and without SIS and to evaluate the impact of the number of trials on reproducibility. Methods Thirty subjects with SIS and twenty healthy subjects participated in the first measurement session to evaluate intrasession reliability. Ten healthy subjects were retested within 2 to 7 days to assess intersession reliability. At each measurement session, upper extremity kinematic patterns were evaluated during a reaching task. Ten trials were recorded. Thereafter, the upper extremity position at the end of reaching and total joint excursion that occurred during reaching were calculated. Intraclass correlation coefficient (ICC) and minimal detectable change (MDC) were used to estimate intra and intersession reliability. Results Intrasession reliability for total joint excursion was good to very good when based on the first two trials (0.770.92). As for end-reach position, intrasession reliability was very good when using either the first two, first five or last five trials (ICC>0.82). Globally, MDC were smaller for the last five trials. Intersession reliability of total joint excursion and position at the end of reaching was good to very good when using the mean of the first two or five trials (0.690.82). For most joints, MDC were smaller when using all ten trials. Conclusions The reaching task proposed to evaluate the upper limb motor performance was found reliable in people with and without SIS. Furthermore, the minimal difference necessary to infer a meaningful change in motor performance was determined, indicating that relatively small changes in task performance can be interpreted as a change in motor performance. PMID:20331889

  4. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  5. Influences of Response Rate and Distribution on the Calculation of Interobserver Reliability Scores

    ERIC Educational Resources Information Center

    Rolider, Natalie U.; Iwata, Brian A.; Bullock, Christopher E.

    2012-01-01

    We examined the effects of several variations in response rate on the calculation of total, interval, exact-agreement, and proportional reliability indices. Trained observers recorded computer-generated data that appeared on a computer screen. In Study 1, target responses occurred at low, moderate, and high rates during separate sessions so that…

  6. Molecular basis for specificity in the druggable kinome: sequence-based analysis.

    PubMed

    Chen, Jianping; Zhang, Xi; Fernández, Ariel

    2007-03-01

    Rational design of kinase inhibitors remains a challenge partly because there is no clear delineation of the molecular features that direct the pharmacological impact towards clinically relevant targets. Standard factors governing ligand affinity, such as potential for intermolecular hydrophobic interactions or for intermolecular hydrogen bonding do not provide good markers to assess cross reactivity. Thus, a core question in the informatics of drug design is what type of molecular similarity among targets promotes promiscuity and what type of molecular difference governs specificity. This work answers the question for a sizable screened sample of the human pharmacokinome including targets with unreported structure. We show that drug design aimed at promoting pairwise interactions between ligand and kinase target actually fosters promiscuity because of the high conservation of the partner groups on or around the ATP-binding site of the kinase. Alternatively, we focus on a structural marker that may be reliably determined from sequence and measures dehydration propensities mostly localized on the loopy regions of kinases. Based on this marker, we construct a sequence-based kinase classifier that enables the accurate prediction of pharmacological differences. Our indicator is a microenvironmental descriptor that quantifies the propensity for water exclusion around preformed polar pairs. The results suggest that targeting polar dehydration patterns heralds a new generation of drugs that enable a tighter control of specificity than designs aimed at promoting ligand-kinase pairwise interactions. The predictor of polar hot spots for dehydration propensity, or solvent-accessible hydrogen bonds in soluble proteins, named YAPView, may be freely downloaded from the University of Chicago website http://protlib.uchicago.edu/dloads.html. Supplementary data are available at Bioinformatics online.

  7. An infrastructure for accurate characterization of single-event transients in digital circuits.

    PubMed

    Savulimedu Veeravalli, Varadan; Polzer, Thomas; Schmid, Ulrich; Steininger, Andreas; Hofbauer, Michael; Schweiger, Kurt; Dietrich, Horst; Schneider-Hornstein, Kerstin; Zimmermann, Horst; Voss, Kay-Obbe; Merk, Bruno; Hajek, Michael

    2013-11-01

    We present the architecture and a detailed pre-fabrication analysis of a digital measurement ASIC facilitating long-term irradiation experiments of basic asynchronous circuits, which also demonstrates the suitability of the general approach for obtaining accurate radiation failure models developed in our FATAL project. Our ASIC design combines radiation targets like Muller C-elements and elastic pipelines as well as standard combinational gates and flip-flops with an elaborate on-chip measurement infrastructure. Major architectural challenges result from the fact that the latter must operate reliably under the same radiation conditions the target circuits are exposed to, without wasting precious die area for a rad-hard design. A measurement architecture based on multiple non-rad-hard counters is used, which we show to be resilient against double faults, as well as many triple and even higher-multiplicity faults. The design evaluation is done by means of comprehensive fault injection experiments, which are based on detailed Spice models of the target circuits in conjunction with a standard double-exponential current injection model for single-event transients (SET). To be as accurate as possible, the parameters of this current model have been aligned with results obtained from 3D device simulation models, which have in turn been validated and calibrated using micro-beam radiation experiments at the GSI in Darmstadt, Germany. For the latter, target circuits instrumented with high-speed sense amplifiers have been used for analog SET recording. Together with a probabilistic analysis of the sustainable particle flow rates, based on a detailed area analysis and experimental cross-section data, we can conclude that the proposed architecture will indeed sustain significant target hit rates, without exceeding the resilience bound of the measurement infrastructure.

  8. An infrastructure for accurate characterization of single-event transients in digital circuits☆

    PubMed Central

    Savulimedu Veeravalli, Varadan; Polzer, Thomas; Schmid, Ulrich; Steininger, Andreas; Hofbauer, Michael; Schweiger, Kurt; Dietrich, Horst; Schneider-Hornstein, Kerstin; Zimmermann, Horst; Voss, Kay-Obbe; Merk, Bruno; Hajek, Michael

    2013-01-01

    We present the architecture and a detailed pre-fabrication analysis of a digital measurement ASIC facilitating long-term irradiation experiments of basic asynchronous circuits, which also demonstrates the suitability of the general approach for obtaining accurate radiation failure models developed in our FATAL project. Our ASIC design combines radiation targets like Muller C-elements and elastic pipelines as well as standard combinational gates and flip-flops with an elaborate on-chip measurement infrastructure. Major architectural challenges result from the fact that the latter must operate reliably under the same radiation conditions the target circuits are exposed to, without wasting precious die area for a rad-hard design. A measurement architecture based on multiple non-rad-hard counters is used, which we show to be resilient against double faults, as well as many triple and even higher-multiplicity faults. The design evaluation is done by means of comprehensive fault injection experiments, which are based on detailed Spice models of the target circuits in conjunction with a standard double-exponential current injection model for single-event transients (SET). To be as accurate as possible, the parameters of this current model have been aligned with results obtained from 3D device simulation models, which have in turn been validated and calibrated using micro-beam radiation experiments at the GSI in Darmstadt, Germany. For the latter, target circuits instrumented with high-speed sense amplifiers have been used for analog SET recording. Together with a probabilistic analysis of the sustainable particle flow rates, based on a detailed area analysis and experimental cross-section data, we can conclude that the proposed architecture will indeed sustain significant target hit rates, without exceeding the resilience bound of the measurement infrastructure. PMID:24748694

  9. PROCOS: computational analysis of protein-protein complexes.

    PubMed

    Fink, Florian; Hochrein, Jochen; Wolowski, Vincent; Merkl, Rainer; Gronwald, Wolfram

    2011-09-01

    One of the main challenges in protein-protein docking is a meaningful evaluation of the many putative solutions. Here we present a program (PROCOS) that calculates a probability-like measure to be native for a given complex. In contrast to scores often used for analyzing complex structures, the calculated probabilities offer the advantage of providing a fixed range of expected values. This will allow, in principle, the comparison of models corresponding to different targets that were solved with the same algorithm. Judgments are based on distributions of properties derived from a large database of native and false complexes. For complex analysis PROCOS uses these property distributions of native and false complexes together with a support vector machine (SVM). PROCOS was compared to the established scoring schemes of ZRANK and DFIRE. Employing a set of experimentally solved native complexes, high probability values above 50% were obtained for 90% of these structures. Next, the performance of PROCOS was tested on the 40 binary targets of the Dockground decoy set, on 14 targets of the RosettaDock decoy set and on 9 targets that participated in the CAPRI scoring evaluation. Again the advantage of using a probability-based scoring system becomes apparent and a reasonable number of near native complexes was found within the top ranked complexes. In conclusion, a novel fully automated method is presented that allows the reliable evaluation of protein-protein complexes. Copyright © 2011 Wiley Periodicals, Inc.

  10. Mapping intended spinal site of care from the upright to prone position: an interexaminer reliability study.

    PubMed

    Cooperstein, Robert; Young, Morgan

    2014-01-01

    Upright examination procedures like radiology, thermography, manual muscle testing, and spinal motion palpation may lead to spinal interventions with the patient prone. The reliability and accuracy of mapping upright examination findings to the prone position is unknown. This study had 2 primary goals: (1) investigate how erroneous spine-scapular landmark associations may lead to errors in treating and charting spine levels; and (2) study the interexaminer reliability of a novel method for mapping upright spinal sites to the prone position. Experiment 1 was a thought experiment exploring the consequences of depending on the erroneous landmark association of the inferior scapular tip with the T7 spinous process upright and T6 spinous process prone (relatively recent studies suggest these levels are T8 and T9, respectively). This allowed deduction of targeting and charting errors. In experiment 2, 10 examiners (2 experienced, 8 novice) used an index finger to maintain contact with a mid-thoracic spinous process as each of 2 participants slowly moved from the upright to the prone position. Interexaminer reliability was assessed by computing Intraclass Correlation Coefficient, standard error of the mean, root mean squared error, and the absolute value of the mean difference for each examiner from the 10 examiner mean for each of the 2 participants. The thought experiment suggesting that using the (inaccurate) scapular tip landmark rule would result in a 3 level targeting and charting error when radiological findings are mapped to the prone position. Physical upright exam procedures like motion palpation would result in a 2 level targeting error for intervention, and a 3 level error for charting. The reliability experiment showed examiners accurately maintained contact with the same thoracic spinous process as the participant went from upright to prone, ICC (2,1) = 0.83. As manual therapists, the authors have emphasized how targeting errors may impact upon manual care of the spine. Practitioners in other fields that need to accurately locate spinal levels, such as acupuncture and anesthesiology, would also be expected to draw important conclusions from these findings.

  11. Mapping intended spinal site of care from the upright to prone position: an interexaminer reliability study

    PubMed Central

    2014-01-01

    Background Upright examination procedures like radiology, thermography, manual muscle testing, and spinal motion palpation may lead to spinal interventions with the patient prone. The reliability and accuracy of mapping upright examination findings to the prone position is unknown. This study had 2 primary goals: (1) investigate how erroneous spine-scapular landmark associations may lead to errors in treating and charting spine levels; and (2) study the interexaminer reliability of a novel method for mapping upright spinal sites to the prone position. Methods Experiment 1 was a thought experiment exploring the consequences of depending on the erroneous landmark association of the inferior scapular tip with the T7 spinous process upright and T6 spinous process prone (relatively recent studies suggest these levels are T8 and T9, respectively). This allowed deduction of targeting and charting errors. In experiment 2, 10 examiners (2 experienced, 8 novice) used an index finger to maintain contact with a mid-thoracic spinous process as each of 2 participants slowly moved from the upright to the prone position. Interexaminer reliability was assessed by computing Intraclass Correlation Coefficient, standard error of the mean, root mean squared error, and the absolute value of the mean difference for each examiner from the 10 examiner mean for each of the 2 participants. Results The thought experiment suggesting that using the (inaccurate) scapular tip landmark rule would result in a 3 level targeting and charting error when radiological findings are mapped to the prone position. Physical upright exam procedures like motion palpation would result in a 2 level targeting error for intervention, and a 3 level error for charting. The reliability experiment showed examiners accurately maintained contact with the same thoracic spinous process as the participant went from upright to prone, ICC (2,1) = 0.83. Conclusions As manual therapists, the authors have emphasized how targeting errors may impact upon manual care of the spine. Practitioners in other fields that need to accurately locate spinal levels, such as acupuncture and anesthesiology, would also be expected to draw important conclusions from these findings. PMID:24904747

  12. Docking-based classification models for exploratory toxicology ...

    EPA Pesticide Factsheets

    Background: Exploratory toxicology is a new emerging research area whose ultimate mission is that of protecting human health and environment from risks posed by chemicals. In this regard, the ethical and practical limitation of animal testing has encouraged the promotion of computational methods for the fast screening of huge collections of chemicals available on the market. Results: We derived 24 reliable docking-based classification models able to predict the estrogenic potential of a large collection of chemicals having high quality experimental data, kindly provided by the U.S. Environmental Protection Agency (EPA). The predictive power of our docking-based models was supported by values of AUC, EF1% (EFmax = 7.1), -LR (at SE = 0.75) and +LR (at SE = 0.25) ranging from 0.63 to 0.72, from 2.5 to 6.2, from 0.35 to 0.67 and from 2.05 to 9.84, respectively. In addition, external predictions were successfully made on some representative known estrogenic chemicals. Conclusion: We show how structure-based methods, widely applied to drug discovery programs, can be adapted to meet the conditions of the regulatory context. Importantly, these methods enable one to employ the physicochemical information contained in the X-ray solved biological target and to screen structurally-unrelated chemicals. Shows how structure-based methods, widely applied to drug discovery programs, can be adapted to meet the conditions of the regulatory context. Evaluation of 24 reliable dockin

  13. 4D dose simulation in volumetric arc therapy: Accuracy and affecting parameters

    PubMed Central

    Werner, René

    2017-01-01

    Radiotherapy of lung and liver lesions has changed from normofractioned 3D-CRT to stereotactic treatment in a single or few fractions, often employing volumetric arc therapy (VMAT)-based techniques. Potential unintended interference of respiratory target motion and dynamically changing beam parameters during VMAT dose delivery motivates establishing 4D quality assurance (4D QA) procedures to assess appropriateness of generated VMAT treatment plans when taking into account patient-specific motion characteristics. Current approaches are motion phantom-based 4D QA and image-based 4D VMAT dose simulation. Whereas phantom-based 4D QA is usually restricted to a small number of measurements, the computational approaches allow simulating many motion scenarios. However, 4D VMAT dose simulation depends on various input parameters, influencing estimated doses along with mitigating simulation reliability. Thus, aiming at routine use of simulation-based 4D VMAT QA, the impact of such parameters as well as the overall accuracy of the 4D VMAT dose simulation has to be studied in detail–which is the topic of the present work. In detail, we introduce the principles of 4D VMAT dose simulation, identify influencing parameters and assess their impact on 4D dose simulation accuracy by comparison of simulated motion-affected dose distributions to corresponding dosimetric motion phantom measurements. Exploiting an ITV-based treatment planning approach, VMAT treatment plans were generated for a motion phantom and different motion scenarios (sinusoidal motion of different period/direction; regular/irregular motion). 4D VMAT dose simulation results and dose measurements were compared by local 3% / 3 mm γ-evaluation, with the measured dose distributions serving as ground truth. Overall γ-passing rates of simulations and dynamic measurements ranged from 97% to 100% (mean across all motion scenarios: 98% ± 1%); corresponding values for comparison of different day repeat measurements were between 98% and 100%. Parameters of major influence on 4D VMAT dose simulation accuracy were the degree of temporal discretization of the dose delivery process (the higher, the better) and correct alignment of the assumed breathing phases at the beginning of the dose measurements and simulations. Given the high γ-passing rates between simulated motion-affected doses and dynamic measurements, we consider the simulations to provide a reliable basis for assessment of VMAT motion effects that–in the sense of 4D QA of VMAT treatment plans–allows to verify target coverage in hypofractioned VMAT-based radiotherapy of moving targets. Remaining differences between measurements and simulations motivate, however, further detailed studies. PMID:28231337

  14. 4D dose simulation in volumetric arc therapy: Accuracy and affecting parameters.

    PubMed

    Sothmann, Thilo; Gauer, Tobias; Werner, René

    2017-01-01

    Radiotherapy of lung and liver lesions has changed from normofractioned 3D-CRT to stereotactic treatment in a single or few fractions, often employing volumetric arc therapy (VMAT)-based techniques. Potential unintended interference of respiratory target motion and dynamically changing beam parameters during VMAT dose delivery motivates establishing 4D quality assurance (4D QA) procedures to assess appropriateness of generated VMAT treatment plans when taking into account patient-specific motion characteristics. Current approaches are motion phantom-based 4D QA and image-based 4D VMAT dose simulation. Whereas phantom-based 4D QA is usually restricted to a small number of measurements, the computational approaches allow simulating many motion scenarios. However, 4D VMAT dose simulation depends on various input parameters, influencing estimated doses along with mitigating simulation reliability. Thus, aiming at routine use of simulation-based 4D VMAT QA, the impact of such parameters as well as the overall accuracy of the 4D VMAT dose simulation has to be studied in detail-which is the topic of the present work. In detail, we introduce the principles of 4D VMAT dose simulation, identify influencing parameters and assess their impact on 4D dose simulation accuracy by comparison of simulated motion-affected dose distributions to corresponding dosimetric motion phantom measurements. Exploiting an ITV-based treatment planning approach, VMAT treatment plans were generated for a motion phantom and different motion scenarios (sinusoidal motion of different period/direction; regular/irregular motion). 4D VMAT dose simulation results and dose measurements were compared by local 3% / 3 mm γ-evaluation, with the measured dose distributions serving as ground truth. Overall γ-passing rates of simulations and dynamic measurements ranged from 97% to 100% (mean across all motion scenarios: 98% ± 1%); corresponding values for comparison of different day repeat measurements were between 98% and 100%. Parameters of major influence on 4D VMAT dose simulation accuracy were the degree of temporal discretization of the dose delivery process (the higher, the better) and correct alignment of the assumed breathing phases at the beginning of the dose measurements and simulations. Given the high γ-passing rates between simulated motion-affected doses and dynamic measurements, we consider the simulations to provide a reliable basis for assessment of VMAT motion effects that-in the sense of 4D QA of VMAT treatment plans-allows to verify target coverage in hypofractioned VMAT-based radiotherapy of moving targets. Remaining differences between measurements and simulations motivate, however, further detailed studies.

  15. Utilization of wireless structural health monitoring as decision making tools for a condition and reliability-based assessment of railroad bridges

    NASA Astrophysics Data System (ADS)

    Flanigan, Katherine A.; Johnson, Nephi R.; Hou, Rui; Ettouney, Mohammed; Lynch, Jerome P.

    2017-04-01

    The ability to quantitatively assess the condition of railroad bridges facilitates objective evaluation of their robustness in the face of hazard events. Of particular importance is the need to assess the condition of railroad bridges in networks that are exposed to multiple hazards. Data collected from structural health monitoring (SHM) can be used to better maintain a structure by prompting preventative (rather than reactive) maintenance strategies and supplying quantitative information to aid in recovery. To that end, a wireless monitoring system is validated and installed on the Harahan Bridge which is a hundred-year-old long-span railroad truss bridge that crosses the Mississippi River near Memphis, TN. This bridge is exposed to multiple hazards including scour, vehicle/barge impact, seismic activity, and aging. The instrumented sensing system targets non-redundant structural components and areas of the truss and floor system that bridge managers are most concerned about based on previous inspections and structural analysis. This paper details the monitoring system and the analytical method for the assessment of bridge condition based on automated data-driven analyses. Two primary objectives of monitoring the system performance are discussed: 1) monitoring fatigue accumulation in critical tensile truss elements; and 2) monitoring the reliability index values associated with sub-system limit states of these members. Moreover, since the reliability index is a scalar indicator of the safety of components, quantifiable condition assessment can be used as an objective metric so that bridge owners can make informed damage mitigation strategies and optimize resource management on single bridge or network levels.

  16. Creating diversified response profiles from a single quenchometric sensor element by using phase-resolved luminescence.

    PubMed

    Tehan, Elizabeth C; Bukowski, Rachel M; Chodavarapu, Vamsy P; Titus, Albert H; Cartwright, Alexander N; Bright, Frank V

    2015-01-05

    We report a new strategy for generating a continuum of response profiles from a single luminescence-based sensor element by using phase-resolved detection. This strategy yields reliable responses that depend in a predictable manner on changes in the luminescent reporter lifetime in the presence of the target analyte, the excitation modulation frequency, and the detector (lock-in amplifier) phase angle. In the traditional steady-state mode, the sensor that we evaluate exhibits a linear, positive going response to changes in the target analyte concentration. Under phase-resolved conditions the analyte-dependent response profiles: (i) can become highly non-linear; (ii) yield negative going responses; (iii) can be biphasic; and (iv) can exhibit super sensitivity (e.g., sensitivities up to 300 fold greater in comparison to steady-state conditions).

  17. Design of Potent and Druglike Nonphenolic Inhibitors for Catechol O-Methyltransferase Derived from a Fragment Screening Approach Targeting the S-Adenosyl-l-methionine Pocket.

    PubMed

    Lerner, Christian; Jakob-Roetne, Roland; Buettelmann, Bernd; Ehler, Andreas; Rudolph, Markus; Rodríguez Sarmiento, Rosa María

    2016-11-23

    A fragment screening approach designed to target specifically the S-adenosyl-l-methionine pocket of catechol O-methyl transferase allowed the identification of structurally related fragments of high ligand efficiency and with activity on the described orthogonal assays. By use of a reliable enzymatic assay together with X-ray crystallography as guidance, a series of fragment modifications revealed an SAR and, after several expansions, potent lead compounds could be obtained. For the first time nonphenolic and small low nanomolar potent, SAM competitive COMT inhibitors are reported. These compounds represent a novel series of potent COMT inhibitors that might be further optimized to new drugs useful for the treatment of Parkinson's disease, as adjuncts in levodopa based therapy, or for the treatment of schizophrenia.

  18. Self-Fulfilling Prophecies as a Link between Men’s Facial Width-to-Height Ratio and Behavior

    PubMed Central

    Haselhuhn, Michael P.; Wong, Elaine M.; Ormiston, Margaret E.

    2013-01-01

    The facial width-to-height ratio (fWHR) has been identified as a reliable predictor of men’s behavior, with researchers focusing on evolutionary selection pressures as the underlying mechanism explaining these relationships. In this paper, we complement this approach and examine the extent to which social processes also determine the extent to which men’s fWHR serves as a behavioral cue. Specifically, we propose that observers’ treatment of target men based on the targets’ fWHR subsequently affects behavior, leading the targets to behave in ways that are consistent with the observers’ expectations (i.e., a self-fulfilling prophecy). Results from four studies demonstrate that individuals behave more selfishly when interacting with men with greater fWHRs, and this selfish behavior, in turn, elicits selfish behavior in others. PMID:24015226

  19. The Enzyme Function Initiative†

    PubMed Central

    Gerlt, John A.; Allen, Karen N.; Almo, Steven C.; Armstrong, Richard N.; Babbitt, Patricia C.; Cronan, John E.; Dunaway-Mariano, Debra; Imker, Heidi J.; Jacobson, Matthew P.; Minor, Wladek; Poulter, C. Dale; Raushel, Frank M.; Sali, Andrej; Shoichet, Brian K.; Sweedler, Jonathan V.

    2011-01-01

    The Enzyme Function Initiative (EFI) was recently established to address the challenge of assigning reliable functions to enzymes discovered in bacterial genome projects; in this Current Topic we review the structure and operations of the EFI. The EFI includes the Superfamily/Genome, Protein, Structure, Computation, and Data/Dissemination Cores that provide the infrastructure for reliably predicting the in vitro functions of unknown enzymes. The initial targets for functional assignment are selected from five functionally diverse superfamilies (amidohydrolase, enolase, glutathione transferase, haloalkanoic acid dehalogenase, and isoprenoid synthase), with five superfamily-specific Bridging Projects experimentally testing the predicted in vitro enzymatic activities. The EFI also includes the Microbiology Core that evaluates the in vivo context of in vitro enzymatic functions and confirms the functional predictions of the EFI. The deliverables of the EFI to the scientific community include: 1) development of a large-scale, multidisciplinary sequence/structure-based strategy for functional assignment of unknown enzymes discovered in genome projects (target selection, protein production, structure determination, computation, experimental enzymology, microbiology, and structure-based annotation); 2) dissemination of the strategy to the community via publications, collaborations, workshops, and symposia; 3) computational and bioinformatic tools for using the strategy; 4) provision of experimental protocols and/or reagents for enzyme production and characterization; and 5) dissemination of data via the EFI’s website, enzymefunction.org. The realization of multidisciplinary strategies for functional assignment will begin to define the full metabolic diversity that exists in nature and will impact basic biochemical and evolutionary understanding, as well as a wide range of applications of central importance to industrial, medicinal and pharmaceutical efforts. PMID:21999478

  20. The Enzyme Function Initiative.

    PubMed

    Gerlt, John A; Allen, Karen N; Almo, Steven C; Armstrong, Richard N; Babbitt, Patricia C; Cronan, John E; Dunaway-Mariano, Debra; Imker, Heidi J; Jacobson, Matthew P; Minor, Wladek; Poulter, C Dale; Raushel, Frank M; Sali, Andrej; Shoichet, Brian K; Sweedler, Jonathan V

    2011-11-22

    The Enzyme Function Initiative (EFI) was recently established to address the challenge of assigning reliable functions to enzymes discovered in bacterial genome projects; in this Current Topic, we review the structure and operations of the EFI. The EFI includes the Superfamily/Genome, Protein, Structure, Computation, and Data/Dissemination Cores that provide the infrastructure for reliably predicting the in vitro functions of unknown enzymes. The initial targets for functional assignment are selected from five functionally diverse superfamilies (amidohydrolase, enolase, glutathione transferase, haloalkanoic acid dehalogenase, and isoprenoid synthase), with five superfamily specific Bridging Projects experimentally testing the predicted in vitro enzymatic activities. The EFI also includes the Microbiology Core that evaluates the in vivo context of in vitro enzymatic functions and confirms the functional predictions of the EFI. The deliverables of the EFI to the scientific community include (1) development of a large-scale, multidisciplinary sequence/structure-based strategy for functional assignment of unknown enzymes discovered in genome projects (target selection, protein production, structure determination, computation, experimental enzymology, microbiology, and structure-based annotation), (2) dissemination of the strategy to the community via publications, collaborations, workshops, and symposia, (3) computational and bioinformatic tools for using the strategy, (4) provision of experimental protocols and/or reagents for enzyme production and characterization, and (5) dissemination of data via the EFI's Website, http://enzymefunction.org. The realization of multidisciplinary strategies for functional assignment will begin to define the full metabolic diversity that exists in nature and will impact basic biochemical and evolutionary understanding, as well as a wide range of applications of central importance to industrial, medicinal, and pharmaceutical efforts. © 2011 American Chemical Society

  1. Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification

    PubMed Central

    ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE

    2017-01-01

    Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793

  2. The operations of quantum logic gates with pure and mixed initial states.

    PubMed

    Chen, Jun-Liang; Li, Che-Ming; Hwang, Chi-Chuan; Ho, Yi-Hui

    2011-04-07

    The implementations of quantum logic gates realized by the rovibrational states of a C(12)O(16) molecule in the X((1)Σ(+)) electronic ground state are investigated. Optimal laser fields are obtained by using the modified multitarget optimal theory (MTOCT) which combines the maxima of the cost functional and the fidelity for state and quantum process. The projection operator technique together with modified MTOCT is used to get optimal laser fields. If initial states of the quantum gate are pure states, states at target time approach well to ideal target states. However, if the initial states are mixed states, the target states do not approach well to ideal ones. The process fidelity is introduced to investigate the reliability of the quantum gate operation driven by the optimal laser field. We found that the quantum gates operate reliably whether the initial states are pure or mixed.

  3. Placenta-specific1 (PLAC1) is a potential target for antibody-drug conjugate-based prostate cancer immunotherapy.

    PubMed

    Nejadmoghaddam, Mohammad-Reza; Zarnani, Amir-Hassan; Ghahremanzadeh, Ramin; Ghods, Roya; Mahmoudian, Jafar; Yousefi, Maryam; Nazari, Mahboobeh; Ghahremani, Mohammad Hossein; Abolhasani, Maryam; Anissian, Ali; Mahmoudi, Morteza; Dinarvand, Rassoul

    2017-10-17

    Our recent findings strongly support the idea of PLAC1 being as a potential immunotherapeutic target in prostate cancer (PCa). Here, we have generated and evaluated an anti-placenta-specific1 (PLAC1)-based antibody drug conjugate (ADC) for targeted immunotherapy of PCa. Prostate cancer cells express considerable levels of PLAC1. The Anti-PLAC1 clone, 2H12C12, showed high reactivity with recombinant PLAC1 and selectivity recognized PLAC1 in prostate cancer cells but not in LS180 cells, the negative control. PLAC1 binding induced rapid internalization of the antibody within a few minutes which reached to about 50% after 15 min and almost completed within an hour. After SN38 conjugation to antibody, a drug-antibody ratio (DAR) of about 5.5 was achieved without apparent negative effect on antibody affinity to cell surface antigen. The ADC retained intrinsic antibody activity and showed enhanced and selective cytotoxicity with an IC50 of 62 nM which was about 15-fold lower compared to free drug. Anti-PLAC1-ADC induced apoptosis in human primary prostate cancer cells and prostate cell lines. No apparent cytotoxic effect was observed in in vivo animal safety experiments. Our newly developed anti-PLAC1-based ADCs might pave the way for a reliable, efficient, and novel immunotherapeutic modality for patients with PCa.

  4. Evaluating web-based cognitive-affective remediation in recent trauma survivors: study rationale and protocol.

    PubMed

    Fine, Naomi B; Achituv, Michal; Etkin, Amit; Merin, Ofer; Shalev, Arieh Y

    2018-01-01

    Background : The immediate aftermath of traumatic events is a period of enhanced neural plasticity, following which some survivors remain with post-traumatic stress disorder (PTSD) whereas others recover. Evidence points to impairments in emotional reactivity, emotion regulation, and broader executive functions as critically contributing to PTSD. Emerging evidence further suggests that the neural mechanisms underlying these functions remain plastic in adulthood and that targeted retraining of these systems may enhance their efficiency and could reduce the likelihood of developing PTSD. Administering targeted neurocognitive training shortly after trauma exposure is a daunting challenge. This work describes a study design addressing that challenge. The study evaluated the direct effects of cognitive remediation training on neurocognitive mechanisms that hypothetically underlay PTSD, and the indirect effect of this intervention on emerging PTSD symptoms. Method : We describe a study rationale, design, and methodological choices involving: (a) participants' enrolment; (b) implementation and management of a daily self-administered, web-based intervention; (c) reliable, timely screening and assessment of treatment of eligible survivors; and (d) defining control conditions and outcome measures. We outline the rationale of choices made regarding study sample, timing of intervention, measurements, monitoring participants' adherence, and ways to harmonize and retain interviewers' fidelity and mitigate eventual burnout by repeated contacts with recently traumatized survivors. Conclusion : Early web-based interventions targeting causative mechanisms of PTSD can be informed by the model presented in this paper.

  5. Enhanced detection and visualization of anomalies in spectral imagery

    NASA Astrophysics Data System (ADS)

    Basener, William F.; Messinger, David W.

    2009-05-01

    Anomaly detection algorithms applied to hyperspectral imagery are able to reliably identify man-made objects from a natural environment based on statistical/geometric likelyhood. The process is more robust than target identification, which requires precise prior knowledge of the object of interest, but has an inherently higher false alarm rate. Standard anomaly detection algorithms measure deviation of pixel spectra from a parametric model (either statistical or linear mixing) estimating the image background. The topological anomaly detector (TAD) creates a fully non-parametric, graph theory-based, topological model of the image background and measures deviation from this background using codensity. In this paper we present a large-scale comparative test of TAD against 80+ targets in four full HYDICE images using the entire canonical target set for generation of ROC curves. TAD will be compared against several statistics-based detectors including local RX and subspace RX. Even a perfect anomaly detection algorithm would have a high practical false alarm rate in most scenes simply because the user/analyst is not interested in every anomalous object. To assist the analyst in identifying and sorting objects of interest, we investigate coloring of the anomalies with principle components projections using statistics computed from the anomalies. This gives a very useful colorization of anomalies in which objects of similar material tend to have the same color, enabling an analyst to quickly sort and identify anomalies of highest interest.

  6. PESSTO: survey description and products from the first data release by the Public ESO Spectroscopic Survey of Transient Objects

    NASA Astrophysics Data System (ADS)

    Smartt, S. J.; Valenti, S.; Fraser, M.; Inserra, C.; Young, D. R.; Sullivan, M.; Pastorello, A.; Benetti, S.; Gal-Yam, A.; Knapic, C.; Molinaro, M.; Smareglia, R.; Smith, K. W.; Taubenberger, S.; Yaron, O.; Anderson, J. P.; Ashall, C.; Balland, C.; Baltay, C.; Barbarino, C.; Bauer, F. E.; Baumont, S.; Bersier, D.; Blagorodnova, N.; Bongard, S.; Botticella, M. T.; Bufano, F.; Bulla, M.; Cappellaro, E.; Campbell, H.; Cellier-Holzem, F.; Chen, T.-W.; Childress, M. J.; Clocchiatti, A.; Contreras, C.; Dall'Ora, M.; Danziger, J.; de Jaeger, T.; De Cia, A.; Della Valle, M.; Dennefeld, M.; Elias-Rosa, N.; Elman, N.; Feindt, U.; Fleury, M.; Gall, E.; Gonzalez-Gaitan, S.; Galbany, L.; Morales Garoffolo, A.; Greggio, L.; Guillou, L. L.; Hachinger, S.; Hadjiyska, E.; Hage, P. E.; Hillebrandt, W.; Hodgkin, S.; Hsiao, E. Y.; James, P. A.; Jerkstrand, A.; Kangas, T.; Kankare, E.; Kotak, R.; Kromer, M.; Kuncarayakti, H.; Leloudas, G.; Lundqvist, P.; Lyman, J. D.; Hook, I. M.; Maguire, K.; Manulis, I.; Margheim, S. J.; Mattila, S.; Maund, J. R.; Mazzali, P. A.; McCrum, M.; McKinnon, R.; Moreno-Raya, M. E.; Nicholl, M.; Nugent, P.; Pain, R.; Pignata, G.; Phillips, M. M.; Polshaw, J.; Pumo, M. L.; Rabinowitz, D.; Reilly, E.; Romero-Cañizales, C.; Scalzo, R.; Schmidt, B.; Schulze, S.; Sim, S.; Sollerman, J.; Taddia, F.; Tartaglia, L.; Terreran, G.; Tomasella, L.; Turatto, M.; Walker, E.; Walton, N. A.; Wyrzykowski, L.; Yuan, F.; Zampieri, L.

    2015-07-01

    Context. The Public European Southern Observatory Spectroscopic Survey of Transient Objects (PESSTO) began as a public spectroscopic survey in April 2012. PESSTO classifies transients from publicly available sources and wide-field surveys, and selects science targets for detailed spectroscopic and photometric follow-up. PESSTO runs for nine months of the year, January - April and August - December inclusive, and typically has allocations of 10 nights per month. Aims: We describe the data reduction strategy and data products that are publicly available through the ESO archive as the Spectroscopic Survey data release 1 (SSDR1). Methods: PESSTO uses the New Technology Telescope with the instruments EFOSC2 and SOFI to provide optical and NIR spectroscopy and imaging. We target supernovae and optical transients brighter than 20.5m for classification. Science targets are selected for follow-up based on the PESSTO science goal of extending knowledge of the extremes of the supernova population. We use standard EFOSC2 set-ups providing spectra with resolutions of 13-18 Å between 3345-9995 Å. A subset of the brighter science targets are selected for SOFI spectroscopy with the blue and red grisms (0.935-2.53 μm and resolutions 23-33 Å) and imaging with broadband JHKs filters. Results: This first data release (SSDR1) contains flux calibrated spectra from the first year (April 2012-2013). A total of 221 confirmed supernovae were classified, and we released calibrated optical spectra and classifications publicly within 24 h of the data being taken (via WISeREP). The data in SSDR1 replace those released spectra. They have more reliable and quantifiable flux calibrations, correction for telluric absorption, and are made available in standard ESO Phase 3 formats. We estimate the absolute accuracy of the flux calibrations for EFOSC2 across the whole survey in SSDR1 to be typically ~15%, although a number of spectra will have less reliable absolute flux calibration because of weather and slit losses. Acquisition images for each spectrum are available which, in principle, can allow the user to refine the absolute flux calibration. The standard NIR reduction process does not produce high accuracy absolute spectrophotometry but synthetic photometry with accompanying JHKs imaging can improve this. Whenever possible, reduced SOFI images are provided to allow this. Conclusions: Future data releases will focus on improving the automated flux calibration of the data products. The rapid turnaround between discovery and classification and access to reliable pipeline processed data products has allowed early science papers in the first few months of the survey. Based on observations collected at the European Organisation for Astronomical Research in the Southern Hemisphere, Chile, as part of programme 188.D-3003 (PESSTO). http://www.pessto.org

  7. Defining the wheat gluten peptide fingerprint via a discovery and targeted proteomics approach.

    PubMed

    Martínez-Esteso, María José; Nørgaard, Jørgen; Brohée, Marcel; Haraszi, Reka; Maquet, Alain; O'Connor, Gavin

    2016-09-16

    Accurate, reliable and sensitive detection methods for gluten are required to support current EU regulations. The enforcement of legislative levels requires that measurement results are comparable over time and between methods. This is not a trivial task for gluten which comprises a large number of protein targets. This paper describes a strategy for defining a set of specific analytical targets for wheat gluten. A comprehensive proteomic approach was applied by fractionating wheat gluten using RP-HPLC (reversed phase high performance liquid chromatography) followed by a multi-enzymatic digestion (LysC, trypsin and chymotrypsin) with subsequent mass spectrometric analysis. This approach identified 434 peptide sequences from gluten. Peptides were grouped based on two criteria: unique to a single gluten protein sequence; contained known immunogenic and toxic sequences in the context of coeliac disease. An LC-MS/MS method based on selected reaction monitoring (SRM) was developed on a triple quadrupole mass spectrometer for the specific detection of the target peptides. The SRM based screening approach was applied to gluten containing cereals (wheat, rye, barley and oats) and non-gluten containing flours (corn, soy and rice). A unique set of wheat gluten marker peptides were identified and are proposed as wheat specific markers. The measurement of gluten in processed food products in support of regulatory limits is performed routinely. Mass spectrometry is emerging as a viable alternative to ELISA based methods. Here we outline a set of peptide markers that are representative of gluten and consider the end user's needs in protecting those with coeliac disease. The approach taken has been applied to wheat but can be easily extended to include other species potentially enabling the MS quantification of different gluten containing species from the identified markers. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Development of a questionnaire for assessing factors predicting blood donation among university students: a pilot study.

    PubMed

    Jalalian, Mehrdad; Latiff, Latiffah; Hassan, Syed Tajuddin Syed; Hanachi, Parichehr; Othman, Mohamed

    2010-05-01

    University students are a target group for blood donor programs. To develop a blood donation culture among university students, it is important to identify factors used to predict their intent to donate blood. This study attempted to develop a valid and reliable measurement tool to be employed in assessing variables in a blood donation behavior model based on the Theory of Planned Behavior (TPB), a commonly used theoretical foundation for social psychology studies. We employed an elicitation study, in which we determined the commonly held behavioral and normative beliefs about blood donation. We used the results of the elicitation study and a standard format for creating questionnaire items for all constructs of the TPB model to prepare the first draft of the measurement tool. After piloting the questionnaire, we prepared the final draft of the questionnaire to be used in our main study. Examination of internal consistency using Chronbach's alpha coefficient and item-total statistics indicated the constructs "Intention" and "Self efficacy" had the highest reliability. Removing one item from each of the constructs, "Attitude," "Subjective norm," "Self efficacy," or "Behavioral beliefs", can considerably increase the reliability of the measurement tool, however, such action is controversial, especially for the variables "attitude" and "subjective norm." We consider all the items of our first draft questionnaire in our main study to make it a reliable measurement tool.

  9. The adolescent child health and illness profile. A population-based measure of health.

    PubMed

    Starfield, B; Riley, A W; Green, B F; Ensminger, M E; Ryan, S A; Kelleher, K; Kim-Harris, S; Johnston, D; Vogel, K

    1995-05-01

    This study was designed to test the reliability and validity of an instrument to assess adolescent health status. Reliability and validity were examined by administration to adolescents (ages 11-17 years) in eight schools in two urban areas, one area in Appalachia, and one area in the rural South. Integrity of the domains and subdomains and construct validity were tested in all areas. Test/retest stability, criterion validity, and convergent and discriminant validity were tested in the two urban areas. Iterative testing has resulted in the final form of the CHIP-AE (Child Health and Illness Profile-Adolescent Edition) having 6 domains with 20 subdomains. The domains are Discomfort, Disorders, Satisfaction with Health, Achievement (of age-appropriate social roles), Risks, and Resilience. Tested aspects of reliability and validity have achieved acceptable levels for all retained subdomains. The CHIP-AE in its current form is suitable for assessing the health status of populations and subpopulations of adolescents. Evidence from test-retest stability analyses suggests that the CHIP-AE also can be used to assess changes occurring over time or in response to health services interventions targeted at groups of adolescents.

  10. A dynamic Thurstonian item response theory of motive expression in the picture story exercise: solving the internal consistency paradox of the PSE.

    PubMed

    Lang, Jonas W B

    2014-07-01

    The measurement of implicit or unconscious motives using the picture story exercise (PSE) has long been a target of debate in the psychological literature. Most debates have centered on the apparent paradox that PSE measures of implicit motives typically show low internal consistency reliability on common indices like Cronbach's alpha but nevertheless predict behavioral outcomes. I describe a dynamic Thurstonian item response theory (IRT) model that builds on dynamic system theories of motivation, theorizing on the PSE response process, and recent advancements in Thurstonian IRT modeling of choice data. To assess the models' capability to explain the internal consistency paradox, I first fitted the model to archival data (Gurin, Veroff, & Feld, 1957) and then simulated data based on bias-corrected model estimates from the real data. Simulation results revealed that the average squared correlation reliability for the motives in the Thurstonian IRT model was .74 and that Cronbach's alpha values were similar to the real data (<.35). These findings suggest that PSE motive measures have long been reliable and increase the scientific value of extant evidence from motivational research using PSE motive measures. (c) 2014 APA, all rights reserved.

  11. The influence of target erosion grade in the optoelectronic properties of AZO coatings growth by magnetron sputtering

    NASA Astrophysics Data System (ADS)

    Zubizarreta, C.; G-Berasategui, E.; Ciarsolo, I.; Barriga, J.; Gaspar, D.; Martins, R.; Fortunato, E.

    2016-09-01

    Aluminum-doped zinc oxide (AZO) transparent conductor coating has emerged as promising substitute to tin-doped indium oxide (ITO) as electrode in optoelectronic applications such as photovoltaics or light emitting diodes (LEDs). Besides its high transmission in the visible spectral region and low resistivity, AZO presents a main advantage over other candidates such as graphene, carbon nanotubes or silver nanowires; it can be deposited using the technology industrially implemented to manufacture ITO layers, the magnetron sputtering (MS). This is a productive, reliable and green manufacturing technique. But to guarantee the robustness, reproducibility and reliability of the process there are still some issues to be addressed, such as the effect and control of the target state. In this paper a thorough study of the influence of the target erosion grade in developed coatings has been performed. AZO films have been deposited from a ceramic target by RF MS. Structure, optical transmittance and electrical properties of the produced coatings have been analyzed as function of the target erosion grade. No noticeable differences have been found neither in optoelectronic properties nor in the structure of the coatings, indicating that the RF MS is a stable and consistent process through the whole life of the target.

  12. Improved object optimal synthetic description, modeling, learning, and discrimination by GEOGINE computational kernel

    NASA Astrophysics Data System (ADS)

    Fiorini, Rodolfo A.; Dacquino, Gianfranco

    2005-03-01

    GEOGINE (GEOmetrical enGINE), a state-of-the-art OMG (Ontological Model Generator) based on n-D Tensor Invariants for n-Dimensional shape/texture optimal synthetic representation, description and learning, was presented in previous conferences elsewhere recently. Improved computational algorithms based on the computational invariant theory of finite groups in Euclidean space and a demo application is presented. Progressive model automatic generation is discussed. GEOGINE can be used as an efficient computational kernel for fast reliable application development and delivery in advanced biomedical engineering, biometric, intelligent computing, target recognition, content image retrieval, data mining technological areas mainly. Ontology can be regarded as a logical theory accounting for the intended meaning of a formal dictionary, i.e., its ontological commitment to a particular conceptualization of the world object. According to this approach, "n-D Tensor Calculus" can be considered a "Formal Language" to reliably compute optimized "n-Dimensional Tensor Invariants" as specific object "invariant parameter and attribute words" for automated n-Dimensional shape/texture optimal synthetic object description by incremental model generation. The class of those "invariant parameter and attribute words" can be thought as a specific "Formal Vocabulary" learned from a "Generalized Formal Dictionary" of the "Computational Tensor Invariants" language. Even object chromatic attributes can be effectively and reliably computed from object geometric parameters into robust colour shape invariant characteristics. As a matter of fact, any highly sophisticated application needing effective, robust object geometric/colour invariant attribute capture and parameterization features, for reliable automated object learning and discrimination can deeply benefit from GEOGINE progressive automated model generation computational kernel performance. Main operational advantages over previous, similar approaches are: 1) Progressive Automated Invariant Model Generation, 2) Invariant Minimal Complete Description Set for computational efficiency, 3) Arbitrary Model Precision for robust object description and identification.

  13. A robust sub-pixel edge detection method of infrared image based on tremor-based retinal receptive field model

    NASA Astrophysics Data System (ADS)

    Gao, Kun; Yang, Hu; Chen, Xiaomei; Ni, Guoqiang

    2008-03-01

    Because of complex thermal objects in an infrared image, the prevalent image edge detection operators are often suitable for a certain scene and extract too wide edges sometimes. From a biological point of view, the image edge detection operators work reliably when assuming a convolution-based receptive field architecture. A DoG (Difference-of- Gaussians) model filter based on ON-center retinal ganglion cell receptive field architecture with artificial eye tremors introduced is proposed for the image contour detection. Aiming at the blurred edges of an infrared image, the subsequent orthogonal polynomial interpolation and sub-pixel level edge detection in rough edge pixel neighborhood is adopted to locate the foregoing rough edges in sub-pixel level. Numerical simulations show that this method can locate the target edge accurately and robustly.

  14. The GMOseek matrix: a decision support tool for optimizing the detection of genetically modified plants.

    PubMed

    Block, Annette; Debode, Frédéric; Grohmann, Lutz; Hulin, Julie; Taverniers, Isabel; Kluga, Linda; Barbau-Piednoir, Elodie; Broeders, Sylvia; Huber, Ingrid; Van den Bulcke, Marc; Heinze, Petra; Berben, Gilbert; Busch, Ulrich; Roosens, Nancy; Janssen, Eric; Žel, Jana; Gruden, Kristina; Morisset, Dany

    2013-08-22

    Since their first commercialization, the diversity of taxa and the genetic composition of transgene sequences in genetically modified plants (GMOs) are constantly increasing. To date, the detection of GMOs and derived products is commonly performed by PCR-based methods targeting specific DNA sequences introduced into the host genome. Information available regarding the GMOs' molecular characterization is dispersed and not appropriately organized. For this reason, GMO testing is very challenging and requires more complex screening strategies and decision making schemes, demanding in return the use of efficient bioinformatics tools relying on reliable information. The GMOseek matrix was built as a comprehensive, online open-access tabulated database which provides a reliable, comprehensive and user-friendly overview of 328 GMO events and 247 different genetic elements (status: 18/07/2013). The GMOseek matrix is aiming to facilitate GMO detection from plant origin at different phases of the analysis. It assists in selecting the targets for a screening analysis, interpreting the screening results, checking the occurrence of a screening element in a group of selected GMOs, identifying gaps in the available pool of GMO detection methods, and designing a decision tree. The GMOseek matrix is an independent database with effective functionalities in a format facilitating transferability to other platforms. Data were collected from all available sources and experimentally tested where detection methods and certified reference materials (CRMs) were available. The GMOseek matrix is currently a unique and very valuable tool with reliable information on GMOs from plant origin and their present genetic elements that enables further development of appropriate strategies for GMO detection. It is flexible enough to be further updated with new information and integrated in different applications and platforms.

  15. Fully Passive Wireless Acquisition of Neuropotentials

    NASA Astrophysics Data System (ADS)

    Schwerdt, Helen N.

    The ability to monitor electrophysiological signals from the sentient brain is requisite to decipher its enormously complex workings and initiate remedial solutions for the vast amount of neurologically-based disorders. Despite immense advancements in creating a variety of instruments to record signals from the brain, the translation of such neurorecording instrumentation to real clinical domains places heavy demands on their safety and reliability, both of which are not entirely portrayed by presently existing implantable recording solutions. In an attempt to lower these barriers, alternative wireless radar backscattering techniques are proposed to render the technical burdens of the implant chip to entirely passive neurorecording processes that transpire in the absence of formal integrated power sources or powering schemes along with any active circuitry. These radar-like wireless backscattering mechanisms are used to conceive of fully passive neurorecording operations of an implantable microsystem. The fully passive device potentially manifests inherent advantages over current wireless implantable and wired recording systems: negligible heat dissipation to reduce risks of brain tissue damage and minimal circuitry for long term reliability as a chronic implant. Fully passive neurorecording operations are realized via intrinsic nonlinear mixing properties of the varactor diode. These mixing and recording operations are directly activated by wirelessly interrogating the fully passive device with a microwave carrier signal. This fundamental carrier signal, acquired by the implant antenna, mixes through the varactor diode along with the internal targeted neuropotential brain signals to produce higher frequency harmonics containing the targeted neuropotential signals. These harmonics are backscattered wirelessly to the external interrogator that retrieves and recovers the original neuropotential brain signal. The passive approach removes the need for internal power sources and may alleviate heat trauma and reliability issues that limit practical implementation of existing implantable neurorecorders.

  16. Validating the European Health Literacy Survey Questionnaire in people with type 2 diabetes: Latent trait analyses applying multidimensional Rasch modelling and confirmatory factor analysis.

    PubMed

    Finbråten, Hanne Søberg; Pettersen, Kjell Sverre; Wilde-Larsson, Bodil; Nordström, Gun; Trollvik, Anne; Guttersrud, Øystein

    2017-11-01

    To validate the European Health Literacy Survey Questionnaire (HLS-EU-Q47) in people with type 2 diabetes mellitus. The HLS-EU-Q47 latent variable is outlined in a framework with four cognitive domains integrated in three health domains, implying 12 theoretically defined subscales. Valid and reliable health literacy measurers are crucial to effectively adapt health communication and education to individuals and groups of patients. Cross-sectional study applying confirmatory latent trait analyses. Using a paper-and-pencil self-administered approach, 388 adults responded in March 2015. The data were analysed using the Rasch methodology and confirmatory factor analysis. Response violation (response dependency) and trait violation (multidimensionality) of local independence were identified. Fitting the "multidimensional random coefficients multinomial logit" model, 1-, 3- and 12-dimensional Rasch models were applied and compared. Poor model fit and differential item functioning were present in some items, and several subscales suffered from poor targeting and low reliability. Despite multidimensional data, we did not observe any unordered response categories. Interpreting the domains as distinct but related latent dimensions, the data fit a 12-dimensional Rasch model and a 12-factor confirmatory factor model best. Therefore, the analyses did not support the estimation of one overall "health literacy score." To support the plausibility of claims based on the HLS-EU score(s), we suggest: removing the health care aspect to reduce the magnitude of multidimensionality; rejecting redundant items to avoid response dependency; adding "harder" items and applying a six-point rating scale to improve subscale targeting and reliability; and revising items to improve model fit and avoid bias owing to person factors. © 2017 John Wiley & Sons Ltd.

  17. The GMOseek matrix: a decision support tool for optimizing the detection of genetically modified plants

    PubMed Central

    2013-01-01

    Background Since their first commercialization, the diversity of taxa and the genetic composition of transgene sequences in genetically modified plants (GMOs) are constantly increasing. To date, the detection of GMOs and derived products is commonly performed by PCR-based methods targeting specific DNA sequences introduced into the host genome. Information available regarding the GMOs’ molecular characterization is dispersed and not appropriately organized. For this reason, GMO testing is very challenging and requires more complex screening strategies and decision making schemes, demanding in return the use of efficient bioinformatics tools relying on reliable information. Description The GMOseek matrix was built as a comprehensive, online open-access tabulated database which provides a reliable, comprehensive and user-friendly overview of 328 GMO events and 247 different genetic elements (status: 18/07/2013). The GMOseek matrix is aiming to facilitate GMO detection from plant origin at different phases of the analysis. It assists in selecting the targets for a screening analysis, interpreting the screening results, checking the occurrence of a screening element in a group of selected GMOs, identifying gaps in the available pool of GMO detection methods, and designing a decision tree. The GMOseek matrix is an independent database with effective functionalities in a format facilitating transferability to other platforms. Data were collected from all available sources and experimentally tested where detection methods and certified reference materials (CRMs) were available. Conclusions The GMOseek matrix is currently a unique and very valuable tool with reliable information on GMOs from plant origin and their present genetic elements that enables further development of appropriate strategies for GMO detection. It is flexible enough to be further updated with new information and integrated in different applications and platforms. PMID:23965170

  18. Automated batch fiducial-less tilt-series alignment in Appion using Protomo.

    PubMed

    Noble, Alex J; Stagg, Scott M

    2015-11-01

    The field of electron tomography has benefited greatly from manual and semi-automated approaches to marker-based tilt-series alignment that have allowed for the structural determination of multitudes of in situ cellular structures as well as macromolecular structures of individual protein complexes. The emergence of complementary metal-oxide semiconductor detectors capable of detecting individual electrons has enabled the collection of low dose, high contrast images, opening the door for reliable correlation-based tilt-series alignment. Here we present a set of automated, correlation-based tilt-series alignment, contrast transfer function (CTF) correction, and reconstruction workflows for use in conjunction with the Appion/Leginon package that are primarily targeted at automating structure determination with cryogenic electron microscopy. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. An Alu-based, MGB Eclipse real-time PCR method for quantitation of human DNA in forensic samples.

    PubMed

    Nicklas, Janice A; Buel, Eric

    2005-09-01

    The forensic community needs quick, reliable methods to quantitate human DNA in crime scene samples to replace the laborious and imprecise slot blot method. A real-time PCR based method has the possibility of allowing development of a faster and more quantitative assay. Alu sequences are primate-specific and are found in many copies in the human genome, making these sequences an excellent target or marker for human DNA. This paper describes the development of a real-time Alu sequence-based assay using MGB Eclipse primers and probes. The advantages of this assay are simplicity, speed, less hands-on-time and automated quantitation, as well as a large dynamic range (128 ng/microL to 0.5 pg/microL).

  20. Motion-compensated optical coherence tomography using envelope-based surface detection and Kalman-based prediction

    NASA Astrophysics Data System (ADS)

    Irsch, Kristina; Lee, Soohyun; Bose, Sanjukta N.; Kang, Jin U.

    2018-02-01

    We present an optical coherence tomography (OCT) imaging system that effectively compensates unwanted axial motion with micron-scale accuracy. The OCT system is based on a swept-source (SS) engine (1060-nm center wavelength, 100-nm full-width sweeping bandwidth, and 100-kHz repetition rate), with axial and lateral resolutions of about 4.5 and 8.5 microns respectively. The SS-OCT system incorporates a distance sensing method utilizing an envelope-based surface detection algorithm. The algorithm locates the target surface from the B-scans, taking into account not just the first or highest peak but the entire signature of sequential A-scans. Subsequently, a Kalman filter is applied as predictor to make up for system latencies, before sending the calculated position information to control a linear motor, adjusting and maintaining a fixed system-target distance. To test system performance, the motioncorrection algorithm was compared to earlier, more basic peak-based surface detection methods and to performing no motion compensation. Results demonstrate increased robustness and reproducibility, particularly noticeable in multilayered tissues, while utilizing the novel technique. Implementing such motion compensation into clinical OCT systems may thus improve the reliability of objective and quantitative information that can be extracted from OCT measurements.

  1. Trunk postural adjustments: Medium-term reliability and correlation with changes of clinical outcomes following an 8-week lumbar stabilization exercise program.

    PubMed

    Boucher, Jean-Alexandre; Preuss, Richard; Henry, Sharon M; Nugent, Marilee; Larivière, Christian

    2018-04-22

    Low back pain (LBP) has been previously associated with delayed anticipatory postural adjustments (APAs) determined by trunk muscle activation. Lumbar stabilization exercise programs (LSEP) for patients with LBP may restore the trunk neuromuscular control of the lumbar spine, and normalize APAs. This exploratory study aimed at testing the reliability of EMG and kinematics-based postural adjustment measures over an 8-week interval, assessing their sensitivity to LBP status and treatment and examining their relationship with clinical outcomes. Muscle activation of 10 trunk muscles, using surface electromyography (EMG), and lumbar angular kinematics were recorded during a rapid arm-raising/lowering task. Patients with LBP were tested before and after an 8-week LSEP. Healthy controls receiving no treatment were assessed over the same interval to determine the reliability of the measures and act as a control group at baseline. Muscle activation onsets and reactive range of motion, range of velocities and accelerations were assessed for between group differences at baseline and pre- to post-treatment effects within patients with LBP using t-tests. Correlations between these dependent variables and the change of clinical outcomes (pain, disability) over treatment were also explored. Kinematic-based measures showed comparable reliability to EMG-based measures. Between-group differences were found in lumbar lateral flexion ROM at baseline (patients < controls). In the patients with LBP, lateral flexion velocity and acceleration significantly increased following the LSEP. Correlational analyses revealed that lumbar angular kinematics were more sensitive to changes in pain intensity following the LSEP compared to EMG measures. These findings are interpreted in from the perspective of guarding behaviors and lumbar stability hypotheses. Future clinical trials are needed to target patients with and without delayed APAs at baseline and to explore the sensitivity of different outcome measures related to APAs. Different tasks more challenging to postural stability may need to be explored to more effectively reveal APA dysfunction. Copyright © 2018. Published by Elsevier Ltd.

  2. Simultaneous quantification of protein phosphorylation sites using liquid chromatography-tandem mass spectrometry-based targeted proteomics: a linear algebra approach for isobaric phosphopeptides.

    PubMed

    Xu, Feifei; Yang, Ting; Sheng, Yuan; Zhong, Ting; Yang, Mi; Chen, Yun

    2014-12-05

    As one of the most studied post-translational modifications (PTM), protein phosphorylation plays an essential role in almost all cellular processes. Current methods are able to predict and determine thousands of phosphorylation sites, whereas stoichiometric quantification of these sites is still challenging. Liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS)-based targeted proteomics is emerging as a promising technique for site-specific quantification of protein phosphorylation using proteolytic peptides as surrogates of proteins. However, several issues may limit its application, one of which relates to the phosphopeptides with different phosphorylation sites and the same mass (i.e., isobaric phosphopeptides). While employment of site-specific product ions allows for these isobaric phosphopeptides to be distinguished and quantified, site-specific product ions are often absent or weak in tandem mass spectra. In this study, linear algebra algorithms were employed as an add-on to targeted proteomics to retrieve information on individual phosphopeptides from their common spectra. To achieve this simultaneous quantification, a LC-MS/MS-based targeted proteomics assay was first developed and validated for each phosphopeptide. Given the slope and intercept of calibration curves of phosphopeptides in each transition, linear algebraic equations were developed. Using a series of mock mixtures prepared with varying concentrations of each phosphopeptide, the reliability of the approach to quantify isobaric phosphopeptides containing multiple phosphorylation sites (≥ 2) was discussed. Finally, we applied this approach to determine the phosphorylation stoichiometry of heat shock protein 27 (HSP27) at Ser78 and Ser82 in breast cancer cells and tissue samples.

  3. A novel vehicle tracking algorithm based on mean shift and active contour model in complex environment

    NASA Astrophysics Data System (ADS)

    Cai, Lei; Wang, Lin; Li, Bo; Zhang, Libao; Lv, Wen

    2017-06-01

    Vehicle tracking technology is currently one of the most active research topics in machine vision. It is an important part of intelligent transportation system. However, in theory and technology, it still faces many challenges including real-time and robustness. In video surveillance, the targets need to be detected in real-time and to be calculated accurate position for judging the motives. The contents of video sequence images and the target motion are complex, so the objects can't be expressed by a unified mathematical model. Object-tracking is defined as locating the interest moving target in each frame of a piece of video. The current tracking technology can achieve reliable results in simple environment over the target with easy identified characteristics. However, in more complex environment, it is easy to lose the target because of the mismatch between the target appearance and its dynamic model. Moreover, the target usually has a complex shape, but the tradition target tracking algorithm usually represents the tracking results by simple geometric such as rectangle or circle, so it cannot provide accurate information for the subsequent upper application. This paper combines a traditional object-tracking technology, Mean-Shift algorithm, with a kind of image segmentation algorithm, Active-Contour model, to get the outlines of objects while the tracking process and automatically handle topology changes. Meanwhile, the outline information is used to aid tracking algorithm to improve it.

  4. Performance evaluation of non-targeted peak-based cross-sample analysis for comprehensive two-dimensional gas chromatography-mass spectrometry data and application to processed hazelnut profiling.

    PubMed

    Kiefl, Johannes; Cordero, Chiara; Nicolotti, Luca; Schieberle, Peter; Reichenbach, Stephen E; Bicchi, Carlo

    2012-06-22

    The continuous interest in non-targeted profiling induced the development of tools for automated cross-sample analysis. Such tools were found to be selective or not comprehensive thus delivering a biased view on the qualitative/quantitative peak distribution across 2D sample chromatograms. Therefore, the performance of non-targeted approaches needs to be critically evaluated. This study focused on the development of a validation procedure for non-targeted, peak-based, GC×GC-MS data profiling. The procedure introduced performance parameters such as specificity, precision, accuracy, and uncertainty for a profiling method known as Comprehensive Template Matching. The performance was assessed by applying a three-week validation protocol based on CITAC/EURACHEM guidelines. Optimized ¹D and ²D retention times search windows, MS match factor threshold, detection threshold, and template threshold were evolved from two training sets by a semi-automated learning process. The effectiveness of proposed settings to consistently match 2D peak patterns was established by evaluating the rate of mismatched peaks and was expressed in terms of results accuracy. The study utilized 23 different 2D peak patterns providing the chemical fingerprints of raw and roasted hazelnuts (Corylus avellana L.) from different geographical origins, of diverse varieties and different roasting degrees. The validation results show that non-targeted peak-based profiling can be reliable with error rates lower than 10% independent of the degree of analytical variance. The optimized Comprehensive Template Matching procedure was employed to study hazelnut roasting profiles and in particular to find marker compounds strongly dependent on the thermal treatment, and to establish the correlation of potential marker compounds to geographical origin and variety/cultivar and finally to reveal the characteristic release of aroma active compounds. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Identification of Target Complaints by Computer Interview: Evaluation of the Computerized Assessment System for Psychotherapy Evaluation and Research.

    ERIC Educational Resources Information Center

    Farrell, Albert D.; And Others

    1987-01-01

    Evaluated computer interview to standardize collection of target complaints. Adult outpatients (N=103) completed computer interview, unstructured intake interview, Symptoms Checklist-90, and Minnesota Multiphasic Personality Inventory. Results provided support for the computer interview in regard to reliability and validity though there was low…

  6. Validation of Heavy Ion Transport Capabilities in PHITS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronningen, Reginald M.

    The performance of the Monte Carlo code system PHITS is validated for heavy ion transport capabilities by performing simulations and comparing results against experimental data from heavy ion reactions of benchmark quality. These data are from measurements of secondary neutron production cross sections in reactions of Xe at 400 MeV/u with lithium and lead targets, measurements of neutrons outside of thick concrete and iron shields, and measurements of isotope yields produced in the fragmentation of a 140 MeV/u 48Ca beam on a beryllium target and on a tantalum target. A practical example that tests magnetic field capabilities is shown formore » a simulated 48Ca beam at 500 MeV/u striking a lithium target to produce the rare isotope 44Si, with ion transport through a fragmentation-reaction magnetic pre-separator. The results of this study show that PHITS performs reliably for the simulation of radiation fields that is necessary for designing safe, reliable and cost effective future high-powered heavy-ion accelerators in rare isotope beam facilities.« less

  7. Examining the robustness of automated aural classification of active sonar echoes.

    PubMed

    Murphy, Stefan M; Hines, Paul C

    2014-02-01

    Active sonar systems are used to detect underwater man-made objects of interest (targets) that are too quiet to be reliably detected with passive sonar. Performance of active sonar can be degraded by false alarms caused by echoes returned from geological seabed structures (clutter) in shallow regions. To reduce false alarms, a method of distinguishing target echoes from clutter echoes is required. Research has demonstrated that perceptual-based signal features similar to those employed in the human auditory system can be used to automatically discriminate between target and clutter echoes, thereby reducing the number of false alarms and improving sonar performance. An active sonar experiment on the Malta Plateau in the Mediterranean Sea was conducted during the Clutter07 sea trial and repeated during the Clutter09 sea trial. The dataset consists of more than 95,000 pulse-compressed echoes returned from two targets and many geological clutter objects. These echoes were processed using an automatic classifier that quantifies the timbre of each echo using a number of perceptual signal features. Using echoes from 2007, the aural classifier was trained to establish a boundary between targets and clutter in the feature space. Temporal robustness was then investigated by testing the classifier on echoes from the 2009 experiment.

  8. New glycoproteomics software, GlycoPep Evaluator, generates decoy glycopeptides de novo and enables accurate false discovery rate analysis for small data sets.

    PubMed

    Zhu, Zhikai; Su, Xiaomeng; Go, Eden P; Desaire, Heather

    2014-09-16

    Glycoproteins are biologically significant large molecules that participate in numerous cellular activities. In order to obtain site-specific protein glycosylation information, intact glycopeptides, with the glycan attached to the peptide sequence, are characterized by tandem mass spectrometry (MS/MS) methods such as collision-induced dissociation (CID) and electron transfer dissociation (ETD). While several emerging automated tools are developed, no consensus is present in the field about the best way to determine the reliability of the tools and/or provide the false discovery rate (FDR). A common approach to calculate FDRs for glycopeptide analysis, adopted from the target-decoy strategy in proteomics, employs a decoy database that is created based on the target protein sequence database. Nonetheless, this approach is not optimal in measuring the confidence of N-linked glycopeptide matches, because the glycopeptide data set is considerably smaller compared to that of peptides, and the requirement of a consensus sequence for N-glycosylation further limits the number of possible decoy glycopeptides tested in a database search. To address the need to accurately determine FDRs for automated glycopeptide assignments, we developed GlycoPep Evaluator (GPE), a tool that helps to measure FDRs in identifying glycopeptides without using a decoy database. GPE generates decoy glycopeptides de novo for every target glycopeptide, in a 1:20 target-to-decoy ratio. The decoys, along with target glycopeptides, are scored against the ETD data, from which FDRs can be calculated accurately based on the number of decoy matches and the ratio of the number of targets to decoys, for small data sets. GPE is freely accessible for download and can work with any search engine that interprets ETD data of N-linked glycopeptides. The software is provided at https://desairegroup.ku.edu/research.

  9. Relative sensitivity of conventional and real-time PCR assays for detection of SFG Rickettsia in blood and tissue samples from laboratory animals.

    PubMed

    Zemtsova, Galina E; Montgomery, Merrill; Levin, Michael L

    2015-01-01

    Studies on the natural transmission cycles of zoonotic pathogens and the reservoir competence of vertebrate hosts require methods for reliable diagnosis of infection in wild and laboratory animals. Several PCR-based applications have been developed for detection of infections caused by Spotted Fever group Rickettsia spp. in a variety of animal tissues. These assays are being widely used by researchers, but they differ in their sensitivity and reliability. We compared the sensitivity of five previously published conventional PCR assays and one SYBR green-based real-time PCR assay for the detection of rickettsial DNA in blood and tissue samples from Rickettsia- infected laboratory animals (n = 87). The real-time PCR, which detected rickettsial DNA in 37.9% of samples, was the most sensitive. The next best were the semi-nested ompA assay and rpoB conventional PCR, which detected as positive 18.4% and 14.9% samples respectively. Conventional assays targeting ompB, gltA and hrtA genes have been the least sensitive. Therefore, we recommend the SYBR green-based real-time PCR as a tool for the detection of rickettsial DNA in animal samples due to its higher sensitivity when compared to more traditional assays.

  10. Relative Sensitivity of Conventional and Real-Time PCR Assays for Detection of SFG Rickettsia in Blood and Tissue Samples from Laboratory Animals

    PubMed Central

    Zemtsova, Galina E.; Montgomery, Merrill; Levin, Michael L.

    2015-01-01

    Studies on the natural transmission cycles of zoonotic pathogens and the reservoir competence of vertebrate hosts require methods for reliable diagnosis of infection in wild and laboratory animals. Several PCR-based applications have been developed for detection of infections caused by Spotted Fever group Rickettsia spp. in a variety of animal tissues. These assays are being widely used by researchers, but they differ in their sensitivity and reliability. We compared the sensitivity of five previously published conventional PCR assays and one SYBR green-based real-time PCR assay for the detection of rickettsial DNA in blood and tissue samples from Rickettsia- infected laboratory animals (n = 87). The real-time PCR, which detected rickettsial DNA in 37.9% of samples, was the most sensitive. The next best were the semi-nested ompA assay and rpoB conventional PCR, which detected as positive 18.4% and 14.9% samples respectively. Conventional assays targeting ompB, gltA and hrtA genes have been the least sensitive. Therefore, we recommend the SYBR green-based real-time PCR as a tool for the detection of rickettsial DNA in animal samples due to its higher sensitivity when compared to more traditional assays. PMID:25607846

  11. Deterministic and reliability based optimization of integrated thermal protection system composite panel using adaptive sampling techniques

    NASA Astrophysics Data System (ADS)

    Ravishankar, Bharani

    Conventional space vehicles have thermal protection systems (TPS) that provide protection to an underlying structure that carries the flight loads. In an attempt to save weight, there is interest in an integrated TPS (ITPS) that combines the structural function and the TPS function. This has weight saving potential, but complicates the design of the ITPS that now has both thermal and structural failure modes. The main objectives of this dissertation was to optimally design the ITPS subjected to thermal and mechanical loads through deterministic and reliability based optimization. The optimization of the ITPS structure requires computationally expensive finite element analyses of 3D ITPS (solid) model. To reduce the computational expenses involved in the structural analysis, finite element based homogenization method was employed, homogenizing the 3D ITPS model to a 2D orthotropic plate. However it was found that homogenization was applicable only for panels that are much larger than the characteristic dimensions of the repeating unit cell in the ITPS panel. Hence a single unit cell was used for the optimization process to reduce the computational cost. Deterministic and probabilistic optimization of the ITPS panel required evaluation of failure constraints at various design points. This further demands computationally expensive finite element analyses which was replaced by efficient, low fidelity surrogate models. In an optimization process, it is important to represent the constraints accurately to find the optimum design. Instead of building global surrogate models using large number of designs, the computational resources were directed towards target regions near constraint boundaries for accurate representation of constraints using adaptive sampling strategies. Efficient Global Reliability Analyses (EGRA) facilitates sequentially sampling of design points around the region of interest in the design space. EGRA was applied to the response surface construction of the failure constraints in the deterministic and reliability based optimization of the ITPS panel. It was shown that using adaptive sampling, the number of designs required to find the optimum were reduced drastically, while improving the accuracy. System reliability of ITPS was estimated using Monte Carlo Simulation (MCS) based method. Separable Monte Carlo method was employed that allowed separable sampling of the random variables to predict the probability of failure accurately. The reliability analysis considered uncertainties in the geometry, material properties, loading conditions of the panel and error in finite element modeling. These uncertainties further increased the computational cost of MCS techniques which was also reduced by employing surrogate models. In order to estimate the error in the probability of failure estimate, bootstrapping method was applied. This research work thus demonstrates optimization of the ITPS composite panel with multiple failure modes and large number of uncertainties using adaptive sampling techniques.

  12. The Yale-Brown Obsessive Compulsive Scale: A Reliability Generalization Meta-Analysis.

    PubMed

    López-Pina, José Antonio; Sánchez-Meca, Julio; López-López, José Antonio; Marín-Martínez, Fulgencio; Núñez-Núñez, Rosa Maria; Rosa-Alcázar, Ana I; Gómez-Conesa, Antonia; Ferrer-Requena, Josefa

    2015-10-01

    The Yale-Brown Obsessive Compulsive Scale (Y-BOCS) is the most frequently applied test to assess obsessive compulsive symptoms. We conducted a reliability generalization meta-analysis on the Y-BOCS to estimate the average reliability, examine the variability among the reliability estimates, search for moderators, and propose a predictive model that researchers and clinicians can use to estimate the expected reliability of the Y-BOCS. We included studies where the Y-BOCS was applied to a sample of adults and reliability estimate was reported. Out of the 11,490 references located, 144 studies met the selection criteria. For the total scale, the mean reliability was 0.866 for coefficients alpha, 0.848 for test-retest correlations, and 0.922 for intraclass correlations. The moderator analyses led to a predictive model where the standard deviation of the total test and the target population (clinical vs. nonclinical) explained 38.6% of the total variability among coefficients alpha. Finally, clinical implications of the results are discussed. © The Author(s) 2014.

  13. A novel approach to simulate chest wall micro-motion for bio-radar life detection purpose

    NASA Astrophysics Data System (ADS)

    An, Qiang; Li, Zhao; Liang, Fulai; Chen, Fuming; Wang, Jianqi

    2016-10-01

    Volunteers are often recruited to serve as the detection targets during the research process of bio-radar life detection technology, in which the experiment results are highly susceptible to the physical status of different individuals (shape, posture, etc.). In order to objectively evaluate the radar system performance and life detection algorithms, a standard detection target is urgently needed. The paper first proposed a parameter quantitatively controllable system to simulate the chest wall micro-motion caused mainly by breathing and heart beating. Then, the paper continued to analyze the material and size selection of the scattering body mounted on the simulation system from the perspective of back scattering energy. The computational electromagnetic method was employed to determine the exact scattering body. Finally, on-site experiments were carried out to verify the reliability of the simulation platform utilizing an IR UWB bioradar. Experimental result shows that the proposed system can simulate a real human target from three aspects: respiration frequency, amplitude and body surface scattering energy. Thus, it can be utilized as a substitute for a human target in radar based non-contact life detection research in various scenarios.

  14. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection

    NASA Astrophysics Data System (ADS)

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-10-01

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.

  15. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection.

    PubMed

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-10-14

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.

  16. A computational framework for modeling targets as complex adaptive systems

    NASA Astrophysics Data System (ADS)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  17. Next generation diagnostic molecular pathology: critical appraisal of quality assurance in Europe.

    PubMed

    Dubbink, Hendrikus J; Deans, Zandra C; Tops, Bastiaan B J; van Kemenade, Folkert J; Koljenović, S; van Krieken, Han J M; Blokx, Willeke A M; Dinjens, Winand N M; Groenen, Patricia J T A

    2014-06-01

    Tumor evaluation in pathology is more and more based on a combination of traditional histopathology and molecular analysis. Due to the rapid development of new cancer treatments that specifically target aberrant proteins present in tumor cells, treatment decisions are increasingly based on the molecular features of the tumor. Not only the number of patients eligible for targeted precision medicine, but also the number of molecular targets per patient and tumor type is rising. Diagnostic molecular pathology, the discipline that determines the molecular aberrations present in tumors for diagnostic, prognostic or predictive purposes, is faced with true challenges. The laboratories have to meet the need of comprehensive molecular testing using only limited amount of tumor tissue, mostly fixed in formalin and embedded in paraffin (FFPE), in short turnaround time. Choices must be made for analytical methods that provide accurate, reliable and cost-effective results. Validation of the test procedures and results is essential. In addition, participation and good performance in internal (IQA) and external quality assurance (EQA) schemes is mandatory. In this review, we critically evaluate the validation procedure for comprehensive molecular tests as well as the organization of quality assurance and assessment of competence of diagnostic molecular pathology laboratories within Europe. Copyright © 2014 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  18. A computational theory for the classification of natural biosonar targets based on a spike code.

    PubMed

    Müller, Rolf

    2003-08-01

    A computational theory for the classification of natural biosonar targets is developed based on the properties of an example stimulus ensemble. An extensive set of echoes (84 800) from four different foliages was transcribed into a spike code using a parsimonious model (linear filtering, half-wave rectification, thresholding). The spike code is assumed to consist of time differences (interspike intervals) between threshold crossings. Among the elementary interspike intervals flanked by exceedances of adjacent thresholds, a few intervals triggered by disjoint half-cycles of the carrier oscillation stand out in terms of resolvability, visibility across resolution scales and a simple stochastic structure (uncorrelatedness). They are therefore argued to be a stochastic analogue to edges in vision. A three-dimensional feature vector representing these interspike intervals sustained a reliable target classification performance (0.06% classification error) in a sequential probability ratio test, which models sequential processing of echo trains by biological sonar systems. The dimensions of the representation are the first moments of duration and amplitude location of these interspike intervals as well as their number. All three quantities are readily reconciled with known principles of neural signal representation, since they correspond to the centre of gravity of excitation on a neural map and the total amount of excitation.

  19. Identification of the Geographic Origin of Parmigiano Reggiano (P.D.O.) Cheeses Deploying Non-Targeted Mass Spectrometry and Chemometrics.

    PubMed

    Popping, Bert; De Dominicis, Emiliano; Dante, Mario; Nocetti, Marco

    2017-02-16

    Parmigiano Reggiano is an Italian product with a protected designation of origin (P.D.O.). It is an aged hard cheese made from raw milk. P.D.O. products are protected by European regulations. Approximately 3 million wheels are produced each year, and the product attracts a relevant premium price due to its quality and all around the world well known typicity. Due to the high demand that exceeds the production, several fraudulent products can be found on the market. The rate of fraud is estimated between 20% and 40%, the latter predominantly in the grated form. We have developed a non-target method based on Liquid Chomatography-High Resolution Mass Spectrometry (LC-HRMS) that allows the discrimination of Parmigiano Reggiano from non-authentic products with milk from different geographical origins or products, where other aspects of the production process do not comply with the rules laid down in the production specifications for Parmeggiano Reggiano. Based on a database created with authentic samples provided by the Consortium of Parmigiano Reggiano Cheese, a reliable classification model was built. The overall classification capabilities of this non-targeted method was verified on 32 grated cheese samples. The classification was 87.5% accurate.

  20. Key enzymes and proteins of crop insects as candidate for RNAi based gene silencing

    PubMed Central

    Kola, Vijaya Sudhakara Rao; Renuka, P.; Madhav, Maganti Sheshu; Mangrauthia, Satendra K.

    2015-01-01

    RNA interference (RNAi) is a mechanism of homology dependent gene silencing present in plants and animals. It operates through 21–24 nucleotides small RNAs which are processed through a set of core enzymatic machinery that involves Dicer and Argonaute proteins. In recent past, the technology has been well appreciated toward the control of plant pathogens and insects through suppression of key genes/proteins of infecting organisms. The genes encoding key enzymes/proteins with the great potential for developing an effective insect control by RNAi approach are actylcholinesterase, cytochrome P450 enzymes, amino peptidase N, allatostatin, allatotropin, tryptophan oxygenase, arginine kinase, vacuolar ATPase, chitin synthase, glutathione-S-transferase, catalase, trehalose phosphate synthase, vitellogenin, hydroxy-3-methylglutaryl coenzyme A reductase, and hormone receptor genes. Through various studies, it is demonstrated that RNAi is a reliable molecular tool which offers great promises in meeting the challenges imposed by crop insects with careful selection of key enzymes/proteins. Utilization of RNAi tool to target some of these key proteins of crop insects through various approaches is described here. The major challenges of RNAi based insect control such as identifying potential targets, delivery methods of silencing trigger, off target effects, and complexity of insect biology are very well illustrated. Further, required efforts to address these challenges are also discussed. PMID:25954206

  1. Discriminating between camouflaged targets by their time of detection by a human-based observer assessment method

    NASA Astrophysics Data System (ADS)

    Selj, G. K.; Søderblom, M.

    2015-10-01

    Detection of a camouflaged object in natural sceneries requires the target to be distinguishable from its local background. The development of any new camouflage pattern therefore has to rely on a well-founded test methodology - which has to be correlated with the final purpose of the pattern - as well as an evaluation procedure, containing the optimal criteria for i) discriminating between the targets and then eventually ii) for a final rank of the targets. In this study we present results from a recent camouflage assessment trial where human observers were used in a search by photo methodology to assess generic test camouflage patterns. We conducted a study to investigate possible improvements in camouflage patterns for battle dress uniforms. The aim was to do a comparative study of potential, and generic patterns intended for use in arid areas (sparsely vegetated, semi desert). We developed a test methodology that was intended to be simple, reliable and realistic with respect to the operational benefit of camouflage. Therefore we chose to conduct a human based observer trial founded on imagery of realistic targets in natural backgrounds. Inspired by a recent and similar trial in the UK, we developed new and purpose-based software to be able to conduct the observer trial. Our preferred assessment methodology - the observer trial - was based on target recordings in 12 different, but operational relevant scenes, collected in a dry and sparsely vegetated area (Rhodes). The scenes were chosen with the intention to span as broadly as possible. The targets were human-shaped mannequins and were situated identically in each of the scenes to allow for a relative comparison of camouflage effectiveness in each scene. Test of significance, among the targets' performance, was carried out by non-parametric tests as the corresponding time of detection distributions in overall were found to be difficult to parameterize. From the trial, containing 12 different scenes from sparsely vegetated areas we collected detection time's distributions for 6 generic targets through visual search by 148 observers. We found that the different targets performed differently, given by their corresponding time of detection distributions, within a single scene. Furthermore, we gained an overall ranking over all the 12 scenes by performing a weighted sum over all scenes, intended to keep as much of the vital information on the targets' signature effectiveness as possible. Our results show that it was possible to measure the targets performance relatively to another also when summing over all scenes. We also compared our ranking based on our preferred criterion (detection time) with a secondary (probability of detection) to assess the sensitivity of a final ranking based upon the test set-up and evaluation criterion. We found our observer-based approach to be well suited regarding its ability to discriminate between similar targets and to assign numeric values to the observed differences in performance. We believe our approach will be well suited as a tool whenever different aspects of camouflage are to be evaluated and understood further.

  2. Visual attention to features by associative learning.

    PubMed

    Gozli, Davood G; Moskowitz, Joshua B; Pratt, Jay

    2014-11-01

    Expecting a particular stimulus can facilitate processing of that stimulus over others, but what is the fate of other stimuli that are known to co-occur with the expected stimulus? This study examined the impact of learned association on feature-based attention. The findings show that the effectiveness of an uninformative color transient in orienting attention can change by learned associations between colors and the expected target shape. In an initial acquisition phase, participants learned two distinct sequences of stimulus-response-outcome, where stimuli were defined by shape ('S' vs. 'H'), responses were localized key-presses (left vs. right), and outcomes were colors (red vs. green). Next, in a test phase, while expecting a target shape (80% probable), participants showed reliable attentional orienting to the color transient associated with the target shape, and showed no attentional orienting with the color associated with the alternative target shape. This bias seemed to be driven by learned association between shapes and colors, and not modulated by the response. In addition, the bias seemed to depend on observing target-color conjunctions, since encountering the two features disjunctively (without spatiotemporal overlap) did not replicate the findings. We conclude that associative learning - likely mediated by mechanisms underlying visual object representation - can extend the impact of goal-driven attention to features associated with a target stimulus. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Benchmarking passive seismic methods of estimating the depth of velocity interfaces down to ~300 m

    NASA Astrophysics Data System (ADS)

    Czarnota, Karol; Gorbatov, Alexei

    2016-04-01

    In shallow passive seismology it is generally accepted that the spatial autocorrelation (SPAC) method is more robust than the horizontal-over-vertical spectral ratio (HVSR) method at resolving the depth to surface-wave velocity (Vs) interfaces. Here we present results of a field test of these two methods over ten drill sites in western Victoria, Australia. The target interface is the base of Cenozoic unconsolidated to semi-consolidated clastic and/or carbonate sediments of the Murray Basin, which overlie Paleozoic crystalline rocks. Depths of this interface intersected in drill holes are between ~27 m and ~300 m. Seismometers were deployed in a three-arm spiral array, with a radius of 250 m, consisting of 13 Trillium Compact 120 s broadband instruments. Data were acquired at each site for 7-21 hours. The Vs architecture beneath each site was determined through nonlinear inversion of HVSR and SPAC data using the neighbourhood algorithm, implemented in the geopsy modelling package (Wathelet, 2005, GRL v35). The HVSR technique yielded depth estimates of the target interface (Vs > 1000 m/s) generally within ±20% error. Successful estimates were even obtained at a site with an inverted velocity profile, where Quaternary basalts overlie Neogene sediments which in turn overlie the target basement. Half of the SPAC estimates showed significantly higher errors than were obtained using HVSR. Joint inversion provided the most reliable estimates but was unstable at three sites. We attribute the surprising success of HVSR over SPAC to a low content of transient signals within the seismic record caused by low levels of anthropogenic noise at the benchmark sites. At a few sites SPAC waveform curves showed clear overtones suggesting that more reliable SPAC estimates may be obtained utilizing a multi-modal inversion. Nevertheless, our study indicates that reliable basin thickness estimates in the Australian conditions tested can be obtained utilizing HVSR data from a single seismometer, without a priori knowledge of the surface-wave velocity of the basin material, thereby negating the need to deploy cumbersome arrays.

  4. NASA Applications and Lessons Learned in Reliability Engineering

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Fuller, Raymond P.

    2011-01-01

    Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.

  5. Upgrade possibilities for continuous wave rf electron guns based on room-temperature very high frequency technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sannibale, F.; Filippetto, D.; Johnson, M.

    The past decade was characterized by an increasing scientific demand for extending towards higher repetition rates (MHz class and beyond) the performance of already operating lower repetition rate accelerator-based instruments such as x-ray free electron lasers (FELs) and ultrafast electron diffraction (UED) and microscopy (UEM) instruments. Such a need stimulated a worldwide spread of a vibrant R & D activity targeting the development of high-brightness electron sources capable of operating at these challenging rates. Among the different technologies pursued, rf guns based on room-temperature structures resonating in the very high frequency (VHF) range (30-300 MHz) and operating in continuous wavemore » successfully demonstrated in the past few years the targeted brightness and reliability. Nonetheless, recently proposed upgrades for x-ray FELs and the always brightness-frontier applications such as UED and UEM are now requiring a further step forward in terms of beam brightness in electron sources. Here, we present a few possible upgrade paths that would allow one to extend, in a relatively simple and cost-effective way, the performance of the present VHF technology to the required new goals.« less

  6. Upgrade possibilities for continuous wave rf electron guns based on room-temperature very high frequency technology

    DOE PAGES

    Sannibale, F.; Filippetto, D.; Johnson, M.; ...

    2017-11-27

    The past decade was characterized by an increasing scientific demand for extending towards higher repetition rates (MHz class and beyond) the performance of already operating lower repetition rate accelerator-based instruments such as x-ray free electron lasers (FELs) and ultrafast electron diffraction (UED) and microscopy (UEM) instruments. Such a need stimulated a worldwide spread of a vibrant R & D activity targeting the development of high-brightness electron sources capable of operating at these challenging rates. Among the different technologies pursued, rf guns based on room-temperature structures resonating in the very high frequency (VHF) range (30-300 MHz) and operating in continuous wavemore » successfully demonstrated in the past few years the targeted brightness and reliability. Nonetheless, recently proposed upgrades for x-ray FELs and the always brightness-frontier applications such as UED and UEM are now requiring a further step forward in terms of beam brightness in electron sources. Here, we present a few possible upgrade paths that would allow one to extend, in a relatively simple and cost-effective way, the performance of the present VHF technology to the required new goals.« less

  7. Comparison of liver fibrosis blood tests developed for HCV with new specific tests in HIV/HCV co-infection.

    PubMed

    Calès, Paul; Halfon, Philippe; Batisse, Dominique; Carrat, Fabrice; Perré, Philippe; Penaranda, Guillaume; Guyader, Dominique; d'Alteroche, Louis; Fouchard-Hubert, Isabelle; Michelet, Christian; Veillon, Pascal; Lambert, Jérôme; Weiss, Laurence; Salmon, Dominique; Cacoub, Patrice

    2010-08-01

    We compared 5 non-specific and 2 specific blood tests for liver fibrosis in HCV/HIV co-infection. Four hundred and sixty-seven patients were included into derivation (n=183) or validation (n=284) populations. Within these populations, the diagnostic target, significant fibrosis (Metavir F > or = 2), was found in 66% and 72% of the patients, respectively. Two new fibrosis tests, FibroMeter HICV and HICV test, were constructed in the derivation population. Unadjusted AUROCs in the derivation population were: APRI: 0.716, Fib-4: 0.722, Fibrotest: 0.778, Hepascore: 0.779, FibroMeter: 0.783, HICV test: 0.822, FibroMeter HICV: 0.828. AUROCs adjusted on classification and distribution of fibrosis stages in a reference population showed similar values in both populations. FibroMeter, FibroMeter HICV and HICV test had the highest correct classification rates in F0/1 and F3/4 (which account for high predictive values): 77-79% vs. 70-72% in the other tests (p=0.002). Reliable individual diagnosis based on predictive values > or = 90% distinguished three test categories: poorly reliable: Fib-4 (2.4% of patients), APRI (8.9%); moderately reliable: Fibrotest (25.4%), FibroMeter (26.6%), Hepascore (30.2%); acceptably reliable: HICV test (40.2%), FibroMeter HICV (45.6%) (p<10(-3) between tests). FibroMeter HICV classified all patients into four reliable diagnosis intervals (< or =F1, F1+/-1, > or =F1, > or =F2) with an overall accuracy of 93% vs. 79% (p<10(-3)) for a binary diagnosis of significant fibrosis. Tests designed for HCV infections are less effective in HIV/HCV infections. A specific test, like FibroMeter HICV, was the most interesting test for diagnostic accuracy, correct classification profile, and a reliable diagnosis. With reliable diagnosis intervals, liver biopsy can therefore be avoided in all patients. Copyright 2010 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.

  8. Establishing Reliability and Validity of the Criterion Referenced Exam of GeoloGy Standards EGGS

    NASA Astrophysics Data System (ADS)

    Guffey, S. K.; Slater, S. J.; Slater, T. F.; Schleigh, S.; Burrows, A. C.

    2016-12-01

    Discipline-based geoscience education researchers have considerable need for a criterion-referenced, easy-to-administer and -score conceptual diagnostic survey for undergraduates taking introductory science survey courses in order for faculty to better be able to monitor the learning impacts of various interactive teaching approaches. To support ongoing education research across the geosciences, we are continuing to rigorously and systematically work to firmly establish the reliability and validity of the recently released Exam of GeoloGy Standards, EGGS. In educational testing, reliability refers to the consistency or stability of test scores whereas validity refers to the accuracy of the inferences or interpretations one makes from test scores. There are several types of reliability measures being applied to the iterative refinement of the EGGS survey, including test-retest, alternate form, split-half, internal consistency, and interrater reliability measures. EGGS rates strongly on most measures of reliability. For one, Cronbach's alpha provides a quantitative index indicating the extent to which if students are answering items consistently throughout the test and measures inter-item correlations. Traditional item analysis methods further establish the degree to which a particular item is reliably assessing students is actually quantifiable, including item difficulty and item discrimination. Validity, on the other hand, is perhaps best described by the word accuracy. For example, content validity is the to extent to which a measurement reflects the specific intended domain of the content, stemming from judgments of people who are either experts in the testing of that particular content area or are content experts. Perhaps more importantly, face validity is a judgement of how representative an instrument is reflective of the science "at face value" and refers to the extent to which a test appears to measure a the targeted scientific domain as viewed by laypersons, examinees, test users, the public, and other invested stakeholders.

  9. Real-time target tracking and locating system for UAV

    NASA Astrophysics Data System (ADS)

    Zhang, Chao; Tang, Linbo; Fu, Huiquan; Li, Maowen

    2017-07-01

    In order to achieve real-time target tracking and locating for UAV, a reliable processing system is built on the embedded platform. Firstly, the video image is acquired in real time by the photovoltaic system on the UAV. When the target information is known, KCF tracking algorithm is adopted to track the target. Then, the servo is controlled to rotate with the target, when the target is in the center of the image, the laser ranging module is opened to obtain the distance between the UAV and the target. Finally, to combine with UAV flight parameters obtained by BeiDou navigation system, through the target location algorithm to calculate the geodetic coordinates of the target. The results show that the system is stable for real-time tracking of targets and positioning.

  10. "Unreliability as a threat to understanding psychopathology: The cautionary tale of attentional bias": Correction to Rodebaugh et al. (2016).

    PubMed

    2016-10-01

    Reports an error in "Unreliability as a threat to understanding psychopathology: The cautionary tale of attentional bias" by Thomas L. Rodebaugh, Rachel B. Scullin, Julia K. Langer, David J. Dixon, Jonathan D. Huppert, Amit Bernstein, Ariel Zvielli and Eric J. Lenze ( Journal of Abnormal Psychology , 2016[Aug], Vol 125[6], 840-851). There was an error in the Author Note concerning the support of the MacBrain Face Stimulus Set. The correct statement is provided. (The following abstract of the original article appeared in record 2016-30117-001.) The use of unreliable measures constitutes a threat to our understanding of psychopathology, because advancement of science using both behavioral and biologically oriented measures can only be certain if such measurements are reliable. Two pillars of the National Institute of Mental Health's portfolio-the Research Domain Criteria (RDoC) initiative for psychopathology and the target engagement initiative in clinical trials-cannot succeed without measures that possess the high reliability necessary for tests involving mediation and selection based on individual differences. We focus on the historical lack of reliability of attentional bias measures as an illustration of how reliability can pose a threat to our understanding. Our own data replicate previous findings of poor reliability for traditionally used scores, which suggests a serious problem with the ability to test theories regarding attentional bias. This lack of reliability may also suggest problems with the assumption (in both theory and the formula for the scores) that attentional bias is consistent and stable across time. In contrast, measures accounting for attention as a dynamic process in time show good reliability in our data. The field is sorely in need of research reporting findings and reliability for attentional bias scores using multiple methods, including those focusing on dynamic processes over time. We urge researchers to test and report reliability of all measures, considering findings of low reliability not just as a nuisance but as an opportunity to modify and improve upon the underlying theory. Full assessment of reliability of measures will maximize the possibility that RDoC (and psychological science more generally) will succeed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  11. Study on Temperature Control System Based on SG3525

    NASA Astrophysics Data System (ADS)

    Cheng, Cong; Zhu, Yifeng; Wu, Junfeng

    2017-12-01

    In this paper, it uses the way of dry bath temperature to heat the microfluidic chip directly by the heating plate and the liquid sample in microfluidic chip is heated through thermal conductivity, thus the liquid sample will maintain at target temperature. In order to improve the reliability of the whole machine, a temperature control system based on SG3525 is designed.SG3525 is the core of the system which uses PWM wave produced by itself to drive power tube to heat the heating plate. The bridge circuit consisted of thermistor and PID regulation ensure that the temperature can be controlled at 37 °C with a correctness of ± 0.2 °C and a fluctuation of ± 0.1 °C.

  12. The technology roadmap for plant/crop-based renewable resources 2020

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McLaren, J.

    1999-02-22

    The long-term well-being of the nation and maintenance of a sustainable leadership position in agriculture, forestry, and manufacturing, clearly depend on current and near-term support of multidisciplinary research for the development of a reliable renewable resource base. This document sets a roadmap and priorities for that research. America needs leadership that will continue to recognize, support, and move rapidly to meet the need to expand the use of sustainable renewable resources. This roadmap has highlighted potential ways for progress and has identified goals in specific components of the system. Achieving success with these goals will provide the opportunity to hitmore » the vision target of a fivefold increase in renewable resource use by 2020.« less

  13. The Technology Roadmap for Plant/Crop-Based Renewable Resources 2020

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1999-02-01

    The long-term well-being of the nation and maintenance of a sustainable leadership position in agriculture, forestry, and manufacturing, clearly depend on current and near-term support of multidisciplinary research for the development of a reliable renewable resource base. This document sets a roadmap and priorities for that research. America needs leadership that will continue to recognize, support, and move rapidly to meet the need to expand the use of sustainable renewable resources. This roadmap has highlighted potential ways for progress and has identified goals in specific components of the system. Achieving success with these goals will provide the opportunity to hitmore » the vision target of a fivefold increase in renewable resource use by 2020.« less

  14. Transgenic Mosquitoes - Fact or Fiction?

    PubMed

    Wilke, André B B; Beier, John C; Benelli, Giovanni

    2018-06-01

    Technologies for controlling mosquito vectors based on genetic manipulation and the release of genetically modified mosquitoes (GMMs) are gaining ground. However, concrete epidemiological evidence of their effectiveness, sustainability, and impact on the environment and nontarget species is lacking; no reliable ecological evidence on the potential interactions among GMMs, target populations, and other mosquito species populations exists; and no GMM technology has yet been approved by the WHO Vector Control Advisory Group. Our opinion is that, although GMMs may be considered a promising control tool, more studies are needed to assess their true effectiveness, risks, and benefits. Overall, several lines of evidence must be provided before GMM-based control strategies can be used under the integrated vector management framework. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. DHS S&T First Responders Group and NATO Counter UAS Proposal Interest Response.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salton, Jonathan R.

    The capability, speed, size, and widespread availability of small unmanned aerial systems (sUAS) makes them a serious security concern. The enabling technologies for sUAS are rapidly evolving and so too are the threats they pose to national security. Potential threat vehicles have a small cross-section, and are difficult to reliably detect using purely ground-based systems (e.g. radar or electro-optical) and challenging to target using conventional anti-aircraft defenses. Ground-based sensors are static and suffer from interference with the earth, vegetation and other man-made structures which obscure objects at low altitudes. Because of these challenges, sUAS pose a unique and rapidly evolvingmore » threat to national security.« less

  16. Mechanical Design of Downhole Tractor Based on Two-Way Self-locking Mechanism

    NASA Astrophysics Data System (ADS)

    Fang, Delei; Shang, Jianzhong; Luo, Zirong; Wu, Guoheng; Liu, Yiying

    2018-03-01

    Based on the technology of horizontal well tractor, a kind of downhole tractor was developed which can realize Two-Way self-locking function. Aiming at the needs of horizontal well logging to realize the target of small size, high traction and high reliability, the tractor selects unique heart-shaped CAM as the locking mechanism. The motion principle of telescopic downhole tractor, the design of mechanical structure and locking principle of the locking mechanism are all analyzed. The mathematical expressions of traction are obtained by mechanical analysis of parallel support rod in the locking mechanism. The force analysis and contour design of the heart-shaped CAM are performed, which can lay the foundation for the development of tractor prototype.

  17. Nail-like targets for laser plasma interaction experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pasley, J; Wei, M; Shipton, E

    2007-12-18

    The interaction of ultra-high power picosecond laser pulses with solid targets is of interest both for benchmarking the results of hybrid particle in cell (PIC) codes and also for applications to re-entrant cone guided fast ignition. We describe the construction of novel targets in which copper/titanium wires are formed into 'nail-like' objects by a process of melting and micromachining, so that energy can be reliably coupled to a 24 {micro}m diameter wire. An extreme-ultraviolet image of the interaction of the Titan laser with such a target is shown.

  18. Liquid Hydrogen Target Experience at SLAC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weisend, J.G.; Boyce, R.; Candia, A.

    2005-08-29

    Liquid hydrogen targets have played a vital role in the physics program at SLAC for the past 40 years. These targets have ranged from small ''beer can'' targets to the 1.5 m long E158 target that was capable of absorbing up to 800 W without any significant density changes. Successful use of these targets has required the development of thin wall designs, liquid hydrogen pumps, remote positioning and alignment systems, safety systems, control and data acquisition systems, cryogenic cooling circuits and heat exchangers. Detailed operating procedures have been created to ensure safety and operational reliability. This paper surveys the evolutionmore » of liquid hydrogen targets at SLAC and discusses advances in several of the enabling technologies that made these targets possible.« less

  19. Molding compound trends in a denser packaging world: Qualification tests and reliability concerns

    NASA Astrophysics Data System (ADS)

    Nguyen, L. T.; Lo, R. H. Y.; Chen, A. S.; Belani, J. G.

    1993-12-01

    Molding compound development has traditionally been driven by the memory market, then subsequent applications filter down to other IC technologies such as logic, analog, and ASIC. However, this strategy has changed lately with the introduction of thin packages such as PQFP & TSOP. Rather than targeting a compound for a family of IC such as DRAM or SRAM, compound development efforts are now focused at specific classes of packages. The configurations of these thin packages impose new functional requirements that need to be revisited to provide the optimized combination of properties. The evolution of qualification tests mirrors the advances in epoxy and compounding technologies. From the first standard novolac-based epoxies of the 1970s to the latest 3(sup rd)-generation ultra-low stress materials, longer test times at increasingly harsher environments were achieved. This paper benchmarks the current reliability tests used by the electronic industry, examines those tests that affect and are affected by the molding compounds, discusses the relevance of accelerated testing, and addresses the major reliability issues facing current molding compound development efforts. Six compound-related reliability concerns were selected: moldability, package stresses, package cracking, halogen-induced intermetallic growth at bond pads, moisture-induced corrosion, and interfacial delamination. Causes of each failure type are surveyed and remedies are recommended. Accelerated tests are designed to apply to a limited quantity of devices, bias, or environmental conditions larger than usual ratings, to intensify failure mechanisms that would occur under normal operating conditions. The observed behavior is then extrapolated from the lot to the entire population. Emphasis is on compressing the time necessary to obtain reliability data. This approach has two main drawbacks. With increasingly complex devices, even accelerated tests are expensive. And with new technologies, it becomes difficult to ascertain that the applied stress 1) induces the failure phenomenon linked with usual field conditions, and 2) does not create any new ones. Technology evolution and reliability testing are interdependent. Devices get larger with increasingly smaller features and more complex geometries. Molding compounds have evolved considerably over the past decade to provide ultra-low stress levels and moldability for thin packages.

  20. Variability in Non-Target Terrestrial Plant Studies Should Inform Endpoint Selection.

    PubMed

    Staveley, J P; Green, J W; Nusz, J; Edwards, D; Henry, K; Kern, M; Deines, A M; Brain, R; Glenn, B; Ehresman, N; Kung, T; Ralston-Hooper, K; Kee, F; McMaster, S

    2018-05-04

    Inherent variability in Non-Target Terrestrial Plant (NTTP) testing of pesticides creates challenges for using and interpreting these data for risk assessment. Standardized NTTP testing protocols were initially designed to calculate the application rate causing a 25% effect (ER25, used in the U.S.) or a 50% effect (ER50, used in Europe) for various measures based on the observed dose-response. More recently, the requirement to generate a no-observed-effect rate (NOER), or, in the absence of a NOER, the rate causing a 5% effect (ER05), has raised questions about the inherent variability in, and statistical detectability of, these tests. Statistically significant differences observed between test and control groups may be a product of this inherent variability and may not represent biological relevance. Attempting to derive an ER05 and the associated risk assessment conclusions drawn from these values can overestimate risk. To address these concerns, we evaluated historical data from approximately 100 seedling emergence and vegetative vigor guideline studies on pesticides to assess the variability of control results across studies for each plant species, examined potential causes for the variation in control results, and defined the minimum percent effect that can be reliably detected. The results indicate that with current test design and implementation, the ER05 cannot be reliably estimated. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  1. Detection and typing of low-risk human papillomavirus genotypes HPV 6, HPV 11, HPV 42, HPV 43 and HPV 44 by polymerase chain reaction and restriction fragment length polymorphism.

    PubMed

    Maver, Polona J; Poljak, Mario; Seme, Katja; Kocjan, Bostjan J

    2010-10-01

    A novel PCR-restriction fragment length polymorphism assay (PCR-RFLP) was developed for sensitive detection and reliable differentiation of five low-risk human papillomavirus (lr-HPV) genotypes: HPV 6, HPV 11, HPV 42, HPV 43 and HPV 44, as well as differentiation of prototypic and non-prototypic HPV 6 genomic variants. The assay is based on the amplification of a 320-bp fragment of the HPV E1 gene and subsequent analysis of PCR-products with BsaJI and HinFI. Testing on plasmid standards showed that PCR-RFLP enabled simple and reliable identification and differentiation of five targeted lr-HPV genotypes and could detect reproducibly down to 10 copies of viral genome equivalents per PCR. The PCR-RFLP showed almost complete agreement with previously obtained genotyping results on 42 HPV-DNA negative samples and 223 HPV-DNA positive samples (45 HPV 6, 34 HPV 11, 35 HPV 42, 10 HPV 43, 24 HPV 44 positive samples and 75 samples containing 28 non-targeted HPV genotypes). The novel assay is simple and robust, does not require any sophisticated equipment and can be of great value for epidemiological studies, particularly in settings in which financial resources are limited. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  2. Fiber-optic microsphere-based arrays for multiplexed biological warfare agent detection.

    PubMed

    Song, Linan; Ahn, Soohyoun; Walt, David R

    2006-02-15

    We report a multiplexed high-density DNA array capable of rapid, sensitive, and reliable identification of potential biological warfare agents. An optical fiber bundle containing 6000 individual 3.1-mum-diameter fibers was chemically etched to yield microwells and used as the substrate for the array. Eighteen different 50-mer single-stranded DNA probes were covalently attached to 3.1-mum microspheres. Probe sequences were designed for Bacillus anthracis, Yersinia pestis, Francisella tularensis, Brucella melitensis, Clostridium botulinum, Vaccinia virus, and one biological warfare agent (BWA) simulant, Bacillus thuringiensis kurstaki. The microspheres were distributed into the microwells to form a randomized multiplexed high-density DNA array. A detection limit of 10 fM in a 50-microL sample volume was achieved within 30 min of hybridization for B. anthracis, Y. pestis, Vaccinia virus, and B. thuringiensis kurstaki. We used both specific responses of probes upon hybridization to complementary targets as well as response patterns of the multiplexed array to identify BWAs with high accuracy. We demonstrated the application of this multiplexed high-density DNA array for parallel identification of target BWAs in spiked sewage samples after PCR amplification. The array's miniaturized feature size, fabrication flexibility, reusability, and high reproducibility may enable this array platform to be integrated into a highly sensitive, specific, and reliable portable instrument for in situ BWA detection.

  3. Applying fault tree analysis to the prevention of wrong-site surgery.

    PubMed

    Abecassis, Zachary A; McElroy, Lisa M; Patel, Ronak M; Khorzad, Rebeca; Carroll, Charles; Mehrotra, Sanjay

    2015-01-01

    Wrong-site surgery (WSS) is a rare event that occurs to hundreds of patients each year. Despite national implementation of the Universal Protocol over the past decade, development of effective interventions remains a challenge. We performed a systematic review of the literature reporting root causes of WSS and used the results to perform a fault tree analysis to assess the reliability of the system in preventing WSS and identifying high-priority targets for interventions aimed at reducing WSS. Process components where a single error could result in WSS were labeled with OR gates; process aspects reinforced by verification were labeled with AND gates. The overall redundancy of the system was evaluated based on prevalence of AND gates and OR gates. In total, 37 studies described risk factors for WSS. The fault tree contains 35 faults, most of which fall into five main categories. Despite the Universal Protocol mandating patient verification, surgical site signing, and a brief time-out, a large proportion of the process relies on human transcription and verification. Fault tree analysis provides a standardized perspective of errors or faults within the system of surgical scheduling and site confirmation. It can be adapted by institutions or specialties to lead to more targeted interventions to increase redundancy and reliability within the preoperative process. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Multi-agent Negotiation Mechanisms for Statistical Target Classification in Wireless Multimedia Sensor Networks

    PubMed Central

    Wang, Xue; Bi, Dao-wei; Ding, Liang; Wang, Sheng

    2007-01-01

    The recent availability of low cost and miniaturized hardware has allowed wireless sensor networks (WSNs) to retrieve audio and video data in real world applications, which has fostered the development of wireless multimedia sensor networks (WMSNs). Resource constraints and challenging multimedia data volume make development of efficient algorithms to perform in-network processing of multimedia contents imperative. This paper proposes solving problems in the domain of WMSNs from the perspective of multi-agent systems. The multi-agent framework enables flexible network configuration and efficient collaborative in-network processing. The focus is placed on target classification in WMSNs where audio information is retrieved by microphones. To deal with the uncertainties related to audio information retrieval, the statistical approaches of power spectral density estimates, principal component analysis and Gaussian process classification are employed. A multi-agent negotiation mechanism is specially developed to efficiently utilize limited resources and simultaneously enhance classification accuracy and reliability. The negotiation is composed of two phases, where an auction based approach is first exploited to allocate the classification task among the agents and then individual agent decisions are combined by the committee decision mechanism. Simulation experiments with real world data are conducted and the results show that the proposed statistical approaches and negotiation mechanism not only reduce memory and computation requirements in WMSNs but also significantly enhance classification accuracy and reliability. PMID:28903223

  5. Reliability Analysis and Reliability-Based Design Optimization of Circular Composite Cylinders Under Axial Compression

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2001-01-01

    This report describes the preliminary results of an investigation on component reliability analysis and reliability-based design optimization of thin-walled circular composite cylinders with average diameter and average length of 15 inches. Structural reliability is based on axial buckling strength of the cylinder. Both Monte Carlo simulation and First Order Reliability Method are considered for reliability analysis with the latter incorporated into the reliability-based structural optimization problem. To improve the efficiency of reliability sensitivity analysis and design optimization solution, the buckling strength of the cylinder is estimated using a second-order response surface model. The sensitivity of the reliability index with respect to the mean and standard deviation of each random variable is calculated and compared. The reliability index is found to be extremely sensitive to the applied load and elastic modulus of the material in the fiber direction. The cylinder diameter was found to have the third highest impact on the reliability index. Also the uncertainty in the applied load, captured by examining different values for its coefficient of variation, is found to have a large influence on cylinder reliability. The optimization problem for minimum weight is solved subject to a design constraint on element reliability index. The methodology, solution procedure and optimization results are included in this report.

  6. An enhanced MMW and SMMW/THz imaging system performance prediction and analysis tool for concealed weapon detection and pilotage obstacle avoidance

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Jacobs, Eddie L.; Franck, Charmaine C.; Petkie, Douglas T.; De Lucia, Frank C.

    2015-10-01

    The U.S. Army Research Laboratory (ARL) has continued to develop and enhance a millimeter-wave (MMW) and submillimeter- wave (SMMW)/terahertz (THz)-band imaging system performance prediction and analysis tool for both the detection and identification of concealed weaponry, and for pilotage obstacle avoidance. The details of the MATLAB-based model which accounts for the effects of all critical sensor and display components, for the effects of atmospheric attenuation, concealment material attenuation, and active illumination, were reported on at the 2005 SPIE Europe Security and Defence Symposium (Brugge). An advanced version of the base model that accounts for both the dramatic impact that target and background orientation can have on target observability as related to specular and Lambertian reflections captured by an active-illumination-based imaging system, and for the impact of target and background thermal emission, was reported on at the 2007 SPIE Defense and Security Symposium (Orlando). Further development of this tool that includes a MODTRAN-based atmospheric attenuation calculator and advanced system architecture configuration inputs that allow for straightforward performance analysis of active or passive systems based on scanning (single- or line-array detector element(s)) or staring (focal-plane-array detector elements) imaging architectures was reported on at the 2011 SPIE Europe Security and Defence Symposium (Prague). This paper provides a comprehensive review of a newly enhanced MMW and SMMW/THz imaging system analysis and design tool that now includes an improved noise sub-model for more accurate and reliable performance predictions, the capability to account for postcapture image contrast enhancement, and the capability to account for concealment material backscatter with active-illumination- based systems. Present plans for additional expansion of the model's predictive capabilities are also outlined.

  7. Assessing child and adolescent pragmatic language competencies: toward evidence-based assessments.

    PubMed

    Russell, Robert L; Grizzle, Kenneth L

    2008-06-01

    Using language appropriately and effectively in social contexts requires pragmatic language competencies (PLCs). Increasingly, deficits in PLCs are linked to child and adolescent disorders, including autism spectrum, externalizing, and internalizing disorders. As the role of PLCs expands in diagnosis and treatment of developmental psychopathology, psychologists and educators will need to appraise and select clinical and research PLC instruments for use in assessments and/or studies. To assist in this appraisal, 24 PLC instruments, containing 1,082 items, are assessed by addressing four questions: (1) Can PLC domains targeted by assessment items be reliably identified?, (2) What are the core PLC domains that emerge across the 24 instruments?, (3) Do PLC questionnaires and tests assess similar PLC domains?, and (4) Do the instruments achieve content, structural, diagnostic, and ecological validity? Results indicate that test and questionnaire items can be reliably categorized into PLC domains, that PLC domains featured in questionnaires and tests significantly differ, and that PLC instruments need empirical confirmation of their dimensional structure, content validity across all developmental age bands, and ecological validity. Progress in building a better evidence base for PLC assessments should be a priority in future research.

  8. Online Hierarchical Sparse Representation of Multifeature for Robust Object Tracking

    PubMed Central

    Qu, Shiru

    2016-01-01

    Object tracking based on sparse representation has given promising tracking results in recent years. However, the trackers under the framework of sparse representation always overemphasize the sparse representation and ignore the correlation of visual information. In addition, the sparse coding methods only encode the local region independently and ignore the spatial neighborhood information of the image. In this paper, we propose a robust tracking algorithm. Firstly, multiple complementary features are used to describe the object appearance; the appearance model of the tracked target is modeled by instantaneous and stable appearance features simultaneously. A two-stage sparse-coded method which takes the spatial neighborhood information of the image patch and the computation burden into consideration is used to compute the reconstructed object appearance. Then, the reliability of each tracker is measured by the tracking likelihood function of transient and reconstructed appearance models. Finally, the most reliable tracker is obtained by a well established particle filter framework; the training set and the template library are incrementally updated based on the current tracking results. Experiment results on different challenging video sequences show that the proposed algorithm performs well with superior tracking accuracy and robustness. PMID:27630710

  9. Temporary threshold shift after impulse-noise during video game play: laboratory data.

    PubMed

    Spankovich, C; Griffiths, S K; Lobariñas, E; Morgenstein, K E; de la Calle, S; Ledon, V; Guercio, D; Le Prell, C G

    2014-03-01

    Prevention of temporary threshold shift (TTS) after laboratory-based exposure to pure-tones, broadband noise, and narrowband noise signals has been achieved, but prevention of TTS under these experimental conditions may not accurately reflect protection against hearing loss following impulse noise. This study used a controlled laboratory-based TTS paradigm that incorporated impulsive stimuli into the exposure protocol; development of this model could provide a novel platform for assessing proposed therapeutics. Participants played a video game that delivered gunfire-like sound through headphones as part of a target practice game. Effects were measured using audiometric threshold evaluations and distortion product otoacoustic emissions (DPOAEs). The sound level and number of impulses presented were sequentially increased throughout the study. Participants were normal-hearing students at the University of Florida who provided written informed consent prior to participation. TTS was not reliably induced by any of the exposure conditions assessed here. However, there was significant individual variability, and a subset of subjects showed TTS under some exposure conditions. A subset of participants demonstrated reliable threshold shifts under some conditions. Additional experiments are needed to better understand and optimize stimulus parameters that influence TTS after simulated impulse noise.

  10. Temporary threshold shift after impulse-noise during video game play: Laboratory data

    PubMed Central

    Spankovich, C.; Griffiths, S. K.; Lobariñas, E.; Morgenstein, K.E.; de la Calle, S.; Ledon, V.; Guercio, D.; Le Prell, C.G.

    2015-01-01

    Objective Prevention of temporary threshold shift (TTS) after laboratory-based exposure to pure-tones, broadband noise, and narrow band noise signals has been achieved, but prevention of TTS under these experimental conditions may not accurately reflect protection against hearing loss following impulse noise. This study used a controlled laboratory-based TTS paradigm that incorporated impulsive stimuli into the exposure protocol; development of this model could provide a novel platform for assessing proposed therapeutics. Design Participants played a video game that delivered gunfire-like sound through headphones as part of a target practice game. Effects were measured using audiometric threshold evaluations and distortion product otoacoustic emissions (DPOAEs). The sound level and number of impulses presented were sequentially increased throughout the study. Study sample Participants were normal-hearing students at the University of Florida who provided written informed consent prior to participation. Results TTS was not reliably induced by any of the exposure conditions assessed here. However, there was significant individual variability, and a subset of subjects showed TTS under some exposure conditions. Conclusions A subset of participants demonstrated reliable threshold shifts under some conditions. Additional experiments are needed to better understand and optimize stimulus parameters that influence TTS after simulated impulse noise. PMID:24564694

  11. Two decades of ART: improving on success through further research

    PubMed Central

    HOLMGREN, Christopher J.; FIGUEREDO, Márcia Cançado

    2009-01-01

    ABSTRACT Since the introduction of the Atraumatic Restorative Treatment (ART) approach over twenty years ago, more than 190 research publications have appeared. The last research agenda defining research priorities for ART was published in 1999. The objective of the present work was to review existing research in the context of future research priorities for ART. Material and Methods: An internet survey was conducted amongst those who had published on ART or were known to be working on the ART approach, to solicit their views as to areas of future ART research. Three broad categories were defined, namely: 1. Basic and laboratory research; 2. Clinical research, and, 3. Community, Public Health, Health Services Research. Results: A 31% response rate was achieved. The study identified a number of new areas of research as well as areas where additional research is required. These are expressed as recommendations for future ART research. Conclusions: The ART approach is based on a robust, reliable and ever-growing evidence base concerning its clinical applications which indicates that it is a reliable and quality treatment approach. In common with all other oral health care procedures, targeted applied research is required to improve the oral health care offered. PMID:21499666

  12. Quality Assessment of Medical Apps that Target Medication-Related Problems.

    PubMed

    Loy, John Shiguang; Ali, Eskinder Eshetu; Yap, Kevin Yi-Lwern

    2016-10-01

    The advent of smartphones has enabled a plethora of medical apps for disease management. As of 2012, there are 40,000 health care-related mobile apps available in the market. Since most of these medical apps do not go through any stringent quality assessment, there is a risk of consumers being misinformed or misled by unreliable information. In this regard, apps that target medication-related problems (MRPs) are not an exception. There is little information on what constitutes quality in apps that target MRPs and how good the existing apps are. To develop a quality assessment tool for evaluating apps that target MRPs and assess the quality of such apps available in the major mobile app stores (iTunes and Google Play). The top 100 free and paid apps in the medical categories of iTunes and Google Play stores (total of 400 apps) were screened for inclusion in the final analysis. English language apps that targeted MRPs were downloaded on test devices to evaluate their quality. Apps intended for clinicians, patients, or both were eligible for evaluation. The quality assessment tool consisted of 4 sections (appropriateness, reliability, usability, privacy), which determined the overall quality of the apps. Apps that fulfilled the inclusion criteria were classified based on the presence of any 1 or more of the 5 features considered important for apps targeting MRPs (monitoring, interaction checker, dose calculator, medication information, medication record). Descriptive statistics and Mann-Whitney tests were used for analysis. Final analysis was based on 59 apps that fulfilled the study inclusion criteria. Apps with interaction checker (66.9%) and monitoring features (54.8%) had the highest and lowest overall qualities. Paid apps generally scored higher for usability than free apps (P = 0.006) but lower for privacy (P = 0.003). Half of the interaction checker apps were unable to detect interactions with herbal medications. Blood pressure and heart rate monitoring apps had the highest overall quality scores (67.7%), while apps that monitored visual, hearing, and temperature changes scored the lowest (35.5%). A quality assessment tool for evaluating medical apps targeting MRPs has been developed. Clinicians can use this tool to guide their assessments of medical apps that are appropriate for use in the health care setting. Although potentially useful apps were identified, many apps were found to have deficiencies in quality, among which was poor reliability scores for most of the apps. Continued assessments of the quality of apps targeting MRPs are recommended to ensure their usefulness for clinicians and patients. No outside funding supported this study. The authors have no conflicts of interests directly related to this study. Study concept and design were contributed by Loy and Yap. Loy collected the data and took the lead in data interpretation, along with Ali and Yap. The manuscript was primarily written by Loy, along with Yap, and revised primarily by Ali, along with Yap.

  13. In situ click chemistry: a powerful means for lead discovery.

    PubMed

    Sharpless, K Barry; Manetsch, Roman

    2006-11-01

    Combinatorial chemistry and parallel synthesis are important and regularly applied tools for lead identification and optimisation, although they are often accompanied by challenges related to the efficiency of library synthesis and the purity of the compound library. In the last decade, novel means of lead discovery approaches have been investigated where the biological target is actively involved in the synthesis of its own inhibitory compound. These fragment-based approaches, also termed target-guided synthesis (TGS), show great promise in lead discovery applications by combining the synthesis and screening of libraries of low molecular weight compounds in a single step. Of all the TGS methods, the kinetically controlled variant is the least well known, but it has the potential to emerge as a reliable lead discovery method. The kinetically controlled TGS approach, termed in situ click chemistry, is discussed in this article.

  14. The Study of Intelligent Vehicle Navigation Path Based on Behavior Coordination of Particle Swarm.

    PubMed

    Han, Gaining; Fu, Weiping; Wang, Wen

    2016-01-01

    In the behavior dynamics model, behavior competition leads to the shock problem of the intelligent vehicle navigation path, because of the simultaneous occurrence of the time-variant target behavior and obstacle avoidance behavior. Considering the safety and real-time of intelligent vehicle, the particle swarm optimization (PSO) algorithm is proposed to solve these problems for the optimization of weight coefficients of the heading angle and the path velocity. Firstly, according to the behavior dynamics model, the fitness function is defined concerning the intelligent vehicle driving characteristics, the distance between intelligent vehicle and obstacle, and distance of intelligent vehicle and target. Secondly, behavior coordination parameters that minimize the fitness function are obtained by particle swarm optimization algorithms. Finally, the simulation results show that the optimization method and its fitness function can improve the perturbations of the vehicle planning path and real-time and reliability.

  15. Long Term, Operational Monitoring Of Enhanced Oil Recovery In Harsh Environments With INSAR

    NASA Astrophysics Data System (ADS)

    Sato, S.; Henschel, M. D.

    2012-01-01

    Since 2004, MDA GSI has provided ground deformation measurements for an oil field in northern Alberta, Canada using InSAR technology. During this period, the monitoring has reliably shown the slow rise of the oil field due to enhanced oil recovery operations. The InSAR monitoring solution is essentially based on the observation of point and point-like targets in the field. Ground conditions in the area are almost continuously changing (in their reflectivity characteristics) making it difficult to ob- serve coherent patterns from the ground. The extended duration of the oil operations has allowed us to continue InSAR monitoring and transition from RADARSAT-1 to RADARSAT-2. With RADARSAT-2 and the enhancement of the satellite resolution capability has provided more targets of opportunity as identified by a differential coherence method. This poster provides an overview of the long term monitoring of the oil field in northern Alberta, Canada.

  16. Translating pharmacodynamic biomarkers from bench to bedside: analytical validation and fit-for-purpose studies to qualify multiplex immunofluorescent assays for use on clinical core biopsy specimens.

    PubMed

    Marrero, Allison; Lawrence, Scott; Wilsker, Deborah; Voth, Andrea Regier; Kinders, Robert J

    2016-08-01

    Multiplex pharmacodynamic (PD) assays have the potential to increase sensitivity of biomarker-based reporting for new targeted agents, as well as revealing significantly more information about target and pathway activation than single-biomarker PD assays. Stringent methodology is required to ensure reliable and reproducible results. Common to all PD assays is the importance of reagent validation, assay and instrument calibration, and the determination of suitable response calibrators; however, multiplex assays, particularly those performed on paraffin specimens from tissue blocks, bring format-specific challenges adding a layer of complexity to assay development. We discuss existing multiplex approaches and the development of a multiplex immunofluorescence assay measuring DNA damage and DNA repair enzymes in response to anti-cancer therapeutics and describe how our novel method addresses known issues. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Identification of a New Isoindole-2-yl Scaffold as a Qo and Qi Dual Inhibitor of Cytochrome bc 1 Complex: Virtual Screening, Synthesis, and Biochemical Assay.

    PubMed

    Azizian, Homa; Bagherzadeh, Kowsar; Shahbazi, Sophia; Sharifi, Niusha; Amanlou, Massoud

    2017-09-18

    Respiratory chain ubiquinol-cytochrome (cyt) c oxidoreductase (cyt bc 1 or complex III) has been demonstrated as a promising target for numerous antibiotics and fungicide applications. In this study, a virtual screening of NCI diversity database was carried out in order to find novel Qo/Qi cyt bc 1 complex inhibitors. Structure-based virtual screening and molecular docking methodology were employed to further screen compounds with inhibition activity against cyt bc 1 complex after extensive reliability validation protocol with cross-docking method and identification of the best score functions. Subsequently, the application of rational filtering procedure over the target database resulted in the elucidation of a novel class of cyt bc 1 complex potent inhibitors with comparable binding energies and biological activities to those of the standard inhibitor, antimycin.

  18. The Study of Intelligent Vehicle Navigation Path Based on Behavior Coordination of Particle Swarm

    PubMed Central

    Han, Gaining; Fu, Weiping; Wang, Wen

    2016-01-01

    In the behavior dynamics model, behavior competition leads to the shock problem of the intelligent vehicle navigation path, because of the simultaneous occurrence of the time-variant target behavior and obstacle avoidance behavior. Considering the safety and real-time of intelligent vehicle, the particle swarm optimization (PSO) algorithm is proposed to solve these problems for the optimization of weight coefficients of the heading angle and the path velocity. Firstly, according to the behavior dynamics model, the fitness function is defined concerning the intelligent vehicle driving characteristics, the distance between intelligent vehicle and obstacle, and distance of intelligent vehicle and target. Secondly, behavior coordination parameters that minimize the fitness function are obtained by particle swarm optimization algorithms. Finally, the simulation results show that the optimization method and its fitness function can improve the perturbations of the vehicle planning path and real-time and reliability. PMID:26880881

  19. Rapid and Inexpensive Screening of Genomic Copy Number Variations Using a Novel Quantitative Fluorescent PCR Method

    PubMed Central

    Han, Joan C.; Elsea, Sarah H.; Pena, Heloísa B.; Pena, Sérgio Danilo Junho

    2013-01-01

    Detection of human microdeletion and microduplication syndromes poses significant burden on public healthcare systems in developing countries. With genome-wide diagnostic assays frequently inaccessible, targeted low-cost PCR-based approaches are preferred. However, their reproducibility depends on equally efficient amplification using a number of target and control primers. To address this, the recently described technique called Microdeletion/Microduplication Quantitative Fluorescent PCR (MQF-PCR) was shown to reliably detect four human syndromes by quantifying DNA amplification in an internally controlled PCR reaction. Here, we confirm its utility in the detection of eight human microdeletion syndromes, including the more common WAGR, Smith-Magenis, and Potocki-Lupski syndromes with 100% sensitivity and 100% specificity. We present selection, design, and performance evaluation of detection primers using variety of approaches. We conclude that MQF-PCR is an easily adaptable method for detection of human pathological chromosomal aberrations. PMID:24288428

  20. Gold Nano Popcorn Attached SWCNT Hybrid Nanomaterial for Targeted Diagnosis and Photothermal Therapy of Human Breast Cancer Cells

    PubMed Central

    Beqa, Lule; Fan, Zhen; Singh, Anant Kumar; Senapati, Dulal; Ray, Paresh Chandra

    2011-01-01

    Breast cancer presents greatest challenge in health care in today’s world. The key to ultimately successful treatment of breast cancer disease is an early and accurate diagnosis. Current breast cancer treatments are often associated with severe side effects. Driven by the need, we report the design of novel hybrid nanomaterial using gold nano popcorn-attached single wall carbon nanotube for targeted diagnosis and selective photothermal treatment. Targeted SK-BR-3 human breast cancer cell sensing have been performed in 10 cancer cells/mL level, using surface enhanced Raman scattering of single walls carbon nanotube’s D and G bands. Our data show that S6 aptamer attached hybrid nanomaterial based SERS assay is highly sensitive to targeted human breast cancer SK-BR-3 cell line and it will be able to distinguish it from other non targeted MDA-MB breast cancer cell line and HaCaT normal skin cell line. Our results also show that 10 minutes of photothermal therapy treatment by 1.5 W/cm2 power, 785 nm laser is enough to kill cancer cells very effectively using S6 aptamer attached hybrid nanomaterials. Possible mechanisms for targeted sensing and operating principle for highly efficient photothermal therapy have been discussed. Our experimental results reported here open up a new possibility for using aptamers modified hybrid nanomaterial for reliable diagnosis and targeted therapy of cancer cell lines quickly. PMID:21842867

  1. Gold nano-popcorn attached SWCNT hybrid nanomaterial for targeted diagnosis and photothermal therapy of human breast cancer cells.

    PubMed

    Beqa, Lule; Fan, Zhen; Singh, Anant Kumar; Senapati, Dulal; Ray, Paresh Chandra

    2011-09-01

    Breast cancer presents greatest challenge in health care in today's world. The key to ultimately successful treatment of breast cancer disease is an early and accurate diagnosis. Current breast cancer treatments are often associated with severe side effects. Driven by the need, we report the design of novel hybrid nanomaterial using gold nano popcorn-attached single wall carbon nanotube for targeted diagnosis and selective photothermal treatment. Targeted SK-BR-3 human breast cancer cell sensing have been performed in 10 cancer cells/mL level, using surface enhanced Raman scattering of single walls carbon nanotube's D and G bands. Our data show that S6 aptamer attached hybrid nanomaterial based SERS assay is highly sensitive to targeted human breast cancer SK-BR-3 cell line and it will be able to distinguish it from other non targeted MDA-MB breast cancer cell line and HaCaT normal skin cell line. Our results also show that 10 min of photothermal therapy treatment by 1.5 W/cm(2) power, 785 nm laser is enough to kill cancer cells very effectively using S6 aptamer attached hybrid nanomaterials. Possible mechanisms for targeted sensing and operating principle for highly efficient photothermal therapy have been discussed. Our experimental results reported here open up a new possibility for using aptamers modified hybrid nanomaterial for reliable diagnosis and targeted therapy of cancer cell lines quickly.

  2. You look familiar, but I don’t care: Lure rejection in hybrid visual and memory search is not based on familiarity

    PubMed Central

    Wolfe, Jeremy M.; Boettcher, Sage E. P.; Josephs, Emilie L.; Cunningham, Corbin A.; Drew, Trafton

    2015-01-01

    In “hybrid” search tasks, observers hold multiple possible targets in memory while searching for those targets amongst distractor items in visual displays. Wolfe (2012) found that, if the target set is held constant over a block of trials, RTs in such tasks were a linear function of the number of items in the visual display and a linear function of the log of the number of items held in memory. However, in such tasks, the targets can become far more familiar than the distractors. Does this “familiarity” – operationalized here as the frequency and recency with which an item has appeared – influence performance in hybrid tasks In Experiment 1, we compared searches where distractors appeared with the same frequency as the targets to searches where all distractors were novel. Distractor familiarity did not have any reliable effect on search. In Experiment 2, most distractors were novel but some critical distractors were as common as the targets while others were 4× more common. Familiar distractors did not produce false alarm errors, though they did slightly increase response times (RTs). In Experiment 3, observers successfully searched for the new, unfamiliar item among distractors that, in many cases, had been seen only once before. We conclude that when the memory set is held constant for many trials, item familiarity alone does not cause observers to mistakenly confuse target with distractors. PMID:26191615

  3. Sequencing Needs for Viral Diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, S N; Lam, M; Mulakken, N J

    2004-01-26

    We built a system to guide decisions regarding the amount of genomic sequencing required to develop diagnostic DNA signatures, which are short sequences that are sufficient to uniquely identify a viral species. We used our existing DNA diagnostic signature prediction pipeline, which selects regions of a target species genome that are conserved among strains of the target (for reliability, to prevent false negatives) and unique relative to other species (for specificity, to avoid false positives). We performed simulations, based on existing sequence data, to assess the number of genome sequences of a target species and of close phylogenetic relatives (''nearmore » neighbors'') that are required to predict diagnostic signature regions that are conserved among strains of the target species and unique relative to other bacterial and viral species. For DNA viruses such as variola (smallpox), three target genomes provide sufficient guidance for selecting species-wide signatures. Three near neighbor genomes are critical for species specificity. In contrast, most RNA viruses require four target genomes and no near neighbor genomes, since lack of conservation among strains is more limiting than uniqueness. SARS and Ebola Zaire are exceptional, as additional target genomes currently do not improve predictions, but near neighbor sequences are urgently needed. Our results also indicate that double stranded DNA viruses are more conserved among strains than are RNA viruses, since in most cases there was at least one conserved signature candidate for the DNA viruses and zero conserved signature candidates for the RNA viruses.« less

  4. A novel automotive headlight system based on digital micro-mirror devices and diffractive optical elements

    NASA Astrophysics Data System (ADS)

    Su, Ping; Song, Yuming; Ma, Jianshe

    2018-01-01

    The DMD (Digital Micro-mirror Device) has the advantages of high refresh rate and high diffraction efficiency, and these make it become an ideal loader of multiple modes illumination. DOEs (Diffractive Optical Element) have the advantages of high degree of freedom, light weight, easy to copy, low cost etc., and can be used to reduce the weight, complexity, cost of optical system. A novel automotive headlamp system using DMD as the light distribution element and a DOE as the light field modulation device is proposed in this paper. The pure phase DOE is obtained by the GS algorithm using Rayleigh-Sommerfeld diffraction integral model. Based on the standard automotive headlamp light intensity distribution in the target plane, the amplitude distribution of DMD is obtained by numerical simulation, and the grayscale diagram loaded on the DMD can be obtained accordingly. Finally, according to simulation result, the light intensity distribution in the target plane is proportional to the national standard, hence verifies the validity of the novel system. The novel illumination system proposed in this paper provides a reliable hardware platform for the intelligent headlamps.

  5. Engineered Aptamers to Probe Molecular Interactions on the Cell Surface

    PubMed Central

    Batool, Sana; Bhandari, Sanam; George, Shanell; Okeoma, Precious; Van, Nabeela; Zümrüt, Hazan E.; Mallikaratchy, Prabodhika

    2017-01-01

    Significant progress has been made in understanding the nature of molecular interactions on the cell membrane. To decipher such interactions, molecular scaffolds can be engineered as a tool to modulate these events as they occur on the cell membrane. To guarantee reliability, scaffolds that function as modulators of cell membrane events must be coupled to a targeting moiety with superior chemical versatility. In this regard, nucleic acid aptamers are a suitable class of targeting moieties. Aptamers are inherently chemical in nature, allowing extensive site-specific chemical modification to engineer sensing molecules. Aptamers can be easily selected using a simple laboratory-based in vitro evolution method enabling the design and development of aptamer-based functional molecular scaffolds against wide range of cell surface molecules. This article reviews the application of aptamers as monitors and modulators of molecular interactions on the mammalian cell surface with the aim of increasing our understanding of cell-surface receptor response to external stimuli. The information gained from these types of studies could eventually prove useful in engineering improved medical diagnostics and therapeutics. PMID:28850067

  6. Standoff detection of explosives and chemical agents using broadly tuned external-cavity quantum cascade lasers (EC-QCLs)

    NASA Astrophysics Data System (ADS)

    Takeuchi, Eric B.; Rayner, Timothy; Weida, Miles; Crivello, Salvatore; Day, Timothy

    2007-10-01

    Civilian soft targets such as transportation systems are being targeted by terrorists using IEDs and suicide bombers. Having the capability to remotely detect explosives, precursors and other chemicals would enable these assets to be protected with minimal interruption of the flow of commerce. Mid-IR laser technology offers the potential to detect explosives and other chemicals in real-time and from a safe standoff distance. While many of these agents possess "fingerprint" signatures in the mid-IR (i.e. in the 3-20 micron regime), their effective interrogation by a practical, field-deployable system has been limited by size, complexity, reliability and cost constraints of the base laser technology. Daylight Solutions has addressed these shortcomings by developing compact, portable, broadly tunable mid-IR laser sources based upon external-cavity quantum cascade technology. This technology is now being applied by Daylight in system level architectures for standoff and remote detection of explosives, precursors and chemical agents. Several of these architectures and predicted levels of performance will be presented.

  7. Technology test results from an intelligent, free-flying robot for crew and equipment retrieval in space

    NASA Technical Reports Server (NTRS)

    Erickson, J.; Goode, R.; Grimm, K.; Hess, C.; Norsworthy, R.; Anderson, G.; Merkel, L.; Phinney, D.

    1992-01-01

    The ground-based demonstrations of Extra Vehicular Activity (EVA) Retriever, a voice-supervised, intelligent, free-flying robot, are designed to evaluate the capability to retrieve objects (astronauts, equipment, and tools) which have accidentally separated from the Space Station. The EVA Retriever software is required to autonomously plan and execute a target rendezvous, grapple, and return to base while avoiding stationary and moving obstacles with subsequent object handover. The software architecture incorporates a heirarchical decomposition of the control system that is horizontally partitioned into five major functional subsystems: sensing, perception, world model, reasoning, and acting. The design provides for supervised autonomy as the primary mode of operation. It is intended to be an evolutionary system improving in capability over time and as it earns crew trust through reliable and safe operation. This paper gives an overview of the hardware, a focus on software, and a summary of results achieved recently from both computer simulations and air bearing floor demonstrations. Limitations of the technology used are evaluated. Plans for the next phase, during which moving targets and obstacles drive realtime behavior requirements, are discussed.

  8. Technology test results from an intelligent, free-flying robot for crew and equipment retrieval in space

    NASA Astrophysics Data System (ADS)

    Erickson, Jon D.; Goode, R.; Grimm, K. A.; Hess, Clifford W.; Norsworthy, Robert S.; Anderson, Greg D.; Merkel, L.; Phinney, Dale E.

    1992-03-01

    The ground-based demonstrations of Extra Vehicular Activity (EVA) Retriever, a voice- supervised, intelligent, free-flying robot, are designed to evaluate the capability to retrieve objects (astronauts, equipment, and tools) which have accidentally separated from the space station. The EVA Retriever software is required to autonomously plan and execute a target rendezvous, grapple, and return to base while avoiding stationary and moving obstacles with subsequent object handover. The software architecture incorporates a hierarchical decomposition of the control system that is horizontally partitioned into five major functional subsystems: sensing, perception, world model, reasoning, and acting. The design provides for supervised autonomy as the primary mode of operation. It is intended to be an evolutionary system improving in capability over time and as it earns crew trust through reliable and safe operation. This paper gives an overview of the hardware, a focus on software, and a summary of results achieved recently from both computer simulations and air bearing floor demonstrations. Limitations of the technology used are evaluated. Plans for the next phase, during which moving targets and obstacles drive realtime behavior requirements, are discussed.

  9. Control of the interaction strength of photonic molecules by nanometer precise 3D fabrication.

    PubMed

    Rawlings, Colin D; Zientek, Michal; Spieser, Martin; Urbonas, Darius; Stöferle, Thilo; Mahrt, Rainer F; Lisunova, Yuliya; Brugger, Juergen; Duerig, Urs; Knoll, Armin W

    2017-11-28

    Applications for high resolution 3D profiles, so-called grayscale lithography, exist in diverse fields such as optics, nanofluidics and tribology. All of them require the fabrication of patterns with reliable absolute patterning depth independent of the substrate location and target materials. Here we present a complete patterning and pattern-transfer solution based on thermal scanning probe lithography (t-SPL) and dry etching. We demonstrate the fabrication of 3D profiles in silicon and silicon oxide with nanometer scale accuracy of absolute depth levels. An accuracy of less than 1nm standard deviation in t-SPL is achieved by providing an accurate physical model of the writing process to a model-based implementation of a closed-loop lithography process. For transfering the pattern to a target substrate we optimized the etch process and demonstrate linear amplification of grayscale patterns into silicon and silicon oxide with amplification ratios of ∼6 and ∼1, respectively. The performance of the entire process is demonstrated by manufacturing photonic molecules of desired interaction strength. Excellent agreement of fabricated and simulated structures has been achieved.

  10. Designing Microblog Direct Messages to Engage Social Media Users With Suicide Ideation: Interview and Survey Study on Weibo

    PubMed Central

    Tan, Ziying; Liu, Xingyun; Liu, Xiaoqian; Cheng, Qijin

    2017-01-01

    Background While Web-based interventions can be efficacious, engaging a target population’s attention remains challenging. We argue that strategies to draw such a population’s attention should be tailored to meet its needs. Increasing user engagement in online suicide intervention development requires feedback from this group to prevent people who have suicide ideation from seeking treatment. Objective The goal of this study was to solicit feedback on the acceptability of the content of messaging from social media users with suicide ideation. To overcome the common concern of lack of engagement in online interventions and to ensure effective learning from the message, this research employs a customized design of both content and length of the message. Methods In study 1, 17 participants suffering from suicide ideation were recruited. The first (n=8) group conversed with a professional suicide intervention doctor about its attitudes and suggestions for a direct message intervention. To ensure the reliability and consistency of the result, an identical interview was conducted for the second group (n=9). Based on the collected data, questionnaires about this intervention were formed. Study 2 recruited 4222 microblog users with suicide ideation via the Internet. Results The results of the group interviews in study 1 yielded little difference regarding the interview results; this difference may relate to the 2 groups’ varied perceptions of direct message design. However, most participants reported that they would be most drawn to an intervention where they knew that the account was reliable. Out of 4222 microblog users, we received responses from 725 with completed questionnaires; 78.62% (570/725) participants were not opposed to online suicide intervention and they valued the link for extra suicide intervention information as long as the account appeared to be trustworthy. Their attitudes toward the intervention and the account were similar to those from study 1, and 3 important elements were found pertaining to the direct message: reliability of account name, brevity of the message, and details of the phone numbers of psychological intervention centers and psychological assessment. Conclusions This paper proposed strategies for engaging target populations in online suicide interventions. PMID:29233805

  11. Object acquisition and tracking for space-based surveillance

    NASA Astrophysics Data System (ADS)

    1991-11-01

    This report presents the results of research carried out by Space Computer Corporation under the U.S. government's Small Business Innovation Research (SBIR) Program. The work was sponsored by the Strategic Defense Initiative Organization and managed by the Office of Naval Research under Contracts N00014-87-C-0801 (Phase 1) and N00014-89-C-0015 (Phase 2). The basic purpose of this research was to develop and demonstrate a new approach to the detection of, and initiation of track on, moving targets using data from a passive infrared or visual sensor. This approach differs in very significant ways from the traditional approach of dividing the required processing into time dependent, object dependent, and data dependent processing stages. In that approach individual targets are first detected in individual image frames, and the detections are then assembled into tracks. That requires that the signal to noise ratio in each image frame be sufficient for fairly reliable target detection. In contrast, our approach bases detection of targets on multiple image frames, and, accordingly, requires a smaller signal to noise ratio. It is sometimes referred to as track before detect, and can lead to a significant reduction in total system cost. For example, it can allow greater detection range for a single sensor, or it can allow the use of smaller sensor optics. Both the traditional and track before detect approaches are applicable to systems using scanning sensors, as well as those which use staring sensors.

  12. Object acquisition and tracking for space-based surveillance. Final report, Dec 88-May 90

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-11-27

    This report presents the results of research carried out by Space Computer Corporation under the U.S. government's Small Business Innovation Research (SBIR) Program. The work was sponsored by the Strategic Defense Initiative Organization and managed by the Office of Naval Research under Contracts N00014-87-C-0801 (Phase I) and N00014-89-C-0015 (Phase II). The basic purpose of this research was to develop and demonstrate a new approach to the detection of, and initiation of track on, moving targets using data from a passive infrared or visual sensor. This approach differs in very significant ways from the traditional approach of dividing the required processingmore » into time dependent, object-dependent, and data-dependent processing stages. In that approach individual targets are first detected in individual image frames, and the detections are then assembled into tracks. That requires that the signal to noise ratio in each image frame be sufficient for fairly reliable target detection. In contrast, our approach bases detection of targets on multiple image frames, and, accordingly, requires a smaller signal to noise ratio. It is sometimes referred to as track before detect, and can lead to a significant reduction in total system cost. For example, it can allow greater detection range for a single sensor, or it can allow the use of smaller sensor optics. Both the traditional and track before detect approaches are applicable to systems using scanning sensors, as well as those which use staring sensors.« less

  13. Point Cloud Based Relative Pose Estimation of a Satellite in Close Range

    PubMed Central

    Liu, Lujiang; Zhao, Gaopeng; Bo, Yuming

    2016-01-01

    Determination of the relative pose of satellites is essential in space rendezvous operations and on-orbit servicing missions. The key problems are the adoption of suitable sensor on board of a chaser and efficient techniques for pose estimation. This paper aims to estimate the pose of a target satellite in close range on the basis of its known model by using point cloud data generated by a flash LIDAR sensor. A novel model based pose estimation method is proposed; it includes a fast and reliable pose initial acquisition method based on global optimal searching by processing the dense point cloud data directly, and a pose tracking method based on Iterative Closest Point algorithm. Also, a simulation system is presented in this paper in order to evaluate the performance of the sensor and generate simulated sensor point cloud data. It also provides truth pose of the test target so that the pose estimation error can be quantified. To investigate the effectiveness of the proposed approach and achievable pose accuracy, numerical simulation experiments are performed; results demonstrate algorithm capability of operating with point cloud directly and large pose variations. Also, a field testing experiment is conducted and results show that the proposed method is effective. PMID:27271633

  14. Development of a DNA Microarray-Based Assay for the Detection of Sugar Beet Root Rot Pathogens.

    PubMed

    Liebe, Sebastian; Christ, Daniela S; Ehricht, Ralf; Varrelmann, Mark

    2016-01-01

    Sugar beet root rot diseases that occur during the cropping season or in storage are accompanied by high yield losses and a severe reduction of processing quality. The vast diversity of microorganism species involved in rot development requires molecular tools allowing simultaneous identification of many different targets. Therefore, a new microarray technology (ArrayTube) was applied in this study to improve diagnosis of sugar beet root rot diseases. Based on three marker genes (internal transcribed spacer, translation elongation factor 1 alpha, and 16S ribosomal DNA), 42 well-performing probes enabled the identification of prevalent field pathogens (e.g., Aphanomyces cochlioides), storage pathogens (e.g., Botrytis cinerea), and ubiquitous spoilage fungi (e.g., Penicillium expansum). All probes were proven for specificity with pure cultures from 73 microorganism species as well as for in planta detection of their target species using inoculated sugar beet tissue. Microarray-based identification of root rot pathogens in diseased field beets was successfully confirmed by classical detection methods. The high discriminatory potential was proven by Fusarium species differentiation based on a single nucleotide polymorphism. The results demonstrate that the ArrayTube constitute an innovative tool allowing a rapid and reliable detection of plant pathogens particularly when multiple microorganism species are present.

  15. Air-condition Control System of Weaving Workshop Based on LabVIEW

    NASA Astrophysics Data System (ADS)

    Song, Jian

    The project of air-condition measurement and control system based on LabVIEW is put forward for the sake of controlling effectively the environmental targets in the weaving workshop. In this project, which is based on the virtual instrument technology and in which LabVIEW development platform by NI is adopted, the system is constructed on the basis of the virtual instrument technology. It is composed of the upper PC, central control nodes based on CC2530, sensor nodes, sensor modules and executive device. Fuzzy control algorithm is employed to achieve the accuracy control of the temperature and humidity. A user-friendly man-machine interaction interface is designed with virtual instrument technology at the core of the software. It is shown by experiments that the measurement and control system can run stably and reliably and meet the functional requirements for controlling the weaving workshop.

  16. Virtual Sensor Web Architecture

    NASA Astrophysics Data System (ADS)

    Bose, P.; Zimdars, A.; Hurlburt, N.; Doug, S.

    2006-12-01

    NASA envisions the development of smart sensor webs, intelligent and integrated observation network that harness distributed sensing assets, their associated continuous and complex data sets, and predictive observation processing mechanisms for timely, collaborative hazard mitigation and enhanced science productivity and reliability. This paper presents Virtual Sensor Web Infrastructure for Collaborative Science (VSICS) Architecture for sustained coordination of (numerical and distributed) model-based processing, closed-loop resource allocation, and observation planning. VSICS's key ideas include i) rich descriptions of sensors as services based on semantic markup languages like OWL and SensorML; ii) service-oriented workflow composition and repair for simple and ensemble models; event-driven workflow execution based on event-based and distributed workflow management mechanisms; and iii) development of autonomous model interaction management capabilities providing closed-loop control of collection resources driven by competing targeted observation needs. We present results from initial work on collaborative science processing involving distributed services (COSEC framework) that is being extended to create VSICS.

  17. Quantifying the Impacts of Large Scale Integration of Renewables in Indian Power Sector

    NASA Astrophysics Data System (ADS)

    Kumar, P.; Mishra, T.; Banerjee, R.

    2017-12-01

    India's power sector is responsible for nearly 37 percent of India's greenhouse gas emissions. For a fast emerging economy like India whose population and energy consumption are poised to rise rapidly in the coming decades, renewable energy can play a vital role in decarbonizing power sector. In this context, India has targeted 33-35 percent emission intensity reduction (with respect to 2005 levels) along with large scale renewable energy targets (100GW solar, 60GW wind, and 10GW biomass energy by 2022) in INDCs submitted at Paris agreement. But large scale integration of renewable energy is a complex process which faces a number of problems like capital intensiveness, matching intermittent loads with least storage capacity and reliability. In this context, this study attempts to assess the technical feasibility of integrating renewables into Indian electricity mix by 2022 and analyze its implications on power sector operations. This study uses TIMES, a bottom up energy optimization model with unit commitment and dispatch features. We model coal and gas fired units discretely with region-wise representation of wind and solar resources. The dispatch features are used for operational analysis of power plant units under ramp rate and minimum generation constraints. The study analyzes India's electricity sector transition for the year 2022 with three scenarios. The base case scenario (no RE addition) along with INDC scenario (with 100GW solar, 60GW wind, 10GW biomass) and low RE scenario (50GW solar, 30GW wind) have been created to analyze the implications of large scale integration of variable renewable energy. The results provide us insights on trade-offs involved in achieving mitigation targets and investment decisions involved. The study also examines operational reliability and flexibility requirements of the system for integrating renewables.

  18. A systems biology approach for miRNA-mRNA expression patterns analysis in non-small cell lung cancer.

    PubMed

    Najafi, Ali; Tavallaei, Mahmood; Hosseini, Sayed Mostafa

    2016-01-01

    Non-small cell lung cancers (NSCLCs) is a prevalent and heterogeneous subtype of lung cancer accounting for 85 percent of patients. MicroRNAs (miRNAs), a class of small endogenous non-coding RNAs, incorporate into regulation of gene expression post-transcriptionally. Therefore, deregulation of miRNAs' expression has provided further layers of complexity to the molecular etiology and pathogenesis of different diseases and malignancies. Although, until now considerable number of studies has been carried out to illuminate this complexity in NSCLC, they have remained less effective in their goal due to lack of a holistic and integrative systems biology approach which considers all natural elaborations of miRNAs' function. It is able to reliably nominate most affected signaling pathways and therapeutic target genes by deregulated miRNAs during a particular pathological condition. Herein, we utilized a holistic systems biology approach, based on appropriate re-analyses of microarray datasets followed by reliable data filtering, to analyze integrative and combinatorial deregulated miRNA-mRNA interaction network in NSCLC, aiming to ascertain miRNA-dysregulated signaling pathway and potential therapeutic miRNAs and mRNAs which represent a lion' share during various aspects of NSCLC's pathogenesis. Our systems biology approach introduced and nominated 1) important deregulated miRNAs in NSCLCs compared with normal tissue 2) significant and confident deregulated mRNAs which were anti-correlatively targeted by deregulated miRNA in NSCLCs and 3) dysregulated signaling pathways in association with deregulated miRNA-mRNAs interactions in NSCLCs. These results introduce possible mechanism of function of deregulated miRNAs and mRNAs in NSCLC that could be used as potential therapeutic targets.

  19. Direct costs of unintended pregnancy in the Russian federation.

    PubMed

    Lowin, Julia; Jarrett, James; Dimova, Maria; Ignateva, Victoria; Omelyanovsky, Vitaly; Filonenko, Anna

    2015-02-01

    In 2010, almost every third pregnancy in Russia was terminated, indicating that unintended pregnancy (UP) is a public health problem. The aim of this study was to estimate the direct cost of UP to the healthcare system in Russia and the proportion attributable to using unreliable contraception. A cost model was built, adopting a generic payer perspective with a 1-year time horizon. The analysis cohort was defined as women of childbearing age between 18 and 44 years actively seeking to avoid pregnancy. Model inputs were derived from published sources or government statistics with a 2012 cost base. To estimate the number of UPs attributable to unreliable methods, the model combined annual typical use failure rates and age-adjusted utilization for each contraceptive method. Published survey data was used to adjust the total cost of UP by the number of UPs that were mistimed rather than unwanted. Scenario analysis considered alternate allocation of methods to the reliable and unreliable categories and estimate of the burden of UP in the target sub-group of women aged 18-29 years. The model estimated 1,646,799 UPs in the analysis cohort (women aged 18-44 years) with an associated annual cost of US$783 million. The model estimated 1,019,371 UPs in the target group of 18-29 years, of which 88 % were attributable to unreliable contraception. The total cost of UPs in the target group was estimated at approximately US$498 million, of which US$441 million could be considered attributable to the use of unreliable methods. The cost of UP attributable to use of unreliable contraception in Russia is substantial. Policies encouraging use of reliable contraceptive methods could reduce the burden of UP.

  20. Differentiation of Toxocara canis and Toxocara cati based on PCR-RFLP analyses of rDNA-ITS and mitochondrial cox1 and nad1 regions.

    PubMed

    Mikaeili, Fattaneh; Mathis, Alexander; Deplazes, Peter; Mirhendi, Hossein; Barazesh, Afshin; Ebrahimi, Sepideh; Kia, Eshrat Beigom

    2017-09-26

    The definitive genetic identification of Toxocara species is currently based on PCR/sequencing. The objectives of the present study were to design and conduct an in silico polymerase chain reaction-restriction fragment length polymorphism method for identification of Toxocara species. In silico analyses using the DNASIS and NEBcutter softwares were performed with rDNA internal transcribed spacers, and mitochondrial cox1 and nad1 sequences obtained in our previous studies along with relevant sequences deposited in GenBank. Consequently, RFLP profiles were designed and all isolates of T. canis and T. cati collected from dogs and cats in different geographical areas of Iran were investigated with the RFLP method using some of the identified suitable enzymes. The findings of in silico analyses predicted that on the cox1 gene only the MboII enzyme is appropriate for PCR-RFLP to reliably distinguish the two species. No suitable enzyme for PCR-RFLP on the nad1 gene was identified that yields the same pattern for all isolates of a species. DNASIS software showed that there are 241 suitable restriction enzymes for the differentiation of T. canis from T. cati based on ITS sequences. RsaI, MvaI and SalI enzymes were selected to evaluate the reliability of the in silico PCR-RFLP. The sizes of restriction fragments obtained by PCR-RFLP of all samples consistently matched the expected RFLP patterns. The ITS sequences are usually conserved and the PCR-RFLP approach targeting the ITS sequence is recommended for the molecular differentiation of Toxocara species and can provide a reliable tool for identification purposes particularly at the larval and egg stages.

  1. Miss-distance indicator for tank main guns

    NASA Astrophysics Data System (ADS)

    Bornstein, Jonathan A.; Hillis, David B.

    1996-06-01

    Tank main gun systems must possess extremely high levels of accuracy to perform successfully in battle. Under some circumstances, the first round fired in an engagement may miss the intended target, and it becomes necessary to rapidly correct fire. A breadboard automatic miss-distance indicator system was previously developed to assist in this process. The system, which would be mounted on a 'wingman' tank, consists of a charged-coupled device (CCD) camera and computer-based image-processing system, coupled with a separate infrared sensor to detect muzzle flash. For the system to be successfully employed with current generation tanks, it must be reliable, be relatively low cost, and respond rapidly maintaining current firing rates. Recently, the original indicator system was developed further in an effort to assist in achieving these goals. Efforts have focused primarily upon enhanced image-processing algorithms, both to improve system reliability and to reduce processing requirements. Intelligent application of newly refined trajectory models has permitted examination of reduced areas of interest and enhanced rejection of false alarms, significantly improving system performance.

  2. The Japanese Criminal Thinking Inventory: Development, Reliability, and Initial Validation of a New Scale for Assessing Criminal Thinking in a Japanese Offender Population.

    PubMed

    Kishi, Kaori; Takeda, Fumi; Nagata, Yuko; Suzuki, Junko; Monma, Takafumi; Asanuma, Tohru

    2015-11-01

    Using a sample of 116 Japanese men who had been placed under parole/probationary supervision or released from prison, the present study examined standardization, reliability, and validation of the Japanese Criminal Thinking Inventory (JCTI) that was based on the short form of the Psychological Inventory of Criminal Thinking Styles (PICTS), a self-rating instrument designed to evaluate cognitive patterns specific to criminal conduct. An exploratory factor analysis revealed that four dimensions adequately captured the structure of the JCTI, and the resultant 17-item JCTI demonstrated high internal consistency. Compared with the Japanese version of the Buss-Perry Aggression Questionnaire (BAQ), the JCTI showed a favorable pattern of criterion-related validity. Prior criminal environment and drug abuse as the most recent offense also significantly correlated with the JCTI total score. Overall, the JCTI possesses an important implication for offender rehabilitation as it identifies relevant cognitive targets and assesses offender progress. © The Author(s) 2014.

  3. Supramolecular Affinity Chromatography for Methylation-Targeted Proteomics.

    PubMed

    Garnett, Graham A E; Starke, Melissa J; Shaurya, Alok; Li, Janessa; Hof, Fraser

    2016-04-05

    Proteome-wide studies of post-translationally methylated species using mass spectrometry are complicated by high sample diversity, competition for ionization among peptides, and mass redundancies. Antibody-based enrichment has powered methylation proteomics until now, but the reliability, pan-specificity, polyclonal nature, and stability of the available pan-specific antibodies are problematic and do not provide a standard, reliable platform for investigators. We have invented an anionic supramolecular host that can form host-guest complexes selectively with methyllysine-containing peptides and used it to create a methylysine-affinity column. The column resolves peptides on the basis of methylation-a feat impossible with a comparable commercial cation-exchange column. A proteolyzed nuclear extract was separated on the methyl-affinity column prior to standard proteomics analysis. This experiment demonstrates that such chemical methyl-affinity columns are capable of enriching and improving the analysis of methyllysine residues from complex protein mixtures. We discuss the importance of this advance in the context of biomolecule-driven enrichment methods.

  4. Modern methodology of designing target reliability into rotating mechanical components

    NASA Technical Reports Server (NTRS)

    Kececioglu, D. B.; Chester, L. B.

    1973-01-01

    Experimentally determined distributional cycles-to-failure versus maximum alternating nominal strength (S-N) diagrams, and distributional mean nominal strength versus maximum alternating nominal strength (Goodman) diagrams are presented. These distributional S-N and Goodman diagrams are for AISI 4340 steel, R sub c 35/40 hardness, round, cylindrical specimens 0.735 in. in diameter and 6 in. long with a circumferential groove 0.145 in. radius for a theoretical stress concentration = 1.42 and 0.034 in. radius for a stress concentration = 2.34. The specimens are subjected to reversed bending and steady torque in specially built, three complex-fatigue research machines. Based on these results, the effects on the distributional S-N and Goodman diagrams and on service life of superimposing steady torque on reversed bending are established, as well as the effect of various stress concentrations. In addition a computer program for determining the three-parameter Weibull distribution representing the cycles-to-failure data, and two methods for calculating the reliability of components subjected to cumulative fatigue loads are given.

  5. The 3600 hp split-torque helicopter transmission

    NASA Technical Reports Server (NTRS)

    White, G.

    1985-01-01

    Final design details of a helicopter transmission that is powered by GE twin T 700 engines each rated at 1800 hp are presented. It is demonstrated that in comparison with conventional helicopter transmission arrangements the split torque design offers: weight reduction of 15%; reduction in drive train losses of 9%; and improved reliability resulting from redundant drive paths between the two engines and the main shaft. The transmission fits within the NASA LeRC 3000 hp Test Stand and accepts the existing positions for engine inputs, main shaft, connecting drive shafts, and the cradle attachment points. One necessary change to the test stand involved gear trains of different ratio in the tail drive gearbox. Progressive uprating of engine input power from 3600 to 4500 hp twin engine rating is allowed for in the design. In this way the test transmission will provide a base for several years of analytical, research, and component development effort targeted at improving the performance and reliability of helicopter transmission.

  6. Statistical sensor fusion analysis of near-IR polarimetric and thermal imagery for the detection of minelike targets

    NASA Astrophysics Data System (ADS)

    Weisenseel, Robert A.; Karl, William C.; Castanon, David A.; DiMarzio, Charles A.

    1999-02-01

    We present an analysis of statistical model based data-level fusion for near-IR polarimetric and thermal data, particularly for the detection of mines and mine-like targets. Typical detection-level data fusion methods, approaches that fuse detections from individual sensors rather than fusing at the level of the raw data, do not account rationally for the relative reliability of different sensors, nor the redundancy often inherent in multiple sensors. Representative examples of such detection-level techniques include logical AND/OR operations on detections from individual sensors and majority vote methods. In this work, we exploit a statistical data model for the detection of mines and mine-like targets to compare and fuse multiple sensor channels. Our purpose is to quantify the amount of knowledge that each polarimetric or thermal channel supplies to the detection process. With this information, we can make reasonable decisions about the usefulness of each channel. We can use this information to improve the detection process, or we can use it to reduce the number of required channels.

  7. A targeted change-detection procedure by combining change vector analysis and post-classification approach

    NASA Astrophysics Data System (ADS)

    Ye, Su; Chen, Dongmei; Yu, Jie

    2016-04-01

    In remote sensing, conventional supervised change-detection methods usually require effective training data for multiple change types. This paper introduces a more flexible and efficient procedure that seeks to identify only the changes that users are interested in, here after referred to as "targeted change detection". Based on a one-class classifier "Support Vector Domain Description (SVDD)", a novel algorithm named "Three-layer SVDD Fusion (TLSF)" is developed specially for targeted change detection. The proposed algorithm combines one-class classification generated from change vector maps, as well as before- and after-change images in order to get a more reliable detecting result. In addition, this paper introduces a detailed workflow for implementing this algorithm. This workflow has been applied to two case studies with different practical monitoring objectives: urban expansion and forest fire assessment. The experiment results of these two case studies show that the overall accuracy of our proposed algorithm is superior (Kappa statistics are 86.3% and 87.8% for Case 1 and 2, respectively), compared to applying SVDD to change vector analysis and post-classification comparison.

  8. A low density microarray method for the identification of human papillomavirus type 18 variants.

    PubMed

    Meza-Menchaca, Thuluz; Williams, John; Rodríguez-Estrada, Rocío B; García-Bravo, Aracely; Ramos-Ligonio, Ángel; López-Monteon, Aracely; Zepeda, Rossana C

    2013-09-26

    We describe a novel microarray based-method for the screening of oncogenic human papillomavirus 18 (HPV-18) molecular variants. Due to the fact that sequencing methodology may underestimate samples containing more than one variant we designed a specific and sensitive stacking DNA hybridization assay. This technology can be used to discriminate between three possible phylogenetic branches of HPV-18. Probes were attached covalently on glass slides and hybridized with single-stranded DNA targets. Prior to hybridization with the probes, the target strands were pre-annealed with the three auxiliary contiguous oligonucleotides flanking the target sequences. Screening HPV-18 positive cell lines and cervical samples were used to evaluate the performance of this HPV DNA microarray. Our results demonstrate that the HPV-18's variants hybridized specifically to probes, with no detection of unspecific signals. Specific probes successfully reveal detectable point mutations in these variants. The present DNA oligoarray system can be used as a reliable, sensitive and specific method for HPV-18 variant screening. Furthermore, this simple assay allows the use of inexpensive equipment, making it accessible in resource-poor settings.

  9. A Low Density Microarray Method for the Identification of Human Papillomavirus Type 18 Variants

    PubMed Central

    Meza-Menchaca, Thuluz; Williams, John; Rodríguez-Estrada, Rocío B.; García-Bravo, Aracely; Ramos-Ligonio, Ángel; López-Monteon, Aracely; Zepeda, Rossana C.

    2013-01-01

    We describe a novel microarray based-method for the screening of oncogenic human papillomavirus 18 (HPV-18) molecular variants. Due to the fact that sequencing methodology may underestimate samples containing more than one variant we designed a specific and sensitive stacking DNA hybridization assay. This technology can be used to discriminate between three possible phylogenetic branches of HPV-18. Probes were attached covalently on glass slides and hybridized with single-stranded DNA targets. Prior to hybridization with the probes, the target strands were pre-annealed with the three auxiliary contiguous oligonucleotides flanking the target sequences. Screening HPV-18 positive cell lines and cervical samples were used to evaluate the performance of this HPV DNA microarray. Our results demonstrate that the HPV-18's variants hybridized specifically to probes, with no detection of unspecific signals. Specific probes successfully reveal detectable point mutations in these variants. The present DNA oligoarray system can be used as a reliable, sensitive and specific method for HPV-18 variant screening. Furthermore, this simple assay allows the use of inexpensive equipment, making it accessible in resource-poor settings. PMID:24077317

  10. A portable molecular-sieve-based CO{sub 2} sampling system for radiocarbon measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palonen, V., E-mail: vesa.palonen@helsinki.fi

    We have developed a field-capable sampling system for the collection of CO{sub 2} samples for radiocarbon-concentration measurements. Most target systems in environmental research are limited in volume and CO{sub 2} concentration, making conventional flask sampling hard or impossible for radiocarbon studies. The present system captures the CO{sub 2} selectively to cartridges containing 13X molecular sieve material. The sampling does not introduce significant under-pressures or significant losses of moisture to the target system, making it suitable for most environmental targets. The system also incorporates a significantly larger sieve container for the removal of CO{sub 2} from chambers prior to the CO{submore » 2} build-up phase and sampling. In addition, both the CO{sub 2} and H{sub 2}O content of the sample gas are measured continuously. This enables in situ estimation of the amount of collected CO{sub 2} and the determination of CO{sub 2} flux to a chamber. The portable sampling system is described in detail and tests for the reliability of the method are presented.« less

  11. HLA-targeted flow cytometric sorting of blood cells allows separation of pure and viable microchimeric cell populations.

    PubMed

    Drabbels, Jos J M; van de Keur, Carin; Kemps, Berit M; Mulder, Arend; Scherjon, Sicco A; Claas, Frans H J; Eikmans, Michael

    2011-11-10

    Microchimerism is defined by the presence of low levels of nonhost cells in a person. We developed a reliable method for separating viable microchimeric cells from the host environment. For flow cytometric cell sorting, HLA antigens were targeted with human monoclonal HLA antibodies (mAbs). Optimal separation of microchimeric cells (present at a proportion as low as 0.01% in artificial mixtures) was obtained with 2 different HLA mAbs, one targeting the chimeric cells and the other the background cells. To verify purity of separated cell populations, flow-sorted fractions of 1000 cells were processed for DNA analysis by HLA-allele-specific and Y-chromosome-directed real-time quantitative PCR assays. After sorting, PCR signals of chimeric DNA markers in the positive fractions were significantly enhanced compared with those in the presort samples, and they were similar to those in 100% chimeric control samples. Next, we demonstrate applicability of HLA-targeted FACS sorting after pregnancy by separating chimeric maternal cells from child umbilical cord mononuclear cells. Targeting allelic differences with anti-HLA mAbs with FACS sorting allows maximal enrichment of viable microchimeric cells from a background cell population. The current methodology enables reliable microchimeric cell detection and separation in clinical specimens.

  12. Planck 2015 results. XXVI. The Second Planck Catalogue of Compact Sources

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Argüeso, F.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Battaner, E.; Beichman, C.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Böhringer, H.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chary, R.-R.; Chiang, H. C.; Christensen, P. R.; Clemens, M.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Falgarone, E.; Fergusson, J.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frejsel, A.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; McGehee, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Negrello, M.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Reach, W. T.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rowan-Robinson, M.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Sanghera, H. S.; Santos, D.; Savelainen, M.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Stolyarov, V.; Sudiwala, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tornikoski, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Türler, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L. A.; Walter, B.; Wandelt, B. D.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zonca, A.

    2016-09-01

    The Second Planck Catalogue of Compact Sources is a list of discrete objects detected in single-frequency maps from the full duration of the Planck mission and supersedes previous versions. It consists of compact sources, both Galactic and extragalactic, detected over the entire sky. Compact sources detected in the lower frequency channels are assigned to the PCCS2, while at higher frequencies they are assigned to one of two subcatalogues, the PCCS2 or PCCS2E, depending on their location on the sky. The first of these (PCCS2) covers most of the sky and allows the user to produce subsamples at higher reliabilities than the target 80% integral reliability of the catalogue. The second (PCCS2E) contains sources detected in sky regions where the diffuse emission makes it difficult to quantify the reliability of the detections. Both the PCCS2 and PCCS2E include polarization measurements, in the form of polarized flux densities, or upper limits, and orientation angles for all seven polarization-sensitive Planck channels. The improved data-processing of the full-mission maps and their reduced noise levels allow us to increase the number of objects in the catalogue, improving its completeness for the target 80% reliability as compared with the previous versions, the PCCS and the Early Release Compact Source Catalogue (ERCSC).

  13. A reliability and mass perspective of SP-100 Stirling cycle lunar-base powerplant designs

    NASA Technical Reports Server (NTRS)

    Bloomfield, Harvey S.

    1991-01-01

    The purpose was to obtain reliability and mass perspectives on selection of space power system conceptual designs based on SP-100 reactor and Stirling cycle power-generation subsystems. The approach taken was to: (1) develop a criterion for an acceptable overall reliability risk as a function of the expected range of emerging technology subsystem unit reliabilities; (2) conduct reliability and mass analyses for a diverse matrix of 800-kWe lunar-base design configurations employing single and multiple powerplants with both full and partial subsystem redundancy combinations; and (3) derive reliability and mass perspectives on selection of conceptual design configurations that meet an acceptable reliability criterion with the minimum system mass increase relative to reference powerplant design. The developed perspectives provided valuable insight into the considerations required to identify and characterize high-reliability and low-mass lunar-base powerplant conceptual design.

  14. An approach to value-based simulator selection: The creation and evaluation of the simulator value index tool.

    PubMed

    Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M

    2018-04-01

    Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Carotid Plaque Morphological Classification Compared With Biomechanical Cap Stress: Implications for a Magnetic Resonance Imaging-Based Assessment.

    PubMed

    Gijsen, Frank J H; Nieuwstadt, Harm A; Wentzel, Jolanda J; Verhagen, Hence J M; van der Lugt, Aad; van der Steen, Antonius F W

    2015-08-01

    Two approaches to target plaque vulnerability-a histopathologic classification scheme and a biomechanical analysis-were compared and the implications for noninvasive risk stratification of carotid plaques using magnetic resonance imaging were assessed. Seventy-five histological plaque cross sections were obtained from carotid endarterectomy specimens from 34 patients (>70% stenosis) and subjected to both a Virmani histopathologic classification (thin fibrous cap atheroma with <0.2-mm cap thickness, presumed vulnerable) and a peak cap stress computation (<140 kPa: presumed stable; >300 kPa: presumed vulnerable). To demonstrate the implications for noninvasive plaque assessment, numeric simulations of a typical carotid magnetic resonance imaging protocol were performed (0.62×0.62 mm(2) in-plane acquired voxel size) and used to obtain the magnetic resonance imaging-based peak cap stress. Peak cap stress was generally associated with histological classification. However, only 16 of 25 plaque cross sections could be labeled as high-risk (peak cap stress>300 kPa and classified as a thin fibrous cap atheroma). Twenty-eight of 50 plaque cross sections could be labeled as low-risk (a peak cap stress<140 kPa and not a thin fibrous cap atheroma), leading to a κ=0.39. 31 plaques (41%) had a disagreement between both classifications. Because of the limited magnetic resonance imaging voxel size with regard to cap thickness, a noninvasive identification of only a group of low-risk, thick-cap plaques was reliable. Instead of trying to target only vulnerable plaques, a more reliable noninvasive identification of a select group of stable plaques with a thick cap and low stress might be a more fruitful approach to start reducing surgical interventions on carotid plaques. © 2015 American Heart Association, Inc.

  16. New approaches for the reliable in vitro assessment of binding affinity based on high-resolution real-time data acquisition of radioligand-receptor binding kinetics.

    PubMed

    Zeilinger, Markus; Pichler, Florian; Nics, Lukas; Wadsak, Wolfgang; Spreitzer, Helmut; Hacker, Marcus; Mitterhauser, Markus

    2017-12-01

    Resolving the kinetic mechanisms of biomolecular interactions have become increasingly important in early-phase drug development. Since traditional in vitro methods belong to dose-dependent assessments, binding kinetics is usually overlooked. The present study aimed at the establishment of two novel experimental approaches for the assessment of binding affinity of both, radiolabelled and non-labelled compounds targeting the A 3 R, based on high-resolution real-time data acquisition of radioligand-receptor binding kinetics. A novel time-resolved competition assay was developed and applied to determine the K i of eight different A 3 R antagonists, using CHO-K1 cells stably expressing the hA 3 R. In addition, a new kinetic real-time cell-binding approach was established to quantify the rate constants k on and k off , as well as the dedicated K d of the A 3 R agonist [ 125 I]-AB-MECA. Furthermore, lipophilicity measurements were conducted to control influences due to physicochemical properties of the used compounds. Two novel real-time cell-binding approaches were successfully developed and established. Both experimental procedures were found to visualize the kinetic binding characteristics with high spatial and temporal resolution, resulting in reliable affinity values, which are in good agreement with values previously reported with traditional methods. Taking into account the lipophilicity of the A 3 R antagonists, no influences on the experimental performance and the resulting affinity were investigated. Both kinetic binding approaches comprise tracer administration and subsequent binding to living cells, expressing the dedicated target protein. Therefore, the experiments resemble better the true in vivo physiological conditions and provide important markers of cellular feedback and biological response.

  17. Robust object tracking techniques for vision-based 3D motion analysis applications

    NASA Astrophysics Data System (ADS)

    Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.

    2016-04-01

    Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.

  18. TP53, PIK3CA, FBXW7 and KRAS Mutations in Esophageal Cancer Identified by Targeted Sequencing.

    PubMed

    Zheng, Huili; Wang, Yan; Tang, Chuanning; Jones, Lindsey; Ye, Hua; Zhang, Guangchun; Cao, Weihai; Li, Jingwen; Liu, Lifeng; Liu, Zhencong; Zhang, Chao; Lou, Feng; Liu, Zhiyuan; Li, Yangyang; Shi, Zhenfen; Zhang, Jingbo; Zhang, Dandan; Sun, Hong; Dong, Haichao; Dong, Zhishou; Guo, Baishuai; Yan, H E; Lu, Qingyu; Huang, Xue; Chen, Si-Yi

    2016-01-01

    Esophageal cancer (EC) is a common malignancy with significant morbidity and mortality. As individual cancers exhibit unique mutation patterns, identifying and characterizing gene mutations in EC that may serve as biomarkers might help predict patient outcome and guide treatment. Traditionally, personalized cancer DNA sequencing was impractical and expensive. Recent technological advancements have made targeted DNA sequencing more cost- and time-effective with reliable results. This technology may be useful for clinicians to direct patient treatment. The Ion PGM and AmpliSeq Cancer Panel was used to identify mutations at 737 hotspot loci of 45 cancer-related genes in 64 EC samples from Chinese patients. Frequent mutations were found in TP53 and less frequent mutations in PIK3CA, FBXW7 and KRAS. These results demonstrate that targeted sequencing can reliably identify mutations in individual tumors that make this technology a possibility for clinical use. Copyright© 2016, International Institute of Anticancer Research (Dr. John G. Delinasios), All rights reserved.

  19. Radar waveform requirements for reliable detection of an aircraft-launched missile

    NASA Astrophysics Data System (ADS)

    Blair, W. Dale; Brandt-Pearce, Maite

    1996-06-01

    When tracking a manned aircraft with a phase array radar, detecting a missile launch (i.e., a target split) is particularly important because the missile can have a very small radar cross section (RCS) and drop below the horizon of the radar shortly after launch. Reliable detection of the launch is made difficult because the RCS of the missile is very small compared to that of the manned aircraft and the radar typically revisits a manned aircraft every few seconds. Furthermore, any measurements of the aircraft and missile taken shortly after the launch will be merged until the two targets are resolved in range, frequency, or space. In this paper, detection of the launched missile is addressed through the detection of the presence of target multiplicity with the in-phase and quadrature monopulse measurements. The probability of detecting the launch using monopulse processing will be studied with regard to the tracking signal-to-noise ratio and the number of pulses n the radar waveform.

  20. IDLN-MSP: Idiolocal normalization of real-time methylation-specific PCR for genetic imbalanced DNA specimens.

    PubMed

    Santourlidis, Simeon; Ghanjati, Foued; Beermann, Agnes; Hermanns, Thomas; Poyet, Cédric

    2016-02-01

    Sensitive, accurate, and reliable measurements of tumor cell-specific DNA methylation changes are of fundamental importance in cancer diagnosis, prognosis, and monitoring. Real-time methylation-specific PCR (MSP) using intercalating dyes is an established method of choice for this purpose. Here we present a simple but crucial adaptation of this widely applied method that overcomes a major obstacle: genetic abnormalities in the DNA samples, such as aneuploidy or copy number variations, that could result in inaccurate results due to improper normalization if the copy numbers of the target and reference sequences are not the same. In our idiolocal normalization (IDLN) method, the locus for the normalizing, methylation-independent reference amplification is chosen close to the locus of the methylation-dependent target amplification. This ensures that the copy numbers of both the target and reference sequences will be identical in most cases if they are close enough to each other, resulting in accurate normalization and reliable comparative measurements of DNA methylation in clinical samples when using real-time MSP.

  1. Comparison of Reliability Measures under Factor Analysis and Item Response Theory

    ERIC Educational Resources Information Center

    Cheng, Ying; Yuan, Ke-Hai; Liu, Cheng

    2012-01-01

    Reliability of test scores is one of the most pervasive psychometric concepts in measurement. Reliability coefficients based on a unifactor model for continuous indicators include maximal reliability rho and an unweighted sum score-based omega, among many others. With increasing popularity of item response theory, a parallel reliability measure pi…

  2. Inter-examiner classification reliability of Mechanical Diagnosis and Therapy for extremity problems - Systematic review.

    PubMed

    Takasaki, Hiroshi; Okuyama, Kousuke; Rosedale, Richard

    2017-02-01

    Mechanical Diagnosis and Therapy (MDT) is used in the treatment of extremity problems. Classifying clinical problems is one method of providing effective treatment to a target population. Classification reliability is a key factor to determine the precise clinical problem and to direct an appropriate intervention. To explore inter-examiner reliability of the MDT classification for extremity problems in three reliability designs: 1) vignette reliability using surveys with patient vignettes, 2) concurrent reliability, where multiple assessors decide a classification by observing someone's assessment, 3) successive reliability, where multiple assessors independently assess the same patient at different times. Systematic review with data synthesis in a quantitative format. Agreement of MDT subgroups was examined using the Kappa value, with the operational definition of acceptable reliability set at ≥ 0.6. The level of evidence was determined considering the methodological quality of the studies. Six studies were included and all studies met the criteria for high quality. Kappa values for the vignette reliability design (five studies) were ≥ 0.7. There was data from two cohorts in one study for the concurrent reliability design and the Kappa values ranged from 0.45 to 1.0. Kappa values for the successive reliability design (data from three cohorts in one study) were < 0.6. The current review found strong evidence of acceptable inter-examiner reliability of MDT classification for extremity problems in the vignette reliability design, limited evidence of acceptable reliability in the concurrent reliability design and unacceptable reliability in the successive reliability design. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. The MicroRNA Interaction Network of Lipid Diseases

    PubMed Central

    Kandhro, Abdul H.; Shoombuatong, Watshara; Nantasenamat, Chanin; Prachayasittikul, Virapong; Nuchnoi, Pornlada

    2017-01-01

    Background: Dyslipidemia is one of the major forms of lipid disorder, characterized by increased triglycerides (TGs), increased low-density lipoprotein-cholesterol (LDL-C), and decreased high-density lipoprotein-cholesterol (HDL-C) levels in blood. Recently, MicroRNAs (miRNAs) have been reported to involve in various biological processes; their potential usage being a biomarkers and in diagnosis of various diseases. Computational approaches including text mining have been used recently to analyze abstracts from the public databases to observe the relationships/associations between the biological molecules, miRNAs, and disease phenotypes. Materials and Methods: In the present study, significance of text mined extracted pair associations (miRNA-lipid disease) were estimated by one-sided Fisher's exact test. The top 20 significant miRNA-disease associations were visualized on Cytoscape. The CyTargetLinker plug-in tool on Cytoscape was used to extend the network and predicts new miRNA target genes. The Biological Networks Gene Ontology (BiNGO) plug-in tool on Cytoscape was used to retrieve gene ontology (GO) annotations for the targeted genes. Results: We retrieved 227 miRNA-lipid disease associations including 148 miRNAs. The top 20 significant miRNAs analysis on CyTargetLinker provides defined, predicted and validated gene targets, further targeted genes analyzed by BiNGO showed targeted genes were significantly associated with lipid, cholesterol, apolipoprotein, and fatty acids GO terms. Conclusion: We are the first to provide a reliable miRNA-lipid disease association network based on text mining. This could help future experimental studies that aim to validate predicted gene targets. PMID:29018475

  4. DMirNet: Inferring direct microRNA-mRNA association networks.

    PubMed

    Lee, Minsu; Lee, HyungJune

    2016-12-05

    MicroRNAs (miRNAs) play important regulatory roles in the wide range of biological processes by inducing target mRNA degradation or translational repression. Based on the correlation between expression profiles of a miRNA and its target mRNA, various computational methods have previously been proposed to identify miRNA-mRNA association networks by incorporating the matched miRNA and mRNA expression profiles. However, there remain three major issues to be resolved in the conventional computation approaches for inferring miRNA-mRNA association networks from expression profiles. 1) Inferred correlations from the observed expression profiles using conventional correlation-based methods include numerous erroneous links or over-estimated edge weight due to the transitive information flow among direct associations. 2) Due to the high-dimension-low-sample-size problem on the microarray dataset, it is difficult to obtain an accurate and reliable estimate of the empirical correlations between all pairs of expression profiles. 3) Because the previously proposed computational methods usually suffer from varying performance across different datasets, a more reliable model that guarantees optimal or suboptimal performance across different datasets is highly needed. In this paper, we present DMirNet, a new framework for identifying direct miRNA-mRNA association networks. To tackle the aforementioned issues, DMirNet incorporates 1) three direct correlation estimation methods (namely Corpcor, SPACE, Network deconvolution) to infer direct miRNA-mRNA association networks, 2) the bootstrapping method to fully utilize insufficient training expression profiles, and 3) a rank-based Ensemble aggregation to build a reliable and robust model across different datasets. Our empirical experiments on three datasets demonstrate the combinatorial effects of necessary components in DMirNet. Additional performance comparison experiments show that DMirNet outperforms the state-of-the-art Ensemble-based model [1] which has shown the best performance across the same three datasets, with a factor of up to 1.29. Further, we identify 43 putative novel multi-cancer-related miRNA-mRNA association relationships from an inferred Top 1000 direct miRNA-mRNA association network. We believe that DMirNet is a promising method to identify novel direct miRNA-mRNA relations and to elucidate the direct miRNA-mRNA association networks. Since DMirNet infers direct relationships from the observed data, DMirNet can contribute to reconstructing various direct regulatory pathways, including, but not limited to, the direct miRNA-mRNA association networks.

  5. A Cellular High-Throughput Screening Approach for Therapeutic trans-Cleaving Ribozymes and RNAi against Arbitrary mRNA Disease Targets

    PubMed Central

    Yau, Edwin H.; Butler, Mark C.; Sullivan, Jack M.

    2016-01-01

    Major bottlenecks in development of therapeutic post transcriptional gene silencing (PTGS) agents (e.g. ribozymes, RNA interference, antisense) include the challenge of mapping rare accessible regions of the mRNA target that are open for annealing and cleavage, testing and optimization of agents in human cells to identify lead agents, testing for cellular toxicity, and preclinical evaluation in appropriate animal models of disease. Methods for rapid and reliable cellular testing of PTGS agents are needed to identify potent lead candidates for optimization. Our goal was to develop a means of rapid assessment of many RNA agents to identify a lead candidate for a given mRNA associated with a disease state. We developed a rapid human cell-based screening platform to test efficacy of hammerhead ribozyme (hhRz) or RNA interference (RNAi) constructs, using a model retinal degeneration target, human rod opsin (RHO) mRNA. The focus is on RNA Drug Discovery for diverse retinal degeneration targets. To validate the approach, candidate hhRzs were tested against NUH↓ cleavage sites (N=G,C,A,U; H=C,A,U) within the target mRNA of secreted alkaline phosphatase (SEAP), a model gene expression reporter, based upon in silico predictions of mRNA accessibility. HhRzs were embedded in a larger stable adenoviral VAI RNA scaffold for high cellular expression, cytoplasmic trafficking, and stability. Most hhRz expression plasmids exerted statistically significant knockdown of extracellular SEAP enzyme activity when readily assayed by a fluorescence enzyme assay intended for high throughput screening (HTS). Kinetics of PTGS knockdown of cellular targets is measureable in live cells with the SEAP reporter. The validated SEAP HTS platform was transposed to identify lead PTGS agents against a model hereditary retinal degeneration target, RHO mRNA. Two approaches were used to physically fuse the model retinal gene target mRNA to the SEAP reporter mRNA. The most expedient way to evaluate a large set of potential VAI-hhRz expression plasmids against diverse NUH↓ cleavage sites uses cultured human HEK293S cells stably expressing a dicistronic Target-IRES-SEAP target fusion mRNA. Broad utility of this rational RNA drug discovery approach is feasible for any ophthalmological disease-relevant mRNA targets and any disease mRNA targets in general. The approach will permit rank ordering of PTGS agents based on potency to identify a lead therapeutic compound for further optimization. PMID:27233447

  6. Detection of Orthopoxvirus DNA by Real-Time PCR and Identification of Variola Virus DNA by Melting Analysis

    PubMed Central

    Nitsche, Andreas; Ellerbrok, Heinz; Pauli, Georg

    2004-01-01

    Although variola virus was eradicated by the World Health Organization vaccination program in the 1970s, the diagnosis of smallpox infection has attracted great interest in the context of a possible deliberate release of variola virus in bioterrorist attacks. Obviously, fast and reliable diagnostic tools are required to detect variola virus and to distinguish it from orthopoxviruses that have identical morphological characteristics, including vaccinia virus. The advent of real-time PCR for the clinical diagnosis of viral infections has facilitated the detection of minute amounts of viral nucleic acids in a fast, safe, and precise manner, including the option to quantify and to genotype the target reliably. In this study a complete set of four hybridization probe-based real-time PCR assays for the specific detection of orthopoxvirus DNA is presented. Melting analysis following PCR enables the identification of variola virus by the PCR product's characteristic melting temperature, permitting the discrimination of variola virus from other orthopoxviruses. In addition, an assay for the specific amplification of variola virus DNA is presented. All assays can be performed simultaneously in the same cycler, and results of a PCR run are obtained in less than 1 h. The application of more than one assay for the same organism significantly contributes to the diagnostic reliability, reducing the risk of false-negative results due to unknown sequence variations. In conclusion, the assays presented will improve the speed and reliability of orthopoxvirus diagnostics and variola virus identification. PMID:15004077

  7. Development and Preliminary Validation of a Comprehensive Questionnaire to Assess Women’s Knowledge and Perception of the Current Weight Gain Guidelines during Pregnancy

    PubMed Central

    Ockenden, Holly; Gunnell, Katie; Giles, Audrey; Nerenberg, Kara; Goldfield, Gary; Manyanga, Taru; Adamo, Kristi

    2016-01-01

    The aim of this study was to develop and validate an electronic questionnaire, the Electronic Maternal Health Survey (EMat Health Survey), related to women’s knowledge and perceptions of the current gestational weight gain guidelines (GWG), as well as pregnancy-related health behaviours. Constructs addressed within the questionnaire include self-efficacy, locus of control, perceived barriers, and facilitators of physical activity and diet, outcome expectations, social environment and health practices. Content validity was examined using an expert panel (n = 7) and pilot testing items in a small sample (n = 5) of pregnant women and recent mothers (target population). Test re-test reliability was assessed among a sample (n = 71) of the target population. Reliability scores were calculated for all constructs (r and intra-class correlation coefficients (ICC)), those with a score of >0.5 were considered acceptable. The content validity of the questionnaire reflects the degree to which all relevant components of excessive GWG risk in women are included. Strong test-retest reliability was found in the current study, indicating that responses to the questionnaire were reliable in this population. The EMat Health Survey adds to the growing body of literature on maternal health and gestational weight gain by providing the first comprehensive questionnaire that can be self-administered and remotely accessed. The questionnaire can be completed in 15–25 min and collects useful data on various social determinants of health and GWG as well as associated health behaviours. This online tool may assist researchers by providing them with a platform to collect useful information in developing and tailoring interventions to better support women in achieving recommended weight gain targets in pregnancy. PMID:27916921

  8. Development of "one-pot" method for multi-class compounds in porcine formula feed by multi-function impurity adsorption cleaning followed ultra-performance liquid chromatography-tandem mass spectrometry detection.

    PubMed

    Wang, Peilong; Wang, Xiao; Zhang, Wei; Su, Xiaoou

    2014-02-01

    A novel and efficient determination method for multi-class compounds including β-agonists, sedatives, nitro-imidazoles and aflatoxins in porcine formula feed based on a fast "one-pot" extraction/multifunction impurity adsorption (MFIA) clean-up procedure has been developed. 23 target analytes belonging to four different class compounds could be determined simultaneously in a single run. Conditions for "one-pot" extraction were studied in detail. Under the optimized conditions, the multi-class compounds in porcine formula feed samples were extracted and purified with methanol contained ammonia and absorbents by one step. The compounds in extracts were purified by using multi types of absorbent based on MFIA in one pot. The multi-walled carbon nanotubes were employed to improved clean-up efficiency. Shield BEH C18 column was used to separate 23 target analytes, followed by tandem mass spectrometry (MS/MS) detection using an electro-spray ionization source in positive mode. Recovery studies were done at three fortification levels. Overall average recoveries of target compounds in porcine formula feed at each levels were >51.6% based on matrix fortified calibration with coefficients of variation from 2.7% to 13.2% (n=6). The limit of determination (LOD) of these compounds in porcine formula feed sample matrix was <5.0 μg/kg. This method was successfully applied in screening and confirmation of target drugs in >30 porcine formula feed samples. It was demonstrated that the integration of the MFIA protocol with the MS/MS instrument could serve as a valuable strategy for rapid screening and reliable confirmatory analysis of multi-class compounds in real samples. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Overcoming non-specific binding to measure the active concentration and kinetics of serum anti-HLA antibodies by surface plasmon resonance.

    PubMed

    Visentin, Jonathan; Couzi, Lionel; Dromer, Claire; Neau-Cransac, Martine; Guidicelli, Gwendaline; Veniard, Vincent; Coniat, Karine Nubret-le; Merville, Pierre; Di Primo, Carmelo; Taupin, Jean-Luc

    2018-06-07

    Human leukocyte antigen (HLA) donor-specific antibodies are key serum biomarkers for assessing the outcome of transplanted patients. Measuring their active concentration, i.e. the fraction that really interacts with donor HLA, and their affinity could help deciphering their pathogenicity. Surface plasmon resonance (SPR) is recognized as the gold-standard for measuring binding kinetics but also active concentrations, without calibration curves. SPR-based biosensors often suffer from non-specific binding (NSB) occurring with the sensor chip surface and the immobilized targets, especially for complex media such as human serum. In this work we show that several serum treatments such as dialysis or IgG purification reduce NSB but insufficiently for SPR applications. We then demonstrate that the NSB contribution to the SPR signal can be eliminated to determine precisely and reliably the active concentration and the affinity of anti-HLA antibodies from patients' sera. This was achieved even at concentrations close to the limit of quantification of the method, in the 0.5-1 nM range. The robustness of the assay was demonstrated by using a wide range of artificially generated NSB and by varying the density of the targets captured onto the surface. The assay is of general interest and can be used with molecules generating strong NSB, as far as a non-cognate target structurally close to the target can be captured on the same flow cell, in a different binding cycle. Compared with current fluorescence-based methods that are semi-quantitative, we expect this SPR-based assay to help better understanding anti-HLA antibodies pathogenicity and improving organ recipients' management. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. A new approach to the characterization of subtle errors in everyday action: implications for mild cognitive impairment.

    PubMed

    Seligman, Sarah C; Giovannetti, Tania; Sestito, John; Libon, David J

    2014-01-01

    Mild functional difficulties have been associated with early cognitive decline in older adults and increased risk for conversion to dementia in mild cognitive impairment, but our understanding of this decline has been limited by a dearth of objective methods. This study evaluated the reliability and validity of a new system to code subtle errors on an established performance-based measure of everyday action and described preliminary findings within the context of a theoretical model of action disruption. Here 45 older adults completed the Naturalistic Action Test (NAT) and neuropsychological measures. NAT performance was coded for overt errors, and subtle action difficulties were scored using a novel coding system. An inter-rater reliability coefficient was calculated. Validity of the coding system was assessed using a repeated-measures ANOVA with NAT task (simple versus complex) and error type (overt versus subtle) as within-group factors. Correlation/regression analyses were conducted among overt NAT errors, subtle NAT errors, and neuropsychological variables. The coding of subtle action errors was reliable and valid, and episodic memory breakdown predicted subtle action disruption. Results suggest that the NAT can be useful in objectively assessing subtle functional decline. Treatments targeting episodic memory may be most effective in addressing early functional impairment in older age.

  11. Development and validation of the Survey of Organizational Research Climate (SORC).

    PubMed

    Martinson, Brian C; Thrush, Carol R; Lauren Crain, A

    2013-09-01

    Development and targeting efforts by academic organizations to effectively promote research integrity can be enhanced if they are able to collect reliable data to benchmark baseline conditions, to assess areas needing improvement, and to subsequently assess the impact of specific initiatives. To date, no standardized and validated tool has existed to serve this need. A web- and mail-based survey was administered in the second half of 2009 to 2,837 randomly selected biomedical and social science faculty and postdoctoral fellows at 40 academic health centers in top-tier research universities in the United States. Measures included the Survey of Organizational Research Climate (SORC) as well as measures of perceptions of organizational justice. Exploratory and confirmatory factor analyses yielded seven subscales of organizational research climate, all of which demonstrated acceptable internal consistency (Cronbach's α ranging from 0.81 to 0.87) and adequate test-retest reliability (Pearson r ranging from 0.72 to 0.83). A broad range of correlations between the seven subscales and five measures of organizational justice (unadjusted regression coefficients ranging from 0.13 to 0.95) document both construct and discriminant validity of the instrument. The SORC demonstrates good internal (alpha) and external reliability (test-retest) as well as both construct and discriminant validity.

  12. Expert system for UNIX system reliability and availability enhancement

    NASA Astrophysics Data System (ADS)

    Xu, Catherine Q.

    1993-02-01

    Highly reliable and available systems are critical to the airline industry. However, most off-the-shelf computer operating systems and hardware do not have built-in fault tolerant mechanisms, the UNIX workstation is one example. In this research effort, we have developed a rule-based Expert System (ES) to monitor, command, and control a UNIX workstation system with hot-standby redundancy. The ES on each workstation acts as an on-line system administrator to diagnose, report, correct, and prevent certain types of hardware and software failures. If a primary station is approaching failure, the ES coordinates the switch-over to a hot-standby secondary workstation. The goal is to discover and solve certain fatal problems early enough to prevent complete system failure from occurring and therefore to enhance system reliability and availability. Test results show that the ES can diagnose all targeted faulty scenarios and take desired actions in a consistent manner regardless of the sequence of the faults. The ES can perform designated system administration tasks about ten times faster than an experienced human operator. Compared with a single workstation system, our hot-standby redundancy system downtime is predicted to be reduced by more than 50 percent by using the ES to command and control the system.

  13. A prototype for unsupervised analysis of tissue microarrays for cancer research and diagnostics.

    PubMed

    Chen, Wenjin; Reiss, Michael; Foran, David J

    2004-06-01

    The tissue microarray (TMA) technique enables researchers to extract small cylinders of tissue from histological sections and arrange them in a matrix configuration on a recipient paraffin block such that hundreds can be analyzed simultaneously. TMA offers several advantages over traditional specimen preparation by maximizing limited tissue resources and providing a highly efficient means for visualizing molecular targets. By enabling researchers to reliably determine the protein expression profile for specific types of cancer, it may be possible to elucidate the mechanism by which healthy tissues are transformed into malignancies. Currently, the primary methods used to evaluate arrays involve the interactive review of TMA samples while they are viewed under a microscope, subjectively evaluated, and scored by a technician. This process is extremely slow, tedious, and prone to error. In order to facilitate large-scale, multi-institutional studies, a more automated and reliable means for analyzing TMAs is needed. We report here a web-based prototype which features automated imaging, registration, and distributed archiving of TMAs in multiuser network environments. The system utilizes a principal color decomposition approach to identify and characterize the predominant staining signatures of specimens in color space. This strategy was shown to be reliable for detecting and quantifying the immunohistochemical expression levels for TMAs.

  14. Expert System for UNIX System Reliability and Availability Enhancement

    NASA Technical Reports Server (NTRS)

    Xu, Catherine Q.

    1993-01-01

    Highly reliable and available systems are critical to the airline industry. However, most off-the-shelf computer operating systems and hardware do not have built-in fault tolerant mechanisms, the UNIX workstation is one example. In this research effort, we have developed a rule-based Expert System (ES) to monitor, command, and control a UNIX workstation system with hot-standby redundancy. The ES on each workstation acts as an on-line system administrator to diagnose, report, correct, and prevent certain types of hardware and software failures. If a primary station is approaching failure, the ES coordinates the switch-over to a hot-standby secondary workstation. The goal is to discover and solve certain fatal problems early enough to prevent complete system failure from occurring and therefore to enhance system reliability and availability. Test results show that the ES can diagnose all targeted faulty scenarios and take desired actions in a consistent manner regardless of the sequence of the faults. The ES can perform designated system administration tasks about ten times faster than an experienced human operator. Compared with a single workstation system, our hot-standby redundancy system downtime is predicted to be reduced by more than 50 percent by using the ES to command and control the system.

  15. Development and Validation of the Survey of Organizational Research Climate (SORC)

    PubMed Central

    Martinson, Brian C.; Thrush, Carol R.; Crain, A. Lauren

    2012-01-01

    Background Development and targeting efforts by academic organizations to effectively promote research integrity can be enhanced if they are able to collect reliable data to benchmark baseline conditions, to assess areas needing improvement, and to subsequently assess the impact of specific initiatives. To date, no standardized and validated tool has existed to serve this need. Methods A web- and mail-based survey was administered in the second half of 2009 to 2,837 randomly selected biomedical and social science faculty and postdoctoral fellows at 40 academic health centers in top-tier research universities in the United States. Measures included the Survey of Organizational Research Climate (SORC) as well as measures of perceptions of organizational justice. Results Exploratory and confirmatory factor analyses yielded seven subscales of organizational research climate, all of which demonstrated acceptable internal consistency (Cronbach’s α ranging from 0.81 to 0.87) and adequate test-retest reliability (Pearson r ranging from 0.72 to 0.83). A broad range of correlations between the seven subscales and five measures of organizational justice (unadjusted regression coefficients ranging from .13 to .95) document both construct and discriminant validity of the instrument. Conclusions The SORC demonstrates good internal (alpha) and external reliability (test-retest) as well as both construct and discriminant validity. PMID:23096775

  16. Identification of FVIII gene mutations in patients with hemophilia A using new combinatorial sequencing by hybridization

    PubMed Central

    Chetta, M.; Drmanac, A.; Santacroce, R.; Grandone, E.; Surrey, S.; Fortina, P.; Margaglione, M.

    2008-01-01

    BACKGROUND: Standard methods of mutation detection are time consuming in Hemophilia A (HA) rendering their application unavailable in some analysis such as prenatal diagnosis. OBJECTIVES: To evaluate the feasibility of combinatorial sequencing-by-hybridization (cSBH) as an alternative and reliable tool for mutation detection in FVIII gene. PATIENTS/METHODS: We have applied a new method of cSBH that uses two different colors for detection of multiple point mutations in the FVIII gene. The 26 exons encompassing the HA gene were analyzed in 7 newly diagnosed Italian patients and in 19 previously characterized individuals with FVIII deficiency. RESULTS: Data show that, when solution-phase TAMRA and QUASAR labeled 5-mer oligonucleotide sets mixed with unlabeled target PCR templates are co-hybridized in the presence of DNA ligase to universal 6-mer oligonucleotide probe-based arrays, a number of mutations can be successfully detected. The technique was reliable also in identifying a mutant FVIII allele in an obligate heterozygote. A novel missense mutation (Leu1843Thr) in exon 16 and three novel neutral polymorphisms are presented with an updated protocol for 2-color cSBH. CONCLUSIONS: cSBH is a reliable tool for mutation detection in FVIII gene and may represent a complementary method for the genetic screening of HA patients. PMID:20300295

  17. Designing a reliable leak bio-detection system for natural gas pipelines.

    PubMed

    Batzias, F A; Siontorou, C G; Spanidis, P-M P

    2011-02-15

    Monitoring of natural gas (NG) pipelines is an important task for economical/safety operation, loss prevention and environmental protection. Timely and reliable leak detection of gas pipeline, therefore, plays a key role in the overall integrity management for the pipeline system. Owing to the various limitations of the currently available techniques and the surveillance area that needs to be covered, the research on new detector systems is still thriving. Biosensors are worldwide considered as a niche technology in the environmental market, since they afford the desired detector capabilities at low cost, provided they have been properly designed/developed and rationally placed/networked/maintained by the aid of operational research techniques. This paper addresses NG leakage surveillance through a robust cooperative/synergistic scheme between biosensors and conventional detector systems; the network is validated in situ and optimized in order to provide reliable information at the required granularity level. The proposed scheme is substantiated through a knowledge based approach and relies on Fuzzy Multicriteria Analysis (FMCA), for selecting the best biosensor design that suits both, the target analyte and the operational micro-environment. This approach is illustrated in the design of leak surveying over a pipeline network in Greece. Copyright © 2010 Elsevier B.V. All rights reserved.

  18. TH-B-204-01: Real-Time Tracking with Implanted Markers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Q.

    Implanted markers as target surrogates have been widely used for treatment verification, as they provide safe and reliable monitoring of the inter- and intra-fractional target motion. The rapid advancement of technology requires a critical review and recommendation for the usage of implanted surrogates in current field. The symposium, also reporting an update of AAPM TG 199 - Implanted Target Surrogates for Radiation Treatment Verification, will be focusing on all clinical aspects of using the implanted target surrogates for treatment verification and related issues. A wide variety of markers available in the market will be first reviewed, including radiopaque markers, MRImore » compatible makers, non-migrating coils, surgical clips and electromagnetic transponders etc. The pros and cons of each kind will be discussed. The clinical applications of implanted surrogates will be presented based on different anatomical sites. For the lung, we will discuss gated treatments and 2D or 3D real-time fiducial tracking techniques. For the prostate, we will be focusing on 2D-3D, 3D-3D matching and electromagnetic transponder based localization techniques. For the liver, we will review techniques when patients are under gating, shallow or free breathing condition. We will review techniques when treating challenging breast cancer as deformation may occur. Finally, we will summarize potential issues related to the usage of implanted target surrogates with TG 199 recommendations. A review of fiducial migration and fiducial derived target rotation in different disease sites will be provided. The issue of target deformation, especially near the diaphragm, and related suggestions will be also presented and discussed. Learning Objectives: Knowledge of a wide variety of markers Knowledge of their application for different disease sites Understand of issues related to these applications Z. Wang: Research funding support from Brainlab AG Q. Xu: Consultant for Accuray; Q. Xu, I am a consultant for Accuray planning service.« less

  19. TH-B-204-03: TG-199: Implanted Markers for Radiation Treatment Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Z.

    Implanted markers as target surrogates have been widely used for treatment verification, as they provide safe and reliable monitoring of the inter- and intra-fractional target motion. The rapid advancement of technology requires a critical review and recommendation for the usage of implanted surrogates in current field. The symposium, also reporting an update of AAPM TG 199 - Implanted Target Surrogates for Radiation Treatment Verification, will be focusing on all clinical aspects of using the implanted target surrogates for treatment verification and related issues. A wide variety of markers available in the market will be first reviewed, including radiopaque markers, MRImore » compatible makers, non-migrating coils, surgical clips and electromagnetic transponders etc. The pros and cons of each kind will be discussed. The clinical applications of implanted surrogates will be presented based on different anatomical sites. For the lung, we will discuss gated treatments and 2D or 3D real-time fiducial tracking techniques. For the prostate, we will be focusing on 2D-3D, 3D-3D matching and electromagnetic transponder based localization techniques. For the liver, we will review techniques when patients are under gating, shallow or free breathing condition. We will review techniques when treating challenging breast cancer as deformation may occur. Finally, we will summarize potential issues related to the usage of implanted target surrogates with TG 199 recommendations. A review of fiducial migration and fiducial derived target rotation in different disease sites will be provided. The issue of target deformation, especially near the diaphragm, and related suggestions will be also presented and discussed. Learning Objectives: Knowledge of a wide variety of markers Knowledge of their application for different disease sites Understand of issues related to these applications Z. Wang: Research funding support from Brainlab AG Q. Xu: Consultant for Accuray; Q. Xu, I am a consultant for Accuray planning service.« less

  20. TH-B-204-02: Application of Implanted Markers in Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, S.

    Implanted markers as target surrogates have been widely used for treatment verification, as they provide safe and reliable monitoring of the inter- and intra-fractional target motion. The rapid advancement of technology requires a critical review and recommendation for the usage of implanted surrogates in current field. The symposium, also reporting an update of AAPM TG 199 - Implanted Target Surrogates for Radiation Treatment Verification, will be focusing on all clinical aspects of using the implanted target surrogates for treatment verification and related issues. A wide variety of markers available in the market will be first reviewed, including radiopaque markers, MRImore » compatible makers, non-migrating coils, surgical clips and electromagnetic transponders etc. The pros and cons of each kind will be discussed. The clinical applications of implanted surrogates will be presented based on different anatomical sites. For the lung, we will discuss gated treatments and 2D or 3D real-time fiducial tracking techniques. For the prostate, we will be focusing on 2D-3D, 3D-3D matching and electromagnetic transponder based localization techniques. For the liver, we will review techniques when patients are under gating, shallow or free breathing condition. We will review techniques when treating challenging breast cancer as deformation may occur. Finally, we will summarize potential issues related to the usage of implanted target surrogates with TG 199 recommendations. A review of fiducial migration and fiducial derived target rotation in different disease sites will be provided. The issue of target deformation, especially near the diaphragm, and related suggestions will be also presented and discussed. Learning Objectives: Knowledge of a wide variety of markers Knowledge of their application for different disease sites Understand of issues related to these applications Z. Wang: Research funding support from Brainlab AG Q. Xu: Consultant for Accuray; Q. Xu, I am a consultant for Accuray planning service.« less

Top