Sample records for target reliability based

  1. Employing machine learning for reliable miRNA target identification in plants.

    PubMed

    Jha, Ashwani; Shankar, Ravi

    2011-12-29

    miRNAs are ~21 nucleotide long small noncoding RNA molecules, formed endogenously in most of the eukaryotes, which mainly control their target genes post transcriptionally by interacting and silencing them. While a lot of tools has been developed for animal miRNA target system, plant miRNA target identification system has witnessed limited development. Most of them have been centered around exact complementarity match. Very few of them considered other factors like multiple target sites and role of flanking regions. In the present work, a Support Vector Regression (SVR) approach has been implemented for plant miRNA target identification, utilizing position specific dinucleotide density variation information around the target sites, to yield highly reliable result. It has been named as p-TAREF (plant-Target Refiner). Performance comparison for p-TAREF was done with other prediction tools for plants with utmost rigor and where p-TAREF was found better performing in several aspects. Further, p-TAREF was run over the experimentally validated miRNA targets from species like Arabidopsis, Medicago, Rice and Tomato, and detected them accurately, suggesting gross usability of p-TAREF for plant species. Using p-TAREF, target identification was done for the complete Rice transcriptome, supported by expression and degradome based data. miR156 was found as an important component of the Rice regulatory system, where control of genes associated with growth and transcription looked predominant. The entire methodology has been implemented in a multi-threaded parallel architecture in Java, to enable fast processing for web-server version as well as standalone version. This also makes it to run even on a simple desktop computer in concurrent mode. It also provides a facility to gather experimental support for predictions made, through on the spot expression data analysis, in its web-server version. A machine learning multivariate feature tool has been implemented in parallel and

  2. Employing machine learning for reliable miRNA target identification in plants

    PubMed Central

    2011-01-01

    Background miRNAs are ~21 nucleotide long small noncoding RNA molecules, formed endogenously in most of the eukaryotes, which mainly control their target genes post transcriptionally by interacting and silencing them. While a lot of tools has been developed for animal miRNA target system, plant miRNA target identification system has witnessed limited development. Most of them have been centered around exact complementarity match. Very few of them considered other factors like multiple target sites and role of flanking regions. Result In the present work, a Support Vector Regression (SVR) approach has been implemented for plant miRNA target identification, utilizing position specific dinucleotide density variation information around the target sites, to yield highly reliable result. It has been named as p-TAREF (plant-Target Refiner). Performance comparison for p-TAREF was done with other prediction tools for plants with utmost rigor and where p-TAREF was found better performing in several aspects. Further, p-TAREF was run over the experimentally validated miRNA targets from species like Arabidopsis, Medicago, Rice and Tomato, and detected them accurately, suggesting gross usability of p-TAREF for plant species. Using p-TAREF, target identification was done for the complete Rice transcriptome, supported by expression and degradome based data. miR156 was found as an important component of the Rice regulatory system, where control of genes associated with growth and transcription looked predominant. The entire methodology has been implemented in a multi-threaded parallel architecture in Java, to enable fast processing for web-server version as well as standalone version. This also makes it to run even on a simple desktop computer in concurrent mode. It also provides a facility to gather experimental support for predictions made, through on the spot expression data analysis, in its web-server version. Conclusion A machine learning multivariate feature tool has been

  3. Assessing Reliability of Cold Spray Sputter Targets in Photovoltaic Manufacturing

    NASA Astrophysics Data System (ADS)

    Hardikar, Kedar; Vlcek, Johannes; Bheemreddy, Venkata; Juliano, Daniel

    2017-10-01

    Cold spray has been used to manufacture more than 800 Cu-In-Ga (CIG) sputter targets for deposition of high-efficiency photovoltaic thin films. It is a preferred technique since it enables high deposit purity and transfer of non-equilibrium alloy states to the target material. In this work, an integrated approach to reliability assessment of such targets with deposit weight in excess of 50 lb. is undertaken, involving thermal-mechanical characterization of the material in as-deposited condition, characterization of the interface adhesion on cylindrical substrate in as-deposited condition, and developing means to assess target integrity under thermal-mechanical loads during the physical vapor deposition (PVD) sputtering process. Mechanical characterization of cold spray deposited CIG alloy is accomplished through the use of indentation testing and adaptation of Brazilian disk test. A custom lever test was developed to characterize adhesion along the cylindrical interface between the CIG deposit and cylindrical substrate, overcoming limitations of current standards. A cohesive zone model for crack initiation and propagation at the deposit interface is developed and validated using the lever test and later used to simulate the potential catastrophic target failure in the PVD process. It is shown that this approach enables reliability assessment of sputter targets and improves robustness.

  4. Reliability based design of the primary structure of oil tankers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casella, G.; Dogliani, M.; Guedes Soares, C.

    1996-12-31

    The present paper describes the reliability analysis carried out for two oil tanker-ships having comparable dimensions but different design. The scope of the analysis was to derive indications on the value of the reliability index obtained for existing, typical and well designed oil tankers, as well as to apply the tentative rule checking formulation developed within the CEC-funded SHIPREL Project. The checking formula was adopted to redesign the midships section of one of the considered ships, upgrading her in order to meet the target failure probability considered in the rule development process. The resulting structure, in view of an upgradingmore » of the steel grade in the central part of the deck, lead to a convenient reliability level. The results of the analysis clearly showed that a large scatter exists presently in the design safety levels of ships, even when the Classification Societies` unified requirements are satisfied. A reliability based approach for the calibration of the rules for the global strength of ships is therefore proposed, in order to assist designers and Classification Societies in the process of producing ships which are more optimized, with respect to ensured safety levels. Based on the work reported in the paper, the feasibility and usefulness of a reliability based approach in the development of ship longitudinal strength requirements has been demonstrated.« less

  5. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed

  6. Study on evaluation of construction reliability for engineering project based on fuzzy language operator

    NASA Astrophysics Data System (ADS)

    Shi, Yu-Fang; Ma, Yi-Yi; Song, Ping-Ping

    2018-03-01

    System Reliability Theory is a research hotspot of management science and system engineering in recent years, and construction reliability is useful for quantitative evaluation of project management level. According to reliability theory and target system of engineering project management, the defination of construction reliability appears. Based on fuzzy mathematics theory and language operator, value space of construction reliability is divided into seven fuzzy subsets and correspondingly, seven membership function and fuzzy evaluation intervals are got with the operation of language operator, which provides the basis of corresponding method and parameter for the evaluation of construction reliability. This method is proved to be scientific and reasonable for construction condition and an useful attempt for theory and method research of engineering project system reliability.

  7. Regional Reliability of Quantitative Signal Targeting with Alternating Radiofrequency (STAR) Labeling of Arterial Regions (QUASAR)

    PubMed Central

    Tatewaki, Yasuko; Higano, Shuichi; Taki, Yasuyuki; Thyreau, Benjamin; Murata, Takaki; Mugikura, Shunji; Ito, Daisuke; Takase, Kei; Takahashi, Shoki

    2014-01-01

    BACKGROUND AND PURPOSE Quantitative signal targeting with alternating radiofrequency labeling of arterial regions (QUASAR) is a recent spin labeling technique that could improve the reliability of brain perfusion measurements. Although it is considered reliable for measuring gray matter as a whole, it has never been evaluated regionally. Here we assessed this regional reliability. METHODS Using a 3-Tesla Philips Achieva whole-body system, we scanned four times 10 healthy volunteers, in two sessions 2 weeks apart, to obtain QUASAR images. We computed perfusion images and ran a voxel-based analysis within all brain structures. We also calculated mean regional cerebral blood flow (rCBF) within regions of interest configured for each arterial territory distribution. RESULTS The mean CBF over whole gray matter was 37.74 with intraclass correlation coefficient (ICC) of .70. In white matter, it was 13.94 with an ICC of .30. Voxel-wise ICC and coefficient-of-variation maps showed relatively lower reliability in watershed areas and white matter especially in deeper white matter. The absolute mean rCBF values were consistent with the ones reported from PET, as was the relatively low variability in different feeding arteries. CONCLUSIONS Thus, QUASAR reliability for regional perfusion is high within gray matter, but uncertain within white matter. PMID:25370338

  8. Regional reliability of quantitative signal targeting with alternating radiofrequency (STAR) labeling of arterial regions (QUASAR).

    PubMed

    Tatewaki, Yasuko; Higano, Shuichi; Taki, Yasuyuki; Thyreau, Benjamin; Murata, Takaki; Mugikura, Shunji; Ito, Daisuke; Takase, Kei; Takahashi, Shoki

    2014-01-01

    Quantitative signal targeting with alternating radiofrequency labeling of arterial regions (QUASAR) is a recent spin labeling technique that could improve the reliability of brain perfusion measurements. Although it is considered reliable for measuring gray matter as a whole, it has never been evaluated regionally. Here we assessed this regional reliability. Using a 3-Tesla Philips Achieva whole-body system, we scanned four times 10 healthy volunteers, in two sessions 2 weeks apart, to obtain QUASAR images. We computed perfusion images and ran a voxel-based analysis within all brain structures. We also calculated mean regional cerebral blood flow (rCBF) within regions of interest configured for each arterial territory distribution. The mean CBF over whole gray matter was 37.74 with intraclass correlation coefficient (ICC) of .70. In white matter, it was 13.94 with an ICC of .30. Voxel-wise ICC and coefficient-of-variation maps showed relatively lower reliability in watershed areas and white matter especially in deeper white matter. The absolute mean rCBF values were consistent with the ones reported from PET, as was the relatively low variability in different feeding arteries. Thus, QUASAR reliability for regional perfusion is high within gray matter, but uncertain within white matter. © 2014 The Authors. Journal of Neuroimaging published by the American Society of Neuroimaging.

  9. Reliability Analysis and Reliability-Based Design Optimization of Circular Composite Cylinders Under Axial Compression

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2001-01-01

    This report describes the preliminary results of an investigation on component reliability analysis and reliability-based design optimization of thin-walled circular composite cylinders with average diameter and average length of 15 inches. Structural reliability is based on axial buckling strength of the cylinder. Both Monte Carlo simulation and First Order Reliability Method are considered for reliability analysis with the latter incorporated into the reliability-based structural optimization problem. To improve the efficiency of reliability sensitivity analysis and design optimization solution, the buckling strength of the cylinder is estimated using a second-order response surface model. The sensitivity of the reliability index with respect to the mean and standard deviation of each random variable is calculated and compared. The reliability index is found to be extremely sensitive to the applied load and elastic modulus of the material in the fiber direction. The cylinder diameter was found to have the third highest impact on the reliability index. Also the uncertainty in the applied load, captured by examining different values for its coefficient of variation, is found to have a large influence on cylinder reliability. The optimization problem for minimum weight is solved subject to a design constraint on element reliability index. The methodology, solution procedure and optimization results are included in this report.

  10. Custom oligonucleotide array-based CGH: a reliable diagnostic tool for detection of exonic copy-number changes in multiple targeted genes

    PubMed Central

    Vasson, Aurélie; Leroux, Céline; Orhant, Lucie; Boimard, Mathieu; Toussaint, Aurélie; Leroy, Chrystel; Commere, Virginie; Ghiotti, Tiffany; Deburgrave, Nathalie; Saillour, Yoann; Atlan, Isabelle; Fouveaut, Corinne; Beldjord, Cherif; Valleix, Sophie; Leturcq, France; Dodé, Catherine; Bienvenu, Thierry; Chelly, Jamel; Cossée, Mireille

    2013-01-01

    The frequency of disease-related large rearrangements (referred to as copy-number mutations, CNMs) varies among genes, and search for these mutations has an important place in diagnostic strategies. In recent years, CGH method using custom-designed high-density oligonucleotide-based arrays allowed the development of a powerful tool for detection of alterations at the level of exons and made it possible to provide flexibility through the possibility of modeling chips. The aim of our study was to test custom-designed oligonucleotide CGH array in a diagnostic laboratory setting that analyses several genes involved in various genetic diseases, and to compare it with conventional strategies. To this end, we designed a 12-plex CGH array (135k; 135 000 probes/subarray) (Roche Nimblegen) with exonic and intronic oligonucleotide probes covering 26 genes routinely analyzed in the laboratory. We tested control samples with known CNMs and patients for whom genetic causes underlying their disorders were unknown. The contribution of this technique is undeniable. Indeed, it appeared reproducible, reliable and sensitive enough to detect heterozygous single-exon deletions or duplications, complex rearrangements and somatic mosaicism. In addition, it improves reliability of CNM detection and allows determination of boundaries precisely enough to direct targeted sequencing of breakpoints. All of these points, associated with the possibility of a simultaneous analysis of several genes and scalability ‘homemade' make it a valuable tool as a new diagnostic approach of CNMs. PMID:23340513

  11. Vision-Based Target Finding and Inspection of a Ground Target Using a Multirotor UAV System.

    PubMed

    Hinas, Ajmal; Roberts, Jonathan M; Gonzalez, Felipe

    2017-12-17

    In this paper, a system that uses an algorithm for target detection and navigation and a multirotor Unmanned Aerial Vehicle (UAV) for finding a ground target and inspecting it closely is presented. The system can also be used for accurate and safe delivery of payloads or spot spraying applications in site-specific crop management. A downward-looking camera attached to a multirotor is used to find the target on the ground. The UAV descends to the target and hovers above the target for a few seconds to inspect the target. A high-level decision algorithm based on an OODA (observe, orient, decide, and act) loop was developed as a solution to address the problem. Navigation of the UAV was achieved by continuously sending local position messages to the autopilot via Mavros. The proposed system performed hovering above the target in three different stages: locate, descend, and hover. The system was tested in multiple trials, in simulations and outdoor tests, from heights of 10 m to 40 m. Results show that the system is highly reliable and robust to sensor errors, drift, and external disturbance.

  12. Reliable motion detection of small targets in video with low signal-to-clutter ratios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nichols, S.A.; Naylor, R.B.

    1995-07-01

    Studies show that vigilance decreases rapidly after several minutes when human operators are required to search live video for infrequent intrusion detections. Therefore, there is a need for systems which can automatically detect targets in live video and reserve the operator`s attention for assessment only. Thus far, automated systems have not simultaneously provided adequate detection sensitivity, false alarm suppression, and ease of setup when used in external, unconstrained environments. This unsatisfactory performance can be exacerbated by poor video imagery with low contrast, high noise, dynamic clutter, image misregistration, and/or the presence of small, slow, or erratically moving targets. This papermore » describes a highly adaptive video motion detection and tracking algorithm which has been developed as part of Sandia`s Advanced Exterior Sensor (AES) program. The AES is a wide-area detection and assessment system for use in unconstrained exterior security applications. The AES detection and tracking algorithm provides good performance under stressing data and environmental conditions. Features of the algorithm include: reliable detection with negligible false alarm rate of variable velocity targets having low signal-to-clutter ratios; reliable tracking of targets that exhibit motion that is non-inertial, i.e., varies in direction and velocity; automatic adaptation to both infrared and visible imagery with variable quality; and suppression of false alarms caused by sensor flaws and/or cutouts.« less

  13. CCTop: An Intuitive, Flexible and Reliable CRISPR/Cas9 Target Prediction Tool

    PubMed Central

    del Sol Keyer, Maria; Wittbrodt, Joachim; Mateo, Juan L.

    2015-01-01

    Engineering of the CRISPR/Cas9 system has opened a plethora of new opportunities for site-directed mutagenesis and targeted genome modification. Fundamental to this is a stretch of twenty nucleotides at the 5’ end of a guide RNA that provides specificity to the bound Cas9 endonuclease. Since a sequence of twenty nucleotides can occur multiple times in a given genome and some mismatches seem to be accepted by the CRISPR/Cas9 complex, an efficient and reliable in silico selection and evaluation of the targeting site is key prerequisite for the experimental success. Here we present the CRISPR/Cas9 target online predictor (CCTop, http://crispr.cos.uni-heidelberg.de) to overcome limitations of already available tools. CCTop provides an intuitive user interface with reasonable default parameters that can easily be tuned by the user. From a given query sequence, CCTop identifies and ranks all candidate sgRNA target sites according to their off-target quality and displays full documentation. CCTop was experimentally validated for gene inactivation, non-homologous end-joining as well as homology directed repair. Thus, CCTop provides the bench biologist with a tool for the rapid and efficient identification of high quality target sites. PMID:25909470

  14. Reliability modeling of fault-tolerant computer based systems

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.

    1987-01-01

    Digital fault-tolerant computer-based systems have become commonplace in military and commercial avionics. These systems hold the promise of increased availability, reliability, and maintainability over conventional analog-based systems through the application of replicated digital computers arranged in fault-tolerant configurations. Three tightly coupled factors of paramount importance, ultimately determining the viability of these systems, are reliability, safety, and profitability. Reliability, the major driver affects virtually every aspect of design, packaging, and field operations, and eventually produces profit for commercial applications or increased national security. However, the utilization of digital computer systems makes the task of producing credible reliability assessment a formidable one for the reliability engineer. The root of the problem lies in the digital computer's unique adaptability to changing requirements, computational power, and ability to test itself efficiently. Addressed here are the nuances of modeling the reliability of systems with large state sizes, in the Markov sense, which result from systems based on replicated redundant hardware and to discuss the modeling of factors which can reduce reliability without concomitant depletion of hardware. Advanced fault-handling models are described and methods of acquiring and measuring parameters for these models are delineated.

  15. Fusion-based multi-target tracking and localization for intelligent surveillance systems

    NASA Astrophysics Data System (ADS)

    Rababaah, Haroun; Shirkhodaie, Amir

    2008-04-01

    In this paper, we have presented two approaches addressing visual target tracking and localization in complex urban environment. The two techniques presented in this paper are: fusion-based multi-target visual tracking, and multi-target localization via camera calibration. For multi-target tracking, the data fusion concepts of hypothesis generation/evaluation/selection, target-to-target registration, and association are employed. An association matrix is implemented using RGB histograms for associated tracking of multi-targets of interests. Motion segmentation of targets of interest (TOI) from the background was achieved by a Gaussian Mixture Model. Foreground segmentation, on other hand, was achieved by the Connected Components Analysis (CCA) technique. The tracking of individual targets was estimated by fusing two sources of information, the centroid with the spatial gating, and the RGB histogram association matrix. The localization problem is addressed through an effective camera calibration technique using edge modeling for grid mapping (EMGM). A two-stage image pixel to world coordinates mapping technique is introduced that performs coarse and fine location estimation of moving TOIs. In coarse estimation, an approximate neighborhood of the target position is estimated based on nearest 4-neighbor method, and in fine estimation, we use Euclidean interpolation to localize the position within the estimated four neighbors. Both techniques were tested and shown reliable results for tracking and localization of Targets of interests in complex urban environment.

  16. A rainwater harvesting system reliability model based on nonparametric stochastic rainfall generator

    NASA Astrophysics Data System (ADS)

    Basinger, Matt; Montalto, Franco; Lall, Upmanu

    2010-10-01

    SummaryThe reliability with which harvested rainwater can be used as a means of flushing toilets, irrigating gardens, and topping off air-conditioner serving multifamily residential buildings in New York City is assessed using a new rainwater harvesting (RWH) system reliability model. Although demonstrated with a specific case study, the model is portable because it is based on a nonparametric rainfall generation procedure utilizing a bootstrapped markov chain. Precipitation occurrence is simulated using transition probabilities derived for each day of the year based on the historical probability of wet and dry day state changes. Precipitation amounts are selected from a matrix of historical values within a moving 15 day window that is centered on the target day. RWH system reliability is determined for user-specified catchment area and tank volume ranges using precipitation ensembles generated using the described stochastic procedure. The reliability with which NYC backyard gardens can be irrigated and air conditioning units supplied with water harvested from local roofs exceeds 80% and 90%, respectively, for the entire range of catchment areas and tank volumes considered in the analysis. For RWH systems installed on the most commonly occurring rooftop catchment areas found in NYC (51-75 m 2), toilet flushing demand can be met with 7-40% reliability, with lower end of the range representing buildings with high flow toilets and no storage elements, and the upper end representing buildings that feature low flow fixtures and storage tanks of up to 5 m 3. When the reliability curves developed are used to size RWH systems to flush the low flow toilets of all multifamily buildings found a typical residential neighborhood in the Bronx, rooftop runoff inputs to the sewer system are reduced by approximately 28% over an average rainfall year, and potable water demand is reduced by approximately 53%.

  17. New support vector machine-based method for microRNA target prediction.

    PubMed

    Li, L; Gao, Q; Mao, X; Cao, Y

    2014-06-09

    MicroRNA (miRNA) plays important roles in cell differentiation, proliferation, growth, mobility, and apoptosis. An accurate list of precise target genes is necessary in order to fully understand the importance of miRNAs in animal development and disease. Several computational methods have been proposed for miRNA target-gene identification. However, these methods still have limitations with respect to their sensitivity and accuracy. Thus, we developed a new miRNA target-prediction method based on the support vector machine (SVM) model. The model supplies information of two binding sites (primary and secondary) for a radial basis function kernel as a similarity measure for SVM features. The information is categorized based on structural, thermodynamic, and sequence conservation. Using high-confidence datasets selected from public miRNA target databases, we obtained a human miRNA target SVM classifier model with high performance and provided an efficient tool for human miRNA target gene identification. Experiments have shown that our method is a reliable tool for miRNA target-gene prediction, and a successful application of an SVM classifier. Compared with other methods, the method proposed here improves the sensitivity and accuracy of miRNA prediction. Its performance can be further improved by providing more training examples.

  18. GalaxyTBM: template-based modeling by building a reliable core and refining unreliable local regions.

    PubMed

    Ko, Junsu; Park, Hahnbeom; Seok, Chaok

    2012-08-10

    Protein structures can be reliably predicted by template-based modeling (TBM) when experimental structures of homologous proteins are available. However, it is challenging to obtain structures more accurate than the single best templates by either combining information from multiple templates or by modeling regions that vary among templates or are not covered by any templates. We introduce GalaxyTBM, a new TBM method in which the more reliable core region is modeled first from multiple templates and less reliable, variable local regions, such as loops or termini, are then detected and re-modeled by an ab initio method. This TBM method is based on "Seok-server," which was tested in CASP9 and assessed to be amongst the top TBM servers. The accuracy of the initial core modeling is enhanced by focusing on more conserved regions in the multiple-template selection and multiple sequence alignment stages. Additional improvement is achieved by ab initio modeling of up to 3 unreliable local regions in the fixed framework of the core structure. Overall, GalaxyTBM reproduced the performance of Seok-server, with GalaxyTBM and Seok-server resulting in average GDT-TS of 68.1 and 68.4, respectively, when tested on 68 single-domain CASP9 TBM targets. For application to multi-domain proteins, GalaxyTBM must be combined with domain-splitting methods. Application of GalaxyTBM to CASP9 targets demonstrates that accurate protein structure prediction is possible by use of a multiple-template-based approach, and ab initio modeling of variable regions can further enhance the model quality.

  19. Reliability of digital reactor protection system based on extenics.

    PubMed

    Zhao, Jing; He, Ya-Nan; Gu, Peng-Fei; Chen, Wei-Hua; Gao, Feng

    2016-01-01

    After the Fukushima nuclear accident, safety of nuclear power plants (NPPs) is widespread concerned. The reliability of reactor protection system (RPS) is directly related to the safety of NPPs, however, it is difficult to accurately evaluate the reliability of digital RPS. The method is based on estimating probability has some uncertainties, which can not reflect the reliability status of RPS dynamically and support the maintenance and troubleshooting. In this paper, the reliability quantitative analysis method based on extenics is proposed for the digital RPS (safety-critical), by which the relationship between the reliability and response time of RPS is constructed. The reliability of the RPS for CPR1000 NPP is modeled and analyzed by the proposed method as an example. The results show that the proposed method is capable to estimate the RPS reliability effectively and provide support to maintenance and troubleshooting of digital RPS system.

  20. Complementary Reliability-Based Decodings of Binary Linear Block Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1997-01-01

    This correspondence presents a hybrid reliability-based decoding algorithm which combines the reprocessing method based on the most reliable basis and a generalized Chase-type algebraic decoder based on the least reliable positions. It is shown that reprocessing with a simple additional algebraic decoding effort achieves significant coding gain. For long codes, the order of reprocessing required to achieve asymptotic optimum error performance is reduced by approximately 1/3. This significantly reduces the computational complexity, especially for long codes. Also, a more efficient criterion for stopping the decoding process is derived based on the knowledge of the algebraic decoding solution.

  1. Delay Analysis of Car-to-Car Reliable Data Delivery Strategies Based on Data Mulling with Network Coding

    NASA Astrophysics Data System (ADS)

    Park, Joon-Sang; Lee, Uichin; Oh, Soon Young; Gerla, Mario; Lun, Desmond Siumen; Ro, Won Woo; Park, Joonseok

    Vehicular ad hoc networks (VANET) aims to enhance vehicle navigation safety by providing an early warning system: any chance of accidents is informed through the wireless communication between vehicles. For the warning system to work, it is crucial that safety messages be reliably delivered to the target vehicles in a timely manner and thus reliable and timely data dissemination service is the key building block of VANET. Data mulling technique combined with three strategies, network codeing, erasure coding and repetition coding, is proposed for the reliable and timely data dissemination service. Particularly, vehicles in the opposite direction on a highway are exploited as data mules, mobile nodes physically delivering data to destinations, to overcome intermittent network connectivity cause by sparse vehicle traffic. Using analytic models, we show that in such a highway data mulling scenario the network coding based strategy outperforms erasure coding and repetition based strategies.

  2. Targeted cellular ablation based on the morphology of malignant cells

    NASA Astrophysics Data System (ADS)

    Ivey, Jill W.; Latouche, Eduardo L.; Sano, Michael B.; Rossmeisl, John H.; Davalos, Rafael V.; Verbridge, Scott S.

    2015-11-01

    Treatment of glioblastoma multiforme (GBM) is especially challenging due to a shortage of methods to preferentially target diffuse infiltrative cells, and therapy-resistant glioma stem cell populations. Here we report a physical treatment method based on electrical disruption of cells, whose action depends strongly on cellular morphology. Interestingly, numerical modeling suggests that while outer lipid bilayer disruption induced by long pulses (~100 μs) is enhanced for larger cells, short pulses (~1 μs) preferentially result in high fields within the cell interior, which scale in magnitude with nucleus size. Because enlarged nuclei represent a reliable indicator of malignancy, this suggested a means of preferentially targeting malignant cells. While we demonstrate killing of both normal and malignant cells using pulsed electric fields (PEFs) to treat spontaneous canine GBM, we proposed that properly tuned PEFs might provide targeted ablation based on nuclear size. Using 3D hydrogel models of normal and malignant brain tissues, which permit high-resolution interrogation during treatment testing, we confirmed that PEFs could be tuned to preferentially kill cancerous cells. Finally, we estimated the nuclear envelope electric potential disruption needed for cell death from PEFs. Our results may be useful in safely targeting the therapy-resistant cell niches that cause recurrence of GBM tumors.

  3. Targeted cellular ablation based on the morphology of malignant cells

    PubMed Central

    Ivey, Jill W.; Latouche, Eduardo L.; Sano, Michael B.; Rossmeisl, John H.; Davalos, Rafael V.; Verbridge, Scott S.

    2015-01-01

    Treatment of glioblastoma multiforme (GBM) is especially challenging due to a shortage of methods to preferentially target diffuse infiltrative cells, and therapy-resistant glioma stem cell populations. Here we report a physical treatment method based on electrical disruption of cells, whose action depends strongly on cellular morphology. Interestingly, numerical modeling suggests that while outer lipid bilayer disruption induced by long pulses (~100 μs) is enhanced for larger cells, short pulses (~1 μs) preferentially result in high fields within the cell interior, which scale in magnitude with nucleus size. Because enlarged nuclei represent a reliable indicator of malignancy, this suggested a means of preferentially targeting malignant cells. While we demonstrate killing of both normal and malignant cells using pulsed electric fields (PEFs) to treat spontaneous canine GBM, we proposed that properly tuned PEFs might provide targeted ablation based on nuclear size. Using 3D hydrogel models of normal and malignant brain tissues, which permit high-resolution interrogation during treatment testing, we confirmed that PEFs could be tuned to preferentially kill cancerous cells. Finally, we estimated the nuclear envelope electric potential disruption needed for cell death from PEFs. Our results may be useful in safely targeting the therapy-resistant cell niches that cause recurrence of GBM tumors. PMID:26596248

  4. Gaussian mixture models-based ship target recognition algorithm in remote sensing infrared images

    NASA Astrophysics Data System (ADS)

    Yao, Shoukui; Qin, Xiaojuan

    2018-02-01

    Since the resolution of remote sensing infrared images is low, the features of ship targets become unstable. The issue of how to recognize ships with fuzzy features is an open problem. In this paper, we propose a novel ship target recognition algorithm based on Gaussian mixture models (GMMs). In the proposed algorithm, there are mainly two steps. At the first step, the Hu moments of these ship target images are calculated, and the GMMs are trained on the moment features of ships. At the second step, the moment feature of each ship image is assigned to the trained GMMs for recognition. Because of the scale, rotation, translation invariance property of Hu moments and the power feature-space description ability of GMMs, the GMMs-based ship target recognition algorithm can recognize ship reliably. Experimental results of a large simulating image set show that our approach is effective in distinguishing different ship types, and obtains a satisfactory ship recognition performance.

  5. Neural Networks Based Approach to Enhance Space Hardware Reliability

    NASA Technical Reports Server (NTRS)

    Zebulum, Ricardo S.; Thakoor, Anilkumar; Lu, Thomas; Franco, Lauro; Lin, Tsung Han; McClure, S. S.

    2011-01-01

    This paper demonstrates the use of Neural Networks as a device modeling tool to increase the reliability analysis accuracy of circuits targeted for space applications. The paper tackles a number of case studies of relevance to the design of Flight hardware. The results show that the proposed technique generates more accurate models than the ones regularly used to model circuits.

  6. Enhancing emotional-based target prediction

    NASA Astrophysics Data System (ADS)

    Gosnell, Michael; Woodley, Robert

    2008-04-01

    This work extends existing agent-based target movement prediction to include key ideas of behavioral inertia, steady states, and catastrophic change from existing psychological, sociological, and mathematical work. Existing target prediction work inherently assumes a single steady state for target behavior, and attempts to classify behavior based on a single emotional state set. The enhanced, emotional-based target prediction maintains up to three distinct steady states, or typical behaviors, based on a target's operating conditions and observed behaviors. Each steady state has an associated behavioral inertia, similar to the standard deviation of behaviors within that state. The enhanced prediction framework also allows steady state transitions through catastrophic change and individual steady states could be used in an offline analysis with additional modeling efforts to better predict anticipated target reactions.

  7. Advanced reliability modeling of fault-tolerant computer-based systems

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1982-01-01

    Two methodologies for the reliability assessment of fault tolerant digital computer based systems are discussed. The computer-aided reliability estimation 3 (CARE 3) and gate logic software simulation (GLOSS) are assessment technologies that were developed to mitigate a serious weakness in the design and evaluation process of ultrareliable digital systems. The weak link is based on the unavailability of a sufficiently powerful modeling technique for comparing the stochastic attributes of one system against others. Some of the more interesting attributes are reliability, system survival, safety, and mission success.

  8. Reliability evaluation of microgrid considering incentive-based demand response

    NASA Astrophysics Data System (ADS)

    Huang, Ting-Cheng; Zhang, Yong-Jun

    2017-07-01

    Incentive-based demand response (IBDR) can guide customers to adjust their behaviour of electricity and curtail load actively. Meanwhile, distributed generation (DG) and energy storage system (ESS) can provide time for the implementation of IBDR. The paper focus on the reliability evaluation of microgrid considering IBDR. Firstly, the mechanism of IBDR and its impact on power supply reliability are analysed. Secondly, the IBDR dispatch model considering customer’s comprehensive assessment and the customer response model are developed. Thirdly, the reliability evaluation method considering IBDR based on Monte Carlo simulation is proposed. Finally, the validity of the above models and method is studied through numerical tests on modified RBTS Bus6 test system. Simulation results demonstrated that IBDR can improve the reliability of microgrid.

  9. Validation of highly reliable, real-time knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1988-01-01

    Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.

  10. Developing safety performance functions incorporating reliability-based risk measures.

    PubMed

    Ibrahim, Shewkar El-Bassiouni; Sayed, Tarek

    2011-11-01

    Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (P(nc)) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit-cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the P(nc) provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I+F) and property damage only (PDO) collisions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Improving the reliability of inverter-based welding machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiedermayer, M.

    1997-02-01

    Although inverter-based welding power sources have been available since the late 1980s, many people hesitated to purchase them because of reliability issues. Unfortunately, their hesitancy had a basis, until now. Recent improvements give some inverters a reliability level that approaches that of traditional, transformer-based industrial welding machines, which have a failure rate of about 1%. Acceptance of inverter-based welding machines is important because, for many welding applications, they provide capabilities that solid-state, transformer-based machines cannot deliver. These advantages include enhanced pulsed gas metal arc welding (GMAW-P), lightweight portability, an ultrastable arc, and energy efficiency--all while producing highly aesthetic weld beadsmore » and delivering multiprocess capabilities.« less

  12. Drug Target Prediction and Repositioning Using an Integrated Network-Based Approach

    PubMed Central

    Emig, Dorothea; Ivliev, Alexander; Pustovalova, Olga; Lancashire, Lee; Bureeva, Svetlana; Nikolsky, Yuri; Bessarabova, Marina

    2013-01-01

    The discovery of novel drug targets is a significant challenge in drug development. Although the human genome comprises approximately 30,000 genes, proteins encoded by fewer than 400 are used as drug targets in the treatment of diseases. Therefore, novel drug targets are extremely valuable as the source for first in class drugs. On the other hand, many of the currently known drug targets are functionally pleiotropic and involved in multiple pathologies. Several of them are exploited for treating multiple diseases, which highlights the need for methods to reliably reposition drug targets to new indications. Network-based methods have been successfully applied to prioritize novel disease-associated genes. In recent years, several such algorithms have been developed, some focusing on local network properties only, and others taking the complete network topology into account. Common to all approaches is the understanding that novel disease-associated candidates are in close overall proximity to known disease genes. However, the relevance of these methods to the prediction of novel drug targets has not yet been assessed. Here, we present a network-based approach for the prediction of drug targets for a given disease. The method allows both repositioning drug targets known for other diseases to the given disease and the prediction of unexploited drug targets which are not used for treatment of any disease. Our approach takes as input a disease gene expression signature and a high-quality interaction network and outputs a prioritized list of drug targets. We demonstrate the high performance of our method and highlight the usefulness of the predictions in three case studies. We present novel drug targets for scleroderma and different types of cancer with their underlying biological processes. Furthermore, we demonstrate the ability of our method to identify non-suspected repositioning candidates using diabetes type 1 as an example. PMID:23593264

  13. Target-decoy Based False Discovery Rate Estimation for Large-scale Metabolite Identification.

    PubMed

    Wang, Xusheng; Jones, Drew R; Shaw, Timothy I; Cho, Ji-Hoon; Wang, Yuanyuan; Tan, Haiyan; Xie, Boer; Zhou, Suiping; Li, Yuxin; Peng, Junmin

    2018-05-23

    Metabolite identification is a crucial step in mass spectrometry (MS)-based metabolomics. However, it is still challenging to assess the confidence of assigned metabolites. In this study, we report a novel method for estimating false discovery rate (FDR) of metabolite assignment with a target-decoy strategy, in which the decoys are generated through violating the octet rule of chemistry by adding small odd numbers of hydrogen atoms. The target-decoy strategy was integrated into JUMPm, an automated metabolite identification pipeline for large-scale MS analysis, and was also evaluated with two other metabolomics tools, mzMatch and mzMine 2. The reliability of FDR calculation was examined by false datasets, which were simulated by altering MS1 or MS2 spectra. Finally, we used the JUMPm pipeline coupled with the target-decoy strategy to process unlabeled and stable-isotope labeled metabolomic datasets. The results demonstrate that the target-decoy strategy is a simple and effective method for evaluating the confidence of high-throughput metabolite identification.

  14. A Target-Less Vision-Based Displacement Sensor Based on Image Convex Hull Optimization for Measuring the Dynamic Response of Building Structures.

    PubMed

    Choi, Insub; Kim, JunHee; Kim, Donghyun

    2016-12-08

    Existing vision-based displacement sensors (VDSs) extract displacement data through changes in the movement of a target that is identified within the image using natural or artificial structure markers. A target-less vision-based displacement sensor (hereafter called "TVDS") is proposed. It can extract displacement data without targets, which then serve as feature points in the image of the structure. The TVDS can extract and track the feature points without the target in the image through image convex hull optimization, which is done to adjust the threshold values and to optimize them so that they can have the same convex hull in every image frame and so that the center of the convex hull is the feature point. In addition, the pixel coordinates of the feature point can be converted to physical coordinates through a scaling factor map calculated based on the distance, angle, and focal length between the camera and target. The accuracy of the proposed scaling factor map was verified through an experiment in which the diameter of a circular marker was estimated. A white-noise excitation test was conducted, and the reliability of the displacement data obtained from the TVDS was analyzed by comparing the displacement data of the structure measured with a laser displacement sensor (LDS). The dynamic characteristics of the structure, such as the mode shape and natural frequency, were extracted using the obtained displacement data, and were compared with the numerical analysis results. TVDS yielded highly reliable displacement data and highly accurate dynamic characteristics, such as the natural frequency and mode shape of the structure. As the proposed TVDS can easily extract the displacement data even without artificial or natural markers, it has the advantage of extracting displacement data from any portion of the structure in the image.

  15. A new method of small target detection based on neural network

    NASA Astrophysics Data System (ADS)

    Hu, Jing; Hu, Yongli; Lu, Xinxin

    2018-02-01

    The detection and tracking of moving dim target in infrared image have been an research hotspot for many years. The target in each frame of images only occupies several pixels without any shape and structure information. Moreover, infrared small target is often submerged in complicated background with low signal-to-clutter ratio, making the detection very difficult. Different backgrounds exhibit different statistical properties, making it becomes extremely complex to detect the target. If the threshold segmentation is not reasonable, there may be more noise points in the final detection, which is unfavorable for the detection of the trajectory of the target. Single-frame target detection may not be able to obtain the desired target and cause high false alarm rate. We believe the combination of suspicious target detection spatially in each frame and temporal association for target tracking will increase reliability of tracking dim target. The detection of dim target is mainly divided into two parts, In the first part, we adopt bilateral filtering method in background suppression, after the threshold segmentation, the suspicious target in each frame are extracted, then we use LSTM(long short term memory) neural network to predict coordinates of target of the next frame. It is a brand-new method base on the movement characteristic of the target in sequence images which could respond to the changes in the relationship between past and future values of the values. Simulation results demonstrate proposed algorithm can effectively predict the trajectory of the moving small target and work efficiently and robustly with low false alarm.

  16. Reliability-based structural optimization: A proposed analytical-experimental study

    NASA Technical Reports Server (NTRS)

    Stroud, W. Jefferson; Nikolaidis, Efstratios

    1993-01-01

    An analytical and experimental study for assessing the potential of reliability-based structural optimization is proposed and described. In the study, competing designs obtained by deterministic and reliability-based optimization are compared. The experimental portion of the study is practical because the structure selected is a modular, actively and passively controlled truss that consists of many identical members, and because the competing designs are compared in terms of their dynamic performance and are not destroyed if failure occurs. The analytical portion of this study is illustrated on a 10-bar truss example. In the illustrative example, it is shown that reliability-based optimization can yield a design that is superior to an alternative design obtained by deterministic optimization. These analytical results provide motivation for the proposed study, which is underway.

  17. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    NASA Astrophysics Data System (ADS)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  18. Co-detection: ultra-reliable nanoparticle-based electrical detection of biomolecules in the presence of large background interference.

    PubMed

    Liu, Yang; Gu, Ming; Alocilja, Evangelyn C; Chakrabartty, Shantanu

    2010-11-15

    An ultra-reliable technique for detecting trace quantities of biomolecules is reported. The technique called "co-detection" exploits the non-linear redundancy amongst synthetically patterned biomolecular logic circuits for deciphering the presence or absence of target biomolecules in a sample. In this paper, we verify the "co-detection" principle on gold-nanoparticle-based conductimetric soft-logic circuits which use a silver-enhancement technique for signal amplification. Using co-detection, we have been able to demonstrate a great improvement in the reliability of detecting mouse IgG at concentration levels that are 10(5) lower than the concentration of rabbit IgG which serves as background interference. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. A Camera-Based Target Detection and Positioning UAV System for Search and Rescue (SAR) Purposes

    PubMed Central

    Sun, Jingxuan; Li, Boyang; Jiang, Yifan; Wen, Chih-yung

    2016-01-01

    Wilderness search and rescue entails performing a wide-range of work in complex environments and large regions. Given the concerns inherent in large regions due to limited rescue distribution, unmanned aerial vehicle (UAV)-based frameworks are a promising platform for providing aerial imaging. In recent years, technological advances in areas such as micro-technology, sensors and navigation have influenced the various applications of UAVs. In this study, an all-in-one camera-based target detection and positioning system is developed and integrated into a fully autonomous fixed-wing UAV. The system presented in this paper is capable of on-board, real-time target identification, post-target identification and location and aerial image collection for further mapping applications. Its performance is examined using several simulated search and rescue missions, and the test results demonstrate its reliability and efficiency. PMID:27792156

  20. A Camera-Based Target Detection and Positioning UAV System for Search and Rescue (SAR) Purposes.

    PubMed

    Sun, Jingxuan; Li, Boyang; Jiang, Yifan; Wen, Chih-Yung

    2016-10-25

    Wilderness search and rescue entails performing a wide-range of work in complex environments and large regions. Given the concerns inherent in large regions due to limited rescue distribution, unmanned aerial vehicle (UAV)-based frameworks are a promising platform for providing aerial imaging. In recent years, technological advances in areas such as micro-technology, sensors and navigation have influenced the various applications of UAVs. In this study, an all-in-one camera-based target detection and positioning system is developed and integrated into a fully autonomous fixed-wing UAV. The system presented in this paper is capable of on-board, real-time target identification, post-target identification and location and aerial image collection for further mapping applications. Its performance is examined using several simulated search and rescue missions, and the test results demonstrate its reliability and efficiency.

  1. Reliability based design including future tests and multiagent approaches

    NASA Astrophysics Data System (ADS)

    Villanueva, Diane

    The initial stages of reliability-based design optimization involve the formulation of objective functions and constraints, and building a model to estimate the reliability of the design with quantified uncertainties. However, even experienced hands often overlook important objective functions and constraints that affect the design. In addition, uncertainty reduction measures, such as tests and redesign, are often not considered in reliability calculations during the initial stages. This research considers two areas that concern the design of engineering systems: 1) the trade-off of the effect of a test and post-test redesign on reliability and cost and 2) the search for multiple candidate designs as insurance against unforeseen faults in some designs. In this research, a methodology was developed to estimate the effect of a single future test and post-test redesign on reliability and cost. The methodology uses assumed distributions of computational and experimental errors with re-design rules to simulate alternative future test and redesign outcomes to form a probabilistic estimate of the reliability and cost for a given design. Further, it was explored how modeling a future test and redesign provides a company an opportunity to balance development costs versus performance by simultaneously designing the design and the post-test redesign rules during the initial design stage. The second area of this research considers the use of dynamic local surrogates, or surrogate-based agents, to locate multiple candidate designs. Surrogate-based global optimization algorithms often require search in multiple candidate regions of design space, expending most of the computation needed to define multiple alternate designs. Thus, focusing on solely locating the best design may be wasteful. We extended adaptive sampling surrogate techniques to locate multiple optima by building local surrogates in sub-regions of the design space to identify optima. The efficiency of this method

  2. Reliability based fatigue design and maintenance procedures

    NASA Technical Reports Server (NTRS)

    Hanagud, S.

    1977-01-01

    A stochastic model has been developed to describe a probability for fatigue process by assuming a varying hazard rate. This stochastic model can be used to obtain the desired probability of a crack of certain length at a given location after a certain number of cycles or time. Quantitative estimation of the developed model was also discussed. Application of the model to develop a procedure for reliability-based cost-effective fail-safe structural design is presented. This design procedure includes the reliability improvement due to inspection and repair. Methods of obtaining optimum inspection and maintenance schemes are treated.

  3. Creating High Reliability in Health Care Organizations

    PubMed Central

    Pronovost, Peter J; Berenholtz, Sean M; Goeschel, Christine A; Needham, Dale M; Sexton, J Bryan; Thompson, David A; Lubomski, Lisa H; Marsteller, Jill A; Makary, Martin A; Hunt, Elizabeth

    2006-01-01

    Objective The objective of this paper was to present a comprehensive approach to help health care organizations reliably deliver effective interventions. Context Reliability in healthcare translates into using valid rate-based measures. Yet high reliability organizations have proven that the context in which care is delivered, called organizational culture, also has important influences on patient safety. Model for Improvement Our model to improve reliability, which also includes interventions to improve culture, focuses on valid rate-based measures. This model includes (1) identifying evidence-based interventions that improve the outcome, (2) selecting interventions with the most impact on outcomes and converting to behaviors, (3) developing measures to evaluate reliability, (4) measuring baseline performance, and (5) ensuring patients receive the evidence-based interventions. The comprehensive unit-based safety program (CUSP) is used to improve culture and guide organizations in learning from mistakes that are important, but cannot be measured as rates. Conclusions We present how this model was used in over 100 intensive care units in Michigan to improve culture and eliminate catheter-related blood stream infections—both were accomplished. Our model differs from existing models in that it incorporates efforts to improve a vital component for system redesign—culture, it targets 3 important groups—senior leaders, team leaders, and front line staff, and facilitates change management—engage, educate, execute, and evaluate for planned interventions. PMID:16898981

  4. Feature extraction algorithm for space targets based on fractal theory

    NASA Astrophysics Data System (ADS)

    Tian, Balin; Yuan, Jianping; Yue, Xiaokui; Ning, Xin

    2007-11-01

    In order to offer a potential for extending the life of satellites and reducing the launch and operating costs, satellite servicing including conducting repairs, upgrading and refueling spacecraft on-orbit become much more frequently. Future space operations can be more economically and reliably executed using machine vision systems, which can meet real time and tracking reliability requirements for image tracking of space surveillance system. Machine vision was applied to the research of relative pose for spacecrafts, the feature extraction algorithm was the basis of relative pose. In this paper fractal geometry based edge extraction algorithm which can be used in determining and tracking the relative pose of an observed satellite during proximity operations in machine vision system was presented. The method gets the gray-level image distributed by fractal dimension used the Differential Box-Counting (DBC) approach of the fractal theory to restrain the noise. After this, we detect the consecutive edge using Mathematical Morphology. The validity of the proposed method is examined by processing and analyzing images of space targets. The edge extraction method not only extracts the outline of the target, but also keeps the inner details. Meanwhile, edge extraction is only processed in moving area to reduce computation greatly. Simulation results compared edge detection using the method which presented by us with other detection methods. The results indicate that the presented algorithm is a valid method to solve the problems of relative pose for spacecrafts.

  5. Auditory color constancy: calibration to reliable spectral properties across nonspeech context and targets.

    PubMed

    Stilp, Christian E; Alexander, Joshua M; Kiefte, Michael; Kluender, Keith R

    2010-02-01

    Brief experience with reliable spectral characteristics of a listening context can markedly alter perception of subsequent speech sounds, and parallels have been drawn between auditory compensation for listening context and visual color constancy. In order to better evaluate such an analogy, the generality of acoustic context effects for sounds with spectral-temporal compositions distinct from speech was investigated. Listeners identified nonspeech sounds-extensively edited samples produced by a French horn and a tenor saxophone-following either resynthesized speech or a short passage of music. Preceding contexts were "colored" by spectral envelope difference filters, which were created to emphasize differences between French horn and saxophone spectra. Listeners were more likely to report hearing a saxophone when the stimulus followed a context filtered to emphasize spectral characteristics of the French horn, and vice versa. Despite clear changes in apparent acoustic source, the auditory system calibrated to relatively predictable spectral characteristics of filtered context, differentially affecting perception of subsequent target nonspeech sounds. This calibration to listening context and relative indifference to acoustic sources operates much like visual color constancy, for which reliable properties of the spectrum of illumination are factored out of perception of color.

  6. Reliability analysis based on the losses from failures.

    PubMed

    Todinov, M T

    2006-04-01

    The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the

  7. Multimodality Imaging with Silica-Based Targeted Nanoparticle Platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason S. Lewis

    2012-04-09

    Objectives: To synthesize and characterize a C-Dot silica-based nanoparticle containing 'clickable' groups for the subsequent attachment of targeting moieties (e.g., peptides) and multiple contrast agents (e.g., radionuclides with high specific activity) [1,2]. These new constructs will be tested in suitable tumor models in vitro and in vivo to ensure maintenance of target-specificity and high specific activity. Methods: Cy5 dye molecules are cross-linked to a silica precursor which is reacted to form a dye-rich core particle. This core is then encapsulated in a layer of pure silica to create the core-shell C-Dot (Figure 1) [2]. A 'click' chemistry approach has beenmore » used to functionalize the silica shell with radionuclides conferring high contrast and specific activity (e.g. 64Cu and 89Zr) and peptides for tumor targeting (e.g. cRGD and octreotate) [3]. Based on the selective Diels-Alder reaction between tetrazine and norbornene, the reaction is bioorthogonal, highyielding, rapid, and water-compatible. This radiolabeling approach has already been employed successfully with both short peptides (e.g. octreotate) and antibodies (e.g. trastuzumab) as model systems for the ultimate labeling of the nanoparticles [1]. Results: PEGylated C-Dots with a Cy5 core and labeled with tetrazine have been synthesized (d = 55 nm, zeta potential = -3 mV) reliably and reproducibly and have been shown to be stable under physiological conditions for up to 1 month. Characterization of the nanoparticles revealed that the immobilized Cy5 dye within the C-Dots exhibited fluorescence intensities over twice that of the fluorophore alone. The nanoparticles were successfully radiolabeled with Cu-64. Efforts toward the conjugation of targeting peptides (e.g. cRGD) are underway. In vitro stability, specificity, and uptake studies as well as in vivo imaging and biodistribution investigations will be presented. Conclusions: C-Dot silica-based nanoparticles offer a robust, versatile, and

  8. Reliability-Based Control Design for Uncertain Systems

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.

    2005-01-01

    This paper presents a robust control design methodology for systems with probabilistic parametric uncertainty. Control design is carried out by solving a reliability-based multi-objective optimization problem where the probability of violating design requirements is minimized. Simultaneously, failure domains are optimally enlarged to enable global improvements in the closed-loop performance. To enable an efficient numerical implementation, a hybrid approach for estimating reliability metrics is developed. This approach, which integrates deterministic sampling and asymptotic approximations, greatly reduces the numerical burden associated with complex probabilistic computations without compromising the accuracy of the results. Examples using output-feedback and full-state feedback with state estimation are used to demonstrate the ideas proposed.

  9. Reliability-based trajectory optimization using nonintrusive polynomial chaos for Mars entry mission

    NASA Astrophysics Data System (ADS)

    Huang, Yuechen; Li, Haiyang

    2018-06-01

    This paper presents the reliability-based sequential optimization (RBSO) method to settle the trajectory optimization problem with parametric uncertainties in entry dynamics for Mars entry mission. First, the deterministic entry trajectory optimization model is reviewed, and then the reliability-based optimization model is formulated. In addition, the modified sequential optimization method, in which the nonintrusive polynomial chaos expansion (PCE) method and the most probable point (MPP) searching method are employed, is proposed to solve the reliability-based optimization problem efficiently. The nonintrusive PCE method contributes to the transformation between the stochastic optimization (SO) and the deterministic optimization (DO) and to the approximation of trajectory solution efficiently. The MPP method, which is used for assessing the reliability of constraints satisfaction only up to the necessary level, is employed to further improve the computational efficiency. The cycle including SO, reliability assessment and constraints update is repeated in the RBSO until the reliability requirements of constraints satisfaction are satisfied. Finally, the RBSO is compared with the traditional DO and the traditional sequential optimization based on Monte Carlo (MC) simulation in a specific Mars entry mission to demonstrate the effectiveness and the efficiency of the proposed method.

  10. Feature-based RNN target recognition

    NASA Astrophysics Data System (ADS)

    Bakircioglu, Hakan; Gelenbe, Erol

    1998-09-01

    Detection and recognition of target signatures in sensory data obtained by synthetic aperture radar (SAR), forward- looking infrared, or laser radar, have received considerable attention in the literature. In this paper, we propose a feature based target classification methodology to detect and classify targets in cluttered SAR images, that makes use of selective signature data from sensory data, together with a neural network technique which uses a set of trained networks based on the Random Neural Network (RNN) model (Gelenbe 89, 90, 91, 93) which is trained to act as a matched filter. We propose and investigate radial features of target shapes that are invariant to rotation, translation, and scale, to characterize target and clutter signatures. These features are then used to train a set of learning RNNs which can be used to detect targets within clutter with high accuracy, and to classify the targets or man-made objects from natural clutter. Experimental data from SAR imagery is used to illustrate and validate the proposed method, and to calculate Receiver Operating Characteristics which illustrate the performance of the proposed algorithm.

  11. Off-target model based OPC

    NASA Astrophysics Data System (ADS)

    Lu, Mark; Liang, Curtis; King, Dion; Melvin, Lawrence S., III

    2005-11-01

    Model-based Optical Proximity correction has become an indispensable tool for achieving wafer pattern to design fidelity at current manufacturing process nodes. Most model-based OPC is performed considering the nominal process condition, with limited consideration of through process manufacturing robustness. This study examines the use of off-target process models - models that represent non-nominal process states such as would occur with a dose or focus variation - to understands and manipulate the final pattern correction to a more process robust configuration. The study will first examine and validate the process of generating an off-target model, then examine the quality of the off-target model. Once the off-target model is proven, it will be used to demonstrate methods of generating process robust corrections. The concepts are demonstrated using a 0.13 μm logic gate process. Preliminary indications show success in both off-target model production and process robust corrections. With these off-target models as tools, mask production cycle times can be reduced.

  12. Reliability Evaluation of Machine Center Components Based on Cascading Failure Analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Ying-Zhi; Liu, Jin-Tong; Shen, Gui-Xiang; Long, Zhe; Sun, Shu-Guang

    2017-07-01

    In order to rectify the problems that the component reliability model exhibits deviation, and the evaluation result is low due to the overlook of failure propagation in traditional reliability evaluation of machine center components, a new reliability evaluation method based on cascading failure analysis and the failure influenced degree assessment is proposed. A direct graph model of cascading failure among components is established according to cascading failure mechanism analysis and graph theory. The failure influenced degrees of the system components are assessed by the adjacency matrix and its transposition, combined with the Pagerank algorithm. Based on the comprehensive failure probability function and total probability formula, the inherent failure probability function is determined to realize the reliability evaluation of the system components. Finally, the method is applied to a machine center, it shows the following: 1) The reliability evaluation values of the proposed method are at least 2.5% higher than those of the traditional method; 2) The difference between the comprehensive and inherent reliability of the system component presents a positive correlation with the failure influenced degree of the system component, which provides a theoretical basis for reliability allocation of machine center system.

  13. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  14. Highly sensitive detection of target molecules using a new fluorescence-based bead assay

    NASA Astrophysics Data System (ADS)

    Scheffler, Silvia; Strauß, Denis; Sauer, Markus

    2007-07-01

    Development of immunoassays with improved sensitivity, specificity and reliability are of major interest in modern bioanalytical research. We describe the development of a new immunomagnetic fluorescence detection (IM-FD) assay based on specific antigen/antibody interactions and on accumulation of the fluorescence signal on superparamagnetic PE beads in combination with the use of extrinsic fluorescent labels. IM-FD can be easily modified by varying the order of coatings and assay conditions. Depending on the target molecule, antibodies (ABs), entire proteins, or small protein epitopes can be used as capture molecules. The presence of target molecules is detected by fluorescence microscopy using fluorescently labeled secondary or detection antibodies. Here, we demonstrate the potential of the new assay detecting the two tumor markers IGF-I and p53 antibodies in the clinically relevant concentration range. Our data show that the fluorescence-based bead assay exhibits a large dynamic range and a high sensitivity down to the subpicomolar level.

  15. NEPP DDR Device Reliability FY13 Report

    NASA Technical Reports Server (NTRS)

    Guertin, Steven M.; Armbar, Mehran

    2014-01-01

    This document reports the status of the NEPP Double Data Rate (DDR) Device Reliability effort for FY2013. The task targeted general reliability of > 100 DDR2 devices from Hynix, Samsung, and Micron. Detailed characterization of some devices when stressed by several data storage patterns was studied, targeting ability of the data cells to store the different data patterns without refresh, highlighting the weakest bits. DDR2, Reliability, Data Retention, Temperature Stress, Test System Evaluation, General Reliability, IDD measurements, electronic parts, parts testing, microcircuits

  16. Multi-mode reliability-based design of horizontal curves.

    PubMed

    Essa, Mohamed; Sayed, Tarek; Hussein, Mohamed

    2016-08-01

    Recently, reliability analysis has been advocated as an effective approach to account for uncertainty in the geometric design process and to evaluate the risk associated with a particular design. In this approach, a risk measure (e.g. probability of noncompliance) is calculated to represent the probability that a specific design would not meet standard requirements. The majority of previous applications of reliability analysis in geometric design focused on evaluating the probability of noncompliance for only one mode of noncompliance such as insufficient sight distance. However, in many design situations, more than one mode of noncompliance may be present (e.g. insufficient sight distance and vehicle skidding at horizontal curves). In these situations, utilizing a multi-mode reliability approach that considers more than one failure (noncompliance) mode is required. The main objective of this paper is to demonstrate the application of multi-mode (system) reliability analysis to the design of horizontal curves. The process is demonstrated by a case study of Sea-to-Sky Highway located between Vancouver and Whistler, in southern British Columbia, Canada. Two noncompliance modes were considered: insufficient sight distance and vehicle skidding. The results show the importance of accounting for several noncompliance modes in the reliability model. The system reliability concept could be used in future studies to calibrate the design of various design elements in order to achieve consistent safety levels based on all possible modes of noncompliance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Target attribute-based false alarm rejection in small infrared target detection

    NASA Astrophysics Data System (ADS)

    Kim, Sungho

    2012-11-01

    Infrared search and track is an important research area in military applications. Although there are a lot of works on small infrared target detection methods, we cannot apply them in real field due to high false alarm rate caused by clutters. This paper presents a novel target attribute extraction and machine learning-based target discrimination method. Eight kinds of target features are extracted and analyzed statistically. Learning-based classifiers such as SVM and Adaboost are developed and compared with conventional classifiers for real infrared images. In addition, the generalization capability is also inspected for various infrared clutters.

  18. Evaluation of the reliability of maize reference assays for GMO quantification.

    PubMed

    Papazova, Nina; Zhang, David; Gruden, Kristina; Vojvoda, Jana; Yang, Litao; Buh Gasparic, Meti; Blejec, Andrej; Fouilloux, Stephane; De Loose, Marc; Taverniers, Isabel

    2010-03-01

    A reliable PCR reference assay for relative genetically modified organism (GMO) quantification must be specific for the target taxon and amplify uniformly along the commercialised varieties within the considered taxon. Different reference assays for maize (Zea mays L.) are used in official methods for GMO quantification. In this study, we evaluated the reliability of eight existing maize reference assays, four of which are used in combination with an event-specific polymerase chain reaction (PCR) assay validated and published by the Community Reference Laboratory (CRL). We analysed the nucleotide sequence variation in the target genomic regions in a broad range of transgenic and conventional varieties and lines: MON 810 varieties cultivated in Spain and conventional varieties from various geographical origins and breeding history. In addition, the reliability of the assays was evaluated based on their PCR amplification performance. A single base pair substitution, corresponding to a single nucleotide polymorphism (SNP) reported in an earlier study, was observed in the forward primer of one of the studied alcohol dehydrogenase 1 (Adh1) (70) assays in a large number of varieties. The SNP presence is consistent with a poor PCR performance observed for this assay along the tested varieties. The obtained data show that the Adh1 (70) assay used in the official CRL NK603 assay is unreliable. Based on our results from both the nucleotide stability study and the PCR performance test, we can conclude that the Adh1 (136) reference assay (T25 and Bt11 assays) as well as the tested high mobility group protein gene assay, which also form parts of CRL methods for quantification, are highly reliable. Despite the observed uniformity in the nucleotide sequence of the invertase gene assay, the PCR performance test reveals that this target sequence might occur in more than one copy. Finally, although currently not forming a part of official quantification methods, zein and SSIIb

  19. Diagnostic reliability of MMPI-2 computer-based test interpretations.

    PubMed

    Pant, Hina; McCabe, Brian J; Deskovitz, Mark A; Weed, Nathan C; Williams, John E

    2014-09-01

    Reflecting the common use of the MMPI-2 to provide diagnostic considerations, computer-based test interpretations (CBTIs) also typically offer diagnostic suggestions. However, these diagnostic suggestions can sometimes be shown to vary widely across different CBTI programs even for identical MMPI-2 profiles. The present study evaluated the diagnostic reliability of 6 commercially available CBTIs using a 20-item Q-sort task developed for this study. Four raters each sorted diagnostic classifications based on these 6 CBTI reports for 20 MMPI-2 profiles. Two questions were addressed. First, do users of CBTIs understand the diagnostic information contained within the reports similarly? Overall, diagnostic sorts of the CBTIs showed moderate inter-interpreter diagnostic reliability (mean r = .56), with sorts for the 1/2/3 profile showing the highest inter-interpreter diagnostic reliability (mean r = .67). Second, do different CBTIs programs vary with respect to diagnostic suggestions? It was found that diagnostic sorts of the CBTIs had a mean inter-CBTI diagnostic reliability of r = .56, indicating moderate but not strong agreement across CBTIs in terms of diagnostic suggestions. The strongest inter-CBTI diagnostic agreement was found for sorts of the 1/2/3 profile CBTIs (mean r = .71). Limitations and future directions are discussed. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  20. Measurement-based reliability/performability models

    NASA Technical Reports Server (NTRS)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  1. A reliability and mass perspective of SP-100 Stirling cycle lunar-base powerplant designs

    NASA Technical Reports Server (NTRS)

    Bloomfield, Harvey S.

    1991-01-01

    The purpose was to obtain reliability and mass perspectives on selection of space power system conceptual designs based on SP-100 reactor and Stirling cycle power-generation subsystems. The approach taken was to: (1) develop a criterion for an acceptable overall reliability risk as a function of the expected range of emerging technology subsystem unit reliabilities; (2) conduct reliability and mass analyses for a diverse matrix of 800-kWe lunar-base design configurations employing single and multiple powerplants with both full and partial subsystem redundancy combinations; and (3) derive reliability and mass perspectives on selection of conceptual design configurations that meet an acceptable reliability criterion with the minimum system mass increase relative to reference powerplant design. The developed perspectives provided valuable insight into the considerations required to identify and characterize high-reliability and low-mass lunar-base powerplant conceptual design.

  2. Reliable bonding using indium-based solders

    NASA Astrophysics Data System (ADS)

    Cheong, Jongpil; Goyal, Abhijat; Tadigadapa, Srinivas; Rahn, Christopher

    2004-01-01

    Low temperature bonding techniques with high bond strengths and reliability are required for the fabrication and packaging of MEMS devices. Indium and indium-tin based bonding processes are explored for the fabrication of a flextensional MEMS actuator, which requires the integration of lead zirconate titanate (PZT) substrate with a silicon micromachined structure at low temperatures. The developed technique can be used either for wafer or chip level bonding. The lithographic steps used for the patterning and delineation of the seed layer limit the resolution of this technique. Using this technique, reliable bonds were achieved at a temperature of 200°C. The bonds yielded an average tensile strength of 5.41 MPa and 7.38 MPa for samples using indium and indium-tin alloy solders as the intermediate bonding layers respectively. The bonds (with line width of 100 microns) showed hermetic sealing capability of better than 10-11 mbar-l/s when tested using a commercial helium leak tester.

  3. Reliable bonding using indium-based solders

    NASA Astrophysics Data System (ADS)

    Cheong, Jongpil; Goyal, Abhijat; Tadigadapa, Srinivas; Rahn, Christopher

    2003-12-01

    Low temperature bonding techniques with high bond strengths and reliability are required for the fabrication and packaging of MEMS devices. Indium and indium-tin based bonding processes are explored for the fabrication of a flextensional MEMS actuator, which requires the integration of lead zirconate titanate (PZT) substrate with a silicon micromachined structure at low temperatures. The developed technique can be used either for wafer or chip level bonding. The lithographic steps used for the patterning and delineation of the seed layer limit the resolution of this technique. Using this technique, reliable bonds were achieved at a temperature of 200°C. The bonds yielded an average tensile strength of 5.41 MPa and 7.38 MPa for samples using indium and indium-tin alloy solders as the intermediate bonding layers respectively. The bonds (with line width of 100 microns) showed hermetic sealing capability of better than 10-11 mbar-l/s when tested using a commercial helium leak tester.

  4. MOST: most-similar ligand based approach to target prediction.

    PubMed

    Huang, Tao; Mi, Hong; Lin, Cheng-Yuan; Zhao, Ling; Zhong, Linda L D; Liu, Feng-Bin; Zhang, Ge; Lu, Ai-Ping; Bian, Zhao-Xiang

    2017-03-11

    Many computational approaches have been used for target prediction, including machine learning, reverse docking, bioactivity spectra analysis, and chemical similarity searching. Recent studies have suggested that chemical similarity searching may be driven by the most-similar ligand. However, the extent of bioactivity of most-similar ligands has been oversimplified or even neglected in these studies, and this has impaired the prediction power. Here we propose the MOst-Similar ligand-based Target inference approach, namely MOST, which uses fingerprint similarity and explicit bioactivity of the most-similar ligands to predict targets of the query compound. Performance of MOST was evaluated by using combinations of different fingerprint schemes, machine learning methods, and bioactivity representations. In sevenfold cross-validation with a benchmark Ki dataset from CHEMBL release 19 containing 61,937 bioactivity data of 173 human targets, MOST achieved high average prediction accuracy (0.95 for pKi ≥ 5, and 0.87 for pKi ≥ 6). Morgan fingerprint was shown to be slightly better than FP2. Logistic Regression and Random Forest methods performed better than Naïve Bayes. In a temporal validation, the Ki dataset from CHEMBL19 were used to train models and predict the bioactivity of newly deposited ligands in CHEMBL20. MOST also performed well with high accuracy (0.90 for pKi ≥ 5, and 0.76 for pKi ≥ 6), when Logistic Regression and Morgan fingerprint were employed. Furthermore, the p values associated with explicit bioactivity were found be a robust index for removing false positive predictions. Implicit bioactivity did not offer this capability. Finally, p values generated with Logistic Regression, Morgan fingerprint and explicit activity were integrated with a false discovery rate (FDR) control procedure to reduce false positives in multiple-target prediction scenario, and the success of this strategy it was demonstrated with a case of fluanisone

  5. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, Gene J.-W; Newman, Perry A. (Technical Monitor)

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The minimum distance associated with the MPP provides a measurement of safety probability, which can be obtained by approximate probability integration methods such as FORM or SORM. The reliability sensitivity equations are derived first in this paper, based on the derivatives of the optimal solution. Examples are provided later to demonstrate the use of these derivatives for better reliability analysis and reliability-based design optimization (RBDO).

  6. Assessing segment- and corridor-based travel-time reliability on urban freeways : final report.

    DOT National Transportation Integrated Search

    2016-09-01

    Travel time and its reliability are intuitive performance measures for freeway traffic operations. The objective of this project was to quantify segment-based and corridor-based travel time reliability measures on urban freeways. To achieve this obje...

  7. Reliability and Validity of the Evidence-Based Practice Confidence (EPIC) Scale

    ERIC Educational Resources Information Center

    Salbach, Nancy M.; Jaglal, Susan B.; Williams, Jack I.

    2013-01-01

    Introduction: The reliability, minimal detectable change (MDC), and construct validity of the evidence-based practice confidence (EPIC) scale were evaluated among physical therapists (PTs) in clinical practice. Methods: A longitudinal mail survey was conducted. Internal consistency and test-retest reliability were estimated using Cronbach's alpha…

  8. Characterizing the reliability of a bioMEMS-based cantilever sensor

    NASA Astrophysics Data System (ADS)

    Bhalerao, Kaustubh D.

    2004-12-01

    The cantilever-based BioMEMS sensor represents one instance from many competing ideas of biosensor technology based on Micro Electro Mechanical Systems. The advancement of BioMEMS from laboratory-scale experiments to applications in the field will require standardization of their components and manufacturing procedures as well as frameworks to evaluate their performance. Reliability, the likelihood with which a system performs its intended task, is a compact mathematical description of its performance. The mathematical and statistical foundation of systems-reliability has been applied to the cantilever-based BioMEMS sensor. The sensor is designed to detect one aspect of human ovarian cancer, namely the over-expression of the folate receptor surface protein (FR-alpha). Even as the application chosen is clinically motivated, the objective of this study was to demonstrate the underlying systems-based methodology used to design, develop and evaluate the sensor. The framework development can be readily extended to other BioMEMS-based devices for disease detection and will have an impact in the rapidly growing $30 bn industry. The Unified Modeling Language (UML) is a systems-based framework for design and development of object-oriented information systems which has potential application for use in systems designed to interact with biological environments. The UML has been used to abstract and describe the application of the biosensor, to identify key components of the biosensor, and the technology needed to link them together in a coherent manner. The use of the framework is also demonstrated in computation of system reliability from first principles as a function of the structure and materials of the biosensor. The outcomes of applying the systems-based framework to the study are the following: (1) Characterizing the cantilever-based MEMS device for disease (cell) detection. (2) Development of a novel chemical interface between the analyte and the sensor that provides a

  9. Complementary Approaches to Existing Target Based Drug Discovery for Identifying Novel Drug Targets.

    PubMed

    Vasaikar, Suhas; Bhatia, Pooja; Bhatia, Partap G; Chu Yaiw, Koon

    2016-11-21

    In the past decade, it was observed that the relationship between the emerging New Molecular Entities and the quantum of R&D investment has not been favorable. There might be numerous reasons but few studies stress the introduction of target based drug discovery approach as one of the factors. Although a number of drugs have been developed with an emphasis on a single protein target, yet identification of valid target is complex. The approach focuses on an in vitro single target, which overlooks the complexity of cell and makes process of validation drug targets uncertain. Thus, it is imperative to search for alternatives rather than looking at success stories of target-based drug discovery. It would be beneficial if the drugs were developed to target multiple components. New approaches like reverse engineering and translational research need to take into account both system and target-based approach. This review evaluates the strengths and limitations of known drug discovery approaches and proposes alternative approaches for increasing efficiency against treatment.

  10. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    NASA Astrophysics Data System (ADS)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  11. Development of a nanosatellite de-orbiting system by reliability based design optimization

    NASA Astrophysics Data System (ADS)

    Nikbay, Melike; Acar, Pınar; Aslan, Alim Rüstem

    2015-12-01

    This paper presents design approaches to develop a reliable and efficient de-orbiting system for the 3USAT nanosatellite to provide a beneficial orbital decay process at the end of a mission. A de-orbiting system is initially designed by employing the aerodynamic drag augmentation principle where the structural constraints of the overall satellite system and the aerodynamic forces are taken into account. Next, an alternative de-orbiting system is designed with new considerations and further optimized using deterministic and reliability based design techniques. For the multi-objective design, the objectives are chosen to maximize the aerodynamic drag force through the maximization of the Kapton surface area while minimizing the de-orbiting system mass. The constraints are related in a deterministic manner to the required deployment force, the height of the solar panel hole and the deployment angle. The length and the number of layers of the deployable Kapton structure are used as optimization variables. In the second stage of this study, uncertainties related to both manufacturing and operating conditions of the deployable structure in space environment are considered. These uncertainties are then incorporated into the design process by using different probabilistic approaches such as Monte Carlo Simulation, the First-Order Reliability Method and the Second-Order Reliability Method. The reliability based design optimization seeks optimal solutions using the former design objectives and constraints with the inclusion of a reliability index. Finally, the de-orbiting system design alternatives generated by different approaches are investigated and the reliability based optimum design is found to yield the best solution since it significantly improves both system reliability and performance requirements.

  12. Infrared small target tracking based on SOPC

    NASA Astrophysics Data System (ADS)

    Hu, Taotao; Fan, Xiang; Zhang, Yu-Jin; Cheng, Zheng-dong; Zhu, Bin

    2011-01-01

    The paper presents a low cost FPGA based solution for a real-time infrared small target tracking system. A specialized architecture is presented based on a soft RISC processor capable of running kernel based mean shift tracking algorithm. Mean shift tracking algorithm is realized in NIOS II soft-core with SOPC (System on a Programmable Chip) technology. Though mean shift algorithm is widely used for target tracking, the original mean shift algorithm can not be directly used for infrared small target tracking. As infrared small target only has intensity information, so an improved mean shift algorithm is presented in this paper. How to describe target will determine whether target can be tracked by mean shift algorithm. Because color target can be tracked well by mean shift algorithm, imitating color image expression, spatial component and temporal component are advanced to describe target, which forms pseudo-color image. In order to improve the processing speed parallel technology and pipeline technology are taken. Two RAM are taken to stored images separately by ping-pong technology. A FLASH is used to store mass temp data. The experimental results show that infrared small target is tracked stably in complicated background.

  13. HPPD: ligand- and target-based virtual screening on a herbicide target.

    PubMed

    López-Ramos, Miriam; Perruccio, Francesca

    2010-05-24

    Hydroxyphenylpyruvate dioxygenase (HPPD) has proven to be a very successful target for the development of herbicides with bleaching properties, and today HPPD inhibitors are well established in the agrochemical market. Syngenta has a long history of HPPD-inhibitor research, and HPPD was chosen as a case study for the validation of diverse ligand- and target-based virtual screening approaches to identify compounds with inhibitory properties. Two-dimensional extended connectivity fingerprints, three-dimensional shape-based tools (ROCS, EON, and Phase-shape) and a pharmacophore approach (Phase) were used as ligand-based methods; Glide and Gold were used as target-based. Both the virtual screening utility and the scaffold-hopping ability of the screening tools were assessed. Particular emphasis was put on the specific pitfalls to take into account for the design of a virtual screening campaign in an agrochemical context, as compared to a pharmaceutical environment.

  14. Target recognition based on the moment functions of radar signatures

    NASA Astrophysics Data System (ADS)

    Kim, Kyung-Tae; Kim, Hyo-Tae

    2002-03-01

    In this paper, we present the results of target recognition research based on the moment functions of various radar signatures, such as time-frequency signatures, range profiles, and scattering centers. The proposed approach utilizes geometrical moments or central moments of the obtained radar signatures. In particular, we derived exact and closed form expressions of the geometrical moments of the adaptive Gaussian representation (AGR), which is one of the adaptive joint time-frequency techniques, and also computed the central moments of range profiles and one-dimensional (1-D) scattering centers on a target, which are obtained by various super-resolution techniques. The obtained moment functions are further processed to provide small dimensional and redundancy-free feature vectors, and classified via a neural network approach or a Bayes classifier. The performances of the proposed technique are demonstrated using a simulated radar cross section (RCS) data set, or a measured RCS data set of various scaled aircraft models, obtained at the Pohang University of Science and Technology (POSTECH) compact range facility. Results show that the techniques in this paper can not only provide reliable classification accuracy, but also save computational resources.

  15. Advancing methods for reliably assessing motivational interviewing fidelity using the Motivational Interviewing Skills Code

    PubMed Central

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W.; Imel, Zac E.; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C.

    2014-01-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. PMID:25242192

  16. A method for detecting small targets based on cumulative weighted value of target properties

    NASA Astrophysics Data System (ADS)

    Jin, Xing; Sun, Gang; Wang, Wei-hua; Liu, Fang; Chen, Zeng-ping

    2015-03-01

    Laser detection based on the "cat's eye effect" has become the hot research project for its initiative compared to the passivity of sound detection and infrared detection. And the target detection is one of the core technologies in this system. The paper puts forward a method for detecting small targets based on cumulative weighted value of target properties using given data. Firstly, we make a frame difference to the images, then make image processing based on Morphology Principles. Secondly, we segment images, and screen the targets; then find some interesting locations. Finally, comparing to a quantity of frames, we locate the target. We did an exam to 394 true frames, the experimental result shows that the mathod can detect small targets efficiently.

  17. Reliabilities of mental rotation tasks: limits to the assessment of individual differences.

    PubMed

    Hirschfeld, Gerrit; Thielsch, Meinald T; Zernikow, Boris

    2013-01-01

    Mental rotation tasks with objects and body parts as targets are widely used in cognitive neuropsychology. Even though these tasks are well established to study between-groups differences, the reliability on an individual level is largely unknown. We present a systematic study on the internal consistency and test-retest reliability of individual differences in mental rotation tasks comparing different target types and orders of presentations. In total n = 99 participants (n = 63 for the retest) completed the mental rotation tasks with hands, feet, faces, and cars as targets. Different target types were presented in either randomly mixed blocks or blocks of homogeneous targets. Across all target types, the consistency (split-half reliability) and stability (test-retest reliabilities) were good or acceptable both for intercepts and slopes. At the level of individual targets, only intercepts showed acceptable reliabilities. Blocked presentations resulted in significantly faster and numerically more consistent and stable responses. Mental rotation tasks-especially in blocked variants-can be used to reliably assess individual differences in global processing speed. However, the assessment of the theoretically important slope parameter for individual targets requires further adaptations to mental rotation tests.

  18. How reliable are ligand-centric methods for Target Fishing?

    NASA Astrophysics Data System (ADS)

    Peon, Antonio; Dang, Cuong; Ballester, Pedro

    2016-04-01

    Computational methods for Target Fishing (TF), also known as Target Prediction or Polypharmacology Prediction, can be used to discover new targets for small-molecule drugs. This may result in repositioning the drug in a new indication or improving our current understanding of its efficacy and side effects. While there is a substantial body of research on TF methods, there is still a need to improve their validation, which is often limited to a small part of the available targets and not easily interpretable by the user. Here we discuss how target-centric TF methods are inherently limited by the number of targets that can possibly predict (this number is by construction much larger in ligand-centric techniques). We also propose a new benchmark to validate TF methods, which is particularly suited to analyse how predictive performance varies with the query molecule. On average over approved drugs, we estimate that only five predicted targets will have to be tested to find two true targets with submicromolar potency (a strong variability in performance is however observed). In addition, we find that an approved drug has currently an average of eight known targets, which reinforces the notion that polypharmacology is a common and strong event. Furthermore, with the assistance of a control group of randomly-selected molecules, we show that the targets of approved drugs are generally harder to predict.

  19. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false General Principles of Reliability-Based... STANDARDS Pt. 238, App. E Appendix E to Part 238—General Principles of Reliability-Based Maintenance... maintenance programs are based on the following general principles. A failure is an unsatisfactory condition...

  20. Advancing methods for reliably assessing motivational interviewing fidelity using the motivational interviewing skills code.

    PubMed

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C

    2015-02-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. A Research Roadmap for Computation-Based Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less

  2. Object-based target templates guide attention during visual search.

    PubMed

    Berggren, Nick; Eimer, Martin

    2018-05-03

    During visual search, attention is believed to be controlled in a strictly feature-based fashion, without any guidance by object-based target representations. To challenge this received view, we measured electrophysiological markers of attentional selection (N2pc component) and working memory (sustained posterior contralateral negativity; SPCN) in search tasks where two possible targets were defined by feature conjunctions (e.g., blue circles and green squares). Critically, some search displays also contained nontargets with two target features (incorrect conjunction objects, e.g., blue squares). Because feature-based guidance cannot distinguish these objects from targets, any selective bias for targets will reflect object-based attentional control. In Experiment 1, where search displays always contained only one object with target-matching features, targets and incorrect conjunction objects elicited identical N2pc and SPCN components, demonstrating that attentional guidance was entirely feature-based. In Experiment 2, where targets and incorrect conjunction objects could appear in the same display, clear evidence for object-based attentional control was found. The target N2pc became larger than the N2pc to incorrect conjunction objects from 250 ms poststimulus, and only targets elicited SPCN components. This demonstrates that after an initial feature-based guidance phase, object-based templates are activated when they are required to distinguish target and nontarget objects. These templates modulate visual processing and control access to working memory, and their activation may coincide with the start of feature integration processes. Results also suggest that while multiple feature templates can be activated concurrently, only a single object-based target template can guide attention at any given time. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  3. Reliability of a computer-based system for measuring visual performance skills.

    PubMed

    Erickson, Graham B; Citek, Karl; Cove, Michelle; Wilczek, Jennifer; Linster, Carolyn; Bjarnason, Brendon; Langemo, Nathan

    2011-09-01

    Athletes have demonstrated better visual abilities than nonathletes. A vision assessment for an athlete should include methods to evaluate the quality of visual performance skills in the most appropriate, accurate, and repeatable manner. This study determines the reliability of the visual performance measures assessed with a computer-based system, known as the Nike Sensory Station. One hundred twenty-five subjects (56 men, 69 women), age 18 to 30, completed Phase I of the study. Subjects attended 2 sessions, separated by at least 1 week, in which identical protocols were followed. Subjects completed the following assessments: Visual Clarity, Contrast Sensitivity, Depth Perception, Near-Far Quickness, Target Capture, Perception Span, Eye-Hand Coordination, Go/No Go, and Reaction Time. An additional 36 subjects (20 men, 16 women), age 22 to 35, completed Phase II of the study involving modifications to the equipment, instructions, and protocols from Phase I. Results show no significant change in performance over time on assessments of Visual Clarity, Contrast Sensitivity, Depth Perception, Target Capture, Perception Span, and Reaction Time. Performance did improve over time for Near-Far Quickness, Eye-Hand Coordination, and Go/No Go. The results of this study show that many of the Nike Sensory Station assessments show repeatability and no learning effect over time. The measures that did improve across sessions show an expected learning effect caused by the motor response characteristics being measured. Copyright © 2011 American Optometric Association. Published by Elsevier Inc. All rights reserved.

  4. Multisite Reliability of MR-Based Functional Connectivity

    PubMed Central

    Noble, Stephanie; Scheinost, Dustin; Finn, Emily S.; Shen, Xilin; Papademetris, Xenophon; McEwen, Sarah C.; Bearden, Carrie E.; Addington, Jean; Goodyear, Bradley; Cadenhead, Kristin S.; Mirzakhanian, Heline; Cornblatt, Barbara A.; Olvet, Doreen M.; Mathalon, Daniel H.; McGlashan, Thomas H.; Perkins, Diana O.; Belger, Aysenil; Seidman, Larry J.; Thermenos, Heidi; Tsuang, Ming T.; van Erp, Theo G.M.; Walker, Elaine F.; Hamann, Stephan; Woods, Scott W.; Cannon, Tyrone D.; Constable, R. Todd

    2016-01-01

    Recent years have witnessed an increasing number of multisite MRI functional connectivity (fcMRI) studies. While multisite studies are an efficient way to speed up data collection and increase sample sizes, especially for rare clinical populations, any effects of site or MRI scanner could ultimately limit power and weaken results. Little data exists on the stability of functional connectivity measurements across sites and sessions. In this study, we assess the influence of site and session on resting state functional connectivity measurements in a healthy cohort of traveling subjects (8 subjects scanned twice at each of 8 sites) scanned as part of the North American Prodrome Longitudinal Study (NAPLS). Reliability was investigated in three types of connectivity analyses: (1) seed-based connectivity with posterior cingulate cortex (PCC), right motor cortex (RMC), and left thalamus (LT) as seeds; (2) the intrinsic connectivity distribution (ICD), a voxel-wise connectivity measure; and (3) matrix connectivity, a whole-brain, atlas-based approach assessing connectivity between nodes. Contributions to variability in connectivity due to subject, site, and day-of-scan were quantified and used to assess between-session (test-retest) reliability in accordance with Generalizability Theory. Overall, no major site, scanner manufacturer, or day-of-scan effects were found for the univariate connectivity analyses; instead, subject effects dominated relative to the other measured factors. However, summaries of voxel-wise connectivity were found to be sensitive to site and scanner manufacturer effects. For all connectivity measures, although subject variance was three times the site variance, the residual represented 60–80% of the variance, indicating that connectivity differed greatly from scan to scan independent of any of the measured factors (i.e., subject, site, and day-of-scan). Thus, for a single 5 min scan, reliability across connectivity measures was poor (ICC=0.07–0

  5. Identification of reliable gridded reference data for statistical downscaling methods in Alberta

    NASA Astrophysics Data System (ADS)

    Eum, H. I.; Gupta, A.

    2017-12-01

    Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system

  6. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    NASA Astrophysics Data System (ADS)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  7. Reliability Assessment of a Robust Design Under Uncertainty for a 3-D Flexible Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J. -W.; Newman, Perry A.

    2003-01-01

    The paper presents reliability assessment results for the robust designs under uncertainty of a 3-D flexible wing previously reported by the authors. Reliability assessments (additional optimization problems) of the active constraints at the various probabilistic robust design points are obtained and compared with the constraint values or target constraint probabilities specified in the robust design. In addition, reliability-based sensitivity derivatives with respect to design variable mean values are also obtained and shown to agree with finite difference values. These derivatives allow one to perform reliability based design without having to obtain second-order sensitivity derivatives. However, an inner-loop optimization problem must be solved for each active constraint to find the most probable point on that constraint failure surface.

  8. Validity and reliability of Internet-based physiotherapy assessment for musculoskeletal disorders: a systematic review.

    PubMed

    Mani, Suresh; Sharma, Shobha; Omar, Baharudin; Paungmali, Aatit; Joseph, Leonard

    2017-04-01

    Purpose The purpose of this review is to systematically explore and summarise the validity and reliability of telerehabilitation (TR)-based physiotherapy assessment for musculoskeletal disorders. Method A comprehensive systematic literature review was conducted using a number of electronic databases: PubMed, EMBASE, PsycINFO, Cochrane Library and CINAHL, published between January 2000 and May 2015. The studies examined the validity, inter- and intra-rater reliabilities of TR-based physiotherapy assessment for musculoskeletal conditions were included. Two independent reviewers used the Quality Appraisal Tool for studies of diagnostic Reliability (QAREL) and the Quality Assessment of Diagnostic Accuracy Studies (QUADAS) tool to assess the methodological quality of reliability and validity studies respectively. Results A total of 898 hits were achieved, of which 11 articles based on inclusion criteria were reviewed. Nine studies explored the concurrent validity, inter- and intra-rater reliabilities, while two studies examined only the concurrent validity. Reviewed studies were moderate to good in methodological quality. The physiotherapy assessments such as pain, swelling, range of motion, muscle strength, balance, gait and functional assessment demonstrated good concurrent validity. However, the reported concurrent validity of lumbar spine posture, special orthopaedic tests, neurodynamic tests and scar assessments ranged from low to moderate. Conclusion TR-based physiotherapy assessment was technically feasible with overall good concurrent validity and excellent reliability, except for lumbar spine posture, orthopaedic special tests, neurodynamic testa and scar assessment.

  9. Target scattering characteristics for OAM-based radar

    NASA Astrophysics Data System (ADS)

    Liu, Kang; Gao, Yue; Li, Xiang; Cheng, Yongqiang

    2018-02-01

    The target scattering characteristics are crucial for radar systems. However, there is very little study conducted for the recently developed orbital angular momentum (OAM) based radar system. To illustrate the role of OAM-based radar cross section (ORCS), conventional radar equation is modified by taking characteristics of the OAM waves into account. Subsequently, the ORCS is defined in analogy to classical radar cross section (RCS). The unique features of the incident OAM-carrying field are analyzed. The scattered field is derived, and the analytical expressions of ORCSs for metal plate and cylinder targets are obtained. Furthermore, the ORCS and RCS are compared to illustrate the influences of OAM mode number, target size and signal frequency on the ORCS. Analytical studies demonstrate that the mirror-reflection phenomenon disappears and peak values of ORCS are in the non-specular direction. Finally, the ORCS features are summarized to show its advantages in radar target detection. This work can provide theoretical guidance to the design of OAM-based radar as well as the target detection and identification applications.

  10. Analyzing Reliability and Performance Trade-Offs of HLS-Based Designs in SRAM-Based FPGAs Under Soft Errors

    NASA Astrophysics Data System (ADS)

    Tambara, Lucas Antunes; Tonfat, Jorge; Santos, André; Kastensmidt, Fernanda Lima; Medina, Nilberto H.; Added, Nemitala; Aguiar, Vitor A. P.; Aguirre, Fernando; Silveira, Marcilei A. G.

    2017-02-01

    The increasing system complexity of FPGA-based hardware designs and shortening of time-to-market have motivated the adoption of new designing methodologies focused on addressing the current need for high-performance circuits. High-Level Synthesis (HLS) tools can generate Register Transfer Level (RTL) designs from high-level software programming languages. These tools have evolved significantly in recent years, providing optimized RTL designs, which can serve the needs of safety-critical applications that require both high performance and high reliability levels. However, a reliability evaluation of HLS-based designs under soft errors has not yet been presented. In this work, the trade-offs of different HLS-based designs in terms of reliability, resource utilization, and performance are investigated by analyzing their behavior under soft errors and comparing them to a standard processor-based implementation in an SRAM-based FPGA. Results obtained from fault injection campaigns and radiation experiments show that it is possible to increase the performance of a processor-based system up to 5,000 times by changing its architecture with a small impact in the cross section (increasing up to 8 times), and still increasing the Mean Workload Between Failures (MWBF) of the system.

  11. An Energy-Based Limit State Function for Estimation of Structural Reliability in Shock Environments

    DOE PAGES

    Guthrie, Michael A.

    2013-01-01

    limit state function is developed for the estimation of structural reliability in shock environments. This limit state function uses peak modal strain energies to characterize environmental severity and modal strain energies at failure to characterize the structural capacity. The Hasofer-Lind reliability index is briefly reviewed and its computation for the energy-based limit state function is discussed. Applications to two degree of freedom mass-spring systems and to a simple finite element model are considered. For these examples, computation of the reliability index requires little effort beyond a modal analysis, but still accounts for relevant uncertainties in both the structure and environment.more » For both examples, the reliability index is observed to agree well with the results of Monte Carlo analysis. In situations where fast, qualitative comparison of several candidate designs is required, the reliability index based on the proposed limit state function provides an attractive metric which can be used to compare and control reliability.« less

  12. Composite Stress Rupture: A New Reliability Model Based on Strength Decay

    NASA Technical Reports Server (NTRS)

    Reeder, James R.

    2012-01-01

    A model is proposed to estimate reliability for stress rupture of composite overwrap pressure vessels (COPVs) and similar composite structures. This new reliability model is generated by assuming a strength degradation (or decay) over time. The model suggests that most of the strength decay occurs late in life. The strength decay model will be shown to predict a response similar to that predicted by a traditional reliability model for stress rupture based on tests at a single stress level. In addition, the model predicts that even though there is strength decay due to proof loading, a significant overall increase in reliability is gained by eliminating any weak vessels, which would fail early. The model predicts that there should be significant periods of safe life following proof loading, because time is required for the strength to decay from the proof stress level to the subsequent loading level. Suggestions for testing the strength decay reliability model have been made. If the strength decay reliability model predictions are shown through testing to be accurate, COPVs may be designed to carry a higher level of stress than is currently allowed, which will enable the production of lighter structures

  13. Probing Reliability of Transport Phenomena Based Heat Transfer and Fluid Flow Analysis in Autogeneous Fusion Welding Process

    NASA Astrophysics Data System (ADS)

    Bag, S.; de, A.

    2010-09-01

    The transport phenomena based heat transfer and fluid flow calculations in weld pool require a number of input parameters. Arc efficiency, effective thermal conductivity, and viscosity in weld pool are some of these parameters, values of which are rarely known and difficult to assign a priori based on the scientific principles alone. The present work reports a bi-directional three-dimensional (3-D) heat transfer and fluid flow model, which is integrated with a real number based genetic algorithm. The bi-directional feature of the integrated model allows the identification of the values of a required set of uncertain model input parameters and, next, the design of process parameters to achieve a target weld pool dimension. The computed values are validated with measured results in linear gas-tungsten-arc (GTA) weld samples. Furthermore, a novel methodology to estimate the overall reliability of the computed solutions is also presented.

  14. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    NASA Astrophysics Data System (ADS)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  15. The effect of Web-based Braden Scale training on the reliability of Braden subscale ratings.

    PubMed

    Magnan, Morris A; Maklebust, JoAnn

    2009-01-01

    The primary purpose of this study was to evaluate the effect of Web-based Braden Scale training on the reliability of Braden Scale subscale ratings made by nurses working in acute care hospitals. A secondary purpose was to describe the distribution of reliable Braden subscale ratings before and after Web-based Braden Scale training. Secondary analysis of data from a recently completed quasi-experimental, pretest-posttest, interrater reliability study. A convenience sample of RNs working at 3 Michigan medical centers voluntarily participated in the study. RN participants included nurses who used the Braden Scale regularly at their place of employment ("regular users") as well as nurses who did not use the Braden Scale at their place of employment ("new users"). Using a pretest-posttest, quasi-experimental design, pretest interrater reliability data were collected to identify the percentage of nurses making reliable Braden subscale assessments. Nurses then completed a Web-based Braden Scale training module after which posttest interrater reliability data were collected. The reliability of nurses' Braden subscale ratings was determined by examining the level of agreement/disagreement between ratings made by an RN and an "expert" rating the same patient. In total, 381 RN-to-expert dyads were available for analysis. During both the pretest and posttest periods, the percentage of reliable subscale ratings was highest for the activity subscale, lowest for the moisture subscale, and second lowest for the nutrition subscale. With Web-based Braden Scale training, the percentage of reliable Braden subscale ratings made by new users increased for all 6 subscales with statistically significant improvements in the percentage of reliable assessments made on 3 subscales: sensory-perception, moisture, and mobility. Training had virtually no effect on the percentage of reliable subscale ratings made by regular users of the Braden Scale. With Web-based Braden Scale training the

  16. INFLUENCES OF RESPONSE RATE AND DISTRIBUTION ON THE CALCULATION OF INTEROBSERVER RELIABILITY SCORES

    PubMed Central

    Rolider, Natalie U.; Iwata, Brian A.; Bullock, Christopher E.

    2012-01-01

    We examined the effects of several variations in response rate on the calculation of total, interval, exact-agreement, and proportional reliability indices. Trained observers recorded computer-generated data that appeared on a computer screen. In Study 1, target responses occurred at low, moderate, and high rates during separate sessions so that reliability results based on the four calculations could be compared across a range of values. Total reliability was uniformly high, interval reliability was spuriously high for high-rate responding, proportional reliability was somewhat lower for high-rate responding, and exact-agreement reliability was the lowest of the measures, especially for high-rate responding. In Study 2, we examined the separate effects of response rate per se, bursting, and end-of-interval responding. Response rate and bursting had little effect on reliability scores; however, the distribution of some responses at the end of intervals decreased interval reliability somewhat, proportional reliability noticeably, and exact-agreement reliability markedly. PMID:23322930

  17. Automatic Target Recognition Based on Cross-Plot

    PubMed Central

    Wong, Kelvin Kian Loong; Abbott, Derek

    2011-01-01

    Automatic target recognition that relies on rapid feature extraction of real-time target from photo-realistic imaging will enable efficient identification of target patterns. To achieve this objective, Cross-plots of binary patterns are explored as potential signatures for the observed target by high-speed capture of the crucial spatial features using minimal computational resources. Target recognition was implemented based on the proposed pattern recognition concept and tested rigorously for its precision and recall performance. We conclude that Cross-plotting is able to produce a digital fingerprint of a target that correlates efficiently and effectively to signatures of patterns having its identity in a target repository. PMID:21980508

  18. Pinch aperture proprioception: reliability and feasibility study

    PubMed Central

    Yahya, Abdalghani; von Behren, Timothy; Levine, Shira; dos Santos, Marcio

    2018-01-01

    [Purpose] To establish the reliability and feasibility of a novel pinch aperture device to measure proprioceptive joint position sense. [Subjects and Methods] Reliability of the pinch aperture device was assessed in 21 healthy subjects. Following familiarization with a 15° target position of the index finger and thumb, subjects performed 5 trials in which they attempted to actively reproduce the target position without visual feedback. This procedure was repeated at a testing session on a separate date, and the between-session intraclass correlation coefficient (ICC) was calculated. In addition, extensor tendon vibration was applied to 19 healthy subjects, and paired t-tests were conducted to compare performance under vibration and no-vibration conditions. Pinch aperture proprioception was also assessed in two individuals with known diabetic neuropathy. [Results] The pinch aperture device demonstrated excellent reliability in healthy subjects (ICC 0.88, 95% confidence interval 0.70–0.95). Tendon vibration disrupted pinch aperture proprioception, causing subjects to undershoot the target position (18.1 ± 2.6° vs. 14.8° ± 0.76, p<0.001). This tendency to undershoot the target position was also noted in individuals with diabetic neuropathy. [Conclusion] This study describes a reliable, feasible, and functional means of measuring finger proprioception. Further research should investigate the assessment and implications of pinch aperture proprioception in neurological and orthopedic populations. PMID:29765192

  19. Reliability and validity of a treatment fidelity assessment for motivational interviewing targeting sexual risk behaviors in people living with HIV/AIDS.

    PubMed

    Seng, Elizabeth K; Lovejoy, Travis I

    2013-12-01

    This study psychometrically evaluates the Motivational Interviewing Treatment Integrity Code (MITI) to assess fidelity to motivational interviewing to reduce sexual risk behaviors in people living with HIV/AIDS. 74 sessions from a pilot randomized controlled trial of motivational interviewing to reduce sexual risk behaviors in people living with HIV were coded with the MITI. Participants reported sexual behavior at baseline, 3-month, and 6-months. Regarding reliability, excellent inter-rater reliability was achieved for measures of behavior frequency across the 12 sessions coded by both coders; global scales demonstrated poor intraclass correlations, but adequate percent agreement. Regarding validity, principle components analyses indicated that a two-factor model accounted for an adequate amount of variance in the data. These factors were associated with decreases in sexual risk behaviors after treatment. The MITI is a reliable and valid measurement of treatment fidelity for motivational interviewing targeting sexual risk behaviors in people living with HIV/AIDS.

  20. MultiMiTar: a novel multi objective optimization based miRNA-target prediction method.

    PubMed

    Mitra, Ramkrishna; Bandyopadhyay, Sanghamitra

    2011-01-01

    Machine learning based miRNA-target prediction algorithms often fail to obtain a balanced prediction accuracy in terms of both sensitivity and specificity due to lack of the gold standard of negative examples, miRNA-targeting site context specific relevant features and efficient feature selection process. Moreover, all the sequence, structure and machine learning based algorithms are unable to distribute the true positive predictions preferentially at the top of the ranked list; hence the algorithms become unreliable to the biologists. In addition, these algorithms fail to obtain considerable combination of precision and recall for the target transcripts that are translationally repressed at protein level. In the proposed article, we introduce an efficient miRNA-target prediction system MultiMiTar, a Support Vector Machine (SVM) based classifier integrated with a multiobjective metaheuristic based feature selection technique. The robust performance of the proposed method is mainly the result of using high quality negative examples and selection of biologically relevant miRNA-targeting site context specific features. The features are selected by using a novel feature selection technique AMOSA-SVM, that integrates the multi objective optimization technique Archived Multi-Objective Simulated Annealing (AMOSA) and SVM. MultiMiTar is found to achieve much higher Matthew's correlation coefficient (MCC) of 0.583 and average class-wise accuracy (ACA) of 0.8 compared to the others target prediction methods for a completely independent test data set. The obtained MCC and ACA values of these algorithms range from -0.269 to 0.155 and 0.321 to 0.582, respectively. Moreover, it shows a more balanced result in terms of precision and sensitivity (recall) for the translationally repressed data set as compared to all the other existing methods. An important aspect is that the true positive predictions are distributed preferentially at the top of the ranked list that makes Multi

  1. Reliability, Compliance, and Security in Web-Based Course Assessments

    ERIC Educational Resources Information Center

    Bonham, Scott

    2008-01-01

    Pre- and postcourse assessment has become a very important tool for education research in physics and other areas. The web offers an attractive alternative to in-class paper administration, but concerns about web-based administration include reliability due to changes in medium, student compliance rates, and test security, both question leakage…

  2. Inter-rater reliability of an observation-based ergonomics assessment checklist for office workers.

    PubMed

    Pereira, Michelle Jessica; Straker, Leon Melville; Comans, Tracy Anne; Johnston, Venerina

    2016-12-01

    To establish the inter-rater reliability of an observation-based ergonomics assessment checklist for computer workers. A 37-item (38-item if a laptop was part of the workstation) comprehensive observational ergonomics assessment checklist comparable to government guidelines and up to date with empirical evidence was developed. Two trained practitioners assessed full-time office workers performing their usual computer-based work and evaluated the suitability of workstations used. Practitioners assessed each participant consecutively. The order of assessors was randomised, and the second assessor was blinded to the findings of the first. Unadjusted kappa coefficients between the raters were obtained for the overall checklist and subsections that were formed from question-items relevant to specific workstation equipment. Twenty-seven office workers were recruited. The inter-rater reliability between two trained practitioners achieved moderate to good reliability for all except one checklist component. This checklist has mostly moderate to good reliability between two trained practitioners. Practitioner Summary: This reliable ergonomics assessment checklist for computer workers was designed using accessible government guidelines and supplemented with up-to-date evidence. Employers in Queensland (Australia) can fulfil legislative requirements by using this reliable checklist to identify and subsequently address potential risk factors for work-related injury to provide a safe working environment.

  3. A Measure for the Reliability of a Rating Scale Based on Longitudinal Clinical Trial Data

    ERIC Educational Resources Information Center

    Laenen, Annouschka; Alonso, Ariel; Molenberghs, Geert

    2007-01-01

    A new measure for reliability of a rating scale is introduced, based on the classical definition of reliability, as the ratio of the true score variance and the total variance. Clinical trial data can be employed to estimate the reliability of the scale in use, whenever repeated measurements are taken. The reliability is estimated from the…

  4. Reliability Based Design for a Raked Wing Tip of an Airframe

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2011-01-01

    A reliability-based optimization methodology has been developed to design the raked wing tip of the Boeing 767-400 extended range airliner made of composite and metallic materials. Design is formulated for an accepted level of risk or reliability. The design variables, weight and the constraints became functions of reliability. Uncertainties in the load, strength and the material properties, as well as the design variables, were modeled as random parameters with specified distributions, like normal, Weibull or Gumbel functions. The objective function and constraint, or a failure mode, became derived functions of the risk-level. Solution to the problem produced the optimum design with weight, variables and constraints as a function of the risk-level. Optimum weight versus reliability traced out an inverted-S shaped graph. The center of the graph corresponded to a 50 percent probability of success, or one failure in two samples. Under some assumptions, this design would be quite close to the deterministic optimum solution. The weight increased when reliability exceeded 50 percent, and decreased when the reliability was compromised. A design could be selected depending on the level of risk acceptable to a situation. The optimization process achieved up to a 20-percent reduction in weight over traditional design.

  5. Limit states and reliability-based pipeline design. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zimmerman, T.J.E.; Chen, Q.; Pandey, M.D.

    1997-06-01

    This report provides the results of a study to develop limit states design (LSD) procedures for pipelines. Limit states design, also known as load and resistance factor design (LRFD), provides a unified approach to dealing with all relevant failure modes combinations of concern. It explicitly accounts for the uncertainties that naturally occur in the determination of the loads which act on a pipeline and in the resistance of the pipe to failure. The load and resistance factors used are based on reliability considerations; however, the designer is not faced with carrying out probabilistic calculations. This work is done during developmentmore » and periodic updating of the LSD document. This report provides background information concerning limits states and reliability-based design (Section 2), gives the limit states design procedures that were developed (Section 3) and provides results of the reliability analyses that were undertaken in order to partially calibrate the LSD method (Section 4). An appendix contains LSD design examples in order to demonstrate use of the method. Section 3, Limit States Design has been written in the format of a recommended practice. It has been structured so that, in future, it can easily be converted to a limit states design code format. Throughout the report, figures and tables are given at the end of each section, with the exception of Section 3, where to facilitate understanding of the LSD method, they have been included with the text.« less

  6. Real-time stereo matching using orthogonal reliability-based dynamic programming.

    PubMed

    Gong, Minglun; Yang, Yee-Hong

    2007-03-01

    A novel algorithm is presented in this paper for estimating reliable stereo matches in real time. Based on the dynamic programming-based technique we previously proposed, the new algorithm can generate semi-dense disparity maps using as few as two dynamic programming passes. The iterative best path tracing process used in traditional dynamic programming is replaced by a local minimum searching process, making the algorithm suitable for parallel execution. Most computations are implemented on programmable graphics hardware, which improves the processing speed and makes real-time estimation possible. The experiments on the four new Middlebury stereo datasets show that, on an ATI Radeon X800 card, the presented algorithm can produce reliable matches for 60% approximately 80% of pixels at the rate of 10 approximately 20 frames per second. If needed, the algorithm can be configured for generating full density disparity maps.

  7. Reliability and performance evaluation of systems containing embedded rule-based expert systems

    NASA Technical Reports Server (NTRS)

    Beaton, Robert M.; Adams, Milton B.; Harrison, James V. A.

    1989-01-01

    A method for evaluating the reliability of real-time systems containing embedded rule-based expert systems is proposed and investigated. It is a three stage technique that addresses the impact of knowledge-base uncertainties on the performance of expert systems. In the first stage, a Markov reliability model of the system is developed which identifies the key performance parameters of the expert system. In the second stage, the evaluation method is used to determine the values of the expert system's key performance parameters. The performance parameters can be evaluated directly by using a probabilistic model of uncertainties in the knowledge-base or by using sensitivity analyses. In the third and final state, the performance parameters of the expert system are combined with performance parameters for other system components and subsystems to evaluate the reliability and performance of the complete system. The evaluation method is demonstrated in the context of a simple expert system used to supervise the performances of an FDI algorithm associated with an aircraft longitudinal flight-control system.

  8. Reliability-based evaluation of bridge components for consistent safety margins.

    DOT National Transportation Integrated Search

    2010-10-01

    The Load and Resistant Factor Design (LRFD) approach is based on the concept of structural reliability. The approach is more : rational than the former design approaches such as Load Factor Design or Allowable Stress Design. The LRFD Specification fo...

  9. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  10. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory

    PubMed Central

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611

  11. Advances in silica based nanoparticles for targeted cancer therapy.

    PubMed

    Yang, Yannan; Yu, Chengzhong

    2016-02-01

    Targeted delivery of anticancer drug specifically to tumor site without damaging normal tissues has been the dream of all scientists fighting against cancer for decades. Recent breakthrough on nanotechnology based medicines has provided a possible tool to solve this puzzle. Among diverse nanomaterials that are under development and extensive study, silica based nanoparticles with vast advantages have attracted great attention. In this review, we concentrate on the recent progress using silica based nanoparticles, particularly mesoporous silica nanoparticles (MSNs), for targeted drug delivery applications. First, we discuss the passive targeting capability of silica based nanoparticles in relation to their physiochemical properties. Then, we focus on the recent advances of active targeting strategies involving tumor cell targeting, vascular targeting, nuclear targeting and multistage targeting, followed by an introduction to magnetic field directed targeting approach. We conclude with our personal perspectives on the remaining challenges and the possible future directions. Chemotherapy has been one of the mainstays of cancer treatment. The advances in nanotechnology has allowed the development of novel carrier systems for the delivery of anticancer drugs. Mesoporous silica has shown great promise in this respect. In this review article, the authors provided a comprehensive overview of the use of this nanoparticle in both passive, as well as active targeting in the field of oncology. The advantages of this particle were further discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Combine Flash-Based FPGA TID and Long-Term Retention Reliabilities Through VT Shift

    NASA Astrophysics Data System (ADS)

    Wang, Jih-Jong; Rezzak, Nadia; Dsilva, Durwyn; Xue, Fengliang; Samiee, Salim; Singaraju, Pavan; Jia, James; Nguyen, Victor; Hawley, Frank; Hamdy, Esmat

    2016-08-01

    Reliability test results of data retention and total ionizing dose (TID) in 65 nm Flash-based field programmable gate array (FPGA) are presented. Long-chain inverter design is recommended for reliability evaluation because it is the worst case design for both effects. Based on preliminary test data, both issues are unified and modeled by one natural decay equation. The relative contributions of TID induced threshold-voltage shift and retention mechanisms are evaluated by analyzing test data.

  13. Is School-Based Height and Weight Screening of Elementary Students Private and Reliable?

    ERIC Educational Resources Information Center

    Stoddard, Sarah A.; Kubik, Martha Y.; Skay, Carol

    2008-01-01

    The Institute of Medicine recommends school-based body mass index (BMI) screening as an obesity prevention strategy. While school nurses have provided height/weight screening for years, little has been published describing measurement reliability or process. This study evaluated the reliability of height/weight measures collected by school nurses…

  14. Bridge reliability assessment based on the PDF of long-term monitored extreme strains

    NASA Astrophysics Data System (ADS)

    Jiao, Meiju; Sun, Limin

    2011-04-01

    Structural health monitoring (SHM) systems can provide valuable information for the evaluation of bridge performance. As the development and implementation of SHM technology in recent years, the data mining and use has received increasingly attention and interests in civil engineering. Based on the principle of probabilistic and statistics, a reliability approach provides a rational basis for analysis of the randomness in loads and their effects on structures. A novel approach combined SHM systems with reliability method to evaluate the reliability of a cable-stayed bridge instrumented with SHM systems was presented in this paper. In this study, the reliability of the steel girder of the cable-stayed bridge was denoted by failure probability directly instead of reliability index as commonly used. Under the assumption that the probability distributions of the resistance are independent to the responses of structures, a formulation of failure probability was deduced. Then, as a main factor in the formulation, the probability density function (PDF) of the strain at sensor locations based on the monitoring data was evaluated and verified. That Donghai Bridge was taken as an example for the application of the proposed approach followed. In the case study, 4 years' monitoring data since the operation of the SHM systems was processed, and the reliability assessment results were discussed. Finally, the sensitivity and accuracy of the novel approach compared with FORM was discussed.

  15. Reliable contact fabrication on nanostructured Bi2Te3-based thermoelectric materials.

    PubMed

    Feng, Shien-Ping; Chang, Ya-Huei; Yang, Jian; Poudel, Bed; Yu, Bo; Ren, Zhifeng; Chen, Gang

    2013-05-14

    A cost-effective and reliable Ni-Au contact on nanostructured Bi2Te3-based alloys for a solar thermoelectric generator (STEG) is reported. The use of MPS SAMs creates a strong covalent binding and more nucleation sites with even distribution for electroplating contact electrodes on nanostructured thermoelectric materials. A reliable high-performance flat-panel STEG can be obtained by using this new method.

  16. Design of a sensitive aptasensor based on magnetic microbeads-assisted strand displacement amplification and target recycling.

    PubMed

    Li, Ying; Ji, Xiaoting; Song, Weiling; Guo, Yingshu

    2013-04-03

    A cross-circular amplification system for sensitive detection of adenosine triphosphate (ATP) in cancer cells was developed based on aptamer-target interaction, magnetic microbeads (MBs)-assisted strand displacement amplification and target recycling. Here we described a new recognition probe possessing two parts, the ATP aptamer and the extension part. The recognition probe was firstly immobilized on the surface of MBs and hybridized with its complementary sequence to form a duplex. When combined with ATP, the probe changed its conformation, revealing the extension part in single-strand form, which further served as a toehold for subsequent target recycling. The released complementary sequence of the probe acted as the catalyst of the MB-assisted strand displacement reaction. Incorporated with target recycling, a large amount of biotin-tagged MB complexes were formed to stimulate the generation of chemiluminescence (CL) signal in the presence of luminol and H2O2 by incorporating with streptavidin-HRP, reaching a detection limit of ATP as low as 6.1×10(-10)M. Moreover, sample assays of ATP in Ramos Burkitt's lymphoma B cells were performed, which confirmed the reliability and practicality of the protocol. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Deterministic and reliability based optimization of integrated thermal protection system composite panel using adaptive sampling techniques

    NASA Astrophysics Data System (ADS)

    Ravishankar, Bharani

    Conventional space vehicles have thermal protection systems (TPS) that provide protection to an underlying structure that carries the flight loads. In an attempt to save weight, there is interest in an integrated TPS (ITPS) that combines the structural function and the TPS function. This has weight saving potential, but complicates the design of the ITPS that now has both thermal and structural failure modes. The main objectives of this dissertation was to optimally design the ITPS subjected to thermal and mechanical loads through deterministic and reliability based optimization. The optimization of the ITPS structure requires computationally expensive finite element analyses of 3D ITPS (solid) model. To reduce the computational expenses involved in the structural analysis, finite element based homogenization method was employed, homogenizing the 3D ITPS model to a 2D orthotropic plate. However it was found that homogenization was applicable only for panels that are much larger than the characteristic dimensions of the repeating unit cell in the ITPS panel. Hence a single unit cell was used for the optimization process to reduce the computational cost. Deterministic and probabilistic optimization of the ITPS panel required evaluation of failure constraints at various design points. This further demands computationally expensive finite element analyses which was replaced by efficient, low fidelity surrogate models. In an optimization process, it is important to represent the constraints accurately to find the optimum design. Instead of building global surrogate models using large number of designs, the computational resources were directed towards target regions near constraint boundaries for accurate representation of constraints using adaptive sampling strategies. Efficient Global Reliability Analyses (EGRA) facilitates sequentially sampling of design points around the region of interest in the design space. EGRA was applied to the response surface construction of

  18. Reliability of 3D laser-based anthropometry and comparison with classical anthropometry.

    PubMed

    Kuehnapfel, Andreas; Ahnert, Peter; Loeffler, Markus; Broda, Anja; Scholz, Markus

    2016-05-26

    Anthropometric quantities are widely used in epidemiologic research as possible confounders, risk factors, or outcomes. 3D laser-based body scans (BS) allow evaluation of dozens of quantities in short time with minimal physical contact between observers and probands. The aim of this study was to compare BS with classical manual anthropometric (CA) assessments with respect to feasibility, reliability, and validity. We performed a study on 108 individuals with multiple measurements of BS and CA to estimate intra- and inter-rater reliabilities for both. We suggested BS equivalents of CA measurements and determined validity of BS considering CA the gold standard. Throughout the study, the overall concordance correlation coefficient (OCCC) was chosen as indicator of agreement. BS was slightly more time consuming but better accepted than CA. For CA, OCCCs for intra- and inter-rater reliability were greater than 0.8 for all nine quantities studied. For BS, 9 of 154 quantities showed reliabilities below 0.7. BS proxies for CA measurements showed good agreement (minimum OCCC > 0.77) after offset correction. Thigh length showed higher reliability in BS while upper arm length showed higher reliability in CA. Except for these issues, reliabilities of CA measurements and their BS equivalents were comparable.

  19. Test-retest reliability of sensor-based sit-to-stand measures in young and older adults.

    PubMed

    Regterschot, G Ruben H; Zhang, Wei; Baldus, Heribert; Stevens, Martin; Zijlstra, Wiebren

    2014-01-01

    This study investigated test-retest reliability of sensor-based sit-to-stand (STS) peak power and other STS measures in young and older adults. In addition, test-retest reliability of the sensor method was compared to test-retest reliability of the Timed Up and Go Test (TUGT) and Five-Times-Sit-to-Stand Test (FTSST) in older adults. Ten healthy young female adults (20-23 years) and 31 older adults (21 females; 73-94 years) participated in two assessment sessions separated by 3-8 days. Vertical peak power was assessed during three (young adults) and five (older adults) normal and fast STS trials with a hybrid motion sensor worn on the hip. Older adults also performed the FTSST and TUGT. The average sensor-based STS peak power of the normal STS trials and the average sensor-based STS peak power of the fast STS trials showed excellent test-retest reliability in young adults (intra-class correlation (ICC)≥0.90; zero in 95% confidence interval of mean difference between test and retest (95%CI of D); standard error of measurement (SEM)≤6.7% of mean peak power) and older adults (ICC≥0.91; zero in 95%CI of D; SEM≤9.9%). Test-retest reliability of sensor-based STS peak power and TUGT (ICC=0.98; zero in 95%CI of D; SEM=8.5%) was comparable in older adults, test-retest reliability of the FTSST was lower (ICC=0.73; zero outside 95%CI of D; SEM=14.4%). Sensor-based STS peak power demonstrated excellent test-retest reliability and may therefore be useful for clinical assessment of functional status and fall risk. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Reliability model generator

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C. (Inventor); McMann, Catherine M. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  1. Recent Trends in Nanotechnology-Based Drugs and Formulations for Targeted Therapeutic Delivery.

    PubMed

    Iqbal, Hafiz M N; Rodriguez, Angel M V; Khandia, Rekha; Munjal, Ashok; Dhama, Kuldeep

    2017-01-01

    In the recent past, a wider spectrum of nanotechnologybased drugs or drug-loaded devices and systems has been engineered and investigated with high interests. The key objective is to help for an enhanced/better quality of patient life in a secure way by avoiding/limiting drug abuse, or severe adverse effects of some in practice traditional therapies. Various methodological approaches including in vitro, in vivo, and ex vivo techniques have been exploited, so far. Among them, nanoparticles-based therapeutic agents are of supreme interests for an enhanced and efficient delivery in the current biomedical sector of the modern world. The development of new types of novel, effective and highly reliable therapeutic drug delivery system (DDS) for multipurpose applications is essential and a core demand to tackle many human health related diseases. In this context, nanotechnology-based several advanced DDS have been engineered with novel characteristics for biomedical, pharmaceutical and cosmeceutical applications that include but not limited to the enhanced/improved bioactivity, bioavailability, drug efficacy, targeted delivery, and therapeutically safer with an extra advantage of overcoming demerits of traditional drug formulations/designs. This review work is focused on recent trends/advances in nanotechnology-based drugs and formulations designed for targeted therapeutic delivery. Moreover, information is also reviewed and given from recent patents and summarized or illustrated diagrammatically to depict a better understanding. Recent patents covering various nanotechnology-based approaches for several applications have also been reviewed. The drug-loaded nanoparticles are among versatile candidates with multifunctional characteristics for potential applications in biomedical, and tissue engineering sector. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  2. Drug Discovery for Neglected Diseases: Molecular Target-Based and Phenotypic Approaches

    PubMed Central

    2013-01-01

    Drug discovery for neglected tropical diseases is carried out using both target-based and phenotypic approaches. In this paper, target-based approaches are discussed, with a particular focus on human African trypanosomiasis. Target-based drug discovery can be successful, but careful selection of targets is required. There are still very few fully validated drug targets in neglected diseases, and there is a high attrition rate in target-based drug discovery for these diseases. Phenotypic screening is a powerful method in both neglected and non-neglected diseases and has been very successfully used. Identification of molecular targets from phenotypic approaches can be a way to identify potential new drug targets. PMID:24015767

  3. Reliability of a smartphone-based goniometer for knee joint goniometry.

    PubMed

    Ferriero, Giorgio; Vercelli, Stefano; Sartorio, Francesco; Muñoz Lasa, Susana; Ilieva, Elena; Brigatti, Elisa; Ruella, Carolina; Foti, Calogero

    2013-06-01

    The aim of this study was to assess the reliability of a smartphone-based application developed for photographic-based goniometry, DrGoniometer (DrG), by comparing its measurement of the knee joint angle with that made by a universal goniometer (UG). Joint goniometry is a common mode of clinical assessment used in many disciplines, in particular in rehabilitation. One validated method is photographic-based goniometry, but the procedure is usually complex: the image has to be downloaded from the camera to a computer and then edited using dedicated software. This disadvantage may be overcome by the new generation of mobile phones (smartphones) that have computer-like functionality and an integrated digital camera. This validation study was carried out under two different controlled conditions: (i) with the participant to measure in a fixed position and (ii) with a battery of pictures to assess. In the first part, four raters performed repeated measurements with DrG and UG at different knee joint angles. Then, 10 other raters measured the knee at different flexion angles ranging 20-145° on a battery of 35 pictures taken in a clinical setting. The results showed that inter-rater and intra-rater correlations were always more than 0.958. Agreement with the UG showed a width of 18.2° [95% limits of agreement (LoA)=-7.5/+10.7°] and 14.1° (LoA=-6.6/+7.5°). In conclusion, DrG seems to be a reliable method for measuring knee joint angle. This mHealth application can be an alternative/additional method of goniometry, easier to use than other photographic-based goniometric assessments. Further studies are required to assess its reliability for the measurement of other joints.

  4. Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model

    NASA Astrophysics Data System (ADS)

    Yuan, Zhongda; Deng, Junxiang; Wang, Dawei

    2018-02-01

    Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.

  5. Geo-Referenced Dynamic Pushbroom Stereo Mosaics for 3D and Moving Target Extraction - A New Geometric Approach

    DTIC Science & Technology

    2009-12-01

    facilitating reliable stereo matching, occlusion handling, accurate 3D reconstruction and robust moving target detection . We use the fact that all the...a moving platform, we will have to naturally and effectively handle obvious motion parallax and object occlusions in order to be able to detect ...facilitating reliable stereo matching, occlusion handling, accurate 3D reconstruction and robust moving target detection . Based on the above two

  6. Universal, colorimetric microRNA detection strategy based on target-catalyzed toehold-mediated strand displacement reaction

    NASA Astrophysics Data System (ADS)

    Park, Yeonkyung; Lee, Chang Yeol; Kang, Shinyoung; Kim, Hansol; Park, Ki Soo; Park, Hyun Gyu

    2018-02-01

    In this work, we developed a novel, label-free, and enzyme-free strategy for the colorimetric detection of microRNA (miRNA), which relies on a target-catalyzed toehold-mediated strand displacement (TMSD) reaction. The system employs a detection probe that specifically binds to the target miRNA and sequentially releases a catalyst strand (CS) intended to trigger the subsequent TMSD reaction. Thus, the presence of target miRNA releases the CS that mediates the formation of an active G-quadruplex DNAzyme which is initially caged and inactivated by a blocker strand. In addition, a fuel strand that is supplemented for the recycling of the CS promotes another TMSD reaction, consequently generating a large number of active G-quadruplex DNAzymes. As a result, a distinct colorimetric signal is produced by the ABTS oxidation promoted by the peroxidase mimicking activity of the released G-quadruplex DNAzymes. Based on this novel strategy, we successfully detected miR-141, a promising biomarker for human prostate cancer, with high selectivity. The diagnostic capability of this system was also demonstrated by reliably determining target miR-141 in human serum, showing its great potential towards real clinical applications. Importantly, the proposed approach is composed of separate target recognition and signal transduction modules. Thus, it could be extended to analyze different target miRNAs by simply redesigning the detection probe while keeping the same signal transduction module as a universal signal amplification unit, which was successfully demonstrated by analyzing another target miRNA, let-7d.

  7. Universal, colorimetric microRNA detection strategy based on target-catalyzed toehold-mediated strand displacement reaction.

    PubMed

    Park, Yeonkyung; Lee, Chang Yeol; Kang, Shinyoung; Kim, Hansol; Park, Ki Soo; Park, Hyun Gyu

    2018-02-23

    In this work, we developed a novel, label-free, and enzyme-free strategy for the colorimetric detection of microRNA (miRNA), which relies on a target-catalyzed toehold-mediated strand displacement (TMSD) reaction. The system employs a detection probe that specifically binds to the target miRNA and sequentially releases a catalyst strand (CS) intended to trigger the subsequent TMSD reaction. Thus, the presence of target miRNA releases the CS that mediates the formation of an active G-quadruplex DNAzyme which is initially caged and inactivated by a blocker strand. In addition, a fuel strand that is supplemented for the recycling of the CS promotes another TMSD reaction, consequently generating a large number of active G-quadruplex DNAzymes. As a result, a distinct colorimetric signal is produced by the ABTS oxidation promoted by the peroxidase mimicking activity of the released G-quadruplex DNAzymes. Based on this novel strategy, we successfully detected miR-141, a promising biomarker for human prostate cancer, with high selectivity. The diagnostic capability of this system was also demonstrated by reliably determining target miR-141 in human serum, showing its great potential towards real clinical applications. Importantly, the proposed approach is composed of separate target recognition and signal transduction modules. Thus, it could be extended to analyze different target miRNAs by simply redesigning the detection probe while keeping the same signal transduction module as a universal signal amplification unit, which was successfully demonstrated by analyzing another target miRNA, let-7d.

  8. Robust Targeting for the Smartphone Video Guidance Sensor

    NASA Technical Reports Server (NTRS)

    Carter, Christopher

    2017-01-01

    The Smartphone Video Guidance Sensor (SVGS) is a miniature, self-contained autonomous rendezvous and docking sensor developed using a commercial off the shelf Android-based smartphone. It aims to provide a miniaturized solution for rendezvous and docking, enabling small satellites to conduct proximity operations and formation flying while minimizing interference with a primary payload. Previously, the sensor was limited by a slow (2 Hz) refresh rate and its use of retro-reflectors, both of which contributed to a limited operating environment. To advance the technology readiness level, a modified approach was developed, combining a multi-colored LED target with a focused target-detection algorithm. Alone, the use of an LED system was determined to be much more reliable, though slower, than the retro-reflector system. The focused target-detection system was developed in response to this problem to mitigate the speed reduction of using color. However, it also improved the reliability. In combination these two methods have been demonstrated to dramatically increase sensor speed and allow the sensor to select the target even with significant noise interfering with the sensor, providing millimeter level accuracy at a range of two meters with a 1U target.

  9. Robust Targeting for the Smartphone Video Guidance Sensor

    NASA Technical Reports Server (NTRS)

    Carter, C.

    2017-01-01

    The Smartphone Video Guidance Sensor (SVGS) is a miniature, self-contained autonomous rendezvous and docking sensor developed using a commercial off the shelf Android-based smartphone. It aims to provide a miniaturized solution for rendezvous and docking, enabling small satellites to conduct proximity operations and formation flying while minimizing interference with a primary payload. Previously, the sensor was limited by a slow (2 Hz) refresh rate and its use of retro-reflectors, both of which contributed to a limited operating environment. To advance the technology readiness level, a modified approach was developed, combining a multi-colored LED target with a focused target-detection algorithm. Alone, the use of an LED system was determined to be much more reliable, though slower, than the retro-reflector system. The focused target-detection system was developed in response to this problem to mitigate the speed reduction of using color. However it also improved the reliability. In combination these two methods have been demonstrated to dramatically increase sensor speed and allow the sensor to select the target even with significant noise interfering with the sensor, providing millimeter level precision at a range of two meters with a 1U target.

  10. TargetNet: a web service for predicting potential drug-target interaction profiling via multi-target SAR models.

    PubMed

    Yao, Zhi-Jiang; Dong, Jie; Che, Yu-Jing; Zhu, Min-Feng; Wen, Ming; Wang, Ning-Ning; Wang, Shan; Lu, Ai-Ping; Cao, Dong-Sheng

    2016-05-01

    Drug-target interactions (DTIs) are central to current drug discovery processes and public health fields. Analyzing the DTI profiling of the drugs helps to infer drug indications, adverse drug reactions, drug-drug interactions, and drug mode of actions. Therefore, it is of high importance to reliably and fast predict DTI profiling of the drugs on a genome-scale level. Here, we develop the TargetNet server, which can make real-time DTI predictions based only on molecular structures, following the spirit of multi-target SAR methodology. Naïve Bayes models together with various molecular fingerprints were employed to construct prediction models. Ensemble learning from these fingerprints was also provided to improve the prediction ability. When the user submits a molecule, the server will predict the activity of the user's molecule across 623 human proteins by the established high quality SAR model, thus generating a DTI profiling that can be used as a feature vector of chemicals for wide applications. The 623 SAR models related to 623 human proteins were strictly evaluated and validated by several model validation strategies, resulting in the AUC scores of 75-100 %. We applied the generated DTI profiling to successfully predict potential targets, toxicity classification, drug-drug interactions, and drug mode of action, which sufficiently demonstrated the wide application value of the potential DTI profiling. The TargetNet webserver is designed based on the Django framework in Python, and is freely accessible at http://targetnet.scbdd.com .

  11. TargetNet: a web service for predicting potential drug-target interaction profiling via multi-target SAR models

    NASA Astrophysics Data System (ADS)

    Yao, Zhi-Jiang; Dong, Jie; Che, Yu-Jing; Zhu, Min-Feng; Wen, Ming; Wang, Ning-Ning; Wang, Shan; Lu, Ai-Ping; Cao, Dong-Sheng

    2016-05-01

    Drug-target interactions (DTIs) are central to current drug discovery processes and public health fields. Analyzing the DTI profiling of the drugs helps to infer drug indications, adverse drug reactions, drug-drug interactions, and drug mode of actions. Therefore, it is of high importance to reliably and fast predict DTI profiling of the drugs on a genome-scale level. Here, we develop the TargetNet server, which can make real-time DTI predictions based only on molecular structures, following the spirit of multi-target SAR methodology. Naïve Bayes models together with various molecular fingerprints were employed to construct prediction models. Ensemble learning from these fingerprints was also provided to improve the prediction ability. When the user submits a molecule, the server will predict the activity of the user's molecule across 623 human proteins by the established high quality SAR model, thus generating a DTI profiling that can be used as a feature vector of chemicals for wide applications. The 623 SAR models related to 623 human proteins were strictly evaluated and validated by several model validation strategies, resulting in the AUC scores of 75-100 %. We applied the generated DTI profiling to successfully predict potential targets, toxicity classification, drug-drug interactions, and drug mode of action, which sufficiently demonstrated the wide application value of the potential DTI profiling. The TargetNet webserver is designed based on the Django framework in Python, and is freely accessible at http://targetnet.scbdd.com.

  12. Reliability analysis of the solar array based on Fault Tree Analysis

    NASA Astrophysics Data System (ADS)

    Jianing, Wu; Shaoze, Yan

    2011-07-01

    The solar array is an important device used in the spacecraft, which influences the quality of in-orbit operation of the spacecraft and even the launches. This paper analyzes the reliability of the mechanical system and certifies the most vital subsystem of the solar array. The fault tree analysis (FTA) model is established according to the operating process of the mechanical system based on DFH-3 satellite; the logical expression of the top event is obtained by Boolean algebra and the reliability of the solar array is calculated. The conclusion shows that the hinges are the most vital links between the solar arrays. By analyzing the structure importance(SI) of the hinge's FTA model, some fatal causes, including faults of the seal, insufficient torque of the locking spring, temperature in space, and friction force, can be identified. Damage is the initial stage of the fault, so limiting damage is significant to prevent faults. Furthermore, recommendations for improving reliability associated with damage limitation are discussed, which can be used for the redesigning of the solar array and the reliability growth planning.

  13. Reliability and Validity of the Dyadic Observed Communication Scale (DOCS).

    PubMed

    Hadley, Wendy; Stewart, Angela; Hunter, Heather L; Affleck, Katelyn; Donenberg, Geri; Diclemente, Ralph; Brown, Larry K

    2013-02-01

    We evaluated the reliability and validity of the Dyadic Observed Communication Scale (DOCS) coding scheme, which was developed to capture a range of communication components between parents and adolescents. Adolescents and their caregivers were recruited from mental health facilities for participation in a large, multi-site family-based HIV prevention intervention study. Seventy-one dyads were randomly selected from the larger study sample and coded using the DOCS at baseline. Preliminary validity and reliability of the DOCS was examined using various methods, such as comparing results to self-report measures and examining interrater reliability. Results suggest that the DOCS is a reliable and valid measure of observed communication among parent-adolescent dyads that captures both verbal and nonverbal communication behaviors that are typical intervention targets. The DOCS is a viable coding scheme for use by researchers and clinicians examining parent-adolescent communication. Coders can be trained to reliably capture individual and dyadic components of communication for parents and adolescents and this complex information can be obtained relatively quickly.

  14. 75 FR 16098 - Reliable Power LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-31

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [ Docket No. ER10-881-000] Reliable Power LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket... of Reliable Power, LLC's application for market-based rate authority, with an accompanying rate...

  15. Reliability-based optimization of an active vibration controller using evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Saraygord Afshari, Sajad; Pourtakdoust, Seid H.

    2017-04-01

    Many modern industrialized systems such as aircrafts, rotating turbines, satellite booms, etc. cannot perform their desired tasks accurately if their uninhibited structural vibrations are not controlled properly. Structural health monitoring and online reliability calculations are emerging new means to handle system imposed uncertainties. As stochastic forcing are unavoidable, in most engineering systems, it is often needed to take them into the account for the control design process. In this research, smart material technology is utilized for structural health monitoring and control in order to keep the system in a reliable performance range. In this regard, a reliability-based cost function is assigned for both controller gain optimization as well as sensor placement. The proposed scheme is implemented and verified for a wing section. Comparison of results for the frequency responses is considered to show potential applicability of the presented technique.

  16. An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.

    PubMed

    Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes

    2017-10-01

    This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice. © 2017 Society for Risk Analysis.

  17. A Canopy Density Model for Planar Orchard Target Detection Based on Ultrasonic Sensors

    PubMed Central

    Li, Hanzhe; Zhai, Changyuan; Weckler, Paul; Wang, Ning; Yang, Shuo; Zhang, Bo

    2016-01-01

    Orchard target-oriented variable rate spraying is an effective method to reduce pesticide drift and excessive residues. To accomplish this task, the orchard targets’ characteristic information is needed to control liquid flow rate and airflow rate. One of the most important characteristics is the canopy density. In order to establish the canopy density model for a planar orchard target which is indispensable for canopy density calculation, a target density detection testing system was developed based on an ultrasonic sensor. A time-domain energy analysis method was employed to analyze the ultrasonic signal. Orthogonal regression central composite experiments were designed and conducted using man-made canopies of known density with three or four layers of leaves. Two model equations were obtained, of which the model for the canopies with four layers was found to be the most reliable. A verification test was conducted with different layers at the same density values and detecting distances. The test results showed that the relative errors of model density values and actual values of five, four, three and two layers of leaves were acceptable, while the maximum relative errors were 17.68%, 25.64%, 21.33% and 29.92%, respectively. It also suggested the model equation with four layers had a good applicability with different layers which increased with adjacent layers. PMID:28029132

  18. Reliable Acquisition of RAM Dumps from Intel-Based Apple Mac Computers over FireWire

    NASA Astrophysics Data System (ADS)

    Gladyshev, Pavel; Almansoori, Afrah

    RAM content acquisition is an important step in live forensic analysis of computer systems. FireWire offers an attractive way to acquire RAM content of Apple Mac computers equipped with a FireWire connection. However, the existing techniques for doing so require substantial knowledge of the target computer configuration and cannot be used reliably on a previously unknown computer in a crime scene. This paper proposes a novel method for acquiring RAM content of Apple Mac computers over FireWire, which automatically discovers necessary information about the target computer and can be used in the crime scene setting. As an application of the developed method, the techniques for recovery of AOL Instant Messenger (AIM) conversation fragments from RAM dumps are also discussed in this paper.

  19. Space-based infrared sensors of space target imaging effect analysis

    NASA Astrophysics Data System (ADS)

    Dai, Huayu; Zhang, Yasheng; Zhou, Haijun; Zhao, Shuang

    2018-02-01

    Target identification problem is one of the core problem of ballistic missile defense system, infrared imaging simulation is an important means of target detection and recognition. This paper first established the space-based infrared sensors ballistic target imaging model of point source on the planet's atmosphere; then from two aspects of space-based sensors camera parameters and target characteristics simulated atmosphere ballistic target of infrared imaging effect, analyzed the camera line of sight jitter, camera system noise and different imaging effects of wave on the target.

  20. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  1. The Development of DNA Based Methods for the Reliable and Efficient Identification of Nicotiana tabacum in Tobacco and Its Derived Products

    PubMed Central

    Fan, Wei; Li, Rong; Li, Sifan; Ping, Wenli; Li, Shujun; Naumova, Alexandra; Peelen, Tamara; Yuan, Zheng; Zhang, Dabing

    2016-01-01

    Reliable methods are needed to detect the presence of tobacco components in tobacco products to effectively control smuggling and classify tariff and excise in tobacco industry to control illegal tobacco trade. In this study, two sensitive and specific DNA based methods, one quantitative real-time PCR (qPCR) assay and the other loop-mediated isothermal amplification (LAMP) assay, were developed for the reliable and efficient detection of the presence of tobacco (Nicotiana tabacum) in various tobacco samples and commodities. Both assays targeted the same sequence of the uridine 5′-monophosphate synthase (UMPS), and their specificities and sensitivities were determined with various plant materials. Both qPCR and LAMP methods were reliable and accurate in the rapid detection of tobacco components in various practical samples, including customs samples, reconstituted tobacco samples, and locally purchased cigarettes, showing high potential for their application in tobacco identification, particularly in the special cases where the morphology or chemical compositions of tobacco have been disrupted. Therefore, combining both methods would facilitate not only the detection of tobacco smuggling control, but also the detection of tariff classification and of excise. PMID:27635142

  2. Discriminating between camouflaged targets by their time of detection by a human-based observer assessment method

    NASA Astrophysics Data System (ADS)

    Selj, G. K.; Søderblom, M.

    2015-10-01

    Detection of a camouflaged object in natural sceneries requires the target to be distinguishable from its local background. The development of any new camouflage pattern therefore has to rely on a well-founded test methodology - which has to be correlated with the final purpose of the pattern - as well as an evaluation procedure, containing the optimal criteria for i) discriminating between the targets and then eventually ii) for a final rank of the targets. In this study we present results from a recent camouflage assessment trial where human observers were used in a search by photo methodology to assess generic test camouflage patterns. We conducted a study to investigate possible improvements in camouflage patterns for battle dress uniforms. The aim was to do a comparative study of potential, and generic patterns intended for use in arid areas (sparsely vegetated, semi desert). We developed a test methodology that was intended to be simple, reliable and realistic with respect to the operational benefit of camouflage. Therefore we chose to conduct a human based observer trial founded on imagery of realistic targets in natural backgrounds. Inspired by a recent and similar trial in the UK, we developed new and purpose-based software to be able to conduct the observer trial. Our preferred assessment methodology - the observer trial - was based on target recordings in 12 different, but operational relevant scenes, collected in a dry and sparsely vegetated area (Rhodes). The scenes were chosen with the intention to span as broadly as possible. The targets were human-shaped mannequins and were situated identically in each of the scenes to allow for a relative comparison of camouflage effectiveness in each scene. Test of significance, among the targets' performance, was carried out by non-parametric tests as the corresponding time of detection distributions in overall were found to be difficult to parameterize. From the trial, containing 12 different scenes from

  3. The weakest t-norm based intuitionistic fuzzy fault-tree analysis to evaluate system reliability.

    PubMed

    Kumar, Mohit; Yadav, Shiv Prasad

    2012-07-01

    In this paper, a new approach of intuitionistic fuzzy fault-tree analysis is proposed to evaluate system reliability and to find the most critical system component that affects the system reliability. Here weakest t-norm based intuitionistic fuzzy fault tree analysis is presented to calculate fault interval of system components from integrating expert's knowledge and experience in terms of providing the possibility of failure of bottom events. It applies fault-tree analysis, α-cut of intuitionistic fuzzy set and T(ω) (the weakest t-norm) based arithmetic operations on triangular intuitionistic fuzzy sets to obtain fault interval and reliability interval of the system. This paper also modifies Tanaka et al.'s fuzzy fault-tree definition. In numerical verification, a malfunction of weapon system "automatic gun" is presented as a numerical example. The result of the proposed method is compared with the listing approaches of reliability analysis methods. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Interpretive Reliability of Six Computer-Based Test Interpretation Programs for the Minnesota Multiphasic Personality Inventory-2.

    PubMed

    Deskovitz, Mark A; Weed, Nathan C; McLaughlan, Joseph K; Williams, John E

    2016-04-01

    The reliability of six Minnesota Multiphasic Personality Inventory-Second edition (MMPI-2) computer-based test interpretation (CBTI) programs was evaluated across a set of 20 commonly appearing MMPI-2 profile codetypes in clinical settings. Evaluation of CBTI reliability comprised examination of (a) interrater reliability, the degree to which raters arrive at similar inferences based on the same CBTI profile and (b) interprogram reliability, the level of agreement across different CBTI systems. Profile inferences drawn by four raters were operationalized using q-sort methodology. Results revealed no significant differences overall with regard to interrater and interprogram reliability. Some specific CBTI/profile combinations (e.g., the CBTI by Automated Assessment Associates on a within normal limits profile) and specific profiles (e.g., the 4/9 profile displayed greater interprogram reliability than the 2/4 profile) were interpreted with variable consensus (α range = .21-.95). In practice, users should consider that certain MMPI-2 profiles are interpreted more or less consensually and that some CBTIs show variable reliability depending on the profile. © The Author(s) 2015.

  5. Modern methodology of designing target reliability into rotating mechanical components

    NASA Technical Reports Server (NTRS)

    Kececioglu, D. B.; Chester, L. B.

    1973-01-01

    Experimentally determined distributional cycles-to-failure versus maximum alternating nominal strength (S-N) diagrams, and distributional mean nominal strength versus maximum alternating nominal strength (Goodman) diagrams are presented. These distributional S-N and Goodman diagrams are for AISI 4340 steel, R sub c 35/40 hardness, round, cylindrical specimens 0.735 in. in diameter and 6 in. long with a circumferential groove 0.145 in. radius for a theoretical stress concentration = 1.42 and 0.034 in. radius for a stress concentration = 2.34. The specimens are subjected to reversed bending and steady torque in specially built, three complex-fatigue research machines. Based on these results, the effects on the distributional S-N and Goodman diagrams and on service life of superimposing steady torque on reversed bending are established, as well as the effect of various stress concentrations. In addition a computer program for determining the three-parameter Weibull distribution representing the cycles-to-failure data, and two methods for calculating the reliability of components subjected to cumulative fatigue loads are given.

  6. Deep-Learning-Based Drug-Target Interaction Prediction.

    PubMed

    Wen, Ming; Zhang, Zhimin; Niu, Shaoyu; Sha, Haozhi; Yang, Ruihan; Yun, Yonghuan; Lu, Hongmei

    2017-04-07

    Identifying interactions between known drugs and targets is a major challenge in drug repositioning. In silico prediction of drug-target interaction (DTI) can speed up the expensive and time-consuming experimental work by providing the most potent DTIs. In silico prediction of DTI can also provide insights about the potential drug-drug interaction and promote the exploration of drug side effects. Traditionally, the performance of DTI prediction depends heavily on the descriptors used to represent the drugs and the target proteins. In this paper, to accurately predict new DTIs between approved drugs and targets without separating the targets into different classes, we developed a deep-learning-based algorithmic framework named DeepDTIs. It first abstracts representations from raw input descriptors using unsupervised pretraining and then applies known label pairs of interaction to build a classification model. Compared with other methods, it is found that DeepDTIs reaches or outperforms other state-of-the-art methods. The DeepDTIs can be further used to predict whether a new drug targets to some existing targets or whether a new target interacts with some existing drugs.

  7. A High-Speed Target-Free Vision-Based Sensor for Bus Rapid Transit Viaduct Vibration Measurements Using CMT and ORB Algorithms.

    PubMed

    Hu, Qijun; He, Songsheng; Wang, Shilong; Liu, Yugang; Zhang, Zutao; He, Leping; Wang, Fubin; Cai, Qijie; Shi, Rendan; Yang, Yuan

    2017-06-06

    Bus Rapid Transit (BRT) has become an increasing source of concern for public transportation of modern cities. Traditional contact sensing techniques during the process of health monitoring of BRT viaducts cannot overcome the deficiency that the normal free-flow of traffic would be blocked. Advances in computer vision technology provide a new line of thought for solving this problem. In this study, a high-speed target-free vision-based sensor is proposed to measure the vibration of structures without interrupting traffic. An improved keypoints matching algorithm based on consensus-based matching and tracking (CMT) object tracking algorithm is adopted and further developed together with oriented brief (ORB) keypoints detection algorithm for practicable and effective tracking of objects. Moreover, by synthesizing the existing scaling factor calculation methods, more rational approaches to reducing errors are implemented. The performance of the vision-based sensor is evaluated through a series of laboratory tests. Experimental tests with different target types, frequencies, amplitudes and motion patterns are conducted. The performance of the method is satisfactory, which indicates that the vision sensor can extract accurate structure vibration signals by tracking either artificial or natural targets. Field tests further demonstrate that the vision sensor is both practicable and reliable.

  8. A High-Speed Target-Free Vision-Based Sensor for Bus Rapid Transit Viaduct Vibration Measurements Using CMT and ORB Algorithms

    PubMed Central

    Hu, Qijun; He, Songsheng; Wang, Shilong; Liu, Yugang; Zhang, Zutao; He, Leping; Wang, Fubin; Cai, Qijie; Shi, Rendan; Yang, Yuan

    2017-01-01

    Bus Rapid Transit (BRT) has become an increasing source of concern for public transportation of modern cities. Traditional contact sensing techniques during the process of health monitoring of BRT viaducts cannot overcome the deficiency that the normal free-flow of traffic would be blocked. Advances in computer vision technology provide a new line of thought for solving this problem. In this study, a high-speed target-free vision-based sensor is proposed to measure the vibration of structures without interrupting traffic. An improved keypoints matching algorithm based on consensus-based matching and tracking (CMT) object tracking algorithm is adopted and further developed together with oriented brief (ORB) keypoints detection algorithm for practicable and effective tracking of objects. Moreover, by synthesizing the existing scaling factor calculation methods, more rational approaches to reducing errors are implemented. The performance of the vision-based sensor is evaluated through a series of laboratory tests. Experimental tests with different target types, frequencies, amplitudes and motion patterns are conducted. The performance of the method is satisfactory, which indicates that the vision sensor can extract accurate structure vibration signals by tracking either artificial or natural targets. Field tests further demonstrate that the vision sensor is both practicable and reliable. PMID:28587275

  9. Reliability Analysis of Sealing Structure of Electromechanical System Based on Kriging Model

    NASA Astrophysics Data System (ADS)

    Zhang, F.; Wang, Y. M.; Chen, R. W.; Deng, W. W.; Gao, Y.

    2018-05-01

    The sealing performance of aircraft electromechanical system has a great influence on flight safety, and the reliability of its typical seal structure is analyzed by researcher. In this paper, we regard reciprocating seal structure as a research object to study structural reliability. Having been based on the finite element numerical simulation method, the contact stress between the rubber sealing ring and the cylinder wall is calculated, and the relationship between the contact stress and the pressure of the hydraulic medium is built, and the friction force on different working conditions are compared. Through the co-simulation, the adaptive Kriging model obtained by EFF learning mechanism is used to describe the failure probability of the seal ring, so as to evaluate the reliability of the sealing structure. This article proposes a new idea of numerical evaluation for the reliability analysis of sealing structure, and also provides a theoretical basis for the optimal design of sealing structure.

  10. Exploring Polypharmacology Using a ROCS-Based Target Fishing Approach

    DTIC Science & Technology

    2012-01-01

    target representatives. Target profiles were then generated for a given query molecule by computing maximal shape/ chemistry overlap between the query...molecule and the drug sets assigned to each protein target. The overlap was computed using the program ROCS (Rapid Overlay of Chemical Structures ). We...approaches in off-target prediction has been reviewed.9,10 Many structure -based target fishing (SBTF) approaches, such as INVDOCK11 and Target Fishing Dock

  11. Probabilistic confidence for decisions based on uncertain reliability estimates

    NASA Astrophysics Data System (ADS)

    Reid, Stuart G.

    2013-05-01

    Reliability assessments are commonly carried out to provide a rational basis for risk-informed decisions concerning the design or maintenance of engineering systems and structures. However, calculated reliabilities and associated probabilities of failure often have significant uncertainties associated with the possible estimation errors relative to the 'true' failure probabilities. For uncertain probabilities of failure, a measure of 'probabilistic confidence' has been proposed to reflect the concern that uncertainty about the true probability of failure could result in a system or structure that is unsafe and could subsequently fail. The paper describes how the concept of probabilistic confidence can be applied to evaluate and appropriately limit the probabilities of failure attributable to particular uncertainties such as design errors that may critically affect the dependability of risk-acceptance decisions. This approach is illustrated with regard to the dependability of structural design processes based on prototype testing with uncertainties attributable to sampling variability.

  12. Optical simulation of flying targets using physically based renderer

    NASA Astrophysics Data System (ADS)

    Cheng, Ye; Zheng, Quan; Peng, Junkai; Lv, Pin; Zheng, Changwen

    2018-02-01

    The simulation of aerial flying targets is widely needed in many fields. This paper proposes a physically based method for optical simulation of flying targets. In the first step, three-dimensional target models are built and the motion speed and direction are defined. Next, the material of the outward appearance of a target is also simulated. Then the illumination conditions are defined. After all definitions are given, all settings are encoded in a description file. Finally, simulated results are generated by Monte Carlo ray tracing in a physically based renderer. Experiments show that this method is able to simulate materials, lighting and motion blur for flying targets, and it can generate convincing and highquality simulation results.

  13. Tracking target objects orbiting earth using satellite-based telescopes

    DOEpatents

    De Vries, Willem H; Olivier, Scot S; Pertica, Alexander J

    2014-10-14

    A system for tracking objects that are in earth orbit via a constellation or network of satellites having imaging devices is provided. An object tracking system includes a ground controller and, for each satellite in the constellation, an onboard controller. The ground controller receives ephemeris information for a target object and directs that ephemeris information be transmitted to the satellites. Each onboard controller receives ephemeris information for a target object, collects images of the target object based on the expected location of the target object at an expected time, identifies actual locations of the target object from the collected images, and identifies a next expected location at a next expected time based on the identified actual locations of the target object. The onboard controller processes the collected image to identify the actual location of the target object and transmits the actual location information to the ground controller.

  14. Target Abundance-Based Fitness Screening (TAFiS) Facilitates Rapid Identification of Target-Specific and Physiologically Active Chemical Probes

    PubMed Central

    Butts, Arielle; DeJarnette, Christian; Peters, Tracy L.; Parker, Josie E.; Kerns, Morgan E.; Eberle, Karen E.; Kelly, Steve L.

    2017-01-01

    ABSTRACT Traditional approaches to drug discovery are frustratingly inefficient and have several key limitations that severely constrain our capacity to rapidly identify and develop novel experimental therapeutics. To address this, we have devised a second-generation target-based whole-cell screening assay based on the principles of competitive fitness, which can rapidly identify target-specific and physiologically active compounds. Briefly, strains expressing high, intermediate, and low levels of a preselected target protein are constructed, tagged with spectrally distinct fluorescent proteins (FPs), and pooled. The pooled strains are then grown in the presence of various small molecules, and the relative growth of each strain within the mixed culture is compared by measuring the intensity of the corresponding FP tags. Chemical-induced population shifts indicate that the bioactivity of a small molecule is dependent upon the target protein’s abundance and thus establish a specific functional interaction. Here, we describe the molecular tools required to apply this technique in the prevalent human fungal pathogen Candida albicans and validate the approach using two well-characterized drug targets—lanosterol demethylase and dihydrofolate reductase. However, our approach, which we have termed target abundance-based fitness screening (TAFiS), should be applicable to a wide array of molecular targets and in essentially any genetically tractable microbe. IMPORTANCE Conventional drug screening typically employs either target-based or cell-based approaches. The first group relies on biochemical assays to detect modulators of a purified target. However, hits frequently lack drug-like characteristics such as membrane permeability and target specificity. Cell-based screens identify compounds that induce a desired phenotype, but the target is unknown, which severely restricts further development and optimization. To address these issues, we have developed a second

  15. Effect of clinically discriminating, evidence-based checklist items on the reliability of scores from an Internal Medicine residency OSCE.

    PubMed

    Daniels, Vijay J; Bordage, Georges; Gierl, Mark J; Yudkowsky, Rachel

    2014-10-01

    Objective structured clinical examinations (OSCEs) are used worldwide for summative examinations but often lack acceptable reliability. Research has shown that reliability of scores increases if OSCE checklists for medical students include only clinically relevant items. Also, checklists are often missing evidence-based items that high-achieving learners are more likely to use. The purpose of this study was to determine if limiting checklist items to clinically discriminating items and/or adding missing evidence-based items improved score reliability in an Internal Medicine residency OSCE. Six internists reviewed the traditional checklists of four OSCE stations classifying items as clinically discriminating or non-discriminating. Two independent reviewers augmented checklists with missing evidence-based items. We used generalizability theory to calculate overall reliability of faculty observer checklist scores from 45 first and second-year residents and predict how many 10-item stations would be required to reach a Phi coefficient of 0.8. Removing clinically non-discriminating items from the traditional checklist did not affect the number of stations (15) required to reach a Phi of 0.8 with 10 items. Focusing the checklist on only evidence-based clinically discriminating items increased test score reliability, needing 11 stations instead of 15 to reach 0.8; adding missing evidence-based clinically discriminating items to the traditional checklist modestly improved reliability (needing 14 instead of 15 stations). Checklists composed of evidence-based clinically discriminating items improved the reliability of checklist scores and reduced the number of stations needed for acceptable reliability. Educators should give preference to evidence-based items over non-evidence-based items when developing OSCE checklists.

  16. Image based detection and targeting of therapy resistance in pancreatic adenocarcinoma

    PubMed Central

    Jaquish, Dawn V.; Park, Frederick D.; Ito, Takahiro; Bajaj, Jeevisha; Koechlein, Claire S.; Zimdahl, Bryan; Yano, Masato; Kopp, Janel; Kritzik, Marcie; Sicklick, Jason; Sander, Maike; Grandgenett, Paul M.; Hollingsworth, Michael A.; Shibata, Shinsuke; Pizzo, Donald; Valasek, Mark; Sasik, Roman; Scadeng, Miriam; Okano, Hideyuki; Kim, Youngsoo; MacLeod, A. Robert

    2016-01-01

    Pancreatic intraepithelial neoplasia (PanIN) is a premalignant lesion that can progress to pancreatic ductal adenocarcinoma, a highly lethal malignancy marked by its late stage at clinical presentation and profound drug resistance1. The genomic alterations that commonly occur in pancreatic cancer include activation of KRAS2 and inactivation of p53, and SMAD42-4. To date, however, it has been challenging to target these pathways therapeutically; thus the search for other key mediators of pancreatic cancer growth remains an important endeavor. Here we show that the stem cell determinant Musashi (Msi) is a critical element of pancreatic cancer progression in both genetic models and patient derived xenografts. Specifically, we developed Msi reporter mice that allowed image based tracking of stem cell signals within cancers, revealing that Msi expression rises as PanIN progresses to adenocarcinoma, and that Msi-expressing cells are key drivers of pancreatic cancer: they preferentially harbor the capacity to propagate adenocarcinoma, are enriched in circulating tumor cells, and are markedly drug resistant. This population could be effectively targeted by deletion of either Msi1 or Msi2, which led to a striking defect in PanIN progression to adenocarcinoma and an improvement in overall survival. Msi inhibition also blocked the growth of primary patient-derived tumors, suggesting that this signal is required for human disease. To define the translational potential of this work we developed antisense oligonucleotides against Msi; these showed reliable tumor penetration, uptake and target inhibition, and effectively blocked pancreatic cancer growth. Collectively, these studies highlight Msi reporters as a unique tool to identify therapy resistance, and define Msi signaling as a central regulator of pancreatic cancer. PMID:27281208

  17. Can target-to-pons ratio be used as a reliable method for the analysis of [11C]PIB brain scans?

    PubMed

    Edison, P; Hinz, R; Ramlackhansingh, A; Thomas, J; Gelosa, G; Archer, H A; Turkheimer, F E; Brooks, D J

    2012-04-15

    (11)C]PIB is the most widely used PET imaging marker for amyloid in dementia studies. In the majority of studies the cerebellum has been used as a reference region. However, cerebellar amyloid may be present in genetic Alzheimer's (AD), cerebral amyloid angiopathy and prion diseases. Therefore, we investigated whether the pons could be used as an alternative reference region for the analysis of [(11)C]PIB binding in AD. The aims of the study were to: 1) Evaluate the pons as a reference region using arterial plasma input function and Logan graphical analysis of binding. 2) Assess the power of target-to-pons ratios to discriminate controls from AD subjects. 3) Determine the test-retest reliability in AD subjects. 4) Demonstrate the application of target-to-pons ratio in subjects with elevated cerebellar [(11)C]PIB binding. 12 sporadic AD subjects aged 65 ± 4.5 yrs with a mean MMSE 21.4 ± 4 and 10 age-matched control subjects had [(11)C]PIB PET with arterial blood sampling. Three additional subjects (two subjects with pre-symptomatic presenilin-1 mutation carriers and one probable familial AD) were also studied. Object maps were created by segmenting individual MRIs and spatially transforming the gray matter images into standard stereotaxic MNI space and then superimposing a probabilistic atlas. Cortical [(11)C]PIB binding was assessed with an ROI (region of interest) analysis. Parametric maps of the volume of distribution (V(T)) were generated with Logan analysis. Additionally, parametric maps of the 60-90 min target-to-cerebellar ratio (RATIO(CER)) and the 60-90 min target-to-pons ratio (RATIO(PONS)) were computed. All three approaches were able to differentiate AD from controls (p<0.0001, nonparametric Wilcoxon rank sum test) in the target regions with RATIO(CER) and RATIO(PONS) differences higher than V(T) with use of an arterial input function. All methods had a good reproducibility (intraclass correlation coefficient>0.83); RATIO(CER) performed best closely

  18. Acceptability of Service Targets for ICT-Based Healthcare.

    PubMed

    Jeon, Eun Min; Seo, Hwa Jeong

    2016-10-01

    In order to adopt and activate telemedicine it is necessary to survey how medical staff, who are providers of medical service, and consumers, who are the service targets, perceive information and communication technology (ICT)-based healthcare service. This study surveyed the awareness and acceptability of ICT-based healthcare by involving service targets, specifically workers and students living in the Seoul and Gyeonggi regions who are consumers of healthcare service. To determine the correlation among awareness of ICT-based healthcare, the need for self-management, and acceptability, this study conducted a correlation analysis and a simple regression analysis. According to the responses to the questions on the need for ICT-based healthcare service by item, blood pressure (n = 279, 94.3%) and glucose (n = 277, 93.6%) were revealed to be the physiological signal monitoring area. Among the six measurement factors affecting ICT-based healthcare service acceptability, age, health concerns, and effect expectation had the most significant effects. As effect expectation increased, acceptability became 4.38 times higher ( p < 0.05). This study identified a positive awareness of service targets on ICT-based healthcare service. The fact that acceptability is higher among people who have family disease history or greater health concerns may lead to service targets' more active participation. This study also confirmed that a policy to motivate active participation of those in their 40s (who had high prevalence rates) was needed.

  19. Category-based attentional guidance can operate in parallel for multiple target objects.

    PubMed

    Jenkins, Michael; Grubert, Anna; Eimer, Martin

    2018-05-01

    The question whether the control of attention during visual search is always feature-based or can also be based on the category of objects remains unresolved. Here, we employed the N2pc component as an on-line marker for target selection processes to compare the efficiency of feature-based and category-based attentional guidance. Two successive displays containing pairs of real-world objects (line drawings of kitchen or clothing items) were separated by a 10 ms SOA. In Experiment 1, target objects were defined by their category. In Experiment 2, one specific visual object served as target (exemplar-based search). On different trials, targets appeared either in one or in both displays, and participants had to report the number of targets (one or two). Target N2pc components were larger and emerged earlier during exemplar-based search than during category-based search, demonstrating the superior efficiency of feature-based attentional guidance. On trials where target objects appeared in both displays, both targets elicited N2pc components that overlapped in time, suggesting that attention was allocated in parallel to these target objects. Critically, this was the case not only in the exemplar-based task, but also when targets were defined by their category. These results demonstrate that attention can be guided by object categories, and that this type of category-based attentional control can operate concurrently for multiple target objects. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Intercepting a moving target: On-line or model-based control?

    PubMed

    Zhao, Huaiyong; Warren, William H

    2017-05-01

    When walking to intercept a moving target, people take an interception path that appears to anticipate the target's trajectory. According to the constant bearing strategy, the observer holds the bearing direction of the target constant based on current visual information, consistent with on-line control. Alternatively, the interception path might be based on an internal model of the target's motion, known as model-based control. To investigate these two accounts, participants walked to intercept a moving target in a virtual environment. We degraded the target's visibility by blurring the target to varying degrees in the midst of a trial, in order to influence its perceived speed and position. Reduced levels of visibility progressively impaired interception accuracy and precision; total occlusion impaired performance most and yielded nonadaptive heading adjustments. Thus, performance strongly depended on current visual information and deteriorated qualitatively when it was withdrawn. The results imply that locomotor interception is normally guided by current information rather than an internal model of target motion, consistent with on-line control.

  1. Reliability optimization design of the gear modification coefficient based on the meshing stiffness

    NASA Astrophysics Data System (ADS)

    Wang, Qianqian; Wang, Hui

    2018-04-01

    Since the time varying meshing stiffness of gear system is the key factor affecting gear vibration, it is important to design the meshing stiffness to reduce vibration. Based on the effect of gear modification coefficient on the meshing stiffness, considering the random parameters, reliability optimization design of the gear modification is researched. The dimension reduction and point estimation method is used to estimate the moment of the limit state function, and the reliability is obtained by the forth moment method. The cooperation of the dynamic amplitude results before and after optimization indicates that the research is useful for the reduction of vibration and noise and the improvement of the reliability.

  2. Solution NMR Spectroscopy in Target-Based Drug Discovery.

    PubMed

    Li, Yan; Kang, Congbao

    2017-08-23

    Solution NMR spectroscopy is a powerful tool to study protein structures and dynamics under physiological conditions. This technique is particularly useful in target-based drug discovery projects as it provides protein-ligand binding information in solution. Accumulated studies have shown that NMR will play more and more important roles in multiple steps of the drug discovery process. In a fragment-based drug discovery process, ligand-observed and protein-observed NMR spectroscopy can be applied to screen fragments with low binding affinities. The screened fragments can be further optimized into drug-like molecules. In combination with other biophysical techniques, NMR will guide structure-based drug discovery. In this review, we describe the possible roles of NMR spectroscopy in drug discovery. We also illustrate the challenges encountered in the drug discovery process. We include several examples demonstrating the roles of NMR in target-based drug discoveries such as hit identification, ranking ligand binding affinities, and mapping the ligand binding site. We also speculate the possible roles of NMR in target engagement based on recent processes in in-cell NMR spectroscopy.

  3. A comparison of manual anthropometric measurements with Kinect-based scanned measurements in terms of precision and reliability.

    PubMed

    Bragança, Sara; Arezes, Pedro; Carvalho, Miguel; Ashdown, Susan P; Castellucci, Ignacio; Leão, Celina

    2018-01-01

    Collecting anthropometric data for real-life applications demands a high degree of precision and reliability. It is important to test new equipment that will be used for data collectionOBJECTIVE:Compare two anthropometric data gathering techniques - manual methods and a Kinect-based 3D body scanner - to understand which of them gives more precise and reliable results. The data was collected using a measuring tape and a Kinect-based 3D body scanner. It was evaluated in terms of precision by considering the regular and relative Technical Error of Measurement and in terms of reliability by using the Intraclass Correlation Coefficient, Reliability Coefficient, Standard Error of Measurement and Coefficient of Variation. The results obtained showed that both methods presented better results for reliability than for precision. Both methods showed relatively good results for these two variables, however, manual methods had better results for some body measurements. Despite being considered sufficiently precise and reliable for certain applications (e.g. apparel industry), the 3D scanner tested showed, for almost every anthropometric measurement, a different result than the manual technique. Many companies design their products based on data obtained from 3D scanners, hence, understanding the precision and reliability of the equipment used is essential to obtain feasible results.

  4. THE DYNAMIC LEAP AND BALANCE TEST (DLBT): A TEST-RETEST RELIABILITY STUDY

    PubMed Central

    Newman, Thomas M.; Smith, Brent I.; John Miller, Sayers

    2017-01-01

    Background There is a need for new clinical assessment tools to test dynamic balance during typical functional movements. Common methods for assessing dynamic balance, such as the Star Excursion Balance Test, which requires controlled movement of body segments over an unchanged base of support, may not be an adequate measure for testing typical functional movements that involve controlled movement of body segments along with a change in base of support. Purpose/hypothesis The purpose of this study was to determine the reliability of the Dynamic Leap and Balance Test (DLBT) by assessing its test-retest reliability. It was hypothesized that there would be no statistically significant differences between testing days in time taken to complete the test. Study Design Reliability study Methods Thirty healthy college aged individuals participated in this study. Participants performed a series of leaps in a prescribed sequence, unique to the DLBT test. Time required by the participants to complete the 20-leap task was the dependent variable. Subjects leaped back and forth from peripheral to central targets alternating weight bearing from one leg to the other. Participants landed on the central target with the tested limb and were required to stabilize for two seconds before leaping to the next target. Stability was based upon qualitative measures similar to Balance Error Scoring System. Each assessment was comprised of three trials and performed on two days with a separation of at least six days. Results Two-way mixed ANOVA was used to analyze the differences in time to complete the sequence between the three trial averages of the two testing sessions. Intraclass Correlation Coefficient (ICC3,1) was used to establish between session test-retest reliability of the test trial averages. Significance was set a priori at p ≤ 0.05. No significant differences (p > 0.05) were detected between the two testing sessions. The ICC was 0.93 with a 95% confidence interval from

  5. Web-Based Assessment of Mental Well-Being in Early Adolescence: A Reliability Study.

    PubMed

    Hamann, Christoph; Schultze-Lutter, Frauke; Tarokh, Leila

    2016-06-15

    The ever-increasing use of the Internet among adolescents represents an emerging opportunity for researchers to gain access to larger samples, which can be queried over several years longitudinally. Among adolescents, young adolescents (ages 11 to 13 years) are of particular interest to clinicians as this is a transitional stage, during which depressive and anxiety symptoms often emerge. However, it remains unclear whether these youngest adolescents can accurately answer questions about their mental well-being using a Web-based platform. The aim of the study was to examine the accuracy of responses obtained from Web-based questionnaires by comparing Web-based with paper-and-pencil versions of depression and anxiety questionnaires. The primary outcome was the score on the depression and anxiety questionnaires under two conditions: (1) paper-and-pencil and (2) Web-based versions. Twenty-eight adolescents (aged 11-13 years, mean age 12.78 years and SD 0.78; 18 females, 64%) were randomly assigned to complete either the paper-and-pencil or the Web-based questionnaire first. Intraclass correlation coefficients (ICCs) were calculated to measure intrarater reliability. Intraclass correlation coefficients were calculated separately for depression (Children's Depression Inventory, CDI) and anxiety (Spence Children's Anxiety Scale, SCAS) questionnaires. On average, it took participants 17 minutes (SD 6) to answer 116 questions online. Intraclass correlation coefficient analysis revealed high intrarater reliability when comparing Web-based with paper-and-pencil responses for both CDI (ICC=.88; P<.001) and the SCAS (ICC=.95; P<.001). According to published criteria, both of these values are in the "almost perfect" category indicating the highest degree of reliability. The results of the study show an excellent reliability of Web-based assessment in 11- to 13-year-old children as compared with the standard paper-pencil assessment. Furthermore, we found that Web-based

  6. Reliability of smartphone-based teleradiology for evaluating thoracolumbar spine fractures.

    PubMed

    Stahl, Ido; Dreyfuss, Daniel; Ofir, Dror; Merom, Lior; Raichel, Michael; Hous, Nir; Norman, Doron; Haddad, Elias

    2017-02-01

    Timely interpretation of computed tomography (CT) scans is of paramount importance in diagnosing and managing spinal column fractures, which can be devastating. Out-of-hospital, on-call spine surgeons are often asked to evaluate CT scans of patients who have sustained trauma to the thoracolumbar spine to make diagnosis and to determine the appropriate course of urgent treatment. Capturing radiographic scans and video clips from computer screens and sending them as instant messages have become common means of communication between physicians, aiding in triaging and transfer decision-making in orthopedic and neurosurgical emergencies. The present study aimed to compare the reliability of interpreting CT scans viewed by orthopedic surgeons in two ways for diagnosing, classifying, and treatment planning for thoracolumbar spine fractures: (1) captured as video clips from standard workstation-based picture archiving and communication system (PACS) and sent via a smartphone-based instant messaging application for viewing on a smartphone; and (2) viewed directly on a PACS. Reliability and agreement study. Thirty adults with thoracolumbar spine fractures who had been consecutively admitted to the Division of Orthopedic Surgery of a Level I trauma center during 2014. Intraobserver agreement. CT scans were captured by use of an iPhone 6 smartphone from a computer screen displaying PACS. Then by use of the WhatsApp instant messaging application, video clips of the scans were sent to the personal smartphones of five spine surgeons. These evaluators were asked to diagnose, classify, and determine the course of treatment for each case. Evaluation of the cases was repeated 4 weeks later, this time using the standard method of workstation-based PACS. Intraobserver agreement was interpreted based on the value of Cohen's kappa statistic. The study did not receive any outside funding. Intraobserver agreement for determining fracture level was near perfect (κ=0.94). Intraobserver

  7. Test-retest reliability of computer-based video analysis of general movements in healthy term-born infants.

    PubMed

    Valle, Susanne Collier; Støen, Ragnhild; Sæther, Rannei; Jensenius, Alexander Refsum; Adde, Lars

    2015-10-01

    A computer-based video analysis has recently been presented for quantitative assessment of general movements (GMs). This method's test-retest reliability, however, has not yet been evaluated. The aim of the current study was to evaluate the test-retest reliability of computer-based video analysis of GMs, and to explore the association between computer-based video analysis and the temporal organization of fidgety movements (FMs). Test-retest reliability study. 75 healthy, term-born infants were recorded twice the same day during the FMs period using a standardized video set-up. The computer-based movement variables "quantity of motion mean" (Qmean), "quantity of motion standard deviation" (QSD) and "centroid of motion standard deviation" (CSD) were analyzed, reflecting the amount of motion and the variability of the spatial center of motion of the infant, respectively. In addition, the association between the variable CSD and the temporal organization of FMs was explored. Intraclass correlation coefficients (ICC 1.1 and ICC 3.1) were calculated to assess test-retest reliability. The ICC values for the variables CSD, Qmean and QSD were 0.80, 0.80 and 0.86 for ICC (1.1), respectively; and 0.80, 0.86 and 0.90 for ICC (3.1), respectively. There were significantly lower CSD values in the recordings with continual FMs compared to the recordings with intermittent FMs (p<0.05). This study showed high test-retest reliability of computer-based video analysis of GMs, and a significant association between our computer-based video analysis and the temporal organization of FMs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Perceptual attraction in tool use: evidence for a reliability-based weighting mechanism.

    PubMed

    Debats, Nienke B; Ernst, Marc O; Heuer, Herbert

    2017-04-01

    Humans are well able to operate tools whereby their hand movement is linked, via a kinematic transformation, to a spatially distant object moving in a separate plane of motion. An everyday example is controlling a cursor on a computer monitor. Despite these separate reference frames, the perceived positions of the hand and the object were found to be biased toward each other. We propose that this perceptual attraction is based on the principles by which the brain integrates redundant sensory information of single objects or events, known as optimal multisensory integration. That is, 1 ) sensory information about the hand and the tool are weighted according to their relative reliability (i.e., inverse variances), and 2 ) the unisensory reliabilities sum up in the integrated estimate. We assessed whether perceptual attraction is consistent with optimal multisensory integration model predictions. We used a cursor-control tool-use task in which we manipulated the relative reliability of the unisensory hand and cursor position estimates. The perceptual biases shifted according to these relative reliabilities, with an additional bias due to contextual factors that were present in experiment 1 but not in experiment 2 The biased position judgments' variances were, however, systematically larger than the predicted optimal variances. Our findings suggest that the perceptual attraction in tool use results from a reliability-based weighting mechanism similar to optimal multisensory integration, but that certain boundary conditions for optimality might not be satisfied. NEW & NOTEWORTHY Kinematic tool use is associated with a perceptual attraction between the spatially separated hand and the effective part of the tool. We provide a formal account for this phenomenon, thereby showing that the process behind it is similar to optimal integration of sensory information relating to single objects. Copyright © 2017 the American Physiological Society.

  9. A saliency-based approach to detection of infrared target

    NASA Astrophysics Data System (ADS)

    Chen, Yanfei; Sang, Nong; Dan, Zhiping

    2013-10-01

    Automatic target detection in infrared images is a hot research field of national defense technology. We propose a new saliency-based infrared target detection model in this paper, which is based on the fact that human focus of attention is directed towards the relevant target to interpret the most promising information. For a given image, the convolution of the image log amplitude spectrum with a low-pass Gaussian kernel of an appropriate scale is equivalent to an image saliency detector in the frequency domain. At the same time, orientation and shape features extracted are combined into a saliency map in the spatial domain. Our proposed model decides salient targets based on a final saliency map, which is generated by integration of the saliency maps in the frequency and spatial domain. At last, the size of each salient target is obtained by maximizing entropy of the final saliency map. Experimental results show that the proposed model can highlight both small and large salient regions in infrared image, as well as inhibit repeated distractors in cluttered image. In addition, its detecting efficiency has improved significantly.

  10. Polysaccharide-based micro/nanocarriers for oral colon-targeted drug delivery.

    PubMed

    Zhang, Lin; Sang, Yuan; Feng, Jing; Li, Zhaoming; Zhao, Aili

    2016-08-01

    Oral colon-targeted drug delivery has attracted many researchers because of its distinct advantages of increasing the bioavailability of the drug at the target site and reducing the side effects. Polysaccharides that are precisely activated by the physiological environment of the colon hold greater promise for colon targeting. Considerable research efforts have been directed towards developing polysaccharide-based micro/nanocarriers. Types of polysaccharides for colon targeting and in vitro/in vivo assessments of polysaccharide-based carriers for oral colon-targeted drug delivery are summarised. Polysaccharide-based microspheres have gained increased importance not just for the delivery of the drugs for the treatment of local diseases associated with the colon (colon cancer, inflammatory bowel disease (IBD), amoebiasis and irritable bowel syndrome (IBS)), but also for it's potential for the delivery of anti-rheumatoid arthritis and anti-chronic stable angina drugs. Besides, Polysaccharide-based micro/nanocarriers such as microbeads, microcapsules, microparticles, nanoparticles, nanogels and nanospheres are also introduced in this review.

  11. Target recognition and scene interpretation in image/video understanding systems based on network-symbolic models

    NASA Astrophysics Data System (ADS)

    Kuvich, Gary

    2004-08-01

    Vision is only a part of a system that converts visual information into knowledge structures. These structures drive the vision process, resolving ambiguity and uncertainty via feedback, and provide image understanding, which is an interpretation of visual information in terms of these knowledge models. These mechanisms provide a reliable recognition if the object is occluded or cannot be recognized as a whole. It is hard to split the entire system apart, and reliable solutions to the target recognition problems are possible only within the solution of a more generic Image Understanding Problem. Brain reduces informational and computational complexities, using implicit symbolic coding of features, hierarchical compression, and selective processing of visual information. Biologically inspired Network-Symbolic representation, where both systematic structural/logical methods and neural/statistical methods are parts of a single mechanism, is the most feasible for such models. It converts visual information into relational Network-Symbolic structures, avoiding artificial precise computations of 3-dimensional models. Network-Symbolic Transformations derive abstract structures, which allows for invariant recognition of an object as exemplar of a class. Active vision helps creating consistent models. Attention, separation of figure from ground and perceptual grouping are special kinds of network-symbolic transformations. Such Image/Video Understanding Systems will be reliably recognizing targets.

  12. Fragment-Based Phenotypic Lead Discovery: Cell-Based Assay to Target Leishmaniasis.

    PubMed

    Ayotte, Yann; Bilodeau, François; Descoteaux, Albert; LaPlante, Steven R

    2018-05-02

    A rapid and practical approach for the discovery of new chemical matter for targeting pathogens and diseases is described. Fragment-based phenotypic lead discovery (FPLD) combines aspects of traditional fragment-based lead discovery (FBLD), which involves the screening of small-molecule fragment libraries to target specific proteins, with phenotypic lead discovery (PLD), which typically involves the screening of drug-like compounds in cell-based assays. To enable FPLD, a diverse library of fragments was first designed, assembled, and curated. This library of soluble, low-molecular-weight compounds was then pooled to expedite screening. Axenic cultures of Leishmania promastigotes were screened, and single hits were then tested for leishmanicidal activity against intracellular amastigote forms in infected murine bone-marrow-derived macrophages without evidence of toxicity toward mammalian cells. These studies demonstrate that FPLD can be a rapid and effective means to discover hits that can serve as leads for further medicinal chemistry purposes or as tool compounds for identifying known or novel targets. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. A General Reliability Model for Ni-BaTiO3-Based Multilayer Ceramic Capacitors

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    The evaluation of multilayer ceramic capacitors (MLCCs) with Ni electrode and BaTiO3 dielectric material for potential space project applications requires an in-depth understanding of their reliability. A general reliability model for Ni-BaTiO3 MLCC is developed and discussed. The model consists of three parts: a statistical distribution; an acceleration function that describes how a capacitor's reliability life responds to the external stresses, and an empirical function that defines contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size, and capacitor chip size A. Application examples are also discussed based on the proposed reliability model for Ni-BaTiO3 MLCCs.

  14. Dynamic decision-making for reliability and maintenance analysis of manufacturing systems based on failure effects

    NASA Astrophysics Data System (ADS)

    Zhang, Ding; Zhang, Yingjie

    2017-09-01

    A framework for reliability and maintenance analysis of job shop manufacturing systems is proposed in this paper. An efficient preventive maintenance (PM) policy in terms of failure effects analysis (FEA) is proposed. Subsequently, reliability evaluation and component importance measure based on FEA are performed under the PM policy. A job shop manufacturing system is applied to validate the reliability evaluation and dynamic maintenance policy. Obtained results are compared with existed methods and the effectiveness is validated. Some vague understandings for issues such as network modelling, vulnerabilities identification, the evaluation criteria of repairable systems, as well as PM policy during manufacturing system reliability analysis are elaborated. This framework can help for reliability optimisation and rational maintenance resources allocation of job shop manufacturing systems.

  15. Efficient Network Coding-Based Loss Recovery for Reliable Multicast in Wireless Networks

    NASA Astrophysics Data System (ADS)

    Chi, Kaikai; Jiang, Xiaohong; Ye, Baoliu; Horiguchi, Susumu

    Recently, network coding has been applied to the loss recovery of reliable multicast in wireless networks [19], where multiple lost packets are XOR-ed together as one packet and forwarded via single retransmission, resulting in a significant reduction of bandwidth consumption. In this paper, we first prove that maximizing the number of lost packets for XOR-ing, which is the key part of the available network coding-based reliable multicast schemes, is actually a complex NP-complete problem. To address this limitation, we then propose an efficient heuristic algorithm for finding an approximately optimal solution of this optimization problem. Furthermore, we show that the packet coding principle of maximizing the number of lost packets for XOR-ing sometimes cannot fully exploit the potential coding opportunities, and we then further propose new heuristic-based schemes with a new coding principle. Simulation results demonstrate that the heuristic-based schemes have very low computational complexity and can achieve almost the same transmission efficiency as the current coding-based high-complexity schemes. Furthermore, the heuristic-based schemes with the new coding principle not only have very low complexity, but also slightly outperform the current high-complexity ones.

  16. Analysis of A Drug Target-based Classification System using Molecular Descriptors.

    PubMed

    Lu, Jing; Zhang, Pin; Bi, Yi; Luo, Xiaomin

    2016-01-01

    Drug-target interaction is an important topic in drug discovery and drug repositioning. KEGG database offers a drug annotation and classification using a target-based classification system. In this study, we gave an investigation on five target-based classes: (I) G protein-coupled receptors; (II) Nuclear receptors; (III) Ion channels; (IV) Enzymes; (V) Pathogens, using molecular descriptors to represent each drug compound. Two popular feature selection methods, maximum relevance minimum redundancy and incremental feature selection, were adopted to extract the important descriptors. Meanwhile, an optimal prediction model based on nearest neighbor algorithm was constructed, which got the best result in identifying drug target-based classes. Finally, some key descriptors were discussed to uncover their important roles in the identification of drug-target classes.

  17. Interrater reliability of a Pilates movement-based classification system.

    PubMed

    Yu, Kwan Kenny; Tulloch, Evelyn; Hendrick, Paul

    2015-01-01

    To determine the interrater reliability for identification of a specific movement pattern using a Pilates Classification system. Videos of 5 subjects performing specific movement tasks were sent to raters trained in the DMA-CP classification system. Ninety-six raters completed the survey. Interrater reliability for the detection of a directional bias was excellent (Pi = 0.92, and K(free) = 0.89). Interrater reliability for classifying an individual into a specific subgroup was moderate (Pi = 0.64, K(free) = 0.55) however raters who had completed levels 1-4 of the DMA-CP training and reported using the assessment daily demonstrated excellent reliability (Pi = 0.89 and K(free) = 0.87). The reliability of the classification system demonstrated almost perfect agreement in determining the existence of a specific movement pattern and classifying into a subgroup for experienced raters. There was a trend for greater reliability associated with increased levels of training and experience of the raters. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Fuzzy System-Based Target Selection for a NIR Camera-Based Gaze Tracker

    PubMed Central

    Naqvi, Rizwan Ali; Arsalan, Muhammad; Park, Kang Ryoung

    2017-01-01

    Gaze-based interaction (GBI) techniques have been a popular subject of research in the last few decades. Among other applications, GBI can be used by persons with disabilities to perform everyday tasks, as a game interface, and can play a pivotal role in the human computer interface (HCI) field. While gaze tracking systems have shown high accuracy in GBI, detecting a user’s gaze for target selection is a challenging problem that needs to be considered while using a gaze detection system. Past research has used the blinking of the eyes for this purpose as well as dwell time-based methods, but these techniques are either inconvenient for the user or requires a long time for target selection. Therefore, in this paper, we propose a method for fuzzy system-based target selection for near-infrared (NIR) camera-based gaze trackers. The results of experiments performed in addition to tests of the usability and on-screen keyboard use of the proposed method show that it is better than previous methods. PMID:28420114

  19. The (un)reliability of item-level semantic priming effects.

    PubMed

    Heyman, Tom; Bruninx, Anke; Hutchison, Keith A; Storms, Gert

    2018-04-05

    Many researchers have tried to predict semantic priming effects using a myriad of variables (e.g., prime-target associative strength or co-occurrence frequency). The idea is that relatedness varies across prime-target pairs, which should be reflected in the size of the priming effect (e.g., cat should prime dog more than animal does). However, it is only insightful to predict item-level priming effects if they can be measured reliably. Thus, in the present study we examined the split-half and test-retest reliabilities of item-level priming effects under conditions that should discourage the use of strategies. The resulting priming effects proved extremely unreliable, and reanalyses of three published priming datasets revealed similar cases of low reliability. These results imply that previous attempts to predict semantic priming were unlikely to be successful. However, one study with an unusually large sample size yielded more favorable reliability estimates, suggesting that big data, in terms of items and participants, should be the future for semantic priming research.

  20. Statistical Bayesian method for reliability evaluation based on ADT data

    NASA Astrophysics Data System (ADS)

    Lu, Dawei; Wang, Lizhi; Sun, Yusheng; Wang, Xiaohong

    2018-05-01

    Accelerated degradation testing (ADT) is frequently conducted in the laboratory to predict the products’ reliability under normal operating conditions. Two kinds of methods, degradation path models and stochastic process models, are utilized to analyze degradation data and the latter one is the most popular method. However, some limitations like imprecise solution process and estimation result of degradation ratio still exist, which may affect the accuracy of the acceleration model and the extrapolation value. Moreover, the conducted solution of this problem, Bayesian method, lose key information when unifying the degradation data. In this paper, a new data processing and parameter inference method based on Bayesian method is proposed to handle degradation data and solve the problems above. First, Wiener process and acceleration model is chosen; Second, the initial values of degradation model and parameters of prior and posterior distribution under each level is calculated with updating and iteration of estimation values; Third, the lifetime and reliability values are estimated on the basis of the estimation parameters; Finally, a case study is provided to demonstrate the validity of the proposed method. The results illustrate that the proposed method is quite effective and accuracy in estimating the lifetime and reliability of a product.

  1. A General Reliability Model for Ni-BaTiO3-Based Multilayer Ceramic Capacitors

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    The evaluation for potential space project applications of multilayer ceramic capacitors (MLCCs) with Ni electrode and BaTiO3 dielectric material requires an in-depth understanding of the MLCCs reliability. A general reliability model for Ni-BaTiO3 MLCCs is developed and discussed in this paper. The model consists of three parts: a statistical distribution; an acceleration function that describes how a capacitors reliability life responds to external stresses; and an empirical function that defines the contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size r, and capacitor chip size A. Application examples are also discussed based on the proposed reliability model for Ni-BaTiO3 MLCCs.

  2. Component-based target recognition inspired by human vision

    NASA Astrophysics Data System (ADS)

    Zheng, Yufeng; Agyepong, Kwabena

    2009-05-01

    In contrast with machine vision, human can recognize an object from complex background with great flexibility. For example, given the task of finding and circling all cars (no further information) in a picture, you may build a virtual image in mind from the task (or target) description before looking at the picture. Specifically, the virtual car image may be composed of the key components such as driver cabin and wheels. In this paper, we propose a component-based target recognition method by simulating the human recognition process. The component templates (equivalent to the virtual image in mind) of the target (car) are manually decomposed from the target feature image. Meanwhile, the edges of the testing image can be extracted by using a difference of Gaussian (DOG) model that simulates the spatiotemporal response in visual process. A phase correlation matching algorithm is then applied to match the templates with the testing edge image. If all key component templates are matched with the examining object, then this object is recognized as the target. Besides the recognition accuracy, we will also investigate if this method works with part targets (half cars). In our experiments, several natural pictures taken on streets were used to test the proposed method. The preliminary results show that the component-based recognition method is very promising.

  3. A framework for conducting mechanistic based reliability assessments of components operating in complex systems

    NASA Astrophysics Data System (ADS)

    Wallace, Jon Michael

    2003-10-01

    Reliability prediction of components operating in complex systems has historically been conducted in a statistically isolated manner. Current physics-based, i.e. mechanistic, component reliability approaches focus more on component-specific attributes and mathematical algorithms and not enough on the influence of the system. The result is that significant error can be introduced into the component reliability assessment process. The objective of this study is the development of a framework that infuses the needs and influence of the system into the process of conducting mechanistic-based component reliability assessments. The formulated framework consists of six primary steps. The first three steps, identification, decomposition, and synthesis, are primarily qualitative in nature and employ system reliability and safety engineering principles to construct an appropriate starting point for the component reliability assessment. The following two steps are the most unique. They involve a step to efficiently characterize and quantify the system-driven local parameter space and a subsequent step using this information to guide the reduction of the component parameter space. The local statistical space quantification step is accomplished using two proposed multivariate probability models: Multi-Response First Order Second Moment and Taylor-Based Inverse Transformation. Where existing joint probability models require preliminary distribution and correlation information of the responses, these models combine statistical information of the input parameters with an efficient sampling of the response analyses to produce the multi-response joint probability distribution. Parameter space reduction is accomplished using Approximate Canonical Correlation Analysis (ACCA) employed as a multi-response screening technique. The novelty of this approach is that each individual local parameter and even subsets of parameters representing entire contributing analyses can now be rank

  4. A Reliability and Validity of an Instrument to Evaluate the School-Based Assessment System: A Pilot Study

    ERIC Educational Resources Information Center

    Ghazali, Nor Hasnida Md

    2016-01-01

    A valid, reliable and practical instrument is needed to evaluate the implementation of the school-based assessment (SBA) system. The aim of this study is to develop and assess the validity and reliability of an instrument to measure the perception of teachers towards the SBA implementation in schools. The instrument is developed based on a…

  5. Reliability- and performance-based robust design optimization of MEMS structures considering technological uncertainties

    NASA Astrophysics Data System (ADS)

    Martowicz, Adam; Uhl, Tadeusz

    2012-10-01

    The paper discusses the applicability of a reliability- and performance-based multi-criteria robust design optimization technique for micro-electromechanical systems, considering their technological uncertainties. Nowadays, micro-devices are commonly applied systems, especially in the automotive industry, taking advantage of utilizing both the mechanical structure and electronic control circuit on one board. Their frequent use motivates the elaboration of virtual prototyping tools that can be applied in design optimization with the introduction of technological uncertainties and reliability. The authors present a procedure for the optimization of micro-devices, which is based on the theory of reliability-based robust design optimization. This takes into consideration the performance of a micro-device and its reliability assessed by means of uncertainty analysis. The procedure assumes that, for each checked design configuration, the assessment of uncertainty propagation is performed with the meta-modeling technique. The described procedure is illustrated with an example of the optimization carried out for a finite element model of a micro-mirror. The multi-physics approach allowed the introduction of several physical phenomena to correctly model the electrostatic actuation and the squeezing effect present between electrodes. The optimization was preceded by sensitivity analysis to establish the design and uncertain domains. The genetic algorithms fulfilled the defined optimization task effectively. The best discovered individuals are characterized by a minimized value of the multi-criteria objective function, simultaneously satisfying the constraint on material strength. The restriction of the maximum equivalent stresses was introduced with the conditionally formulated objective function with a penalty component. The yielded results were successfully verified with a global uniform search through the input design domain.

  6. A Fast MEANSHIFT Algorithm-Based Target Tracking System

    PubMed Central

    Sun, Jian

    2012-01-01

    Tracking moving targets in complex scenes using an active video camera is a challenging task. Tracking accuracy and efficiency are two key yet generally incompatible aspects of a Target Tracking System (TTS). A compromise scheme will be studied in this paper. A fast mean-shift-based Target Tracking scheme is designed and realized, which is robust to partial occlusion and changes in object appearance. The physical simulation shows that the image signal processing speed is >50 frame/s. PMID:22969397

  7. A liquid chromatography-tandem mass spectrometry-based targeted proteomics assay for monitoring P-glycoprotein levels in human breast tissue.

    PubMed

    Yang, Ting; Chen, Fei; Xu, Feifei; Wang, Fengliang; Xu, Qingqing; Chen, Yun

    2014-09-25

    P-glycoprotein (P-gp) can efflux drugs from cancer cells, and its overexpression is commonly associated with multi-drug resistance (MDR). Thus, the accurate quantification of P-gp would help predict the response to chemotherapy and for prognosis of breast cancer patients. An advanced liquid chromatography-tandem mass spectrometry (LC/MS/MS)-based targeted proteomics assay was developed and validated for monitoring P-gp levels in breast tissue. Tryptic peptide 368IIDNKPSIDSYSK380 was selected as a surrogate analyte for quantification, and immuno-depleted tissue extract was used as a surrogate matrix. Matched pairs of breast tissue samples from 60 patients who were suspected to have drug resistance were subject to analysis. The levels of P-gp were quantified. Using data from normal tissue, we suggested a P-gp reference interval. The experimental values of tumor tissue samples were compared with those obtained from Western blotting and immunohistochemistry (IHC). The result indicated that the targeted proteomics approach was comparable to IHC but provided a lower limit of quantification (LOQ) and could afford more reliable results at low concentrations than the other two methods. LC/MS/MS-based targeted proteomics may allow the quantification of P-gp in breast tissue in a more accurate manner. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Texture and haptic cues in slant discrimination: reliability-based cue weighting without statistically optimal cue combination

    NASA Astrophysics Data System (ADS)

    Rosas, Pedro; Wagemans, Johan; Ernst, Marc O.; Wichmann, Felix A.

    2005-05-01

    A number of models of depth-cue combination suggest that the final depth percept results from a weighted average of independent depth estimates based on the different cues available. The weight of each cue in such an average is thought to depend on the reliability of each cue. In principle, such a depth estimation could be statistically optimal in the sense of producing the minimum-variance unbiased estimator that can be constructed from the available information. Here we test such models by using visual and haptic depth information. Different texture types produce differences in slant-discrimination performance, thus providing a means for testing a reliability-sensitive cue-combination model with texture as one of the cues to slant. Our results show that the weights for the cues were generally sensitive to their reliability but fell short of statistically optimal combination - we find reliability-based reweighting but not statistically optimal cue combination.

  9. Reliability and Validity of Assessing User Satisfaction With Web-Based Health Interventions.

    PubMed

    Boß, Leif; Lehr, Dirk; Reis, Dorota; Vis, Christiaan; Riper, Heleen; Berking, Matthias; Ebert, David Daniel

    2016-08-31

    The perspective of users should be taken into account in the evaluation of Web-based health interventions. Assessing the users' satisfaction with the intervention they receive could enhance the evidence for the intervention effects. Thus, there is a need for valid and reliable measures to assess satisfaction with Web-based health interventions. The objective of this study was to analyze the reliability, factorial structure, and construct validity of the Client Satisfaction Questionnaire adapted to Internet-based interventions (CSQ-I). The psychometric quality of the CSQ-I was analyzed in user samples from 2 separate randomized controlled trials evaluating Web-based health interventions, one from a depression prevention intervention (sample 1, N=174) and the other from a stress management intervention (sample 2, N=111). At first, the underlying measurement model of the CSQ-I was analyzed to determine the internal consistency. The factorial structure of the scale and the measurement invariance across groups were tested by multigroup confirmatory factor analyses. Additionally, the construct validity of the scale was examined by comparing satisfaction scores with the primary clinical outcome. Multigroup confirmatory analyses on the scale yielded a one-factorial structure with a good fit (root-mean-square error of approximation =.09, comparative fit index =.96, standardized root-mean-square residual =.05) that showed partial strong invariance across the 2 samples. The scale showed very good reliability, indicated by McDonald omegas of .95 in sample 1 and .93 in sample 2. Significant correlations with change in depressive symptoms (r=-.35, P<.001) and perceived stress (r=-.48, P<.001) demonstrated the construct validity of the scale. The proven internal consistency, factorial structure, and construct validity of the CSQ-I indicate a good overall psychometric quality of the measure to assess the user's general satisfaction with Web-based interventions for depression and

  10. Particle Filtering with Region-based Matching for Tracking of Partially Occluded and Scaled Targets*

    PubMed Central

    Nakhmani, Arie; Tannenbaum, Allen

    2012-01-01

    Visual tracking of arbitrary targets in clutter is important for a wide range of military and civilian applications. We propose a general framework for the tracking of scaled and partially occluded targets, which do not necessarily have prominent features. The algorithm proposed in the present paper utilizes a modified normalized cross-correlation as the likelihood for a particle filter. The algorithm divides the template, selected by the user in the first video frame, into numerous patches. The matching process of these patches by particle filtering allows one to handle the target’s occlusions and scaling. Experimental results with fixed rectangular templates show that the method is reliable for videos with nonstationary, noisy, and cluttered background, and provides accurate trajectories in cases of target translation, scaling, and occlusion. PMID:22506088

  11. Small numbers, disclosure risk, security, and reliability issues in Web-based data query systems.

    PubMed

    Rudolph, Barbara A; Shah, Gulzar H; Love, Denise

    2006-01-01

    This article describes the process for developing consensus guidelines and tools for releasing public health data via the Web and highlights approaches leading agencies have taken to balance disclosure risk with public dissemination of reliable health statistics. An agency's choice of statistical methods for improving the reliability of released data for Web-based query systems is based upon a number of factors, including query system design (dynamic analysis vs preaggregated data and tables), population size, cell size, data use, and how data will be supplied to users. The article also describes those efforts that are necessary to reduce the risk of disclosure of an individual's protected health information.

  12. Drug-target interaction prediction from PSSM based evolutionary information.

    PubMed

    Mousavian, Zaynab; Khakabimamaghani, Sahand; Kavousi, Kaveh; Masoudi-Nejad, Ali

    2016-01-01

    The labor-intensive and expensive experimental process of drug-target interaction prediction has motivated many researchers to focus on in silico prediction, which leads to the helpful information in supporting the experimental interaction data. Therefore, they have proposed several computational approaches for discovering new drug-target interactions. Several learning-based methods have been increasingly developed which can be categorized into two main groups: similarity-based and feature-based. In this paper, we firstly use the bi-gram features extracted from the Position Specific Scoring Matrix (PSSM) of proteins in predicting drug-target interactions. Our results demonstrate the high-confidence prediction ability of the Bigram-PSSM model in terms of several performance indicators specifically for enzymes and ion channels. Moreover, we investigate the impact of negative selection strategy on the performance of the prediction, which is not widely taken into account in the other relevant studies. This is important, as the number of non-interacting drug-target pairs are usually extremely large in comparison with the number of interacting ones in existing drug-target interaction data. An interesting observation is that different levels of performance reduction have been attained for four datasets when we change the sampling method from the random sampling to the balanced sampling. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Accuracy and Reliability of Eye-Based vs Quadrant-Based Diagnosis of Plus Disease in Retinopathy of Prematurity.

    PubMed

    Kim, Sang Jin; Campbell, J Peter; Kalpathy-Cramer, Jayashree; Ostmo, Susan; Jonas, Karyn E; Choi, Dongseok; Chan, R V Paul; Chiang, Michael F

    2018-06-01

    Presence of plus disease in retinopathy of prematurity is the most critical element in identifying treatment-requiring disease. However, there is significant variability in plus disease diagnosis. In particular, plus disease has been defined as 2 or more quadrants of vascular abnormality, and it is not clear whether it is more reliably and accurately diagnosed by eye-based assessment of overall retinal appearance or by quadrant-based assessment combining grades of 4 individual quadrants. To compare eye-based vs quadrant-based diagnosis of plus disease and to provide insight for ophthalmologists about the diagnostic process. In this multicenter cohort study, we developed a database of 197 wide-angle retinal images from 141 preterm infants from neonatal intensive care units at 9 academic institutions (enrolled from July 2011 to December 2016). Each image was assigned a reference standard diagnosis based on consensus image-based and clinical diagnosis. Data analysis was performed from February 2017 to September 2017. Six graders independently diagnosed each of the 4 quadrants (cropped images) of the 197 eyes (quadrant-based diagnosis) as well as the entire image (eye-based diagnosis). Images were displayed individually, in random order. Quadrant-based diagnosis of plus disease was made when 2 or more quadrants were diagnosed as indicating plus disease by combining grades of individual quadrants post hoc. Intragrader and intergrader reliability (absolute agreement and κ statistic) and accuracy compared with the reference standard diagnosis. Of the 141 included preterm infants, 65 (46.1%) were female and 116 (82.3%) white, and the mean (SD) gestational age was 27.0 (2.6) weeks. There was variable agreement between eye-based and quadrant-based diagnosis among the 6 graders (Cohen κ range, 0.32-0.75). Four graders showed underdiagnosis of plus disease with quadrant-based diagnosis compared with eye-based diagnosis (by McNemar test). Intergrader agreement of quadrant-based

  14. Selective targeting of melanoma by PEG-masked protein-based multifunctional nanoparticles

    PubMed Central

    Vannucci, Luca; Falvo, Elisabetta; Fornara, Manuela; Di Micco, Patrizio; Benada, Oldrich; Krizan, Jiri; Svoboda, Jan; Hulikova-Capkova, Katarina; Morea, Veronica; Boffi, Alberto; Ceci, Pierpaolo

    2012-01-01

    Background Nanoparticle-based systems are promising for the development of imaging and therapeutic agents. The main advantage of nanoparticles over traditional systems lies in the possibility of loading multiple functionalities onto a single molecule, which are useful for therapeutic and/or diagnostic purposes. These functionalities include targeting moieties which are able to recognize receptors overexpressed by specific cells and tissues. However, targeted delivery of nanoparticles requires an accurate system design. We present here a rationally designed, genetically engineered, and chemically modified protein-based nanoplatform for cell/tissue-specific targeting. Methods Our nanoparticle constructs were based on the heavy chain of the human protein ferritin (HFt), a highly symmetrical assembly of 24 subunits enclosing a hollow cavity. HFt-based nanoparticles were produced using both genetic engineering and chemical functionalization methods to impart several functionalities, ie, the α-melanocyte-stimulating hormone peptide as a melanoma-targeting moiety, stabilizing and HFt-masking polyethylene glycol molecules, rhodamine fluorophores, and magnetic resonance imaging agents. The constructs produced were extensively characterized by a number of physicochemical techniques, and assayed for selective melanoma-targeting in vitro and in vivo. Results Our HFt-based nanoparticle constructs functionalized with the α-melanocyte-stimulating hormone peptide moiety and polyethylene glycol molecules were specifically taken up by melanoma cells but not by other cancer cell types in vitro. Moreover, experiments in melanoma-bearing mice indicate that these constructs have an excellent tumor-targeting profile and a long circulation time in vivo. Conclusion By masking human HFt with polyethylene glycol and targeting it with an α-melanocyte-stimulating hormone peptide, we developed an HFt-based melanoma-targeting nanoplatform for application in melanoma diagnosis and treatment

  15. Benchmark data sets for structure-based computational target prediction.

    PubMed

    Schomburg, Karen T; Rarey, Matthias

    2014-08-25

    Structure-based computational target prediction methods identify potential targets for a bioactive compound. Methods based on protein-ligand docking so far face many challenges, where the greatest probably is the ranking of true targets in a large data set of protein structures. Currently, no standard data sets for evaluation exist, rendering comparison and demonstration of improvements of methods cumbersome. Therefore, we propose two data sets and evaluation strategies for a meaningful evaluation of new target prediction methods, i.e., a small data set consisting of three target classes for detailed proof-of-concept and selectivity studies and a large data set consisting of 7992 protein structures and 72 drug-like ligands allowing statistical evaluation with performance metrics on a drug-like chemical space. Both data sets are built from openly available resources, and any information needed to perform the described experiments is reported. We describe the composition of the data sets, the setup of screening experiments, and the evaluation strategy. Performance metrics capable to measure the early recognition of enrichments like AUC, BEDROC, and NSLR are proposed. We apply a sequence-based target prediction method to the large data set to analyze its content of nontrivial evaluation cases. The proposed data sets are used for method evaluation of our new inverse screening method iRAISE. The small data set reveals the method's capability and limitations to selectively distinguish between rather similar protein structures. The large data set simulates real target identification scenarios. iRAISE achieves in 55% excellent or good enrichment a median AUC of 0.67 and RMSDs below 2.0 Å for 74% and was able to predict the first true target in 59 out of 72 cases in the top 2% of the protein data set of about 8000 structures.

  16. Sarma-based key-group method for rock slope reliability analyses

    NASA Astrophysics Data System (ADS)

    Yarahmadi Bafghi, A. R.; Verdel, T.

    2005-08-01

    The methods used in conducting static stability analyses have remained pertinent to this day for reasons of both simplicity and speed of execution. The most well-known of these methods for purposes of stability analysis of fractured rock masses is the key-block method (KBM).This paper proposes an extension to the KBM, called the key-group method (KGM), which combines not only individual key-blocks but also groups of collapsable blocks into an iterative and progressive analysis of the stability of discontinuous rock slopes. To take intra-group forces into account, the Sarma method has been implemented within the KGM in order to generate a Sarma-based KGM, abbreviated SKGM. We will discuss herein the hypothesis behind this new method, details regarding its implementation, and validation through comparison with results obtained from the distinct element method.Furthermore, as an alternative to deterministic methods, reliability analyses or probabilistic analyses have been proposed to take account of the uncertainty in analytical parameters and models. The FOSM and ASM probabilistic methods could be implemented within the KGM and SKGM framework in order to take account of the uncertainty due to physical and mechanical data (density, cohesion and angle of friction). We will then show how such reliability analyses can be introduced into SKGM to give rise to the probabilistic SKGM (PSKGM) and how it can be used for rock slope reliability analyses. Copyright

  17. Targets of opportunity : community based alcohol programs

    DOT National Transportation Integrated Search

    1988-04-01

    Targets of Opportunity (TOP), were comprehensive community based programs addressing the drinking and driving concerns within a particular community. The program incorporated six elements: 1) General deterrence - public information,leducation and enf...

  18. Infrared dim target detection based on visual attention

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Lv, Guofang; Xu, Lizhong

    2012-11-01

    Accurate and fast detection of infrared (IR) dim target has very important meaning for infrared precise guidance, early warning, video surveillance, etc. Based on human visual attention mechanisms, an automatic detection algorithm for infrared dim target is presented. After analyzing the characteristics of infrared dim target images, the method firstly designs Difference of Gaussians (DoG) filters to compute the saliency map. Then the salient regions where the potential targets exist in are extracted by searching through the saliency map with a control mechanism of winner-take-all (WTA) competition and inhibition-of-return (IOR). At last, these regions are identified by the characteristics of the dim IR targets, so the true targets are detected, and the spurious objects are rejected. The experiments are performed for some real-life IR images, and the results prove that the proposed method has satisfying detection effectiveness and robustness. Meanwhile, it has high detection efficiency and can be used for real-time detection.

  19. Reliability-Based Design Optimization of a Composite Airframe Component

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2009-01-01

    A stochastic design optimization methodology (SDO) has been developed to design components of an airframe structure that can be made of metallic and composite materials. The design is obtained as a function of the risk level, or reliability, p. The design method treats uncertainties in load, strength, and material properties as distribution functions, which are defined with mean values and standard deviations. A design constraint or a failure mode is specified as a function of reliability p. Solution to stochastic optimization yields the weight of a structure as a function of reliability p. Optimum weight versus reliability p traced out an inverted-S-shaped graph. The center of the inverted-S graph corresponded to 50 percent (p = 0.5) probability of success. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponds to unity for reliability p (or p = 1). Weight can be reduced to a small value for the most failure-prone design with a reliability that approaches zero (p = 0). Reliability can be changed for different components of an airframe structure. For example, the landing gear can be designed for a very high reliability, whereas it can be reduced to a small extent for a raked wingtip. The SDO capability is obtained by combining three codes: (1) The MSC/Nastran code was the deterministic analysis tool, (2) The fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and (3) NASA Glenn Research Center s optimization testbed CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life raked wingtip structure of the Boeing 767-400 extended range airliner made of metallic and composite materials.

  20. [Reliability theory based on quality risk network analysis for Chinese medicine injection].

    PubMed

    Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui

    2014-08-01

    A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.

  1. Research on target tracking algorithm based on spatio-temporal context

    NASA Astrophysics Data System (ADS)

    Li, Baiping; Xu, Sanmei; Kang, Hongjuan

    2017-07-01

    In this paper, a novel target tracking algorithm based on spatio-temporal context is proposed. During the tracking process, the camera shaking or occlusion may lead to the failure of tracking. The proposed algorithm can solve this problem effectively. The method use the spatio-temporal context algorithm as the main research object. We get the first frame's target region via mouse. Then the spatio-temporal context algorithm is used to get the tracking targets of the sequence of frames. During this process a similarity measure function based on perceptual hash algorithm is used to judge the tracking results. If tracking failed, reset the initial value of Mean Shift algorithm for the subsequent target tracking. Experiment results show that the proposed algorithm can achieve real-time and stable tracking when camera shaking or target occlusion.

  2. Web-based training and interrater reliability testing for scoring the Hamilton Depression Rating Scale.

    PubMed

    Rosen, Jules; Mulsant, Benoit H; Marino, Patricia; Groening, Christopher; Young, Robert C; Fox, Debra

    2008-10-30

    Despite the importance of establishing shared scoring conventions and assessing interrater reliability in clinical trials in psychiatry, these elements are often overlooked. Obstacles to rater training and reliability testing include logistic difficulties in providing live training sessions, or mailing videotapes of patients to multiple sites and collecting the data for analysis. To address some of these obstacles, a web-based interactive video system was developed. It uses actors of diverse ages, gender and race to train raters how to score the Hamilton Depression Rating Scale and to assess interrater reliability. This system was tested with a group of experienced and novice raters within a single site. It was subsequently used to train raters of a federally funded multi-center clinical trial on scoring conventions and to test their interrater reliability. The advantages and limitations of using interactive video technology to improve the quality of clinical trials are discussed.

  3. Target matching based on multi-view tracking

    NASA Astrophysics Data System (ADS)

    Liu, Yahui; Zhou, Changsheng

    2011-01-01

    A feature matching method is proposed based on Maximally Stable Extremal Regions (MSER) and Scale Invariant Feature Transform (SIFT) to solve the problem of the same target matching in multiple cameras. Target foreground is extracted by using frame difference twice and bounding box which is regarded as target regions is calculated. Extremal regions are got by MSER. After fitted into elliptical regions, those regions will be normalized into unity circles and represented with SIFT descriptors. Initial matching is obtained from the ratio of the closest distance to second distance less than some threshold and outlier points are eliminated in terms of RANSAC. Experimental results indicate the method can reduce computational complexity effectively and is also adapt to affine transformation, rotation, scale and illumination.

  4. Acceptability of Service Targets for ICT-Based Healthcare

    PubMed Central

    Jeon, Eun Min

    2016-01-01

    Objectives In order to adopt and activate telemedicine it is necessary to survey how medical staff, who are providers of medical service, and consumers, who are the service targets, perceive information and communication technology (ICT)-based healthcare service. Methods This study surveyed the awareness and acceptability of ICT-based healthcare by involving service targets, specifically workers and students living in the Seoul and Gyeonggi regions who are consumers of healthcare service. To determine the correlation among awareness of ICT-based healthcare, the need for self-management, and acceptability, this study conducted a correlation analysis and a simple regression analysis. Results According to the responses to the questions on the need for ICT-based healthcare service by item, blood pressure (n = 279, 94.3%) and glucose (n = 277, 93.6%) were revealed to be the physiological signal monitoring area. Among the six measurement factors affecting ICT-based healthcare service acceptability, age, health concerns, and effect expectation had the most significant effects. As effect expectation increased, acceptability became 4.38 times higher (p < 0.05). Conclusions This study identified a positive awareness of service targets on ICT-based healthcare service. The fact that acceptability is higher among people who have family disease history or greater health concerns may lead to service targets’ more active participation. This study also confirmed that a policy to motivate active participation of those in their 40s (who had high prevalence rates) was needed. PMID:27895966

  5. Literature-based condition-specific miRNA-mRNA target prediction.

    PubMed

    Oh, Minsik; Rhee, Sungmin; Moon, Ji Hwan; Chae, Heejoon; Lee, Sunwon; Kang, Jaewoo; Kim, Sun

    2017-01-01

    miRNAs are small non-coding RNAs that regulate gene expression by binding to the 3'-UTR of genes. Many recent studies have reported that miRNAs play important biological roles by regulating specific mRNAs or genes. Many sequence-based target prediction algorithms have been developed to predict miRNA targets. However, these methods are not designed for condition-specific target predictions and produce many false positives; thus, expression-based target prediction algorithms have been developed for condition-specific target predictions. A typical strategy to utilize expression data is to leverage the negative control roles of miRNAs on genes. To control false positives, a stringent cutoff value is typically set, but in this case, these methods tend to reject many true target relationships, i.e., false negatives. To overcome these limitations, additional information should be utilized. The literature is probably the best resource that we can utilize. Recent literature mining systems compile millions of articles with experiments designed for specific biological questions, and the systems provide a function to search for specific information. To utilize the literature information, we used a literature mining system, BEST, that automatically extracts information from the literature in PubMed and that allows the user to perform searches of the literature with any English words. By integrating omics data analysis methods and BEST, we developed Context-MMIA, a miRNA-mRNA target prediction method that combines expression data analysis results and the literature information extracted based on the user-specified context. In the pathway enrichment analysis using genes included in the top 200 miRNA-targets, Context-MMIA outperformed the four existing target prediction methods that we tested. In another test on whether prediction methods can re-produce experimentally validated target relationships, Context-MMIA outperformed the four existing target prediction methods. In summary

  6. Validity and reliability assessment of a peer evaluation method in team-based learning classes.

    PubMed

    Yoon, Hyun Bae; Park, Wan Beom; Myung, Sun-Jung; Moon, Sang Hui; Park, Jun-Bean

    2018-03-01

    Team-based learning (TBL) is increasingly employed in medical education because of its potential to promote active group learning. In TBL, learners are usually asked to assess the contributions of peers within their group to ensure accountability. The purpose of this study is to assess the validity and reliability of a peer evaluation instrument that was used in TBL classes in a single medical school. A total of 141 students were divided into 18 groups in 11 TBL classes. The students were asked to evaluate their peers in the group based on evaluation criteria that were provided to them. We analyzed the comments that were written for the highest and lowest achievers to assess the validity of the peer evaluation instrument. The reliability of the instrument was assessed by examining the agreement among peer ratings within each group of students via intraclass correlation coefficient (ICC) analysis. Most of the students provided reasonable and understandable comments for the high and low achievers within their group, and most of those comments were compatible with the evaluation criteria. The average ICC of each group ranged from 0.390 to 0.863, and the overall average was 0.659. There was no significant difference in inter-rater reliability according to the number of members in the group or the timing of the evaluation within the course. The peer evaluation instrument that was used in the TBL classes was valid and reliable. Providing evaluation criteria and rules seemed to improve the validity and reliability of the instrument.

  7. Target Fishing for Chemical Compounds using Target-Ligand Activity data and Ranking based Methods

    PubMed Central

    Wale, Nikil; Karypis, George

    2009-01-01

    In recent years the development of computational techniques that identify all the likely targets for a given chemical compound, also termed as the problem of Target Fishing, has been an active area of research. Identification of likely targets of a chemical compound helps to understand problems such as toxicity, lack of efficacy in humans, and poor physical properties associated with that compound in the early stages of drug discovery. In this paper we present a set of techniques whose goal is to rank or prioritize targets in the context of a given chemical compound such that most targets that this compound may show activity against appear higher in the ranked list. These methods are based on our extensions to the SVM and Ranking Perceptron algorithms for this problem. Our extensive experimental study shows that the methods developed in this work outperform previous approaches by 2% to 60% under different evaluation criterions. PMID:19764745

  8. Arc-based smoothing of ion beam intensity on targets

    DOE PAGES

    Friedman, Alex

    2012-06-20

    Manipulating a set of ion beams upstream of a target, makes it possible to arrange a smoother deposition pattern, so as to achieve more uniform illumination of the target. A uniform energy deposition pattern is important for applications including ion-beam-driven high energy density physics and heavy-ion beam-driven inertial fusion energy (“heavy-ion fusion”). Here, we consider an approach to such smoothing that is based on rapidly “wobbling” each of the beams back and forth along a short arc-shaped path, via oscillating fields applied upstream of the final pulse compression. In this technique, uniformity is achieved in the time-averaged sense; this ismore » sufficient provided the beam oscillation timescale is short relative to the hydrodynamic timescale of the target implosion. This work builds on two earlier concepts: elliptical beams applied to a distributed-radiator target [D. A. Callahan and M. Tabak, Phys. Plasmas 7, 2083 (2000)] and beams that are wobbled so as to trace a number of full rotations around a circular or elliptical path [R. C. Arnold et al., Nucl. Instrum. Methods 199, 557 (1982)]. Here, we describe the arc-based smoothing approach and compare it to results obtainable using an elliptical-beam prescription. In particular, we assess the potential of these approaches for minimization of azimuthal asymmetry, for the case of a ring of beams arranged on a cone. We also found that, for small numbers of beams on the ring, the arc-based smoothing approach offers superior uniformity. In contrast with the full-rotation approach, arc-based smoothing remains usable when the geometry precludes wobbling the beams around a full circle, e.g., for the X-target [E. Henestroza, B. G. Logan, and L. J. Perkins, Phys. Plasmas 18, 032702 (2011)] and some classes of distributed-radiator targets.« less

  9. Feature reliability determines specificity and transfer of perceptual learning in orientation search.

    PubMed

    Yashar, Amit; Denison, Rachel N

    2017-12-01

    Training can modify the visual system to produce a substantial improvement on perceptual tasks and therefore has applications for treating visual deficits. Visual perceptual learning (VPL) is often specific to the trained feature, which gives insight into processes underlying brain plasticity, but limits VPL's effectiveness in rehabilitation. Under what circumstances VPL transfers to untrained stimuli is poorly understood. Here we report a qualitatively new phenomenon: intrinsic variation in the representation of features determines the transfer of VPL. Orientations around cardinal are represented more reliably than orientations around oblique in V1, which has been linked to behavioral consequences such as visual search asymmetries. We studied VPL for visual search of near-cardinal or oblique targets among distractors of the other orientation while controlling for other display and task attributes, including task precision, task difficulty, and stimulus exposure. Learning was the same in all training conditions; however, transfer depended on the orientation of the target, with full transfer of learning from near-cardinal to oblique targets but not the reverse. To evaluate the idea that representational reliability was the key difference between the orientations in determining VPL transfer, we created a model that combined orientation-dependent reliability, improvement of reliability with learning, and an optimal search strategy. Modeling suggested that not only search asymmetries but also the asymmetric transfer of VPL depended on preexisting differences between the reliability of near-cardinal and oblique representations. Transfer asymmetries in model behavior also depended on having different learning rates for targets and distractors, such that greater learning for low-reliability distractors facilitated transfer. These findings suggest that training on sensory features with intrinsically low reliability may maximize the generalizability of learning in complex

  10. Feature reliability determines specificity and transfer of perceptual learning in orientation search

    PubMed Central

    2017-01-01

    Training can modify the visual system to produce a substantial improvement on perceptual tasks and therefore has applications for treating visual deficits. Visual perceptual learning (VPL) is often specific to the trained feature, which gives insight into processes underlying brain plasticity, but limits VPL’s effectiveness in rehabilitation. Under what circumstances VPL transfers to untrained stimuli is poorly understood. Here we report a qualitatively new phenomenon: intrinsic variation in the representation of features determines the transfer of VPL. Orientations around cardinal are represented more reliably than orientations around oblique in V1, which has been linked to behavioral consequences such as visual search asymmetries. We studied VPL for visual search of near-cardinal or oblique targets among distractors of the other orientation while controlling for other display and task attributes, including task precision, task difficulty, and stimulus exposure. Learning was the same in all training conditions; however, transfer depended on the orientation of the target, with full transfer of learning from near-cardinal to oblique targets but not the reverse. To evaluate the idea that representational reliability was the key difference between the orientations in determining VPL transfer, we created a model that combined orientation-dependent reliability, improvement of reliability with learning, and an optimal search strategy. Modeling suggested that not only search asymmetries but also the asymmetric transfer of VPL depended on preexisting differences between the reliability of near-cardinal and oblique representations. Transfer asymmetries in model behavior also depended on having different learning rates for targets and distractors, such that greater learning for low-reliability distractors facilitated transfer. These findings suggest that training on sensory features with intrinsically low reliability may maximize the generalizability of learning in complex

  11. Reliability of Soft Tissue Model Based Implant Surgical Guides; A Methodological Mistake.

    PubMed

    Sabour, Siamak; Dastjerdi, Elahe Vahid

    2012-08-20

    Abstract We were interested to read the paper by Maney P and colleagues published in the July 2012 issue of J Oral Implantol. The authors aimed to assess the reliability of soft tissue model based implant surgical guides reported that the accuracy was evaluated using software. 1 I found the manuscript title of Maney P, et al. incorrect and misleading. Moreover, they reported twenty-two sites (46.81%) were considered accurate (13 of 24 maxillary and 9 of 23 mandibular sites). As the authors point out in their conclusion, Soft tissue models do not always provide sufficient accuracy for implant surgical guide fabrication.Reliability (precision) and validity (accuracy) are two different methodological issues in researches. Sensitivity, specificity, PPV, NPV, likelihood ratio positive (true positive/false negative) and likelihood ratio negative (false positive/ true negative) as well as odds ratio (true results\\false results - preferably more than 50) are among the tests to evaluate the validity (accuracy) of a single test compared to a gold standard.2-4 It is not clear that the reported twenty-two sites (46.81%) which were considered accurate related to which of the above mentioned estimates for validity analysis. Reliability (repeatability or reproducibility) is being assessed by different statistical tests such as Pearson r, least square and paired t.test which all of them are among common mistakes in reliability analysis 5. Briefly, for quantitative variable Intra Class Correlation Coefficient (ICC) and for qualitative variables weighted kappa should be used with caution because kappa has its own limitation too. Regarding reliability or agreement, it is good to know that for computing kappa value, just concordant cells are being considered, whereas discordant cells should also be taking into account in order to reach a correct estimation of agreement (Weighted kappa).2-4 As a take home message, for reliability and validity analysis, appropriate tests should be

  12. Infrared small target detection based on Danger Theory

    NASA Astrophysics Data System (ADS)

    Lan, Jinhui; Yang, Xiao

    2009-11-01

    To solve the problem that traditional method can't detect the small objects whose local SNR is less than 2 in IR images, a Danger Theory-based model to detect infrared small target is presented in this paper. First, on the analog with immunology, the definition is given, in this paper, to such terms as dangerous signal, antigens, APC, antibodies. Besides, matching rule between antigen and antibody is improved. Prior to training the detection model and detecting the targets, the IR images are processed utilizing adaptive smooth filter to decrease the stochastic noise. Then at the training process, deleting rule, generating rule, crossover rule and the mutation rule are established after a large number of experiments in order to realize immediate convergence and obtain good antibodies. The Danger Theory-based model is built after the training process, and this model can detect the target whose local SNR is only 1.5.

  13. Reliability of a retrospective decade-based life-course alcohol consumption questionnaire administered in later life.

    PubMed

    Bell, Steven; Britton, Annie

    2015-10-01

    Retrospective measures of alcohol intake are becoming increasingly popular; however, the reliability of such measures remains uncertain. This study assessed the reliability of a retrospective decade-based life-course alcohol consumption questionnaire, based on the standardized Alcohol Use Disorder Identification Test-Consumption (AUDIT-C) administered in older age in a well-characterized cohort study. A retrospective alcohol life-grid was administered to 5980 participants (72% male, mean age 70 years) in the Whitehall II study covering frequency of drinking, number of drinks in a typical drinking day and frequency of consuming six or more drinks in a single drinking occasion in the teens (16-19 years) through to the 80s. A subsample of 385 individuals completed a repeat survey to determine test-retest reliability. Retrospective measures were also compared with prospectively ascertained information and used to predict objectively measured systolic blood pressure to test their predictive validity. Across all decades of life, test-retest reliability was generally good (κ range = 0.62-0.78 for frequency, 0.55-0.62 for usual number of drinks and 0.57-0.65 for frequency of consuming six or more drinks in a single occasion). The concordance between prospective and retrospective measures was consistently moderate to high. The life-grid method performed better than a single question in identifying life-time abstainers. Retrospective measures were also related to systolic blood pressure in the manner anticipated. A retrospective decade-based AUDIT-C grid administered in older age provides a relatively reliable measure of alcohol consumption across the life-course. © 2015 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  14. Pioneering topological methods for network-based drug-target prediction by exploiting a brain-network self-organization theory.

    PubMed

    Durán, Claudio; Daminelli, Simone; Thomas, Josephine M; Haupt, V Joachim; Schroeder, Michael; Cannistraci, Carlo Vittorio

    2017-04-26

    The bipartite network representation of the drug-target interactions (DTIs) in a biosystem enhances understanding of the drugs' multifaceted action modes, suggests therapeutic switching for approved drugs and unveils possible side effects. As experimental testing of DTIs is costly and time-consuming, computational predictors are of great aid. Here, for the first time, state-of-the-art DTI supervised predictors custom-made in network biology were compared-using standard and innovative validation frameworks-with unsupervised pure topological-based models designed for general-purpose link prediction in bipartite networks. Surprisingly, our results show that the bipartite topology alone, if adequately exploited by means of the recently proposed local-community-paradigm (LCP) theory-initially detected in brain-network topological self-organization and afterwards generalized to any complex network-is able to suggest highly reliable predictions, with comparable performance with the state-of-the-art-supervised methods that exploit additional (non-topological, for instance biochemical) DTI knowledge. Furthermore, a detailed analysis of the novel predictions revealed that each class of methods prioritizes distinct true interactions; hence, combining methodologies based on diverse principles represents a promising strategy to improve drug-target discovery. To conclude, this study promotes the power of bio-inspired computing, demonstrating that simple unsupervised rules inspired by principles of topological self-organization and adaptiveness arising during learning in living intelligent systems (like the brain) can efficiently equal perform complicated algorithms based on advanced, supervised and knowledge-based engineering. © The Author 2017. Published by Oxford University Press.

  15. Design for a Crane Metallic Structure Based on Imperialist Competitive Algorithm and Inverse Reliability Strategy

    NASA Astrophysics Data System (ADS)

    Fan, Xiao-Ning; Zhi, Bo

    2017-07-01

    Uncertainties in parameters such as materials, loading, and geometry are inevitable in designing metallic structures for cranes. When considering these uncertainty factors, reliability-based design optimization (RBDO) offers a more reasonable design approach. However, existing RBDO methods for crane metallic structures are prone to low convergence speed and high computational cost. A unilevel RBDO method, combining a discrete imperialist competitive algorithm with an inverse reliability strategy based on the performance measure approach, is developed. Application of the imperialist competitive algorithm at the optimization level significantly improves the convergence speed of this RBDO method. At the reliability analysis level, the inverse reliability strategy is used to determine the feasibility of each probabilistic constraint at each design point by calculating its α-percentile performance, thereby avoiding convergence failure, calculation error, and disproportionate computational effort encountered using conventional moment and simulation methods. Application of the RBDO method to an actual crane structure shows that the developed RBDO realizes a design with the best tradeoff between economy and safety together with about one-third of the convergence speed and the computational cost of the existing method. This paper provides a scientific and effective design approach for the design of metallic structures of cranes.

  16. A New Tool for Nutrition App Quality Evaluation (AQEL): Development, Validation, and Reliability Testing

    PubMed Central

    Huang, Wenhao; Chapman-Novakofski, Karen M

    2017-01-01

    Background The extensive availability and increasing use of mobile apps for nutrition-based health interventions makes evaluation of the quality of these apps crucial for integration of apps into nutritional counseling. Objective The goal of this research was the development, validation, and reliability testing of the app quality evaluation (AQEL) tool, an instrument for evaluating apps’ educational quality and technical functionality. Methods Items for evaluating app quality were adapted from website evaluations, with additional items added to evaluate the specific characteristics of apps, resulting in 79 initial items. Expert panels of nutrition and technology professionals and app users reviewed items for face and content validation. After recommended revisions, nutrition experts completed a second AQEL review to ensure clarity. On the basis of 150 sets of responses using the revised AQEL, principal component analysis was completed, reducing AQEL into 5 factors that underwent reliability testing, including internal consistency, split-half reliability, test-retest reliability, and interrater reliability (IRR). Two additional modifiable constructs for evaluating apps based on the age and needs of the target audience as selected by the evaluator were also tested for construct reliability. IRR testing using intraclass correlations (ICC) with all 7 constructs was conducted, with 15 dietitians evaluating one app. Results Development and validation resulted in the 51-item AQEL. These were reduced to 25 items in 5 factors after principal component analysis, plus 9 modifiable items in two constructs that were not included in principal component analysis. Internal consistency and split-half reliability of the following constructs derived from principal components analysis was good (Cronbach alpha >.80, Spearman-Brown coefficient >.80): behavior change potential, support of knowledge acquisition, app function, and skill development. App purpose split half-reliability was

  17. Psychometric Properties of Performance-based Measurements of Functional Capacity: Test-Retest Reliability, Practice Effects, and Potential Sensitivity to Change

    PubMed Central

    Leifker, Feea R.; Patterson, Thomas L.; Bowie, Christopher R.; Mausbach, Brent T.; Harvey, Philip D.

    2010-01-01

    Performance-based measures of the ability to perform social and everyday living skills are being more widely used to assess functional capacity in people with serious mental illnesses such as schizophrenia and bipolar disorder. Since they are also being used as outcome measures in pharmacological and cognitive remediation studies aimed at cognitive impairments in schizophrenia, understanding their measurement properties and potential sensitivity to change is important. In this study, the test-retest reliability, practice effects, and reliable change indices of two different performance-based functional capacity measures, the UCSD Performance-based skills assessment (UPSA) and Social skills performance assessment (SSPA) were examined over several different retest intervals in two different samples of people with schizophrenia (n’s=238 and 116) and a healthy comparison sample (n=109). These psychometric properties were compared to those of a neuropsychological assessment battery. Test-retest reliabilities of the long form of the UPSA ranged from r=.63 to r=.80 over follow-up periods up to 36 months in people with schizophrenia, while brief UPSA reliabilities ranged from r=.66 to r=.81. Test-retest reliability of the NP performance scores ranged from r=.77 to r=.79. Test-retest reliabilities of the UPSA were lower in healthy controls, while NP performance was slightly more reliable. SSPA test-retest reliability was lower. Practice effect sizes ranged from .05 to .16 for the UPSA and .07 to .19 for the NP assessment in patients, with HC having more practice effects. Reliable change intervals were consistent across NP and both FC measures, indicating equal potential for detection of change. These performance-based measures of functional capacity appear to have similar potential to be sensitive to change compared to NP performance in people with schizophrenia. PMID:20399613

  18. Composite Reliability of a Workplace-Based Assessment Toolbox for Postgraduate Medical Education

    ERIC Educational Resources Information Center

    Moonen-van Loon, J. M. W.; Overeem, K.; Donkers, H. H. L. M.; van der Vleuten, C. P. M.; Driessen, E. W.

    2013-01-01

    In recent years, postgraduate assessment programmes around the world have embraced workplace-based assessment (WBA) and its related tools. Despite their widespread use, results of studies on the validity and reliability of these tools have been variable. Although in many countries decisions about residents' continuation of training and…

  19. A reliability assessment of constrained spherical deconvolution-based diffusion-weighted magnetic resonance imaging in individuals with chronic stroke.

    PubMed

    Snow, Nicholas J; Peters, Sue; Borich, Michael R; Shirzad, Navid; Auriat, Angela M; Hayward, Kathryn S; Boyd, Lara A

    2016-01-15

    Diffusion-weighted magnetic resonance imaging (DW-MRI) is commonly used to assess white matter properties after stroke. Novel work is utilizing constrained spherical deconvolution (CSD) to estimate complex intra-voxel fiber architecture unaccounted for with tensor-based fiber tractography. However, the reliability of CSD-based tractography has not been established in people with chronic stroke. Establishing the reliability of CSD-based DW-MRI in chronic stroke. High-resolution DW-MRI was performed in ten adults with chronic stroke during two separate sessions. Deterministic region of interest-based fiber tractography using CSD was performed by two raters. Mean fractional anisotropy (FA), apparent diffusion coefficient (ADC), tract number, and tract volume were extracted from reconstructed fiber pathways in the corticospinal tract (CST) and superior longitudinal fasciculus (SLF). Callosal fiber pathways connecting the primary motor cortices were also evaluated. Inter-rater and test-retest reliability were determined by intra-class correlation coefficients (ICCs). ICCs revealed excellent reliability for FA and ADC in ipsilesional (0.86-1.00; p<0.05) and contralesional hemispheres (0.94-1.00; p<0.0001), for CST and SLF fibers; and excellent reliability for all metrics in callosal fibers (0.85-1.00; p<0.05). ICC ranged from poor to excellent for tract number and tract volume in ipsilesional (-0.11 to 0.92; p≤0.57) and contralesional hemispheres (-0.27 to 0.93; p≤0.64), for CST and SLF fibers. Like other select DW-MRI approaches, CSD-based tractography is a reliable approach to evaluate FA and ADC in major white matter pathways, in chronic stroke. Future work should address the reproducibility and utility of CSD-based metrics of tract number and tract volume. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Reliability based impact localization in composite panels using Bayesian updating and the Kalman filter

    NASA Astrophysics Data System (ADS)

    Morse, Llewellyn; Sharif Khodaei, Zahra; Aliabadi, M. H.

    2018-01-01

    In this work, a reliability based impact detection strategy for a sensorized composite structure is proposed. Impacts are localized using Artificial Neural Networks (ANNs) with recorded guided waves due to impacts used as inputs. To account for variability in the recorded data under operational conditions, Bayesian updating and Kalman filter techniques are applied to improve the reliability of the detection algorithm. The possibility of having one or more faulty sensors is considered, and a decision fusion algorithm based on sub-networks of sensors is proposed to improve the application of the methodology to real structures. A strategy for reliably categorizing impacts into high energy impacts, which are probable to cause damage in the structure (true impacts), and low energy non-damaging impacts (false impacts), has also been proposed to reduce the false alarm rate. The proposed strategy involves employing classification ANNs with different features extracted from captured signals used as inputs. The proposed methodologies are validated by experimental results on a quasi-isotropic composite coupon impacted with a range of impact energies.

  1. The Reliability of Environmental Measures of the College Alcohol Environment.

    ERIC Educational Resources Information Center

    Clapp, John D.; Whitney, Mike; Shillington, Audrey M.

    2002-01-01

    Assesses the inter-rater reliability of two environmental scanning tools designed to identify alcohol-related advertisements targeting college students. Inter-rater reliability for these forms varied across different rating categories and ranged from poor to excellent. Suggestions for future research are addressed. (Contains 26 references and 6…

  2. Development of a PCR-based assay for rapid and reliable identification of pathogenic Fusaria.

    PubMed

    Mishra, Prashant K; Fox, Roland T V; Culham, Alastair

    2003-01-28

    Identification of Fusarium species has always been difficult due to confusing phenotypic classification systems. We have developed a fluorescent-based polymerase chain reaction assay that allows for rapid and reliable identification of five toxigenic and pathogenic Fusarium species. The species includes Fusarium avenaceum, F. culmorum, F. equiseti, F. oxysporum and F. sambucinum. The method is based on the PCR amplification of species-specific DNA fragments using fluorescent oligonucleotide primers, which were designed based on sequence divergence within the internal transcribed spacer region of nuclear ribosomal DNA. Besides providing an accurate, reliable, and quick diagnosis of these Fusaria, another advantage with this method is that it reduces the potential for exposure to carcinogenic chemicals as it substitutes the use of fluorescent dyes in place of ethidium bromide. Apart from its multidisciplinary importance and usefulness, it also obviates the need for gel electrophoresis.

  3. Reliability-Based Life Assessment of Stirling Convertor Heater Head

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Halford, Gary R.; Korovaichuk, Igor

    2004-01-01

    Onboard radioisotope power systems being developed and planned for NASA's deep-space missions require reliable design lifetimes of up to 14 yr. The structurally critical heater head of the high-efficiency Stirling power convertor has undergone extensive computational analysis of operating temperatures, stresses, and creep resistance of the thin-walled Inconel 718 bill of material. A preliminary assessment of the effect of uncertainties in the material behavior was also performed. Creep failure resistance of the thin-walled heater head could show variation due to small deviations in the manufactured thickness and in uncertainties in operating temperature and pressure. Durability prediction and reliability of the heater head are affected by these deviations from nominal design conditions. Therefore, it is important to include the effects of these uncertainties in predicting the probability of survival of the heater head under mission loads. Furthermore, it may be possible for the heater head to experience rare incidences of small temperature excursions of short duration. These rare incidences would affect the creep strain rate and, therefore, the life. This paper addresses the effects of such rare incidences on the reliability. In addition, the sensitivities of variables affecting the reliability are quantified, and guidelines developed to improve the reliability are outlined. Heater head reliability is being quantified with data from NASA Glenn Research Center's accelerated benchmark testing program.

  4. Beyond the Sparsity-Based Target Detector: A Hybrid Sparsity and Statistics Based Detector for Hyperspectral Images.

    PubMed

    Du, Bo; Zhang, Yuxiang; Zhang, Liangpei; Tao, Dacheng

    2016-08-18

    Hyperspectral images provide great potential for target detection, however, new challenges are also introduced for hyperspectral target detection, resulting that hyperspectral target detection should be treated as a new problem and modeled differently. Many classical detectors are proposed based on the linear mixing model and the sparsity model. However, the former type of model cannot deal well with spectral variability in limited endmembers, and the latter type of model usually treats the target detection as a simple classification problem and pays less attention to the low target probability. In this case, can we find an efficient way to utilize both the high-dimension features behind hyperspectral images and the limited target information to extract small targets? This paper proposes a novel sparsitybased detector named the hybrid sparsity and statistics detector (HSSD) for target detection in hyperspectral imagery, which can effectively deal with the above two problems. The proposed algorithm designs a hypothesis-specific dictionary based on the prior hypotheses for the test pixel, which can avoid the imbalanced number of training samples for a class-specific dictionary. Then, a purification process is employed for the background training samples in order to construct an effective competition between the two hypotheses. Next, a sparse representation based binary hypothesis model merged with additive Gaussian noise is proposed to represent the image. Finally, a generalized likelihood ratio test is performed to obtain a more robust detection decision than the reconstruction residual based detection methods. Extensive experimental results with three hyperspectral datasets confirm that the proposed HSSD algorithm clearly outperforms the stateof- the-art target detectors.

  5. A reliable transmission protocol for ZigBee-based wireless patient monitoring.

    PubMed

    Chen, Shyr-Kuen; Kao, Tsair; Chan, Chia-Tai; Huang, Chih-Ning; Chiang, Chih-Yen; Lai, Chin-Yu; Tung, Tse-Hua; Wang, Pi-Chung

    2012-01-01

    Patient monitoring systems are gaining their importance as the fast-growing global elderly population increases demands for caretaking. These systems use wireless technologies to transmit vital signs for medical evaluation. In a multihop ZigBee network, the existing systems usually use broadcast or multicast schemes to increase the reliability of signals transmission; however, both the schemes lead to significantly higher network traffic and end-to-end transmission delay. In this paper, we present a reliable transmission protocol based on anycast routing for wireless patient monitoring. Our scheme automatically selects the closest data receiver in an anycast group as a destination to reduce the transmission latency as well as the control overhead. The new protocol also shortens the latency of path recovery by initiating route recovery from the intermediate routers of the original path. On the basis of a reliable transmission scheme, we implement a ZigBee device for fall monitoring, which integrates fall detection, indoor positioning, and ECG monitoring. When the triaxial accelerometer of the device detects a fall, the current position of the patient is transmitted to an emergency center through a ZigBee network. In order to clarify the situation of the fallen patient, 4-s ECG signals are also transmitted. Our transmission scheme ensures the successful transmission of these critical messages. The experimental results show that our scheme is fast and reliable. We also demonstrate that our devices can seamlessly integrate with the next generation technology of wireless wide area network, worldwide interoperability for microwave access, to achieve real-time patient monitoring.

  6. Camera-based measurement of respiratory rates is reliable.

    PubMed

    Becker, Christoph; Achermann, Stefan; Rocque, Mukul; Kirenko, Ihor; Schlack, Andreas; Dreher-Hummel, Thomas; Zumbrunn, Thomas; Bingisser, Roland; Nickel, Christian H

    2017-06-01

    Respiratory rate (RR) is one of the most important vital signs used to detect whether a patient is in critical condition. It is part of many risk scores and its measurement is essential for triage of patients in emergency departments. It is often not recorded as measurement is cumbersome and time-consuming. We intended to evaluate the accuracy of camera-based measurements as an alternative measurement to the current practice of manual counting. We monitored the RR of healthy male volunteers with a camera-based prototype application and simultaneously by manual counting and by capnography, which was considered the gold standard. The four assessors were mutually blinded. We simulated normoventilation, hypoventilation and hyperventilation as well as deep, normal and superficial breathing depths to assess potential clinical settings. The volunteers were assessed while being undressed, wearing a T-shirt or a winter coat. In total, 20 volunteers were included. The results of camera-based measurements of RRs and capnography were in close agreement throughout all clothing styles and respiratory patterns (Pearson's correlation coefficient, r=0.90-1.00, except for one scenario, in which the volunteer breathed slowly dressed in a winter coat r=0.84). In the winter-coat scenarios, the camera-based prototype application was superior to human counters. In our pilot study, we found that camera-based measurements delivered accurate and reliable results. Future studies need to show that camera-based measurements are a secure alternative for measuring RRs in clinical settings as well.

  7. CardioGuard: A Brassiere-Based Reliable ECG Monitoring Sensor System for Supporting Daily Smartphone Healthcare Applications

    PubMed Central

    Kwon, Sungjun; Kim, Jeehoon; Kang, Seungwoo; Lee, Youngki; Baek, Hyunjae

    2014-01-01

    Abstract We propose CardioGuard, a brassiere-based reliable electrocardiogram (ECG) monitoring sensor system, for supporting daily smartphone healthcare applications. It is designed to satisfy two key requirements for user-unobtrusive daily ECG monitoring: reliability of ECG sensing and usability of the sensor. The system is validated through extensive evaluations. The evaluation results showed that the CardioGuard sensor reliably measure the ECG during 12 representative daily activities including diverse movement levels; 89.53% of QRS peaks were detected on average. The questionnaire-based user study with 15 participants showed that the CardioGuard sensor was comfortable and unobtrusive. Additionally, the signal-to-noise ratio test and the washing durability test were conducted to show the high-quality sensing of the proposed sensor and its physical durability in practical use, respectively. PMID:25405527

  8. A Bayesian-Based EDA Tool for Nano-circuits Reliability Calculations

    NASA Astrophysics Data System (ADS)

    Ibrahim, Walid; Beiu, Valeriu

    As the sizes of (nano-)devices are aggressively scaled deep into the nanometer range, the design and manufacturing of future (nano-)circuits will become extremely complex and inevitably will introduce more defects while their functioning will be adversely affected by transient faults. Therefore, accurately calculating the reliability of future designs will become a very important aspect for (nano-)circuit designers as they investigate several design alternatives to optimize the trade-offs between the conflicting metrics of area-power-energy-delay versus reliability. This paper introduces a novel generic technique for the accurate calculation of the reliability of future nano-circuits. Our aim is to provide both educational and research institutions (as well as the semiconductor industry at a later stage) with an accurate and easy to use tool for closely comparing the reliability of different design alternatives, and for being able to easily select the design that best fits a set of given (design) constraints. Moreover, the reliability model generated by the tool should empower designers with the unique opportunity of understanding the influence individual gates play on the design’s overall reliability, and identifying those (few) gates which impact the design’s reliability most significantly.

  9. Reliability Sensitivity Analysis and Design Optimization of Composite Structures Based on Response Surface Methodology

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2003-01-01

    This report discusses the development and application of two alternative strategies in the form of global and sequential local response surface (RS) techniques for the solution of reliability-based optimization (RBO) problems. The problem of a thin-walled composite circular cylinder under axial buckling instability is used as a demonstrative example. In this case, the global technique uses a single second-order RS model to estimate the axial buckling load over the entire feasible design space (FDS) whereas the local technique uses multiple first-order RS models with each applied to a small subregion of FDS. Alternative methods for the calculation of unknown coefficients in each RS model are explored prior to the solution of the optimization problem. The example RBO problem is formulated as a function of 23 uncorrelated random variables that include material properties, thickness and orientation angle of each ply, cylinder diameter and length, as well as the applied load. The mean values of the 8 ply thicknesses are treated as independent design variables. While the coefficients of variation of all random variables are held fixed, the standard deviations of ply thicknesses can vary during the optimization process as a result of changes in the design variables. The structural reliability analysis is based on the first-order reliability method with reliability index treated as the design constraint. In addition to the probabilistic sensitivity analysis of reliability index, the results of the RBO problem are presented for different combinations of cylinder length and diameter and laminate ply patterns. The two strategies are found to produce similar results in terms of accuracy with the sequential local RS technique having a considerably better computational efficiency.

  10. A Vision for Spaceflight Reliability: NASA's Objectives Based Strategy

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Evans, John; Hall, Tony

    2015-01-01

    In defining the direction for a new Reliability and Maintainability standard, OSMA has extracted the essential objectives that our programs need, to undertake a reliable mission. These objectives have been structured to lead mission planning through construction of an objective hierarchy, which defines the critical approaches for achieving high reliability and maintainability (R M). Creating a hierarchy, as a basis for assurance implementation, is a proven approach; yet, it holds the opportunity to enable new directions, as NASA moves forward in tackling the challenges of space exploration.

  11. Reliability and Validity of Assessing User Satisfaction With Web-Based Health Interventions

    PubMed Central

    Lehr, Dirk; Reis, Dorota; Vis, Christiaan; Riper, Heleen; Berking, Matthias; Ebert, David Daniel

    2016-01-01

    Background The perspective of users should be taken into account in the evaluation of Web-based health interventions. Assessing the users’ satisfaction with the intervention they receive could enhance the evidence for the intervention effects. Thus, there is a need for valid and reliable measures to assess satisfaction with Web-based health interventions. Objective The objective of this study was to analyze the reliability, factorial structure, and construct validity of the Client Satisfaction Questionnaire adapted to Internet-based interventions (CSQ-I). Methods The psychometric quality of the CSQ-I was analyzed in user samples from 2 separate randomized controlled trials evaluating Web-based health interventions, one from a depression prevention intervention (sample 1, N=174) and the other from a stress management intervention (sample 2, N=111). At first, the underlying measurement model of the CSQ-I was analyzed to determine the internal consistency. The factorial structure of the scale and the measurement invariance across groups were tested by multigroup confirmatory factor analyses. Additionally, the construct validity of the scale was examined by comparing satisfaction scores with the primary clinical outcome. Results Multigroup confirmatory analyses on the scale yielded a one-factorial structure with a good fit (root-mean-square error of approximation =.09, comparative fit index =.96, standardized root-mean-square residual =.05) that showed partial strong invariance across the 2 samples. The scale showed very good reliability, indicated by McDonald omegas of .95 in sample 1 and .93 in sample 2. Significant correlations with change in depressive symptoms (r=−.35, P<.001) and perceived stress (r=−.48, P<.001) demonstrated the construct validity of the scale. Conclusions The proven internal consistency, factorial structure, and construct validity of the CSQ-I indicate a good overall psychometric quality of the measure to assess the user’s general

  12. Thermal Management and Reliability of Automotive Power Electronics and Electric Machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narumanchi, Sreekant V; Bennion, Kevin S; Cousineau, Justine E

    Low-cost, high-performance thermal management technologies are helping meet aggressive power density, specific power, cost, and reliability targets for power electronics and electric machines. The National Renewable Energy Laboratory is working closely with numerous industry and research partners to help influence development of components that meet aggressive performance and cost targets through development and characterization of cooling technologies, and thermal characterization and improvements of passive stack materials and interfaces. Thermomechanical reliability and lifetime estimation models are important enablers for industry in cost-and time-effective design.

  13. Therapeutic Targeting of Siglecs using Antibody- and Glycan-based Approaches

    PubMed Central

    Angata, Takashi; Nycholat, Corwin M.; Macauley, Matthew S.

    2015-01-01

    The sialic acid-binding immunoglobulin-like lectins (Siglecs) are a family of immunomodulatory receptors whose functions are regulated by their glycan ligands. Siglecs are attractive therapeutic targets because of their cell-type specific expression pattern, endocytic properties, high expression on certain lymphomas/leukemias, and ability to modulate receptor signaling. Siglec-targeting approaches with therapeutic potential encompass antibody- and glycan-based strategies. Several antibody-based therapies are in clinical trials and continue to be developed for the treatment of lymphoma/leukemia and autoimmune disease, while the therapeutic potential of glycan-based strategies for cargo-delivery and immunomodulation is a promising new approach. Here, we review these strategies with special emphasis on emerging approaches and disease areas that may benefit from targeting the Siglec family. PMID:26435210

  14. Generalizability Theory Reliability of Written Expression Curriculum-Based Measurement in Universal Screening

    ERIC Educational Resources Information Center

    Keller-Margulis, Milena A.; Mercer, Sterett H.; Thomas, Erin L.

    2016-01-01

    The purpose of this study was to examine the reliability of written expression curriculum-based measurement (WE-CBM) in the context of universal screening from a generalizability theory framework. Students in second through fifth grade (n = 145) participated in the study. The sample included 54% female students, 49% White students, 23% African…

  15. Ligand cluster-based protein network and ePlatton, a multi-target ligand finder.

    PubMed

    Du, Yu; Shi, Tieliu

    2016-01-01

    Small molecules are information carriers that make cells aware of external changes and couple internal metabolic and signalling pathway systems with each other. In some specific physiological status, natural or artificial molecules are used to interact with selective biological targets to activate or inhibit their functions to achieve expected biological and physiological output. Millions of years of evolution have optimized biological processes and pathways and now the endocrine and immune system cannot work properly without some key small molecules. In the past thousands of years, the human race has managed to find many medicines against diseases by trail-and-error experience. In the recent decades, with the deepening understanding of life and the progress of molecular biology, researchers spare no effort to design molecules targeting one or two key enzymes and receptors related to corresponding diseases. But recent studies in pharmacogenomics have shown that polypharmacology may be necessary for the effects of drugs, which challenge the paradigm, 'one drug, one target, one disease'. Nowadays, cheminformatics and structural biology can help us reasonably take advantage of the polypharmacology to design next-generation promiscuous drugs and drug combination therapies. 234,591 protein-ligand interactions were extracted from ChEMBL. By the 2D structure similarity, 13,769 ligand emerged from 156,151 distinct ligands which were recognized by 1477 proteins. Ligand cluster- and sequence-based protein networks (LCBN, SBN) were constructed, compared and analysed. For assisting compound designing, exploring polypharmacology and finding possible drug combination, we integrated the pathway, disease, drug adverse reaction and the relationship of targets and ligand clusters into the web platform, ePlatton, which is available at http://www.megabionet.org/eplatton. Although there were some disagreements between the LCBN and SBN, communities in both networks were largely the same

  16. Log-polar mapping-based scale space tracking with adaptive target response

    NASA Astrophysics Data System (ADS)

    Li, Dongdong; Wen, Gongjian; Kuai, Yangliu; Zhang, Ximing

    2017-05-01

    Correlation filter-based tracking has exhibited impressive robustness and accuracy in recent years. Standard correlation filter-based trackers are restricted to translation estimation and equipped with fixed target response. These trackers produce an inferior performance when encountered with a significant scale variation or appearance change. We propose a log-polar mapping-based scale space tracker with an adaptive target response. This tracker transforms the scale variation of the target in the Cartesian space into a shift along the logarithmic axis in the log-polar space. A one-dimensional scale correlation filter is learned online to estimate the shift along the logarithmic axis. With the log-polar representation, scale estimation is achieved accurately without a multiresolution pyramid. To achieve an adaptive target response, a variance of the Gaussian function is computed from the response map and updated online with a learning rate parameter. Our log-polar mapping-based scale correlation filter and adaptive target response can be combined with any correlation filter-based trackers. In addition, the scale correlation filter can be extended to a two-dimensional correlation filter to achieve joint estimation of the scale variation and in-plane rotation. Experiments performed on an OTB50 benchmark demonstrate that our tracker achieves superior performance against state-of-the-art trackers.

  17. Reliability and validity of the neurorehabilitation experience questionnaire for inpatients.

    PubMed

    Kneebone, Ian I; Hull, Samantha L; McGurk, Rhona; Cropley, Mark

    2012-09-01

    Patient-centered measures of the inpatient neurorehabilitation experience are needed to assess services. The objective of this study was to develop a valid and reliable Neurorehabilitation Experience Questionnaire (NREQ) to assess whether neurorehabilitation inpatients experience service elements important to them. Based on the themes established in prior qualitative research, adopting questions from established inventories and using a literature review, a draft version of the NREQ was generated. Focus groups and interviews were conducted with 9 patients and 26 staff from neurological rehabilitation units to establish face validity. Then, 70 patients were recruited to complete the NREQ to ascertain reliability (internal and test-retest) and concurrent validity. On the basis of the face validity testing, several modifications were made to the draft version of the NREQ. Subsequently, internal reliability (time 1 α = .76, time 2 α = .80), test retest reliability (r = 0.70), and concurrent validity (r = 0.32 and r = 0.56) were established for the revised version. Whereas responses were associated with positive mood (r = 0.30), they appeared not to be influenced by negative mood, age, education, length of stay, sex, functional independence, or whether a participant had been a patient on a unit previously. Preliminary validation of the NREQ suggests promise for use with its target population.

  18. Reliability Modeling Development and Its Applications for Ceramic Capacitors with Base-Metal Electrodes (BMEs)

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    This presentation includes a summary of NEPP-funded deliverables for the Base-Metal Electrodes (BMEs) capacitor task, development of a general reliability model for BME capacitors, and a summary and future work.

  19. Statistical Modeling of Single Target Cell Encapsulation

    PubMed Central

    Moon, SangJun; Ceyhan, Elvan; Gurkan, Umut Atakan; Demirci, Utkan

    2011-01-01

    High throughput drop-on-demand systems for separation and encapsulation of individual target cells from heterogeneous mixtures of multiple cell types is an emerging method in biotechnology that has broad applications in tissue engineering and regenerative medicine, genomics, and cryobiology. However, cell encapsulation in droplets is a random process that is hard to control. Statistical models can provide an understanding of the underlying processes and estimation of the relevant parameters, and enable reliable and repeatable control over the encapsulation of cells in droplets during the isolation process with high confidence level. We have modeled and experimentally verified a microdroplet-based cell encapsulation process for various combinations of cell loading and target cell concentrations. Here, we explain theoretically and validate experimentally a model to isolate and pattern single target cells from heterogeneous mixtures without using complex peripheral systems. PMID:21814548

  20. Cut set-based risk and reliability analysis for arbitrarily interconnected networks

    DOEpatents

    Wyss, Gregory D.

    2000-01-01

    Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.

  1. Space station software reliability analysis based on failures observed during testing at the multisystem integration facility

    NASA Technical Reports Server (NTRS)

    Tamayo, Tak Chai

    1987-01-01

    Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.

  2. NASA Applications and Lessons Learned in Reliability Engineering

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Fuller, Raymond P.

    2011-01-01

    Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.

  3. Nanoparticle-based targeted therapeutics in head-and-neck cancer.

    PubMed

    Wu, Ting-Ting; Zhou, Shui-Hong

    2015-01-01

    Head-and-neck cancer is a major form of the disease worldwide. Treatment consists of surgery, radiation therapy and chemotherapy, but these have not resulted in improved survival rates over the past few decades. Versatile nanoparticles, with selective tumor targeting, are considered to have the potential to improve these poor outcomes. Application of nanoparticle-based targeted therapeutics has extended into many areas, including gene silencing, chemotherapeutic drug delivery, radiosensitization, photothermal therapy, and has shown much promise. In this review, we discuss recent advances in the field of nanoparticle-mediated targeted therapeutics for head-and-neck cancer, with an emphasis on the description of targeting points, including future perspectives.

  4. Targeted gene insertion for molecular medicine.

    PubMed

    Voigt, Katrin; Izsvák, Zsuzsanna; Ivics, Zoltán

    2008-11-01

    Genomic insertion of a functional gene together with suitable transcriptional regulatory elements is often required for long-term therapeutical benefit in gene therapy for several genetic diseases. A variety of integrating vectors for gene delivery exist. Some of them exhibit random genomic integration, whereas others have integration preferences based on attributes of the targeted site, such as primary DNA sequence and physical structure of the DNA, or through tethering to certain DNA sequences by host-encoded cellular factors. Uncontrolled genomic insertion bears the risk of the transgene being silenced due to chromosomal position effects, and can lead to genotoxic effects due to mutagenesis of cellular genes. None of the vector systems currently used in either preclinical experiments or clinical trials displays sufficient preferences for target DNA sequences that would ensure appropriate and reliable expression of the transgene and simultaneously prevent hazardous side effects. We review in this paper the advantages and disadvantages of both viral and non-viral gene delivery technologies, discuss mechanisms of target site selection of integrating genetic elements (viruses and transposons), and suggest distinct molecular strategies for targeted gene delivery.

  5. Shock reliability analysis and improvement of MEMS electret-based vibration energy harvesters

    NASA Astrophysics Data System (ADS)

    Renaud, M.; Fujita, T.; Goedbloed, M.; de Nooijer, C.; van Schaijk, R.

    2015-10-01

    Vibration energy harvesters can serve as a replacement solution to batteries for powering tire pressure monitoring systems (TPMS). Autonomous wireless TPMS powered by microelectromechanical system (MEMS) electret-based vibration energy harvester have been demonstrated. The mechanical reliability of the MEMS harvester still has to be assessed in order to bring the harvester to the requirements of the consumer market. It should survive the mechanical shocks occurring in the tire environment. A testing procedure to quantify the shock resilience of harvesters is described in this article. Our first generation of harvesters has a shock resilience of 400 g, which is far from being sufficient for the targeted application. In order to improve this aspect, the first important aspect is to understand the failure mechanism. Failure is found to occur in the form of fracture of the device’s springs. It results from impacts between the anchors of the springs when the harvester undergoes a shock. The shock resilience of the harvesters can be improved by redirecting these impacts to nonvital parts of the device. With this philosophy in mind, we design three types of shock absorbing structures and test their effect on the shock resilience of our MEMS harvesters. The solution leading to the best results consists of rigid silicon stoppers covered by a layer of Parylene. The shock resilience of the harvesters is brought above 2500 g. Results in the same range are also obtained with flexible silicon bumpers, which are simpler to manufacture.

  6. The Turkish Version of Web-Based Learning Platform Evaluation Scale: Reliability and Validity Study

    ERIC Educational Resources Information Center

    Dag, Funda

    2016-01-01

    The purpose of this study is to determine the language equivalence and the validity and reliability of the Turkish version of the "Web-Based Learning Platform Evaluation Scale" ("Web Tabanli Ögrenme Ortami Degerlendirme Ölçegi" [WTÖODÖ]) used in the selection and evaluation of web-based learning environments. Within this scope,…

  7. Advances in developing rapid, reliable and portable detection systems for alcohol.

    PubMed

    Thungon, Phurpa Dema; Kakoti, Ankana; Ngashangva, Lightson; Goswami, Pranab

    2017-11-15

    Development of portable, reliable, sensitive, simple, and inexpensive detection system for alcohol has been an instinctive demand not only in traditional brewing, pharmaceutical, food and clinical industries but also in rapidly growing alcohol based fuel industries. Highly sensitive, selective, and reliable alcohol detections are currently amenable typically through the sophisticated instrument based analyses confined mostly to the state-of-art analytical laboratory facilities. With the growing demand of rapid and reliable alcohol detection systems, an all-round attempt has been made over the past decade encompassing various disciplines from basic and engineering sciences. Of late, the research for developing small-scale portable alcohol detection system has been accelerated with the advent of emerging miniaturization techniques, advanced materials and sensing platforms such as lab-on-chip, lab-on-CD, lab-on-paper etc. With these new inter-disciplinary approaches along with the support from the parallel knowledge growth on rapid detection systems being pursued for various targets, the progress on translating the proof-of-concepts to commercially viable and environment friendly portable alcohol detection systems is gaining pace. Here, we summarize the progress made over the years on the alcohol detection systems, with a focus on recent advancement towards developing portable, simple and efficient alcohol sensors. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Dim target detection method based on salient graph fusion

    NASA Astrophysics Data System (ADS)

    Hu, Ruo-lan; Shen, Yi-yan; Jiang, Jun

    2018-02-01

    Dim target detection is one key problem in digital image processing field. With development of multi-spectrum imaging sensor, it becomes a trend to improve the performance of dim target detection by fusing the information from different spectral images. In this paper, one dim target detection method based on salient graph fusion was proposed. In the method, Gabor filter with multi-direction and contrast filter with multi-scale were combined to construct salient graph from digital image. And then, the maximum salience fusion strategy was designed to fuse the salient graph from different spectral images. Top-hat filter was used to detect dim target from the fusion salient graph. Experimental results show that proposal method improved the probability of target detection and reduced the probability of false alarm on clutter background images.

  9. A computational theory for the classification of natural biosonar targets based on a spike code.

    PubMed

    Müller, Rolf

    2003-08-01

    A computational theory for the classification of natural biosonar targets is developed based on the properties of an example stimulus ensemble. An extensive set of echoes (84 800) from four different foliages was transcribed into a spike code using a parsimonious model (linear filtering, half-wave rectification, thresholding). The spike code is assumed to consist of time differences (interspike intervals) between threshold crossings. Among the elementary interspike intervals flanked by exceedances of adjacent thresholds, a few intervals triggered by disjoint half-cycles of the carrier oscillation stand out in terms of resolvability, visibility across resolution scales and a simple stochastic structure (uncorrelatedness). They are therefore argued to be a stochastic analogue to edges in vision. A three-dimensional feature vector representing these interspike intervals sustained a reliable target classification performance (0.06% classification error) in a sequential probability ratio test, which models sequential processing of echo trains by biological sonar systems. The dimensions of the representation are the first moments of duration and amplitude location of these interspike intervals as well as their number. All three quantities are readily reconciled with known principles of neural signal representation, since they correspond to the centre of gravity of excitation on a neural map and the total amount of excitation.

  10. Pricise Target Geolocation and Tracking Based on Uav Video Imagery

    NASA Astrophysics Data System (ADS)

    Hosseinpoor, H. R.; Samadzadegan, F.; Dadrasjavan, F.

    2016-06-01

    There is an increasingly large number of applications for Unmanned Aerial Vehicles (UAVs) from monitoring, mapping and target geolocation. However, most of commercial UAVs are equipped with low-cost navigation sensors such as C/A code GPS and a low-cost IMU on board, allowing a positioning accuracy of 5 to 10 meters. This low accuracy cannot be used in applications that require high precision data on cm-level. This paper presents a precise process for geolocation of ground targets based on thermal video imagery acquired by small UAV equipped with RTK GPS. The geolocation data is filtered using an extended Kalman filter, which provides a smoothed estimate of target location and target velocity. The accurate geo-locating of targets during image acquisition is conducted via traditional photogrammetric bundle adjustment equations using accurate exterior parameters achieved by on board IMU and RTK GPS sensors, Kalman filtering and interior orientation parameters of thermal camera from pre-flight laboratory calibration process. The results of this study compared with code-based ordinary GPS, indicate that RTK observation with proposed method shows more than 10 times improvement of accuracy in target geolocation.

  11. 42 CFR 419.30 - Base expenditure target for calendar year 1999.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 3 2012-10-01 2012-10-01 false Base expenditure target for calendar year 1999. 419.30 Section 419.30 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND... Outpatient Services § 419.30 Base expenditure target for calendar year 1999. (a) CMS estimates the aggregate...

  12. 42 CFR 419.30 - Base expenditure target for calendar year 1999.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 3 2011-10-01 2011-10-01 false Base expenditure target for calendar year 1999. 419.30 Section 419.30 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND... Services § 419.30 Base expenditure target for calendar year 1999. (a) CMS estimates the aggregate amount...

  13. 42 CFR 419.30 - Base expenditure target for calendar year 1999.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 3 2013-10-01 2013-10-01 false Base expenditure target for calendar year 1999. 419.30 Section 419.30 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND... Outpatient Services § 419.30 Base expenditure target for calendar year 1999. (a) CMS estimates the aggregate...

  14. 42 CFR 419.30 - Base expenditure target for calendar year 1999.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Base expenditure target for calendar year 1999. 419.30 Section 419.30 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND... Services § 419.30 Base expenditure target for calendar year 1999. (a) CMS estimates the aggregate amount...

  15. Improved GGIW-PHD filter for maneuvering non-ellipsoidal extended targets or group targets tracking based on sub-random matrices.

    PubMed

    Liang, Zhibing; Liu, Fuxian; Gao, Jiale

    2018-01-01

    For non-ellipsoidal extended targets and group targets tracking (NETT and NGTT), using an ellipsoid to approximate the target extension may not be accurate enough because of the lack of shape and orientation information. In consideration of this, we model a non-ellipsoidal extended target or target group as a combination of multiple ellipsoidal sub-objects, each represented by a random matrix. Based on these models, an improved gamma Gaussian inverse Wishart probability hypothesis density (GGIW-PHD) filter is proposed to estimate the measurement rates, kinematic states, and extension states of the sub-objects for each extended target or target group. For maneuvering NETT and NGTT, a multi-model (MM) approach based GGIW-PHD (MM-GGIW-PHD) filter is proposed. The common and the individual dynamics of the sub-objects belonging to the same extended target or target group are described by means of the combination between the overall maneuver model and the sub-object models. For the merging of updating components, an improved merging criterion and a new merging method are derived. A specific implementation of prediction partition with pseudo-likelihood method is presented. Two scenarios for non-maneuvering and maneuvering NETT and NGTT are simulated. The results demonstrate the effectiveness of the proposed algorithms.

  16. Improved GGIW-PHD filter for maneuvering non-ellipsoidal extended targets or group targets tracking based on sub-random matrices

    PubMed Central

    Liu, Fuxian; Gao, Jiale

    2018-01-01

    For non-ellipsoidal extended targets and group targets tracking (NETT and NGTT), using an ellipsoid to approximate the target extension may not be accurate enough because of the lack of shape and orientation information. In consideration of this, we model a non-ellipsoidal extended target or target group as a combination of multiple ellipsoidal sub-objects, each represented by a random matrix. Based on these models, an improved gamma Gaussian inverse Wishart probability hypothesis density (GGIW-PHD) filter is proposed to estimate the measurement rates, kinematic states, and extension states of the sub-objects for each extended target or target group. For maneuvering NETT and NGTT, a multi-model (MM) approach based GGIW-PHD (MM-GGIW-PHD) filter is proposed. The common and the individual dynamics of the sub-objects belonging to the same extended target or target group are described by means of the combination between the overall maneuver model and the sub-object models. For the merging of updating components, an improved merging criterion and a new merging method are derived. A specific implementation of prediction partition with pseudo-likelihood method is presented. Two scenarios for non-maneuvering and maneuvering NETT and NGTT are simulated. The results demonstrate the effectiveness of the proposed algorithms. PMID:29444144

  17. A New Tool for Nutrition App Quality Evaluation (AQEL): Development, Validation, and Reliability Testing.

    PubMed

    DiFilippo, Kristen Nicole; Huang, Wenhao; Chapman-Novakofski, Karen M

    2017-10-27

    The extensive availability and increasing use of mobile apps for nutrition-based health interventions makes evaluation of the quality of these apps crucial for integration of apps into nutritional counseling. The goal of this research was the development, validation, and reliability testing of the app quality evaluation (AQEL) tool, an instrument for evaluating apps' educational quality and technical functionality. Items for evaluating app quality were adapted from website evaluations, with additional items added to evaluate the specific characteristics of apps, resulting in 79 initial items. Expert panels of nutrition and technology professionals and app users reviewed items for face and content validation. After recommended revisions, nutrition experts completed a second AQEL review to ensure clarity. On the basis of 150 sets of responses using the revised AQEL, principal component analysis was completed, reducing AQEL into 5 factors that underwent reliability testing, including internal consistency, split-half reliability, test-retest reliability, and interrater reliability (IRR). Two additional modifiable constructs for evaluating apps based on the age and needs of the target audience as selected by the evaluator were also tested for construct reliability. IRR testing using intraclass correlations (ICC) with all 7 constructs was conducted, with 15 dietitians evaluating one app. Development and validation resulted in the 51-item AQEL. These were reduced to 25 items in 5 factors after principal component analysis, plus 9 modifiable items in two constructs that were not included in principal component analysis. Internal consistency and split-half reliability of the following constructs derived from principal components analysis was good (Cronbach alpha >.80, Spearman-Brown coefficient >.80): behavior change potential, support of knowledge acquisition, app function, and skill development. App purpose split half-reliability was .65. Test-retest reliability showed no

  18. Micro-channel-based high specific power lithium target

    NASA Astrophysics Data System (ADS)

    Mastinu, P.; Martın-Hernández, G.; Praena, J.; Gramegna, F.; Prete, G.; Agostini, P.; Aiello, A.; Phoenix, B.

    2016-11-01

    A micro-channel-based heat sink has been produced and tested. The device has been developed to be used as a Lithium target for the LENOS (Legnaro Neutron Source) facility and for the production of radioisotope. Nevertheless, applications of such device can span on many areas: cooling of electronic devices, diode laser array, automotive applications etc. The target has been tested using a proton beam of 2.8MeV energy and delivering total power shots from 100W to 1500W with beam spots varying from 5mm2 to 19mm2. Since the target has been designed to be used with a thin deposit of lithium and since lithium is a low-melting-point material, we have measured that, for such application, a specific power of about 3kW/cm2 can be delivered to the target, keeping the maximum surface temperature not exceeding 150° C.

  19. Accounting for Proof Test Data in a Reliability Based Design Optimization Framework

    NASA Technical Reports Server (NTRS)

    Ventor, Gerharad; Scotti, Stephen J.

    2012-01-01

    This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.

  20. Cue reliability and a landmark stability heuristic determine relative weighting between egocentric and allocentric visual information in memory-guided reach.

    PubMed

    Byrne, Patrick A; Crawford, J Douglas

    2010-06-01

    It is not known how egocentric visual information (location of a target relative to the self) and allocentric visual information (location of a target relative to external landmarks) are integrated to form reach plans. Based on behavioral data from rodents and humans we hypothesized that the degree of stability in visual landmarks would influence the relative weighting. Furthermore, based on numerous cue-combination studies we hypothesized that the reach system would act like a maximum-likelihood estimator (MLE), where the reliability of both cues determines their relative weighting. To predict how these factors might interact we developed an MLE model that weighs egocentric and allocentric information based on their respective reliabilities, and also on an additional stability heuristic. We tested the predictions of this model in 10 human subjects by manipulating landmark stability and reliability (via variable amplitude vibration of the landmarks and variable amplitude gaze shifts) in three reach-to-touch tasks: an egocentric control (reaching without landmarks), an allocentric control (reaching relative to landmarks), and a cue-conflict task (involving a subtle landmark "shift" during the memory interval). Variability from all three experiments was used to derive parameters for the MLE model, which was then used to simulate egocentric-allocentric weighting in the cue-conflict experiment. As predicted by the model, landmark vibration--despite its lack of influence on pointing variability (and thus allocentric reliability) in the control experiment--had a strong influence on egocentric-allocentric weighting. A reduced model without the stability heuristic was unable to reproduce this effect. These results suggest heuristics for extrinsic cue stability are at least as important as reliability for determining cue weighting in memory-guided reaching.

  1. Design of nuclease-based target recycling signal amplification in aptasensors.

    PubMed

    Yan, Mengmeng; Bai, Wenhui; Zhu, Chao; Huang, Yafei; Yan, Jiao; Chen, Ailiang

    2016-03-15

    Compared with conventional antibody-based immunoassay methods, aptasensors based on nucleic acid aptamer have made at least two significant breakthroughs. One is that aptamers are more easily used for developing various simple and rapid homogeneous detection methods by "sample in signal out" without multi-step washing. The other is that aptamers are more easily employed for developing highly sensitive detection methods by using various nucleic acid-based signal amplification approaches. As many substances playing regulatory roles in physiology or pathology exist at an extremely low concentration and many chemical contaminants occur in trace amounts in food or environment, aptasensors for signal amplification contribute greatly to detection of such targets. Among the signal amplification approaches in highly sensitive aptasensors, the nuclease-based target recycling signal amplification has recently become a research focus because it shows easy design, simple operation, and rapid reaction and can be easily developed for homogenous assay. In this review, we summarized recent advances in the development of various nuclease-based target recycling signal amplification with the aim to provide a general guide for the design of aptamer-based ultrasensitive biosensing assays. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. The B-747 flight control system maintenance and reliability data base for cost effectiveness tradeoff studies

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Primary and automatic flight controls are combined for a total flight control reliability and maintenance cost data base using information from two previous reports and additional cost data gathered from a major airline. A comparison of the current B-747 flight control system effects on reliability and operating cost with that of a B-747 designed for an active control wing load alleviation system is provided.

  3. Texture orientation-based algorithm for detecting infrared maritime targets.

    PubMed

    Wang, Bin; Dong, Lili; Zhao, Ming; Wu, Houde; Xu, Wenhai

    2015-05-20

    Infrared maritime target detection is a key technology for maritime target searching systems. However, in infrared maritime images (IMIs) taken under complicated sea conditions, background clutters, such as ocean waves, clouds or sea fog, usually have high intensity that can easily overwhelm the brightness of real targets, which is difficult for traditional target detection algorithms to deal with. To mitigate this problem, this paper proposes a novel target detection algorithm based on texture orientation. This algorithm first extracts suspected targets by analyzing the intersubband correlation between horizontal and vertical wavelet subbands of the original IMI on the first scale. Then the self-adaptive wavelet threshold denoising and local singularity analysis of the original IMI is combined to remove false alarms further. Experiments show that compared with traditional algorithms, this algorithm can suppress background clutter much better and realize better single-frame detection for infrared maritime targets. Besides, in order to guarantee accurate target extraction further, the pipeline-filtering algorithm is adopted to eliminate residual false alarms. The high practical value and applicability of this proposed strategy is backed strongly by experimental data acquired under different environmental conditions.

  4. Tertiary structure-based analysis of microRNA–target interactions

    PubMed Central

    Gan, Hin Hark; Gunsalus, Kristin C.

    2013-01-01

    Current computational analysis of microRNA interactions is based largely on primary and secondary structure analysis. Computationally efficient tertiary structure-based methods are needed to enable more realistic modeling of the molecular interactions underlying miRNA-mediated translational repression. We incorporate algorithms for predicting duplex RNA structures, ionic strength effects, duplex entropy and free energy, and docking of duplex–Argonaute protein complexes into a pipeline to model and predict miRNA–target duplex binding energies. To ensure modeling accuracy and computational efficiency, we use an all-atom description of RNA and a continuum description of ionic interactions using the Poisson–Boltzmann equation. Our method predicts the conformations of two constructs of Caenorhabditis elegans let-7 miRNA–target duplexes to an accuracy of ∼3.8 Å root mean square distance of their NMR structures. We also show that the computed duplex formation enthalpies, entropies, and free energies for eight miRNA–target duplexes agree with titration calorimetry data. Analysis of duplex–Argonaute docking shows that structural distortions arising from single-base-pair mismatches in the seed region influence the activity of the complex by destabilizing both duplex hybridization and its association with Argonaute. Collectively, these results demonstrate that tertiary structure-based modeling of miRNA interactions can reveal structural mechanisms not accessible with current secondary structure-based methods. PMID:23417009

  5. The reliability and validity of the Complex Task Performance Assessment: A performance-based assessment of executive function.

    PubMed

    Wolf, Timothy J; Dahl, Abigail; Auen, Colleen; Doherty, Meghan

    2017-07-01

    The objective of this study was to evaluate the inter-rater reliability, test-retest reliability, concurrent validity, and discriminant validity of the Complex Task Performance Assessment (CTPA): an ecologically valid performance-based assessment of executive function. Community control participants (n = 20) and individuals with mild stroke (n = 14) participated in this study. All participants completed the CTPA and a battery of cognitive assessments at initial testing. The control participants completed the CTPA at two different times one week apart. The intra-class correlation coefficient (ICC) for inter-rater reliability for the total score on the CTPA was .991. The ICCs for all of the sub-scores of the CTPA were also high (.889-.977). The CTPA total score was significantly correlated to Condition 4 of the DKEFS Color-Word Interference Test (p = -.425), and the Wechsler Test of Adult Reading (p  = -.493). Finally, there were significant differences between control subjects and individuals with mild stroke on the total score of the CTPA (p = .007) and all sub-scores except interpretation failures and total items incorrect. These results are also consistent with other current executive function performance-based assessments and indicate that the CTPA is a reliable and valid performance-based measure of executive function.

  6. Placenta-specific1 (PLAC1) is a potential target for antibody-drug conjugate-based prostate cancer immunotherapy.

    PubMed

    Nejadmoghaddam, Mohammad-Reza; Zarnani, Amir-Hassan; Ghahremanzadeh, Ramin; Ghods, Roya; Mahmoudian, Jafar; Yousefi, Maryam; Nazari, Mahboobeh; Ghahremani, Mohammad Hossein; Abolhasani, Maryam; Anissian, Ali; Mahmoudi, Morteza; Dinarvand, Rassoul

    2017-10-17

    Our recent findings strongly support the idea of PLAC1 being as a potential immunotherapeutic target in prostate cancer (PCa). Here, we have generated and evaluated an anti-placenta-specific1 (PLAC1)-based antibody drug conjugate (ADC) for targeted immunotherapy of PCa. Prostate cancer cells express considerable levels of PLAC1. The Anti-PLAC1 clone, 2H12C12, showed high reactivity with recombinant PLAC1 and selectivity recognized PLAC1 in prostate cancer cells but not in LS180 cells, the negative control. PLAC1 binding induced rapid internalization of the antibody within a few minutes which reached to about 50% after 15 min and almost completed within an hour. After SN38 conjugation to antibody, a drug-antibody ratio (DAR) of about 5.5 was achieved without apparent negative effect on antibody affinity to cell surface antigen. The ADC retained intrinsic antibody activity and showed enhanced and selective cytotoxicity with an IC50 of 62 nM which was about 15-fold lower compared to free drug. Anti-PLAC1-ADC induced apoptosis in human primary prostate cancer cells and prostate cell lines. No apparent cytotoxic effect was observed in in vivo animal safety experiments. Our newly developed anti-PLAC1-based ADCs might pave the way for a reliable, efficient, and novel immunotherapeutic modality for patients with PCa.

  7. Sparsity based target detection for compressive spectral imagery

    NASA Astrophysics Data System (ADS)

    Boada, David Alberto; Arguello Fuentes, Henry

    2016-09-01

    Hyperspectral imagery provides significant information about the spectral characteristics of objects and materials present in a scene. It enables object and feature detection, classification, or identification based on the acquired spectral characteristics. However, it relies on sophisticated acquisition and data processing systems able to acquire, process, store, and transmit hundreds or thousands of image bands from a given area of interest which demands enormous computational resources in terms of storage, computationm, and I/O throughputs. Specialized optical architectures have been developed for the compressed acquisition of spectral images using a reduced set of coded measurements contrary to traditional architectures that need a complete set of measurements of the data cube for image acquisition, dealing with the storage and acquisition limitations. Despite this improvement, if any processing is desired, the image has to be reconstructed by an inverse algorithm in order to be processed, which is also an expensive task. In this paper, a sparsity-based algorithm for target detection in compressed spectral images is presented. Specifically, the target detection model adapts a sparsity-based target detector to work in a compressive domain, modifying the sparse representation basis in the compressive sensing problem by means of over-complete training dictionaries and a wavelet basis representation. Simulations show that the presented method can achieve even better detection results than the state of the art methods.

  8. Validity and Reliability of the Turkish Version of Needs Based Biopsychosocial Distress Instrument for Cancer Patients (CANDI)

    PubMed Central

    Beyhun, Nazim Ercument; Can, Gamze; Tiryaki, Ahmet; Karakullukcu, Serdar; Bulut, Bekir; Yesilbas, Sehbal; Kavgaci, Halil; Topbas, Murat

    2016-01-01

    Background Needs based biopsychosocial distress instrument for cancer patients (CANDI) is a scale based on needs arising due to the effects of cancer. Objectives The aim of this research was to determine the reliability and validity of the CANDI scale in the Turkish language. Patients and Methods The study was performed with the participation of 172 cancer patients aged 18 and over. Factor analysis (principal components analysis) was used to assess construct validity. Criterion validities were tested by computing Spearman correlation between CANDI and hospital anxiety depression scale (HADS), and brief symptom inventory (BSI) (convergent validity) and quality of life scales (FACT-G) (divergent validity). Test-retest reliabilities and internal consistencies were measured with intraclass correlation (ICC) and Cronbach-α. Results A three-factor solution (emotional, physical and social) was found with factor analysis. Internal reliability (α = 0.94) and test-retest reliability (ICC = 0.87) were significantly high. Correlations between CANDI and HADS (rs = 0.67), and BSI (rs = 0.69) and FACT-G (rs = -0.76) were moderate and significant in the expected direction. Conclusions CANDI is a valid and reliable scale in cancer patients with a three-factor structure (emotional, physical and social) in the Turkish language. PMID:27621931

  9. Reliability and validity of play-based assessments of motor and cognitive skills for infants and young children: a systematic review.

    PubMed

    O'Grady, Michael G; Dusing, Stacey C

    2015-01-01

    Play is vital for development. Infants and children learn through play. Traditional standardized developmental tests measure whether a child performs individual skills within controlled environments. Play-based assessments can measure skill performance during natural, child-driven play. The purpose of this study was to systematically review reliability, validity, and responsiveness of all play-based assessments that quantify motor and cognitive skills in children from birth to 36 months of age. Studies were identified from a literature search using PubMed, ERIC, CINAHL, and PsycINFO databases and the reference lists of included papers. Included studies investigated reliability, validity, or responsiveness of play-based assessments that measured motor and cognitive skills for children to 36 months of age. Two reviewers independently screened 40 studies for eligibility and inclusion. The reviewers independently extracted reliability, validity, and responsiveness data. They examined measurement properties and methodological quality of the included studies. Four current play-based assessment tools were identified in 8 included studies. Each play-based assessment tool measured motor and cognitive skills in a different way during play. Interrater reliability correlations ranged from .86 to .98 for motor development and from .23 to .90 for cognitive development. Test-retest reliability correlations ranged from .88 to .95 for motor development and from .45 to .91 for cognitive development. Structural validity correlations ranged from .62 to .90 for motor development and from .42 to .93 for cognitive development. One study assessed responsiveness to change in motor development. Most studies had small and poorly described samples. Lack of transparency in data management and statistical analysis was common. Play-based assessments have potential to be reliable and valid tools to assess cognitive and motor skills, but higher-quality research is needed. Psychometric properties

  10. Pharmacophore based design of some multi-targeted compounds targeted against pathways of diabetic complications.

    PubMed

    Chadha, Navriti; Silakari, Om

    2017-09-01

    Diabetic complications is a complex metabolic disorder developed primarily due to prolonged hyperglycemia in the body. The complexity of the disease state as well as the unifying pathophysiology discussed in the literature reports exhibited that the use of multi-targeted agents with multiple complementary biological activities may offer promising therapy for the intervention of the disease over the single-target drugs. In the present study, novel thiazolidine-2,4-dione analogues were designed as multi-targeted agents implicated against the molecular pathways involved in diabetic complications using knowledge based as well as in-silico approaches such as pharmacophore mapping, molecular docking etc. The hit molecules were duly synthesized and biochemical estimation of these molecules against aldose reductase (ALR2), protein kinase Cβ (PKCβ) and poly (ADP-ribose) polymerase 1 (PARP-1) led to identification of compound 2 that showed good potency against PARP-1 and ALR2 enzymes. These positive results support the progress of a low cost multi-targeted agent with putative roles in diabetic complications. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. The effect of Web-based Braden Scale training on the reliability and precision of Braden Scale pressure ulcer risk assessments.

    PubMed

    Magnan, Morris A; Maklebust, Joann

    2008-01-01

    To evaluate the effect of Web-based Braden Scale training on the reliability and precision of pressure ulcer risk assessments made by registered nurses (RN) working in acute care settings. Pretest-posttest, 2-group, quasi-experimental design. Five hundred Braden Scale risk assessments were made on 102 acute care patients deemed to be at various levels of risk for pressure ulceration. Assessments were made by RNs working in acute care hospitals at 3 different medical centers where the Braden Scale was in regular daily use (2 medical centers) or new to the setting (1 medical center). The Braden Scale for Predicting Pressure Sore Risk was used to guide pressure ulcer risk assessments. A Web-based version of the Detroit Medical Center Braden Scale Computerized Training Module was used to teach nurses correct use of the Braden Scale and selection of risk-based pressure ulcer prevention interventions. In the aggregate, RN generated reliable Braden Scale pressure ulcer risk assessments 65% of the time after training. The effect of Web-based Braden Scale training on reliability and precision of assessments varied according to familiarity with the scale. With training, new users of the scale made reliable assessments 84% of the time and significantly improved precision of their assessments. The reliability and precision of Braden Scale risk assessments made by its regular users was unaffected by training. Technology-assisted Braden Scale training improved both reliability and precision of risk assessments made by new users of the scale, but had virtually no effect on the reliability or precision of risk assessments made by regular users of the instrument. Further research is needed to determine best approaches for improving reliability and precision of Braden Scale assessments made by its regular users.

  12. Reliability of Interaural Time Difference-Based Localization Training in Elderly Individuals with Speech-in-Noise Perception Disorder.

    PubMed

    Delphi, Maryam; Lotfi, M-Yones; Moossavi, Abdollah; Bakhshi, Enayatollah; Banimostafa, Maryam

    2017-09-01

    Previous studies have shown that interaural-time-difference (ITD) training can improve localization ability. Surprisingly little is, however, known about localization training vis-à-vis speech perception in noise based on interaural time difference in the envelope (ITD ENV). We sought to investigate the reliability of an ITD ENV-based training program in speech-in-noise perception among elderly individuals with normal hearing and speech-in-noise disorder. The present interventional study was performed during 2016. Sixteen elderly men between 55 and 65 years of age with the clinical diagnosis of normal hearing up to 2000 Hz and speech-in-noise perception disorder participated in this study. The training localization program was based on changes in ITD ENV. In order to evaluate the reliability of the training program, we performed speech-in-noise tests before the training program, immediately afterward, and then at 2 months' follow-up. The reliability of the training program was analyzed using the Friedman test and the SPSS software. Significant statistical differences were shown in the mean scores of speech-in-noise perception between the 3 time points (P=0.001). The results also indicated no difference in the mean scores of speech-in-noise perception between the 2 time points of immediately after the training program and 2 months' follow-up (P=0.212). The present study showed the reliability of an ITD ENV-based localization training in elderly individuals with speech-in-noise perception disorder.

  13. A DBN based anomaly targets detector for HSI

    NASA Astrophysics Data System (ADS)

    Ma, Ning; Wang, Shaojun; Yu, Jinxiang; Peng, Yu

    2017-10-01

    Due to the assumption that Hyperspectral image (HSI) should conform to Gaussian distribution, traditional Mahalanobis distance-based anomaly targets detectors perform poor because the assumption may not always hold. In order to solve those problems, a deep learning based detector, Deep Belief Network(DBN) anomaly detector(DBN-AD), was proposed to fit the unknown distribution of HSI by energy modeling, the reconstruction errors of this encode-decode processing are used for discriminating the anomaly targets. Experiments are implemented on real and synthesized HSI dataset which collection by Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS). Comparing to classic anomaly detector, the proposed method shows better performance, it performs about 0.17 higher in Area Under ROC Curve (AUC) than that of Reed-Xiaoli detector(RXD) and Kernel-RXD (K-RXD).

  14. Silicon nanowires reliability and robustness investigation using AFM-based techniques

    NASA Astrophysics Data System (ADS)

    Bieniek, Tomasz; Janczyk, Grzegorz; Janus, Paweł; Grabiec, Piotr; Nieprzecki, Marek; Wielgoszewski, Grzegorz; Moczała, Magdalena; Gotszalk, Teodor; Buitrago, Elizabeth; Badia, Montserrat F.; Ionescu, Adrian M.

    2013-07-01

    Silicon nanowires (SiNWs) have undergone intensive research for their application in novel integrated systems such as field effect transistor (FET) biosensors and mass sensing resonators profiting from large surface-to-volume ratios (nano dimensions). Such devices have been shown to have the potential for outstanding performances in terms of high sensitivity, selectivity through surface modification and unprecedented structural characteristics. This paper presents the results of mechanical characterization done for various types of suspended SiNWs arranged in a 3D array. The characterization has been performed using techniques based on atomic force microscopy (AFM). This investigation is a necessary prerequisite for the reliable and robust design of any biosensing system. This paper also describes the applied investigation methodology and reports measurement results aggregated during series of AFM-based tests.

  15. Reliability and criterion validity of measurements using a smart phone-based measurement tool for the transverse rotation angle of the pelvis during single-leg lifting.

    PubMed

    Jung, Sung-Hoon; Kwon, Oh-Yun; Jeon, In-Cheol; Hwang, Ui-Jae; Weon, Jong-Hyuck

    2018-01-01

    The purposes of this study were to determine the intra-rater test-retest reliability of a smart phone-based measurement tool (SBMT) and a three-dimensional (3D) motion analysis system for measuring the transverse rotation angle of the pelvis during single-leg lifting (SLL) and the criterion validity of the transverse rotation angle of the pelvis measurement using SBMT compared with a 3D motion analysis system (3DMAS). Seventeen healthy volunteers performed SLL with their dominant leg without bending the knee until they reached a target placed 20 cm above the table. This study used a 3DMAS, considered the gold standard, to measure the transverse rotation angle of the pelvis to assess the criterion validity of the SBMT measurement. Intra-rater test-retest reliability was determined using the SBMT and 3DMAS using intra-class correlation coefficient (ICC) [3,1] values. The criterion validity of the SBMT was assessed with ICC [3,1] values. Both the 3DMAS (ICC = 0.77) and SBMT (ICC = 0.83) showed excellent intra-rater test-retest reliability in the measurement of the transverse rotation angle of the pelvis during SLL in a supine position. Moreover, the SBMT showed an excellent correlation with the 3DMAS (ICC = 0.99). Measurement of the transverse rotation angle of the pelvis using the SBMT showed excellent reliability and criterion validity compared with the 3DMAS.

  16. Reliability Based Geometric Design of Horizontal Circular Curves

    NASA Astrophysics Data System (ADS)

    Rajbongshi, Pabitra; Kalita, Kuldeep

    2018-06-01

    Geometric design of horizontal circular curve primarily involves with radius of the curve and stopping sight distance at the curve section. Minimum radius is decided based on lateral thrust exerted on the vehicles and the minimum stopping sight distance is provided to maintain the safety in longitudinal direction of vehicles. Available sight distance at site can be regulated by changing the radius and middle ordinate at the curve section. Both radius and sight distance depend on design speed. Speed of vehicles at any road section is a variable parameter and therefore, normally the 98th percentile speed is taken as the design speed. This work presents a probabilistic approach for evaluating stopping sight distance, considering the variability of all input parameters of sight distance. It is observed that the 98th percentile sight distance value is much lower than the sight distance corresponding to 98th percentile speed. The distribution of sight distance parameter is also studied and found to follow a lognormal distribution. Finally, the reliability based design charts are presented for both plain and hill regions, and considering the effect of lateral thrust.

  17. Utilization of wireless structural health monitoring as decision making tools for a condition and reliability-based assessment of railroad bridges

    NASA Astrophysics Data System (ADS)

    Flanigan, Katherine A.; Johnson, Nephi R.; Hou, Rui; Ettouney, Mohammed; Lynch, Jerome P.

    2017-04-01

    The ability to quantitatively assess the condition of railroad bridges facilitates objective evaluation of their robustness in the face of hazard events. Of particular importance is the need to assess the condition of railroad bridges in networks that are exposed to multiple hazards. Data collected from structural health monitoring (SHM) can be used to better maintain a structure by prompting preventative (rather than reactive) maintenance strategies and supplying quantitative information to aid in recovery. To that end, a wireless monitoring system is validated and installed on the Harahan Bridge which is a hundred-year-old long-span railroad truss bridge that crosses the Mississippi River near Memphis, TN. This bridge is exposed to multiple hazards including scour, vehicle/barge impact, seismic activity, and aging. The instrumented sensing system targets non-redundant structural components and areas of the truss and floor system that bridge managers are most concerned about based on previous inspections and structural analysis. This paper details the monitoring system and the analytical method for the assessment of bridge condition based on automated data-driven analyses. Two primary objectives of monitoring the system performance are discussed: 1) monitoring fatigue accumulation in critical tensile truss elements; and 2) monitoring the reliability index values associated with sub-system limit states of these members. Moreover, since the reliability index is a scalar indicator of the safety of components, quantifiable condition assessment can be used as an objective metric so that bridge owners can make informed damage mitigation strategies and optimize resource management on single bridge or network levels.

  18. Moving Beyond Motive-based categories of Targeted Violence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weine, Stevan; Cohen, John; Brannegan, David

    Today’s categories for responding to targeted violence are motive-based and tend to drive policies, practices, training, media coverage, and research. These categories are based on the assumption that there are significant differences between ideological and non-ideological actors and between domestic and international actors. We question the reliance on these categories and offer an alternative way to frame the response to multiple forms of targeted violence. We propose adopting a community-based multidisciplinary approach to assess risk and provide interventions that are focused on the pre-criminal space. We describe four capabilities that should be implemented locally by establishing and maintaining multidisciplinary responsemore » teams that combine community and law-enforcement components: (1) community members are educated, making them better able to identify and report patterns associated with elevated risk for violence; (2) community-based professionals are trained to assess the risks for violent behavior posed by individuals; (3) community-based professionals learn to implement strategies that directly intervene in causal factors for those individuals who are at elevated risk; and (4) community-based professionals learn to monitor and assess an individual’s risk for violent behaviors on an ongoing basis. Community-based multidisciplinary response teams have the potential to identify and help persons in the pre-criminal space and to reduce barriers that have traditionally impeded community/law-enforcement collaboration.« less

  19. Target detection method by airborne and spaceborne images fusion based on past images

    NASA Astrophysics Data System (ADS)

    Chen, Shanjing; Kang, Qing; Wang, Zhenggang; Shen, ZhiQiang; Pu, Huan; Han, Hao; Gu, Zhongzheng

    2017-11-01

    To solve the problem that remote sensing target detection method has low utilization rate of past remote sensing data on target area, and can not recognize camouflage target accurately, a target detection method by airborne and spaceborne images fusion based on past images is proposed in this paper. The target area's past of space remote sensing image is taken as background. The airborne and spaceborne remote sensing data is fused and target feature is extracted by the means of airborne and spaceborne images registration, target change feature extraction, background noise suppression and artificial target feature extraction based on real-time aerial optical remote sensing image. Finally, the support vector machine is used to detect and recognize the target on feature fusion data. The experimental results have established that the proposed method combines the target area change feature of airborne and spaceborne remote sensing images with target detection algorithm, and obtains fine detection and recognition effect on camouflage and non-camouflage targets.

  20. Using Model Replication to Improve the Reliability of Agent-Based Models

    NASA Astrophysics Data System (ADS)

    Zhong, Wei; Kim, Yushim

    The basic presupposition of model replication activities for a computational model such as an agent-based model (ABM) is that, as a robust and reliable tool, it must be replicable in other computing settings. This assumption has recently gained attention in the community of artificial society and simulation due to the challenges of model verification and validation. Illustrating the replication of an ABM representing fraudulent behavior in a public service delivery system originally developed in the Java-based MASON toolkit for NetLogo by a different author, this paper exemplifies how model replication exercises provide unique opportunities for model verification and validation process. At the same time, it helps accumulate best practices and patterns of model replication and contributes to the agenda of developing a standard methodological protocol for agent-based social simulation.

  1. The development and reliability of a simple field based screening tool to assess core stability in athletes.

    PubMed

    O'Connor, S; McCaffrey, N; Whyte, E; Moran, K

    2016-07-01

    To adapt the trunk stability test to facilitate further sub-classification of higher levels of core stability in athletes for use as a screening tool. To establish the inter-tester and intra-tester reliability of this adapted core stability test. Reliability study. Collegiate athletic therapy facilities. Fifteen physically active male subjects (19.46 ± 0.63) free from any orthopaedic or neurological disorders were recruited from a convenience sample of collegiate students. The intraclass correlation coefficients (ICC) and 95% Confidence Intervals (CI) were computed to establish inter-tester and intra-tester reliability. Excellent ICC values were observed in the adapted core stability test for inter-tester reliability (0.97) and good to excellent intra-tester reliability (0.73-0.90). While the 95% CI were narrow for inter-tester reliability, Tester A and C 95% CI's were widely distributed compared to Tester B. The adapted core stability test developed in this study is a quick and simple field based test to administer that can further subdivide athletes with high levels of core stability. The test demonstrated high inter-tester and intra-tester reliability. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. An Improved Compressive Sensing and Received Signal Strength-Based Target Localization Algorithm with Unknown Target Population for Wireless Local Area Networks.

    PubMed

    Yan, Jun; Yu, Kegen; Chen, Ruizhi; Chen, Liang

    2017-05-30

    In this paper a two-phase compressive sensing (CS) and received signal strength (RSS)-based target localization approach is proposed to improve position accuracy by dealing with the unknown target population and the effect of grid dimensions on position error. In the coarse localization phase, by formulating target localization as a sparse signal recovery problem, grids with recovery vector components greater than a threshold are chosen as the candidate target grids. In the fine localization phase, by partitioning each candidate grid, the target position in a grid is iteratively refined by using the minimum residual error rule and the least-squares technique. When all the candidate target grids are iteratively partitioned and the measurement matrix is updated, the recovery vector is re-estimated. Threshold-based detection is employed again to determine the target grids and hence the target population. As a consequence, both the target population and the position estimation accuracy can be significantly improved. Simulation results demonstrate that the proposed approach achieves the best accuracy among all the algorithms compared.

  3. Target identification of small molecules based on chemical biology approaches.

    PubMed

    Futamura, Yushi; Muroi, Makoto; Osada, Hiroyuki

    2013-05-01

    Recently, a phenotypic approach-screens that assess the effects of compounds on cells, tissues, or whole organisms-has been reconsidered and reintroduced as a complementary strategy of a target-based approach for drug discovery. Although the finding of novel bioactive compounds from large chemical libraries has become routine, the identification of their molecular targets is still a time-consuming and difficult process, making this step rate-limiting in drug development. In the last decade, we and other researchers have amassed a large amount of phenotypic data through progress in omics research and advances in instrumentation. Accordingly, the profiling methodologies using these datasets expertly have emerged to identify and validate specific molecular targets of drug candidates, attaining some progress in current drug discovery (e.g., eribulin). In the case of a compound that shows an unprecedented phenotype likely by inhibiting a first-in-class target, however, such phenotypic profiling is invalid. Under the circumstances, a photo-crosslinking affinity approach should be beneficial. In this review, we describe and summarize recent progress in both affinity-based (direct) and phenotypic profiling (indirect) approaches for chemical biology target identification.

  4. Radiance and atmosphere propagation-based method for the target range estimation

    NASA Astrophysics Data System (ADS)

    Cho, Hoonkyung; Chun, Joohwan

    2012-06-01

    Target range estimation is traditionally based on radar and active sonar systems in modern combat system. However, the performance of such active sensor devices is degraded tremendously by jamming signal from the enemy. This paper proposes a simple range estimation method between the target and the sensor. Passive IR sensors measures infrared (IR) light radiance radiating from objects in dierent wavelength and this method shows robustness against electromagnetic jamming. The measured target radiance of each wavelength at the IR sensor depends on the emissive properties of target material and is attenuated by various factors, in particular the distance between the sensor and the target and atmosphere environment. MODTRAN is a tool that models atmospheric propagation of electromagnetic radiation. Based on the result from MODTRAN and measured radiance, the target range is estimated. To statistically analyze the performance of proposed method, we use maximum likelihood estimation (MLE) and evaluate the Cramer-Rao Lower Bound (CRLB) via the probability density function of measured radiance. And we also compare CRLB and the variance of and ML estimation using Monte-Carlo.

  5. Adaptive target binarization method based on a dual-camera system

    NASA Astrophysics Data System (ADS)

    Lei, Jing; Zhang, Ping; Xu, Jiangtao; Gao, Zhiyuan; Gao, Jing

    2018-01-01

    An adaptive target binarization method based on a dual-camera system that contains two dynamic vision sensors was proposed. First, a preprocessing procedure of denoising is introduced to remove the noise events generated by the sensors. Then, the complete edge of the target is retrieved and represented by events based on an event mosaicking method. Third, the region of the target is confirmed by an event-to-event method. Finally, a postprocessing procedure of image open and close operations of morphology methods is adopted to remove the artifacts caused by event-to-event mismatching. The proposed binarization method has been extensively tested on numerous degraded images with nonuniform illumination, low contrast, noise, or light spots and successfully compared with other well-known binarization methods. The experimental results, which are based on visual and misclassification error criteria, show that the proposed method performs well and has better robustness on the binarization of degraded images.

  6. A Reliability Model for Ni-BaTiO3-Based (BME) Ceramic Capacitors

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    The evaluation of multilayer ceramic capacitors (MLCCs) with base-metal electrodes (BMEs) for potential NASA space project applications requires an in-depth understanding of their reliability. The reliability of an MLCC is defined as the ability of the dielectric material to retain its insulating properties under stated environmental and operational conditions for a specified period of time t. In this presentation, a general mathematic expression of a reliability model for a BME MLCC is developed and discussed. The reliability model consists of three parts: (1) a statistical distribution that describes the individual variation of properties in a test group of samples (Weibull, log normal, normal, etc.), (2) an acceleration function that describes how a capacitors reliability responds to external stresses such as applied voltage and temperature (All units in the test group should follow the same acceleration function if they share the same failure mode, independent of individual units), and (3) the effect and contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size r, and capacitor chip size S. In general, a two-parameter Weibull statistical distribution model is used in the description of a BME capacitors reliability as a function of time. The acceleration function that relates a capacitors reliability to external stresses is dependent on the failure mode. Two failure modes have been identified in BME MLCCs: catastrophic and slow degradation. A catastrophic failure is characterized by a time-accelerating increase in leakage current that is mainly due to existing processing defects (voids, cracks, delamination, etc.), or the extrinsic defects. A slow degradation failure is characterized by a near-linear increase in leakage current against the stress time; this is caused by the electromigration of oxygen vacancies (intrinsic defects). The

  7. NMR-based platform for fragment-based lead discovery used in screening BRD4-targeted compounds

    PubMed Central

    Yu, Jun-lan; Chen, Tian-tian; Zhou, Chen; Lian, Fu-lin; Tang, Xu-long; Wen, Yi; Shen, Jing-kang; Xu, Ye-chun; Xiong, Bing; Zhang, Nai-xia

    2016-01-01

    Aim: Fragment-based lead discovery (FBLD) is a complementary approach in drug research and development. In this study, we established an NMR-based FBLD platform that was used to screen novel scaffolds targeting human bromodomain of BRD4, and investigated the binding interactions between hit compounds and the target protein. Methods: 1D NMR techniques were primarily used to generate the fragment library and to screen compounds. The inhibitory activity of hits on the first bromodomain of BRD4 [BRD4(I)] was examined using fluorescence anisotropy binding assay. 2D NMR and X-ray crystallography were applied to characterize the binding interactions between hit compounds and the target protein. Results: An NMR-based fragment library containing 539 compounds was established, which were clustered into 56 groups (8–10 compounds in each group). Eight hits with new scaffolds were found to inhibit BRD4(I). Four out of the 8 hits (compounds 1, 2, 8 and 9) had IC50 values of 100–260 μmol/L, demonstrating their potential for further BRD4-targeted hit-to-lead optimization. Analysis of the binding interactions revealed that compounds 1 and 2 shared a common quinazolin core structure and bound to BRD4(I) in a non-acetylated lysine mimetic mode. Conclusion: An NMR-based platform for FBLD was established and used in discovery of BRD4-targeted compounds. Four potential hit-to-lead optimization candidates have been found, two of them bound to BRD4(I) in a non-acetylated lysine mimetic mode, being selective BRD4(I) inhibitors. PMID:27238211

  8. Modeling and validation of photometric characteristics of space targets oriented to space-based observation.

    PubMed

    Wang, Hongyuan; Zhang, Wei; Dong, Aotuo

    2012-11-10

    A modeling and validation method of photometric characteristics of the space target was presented in order to track and identify different satellites effectively. The background radiation characteristics models of the target were built based on blackbody radiation theory. The geometry characteristics of the target were illustrated by the surface equations based on its body coordinate system. The material characteristics of the target surface were described by a bidirectional reflectance distribution function model, which considers the character of surface Gauss statistics and microscale self-shadow and is obtained by measurement and modeling in advance. The contributing surfaces of the target to observation system were determined by coordinate transformation according to the relative position of the space-based target, the background radiation sources, and the observation platform. Then a mathematical model on photometric characteristics of the space target was built by summing reflection components of all the surfaces. Photometric characteristics simulation of the space-based target was achieved according to its given geometrical dimensions, physical parameters, and orbital parameters. Experimental validation was made based on the scale model of the satellite. The calculated results fit well with the measured results, which indicates the modeling method of photometric characteristics of the space target is correct.

  9. Reliability of Pain Measurements Using Computerized Cuff Algometry: A DoloCuff Reliability and Agreement Study.

    PubMed

    Kvistgaard Olsen, Jack; Fener, Dilay Kesgin; Waehrens, Eva Elisabet; Wulf Christensen, Anton; Jespersen, Anders; Danneskiold-Samsøe, Bente; Bartels, Else Marie

    2017-07-01

    Computerized pneumatic cuff pressure algometry (CPA) using the DoloCuff is a new method for pain assessment. Intra- and inter-rater reliabilities have not yet been established. Our aim was to examine the inter- and intrarater reliabilities of DoloCuff measures in healthy subjects. Twenty healthy subjects (ages 20 to 29 years) were assessed three times at 24-hour intervals by two trained raters. Inter-rater reliability was established based on the first and second assessments, whereas intrarater reliability was based on the second and third assessments. Subjects were randomized 1:1 to first assessment at either rater 1 or rater 2. The variables of interest were pressure pain threshold (PT), pressure pain tolerance (PTol), and temporal summation index (TSI). Reliability was estimated by a two-way mixed intraclass correlation coefficient (ICC) absolute agreement analysis. Reliability was considered excellent if ICC > 0.75, fair to good if 0.4 < ICC < 0.75, and poor if ICC < 0.4. Bias and random errors between raters and assessments were evaluated using 95% confidence interval (CI) and Bland-Altman plots. Inter-rater reliability for PT, PTol, and TSI was 0.88 (95% CI: 0.69 to 0.95), 0.86 (95% CI: 0.65 to 0.95), and 0.81 (95% CI: 0.42 to 0.94), respectively. The intrarater reliability for PT, PTol, and TSI was 0.81 (95% CI: 0.53 to 0.92), 0.89 (95% CI: 0.74 to 0.96), and 0.75 (95% CI: 0.28 to 0.91), respectively. Inter-rater reliability was excellent for PT, PTol, and TSI. Similarly, the intrarater reliability for PT and PTol was excellent, while borderline excellent/good for TSI. Therefore, the DoloCuff can be used to obtain reliable measures of pressure pain parameters in healthy subjects. © 2016 World Institute of Pain.

  10. Self-Tuning Method for Increased Obstacle Detection Reliability Based on Internet of Things LiDAR Sensor Models

    PubMed Central

    2018-01-01

    On-chip LiDAR sensors for vehicle collision avoidance are a rapidly expanding area of research and development. The assessment of reliable obstacle detection using data collected by LiDAR sensors has become a key issue that the scientific community is actively exploring. The design of a self-tuning methodology and its implementation are presented in this paper, to maximize the reliability of LiDAR sensors network for obstacle detection in the ‘Internet of Things’ (IoT) mobility scenarios. The Webots Automobile 3D simulation tool for emulating sensor interaction in complex driving environments is selected in order to achieve that objective. Furthermore, a model-based framework is defined that employs a point-cloud clustering technique, and an error-based prediction model library that is composed of a multilayer perceptron neural network, and k-nearest neighbors and linear regression models. Finally, a reinforcement learning technique, specifically a Q-learning method, is implemented to determine the number of LiDAR sensors that are required to increase sensor reliability for obstacle localization tasks. In addition, a IoT driving assistance user scenario, connecting a five LiDAR sensor network is designed and implemented to validate the accuracy of the computational intelligence-based framework. The results demonstrated that the self-tuning method is an appropriate strategy to increase the reliability of the sensor network while minimizing detection thresholds. PMID:29748521

  11. Self-Tuning Method for Increased Obstacle Detection Reliability Based on Internet of Things LiDAR Sensor Models.

    PubMed

    Castaño, Fernando; Beruvides, Gerardo; Villalonga, Alberto; Haber, Rodolfo E

    2018-05-10

    On-chip LiDAR sensors for vehicle collision avoidance are a rapidly expanding area of research and development. The assessment of reliable obstacle detection using data collected by LiDAR sensors has become a key issue that the scientific community is actively exploring. The design of a self-tuning methodology and its implementation are presented in this paper, to maximize the reliability of LiDAR sensors network for obstacle detection in the 'Internet of Things' (IoT) mobility scenarios. The Webots Automobile 3D simulation tool for emulating sensor interaction in complex driving environments is selected in order to achieve that objective. Furthermore, a model-based framework is defined that employs a point-cloud clustering technique, and an error-based prediction model library that is composed of a multilayer perceptron neural network, and k-nearest neighbors and linear regression models. Finally, a reinforcement learning technique, specifically a Q-learning method, is implemented to determine the number of LiDAR sensors that are required to increase sensor reliability for obstacle localization tasks. In addition, a IoT driving assistance user scenario, connecting a five LiDAR sensor network is designed and implemented to validate the accuracy of the computational intelligence-based framework. The results demonstrated that the self-tuning method is an appropriate strategy to increase the reliability of the sensor network while minimizing detection thresholds.

  12. Assessing the Reliability of Curriculum-Based Measurement: An Application of Latent Growth Modeling

    ERIC Educational Resources Information Center

    Yeo, Seungsoo; Kim, Dong-Il; Branum-Martin, Lee; Wayman, Miya Miura; Espin, Christine A.

    2012-01-01

    The purpose of this study was to demonstrate the use of Latent Growth Modeling (LGM) as a method for estimating reliability of Curriculum-Based Measurement (CBM) progress-monitoring data. The LGM approach permits the error associated with each measure to differ at each time point, thus providing an alternative method for examining of the…

  13. Target virus log10 reduction values determined for two reclaimed wastewater irrigation scenarios in Japan based on tolerable annual disease burden.

    PubMed

    Ito, Toshihiro; Kitajima, Masaaki; Kato, Tsuyoshi; Ishii, Satoshi; Segawa, Takahiro; Okabe, Satoshi; Sano, Daisuke

    2017-11-15

    Multiple-barriers are widely employed for managing microbial risks in water reuse, in which different types of wastewater treatment units (biological treatment, disinfection, etc.) and health protection measures (use of personal protective gear, vegetable washing, etc.) are combined to achieve a performance target value of log 10 reduction (LR) of viruses. The LR virus target value needs to be calculated based on the data obtained from monitoring the viruses of concern and the water reuse scheme in the context of the countries/regions where water reuse is implemented. In this study, we calculated the virus LR target values under two exposure scenarios for reclaimed wastewater irrigation in Japan, using the concentrations of indigenous viruses in untreated wastewater and a defined tolerable annual disease burden (10 -4 or 10 -6 disability-adjusted life years per person per year (DALY pppy )). Three genogroups of norovirus (norovirus genogroup I (NoV GI), geogroup II (NoV GII), and genogroup IV (NoV GIV)) in untreated wastewater were quantified as model viruses using reverse transcription-microfluidic quantitative PCR, and only NoV GII was present in quantifiable concentration. The probabilistic distribution of NoV GII concentration in untreated wastewater was then estimated from its concentration dataset, and used to calculate the LR target values of NoV GII for wastewater treatment. When an accidental ingestion of reclaimed wastewater by Japanese farmers was assumed, the NoV GII LR target values corresponding to the tolerable annual disease burden of 10 -6 DALY pppy were 3.2, 4.4, and 5.7 at 95, 99, and 99.9%tile, respectively. These percentile values, defined as "reliability," represent the cumulative probability of NoV GII concentration distribution in untreated wastewater below the corresponding tolerable annual disease burden after wastewater reclamation. An approximate 1-log 10 difference of LR target values was observed between 10 -4 and 10 -6 DALY pppy

  14. Adaptive block online learning target tracking based on super pixel segmentation

    NASA Astrophysics Data System (ADS)

    Cheng, Yue; Li, Jianzeng

    2018-04-01

    Video target tracking technology under the unremitting exploration of predecessors has made big progress, but there are still lots of problems not solved. This paper proposed a new algorithm of target tracking based on image segmentation technology. Firstly we divide the selected region using simple linear iterative clustering (SLIC) algorithm, after that, we block the area with the improved density-based spatial clustering of applications with noise (DBSCAN) clustering algorithm. Each sub-block independently trained classifier and tracked, then the algorithm ignore the failed tracking sub-block while reintegrate the rest of the sub-blocks into tracking box to complete the target tracking. The experimental results show that our algorithm can work effectively under occlusion interference, rotation change, scale change and many other problems in target tracking compared with the current mainstream algorithms.

  15. Classification of underwater target echoes based on auditory perception characteristics

    NASA Astrophysics Data System (ADS)

    Li, Xiukun; Meng, Xiangxia; Liu, Hang; Liu, Mingye

    2014-06-01

    In underwater target detection, the bottom reverberation has some of the same properties as the target echo, which has a great impact on the performance. It is essential to study the difference between target echo and reverberation. In this paper, based on the unique advantage of human listening ability on objects distinction, the Gammatone filter is taken as the auditory model. In addition, time-frequency perception features and auditory spectral features are extracted for active sonar target echo and bottom reverberation separation. The features of the experimental data have good concentration characteristics in the same class and have a large amount of differences between different classes, which shows that this method can effectively distinguish between the target echo and reverberation.

  16. Fluorogenic reaction-based prodrug conjugates as targeted cancer theranostics.

    PubMed

    Lee, Min Hee; Sharma, Amit; Chang, Min Jung; Lee, Jinju; Son, Subin; Sessler, Jonathan L; Kang, Chulhun; Kim, Jong Seung

    2018-01-02

    Theranostic systems are receiving ever-increasing attention due to their potential therapeutic utility, imaging enhancement capability, and promise for advancing the field of personalized medicine, particularly as it relates to the diagnosis, staging, and treatment of cancer. In this Tutorial Review, we provide an introduction to the concepts of theranostic drug delivery effected via use of conjugates that are able to target cancer cells selectively, provide cytotoxic chemotherapeutics, and produce readily monitored imaging signals in vitro and in vivo. The underlying design concepts, requiring the synthesis of conjugates composed of imaging reporters, masked chemotherapeutic drugs, cleavable linkers, and cancer targeting ligands, are discussed. Particular emphasis is placed on highlighting the potential benefits of fluorogenic reaction-based targeted systems that are activated for both imaging and therapy by cellular entities, e.g., thiols, reactive oxygen species and enzymes, which are present at relatively elevated levels in tumour environments, physiological characteristics of cancer, e.g., hypoxia and acidic pH. Also discussed are systems activated by an external stimulus, such as light. The work summarized in this Tutorial Review will help define the role fluorogenic reaction-based, cancer-targeting theranostics may have in advancing drug discovery efforts, as well as improving our understanding of cellular uptake and drug release mechanisms.

  17. Physics-based process modeling, reliability prediction, and design guidelines for flip-chip devices

    NASA Astrophysics Data System (ADS)

    Michaelides, Stylianos

    -down devices without the underfill, based on the thorough understanding of the failure modes. Also, practical design guidelines for material, geometry and process parameters for reliable flip-chip devices have been developed.

  18. Reliability and validity of procedure-based assessments in otolaryngology training.

    PubMed

    Awad, Zaid; Hayden, Lindsay; Robson, Andrew K; Muthuswamy, Keerthini; Tolley, Neil S

    2015-06-01

    To investigate the reliability and construct validity of procedure-based assessment (PBA) in assessing performance and progress in otolaryngology training. Retrospective database analysis using a national electronic database. We analyzed PBAs of otolaryngology trainees in North London from core trainees (CTs) to specialty trainees (STs). The tool contains six multi-item domains: consent, planning, preparation, exposure/closure, technique, and postoperative care, rated as "satisfactory" or "development required," in addition to an overall performance rating (pS) of 1 to 4. Individual domain score, overall calculated score (cS), and number of "development-required" items were calculated for each PBA. Receiver operating characteristic analysis helped determine sensitivity and specificity. There were 3,152 otolaryngology PBAs from 46 otolaryngology trainees analyzed. PBA reliability was high (Cronbach's α 0.899), and sensitivity approached 99%. cS correlated positively with pS and level in training (rs : +0.681 and +0.324, respectively). ST had higher cS and pS than CT (93% ± 0.6 and 3.2 ± 0.03 vs. 71% ± 3.1 and 2.3 ± 0.08, respectively; P < .001). cS and pS increased from CT1 to ST8 showing construct validity (rs : +0.348 and +0.354, respectively; P < .001). The technical skill domain had the highest utilization (98% of PBAs) and was the best predictor of cS and pS (rs : +0.96 and +0.66, respectively). PBA is reliable and valid for assessing otolaryngology trainees' performance and progress at all levels. It is highly sensitive in identifying competent trainees. The tool is used in a formative and feedback capacity. The technical domain is the best predictor and should be given close attention. NA. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  19. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    NASA Astrophysics Data System (ADS)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2017-06-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  20. Neighborhood-based tobacco advertising targeting adolescents.

    PubMed

    Ammerman, S D; Nolden, M

    1995-06-01

    Adolescent tobacco use remains a serious problem, and adolescents may be particularly receptive to the glamorous images tobacco companies use in advertisements. A relatively new form of neighborhood-based outdoor advertising, the illuminated bus-stop-shelter billboard, was studied to determine tobacco companies' use of this medium. We hypothesized that in 2 distinct San Francisco, California, neighborhoods, 1 predominantly white and the other mostly Latino, we would find a predominance of tobacco advertising on these billboards in both neighborhoods, that tobacco advertisements would be more prevalent in the minority Latino neighborhood, and that tobacco advertising would target adolescents in both neighborhoods. Each bus-stop-shelter billboard advertisement in the study areas from April 1992 to March 1993 was recorded. The type and frequency of products advertised and qualitative content of tobacco advertisements were analyzed. Adolescents' possible exposure to these advertisements was noted. Our main outcome measures were the percentage of tobacco advertising, possible adolescent exposure to this advertising, and themes of the tobacco advertisements. About 10% of all bus-stop-shelter billboard advertisements in each area promoted tobacco use. Possible exposures to these advertisements were greater in the Latino neighborhood because of a greater adolescent population. Qualitative analyses of tobacco advertisements suggested that adolescents are the primary targets. We urge physicians and educators to explicitly address this form of tobacco advertising, and we urge a ban on neighborhood-based tobacco advertising.

  1. Assessing I-Grid(TM) web-based monitoring for power quality and reliability benchmarking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Divan, Deepak; Brumsickle, William; Eto, Joseph

    2003-04-30

    This paper presents preliminary findings from DOEs pilot program. The results show how a web-based monitoring system can form the basis for aggregation of data and correlation and benchmarking across broad geographical lines. A longer report describes additional findings from the pilot, including impacts of power quality and reliability on customers operations [Divan, Brumsickle, Eto 2003].

  2. Autonomous, Decentralized Grid Architecture: Prosumer-Based Distributed Autonomous Cyber-Physical Architecture for Ultra-Reliable Green Electricity Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2012-01-11

    GENI Project: Georgia Tech is developing a decentralized, autonomous, internet-like control architecture and control software system for the electric power grid. Georgia Tech’s new architecture is based on the emerging concept of electricity prosumers—economically motivated actors that can produce, consume, or store electricity. Under Georgia Tech’s architecture, all of the actors in an energy system are empowered to offer associated energy services based on their capabilities. The actors achieve their sustainability, efficiency, reliability, and economic objectives, while contributing to system-wide reliability and efficiency goals. This is in marked contrast to the current one-way, centralized control paradigm.

  3. A critique of the molecular target-based drug discovery paradigm based on principles of metabolic control: advantages of pathway-based discovery.

    PubMed

    Hellerstein, Marc K

    2008-01-01

    Contemporary drug discovery and development (DDD) is dominated by a molecular target-based paradigm. Molecular targets that are potentially important in disease are physically characterized; chemical entities that interact with these targets are identified by ex vivo high-throughput screening assays, and optimized lead compounds enter testing as drugs. Contrary to highly publicized claims, the ascendance of this approach has in fact resulted in the lowest rate of new drug approvals in a generation. The primary explanation for low rates of new drugs is attrition, or the failure of candidates identified by molecular target-based methods to advance successfully through the DDD process. In this essay, I advance the thesis that this failure was predictable, based on modern principles of metabolic control that have emerged and been applied most forcefully in the field of metabolic engineering. These principles, such as the robustness of flux distributions, address connectivity relationships in complex metabolic networks and make it unlikely a priori that modulating most molecular targets will have predictable, beneficial functional outcomes. These same principles also suggest, however, that unexpected therapeutic actions will be common for agents that have any effect (i.e., that complexity can be exploited therapeutically). A potential operational solution (pathway-based DDD), based on observability rather than predictability, is described, focusing on emergent properties of key metabolic pathways in vivo. Recent examples of pathway-based DDD are described. In summary, the molecular target-based DDD paradigm is built on a naïve and misleading model of biologic control and is not heuristically adequate for advancing the mission of modern therapeutics. New approaches that take account of and are built on principles described by metabolic engineers are needed for the next generation of DDD.

  4. Research on Aircraft Target Detection Algorithm Based on Improved Radial Gradient Transformation

    NASA Astrophysics Data System (ADS)

    Zhao, Z. M.; Gao, X. M.; Jiang, D. N.; Zhang, Y. Q.

    2018-04-01

    Aiming at the problem that the target may have different orientation in the unmanned aerial vehicle (UAV) image, the target detection algorithm based on the rotation invariant feature is studied, and this paper proposes a method of RIFF (Rotation-Invariant Fast Features) based on look up table and polar coordinate acceleration to be used for aircraft target detection. The experiment shows that the detection performance of this method is basically equal to the RIFF, and the operation efficiency is greatly improved.

  5. Polarimetric subspace target detector for SAR data based on the Huynen dihedral model

    NASA Astrophysics Data System (ADS)

    Larson, Victor J.; Novak, Leslie M.

    1995-06-01

    Two new polarimetric subspace target detectors are developed based on a dihedral signal model for bright peaks within a spatially extended target signature. The first is a coherent dihedral target detector based on the exact Huynen model for a dihedral. The second is a noncoherent dihedral target detector based on the Huynen model with an extra unknown phase term. Expressions for these polarimetric subspace target detectors are developed for both additive Gaussian clutter and more general additive spherically invariant random vector clutter including the K-distribution. For the case of Gaussian clutter with unknown clutter parameters, constant false alarm rate implementations of these polarimetric subspace target detectors are developed. The performance of these dihedral detectors is demonstrated with real millimeter-wave fully polarimetric SAR data. The coherent dihedral detector which is developed with a more accurate description of a dihedral offers no performance advantage over the noncoherent dihedral detector which is computationally more attractive. The dihedral detectors do a better job of separating a set of tactical military targets from natural clutter compared to a detector that assumes no knowledge about the polarimetric structure of the target signal.

  6. Target-object integration, attention distribution, and object orientation interactively modulate object-based selection.

    PubMed

    Al-Janabi, Shahd; Greenberg, Adam S

    2016-10-01

    The representational basis of attentional selection can be object-based. Various studies have suggested, however, that object-based selection is less robust than spatial selection across experimental paradigms. We sought to examine the manner by which the following factors might explain this variation: Target-Object Integration (targets 'on' vs. part 'of' an object), Attention Distribution (narrow vs. wide), and Object Orientation (horizontal vs. vertical). In Experiment 1, participants discriminated between two targets presented 'on' an object in one session, or presented as a change 'of' an object in another session. There was no spatial cue-thus, attention was initially focused widely-and the objects were horizontal or vertical. We found evidence of object-based selection only when targets constituted a change 'of' an object. Additionally, object orientation modulated the sign of object-based selection: We observed a same-object advantage for horizontal objects, but a same-object cost for vertical objects. In Experiment 2, an informative cue preceded a single target presented 'on' an object or as a change 'of' an object (thus, attention was initially focused narrowly). Unlike in Experiment 1, we found evidence of object-based selection independent of target-object integration. We again found that the sign of selection was modulated by the objects' orientation. This result may reflect a meridian effect, which emerged due to anisotropies in the cortical representations when attention is oriented endogenously. Experiment 3 revealed that object orientation did not modulate object-based selection when attention was oriented exogenously. Our findings suggest that target-object integration, attention distribution, and object orientation modulate object-based selection, but only in combination.

  7. Reliability techniques in the petroleum industry

    NASA Technical Reports Server (NTRS)

    Williams, H. L.

    1971-01-01

    Quantitative reliability evaluation methods used in the Apollo Spacecraft Program are translated into petroleum industry requirements with emphasis on offsetting reliability demonstration costs and limited production runs. Described are the qualitative disciplines applicable, the definitions and criteria that accompany the disciplines, and the generic application of these disciplines to the chemical industry. The disciplines are then translated into proposed definitions and criteria for the industry, into a base-line reliability plan that includes these disciplines, and into application notes to aid in adapting the base-line plan to a specific operation.

  8. Targeted therapy according to next generation sequencing-based panel sequencing.

    PubMed

    Saito, Motonobu; Momma, Tomoyuki; Kono, Koji

    2018-04-17

    Targeted therapy against actionable gene mutations shows a significantly higher response rate as well as longer survival compared to conventional chemotherapy, and has become a standard therapy for many cancers. Recent progress in next-generation sequencing (NGS) has enabled to identify huge number of genetic aberrations. Based on sequencing results, patients recommend to undergo targeted therapy or immunotherapy. In cases where there are no available approved drugs for the genetic mutations detected in the patients, it is recommended to be facilitate the registration for the clinical trials. For that purpose, a NGS-based sequencing panel that can simultaneously target multiple genes in a single investigation has been used in daily clinical practice. To date, various types of sequencing panels have been developed to investigate genetic aberrations with tumor somatic genome variants (gain-of-function or loss-of-function mutations, high-level copy number alterations, and gene fusions) through comprehensive bioinformatics. Because sequencing panels are efficient and cost-effective, they are quickly being adopted outside the lab, in hospitals and clinics, in order to identify personal targeted therapy for individual cancer patients.

  9. WEAMR-a weighted energy aware multipath reliable routing mechanism for hotline-based WSNs.

    PubMed

    Tufail, Ali; Qamar, Arslan; Khan, Adil Mehmood; Baig, Waleed Akram; Kim, Ki-Hyung

    2013-05-13

    Reliable source to sink communication is the most important factor for an efficient routing protocol especially in domains of military, healthcare and disaster recovery applications. We present weighted energy aware multipath reliable routing (WEAMR), a novel energy aware multipath routing protocol which utilizes hotline-assisted routing to meet such requirements for mission critical applications. The protocol reduces the number of average hops from source to destination and provides unmatched reliability as compared to well known reactive ad hoc protocols i.e., AODV and AOMDV. Our protocol makes efficient use of network paths based on weighted cost calculation and intelligently selects the best possible paths for data transmissions. The path cost calculation considers end to end number of hops, latency and minimum energy node value in the path. In case of path failure path recalculation is done efficiently with minimum latency and control packets overhead. Our evaluation shows that our proposal provides better end-to-end delivery with less routing overhead and higher packet delivery success ratio compared to AODV and AOMDV. The use of multipath also increases overall life time of WSN network using optimum energy available paths between sender and receiver in WDNs.

  10. Weighted Optimization-Based Distributed Kalman Filter for Nonlinear Target Tracking in Collaborative Sensor Networks.

    PubMed

    Chen, Jie; Li, Jiahong; Yang, Shuanghua; Deng, Fang

    2017-11-01

    The identification of the nonlinearity and coupling is crucial in nonlinear target tracking problem in collaborative sensor networks. According to the adaptive Kalman filtering (KF) method, the nonlinearity and coupling can be regarded as the model noise covariance, and estimated by minimizing the innovation or residual errors of the states. However, the method requires large time window of data to achieve reliable covariance measurement, making it impractical for nonlinear systems which are rapidly changing. To deal with the problem, a weighted optimization-based distributed KF algorithm (WODKF) is proposed in this paper. The algorithm enlarges the data size of each sensor by the received measurements and state estimates from its connected sensors instead of the time window. A new cost function is set as the weighted sum of the bias and oscillation of the state to estimate the "best" estimate of the model noise covariance. The bias and oscillation of the state of each sensor are estimated by polynomial fitting a time window of state estimates and measurements of the sensor and its neighbors weighted by the measurement noise covariance. The best estimate of the model noise covariance is computed by minimizing the weighted cost function using the exhaustive method. The sensor selection method is in addition to the algorithm to decrease the computation load of the filter and increase the scalability of the sensor network. The existence, suboptimality and stability analysis of the algorithm are given. The local probability data association method is used in the proposed algorithm for the multitarget tracking case. The algorithm is demonstrated in simulations on tracking examples for a random signal, one nonlinear target, and four nonlinear targets. Results show the feasibility and superiority of WODKF against other filtering algorithms for a large class of systems.

  11. Is Teacher Assessment Reliable or Valid for High School Students under a Web-Based Portfolio Environment?

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Wu, Bing-Hong

    2012-01-01

    This study explored the reliability and validity of teacher assessment under a Web-based portfolio assessment environment (or Web-based teacher portfolio assessment). Participants were 72 eleventh graders taking the "Computer Application" course. The students perform portfolio creation, inspection, self- and peer-assessment using the Web-based…

  12. Assessing the Conditional Reliability of State Assessments

    ERIC Educational Resources Information Center

    May, Henry; Cole, Russell; Haimson, Josh; Perez-Johnson, Irma

    2010-01-01

    The purpose of this study is to provide empirical benchmarks of the conditional reliabilities of state tests for samples of the student population defined by ability level. Given that many educational interventions are targeted for samples of low performing students, schools, or districts, the primary goal of this research is to determine how…

  13. Comprehensive proficiency-based inanimate training for robotic surgery: reliability, feasibility, and educational benefit.

    PubMed

    Arain, Nabeel A; Dulan, Genevieve; Hogg, Deborah C; Rege, Robert V; Powers, Cathryn E; Tesfay, Seifu T; Hynan, Linda S; Scott, Daniel J

    2012-10-01

    We previously developed a comprehensive proficiency-based robotic training curriculum demonstrating construct, content, and face validity. This study aimed to assess reliability, feasibility, and educational benefit associated with curricular implementation. Over an 11-month period, 55 residents, fellows, and faculty (robotic novices) from general surgery, urology, and gynecology were enrolled in a 2-month curriculum: online didactics, half-day hands-on tutorial, and self-practice using nine inanimate exercises. Each trainee completed a questionnaire and performed a single proctored repetition of each task before (pretest) and after (post-test) training. Tasks were scored for time and errors using modified FLS metrics. For inter-rater reliability (IRR), three trainees were scored by two raters and analyzed using intraclass correlation coefficients (ICC). Data from eight experts were analyzed using ICC and Cronbach's α to determine test-retest reliability and internal consistency, respectively. Educational benefit was assessed by comparing baseline (pretest) and final (post-test) trainee performance; comparisons used Wilcoxon signed-rank test. Of the 55 trainees that pretested, 53 (96 %) completed all curricular components in 9-17 h and reached proficiency after completing an average of 72 ± 28 repetitions over 5 ± 1 h. Trainees indicated minimal prior robotic experience and "poor comfort" with robotic skills at baseline (1.8 ± 0.9) compared to final testing (3.1 ± 0.8, p < 0.001). IRR data for the composite score revealed an ICC of 0.96 (p < 0.001). Test-retest reliability was 0.91 (p < 0.001) and internal consistency was 0.81. Performance improved significantly after training for all nine tasks and according to composite scores (548 ± 176 vs. 914 ± 81, p < 0.001), demonstrating educational benefit. This curriculum is associated with high reliability measures, demonstrated feasibility for a large cohort of trainees, and yielded significant educational

  14. Continuous improvement of medical test reliability using reference methods and matrix-corrected target values in proficiency testing schemes: application to glucose assay.

    PubMed

    Delatour, Vincent; Lalere, Beatrice; Saint-Albin, Karène; Peignaux, Maryline; Hattchouel, Jean-Marc; Dumont, Gilles; De Graeve, Jacques; Vaslin-Reimann, Sophie; Gillery, Philippe

    2012-11-20

    The reliability of biological tests is a major issue for patient care in terms of public health that involves high economic stakes. Reference methods, as well as regular external quality assessment schemes (EQAS), are needed to monitor the analytical performance of field methods. However, control material commutability is a major concern to assess method accuracy. To overcome material non-commutability, we investigated the possibility of using lyophilized serum samples together with a limited number of frozen serum samples to assign matrix-corrected target values, taking the example of glucose assays. Trueness of the current glucose assays was first measured against a primary reference method by using human frozen sera. Methods using hexokinase and glucose oxidase with spectroreflectometric detection proved very accurate, with bias ranging between -2.2% and +2.3%. Bias of methods using glucose oxidase with spectrophotometric detection was +4.5%. Matrix-related bias of the lyophilized materials was then determined and ranged from +2.5% to -14.4%. Matrix-corrected target values were assigned and used to assess trueness of 22 sub-peer groups. We demonstrated that matrix-corrected target values can be a valuable tool to assess field method accuracy in large scale surveys where commutable materials are not available in sufficient amount with acceptable costs. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Robust Ground Target Detection by SAR and IR Sensor Fusion Using Adaboost-Based Feature Selection

    PubMed Central

    Kim, Sungho; Song, Woo-Jin; Kim, So-Hyun

    2016-01-01

    Long-range ground targets are difficult to detect in a noisy cluttered environment using either synthetic aperture radar (SAR) images or infrared (IR) images. SAR-based detectors can provide a high detection rate with a high false alarm rate to background scatter noise. IR-based approaches can detect hot targets but are affected strongly by the weather conditions. This paper proposes a novel target detection method by decision-level SAR and IR fusion using an Adaboost-based machine learning scheme to achieve a high detection rate and low false alarm rate. The proposed method consists of individual detection, registration, and fusion architecture. This paper presents a single framework of a SAR and IR target detection method using modified Boolean map visual theory (modBMVT) and feature-selection based fusion. Previous methods applied different algorithms to detect SAR and IR targets because of the different physical image characteristics. One method that is optimized for IR target detection produces unsuccessful results in SAR target detection. This study examined the image characteristics and proposed a unified SAR and IR target detection method by inserting a median local average filter (MLAF, pre-filter) and an asymmetric morphological closing filter (AMCF, post-filter) into the BMVT. The original BMVT was optimized to detect small infrared targets. The proposed modBMVT can remove the thermal and scatter noise by the MLAF and detect extended targets by attaching the AMCF after the BMVT. Heterogeneous SAR and IR images were registered automatically using the proposed RANdom SAmple Region Consensus (RANSARC)-based homography optimization after a brute-force correspondence search using the detected target centers and regions. The final targets were detected by feature-selection based sensor fusion using Adaboost. The proposed method showed good SAR and IR target detection performance through feature selection-based decision fusion on a synthetic database generated

  16. Robust Ground Target Detection by SAR and IR Sensor Fusion Using Adaboost-Based Feature Selection.

    PubMed

    Kim, Sungho; Song, Woo-Jin; Kim, So-Hyun

    2016-07-19

    Long-range ground targets are difficult to detect in a noisy cluttered environment using either synthetic aperture radar (SAR) images or infrared (IR) images. SAR-based detectors can provide a high detection rate with a high false alarm rate to background scatter noise. IR-based approaches can detect hot targets but are affected strongly by the weather conditions. This paper proposes a novel target detection method by decision-level SAR and IR fusion using an Adaboost-based machine learning scheme to achieve a high detection rate and low false alarm rate. The proposed method consists of individual detection, registration, and fusion architecture. This paper presents a single framework of a SAR and IR target detection method using modified Boolean map visual theory (modBMVT) and feature-selection based fusion. Previous methods applied different algorithms to detect SAR and IR targets because of the different physical image characteristics. One method that is optimized for IR target detection produces unsuccessful results in SAR target detection. This study examined the image characteristics and proposed a unified SAR and IR target detection method by inserting a median local average filter (MLAF, pre-filter) and an asymmetric morphological closing filter (AMCF, post-filter) into the BMVT. The original BMVT was optimized to detect small infrared targets. The proposed modBMVT can remove the thermal and scatter noise by the MLAF and detect extended targets by attaching the AMCF after the BMVT. Heterogeneous SAR and IR images were registered automatically using the proposed RANdom SAmple Region Consensus (RANSARC)-based homography optimization after a brute-force correspondence search using the detected target centers and regions. The final targets were detected by feature-selection based sensor fusion using Adaboost. The proposed method showed good SAR and IR target detection performance through feature selection-based decision fusion on a synthetic database generated

  17. Reproducibility, Reliability, and Validity of Fuchsin-Based Beads for the Evaluation of Masticatory Performance.

    PubMed

    Sánchez-Ayala, Alfonso; Farias-Neto, Arcelino; Vilanova, Larissa Soares Reis; Costa, Marina Abrantes; Paiva, Ana Clara Soares; Carreiro, Adriana da Fonte Porto; Mestriner-Junior, Wilson

    2016-08-01

    Rehabilitation of masticatory function is inherent to prosthodontics; however, despite the various techniques for evaluating oral comminution, the methodological suitability of these has not been completely studied. The aim of this study was to determine the reproducibility, reliability, and validity of a test food based on fuchsin beads for masticatory function assessment. Masticatory performance was evaluated in 20 dentate subjects (mean age, 23.3 years) using two kinds of test foods and methods: fuchsin beads and ultraviolet-visible spectrophotometry, and silicone cubes and multiple sieving as gold standard. Three examiners conducted five masticatory performance trials with each test food. Reproducibility of the results from both test foods was separately assessed using the intraclass correlation coefficient (ICC). Reliability and validity of fuchsin bead data were measured by comparing the average mean of absolute differences and the measurement means, respectively, regarding silicone cube data using the paired Student's t-test (α = 0.05). Intraexaminer and interexaminer ICC for the fuchsin bead values were 0.65 and 0.76 (p < 0.001), respectively; those for the silicone cubes values were 0.93 and 0.91 (p < 0.001), respectively. Reliability revealed intraexaminer (p < 0.001) and interexaminer (p < 0.05) differences between the average means of absolute differences of each test foods. Validity also showed differences between the measurement means of each test food (p < 0.001). Intra- and interexaminer reproducibility of the test food based on fuchsin beads for evaluation of masticatory performance were good and excellent, respectively; however, the reliability and validity were low, because fuchsin beads do not measure the grinding capacity of masticatory function as silicone cubes do; instead, this test food describes the crushing potential of teeth. Thus, the two kinds of test foods evaluate different properties of masticatory capacity, confirming fushsin

  18. Multitarget mixture reduction algorithm with incorporated target existence recursions

    NASA Astrophysics Data System (ADS)

    Ristic, Branko; Arulampalam, Sanjeev

    2000-07-01

    The paper derives a deferred logic data association algorithm based on the mixture reduction approach originally due to Salmond [SPIE vol.1305, 1990]. The novelty of the proposed algorithm provides the recursive formulae for both data association and target existence (confidence) estimation, thus allowing automatic track initiation and termination. T he track initiation performance of the proposed filter is investigated by computer simulations. It is observed that at moderately high levels of clutter density the proposed filter initiates tracks more reliably than its corresponding PDA filter. An extension of the proposed filter to the multi-target case is also presented. In addition, the paper compares the track maintenance performance of the MR algorithm with an MHT implementation.

  19. System-level multi-target drug discovery from natural products with applications to cardiovascular diseases.

    PubMed

    Zheng, Chunli; Wang, Jinan; Liu, Jianling; Pei, Mengjie; Huang, Chao; Wang, Yonghua

    2014-08-01

    The term systems pharmacology describes a field of study that uses computational and experimental approaches to broaden the view of drug actions rooted in molecular interactions and advance the process of drug discovery. The aim of this work is to stick out the role that the systems pharmacology plays across the multi-target drug discovery from natural products for cardiovascular diseases (CVDs). Firstly, based on network pharmacology methods, we reconstructed the drug-target and target-target networks to determine the putative protein target set of multi-target drugs for CVDs treatment. Secondly, we reintegrated a compound dataset of natural products and then obtained a multi-target compounds subset by virtual-screening process. Thirdly, a drug-likeness evaluation was applied to find the ADME-favorable compounds in this subset. Finally, we conducted in vitro experiments to evaluate the reliability of the selected chemicals and targets. We found that four of the five randomly selected natural molecules can effectively act on the target set for CVDs, indicating the reasonability of our systems-based method. This strategy may serve as a new model for multi-target drug discovery of complex diseases.

  20. Prediction of Drug-Target Interactions and Drug Repositioning via Network-Based Inference

    PubMed Central

    Jiang, Jing; Lu, Weiqiang; Li, Weihua; Liu, Guixia; Zhou, Weixing; Huang, Jin; Tang, Yun

    2012-01-01

    Drug-target interaction (DTI) is the basis of drug discovery and design. It is time consuming and costly to determine DTI experimentally. Hence, it is necessary to develop computational methods for the prediction of potential DTI. Based on complex network theory, three supervised inference methods were developed here to predict DTI and used for drug repositioning, namely drug-based similarity inference (DBSI), target-based similarity inference (TBSI) and network-based inference (NBI). Among them, NBI performed best on four benchmark data sets. Then a drug-target network was created with NBI based on 12,483 FDA-approved and experimental drug-target binary links, and some new DTIs were further predicted. In vitro assays confirmed that five old drugs, namely montelukast, diclofenac, simvastatin, ketoconazole, and itraconazole, showed polypharmacological features on estrogen receptors or dipeptidyl peptidase-IV with half maximal inhibitory or effective concentration ranged from 0.2 to 10 µM. Moreover, simvastatin and ketoconazole showed potent antiproliferative activities on human MDA-MB-231 breast cancer cell line in MTT assays. The results indicated that these methods could be powerful tools in prediction of DTIs and drug repositioning. PMID:22589709

  1. Optimization Based Efficiencies in First Order Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Peck, Jeffrey A.; Mahadevan, Sankaran

    2003-01-01

    This paper develops a method for updating the gradient vector of the limit state function in reliability analysis using Broyden's rank one updating technique. In problems that use commercial code as a black box, the gradient calculations are usually done using a finite difference approach, which becomes very expensive for large system models. The proposed method replaces the finite difference gradient calculations in a standard first order reliability method (FORM) with Broyden's Quasi-Newton technique. The resulting algorithm of Broyden updates within a FORM framework (BFORM) is used to run several example problems, and the results compared to standard FORM results. It is found that BFORM typically requires fewer functional evaluations that FORM to converge to the same answer.

  2. Reliability-based optimization of maintenance scheduling of mechanical components under fatigue

    PubMed Central

    Beaurepaire, P.; Valdebenito, M.A.; Schuëller, G.I.; Jensen, H.A.

    2012-01-01

    This study presents the optimization of the maintenance scheduling of mechanical components under fatigue loading. The cracks of damaged structures may be detected during non-destructive inspection and subsequently repaired. Fatigue crack initiation and growth show inherent variability, and as well the outcome of inspection activities. The problem is addressed under the framework of reliability based optimization. The initiation and propagation of fatigue cracks are efficiently modeled using cohesive zone elements. The applicability of the method is demonstrated by a numerical example, which involves a plate with two holes subject to alternating stress. PMID:23564979

  3. Effect of Clinically Discriminating, Evidence-Based Checklist Items on the Reliability of Scores from an Internal Medicine Residency OSCE

    ERIC Educational Resources Information Center

    Daniels, Vijay J.; Bordage, Georges; Gierl, Mark J.; Yudkowsky, Rachel

    2014-01-01

    Objective structured clinical examinations (OSCEs) are used worldwide for summative examinations but often lack acceptable reliability. Research has shown that reliability of scores increases if OSCE checklists for medical students include only clinically relevant items. Also, checklists are often missing evidence-based items that high-achieving…

  4. Test-retest reliability of evoked BOLD signals from a cognitive-emotive fMRI test battery.

    PubMed

    Plichta, Michael M; Schwarz, Adam J; Grimm, Oliver; Morgen, Katrin; Mier, Daniela; Haddad, Leila; Gerdes, Antje B M; Sauer, Carina; Tost, Heike; Esslinger, Christine; Colman, Peter; Wilson, Frederick; Kirsch, Peter; Meyer-Lindenberg, Andreas

    2012-04-15

    Even more than in cognitive research applications, moving fMRI to the clinic and the drug development process requires the generation of stable and reliable signal changes. The performance characteristics of the fMRI paradigm constrain experimental power and may require different study designs (e.g., crossover vs. parallel groups), yet fMRI reliability characteristics can be strongly dependent on the nature of the fMRI task. The present study investigated both within-subject and group-level reliability of a combined three-task fMRI battery targeting three systems of wide applicability in clinical and cognitive neuroscience: an emotional (face matching), a motivational (monetary reward anticipation) and a cognitive (n-back working memory) task. A group of 25 young, healthy volunteers were scanned twice on a 3T MRI scanner with a mean test-retest interval of 14.6 days. FMRI reliability was quantified using the intraclass correlation coefficient (ICC) applied at three different levels ranging from a global to a localized and fine spatial scale: (1) reliability of group-level activation maps over the whole brain and within targeted regions of interest (ROIs); (2) within-subject reliability of ROI-mean amplitudes and (3) within-subject reliability of individual voxels in the target ROIs. Results showed robust evoked activation of all three tasks in their respective target regions (emotional task=amygdala; motivational task=ventral striatum; cognitive task=right dorsolateral prefrontal cortex and parietal cortices) with high effect sizes (ES) of ROI-mean summary values (ES=1.11-1.44 for the faces task, 0.96-1.43 for the reward task, 0.83-2.58 for the n-back task). Reliability of group level activation was excellent for all three tasks with ICCs of 0.89-0.98 at the whole brain level and 0.66-0.97 within target ROIs. Within-subject reliability of ROI-mean amplitudes across sessions was fair to good for the reward task (ICCs=0.56-0.62) and, dependent on the particular ROI

  5. A Bioinformatic Pipeline for Monitoring of the Mutational Stability of Viral Drug Targets with Deep-Sequencing Technology.

    PubMed

    Kravatsky, Yuri; Chechetkin, Vladimir; Fedoseeva, Daria; Gorbacheva, Maria; Kravatskaya, Galina; Kretova, Olga; Tchurikov, Nickolai

    2017-11-23

    The efficient development of antiviral drugs, including efficient antiviral small interfering RNAs (siRNAs), requires continuous monitoring of the strict correspondence between a drug and the related highly variable viral DNA/RNA target(s). Deep sequencing is able to provide an assessment of both the general target conservation and the frequency of particular mutations in the different target sites. The aim of this study was to develop a reliable bioinformatic pipeline for the analysis of millions of short, deep sequencing reads corresponding to selected highly variable viral sequences that are drug target(s). The suggested bioinformatic pipeline combines the available programs and the ad hoc scripts based on an original algorithm of the search for the conserved targets in the deep sequencing data. We also present the statistical criteria for the threshold of reliable mutation detection and for the assessment of variations between corresponding data sets. These criteria are robust against the possible sequencing errors in the reads. As an example, the bioinformatic pipeline is applied to the study of the conservation of RNA interference (RNAi) targets in human immunodeficiency virus 1 (HIV-1) subtype A. The developed pipeline is freely available to download at the website http://virmut.eimb.ru/. Brief comments and comparisons between VirMut and other pipelines are also presented.

  6. High resolution melting curve analysis targeting the HBB gene mutational hot-spot offers a reliable screening approach for all common as well as most of the rare beta-globin gene mutations in Bangladesh.

    PubMed

    Islam, Md Tarikul; Sarkar, Suprovath Kumar; Sultana, Nusrat; Begum, Mst Noorjahan; Bhuyan, Golam Sarower; Talukder, Shezote; Muraduzzaman, A K M; Alauddin, Md; Islam, Mohammad Sazzadul; Biswas, Pritha Promita; Biswas, Aparna; Qadri, Syeda Kashfi; Shirin, Tahmina; Banu, Bilquis; Sadya, Salma; Hussain, Manzoor; Sarwardi, Golam; Khan, Waqar Ahmed; Mannan, Mohammad Abdul; Shekhar, Hossain Uddin; Chowdhury, Emran Kabir; Sajib, Abu Ashfaqur; Akhteruzzaman, Sharif; Qadri, Syed Saleheen; Qadri, Firdausi; Mannoor, Kaiissar

    2018-01-02

    Bangladesh lies in the global thalassemia belt, which has a defined mutational hot-spot in the beta-globin gene. The high carrier frequencies of beta-thalassemia trait and hemoglobin E-trait in Bangladesh necessitate a reliable DNA-based carrier screening approach that could supplement the use of hematological and electrophoretic indices to overcome the barriers of carrier screening. With this view in mind, the study aimed to establish a high resolution melting (HRM) curve-based rapid and reliable mutation screening method targeting the mutational hot-spot of South Asian and Southeast Asian countries that encompasses exon-1 (c.1 - c.92), intron-1 (c.92 + 1 - c.92 + 130) and a portion of exon-2 (c.93 - c.217) of the HBB gene which harbors more than 95% of mutant alleles responsible for beta-thalassemia in Bangladesh. Our HRM approach could successfully differentiate ten beta-globin gene mutations, namely c.79G > A, c.92 + 5G > C, c.126_129delCTTT, c.27_28insG, c.46delT, c.47G > A, c.92G > C, c.92 + 130G > C, c.126delC and c.135delC in heterozygous states from the wild type alleles, implying the significance of the approach for carrier screening as the first three of these mutations account for ~85% of total mutant alleles in Bangladesh. Moreover, different combinations of compound heterozygous mutations were found to generate melt curves that were distinct from the wild type alleles and from one another. Based on the findings, sixteen reference samples were run in parallel to 41 unknown specimens to perform direct genotyping of the beta-thalassemia specimens using HRM. The HRM-based genotyping of the unknown specimens showed 100% consistency with the sequencing result. Targeting the mutational hot-spot, the HRM approach could be successfully applied for screening of beta-thalassemia carriers in Bangladesh as well as in other countries of South Asia and Southeast Asia. The approach could be a useful supplement of hematological and

  7. Architecture-Based Reliability Analysis of Web Services

    ERIC Educational Resources Information Center

    Rahmani, Cobra Mariam

    2012-01-01

    In a Service Oriented Architecture (SOA), the hierarchical complexity of Web Services (WS) and their interactions with the underlying Application Server (AS) create new challenges in providing a realistic estimate of WS performance and reliability. The current approaches often treat the entire WS environment as a black-box. Thus, the sensitivity…

  8. Reliability modelling and analysis of a multi-state element based on a dynamic Bayesian network

    PubMed Central

    Xu, Tingxue; Gu, Junyuan; Dong, Qi; Fu, Linyu

    2018-01-01

    This paper presents a quantitative reliability modelling and analysis method for multi-state elements based on a combination of the Markov process and a dynamic Bayesian network (DBN), taking perfect repair, imperfect repair and condition-based maintenance (CBM) into consideration. The Markov models of elements without repair and under CBM are established, and an absorbing set is introduced to determine the reliability of the repairable element. According to the state-transition relations between the states determined by the Markov process, a DBN model is built. In addition, its parameters for series and parallel systems, namely, conditional probability tables, can be calculated by referring to the conditional degradation probabilities. Finally, the power of a control unit in a failure model is used as an example. A dynamic fault tree (DFT) is translated into a Bayesian network model, and subsequently extended to a DBN. The results show the state probabilities of an element and the system without repair, with perfect and imperfect repair, and under CBM, with an absorbing set plotted by differential equations and verified. Through referring forward, the reliability value of the control unit is determined in different kinds of modes. Finally, weak nodes are noted in the control unit. PMID:29765629

  9. Reliability modelling and analysis of a multi-state element based on a dynamic Bayesian network

    NASA Astrophysics Data System (ADS)

    Li, Zhiqiang; Xu, Tingxue; Gu, Junyuan; Dong, Qi; Fu, Linyu

    2018-04-01

    This paper presents a quantitative reliability modelling and analysis method for multi-state elements based on a combination of the Markov process and a dynamic Bayesian network (DBN), taking perfect repair, imperfect repair and condition-based maintenance (CBM) into consideration. The Markov models of elements without repair and under CBM are established, and an absorbing set is introduced to determine the reliability of the repairable element. According to the state-transition relations between the states determined by the Markov process, a DBN model is built. In addition, its parameters for series and parallel systems, namely, conditional probability tables, can be calculated by referring to the conditional degradation probabilities. Finally, the power of a control unit in a failure model is used as an example. A dynamic fault tree (DFT) is translated into a Bayesian network model, and subsequently extended to a DBN. The results show the state probabilities of an element and the system without repair, with perfect and imperfect repair, and under CBM, with an absorbing set plotted by differential equations and verified. Through referring forward, the reliability value of the control unit is determined in different kinds of modes. Finally, weak nodes are noted in the control unit.

  10. How to Characterize the Reliability of Ceramic Capacitors with Base-Metal Electrodes (BMEs)

    NASA Technical Reports Server (NTRS)

    Liu, David (Donhang)

    2015-01-01

    The reliability of an MLCC device is the product of a time-dependent part and a time-independent part: 1) Time-dependent part is a statistical distribution; 2) Time-independent part is the reliability at t=0, the initial reliability. Initial reliability depends only on how a BME MLCC is designed and processed. Similar to the way the minimum dielectric thickness ensured the long-term reliability of a PME MLCC, the initial reliability also ensures the long term-reliability of a BME MLCC. This presentation shows new discoveries regarding commonalities and differences between PME and BME capacitor technologies.

  11. How to Characterize the Reliability of Ceramic Capacitors with Base-Metal Electrodes (BMEs)

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2015-01-01

    The reliability of an MLCC device is the product of a time-dependent part and a time-independent part: 1) Time-dependent part is a statistical distribution; 2) Time-independent part is the reliability at t0, the initial reliability. Initial reliability depends only on how a BME MLCC is designed and processed. Similar to the way the minimum dielectric thickness ensured the long-term reliability of a PME MLCC, the initial reliability also ensures the long term-reliability of a BME MLCC. This presentation shows new discoveries regarding commonalities and differences between PME and BME capacitor technologies.

  12. Probabilistic and structural reliability analysis of laminated composite structures based on the IPACS code

    NASA Technical Reports Server (NTRS)

    Sobel, Larry; Buttitta, Claudio; Suarez, James

    1993-01-01

    Probabilistic predictions based on the Integrated Probabilistic Assessment of Composite Structures (IPACS) code are presented for the material and structural response of unnotched and notched, 1M6/3501-6 Gr/Ep laminates. Comparisons of predicted and measured modulus and strength distributions are given for unnotched unidirectional, cross-ply, and quasi-isotropic laminates. The predicted modulus distributions were found to correlate well with the test results for all three unnotched laminates. Correlations of strength distributions for the unnotched laminates are judged good for the unidirectional laminate and fair for the cross-ply laminate, whereas the strength correlation for the quasi-isotropic laminate is deficient because IPACS did not yet have a progressive failure capability. The paper also presents probabilistic and structural reliability analysis predictions for the strain concentration factor (SCF) for an open-hole, quasi-isotropic laminate subjected to longitudinal tension. A special procedure was developed to adapt IPACS for the structural reliability analysis. The reliability results show the importance of identifying the most significant random variables upon which the SCF depends, and of having accurate scatter values for these variables.

  13. Neighborhood-based tobacco advertising targeting adolescents.

    PubMed Central

    Ammerman, S D; Nolden, M

    1995-01-01

    Adolescent tobacco use remains a serious problem, and adolescents may be particularly receptive to the glamorous images tobacco companies use in advertisements. A relatively new form of neighborhood-based outdoor advertising, the illuminated bus-stop-shelter billboard, was studied to determine tobacco companies' use of this medium. We hypothesized that in 2 distinct San Francisco, California, neighborhoods, 1 predominantly white and the other mostly Latino, we would find a predominance of tobacco advertising on these billboards in both neighborhoods, that tobacco advertisements would be more prevalent in the minority Latino neighborhood, and that tobacco advertising would target adolescents in both neighborhoods. Each bus-stop-shelter billboard advertisement in the study areas from April 1992 to March 1993 was recorded. The type and frequency of products advertised and qualitative content of tobacco advertisements were analyzed. Adolescents' possible exposure to these advertisements was noted. Our main outcome measures were the percentage of tobacco advertising, possible adolescent exposure to this advertising, and themes of the tobacco advertisements. About 10% of all bus-stop-shelter billboard advertisements in each area promoted tobacco use. Possible exposures to these advertisements were greater in the Latino neighborhood because of a greater adolescent population. Qualitative analyses of tobacco advertisements suggested that adolescents are the primary targets. We urge physicians and educators to explicitly address this form of tobacco advertising, and we urge a ban on neighborhood-based tobacco advertising. PMID:7618311

  14. AAVSO Target Tool: A Web-Based Service for Tracking Variable Star Observations (Abstract)

    NASA Astrophysics Data System (ADS)

    Burger, D.; Stassun, K. G.; Barnes, C.; Kafka, S.; Beck, S.; Li, K.

    2018-06-01

    (Abstract only) The AAVSO Target Tool is a web-based interface for bringing stars in need of observation to the attention of AAVSOís network of amateur and professional astronomers. The site currently tracks over 700 targets of interest, collecting data from them on a regular basis from AAVSOís servers and sorting them based on priority. While the target tool does not require a login, users can obtain visibility times for each target by signing up and entering a telescope location. Other key features of the site include filtering by AAVSO observing section, sorting by different variable types, formatting the data for printing, and exporting the data to a CSV file. The AAVSO Target Tool builds upon seven years of experience developing web applications for astronomical data analysis, most notably on Filtergraph (Burger, D., et al. 2013, Astronomical Data Analysis Software and Systems XXII, Astronomical Society of the Pacific, San Francisco, 399), and is built using the web2py web framework based on the python programming language. The target tool is available at http://filtergraph.com/aavso.

  15. Infrared Ship Target Segmentation Based on Spatial Information Improved FCM.

    PubMed

    Bai, Xiangzhi; Chen, Zhiguo; Zhang, Yu; Liu, Zhaoying; Lu, Yi

    2016-12-01

    Segmentation of infrared (IR) ship images is always a challenging task, because of the intensity inhomogeneity and noise. The fuzzy C-means (FCM) clustering is a classical method widely used in image segmentation. However, it has some shortcomings, like not considering the spatial information or being sensitive to noise. In this paper, an improved FCM method based on the spatial information is proposed for IR ship target segmentation. The improvements include two parts: 1) adding the nonlocal spatial information based on the ship target and 2) using the spatial shape information of the contour of the ship target to refine the local spatial constraint by Markov random field. In addition, the results of K -means are used to initialize the improved FCM method. Experimental results show that the improved method is effective and performs better than the existing methods, including the existing FCM methods, for segmentation of the IR ship images.

  16. Test-retest reliability and responsiveness of the Barthel Index-based Supplementary Scales in patients with stroke.

    PubMed

    Lee, Ya-Chen; Yu, Wan-Hui; Hsueh, I-Ping; Chen, Sheng-Shiung; Hsieh, Ching-Lin

    2017-10-01

    A lack of evidence on the test-retest reliability and responsiveness limits the utility of the BI-based Supplementary Scales (BI-SS) in both clinical and research settings. To examine the test-retest reliability and responsiveness of the BI-based Supplementary Scales (BI-SS) in patients with stroke. A repeated-assessments design (1 week apart) was used to examine the test-retest reliability of the BI-SS. For the responsiveness study, the participants were assessed with the BI-SS and BI (treated as an external criterion) at admission to and discharge from rehabilitation wards. Seven outpatient rehabilitation units and one inpatient rehabilitation unit. Outpatients with chronic stroke. Eighty-four outpatients with chronic stroke participated in the test-retest reliability study. Fifty-seven inpatients completed baseline and follow-up assessments in the responsiveness study. For the test-retest reliability study, the values of the intra-class correlation coefficient and the overall percentage of minimal detectable change for the Ability Scale and Self-perceived Difficulty Scale were 0.97, 12.8%, and 0.78, 35.8%, respectively. For the responsiveness study, the standardized effect size and standardized response mean (representing internal responsiveness) of the Ability Scale and Self-perceived Difficulty Scale were 1.17 and 1.56, and 0.78 and 0.89, respectively. Regarding external responsiveness, the change in score of the Ability Scale had significant and moderate association with that of the BI (r=0.61, P<0.001). The change in score of the Self-perceived Difficulty Scale had non-significant and weak association with that of the BI (r=0.23, P=0.080). The Ability Scale of the BI-SS has satisfactory test-retest reliability and sufficient responsiveness for patients with stroke. However, the Self-perceived Difficulty Scale of the BI-SS has substantial random measurement error and insufficient external responsiveness, which may affect its utility in clinical settings. The

  17. Workplace-based assessment of communication skills: A pilot project addressing feasibility, acceptance and reliability

    PubMed Central

    Weyers, Simone; Jemi, Iman; Karger, André; Raski, Bianca; Rotthoff, Thomas; Pentzek, Michael; Mortsiefer, Achim

    2016-01-01

    Background: Imparting communication skills has been given great importance in medical curricula. In addition to standardized assessments, students should communicate with real patients in actual clinical situations during workplace-based assessments and receive structured feedback on their performance. The aim of this project was to pilot a formative testing method for workplace-based assessment. Our investigation centered in particular on whether or not physicians view the method as feasible and how high acceptance is among students. In addition, we assessed the reliability of the method. Method: As part of the project, 16 students held two consultations each with chronically ill patients at the medical practice where they were completing GP training. These consultations were video-recorded. The trained mentoring physician rated the student’s performance and provided feedback immediately following the consultations using the Berlin Global Rating scale (BGR). Two impartial, trained raters also evaluated the videos using BGR. For qualitative and quantitative analysis, information on how physicians and students viewed feasibility and their levels of acceptance was collected in written form in a partially standardized manner. To test for reliability, the test-retest reliability was calculated for both of the overall evaluations given by each rater. The inter-rater reliability was determined for the three evaluations of each individual consultation. Results: The formative assessment method was rated positively by both physicians and students. It is relatively easy to integrate into daily routines. Its significant value lies in the personal, structured and recurring feedback. The two overall scores for each patient consultation given by the two impartial raters correlate moderately. The degree of uniformity among the three raters in respect to the individual consultations is low. Discussion: Within the scope of this pilot project, only a small sample of physicians and

  18. Workplace-based assessment of communication skills: A pilot project addressing feasibility, acceptance and reliability.

    PubMed

    Weyers, Simone; Jemi, Iman; Karger, André; Raski, Bianca; Rotthoff, Thomas; Pentzek, Michael; Mortsiefer, Achim

    2016-01-01

    Background: Imparting communication skills has been given great importance in medical curricula. In addition to standardized assessments, students should communicate with real patients in actual clinical situations during workplace-based assessments and receive structured feedback on their performance. The aim of this project was to pilot a formative testing method for workplace-based assessment. Our investigation centered in particular on whether or not physicians view the method as feasible and how high acceptance is among students. In addition, we assessed the reliability of the method. Method: As part of the project, 16 students held two consultations each with chronically ill patients at the medical practice where they were completing GP training. These consultations were video-recorded. The trained mentoring physician rated the student's performance and provided feedback immediately following the consultations using the Berlin Global Rating scale (BGR). Two impartial, trained raters also evaluated the videos using BGR. For qualitative and quantitative analysis, information on how physicians and students viewed feasibility and their levels of acceptance was collected in written form in a partially standardized manner. To test for reliability, the test-retest reliability was calculated for both of the overall evaluations given by each rater. The inter-rater reliability was determined for the three evaluations of each individual consultation. Results: The formative assessment method was rated positively by both physicians and students. It is relatively easy to integrate into daily routines. Its significant value lies in the personal, structured and recurring feedback. The two overall scores for each patient consultation given by the two impartial raters correlate moderately. The degree of uniformity among the three raters in respect to the individual consultations is low. Discussion: Within the scope of this pilot project, only a small sample of physicians and

  19. Managing Reliability in the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dellin, T.A.

    1998-11-23

    The rapid pace of change at Ike end of the 20th Century should continue unabated well into the 21st Century. The driver will be the marketplace imperative of "faster, better, cheaper." This imperative has already stimulated a revolution-in-engineering in design and manufacturing. In contrast, to date, reliability engineering has not undergone a similar level of change. It is critical that we implement a corresponding revolution-in-reliability-engineering as we enter the new millennium. If we are still using 20th Century reliability approaches in the 21st Century, then reliability issues will be the limiting factor in faster, better, and cheaper. At the heartmore » of this reliability revolution will be a science-based approach to reliability engineering. Science-based reliability will enable building-in reliability, application-specific products, virtual qualification, and predictive maintenance. The purpose of this paper is to stimulate a dialogue on the future of reliability engineering. We will try to gaze into the crystal ball and predict some key issues that will drive reliability programs in the new millennium. In the 21st Century, we will demand more of our reliability programs. We will need the ability to make accurate reliability predictions that will enable optimizing cost, performance and time-to-market to meet the needs of every market segment. We will require that all of these new capabilities be in place prior to the stint of a product development cycle. The management of reliability programs will be driven by quantifiable metrics of value added to the organization business objectives.« less

  20. Evaluating the effect of database inflation in proteogenomic search on sensitive and reliable peptide identification.

    PubMed

    Li, Honglan; Joh, Yoon Sung; Kim, Hyunwoo; Paek, Eunok; Lee, Sang-Won; Hwang, Kyu-Baek

    2016-12-22

    Proteogenomics is a promising approach for various tasks ranging from gene annotation to cancer research. Databases for proteogenomic searches are often constructed by adding peptide sequences inferred from genomic or transcriptomic evidence to reference protein sequences. Such inflation of databases has potential of identifying novel peptides. However, it also raises concerns on sensitive and reliable peptide identification. Spurious peptides included in target databases may result in underestimated false discovery rate (FDR). On the other hand, inflation of decoy databases could decrease the sensitivity of peptide identification due to the increased number of high-scoring random hits. Although several studies have addressed these issues, widely applicable guidelines for sensitive and reliable proteogenomic search have hardly been available. To systematically evaluate the effect of database inflation in proteogenomic searches, we constructed a variety of real and simulated proteogenomic databases for yeast and human tandem mass spectrometry (MS/MS) data, respectively. Against these databases, we tested two popular database search tools with various approaches to search result validation: the target-decoy search strategy (with and without a refined scoring-metric) and a mixture model-based method. The effect of separate filtering of known and novel peptides was also examined. The results from real and simulated proteogenomic searches confirmed that separate filtering increases the sensitivity and reliability in proteogenomic search. However, no one method consistently identified the largest (or the smallest) number of novel peptides from real proteogenomic searches. We propose to use a set of search result validation methods with separate filtering, for sensitive and reliable identification of peptides in proteogenomic search.

  1. Estimating the Reliability of a Soyuz Spacecraft Mission

    NASA Technical Reports Server (NTRS)

    Lutomski, Michael G.; Farnham, Steven J., II; Grant, Warren C.

    2010-01-01

    Once the US Space Shuttle retires in 2010, the Russian Soyuz Launcher and Soyuz Spacecraft will comprise the only means for crew transportation to and from the International Space Station (ISS). The U.S. Government and NASA have contracted for crew transportation services to the ISS with Russia. The resulting implications for the US space program including issues such as astronaut safety must be carefully considered. Are the astronauts and cosmonauts safer on the Soyuz than the Space Shuttle system? Is the Soyuz launch system more robust than the Space Shuttle? Is it safer to continue to fly the 30 year old Shuttle fleet for crew transportation and cargo resupply than the Soyuz? Should we extend the life of the Shuttle Program? How does the development of the Orion/Ares crew transportation system affect these decisions? The Soyuz launcher has been in operation for over 40 years. There have been only two loss of life incidents and two loss of mission incidents. Given that the most recent incident took place in 1983, how do we determine current reliability of the system? Do failures of unmanned Soyuz rockets impact the reliability of the currently operational man-rated launcher? Does the Soyuz exhibit characteristics that demonstrate reliability growth and how would that be reflected in future estimates of success? NASA s next manned rocket and spacecraft development project is currently underway. Though the projects ultimate goal is to return to the Moon and then to Mars, the launch vehicle and spacecraft s first mission will be for crew transportation to and from the ISS. The reliability targets are currently several times higher than the Shuttle and possibly even the Soyuz. Can these targets be compared to the reliability of the Soyuz to determine whether they are realistic and achievable? To help answer these questions this paper will explore how to estimate the reliability of the Soyuz Launcher/Spacecraft system, compare it to the Space Shuttle, and its

  2. Literature-based discovery of diabetes- and ROS-related targets

    PubMed Central

    2010-01-01

    Background Reactive oxygen species (ROS) are known mediators of cellular damage in multiple diseases including diabetic complications. Despite its importance, no comprehensive database is currently available for the genes associated with ROS. Methods We present ROS- and diabetes-related targets (genes/proteins) collected from the biomedical literature through a text mining technology. A web-based literature mining tool, SciMiner, was applied to 1,154 biomedical papers indexed with diabetes and ROS by PubMed to identify relevant targets. Over-represented targets in the ROS-diabetes literature were obtained through comparisons against randomly selected literature. The expression levels of nine genes, selected from the top ranked ROS-diabetes set, were measured in the dorsal root ganglia (DRG) of diabetic and non-diabetic DBA/2J mice in order to evaluate the biological relevance of literature-derived targets in the pathogenesis of diabetic neuropathy. Results SciMiner identified 1,026 ROS- and diabetes-related targets from the 1,154 biomedical papers (http://jdrf.neurology.med.umich.edu/ROSDiabetes/). Fifty-three targets were significantly over-represented in the ROS-diabetes literature compared to randomly selected literature. These over-represented targets included well-known members of the oxidative stress response including catalase, the NADPH oxidase family, and the superoxide dismutase family of proteins. Eight of the nine selected genes exhibited significant differential expression between diabetic and non-diabetic mice. For six genes, the direction of expression change in diabetes paralleled enhanced oxidative stress in the DRG. Conclusions Literature mining compiled ROS-diabetes related targets from the biomedical literature and led us to evaluate the biological relevance of selected targets in the pathogenesis of diabetic neuropathy. PMID:20979611

  3. Two Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates

    ERIC Educational Resources Information Center

    Raju, Nambury S.; Oshima, T.C.

    2005-01-01

    Two new prophecy formulas for estimating item response theory (IRT)-based reliability of a shortened or lengthened test are proposed. Some of the relationships between the two formulas, one of which is identical to the well-known Spearman-Brown prophecy formula, are examined and illustrated. The major assumptions underlying these formulas are…

  4. Target discrimination method for SAR images based on semisupervised co-training

    NASA Astrophysics Data System (ADS)

    Wang, Yan; Du, Lan; Dai, Hui

    2018-01-01

    Synthetic aperture radar (SAR) target discrimination is usually performed in a supervised manner. However, supervised methods for SAR target discrimination may need lots of labeled training samples, whose acquirement is costly, time consuming, and sometimes impossible. This paper proposes an SAR target discrimination method based on semisupervised co-training, which utilizes a limited number of labeled samples and an abundant number of unlabeled samples. First, Lincoln features, widely used in SAR target discrimination, are extracted from the training samples and partitioned into two sets according to their physical meanings. Second, two support vector machine classifiers are iteratively co-trained with the extracted two feature sets based on the co-training algorithm. Finally, the trained classifiers are exploited to classify the test data. The experimental results on real SAR images data not only validate the effectiveness of the proposed method compared with the traditional supervised methods, but also demonstrate the superiority of co-training over self-training, which only uses one feature set.

  5. Body surface assessment with 3D laser-based anthropometry: reliability, validation, and improvement of empirical surface formulae.

    PubMed

    Kuehnapfel, Andreas; Ahnert, Peter; Loeffler, Markus; Scholz, Markus

    2017-02-01

    Body surface area is a physiological quantity relevant for many medical applications. In clinical practice, it is determined by empirical formulae. 3D laser-based anthropometry provides an easy and effective way to measure body surface area but is not ubiquitously available. We used data from laser-based anthropometry from a population-based study to assess validity of published and commonly used empirical formulae. We performed a large population-based study on adults collecting classical anthropometric measurements and 3D body surface assessments (N = 1435). We determined reliability of the 3D body surface assessment and validity of 18 different empirical formulae proposed in the literature. The performance of these formulae is studied in subsets of sex and BMI. Finally, improvements of parameter settings of formulae and adjustments for sex and BMI were considered. 3D body surface measurements show excellent intra- and inter-rater reliability of 0.998 (overall concordance correlation coefficient, OCCC was used as measure of agreement). Empirical formulae of Fujimoto and Watanabe, Shuter and Aslani and Sendroy and Cecchini performed best with excellent concordance with OCCC > 0.949 even in subgroups of sex and BMI. Re-parametrization of formulae and adjustment for sex and BMI slightly improved results. In adults, 3D laser-based body surface assessment is a reliable alternative to estimation by empirical formulae. However, there are empirical formulae showing excellent results even in subgroups of sex and BMI with only little room for improvement.

  6. Liquid Hydrogen Target Experience at SLAC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weisend, J.G.; Boyce, R.; Candia, A.

    2005-08-29

    Liquid hydrogen targets have played a vital role in the physics program at SLAC for the past 40 years. These targets have ranged from small ''beer can'' targets to the 1.5 m long E158 target that was capable of absorbing up to 800 W without any significant density changes. Successful use of these targets has required the development of thin wall designs, liquid hydrogen pumps, remote positioning and alignment systems, safety systems, control and data acquisition systems, cryogenic cooling circuits and heat exchangers. Detailed operating procedures have been created to ensure safety and operational reliability. This paper surveys the evolutionmore » of liquid hydrogen targets at SLAC and discusses advances in several of the enabling technologies that made these targets possible.« less

  7. Atlas-based identification of targets for functional radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stancanello, Joseph; Romanelli, Pantaleo; Modugno, Nicola

    2006-06-15

    Functional disorders of the brain, such as Parkinson's disease, dystonia, epilepsy, and neuropathic pain, may exhibit poor response to medical therapy. In such cases, surgical intervention may become necessary. Modern surgical approaches to such disorders include radio-frequency lesioning and deep brain stimulation (DBS). The subthalamic nucleus (STN) is one of the most useful stereotactic targets available: STN DBS is known to induce substantial improvement in patients with end-stage Parkinson's disease. Other targets include the Globus Pallidus pars interna (GPi) for dystonia and Parkinson's disease, and the centromedian nucleus of the thalamus (CMN) for neuropathic pain. Radiosurgery is an attractive noninvasivemore » alternative to treat some functional brain disorders. The main technical limitation to radiosurgery is that the target can be selected only on the basis of magnetic resonance anatomy without electrophysiological confirmation. The aim of this work is to provide a method for the correct atlas-based identification of the target to be used in functional neurosurgery treatment planning. The coordinates of STN, CMN, and GPi were identified in the Talairach and Tournoux atlas and transformed to the corresponding regions of the Montreal Neurological Institute (MNI) electronic atlas. Binary masks describing the target nuclei were created. The MNI electronic atlas was deformed onto the patient magnetic resonance imaging-T1 scan by applying an affine transformation followed by a local nonrigid registration. The first transformation was based on normalized cross correlation and the second on optimization of a two-part objective function consisting of similarity criteria and weighted regularization. The obtained deformation field was then applied to the target masks. The minimum distance between the surface of an implanted electrode and the surface of the deformed mask was calculated. The validation of the method consisted of comparing the electrode

  8. Validity and reliability of smartphone magnetometer-based goniometer evaluation of shoulder abduction--A pilot study.

    PubMed

    Johnson, Linda B; Sumner, Sean; Duong, Tina; Yan, Posu; Bajcsy, Ruzena; Abresch, R Ted; de Bie, Evan; Han, Jay J

    2015-12-01

    Goniometers are commonly used by physical therapists to measure range-of-motion (ROM) in the musculoskeletal system. These measurements are used to assist in diagnosis and to help monitor treatment efficacy. With newly emerging technologies, smartphone-based applications are being explored for measuring joint angles and movement. This pilot study investigates the intra- and inter-rater reliability as well as concurrent validity of a newly-developed smartphone magnetometer-based goniometer (MG) application for measuring passive shoulder abduction in both sitting and supine positions, and compare against the traditional universal goniometer (UG). This is a comparative study with repeated measurement design. Three physical therapists utilized both the smartphone MG and a traditional UG to measure various angles of passive shoulder abduction in a healthy subject, whose shoulder was positioned in eight different positions with pre-determined degree of abduction while seated or supine. Each therapist was blinded to the measured angles. Concordance correlation coefficients (CCCs), Bland-Altman plotting methods, and Analysis of Variance (ANOVA) were used for statistical analyses. Both traditional UG and smartphone MG were reliable in repeated measures of standardized joint angle positions (average CCC > 0.997) with similar variability in both measurement tools (standard deviation (SD) ± 4°). Agreement between the UG and MG measurements was greater than 0.99 in all positions. Our results show that the smartphone MG has equivalent reliability compared to the traditional UG when measuring passive shoulder abduction ROM. With concordant measures and comparable reliability to the UG, the newly developed MG application shows potential as a useful tool to assess joint angles. Published by Elsevier Ltd.

  9. Model-based recognition of 3D articulated target using ladar range data.

    PubMed

    Lv, Dan; Sun, Jian-Feng; Li, Qi; Wang, Qi

    2015-06-10

    Ladar is suitable for 3D target recognition because ladar range images can provide rich 3D geometric surface information of targets. In this paper, we propose a part-based 3D model matching technique to recognize articulated ground military vehicles in ladar range images. The key of this approach is to solve the decomposition and pose estimation of articulated parts of targets. The articulated components were decomposed into isolate parts based on 3D geometric properties of targets, such as surface point normals, data histogram distribution, and data distance relationships. The corresponding poses of these separate parts were estimated through the linear characteristics of barrels. According to these pose parameters, all parts of the target were roughly aligned to 3D point cloud models in a library and fine matching was finally performed to accomplish 3D articulated target recognition. The recognition performance was evaluated with 1728 ladar range images of eight different articulated military vehicles with various part types and orientations. Experimental results demonstrated that the proposed approach achieved a high recognition rate.

  10. Identification of human microRNA targets from isolated argonaute protein complexes.

    PubMed

    Beitzinger, Michaela; Peters, Lasse; Zhu, Jia Yun; Kremmer, Elisabeth; Meister, Gunter

    2007-06-01

    MicroRNAs (miRNAs) constitute a class of small non-coding RNAs that regulate gene expression on the level of translation and/or mRNA stability. Mammalian miRNAs associate with members of the Argonaute (Ago) protein family and bind to partially complementary sequences in the 3' untranslated region (UTR) of specific target mRNAs. Computer algorithms based on factors such as free binding energy or sequence conservation have been used to predict miRNA target mRNAs. Based on such predictions, up to one third of all mammalian mRNAs seem to be under miRNA regulation. However, due to the low degree of complementarity between the miRNA and its target, such computer programs are often imprecise and therefore not very reliable. Here we report the first biochemical identification approach of miRNA targets from human cells. Using highly specific monoclonal antibodies against members of the Ago protein family, we co-immunoprecipitate Ago-bound mRNAs and identify them by cloning. Interestingly, most of the identified targets are also predicted by different computer programs. Moreover, we randomly analyzed six different target candidates and were able to experimentally validate five as miRNA targets. Our data clearly indicate that miRNA targets can be experimentally identified from Ago complexes and therefore provide a new tool to directly analyze miRNA function.

  11. Background suppression of infrared small target image based on inter-frame registration

    NASA Astrophysics Data System (ADS)

    Ye, Xiubo; Xue, Bindang

    2018-04-01

    We propose a multi-frame background suppression method for remote infrared small target detection. Inter-frame information is necessary when the heavy background clutters make it difficult to distinguish real targets and false alarms. A registration procedure based on points matching in image patches is used to compensate the local deformation of background. Then the target can be separated by background subtraction. Experiments show our method serves as an effective preliminary of target detection.

  12. Test-retest reliability of the prefrontal response to affective pictures based on functional near-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Huang, Yuxia; Mao, Mengchai; Zhang, Zong; Zhou, Hui; Zhao, Yang; Duan, Lian; Kreplin, Ute; Xiao, Xiang; Zhu, Chaozhe

    2017-01-01

    Functional near-infrared spectroscopy (fNIRS) is being increasingly applied to affective and social neuroscience research; however, the reliability of this method is still unclear. This study aimed to evaluate the test-retest reliability of the fNIRS-based prefrontal response to emotional stimuli. Twenty-six participants viewed unpleasant and neutral pictures, and were simultaneously scanned by fNIRS in two sessions three weeks apart. The reproducibility of the prefrontal activation map was evaluated at three spatial scales (mapwise, clusterwise, and channelwise) at both the group and individual levels. The influence of the time interval was also explored and comparisons were made between longer (intersession) and shorter (intrasession) time intervals. The reliabilities of the activation map at the group level for the mapwise (up to 0.88, the highest value appeared in the intersession assessment) and clusterwise scales (up to 0.91, the highest appeared in the intrasession assessment) were acceptable, indicating that fNIRS may be a reliable tool for emotion studies, especially for a group analysis and under larger spatial scales. However, it should be noted that the individual-level and the channelwise fNIRS prefrontal responses were not sufficiently stable. Future studies should investigate which factors influence reliability, as well as the validity of fNIRS used in emotion studies.

  13. Robot-Assisted End-Effector-Based Stair Climbing for Cardiopulmonary Exercise Testing: Feasibility, Reliability, and Repeatability.

    PubMed

    Stoller, Oliver; Schindelholz, Matthias; Hunt, Kenneth J

    2016-01-01

    Neurological impairments can limit the implementation of conventional cardiopulmonary exercise testing (CPET) and cardiovascular training strategies. A promising approach to provoke cardiovascular stress while facilitating task-specific exercise in people with disabilities is feedback-controlled robot-assisted end-effector-based stair climbing (RASC). The aim of this study was to evaluate the feasibility, reliability, and repeatability of augmented RASC-based CPET in able-bodied subjects, with a view towards future research and applications in neurologically impaired populations. Twenty able-bodied subjects performed a familiarisation session and 2 consecutive incremental CPETs using augmented RASC. Outcome measures focussed on standard cardiopulmonary performance parameters and on accuracy of work rate tracking (RMSEP-root mean square error). Criteria for feasibility were cardiopulmonary responsiveness and technical implementation. Relative and absolute test-retest reliability were assessed by intraclass correlation coefficients (ICC), standard error of the measurement (SEM), and minimal detectable change (MDC). Mean differences, limits of agreement, and coefficients of variation (CoV) were estimated to assess repeatability. All criteria for feasibility were achieved. Mean V'O2peak was 106±9% of predicted V'O2max and mean HRpeak was 99±3% of predicted HRmax. 95% of the subjects achieved at least 1 criterion for V'O2max, and the detection of the sub-maximal ventilatory thresholds was successful (ventilatory anaerobic threshold 100%, respiratory compensation point 90% of the subjects). Excellent reliability was found for peak cardiopulmonary outcome measures (ICC ≥ 0.890, SEM ≤ 0.60%, MDC ≤ 1.67%). Repeatability for the primary outcomes was good (CoV ≤ 0.12). RASC-based CPET with feedback-guided exercise intensity demonstrated comparable or higher peak cardiopulmonary performance variables relative to predicted values, achieved the criteria for V'O2max

  14. Robot-Assisted End-Effector-Based Stair Climbing for Cardiopulmonary Exercise Testing: Feasibility, Reliability, and Repeatability

    PubMed Central

    Stoller, Oliver; Schindelholz, Matthias; Hunt, Kenneth J.

    2016-01-01

    Background Neurological impairments can limit the implementation of conventional cardiopulmonary exercise testing (CPET) and cardiovascular training strategies. A promising approach to provoke cardiovascular stress while facilitating task-specific exercise in people with disabilities is feedback-controlled robot-assisted end-effector-based stair climbing (RASC). The aim of this study was to evaluate the feasibility, reliability, and repeatability of augmented RASC-based CPET in able-bodied subjects, with a view towards future research and applications in neurologically impaired populations. Methods Twenty able-bodied subjects performed a familiarisation session and 2 consecutive incremental CPETs using augmented RASC. Outcome measures focussed on standard cardiopulmonary performance parameters and on accuracy of work rate tracking (RMSEP−root mean square error). Criteria for feasibility were cardiopulmonary responsiveness and technical implementation. Relative and absolute test-retest reliability were assessed by intraclass correlation coefficients (ICC), standard error of the measurement (SEM), and minimal detectable change (MDC). Mean differences, limits of agreement, and coefficients of variation (CoV) were estimated to assess repeatability. Results All criteria for feasibility were achieved. Mean V′O2peak was 106±9% of predicted V′O2max and mean HRpeak was 99±3% of predicted HRmax. 95% of the subjects achieved at least 1 criterion for V′O2max, and the detection of the sub-maximal ventilatory thresholds was successful (ventilatory anaerobic threshold 100%, respiratory compensation point 90% of the subjects). Excellent reliability was found for peak cardiopulmonary outcome measures (ICC ≥ 0.890, SEM ≤ 0.60%, MDC ≤ 1.67%). Repeatability for the primary outcomes was good (CoV ≤ 0.12). Conclusions RASC-based CPET with feedback-guided exercise intensity demonstrated comparable or higher peak cardiopulmonary performance variables relative to

  15. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  16. Fuzzy Neural Network-Based Interacting Multiple Model for Multi-Node Target Tracking Algorithm

    PubMed Central

    Sun, Baoliang; Jiang, Chunlan; Li, Ming

    2016-01-01

    An interacting multiple model for multi-node target tracking algorithm was proposed based on a fuzzy neural network (FNN) to solve the multi-node target tracking problem of wireless sensor networks (WSNs). Measured error variance was adaptively adjusted during the multiple model interacting output stage using the difference between the theoretical and estimated values of the measured error covariance matrix. The FNN fusion system was established during multi-node fusion to integrate with the target state estimated data from different nodes and consequently obtain network target state estimation. The feasibility of the algorithm was verified based on a network of nine detection nodes. Experimental results indicated that the proposed algorithm could trace the maneuvering target effectively under sensor failure and unknown system measurement errors. The proposed algorithm exhibited great practicability in the multi-node target tracking of WSNs. PMID:27809271

  17. Targeting helicase-dependent amplification products with an electrochemical genosensor for reliable and sensitive screening of genetically modified organisms.

    PubMed

    Moura-Melo, Suely; Miranda-Castro, Rebeca; de-Los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J; Dos Santos Junior, J Ribeiro; da Silva Fonseca, Rosana A; Lobo-Castañón, Maria Jesús

    2015-08-18

    Cultivation of genetically modified organisms (GMOs) and their use in food and feed is constantly expanding; thus, the question of informing consumers about their presence in food has proven of significant interest. The development of sensitive, rapid, robust, and reliable methods for the detection of GMOs is crucial for proper food labeling. In response, we have experimentally characterized the helicase-dependent isothermal amplification (HDA) and sequence-specific detection of a transgene from the Cauliflower Mosaic Virus 35S Promoter (CaMV35S), inserted into most transgenic plants. HDA is one of the simplest approaches for DNA amplification, emulating the bacterial replication machinery, and resembling PCR but under isothermal conditions. However, it usually suffers from a lack of selectivity, which is due to the accumulation of spurious amplification products. To improve the selectivity of HDA, which makes the detection of amplification products more reliable, we have developed an electrochemical platform targeting the central sequence of HDA copies of the transgene. A binary monolayer architecture is built onto a thin gold film where, upon the formation of perfect nucleic acid duplexes with the amplification products, these are enzyme-labeled and electrochemically transduced. The resulting combined system increases genosensor detectability up to 10(6)-fold, allowing Yes/No detection of GMOs with a limit of detection of ∼30 copies of the CaMV35S genomic DNA. A set of general utility rules in the design of genosensors for detection of HDA amplicons, which may assist in the development of point-of-care tests, is also included. The method provides a versatile tool for detecting nucleic acids with extremely low abundance not only for food safety control but also in the diagnostics and environmental control areas.

  18. Reliability and Validity of an Internet-based Questionnaire Measuring Lifetime Physical Activity

    PubMed Central

    De Vera, Mary A.; Ratzlaff, Charles; Doerfling, Paul; Kopec, Jacek

    2010-01-01

    Lifetime exposure to physical activity is an important construct for evaluating associations between physical activity and disease outcomes, given the long induction periods in many chronic diseases. The authors' objective in this study was to evaluate the measurement properties of the Lifetime Physical Activity Questionnaire (L-PAQ), a novel Internet-based, self-administered instrument measuring lifetime physical activity, among Canadian men and women in 2005–2006. Reliability was examined using a test-retest study. Validity was examined in a 2-part study consisting of 1) comparisons with previously validated instruments measuring similar constructs, the Lifetime Total Physical Activity Questionnaire (LT-PAQ) and the Chasan-Taber Physical Activity Questionnaire (CT-PAQ), and 2) a priori hypothesis tests of constructs measured by the L-PAQ. The L-PAQ demonstrated good reliability, with intraclass correlation coefficients ranging from 0.67 (household activity) to 0.89 (sports/recreation). Comparison between the L-PAQ and the LT-PAQ resulted in Spearman correlation coefficients ranging from 0.41 (total activity) to 0.71 (household activity); comparison between the L-PAQ and the CT-PAQ yielded coefficients of 0.58 (sports/recreation), 0.56 (household activity), and 0.50 (total activity). L-PAQ validity was further supported by observed relations between the L-PAQ and sociodemographic variables, consistent with a priori hypotheses. Overall, the L-PAQ is a useful instrument for assessing multiple domains of lifetime physical activity with acceptable reliability and validity. PMID:20876666

  19. Reliability and validity of an internet-based questionnaire measuring lifetime physical activity.

    PubMed

    De Vera, Mary A; Ratzlaff, Charles; Doerfling, Paul; Kopec, Jacek

    2010-11-15

    Lifetime exposure to physical activity is an important construct for evaluating associations between physical activity and disease outcomes, given the long induction periods in many chronic diseases. The authors' objective in this study was to evaluate the measurement properties of the Lifetime Physical Activity Questionnaire (L-PAQ), a novel Internet-based, self-administered instrument measuring lifetime physical activity, among Canadian men and women in 2005-2006. Reliability was examined using a test-retest study. Validity was examined in a 2-part study consisting of 1) comparisons with previously validated instruments measuring similar constructs, the Lifetime Total Physical Activity Questionnaire (LT-PAQ) and the Chasan-Taber Physical Activity Questionnaire (CT-PAQ), and 2) a priori hypothesis tests of constructs measured by the L-PAQ. The L-PAQ demonstrated good reliability, with intraclass correlation coefficients ranging from 0.67 (household activity) to 0.89 (sports/recreation). Comparison between the L-PAQ and the LT-PAQ resulted in Spearman correlation coefficients ranging from 0.41 (total activity) to 0.71 (household activity); comparison between the L-PAQ and the CT-PAQ yielded coefficients of 0.58 (sports/recreation), 0.56 (household activity), and 0.50 (total activity). L-PAQ validity was further supported by observed relations between the L-PAQ and sociodemographic variables, consistent with a priori hypotheses. Overall, the L-PAQ is a useful instrument for assessing multiple domains of lifetime physical activity with acceptable reliability and validity.

  20. Reliability of Wireless Sensor Networks

    PubMed Central

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2014-01-01

    Wireless Sensor Networks (WSNs) consist of hundreds or thousands of sensor nodes with limited processing, storage, and battery capabilities. There are several strategies to reduce the power consumption of WSN nodes (by increasing the network lifetime) and increase the reliability of the network (by improving the WSN Quality of Service). However, there is an inherent conflict between power consumption and reliability: an increase in reliability usually leads to an increase in power consumption. For example, routing algorithms can send the same packet though different paths (multipath strategy), which it is important for reliability, but they significantly increase the WSN power consumption. In this context, this paper proposes a model for evaluating the reliability of WSNs considering the battery level as a key factor. Moreover, this model is based on routing algorithms used by WSNs. In order to evaluate the proposed models, three scenarios were considered to show the impact of the power consumption on the reliability of WSNs. PMID:25157553

  1. The Reliability and Sources of Error of Using Rubrics-Based Assessment for Student Projects

    ERIC Educational Resources Information Center

    Menéndez-Varela, José-Luis; Gregori-Giralt, Eva

    2018-01-01

    Rubrics are widely used in higher education to assess performance in project-based learning environments. To date, the sources of error that may affect their reliability have not been studied in depth. Using generalisability theory as its starting-point, this article analyses the influence of the assessors and the criteria of the rubrics on the…

  2. Girsanov's transformation based variance reduced Monte Carlo simulation schemes for reliability estimation in nonlinear stochastic dynamics

    NASA Astrophysics Data System (ADS)

    Kanjilal, Oindrila; Manohar, C. S.

    2017-07-01

    The study considers the problem of simulation based time variant reliability analysis of nonlinear randomly excited dynamical systems. Attention is focused on importance sampling strategies based on the application of Girsanov's transformation method. Controls which minimize the distance function, as in the first order reliability method (FORM), are shown to minimize a bound on the sampling variance of the estimator for the probability of failure. Two schemes based on the application of calculus of variations for selecting control signals are proposed: the first obtains the control force as the solution of a two-point nonlinear boundary value problem, and, the second explores the application of the Volterra series in characterizing the controls. The relative merits of these schemes, vis-à-vis the method based on ideas from the FORM, are discussed. Illustrative examples, involving archetypal single degree of freedom (dof) nonlinear oscillators, and a multi-degree of freedom nonlinear dynamical system, are presented. The credentials of the proposed procedures are established by comparing the solutions with pertinent results from direct Monte Carlo simulations.

  3. Improvement in Visual Target Tracking for a Mobile Robot

    NASA Technical Reports Server (NTRS)

    Kim, Won; Ansar, Adnan; Madison, Richard

    2006-01-01

    In an improvement of the visual-target-tracking software used aboard a mobile robot (rover) of the type used to explore the Martian surface, an affine-matching algorithm has been replaced by a combination of a normalized- cross-correlation (NCC) algorithm and a template-image-magnification algorithm. Although neither NCC nor template-image magnification is new, the use of both of them to increase the degree of reliability with which features can be matched is new. In operation, a template image of a target is obtained from a previous rover position, then the magnification of the template image is based on the estimated change in the target distance from the previous rover position to the current rover position (see figure). For this purpose, the target distance at the previous rover position is determined by stereoscopy, while the target distance at the current rover position is calculated from an estimate of the current pose of the rover. The template image is then magnified by an amount corresponding to the estimated target distance to obtain a best template image to match with the image acquired at the current rover position.

  4. A reliability analysis tool for SpaceWire network

    NASA Astrophysics Data System (ADS)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  5. Reliability of Laterality Effects in a Dichotic Listening Task with Words and Syllables

    ERIC Educational Resources Information Center

    Russell, Nancy L.; Voyer, Daniel

    2004-01-01

    Large and reliable laterality effects have been found using a dichotic target detection task in a recent experiment using word stimuli pronounced with an emotional component. The present study tested the hypothesis that the magnitude and reliability of the laterality effects would increase with the removal of the emotional component and variations…

  6. Augmented reality-based electrode guidance system for reliable electroencephalography.

    PubMed

    Song, Chanho; Jeon, Sangseo; Lee, Seongpung; Ha, Ho-Gun; Kim, Jonghyun; Hong, Jaesung

    2018-05-24

    In longitudinal electroencephalography (EEG) studies, repeatable electrode positioning is essential for reliable EEG assessment. Conventional methods use anatomical landmarks as fiducial locations for the electrode placement. Since the landmarks are manually identified, the EEG assessment is inevitably unreliable because of individual variations among the subjects and the examiners. To overcome this unreliability, an augmented reality (AR) visualization-based electrode guidance system was proposed. The proposed electrode guidance system is based on AR visualization to replace the manual electrode positioning. After scanning and registration of the facial surface of a subject by an RGB-D camera, the AR of the initial electrode positions as reference positions is overlapped with the current electrode positions in real time. Thus, it can guide the position of the subsequently placed electrodes with high repeatability. The experimental results with the phantom show that the repeatability of the electrode positioning was improved compared to that of the conventional 10-20 positioning system. The proposed AR guidance system improves the electrode positioning performance with a cost-effective system, which uses only RGB-D camera. This system can be used as an alternative to the international 10-20 system.

  7. Impact of Device Scaling on Deep Sub-micron Transistor Reliability: A Study of Reliability Trends using SRAM

    NASA Technical Reports Server (NTRS)

    White, Mark; Huang, Bing; Qin, Jin; Gur, Zvi; Talmor, Michael; Chen, Yuan; Heidecker, Jason; Nguyen, Duc; Bernstein, Joseph

    2005-01-01

    As microelectronics are scaled in to the deep sub-micron regime, users of advanced technology CMOS, particularly in high-reliability applications, should reassess how scaling effects impact long-term reliability. An experimental based reliability study of industrial grade SRAMs, consisting of three different technology nodes, is proposed to substantiate current acceleration models for temperature and voltage life-stress relationships. This reliability study utilizes step-stress techniques to evaluate memory technologies (0.25mum, 0.15mum, and 0.13mum) embedded in many of today's high-reliability space/aerospace applications. Two acceleration modeling approaches are presented to relate experimental FIT calculations to Mfr's qualification data.

  8. Scene Configuration and Object Reliability Affect the Use of Allocentric Information for Memory-Guided Reaching

    PubMed Central

    Klinghammer, Mathias; Blohm, Gunnar; Fiehler, Katja

    2017-01-01

    Previous research has shown that egocentric and allocentric information is used for coding target locations for memory-guided reaching movements. Especially, task-relevance determines the use of objects as allocentric cues. Here, we investigated the influence of scene configuration and object reliability as a function of task-relevance on allocentric coding for memory-guided reaching. For that purpose, we presented participants images of a naturalistic breakfast scene with five objects on a table and six objects in the background. Six of these objects served as potential reach-targets (= task-relevant objects). Participants explored the scene and after a short delay, a test scene appeared with one of the task-relevant objects missing, indicating the location of the reach target. After the test scene vanished, participants performed a memory-guided reaching movement toward the target location. Besides removing one object from the test scene, we also shifted the remaining task-relevant and/or task-irrelevant objects left- or rightwards either coherently in the same direction or incoherently in opposite directions. By varying object coherence, we manipulated the reliability of task-relevant and task-irrelevant objects in the scene. In order to examine the influence of scene configuration (distributed vs. grouped arrangement of task-relevant objects) on allocentric coding, we compared the present data with our previously published data set (Klinghammer et al., 2015). We found that reaching errors systematically deviated in the direction of object shifts, but only when the objects were task-relevant and their reliability was high. However, this effect was substantially reduced when task-relevant objects were distributed across the scene leading to a larger target-cue distance compared to a grouped configuration. No deviations of reach endpoints were observed in conditions with shifts of only task-irrelevant objects or with low object reliability irrespective of task

  9. Scene Configuration and Object Reliability Affect the Use of Allocentric Information for Memory-Guided Reaching.

    PubMed

    Klinghammer, Mathias; Blohm, Gunnar; Fiehler, Katja

    2017-01-01

    Previous research has shown that egocentric and allocentric information is used for coding target locations for memory-guided reaching movements. Especially, task-relevance determines the use of objects as allocentric cues. Here, we investigated the influence of scene configuration and object reliability as a function of task-relevance on allocentric coding for memory-guided reaching. For that purpose, we presented participants images of a naturalistic breakfast scene with five objects on a table and six objects in the background. Six of these objects served as potential reach-targets (= task-relevant objects). Participants explored the scene and after a short delay, a test scene appeared with one of the task-relevant objects missing, indicating the location of the reach target. After the test scene vanished, participants performed a memory-guided reaching movement toward the target location. Besides removing one object from the test scene, we also shifted the remaining task-relevant and/or task-irrelevant objects left- or rightwards either coherently in the same direction or incoherently in opposite directions. By varying object coherence, we manipulated the reliability of task-relevant and task-irrelevant objects in the scene. In order to examine the influence of scene configuration (distributed vs. grouped arrangement of task-relevant objects) on allocentric coding, we compared the present data with our previously published data set (Klinghammer et al., 2015). We found that reaching errors systematically deviated in the direction of object shifts, but only when the objects were task-relevant and their reliability was high. However, this effect was substantially reduced when task-relevant objects were distributed across the scene leading to a larger target-cue distance compared to a grouped configuration. No deviations of reach endpoints were observed in conditions with shifts of only task-irrelevant objects or with low object reliability irrespective of task

  10. An Evaluation method for C2 Cyber-Physical Systems Reliability Based on Deep Learning

    DTIC Science & Technology

    2014-06-01

    the reliability testing data of the system, we obtain the prior distribution of the relia- bility is 1 1( ) ( ; , )R LG R r  . By Bayes theo- rem ...criticality cyber-physical sys- tems[C]//Proc of ICDCS. Piscataway, NJ: IEEE, 2010:169-178. [17] Zimmer C, Bhat B, Muller F, et al. Time-based intrusion de

  11. A web-based team-oriented medical error communication assessment tool: development, preliminary reliability, validity, and user ratings.

    PubMed

    Kim, Sara; Brock, Doug; Prouty, Carolyn D; Odegard, Peggy Soule; Shannon, Sarah E; Robins, Lynne; Boggs, Jim G; Clark, Fiona J; Gallagher, Thomas

    2011-01-01

    Multiple-choice exams are not well suited for assessing communication skills. Standardized patient assessments are costly and patient and peer assessments are often biased. Web-based assessment using video content offers the possibility of reliable, valid, and cost-efficient means for measuring complex communication skills, including interprofessional communication. We report development of the Web-based Team-Oriented Medical Error Communication Assessment Tool, which uses videotaped cases for assessing skills in error disclosure and team communication. Steps in development included (a) defining communication behaviors, (b) creating scenarios, (c) developing scripts, (d) filming video with professional actors, and (e) writing assessment questions targeting team communication during planning and error disclosure. Using valid data from 78 participants in the intervention group, coefficient alpha estimates of internal consistency were calculated based on the Likert-scale questions and ranged from α=.79 to α=.89 for each set of 7 Likert-type discussion/planning items and from α=.70 to α=.86 for each set of 8 Likert-type disclosure items. The preliminary test-retest Pearson correlation based on the scores of the intervention group was r=.59 for discussion/planning and r=.25 for error disclosure sections, respectively. Content validity was established through reliance on empirically driven published principles of effective disclosure as well as integration of expert views across all aspects of the development process. In addition, data from 122 medicine and surgical physicians and nurses showed high ratings for video quality (4.3 of 5.0), acting (4.3), and case content (4.5). Web assessment of communication skills appears promising. Physicians and nurses across specialties respond favorably to the tool.

  12. Salient target detection based on pseudo-Wigner-Ville distribution and Rényi entropy.

    PubMed

    Xu, Yuannan; Zhao, Yuan; Jin, Chenfei; Qu, Zengfeng; Liu, Liping; Sun, Xiudong

    2010-02-15

    We present what we believe to be a novel method based on pseudo-Wigner-Ville distribution (PWVD) and Rényi entropy for salient targets detection. In the foundation of studying the statistical property of Rényi entropy via PWVD, the residual entropy-based saliency map of an input image can be obtained. From the saliency map, target detection is completed by the simple and convenient threshold segmentation. Experimental results demonstrate the proposed method can detect targets effectively in complex ground scenes.

  13. Reliability of lower limb alignment measures using an established landmark-based method with a customized computer software program

    PubMed Central

    Sled, Elizabeth A.; Sheehy, Lisa M.; Felson, David T.; Costigan, Patrick A.; Lam, Miu; Cooke, T. Derek V.

    2010-01-01

    The objective of the study was to evaluate the reliability of frontal plane lower limb alignment measures using a landmark-based method by (1) comparing inter- and intra-reader reliability between measurements of alignment obtained manually with those using a computer program, and (2) determining inter- and intra-reader reliability of computer-assisted alignment measures from full-limb radiographs. An established method for measuring alignment was used, involving selection of 10 femoral and tibial bone landmarks. 1) To compare manual and computer methods, we used digital images and matching paper copies of five alignment patterns simulating healthy and malaligned limbs drawn using AutoCAD. Seven readers were trained in each system. Paper copies were measured manually and repeat measurements were performed daily for 3 days, followed by a similar routine with the digital images using the computer. 2) To examine the reliability of computer-assisted measures from full-limb radiographs, 100 images (200 limbs) were selected as a random sample from 1,500 full-limb digital radiographs which were part of the Multicenter Osteoarthritis (MOST) Study. Three trained readers used the software program to measure alignment twice from the batch of 100 images, with two or more weeks between batch handling. Manual and computer measures of alignment showed excellent agreement (intraclass correlations [ICCs] 0.977 – 0.999 for computer analysis; 0.820 – 0.995 for manual measures). The computer program applied to full-limb radiographs produced alignment measurements with high inter- and intra-reader reliability (ICCs 0.839 – 0.998). In conclusion, alignment measures using a bone landmark-based approach and a computer program were highly reliable between multiple readers. PMID:19882339

  14. Psychometric instrumentation: reliability and validity of instruments used for clinical practice, evidence-based practice projects and research studies.

    PubMed

    Mayo, Ann M

    2015-01-01

    It is important for CNSs and other APNs to consider the reliability and validity of instruments chosen for clinical practice, evidence-based practice projects, or research studies. Psychometric testing uses specific research methods to evaluate the amount of error associated with any particular instrument. Reliability estimates explain more about how well the instrument is designed, whereas validity estimates explain more about scores that are produced by the instrument. An instrument may be architecturally sound overall (reliable), but the same instrument may not be valid. For example, if a specific group does not understand certain well-constructed items, then the instrument does not produce valid scores when used with that group. Many instrument developers may conduct reliability testing only once, yet continue validity testing in different populations over many years. All CNSs should be advocating for the use of reliable instruments that produce valid results. Clinical nurse specialists may find themselves in situations where reliability and validity estimates for some instruments that are being utilized are unknown. In such cases, CNSs should engage key stakeholders to sponsor nursing researchers to pursue this most important work.

  15. An infrared small target detection method based on multiscale local homogeneity measure

    NASA Astrophysics Data System (ADS)

    Nie, Jinyan; Qu, Shaocheng; Wei, Yantao; Zhang, Liming; Deng, Lizhen

    2018-05-01

    Infrared (IR) small target detection plays an important role in the field of image detection area owing to its intrinsic characteristics. This paper presents a multiscale local homogeneity measure (MLHM) for infrared small target detection, which can enhance the performance of IR small target detection system. Firstly, intra-patch homogeneity of the target itself and the inter-patch heterogeneity between target and the local background regions are integrated to enhance the significant of small target. Secondly, a multiscale measure based on local regions is proposed to obtain the most appropriate response. Finally, an adaptive threshold method is applied to small target segmentation. Experimental results on three different scenarios indicate that the MLHM has good performance under the interference of strong noise.

  16. Reliable and redundant FPGA based read-out design in the ATLAS TileCal Demonstrator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akerstedt, Henrik; Muschter, Steffen; Drake, Gary

    The Tile Calorimeter at ATLAS [1] is a hadron calorimeter based on steel plates and scintillating tiles read out by PMTs. The current read-out system uses standard ADCs and custom ASICs to digitize and temporarily store the data on the detector. However, only a subset of the data is actually read out to the counting room. The on-detector electronics will be replaced around 2023. To achieve the required reliability the upgraded system will be highly redundant. Here the ASICs will be replaced with Kintex-7 FPGAs from Xilinx. This, in addition to the use of multiple 10 Gbps optical read-out links,more » will allow a full read-out of all detector data. Due to the higher radiation levels expected when the beam luminosity is increased, opportunities for repairs will be less frequent. The circuitry and firmware must therefore be designed for sufficiently high reliability using redundancy and radiation tolerant components. Within a year, a hybrid demonstrator including the new readout system will be installed in one slice of the ATLAS Tile Calorimeter. This will allow the proposed upgrade to be thoroughly evaluated well before the planned 2023 deployment in all slices, especially with regard to long term reliability. Different firmware strategies alongside with their integration in the demonstrator are presented in the context of high reliability protection against hardware malfunction and radiation induced errors.« less

  17. Rhamnogalacturonan-I Based Microcapsules for Targeted Drug Release

    PubMed Central

    Kusic, Anja; De Gobba, Cristian; Larsen, Flemming H.; Sassene, Philip; Zhou, Qi; van de Weert, Marco; Mullertz, Anette; Jørgensen, Bodil; Ulvskov, Peter

    2016-01-01

    Drug targeting to the colon via the oral administration route for local treatment of e.g. inflammatory bowel disease and colonic cancer has several advantages such as needle-free administration and low infection risk. A new source for delivery is plant-polysaccharide based delivery platforms such as Rhamnogalacturonan-I (RG-I). In the gastro-intestinal tract the RG-I is only degraded by the action of the colonic microflora. For assessment of potential drug delivery properties, RG-I based microcapsules (~1 μm in diameter) were prepared by an interfacial poly-addition reaction. The cross-linked capsules were loaded with a fluorescent dye (model drug). The capsules showed negligible and very little in vitro release when subjected to media simulating gastric and intestinal fluids, respectively. However, upon exposure to a cocktail of commercial RG-I cleaving enzymes, ~ 9 times higher release was observed, demonstrating that the capsules can be opened by enzymatic degradation. The combined results suggest a potential platform for targeted drug delivery in the terminal gastro-intestinal tract. PMID:27992455

  18. WEAMR — A Weighted Energy Aware Multipath Reliable Routing Mechanism for Hotline-Based WSNs

    PubMed Central

    Tufail, Ali; Qamar, Arslan; Khan, Adil Mehmood; Baig, Waleed Akram; Kim, Ki-Hyung

    2013-01-01

    Reliable source to sink communication is the most important factor for an efficient routing protocol especially in domains of military, healthcare and disaster recovery applications. We present weighted energy aware multipath reliable routing (WEAMR), a novel energy aware multipath routing protocol which utilizes hotline-assisted routing to meet such requirements for mission critical applications. The protocol reduces the number of average hops from source to destination and provides unmatched reliability as compared to well known reactive ad hoc protocols i.e., AODV and AOMDV. Our protocol makes efficient use of network paths based on weighted cost calculation and intelligently selects the best possible paths for data transmissions. The path cost calculation considers end to end number of hops, latency and minimum energy node value in the path. In case of path failure path recalculation is done efficiently with minimum latency and control packets overhead. Our evaluation shows that our proposal provides better end-to-end delivery with less routing overhead and higher packet delivery success ratio compared to AODV and AOMDV. The use of multipath also increases overall life time of WSN network using optimum energy available paths between sender and receiver in WDNs. PMID:23669714

  19. Target-responsive DNA hydrogel mediated "stop-flow" microfluidic paper-based analytic device for rapid, portable and visual detection of multiple targets.

    PubMed

    Wei, Xiaofeng; Tian, Tian; Jia, Shasha; Zhu, Zhi; Ma, Yanli; Sun, Jianjun; Lin, Zhenyu; Yang, Chaoyong James

    2015-04-21

    A versatile point-of-care assay platform was developed for simultaneous detection of multiple targets based on a microfluidic paper-based analytic device (μPAD) using a target-responsive hydrogel to mediate fluidic flow and signal readout. An aptamer-cross-linked hydrogel was used as a target-responsive flow regulator in the μPAD. In the absence of a target, the hydrogel is formed in the flow channel, stopping the flow in the μPAD and preventing the colored indicator from traveling to the final observation spot, thus yielding a "signal off" readout. In contrast, in the presence of a target, no hydrogel is formed because of the preferential interaction of target and aptamer. This allows free fluidic flow in the μPAD, carrying the indicator to the observation spot and producing a "signal on" readout. The device is inexpensive to fabricate, easy to use, and disposable after detection. Testing results can be obtained within 6 min by the naked eye via a simple loading operation without the need for any auxiliary equipment. Multiple targets, including cocaine, adenosine, and Pb(2+), can be detected simultaneously, even in complex biological matrices such as urine. The reported method offers simple, low cost, rapid, user-friendly, point-of-care testing, which will be useful in many applications.

  20. Mechanism-based Proteomic Screening Identifies Targets of Thioredoxin-like Proteins*

    PubMed Central

    Nakao, Lia S.; Everley, Robert A.; Marino, Stefano M.; Lo, Sze M.; de Souza, Luiz E.; Gygi, Steven P.; Gladyshev, Vadim N.

    2015-01-01

    Thioredoxin (Trx)-fold proteins are protagonists of numerous cellular pathways that are subject to thiol-based redox control. The best characterized regulator of thiols in proteins is Trx1 itself, which together with thioredoxin reductase 1 (TR1) and peroxiredoxins (Prxs) comprises a key redox regulatory system in mammalian cells. However, there are numerous other Trx-like proteins, whose functions and redox interactors are unknown. It is also unclear if the principles of Trx1-based redox control apply to these proteins. Here, we employed a proteomic strategy to four Trx-like proteins containing CXXC motifs, namely Trx1, Rdx12, Trx-like protein 1 (Txnl1) and nucleoredoxin 1 (Nrx1), whose cellular targets were trapped in vivo using mutant Trx-like proteins, under conditions of low endogenous expression of these proteins. Prxs were detected as key redox targets of Trx1, but this approach also supported the detection of TR1, which is the Trx1 reductant, as well as mitochondrial intermembrane proteins AIF and Mia40. In addition, glutathione peroxidase 4 was found to be a Rdx12 redox target. In contrast, no redox targets of Txnl1 and Nrx1 could be detected, suggesting that their CXXC motifs do not engage in mixed disulfides with cellular proteins. For some Trx-like proteins, the method allowed distinguishing redox and non-redox interactions. Parallel, comparative analyses of multiple thiol oxidoreductases revealed differences in the functions of their CXXC motifs, providing important insights into thiol-based redox control of cellular processes. PMID:25561728

  1. Chapter 15: Reliability of Wind Turbines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheng, Shuangwen; O'Connor, Ryan

    The global wind industry has witnessed exciting developments in recent years. The future will be even brighter with further reductions in capital and operation and maintenance costs, which can be accomplished with improved turbine reliability, especially when turbines are installed offshore. One opportunity for the industry to improve wind turbine reliability is through the exploration of reliability engineering life data analysis based on readily available data or maintenance records collected at typical wind plants. If adopted and conducted appropriately, these analyses can quickly save operation and maintenance costs in a potentially impactful manner. This chapter discusses wind turbine reliability bymore » highlighting the methodology of reliability engineering life data analysis. It first briefly discusses fundamentals for wind turbine reliability and the current industry status. Then, the reliability engineering method for life analysis, including data collection, model development, and forecasting, is presented in detail and illustrated through two case studies. The chapter concludes with some remarks on potential opportunities to improve wind turbine reliability. An owner and operator's perspective is taken and mechanical components are used to exemplify the potential benefits of reliability engineering analysis to improve wind turbine reliability and availability.« less

  2. Truncated feature representation for automatic target detection using transformed data-based decomposition

    NASA Astrophysics Data System (ADS)

    Riasati, Vahid R.

    2016-05-01

    In this work, the data covariance matrix is diagonalized to provide an orthogonal bases set using the eigen vectors of the data. The eigen-vector decomposition of the data is transformed and filtered in the transform domain to truncate the data for robust features related to a specified set of targets. These truncated eigen features are then combined and reconstructed to utilize in a composite filter and consequently utilized for the automatic target detection of the same class of targets. The results associated with the testing of the current technique are evaluated using the peak-correlation and peak-correlation energy metrics and are presented in this work. The inverse transformed eigen-bases of the current technique may be thought of as an injected sparsity to minimize data in representing the skeletal data structure information associated with the set of targets under consideration.

  3. Mitochondrion: A Promising Target for Nanoparticle-Based Vaccine Delivery Systems

    PubMed Central

    Wen, Ru; Umeano, Afoma C.; Francis, Lily; Sharma, Nivita; Tundup, Smanla; Dhar, Shanta

    2016-01-01

    Vaccination is one of the most popular technologies in disease prevention and eradication. It is promising to improve immunization efficiency by using vectors and/or adjuvant delivery systems. Nanoparticle (NP)-based delivery systems have attracted increasing interest due to enhancement of antigen uptake via prevention of vaccine degradation in the biological environment and the intrinsic immune-stimulatory properties of the materials. Mitochondria play paramount roles in cell life and death and are promising targets for vaccine delivery systems to effectively induce immune responses. In this review, we focus on NPs-based delivery systems with surfaces that can be manipulated by using mitochondria targeting moieties for intervention in health and disease. PMID:27258316

  4. The Accessibility, Usability, and Reliability of Chinese Web-Based Information on HIV/AIDS.

    PubMed

    Niu, Lu; Luo, Dan; Liu, Ying; Xiao, Shuiyuan

    2016-08-20

    The present study was designed to assess the quality of Chinese-language Internet-based information on HIV/AIDS. We entered the following search terms, in Chinese, into Baidu and Sogou: "HIV/AIDS", "symptoms", and "treatment", and evaluated the first 50 hits of each query using the Minervation validation instrument (LIDA tool) and DISCERN instrument. Of the 900 hits identified, 85 websites were included in this study. The overall score of the LIDA tool was 63.7%; the mean score of accessibility, usability, and reliability was 82.2%, 71.5%, and 27.3%, respectively. Of the top 15 sites according to the LIDA score, the mean DISCERN score was calculated at 43.1 (95% confidence intervals (CI) = 37.7-49.5). Noncommercial websites showed higher DISCERN scores than commercial websites; whereas commercial websites were more likely to be found in the first 20 links obtained from each search engine than the noncommercial websites. In general, the HIV/AIDS related Chinese-language websites have poor reliability, although their accessibility and usability are fair. In addition, the treatment information presented on Chinese-language websites is far from sufficient. There is an imperative need for professionals and specialized institutes to improve the comprehensiveness of web-based information related to HIV/AIDS.

  5. Reliable femoral frame construction based on MRI dedicated to muscles position follow-up.

    PubMed

    Dubois, G; Bonneau, D; Lafage, V; Rouch, P; Skalli, W

    2015-10-01

    In vivo follow-up of muscle shape variation represents a challenge when evaluating muscle development due to disease or treatment. Recent developments in muscles reconstruction techniques indicate MRI as a clinical tool for the follow-up of the thigh muscles. The comparison of 3D muscles shape from two different sequences is not easy because there is no common frame. This study proposes an innovative method for the reconstruction of a reliable femoral frame based on the femoral head and both condyles centers. In order to robustify the definition of condylar spheres, an original method was developed to combine the estimation of diameters of both condyles from the lateral antero-posterior distance and the estimation of the spheres center from an optimization process. The influence of spacing between MR slices and of origin positions was studied. For all axes, the proposed method presented an angular error lower than 1° with spacing between slice of 10 mm and the optimal position of the origin was identified at 56 % of the distance between the femoral head center and the barycenter of both condyles. The high reliability of this method provides a robust frame for clinical follow-up based on MRI .

  6. Investigating univariate temporal patterns for intrinsic connectivity networks based on complexity and low-frequency oscillation: a test-retest reliability study.

    PubMed

    Wang, X; Jiao, Y; Tang, T; Wang, H; Lu, Z

    2013-12-19

    Intrinsic connectivity networks (ICNs) are composed of spatial components and time courses. The spatial components of ICNs were discovered with moderate-to-high reliability. So far as we know, few studies focused on the reliability of the temporal patterns for ICNs based their individual time courses. The goals of this study were twofold: to investigate the test-retest reliability of temporal patterns for ICNs, and to analyze these informative univariate metrics. Additionally, a correlation analysis was performed to enhance interpretability. Our study included three datasets: (a) short- and long-term scans, (b) multi-band echo-planar imaging (mEPI), and (c) eyes open or closed. Using dual regression, we obtained the time courses of ICNs for each subject. To produce temporal patterns for ICNs, we applied two categories of univariate metrics: network-wise complexity and network-wise low-frequency oscillation. Furthermore, we validated the test-retest reliability for each metric. The network-wise temporal patterns for most ICNs (especially for default mode network, DMN) exhibited moderate-to-high reliability and reproducibility under different scan conditions. Network-wise complexity for DMN exhibited fair reliability (ICC<0.5) based on eyes-closed sessions. Specially, our results supported that mEPI could be a useful method with high reliability and reproducibility. In addition, these temporal patterns were with physiological meanings, and certain temporal patterns were correlated to the node strength of the corresponding ICN. Overall, network-wise temporal patterns of ICNs were reliable and informative and could be complementary to spatial patterns of ICNs for further study. Copyright © 2013 IBRO. Published by Elsevier Ltd. All rights reserved.

  7. Asymmetric programming: a highly reliable metadata allocation strategy for MLC NAND flash memory-based sensor systems.

    PubMed

    Huang, Min; Liu, Zhaoqing; Qiao, Liyan

    2014-10-10

    While the NAND flash memory is widely used as the storage medium in modern sensor systems, the aggressive shrinking of process geometry and an increase in the number of bits stored in each memory cell will inevitably degrade the reliability of NAND flash memory. In particular, it's critical to enhance metadata reliability, which occupies only a small portion of the storage space, but maintains the critical information of the file system and the address translations of the storage system. Metadata damage will cause the system to crash or a large amount of data to be lost. This paper presents Asymmetric Programming, a highly reliable metadata allocation strategy for MLC NAND flash memory storage systems. Our technique exploits for the first time the property of the multi-page architecture of MLC NAND flash memory to improve the reliability of metadata. The basic idea is to keep metadata in most significant bit (MSB) pages which are more reliable than least significant bit (LSB) pages. Thus, we can achieve relatively low bit error rates for metadata. Based on this idea, we propose two strategies to optimize address mapping and garbage collection. We have implemented Asymmetric Programming on a real hardware platform. The experimental results show that Asymmetric Programming can achieve a reduction in the number of page errors of up to 99.05% with the baseline error correction scheme.

  8. Asymmetric Programming: A Highly Reliable Metadata Allocation Strategy for MLC NAND Flash Memory-Based Sensor Systems

    PubMed Central

    Huang, Min; Liu, Zhaoqing; Qiao, Liyan

    2014-01-01

    While the NAND flash memory is widely used as the storage medium in modern sensor systems, the aggressive shrinking of process geometry and an increase in the number of bits stored in each memory cell will inevitably degrade the reliability of NAND flash memory. In particular, it's critical to enhance metadata reliability, which occupies only a small portion of the storage space, but maintains the critical information of the file system and the address translations of the storage system. Metadata damage will cause the system to crash or a large amount of data to be lost. This paper presents Asymmetric Programming, a highly reliable metadata allocation strategy for MLC NAND flash memory storage systems. Our technique exploits for the first time the property of the multi-page architecture of MLC NAND flash memory to improve the reliability of metadata. The basic idea is to keep metadata in most significant bit (MSB) pages which are more reliable than least significant bit (LSB) pages. Thus, we can achieve relatively low bit error rates for metadata. Based on this idea, we propose two strategies to optimize address mapping and garbage collection. We have implemented Asymmetric Programming on a real hardware platform. The experimental results show that Asymmetric Programming can achieve a reduction in the number of page errors of up to 99.05% with the baseline error correction scheme. PMID:25310473

  9. Targeted quantitative analysis of Streptococcus pyogenes virulence factors by multiple reaction monitoring.

    PubMed

    Lange, Vinzenz; Malmström, Johan A; Didion, John; King, Nichole L; Johansson, Björn P; Schäfer, Juliane; Rameseder, Jonathan; Wong, Chee-Hong; Deutsch, Eric W; Brusniak, Mi-Youn; Bühlmann, Peter; Björck, Lars; Domon, Bruno; Aebersold, Ruedi

    2008-08-01

    In many studies, particularly in the field of systems biology, it is essential that identical protein sets are precisely quantified in multiple samples such as those representing differentially perturbed cell states. The high degree of reproducibility required for such experiments has not been achieved by classical mass spectrometry-based proteomics methods. In this study we describe the implementation of a targeted quantitative approach by which predetermined protein sets are first identified and subsequently quantified at high sensitivity reliably in multiple samples. This approach consists of three steps. First, the proteome is extensively mapped out by multidimensional fractionation and tandem mass spectrometry, and the data generated are assembled in the PeptideAtlas database. Second, based on this proteome map, peptides uniquely identifying the proteins of interest, proteotypic peptides, are selected, and multiple reaction monitoring (MRM) transitions are established and validated by MS2 spectrum acquisition. This process of peptide selection, transition selection, and validation is supported by a suite of software tools, TIQAM (Targeted Identification for Quantitative Analysis by MRM), described in this study. Third, the selected target protein set is quantified in multiple samples by MRM. Applying this approach we were able to reliably quantify low abundance virulence factors from cultures of the human pathogen Streptococcus pyogenes exposed to increasing amounts of plasma. The resulting quantitative protein patterns enabled us to clearly define the subset of virulence proteins that is regulated upon plasma exposure.

  10. Research on moving target defense based on SDN

    NASA Astrophysics Data System (ADS)

    Chen, Mingyong; Wu, Weimin

    2017-08-01

    An address mutation strategy was proposed. This strategy provided an unpredictable change in address, replacing the real address of the packet forwarding process and path mutation, thus hiding the real address of the host and path. a mobile object defense technology based on Spatio-temporal Mutation on this basis is proposed, Using the software Defined Network centralized control architecture advantage combines sFlow traffic monitoring technology and Moving Target Defense. A mutated time period which can be changed in real time according to the network traffic is changed, and the destination address is changed while the controller abruptly changes the address while the data packet is transferred between the switches to construct a moving target, confusing the host within the network, thereby protecting the host and network.

  11. Reliability of sensor-based real-time workflow recognition in laparoscopic cholecystectomy.

    PubMed

    Kranzfelder, Michael; Schneider, Armin; Fiolka, Adam; Koller, Sebastian; Reiser, Silvano; Vogel, Thomas; Wilhelm, Dirk; Feussner, Hubertus

    2014-11-01

    Laparoscopic cholecystectomy is a very common minimally invasive surgical procedure that may be improved by autonomous or cooperative assistance support systems. Model-based surgery with a precise definition of distinct procedural tasks (PT) of the operation was implemented and tested to depict and analyze the process of this procedure. Reliability of real-time workflow recognition in laparoscopic cholecystectomy ([Formula: see text] cases) was evaluated by continuous sensor-based data acquisition. Ten PTs were defined including begin/end preparation calots' triangle, clipping/cutting cystic artery and duct, begin/end gallbladder dissection, begin/end hemostasis, gallbladder removal, and end of operation. Data acquisition was achieved with continuous instrument detection, room/table light status, intra-abdominal pressure, table tilt, irrigation/aspiration volume and coagulation/cutting current application. Two independent observers recorded start and endpoint of each step by analysis of the sensor data. The data were cross-checked with laparoscopic video recordings serving as gold standard for PT identification. Bland-Altman analysis revealed for 95% of cases a difference of annotation results within the limits of agreement ranging from [Formula: see text]309 s (PT 7) to +368 s (PT 5). Laparoscopic video and sensor data matched to a greater or lesser extent within the different procedural tasks. In the majority of cases, the observer results exceeded those obtained from the laparoscopic video. Empirical knowledge was required to detect phase transit. A set of sensors used to monitor laparoscopic cholecystectomy procedures was sufficient to enable expert observers to reliably identify each PT. In the future, computer systems may automate the task identification process provided a more robust data inflow is available.

  12. Is Learner Self-Assessment Reliable and Valid in a Web-Based Portfolio Environment for High School Students?

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Liang, Chaoyun; Chen, Yi-Hui

    2013-01-01

    This study explored the reliability and validity of Web-based portfolio self-assessment. Participants were 72 senior high school students enrolled in a computer application course. The students created learning portfolios, viewed peers' work, and performed self-assessment on the Web-based portfolio assessment system. The results indicated: 1)…

  13. Assuring long-term reliability of concentrator PV systems

    NASA Astrophysics Data System (ADS)

    McConnell, R.; Garboushian, V.; Brown, J.; Crawford, C.; Darban, K.; Dutra, D.; Geer, S.; Ghassemian, V.; Gordon, R.; Kinsey, G.; Stone, K.; Turner, G.

    2009-08-01

    Concentrator PV (CPV) systems have attracted significant interest because these systems incorporate the world's highest efficiency solar cells and they are targeting the lowest cost production of solar electricity for the world's utility markets. Because these systems are just entering solar markets, manufacturers and customers need to assure their reliability for many years of operation. There are three general approaches for assuring CPV reliability: 1) field testing and development over many years leading to improved product designs, 2) testing to internationally accepted qualification standards (especially for new products) and 3) extended reliability tests to identify critical weaknesses in a new component or design. Amonix has been a pioneer in all three of these approaches. Amonix has an internal library of field failure data spanning over 15 years that serves as the basis for its seven generations of CPV systems. An Amonix product served as the test CPV module for the development of the world's first qualification standard completed in March 2001. Amonix staff has served on international standards development committees, such as the International Electrotechnical Commission (IEC), in support of developing CPV standards needed in today's rapidly expanding solar markets. Recently Amonix employed extended reliability test procedures to assure reliability of multijunction solar cell operation in its seventh generation high concentration PV system. This paper will discuss how these three approaches have all contributed to assuring reliability of the Amonix systems.

  14. Understanding the Reliability of Solder Joints Used in Advanced Structural and Electronics Applications: Part 2 - Reliability Performance.

    DOE PAGES

    Vianco, Paul T.

    2017-03-01

    Whether structural or electronic, all solder joints must provide the necessary level of reliability for the application. The Part 1 report examined the effects of filler metal properties and the soldering process on joint reliability. Filler metal solderability and mechanical properties, as well as the extents of base material dissolution and interface reaction that occur during the soldering process, were shown to affect reliability performance. The continuation of this discussion is presented in this Part 2 report, which highlights those factors that directly affect solder joint reliability. There is the growth of an intermetallic compound (IMC) reaction layer at themore » solder/base material interface by means of solid-state diffusion processes. In terms of mechanical response by the solder joint, fatigue remains as the foremost concern for long-term performance. Thermal mechanical fatigue (TMF), a form of low-cycle fatigue (LCF), occurs when temperature cycling is combined with mismatched values of the coefficient of thermal expansion (CTE) between materials comprising the solder joint “system.” Vibration environments give rise to high-cycle fatigue (HCF) degradation. Although accelerated aging studies provide valuable empirical data, too many variants of filler metals, base materials, joint geometries, and service environments are forcing design engineers to embrace computational modeling to predict the long-term reliability of solder joints.« less

  15. Reliability prediction of large fuel cell stack based on structure stress analysis

    NASA Astrophysics Data System (ADS)

    Liu, L. F.; Liu, B.; Wu, C. W.

    2017-09-01

    The aim of this paper is to improve the reliability of Proton Electrolyte Membrane Fuel Cell (PEMFC) stack by designing the clamping force and the thickness difference between the membrane electrode assembly (MEA) and the gasket. The stack reliability is directly determined by the component reliability, which is affected by the material property and contact stress. The component contact stress is a random variable because it is usually affected by many uncertain factors in the production and clamping process. We have investigated the influences of parameter variation coefficient on the probability distribution of contact stress using the equivalent stiffness model and the first-order second moment method. The optimal contact stress to make the component stay in the highest level reliability is obtained by the stress-strength interference model. To obtain the optimal contact stress between the contact components, the optimal thickness of the component and the stack clamping force are optimally designed. Finally, a detailed description is given how to design the MEA and gasket dimensions to obtain the highest stack reliability. This work can provide a valuable guidance in the design of stack structure for a high reliability of fuel cell stack.

  16. An Efficient Moving Target Detection Algorithm Based on Sparsity-Aware Spectrum Estimation

    PubMed Central

    Shen, Mingwei; Wang, Jie; Wu, Di; Zhu, Daiyin

    2014-01-01

    In this paper, an efficient direct data domain space-time adaptive processing (STAP) algorithm for moving targets detection is proposed, which is achieved based on the distinct spectrum features of clutter and target signals in the angle-Doppler domain. To reduce the computational complexity, the high-resolution angle-Doppler spectrum is obtained by finding the sparsest coefficients in the angle domain using the reduced-dimension data within each Doppler bin. Moreover, we will then present a knowledge-aided block-size detection algorithm that can discriminate between the moving targets and the clutter based on the extracted spectrum features. The feasibility and effectiveness of the proposed method are validated through both numerical simulations and raw data processing results. PMID:25222035

  17. Research on target tracking in coal mine based on optical flow method

    NASA Astrophysics Data System (ADS)

    Xue, Hongye; Xiao, Qingwei

    2015-03-01

    To recognize, track and count the bolting machine in coal mine video images, a real-time target tracking method based on the Lucas-Kanade sparse optical flow is proposed in this paper. In the method, we judge whether the moving target deviate from its trajectory, predicate and correct the position of the moving target. The method solves the problem of failure to track the target or lose the target because of the weak light, uneven illumination and blocking. Using the VC++ platform and Opencv lib we complete the recognition and tracking. The validity of the method is verified by the result of the experiment.

  18. Infrared dim moving target tracking via sparsity-based discriminative classifier and convolutional network

    NASA Astrophysics Data System (ADS)

    Qian, Kun; Zhou, Huixin; Wang, Bingjian; Song, Shangzhen; Zhao, Dong

    2017-11-01

    Infrared dim and small target tracking is a great challenging task. The main challenge for target tracking is to account for appearance change of an object, which submerges in the cluttered background. An efficient appearance model that exploits both the global template and local representation over infrared image sequences is constructed for dim moving target tracking. A Sparsity-based Discriminative Classifier (SDC) and a Convolutional Network-based Generative Model (CNGM) are combined with a prior model. In the SDC model, a sparse representation-based algorithm is adopted to calculate the confidence value that assigns more weights to target templates than negative background templates. In the CNGM model, simple cell feature maps are obtained by calculating the convolution between target templates and fixed filters, which are extracted from the target region at the first frame. These maps measure similarities between each filter and local intensity patterns across the target template, therefore encoding its local structural information. Then, all the maps form a representation, preserving the inner geometric layout of a candidate template. Furthermore, the fixed target template set is processed via an efficient prior model. The same operation is applied to candidate templates in the CNGM model. The online update scheme not only accounts for appearance variations but also alleviates the migration problem. At last, collaborative confidence values of particles are utilized to generate particles' importance weights. Experiments on various infrared sequences have validated the tracking capability of the presented algorithm. Experimental results show that this algorithm runs in real-time and provides a higher accuracy than state of the art algorithms.

  19. Reliability Quantification of Advanced Stirling Convertor (ASC) Components

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward

    2010-01-01

    The Advanced Stirling Convertor, is intended to provide power for an unmanned planetary spacecraft and has an operational life requirement of 17 years. Over this 17 year mission, the ASC must provide power with desired performance and efficiency and require no corrective maintenance. Reliability demonstration testing for the ASC was found to be very limited due to schedule and resource constraints. Reliability demonstration must involve the application of analysis, system and component level testing, and simulation models, taken collectively. Therefore, computer simulation with limited test data verification is a viable approach to assess the reliability of ASC components. This approach is based on physics-of-failure mechanisms and involves the relationship among the design variables based on physics, mechanics, material behavior models, interaction of different components and their respective disciplines such as structures, materials, fluid, thermal, mechanical, electrical, etc. In addition, these models are based on the available test data, which can be updated, and analysis refined as more data and information becomes available. The failure mechanisms and causes of failure are included in the analysis, especially in light of the new information, in order to develop guidelines to improve design reliability and better operating controls to reduce the probability of failure. Quantified reliability assessment based on fundamental physical behavior of components and their relationship with other components has demonstrated itself to be a superior technique to conventional reliability approaches based on utilizing failure rates derived from similar equipment or simply expert judgment.

  20. Reliable pre-eclampsia pathways based on multiple independent microarray data sets.

    PubMed

    Kawasaki, Kaoru; Kondoh, Eiji; Chigusa, Yoshitsugu; Ujita, Mari; Murakami, Ryusuke; Mogami, Haruta; Brown, J B; Okuno, Yasushi; Konishi, Ikuo

    2015-02-01

    Pre-eclampsia is a multifactorial disorder characterized by heterogeneous clinical manifestations. Gene expression profiling of preeclamptic placenta have provided different and even opposite results, partly due to data compromised by various experimental artefacts. Here we aimed to identify reliable pre-eclampsia-specific pathways using multiple independent microarray data sets. Gene expression data of control and preeclamptic placentas were obtained from Gene Expression Omnibus. Single-sample gene-set enrichment analysis was performed to generate gene-set activation scores of 9707 pathways obtained from the Molecular Signatures Database. Candidate pathways were identified by t-test-based screening using data sets, GSE10588, GSE14722 and GSE25906. Additionally, recursive feature elimination was applied to arrive at a further reduced set of pathways. To assess the validity of the pre-eclampsia pathways, a statistically-validated protocol was executed using five data sets including two independent other validation data sets, GSE30186, GSE44711. Quantitative real-time PCR was performed for genes in a panel of potential pre-eclampsia pathways using placentas of 20 women with normal or severe preeclamptic singleton pregnancies (n = 10, respectively). A panel of ten pathways were found to discriminate women with pre-eclampsia from controls with high accuracy. Among these were pathways not previously associated with pre-eclampsia, such as the GABA receptor pathway, as well as pathways that have already been linked to pre-eclampsia, such as the glutathione and CDKN1C pathways. mRNA expression of GABRA3 (GABA receptor pathway), GCLC and GCLM (glutathione metabolic pathway), and CDKN1C was significantly reduced in the preeclamptic placentas. In conclusion, ten accurate and reliable pre-eclampsia pathways were identified based on multiple independent microarray data sets. A pathway-based classification may be a worthwhile approach to elucidate the pathogenesis of pre

  1. Achieving Reliable Communication in Dynamic Emergency Responses

    PubMed Central

    Chipara, Octav; Plymoth, Anders N.; Liu, Fang; Huang, Ricky; Evans, Brian; Johansson, Per; Rao, Ramesh; Griswold, William G.

    2011-01-01

    Emergency responses require the coordination of first responders to assess the condition of victims, stabilize their condition, and transport them to hospitals based on the severity of their injuries. WIISARD is a system designed to facilitate the collection of medical information and its reliable dissemination during emergency responses. A key challenge in WIISARD is to deliver data with high reliability as first responders move and operate in a dynamic radio environment fraught with frequent network disconnections. The initial WIISARD system employed a client-server architecture and an ad-hoc routing protocol was used to exchange data. The system had low reliability when deployed during emergency drills. In this paper, we identify the underlying causes of unreliability and propose a novel peer-to-peer architecture that in combination with a gossip-based communication protocol achieves high reliability. Empirical studies show that compared to the initial WIISARD system, the redesigned system improves reliability by as much as 37% while reducing the number of transmitted packets by 23%. PMID:22195075

  2. Development of a Digital-Based Instrument to Assess Perceived Motor Competence in Children: Face Validity, Test-Retest Reliability, and Internal Consistency

    PubMed Central

    Palmer, Kara K.

    2017-01-01

    Assessing children’s perceptions of their movement abilities (i.e., perceived competence) is traditionally done using picture scales—Pictorial Scale of Perceived Competence and Acceptance for Young Children or Pictorial Scale of Perceived Movement Skill Competence. Pictures fail to capture the temporal components of movement. To address this limitation, we created a digital-based instrument to assess perceived motor competence: the Digital Scale of Perceived Motor Competence. The purpose of this study was to determine the validity, reliability, and internal consistency of the Digital-based Scale of Perceived Motor Skill Competence. The Digital-based Scale of Perceived Motor Skill Competence is based on the twelve fundamental motor skills from the Test of Gross Motor Development-2nd Edition with a similar layout and item structure as the Pictorial Scale of Perceived Movement Skill Competence. Face Validity of the instrument was examined in Phase I (n = 56; Mage = 8.6 ± 0.7 years, 26 girls). Test-retest reliability and internal consistency were assessed in Phase II (n = 54, Mage = 8.7 years ± 0.5 years, 26 girls). Intra-class correlations (ICC) and Cronbach’s alpha were conducted to determine test-retest reliability and internal consistency for all twelve skills along with locomotor and object control subscales. The Digital Scale of Perceived Motor Competence demonstrates excellent test-retest reliability (ICC = 0.83, total; ICC = 0.77, locomotor; ICC = 0.79, object control) and acceptable/good internal consistency (α = 0.62, total; α = 0.57, locomotor; α = 0.49, object control). Findings provide evidence of the reliability of the three level digital-based instrument of perceived motor competence for older children. PMID:29910408

  3. The research and application of visual saliency and adaptive support vector machine in target tracking field.

    PubMed

    Chen, Yuantao; Xu, Weihong; Kuang, Fangjun; Gao, Shangbing

    2013-01-01

    The efficient target tracking algorithm researches have become current research focus of intelligent robots. The main problems of target tracking process in mobile robot face environmental uncertainty. They are very difficult to estimate the target states, illumination change, target shape changes, complex backgrounds, and other factors and all affect the occlusion in tracking robustness. To further improve the target tracking's accuracy and reliability, we present a novel target tracking algorithm to use visual saliency and adaptive support vector machine (ASVM). Furthermore, the paper's algorithm has been based on the mixture saliency of image features. These features include color, brightness, and sport feature. The execution process used visual saliency features and those common characteristics have been expressed as the target's saliency. Numerous experiments demonstrate the effectiveness and timeliness of the proposed target tracking algorithm in video sequences where the target objects undergo large changes in pose, scale, and illumination.

  4. miRTarBase update 2018: a resource for experimentally validated microRNA-target interactions.

    PubMed

    Chou, Chih-Hung; Shrestha, Sirjana; Yang, Chi-Dung; Chang, Nai-Wen; Lin, Yu-Ling; Liao, Kuang-Wen; Huang, Wei-Chi; Sun, Ting-Hsuan; Tu, Siang-Jyun; Lee, Wei-Hsiang; Chiew, Men-Yee; Tai, Chun-San; Wei, Ting-Yen; Tsai, Tzi-Ren; Huang, Hsin-Tzu; Wang, Chung-Yu; Wu, Hsin-Yi; Ho, Shu-Yi; Chen, Pin-Rong; Chuang, Cheng-Hsun; Hsieh, Pei-Jung; Wu, Yi-Shin; Chen, Wen-Liang; Li, Meng-Ju; Wu, Yu-Chun; Huang, Xin-Yi; Ng, Fung Ling; Buddhakosai, Waradee; Huang, Pei-Chun; Lan, Kuan-Chun; Huang, Chia-Yen; Weng, Shun-Long; Cheng, Yeong-Nan; Liang, Chao; Hsu, Wen-Lian; Huang, Hsien-Da

    2018-01-04

    MicroRNAs (miRNAs) are small non-coding RNAs of ∼ 22 nucleotides that are involved in negative regulation of mRNA at the post-transcriptional level. Previously, we developed miRTarBase which provides information about experimentally validated miRNA-target interactions (MTIs). Here, we describe an updated database containing 422 517 curated MTIs from 4076 miRNAs and 23 054 target genes collected from over 8500 articles. The number of MTIs curated by strong evidence has increased ∼1.4-fold since the last update in 2016. In this updated version, target sites validated by reporter assay that are available in the literature can be downloaded. The target site sequence can extract new features for analysis via a machine learning approach which can help to evaluate the performance of miRNA-target prediction tools. Furthermore, different ways of browsing enhance user browsing specific MTIs. With these improvements, miRTarBase serves as more comprehensively annotated, experimentally validated miRNA-target interactions databases in the field of miRNA related research. miRTarBase is available at http://miRTarBase.mbc.nctu.edu.tw/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. A Phenotypic Based Target Screening Approach Delivers New Antitubercular CTP Synthetase Inhibitors.

    PubMed

    Esposito, Marta; Szadocka, Sára; Degiacomi, Giulia; Orena, Beatrice S; Mori, Giorgia; Piano, Valentina; Boldrin, Francesca; Zemanová, Júlia; Huszár, Stanislav; Barros, David; Ekins, Sean; Lelièvre, Joel; Manganelli, Riccardo; Mattevi, Andrea; Pasca, Maria Rosalia; Riccardi, Giovanna; Ballell, Lluis; Mikušová, Katarína; Chiarelli, Laurent R

    2017-06-09

    Despite its great potential, the target-based approach has been mostly unsuccessful in tuberculosis drug discovery, while whole cell phenotypic screening has delivered several active compounds. However, for many of these hits, the cellular target has not yet been identified, thus preventing further target-based optimization of the compounds. In this context, the newly validated drug target CTP synthetase PyrG was exploited to assess a target-based approach of already known, but untargeted, antimycobacterial compounds. To this purpose the publically available GlaxoSmithKline antimycobacterial compound set was assayed, uncovering a series of 4-(pyridin-2-yl)thiazole derivatives which efficiently inhibit the Mycobacterium tuberculosis PyrG enzyme activity, one of them showing low activity against the human CTP synthetase. The three best compounds were ATP binding site competitive inhibitors, with K i values ranging from 3 to 20 μM, but did not show any activity against a small panel of different prokaryotic and eukaryotic kinases, thus demonstrating specificity for the CTP synthetases. Metabolic labeling experiments demonstrated that the compounds directly interfere not only with CTP biosynthesis, but also with other CTP dependent biochemical pathways, such as lipid biosynthesis. Moreover, using a M. tuberculosis pyrG conditional knock-down strain, it was shown that the activity of two compounds is dependent on the intracellular concentration of the CTP synthetase. All these results strongly suggest a role of PyrG as a target of these compounds, thus strengthening the value of this kind of approach for the identification of new scaffolds for drug development.

  6. Fabrication of boron sputter targets

    DOEpatents

    Makowiecki, Daniel M.; McKernan, Mark A.

    1995-01-01

    A process for fabricating high density boron sputtering targets with sufficient mechanical strength to function reliably at typical magnetron sputtering power densities and at normal process parameters. The process involves the fabrication of a high density boron monolithe by hot isostatically compacting high purity (99.9%) boron powder, machining the boron monolithe into the final dimensions, and brazing the finished boron piece to a matching boron carbide (B.sub.4 C) piece, by placing aluminum foil there between and applying pressure and heat in a vacuum. An alternative is the application of aluminum metallization to the back of the boron monolithe by vacuum deposition. Also, a titanium based vacuum braze alloy can be used in place of the aluminum foil.

  7. Fabrication of boron sputter targets

    DOEpatents

    Makowiecki, D.M.; McKernan, M.A.

    1995-02-28

    A process is disclosed for fabricating high density boron sputtering targets with sufficient mechanical strength to function reliably at typical magnetron sputtering power densities and at normal process parameters. The process involves the fabrication of a high density boron monolithe by hot isostatically compacting high purity (99.9%) boron powder, machining the boron monolithe into the final dimensions, and brazing the finished boron piece to a matching boron carbide (B{sub 4}C) piece, by placing aluminum foil there between and applying pressure and heat in a vacuum. An alternative is the application of aluminum metallization to the back of the boron monolithe by vacuum deposition. Also, a titanium based vacuum braze alloy can be used in place of the aluminum foil. 7 figs.

  8. Automated Threshold Selection for Template-Based Sonar Target Detection

    DTIC Science & Technology

    2017-08-01

    test based on the distribution of the matched filter correlations. From the matched filter output we evaluate target sized areas and surrounding...synthetic aperture sonar data that were part of the evaluation . Figure 3 shows a nearly uniform seafloor. Figure 4 is more complex, with

  9. Infrared small target detection based on directional zero-crossing measure

    NASA Astrophysics Data System (ADS)

    Zhang, Xiangyue; Ding, Qinghai; Luo, Haibo; Hui, Bin; Chang, Zheng; Zhang, Junchao

    2017-12-01

    Infrared small target detection under complex background and low signal-to-clutter ratio (SCR) condition is of great significance to the development on precision guidance and infrared surveillance. In order to detect targets precisely and extract targets from intricate clutters effectively, a detection method based on zero-crossing saliency (ZCS) map is proposed. The original map is first decomposed into different first-order directional derivative (FODD) maps by using FODD filters. Then the ZCS map is obtained by fusing all directional zero-crossing points. At last, an adaptive threshold is adopted to segment targets from the ZCS map. Experimental results on a series of images show that our method is effective and robust for detection under complex backgrounds. Moreover, compared with other five state-of-the-art methods, our method achieves better performance in terms of detection rate, SCR gain and background suppression factor.

  10. Interval MULTIMOORA method with target values of attributes based on interval distance and preference degree: biomaterials selection

    NASA Astrophysics Data System (ADS)

    Hafezalkotob, Arian; Hafezalkotob, Ashkan

    2017-06-01

    A target-based MADM method covers beneficial and non-beneficial attributes besides target values for some attributes. Such techniques are considered as the comprehensive forms of MADM approaches. Target-based MADM methods can also be used in traditional decision-making problems in which beneficial and non-beneficial attributes only exist. In many practical selection problems, some attributes have given target values. The values of decision matrix and target-based attributes can be provided as intervals in some of such problems. Some target-based decision-making methods have recently been developed; however, a research gap exists in the area of MADM techniques with target-based attributes under uncertainty of information. We extend the MULTIMOORA method for solving practical material selection problems in which material properties and their target values are given as interval numbers. We employ various concepts of interval computations to reduce degeneration of uncertain data. In this regard, we use interval arithmetic and introduce innovative formula for interval distance of interval numbers to create interval target-based normalization technique. Furthermore, we use a pairwise preference matrix based on the concept of degree of preference of interval numbers to calculate the maximum, minimum, and ranking of these numbers. Two decision-making problems regarding biomaterials selection of hip and knee prostheses are discussed. Preference degree-based ranking lists for subordinate parts of the extended MULTIMOORA method are generated by calculating the relative degrees of preference for the arranged assessment values of the biomaterials. The resultant rankings for the problem are compared with the outcomes of other target-based models in the literature.

  11. Silicon Phthalocyanines Axially Disubstituted with Erlotinib toward Small-Molecular-Target-Based Photodynamic Therapy.

    PubMed

    Chen, Juan-Juan; Huang, Yi-Zhen; Song, Mei-Ru; Zhang, Zhi-Hong; Xue, Jin-Ping

    2017-09-21

    Small-molecular-target-based photodynamic therapy-a promising targeted anticancer strategy-was developed by conjugating zinc(II) phthalocyanine with a small-molecular-target-based anticancer drug. To prevent self-aggregation and avoid problems of phthalocyanine isomerization, two silicon phthalocyanines di-substituted axially with erlotinib have been synthesized and fully characterized. These conjugates are present in monomeric form in various solvents as well as culture media. Cell-based experiments showed that these conjugates localize in lysosomes and mitochondria, while maintaining high photodynamic activities (IC 50 values as low as 8 nm under a light dose of 1.5 J cm -2 ). With erlotinib as the targeting moiety, two conjugates were found to exhibit high specificity for EGFR-overexpressing cancer cells. Various poly(ethylene glycol) (PEG) linker lengths were shown to have an effect on the photophysical/photochemical properties and on in vitro phototoxicity. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Comparison of Reliability Measures under Factor Analysis and Item Response Theory

    ERIC Educational Resources Information Center

    Cheng, Ying; Yuan, Ke-Hai; Liu, Cheng

    2012-01-01

    Reliability of test scores is one of the most pervasive psychometric concepts in measurement. Reliability coefficients based on a unifactor model for continuous indicators include maximal reliability rho and an unweighted sum score-based omega, among many others. With increasing popularity of item response theory, a parallel reliability measure pi…

  13. Real-time reliability measure-driven multi-hypothesis tracking using 2D and 3D features

    NASA Astrophysics Data System (ADS)

    Zúñiga, Marcos D.; Brémond, François; Thonnat, Monique

    2011-12-01

    We propose a new multi-target tracking approach, which is able to reliably track multiple objects even with poor segmentation results due to noisy environments. The approach takes advantage of a new dual object model combining 2D and 3D features through reliability measures. In order to obtain these 3D features, a new classifier associates an object class label to each moving region (e.g. person, vehicle), a parallelepiped model and visual reliability measures of its attributes. These reliability measures allow to properly weight the contribution of noisy, erroneous or false data in order to better maintain the integrity of the object dynamics model. Then, a new multi-target tracking algorithm uses these object descriptions to generate tracking hypotheses about the objects moving in the scene. This tracking approach is able to manage many-to-many visual target correspondences. For achieving this characteristic, the algorithm takes advantage of 3D models for merging dissociated visual evidence (moving regions) potentially corresponding to the same real object, according to previously obtained information. The tracking approach has been validated using video surveillance benchmarks publicly accessible. The obtained performance is real time and the results are competitive compared with other tracking algorithms, with minimal (or null) reconfiguration effort between different videos.

  14. Validity and reliability of criterion based clinical audit to assess obstetrical quality of care in West Africa.

    PubMed

    Pirkle, Catherine M; Dumont, Alexandre; Traore, Mamadou; Zunzunegui, Maria-Victoria

    2012-10-29

    In Mali and Senegal, over 1% of women die giving birth in hospital. At some hospitals, over a third of infants are stillborn. Many deaths are due to substandard medical practices. Criterion-based clinical audits (CBCA) are increasingly used to measure and improve obstetrical care in resource-limited settings, but their measurement properties have not been formally evaluated. In 2011, we published a systematic review of obstetrical CBCA highlighting insufficient considerations of validity and reliability. The objective of this study is to develop an obstetrical CBCA adapted to the West African context and assess its reliability and validity. This work was conducted as a sub-study within a cluster randomized trial known as QUARITE. Criteria were selected based on extensive literature review and expert opinion. Early 2010, two auditors applied the CBCA to identical samples at 8 sites in Mali and Senegal (n = 185) to evaluate inter-rater reliability. In 2010-11, we conducted CBCA at 32 hospitals to assess construct validity (n = 633 patients). We correlated hospital characteristics (resource availability, facility perinatal and maternal mortality) with mean hospital CBCA scores. We used generalized estimating equations to assess whether patient CBCA scores were associated with perinatal mortality. Results demonstrate substantial (ICC = 0.67, 95% CI 0.54; 0.76) to elevated inter-rater reliability (ICC = 0.84, 95% CI 0.77; 0.89) in Senegal and Mali, respectively. Resource availability positively correlated with mean hospital CBCA scores and maternal and perinatal mortality were inversely correlated with hospital CBCA scores. Poor CBCA scores, adjusted for hospital and patient characteristics, were significantly associated with perinatal mortality (OR 1.84, 95% CI 1.01-3.34). Our CBCA has substantial inter-rater reliability and there is compelling evidence of its validity as the tool performs according to theory. Current Controlled Trials ISRCTN46950658.

  15. Infrared moving small target detection based on saliency extraction and image sparse representation

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaomin; Ren, Kan; Gao, Jin; Li, Chaowei; Gu, Guohua; Wan, Minjie

    2016-10-01

    Moving small target detection in infrared image is a crucial technique of infrared search and tracking system. This paper present a novel small target detection technique based on frequency-domain saliency extraction and image sparse representation. First, we exploit the features of Fourier spectrum image and magnitude spectrum of Fourier transform to make a rough extract of saliency regions and use a threshold segmentation system to classify the regions which look salient from the background, which gives us a binary image as result. Second, a new patch-image model and over-complete dictionary were introduced to the detection system, then the infrared small target detection was converted into a problem solving and optimization process of patch-image information reconstruction based on sparse representation. More specifically, the test image and binary image can be decomposed into some image patches follow certain rules. We select the target potential area according to the binary patch-image which contains salient region information, then exploit the over-complete infrared small target dictionary to reconstruct the test image blocks which may contain targets. The coefficients of target image patch satisfy sparse features. Finally, for image sequence, Euclidean distance was used to reduce false alarm ratio and increase the detection accuracy of moving small targets in infrared images due to the target position correlation between frames.

  16. Reliability and validity in a nutshell.

    PubMed

    Bannigan, Katrina; Watson, Roger

    2009-12-01

    To explore and explain the different concepts of reliability and validity as they are related to measurement instruments in social science and health care. There are different concepts contained in the terms reliability and validity and these are often explained poorly and there is often confusion between them. To develop some clarity about reliability and validity a conceptual framework was built based on the existing literature. The concepts of reliability, validity and utility are explored and explained. Reliability contains the concepts of internal consistency and stability and equivalence. Validity contains the concepts of content, face, criterion, concurrent, predictive, construct, convergent (and divergent), factorial and discriminant. In addition, for clinical practice and research, it is essential to establish the utility of a measurement instrument. To use measurement instruments appropriately in clinical practice, the extent to which they are reliable, valid and usable must be established.

  17. Electrical service reliability: the customer perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samsa, M.E.; Hub, K.A.; Krohm, G.C.

    1978-09-01

    Electric-utility-system reliability criteria have traditionally been established as a matter of utility policy or through long-term engineering practice, generally with no supportive customer cost/benefit analysis as justification. This report presents results of an initial study of the customer perspective toward electric-utility-system reliability, based on critical review of over 20 previous and ongoing efforts to quantify the customer's value of reliable electric service. A possible structure of customer classifications is suggested as a reasonable level of disaggregation for further investigation of customer value, and these groups are characterized in terms of their electricity use patterns. The values that customers assign tomore » reliability are discussed in terms of internal and external cost components. A list of options for effecting changes in customer service reliability is set forth, and some of the many policy issues that could alter customer-service reliability are identified.« less

  18. Research of maneuvering target prediction and tracking technology based on IMM algorithm

    NASA Astrophysics Data System (ADS)

    Cao, Zheng; Mao, Yao; Deng, Chao; Liu, Qiong; Chen, Jing

    2016-09-01

    Maneuvering target prediction and tracking technology is widely used in both military and civilian applications, the study of those technologies is all along the hotspot and difficulty. In the Electro-Optical acquisition-tracking-pointing system (ATP), the primary traditional maneuvering targets are ballistic target, large aircraft and other big targets. Those targets have the features of fast velocity and a strong regular trajectory and Kalman Filtering and polynomial fitting have good effects when they are used to track those targets. In recent years, the small unmanned aerial vehicles developed rapidly for they are small, nimble and simple operation. The small unmanned aerial vehicles have strong maneuverability in the observation system of ATP although they are close-in, slow and small targets. Moreover, those vehicles are under the manual operation, therefore, the acceleration of them changes greatly and they move erratically. So the prediction and tracking precision is low when traditional algorithms are used to track the maneuvering fly of those targets, such as speeding up, turning, climbing and so on. The interacting multiple model algorithm (IMM) use multiple models to match target real movement trajectory, there are interactions between each model. The IMM algorithm can switch model based on a Markov chain to adapt to the change of target movement trajectory, so it is suitable to solve the prediction and tracking problems of the small unmanned aerial vehicles because of the better adaptability of irregular movement. This paper has set up model set of constant velocity model (CV), constant acceleration model (CA), constant turning model (CT) and current statistical model. And the results of simulating and analyzing the real movement trajectory data of the small unmanned aerial vehicles show that the prediction and tracking technology based on the interacting multiple model algorithm can get relatively lower tracking error and improve tracking precision

  19. Improved Tumor Targeting of Polymer-based Nanovesicles Using Polymer-Lipid Blends

    PubMed Central

    Cheng, Zhiliang; Elias, Drew R.; Kamat, Neha P.; Johnston, Eric D.; Poloukhtine, Andrei; Popik, Vladimir; Hammer, Daniel A.; Tsourkas, Andrew

    2011-01-01

    Block copolymer-based vesicles have recently garnered a great deal of interest as nanoplatforms for drug delivery and molecular imaging applications due to their unique structural properties. These nanovesicles have been shown to direct their cargo to disease sites either through enhanced permeability and retention or even more efficiently via active targeting. Here we show that the efficacy of nanovesicle targeting can be significantly improved when prepared from polymer-lipid blends compared with block copolymer alone. Polymer-lipid hybrid nanovesicles were produced from the aqueous co-assembly of the diblock copolymer, poly(ethylene oxide)-block-polybutadiene (PEO-PBD), and the phospholipid, hydrogenated soy phosphatidylcholine (HSPC). The PEG-based vesicles, 117 nm in diameter, were functionalized with either folic acid or anti-HER2/neu affibodies as targeting ligands to confer specificity for cancer cells. Our results revealed that nanovesicles prepared from polymer-lipid blends led to significant improvement in cell binding compared to nanovesicles prepared from block copolymer alone in both in vitro cell studies and murine tumor models. Therefore, it is envisioned that nanovesicles composed of polymer-lipid blends may constitute a preferred embodiment for targeted drug delivery and molecular imaging applications. PMID:21899335

  20. Targeted peptide measurements in biology and medicine: best practices for mass spectrometry-based assay development using a fit-for-purpose approach.

    PubMed

    Carr, Steven A; Abbatiello, Susan E; Ackermann, Bradley L; Borchers, Christoph; Domon, Bruno; Deutsch, Eric W; Grant, Russell P; Hoofnagle, Andrew N; Hüttenhain, Ruth; Koomen, John M; Liebler, Daniel C; Liu, Tao; MacLean, Brendan; Mani, D R; Mansfield, Elizabeth; Neubert, Hendrik; Paulovich, Amanda G; Reiter, Lukas; Vitek, Olga; Aebersold, Ruedi; Anderson, Leigh; Bethem, Robert; Blonder, Josip; Boja, Emily; Botelho, Julianne; Boyne, Michael; Bradshaw, Ralph A; Burlingame, Alma L; Chan, Daniel; Keshishian, Hasmik; Kuhn, Eric; Kinsinger, Christopher; Lee, Jerry S H; Lee, Sang-Won; Moritz, Robert; Oses-Prieto, Juan; Rifai, Nader; Ritchie, James; Rodriguez, Henry; Srinivas, Pothur R; Townsend, R Reid; Van Eyk, Jennifer; Whiteley, Gordon; Wiita, Arun; Weintraub, Susan

    2014-03-01

    Adoption of targeted mass spectrometry (MS) approaches such as multiple reaction monitoring (MRM) to study biological and biomedical questions is well underway in the proteomics community. Successful application depends on the ability to generate reliable assays that uniquely and confidently identify target peptides in a sample. Unfortunately, there is a wide range of criteria being applied to say that an assay has been successfully developed. There is no consensus on what criteria are acceptable and little understanding of the impact of variable criteria on the quality of the results generated. Publications describing targeted MS assays for peptides frequently do not contain sufficient information for readers to establish confidence that the tests work as intended or to be able to apply the tests described in their own labs. Guidance must be developed so that targeted MS assays with established performance can be made widely distributed and applied by many labs worldwide. To begin to address the problems and their solutions, a workshop was held at the National Institutes of Health with representatives from the multiple communities developing and employing targeted MS assays. Participants discussed the analytical goals of their experiments and the experimental evidence needed to establish that the assays they develop work as intended and are achieving the required levels of performance. Using this "fit-for-purpose" approach, the group defined three tiers of assays distinguished by their performance and extent of analytical characterization. Computational and statistical tools useful for the analysis of targeted MS results were described. Participants also detailed the information that authors need to provide in their manuscripts to enable reviewers and readers to clearly understand what procedures were performed and to evaluate the reliability of the peptide or protein quantification measurements reported. This paper presents a summary of the meeting and

  1. Targeted Peptide Measurements in Biology and Medicine: Best Practices for Mass Spectrometry-based Assay Development Using a Fit-for-Purpose Approach*

    PubMed Central

    Carr, Steven A.; Abbatiello, Susan E.; Ackermann, Bradley L.; Borchers, Christoph; Domon, Bruno; Deutsch, Eric W.; Grant, Russell P.; Hoofnagle, Andrew N.; Hüttenhain, Ruth; Koomen, John M.; Liebler, Daniel C.; Liu, Tao; MacLean, Brendan; Mani, DR; Mansfield, Elizabeth; Neubert, Hendrik; Paulovich, Amanda G.; Reiter, Lukas; Vitek, Olga; Aebersold, Ruedi; Anderson, Leigh; Bethem, Robert; Blonder, Josip; Boja, Emily; Botelho, Julianne; Boyne, Michael; Bradshaw, Ralph A.; Burlingame, Alma L.; Chan, Daniel; Keshishian, Hasmik; Kuhn, Eric; Kinsinger, Christopher; Lee, Jerry S.H.; Lee, Sang-Won; Moritz, Robert; Oses-Prieto, Juan; Rifai, Nader; Ritchie, James; Rodriguez, Henry; Srinivas, Pothur R.; Townsend, R. Reid; Van Eyk, Jennifer; Whiteley, Gordon; Wiita, Arun; Weintraub, Susan

    2014-01-01

    Adoption of targeted mass spectrometry (MS) approaches such as multiple reaction monitoring (MRM) to study biological and biomedical questions is well underway in the proteomics community. Successful application depends on the ability to generate reliable assays that uniquely and confidently identify target peptides in a sample. Unfortunately, there is a wide range of criteria being applied to say that an assay has been successfully developed. There is no consensus on what criteria are acceptable and little understanding of the impact of variable criteria on the quality of the results generated. Publications describing targeted MS assays for peptides frequently do not contain sufficient information for readers to establish confidence that the tests work as intended or to be able to apply the tests described in their own labs. Guidance must be developed so that targeted MS assays with established performance can be made widely distributed and applied by many labs worldwide. To begin to address the problems and their solutions, a workshop was held at the National Institutes of Health with representatives from the multiple communities developing and employing targeted MS assays. Participants discussed the analytical goals of their experiments and the experimental evidence needed to establish that the assays they develop work as intended and are achieving the required levels of performance. Using this “fit-for-purpose” approach, the group defined three tiers of assays distinguished by their performance and extent of analytical characterization. Computational and statistical tools useful for the analysis of targeted MS results were described. Participants also detailed the information that authors need to provide in their manuscripts to enable reviewers and readers to clearly understand what procedures were performed and to evaluate the reliability of the peptide or protein quantification measurements reported. This paper presents a summary of the meeting and

  2. Reliability prediction of ontology-based service compositions using Petri net and time series models.

    PubMed

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy.

  3. Reliability Prediction of Ontology-Based Service Compositions Using Petri Net and Time Series Models

    PubMed Central

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy. PMID:24688429

  4. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.

    1988-01-01

    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  5. Reliable Freestanding Position-Based Routing in Highway Scenarios

    PubMed Central

    Galaviz-Mosqueda, Gabriel A.; Aquino-Santos, Raúl; Villarreal-Reyes, Salvador; Rivera-Rodríguez, Raúl; Villaseñor-González, Luis; Edwards, Arthur

    2012-01-01

    Vehicular Ad Hoc Networks (VANETs) are considered by car manufacturers and the research community as the enabling technology to radically improve the safety, efficiency and comfort of everyday driving. However, before VANET technology can fulfill all its expected potential, several difficulties must be addressed. One key issue arising when working with VANETs is the complexity of the networking protocols compared to those used by traditional infrastructure networks. Therefore, proper design of the routing strategy becomes a main issue for the effective deployment of VANETs. In this paper, a reliable freestanding position-based routing algorithm (FPBR) for highway scenarios is proposed. For this scenario, several important issues such as the high mobility of vehicles and the propagation conditions may affect the performance of the routing strategy. These constraints have only been partially addressed in previous proposals. In contrast, the design approach used for developing FPBR considered the constraints imposed by a highway scenario and implements mechanisms to overcome them. FPBR performance is compared to one of the leading protocols for highway scenarios. Performance metrics show that FPBR yields similar results when considering freespace propagation conditions, and outperforms the leading protocol when considering a realistic highway path loss model. PMID:23202159

  6. Reliable freestanding position-based routing in highway scenarios.

    PubMed

    Galaviz-Mosqueda, Gabriel A; Aquino-Santos, Raúl; Villarreal-Reyes, Salvador; Rivera-Rodríguez, Raúl; Villaseñor-González, Luis; Edwards, Arthur

    2012-10-24

    Vehicular Ad Hoc Networks (VANETs) are considered by car manufacturers and the research community as the enabling technology to radically improve the safety, efficiency and comfort of everyday driving. However, before VANET technology can fulfill all its expected potential, several difficulties must be addressed. One key issue arising when working with VANETs is the complexity of the networking protocols compared to those used by traditional infrastructure networks. Therefore, proper design of the routing strategy becomes a main issue for the effective deployment of VANETs. In this paper, a reliable freestanding position-based routing algorithm (FPBR) for highway scenarios is proposed. For this scenario, several important issues such as the high mobility of vehicles and the propagation conditions may affect the performance of the routing strategy. These constraints have only been partially addressed in previous proposals. In contrast, the design approach used for developing FPBR considered the constraints imposed by a highway scenario and implements mechanisms to overcome them. FPBR performance is compared to one of the leading protocols for highway scenarios. Performance metrics show that FPBR yields similar results when considering freespace propagation conditions, and outperforms the leading protocol when considering a realistic highway path loss model.

  7. 76 FR 40722 - Granite Reliable Power, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-11

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER11-3941-000] Granite Reliable Power, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket Section 204 Authorization This is a supplemental notice in the above-referenced proceeding of Granite...

  8. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  9. Increasing Small Satellite Reliability- A Public-Private Initiative

    NASA Technical Reports Server (NTRS)

    Johnson, Michael A.; Beauchamp, Patricia; Schone, Harald; Sheldon, Doug; Fuhrman, Linda; Sullivan, Erica; Fairbanks, Tom; Moe, Miquel; Leitner, Jesse

    2017-01-01

    At present, CubeSat components and buses are generally not appropriate for missions where significant risk of failure, or the inability to quantify risk or confidence, is acceptable. However, in the future we anticipate that CubeSats will be used for missions requiring reliability of 1-3 years for Earth-observing missions and even longer for Planetary, Heliophysics, and Astrophysics missions. Their growing potential utility is driving an interagency effort to improve and quantify CubeSat reliability, and more generally, small satellite mission risk. The Small Satellite Reliability Initiative (SSRI)—an ongoing activity with broad collaborative participation from civil, DoD, and commercial space systems providers and stakeholders—targets this challenge. The Initiative seeks to define implementable and broadly-accepted approaches to achieve reliability and acceptable risk postures associated with several SmallSat mission risk classes—from “do no harm” missions, to those associated with missions whose failure would result in loss or delay of key national objectives. These approaches will maintain, to the extent practical, cost efficiencies associated with small satellite missions and consider constraints associated with supply chain elements, as appropriate. The SSRI addresses this challenge from two architectural levels—the mission- or system-level, and the component- or subsystem-level. The mission- or system-level scope targets assessment approaches that are efficient and effective, with mitigation strategies that facilitate resiliency to mission or system anomalies while the component- or subsystem-level scope addresses the challenge at lower architectural levels. The initiative does not limit strategies and approaches to proven and traditional methodologies, but is focused on fomenting thought on novel and innovative solutions. This paper discusses the genesis of and drivers for this initiative, how the public-private collaboration is being executed

  10. Reliability Correction for Functional Connectivity: Theory and Implementation

    PubMed Central

    Mueller, Sophia; Wang, Danhong; Fox, Michael D.; Pan, Ruiqi; Lu, Jie; Li, Kuncheng; Sun, Wei; Buckner, Randy L.; Liu, Hesheng

    2016-01-01

    Network properties can be estimated using functional connectivity MRI (fcMRI). However, regional variation of the fMRI signal causes systematic biases in network estimates including correlation attenuation in regions of low measurement reliability. Here we computed the spatial distribution of fcMRI reliability using longitudinal fcMRI datasets and demonstrated how pre-estimated reliability maps can correct for correlation attenuation. As a test case of reliability-based attenuation correction we estimated properties of the default network, where reliability was significantly lower than average in the medial temporal lobe and higher in the posterior medial cortex, heterogeneity that impacts estimation of the network. Accounting for this bias using attenuation correction revealed that the medial temporal lobe’s contribution to the default network is typically underestimated. To render this approach useful to a greater number of datasets, we demonstrate that test-retest reliability maps derived from repeated runs within a single scanning session can be used as a surrogate for multi-session reliability mapping. Using data segments with different scan lengths between 1 and 30 min, we found that test-retest reliability of connectivity estimates increases with scan length while the spatial distribution of reliability is relatively stable even at short scan lengths. Finally, analyses of tertiary data revealed that reliability distribution is influenced by age, neuropsychiatric status and scanner type, suggesting that reliability correction may be especially important when studying between-group differences. Collectively, these results illustrate that reliability-based attenuation correction is an easily implemented strategy that mitigates certain features of fMRI signal nonuniformity. PMID:26493163

  11. Feasibility, reliability, and validity of a smartphone based application for the assessment of cognitive function in the elderly.

    PubMed

    Brouillette, Robert M; Foil, Heather; Fontenot, Stephanie; Correro, Anthony; Allen, Ray; Martin, Corby K; Bruce-Keller, Annadora J; Keller, Jeffrey N

    2013-01-01

    While considerable knowledge has been gained through the use of established cognitive and motor assessment tools, there is a considerable interest and need for the development of a battery of reliable and validated assessment tools that provide real-time and remote analysis of cognitive and motor function in the elderly. Smartphones appear to be an obvious choice for the development of these "next-generation" assessment tools for geriatric research, although to date no studies have reported on the use of smartphone-based applications for the study of cognition in the elderly. The primary focus of the current study was to assess the feasibility, reliability, and validity of a smartphone-based application for the assessment of cognitive function in the elderly. A total of 57 non-demented elderly individuals were administered a newly developed smartphone application-based Color-Shape Test (CST) in order to determine its utility in measuring cognitive processing speed in the elderly. Validity of this novel cognitive task was assessed by correlating performance on the CST with scores on widely accepted assessments of cognitive function. Scores on the CST were significantly correlated with global cognition (Mini-Mental State Exam: r = 0.515, p<0.0001) and multiple measures of processing speed and attention (Digit Span: r = 0.427, p<0.0001; Trail Making Test: r = -0.651, p<0.00001; Digit Symbol Test: r = 0.508, p<0.0001). The CST was not correlated with naming and verbal fluency tasks (Boston Naming Test, Vegetable/Animal Naming) or memory tasks (Logical Memory Test). Test re-test reliability was observed to be significant (r = 0.726; p = 0.02). Together, these data are the first to demonstrate the feasibility, reliability, and validity of using a smartphone-based application for the purpose of assessing cognitive function in the elderly. The importance of these findings for the establishment of smartphone-based assessment batteries of cognitive

  12. Comparison of provider and plan-based targeting strategies for disease management.

    PubMed

    Annis, Ann M; Holtrop, Jodi Summers; Tao, Min; Chang, Hsiu-Ching; Luo, Zhehui

    2015-05-01

    We aimed to describe and contrast the targeting methods and engagement outcomes for health plan-delivered disease management with those of a provider-delivered care management program. Health plan epidemiologists partnered with university health services researchers to conduct a quasi-experimental, mixed-methods study of a 2-year pilot. We used semi-structured interviews to assess the characteristics of program-targeting strategies, and calculated target and engagement rates from clinical encounter data. Five physician organizations (POs) with 51 participating practices implemented care management. Health plan member lists were sent monthly to the practices to accept patients, and then the practices sent back data reports regarding targeting and engagement in care management. Among patients accepted by the POs, we compared those who were targeted and engaged by POs with those who met health plan targeting criteria. The health plan's targeting process combined claims algorithms and employer group preferences to identify candidates for disease management; on the other hand, several different factors influenced PO practices' targeting approaches, including clinical and personal knowledge of the patients, health assessment information, and availability of disease-relevant programs. Practices targeted a higher percentage of patients for care management than the health plan (38% vs 16%), where only 7% of these patients met the targeting criteria of both. Practices engaged a higher percentage of their targeted patients than the health plan (50% vs 13%). The health plan's claims-driven targeting approach and the clinically based strategies of practices both provide advantages; an optimal model may be to combine the strengths of each approach to maximize benefits in care management.

  13. The Accessibility, Usability, and Reliability of Chinese Web-Based Information on HIV/AIDS

    PubMed Central

    Niu, Lu; Luo, Dan; Liu, Ying; Xiao, Shuiyuan

    2016-01-01

    Objective: The present study was designed to assess the quality of Chinese-language Internet-based information on HIV/AIDS. Methods: We entered the following search terms, in Chinese, into Baidu and Sogou: “HIV/AIDS”, “symptoms”, and “treatment”, and evaluated the first 50 hits of each query using the Minervation validation instrument (LIDA tool) and DISCERN instrument. Results: Of the 900 hits identified, 85 websites were included in this study. The overall score of the LIDA tool was 63.7%; the mean score of accessibility, usability, and reliability was 82.2%, 71.5%, and 27.3%, respectively. Of the top 15 sites according to the LIDA score, the mean DISCERN score was calculated at 43.1 (95% confidence intervals (CI) = 37.7–49.5). Noncommercial websites showed higher DISCERN scores than commercial websites; whereas commercial websites were more likely to be found in the first 20 links obtained from each search engine than the noncommercial websites. Conclusions: In general, the HIV/AIDS related Chinese-language websites have poor reliability, although their accessibility and usability are fair. In addition, the treatment information presented on Chinese-language websites is far from sufficient. There is an imperative need for professionals and specialized institutes to improve the comprehensiveness of web-based information related to HIV/AIDS. PMID:27556475

  14. Stable isotopic labeling-based quantitative targeted glycomics (i-QTaG).

    PubMed

    Kim, Kyoung-Jin; Kim, Yoon-Woo; Kim, Yun-Gon; Park, Hae-Min; Jin, Jang Mi; Hwan Kim, Young; Yang, Yung-Hun; Kyu Lee, Jun; Chung, Junho; Lee, Sun-Gu; Saghatelian, Alan

    2015-01-01

    Mass spectrometry (MS) analysis combined with stable isotopic labeling is a promising method for the relative quantification of aberrant glycosylation in diseases and disorders. We developed a stable isotopic labeling-based quantitative targeted glycomics (i-QTaG) technique for the comparative and quantitative analysis of total N-glycans using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). We established the analytical procedure with the chemical derivatizations (i.e., sialic acid neutralization and stable isotopic labeling) of N-glycans using a model glycoprotein (bovine fetuin). Moreover, the i-QTaG using MALDI-TOF MS was evaluated with various molar ratios (1:1, 1:2, 1:5) of (13) C6 /(12) C6 -2-aminobenzoic acid-labeled glycans from normal human serum. Finally, this method was applied to direct comparison of the total N-glycan profiles between normal human sera (n = 8) and prostate cancer patient sera (n = 17). The intensities of the N-glycan peaks from i-QTaG method showed a good linearity (R(2) > 0.99) with the amount of the bovine fetuin glycoproteins. The ratios of relative intensity between the isotopically 2-AA labeled N-glycans were close to the theoretical molar ratios (1:1, 1:2, 1:5). We also demonstrated that the up-regulation of the Lewis antigen (~82%) in sera from prostate cancer patients. In this proof-of-concept study, we demonstrated that the i-QTaG method, which enables to achieve a reliable comparative quantitation of total N-glycans via MALDI-TOF MS analysis, has the potential to diagnose and monitor alterations in glycosylation associated with disease states or biotherapeutics. © 2015 American Institute of Chemical Engineers.

  15. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  16. Chitosan-based multifunctional nanomedicines and theranostics for targeted therapy of cancer.

    PubMed

    Fathi, Marziyeh; Majidi, Sima; Zangabad, Parham Sahandi; Barar, Jaleh; Erfan-Niya, Hamid; Omidi, Yadollah

    2018-05-30

    Nanotechnology as an emerging field has established inevitable impacts on nano-biomedicine and treatment of formidable diseases, inflammations, and malignancies. In this regard, substantial advances in the design of systems for delivery of therapeutic agents have emerged magnificent and innovative pathways in biomedical applications. Chitosan (CS) is derived via deacetylation of chitin as the second most abundant polysaccharide. Owing to the unique properties of CS (e.g., biocompatibility, biodegradability, bioactivity, mucoadhesion, cationic nature and functional groups), it is an excellent candidate for diverse biomedical and pharmaceutical applications such as drug/gene delivery, transplantation of encapsulated cells, tissue engineering, wound healing, antimicrobial purposes, etc. In this review, we will document, discuss, and provide some key insights toward design and application of miscellaneous nanoplatforms based on CS. The CS-based nanosystems (NSs) can be employed as advanced drug delivery systems (DDSs) in large part due to their remarkable physicochemical and biological characteristics. The abundant functional groups of CS allow the facile functionalization in order to engineer multifunctional NSs, which can simultaneously incorporate therapeutic agents, molecular targeting, and diagnostic/imaging capabilities in particular against malignancies. These multimodal NSs can be literally translated into clinical applications such as targeted diagnosis and therapy of cancer because they offer minimal systemic toxicity and maximal cytotoxicity against cancer cells and tumors. The recent developments in the CS-based NSs functionalized with targeting and imaging agents prove CS as a versatile polymer in targeted imaging and therapy. © 2018 Wiley Periodicals, Inc.

  17. Towards early software reliability prediction for computer forensic tools (case study).

    PubMed

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  18. An N-targeting real-time PCR strategy for the accurate detection of spring viremia of carp virus.

    PubMed

    Shao, Ling; Xiao, Yu; He, Zhengkan; Gao, Longying

    2016-03-01

    Spring viremia of carp virus (SVCV) is a highly pathogenic agent of several economically important Cyprinidae fish species. Currently, there are no effective vaccines or drugs for this virus, and prevention of the disease mostly relies on prompt diagnosis. Previously, nested RT-PCR and RT-qPCR detection methods based on the glycoprotein gene G have been developed. However, the high genetic diversity of the G gene seriously limits the reliability of those methods. Compared with the G gene, phylogenetic analyses indicate that the nucleoprotein gene N is more conserved. Furthermore, studies in other members of the Rhabdoviridae family reveals that their gene transcription level follows the order N>P>M>G>L, indicating that an N gene based RT-PCR should have higher sensitivity. Therefore, two pairs of primers and two corresponding probes targeting the conserved regions of the N gene were designed. RT-qPCR assays demonstrated all primers and probes could detect phylogenetically distant isolates specifically and efficiently. Moreover, in artificially infected fish, the detected copy numbers of the N gene were much higher than those of the G gene in all tissues, and both the N and G gene copy numbers were highest in the kidney and spleen. Testing in 1100 farm-raised fish also showed that the N-targeting strategy was more reliable than the G-targeting methods. The method developed in this study provides a reliable tool for the rapid diagnosis of SVCV. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Parts and Components Reliability Assessment: A Cost Effective Approach

    NASA Technical Reports Server (NTRS)

    Lee, Lydia

    2009-01-01

    System reliability assessment is a methodology which incorporates reliability analyses performed at parts and components level such as Reliability Prediction, Failure Modes and Effects Analysis (FMEA) and Fault Tree Analysis (FTA) to assess risks, perform design tradeoffs, and therefore, to ensure effective productivity and/or mission success. The system reliability is used to optimize the product design to accommodate today?s mandated budget, manpower, and schedule constraints. Stand ard based reliability assessment is an effective approach consisting of reliability predictions together with other reliability analyses for electronic, electrical, and electro-mechanical (EEE) complex parts and components of large systems based on failure rate estimates published by the United States (U.S.) military or commercial standards and handbooks. Many of these standards are globally accepted and recognized. The reliability assessment is especially useful during the initial stages when the system design is still in the development and hard failure data is not yet available or manufacturers are not contractually obliged by their customers to publish the reliability estimates/predictions for their parts and components. This paper presents a methodology to assess system reliability using parts and components reliability estimates to ensure effective productivity and/or mission success in an efficient manner, low cost, and tight schedule.

  20. Aircraft target detection algorithm based on high resolution spaceborne SAR imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Hui; Hao, Mengxi; Zhang, Cong; Su, Xiaojing

    2018-03-01

    In this paper, an image classification algorithm for airport area is proposed, which based on the statistical features of synthetic aperture radar (SAR) images and the spatial information of pixels. The algorithm combines Gamma mixture model and MRF. The algorithm using Gamma mixture model to obtain the initial classification result. Pixel space correlation based on the classification results are optimized by the MRF technique. Additionally, morphology methods are employed to extract airport (ROI) region where the suspected aircraft target samples are clarified to reduce the false alarm and increase the detection performance. Finally, this paper presents the plane target detection, which have been verified by simulation test.

  1. The Yale-Brown Obsessive Compulsive Scale: A Reliability Generalization Meta-Analysis.

    PubMed

    López-Pina, José Antonio; Sánchez-Meca, Julio; López-López, José Antonio; Marín-Martínez, Fulgencio; Núñez-Núñez, Rosa Maria; Rosa-Alcázar, Ana I; Gómez-Conesa, Antonia; Ferrer-Requena, Josefa

    2015-10-01

    The Yale-Brown Obsessive Compulsive Scale (Y-BOCS) is the most frequently applied test to assess obsessive compulsive symptoms. We conducted a reliability generalization meta-analysis on the Y-BOCS to estimate the average reliability, examine the variability among the reliability estimates, search for moderators, and propose a predictive model that researchers and clinicians can use to estimate the expected reliability of the Y-BOCS. We included studies where the Y-BOCS was applied to a sample of adults and reliability estimate was reported. Out of the 11,490 references located, 144 studies met the selection criteria. For the total scale, the mean reliability was 0.866 for coefficients alpha, 0.848 for test-retest correlations, and 0.922 for intraclass correlations. The moderator analyses led to a predictive model where the standard deviation of the total test and the target population (clinical vs. nonclinical) explained 38.6% of the total variability among coefficients alpha. Finally, clinical implications of the results are discussed. © The Author(s) 2014.

  2. Test-retest reliability and stability of N400 effects in a word-pair semantic priming paradigm.

    PubMed

    Kiang, Michael; Patriciu, Iulia; Roy, Carolyn; Christensen, Bruce K; Zipursky, Robert B

    2013-04-01

    Elicited by any meaningful stimulus, the N400 event-related potential (ERP) component is reduced when the stimulus is related to a preceding one. This N400 semantic priming effect has been used to probe abnormal semantic relationship processing in clinical disorders, and suggested as a possible biomarker for treatment studies. Validating N400 semantic priming effects as a clinical biomarker requires characterizing their test-retest reliability. We assessed test-retest reliability of N400 semantic priming in 16 healthy adults who viewed the same related and unrelated prime-target word pairs in two sessions one week apart. As expected, N400 amplitudes were smaller for related versus unrelated targets across sessions. N400 priming effects (amplitude differences between unrelated and related targets) were highly correlated across sessions (r=0.85, P<0.0001), but smaller in the second session due to larger N400s to related targets. N400 priming effects have high reliability over a one-week interval. They may decrease with repeat testing, possibly because of motivational changes. Use of N400 priming effects in treatment studies should account for possible magnitude decreases with repeat testing. Further research is needed to delineate N400 priming effects' test-retest reliability and stability in different age and clinical groups, and with different stimulus types. Copyright © 2012 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  3. Parametric Mass Reliability Study

    NASA Technical Reports Server (NTRS)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  4. The ventriloquist in periphery: impact of eccentricity-related reliability on audio-visual localization.

    PubMed

    Charbonneau, Geneviève; Véronneau, Marie; Boudrias-Fournier, Colin; Lepore, Franco; Collignon, Olivier

    2013-10-28

    The relative reliability of separate sensory estimates influences the way they are merged into a unified percept. We investigated how eccentricity-related changes in reliability of auditory and visual stimuli influence their integration across the entire frontal space. First, we surprisingly found that despite a strong decrease in auditory and visual unisensory localization abilities in periphery, the redundancy gain resulting from the congruent presentation of audio-visual targets was not affected by stimuli eccentricity. This result therefore contrasts with the common prediction that a reduction in sensory reliability necessarily induces an enhanced integrative gain. Second, we demonstrate that the visual capture of sounds observed with spatially incongruent audio-visual targets (ventriloquist effect) steadily decreases with eccentricity, paralleling a lowering of the relative reliability of unimodal visual over unimodal auditory stimuli in periphery. Moreover, at all eccentricities, the ventriloquist effect positively correlated with a weighted combination of the spatial resolution obtained in unisensory conditions. These findings support and extend the view that the localization of audio-visual stimuli relies on an optimal combination of auditory and visual information according to their respective spatial reliability. All together, these results evidence that the external spatial coordinates of multisensory events relative to an observer's body (e.g., eyes' or head's position) influence how this information is merged, and therefore determine the perceptual outcome.

  5. Approach to developing reliable space reactor power systems

    NASA Technical Reports Server (NTRS)

    Mondt, Jack F.; Shinbrot, Charles H.

    1991-01-01

    During Phase II, the Engineering Development Phase, the SP-100 Project has defined and is pursuing a new approach to developing reliable power systems. The approach to developing such a system during the early technology phase is described along with some preliminary examples to help explain the approach. Developing reliable components to meet space reactor power system requirements is based on a top-down systems approach which includes a point design based on a detailed technical specification of a 100-kW power system. The SP-100 system requirements implicitly recognize the challenge of achieving a high system reliability for a ten-year lifetime, while at the same time using technologies that require very significant development efforts. A low-cost method for assessing reliability, based on an understanding of fundamental failure mechanisms and design margins for specific failure mechanisms, is being developed as part of the SP-100 Program.

  6. A Time-Variant Reliability Model for Copper Bending Pipe under Seawater-Active Corrosion Based on the Stochastic Degradation Process

    PubMed Central

    Li, Mengmeng; Feng, Qiang; Yang, Dezhen

    2018-01-01

    In the degradation process, the randomness and multiplicity of variables are difficult to describe by mathematical models. However, they are common in engineering and cannot be neglected, so it is necessary to study this issue in depth. In this paper, the copper bending pipe in seawater piping systems is taken as the analysis object, and the time-variant reliability is calculated by solving the interference of limit strength and maximum stress. We did degradation experiments and tensile experiments on copper material, and obtained the limit strength at each time. In addition, degradation experiments on copper bending pipe were done and the thickness at each time has been obtained, then the response of maximum stress was calculated by simulation. Further, with the help of one kind of Monte Carlo method we propose, the time-variant reliability of copper bending pipe was calculated based on the stochastic degradation process and interference theory. Compared with traditional methods and verified by maintenance records, the results show that the time-variant reliability model based on the stochastic degradation process proposed in this paper has better applicability in the reliability analysis, and it can be more convenient and accurate to predict the replacement cycle of copper bending pipe under seawater-active corrosion. PMID:29584695

  7. A Rapid Python-Based Methodology for Target-Focused Combinatorial Library Design.

    PubMed

    Li, Shiliang; Song, Yuwei; Liu, Xiaofeng; Li, Honglin

    2016-01-01

    The chemical space is so vast that only a small portion of it has been examined. As a complementary approach to systematically probe the chemical space, virtual combinatorial library design has extended enormous impacts on generating novel and diverse structures for drug discovery. Despite the favorable contributions, high attrition rates in drug development that mainly resulted from lack of efficacy and side effects make it increasingly challenging to discover good chemical starting points. In most cases, focused libraries, which are restricted to particular regions of the chemical space, are deftly exploited to maximize hit rate and improve efficiency at the beginning of the drug discovery and drug development pipeline. This paper presented a valid methodology for fast target-focused combinatorial library design in both reaction-based and production-based ways with the library creating rates of approximately 70,000 molecules per second. Simple, quick and convenient operating procedures are the specific features of the method. SHAFTS, a hybrid 3D similarity calculation software, was embedded to help refine the size of the libraries and improve hit rates. Two target-focused (p38-focused and COX2-focused) libraries were constructed efficiently in this study. This rapid library enumeration method is portable and applicable to any other targets for good chemical starting points identification collaborated with either structure-based or ligand-based virtual screening.

  8. Nanobiotechnology-based delivery strategies: New frontiers in brain tumor targeted therapies.

    PubMed

    Mangraviti, Antonella; Gullotti, David; Tyler, Betty; Brem, Henry

    2016-10-28

    Despite recent technological advancements and promising preclinical experiments, brain tumor patients are still met with limited treatment options. Some of the barriers to clinical improvements include the systemic toxicity of cytotoxic compounds, the impedance of the blood brain barrier (BBB), and the lack of therapeutic agents that can selectively target the intracranial tumor environment. To overcome such barriers, a number of chemotherapeutic agents and nucleic acid-based therapies are rapidly being synthesized and tested as new brain tumor-targeted delivery strategies. Novel carriers include liposomal and polymeric nanoparticles, wafers, microchips, microparticle-based nanoplatforms and cells-based vectors. Strong preclinical results suggest that these nanotechnologies are set to transform the therapeutic paradigm for brain tumor treatment. In addition to new tumoricidal agents, parallel work is also being conducted on the BBB front. Preclinical testing of chemical and physical modulation strategies is yielding improved intracranial concentrations. New diagnostic and therapeutic imaging techniques, such as high-intensity focused ultrasound and MRI-guided focused ultrasound, are being used to modulate the BBB in a more precise and non-invasive manner. This review details some of the tremendous advances that are being explored in current brain tumor targeted therapies, including local implant development, nanobiotechnology-based delivery strategies, and techniques of BBB manipulation. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Interrelation Between Safety Factors and Reliability

    NASA Technical Reports Server (NTRS)

    Elishakoff, Isaac; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    An evaluation was performed to establish relationships between safety factors and reliability relationships. Results obtained show that the use of the safety factor is not contradictory to the employment of the probabilistic methods. In many cases the safety factors can be directly expressed by the required reliability levels. However, there is a major difference that must be emphasized: whereas the safety factors are allocated in an ad hoc manner, the probabilistic approach offers a unified mathematical framework. The establishment of the interrelation between the concepts opens an avenue to specify safety factors based on reliability. In cases where there are several forms of failure, then the allocation of safety factors should he based on having the same reliability associated with each failure mode. This immediately suggests that by the probabilistic methods the existing over-design or under-design can be eliminated. The report includes three parts: Part 1-Random Actual Stress and Deterministic Yield Stress; Part 2-Deterministic Actual Stress and Random Yield Stress; Part 3-Both Actual Stress and Yield Stress Are Random.

  10. QCL-based nonlinear sensing of independent targets dynamics.

    PubMed

    Mezzapesa, F P; Columbo, L L; Dabbicco, M; Brambilla, M; Scamarcio, G

    2014-03-10

    We demonstrate a common-path interferometer to measure the independent displacement of multiple targets through nonlinear frequency mixing in a quantum-cascade laser (QCL). The sensing system exploits the unique stability of QCLs under strong optical feedback to access the intrinsic nonlinearity of the active medium. The experimental results using an external dual cavity are in excellent agreement with the numerical simulations based on the Lang-Kobayashi equations.

  11. Simultaneous quantification of protein phosphorylation sites using liquid chromatography-tandem mass spectrometry-based targeted proteomics: a linear algebra approach for isobaric phosphopeptides.

    PubMed

    Xu, Feifei; Yang, Ting; Sheng, Yuan; Zhong, Ting; Yang, Mi; Chen, Yun

    2014-12-05

    As one of the most studied post-translational modifications (PTM), protein phosphorylation plays an essential role in almost all cellular processes. Current methods are able to predict and determine thousands of phosphorylation sites, whereas stoichiometric quantification of these sites is still challenging. Liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS)-based targeted proteomics is emerging as a promising technique for site-specific quantification of protein phosphorylation using proteolytic peptides as surrogates of proteins. However, several issues may limit its application, one of which relates to the phosphopeptides with different phosphorylation sites and the same mass (i.e., isobaric phosphopeptides). While employment of site-specific product ions allows for these isobaric phosphopeptides to be distinguished and quantified, site-specific product ions are often absent or weak in tandem mass spectra. In this study, linear algebra algorithms were employed as an add-on to targeted proteomics to retrieve information on individual phosphopeptides from their common spectra. To achieve this simultaneous quantification, a LC-MS/MS-based targeted proteomics assay was first developed and validated for each phosphopeptide. Given the slope and intercept of calibration curves of phosphopeptides in each transition, linear algebraic equations were developed. Using a series of mock mixtures prepared with varying concentrations of each phosphopeptide, the reliability of the approach to quantify isobaric phosphopeptides containing multiple phosphorylation sites (≥ 2) was discussed. Finally, we applied this approach to determine the phosphorylation stoichiometry of heat shock protein 27 (HSP27) at Ser78 and Ser82 in breast cancer cells and tissue samples.

  12. Investigating effects of communications modulation technique on targeting performance

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Eusebio, Gerald; Huling, Edward

    2006-05-01

    One of the key challenges facing the global war on terrorism (GWOT) and urban operations is the increased need for rapid and diverse information from distributed sources. For users to get adequate information on target types and movements, they would need reliable data. In order to facilitate reliable computational intelligence, we seek to explore the communication modulation tradeoffs affecting information distribution and accumulation. In this analysis, we explore the modulation techniques of Orthogonal Frequency Division Multiplexing (OFDM), Direct Sequence Spread Spectrum (DSSS), and statistical time-division multiple access (TDMA) as a function of the bit error rate and jitter that affect targeting performance. In the analysis, we simulate a Link 16 with a simple bandpass frequency shift keying (PSK) technique using different Signal-to-Noise ratios. The communications transfer delay and accuracy tradeoffs are assessed as to the effects incurred in targeting performance.

  13. Reliability of Instruments Measuring At-Risk and Problem Gambling Among Young Individuals: A Systematic Review Covering Years 2009-2015.

    PubMed

    Edgren, Robert; Castrén, Sari; Mäkelä, Marjukka; Pörtfors, Pia; Alho, Hannu; Salonen, Anne H

    2016-06-01

    This review aims to clarify which instruments measuring at-risk and problem gambling (ARPG) among youth are reliable and valid in light of reported estimates of internal consistency, classification accuracy, and psychometric properties. A systematic search was conducted in PubMed, Medline, and PsycInfo covering the years 2009-2015. In total, 50 original research articles fulfilled the inclusion criteria: target age under 29 years, using an instrument designed for youth, and reporting a reliability estimate. Articles were evaluated with the revised Quality Assessment of Diagnostic Accuracy Studies tool. Reliability estimates were reported for five ARPG instruments. Most studies (66%) evaluated the South Oaks Gambling Screen Revised for Adolescents. The Gambling Addictive Behavior Scale for Adolescents was the only novel instrument. In general, the evaluation of instrument reliability was superficial. Despite its rare use, the Canadian Adolescent Gambling Inventory (CAGI) had a strong theoretical and methodological base. The Gambling Addictive Behavior Scale for Adolescents and the CAGI were the only instruments originally developed for youth. All studies, except the CAGI study, were population based. ARPG instruments for youth have not been rigorously evaluated yet. Further research is needed especially concerning instruments designed for clinical use. Copyright © 2016 The Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  14. A reliability analysis of the revised competitiveness index.

    PubMed

    Harris, Paul B; Houston, John M

    2010-06-01

    This study examined the reliability of the Revised Competitiveness Index by investigating the test-retest reliability, interitem reliability, and factor structure of the measure based on a sample of 280 undergraduates (200 women, 80 men) ranging in age from 18 to 28 years (M = 20.1, SD = 2.1). The findings indicate that the Revised Competitiveness Index has high test-retest reliability, high inter-item reliability, and a stable factor structure. The results support the assertion that the Revised Competitiveness Index assesses competitiveness as a stable trait rather than a dynamic state.

  15. Reliability and validity of a novel Kinect-based software program for measuring posture, balance and side-bending.

    PubMed

    Grooten, Wilhelmus Johannes Andreas; Sandberg, Lisa; Ressman, John; Diamantoglou, Nicolas; Johansson, Elin; Rasmussen-Barr, Eva

    2018-01-08

    Clinical examinations are subjective and often show a low validity and reliability. Objective and highly reliable quantitative assessments are available in laboratory settings using 3D motion analysis, but these systems are too expensive to use for simple clinical examinations. Qinematic™ is an interactive movement analyses system based on the Kinect camera and is an easy-to-use clinical measurement system for assessing posture, balance and side-bending. The aim of the study was to test the test-retest the reliability and construct validity of Qinematic™ in a healthy population, and to calculate the minimal clinical differences for the variables of interest. A further aim was to identify the discriminative validity of Qinematic™ in people with low-back pain (LBP). We performed a test-retest reliability study (n = 37) with around 1 week between the occasions, a construct validity study (n = 30) in which Qinematic™ was tested against a 3D motion capture system, and a discriminative validity study, in which a group of people with LBP (n = 20) was compared to healthy controls (n = 17). We tested a large range of psychometric properties of 18 variables in three sections: posture (head and pelvic position, weight distribution), balance (sway area and velocity in single- and double-leg stance), and side-bending. The majority of the variables in the posture and balance sections, showed poor/fair reliability (ICC < 0.4) and poor/fair validity (Spearman <0.4), with significant differences between occasions, between Qinematic™ and the 3D-motion capture system. In the clinical study, Qinematic™ did not differ between people with LPB and healthy for these variables. For one variable, side-bending to the left, there was excellent reliability (ICC =0.898), excellent validity (r = 0.943), and Qinematic™ could differentiate between LPB and healthy individuals (p = 0.012). This paper shows that a novel software program (Qinematic™) based

  16. Reliable Classification of Geologic Surfaces Using Texture Analysis

    NASA Astrophysics Data System (ADS)

    Foil, G.; Howarth, D.; Abbey, W. J.; Bekker, D. L.; Castano, R.; Thompson, D. R.; Wagstaff, K.

    2012-12-01

    Communication delays and bandwidth constraints are major obstacles for remote exploration spacecraft. Due to such restrictions, spacecraft could make use of onboard science data analysis to maximize scientific gain, through capabilities such as the generation of bandwidth-efficient representative maps of scenes, autonomous instrument targeting to exploit targets of opportunity between communications, and downlink prioritization to ensure fast delivery of tactically-important data. Of particular importance to remote exploration is the precision of such methods and their ability to reliably reproduce consistent results in novel environments. Spacecraft resources are highly oversubscribed, so any onboard data analysis must provide a high degree of confidence in its assessment. The TextureCam project is constructing a "smart camera" that can analyze surface images to autonomously identify scientifically interesting targets and direct narrow field-of-view instruments. The TextureCam instrument incorporates onboard scene interpretation and mapping to assist these autonomous science activities. Computer vision algorithms map scenes such as those encountered during rover traverses. The approach, based on a machine learning strategy, trains a statistical model to recognize different geologic surface types and then classifies every pixel in a new scene according to these categories. We describe three methods for increasing the precision of the TextureCam instrument. The first uses ancillary data to segment challenging scenes into smaller regions having homogeneous properties. These subproblems are individually easier to solve, preventing uncertainty in one region from contaminating those that can be confidently classified. The second involves a Bayesian approach that maximizes the likelihood of correct classifications by abstaining from ambiguous ones. We evaluate these two techniques on a set of images acquired during field expeditions in the Mojave Desert. Finally, the

  17. Reliability Generalization of the Alcohol Use Disorder Identification Test.

    ERIC Educational Resources Information Center

    Shields, Alan L.; Caruso, John C.

    2002-01-01

    Evaluated the reliability of scores from the Alcohol Use Disorders Identification Test (AUDIT; J. Sounders and others, 1993) in a reliability generalization study based on 17 empirical journal articles. Results show AUDIT scores to be generally reliable for basic assessment. (SLD)

  18. Infrared variation reduction by simultaneous background suppression and target contrast enhancement for deep convolutional neural network-based automatic target recognition

    NASA Astrophysics Data System (ADS)

    Kim, Sungho

    2017-06-01

    Automatic target recognition (ATR) is a traditionally challenging problem in military applications because of the wide range of infrared (IR) image variations and the limited number of training images. IR variations are caused by various three-dimensional target poses, noncooperative weather conditions (fog and rain), and difficult target acquisition environments. Recently, deep convolutional neural network-based approaches for RGB images (RGB-CNN) showed breakthrough performance in computer vision problems, such as object detection and classification. The direct use of RGB-CNN to the IR ATR problem fails to work because of the IR database problems (limited database size and IR image variations). An IR variation-reduced deep CNN (IVR-CNN) to cope with the problems is presented. The problem of limited IR database size is solved by a commercial thermal simulator (OKTAL-SE). The second problem of IR variations is mitigated by the proposed shifted ramp function-based intensity transformation. This can suppress the background and enhance the target contrast simultaneously. The experimental results on the synthesized IR images generated by the thermal simulator (OKTAL-SE) validated the feasibility of IVR-CNN for military ATR applications.

  19. Accurate and Reliable Prediction of the Binding Affinities of Macrocycles to Their Protein Targets.

    PubMed

    Yu, Haoyu S; Deng, Yuqing; Wu, Yujie; Sindhikara, Dan; Rask, Amy R; Kimura, Takayuki; Abel, Robert; Wang, Lingle

    2017-12-12

    Macrocycles have been emerging as a very important drug class in the past few decades largely due to their expanded chemical diversity benefiting from advances in synthetic methods. Macrocyclization has been recognized as an effective way to restrict the conformational space of acyclic small molecule inhibitors with the hope of improving potency, selectivity, and metabolic stability. Because of their relatively larger size as compared to typical small molecule drugs and the complexity of the structures, efficient sampling of the accessible macrocycle conformational space and accurate prediction of their binding affinities to their target protein receptors poses a great challenge of central importance in computational macrocycle drug design. In this article, we present a novel method for relative binding free energy calculations between macrocycles with different ring sizes and between the macrocycles and their corresponding acyclic counterparts. We have applied the method to seven pharmaceutically interesting data sets taken from recent drug discovery projects including 33 macrocyclic ligands covering a diverse chemical space. The predicted binding free energies are in good agreement with experimental data with an overall root-mean-square error (RMSE) of 0.94 kcal/mol. This is to our knowledge the first time where the free energy of the macrocyclization of linear molecules has been directly calculated with rigorous physics-based free energy calculation methods, and we anticipate the outstanding accuracy demonstrated here across a broad range of target classes may have significant implications for macrocycle drug discovery.

  20. Molybdenum target specifications for cyclotron production of 99mTc based on patient dose estimates.

    PubMed

    Hou, X; Tanguay, J; Buckley, K; Schaffer, P; Bénard, F; Ruth, T J; Celler, A

    2016-01-21

    In response to the recognized fragility of reactor-produced (99)Mo supply, direct production of (99m)Tc via (100)Mo(p,2n)(99m)Tc reaction using medical cyclotrons has been investigated. However, due to the existence of other Molybdenum (Mo) isotopes in the target, in parallel with (99m)Tc, other technetium (Tc) radioactive isotopes (impurities) will be produced. They will be incorporated into the labeled radiopharmaceuticals and result in increased patient dose. The isotopic composition of the target and beam energy are main factors that determine production of impurities, thus also dose increases. Therefore, they both must be considered when selecting targets for clinical (99m)Tc production. Although for any given Mo target, the patient dose can be predicted based on complicated calculations of production yields for each Tc radioisotope, it would be very difficult to reverse these calculations to specify target composition based on dosimetry considerations. In this article, a relationship between patient dosimetry and Mo target composition is studied. A simple and easy algorithm for dose estimation, based solely on the knowledge of target composition and beam energy, is described. Using this algorithm, the patient dose increase due to every Mo isotope that could be present in the target is estimated. Most importantly, a technique to determine Mo target composition thresholds that would meet any given dosimetry requirement is proposed.

  1. Molybdenum target specifications for cyclotron production of 99mTc based on patient dose estimates

    NASA Astrophysics Data System (ADS)

    Hou, X.; Tanguay, J.; Buckley, K.; Schaffer, P.; Bénard, F.; Ruth, T. J.; Celler, A.

    2016-01-01

    In response to the recognized fragility of reactor-produced 99Mo supply, direct production of 99mTc via 100Mo(p,2n)99mTc reaction using medical cyclotrons has been investigated. However, due to the existence of other Molybdenum (Mo) isotopes in the target, in parallel with 99mTc, other technetium (Tc) radioactive isotopes (impurities) will be produced. They will be incorporated into the labeled radiopharmaceuticals and result in increased patient dose. The isotopic composition of the target and beam energy are main factors that determine production of impurities, thus also dose increases. Therefore, they both must be considered when selecting targets for clinical 99mTc production. Although for any given Mo target, the patient dose can be predicted based on complicated calculations of production yields for each Tc radioisotope, it would be very difficult to reverse these calculations to specify target composition based on dosimetry considerations. In this article, a relationship between patient dosimetry and Mo target composition is studied. A simple and easy algorithm for dose estimation, based solely on the knowledge of target composition and beam energy, is described. Using this algorithm, the patient dose increase due to every Mo isotope that could be present in the target is estimated. Most importantly, a technique to determine Mo target composition thresholds that would meet any given dosimetry requirement is proposed.

  2. Risk-Based Treatment Targets for Onsite Non-Potable Water Reuse

    EPA Science Inventory

    This presentation presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., municipal wastewater, locally-collected greywater, rainwater, and stormwater). A probabilistic, forward Quantitative Micr...

  3. Auto-segmentation of normal and target structures in head and neck CT images: a feature-driven model-based approach.

    PubMed

    Qazi, Arish A; Pekar, Vladimir; Kim, John; Xie, Jason; Breen, Stephen L; Jaffray, David A

    2011-11-01

    Intensity modulated radiation therapy (IMRT) allows greater control over dose distribution, which leads to a decrease in radiation related toxicity. IMRT, however, requires precise and accurate delineation of the organs at risk and target volumes. Manual delineation is tedious and suffers from both interobserver and intraobserver variability. State of the art auto-segmentation methods are either atlas-based, model-based or hybrid however, robust fully automated segmentation is often difficult due to the insufficient discriminative information provided by standard medical imaging modalities for certain tissue types. In this paper, the authors present a fully automated hybrid approach which combines deformable registration with the model-based approach to accurately segment normal and target tissues from head and neck CT images. The segmentation process starts by using an average atlas to reliably identify salient landmarks in the patient image. The relationship between these landmarks and the reference dataset serves to guide a deformable registration algorithm, which allows for a close initialization of a set of organ-specific deformable models in the patient image, ensuring their robust adaptation to the boundaries of the structures. Finally, the models are automatically fine adjusted by our boundary refinement approach which attempts to model the uncertainty in model adaptation using a probabilistic mask. This uncertainty is subsequently resolved by voxel classification based on local low-level organ-specific features. To quantitatively evaluate the method, they auto-segment several organs at risk and target tissues from 10 head and neck CT images. They compare the segmentations to the manual delineations outlined by the expert. The evaluation is carried out by estimating two common quantitative measures on 10 datasets: volume overlap fraction or the Dice similarity coefficient (DSC), and a geometrical metric, the median symmetric Hausdorff distance (HD), which

  4. Mechanism-Based Enhanced Delivery of Drug-Loaded Targeted Nanoparticles for Breast Cancer Therapy

    DTIC Science & Technology

    2014-02-01

    Based Enhanced Delivery of Drug-Loaded Targeted Nanoparticles for Breast Cancer Therapy” 5a. CONTRACT NUMBER 5b. GRANT NUMBER W81XWH-11-1-0167 5c... Nanotechnologies in Living Systems”, Moscow Region, Russia, September, 2011. 3. “Ionic nanogels for drug delivery in cancer ”. NanoDDS’12; Atlantic City, New...AD Award Number: W81XWH-11-1-0167 TITLE: Mechanism-Based Enhanced Delivery of Drug-Loaded Targeted Nanoparticles for Breast

  5. Mining protein interactomes to improve their reliability and support the advancement of network medicine.

    PubMed

    Alanis-Lobato, Gregorio

    2015-01-01

    High-throughput detection of protein interactions has had a major impact in our understanding of the intricate molecular machinery underlying the living cell, and has permitted the construction of very large protein interactomes. The protein networks that are currently available are incomplete and a significant percentage of their interactions are false positives. Fortunately, the structural properties observed in good quality social or technological networks are also present in biological systems. This has encouraged the development of tools, to improve the reliability of protein networks and predict new interactions based merely on the topological characteristics of their components. Since diseases are rarely caused by the malfunction of a single protein, having a more complete and reliable interactome is crucial in order to identify groups of inter-related proteins involved in disease etiology. These system components can then be targeted with minimal collateral damage. In this article, an important number of network mining tools is reviewed, together with resources from which reliable protein interactomes can be constructed. In addition to the review, a few representative examples of how molecular and clinical data can be integrated to deepen our understanding of pathogenesis are discussed.

  6. Targets Mask U-Net for Wind Turbines Detection in Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Han, M.; Wang, H.; Wang, G.; Liu, Y.

    2018-04-01

    To detect wind turbines precisely and quickly in very high resolution remote sensing images (VHRRSI) we propose target mask U-Net. This convolution neural network (CNN), which is carefully designed to be a wide-field detector, models the pixel class assignment to wind turbines and their context information. The shadow, which is the context information of the target in this study, has been regarded as part of a wind turbine instance. We have trained the target mask U-Net on training dataset, which is composed of down sampled image blocks and instance mask blocks. Some post-processes have been integrated to eliminate wrong spots and produce bounding boxes of wind turbine instances. The evaluation metrics prove the reliability and effectiveness of our method for the average F1-score of our detection method is up to 0.97. The comparison of detection accuracy and time consuming with the weakly supervised targets detection method based on CNN illustrates the superiority of our method.

  7. Expert system for UNIX system reliability and availability enhancement

    NASA Astrophysics Data System (ADS)

    Xu, Catherine Q.

    1993-02-01

    Highly reliable and available systems are critical to the airline industry. However, most off-the-shelf computer operating systems and hardware do not have built-in fault tolerant mechanisms, the UNIX workstation is one example. In this research effort, we have developed a rule-based Expert System (ES) to monitor, command, and control a UNIX workstation system with hot-standby redundancy. The ES on each workstation acts as an on-line system administrator to diagnose, report, correct, and prevent certain types of hardware and software failures. If a primary station is approaching failure, the ES coordinates the switch-over to a hot-standby secondary workstation. The goal is to discover and solve certain fatal problems early enough to prevent complete system failure from occurring and therefore to enhance system reliability and availability. Test results show that the ES can diagnose all targeted faulty scenarios and take desired actions in a consistent manner regardless of the sequence of the faults. The ES can perform designated system administration tasks about ten times faster than an experienced human operator. Compared with a single workstation system, our hot-standby redundancy system downtime is predicted to be reduced by more than 50 percent by using the ES to command and control the system.

  8. Expert System for UNIX System Reliability and Availability Enhancement

    NASA Technical Reports Server (NTRS)

    Xu, Catherine Q.

    1993-01-01

    Highly reliable and available systems are critical to the airline industry. However, most off-the-shelf computer operating systems and hardware do not have built-in fault tolerant mechanisms, the UNIX workstation is one example. In this research effort, we have developed a rule-based Expert System (ES) to monitor, command, and control a UNIX workstation system with hot-standby redundancy. The ES on each workstation acts as an on-line system administrator to diagnose, report, correct, and prevent certain types of hardware and software failures. If a primary station is approaching failure, the ES coordinates the switch-over to a hot-standby secondary workstation. The goal is to discover and solve certain fatal problems early enough to prevent complete system failure from occurring and therefore to enhance system reliability and availability. Test results show that the ES can diagnose all targeted faulty scenarios and take desired actions in a consistent manner regardless of the sequence of the faults. The ES can perform designated system administration tasks about ten times faster than an experienced human operator. Compared with a single workstation system, our hot-standby redundancy system downtime is predicted to be reduced by more than 50 percent by using the ES to command and control the system.

  9. Infrared small target detection based on multiscale center-surround contrast measure

    NASA Astrophysics Data System (ADS)

    Fu, Hao; Long, Yunli; Zhu, Ran; An, Wei

    2018-04-01

    Infrared(IR) small target detection plays a critical role in the Infrared Search And Track (IRST) system. Although it has been studied for years, there are some difficulties remained to the clutter environment. According to the principle of human discrimination of small targets from a natural scene that there is a signature of discontinuity between the object and its neighboring regions, we develop an efficient method for infrared small target detection called multiscale centersurround contrast measure (MCSCM). First, to determine the maximum neighboring window size, an entropy-based window selection technique is used. Then, we construct a novel multiscale center-surround contrast measure to calculate the saliency map. Compared with the original image, the MCSCM map has less background clutters and noise residual. Subsequently, a simple threshold is used to segment the target. Experimental results show our method achieves better performance.

  10. Design and implementation of location-based wireless targeted advertising

    NASA Astrophysics Data System (ADS)

    Li, Benjamin; Xu, Deyin

    2001-10-01

    As advertisements are time and location sensitive, a challenge for wireless marketing is to have advertisements delivered when and where they are most convenient. In this paper we introduce a two-stage auction model for location-based wireless targeted advertising. This system extends the notion of location-based service by using location information to target advertising, and does so specifically by enabling advertisers to specify their preferences and bid for advertisement delivery, where those preferences are then used in a subsequent automated auction of actual deliveries to wireless data users. The automated auction in the second stage is especially effective because it can use information about the individual user profile data, including customer relationship management system contents as well as location from the wireless system's location management service, including potentially location history such as current trajectory from recent history and longer-term historical trip records for that user. Through two-stage auction, real-time bidding by advertisers and matching ads contents to mobile users help advertising information reach maximal value.

  11. A multiplex primer design algorithm for target amplification of continuous genomic regions.

    PubMed

    Ozturk, Ahmet Rasit; Can, Tolga

    2017-06-19

    Targeted Next Generation Sequencing (NGS) assays are cost-efficient and reliable alternatives to Sanger sequencing. For sequencing of very large set of genes, the target enrichment approach is suitable. However, for smaller genomic regions, the target amplification method is more efficient than both the target enrichment method and Sanger sequencing. The major difficulty of the target amplification method is the preparation of amplicons, regarding required time, equipment, and labor. Multiplex PCR (MPCR) is a good solution for the mentioned problems. We propose a novel method to design MPCR primers for a continuous genomic region, following the best practices of clinically reliable PCR design processes. On an experimental setup with 48 different combinations of factors, we have shown that multiple parameters might effect finding the first feasible solution. Increasing the length of the initial primer candidate selection sequence gives better results whereas waiting for a longer time to find the first feasible solution does not have a significant impact. We generated MPCR primer designs for the HBB whole gene, MEFV coding regions, and human exons between 2000 bp to 2100 bp-long. Our benchmarking experiments show that the proposed MPCR approach is able produce reliable NGS assay primers for a given sequence in a reasonable amount of time.

  12. Compositional Analysis of Asymmetric and Symmetric Dimethylated H3R2 Using Liquid Chromatography-Tandem Mass Spectrometry-Based Targeted Proteomics.

    PubMed

    Xu, Qingqing; Xu, Feifei; Liu, Liang; Chen, Yun

    2016-09-06

    Protein arginine methylation is one of the common post-translational modifications in cellular processes. To date, two isomeric forms of dimethylated arginine have been identified: asymmetric N(G),N(G)-dimethylarginine (aDMA), and symmetric N(G),N'(G)-dimethylarginine (sDMA). Evidence indicated that these isomers can coexist and have different or even opposite functions, with aDMA and sDMA forms of arginine 2 on histone H3 (i.e., H3R2me2a and H3R2me2s) being an example. Thus, specific detection and quantification of each isomeric form is important. Current methods are capable of predicting and detecting thousands of methylarginine sites in proteins, whereas differentiation and stoichiometric measurement of dimethylated protein isomers are still challenging. Liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS)-based targeted proteomics has emerged as a promising technique for site-specific quantification of protein methylation using enzymatic peptides as surrogates of target proteins. However, it should be pointed out that a routine targeted proteomics strategy cannot easily distinguish sDMA- and aDMA-containing surrogate peptides due to their common nature. The estimated amount should be considered as the sum of both arginine dimethylated isomers. In this study, compositional analysis based on a linear algebra algorithm as an add-on to targeted proteomics was employed to quantify H3R2me2a and H3R2me2s (i.e., surrogate peptides of AR(me2a)TK(me1/2)QT and AR(me2s)TK(me1/2)QT). To achieve this simultaneous quantification, a targeted proteomics assay was developed and validated for each isomer first. With the slope and intercept of their calibration curves for each multiple reaction monitoring (MRM) transition, linear algebraic equations were derived. Using a series of mock mixtures consisting of isomers in varying concentrations, the reliability of the method was confirmed. Finally, the H3R2 dimethylation status was analyzed in normal MCF-10A cells

  13. Predictive models of safety based on audit findings: Part 1: Model development and reliability.

    PubMed

    Hsiao, Yu-Lin; Drury, Colin; Wu, Changxu; Paquet, Victor

    2013-03-01

    This consecutive study was aimed at the quantitative validation of safety audit tools as predictors of safety performance, as we were unable to find prior studies that tested audit validity against safety outcomes. An aviation maintenance domain was chosen for this work as both audits and safety outcomes are currently prescribed and regulated. In Part 1, we developed a Human Factors/Ergonomics classification framework based on HFACS model (Shappell and Wiegmann, 2001a,b), for the human errors detected by audits, because merely counting audit findings did not predict future safety. The framework was tested for measurement reliability using four participants, two of whom classified errors on 1238 audit reports. Kappa values leveled out after about 200 audits at between 0.5 and 0.8 for different tiers of errors categories. This showed sufficient reliability to proceed with prediction validity testing in Part 2. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  14. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  15. Comprehensive reliability allocation method for CNC lathes based on cubic transformed functions of failure mode and effects analysis

    NASA Astrophysics Data System (ADS)

    Yang, Zhou; Zhu, Yunpeng; Ren, Hongrui; Zhang, Yimin

    2015-03-01

    Reliability allocation of computerized numerical controlled(CNC) lathes is very important in industry. Traditional allocation methods only focus on high-failure rate components rather than moderate failure rate components, which is not applicable in some conditions. Aiming at solving the problem of CNC lathes reliability allocating, a comprehensive reliability allocation method based on cubic transformed functions of failure modes and effects analysis(FMEA) is presented. Firstly, conventional reliability allocation methods are introduced. Then the limitations of direct combination of comprehensive allocation method with the exponential transformed FMEA method are investigated. Subsequently, a cubic transformed function is established in order to overcome these limitations. Properties of the new transformed functions are discussed by considering the failure severity and the failure occurrence. Designers can choose appropriate transform amplitudes according to their requirements. Finally, a CNC lathe and a spindle system are used as an example to verify the new allocation method. Seven criteria are considered to compare the results of the new method with traditional methods. The allocation results indicate that the new method is more flexible than traditional methods. By employing the new cubic transformed function, the method covers a wider range of problems in CNC reliability allocation without losing the advantages of traditional methods.

  16. Reliable change indices and standardized regression-based change score norms for evaluating neuropsychological change in children with epilepsy.

    PubMed

    Busch, Robyn M; Lineweaver, Tara T; Ferguson, Lisa; Haut, Jennifer S

    2015-06-01

    Reliable change indices (RCIs) and standardized regression-based (SRB) change score norms permit evaluation of meaningful changes in test scores following treatment interventions, like epilepsy surgery, while accounting for test-retest reliability, practice effects, score fluctuations due to error, and relevant clinical and demographic factors. Although these methods are frequently used to assess cognitive change after epilepsy surgery in adults, they have not been widely applied to examine cognitive change in children with epilepsy. The goal of the current study was to develop RCIs and SRB change score norms for use in children with epilepsy. Sixty-three children with epilepsy (age range: 6-16; M=10.19, SD=2.58) underwent comprehensive neuropsychological evaluations at two time points an average of 12 months apart. Practice effect-adjusted RCIs and SRB change score norms were calculated for all cognitive measures in the battery. Practice effects were quite variable across the neuropsychological measures, with the greatest differences observed among older children, particularly on the Children's Memory Scale and Wisconsin Card Sorting Test. There was also notable variability in test-retest reliabilities across measures in the battery, with coefficients ranging from 0.14 to 0.92. Reliable change indices and SRB change score norms for use in assessing meaningful cognitive change in children following epilepsy surgery are provided for measures with reliability coefficients above 0.50. This is the first study to provide RCIs and SRB change score norms for a comprehensive neuropsychological battery based on a large sample of children with epilepsy. Tables to aid in evaluating cognitive changes in children who have undergone epilepsy surgery are provided for clinical use. An Excel sheet to perform all relevant calculations is also available to interested clinicians or researchers. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Methods to approximate reliabilities in single-step genomic evaluation

    USDA-ARS?s Scientific Manuscript database

    Reliability of predictions from single-step genomic BLUP (ssGBLUP) can be calculated by inversion, but that is not feasible for large data sets. Two methods of approximating reliability were developed based on decomposition of a function of reliability into contributions from records, pedigrees, and...

  18. Multi-hop routing mechanism for reliable sensor computing.

    PubMed

    Chen, Jiann-Liang; Ma, Yi-Wei; Lai, Chia-Ping; Hu, Chia-Cheng; Huang, Yueh-Min

    2009-01-01

    Current research on routing in wireless sensor computing concentrates on increasing the service lifetime, enabling scalability for large number of sensors and supporting fault tolerance for battery exhaustion and broken nodes. A sensor node is naturally exposed to various sources of unreliable communication channels and node failures. Sensor nodes have many failure modes, and each failure degrades the network performance. This work develops a novel mechanism, called Reliable Routing Mechanism (RRM), based on a hybrid cluster-based routing protocol to specify the best reliable routing path for sensor computing. Table-driven intra-cluster routing and on-demand inter-cluster routing are combined by changing the relationship between clusters for sensor computing. Applying a reliable routing mechanism in sensor computing can improve routing reliability, maintain low packet loss, minimize management overhead and save energy consumption. Simulation results indicate that the reliability of the proposed RRM mechanism is around 25% higher than that of the Dynamic Source Routing (DSR) and ad hoc On-demand Distance Vector routing (AODV) mechanisms.

  19. Performance evaluation of non-targeted peak-based cross-sample analysis for comprehensive two-dimensional gas chromatography-mass spectrometry data and application to processed hazelnut profiling.

    PubMed

    Kiefl, Johannes; Cordero, Chiara; Nicolotti, Luca; Schieberle, Peter; Reichenbach, Stephen E; Bicchi, Carlo

    2012-06-22

    The continuous interest in non-targeted profiling induced the development of tools for automated cross-sample analysis. Such tools were found to be selective or not comprehensive thus delivering a biased view on the qualitative/quantitative peak distribution across 2D sample chromatograms. Therefore, the performance of non-targeted approaches needs to be critically evaluated. This study focused on the development of a validation procedure for non-targeted, peak-based, GC×GC-MS data profiling. The procedure introduced performance parameters such as specificity, precision, accuracy, and uncertainty for a profiling method known as Comprehensive Template Matching. The performance was assessed by applying a three-week validation protocol based on CITAC/EURACHEM guidelines. Optimized ¹D and ²D retention times search windows, MS match factor threshold, detection threshold, and template threshold were evolved from two training sets by a semi-automated learning process. The effectiveness of proposed settings to consistently match 2D peak patterns was established by evaluating the rate of mismatched peaks and was expressed in terms of results accuracy. The study utilized 23 different 2D peak patterns providing the chemical fingerprints of raw and roasted hazelnuts (Corylus avellana L.) from different geographical origins, of diverse varieties and different roasting degrees. The validation results show that non-targeted peak-based profiling can be reliable with error rates lower than 10% independent of the degree of analytical variance. The optimized Comprehensive Template Matching procedure was employed to study hazelnut roasting profiles and in particular to find marker compounds strongly dependent on the thermal treatment, and to establish the correlation of potential marker compounds to geographical origin and variety/cultivar and finally to reveal the characteristic release of aroma active compounds. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Reliability Evaluation of Base-Metal-Electrode Multilayer Ceramic Capacitors for Potential Space Applications

    NASA Technical Reports Server (NTRS)

    Liu, David (Donhang); Sampson, Michael J.

    2011-01-01

    Base-metal-electrode (BME) ceramic capacitors are being investigated for possible use in high-reliability spacelevel applications. This paper focuses on how BME capacitors construction and microstructure affects their lifetime and reliability. Examination of the construction and microstructure of commercial off-the-shelf (COTS) BME capacitors reveals great variance in dielectric layer thickness, even among BME capacitors with the same rated voltage. Compared to PME (precious-metal-electrode) capacitors, BME capacitors exhibit a denser and more uniform microstructure, with an average grain size between 0.3 and 0.5 m, which is much less than that of most PME capacitors. BME capacitors can be fabricated with more internal electrode layers and thinner dielectric layers than PME capacitors because they have a fine-grained microstructure and do not shrink much during ceramic sintering. This makes it possible for BME capacitors to achieve a very high capacitance volumetric efficiency. The reliability of BME and PME capacitors was investigated using highly accelerated life testing (HALT). Most BME capacitors were found to fail with an early avalanche breakdown, followed by a regular dielectric wearout failure during the HALT test. When most of the early failures, characterized with avalanche breakdown, were removed, BME capacitors exhibited a minimum mean time-to-failure (MTTF) of more than 105 years at room temperature and rated voltage. Dielectric thickness was found to be a critical parameter for the reliability of BME capacitors. The number of stacked grains in a dielectric layer appears to play a significant role in determining BME capacitor reliability. Although dielectric layer thickness varies for a given rated voltage in BME capacitors, the number of stacked grains is relatively consistent, typically around 12 for a number of BME capacitors with a rated voltage of 25V. This may suggest that the number of grains per dielectric layer is more critical than the

  1. Reliability Generalization (RG) Analysis: The Test Is Not Reliable

    ERIC Educational Resources Information Center

    Warne, Russell

    2008-01-01

    Literature shows that most researchers are unaware of some of the characteristics of reliability. This paper clarifies some misconceptions by describing the procedures, benefits, and limitations of reliability generalization while using it to illustrate the nature of score reliability. Reliability generalization (RG) is a meta-analytic method…

  2. Integrating field methodology and web-based data collection to assess the reliability of the Alcohol Use Disorders Identification Test (AUDIT).

    PubMed

    Celio, Mark A; Vetter-O'Hagen, Courtney S; Lisman, Stephen A; Johansen, Gerard E; Spear, Linda P

    2011-12-01

    Field methodologies offer a unique opportunity to collect ecologically valid data on alcohol use and its associated problems within natural drinking environments. However, limitations in follow-up data collection methods have left unanswered questions regarding the psychometric properties of field-based measures. The aim of the current study is to evaluate the reliability of self-report data collected in a naturally occurring environment - as indexed by the Alcohol Use Disorders Identification Test (AUDIT) - compared to self-report data obtained through an innovative web-based follow-up procedure. Individuals recruited outside of bars (N=170; mean age=21; range 18-32) provided a BAC sample and completed a self-administered survey packet that included the AUDIT. BAC feedback was provided anonymously through a dedicated web page. Upon sign in, follow-up participants (n=89; 52%) were again asked to complete the AUDIT before receiving their BAC feedback. Reliability analyses demonstrated that AUDIT scores - both continuous and dichotomized at the standard cut-point - were stable across field- and web-based administrations. These results suggest that self-report data obtained from acutely intoxicated individuals in naturally occurring environments are reliable when compared to web-based data obtained after a brief follow-up interval. Furthermore, the results demonstrate the feasibility, utility, and potential of integrating field methods and web-based data collection procedures. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  3. An Enhanced Backbone-Assisted Reliable Framework for Wireless Sensor Networks

    PubMed Central

    Tufail, Ali; Khayam, Syed Ali; Raza, Muhammad Taqi; Ali, Amna; Kim, Ki-Hyung

    2010-01-01

    An extremely reliable source to sink communication is required for most of the contemporary WSN applications especially pertaining to military, healthcare and disaster-recovery. However, due to their intrinsic energy, bandwidth and computational constraints, Wireless Sensor Networks (WSNs) encounter several challenges in reliable source to sink communication. In this paper, we present a novel reliable topology that uses reliable hotlines between sensor gateways to boost the reliability of end-to-end transmissions. This reliable and efficient routing alternative reduces the number of average hops from source to the sink. We prove, with the help of analytical evaluation, that communication using hotlines is considerably more reliable than traditional WSN routing. We use reliability theory to analyze the cost and benefit of adding gateway nodes to a backbone-assisted WSN. However, in hotline assisted routing some scenarios where source and the sink are just a couple of hops away might bring more latency, therefore, we present a Signature Based Routing (SBR) scheme. SBR enables the gateways to make intelligent routing decisions, based upon the derived signature, hence providing lesser end-to-end delay between source to the sink communication. Finally, we evaluate our proposed hotline based topology with the help of a simulation tool and show that the proposed topology provides manifold increase in end-to-end reliability. PMID:22294890

  4. SU-F-T-36: Dosimetric Comparison of Point Based Vs. Target Based Prescription for Intracavitary Brachytherapy in Cancer of the Cervix

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashenafi, M; McDonald, D; Peng, J

    Purpose: Improved patient imaging used for planning the treatment of cervical cancer with Tandem and Ovoid (T&O) Intracavitary high-dose-rate brachytherapy (HDR) now allows for 3D delineation of target volumes and organs-at-risk. However, historical data relies on the conventional point A-based planning technique. A comparative dosimetric study was performed by generating both target-based (TBP) and point-based (PBP) plans for ten clinical patients. Methods: Treatment plans created using Elekta Oncentra v. 4.3 for ten consecutive cervical cancer patients were analyzed. All patients were treated with HDR using the Utrecht T&O applicator. Both CT and MRI imaging modalities were utilized to delineate clinicalmore » target volume (CTV) and organs-at-risk (rectum, sigmoid, bladder, and small bowel). Point A (left and right), vaginal mucosa, and ICRU rectum and bladder points were defined on CT. Two plans were generated for each patient using two prescription methods (PBP and TBP). 7Gy was prescribed to each point A for each PBP plan and to the target D90% for each TBP plan. Target V90%, V100%, and V200% were evaluated. In addition, D0.1cc and D2cc were analyzed for each organ-at-risk. Differences were assessed for statistical significance (p<0.05) by use of Student’s t-test. Results: Target coverage was comparable for both planning methods, with each method providing adequate target coverage. TBP showed lower absolute dose to the target volume than PBP (D90% = 7.0Gy vs. 7.4Gy, p=0.028), (V200% = 10.9cc vs. 12.8cc, p=0.014), (ALeft = 6.4Gy vs. 7Gy, p=0.009), and (ARight = 6.4Gy vs. 7Gy, p=0.013). TBP also showed a statistically significant reduction in bladder, rectum, small bowel, and sigmoid doses compared to PBP. There was no statistically significant difference in vaginal mucosa or ICRU-defined rectum and bladder dose. Conclusion: Target based prescription resulted in substantially lower dose to delineated organs-at-risk compared to point based prescription

  5. Advances in Sprint Acceleration Profiling for Field-Based Team-Sport Athletes: Utility, Reliability, Validity and Limitations.

    PubMed

    Simperingham, Kim D; Cronin, John B; Ross, Angus

    2016-11-01

    Advanced testing technologies enable insight into the kinematic and kinetic determinants of sprint acceleration performance, which is particularly important for field-based team-sport athletes. Establishing the reliability and validity of the data, particularly from the acceleration phase, is important for determining the utility of the respective technologies. The aim of this systematic review was to explain the utility, reliability, validity and limitations of (1) radar and laser technology, and (2) non-motorised treadmill (NMT) and torque treadmill (TT) technology for providing kinematic and kinetic measures of sprint acceleration performance. A comprehensive search of the CINAHL Plus, MEDLINE (EBSCO), PubMed, SPORTDiscus, and Web of Science databases was conducted using search terms that included radar, laser, non-motorised treadmill, torque treadmill, sprint, acceleration, kinetic, kinematic, force, and power. Studies examining the kinematics or kinetics of short (≤10 s), maximal-effort sprint acceleration in adults or children, which included an assessment of reliability or validity of the advanced technologies of interest, were included in this systematic review. Absolute reliability, relative reliability and validity data were extracted from the selected articles and tabulated. The level of acceptance of reliability was a coefficient of variation (CV) ≤10 % and an intraclass correlation coefficient (ICC) or correlation coefficient (r) ≥0.70. A total of 34 studies met the inclusion criteria and were included in the qualitative analysis. Generally acceptable validity (r = 0.87-0.99; absolute bias 3-7 %), intraday reliability (CV ≤9.5 %; ICC/r ≥0.84) and interday reliability (ICC ≥0.72) were reported for data from radar and laser. However, low intraday reliability was reported for the theoretical maximum horizontal force (ICC 0.64) within adolescent athletes, and low validity was reported for velocity during the initial 5 m of a sprint

  6. Faster diffraction-based overlay measurements with smaller targets using 3D gratings

    NASA Astrophysics Data System (ADS)

    Li, Jie; Kritsun, Oleg; Liu, Yongdong; Dasari, Prasad; Volkman, Catherine; Hu, Jiangtao

    2012-03-01

    Diffraction-based overlay (DBO) technologies have been developed to address the overlay metrology challenges for 22nm technology node and beyond. Most DBO technologies require specially designed targets that consist of multiple measurement pads, which consume too much space and increase measurement time. The traditional empirical approach (eDBO) using normal incidence spectroscopic reflectometry (NISR) relies on linear response of the reflectance with respect to overlay displacement within a small range. It offers convenience of quick recipe setup since there is no need to establish a model. However it requires three or four pads per direction (x or y) which adds burden to throughput and target size. Recent advances in modeling capability and computation power enabled mDBO, which allows overlay measurement with reduced number of pads, thus reducing measurement time and DBO target space. In this paper we evaluate the performance of single pad mDBO measurements using two 3D targets that have different grating shapes: squares in boxes and L-shapes in boxes. Good overlay sensitivities are observed for both targets. The correlation to programmed shifts and image-based overlay (IBO) is excellent. Despite the difference in shapes, the mDBO results are comparable for square and L-shape targets. The impact of process variations on overlay measurements is studied using a focus and exposure matrix (FEM) wafer. Although the FEM wafer has larger process variations, the correlation of mDBO results with IBO measurements is as good as the normal process wafer. We demonstrate the feasibility of single pad DBO measurements with faster throughput and smaller target size, which is particularly important in high volume manufacturing environment.

  7. Infrared small target detection in heavy sky scene clutter based on sparse representation

    NASA Astrophysics Data System (ADS)

    Liu, Depeng; Li, Zhengzhou; Liu, Bing; Chen, Wenhao; Liu, Tianmei; Cao, Lei

    2017-09-01

    A novel infrared small target detection method based on sky clutter and target sparse representation is proposed in this paper to cope with the representing uncertainty of clutter and target. The sky scene background clutter is described by fractal random field, and it is perceived and eliminated via the sparse representation on fractal background over-complete dictionary (FBOD). The infrared small target signal is simulated by generalized Gaussian intensity model, and it is expressed by the generalized Gaussian target over-complete dictionary (GGTOD), which could describe small target more efficiently than traditional structured dictionaries. Infrared image is decomposed on the union of FBOD and GGTOD, and the sparse representation energy that target signal and background clutter decomposed on GGTOD differ so distinctly that it is adopted to distinguish target from clutter. Some experiments are induced and the experimental results show that the proposed approach could improve the small target detection performance especially under heavy clutter for background clutter could be efficiently perceived and suppressed by FBOD and the changing target could also be represented accurately by GGTOD.

  8. Meta-Analysis of Scale Reliability Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2013-01-01

    A latent variable modeling approach is outlined that can be used for meta-analysis of reliability coefficients of multicomponent measuring instruments. Important limitations of efforts to combine composite reliability findings across multiple studies are initially pointed out. A reliability synthesis procedure is discussed that is based on…

  9. A Bayesian-Based Novel Methodology to Generate Reliable Site Response Mapping Sensitive to Data Uncertainties

    NASA Astrophysics Data System (ADS)

    Chakraborty, A.; Goto, H.

    2017-12-01

    The 2011 off the Pacific coast of Tohoku earthquake caused severe damage in many areas further inside the mainland because of site-amplification. Furukawa district in Miyagi Prefecture, Japan recorded significant spatial differences in ground motion even at sub-kilometer scales. The site responses in the damage zone far exceeded the levels in the hazard maps. A reason why the mismatch occurred is that mapping follow only the mean value at the measurement locations with no regard to the data uncertainties and thus are not always reliable. Our research objective is to develop a methodology to incorporate data uncertainties in mapping and propose a reliable map. The methodology is based on a hierarchical Bayesian modeling of normally-distributed site responses in space where the mean (μ), site-specific variance (σ2) and between-sites variance(s2) parameters are treated as unknowns with a prior distribution. The observation data is artificially created site responses with varying means and variances for 150 seismic events across 50 locations in one-dimensional space. Spatially auto-correlated random effects were added to the mean (μ) using a conditionally autoregressive (CAR) prior. The inferences on the unknown parameters are done using Markov Chain Monte Carlo methods from the posterior distribution. The goal is to find reliable estimates of μ sensitive to uncertainties. During initial trials, we observed that the tau (=1/s2) parameter of CAR prior controls the μ estimation. Using a constraint, s = 1/(k×σ), five spatial models with varying k-values were created. We define reliability to be measured by the model likelihood and propose the maximum likelihood model to be highly reliable. The model with maximum likelihood was selected using a 5-fold cross-validation technique. The results show that the maximum likelihood model (μ*) follows the site-specific mean at low uncertainties and converges to the model-mean at higher uncertainties (Fig.1). This result is

  10. A standard for test reliability in group research.

    PubMed

    Ellis, Jules L

    2013-03-01

    Many authors adhere to the rule that test reliabilities should be at least .70 or .80 in group research. This article introduces a new standard according to which reliabilities can be evaluated. This standard is based on the costs or time of the experiment and of administering the test. For example, if test administration costs are 7 % of the total experimental costs, the efficient value of the reliability is .93. If the actual reliability of a test is equal to this efficient reliability, the test size maximizes the statistical power of the experiment, given the costs. As a standard in experimental research, it is proposed that the reliability of the dependent variable be close to the efficient reliability. Adhering to this standard will enhance the statistical power and reduce the costs of experiments.

  11. Research on regional intrusion prevention and control system based on target tracking

    NASA Astrophysics Data System (ADS)

    Liu, Yanfei; Wang, Jieling; Jiang, Ke; He, Yanhui; Wu, Zhilin

    2017-08-01

    In view of the fact that China’s border is very long and the border prevention and control measures are single, we designed a regional intrusion prevention and control system which based on target-tracking. The system consists of four parts: solar panel, radar, electro-optical equipment, unmanned aerial vehicle and intelligent tracking platform. The solar panel provides independent power for the entire system. The radar detects the target in real time and realizes the high precision positioning of suspicious targets, then through the linkage of electro-optical equipment, it can achieve full-time automatic precise tracking of targets. When the target appears within the range of detection, the drone will be launched to continue the tracking. The system is mainly to realize the full time, full coverage, whole process integration and active realtime control of the border area.

  12. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  13. Estimating the impact on health of poor reliability of drinking water interventions in developing countries.

    PubMed

    Hunter, Paul R; Zmirou-Navier, Denis; Hartemann, Philippe

    2009-04-01

    Recent evidence suggests that many improved drinking water supplies suffer from poor reliability. This study investigates what impact poor reliability may have on achieving health improvement targets. A Quantitative Microbiological Risk Assessment was conducted of the impact of interruptions in water supplies that forced people to revert to drinking raw water. Data from the literature were used to construct models on three waterborne pathogens common in Africa: Rotavirus, Cryptosporidium and Enterotoxigenic E. coli. Risk of infection by the target pathogens is substantially greater on days that people revert to raw water consumption. Over the course of a few days raw water consumption, the annual health benefits attributed to consumption of water from an improved supply will be almost all lost. Furthermore, risk of illness on days drinking raw water will fall substantially on very young children who have the highest risk of death following infection. Agencies responsible for implementing improved drinking water provision will not make meaningful contributions to public health targets if those systems are subject to poor reliability. Funders of water quality interventions in developing countries should put more effort into auditing whether interventions are sustainable and whether the health benefits are being achieved.

  14. A GIS-based assessment of the suitability of SCIAMACHY satellite sensor measurements for estimating reliable CO concentrations in a low-latitude climate.

    PubMed

    Fagbeja, Mofoluso A; Hill, Jennifer L; Chatterton, Tim J; Longhurst, James W S

    2015-02-01

    An assessment of the reliability of the Scanning Imaging Absorption Spectrometer for Atmospheric Cartography (SCIAMACHY) satellite sensor measurements to interpolate tropospheric concentrations of carbon monoxide considering the low-latitude climate of the Niger Delta region in Nigeria was conducted. Monthly SCIAMACHY carbon monoxide (CO) column measurements from January 2,003 to December 2005 were interpolated using ordinary kriging technique. The spatio-temporal variations observed in the reliability were based on proximity to the Atlantic Ocean, seasonal variations in the intensities of rainfall and relative humidity, the presence of dust particles from the Sahara desert, industrialization in Southwest Nigeria and biomass burning during the dry season in Northern Nigeria. Spatial reliabilities of 74 and 42 % are observed for the inland and coastal areas, respectively. Temporally, average reliability of 61 and 55 % occur during the dry and wet seasons, respectively. Reliability in the inland and coastal areas was 72 and 38 % during the wet season, and 75 and 46 % during the dry season, respectively. Based on the results, the WFM-DOAS SCIAMACHY CO data product used for this study is therefore relevant in the assessment of CO concentrations in developing countries within the low latitudes that could not afford monitoring infrastructure due to the required high costs. Although the SCIAMACHY sensor is no longer available, it provided cost-effective, reliable and accessible data that could support air quality assessment in developing countries.

  15. Ethical Implications of Validity-vs.-Reliability Trade-Offs in Educational Research

    ERIC Educational Resources Information Center

    Fendler, Lynn

    2016-01-01

    In educational research that calls itself empirical, the relationship between validity and reliability is that of trade-off: the stronger the bases for validity, the weaker the bases for reliability (and vice versa). Validity and reliability are widely regarded as basic criteria for evaluating research; however, there are ethical implications of…

  16. Ranking targets in structure-based virtual screening of three-dimensional protein libraries: methods and problems.

    PubMed

    Kellenberger, Esther; Foata, Nicolas; Rognan, Didier

    2008-05-01

    Structure-based virtual screening is a promising tool to identify putative targets for a specific ligand. Instead of docking multiple ligands into a single protein cavity, a single ligand is docked in a collection of binding sites. In inverse screening, hits are in fact targets which have been prioritized within the pool of best ranked proteins. The target rate depends on specificity and promiscuity in protein-ligand interactions and, to a considerable extent, on the effectiveness of the scoring function, which still is the Achilles' heel of molecular docking. In the present retrospective study, virtual screening of the sc-PDB target library by GOLD docking was carried out for four compounds (biotin, 4-hydroxy-tamoxifen, 6-hydroxy-1,6-dihydropurine ribonucleoside, and methotrexate) of known sc-PDB targets and, several ranking protocols based on GOLD fitness score and topological molecular interaction fingerprint (IFP) comparison were evaluated. For the four investigated ligands, the fusion of GOLD fitness and two IFP scores allowed the recovery of most targets, including the rare proteins which are not readily suitable for statistical analysis, while significantly filtering out most false positive entries. The current survey suggests that selecting a small number of targets (<20) for experimental evaluation is achievable with a pure structure-based approach.

  17. Applications of Human Performance Reliability Evaluation Concepts and Demonstration Guidelines

    DTIC Science & Technology

    1977-03-15

    ship stops dead in the water and the AN/SQS-26 operator recommends a new heading (000°). At T + 14 minutes, the target ship begins a hard turn to...Various Simulated Conditions 82 9 Hunan Reliability for Each Simulated Operator (Baseline Run) 83 10 Human and Equipment Availabilit / under

  18. An Accurate Non-Cooperative Method for Measuring Textureless Spherical Target Based on Calibrated Lasers.

    PubMed

    Wang, Fei; Dong, Hang; Chen, Yanan; Zheng, Nanning

    2016-12-09

    Strong demands for accurate non-cooperative target measurement have been arising recently for the tasks of assembling and capturing. Spherical objects are one of the most common targets in these applications. However, the performance of the traditional vision-based reconstruction method was limited for practical use when handling poorly-textured targets. In this paper, we propose a novel multi-sensor fusion system for measuring and reconstructing textureless non-cooperative spherical targets. Our system consists of four simple lasers and a visual camera. This paper presents a complete framework of estimating the geometric parameters of textureless spherical targets: (1) an approach to calibrate the extrinsic parameters between a camera and simple lasers; and (2) a method to reconstruct the 3D position of the laser spots on the target surface and achieve the refined results via an optimized scheme. The experiment results show that our proposed calibration method can obtain a fine calibration result, which is comparable to the state-of-the-art LRF-based methods, and our calibrated system can estimate the geometric parameters with high accuracy in real time.

  19. An Accurate Non-Cooperative Method for Measuring Textureless Spherical Target Based on Calibrated Lasers

    PubMed Central

    Wang, Fei; Dong, Hang; Chen, Yanan; Zheng, Nanning

    2016-01-01

    Strong demands for accurate non-cooperative target measurement have been arising recently for the tasks of assembling and capturing. Spherical objects are one of the most common targets in these applications. However, the performance of the traditional vision-based reconstruction method was limited for practical use when handling poorly-textured targets. In this paper, we propose a novel multi-sensor fusion system for measuring and reconstructing textureless non-cooperative spherical targets. Our system consists of four simple lasers and a visual camera. This paper presents a complete framework of estimating the geometric parameters of textureless spherical targets: (1) an approach to calibrate the extrinsic parameters between a camera and simple lasers; and (2) a method to reconstruct the 3D position of the laser spots on the target surface and achieve the refined results via an optimized scheme. The experiment results show that our proposed calibration method can obtain a fine calibration result, which is comparable to the state-of-the-art LRF-based methods, and our calibrated system can estimate the geometric parameters with high accuracy in real time. PMID:27941705

  20. Programmable and multiparameter DNA-based logic platform for cancer recognition and targeted therapy.

    PubMed

    You, Mingxu; Zhu, Guizhi; Chen, Tao; Donovan, Michael J; Tan, Weihong

    2015-01-21

    The specific inventory of molecules on diseased cell surfaces (e.g., cancer cells) provides clinicians an opportunity for accurate diagnosis and intervention. With the discovery of panels of cancer markers, carrying out analyses of multiple cell-surface markers is conceivable. As a trial to accomplish this, we have recently designed a DNA-based device that is capable of performing autonomous logic-based analysis of two or three cancer cell-surface markers. Combining the specific target-recognition properties of DNA aptamers with toehold-mediated strand displacement reactions, multicellular marker-based cancer analysis can be realized based on modular AND, OR, and NOT Boolean logic gates. Specifically, we report here a general approach for assembling these modular logic gates to execute programmable and higher-order profiling of multiple coexisting cell-surface markers, including several found on cancer cells, with the capacity to report a diagnostic signal and/or deliver targeted photodynamic therapy. The success of this strategy demonstrates the potential of DNA nanotechnology in facilitating targeted disease diagnosis and effective therapy.