Sample records for analysis method called

  1. Beluga whale, Delphinapterus leucas, vocalizations from the Churchill River, Manitoba, Canada.

    PubMed

    Chmelnitsky, Elly G; Ferguson, Steven H

    2012-06-01

    Classification of animal vocalizations is often done by a human observer using aural and visual analysis but more efficient, automated methods have also been utilized to reduce bias and increase reproducibility. Beluga whale, Delphinapterus leucas, calls were described from recordings collected in the summers of 2006-2008, in the Churchill River, Manitoba. Calls (n=706) were classified based on aural and visual analysis, and call characteristics were measured; calls were separated into 453 whistles (64.2%; 22 types), 183 pulsed∕noisy calls (25.9%; 15 types), and 70 combined calls (9.9%; seven types). Measured parameters varied within each call type but less variation existed in pulsed and noisy call types and some combined call types than in whistles. A more efficient and repeatable hierarchical clustering method was applied to 200 randomly chosen whistles using six call characteristics as variables; twelve groups were identified. Call characteristics varied less in cluster analysis groups than in whistle types described by visual and aural analysis and results were similar to the whistle contours described. This study provided the first description of beluga calls in Hudson Bay and using two methods provides more robust interpretations and an assessment of appropriate methods for future studies.

  2. Simultaneous Genotype Calling and Haplotype Phasing Improves Genotype Accuracy and Reduces False-Positive Associations for Genome-wide Association Studies

    PubMed Central

    Browning, Brian L.; Yu, Zhaoxia

    2009-01-01

    We present a novel method for simultaneous genotype calling and haplotype-phase inference. Our method employs the computationally efficient BEAGLE haplotype-frequency model, which can be applied to large-scale studies with millions of markers and thousands of samples. We compare genotype calls made with our method to genotype calls made with the BIRDSEED, CHIAMO, GenCall, and ILLUMINUS genotype-calling methods, using genotype data from the Illumina 550K and Affymetrix 500K arrays. We show that our method has higher genotype-call accuracy and yields fewer uncalled genotypes than competing methods. We perform single-marker analysis of data from the Wellcome Trust Case Control Consortium bipolar disorder and type 2 diabetes studies. For bipolar disorder, the genotype calls in the original study yield 25 markers with apparent false-positive association with bipolar disorder at a p < 10−7 significance level, whereas genotype calls made with our method yield no associated markers at this significance threshold. Conversely, for markers with replicated association with type 2 diabetes, there is good concordance between genotype calls used in the original study and calls made by our method. Results from single-marker and haplotypic analysis of our method's genotype calls for the bipolar disorder study indicate that our method is highly effective at eliminating genotyping artifacts that cause false-positive associations in genome-wide association studies. Our new genotype-calling methods are implemented in the BEAGLE and BEAGLECALL software packages. PMID:19931040

  3. Beluga whale (Delphinapterus leucas) vocalizations and call classification from the eastern Beaufort Sea population.

    PubMed

    Garland, Ellen C; Castellote, Manuel; Berchok, Catherine L

    2015-06-01

    Beluga whales, Delphinapterus leucas, have a graded call system; call types exist on a continuum making classification challenging. A description of vocalizations from the eastern Beaufort Sea beluga population during its spring migration are presented here, using both a non-parametric classification tree analysis (CART), and a Random Forest analysis. Twelve frequency and duration measurements were made on 1019 calls recorded over 14 days off Icy Cape, Alaska, resulting in 34 identifiable call types with 83% agreement in classification for both CART and Random Forest analyses. This high level of agreement in classification, with an initial subjective classification of calls into 36 categories, demonstrates that the methods applied here provide a quantitative analysis of a graded call dataset. Further, as calls cannot be attributed to individuals using single sensor passive acoustic monitoring efforts, these methods provide a comprehensive analysis of data where the influence of pseudo-replication of calls from individuals is unknown. This study is the first to describe the vocal repertoire of a beluga population using a robust and repeatable methodology. A baseline eastern Beaufort Sea beluga population repertoire is presented here, against which the call repertoire of other seasonally sympatric Alaskan beluga populations can be compared.

  4. Masking as an effective quality control method for next-generation sequencing data analysis.

    PubMed

    Yun, Sajung; Yun, Sijung

    2014-12-13

    Next generation sequencing produces base calls with low quality scores that can affect the accuracy of identifying simple nucleotide variation calls, including single nucleotide polymorphisms and small insertions and deletions. Here we compare the effectiveness of two data preprocessing methods, masking and trimming, and the accuracy of simple nucleotide variation calls on whole-genome sequence data from Caenorhabditis elegans. Masking substitutes low quality base calls with 'N's (undetermined bases), whereas trimming removes low quality bases that results in a shorter read lengths. We demonstrate that masking is more effective than trimming in reducing the false-positive rate in single nucleotide polymorphism (SNP) calling. However, both of the preprocessing methods did not affect the false-negative rate in SNP calling with statistical significance compared to the data analysis without preprocessing. False-positive rate and false-negative rate for small insertions and deletions did not show differences between masking and trimming. We recommend masking over trimming as a more effective preprocessing method for next generation sequencing data analysis since masking reduces the false-positive rate in SNP calling without sacrificing the false-negative rate although trimming is more commonly used currently in the field. The perl script for masking is available at http://code.google.com/p/subn/. The sequencing data used in the study were deposited in the Sequence Read Archive (SRX450968 and SRX451773).

  5. The mediating effect of calling on the relationship between medical school students’ academic burnout and empathy

    PubMed Central

    2017-01-01

    Purpose This study is aimed at identifying the relationships between medical school students’ academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. Methods A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students’ empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. Results The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. Conclusion This result demonstrates that calling is a key variable that mediates the relationship between medical students’ academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students’ empathy skills. PMID:28870019

  6. Quantifying Vocal Mimicry in the Greater Racket-Tailed Drongo: A Comparison of Automated Methods and Human Assessment

    PubMed Central

    Agnihotri, Samira; Sundeep, P. V. D. S.; Seelamantula, Chandra Sekhar; Balakrishnan, Rohini

    2014-01-01

    Objective identification and description of mimicked calls is a primary component of any study on avian vocal mimicry but few studies have adopted a quantitative approach. We used spectral feature representations commonly used in human speech analysis in combination with various distance metrics to distinguish between mimicked and non-mimicked calls of the greater racket-tailed drongo, Dicrurus paradiseus and cross-validated the results with human assessment of spectral similarity. We found that the automated method and human subjects performed similarly in terms of the overall number of correct matches of mimicked calls to putative model calls. However, the two methods also misclassified different subsets of calls and we achieved a maximum accuracy of ninety five per cent only when we combined the results of both the methods. This study is the first to use Mel-frequency Cepstral Coefficients and Relative Spectral Amplitude - filtered Linear Predictive Coding coefficients to quantify vocal mimicry. Our findings also suggest that in spite of several advances in automated methods of song analysis, corresponding cross-validation by humans remains essential. PMID:24603717

  7. Effect of introduction of electronic patient reporting on the duration of ambulance calls.

    PubMed

    Kuisma, Markku; Väyrynen, Taneli; Hiltunen, Tuomas; Porthan, Kari; Aaltonen, Janne

    2009-10-01

    We examined the effect of the change from paper records to the electronic patient records (EPRs) on ambulance call duration. We retrieved call duration times 6 months before (group 1) and 6 months after (group 2) the introduction of EPR. Subgroup analysis of group 2 was fulfilled depending whether the calls were made during the first or last 3 months after EPR introduction. We analyzed 37 599 ambulance calls (17 950 were in group 1 and 19 649 were in group 2). The median call duration in group 1 was 48 minutes and in group 2 was 49 minutes (P = .008). In group 2, call duration was longer during the first 3 months after EPR introduction. In multiple linear regression analysis, urgency category (P < .0001), unit level (P < .0001), and transportation decision (P < .0001) influenced the call duration. The documentation method was not a significant factor. Electronic patient record system can be implemented in an urban ambulance service in such a way that documentation method does not become a significant factor in determining call duration in the long run. Temporary performance drop during the first 3 months after introduction was noticed, reflecting adaptation process to a new way of working.

  8. Static analysis of class invariants in Java programs

    NASA Astrophysics Data System (ADS)

    Bonilla-Quintero, Lidia Dionisia

    2011-12-01

    This paper presents a technique for the automatic inference of class invariants from Java bytecode. Class invariants are very important for both compiler optimization and as an aid to programmers in their efforts to reduce the number of software defects. We present the original DC-invariant analysis from Adam Webber, talk about its shortcomings and suggest several different ways to improve it. To apply the DC-invariant analysis to identify DC-invariant assertions, all that one needs is a monotonic method analysis function and a suitable assertion domain. The DC-invariant algorithm is very general; however, the method analysis can be highly tuned to the problem in hand. For example, one could choose shape analysis as the method analysis function and use the DC-invariant analysis to simply extend it to an analysis that would yield class-wide invariants describing the shapes of linked data structures. We have a prototype implementation: a system we refer to as "the analyzer" that infers DC-invariant unary and binary relations and provides them to the user in a human readable format. The analyzer uses those relations to identify unnecessary array bounds checks in Java programs and perform null-reference analysis. It uses Adam Webber's relational constraint technique for the class-invariant binary relations. Early results with the analyzer were very imprecise in the presence of "dirty-called" methods. A dirty-called method is one that is called, either directly or transitively, from any constructor of the class, or from any method of the class at a point at which a disciplined field has been altered. This result was unexpected and forced an extensive search for improved techniques. An important contribution of this paper is the suggestion of several ways to improve the results by changing the way dirty-called methods are handled. The new techniques expand the set of class invariants that can be inferred over Webber's original results. The technique that produces better results uses in-line analysis. Final results are promising: we can infer sound class invariants for full-scale, not just toy applications.

  9. Scenario-Based Case Study Method and the Functionality of the Section Called "From Production to Consumption" from the Perspective of Primary School Students

    ERIC Educational Resources Information Center

    Taneri, Ahu

    2018-01-01

    In this research, the aim was showing the evaluation of students on scenario-based case study method and showing the functionality of the studied section called "from production to consumption". Qualitative research method and content analysis were used to reveal participants' experiences and reveal meaningful relations regarding…

  10. The linguistic and interactional factors impacting recognition and dispatch in emergency calls for out-of-hospital cardiac arrest: a mixed-method linguistic analysis study protocol

    PubMed Central

    Riou, Marine; Ball, Stephen; Williams, Teresa A; Whiteside, Austin; O’Halloran, Kay L; Bray, Janet; Perkins, Gavin D; Cameron, Peter; Fatovich, Daniel M; Inoue, Madoka; Bailey, Paul; Brink, Deon; Smith, Karen; Della, Phillip; Finn, Judith

    2017-01-01

    Introduction Emergency telephone calls placed by bystanders are crucial to the recognition of out-of-hospital cardiac arrest (OHCA), fast ambulance dispatch and initiation of early basic life support. Clear and efficient communication between caller and call-taker is essential to this time-critical emergency, yet few studies have investigated the impact that linguistic factors may have on the nature of the interaction and the resulting trajectory of the call. This research aims to provide a better understanding of communication factors impacting on the accuracy and timeliness of ambulance dispatch. Methods and analysis A dataset of OHCA calls and their corresponding metadata will be analysed from an interdisciplinary perspective, combining linguistic analysis and health services research. The calls will be transcribed and coded for linguistic and interactional variables and then used to answer a series of research questions about the recognition of OHCA and the delivery of basic life-support instructions to bystanders. Linguistic analysis of calls will provide a deeper understanding of the interactional dynamics between caller and call-taker which may affect recognition and dispatch for OHCA. Findings from this research will translate into recommendations for modifications of the protocols for ambulance dispatch and provide directions for further research. Ethics and dissemination The study has been approved by the Curtin University Human Research Ethics Committee (HR128/2013) and the St John Ambulance Western Australia Research Advisory Group. Findings will be published in peer-reviewed journals and communicated to key audiences, including ambulance dispatch professionals. PMID:28694349

  11. The mediating effect of calling on the relationship between medical school students' academic burnout and empathy.

    PubMed

    Chae, Su Jin; Jeong, So Mi; Chung, Yoon-Sok

    2017-09-01

    This study is aimed at identifying the relationships between medical school students' academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students' empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. This result demonstrates that calling is a key variable that mediates the relationship between medical students' academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students' empathy skills.

  12. Steepest descent with momentum for quadratic functions is a version of the conjugate gradient method.

    PubMed

    Bhaya, Amit; Kaszkurewicz, Eugenius

    2004-01-01

    It is pointed out that the so called momentum method, much used in the neural network literature as an acceleration of the backpropagation method, is a stationary version of the conjugate gradient method. Connections with the continuous optimization method known as heavy ball with friction are also made. In both cases, adaptive (dynamic) choices of the so called learning rate and momentum parameters are obtained using a control Liapunov function analysis of the system.

  13. Traditional Mold Analysis Compared to a DNA-based Method of Mold Analysis with Applications in Asthmatics' Homes

    EPA Science Inventory

    Traditional environmental mold analysis is based-on microscopic observations and counting of mold structures collected from the air on a sticky surface or culturing of molds on growth media for identification and quantification. A DNA-based method of mold analysis called mol...

  14. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  15. Generalized Structured Component Analysis

    ERIC Educational Resources Information Center

    Hwang, Heungsun; Takane, Yoshio

    2004-01-01

    We propose an alternative method to partial least squares for path analysis with components, called generalized structured component analysis. The proposed method replaces factors by exact linear combinations of observed variables. It employs a well-defined least squares criterion to estimate model parameters. As a result, the proposed method…

  16. A comparative analysis of the statistical properties of large mobile phone calling networks.

    PubMed

    Li, Ming-Xia; Jiang, Zhi-Qiang; Xie, Wen-Jie; Miccichè, Salvatore; Tumminello, Michele; Zhou, Wei-Xing; Mantegna, Rosario N

    2014-05-30

    Mobile phone calling is one of the most widely used communication methods in modern society. The records of calls among mobile phone users provide us a valuable proxy for the understanding of human communication patterns embedded in social networks. Mobile phone users call each other forming a directed calling network. If only reciprocal calls are considered, we obtain an undirected mutual calling network. The preferential communication behavior between two connected users can be statistically tested and it results in two Bonferroni networks with statistically validated edges. We perform a comparative analysis of the statistical properties of these four networks, which are constructed from the calling records of more than nine million individuals in Shanghai over a period of 110 days. We find that these networks share many common structural properties and also exhibit idiosyncratic features when compared with previously studied large mobile calling networks. The empirical findings provide us an intriguing picture of a representative large social network that might shed new lights on the modelling of large social networks.

  17. Beluga whale, Delphinapterus leucas, vocalizations and their relation to behaviour in the Churchill River, Manitoba, Canada

    NASA Astrophysics Data System (ADS)

    Chmelnitsky, Elly Golda

    The investigation of a species' repertoire and the contexts in which different calls are used is central to understanding vocal communication among animals. Beluga whale, Delphinapterus leucas, calls were classified and described in association with behaviours, from recordings collected in the Churchill River, Manitoba, during the summers of 2006-2008. Calls were subjectively classified based on sound and visual analysis into whistles (64.2% of total calls; 22 call types), pulsed or noisy calls (25.9%; 15 call types), and combined calls (9.9%; seven types). A hierarchical cluster analysis, using six call measurements as variables, separated whistles into 12 groups and results were compared to subjective classification. Beluga calls associated with social interactions, travelling, feeding, and interactions with the boat were described. Call type percentages, relative proportions of different whistle contours (shapes), average frequency, and call duration varied with behaviour. Generally, higher percentages of whistles, more broadband pulsed and noisy calls, and shorter calls (<0.49s) were produced during behaviours associated with higher levels of activity and/or apparent arousal. Information on call types, call characteristics, and behavioural context of calls can be used for automated detection and classification methods and in future studies on call meaning and function.

  18. Are EMS call volume predictions based on demand pattern analysis accurate?

    PubMed

    Brown, Lawrence H; Lerner, E Brooke; Larmon, Baxter; LeGassick, Todd; Taigman, Michael

    2007-01-01

    Most EMS systems determine the number of crews they will deploy in their communities and when those crews will be scheduled based on anticipated call volumes. Many systems use historical data to calculate their anticipated call volumes, a method of prediction known as demand pattern analysis. To evaluate the accuracy of call volume predictions calculated using demand pattern analysis. Seven EMS systems provided 73 consecutive weeks of hourly call volume data. The first 20 weeks of data were used to calculate three common demand pattern analysis constructs for call volume prediction: average peak demand (AP), smoothed average peak demand (SAP), and 90th percentile rank (90%R). The 21st week served as a buffer. Actual call volumes in the last 52 weeks were then compared to the predicted call volumes by using descriptive statistics. There were 61,152 hourly observations in the test period. All three constructs accurately predicted peaks and troughs in call volume but not exact call volume. Predictions were accurate (+/-1 call) 13% of the time using AP, 10% using SAP, and 19% using 90%R. Call volumes were overestimated 83% of the time using AP, 86% using SAP, and 74% using 90%R. When call volumes were overestimated, predictions exceeded actual call volume by a median (Interquartile range) of 4 (2-6) calls for AP, 4 (2-6) for SAP, and 3 (2-5) for 90%R. Call volumes were underestimated 4% of time using AP, 4% using SAP, and 7% using 90%R predictions. When call volumes were underestimated, call volumes exceeded predictions by a median (Interquartile range; maximum under estimation) of 1 (1-2; 18) call for AP, 1 (1-2; 18) for SAP, and 2 (1-3; 20) for 90%R. Results did not vary between systems. Generally, demand pattern analysis estimated or overestimated call volume, making it a reasonable predictor for ambulance staffing patterns. However, it did underestimate call volume between 4% and 7% of the time. Communities need to determine if these rates of over-and underestimation are acceptable given their resources and local priorities.

  19. Scalable Open Science Approach for Mutation Calling of Tumor Exomes Using Multiple Genomic Pipelines.

    PubMed

    Ellrott, Kyle; Bailey, Matthew H; Saksena, Gordon; Covington, Kyle R; Kandoth, Cyriac; Stewart, Chip; Hess, Julian; Ma, Singer; Chiotti, Kami E; McLellan, Michael; Sofia, Heidi J; Hutter, Carolyn; Getz, Gad; Wheeler, David; Ding, Li

    2018-03-28

    The Cancer Genome Atlas (TCGA) cancer genomics dataset includes over 10,000 tumor-normal exome pairs across 33 different cancer types, in total >400 TB of raw data files requiring analysis. Here we describe the Multi-Center Mutation Calling in Multiple Cancers project, our effort to generate a comprehensive encyclopedia of somatic mutation calls for the TCGA data to enable robust cross-tumor-type analyses. Our approach accounts for variance and batch effects introduced by the rapid advancement of DNA extraction, hybridization-capture, sequencing, and analysis methods over time. We present best practices for applying an ensemble of seven mutation-calling algorithms with scoring and artifact filtering. The dataset created by this analysis includes 3.5 million somatic variants and forms the basis for PanCan Atlas papers. The results have been made available to the research community along with the methods used to generate them. This project is the result of collaboration from a number of institutes and demonstrates how team science drives extremely large genomics projects. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Characteristics of calls to the Israeli hotline during the Intifada.

    PubMed

    Gilat, Itzhak; Latzer, Yael

    2007-08-01

    The present study examined the help-seeking characteristics of callers to the ten Israeli hotline centers during the Intifada - the Palestinian uprising in the Israeli administered territories. The research method combined quantitative and qualitative analyses of the volunteers' written reports. The quantitative analysis was conducted on a sample of 21,315 structured forms, and the qualitative content analysis was carried out on a sample of 498 verbal descriptions of calls. The quantitative analysis revealed a U-shaped curve illustrating the frequency of Intifada-related calls in relation to the time of the study. The qualitative analysis showed that the main complaints of the callers were focused on direct and masked manifestations of anxiety and feelings of helplessness. The implications of the findings are discussed in terms of understanding the unique psychological response to a new kind of stress, as seen from the perspective of calls to a hotline.

  1. Detailed temporal structure of communication networks in groups of songbirds.

    PubMed

    Stowell, Dan; Gill, Lisa; Clayton, David

    2016-06-01

    Animals in groups often exchange calls, in patterns whose temporal structure may be influenced by contextual factors such as physical location and the social network structure of the group. We introduce a model-based analysis for temporal patterns of animal call timing, originally developed for networks of firing neurons. This has advantages over cross-correlation analysis in that it can correctly handle common-cause confounds and provides a generative model of call patterns with explicit parameters for the influences between individuals. It also has advantages over standard Markovian analysis in that it incorporates detailed temporal interactions which affect timing as well as sequencing of calls. Further, a fitted model can be used to generate novel synthetic call sequences. We apply the method to calls recorded from groups of domesticated zebra finch (Taeniopygia guttata) individuals. We find that the communication network in these groups has stable structure that persists from one day to the next, and that 'kernels' reflecting the temporal range of influence have a characteristic structure for a calling individual's effect on itself, its partner and on others in the group. We further find characteristic patterns of influences by call type as well as by individual. © 2016 The Authors.

  2. A new method of spatial analysis of irregularly spaced HLB data and biological implications

    USDA-ARS?s Scientific Manuscript database

    Field data on intensity of plant diseases is very often irregularly spaced (i.e., there are varying amounts of distance between rows, ponds, voids, roads, houses, or other land areas). A new method of analysis, sometimes called second-generation wavelet analysis, can be used on this type of irregula...

  3. ParticleCall: A particle filter for base calling in next-generation sequencing systems

    PubMed Central

    2012-01-01

    Background Next-generation sequencing systems are capable of rapid and cost-effective DNA sequencing, thus enabling routine sequencing tasks and taking us one step closer to personalized medicine. Accuracy and lengths of their reads, however, are yet to surpass those provided by the conventional Sanger sequencing method. This motivates the search for computationally efficient algorithms capable of reliable and accurate detection of the order of nucleotides in short DNA fragments from the acquired data. Results In this paper, we consider Illumina’s sequencing-by-synthesis platform which relies on reversible terminator chemistry and describe the acquired signal by reformulating its mathematical model as a Hidden Markov Model. Relying on this model and sequential Monte Carlo methods, we develop a parameter estimation and base calling scheme called ParticleCall. ParticleCall is tested on a data set obtained by sequencing phiX174 bacteriophage using Illumina’s Genome Analyzer II. The results show that the developed base calling scheme is significantly more computationally efficient than the best performing unsupervised method currently available, while achieving the same accuracy. Conclusions The proposed ParticleCall provides more accurate calls than the Illumina’s base calling algorithm, Bustard. At the same time, ParticleCall is significantly more computationally efficient than other recent schemes with similar performance, rendering it more feasible for high-throughput sequencing data analysis. Improvement of base calling accuracy will have immediate beneficial effects on the performance of downstream applications such as SNP and genotype calling. ParticleCall is freely available at https://sourceforge.net/projects/particlecall. PMID:22776067

  4. Analysis methods for tocopherols and tocotrienols

    USDA-ARS?s Scientific Manuscript database

    Tocopherols and tocotrienols, sometimes called tocochromanols or tocols, are also collectively termed Vitamin E. Vitamins A, D, E, and K, are referred to as fat soluble vitamins. Since the discovery of Vitamin E in 1922, many methods have been developed for the analysis of tocopherols and tocotrie...

  5. Using telephony data to facilitate discovery of clinical workflows.

    PubMed

    Rucker, Donald W

    2017-04-19

    Discovery of clinical workflows to target for redesign using methods such as Lean and Six Sigma is difficult. VoIP telephone call pattern analysis may complement direct observation and EMR-based tools in understanding clinical workflows at the enterprise level by allowing visualization of institutional telecommunications activity. To build an analytic framework mapping repetitive and high-volume telephone call patterns in a large medical center to their associated clinical units using an enterprise unified communications server log file and to support visualization of specific call patterns using graphical networks. Consecutive call detail records from the medical center's unified communications server were parsed to cross-correlate telephone call patterns and map associated phone numbers to a cost center dictionary. Hashed data structures were built to allow construction of edge and node files representing high volume call patterns for display with an open source graph network tool. Summary statistics for an analysis of exactly one week's call detail records at a large academic medical center showed that 912,386 calls were placed with a total duration of 23,186 hours. Approximately half of all calling called number pairs had an average call duration under 60 seconds and of these the average call duration was 27 seconds. Cross-correlation of phone calls identified by clinical cost center can be used to generate graphical displays of clinical enterprise communications. Many calls are short. The compact data transfers within short calls may serve as automation or re-design targets. The large absolute amount of time medical center employees were engaged in VoIP telecommunications suggests that analysis of telephone call patterns may offer additional insights into core clinical workflows.

  6. The Five Star Method: A Relational Dream Work Methodology

    ERIC Educational Resources Information Center

    Sparrow, Gregory Scott; Thurston, Mark

    2010-01-01

    This article presents a systematic method of dream work called the Five Star Method. Based on cocreative dream theory, which views the dream as the product of the interaction between dreamer and dream, this creative intervention shifts the principal focus in dream analysis from the interpretation of static imagery to the analysis of the dreamer's…

  7. Calling depths of baleen whales from single sensor data: development of an autocorrelation method using multipath localization.

    PubMed

    Valtierra, Robert D; Glynn Holt, R; Cholewiak, Danielle; Van Parijs, Sofie M

    2013-09-01

    Multipath localization techniques have not previously been applied to baleen whale vocalizations due to difficulties in application to tonal vocalizations. Here it is shown that an autocorrelation method coupled with the direct reflected time difference of arrival localization technique can successfully resolve location information. A derivation was made to model the autocorrelation of a direct signal and its overlapping reflections to illustrate that an autocorrelation may be used to extract reflection information from longer duration signals containing a frequency sweep, such as some calls produced by baleen whales. An analysis was performed to characterize the difference in behavior of the autocorrelation when applied to call types with varying parameters (sweep rate, call duration). The method's feasibility was tested using data from playback transmissions to localize an acoustic transducer at a known depth and location. The method was then used to estimate the depth and range of a single North Atlantic right whale (Eubalaena glacialis) and humpback whale (Megaptera novaeangliae) from two separate experiments.

  8. Admixture Aberration Analysis: Application to Mapping in Admixed Population Using Pooled DNA

    NASA Astrophysics Data System (ADS)

    Bercovici, Sivan; Geiger, Dan

    Admixture mapping is a gene mapping approach used for the identification of genomic regions harboring disease susceptibility genes in the case of recently admixed populations such as African Americans. We present a novel method for admixture mapping, called admixture aberration analysis (AAA), that uses a DNA pool of affected admixed individuals. We demonstrate through simulations that AAA is a powerful and economical mapping method under a range of scenarios, capturing complex human diseases such as hypertension and end stage kidney disease. The method has a low false-positive rate and is robust to deviation from model assumptions. Finally, we apply AAA on 600 prostate cancer-affected African Americans, replicating a known risk locus. Simulation results indicate that the method can yield over 96% reduction in genotyping. Our method is implemented as a Java program called AAAmap and is freely available.

  9. Examining the effectiveness of discriminant function analysis and cluster analysis in species identification of male field crickets based on their calling songs.

    PubMed

    Jaiswara, Ranjana; Nandi, Diptarup; Balakrishnan, Rohini

    2013-01-01

    Traditional taxonomy based on morphology has often failed in accurate species identification owing to the occurrence of cryptic species, which are reproductively isolated but morphologically identical. Molecular data have thus been used to complement morphology in species identification. The sexual advertisement calls in several groups of acoustically communicating animals are species-specific and can thus complement molecular data as non-invasive tools for identification. Several statistical tools and automated identifier algorithms have been used to investigate the efficiency of acoustic signals in species identification. Despite a plethora of such methods, there is a general lack of knowledge regarding the appropriate usage of these methods in specific taxa. In this study, we investigated the performance of two commonly used statistical methods, discriminant function analysis (DFA) and cluster analysis, in identification and classification based on acoustic signals of field cricket species belonging to the subfamily Gryllinae. Using a comparative approach we evaluated the optimal number of species and calling song characteristics for both the methods that lead to most accurate classification and identification. The accuracy of classification using DFA was high and was not affected by the number of taxa used. However, a constraint in using discriminant function analysis is the need for a priori classification of songs. Accuracy of classification using cluster analysis, which does not require a priori knowledge, was maximum for 6-7 taxa and decreased significantly when more than ten taxa were analysed together. We also investigated the efficacy of two novel derived acoustic features in improving the accuracy of identification. Our results show that DFA is a reliable statistical tool for species identification using acoustic signals. Our results also show that cluster analysis of acoustic signals in crickets works effectively for species classification and identification.

  10. Identifying Key Words in 9-1-1 Calls for Stroke: A Mixed Methods Approach.

    PubMed

    Richards, Christopher T; Wang, Baiyang; Markul, Eddie; Albarran, Frank; Rottman, Doreen; Aggarwal, Neelum T; Lindeman, Patricia; Stein-Spencer, Leslee; Weber, Joseph M; Pearlman, Kenneth S; Tataris, Katie L; Holl, Jane L; Klabjan, Diego; Prabhakaran, Shyam

    2017-01-01

    Identifying stroke during a 9-1-1 call is critical to timely prehospital care. However, emergency medical dispatchers (EMDs) recognize stroke in less than half of 9-1-1 calls, potentially due to the words used by callers to communicate stroke signs and symptoms. We hypothesized that callers do not typically use words and phrases considered to be classical descriptors of stroke, such as focal neurologic deficits, but that a mixed-methods approach can identify words and phrases commonly used by 9-1-1 callers to describe acute stroke victims. We performed a mixed-method, retrospective study of 9-1-1 call audio recordings for adult patients with confirmed stroke who were transported by ambulance in a large urban city. Content analysis, a qualitative methodology, and computational linguistics, a quantitative methodology, were used to identify key words and phrases used by 9-1-1 callers to describe acute stroke victims. Because a caller's level of emotional distress contributes to the communication during a 9-1-1 call, the Emotional Content and Cooperation Score was scored by a multidisciplinary team. A total of 110 9-1-1 calls, received between June and September 2013, were analyzed. EMDs recognized stroke in 48% of calls, and the emotional state of most callers (95%) was calm. In 77% of calls in which EMDs recognized stroke, callers specifically used the word "stroke"; however, the word "stroke" was used in only 38% of calls. Vague, non-specific words and phrases were used to describe stroke victims' symptoms in 55% of calls, and 45% of callers used distractor words and phrases suggestive of non-stroke emergencies. Focal neurologic symptoms were described in 39% of calls. Computational linguistics identified 9 key words that were more commonly used in calls where the EMD identified stroke. These words were concordant with terms identified through qualitative content analysis. Most 9-1-1 callers used vague, non-specific, or distractor words and phrases and infrequently provide classic stroke descriptions during 9-1-1 calls for stroke. Both qualitative and quantitative methodologies identified similar key words and phrases associated with accurate EMD stroke recognition. This study suggests that tools incorporating commonly used words and phrases could potentially improve EMD stroke recognition.

  11. Conduits to care: call lights and patients’ perceptions of communication

    PubMed Central

    Montie, Mary; Shuman, Clayton; Galinato, Jose; Patak, Lance; Anderson, Christine A; Titler, Marita G

    2017-01-01

    Background Call light systems remain the primary means of hospitalized patients to initiate communication with their health care providers. Although there is vast amounts of literature discussing patient communication with their health care providers, few studies have explored patients’ perceptions concerning call light use and communication. The specific aim of this study was to solicit patients’ perceptions regarding their call light use and communication with nursing staff. Methods Patients invited to this study met the following inclusion criteria: proficient in English, been hospitalized for at least 24 hours, aged ≥21 years, and able to communicate verbally (eg, not intubated). Thirty participants provided written informed consent, were enrolled in the study, and completed interviews. Results Using qualitative descriptive methods, five major themes emerged from patients’ perceptions (namely; establishing connectivity, participant safety concerns, no separation: health care and the call light device, issues with the current call light, and participants’ perceptions of “nurse work”). Multiple minor themes supported these major themes. Data analysis utilized the constant comparative methods of Glaser and Strauss. Discussion Findings from this study extend the knowledge of patients’ understanding of not only why inconsistencies occur between the call light and their nurses, but also why the call light is more than merely a device to initiate communication; rather, it is a direct conduit to their health care and its delivery. PMID:29075125

  12. LOCAL ORTHOGONAL CUTTING METHOD FOR COMPUTING MEDIAL CURVES AND ITS BIOMEDICAL APPLICATIONS

    PubMed Central

    Einstein, Daniel R.; Dyedov, Vladimir

    2010-01-01

    Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method called local orthogonal cutting (LOC) for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stability and consistency tests. These concepts lend themselves to robust numerical techniques and result in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods. PMID:20628546

  13. Digital Stratigraphy: Contextual Analysis of File System Traces in Forensic Science.

    PubMed

    Casey, Eoghan

    2017-12-28

    This work introduces novel methods for conducting forensic analysis of file allocation traces, collectively called digital stratigraphy. These in-depth forensic analysis methods can provide insight into the origin, composition, distribution, and time frame of strata within storage media. Using case examples and empirical studies, this paper illuminates the successes, challenges, and limitations of digital stratigraphy. This study also shows how understanding file allocation methods can provide insight into concealment activities and how real-world computer usage can complicate digital stratigraphy. Furthermore, this work explains how forensic analysts have misinterpreted traces of normal file system behavior as indications of concealment activities. This work raises awareness of the value of taking the overall context into account when analyzing file system traces. This work calls for further research in this area and for forensic tools to provide necessary information for such contextual analysis, such as highlighting mass deletion, mass copying, and potential backdating. © 2017 American Academy of Forensic Sciences.

  14. Three novel approaches to structural identifiability analysis in mixed-effects models.

    PubMed

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2016-05-06

    Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not possible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. Mining of Business-Oriented Conversations at a Call Center

    NASA Astrophysics Data System (ADS)

    Takeuchi, Hironori; Nasukawa, Tetsuya; Watanabe, Hideo

    Recently it has become feasible to transcribe textual records from telephone conversations at call centers by using automatic speech recognition. In this research, we extended a text mining system for call summary records and constructed a conversation mining system for the business-oriented conversations at the call center. To acquire useful business insights from the conversational data through the text mining system, it is critical to identify appropriate textual segments and expressions as the viewpoints to focus on. In the analysis of call summary data using a text mining system, some experts defined the viewpoints for the analysis by looking at some sample records and by preparing the dictionaries based on frequent keywords in the sample dataset. However with conversations it is difficult to identify such viewpoints manually and in advance because the target data consists of complete transcripts that are often lengthy and redundant. In this research, we defined a model of the business-oriented conversations and proposed a mining method to identify segments that have impacts on the outcomes of the conversations and can then extract useful expressions in each of these identified segments. In the experiment, we processed the real datasets from a car rental service center and constructed a mining system. With this system, we show the effectiveness of the method based on the defined conversation model.

  16. Directional ratio based on parabolic molecules and its application to the analysis of tubular structures

    NASA Astrophysics Data System (ADS)

    Labate, Demetrio; Negi, Pooran; Ozcan, Burcin; Papadakis, Manos

    2015-09-01

    As advances in imaging technologies make more and more data available for biomedical applications, there is an increasing need to develop efficient quantitative algorithms for the analysis and processing of imaging data. In this paper, we introduce an innovative multiscale approach called Directional Ratio which is especially effective to distingush isotropic from anisotropic structures. This task is especially useful in the analysis of images of neurons, the main units of the nervous systems which consist of a main cell body called the soma and many elongated processes called neurites. We analyze the theoretical properties of our method on idealized models of neurons and develop a numerical implementation of this approach for analysis of fluorescent images of cultured neurons. We show that this algorithm is very effective for the detection of somas and the extraction of neurites in images of small circuits of neurons.

  17. 24 CFR 3500.17 - Escrow accounts.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...: Aggregate (or) composite analysis, hereafter called aggregate analysis, means an accounting method a... advances funds for a borrower, then the servicer must perform an escrow account analysis before seeking.... If a servicer advances funds in paying a disbursement, which is not the result of a borrower's...

  18. 24 CFR 3500.17 - Escrow accounts.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...: Aggregate (or) composite analysis, hereafter called aggregate analysis, means an accounting method a... advances funds for a borrower, then the servicer must perform an escrow account analysis before seeking.... If a servicer advances funds in paying a disbursement, which is not the result of a borrower's...

  19. 24 CFR 3500.17 - Escrow accounts.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...: Aggregate (or) composite analysis, hereafter called aggregate analysis, means an accounting method a... advances funds for a borrower, then the servicer must perform an escrow account analysis before seeking.... If a servicer advances funds in paying a disbursement, which is not the result of a borrower's...

  20. A Unifying Framework for Causal Analysis in Set-Theoretic Multimethod Research

    ERIC Educational Resources Information Center

    Rohlfing, Ingo; Schneider, Carsten Q.

    2018-01-01

    The combination of Qualitative Comparative Analysis (QCA) with process tracing, which we call set-theoretic multimethod research (MMR), is steadily becoming more popular in empirical research. Despite the fact that both methods have an elected affinity based on set theory, it is not obvious how a within-case method operating in a single case and a…

  1. Application of integrated fluid-thermal-structural analysis methods

    NASA Technical Reports Server (NTRS)

    Wieting, Allan R.; Dechaumphai, Pramote; Bey, Kim S.; Thornton, Earl A.; Morgan, Ken

    1988-01-01

    Hypersonic vehicles operate in a hostile aerothermal environment which has a significant impact on their aerothermostructural performance. Significant coupling occurs between the aerodynamic flow field, structural heat transfer, and structural response creating a multidisciplinary interaction. Interfacing state-of-the-art disciplinary analysis methods is not efficient, hence interdisciplinary analysis methods integrated into a single aerothermostructural analyzer are needed. The NASA Langley Research Center is developing such methods in an analyzer called LIFTS (Langley Integrated Fluid-Thermal-Structural) analyzer. The evolution and status of LIFTS is reviewed and illustrated through applications.

  2. A non-parametric peak calling algorithm for DamID-Seq.

    PubMed

    Li, Renhua; Hempel, Leonie U; Jiang, Tingbo

    2015-01-01

    Protein-DNA interactions play a significant role in gene regulation and expression. In order to identify transcription factor binding sites (TFBS) of double sex (DSX)-an important transcription factor in sex determination, we applied the DNA adenine methylation identification (DamID) technology to the fat body tissue of Drosophila, followed by deep sequencing (DamID-Seq). One feature of DamID-Seq data is that induced adenine methylation signals are not assured to be symmetrically distributed at TFBS, which renders the existing peak calling algorithms for ChIP-Seq, including SPP and MACS, inappropriate for DamID-Seq data. This challenged us to develop a new algorithm for peak calling. A challenge in peaking calling based on sequence data is estimating the averaged behavior of background signals. We applied a bootstrap resampling method to short sequence reads in the control (Dam only). After data quality check and mapping reads to a reference genome, the peaking calling procedure compromises the following steps: 1) reads resampling; 2) reads scaling (normalization) and computing signal-to-noise fold changes; 3) filtering; 4) Calling peaks based on a statistically significant threshold. This is a non-parametric method for peak calling (NPPC). We also used irreproducible discovery rate (IDR) analysis, as well as ChIP-Seq data to compare the peaks called by the NPPC. We identified approximately 6,000 peaks for DSX, which point to 1,225 genes related to the fat body tissue difference between female and male Drosophila. Statistical evidence from IDR analysis indicated that these peaks are reproducible across biological replicates. In addition, these peaks are comparable to those identified by use of ChIP-Seq on S2 cells, in terms of peak number, location, and peaks width.

  3. Optimized Vertex Method and Hybrid Reliability

    NASA Technical Reports Server (NTRS)

    Smith, Steven A.; Krishnamurthy, T.; Mason, B. H.

    2002-01-01

    A method of calculating the fuzzy response of a system is presented. This method, called the Optimized Vertex Method (OVM), is based upon the vertex method but requires considerably fewer function evaluations. The method is demonstrated by calculating the response membership function of strain-energy release rate for a bonded joint with a crack. The possibility of failure of the bonded joint was determined over a range of loads. After completing the possibilistic analysis, the possibilistic (fuzzy) membership functions were transformed to probability density functions and the probability of failure of the bonded joint was calculated. This approach is called a possibility-based hybrid reliability assessment. The possibility and probability of failure are presented and compared to a Monte Carlo Simulation (MCS) of the bonded joint.

  4. Biological relevance of CNV calling methods using familial relatedness including monozygotic twins.

    PubMed

    Castellani, Christina A; Melka, Melkaye G; Wishart, Andrea E; Locke, M Elizabeth O; Awamleh, Zain; O'Reilly, Richard L; Singh, Shiva M

    2014-04-21

    Studies involving the analysis of structural variation including Copy Number Variation (CNV) have recently exploded in the literature. Furthermore, CNVs have been associated with a number of complex diseases and neurodevelopmental disorders. Common methods for CNV detection use SNP, CNV, or CGH arrays, where the signal intensities of consecutive probes are used to define the number of copies associated with a given genomic region. These practices pose a number of challenges that interfere with the ability of available methods to accurately call CNVs. It has, therefore, become necessary to develop experimental protocols to test the reliability of CNV calling methods from microarray data so that researchers can properly discriminate biologically relevant data from noise. We have developed a workflow for the integration of data from multiple CNV calling algorithms using the same array results. It uses four CNV calling programs: PennCNV (PC), Affymetrix® Genotyping Console™ (AGC), Partek® Genomics Suite™ (PGS) and Golden Helix SVS™ (GH) to analyze CEL files from the Affymetrix® Human SNP 6.0 Array™. To assess the relative suitability of each program, we used individuals of known genetic relationships. We found significant differences in CNV calls obtained by different CNV calling programs. Although the programs showed variable patterns of CNVs in the same individuals, their distribution in individuals of different degrees of genetic relatedness has allowed us to offer two suggestions. The first involves the use of multiple algorithms for the detection of the largest possible number of CNVs, and the second suggests the use of PennCNV over all other methods when the use of only one software program is desirable.

  5. Bandwidth and Detection of Packet Length Covert Channels

    DTIC Science & Technology

    2011-03-01

    Shared Resource Matrix ( SRM ): Develop a matrix of all resources on one side and on the other all the processes. Then, determine which process uses which...system calls. This method is similar to that of the SRM . Covert channels have also been created by modulating packet timing, data and headers of net- work...analysis, noninterference analysis, SRM method, and the covert flow tree method [4]. These methods can be used during the design phase of a system. Less

  6. A Researcher "Called" to "Taboo" Places?: A Burgeoning Research Method in African-Centered Education

    ERIC Educational Resources Information Center

    Shockley, Kmt G.

    2009-01-01

    This article presents a self-reflexive analysis of the complexities of conducting Afrocentric education research while living with a "double consciousness." Having been "called" to places that are considered to be "taboo" the author takes readers on a journey that begins in his busy mind and ends in on the African continent in a "rabbit hole."…

  7. The effect of call libraries and acoustic filters on the identification of bat echolocation.

    PubMed

    Clement, Matthew J; Murray, Kevin L; Solick, Donald I; Gruver, Jeffrey C

    2014-09-01

    Quantitative methods for species identification are commonly used in acoustic surveys for animals. While various identification models have been studied extensively, there has been little study of methods for selecting calls prior to modeling or methods for validating results after modeling. We obtained two call libraries with a combined 1556 pulse sequences from 11 North American bat species. We used four acoustic filters to automatically select and quantify bat calls from the combined library. For each filter, we trained a species identification model (a quadratic discriminant function analysis) and compared the classification ability of the models. In a separate analysis, we trained a classification model using just one call library. We then compared a conventional model assessment that used the training library against an alternative approach that used the second library. We found that filters differed in the share of known pulse sequences that were selected (68 to 96%), the share of non-bat noises that were excluded (37 to 100%), their measurement of various pulse parameters, and their overall correct classification rate (41% to 85%). Although the top two filters did not differ significantly in overall correct classification rate (85% and 83%), rates differed significantly for some bat species. In our assessment of call libraries, overall correct classification rates were significantly lower (15% to 23% lower) when tested on the second call library instead of the training library. Well-designed filters obviated the need for subjective and time-consuming manual selection of pulses. Accordingly, researchers should carefully design and test filters and include adequate descriptions in publications. Our results also indicate that it may not be possible to extend inferences about model accuracy beyond the training library. If so, the accuracy of acoustic-only surveys may be lower than commonly reported, which could affect ecological understanding or management decisions based on acoustic surveys.

  8. The effect of call libraries and acoustic filters on the identification of bat echolocation

    PubMed Central

    Clement, Matthew J; Murray, Kevin L; Solick, Donald I; Gruver, Jeffrey C

    2014-01-01

    Quantitative methods for species identification are commonly used in acoustic surveys for animals. While various identification models have been studied extensively, there has been little study of methods for selecting calls prior to modeling or methods for validating results after modeling. We obtained two call libraries with a combined 1556 pulse sequences from 11 North American bat species. We used four acoustic filters to automatically select and quantify bat calls from the combined library. For each filter, we trained a species identification model (a quadratic discriminant function analysis) and compared the classification ability of the models. In a separate analysis, we trained a classification model using just one call library. We then compared a conventional model assessment that used the training library against an alternative approach that used the second library. We found that filters differed in the share of known pulse sequences that were selected (68 to 96%), the share of non-bat noises that were excluded (37 to 100%), their measurement of various pulse parameters, and their overall correct classification rate (41% to 85%). Although the top two filters did not differ significantly in overall correct classification rate (85% and 83%), rates differed significantly for some bat species. In our assessment of call libraries, overall correct classification rates were significantly lower (15% to 23% lower) when tested on the second call library instead of the training library. Well-designed filters obviated the need for subjective and time-consuming manual selection of pulses. Accordingly, researchers should carefully design and test filters and include adequate descriptions in publications. Our results also indicate that it may not be possible to extend inferences about model accuracy beyond the training library. If so, the accuracy of acoustic-only surveys may be lower than commonly reported, which could affect ecological understanding or management decisions based on acoustic surveys. PMID:25535563

  9. The effect of call libraries and acoustic filters on the identification of bat echolocation

    USGS Publications Warehouse

    Clement, Matthew; Murray, Kevin L; Solick, Donald I; Gruver, Jeffrey C

    2014-01-01

    Quantitative methods for species identification are commonly used in acoustic surveys for animals. While various identification models have been studied extensively, there has been little study of methods for selecting calls prior to modeling or methods for validating results after modeling. We obtained two call libraries with a combined 1556 pulse sequences from 11 North American bat species. We used four acoustic filters to automatically select and quantify bat calls from the combined library. For each filter, we trained a species identification model (a quadratic discriminant function analysis) and compared the classification ability of the models. In a separate analysis, we trained a classification model using just one call library. We then compared a conventional model assessment that used the training library against an alternative approach that used the second library. We found that filters differed in the share of known pulse sequences that were selected (68 to 96%), the share of non-bat noises that were excluded (37 to 100%), their measurement of various pulse parameters, and their overall correct classification rate (41% to 85%). Although the top two filters did not differ significantly in overall correct classification rate (85% and 83%), rates differed significantly for some bat species. In our assessment of call libraries, overall correct classification rates were significantly lower (15% to 23% lower) when tested on the second call library instead of the training library. Well-designed filters obviated the need for subjective and time-consuming manual selection of pulses. Accordingly, researchers should carefully design and test filters and include adequate descriptions in publications. Our results also indicate that it may not be possible to extend inferences about model accuracy beyond the training library. If so, the accuracy of acoustic-only surveys may be lower than commonly reported, which could affect ecological understanding or management decisions based on acoustic surveys.

  10. Conduits to care: call lights and patients' perceptions of communication.

    PubMed

    Montie, Mary; Shuman, Clayton; Galinato, Jose; Patak, Lance; Anderson, Christine A; Titler, Marita G

    2017-01-01

    Call light systems remain the primary means of hospitalized patients to initiate communication with their health care providers. Although there is vast amounts of literature discussing patient communication with their health care providers, few studies have explored patients' perceptions concerning call light use and communication. The specific aim of this study was to solicit patients' perceptions regarding their call light use and communication with nursing staff. Patients invited to this study met the following inclusion criteria: proficient in English, been hospitalized for at least 24 hours, aged ≥21 years, and able to communicate verbally (eg, not intubated). Thirty participants provided written informed consent, were enrolled in the study, and completed interviews. Using qualitative descriptive methods, five major themes emerged from patients' perceptions (namely; establishing connectivity, participant safety concerns, no separation: health care and the call light device, issues with the current call light, and participants' perceptions of "nurse work"). Multiple minor themes supported these major themes. Data analysis utilized the constant comparative methods of Glaser and Strauss. Findings from this study extend the knowledge of patients' understanding of not only why inconsistencies occur between the call light and their nurses, but also why the call light is more than merely a device to initiate communication; rather, it is a direct conduit to their health care and its delivery.

  11. Classification of large acoustic datasets using machine learning and crowdsourcing: application to whale calls.

    PubMed

    Shamir, Lior; Yerby, Carol; Simpson, Robert; von Benda-Beckmann, Alexander M; Tyack, Peter; Samarra, Filipa; Miller, Patrick; Wallin, John

    2014-02-01

    Vocal communication is a primary communication method of killer and pilot whales, and is used for transmitting a broad range of messages and information for short and long distance. The large variation in call types of these species makes it challenging to categorize them. In this study, sounds recorded by audio sensors carried by ten killer whales and eight pilot whales close to the coasts of Norway, Iceland, and the Bahamas were analyzed using computer methods and citizen scientists as part of the Whale FM project. Results show that the computer analysis automatically separated the killer whales into Icelandic and Norwegian whales, and the pilot whales were separated into Norwegian long-finned and Bahamas short-finned pilot whales, showing that at least some whales from these two locations have different acoustic repertoires that can be sensed by the computer analysis. The citizen science analysis was also able to separate the whales to locations by their sounds, but the separation was somewhat less accurate compared to the computer method.

  12. On convergence and convergence rates for Ivanov and Morozov regularization and application to some parameter identification problems in elliptic PDEs

    NASA Astrophysics Data System (ADS)

    Kaltenbacher, Barbara; Klassen, Andrej

    2018-05-01

    In this paper we provide a convergence analysis of some variational methods alternative to the classical Tikhonov regularization, namely Ivanov regularization (also called the method of quasi solutions) with some versions of the discrepancy principle for choosing the regularization parameter, and Morozov regularization (also called the method of the residuals). After motivating nonequivalence with Tikhonov regularization by means of an example, we prove well-definedness of the Ivanov and the Morozov method, convergence in the sense of regularization, as well as convergence rates under variational source conditions. Finally, we apply these results to some linear and nonlinear parameter identification problems in elliptic boundary value problems.

  13. Mathematical Practice in Textbooks Analysis: Praxeological Reference Models, the Case of Proportion

    ERIC Educational Resources Information Center

    Wijayanti, Dyana; Winsløw, Carl

    2017-01-01

    We present a new method in textbook analysis, based on so-called praxeological reference models focused on specific content at task level. This method implies that the mathematical contents of a textbook (or textbook part) is analyzed in terms of the tasks and techniques which are exposed to or demanded from readers; this can then be interpreted…

  14. Simultaneous Aerodynamic and Structural Design Optimization (SASDO) for a 3-D Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J.-W.; Newman, Perry A.

    2001-01-01

    The formulation and implementation of an optimization method called Simultaneous Aerodynamic and Structural Design Optimization (SASDO) is shown as an extension of the Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) method. It is extended by the inclusion of structure element sizing parameters as design variables and Finite Element Method (FEM) analysis responses as constraints. The method aims to reduce the computational expense. incurred in performing shape and sizing optimization using state-of-the-art Computational Fluid Dynamics (CFD) flow analysis, FEM structural analysis and sensitivity analysis tools. SASDO is applied to a simple. isolated, 3-D wing in inviscid flow. Results show that the method finds the saine local optimum as a conventional optimization method with some reduction in the computational cost and without significant modifications; to the analysis tools.

  15. Fault feature analysis of cracked gear based on LOD and analytical-FE method

    NASA Astrophysics Data System (ADS)

    Wu, Jiateng; Yang, Yu; Yang, Xingkai; Cheng, Junsheng

    2018-01-01

    At present, there are two main ideas for gear fault diagnosis. One is the model-based gear dynamic analysis; the other is signal-based gear vibration diagnosis. In this paper, a method for fault feature analysis of gear crack is presented, which combines the advantages of dynamic modeling and signal processing. Firstly, a new time-frequency analysis method called local oscillatory-characteristic decomposition (LOD) is proposed, which has the attractive feature of extracting fault characteristic efficiently and accurately. Secondly, an analytical-finite element (analytical-FE) method which is called assist-stress intensity factor (assist-SIF) gear contact model, is put forward to calculate the time-varying mesh stiffness (TVMS) under different crack states. Based on the dynamic model of the gear system with 6 degrees of freedom, the dynamic simulation response was obtained for different tooth crack depths. For the dynamic model, the corresponding relation between the characteristic parameters and the degree of the tooth crack is established under a specific condition. On the basis of the methods mentioned above, a novel gear tooth root crack diagnosis method which combines the LOD with the analytical-FE is proposed. Furthermore, empirical mode decomposition (EMD) and ensemble empirical mode decomposition (EEMD) are contrasted with the LOD by gear crack fault vibration signals. The analysis results indicate that the proposed method performs effectively and feasibility for the tooth crack stiffness calculation and the gear tooth crack fault diagnosis.

  16. Examining the Effectiveness of Discriminant Function Analysis and Cluster Analysis in Species Identification of Male Field Crickets Based on Their Calling Songs

    PubMed Central

    Jaiswara, Ranjana; Nandi, Diptarup; Balakrishnan, Rohini

    2013-01-01

    Traditional taxonomy based on morphology has often failed in accurate species identification owing to the occurrence of cryptic species, which are reproductively isolated but morphologically identical. Molecular data have thus been used to complement morphology in species identification. The sexual advertisement calls in several groups of acoustically communicating animals are species-specific and can thus complement molecular data as non-invasive tools for identification. Several statistical tools and automated identifier algorithms have been used to investigate the efficiency of acoustic signals in species identification. Despite a plethora of such methods, there is a general lack of knowledge regarding the appropriate usage of these methods in specific taxa. In this study, we investigated the performance of two commonly used statistical methods, discriminant function analysis (DFA) and cluster analysis, in identification and classification based on acoustic signals of field cricket species belonging to the subfamily Gryllinae. Using a comparative approach we evaluated the optimal number of species and calling song characteristics for both the methods that lead to most accurate classification and identification. The accuracy of classification using DFA was high and was not affected by the number of taxa used. However, a constraint in using discriminant function analysis is the need for a priori classification of songs. Accuracy of classification using cluster analysis, which does not require a priori knowledge, was maximum for 6–7 taxa and decreased significantly when more than ten taxa were analysed together. We also investigated the efficacy of two novel derived acoustic features in improving the accuracy of identification. Our results show that DFA is a reliable statistical tool for species identification using acoustic signals. Our results also show that cluster analysis of acoustic signals in crickets works effectively for species classification and identification. PMID:24086666

  17. Time-frequency and advanced frequency estimation techniques for the investigation of bat echolocation calls.

    PubMed

    Kopsinis, Yannis; Aboutanios, Elias; Waters, Dean A; McLaughlin, Steve

    2010-02-01

    In this paper, techniques for time-frequency analysis and investigation of bat echolocation calls are studied. Particularly, enhanced resolution techniques are developed and/or used in this specific context for the first time. When compared to traditional time-frequency representation methods, the proposed techniques are more capable of showing previously unseen features in the structure of bat echolocation calls. It should be emphasized that although the study is focused on bat echolocation recordings, the results are more general and applicable to many other types of signal.

  18. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology.

    PubMed

    Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander

    2015-01-01

    Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses.

  19. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology

    PubMed Central

    Faltermeier, Rupert; Proescholdt, Martin A.; Bele, Sylvia; Brawanski, Alexander

    2015-01-01

    Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses. PMID:26693250

  20. Repertoire and classification of non-song calls in Southeast Alaskan humpback whales (Megaptera novaeangliae).

    PubMed

    Fournet, Michelle E; Szabo, Andy; Mellinger, David K

    2015-01-01

    On low-latitude breeding grounds, humpback whales produce complex and highly stereotyped songs as well as a range of non-song sounds associated with breeding behaviors. While on their Southeast Alaskan foraging grounds, humpback whales produce a range of previously unclassified non-song vocalizations. This study investigates the vocal repertoire of Southeast Alaskan humpback whales from a sample of 299 non-song vocalizations collected over a 3-month period on foraging grounds in Frederick Sound, Southeast Alaska. Three classification systems were used, including aural spectrogram analysis, statistical cluster analysis, and discriminant function analysis, to describe and classify vocalizations. A hierarchical acoustic structure was identified; vocalizations were classified into 16 individual call types nested within four vocal classes. The combined classification method shows promise for identifying variability in call stereotypy between vocal groupings and is recommended for future classification of broad vocal repertoires.

  1. Feminist Policy Analysis: Expanding Traditional Social Work Methods

    ERIC Educational Resources Information Center

    Kanenberg, Heather

    2013-01-01

    In an effort to move the methodology of policy analysis beyond the traditional and artificial position of being objective and value-free, this article is a call to those working and teaching in social work to consider a feminist policy analysis lens. A review of standard policy analysis models is presented alongside feminist models. Such a…

  2. Soil structure characterized using computed tomographic images

    Treesearch

    Zhanqi Cheng; Stephen H. Anderson; Clark J. Gantzer; J. W. Van Sambeek

    2003-01-01

    Fractal analysis of soil structure is a relatively new method for quantifying the effects of management systems on soil properties and quality. The objective of this work was to explore several methods of studying images to describe and quantify structure of soils under forest management. This research uses computed tomography and a topological method called Multiple...

  3. Development of the mathematical model for design and verification of acoustic modal analysis methods

    NASA Astrophysics Data System (ADS)

    Siner, Alexander; Startseva, Maria

    2016-10-01

    To reduce the turbofan noise it is necessary to develop methods for the analysis of the sound field generated by the blade machinery called modal analysis. Because modal analysis methods are very difficult and their testing on the full scale measurements are very expensive and tedious it is necessary to construct some mathematical models allowing to test modal analysis algorithms fast and cheap. At this work the model allowing to set single modes at the channel and to analyze generated sound field is presented. Modal analysis of the sound generated by the ring array of point sound sources is made. Comparison of experimental and numerical modal analysis results is presented at this work.

  4. A stable numerical solution method in-plane loading of nonlinear viscoelastic laminated orthotropic materials

    NASA Technical Reports Server (NTRS)

    Gramoll, K. C.; Dillard, D. A.; Brinson, H. F.

    1989-01-01

    In response to the tremendous growth in the development of advanced materials, such as fiber-reinforced plastic (FRP) composite materials, a new numerical method is developed to analyze and predict the time-dependent properties of these materials. Basic concepts in viscoelasticity, laminated composites, and previous viscoelastic numerical methods are presented. A stable numerical method, called the nonlinear differential equation method (NDEM), is developed to calculate the in-plane stresses and strains over any time period for a general laminate constructed from nonlinear viscoelastic orthotropic plies. The method is implemented in an in-plane stress analysis computer program, called VCAP, to demonstrate its usefulness and to verify its accuracy. A number of actual experimental test results performed on Kevlar/epoxy composite laminates are compared to predictions calculated from the numerical method.

  5. Long-range acoustic detection and localization of blue whale calls in the northeast Pacific Ocean.

    PubMed

    Stafford, K M; Fox, C G; Clark, D S

    1998-12-01

    Analysis of acoustic signals recorded from the U.S. Navy's SOund SUrveillance System (SOSUS) was used to detect and locate blue whale (Balaenoptera musculus) calls offshore in the northeast Pacific. The long, low-frequency components of these calls are characteristic of calls recorded in the presence of blue whales elsewhere in the world. Mean values for frequency and time characteristics from field-recorded blue whale calls were used to develop a simple matched filter for detecting such calls in noisy time series. The matched filter was applied to signals from three different SOSUS arrays off the coast of the Pacific Northwest to detect and associate individual calls from the same animal on the different arrays. A U.S. Navy maritime patrol aircraft was directed to an area where blue whale calls had been detected on SOSUS using these methods, and the presence of vocalizing blue whale was confirmed at the site with field recordings from sonobuoys.

  6. Determination of the optimal number of components in independent components analysis.

    PubMed

    Kassouf, Amine; Jouan-Rimbaud Bouveresse, Delphine; Rutledge, Douglas N

    2018-03-01

    Independent components analysis (ICA) may be considered as one of the most established blind source separation techniques for the treatment of complex data sets in analytical chemistry. Like other similar methods, the determination of the optimal number of latent variables, in this case, independent components (ICs), is a crucial step before any modeling. Therefore, validation methods are required in order to decide about the optimal number of ICs to be used in the computation of the final model. In this paper, three new validation methods are formally presented. The first one, called Random_ICA, is a generalization of the ICA_by_blocks method. Its specificity resides in the random way of splitting the initial data matrix into two blocks, and then repeating this procedure several times, giving a broader perspective for the selection of the optimal number of ICs. The second method, called KMO_ICA_Residuals is based on the computation of the Kaiser-Meyer-Olkin (KMO) index of the transposed residual matrices obtained after progressive extraction of ICs. The third method, called ICA_corr_y, helps to select the optimal number of ICs by computing the correlations between calculated proportions and known physico-chemical information about samples, generally concentrations, or between a source signal known to be present in the mixture and the signals extracted by ICA. These three methods were tested using varied simulated and experimental data sets and compared, when necessary, to ICA_by_blocks. Results were relevant and in line with expected ones, proving the reliability of the three proposed methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Cross-correlation detection and analysis for California's electricity market based on analogous multifractal analysis

    NASA Astrophysics Data System (ADS)

    Wang, Fang; Liao, Gui-ping; Li, Jian-hui; Zou, Rui-biao; Shi, Wen

    2013-03-01

    A novel method, which we called the analogous multifractal cross-correlation analysis, is proposed in this paper to study the multifractal behavior in the power-law cross-correlation between price and load in California electricity market. In addition, a statistic ρAMF -XA, which we call the analogous multifractal cross-correlation coefficient, is defined to test whether the cross-correlation between two given signals is genuine or not. Our analysis finds that both the price and load time series in California electricity market express multifractal nature. While, as indicated by the ρAMF -XA statistical test, there is a huge difference in the cross-correlation behavior between the years 1999 and 2000 in California electricity markets.

  8. Cross-correlation detection and analysis for California's electricity market based on analogous multifractal analysis.

    PubMed

    Wang, Fang; Liao, Gui-ping; Li, Jian-hui; Zou, Rui-biao; Shi, Wen

    2013-03-01

    A novel method, which we called the analogous multifractal cross-correlation analysis, is proposed in this paper to study the multifractal behavior in the power-law cross-correlation between price and load in California electricity market. In addition, a statistic ρAMF-XA, which we call the analogous multifractal cross-correlation coefficient, is defined to test whether the cross-correlation between two given signals is genuine or not. Our analysis finds that both the price and load time series in California electricity market express multifractal nature. While, as indicated by the ρAMF-XA statistical test, there is a huge difference in the cross-correlation behavior between the years 1999 and 2000 in California electricity markets.

  9. A Preliminary Rubric Design to Evaluate Mixed Methods Research

    ERIC Educational Resources Information Center

    Burrows, Timothy J.

    2013-01-01

    With the increase in frequency of the use of mixed methods, both in research publications and in externally funded grants there are increasing calls for a set of standards to assess the quality of mixed methods research. The purpose of this mixed methods study was to conduct a multi-phase analysis to create a preliminary rubric to evaluate mixed…

  10. Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) for a 3-D Flexible Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J.-W.

    2001-01-01

    The formulation and implementation of an optimization method called Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) are extended from single discipline analysis (aerodynamics only) to multidisciplinary analysis - in this case, static aero-structural analysis - and applied to a simple 3-D wing problem. The method aims to reduce the computational expense incurred in performing shape optimization using state-of-the-art Computational Fluid Dynamics (CFD) flow analysis, Finite Element Method (FEM) structural analysis and sensitivity analysis tools. Results for this small problem show that the method reaches the same local optimum as conventional optimization. However, unlike its application to the win,, (single discipline analysis), the method. as I implemented here, may not show significant reduction in the computational cost. Similar reductions were seen in the two-design-variable (DV) problem results but not in the 8-DV results given here.

  11. Improved Method for Prediction of Attainable Wing Leading-Edge Thrust

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; McElroy, Marcus O.; Lessard, Wendy B.; McCullers, L. Arnold

    1996-01-01

    Prediction of the loss of wing leading-edge thrust and the accompanying increase in drag due to lift, when flow is not completely attached, presents a difficult but commonly encountered problem. A method (called the previous method) for the prediction of attainable leading-edge thrust and the resultant effect on airplane aerodynamic performance has been in use for more than a decade. Recently, the method has been revised to enhance its applicability to current airplane design and evaluation problems. The improved method (called the present method) provides for a greater range of airfoil shapes from very sharp to very blunt leading edges. It is also based on a wider range of Reynolds numbers than was available for the previous method. The present method, when employed in computer codes for aerodynamic analysis, generally results in improved correlation with experimental wing-body axial-force data and provides reasonable estimates of the measured drag.

  12. Analyzing Students' Learning in Classroom Discussions about Socioscientific Issues

    ERIC Educational Resources Information Center

    Rudsberg, Karin; Ohman, Johan; Ostman, Leif

    2013-01-01

    In this study, the purpose is to develop and illustrate a method that facilitates investigations of students' learning processes in classroom discussions about socioscientific issues. The method, called transactional argumentation analysis, combines a transactional perspective on meaning making based on John Dewey's pragmatic philosophy and an…

  13. Getting the most out of RNA-seq data analysis.

    PubMed

    Khang, Tsung Fei; Lau, Ching Yee

    2015-01-01

    Background. A common research goal in transcriptome projects is to find genes that are differentially expressed in different phenotype classes. Biologists might wish to validate such gene candidates experimentally, or use them for downstream systems biology analysis. Producing a coherent differential gene expression analysis from RNA-seq count data requires an understanding of how numerous sources of variation such as the replicate size, the hypothesized biological effect size, and the specific method for making differential expression calls interact. We believe an explicit demonstration of such interactions in real RNA-seq data sets is of practical interest to biologists. Results. Using two large public RNA-seq data sets-one representing strong, and another mild, biological effect size-we simulated different replicate size scenarios, and tested the performance of several commonly-used methods for calling differentially expressed genes in each of them. We found that, when biological effect size was mild, RNA-seq experiments should focus on experimental validation of differentially expressed gene candidates. Importantly, at least triplicates must be used, and the differentially expressed genes should be called using methods with high positive predictive value (PPV), such as NOISeq or GFOLD. In contrast, when biological effect size was strong, differentially expressed genes mined from unreplicated experiments using NOISeq, ASC and GFOLD had between 30 to 50% mean PPV, an increase of more than 30-fold compared to the cases of mild biological effect size. Among methods with good PPV performance, having triplicates or more substantially improved mean PPV to over 90% for GFOLD, 60% for DESeq2, 50% for NOISeq, and 30% for edgeR. At a replicate size of six, we found DESeq2 and edgeR to be reasonable methods for calling differentially expressed genes at systems level analysis, as their PPV and sensitivity trade-off were superior to the other methods'. Conclusion. When biological effect size is weak, systems level investigation is not possible using RNAseq data, and no meaningful result can be obtained in unreplicated experiments. Nonetheless, NOISeq or GFOLD may yield limited numbers of gene candidates with good validation potential, when triplicates or more are available. When biological effect size is strong, NOISeq and GFOLD are effective tools for detecting differentially expressed genes in unreplicated RNA-seq experiments for qPCR validation. When triplicates or more are available, GFOLD is a sharp tool for identifying high confidence differentially expressed genes for targeted qPCR validation; for downstream systems level analysis, combined results from DESeq2 and edgeR are useful.

  14. Multifractal detrended cross-correlation analysis for two nonstationary signals.

    PubMed

    Zhou, Wei-Xing

    2008-06-01

    We propose a method called multifractal detrended cross-correlation analysis to investigate the multifractal behaviors in the power-law cross-correlations between two time series or higher-dimensional quantities recorded simultaneously, which can be applied to diverse complex systems such as turbulence, finance, ecology, physiology, geophysics, and so on. The method is validated with cross-correlated one- and two-dimensional binomial measures and multifractal random walks. As an example, we illustrate the method by analyzing two financial time series.

  15. CSM Testbed Development and Large-Scale Structural Applications

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Gillian, R. E.; Mccleary, Susan L.; Lotts, C. G.; Poole, E. L.; Overman, A. L.; Macy, S. C.

    1989-01-01

    A research activity called Computational Structural Mechanics (CSM) conducted at the NASA Langley Research Center is described. This activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM Testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM Testbed methods development environment is presented and some new numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  16. Target Detection and Classification Using Seismic and PIR Sensors

    DTIC Science & Technology

    2012-06-01

    time series analysis via wavelet - based partitioning,” Signal Process...regard, this paper presents a wavelet - based method for target detection and classification. The proposed method has been validated on data sets of...The work reported in this paper makes use of a wavelet - based feature extraction method , called Symbolic Dynamic Filtering (SDF) [12]–[14]. The

  17. Analysis of a virtual memory model for maintaining database views

    NASA Technical Reports Server (NTRS)

    Kinsley, Kathryn C.; Hughes, Charles E.

    1992-01-01

    This paper presents an analytical model for predicting the performance of a new support strategy for database views. This strategy, called the virtual method, is compared with traditional methods for supporting views. The analytical model's predictions of improved performance by the virtual method are then validated by comparing these results with those achieved in an experimental implementation.

  18. [Techniques for rapid production of monoclonal antibodies for use with antibody technology].

    PubMed

    Kamada, Haruhiko

    2012-01-01

    A monoclonal antibody (Mab), due to its specific binding ability to a target protein, can potentially be one of the most useful tools for the functional analysis of proteins in recent proteomics-based research. However, the production of Mab is a very time-consuming and laborious process (i.e., preparation of recombinant antigens, immunization of animals, preparation of hybridomas), making it the rate-limiting step in using Mabs in high-throughput proteomics research, which heavily relies on comprehensive and rapid methods. Therefore, there is a great demand for new methods to efficiently generate Mabs against a group of proteins identified by proteome analysis. Here, we describe a useful method called "Antibody proteomic technique" for the rapid generations of Mabs to pharmaceutical target, which were identified by proteomic analyses of disease samples (ex. tumor tissue, etc.). We also introduce another method to find profitable targets on vasculature, which is called "Vascular proteomic technique". Our results suggest that this method for the rapid generation of Mabs to proteins may be very useful in proteomics-based research as well as in clinical applications.

  19. QuASAR: quantitative allele-specific analysis of reads

    PubMed Central

    Harvey, Chris T.; Moyerbrailean, Gregory A.; Davis, Gordon O.; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-01-01

    Motivation: Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. Results: We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. Availability and implementation: http://github.com/piquelab/QuASAR. Contact: fluca@wayne.edu or rpique@wayne.edu Supplementary information: Supplementary Material is available at Bioinformatics online. PMID:25480375

  20. Structural Equation Models in a Redundancy Analysis Framework With Covariates.

    PubMed

    Lovaglio, Pietro Giorgio; Vittadini, Giorgio

    2014-01-01

    A recent method to specify and fit structural equation modeling in the Redundancy Analysis framework based on so-called Extended Redundancy Analysis (ERA) has been proposed in the literature. In this approach, the relationships between the observed exogenous variables and the observed endogenous variables are moderated by the presence of unobservable composites, estimated as linear combinations of exogenous variables. However, in the presence of direct effects linking exogenous and endogenous variables, or concomitant indicators, the composite scores are estimated by ignoring the presence of the specified direct effects. To fit structural equation models, we propose a new specification and estimation method, called Generalized Redundancy Analysis (GRA), allowing us to specify and fit a variety of relationships among composites, endogenous variables, and external covariates. The proposed methodology extends the ERA method, using a more suitable specification and estimation algorithm, by allowing for covariates that affect endogenous indicators indirectly through the composites and/or directly. To illustrate the advantages of GRA over ERA we propose a simulation study of small samples. Moreover, we propose an application aimed at estimating the impact of formal human capital on the initial earnings of graduates of an Italian university, utilizing a structural model consistent with well-established economic theory.

  1. The linguistic and interactional factors impacting recognition and dispatch in emergency calls for out-of-hospital cardiac arrest: a mixed-method linguistic analysis study protocol.

    PubMed

    Riou, Marine; Ball, Stephen; Williams, Teresa A; Whiteside, Austin; O'Halloran, Kay L; Bray, Janet; Perkins, Gavin D; Cameron, Peter; Fatovich, Daniel M; Inoue, Madoka; Bailey, Paul; Brink, Deon; Smith, Karen; Della, Phillip; Finn, Judith

    2017-07-09

    Emergency telephone calls placed by bystanders are crucial to the recognition of out-of-hospital cardiac arrest (OHCA), fast ambulance dispatch and initiation of early basic life support. Clear and efficient communication between caller and call-taker is essential to this time-critical emergency, yet few studies have investigated the impact that linguistic factors may have on the nature of the interaction and the resulting trajectory of the call. This research aims to provide a better understanding of communication factors impacting on the accuracy and timeliness of ambulance dispatch. A dataset of OHCA calls and their corresponding metadata will be analysed from an interdisciplinary perspective, combining linguistic analysis and health services research. The calls will be transcribed and coded for linguistic and interactional variables and then used to answer a series of research questions about the recognition of OHCA and the delivery of basic life-support instructions to bystanders. Linguistic analysis of calls will provide a deeper understanding of the interactional dynamics between caller and call-taker which may affect recognition and dispatch for OHCA. Findings from this research will translate into recommendations for modifications of the protocols for ambulance dispatch and provide directions for further research. The study has been approved by the Curtin University Human Research Ethics Committee (HR128/2013) and the St John Ambulance Western Australia Research Advisory Group. Findings will be published in peer-reviewed journals and communicated to key audiences, including ambulance dispatch professionals. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Review of Federal Reference Method for Ozone: Nitric Oxide-Chemiluminescence:Supplemental Material for CASAC AMMS

    EPA Science Inventory

    ApproachPer suggestion made by CASAC AMMS members during the April 3, 2014 conference call on the Review of Federal Reference Method for Ozone: Nitric Oxide-Chemiluminescence, ORD has performed additional data analysis activities to explain and mitigate scatter observed in the co...

  3. NMR analysis of biodiesel

    USDA-ARS?s Scientific Manuscript database

    Biodiesel is usually analyzed by the various methods called for in standards such as ASTM D6751 and EN 14214. Nuclear magnetic resonance (NMR) is not one of these methods. However, NMR, with 1H-NMR commonly applied, can be useful in a variety of applications related to biodiesel. These include monit...

  4. Evaluation of Methods for Analysis of Lead in Air Particulates: An Intra-Laboratory and Inter-Laboratory Comparison

    EPA Science Inventory

    In 2008, the United States Environmental Protection Agency (USEPA) set a new National Ambient Air Quality Standard (NAAQS) for lead (Pb) in total suspended particulate matter (Pb-TSP) which called for significant decreases in the allowable limits. The Federal Reference Method (FR...

  5. Exploring the communication between telenurse and caller-a critical discourse analysis.

    PubMed

    Hakimnia, Roya; Holmström, Inger K; Carlsson, Marianne; Höglund, Anna T

    2014-01-01

    Telenursing is an expanding service in most Western societies. Sweden is a front-line country, with all of its 21 counties connected to Swedish Healthcare Direct (SHD) 1177. The intention of the service is twofold: to make health care more efficient, while also making it more accessible and safe for patients. Previous research has shown, however, that the service is not used equitably. Gender, age, socio-economic, and ethnicity differences have been reported as determining factors for the use of the service and the advice given. The aim of the study was to explore the communication between telenurses and callers in authentic calls to SHD 1177. A qualitative method, using critical discourse analysis (CDA), was chosen. The approach was deductive, that is, the analysis was made in view of a predetermined framework of theory. Twenty calls were strategically chosen and included in the study. The CDA resulted in five types of calls, namely a gatekeeping call, a gendered call, a call marked by impersonal traits, a call with voices of the life world, and finally a counter discourse call. The dominating patterns in the calls were of gatekeeping and biomedical character. Patterns of the societal gender order were found, in that representations of the reluctant male caller and the ideal female caller were identified, but also a call representing a counter discourse. The service seemed difficult to use for patients with low language proficiency. Telenursing could potentially challenge inequalities in health care. However, the discourse of telenursing is dialectically related to neoliberal ideology and the ideology of medicine. It is also situated in a gendered context of ideal femininity and hegemonic masculinity. Through better awareness of gender biases and the callers' different resources for making themselves heard, the communication between telenurse and caller might become more equal and thereby better suitable for all callers.

  6. Setting technical standards for visual assessment procedures

    Treesearch

    Kenneth H. Craik; Nickolaus R. Feimer

    1979-01-01

    Under the impetus of recent legislative and administrative mandates concerning analysis and management of the landscape, governmental agencies are being called upon to adopt or develop visual resource and impact assessment (VRIA) systems. A variety of techniques that combine methods of psychological assessment and landscape analysis to serve these purposes is being...

  7. Survival analysis, or what to do with upper limits in astronomical surveys

    NASA Technical Reports Server (NTRS)

    Isobe, Takashi; Feigelson, Eric D.

    1986-01-01

    A field of applied statistics called survival analysis has been developed over several decades to deal with censored data, which occur in astronomical surveys when objects are too faint to be detected. How these methods can assist in the statistical interpretation of astronomical data are reviewed.

  8. Self-consistent analysis of high drift velocity measurements with the STARE system

    NASA Technical Reports Server (NTRS)

    Reinleitner, L. A.; Nielsen, E.

    1985-01-01

    The use of the STARE and SABRE coherent radar systems as valuable tools for geophysical research has been enhanced by a new technique called the Superimposed-Grid-Point method. This method permits an analysis of E-layer plasma irregularity phase velocity versus flow angle utilizing only STARE or SABRE data. As previous work with STARE has indicated, this analysis has clearly shown that the cosine law assumption breaks down for velocities near and exceeding the local ion acoustic velocities. Use of this method is improving understanding of naturally-occurring plasma irregularities in the E-layer.

  9. Transient loads analysis for space flight applications

    NASA Technical Reports Server (NTRS)

    Thampi, S. K.; Vidyasagar, N. S.; Ganesan, N.

    1992-01-01

    A significant part of the flight readiness verification process involves transient analysis of the coupled Shuttle-payload system to determine the low frequency transient loads. This paper describes a methodology for transient loads analysis and its implementation for the Spacelab Life Sciences Mission. The analysis is carried out using two major software tools - NASTRAN and an external FORTRAN code called EZTRAN. This approach is adopted to overcome some of the limitations of NASTRAN's standard transient analysis capabilities. The method uses Data Recovery Matrices (DRM) to improve computational efficiency. The mode acceleration method is fully implemented in the DRM formulation to recover accurate displacements, stresses, and forces. The advantages of the method are demonstrated through a numerical example.

  10. A seismic analysis for masonry constructions: The different schematization methods of masonry walls

    NASA Astrophysics Data System (ADS)

    Olivito, Renato. S.; Codispoti, Rosamaria; Scuro, Carmelo

    2017-11-01

    Seismic analysis of masonry structures is usually analyzed through the use of structural calculation software based on equivalent frames method or to macro-elements method. In these approaches, the masonry walls are divided into vertical elements, masonry walls, and horizontal elements, so-called spandrel elements, interconnected by rigid nodes. The aim of this work is to make a critical comparison between different schematization methods of masonry wall underlining the structural importance of the spandrel elements. In order to implement the methods, two different structural calculation software were used and an existing masonry building has been examined.

  11. DSP Synthesis Algorithm for Generating Florida Scrub Jay Calls

    NASA Technical Reports Server (NTRS)

    Lane, John; Pittman, Tyler

    2017-01-01

    A prototype digital signal processing (DSP) algorithm has been developed to approximate Florida scrub jay calls. The Florida scrub jay (Aphelocoma coerulescens), believed to have been in existence for 2 million years, living only in Florida, has a complicated social system that is evident by examining the spectrograms of its calls. Audio data was acquired at the Helen and Allan Cruickshank Sanctuary, Rockledge, Florida during the 2016 mating season using three digital recorders sampling at 44.1 kHz. The synthesis algorithm is a first step at developing a robust identification and call analysis algorithm. Since the Florida scrub jay is severely threatened by loss of habitat, it is important to develop effective methods to monitor their threatened population using autonomous means.

  12. A Method for Populating the Knowledge Base of APTAS, a Domain-Oriented Application Composition System

    DTIC Science & Technology

    1993-12-01

    proposed a domain analysis approach called Feature-Oriented Domain Analysis ( FODA ). The approach identifies prominent features (similarities) and...characteristics of software systems in the domain. Unlike the other domain analysis approaches we have summarized, the re- searchers described FODA in...Domain Analysis ( FODA ) Feasibility Study. Technical Report, Software Engineering Institute, Carnegie Mellon University, Novem- ber 1990. 19. Lee, Kenneth

  13. Automated surveillance of 911 call data for detection of possible water contamination incidents

    PubMed Central

    2011-01-01

    Background Drinking water contamination, with the capability to affect large populations, poses a significant risk to public health. In recent water contamination events, the impact of contamination on public health appeared in data streams monitoring health-seeking behavior. While public health surveillance has traditionally focused on the detection of pathogens, developing methods for detection of illness from fast-acting chemicals has not been an emphasis. Methods An automated surveillance system was implemented for Cincinnati's drinking water contamination warning system to monitor health-related 911 calls in the city of Cincinnati. Incident codes indicative of possible water contamination were filtered from all 911 calls for analysis. The 911 surveillance system uses a space-time scan statistic to detect potential water contamination incidents. The frequency and characteristics of the 911 alarms over a 2.5 year period were studied. Results During the evaluation, 85 alarms occurred, although most occurred prior to the implementation of an additional alerting constraint in May 2009. Data were available for analysis approximately 48 minutes after calls indicating alarms may be generated 1-2 hours after a rapid increase in call volume. Most alerts occurred in areas of high population density. The average alarm area was 9.22 square kilometers. The average number of cases in an alarm was nine calls. Conclusions The 911 surveillance system provides timely notification of possible public health events, but did have limitations. While the alarms contained incident codes and location of the caller, additional information such as medical status was not available to assist validating the cause of the alarm. Furthermore, users indicated that a better understanding of 911 system functionality is necessary to understand how it would behave in an actual water contamination event. PMID:21450105

  14. Cetacean population density estimation from single fixed sensors using passive acoustics.

    PubMed

    Küsel, Elizabeth T; Mellinger, David K; Thomas, Len; Marques, Tiago A; Moretti, David; Ward, Jessica

    2011-06-01

    Passive acoustic methods are increasingly being used to estimate animal population density. Most density estimation methods are based on estimates of the probability of detecting calls as functions of distance. Typically these are obtained using receivers capable of localizing calls or from studies of tagged animals. However, both approaches are expensive to implement. The approach described here uses a MonteCarlo model to estimate the probability of detecting calls from single sensors. The passive sonar equation is used to predict signal-to-noise ratios (SNRs) of received clicks, which are then combined with a detector characterization that predicts probability of detection as a function of SNR. Input distributions for source level, beam pattern, and whale depth are obtained from the literature. Acoustic propagation modeling is used to estimate transmission loss. Other inputs for density estimation are call rate, obtained from the literature, and false positive rate, obtained from manual analysis of a data sample. The method is applied to estimate density of Blainville's beaked whales over a 6-day period around a single hydrophone located in the Tongue of the Ocean, Bahamas. Results are consistent with those from previous analyses, which use additional tag data. © 2011 Acoustical Society of America

  15. Women's questions about medicines in pregnancy - An analysis of calls to an Australian national medicines call centre.

    PubMed

    Pijpers, Eva L; Kreijkamp-Kaspers, Sanne; McGuire, Treasure M; Deckx, Laura; Brodribb, Wendy; van Driel, Mieke L

    2017-06-01

    For many medicines, safe use during pregnancy is not established and adherence is often poor due to safety concerns. Therefore, it is important to identify consumers' medicines information needs during pregnancy. A retrospective, mixed methods analysis was conducted on eight years of pregnancy-related calls to an Australian national medicines call centre. The call profile of pregnancy and non-pregnancy-related questions were compared. Medicines involved in pregnancy calls were categorised by class (Anatomical Therapeutic Chemical (ATC)3 level), and Therapeutic Goods Administration pregnancy category. Questions in these calls were also themed by pregnancy stage. We identified 4573 pregnancy-related and 118 547 non-pregnancy-related calls. The caller profile for pregnancy-related calls was female (93.7%), asking for herself (83.0%), and while 70.1% of questions involved one medicine, 9.6% involved three or more medicines. Pregnancy enquiries were prompted more often by conflicting information, inadequate information or desire for a second opinion. For 1166 calls, where the stage of pregnancy was available, most questions concerned safety. Medication classified as 'safe' during pregnancy accounted for 34% of these questions. After antidepressants, most calls were made about over-the-counter (OTC) medicines (paracetamol, dexchlorpheniramine, codeine). Safe treatment for everyday conditions was of increasing concern as the pregnancy progressed. Pregnant women are concerned about the safety of medication use in pregnancy and a significant proportion overestimate risk. Psychotropic medication and fertility are strong drivers to seek information during preconception. Everyday illnesses and self-medication with OTC medication are a common concern throughout pregnancy, even though many medicines are safe to use. © 2016 The Royal Australian and New Zealand College of Obstetricians and Gynaecologists.

  16. A Likelihood-Based Framework for Association Analysis of Allele-Specific Copy Numbers.

    PubMed

    Hu, Y J; Lin, D Y; Sun, W; Zeng, D

    2014-10-01

    Copy number variants (CNVs) and single nucleotide polymorphisms (SNPs) co-exist throughout the human genome and jointly contribute to phenotypic variations. Thus, it is desirable to consider both types of variants, as characterized by allele-specific copy numbers (ASCNs), in association studies of complex human diseases. Current SNP genotyping technologies capture the CNV and SNP information simultaneously via fluorescent intensity measurements. The common practice of calling ASCNs from the intensity measurements and then using the ASCN calls in downstream association analysis has important limitations. First, the association tests are prone to false-positive findings when differential measurement errors between cases and controls arise from differences in DNA quality or handling. Second, the uncertainties in the ASCN calls are ignored. We present a general framework for the integrated analysis of CNVs and SNPs, including the analysis of total copy numbers as a special case. Our approach combines the ASCN calling and the association analysis into a single step while allowing for differential measurement errors. We construct likelihood functions that properly account for case-control sampling and measurement errors. We establish the asymptotic properties of the maximum likelihood estimators and develop EM algorithms to implement the corresponding inference procedures. The advantages of the proposed methods over the existing ones are demonstrated through realistic simulation studies and an application to a genome-wide association study of schizophrenia. Extensions to next-generation sequencing data are discussed.

  17. Recent advances of liquid chromatography-(tandem) mass spectrometry in clinical and forensic toxicology - An update.

    PubMed

    Remane, Daniela; Wissenbach, Dirk K; Peters, Frank T

    2016-09-01

    Liquid chromatography (LC) coupled to mass spectrometry (MS) or tandem mass spectrometry (MS/MS) is a well-established and widely used technique in clinical and forensic toxicology as well as doping control especially for quantitative analysis. In recent years, many applications for so-called multi-target screening and/or quantification of drugs, poisons, and or their metabolites in biological matrices have been developed. Such methods have proven particularly useful for analysis of so-called new psychoactive substances that have appeared on recreational drug markets throughout the world. Moreover, the evolvement of high resolution MS techniques and the development of data-independent detection modes have opened new possibilities for applications of LC-(MS/MS) in systematic toxicological screening analysis in the so called general unknown setting. The present paper will provide an overview and discuss these recent developments focusing on the literature published after 2010. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. An interactive graphics program to retrieve, display, compare, manipulate, curve fit, difference and cross plot wind tunnel data

    NASA Technical Reports Server (NTRS)

    Elliott, R. D.; Werner, N. M.; Baker, W. M.

    1975-01-01

    The Aerodynamic Data Analysis and Integration System (ADAIS), developed as a highly interactive computer graphics program capable of manipulating large quantities of data such that addressable elements of a data base can be called up for graphic display, compared, curve fit, stored, retrieved, differenced, etc., was described. The general nature of the system is evidenced by the fact that limited usage has already occurred with data bases consisting of thermodynamic, basic loads, and flight dynamics data. Productivity using ADAIS of five times that for conventional manual methods of wind tunnel data analysis is routinely achieved. In wind tunnel data analysis, data from one or more runs of a particular test may be called up and displayed along with data from one or more runs of a different test. Curves may be faired through the data points by any of four methods, including cubic spline and least squares polynomial fit up to seventh order.

  19. Numerical solution of stiff systems of ordinary differential equations with applications to electronic circuits

    NASA Technical Reports Server (NTRS)

    Rosenbaum, J. S.

    1971-01-01

    Systems of ordinary differential equations in which the magnitudes of the eigenvalues (or time constants) vary greatly are commonly called stiff. Such systems of equations arise in nuclear reactor kinetics, the flow of chemically reacting gas, dynamics, control theory, circuit analysis and other fields. The research reported develops an A-stable numerical integration technique for solving stiff systems of ordinary differential equations. The method, which is called the generalized trapezoidal rule, is a modification of the trapezoidal rule. However, the method is computationally more efficient than the trapezoidal rule when the solution of the almost-discontinuous segments is being calculated.

  20. Impact of the mass media on calls to the CDC National AIDS Hotline.

    PubMed

    Fan, D P

    1996-06-01

    This paper considers new computer methodologies for assessing the impact of different types of public health information. The example used public service announcements (PSAs) and mass media news to predict the volume of attempts to call the CDC National AIDS Hotline from December 1992 through to the end of 1993. The analysis relied solely on data from electronic databases. Newspaper stories and television news transcripts were obtained from the NEXIS electronic database and were scored by machine for AIDS coverage. The PSA database was generated by computer monitoring of advertising distributed by the Centers for Disease Control and Prevention (CDC) and by others. The volume of call attempts was collected automatically by the public branch exchange (PBX) of the Hotline telephone system. The call attempts, the PSAs and the news story data were related to each other using both a standard time series method and the statistical model of ideodynamics. The analysis indicated that the only significant explanatory variable for the call attempts was PSAs produced by the CDC. One possible explanation was that these commercials all included the Hotline telephone number while the other information sources did not.

  1. METHODS DEVELOPMENT FOR THE ANALYSIS OF CHIRAL PESTICIDES

    EPA Science Inventory

    Chiral compounds exist as a pair of nonsuperimposable mirror images called enantiomers. Enantiomers have identical physical-chemical properties, but their interactions with other chiral molecules, toxicity, biodegradation, and fate are often different. Many pharmaceutical com...

  2. An improved ChIP-seq peak detection system for simultaneously identifying post-translational modified transcription factors by combinatorial fusion, using SUMOylation as an example.

    PubMed

    Cheng, Chia-Yang; Chu, Chia-Han; Hsu, Hung-Wei; Hsu, Fang-Rong; Tang, Chung Yi; Wang, Wen-Ching; Kung, Hsing-Jien; Chang, Pei-Ching

    2014-01-01

    Post-translational modification (PTM) of transcriptional factors and chromatin remodelling proteins is recognized as a major mechanism by which transcriptional regulation occurs. Chromatin immunoprecipitation (ChIP) in combination with high-throughput sequencing (ChIP-seq) is being applied as a gold standard when studying the genome-wide binding sites of transcription factor (TFs). This has greatly improved our understanding of protein-DNA interactions on a genomic-wide scale. However, current ChIP-seq peak calling tools are not sufficiently sensitive and are unable to simultaneously identify post-translational modified TFs based on ChIP-seq analysis; this is largely due to the wide-spread presence of multiple modified TFs. Using SUMO-1 modification as an example; we describe here an improved approach that allows the simultaneous identification of the particular genomic binding regions of all TFs with SUMO-1 modification. Traditional peak calling methods are inadequate when identifying multiple TF binding sites that involve long genomic regions and therefore we designed a ChIP-seq processing pipeline for the detection of peaks via a combinatorial fusion method. Then, we annotate the peaks with known transcription factor binding sites (TFBS) using the Transfac Matrix Database (v7.0), which predicts potential SUMOylated TFs. Next, the peak calling result was further analyzed based on the promoter proximity, TFBS annotation, a literature review, and was validated by ChIP-real-time quantitative PCR (qPCR) and ChIP-reChIP real-time qPCR. The results show clearly that SUMOylated TFs are able to be pinpointed using our pipeline. A methodology is presented that analyzes SUMO-1 ChIP-seq patterns and predicts related TFs. Our analysis uses three peak calling tools. The fusion of these different tools increases the precision of the peak calling results. TFBS annotation method is able to predict potential SUMOylated TFs. Here, we offer a new approach that enhances ChIP-seq data analysis and allows the identification of multiple SUMOylated TF binding sites simultaneously, which can then be utilized for other functional PTM binding site prediction in future.

  3. Analysis of Tasks in Pre-Service Elementary Teacher Education Courses

    ERIC Educational Resources Information Center

    Sierpinska, Anna; Osana, Helena

    2012-01-01

    This paper presents some results of research aimed at contributing to the development of a professional knowledge base for teachers of elementary mathematics methods courses, called here "teacher educators." We propose that a useful unit of analysis for this knowledge could be the tasks in which teacher-educators engage pre-service…

  4. Calibration and performance of synchronous SIM/scan mode for simultaneous targeted and discovery (non-targeted) analysis of exhaled breath samples from firefighters

    EPA Science Inventory

    Traditionally, gas chromatography – mass spectrometry (GC-MS) analysis has used a targeted approach called selected ion monitoring (SIM) to quantify specific compounds that may have adverse health effects. Due to method limitations and the constraints of preparing duplicat...

  5. Monte Carlo based statistical power analysis for mediation models: methods and software.

    PubMed

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  6. GKI chloride in water, analysis method. GKI boron in water, analysis method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morriss, L.L.

    1979-05-01

    Procedures for the chemical analysis of chlorides and boron in water are presented. Chlorides can be titrated with mercuric nitrate to form mercuric chloride. At pH 2.3 to 2.8, diphenylcarbazone indicates the end point of this titration by formation of a purple complex with mercury ions. When a sample of water containing boron is acidified and evaporated in the presence of curcumin, a red colored product called rosocyanine is formed. This is dissolved and can be measured photometrically or visually. (DMC)

  7. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  8. QuASAR: quantitative allele-specific analysis of reads.

    PubMed

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was built using the OpenMDAO framework. Pycycle provides analytic derivatives allowing for an efficient use of gradient-based optimization methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  10. Performance Analysis of an Actor-Based Distributed Simulation

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.

  11. Cluster Correspondence Analysis.

    PubMed

    van de Velden, M; D'Enza, A Iodice; Palumbo, F

    2017-03-01

    A method is proposed that combines dimension reduction and cluster analysis for categorical data by simultaneously assigning individuals to clusters and optimal scaling values to categories in such a way that a single between variance maximization objective is achieved. In a unified framework, a brief review of alternative methods is provided and we show that the proposed method is equivalent to GROUPALS applied to categorical data. Performance of the methods is appraised by means of a simulation study. The results of the joint dimension reduction and clustering methods are compared with the so-called tandem approach, a sequential analysis of dimension reduction followed by cluster analysis. The tandem approach is conjectured to perform worse when variables are added that are unrelated to the cluster structure. Our simulation study confirms this conjecture. Moreover, the results of the simulation study indicate that the proposed method also consistently outperforms alternative joint dimension reduction and clustering methods.

  12. Environmental Gradient Analysis, Ordination, and Classification in Environmental Impact Assessments.

    DTIC Science & Technology

    1987-09-01

    agglomerative clustering algorithms for mainframe computers: (1) the unweighted pair-group method that V uses arithmetic averages ( UPGMA ), (2) the...hierarchical agglomerative unweighted pair-group method using arithmetic averages ( UPGMA ), which is also called average linkage clustering. This method was...dendrograms produced by weighted clustering (93). Sneath and Sokal (94), Romesburg (84), and Seber• (90) also strongly recommend the UPGMA . A dendrogram

  13. Multi-Centrality Graph Spectral Decompositions and Their Application to Cyber Intrusion Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Pin-Yu; Choudhury, Sutanay; Hero, Alfred

    Many modern datasets can be represented as graphs and hence spectral decompositions such as graph principal component analysis (PCA) can be useful. Distinct from previous graph decomposition approaches based on subspace projection of a single topological feature, e.g., the centered graph adjacency matrix (graph Laplacian), we propose spectral decomposition approaches to graph PCA and graph dictionary learning that integrate multiple features, including graph walk statistics, centrality measures and graph distances to reference nodes. In this paper we propose a new PCA method for single graph analysis, called multi-centrality graph PCA (MC-GPCA), and a new dictionary learning method for ensembles ofmore » graphs, called multi-centrality graph dictionary learning (MC-GDL), both based on spectral decomposition of multi-centrality matrices. As an application to cyber intrusion detection, MC-GPCA can be an effective indicator of anomalous connectivity pattern and MC-GDL can provide discriminative basis for attack classification.« less

  14. OTG-snpcaller: An Optimized Pipeline Based on TMAP and GATK for SNP Calling from Ion Torrent Data

    PubMed Central

    Huang, Wenpan; Xi, Feng; Lin, Lin; Zhi, Qihuan; Zhang, Wenwei; Tang, Y. Tom; Geng, Chunyu; Lu, Zhiyuan; Xu, Xun

    2014-01-01

    Because the new Proton platform from Life Technologies produced markedly different data from those of the Illumina platform, the conventional Illumina data analysis pipeline could not be used directly. We developed an optimized SNP calling method using TMAP and GATK (OTG-snpcaller). This method combined our own optimized processes, Remove Duplicates According to AS Tag (RDAST) and Alignment Optimize Structure (AOS), together with TMAP and GATK, to call SNPs from Proton data. We sequenced four sets of exomes captured by Agilent SureSelect and NimbleGen SeqCap EZ Kit, using Life Technology’s Ion Proton sequencer. Then we applied OTG-snpcaller and compared our results with the results from Torrent Variants Caller. The results indicated that OTG-snpcaller can reduce both false positive and false negative rates. Moreover, we compared our results with Illumina results generated by GATK best practices, and we found that the results of these two platforms were comparable. The good performance in variant calling using GATK best practices can be primarily attributed to the high quality of the Illumina sequences. PMID:24824529

  15. OTG-snpcaller: an optimized pipeline based on TMAP and GATK for SNP calling from ion torrent data.

    PubMed

    Zhu, Pengyuan; He, Lingyu; Li, Yaqiao; Huang, Wenpan; Xi, Feng; Lin, Lin; Zhi, Qihuan; Zhang, Wenwei; Tang, Y Tom; Geng, Chunyu; Lu, Zhiyuan; Xu, Xun

    2014-01-01

    Because the new Proton platform from Life Technologies produced markedly different data from those of the Illumina platform, the conventional Illumina data analysis pipeline could not be used directly. We developed an optimized SNP calling method using TMAP and GATK (OTG-snpcaller). This method combined our own optimized processes, Remove Duplicates According to AS Tag (RDAST) and Alignment Optimize Structure (AOS), together with TMAP and GATK, to call SNPs from Proton data. We sequenced four sets of exomes captured by Agilent SureSelect and NimbleGen SeqCap EZ Kit, using Life Technology's Ion Proton sequencer. Then we applied OTG-snpcaller and compared our results with the results from Torrent Variants Caller. The results indicated that OTG-snpcaller can reduce both false positive and false negative rates. Moreover, we compared our results with Illumina results generated by GATK best practices, and we found that the results of these two platforms were comparable. The good performance in variant calling using GATK best practices can be primarily attributed to the high quality of the Illumina sequences.

  16. TSSer: an automated method to identify transcription start sites in prokaryotic genomes from differential RNA sequencing data.

    PubMed

    Jorjani, Hadi; Zavolan, Mihaela

    2014-04-01

    Accurate identification of transcription start sites (TSSs) is an essential step in the analysis of transcription regulatory networks. In higher eukaryotes, the capped analysis of gene expression technology enabled comprehensive annotation of TSSs in genomes such as those of mice and humans. In bacteria, an equivalent approach, termed differential RNA sequencing (dRNA-seq), has recently been proposed, but the application of this approach to a large number of genomes is hindered by the paucity of computational analysis methods. With few exceptions, when the method has been used, annotation of TSSs has been largely done manually. In this work, we present a computational method called 'TSSer' that enables the automatic inference of TSSs from dRNA-seq data. The method rests on a probabilistic framework for identifying both genomic positions that are preferentially enriched in the dRNA-seq data as well as preferentially captured relative to neighboring genomic regions. Evaluating our approach for TSS calling on several publicly available datasets, we find that TSSer achieves high consistency with the curated lists of annotated TSSs, but identifies many additional TSSs. Therefore, TSSer can accelerate genome-wide identification of TSSs in bacterial genomes and can aid in further characterization of bacterial transcription regulatory networks. TSSer is freely available under GPL license at http://www.clipz.unibas.ch/TSSer/index.php

  17. Wort free amino nitrogen analysis adapted to a microplate format

    USDA-ARS?s Scientific Manuscript database

    The standard method for determining wort free amino nitrogen content calls for the use of test tubes and glass marbles, as well as boiling and 20°C water baths. In this paper we describe how the standard method can be updated and streamlined by replacing water baths, test tubes and marbles with a th...

  18. White Paper: A Defect Prioritization Method Based on the Risk Priority Number

    DTIC Science & Technology

    2013-11-01

    adapted The Failure Modes and Effects Analysis ( FMEA ) method employs a measurement technique called Risk Priority Number (RPN) to quantify the...Up to an hour 16-60 1.5 Brief Interrupt 0-15 1 Table 1 – Time Scaling Factors In the FMEA formulation, RPN is a product of the three categories

  19. Effects of international football matches on ambulance call profiles and volumes during the 2006 World Cup

    PubMed Central

    Deakin, Charles D; Thompson, Fizz; Gibson, Caroline; Green, Mark

    2007-01-01

    Background Prompt ambulance attendance is aimed at improving patient care. With finite resources struggling to meet performance targets, unforeseen demand precludes the ability to tailor resources to cope with increased call volumes, and can have a marked detrimental effect on performance and hence patient care. The effects of the 2006 World Cup football matches on call volumes and profiles were analysed to understand how public events can influence demands on the ambulance service. Methods All emergency calls to the Hampshire Ambulance Service NHS Trust (currently the Hampshire Division of South Central Ambulance Service, Winchester, UK) during the first weekend of the 2006 World Cup football matches were analysed by call volume and classification of call (call type). Results On the day of the first football match, call volume was over 50% higher than that on a typical Saturday, with distinct peaks before and after the inaugural match. Call profile analysis showed increases in alcohol‐related emergencies, including collapse, unconsciousness, assault and road traffic accidents. The increase in assaults was particularly marked at the end of each match and increased again into the late evening. Conclusion A detailed mapping of call volumes and profiles during the World Cup football shows a significant increase in overall emergency calls, mostly alcohol related. Mapping of limited resources to these patterns will allow improved responses to emergency calls. PMID:17513536

  20. Voices of Hispanic College Students: A Content Analysis of Qualitative Research within the "Hispanic Journal of Behavioral Sciences"

    ERIC Educational Resources Information Center

    Storlie, Cassandra A.; Moreno, Luis S.; Portman, Tarrell Awe Agahe

    2014-01-01

    As Hispanic students continue to be an underrepresented cultural group in higher education, researchers are called to uncover the challenging and complex experience of this diverse group of students. Using the constant comparative method, these researchers conducted a content analysis of the qualitative research on the experiences of Hispanic…

  1. Discrete Dynamical Modeling.

    ERIC Educational Resources Information Center

    Sandefur, James T.

    1991-01-01

    Discussed is the process of translating situations involving changing quantities into mathematical relationships. This process, called dynamical modeling, allows students to learn new mathematics while sharpening their algebraic skills. A description of dynamical systems, problem-solving methods, a graphical analysis, and available classroom…

  2. A single-loop optimization method for reliability analysis with second order uncertainty

    NASA Astrophysics Data System (ADS)

    Xie, Shaojun; Pan, Baisong; Du, Xiaoping

    2015-08-01

    Reliability analysis may involve random variables and interval variables. In addition, some of the random variables may have interval distribution parameters owing to limited information. This kind of uncertainty is called second order uncertainty. This article develops an efficient reliability method for problems involving the three aforementioned types of uncertain input variables. The analysis produces the maximum and minimum reliability and is computationally demanding because two loops are needed: a reliability analysis loop with respect to random variables and an interval analysis loop for extreme responses with respect to interval variables. The first order reliability method and nonlinear optimization are used for the two loops, respectively. For computational efficiency, the two loops are combined into a single loop by treating the Karush-Kuhn-Tucker (KKT) optimal conditions of the interval analysis as constraints. Three examples are presented to demonstrate the proposed method.

  3. Systems configured to distribute a telephone call, communication systems, communication methods and methods of routing a telephone call to a service representative

    DOEpatents

    Harris, Scott H.; Johnson, Joel A.; Neiswanger, Jeffery R.; Twitchell, Kevin E.

    2004-03-09

    The present invention includes systems configured to distribute a telephone call, communication systems, communication methods and methods of routing a telephone call to a customer service representative. In one embodiment of the invention, a system configured to distribute a telephone call within a network includes a distributor adapted to connect with a telephone system, the distributor being configured to connect a telephone call using the telephone system and output the telephone call and associated data of the telephone call; and a plurality of customer service representative terminals connected with the distributor and a selected customer service representative terminal being configured to receive the telephone call and the associated data, the distributor and the selected customer service representative terminal being configured to synchronize, application of the telephone call and associated data from the distributor to the selected customer service representative terminal.

  4. A short review of variants calling for single-cell-sequencing data with applications.

    PubMed

    Wei, Zhuohui; Shu, Chang; Zhang, Changsheng; Huang, Jingying; Cai, Hongmin

    2017-11-01

    The field of single-cell sequencing is fleetly expanding, and many techniques have been developed in the past decade. With this technology, biologists can study not only the heterogeneity between two adjacent cells in the same tissue or organ, but also the evolutionary relationships and degenerative processes in a single cell. Calling variants is the main purpose in analyzing single cell sequencing (SCS) data. Currently, some popular methods used for bulk-cell-sequencing data analysis are tailored directly to be applied in dealing with SCS data. However, SCS requires an extra step of genome amplification to accumulate enough quantity for satisfying sequencing needs. The amplification yields large biases and thus raises challenge for using the bulk-cell-sequencing methods. In order to provide guidance for the development of specialized analyzed methods as well as using currently developed tools for SNS, this paper aims to bridge the gap. In this paper, we firstly introduced two popular genome amplification methods and compared their capabilities. Then we introduced a few popular models for calling single-nucleotide polymorphisms and copy-number variations. Finally, break-through applications of SNS were summarized to demonstrate its potential in researching cell evolution. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. A general soft label based linear discriminant analysis for semi-supervised dimensionality reduction.

    PubMed

    Zhao, Mingbo; Zhang, Zhao; Chow, Tommy W S; Li, Bing

    2014-07-01

    Dealing with high-dimensional data has always been a major problem in research of pattern recognition and machine learning, and Linear Discriminant Analysis (LDA) is one of the most popular methods for dimension reduction. However, it only uses labeled samples while neglecting unlabeled samples, which are abundant and can be easily obtained in the real world. In this paper, we propose a new dimension reduction method, called "SL-LDA", by using unlabeled samples to enhance the performance of LDA. The new method first propagates label information from the labeled set to the unlabeled set via a label propagation process, where the predicted labels of unlabeled samples, called "soft labels", can be obtained. It then incorporates the soft labels into the construction of scatter matrixes to find a transformed matrix for dimension reduction. In this way, the proposed method can preserve more discriminative information, which is preferable when solving the classification problem. We further propose an efficient approach for solving SL-LDA under a least squares framework, and a flexible method of SL-LDA (FSL-LDA) to better cope with datasets sampled from a nonlinear manifold. Extensive simulations are carried out on several datasets, and the results show the effectiveness of the proposed method. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Mobile phones and social structures: an exploration of a closed user group in rural Ghana

    PubMed Central

    2013-01-01

    Background In the Millennium Villages Project site of Bonsaaso, Ghana, the Health Team is using a mobile phone closed user group to place calls amongst one another at no cost. Methods In order to determine the utilization and acceptability of the closed user group amongst users, social network analysis and qualitative methods were used. Key informants were identified and interviewed. The key informants also kept prospective call journals. Billing statements and de-identified call data from the closed user group were used to generate data for analyzing the social structure revealed by the network traffic. Results The majority of communication within the closed user group was personal and not for professional purposes. The members of the CUG felt that the group improved their efficiency at work. Conclusions The methods used present an interesting way to investigate the social structure surrounding communication via mobile phones. In addition, the benefits identified from the exploration of this closed user group make a case for supporting mobile phone closed user groups amongst professional groups. PMID:24007331

  7. High-Bandwidth Tactical-Network Data Analysis in a High-Performance-Computing (HPC) Environment: Voice Call Analysis

    DTIC Science & Technology

    2015-09-01

    Gateway 2 4. Voice Packet Flow: SIP , Session Description Protocol (SDP), and RTP 3 5. Voice Data Analysis 5 6. Call Analysis 6 7. Call Metrics 6...analysis processing is designed for a general VoIP system architecture based on Session Initiation Protocol ( SIP ) for negotiating call sessions and...employs Skinny Client Control Protocol for network communication between the phone and the local CallManager (e.g., for each dialed digit), SIP

  8. Fluorescence-labeled methylation-sensitive amplified fragment length polymorphism (FL-MS-AFLP) analysis for quantitative determination of DNA methylation and demethylation status.

    PubMed

    Kageyama, Shinji; Shinmura, Kazuya; Yamamoto, Hiroko; Goto, Masanori; Suzuki, Koichi; Tanioka, Fumihiko; Tsuneyoshi, Toshihiro; Sugimura, Haruhiko

    2008-04-01

    The PCR-based DNA fingerprinting method called the methylation-sensitive amplified fragment length polymorphism (MS-AFLP) analysis is used for genome-wide scanning of methylation status. In this study, we developed a method of fluorescence-labeled MS-AFLP (FL-MS-AFLP) analysis by applying a fluorescence-labeled primer and fluorescence-detecting electrophoresis apparatus to the existing method of MS-AFLP analysis. The FL-MS-AFLP analysis enables quantitative evaluation of more than 350 random CpG loci per run. It was shown to allow evaluation of the differences in methylation level of blood DNA of gastric cancer patients and evaluation of hypermethylation and hypomethylation in DNA from gastric cancer tissue in comparison with adjacent non-cancerous tissue.

  9. Relative Displacement Method for Track-Structure Interaction

    PubMed Central

    Ramos, Óscar Ramón; Pantaleón, Marcos J.

    2014-01-01

    The track-structure interaction effects are usually analysed with conventional FEM programs, where it is difficult to implement the complex track-structure connection behaviour, which is nonlinear, elastic-plastic and depends on the vertical load. The authors developed an alternative analysis method, which they call the relative displacement method. It is based on the calculation of deformation states in single DOF element models that satisfy the boundary conditions. For its solution, an iterative optimisation algorithm is used. This method can be implemented in any programming language or analysis software. A comparison with ABAQUS calculations shows a very good result correlation and compliance with the standard's specifications. PMID:24634610

  10. GenomeFingerprinter: the genome fingerprint and the universal genome fingerprint analysis for systematic comparative genomics.

    PubMed

    Ai, Yuncan; Ai, Hannan; Meng, Fanmei; Zhao, Lei

    2013-01-01

    No attention has been paid on comparing a set of genome sequences crossing genetic components and biological categories with far divergence over large size range. We define it as the systematic comparative genomics and aim to develop the methodology. First, we create a method, GenomeFingerprinter, to unambiguously produce a set of three-dimensional coordinates from a sequence, followed by one three-dimensional plot and six two-dimensional trajectory projections, to illustrate the genome fingerprint of a given genome sequence. Second, we develop a set of concepts and tools, and thereby establish a method called the universal genome fingerprint analysis (UGFA). Particularly, we define the total genetic component configuration (TGCC) (including chromosome, plasmid, and phage) for describing a strain as a systematic unit, the universal genome fingerprint map (UGFM) of TGCC for differentiating strains as a universal system, and the systematic comparative genomics (SCG) for comparing a set of genomes crossing genetic components and biological categories. Third, we construct a method of quantitative analysis to compare two genomes by using the outcome dataset of genome fingerprint analysis. Specifically, we define the geometric center and its geometric mean for a given genome fingerprint map, followed by the Euclidean distance, the differentiate rate, and the weighted differentiate rate to quantitatively describe the difference between two genomes of comparison. Moreover, we demonstrate the applications through case studies on various genome sequences, giving tremendous insights into the critical issues in microbial genomics and taxonomy. We have created a method, GenomeFingerprinter, for rapidly computing, geometrically visualizing, intuitively comparing a set of genomes at genome fingerprint level, and hence established a method called the universal genome fingerprint analysis, as well as developed a method of quantitative analysis of the outcome dataset. These have set up the methodology of systematic comparative genomics based on the genome fingerprint analysis.

  11. Multivariate pattern analysis of fMRI: the early beginnings.

    PubMed

    Haxby, James V

    2012-08-15

    In 2001, we published a paper on the representation of faces and objects in ventral temporal cortex that introduced a new method for fMRI analysis, which subsequently came to be called multivariate pattern analysis (MVPA). MVPA now refers to a diverse set of methods that analyze neural responses as patterns of activity that reflect the varying brain states that a cortical field or system can produce. This paper recounts the circumstances and events that led to the original study and later developments and innovations that have greatly expanded this approach to fMRI data analysis, leading to its widespread application. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. A Chaotic Ordered Hierarchies Consistency Analysis Performance Evaluation Model

    NASA Astrophysics Data System (ADS)

    Yeh, Wei-Chang

    2013-02-01

    The Hierarchies Consistency Analysis (HCA) is proposed by Guh in-cooperated along with some case study on a Resort to reinforce the weakness of Analytical Hierarchy Process (AHP). Although the results obtained enabled aid for the Decision Maker to make more reasonable and rational verdicts, the HCA itself is flawed. In this paper, our objective is to indicate the problems of HCA, and then propose a revised method called chaotic ordered HCA (COH in short) which can avoid problems. Since the COH is based upon Guh's method, the Decision Maker establishes decisions in a way similar to that of the original method.

  13. Linear Discriminant Analysis on a Spreadsheet.

    ERIC Educational Resources Information Center

    Busbey, Arthur Bresnahan III

    1989-01-01

    Described is a software package, "Trapeze," within which a routine called LinDis can be used. Discussed are teaching methods, the linear discriminant model and equations, the LinDis worksheet, and an example. The set up for this routine is included. (CW)

  14. Microbial Resistant Test Method Development

    EPA Science Inventory

    Because humans spend most of their time in the indoor environment, environmental analysis of the quality of indoor air has become an important research topic. A major component of the aerosol in the indoor environment consists of biological particles, called bioaerosols, and fur...

  15. The fuel tax compliance unit : an evaluation and analysis of results.

    DOT National Transportation Integrated Search

    2004-01-01

    Kentucky utilized TEA-21 federal funds to create an innovative pilot program to identify the best practices and methods for auditing taxpayers of transportation related taxes. This program involved a four-year experimental program called the Fuel Tax...

  16. Numerical analysis on the cutting and finishing efficiency of MRAFF process

    NASA Astrophysics Data System (ADS)

    Lih, F. L.

    2016-03-01

    The aim of the present research is to conduct a numerical study of the characteristic of a two-phase magnetorheological fluid with different operation conditions by the finite volume method called SIMPLE with an add-on MHD code.

  17. Review of Research on Educational Leadership and Management in Asia: A Comparative Analysis of Research Topics and Methods, 1995-2012

    ERIC Educational Resources Information Center

    Hallinger, Philip; Chen, Junjun

    2015-01-01

    Over the past two decades scholars have called for a more concerted effort to develop an empirically grounded literature on educational leadership outside of mainstream "Western" contexts. This paper reports the results of a review of research topics and methods that comprise the literature on educational leadership and management in…

  18. Bias Characterization in Probabilistic Genotype Data and Improved Signal Detection with Multiple Imputation

    PubMed Central

    Palmer, Cameron; Pe’er, Itsik

    2016-01-01

    Missing data are an unavoidable component of modern statistical genetics. Different array or sequencing technologies cover different single nucleotide polymorphisms (SNPs), leading to a complicated mosaic pattern of missingness where both individual genotypes and entire SNPs are sporadically absent. Such missing data patterns cannot be ignored without introducing bias, yet cannot be inferred exclusively from nonmissing data. In genome-wide association studies, the accepted solution to missingness is to impute missing data using external reference haplotypes. The resulting probabilistic genotypes may be analyzed in the place of genotype calls. A general-purpose paradigm, called Multiple Imputation (MI), is known to model uncertainty in many contexts, yet it is not widely used in association studies. Here, we undertake a systematic evaluation of existing imputed data analysis methods and MI. We characterize biases related to uncertainty in association studies, and find that bias is introduced both at the imputation level, when imputation algorithms generate inconsistent genotype probabilities, and at the association level, when analysis methods inadequately model genotype uncertainty. We find that MI performs at least as well as existing methods or in some cases much better, and provides a straightforward paradigm for adapting existing genotype association methods to uncertain data. PMID:27310603

  19. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  20. A Semi-Parametric Bayesian Mixture Modeling Approach for the Analysis of Judge Mediated Data

    ERIC Educational Resources Information Center

    Muckle, Timothy Joseph

    2010-01-01

    Existing methods for the analysis of ordinal-level data arising from judge ratings, such as the Multi-Facet Rasch model (MFRM, or the so-called Facets model) have been widely used in assessment in order to render fair examinee ability estimates in situations where the judges vary in their behavior or severity. However, this model makes certain…

  1. Rapid Analysis and Manufacturing Propulsion Technology (RAMPT)

    NASA Technical Reports Server (NTRS)

    Fikes, John C.

    2018-01-01

    NASA's strategic plan calls for the development of enabling technologies, improved production methods, and advanced design and analysis tools related to the agency's objectives to expand human presence in the solar system. NASA seeks to advance exploration, science, innovation, benefits to humanity, and international collaboration, as well as facilitate and utilize U.S. commercial capabilities to deliver cargo and crew to space.

  2. The SOBANE risk management strategy and the Déparis method for the participatory screening of the risks.

    PubMed

    Malchaire, J B

    2004-08-01

    The first section of the document describes a risk-prevention strategy, called SOBANE, in four levels: screening, observation, analysis and expertise. The aim is to make risk prevention faster, more cost effective, and more effective in coordinating the contributions of the workers themselves, their management, the internal and external occupational health (OH) practitioners and the experts. These four levels are: screening, where the risk factors are detected by the workers and their management, and obvious solutions are implemented; observation, where the remaining problems are studied in more detail, one by one, and the reasons and the solutions are discussed in detail; analysis, where, when necessary, an OH practitioner is called upon to carry out appropriate measurements to develop specific solutions; expertise, where, in very sophisticated and rare cases, the assistance of an expert is called upon to solve a particular problem. The method for the participatory screening of the risks (in French: Dépistage Participatif des Risques), Déparis, is proposed for the first level screening of the SOBANE strategy. The work situation is systematically reviewed and all the aspects conditioning the easiness, the effectiveness and the satisfaction at work are discussed, in search of practical prevention measures. The points to be studied more in detail at level 2, observation, are identified. The method is carried out during a meeting of key workers and technical staff. The method proves to be simple, sparing in time and means and playing a significant role in the development of a dynamic plan of risk management and of a culture of dialogue in the company.

  3. An improved adaptive weighting function method for State Estimation in Power Systems with VSC-MTDC

    NASA Astrophysics Data System (ADS)

    Zhao, Kun; Yang, Xiaonan; Lang, Yansheng; Song, Xuri; Wang, Minkun; Luo, Yadi; Wu, Lingyun; Liu, Peng

    2017-04-01

    This paper presents an effective approach for state estimation in power systems that include multi-terminal voltage source converter based high voltage direct current (VSC-MTDC), called improved adaptive weighting function method. The proposed approach is simplified in which the VSC-MTDC system is solved followed by the AC system. Because the new state estimation method only changes the weight and keeps the matrix dimension unchanged. Accurate and fast convergence of AC/DC system can be realized by adaptive weight function method. This method also provides the technical support for the simulation analysis and accurate regulation of AC/DC system. Both the oretical analysis and numerical tests verify practicability, validity and convergence of new method.

  4. Text Genres in Information Organization

    ERIC Educational Resources Information Center

    Nahotko, Marek

    2016-01-01

    Introduction: Text genres used by so-called information organizers in the processes of information organization in information systems were explored in this research. Method: The research employed text genre socio-functional analysis. Five genre groups in information organization were distinguished. Every genre group used in information…

  5. Story Retelling Used with Average and Learning Disabled Readers as a Measure of Reading Comprehension.

    ERIC Educational Resources Information Center

    Hansen, Cheryl L.

    1978-01-01

    A method for quantifying story retells, called proposition analysis, was used to study the reading comprehension performances of 34 learning disabled and normal fifth and sixth graders. Journal availability: see EC 112 927. (DLS) 927

  6. Effect of two 12-minute culturally targeted films on intent to call 911 for stroke

    PubMed Central

    Williams, Olajide; DeSorbo, Alexandra; Eimicke, Joseph; Abel-Bey, Amparo; Valdez, Lenfis; Noble, James; Gordillo, Madeleine; Ravenell, Joseph; Ramirez, Mildred; Teresi, Jeanne A.; Jean-Louis, Girardin; Ogedegbe, Gbenga

    2016-01-01

    Objective: We assessed the behavioral effect of two 12-minute culturally targeted stroke films on immediately calling 911 for suspected stroke among black and Hispanic participants using a quasi-experimental pretest-posttest design. Methods: We enrolled 102 adult churchgoers (60 black and 42 Hispanic) into a single viewing of one of the 2 stroke films—a Gospel musical (English) or Telenovela (Spanish). We measured intent to immediately call 911 using the validated 28-item Stroke Action Test in English and Spanish, along with related variables, before and immediately after the intervention. Data were analyzed using repeated-measures analysis of variance. Results: An increase in intent to call 911 was seen immediately following the single viewing. Higher self-efficacy for calling 911 was associated with intent to call 911 among Hispanic but not black participants. A composite measure of barriers to calling 911 was not associated with intent to call 911 in either group. A significant association was found between higher stroke symptom knowledge and intent to call 911 at baseline, but not immediately following the intervention. No sex associations were found; however, being older was associated with greater intent to call 911. The majority of participants would strongly recommend the films to others. One participant appropriately called 911 for a real-life stroke event. Conclusions: Narrative communication in the form of tailored short films may improve intent to call 911 for stroke among the black and Hispanic population. PMID:27164682

  7. Scratch and dig analysis for Metis mirrors surfaces defects evaluation

    NASA Astrophysics Data System (ADS)

    Špína, M.; Procháska, F.; Melich, R.

    2016-11-01

    The presented paper aims to theoretically analyze the possibilities, advantages and drawbacks of standard methods used for the assessment of optical surface defects (the so-called Scratch and Dig analysis). Based on the acquired knowledge, we design and apply a process of SaD analysis suitable for the evaluation of optical surfaces of mirrors of the space coronagraph Metis, whose manufacturing was successfully implemented within the Centre Toptec in the past period.

  8. Systematic comparison of variant calling pipelines using gold standard personal exome variants

    PubMed Central

    Hwang, Sohyun; Kim, Eiru; Lee, Insuk; Marcotte, Edward M.

    2015-01-01

    The success of clinical genomics using next generation sequencing (NGS) requires the accurate and consistent identification of personal genome variants. Assorted variant calling methods have been developed, which show low concordance between their calls. Hence, a systematic comparison of the variant callers could give important guidance to NGS-based clinical genomics. Recently, a set of high-confident variant calls for one individual (NA12878) has been published by the Genome in a Bottle (GIAB) consortium, enabling performance benchmarking of different variant calling pipelines. Based on the gold standard reference variant calls from GIAB, we compared the performance of thirteen variant calling pipelines, testing combinations of three read aligners—BWA-MEM, Bowtie2, and Novoalign—and four variant callers—Genome Analysis Tool Kit HaplotypeCaller (GATK-HC), Samtools mpileup, Freebayes and Ion Proton Variant Caller (TVC), for twelve data sets for the NA12878 genome sequenced by different platforms including Illumina2000, Illumina2500, and Ion Proton, with various exome capture systems and exome coverage. We observed different biases toward specific types of SNP genotyping errors by the different variant callers. The results of our study provide useful guidelines for reliable variant identification from deep sequencing of personal genomes. PMID:26639839

  9. Neutron activation analysis of certified samples by the absolute method

    NASA Astrophysics Data System (ADS)

    Kadem, F.; Belouadah, N.; Idiri, Z.

    2015-07-01

    The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.

  10. Comparison of Seven Methods for Boolean Factor Analysis and Their Evaluation by Information Gain.

    PubMed

    Frolov, Alexander A; Húsek, Dušan; Polyakov, Pavel Yu

    2016-03-01

    An usual task in large data set analysis is searching for an appropriate data representation in a space of fewer dimensions. One of the most efficient methods to solve this task is factor analysis. In this paper, we compare seven methods for Boolean factor analysis (BFA) in solving the so-called bars problem (BP), which is a BFA benchmark. The performance of the methods is evaluated by means of information gain. Study of the results obtained in solving BP of different levels of complexity has allowed us to reveal strengths and weaknesses of these methods. It is shown that the Likelihood maximization Attractor Neural Network with Increasing Activity (LANNIA) is the most efficient BFA method in solving BP in many cases. Efficacy of the LANNIA method is also shown, when applied to the real data from the Kyoto Encyclopedia of Genes and Genomes database, which contains full genome sequencing for 1368 organisms, and to text data set R52 (from Reuters 21578) typically used for label categorization.

  11. Analysis Methods and Models for Small Unit Operations

    DTIC Science & Technology

    2006-07-01

    wordt in andere studies ogebruikt orn a-an te geven welke op welke wijze operationele effectiviteit kan worden gekwalificeerd en gekwanuificeerd...the node ’Prediction’ is called a child of the node ’Success’ and the node ’Success’ is called a parent of the node ’Prediction’. Figure C.2 A simple...event A is a child of event B and event B is a child of event C ( C -- B -- A). The belief network or influence diagram has to be a directed network

  12. GRIDSS: sensitive and specific genomic rearrangement detection using positional de Bruijn graph assembly

    PubMed Central

    Do, Hongdo; Molania, Ramyar

    2017-01-01

    The identification of genomic rearrangements with high sensitivity and specificity using massively parallel sequencing remains a major challenge, particularly in precision medicine and cancer research. Here, we describe a new method for detecting rearrangements, GRIDSS (Genome Rearrangement IDentification Software Suite). GRIDSS is a multithreaded structural variant (SV) caller that performs efficient genome-wide break-end assembly prior to variant calling using a novel positional de Bruijn graph-based assembler. By combining assembly, split read, and read pair evidence using a probabilistic scoring, GRIDSS achieves high sensitivity and specificity on simulated, cell line, and patient tumor data, recently winning SV subchallenge #5 of the ICGC-TCGA DREAM8.5 Somatic Mutation Calling Challenge. On human cell line data, GRIDSS halves the false discovery rate compared to other recent methods while matching or exceeding their sensitivity. GRIDSS identifies nontemplate sequence insertions, microhomologies, and large imperfect homologies, estimates a quality score for each breakpoint, stratifies calls into high or low confidence, and supports multisample analysis. PMID:29097403

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuo, Rui; Wu, C. F. Jeff

    Many computer models contain unknown parameters which need to be estimated using physical observations. Furthermore, the calibration method based on Gaussian process models may lead to unreasonable estimate for imperfect computer models. In this work, we extend their study to calibration problems with stochastic physical data. We propose a novel method, called the L 2 calibration, and show its semiparametric efficiency. The conventional method of the ordinary least squares is also studied. Theoretical analysis shows that it is consistent but not efficient. Here, numerical examples show that the proposed method outperforms the existing ones.

  14. Nondestructive equipment study

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Identification of existing nondestructive Evaluation (NDE) methods that could be used in a low Earth orbit environment; evaluation of each method with respect to the set of criteria called out in the statement of work; selection of the most promising NDE methods for further evaluation; use of selected NDE methods to test samples of pressure vessel materials in a vacuum; pressure testing of a complex monolythic pressure vessel with known flaws using acoustic emissions in a vacuum; and recommendations for further studies based on analysis and testing are covered.

  15. Evaluation of microarray data normalization procedures using spike-in experiments

    PubMed Central

    Rydén, Patrik; Andersson, Henrik; Landfors, Mattias; Näslund, Linda; Hartmanová, Blanka; Noppa, Laila; Sjöstedt, Anders

    2006-01-01

    Background Recently, a large number of methods for the analysis of microarray data have been proposed but there are few comparisons of their relative performances. By using so-called spike-in experiments, it is possible to characterize the analyzed data and thereby enable comparisons of different analysis methods. Results A spike-in experiment using eight in-house produced arrays was used to evaluate established and novel methods for filtration, background adjustment, scanning, channel adjustment, and censoring. The S-plus package EDMA, a stand-alone tool providing characterization of analyzed cDNA-microarray data obtained from spike-in experiments, was developed and used to evaluate 252 normalization methods. For all analyses, the sensitivities at low false positive rates were observed together with estimates of the overall bias and the standard deviation. In general, there was a trade-off between the ability of the analyses to identify differentially expressed genes (i.e. the analyses' sensitivities) and their ability to provide unbiased estimators of the desired ratios. Virtually all analysis underestimated the magnitude of the regulations; often less than 50% of the true regulations were observed. Moreover, the bias depended on the underlying mRNA-concentration; low concentration resulted in high bias. Many of the analyses had relatively low sensitivities, but analyses that used either the constrained model (i.e. a procedure that combines data from several scans) or partial filtration (a novel method for treating data from so-called not-found spots) had with few exceptions high sensitivities. These methods gave considerable higher sensitivities than some commonly used analysis methods. Conclusion The use of spike-in experiments is a powerful approach for evaluating microarray preprocessing procedures. Analyzed data are characterized by properties of the observed log-ratios and the analysis' ability to detect differentially expressed genes. If bias is not a major problem; we recommend the use of either the CM-procedure or partial filtration. PMID:16774679

  16. Analysis and selection of optimal function implementations in massively parallel computer

    DOEpatents

    Archer, Charles Jens [Rochester, MN; Peters, Amanda [Rochester, MN; Ratterman, Joseph D [Rochester, MN

    2011-05-31

    An apparatus, program product and method optimize the operation of a parallel computer system by, in part, collecting performance data for a set of implementations of a function capable of being executed on the parallel computer system based upon the execution of the set of implementations under varying input parameters in a plurality of input dimensions. The collected performance data may be used to generate selection program code that is configured to call selected implementations of the function in response to a call to the function under varying input parameters. The collected performance data may be used to perform more detailed analysis to ascertain the comparative performance of the set of implementations of the function under the varying input parameters.

  17. Coarse analysis of collective behaviors: Bifurcation analysis of the optimal velocity model for traffic jam formation

    NASA Astrophysics Data System (ADS)

    Miura, Yasunari; Sugiyama, Yuki

    2017-12-01

    We present a general method for analyzing macroscopic collective phenomena observed in many-body systems. For this purpose, we employ diffusion maps, which are one of the dimensionality-reduction techniques, and systematically define a few relevant coarse-grained variables for describing macroscopic phenomena. The time evolution of macroscopic behavior is described as a trajectory in the low-dimensional space constructed by these coarse variables. We apply this method to the analysis of the traffic model, called the optimal velocity model, and reveal a bifurcation structure, which features a transition to the emergence of a moving cluster as a traffic jam.

  18. The GeoViz Toolkit: Using component-oriented coordination methods for geographic visualization and analysis

    PubMed Central

    Hardisty, Frank; Robinson, Anthony C.

    2010-01-01

    In this paper we present the GeoViz Toolkit, an open-source, internet-delivered program for geographic visualization and analysis that features a diverse set of software components which can be flexibly combined by users who do not have programming expertise. The design and architecture of the GeoViz Toolkit allows us to address three key research challenges in geovisualization: allowing end users to create their own geovisualization and analysis component set on-the-fly, integrating geovisualization methods with spatial analysis methods, and making geovisualization applications sharable between users. Each of these tasks necessitates a robust yet flexible approach to inter-tool coordination. The coordination strategy we developed for the GeoViz Toolkit, called Introspective Observer Coordination, leverages and combines key advances in software engineering from the last decade: automatic introspection of objects, software design patterns, and reflective invocation of methods. PMID:21731423

  19. Turnover intentions in a call center: The role of emotional dissonance, job resources, and job satisfaction

    PubMed Central

    Zito, Margherita; Molino, Monica; Cortese, Claudio Giovanni; Ghislieri, Chiara; Colombo, Lara

    2018-01-01

    Background Turnover intentions refer to employees’ intent to leave the organization and, within call centers, it can be influenced by factors such as relational variables or the perception of the quality of working life, which can be affected by emotional dissonance. This specific job demand to express emotions not felt is peculiar in call centers, and can influence job satisfaction and turnover intentions, a crucial problem among these working contexts. This study aims to detect, within the theoretical framework of the Job Demands-Resources Model, the role of emotional dissonance (job demand), and two resources, job autonomy and supervisors’ support, in the perception of job satisfaction and turnover intentions among an Italian call center. Method The study involved 318 call center agents of an Italian Telecommunication Company. Data analysis first performed descriptive statistics through SPSS 22. A path analysis was then performed through LISREL 8.72 and tested both direct and indirect effects. Results Results suggest the role of resources in fostering job satisfaction and in decreasing turnover intentions. Emotional dissonance reveals a negative relation with job satisfaction and a positive relation with turnover. Moreover, job satisfaction is negatively related with turnover and mediates the relationship between job resources and turnover. Conclusion This study contributes to extend the knowledge about the variables influencing turnover intentions, a crucial problem among call centers. Moreover, the study identifies theoretical considerations and practical implications to promote well-being among call center employees. To foster job satisfaction and reduce turnover intentions, in fact, it is important to make resources available, but also to offer specific training programs to make employees and supervisors aware about the consequences of emotional dissonance. PMID:29401507

  20. Semi-Supervised Marginal Fisher Analysis for Hyperspectral Image Classification

    NASA Astrophysics Data System (ADS)

    Huang, H.; Liu, J.; Pan, Y.

    2012-07-01

    The problem of learning with both labeled and unlabeled examples arises frequently in Hyperspectral image (HSI) classification. While marginal Fisher analysis is a supervised method, which cannot be directly applied for Semi-supervised classification. In this paper, we proposed a novel method, called semi-supervised marginal Fisher analysis (SSMFA), to process HSI of natural scenes, which uses a combination of semi-supervised learning and manifold learning. In SSMFA, a new difference-based optimization objective function with unlabeled samples has been designed. SSMFA preserves the manifold structure of labeled and unlabeled samples in addition to separating labeled samples in different classes from each other. The semi-supervised method has an analytic form of the globally optimal solution, and it can be computed based on eigen decomposition. Classification experiments with a challenging HSI task demonstrate that this method outperforms current state-of-the-art HSI-classification methods.

  1. Optimizing Probability of Detection Point Estimate Demonstration

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  2. Resolving and quantifying overlapped chromatographic bands by transmutation

    PubMed

    Malinowski

    2000-09-15

    A new chemometric technique called "transmutation" is developed for the purpose of sharpening overlapped chromatographic bands in order to quantify the components. The "transmutation function" is created from the chromatogram of the pure component of interest, obtained from the same instrument, operating under the same experimental conditions used to record the unresolved chromatogram of the sample mixture. The method is used to quantify mixtures containing toluene, ethylbenzene, m-xylene, naphthalene, and biphenyl from unresolved chromatograms previously reported. The results are compared to those obtained using window factor analysis, rank annihilation factor analysis, and matrix regression analysis. Unlike the latter methods, the transmutation method is not restricted to two-dimensional arrays of data, such as those obtained from HPLC/DAD, but is also applicable to chromatograms obtained from single detector experiments. Limitations of the method are discussed.

  3. SSAW: A new sequence similarity analysis method based on the stationary discrete wavelet transform.

    PubMed

    Lin, Jie; Wei, Jing; Adjeroh, Donald; Jiang, Bing-Hua; Jiang, Yue

    2018-05-02

    Alignment-free sequence similarity analysis methods often lead to significant savings in computational time over alignment-based counterparts. A new alignment-free sequence similarity analysis method, called SSAW is proposed. SSAW stands for Sequence Similarity Analysis using the Stationary Discrete Wavelet Transform (SDWT). It extracts k-mers from a sequence, then maps each k-mer to a complex number field. Then, the series of complex numbers formed are transformed into feature vectors using the stationary discrete wavelet transform. After these steps, the original sequence is turned into a feature vector with numeric values, which can then be used for clustering and/or classification. Using two different types of applications, namely, clustering and classification, we compared SSAW against the the-state-of-the-art alignment free sequence analysis methods. SSAW demonstrates competitive or superior performance in terms of standard indicators, such as accuracy, F-score, precision, and recall. The running time was significantly better in most cases. These make SSAW a suitable method for sequence analysis, especially, given the rapidly increasing volumes of sequence data required by most modern applications.

  4. IBiSA_Tools: A Computational Toolkit for Ion-Binding State Analysis in Molecular Dynamics Trajectories of Ion Channels.

    PubMed

    Kasahara, Kota; Kinoshita, Kengo

    2016-01-01

    Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD) method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML), which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.

  5. High-energy evolution to three loops

    NASA Astrophysics Data System (ADS)

    Caron-Huot, Simon; Herranen, Matti

    2018-02-01

    The Balitsky-Kovchegov equation describes the high-energy growth of gauge theory scattering amplitudes as well as nonlinear saturation effects which stop it. We obtain the three-loop corrections to the equation in planar N = 4 super Yang-Mills theory. Our method exploits a recently established equivalence with the physics of soft wide-angle radiation, so-called non-global logarithms, and thus yields at the same time the threeloop evolution equation for non-global logarithms. As a by-product of our analysis, we develop a Lorentz-covariant method to subtract infrared and collinear divergences in crosssection calculations in the planar limit. We compare our result in the linear regime with a recent prediction for the so-called Pomeron trajectory, and compare its collinear limit with predictions from the spectrum of twist-two operators.

  6. Defining Human Failure Events for Petroleum Risk Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  7. Preliminary investigation of vocal variation in the Mexican Spotted Owl (Strix occidentalis lucida): would vocal analysis of the four-note location call be a useful field tool for individual identification?

    Treesearch

    Wendy A. Kuntz; Peter B. Stacey

    1997-01-01

    Individual identification, especially in rare species, can provide managers with critical information about demographic processes. Traditionally, banding has been the only effective method of marking individuals. However, banding's drawbacks have led some researchers to suggest vocal analysis as an alternative. We explore this prospect for Mexican Spotted Owls (...

  8. Multi-component separation and analysis of bat echolocation calls.

    PubMed

    DiCecco, John; Gaudette, Jason E; Simmons, James A

    2013-01-01

    The vast majority of animal vocalizations contain multiple frequency modulated (FM) components with varying amounts of non-linear modulation and harmonic instability. This is especially true of biosonar sounds where precise time-frequency templates are essential for neural information processing of echoes. Understanding the dynamic waveform design by bats and other echolocating animals may help to improve the efficacy of man-made sonar through biomimetic design. Bats are known to adapt their call structure based on the echolocation task, proximity to nearby objects, and density of acoustic clutter. To interpret the significance of these changes, a method was developed for component separation and analysis of biosonar waveforms. Techniques for imaging in the time-frequency plane are typically limited due to the uncertainty principle and interference cross terms. This problem is addressed by extending the use of the fractional Fourier transform to isolate each non-linear component for separate analysis. Once separated, empirical mode decomposition can be used to further examine each component. The Hilbert transform may then successfully extract detailed time-frequency information from each isolated component. This multi-component analysis method is applied to the sonar signals of four species of bats recorded in-flight by radiotelemetry along with a comparison of other common time-frequency representations.

  9. Using comparative genome analysis to identify problems in annotated microbial genomes.

    PubMed

    Poptsova, Maria S; Gogarten, J Peter

    2010-07-01

    Genome annotation is a tedious task that is mostly done by automated methods; however, the accuracy of these approaches has been questioned since the beginning of the sequencing era. Genome annotation is a multilevel process, and errors can emerge at different stages: during sequencing, as a result of gene-calling procedures, and in the process of assigning gene functions. Missed or wrongly annotated genes differentially impact different types of analyses. Here we discuss and demonstrate how the methods of comparative genome analysis can refine annotations by locating missing orthologues. We also discuss possible reasons for errors and show that the second-generation annotation systems, which combine multiple gene-calling programs with similarity-based methods, perform much better than the first annotation tools. Since old errors may propagate to the newly sequenced genomes, we emphasize that the problem of continuously updating popular public databases is an urgent and unresolved one. Due to the progress in genome-sequencing technologies, automated annotation techniques will remain the main approach in the future. Researchers need to be aware of the existing errors in the annotation of even well-studied genomes, such as Escherichia coli, and consider additional quality control for their results.

  10. Seeking Information with an Information Visualization System: A Study of Cognitive Styles

    ERIC Educational Resources Information Center

    Yuan, Xiaojun; Zhang, Xiangman; Chen, Chaomei; Avery, Joshua M.

    2011-01-01

    Introduction: This study investigated the effect of cognitive styles on users' information-seeking task performance using a knowledge domain information visualization system called CiteSpace. Method: Sixteen graduate students participated in a user experiment. Each completed an extended cognitive style analysis wholistic-analytic test (the…

  11. Expansive Visibilization to Stimulate EFL Teacher Reflection

    ERIC Educational Resources Information Center

    Ito, Ryu

    2012-01-01

    Despite the growing popularity of action research, bridging the gap between data collection and reflective data analysis still lacks a well-developed methodology. As a supplement to the traditional action research procedure for language teaching, I adopted a method called expansive visibilization (EV), which has the potential to be a reflective…

  12. The Variance Normalization Method of Ridge Regression Analysis.

    ERIC Educational Resources Information Center

    Bulcock, J. W.; And Others

    The testing of contemporary sociological theory often calls for the application of structural-equation models to data which are inherently collinear. It is shown that simple ridge regression, which is commonly used for controlling the instability of ordinary least squares regression estimates in ill-conditioned data sets, is not a legitimate…

  13. Effectiveness of automated notification and customer service call centers for timely and accurate reporting of critical values: a laboratory medicine best practices systematic review and meta-analysis.

    PubMed

    Liebow, Edward B; Derzon, James H; Fontanesi, John; Favoretto, Alessandra M; Baetz, Rich Ann; Shaw, Colleen; Thompson, Pamela; Mass, Diana; Christenson, Robert; Epner, Paul; Snyder, Susan R

    2012-09-01

    To conduct a systematic review of the evidence available in support of automated notification methods and call centers and to acknowledge other considerations in making evidence-based recommendations for best practices in improving the timeliness and accuracy of critical value reporting. This review followed the Laboratory Medicine Best Practices (LMBP) review methods (Christenson, et al. 2011). A broad literature search and call for unpublished submissions returned 196 bibliographic records which were screened for eligibility. 41 studies were retrieved. Of these, 4 contained credible evidence for the timeliness and accuracy of automatic notification systems and 5 provided credible evidence for call centers for communicating critical value information in in-patient care settings. Studies reporting improvement from implementing automated notification findings report mean differences and were standardized using the standard difference in means (d=0.42; 95% CI=0.2-0.62) while studies reporting improvement from implementing call centers generally reported criterion referenced findings and were standardized using odds ratios (OR=22.1; 95% CI=17.1-28.6). The evidence, although suggestive, is not sufficient to make an LMBP recommendation for or against using automated notification systems as a best practice to improve the timeliness of critical value reporting in an in-patient care setting. Call centers, however, are effective in improving the timeliness of critical value reporting in an in-patient care setting, and meet LMBP criteria to be recommended as an "evidence-based best practice." Copyright © 2012 The Canadian Society of Clinical Chemists. All rights reserved.

  14. Toward a standard in structural genome annotation for prokaryotes

    DOE PAGES

    Tripp, H. James; Sutton, Granger; White, Owen; ...

    2015-07-25

    In an effort to identify the best practice for finding genes in prokaryotic genomes and propose it as a standard for automated annotation pipelines, we collected 1,004,576 peptides from various publicly available resources, and these were used as a basis to evaluate various gene-calling methods. The peptides came from 45 bacterial replicons with an average GC content from 31 % to 74 %, biased toward higher GC content genomes. Automated, manual, and semi-manual methods were used to tally errors in three widely used gene calling methods, as evidenced by peptides mapped outside the boundaries of called genes. We found thatmore » the consensus set of identical genes predicted by the three methods constitutes only about 70 % of the genes predicted by each individual method (with start and stop required to coincide). Peptide data was useful for evaluating some of the differences between gene callers, but not reliable enough to make the results conclusive, due to limitations inherent in any proteogenomic study. A single, unambiguous, unanimous best practice did not emerge from this analysis, since the available proteomics data were not adequate to provide an objective measurement of differences in the accuracy between these methods. However, as a result of this study, software, reference data, and procedures have been better matched among participants, representing a step toward a much-needed standard. In the absence of sufficient amount of experimental data to achieve a universal standard, our recommendation is that any of these methods can be used by the community, as long as a single method is employed across all datasets to be compared.« less

  15. An Unconditionally Stable, Positivity-Preserving Splitting Scheme for Nonlinear Black-Scholes Equation with Transaction Costs

    PubMed Central

    Guo, Jianqiang; Wang, Wansheng

    2014-01-01

    This paper deals with the numerical analysis of nonlinear Black-Scholes equation with transaction costs. An unconditionally stable and monotone splitting method, ensuring positive numerical solution and avoiding unstable oscillations, is proposed. This numerical method is based on the LOD-Backward Euler method which allows us to solve the discrete equation explicitly. The numerical results for vanilla call option and for European butterfly spread are provided. It turns out that the proposed scheme is efficient and reliable. PMID:24895653

  16. An unconditionally stable, positivity-preserving splitting scheme for nonlinear Black-Scholes equation with transaction costs.

    PubMed

    Guo, Jianqiang; Wang, Wansheng

    2014-01-01

    This paper deals with the numerical analysis of nonlinear Black-Scholes equation with transaction costs. An unconditionally stable and monotone splitting method, ensuring positive numerical solution and avoiding unstable oscillations, is proposed. This numerical method is based on the LOD-Backward Euler method which allows us to solve the discrete equation explicitly. The numerical results for vanilla call option and for European butterfly spread are provided. It turns out that the proposed scheme is efficient and reliable.

  17. Reducing maintenance costs in agreement with CNC machine tools reliability

    NASA Astrophysics Data System (ADS)

    Ungureanu, A. L.; Stan, G.; Butunoi, P. A.

    2016-08-01

    Aligning maintenance strategy with reliability is a challenge due to the need to find an optimal balance between them. Because the various methods described in the relevant literature involve laborious calculations or use of software that can be costly, this paper proposes a method that is easier to implement on CNC machine tools. The new method, called the Consequence of Failure Analysis (CFA) is based on technical and economic optimization, aimed at obtaining a level of required performance with minimum investment and maintenance costs.

  18. Computer tomography of flows external to test models

    NASA Technical Reports Server (NTRS)

    Prikryl, I.; Vest, C. M.

    1982-01-01

    Computer tomographic techniques for reconstruction of three-dimensional aerodynamic density fields, from interferograms recorded from several different viewing directions were studied. Emphasis is on the case in which an opaque object such as a test model in a wind tunnel obscures significant regions of the interferograms (projection data). A method called the Iterative Convolution Method (ICM), existing methods in which the field is represented by a series expansions, and analysis of real experimental data in the form of aerodynamic interferograms are discussed.

  19. cyvcf2: fast, flexible variant analysis with Python.

    PubMed

    Pedersen, Brent S; Quinlan, Aaron R

    2017-06-15

    Variant call format (VCF) files document the genetic variation observed after DNA sequencing, alignment and variant calling of a sample cohort. Given the complexity of the VCF format as well as the diverse variant annotations and genotype metadata, there is a need for fast, flexible methods enabling intuitive analysis of the variant data within VCF and BCF files. We introduce cyvcf2 , a Python library and software package for fast parsing and querying of VCF and BCF files and illustrate its speed, simplicity and utility. bpederse@gmail.com or aaronquinlan@gmail.com. cyvcf2 is available from https://github.com/brentp/cyvcf2 under the MIT license and from common python package managers. Detailed documentation is available at http://brentp.github.io/cyvcf2/. © The Author 2017. Published by Oxford University Press.

  20. Understanding self-harm in victims of intimate partner violence: a qualitative analysis of calls made by victims to a crisis hotline in China.

    PubMed

    Wong, Susan P Y; Wang, Cuiling; Meng, Mei; Phillips, Michael R

    2011-04-01

    Text analysis of the transcripts of 26 calls made to a Chinese crisis hotline by victims of intimate partner violence (IPV) reporting thoughts or acts of self-harm abstracted information on victims' patterns of self-harm and the relationship of their self-harm to IPV. Specific violent episodes often triggered self-harm. Victims considered self-harm a method for airing painful emotions caused by abuse or as a last resort to escape by dying when they saw no other options and were no longer able to endure the violence. We also elaborate on callers' discussions of barriers to accessing support, sociocultural pressures to preserve "face" and family, and restrictive gender roles that contribute to their self-harm behaviors.

  1. Operator function modeling: An approach to cognitive task analysis in supervisory control systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1987-01-01

    In a study of models of operators in complex, automated space systems, an operator function model (OFM) methodology was extended to represent cognitive as well as manual operator activities. Development continued on a software tool called OFMdraw, which facilitates construction of an OFM by permitting construction of a heterarchic network of nodes and arcs. Emphasis was placed on development of OFMspert, an expert system designed both to model human operation and to assist real human operators. The system uses a blackboard method of problem solving to make an on-line representation of operator intentions, called ACTIN (actions interpreter).

  2. Pesticide poisoning in Palestine: a retrospective analysis of calls received by Poison Control and Drug Information Center from 2006-2010.

    PubMed

    Sawalha, Ansam F; O'Malley, Gerald F; Sweileh, Waleed M

    2012-01-01

    The agricultural industry is the largest economic sector in Palestine and is characterized by extensive and unregulated use of pesticides. The objective of this study was to analyze phone calls received by the Poison Control and Drug Information Center (PCDIC) in Palestine regarding pesticide poisoning. All phone calls regarding pesticide poisoning received by the PCDIC from 2006 to 2010 were descriptively analyzed. Statistical Package for Social Sciences (SPSS version 16) was used in statistical analysis and to create figures. A total of 290 calls regarding pesticide poisoning were received during the study period. Most calls (83.8%) were made by physicians. The average age of reported cases was 19.6 ± 15 years. Pesticide poisoning occurred mostly in males (56.9%). Pesticide poisoning was most common (75, 25.9%) in the age category of 20-29.9 years. The majority (51.7%) of the cases were deliberate self-harm while the remaining was accidental exposure. The majority of phone calls (250, 86.2%) described oral exposure to pesticides. Approximately one third (32.9%) of the cases had symptoms consistent with organophosphate poisoning. Gastric lavage (31.7%) was the major decontamination method used, while charcoal was only utilized in 1.4% of the cases. Follow up was performed in 45.5% of the cases, two patients died after hospital admission while the remaining had positive outcome. Pesticide poisoning is a major health problem in Palestine, and the PCDIC has a clear mission to help in recommending therapy and gathering information.

  3. Efficient calibration for imperfect computer models

    DOE PAGES

    Tuo, Rui; Wu, C. F. Jeff

    2015-12-01

    Many computer models contain unknown parameters which need to be estimated using physical observations. Furthermore, the calibration method based on Gaussian process models may lead to unreasonable estimate for imperfect computer models. In this work, we extend their study to calibration problems with stochastic physical data. We propose a novel method, called the L 2 calibration, and show its semiparametric efficiency. The conventional method of the ordinary least squares is also studied. Theoretical analysis shows that it is consistent but not efficient. Here, numerical examples show that the proposed method outperforms the existing ones.

  4. A comparison of acoustic montoring methods for common anurans of the northeastern United States

    USGS Publications Warehouse

    Brauer, Corinne; Donovan, Therese; Mickey, Ruth M.; Katz, Jonathan; Mitchell, Brian R.

    2016-01-01

    Many anuran monitoring programs now include autonomous recording units (ARUs). These devices collect audio data for extended periods of time with little maintenance and at sites where traditional call surveys might be difficult. Additionally, computer software programs have grown increasingly accurate at automatically identifying the calls of species. However, increased automation may cause increased error. We collected 435 min of audio data with 2 types of ARUs at 10 wetland sites in Vermont and New York, USA, from 1 May to 1 July 2010. For each minute, we determined presence or absence of 4 anuran species (Hyla versicolor, Pseudacris crucifer, Anaxyrus americanus, and Lithobates clamitans) using 1) traditional human identification versus 2) computer-mediated identification with software package, Song Scope® (Wildlife Acoustics, Concord, MA). Detections were compared with a data set consisting of verified calls in order to quantify false positive, false negative, true positive, and true negative rates. Multinomial logistic regression analysis revealed a strong (P < 0.001) 3-way interaction between the ARU recorder type, identification method, and focal species, as well as a trend in the main effect of rain (P = 0.059). Overall, human surveyors had the lowest total error rate (<2%) compared with 18–31% total errors with automated methods. Total error rates varied by species, ranging from 4% for A. americanus to 26% for L. clamitans. The presence of rain may reduce false negative rates. For survey minutes where anurans were known to be calling, the odds of a false negative were increased when fewer individuals of the same species were calling.

  5. Who Are We Not Calling On? A Study of Classroom Participation and the Implementation of the Name Card Method.

    ERIC Educational Resources Information Center

    Carter, Angela

    This study involved observing a second-grade classroom to investigate how the teacher called on students, noting whether the teacher gave enough attention to students who raised their hands frequently by calling on them and examining students' responses when called on. Researchers implemented a new method of calling on students using name cards,…

  6. Contact stresses in gear teeth: A new method of analysis

    NASA Technical Reports Server (NTRS)

    Somprakit, Paisan; Huston, Ronald L.; Oswald, Fred B.

    1991-01-01

    A new, innovative procedure called point load superposition for determining the contact stresses in mating gear teeth. It is believed that this procedure will greatly extend both the range of applicability and the accuracy of gear contact stress analysis. Point load superposition is based upon fundamental solutions from the theory of elasticity. It is an iterative numerical procedure which has distinct advantages over the classical Hertz method, the finite element method, and over existing applications with the boundary element method. Specifically, friction and sliding effects, which are either excluded from or difficult to study with the classical methods, are routinely handled with the new procedure. Presented here are the basic theory and the algorithms. Several examples are given. Results are consistent with those of the classical theories. Applications to spur gears are discussed.

  7. VaDiR: an integrated approach to Variant Detection in RNA.

    PubMed

    Neums, Lisa; Suenaga, Seiji; Beyerlein, Peter; Anders, Sara; Koestler, Devin; Mariani, Andrea; Chien, Jeremy

    2018-02-01

    Advances in next-generation DNA sequencing technologies are now enabling detailed characterization of sequence variations in cancer genomes. With whole-genome sequencing, variations in coding and non-coding sequences can be discovered. But the cost associated with it is currently limiting its general use in research. Whole-exome sequencing is used to characterize sequence variations in coding regions, but the cost associated with capture reagents and biases in capture rate limit its full use in research. Additional limitations include uncertainty in assigning the functional significance of the mutations when these mutations are observed in the non-coding region or in genes that are not expressed in cancer tissue. We investigated the feasibility of uncovering mutations from expressed genes using RNA sequencing datasets with a method called Variant Detection in RNA(VaDiR) that integrates 3 variant callers, namely: SNPiR, RVBoost, and MuTect2. The combination of all 3 methods, which we called Tier 1 variants, produced the highest precision with true positive mutations from RNA-seq that could be validated at the DNA level. We also found that the integration of Tier 1 variants with those called by MuTect2 and SNPiR produced the highest recall with acceptable precision. Finally, we observed a higher rate of mutation discovery in genes that are expressed at higher levels. Our method, VaDiR, provides a possibility of uncovering mutations from RNA sequencing datasets that could be useful in further functional analysis. In addition, our approach allows orthogonal validation of DNA-based mutation discovery by providing complementary sequence variation analysis from paired RNA/DNA sequencing datasets.

  8. Point model equations for neutron correlation counting: Extension of Böhnel's equations to any order

    DOE PAGES

    Favalli, Andrea; Croft, Stephen; Santi, Peter

    2015-06-15

    Various methods of autocorrelation neutron analysis may be used to extract information about a measurement item containing spontaneously fissioning material. The two predominant approaches being the time correlation analysis (that make use of a coincidence gate) methods of multiplicity shift register logic and Feynman sampling. The common feature is that the correlated nature of the pulse train can be described by a vector of reduced factorial multiplet rates. We call these singlets, doublets, triplets etc. Within the point reactor model the multiplet rates may be related to the properties of the item, the parameters of the detector, and basic nuclearmore » data constants by a series of coupled algebraic equations – the so called point model equations. Solving, or inverting, the point model equations using experimental calibration model parameters is how assays of unknown items is performed. Currently only the first three multiplets are routinely used. In this work we develop the point model equations to higher order multiplets using the probability generating functions approach combined with the general derivative chain rule, the so called Faà di Bruno Formula. Explicit expression up to 5th order are provided, as well the general iterative formula to calculate any order. This study represents the first necessary step towards determining if higher order multiplets can add value to nondestructive measurement practice for nuclear materials control and accountancy.« less

  9. Connecting the person with dementia and family: a feasibility study of a telepresence robot

    PubMed Central

    2014-01-01

    Background Maintenance of communication is important for people with dementia living in long-term care. The purpose of this study was to assess the feasibility of using “Giraff”, a telepresence robot to enhance engagement between family and a person with dementia living in long-term care. Methods A mixed-methods approach involving semi-structured interviews, call records and video observational data was used. Five people with dementia and their family member participated in a discussion via the Giraff robot for a minimum of six times over a six-week period. A feasibility framework was used to assess feasibility and included video analysis of emotional response and engagement. Results Twenty-six calls with an average duration of 23 mins took place. Residents showed a general state of positive emotions across the calls with a high level of engagement and a minimal level of negative emotions. Participants enjoyed the experience and families reported that the Giraff robot offered the opportunity to reduce social isolation. A number of software and hardware challenges were encountered. Conclusions Participants perceived this novel approach to engage families and people with dementia as a feasible option. Participants were observed and also reported to enjoy the experience. The technical challenges identified have been improved in a newer version of the robot. Future research should include a feasibility trial of longer duration, with a larger sample and a cost analysis. PMID:24456417

  10. The Influence of Judgment Calls on Meta-Analytic Findings.

    PubMed

    Tarrahi, Farid; Eisend, Martin

    2016-01-01

    Previous research has suggested that judgment calls (i.e., methodological choices made in the process of conducting a meta-analysis) have a strong influence on meta-analytic findings and question their robustness. However, prior research applies case study comparison or reanalysis of a few meta-analyses with a focus on a few selected judgment calls. These studies neglect the fact that different judgment calls are related to each other and simultaneously influence the outcomes of a meta-analysis, and that meta-analytic findings can vary due to non-judgment call differences between meta-analyses (e.g., variations of effects over time). The current study analyzes the influence of 13 judgment calls in 176 meta-analyses in marketing research by applying a multivariate, multilevel meta-meta-analysis. The analysis considers simultaneous influences from different judgment calls on meta-analytic effect sizes and controls for alternative explanations based on non-judgment call differences between meta-analyses. The findings suggest that judgment calls have only a minor influence on meta-analytic findings, whereas non-judgment call differences between meta-analyses are more likely to explain differences in meta-analytic findings. The findings support the robustness of meta-analytic results and conclusions.

  11. Identification and validation of loss of function variants in clinical contexts.

    PubMed

    Lescai, Francesco; Marasco, Elena; Bacchelli, Chiara; Stanier, Philip; Mantovani, Vilma; Beales, Philip

    2014-01-01

    The choice of an appropriate variant calling pipeline for exome sequencing data is becoming increasingly more important in translational medicine projects and clinical contexts. Within GOSgene, which facilitates genetic analysis as part of a joint effort of the University College London and the Great Ormond Street Hospital, we aimed to optimize a variant calling pipeline suitable for our clinical context. We implemented the GATK/Queue framework and evaluated the performance of its two callers: the classical UnifiedGenotyper and the new variant discovery tool HaplotypeCaller. We performed an experimental validation of the loss-of-function (LoF) variants called by the two methods using Sequenom technology. UnifiedGenotyper showed a total validation rate of 97.6% for LoF single-nucleotide polymorphisms (SNPs) and 92.0% for insertions or deletions (INDELs), whereas HaplotypeCaller was 91.7% for SNPs and 55.9% for INDELs. We confirm that GATK/Queue is a reliable pipeline in translational medicine and clinical context. We conclude that in our working environment, UnifiedGenotyper is the caller of choice, being an accurate method, with a high validation rate of error-prone calls like LoF variants. We finally highlight the importance of experimental validation, especially for INDELs, as part of a standard pipeline in clinical environments.

  12. Reconstruction of interatomic vectors by principle component analysis of nuclear magnetic resonance data in multiple alignments

    NASA Astrophysics Data System (ADS)

    Hus, Jean-Christophe; Bruschweiler, Rafael

    2002-07-01

    A general method is presented for the reconstruction of interatomic vector orientations from nuclear magnetic resonance (NMR) spectroscopic data of tensor interactions of rank 2, such as dipolar coupling and chemical shielding anisotropy interactions, in solids and partially aligned liquid-state systems. The method, called PRIMA, is based on a principal component analysis of the covariance matrix of the NMR parameters collected for multiple alignments. The five nonzero eigenvalues and their eigenvectors efficiently allow the approximate reconstruction of the vector orientations of the underlying interactions. The method is demonstrated for an isotropic distribution of sample orientations as well as for finite sets of orientations and internuclear vectors encountered in protein systems.

  13. A simple algorithm for quantifying DNA methylation levels on multiple independent CpG sites in bisulfite genomic sequencing electropherograms.

    PubMed

    Leakey, Tatiana I; Zielinski, Jerzy; Siegfried, Rachel N; Siegel, Eric R; Fan, Chun-Yang; Cooney, Craig A

    2008-06-01

    DNA methylation at cytosines is a widely studied epigenetic modification. Methylation is commonly detected using bisulfite modification of DNA followed by PCR and additional techniques such as restriction digestion or sequencing. These additional techniques are either laborious, require specialized equipment, or are not quantitative. Here we describe a simple algorithm that yields quantitative results from analysis of conventional four-dye-trace sequencing. We call this method Mquant and we compare it with the established laboratory method of combined bisulfite restriction assay (COBRA). This analysis of sequencing electropherograms provides a simple, easily applied method to quantify DNA methylation at specific CpG sites.

  14. SLAMMER: Seismic LAndslide Movement Modeled using Earthquake Records

    USGS Publications Warehouse

    Jibson, Randall W.; Rathje, Ellen M.; Jibson, Matthew W.; Lee, Yong W.

    2013-01-01

    This program is designed to facilitate conducting sliding-block analysis (also called permanent-deformation analysis) of slopes in order to estimate slope behavior during earthquakes. The program allows selection from among more than 2,100 strong-motion records from 28 earthquakes and allows users to add their own records to the collection. Any number of earthquake records can be selected using a search interface that selects records based on desired properties. Sliding-block analyses, using any combination of rigid-block (Newmark), decoupled, and fully coupled methods, are then conducted on the selected group of records, and results are compiled in both graphical and tabular form. Simplified methods for conducting each type of analysis are also included.

  15. The Volatility of Data Space: Topology Oriented Sensitivity Analysis

    PubMed Central

    Du, Jing; Ligmann-Zielinska, Arika

    2015-01-01

    Despite the difference among specific methods, existing Sensitivity Analysis (SA) technologies are all value-based, that is, the uncertainties in the model input and output are quantified as changes of values. This paradigm provides only limited insight into the nature of models and the modeled systems. In addition to the value of data, a potentially richer information about the model lies in the topological difference between pre-model data space and post-model data space. This paper introduces an innovative SA method called Topology Oriented Sensitivity Analysis, which defines sensitivity as the volatility of data space. It extends SA into a deeper level that lies in the topology of data. PMID:26368929

  16. Science on TeacherTube: A Mixed Methods Analysis of Teacher Produced Video

    NASA Astrophysics Data System (ADS)

    Chmiel, Margaret (Marjee)

    Increased bandwidth, inexpensive video cameras and easy-to-use video editing software have made social media sites featuring user generated video (UGV) an increasingly popular vehicle for online communication. As such, UGV have come to play a role in education, both formal and informal, but there has been little research on this topic in scholarly literature. In this mixed-methods study, a content and discourse analysis are used to describe the most successful UGV in the science channel of an education-focused site called TeacherTube. The analysis finds that state achievement tests, and their focus on vocabulary and recall-level knowledge, drive much of the content found on TeacherTube.

  17. Sound imaging of nocturnal animal calls in their natural habitat.

    PubMed

    Mizumoto, Takeshi; Aihara, Ikkyu; Otsuka, Takuma; Takeda, Ryu; Aihara, Kazuyuki; Okuno, Hiroshi G

    2011-09-01

    We present a novel method for imaging acoustic communication between nocturnal animals. Investigating the spatio-temporal calling behavior of nocturnal animals, e.g., frogs and crickets, has been difficult because of the need to distinguish many animals' calls in noisy environments without being able to see them. Our method visualizes the spatial and temporal dynamics using dozens of sound-to-light conversion devices (called "Firefly") and an off-the-shelf video camera. The Firefly, which consists of a microphone and a light emitting diode, emits light when it captures nearby sound. Deploying dozens of Fireflies in a target area, we record calls of multiple individuals through the video camera. We conduct two experiments, one indoors and the other in the field, using Japanese tree frogs (Hyla japonica). The indoor experiment demonstrates that our method correctly visualizes Japanese tree frogs' calling behavior. It has confirmed the known behavior; two frogs call synchronously or in anti-phase synchronization. The field experiment (in a rice paddy where Japanese tree frogs live) also visualizes the same calling behavior to confirm anti-phase synchronization in the field. Experimental results confirm that our method can visualize the calling behavior of nocturnal animals in their natural habitat.

  18. Flux control coefficients determined by inhibitor titration: the design and analysis of experiments to minimize errors.

    PubMed Central

    Small, J R

    1993-01-01

    This paper is a study into the effects of experimental error on the estimated values of flux control coefficients obtained using specific inhibitors. Two possible techniques for analysing the experimental data are compared: a simple extrapolation method (the so-called graph method) and a non-linear function fitting method. For these techniques, the sources of systematic errors are identified and the effects of systematic and random errors are quantified, using both statistical analysis and numerical computation. It is shown that the graph method is very sensitive to random errors and, under all conditions studied, that the fitting method, even under conditions where the assumptions underlying the fitted function do not hold, outperformed the graph method. Possible ways of designing experiments to minimize the effects of experimental errors are analysed and discussed. PMID:8257434

  19. Assessing performance of an Electronic Health Record (EHR) using Cognitive Task Analysis.

    PubMed

    Saitwal, Himali; Feng, Xuan; Walji, Muhammad; Patel, Vimla; Zhang, Jiajie

    2010-07-01

    Many Electronic Health Record (EHR) systems fail to provide user-friendly interfaces due to the lack of systematic consideration of human-centered computing issues. Such interfaces can be improved to provide easy to use, easy to learn, and error-resistant EHR systems to the users. To evaluate the usability of an EHR system and suggest areas of improvement in the user interface. The user interface of the AHLTA (Armed Forces Health Longitudinal Technology Application) was analyzed using the Cognitive Task Analysis (CTA) method called GOMS (Goals, Operators, Methods, and Selection rules) and an associated technique called KLM (Keystroke Level Model). The GOMS method was used to evaluate the AHLTA user interface by classifying each step of a given task into Mental (Internal) or Physical (External) operators. This analysis was performed by two analysts independently and the inter-rater reliability was computed to verify the reliability of the GOMS method. Further evaluation was performed using KLM to estimate the execution time required to perform the given task through application of its standard set of operators. The results are based on the analysis of 14 prototypical tasks performed by AHLTA users. The results show that on average a user needs to go through 106 steps to complete a task. To perform all 14 tasks, they would spend about 22 min (independent of system response time) for data entry, of which 11 min are spent on more effortful mental operators. The inter-rater reliability analysis performed for all 14 tasks was 0.8 (kappa), indicating good reliability of the method. This paper empirically reveals and identifies the following finding related to the performance of AHLTA: (1) large number of average total steps to complete common tasks, (2) high average execution time and (3) large percentage of mental operators. The user interface can be improved by reducing (a) the total number of steps and (b) the percentage of mental effort, required for the tasks. 2010 Elsevier Ireland Ltd. All rights reserved.

  20. Statistical Analysis of Hit/Miss Data (Preprint)

    DTIC Science & Technology

    2012-07-01

    HDBK-1823A, 2009). Other agencies and industries have also made use of this guidance (Gandossi et al., 2010) and ( Drury et al., 2006). It should...better accounting of false call rates such that the POD curve doesn’t converge to 0 for small flaw sizes. The difficulty with conventional methods...2002. Drury , Ghylin, and Holness, Error Analysis and Threat Magnitude for Carry-on Bag Inspection, Proceedings of the Human Factors and Ergonomic

  1. Feasibility of MOS Task Analysis and Redesign to Reduce Physical Demands in the U.S. Army

    DTIC Science & Technology

    1997-12-01

    developed to study perchery workers (Scott & Lamb , 1996). Another posture analysis technique is called postural targeting (Corlett, et al., 1979). A...method which had been successfully applied to a variety of situations (Lee & Chiou, 1995; Scott & Lamb , 1996). Some modifications were made in the...Scott, G.B., & Lamb , N.R. (1996). Working practices in a perchery system, using the Ovako Working Posture Analyzing System (OWAS). Applied Ergonomics

  2. A Simple Method for Causal Analysis of Return on IT Investment

    PubMed Central

    Alemi, Farrokh; Zargoush, Manaf; Oakes, James L.; Edrees, Hanan

    2011-01-01

    This paper proposes a method for examining the causal relationship among investment in information technology (IT) and the organization's productivity. In this method, first a strong relationship among (1) investment in IT, (2) use of IT and (3) organization's productivity is verified using correlations. Second, the assumption that IT investment preceded improved productivity is tested using partial correlation. Finally, the assumption of what may have happened in the absence of IT investment, the so called counterfactual, is tested through forecasting productivity at different levels of investment. The paper applies the proposed method to investment in the Veterans Health Information Systems and Technology Architecture (VISTA) system. Result show that the causal analysis can be done, even with limited data. Furthermore, because the procedure relies on overall organization's productivity, it might be more objective than when the analyst picks and chooses which costs and benefits should be included in the analysis. PMID:23019515

  3. Multiaxial Cyclic Thermoplasticity Analysis with Besseling's Subvolume Method

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.

    1983-01-01

    A modification was formulated to Besseling's Subvolume Method to allow it to use multilinear stress-strain curves which are temperature dependent to perform cyclic thermoplasticity analyses. This method automotically reproduces certain aspects of real material behavior important in the analysis of Aircraft Gas Turbine Engine (AGTE) components. These include the Bauschinger effect, cross-hardening, and memory. This constitutive equation was implemented in a finite element computer program called CYANIDE. Subsequently, classical time dependent plasticity (creep) was added to the program. Since its inception, this program was assessed against laboratory and component testing and engine experience. The ability of this program to simulate AGTE material response characteristics was verified by this experience and its utility in providing data for life analyses was demonstrated. In this area of life analysis, the multiaxial thermoplasticity capabilities of the method have proved a match for the actual AGTE life experience.

  4. Spectrum analysis on quality requirements consideration in software design documents.

    PubMed

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  5. A comprehensive assessment of somatic mutation detection in cancer using whole-genome sequencing

    PubMed Central

    Alioto, Tyler S.; Buchhalter, Ivo; Derdak, Sophia; Hutter, Barbara; Eldridge, Matthew D.; Hovig, Eivind; Heisler, Lawrence E.; Beck, Timothy A.; Simpson, Jared T.; Tonon, Laurie; Sertier, Anne-Sophie; Patch, Ann-Marie; Jäger, Natalie; Ginsbach, Philip; Drews, Ruben; Paramasivam, Nagarajan; Kabbe, Rolf; Chotewutmontri, Sasithorn; Diessl, Nicolle; Previti, Christopher; Schmidt, Sabine; Brors, Benedikt; Feuerbach, Lars; Heinold, Michael; Gröbner, Susanne; Korshunov, Andrey; Tarpey, Patrick S.; Butler, Adam P.; Hinton, Jonathan; Jones, David; Menzies, Andrew; Raine, Keiran; Shepherd, Rebecca; Stebbings, Lucy; Teague, Jon W.; Ribeca, Paolo; Giner, Francesc Castro; Beltran, Sergi; Raineri, Emanuele; Dabad, Marc; Heath, Simon C.; Gut, Marta; Denroche, Robert E.; Harding, Nicholas J.; Yamaguchi, Takafumi N.; Fujimoto, Akihiro; Nakagawa, Hidewaki; Quesada, Víctor; Valdés-Mas, Rafael; Nakken, Sigve; Vodák, Daniel; Bower, Lawrence; Lynch, Andrew G.; Anderson, Charlotte L.; Waddell, Nicola; Pearson, John V.; Grimmond, Sean M.; Peto, Myron; Spellman, Paul; He, Minghui; Kandoth, Cyriac; Lee, Semin; Zhang, John; Létourneau, Louis; Ma, Singer; Seth, Sahil; Torrents, David; Xi, Liu; Wheeler, David A.; López-Otín, Carlos; Campo, Elías; Campbell, Peter J.; Boutros, Paul C.; Puente, Xose S.; Gerhard, Daniela S.; Pfister, Stefan M.; McPherson, John D.; Hudson, Thomas J.; Schlesner, Matthias; Lichter, Peter; Eils, Roland; Jones, David T. W.; Gut, Ivo G.

    2015-01-01

    As whole-genome sequencing for cancer genome analysis becomes a clinical tool, a full understanding of the variables affecting sequencing analysis output is required. Here using tumour-normal sample pairs from two different types of cancer, chronic lymphocytic leukaemia and medulloblastoma, we conduct a benchmarking exercise within the context of the International Cancer Genome Consortium. We compare sequencing methods, analysis pipelines and validation methods. We show that using PCR-free methods and increasing sequencing depth to ∼100 × shows benefits, as long as the tumour:control coverage ratio remains balanced. We observe widely varying mutation call rates and low concordance among analysis pipelines, reflecting the artefact-prone nature of the raw data and lack of standards for dealing with the artefacts. However, we show that, using the benchmark mutation set we have created, many issues are in fact easy to remedy and have an immediate positive impact on mutation detection accuracy. PMID:26647970

  6. Correcting for Sample Contamination in Genotype Calling of DNA Sequence Data

    PubMed Central

    Flickinger, Matthew; Jun, Goo; Abecasis, Gonçalo R.; Boehnke, Michael; Kang, Hyun Min

    2015-01-01

    DNA sample contamination is a frequent problem in DNA sequencing studies and can result in genotyping errors and reduced power for association testing. We recently described methods to identify within-species DNA sample contamination based on sequencing read data, showed that our methods can reliably detect and estimate contamination levels as low as 1%, and suggested strategies to identify and remove contaminated samples from sequencing studies. Here we propose methods to model contamination during genotype calling as an alternative to removal of contaminated samples from further analyses. We compare our contamination-adjusted calls to calls that ignore contamination and to calls based on uncontaminated data. We demonstrate that, for moderate contamination levels (5%–20%), contamination-adjusted calls eliminate 48%–77% of the genotyping errors. For lower levels of contamination, our contamination correction methods produce genotypes nearly as accurate as those based on uncontaminated data. Our contamination correction methods are useful generally, but are particularly helpful for sample contamination levels from 2% to 20%. PMID:26235984

  7. Evolution, Chaos, or Perpetual Motion? A Retrospective Trend Analysis of Secondary Science Curriculum Advocacy, 1955-94.

    ERIC Educational Resources Information Center

    Ponder, Gerald; Kelly, Janet

    1997-01-01

    Analyzed 1,595 articles pertaining to secondary science-education curriculum and instruction published in "The Science Teacher" and "Science Education" between 1955 and 1994. For over four decades, science education has been in continual crisis. Instruction methods have changed little. Calls for reforming secondary science education, improving…

  8. The Symbolic Role of Organizational Message Artifacts in a Communication System Assessment.

    ERIC Educational Resources Information Center

    Meyer, John C.

    This paper calls for the inclusion of narrative, thematic, and metaphor analysis as organizational assessment or communication audit methods and discusses some practical means of integrating these symbolic interpretational devices. The paper begins by defining the notion of symbol as the message content important to the organizational member. It…

  9. A Rhetorical Analysis of the Self in an Organization: The Production and Reception of Discourse in a Bank.

    ERIC Educational Resources Information Center

    Roberts, Joy S.

    1999-01-01

    Describes briefly the author's research (contributing to scholarship on successful language practices in organizations) examining the conflicts, and specifically the discursive methods of solving these conflicts, faced by individuals within an organization as they negotiate competing demands. Offers a new tool (called Bracketing, Ranking, and…

  10. Geostatistics for spatial genetic structures: study of wild populations of perennial ryegrass.

    PubMed

    Monestiez, P; Goulard, M; Charmet, G

    1994-04-01

    Methods based on geostatistics were applied to quantitative traits of agricultural interest measured on a collection of 547 wild populations of perennial ryegrass in France. The mathematical background of these methods, which resembles spatial autocorrelation analysis, is briefly described. When a single variable is studied, the spatial structure analysis is similar to spatial autocorrelation analysis, and a spatial prediction method, called "kriging", gives a filtered map of the spatial pattern over all the sampled area. When complex interactions of agronomic traits with different evaluation sites define a multivariate structure for the spatial analysis, geostatistical methods allow the spatial variations to be broken down into two main spatial structures with ranges of 120 km and 300 km, respectively. The predicted maps that corresponded to each range were interpreted as a result of the isolation-by-distance model and as a consequence of selection by environmental factors. Practical collecting methodology for breeders may be derived from such spatial structures.

  11. TRAPR: R Package for Statistical Analysis and Visualization of RNA-Seq Data.

    PubMed

    Lim, Jae Hyun; Lee, Soo Youn; Kim, Ju Han

    2017-03-01

    High-throughput transcriptome sequencing, also known as RNA sequencing (RNA-Seq), is a standard technology for measuring gene expression with unprecedented accuracy. Numerous bioconductor packages have been developed for the statistical analysis of RNA-Seq data. However, these tools focus on specific aspects of the data analysis pipeline, and are difficult to appropriately integrate with one another due to their disparate data structures and processing methods. They also lack visualization methods to confirm the integrity of the data and the process. In this paper, we propose an R-based RNA-Seq analysis pipeline called TRAPR, an integrated tool that facilitates the statistical analysis and visualization of RNA-Seq expression data. TRAPR provides various functions for data management, the filtering of low-quality data, normalization, transformation, statistical analysis, data visualization, and result visualization that allow researchers to build customized analysis pipelines.

  12. Variogram Analysis of Response surfaces (VARS): A New Framework for Global Sensitivity Analysis of Earth and Environmental Systems Models

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Gupta, H. V.

    2015-12-01

    Earth and environmental systems models (EESMs) are continually growing in complexity and dimensionality with continuous advances in understanding and computing power. Complexity and dimensionality are manifested by introducing many different factors in EESMs (i.e., model parameters, forcings, boundary conditions, etc.) to be identified. Sensitivity Analysis (SA) provides an essential means for characterizing the role and importance of such factors in producing the model responses. However, conventional approaches to SA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we present a new and general sensitivity analysis framework (called VARS), based on an analogy to 'variogram analysis', that provides an intuitive and comprehensive characterization of sensitivity across the full spectrum of scales in the factor space. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are limiting cases of VARS, and that their SA indices can be computed as by-products of the VARS framework. We also present a practical strategy for the application of VARS to real-world problems, called STAR-VARS, including a new sampling strategy, called "star-based sampling". Our results across several case studies show the STAR-VARS approach to provide reliable and stable assessments of "global" sensitivity across the full range of scales in the factor space, while being at least 1-2 orders of magnitude more efficient than the benchmark Morris and Sobol approaches.

  13. Projection methods for line radiative transfer in spherical media.

    NASA Astrophysics Data System (ADS)

    Anusha, L. S.; Nagendra, K. N.

    An efficient numerical method called the Preconditioned Bi-Conjugate Gradient (Pre-BiCG) method is presented for the solution of radiative transfer equation in spherical geometry. A variant of this method called Stabilized Preconditioned Bi-Conjugate Gradient (Pre-BiCG-STAB) is also presented. These methods are based on projections on the subspaces of the n dimensional Euclidean space mathbb {R}n called Krylov subspaces. The methods are shown to be faster in terms of convergence rate compared to the contemporary iterative methods such as Jacobi, Gauss-Seidel and Successive Over Relaxation (SOR).

  14. My-Forensic-Loci-queries (MyFLq) framework for analysis of forensic STR data generated by massive parallel sequencing.

    PubMed

    Van Neste, Christophe; Vandewoestyne, Mado; Van Criekinge, Wim; Deforce, Dieter; Van Nieuwerburgh, Filip

    2014-03-01

    Forensic scientists are currently investigating how to transition from capillary electrophoresis (CE) to massive parallel sequencing (MPS) for analysis of forensic DNA profiles. MPS offers several advantages over CE such as virtually unlimited multiplexy of loci, combining both short tandem repeat (STR) and single nucleotide polymorphism (SNP) loci, small amplicons without constraints of size separation, more discrimination power, deep mixture resolution and sample multiplexing. We present our bioinformatic framework My-Forensic-Loci-queries (MyFLq) for analysis of MPS forensic data. For allele calling, the framework uses a MySQL reference allele database with automatically determined regions of interest (ROIs) by a generic maximal flanking algorithm which makes it possible to use any STR or SNP forensic locus. Python scripts were designed to automatically make allele calls starting from raw MPS data. We also present a method to assess the usefulness and overall performance of a forensic locus with respect to MPS, as well as methods to estimate whether an unknown allele, which sequence is not present in the MySQL database, is in fact a new allele or a sequencing error. The MyFLq framework was applied to an Illumina MiSeq dataset of a forensic Illumina amplicon library, generated from multilocus STR polymerase chain reaction (PCR) on both single contributor samples and multiple person DNA mixtures. Although the multilocus PCR was not yet optimized for MPS in terms of amplicon length or locus selection, the results show excellent results for most loci. The results show a high signal-to-noise ratio, correct allele calls, and a low limit of detection for minor DNA contributors in mixed DNA samples. Technically, forensic MPS affords great promise for routine implementation in forensic genomics. The method is also applicable to adjacent disciplines such as molecular autopsy in legal medicine and in mitochondrial DNA research. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  15. A novel collaborative representation and SCAD based classification method for fibrosis and inflammatory activity analysis of chronic hepatitis C

    NASA Astrophysics Data System (ADS)

    Cai, Jiaxin; Chen, Tingting; Li, Yan; Zhu, Nenghui; Qiu, Xuan

    2018-03-01

    In order to analysis the fibrosis stage and inflammatory activity grade of chronic hepatitis C, a novel classification method based on collaborative representation (CR) with smoothly clipped absolute deviation penalty (SCAD) penalty term, called CR-SCAD classifier, is proposed for pattern recognition. After that, an auto-grading system based on CR-SCAD classifier is introduced for the prediction of fibrosis stage and inflammatory activity grade of chronic hepatitis C. The proposed method has been tested on 123 clinical cases of chronic hepatitis C based on serological indexes. Experimental results show that the performance of the proposed method outperforms the state-of-the-art baselines for the classification of fibrosis stage and inflammatory activity grade of chronic hepatitis C.

  16. Evaluation of the status of anurans on a refuge in suburban Maryland

    USGS Publications Warehouse

    Brander, S.M.; Royle, J. Andrew; Eames, M.

    2007-01-01

    Because many anurans have well-defined breeding seasons and male anurans produce loud advertisement calls, surveys of these breeding choruses are believed to provide a dependable means of monitoring population trends. The Patuxent Research Refuge initiated such a calling survey in the spring of 1997, which uses volunteers to collect anuran (frog and toad) calling survey data. The primary goal of initiating the calling surveys at the Patuxent Refuge was to obtain baseline information on anuran populations, such as species occurrence, frequency of occurrence, and relative abundance over time. In this paper, we used the calling survey data to develop models for the ?proportion of area occupied? by individual anuran species, a method in which analysis is focused on the proportion of sites that are occupied by a species, instead of the number of individuals present in the population. This type of analysis is ideal for use in large-scale monitoring programs focused on species that are difficult to count, such as anurans or birds. We considered models for proportion of area occupied that allow for imperfect detection (that is, a species may be present but go undetected during sampling) by incorporating parameters that describe detection probability and the response of detection probability to various environmental and sampling covariates. Our results indicate that anuran populations on the Patuxent Research Refuge have high rates of occupancy compared to areas nearby and that extinction and colonization rates are stable. The potential uses for ?proportion of area occupied? analyses are far-reaching and will allow for more accurate quantification of data and better-informed management decisions for calling surveys on a larger scale.

  17. Evaluation of the status of anurans on a refuge in suburban Maryland

    USGS Publications Warehouse

    Brander, S.M.; Royle, J. Andrew; Eames, M.

    2007-01-01

    Because many anurans have well-defined breeding seasons and male anurans produce loud advertisement calls, surveys of these breeding choruses are believed to provide a dependable means of monitoring population trends. The Patuxent Research Refuge initiated such a calling survey in the spring of 1997, which uses volunteers to collect anuran (frog and toad) calling survey data. The primary goal of initiating the calling surveys at the Patuxent Refuge was to obtain baseline information on anuran populations, such as species occurrence, frequency of occurrence, and relative abundance over time. In this paper, we used the calling survey data to develop models for the "proportion of area occupied" by individual anuran species, a method in which analysis is focused on the proportion of sites that are occupied by a species, instead of the number of individuals present in the population. This type of analysis is ideal for use in large-scale monitoring programs focused on species that are difficult to count, such as anurans or birds. We considered models for proportion of area occupied that allow for imperfect detection (that is, a species may be present but go undetected during sampling) by incorporating parameters that describe detection probability and the response of detection probability to various environmental and sampling covariates. Our results indicate that anuran populations on the Patuxent Research Refuge have high rates of occupancy compared to areas nearby and that extinction and colonization rates are stable. The potential uses for "proportion of area occupied" analyses are far-reaching and will allow for more accurate quantification of data and better-informed management decisions for calling surveys on a larger scale. Copyright 2007 Society for the Study of Amphibians and Reptiles.

  18. Automated surveillance of 911 call data for detection of possible water contamination incidents.

    PubMed

    Haas, Adam J; Gibbons, Darcy; Dangel, Chrissy; Allgeier, Steve

    2011-03-30

    Drinking water contamination, with the capability to affect large populations, poses a significant risk to public health. In recent water contamination events, the impact of contamination on public health appeared in data streams monitoring health-seeking behavior. While public health surveillance has traditionally focused on the detection of pathogens, developing methods for detection of illness from fast-acting chemicals has not been an emphasis. An automated surveillance system was implemented for Cincinnati's drinking water contamination warning system to monitor health-related 911 calls in the city of Cincinnati. Incident codes indicative of possible water contamination were filtered from all 911 calls for analysis. The 911 surveillance system uses a space-time scan statistic to detect potential water contamination incidents. The frequency and characteristics of the 911 alarms over a 2.5 year period were studied. During the evaluation, 85 alarms occurred, although most occurred prior to the implementation of an additional alerting constraint in May 2009. Data were available for analysis approximately 48 minutes after calls indicating alarms may be generated 1-2 hours after a rapid increase in call volume. Most alerts occurred in areas of high population density. The average alarm area was 9.22 square kilometers. The average number of cases in an alarm was nine calls. The 911 surveillance system provides timely notification of possible public health events, but did have limitations. While the alarms contained incident codes and location of the caller, additional information such as medical status was not available to assist validating the cause of the alarm. Furthermore, users indicated that a better understanding of 911 system functionality is necessary to understand how it would behave in an actual water contamination event.

  19. Influence of Installation Effects on Pile Bearing Capacity in Cohesive Soils - Large Deformation Analysis Via Finite Element Method

    NASA Astrophysics Data System (ADS)

    Konkol, Jakub; Bałachowski, Lech

    2017-03-01

    In this paper, the whole process of pile construction and performance during loading is modelled via large deformation finite element methods such as Coupled Eulerian Lagrangian (CEL) and Updated Lagrangian (UL). Numerical study consists of installation process, consolidation phase and following pile static load test (SLT). The Poznań site is chosen as the reference location for the numerical analysis, where series of pile SLTs have been performed in highly overconsolidated clay (OCR ≈ 12). The results of numerical analysis are compared with corresponding field tests and with so-called "wish-in-place" numerical model of pile, where no installation effects are taken into account. The advantages of using large deformation numerical analysis are presented and its application to the pile designing is shown.

  20. [Effect of leader-member exchange on nurses'sense of calling in workplace].

    PubMed

    Zhang, L G; Ma, H L; Wang, Z J; Zhou, Y Y; Jin, T T

    2017-12-20

    Objective: To investigate the effect of leader-member exchange on nurses'sense of calling in workplace based on self-determination theory. Methods: A total of 381 nurses were randomly selected from five tertiary general hospitals in Zhejiang province, China from October to December, 2016. They were subjected to a survey using the Leader-Member Exchange Scale, Job Autonomy Scale, Core Self-Evaluation Scale, and Calling Scale. The mediating effect was used to test the procedures and the data were subjected to hierarchical regression analysis. Results: The leader-member exchange was positively correlated with job autonomy, core self-evaluation, and sense of calling ( r =0.471, P <0.001; r =0.373, P <0.001; r =0.475, P <0.001) ; the leader-member exchange had a positive predictive effect on job autonomy and sense of calling ( β = 0.47, P <0.001; β =0.48, P <0.001) ; the job autonomy had a partial mediating effect on the relationship between leader-member exchange and sense of calling ( F =66.50, P <0.001) ; the core self-evaluation negatively adjusted the positive relationship between leader-member exchange and job autonomy ( F =27.81, P <0.001) . Conclusion: High-quality leader-member exchange enhances the sense of calling by improving staffs' job autonomy and the core self-evaluation reduces the positive relationship between leader-member exchange and job autonomy.

  1. Combined slope ratio analysis and linear-subtraction: An extension of the Pearce ratio method

    NASA Astrophysics Data System (ADS)

    De Waal, Sybrand A.

    1996-07-01

    A new technique, called combined slope ratio analysis, has been developed by extending the Pearce element ratio or conserved-denominator method (Pearce, 1968) to its logical conclusions. If two stoichiometric substances are mixed and certain chemical components are uniquely contained in either one of the two mixing substances, then by treating these unique components as conserved, the composition of the substance not containing the relevant component can be accurately calculated within the limits allowed by analytical and geological error. The calculated composition can then be subjected to rigorous statistical testing using the linear-subtraction method recently advanced by Woronow (1994). Application of combined slope ratio analysis to the rocks of the Uwekahuna Laccolith, Hawaii, USA, and the lavas of the 1959-summit eruption of Kilauea Volcano, Hawaii, USA, yields results that are consistent with field observations.

  2. The NASA/Industry Design Analysis Methods for Vibrations (DAMVIBS) Program - A government overview. [of rotorcraft technology development using finite element method

    NASA Technical Reports Server (NTRS)

    Kvaternik, Raymond G.

    1992-01-01

    An overview is presented of government contributions to the program called Design Analysis Methods for Vibrations (DAMV) which attempted to develop finite-element-based analyses of rotorcraft vibrations. NASA initiated the program with a finite-element modeling program for the CH-47D tandem-rotor helicopter. The DAMV program emphasized four areas including: airframe finite-element modeling, difficult components studies, coupled rotor-airframe vibrations, and airframe structural optimization. Key accomplishments of the program include industrywide standards for modeling metal and composite airframes, improved industrial designs for vibrations, and the identification of critical structural contributors to airframe vibratory responses. The program also demonstrated the value of incorporating secondary modeling details to improving correlation, and the findings provide the basis for an improved finite-element-based dynamics design-analysis capability.

  3. A cross-sectional study of the association between mobile phone use and symptoms of ill health

    PubMed Central

    2016-01-01

    Objectives This study analyzed the associations between mobile phone call frequency and duration with non-specific symptoms. Methods This study was conducted with a population group including 532 non-patient adults established by the Korean Genome and Epidemiology Study. The pattern of phone call using a mobile phone was investigated through face-to-face interview. Structured methods applied to quantitatively assess health effects are Headache Impact Test-6 (HIT-6), Psychosocial Well-being Index-Short Form, Beck Depression Inventory, Korean-Instrumental Activities of Daily Living, Perceived Stress Scale (PSS), Pittsburgh Sleep Quality Index, and 12-item Short Form Health Survey where a higher score represents a higher greater health effect. Results The average daily phone call frequency showed a significant correlation with the PSS score in female subjects. Increases in the average duration of one phone call were significantly correlated with increases in the severity of headaches in both sexes. The mean (standard deviation) HIT-6 score in the subgroup of subjects whose average duration of one phone call was five minutes or longer was 45.98 (8.15), as compared with 42.48 (7.20) in those whose average duration of one phone call was <5 minutes. The severity of headaches was divided into three levels according to the HIT-6 score (little or no impact/moderate impact/substantial or severe impact), and a logistic regression analysis was performed to investigate the association between an increased phone call duration and the headache severity. When the average duration of one phone call was five minutes or longer, the odds ratio (ORs) and the 95% confidence intervals (CIs) for the moderate impact group were 2.22 and 1.18 to 4.19, respectively. The OR and 95% CI for the substantial or severe impact group were 4.44 and 2.11 to 8.90, respectively. Conclusions Mobile phone call duration was not significantly associated with stress, sleep, cognitive function, or depression, but was associated with the severity of headaches. PMID:27788568

  4. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool

    PubMed Central

    Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi

    2016-01-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405

  5. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    PubMed

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  6. Stress analysis of circular semimonocoque cylinders with cutouts

    NASA Technical Reports Server (NTRS)

    Mccomb, Harvey G , Jr

    1955-01-01

    A method is presented for analyzing the stresses about cutouts in circular semimonocoque cylinders with flexible rings. The method involves the use of so-called perturbation stress distributions which are superposed on the stress distribution that would exist in the structure with no cutout in such a way as to give the effects of a cutout. The method can be used for any loading case for which the structure without the cutout can be analyzed and is sufficiently versatile to account for stringer and shear reinforcement about the cutout.

  7. A method of distributed avionics data processing based on SVM classifier

    NASA Astrophysics Data System (ADS)

    Guo, Hangyu; Wang, Jinyan; Kang, Minyang; Xu, Guojing

    2018-03-01

    Under the environment of system combat, in order to solve the problem on management and analysis of the massive heterogeneous data on multi-platform avionics system, this paper proposes a management solution which called avionics "resource cloud" based on big data technology, and designs an aided decision classifier based on SVM algorithm. We design an experiment with STK simulation, the result shows that this method has a high accuracy and a broad application prospect.

  8. Single-feature polymorphism discovery in the barley transcriptome

    PubMed Central

    Rostoks, Nils; Borevitz, Justin O; Hedley, Peter E; Russell, Joanne; Mudie, Sharon; Morris, Jenny; Cardle, Linda; Marshall, David F; Waugh, Robbie

    2005-01-01

    A probe-level model for analysis of GeneChip gene-expression data is presented which identified more than 10,000 single-feature polymorphisms (SFP) between two barley genotypes. The method has good sensitivity, as 67% of known single-nucleotide polymorphisms (SNP) were called as SFPs. This method is applicable to all oligonucleotide microarray data, accounts for SNP effects in gene-expression data and represents an efficient and versatile approach for highly parallel marker identification in large genomes. PMID:15960806

  9. Marker-aided genetic divergence analysis in Brassica.

    PubMed

    Arunachalam, V; Verma, Shefali; Sujata, V; Prabhu, K V

    2005-08-01

    Genetic divergence was evaluated in 31 breeding lines from four Brassica species using Mahalanobis' D2. A new method of grouping using D2 values was used to group the 31 lines, based on diagnostic morphological traits (called morphoqts). Isozyme variation of the individual enzymes esterase and glutamate oxaloacetate was quantified by five parameters (called isoqts) developed earlier. Grouping by the same method was also done based on the isoqts, and the grouping by isozymes was compared with that by morphoqts. Overall, there was an agreement of 73% suggesting that isoqts can be used in the choice of parents and also first stage selection of segregants in the laboratory. It was suggested that such an exercise would help to take care of season-bound and field-related problems of breeding. The new isozyme QTs, within lane variance of relative mobility and relative absorption, accounted for about 50% of the total divergence. The utility of the new method and isoqts in cost-effective breeding were highlighted.

  10. Q-nexus: a comprehensive and efficient analysis pipeline designed for ChIP-nexus.

    PubMed

    Hansen, Peter; Hecht, Jochen; Ibn-Salem, Jonas; Menkuec, Benjamin S; Roskosch, Sebastian; Truss, Matthias; Robinson, Peter N

    2016-11-04

    ChIP-nexus, an extension of the ChIP-exo protocol, can be used to map the borders of protein-bound DNA sequences at nucleotide resolution, requires less input DNA and enables selective PCR duplicate removal using random barcodes. However, the use of random barcodes requires additional preprocessing of the mapping data, which complicates the computational analysis. To date, only a very limited number of software packages are available for the analysis of ChIP-exo data, which have not yet been systematically tested and compared on ChIP-nexus data. Here, we present a comprehensive software package for ChIP-nexus data that exploits the random barcodes for selective removal of PCR duplicates and for quality control. Furthermore, we developed bespoke methods to estimate the width of the protected region resulting from protein-DNA binding and to infer binding positions from ChIP-nexus data. Finally, we applied our peak calling method as well as the two other methods MACE and MACS2 to the available ChIP-nexus data. The Q-nexus software is efficient and easy to use. Novel statistics about duplication rates in consideration of random barcodes are calculated. Our method for the estimation of the width of the protected region yields unbiased signatures that are highly reproducible for biological replicates and at the same time very specific for the respective factors analyzed. As judged by the irreproducible discovery rate (IDR), our peak calling algorithm shows a substantially better reproducibility. An implementation of Q-nexus is available at http://charite.github.io/Q/ .

  11. Systematic text condensation: a strategy for qualitative analysis.

    PubMed

    Malterud, Kirsti

    2012-12-01

    To present background, principles, and procedures for a strategy for qualitative analysis called systematic text condensation and discuss this approach compared with related strategies. Giorgi's psychological phenomenological analysis is the point of departure and inspiration for systematic text condensation. The basic elements of Giorgi's method and the elaboration of these in systematic text condensation are presented, followed by a detailed description of procedures for analysis according to systematic text condensation. Finally, similarities and differences compared with other frequently applied methods for qualitative analysis are identified, as the foundation of a discussion of strengths and limitations of systematic text condensation. Systematic text condensation is a descriptive and explorative method for thematic cross-case analysis of different types of qualitative data, such as interview studies, observational studies, and analysis of written texts. The method represents a pragmatic approach, although inspired by phenomenological ideas, and various theoretical frameworks can be applied. The procedure consists of the following steps: 1) total impression - from chaos to themes; 2) identifying and sorting meaning units - from themes to codes; 3) condensation - from code to meaning; 4) synthesizing - from condensation to descriptions and concepts. Similarities and differences comparing systematic text condensation with other frequently applied qualitative methods regarding thematic analysis, theoretical methodological framework, analysis procedures, and taxonomy are discussed. Systematic text condensation is a strategy for analysis developed from traditions shared by most of the methods for analysis of qualitative data. The method offers the novice researcher a process of intersubjectivity, reflexivity, and feasibility, while maintaining a responsible level of methodological rigour.

  12. Design of RNA splicing analysis null models for post hoc filtering of Drosophila head RNA-Seq data with the splicing analysis kit (Spanki)

    PubMed Central

    2013-01-01

    Background The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. Results We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Conclusions Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools. PMID:24209455

  13. Design of RNA splicing analysis null models for post hoc filtering of Drosophila head RNA-Seq data with the splicing analysis kit (Spanki).

    PubMed

    Sturgill, David; Malone, John H; Sun, Xia; Smith, Harold E; Rabinow, Leonard; Samson, Marie-Laure; Oliver, Brian

    2013-11-09

    The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools.

  14. Structural dynamic analysis of the Space Shuttle Main Engine

    NASA Technical Reports Server (NTRS)

    Scott, L. P.; Jamison, G. T.; Mccutcheon, W. A.; Price, J. M.

    1981-01-01

    This structural dynamic analysis supports development of the SSME by evaluating components subjected to critical dynamic loads, identifying significant parameters, and evaluating solution methods. Engine operating parameters at both rated and full power levels are considered. Detailed structural dynamic analyses of operationally critical and life limited components support the assessment of engine design modifications and environmental changes. Engine system test results are utilized to verify analytic model simulations. The SSME main chamber injector assembly is an assembly of 600 injector elements which are called LOX posts. The overall LOX post analysis procedure is shown.

  15. Guided SAR image despeckling with probabilistic non local weights

    NASA Astrophysics Data System (ADS)

    Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny

    2017-12-01

    SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.

  16. PyMICE: APython library for analysis of IntelliCage data.

    PubMed

    Dzik, Jakub M; Puścian, Alicja; Mijakowska, Zofia; Radwanska, Kasia; Łęski, Szymon

    2018-04-01

    IntelliCage is an automated system for recording the behavior of a group of mice housed together. It produces rich, detailed behavioral data calling for new methods and software for their analysis. Here we present PyMICE, a free and open-source library for analysis of IntelliCage data in the Python programming language. We describe the design and demonstrate the use of the library through a series of examples. PyMICE provides easy and intuitive access to IntelliCage data, and thus facilitates the possibility of using numerous other Python scientific libraries to form a complete data analysis workflow.

  17. Long-time asymptotic analysis of the Korteweg-de Vries equation via the dbar steepest descent method: the soliton region

    NASA Astrophysics Data System (ADS)

    Giavedoni, Pietro

    2017-03-01

    We address the problem of long-time asymptotics for the solutions of the Korteweg-de Vries equation under low regularity assumptions. We consider decaying initial data admitting only a finite number of moments. For the so-called ‘soliton region’, an improved asymptotic estimate is provided, in comparison with the one in Grunert and Teschl (2009 Math. Phys. Anal. Geom. 12 287-324). Our analysis is based on the dbar steepest descent method proposed by Miller and McLaughlin. Dedicated to Dora, Paolo and Sanja, with deep gratitude for their love and support.

  18. Multi-Disciplinary, Multi-Fidelity Discrete Data Transfer Using Degenerate Geometry Forms

    NASA Technical Reports Server (NTRS)

    Olson, Erik D.

    2016-01-01

    In a typical multi-fidelity design process, different levels of geometric abstraction are used for different analysis methods, and transitioning from one phase of design to the next often requires a complete re-creation of the geometry. To maintain consistency between lower-order and higher-order analysis results, Vehicle Sketch Pad (OpenVSP) recently introduced the ability to generate and export several degenerate forms of the geometry, representing the type of abstraction required to perform low- to medium-order analysis for a range of aeronautical disciplines. In this research, the functionality of these degenerate models was extended, so that in addition to serving as repositories for the geometric information that is required as input to an analysis, the degenerate models can also store the results of that analysis mapped back onto the geometric nodes. At the same time, the results are also mapped indirectly onto the nodes of lower-order degenerate models using a process called aggregation, and onto higher-order models using a process called disaggregation. The mapped analysis results are available for use by any subsequent analysis in an integrated design and analysis process. A simple multi-fidelity analysis process for a single-aisle subsonic transport aircraft is used as an example case to demonstrate the value of the approach.

  19. A method to estimate weight and dimensions of small aircraft propulsion gas turbine engines: User's guide

    NASA Technical Reports Server (NTRS)

    Hale, P. L.

    1982-01-01

    The weight and major envelope dimensions of small aircraft propulsion gas turbine engines are estimated. The computerized method, called WATE-S (Weight Analysis of Turbine Engines-Small) is a derivative of the WATE-2 computer code. WATE-S determines the weight of each major component in the engine including compressors, burners, turbines, heat exchangers, nozzles, propellers, and accessories. A preliminary design approach is used where the stress levels, maximum pressures and temperatures, material properties, geometry, stage loading, hub/tip radius ratio, and mechanical overspeed are used to determine the component weights and dimensions. The accuracy of the method is generally better than + or - 10 percent as verified by analysis of four small aircraft propulsion gas turbine engines.

  20. Evidence based practice in traditional & complementary medicine: An agenda for policy, practice, education and research.

    PubMed

    Leach, Matthew J; Canaway, Rachel; Hunter, Jennifer

    2018-05-01

    To develop a policy, practice, education and research agenda for evidence-based practice (EBP) in traditional and complementary medicine (T&CM). The study was a secondary analysis of qualitative data, using the method of roundtable discussion. The sample comprised seventeen experts in EBP and T&CM. The discussion was audio-recorded, and the transcript analysed using thematic analysis. Four central themes emerged from the data; understanding evidence and EBP, drivers of change, interpersonal interaction, and moving forward. Captured within these themes were fifteen sub-themes. These themes/sub-themes translated into three broad calls to action: (1) defining terminology, (2) defining the EBP approach, and (3) fostering social movement. These calls to action formed the framework of the agenda. This analysis presents a potential framework for an agenda to improve EBP implementation in T&CM. The fundamental elements of this action plan seek clarification, leadership and unification on the issue of EBP in T&CM. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Comparing sequencing assays and human-machine analyses in actionable genomics for glioblastoma

    PubMed Central

    Wrzeszczynski, Kazimierz O.; Frank, Mayu O.; Koyama, Takahiko; Rhrissorrakrai, Kahn; Robine, Nicolas; Utro, Filippo; Emde, Anne-Katrin; Chen, Bo-Juen; Arora, Kanika; Shah, Minita; Vacic, Vladimir; Norel, Raquel; Bilal, Erhan; Bergmann, Ewa A.; Moore Vogel, Julia L.; Bruce, Jeffrey N.; Lassman, Andrew B.; Canoll, Peter; Grommes, Christian; Harvey, Steve; Parida, Laxmi; Michelini, Vanessa V.; Zody, Michael C.; Jobanputra, Vaidehi; Royyuru, Ajay K.

    2017-01-01

    Objective: To analyze a glioblastoma tumor specimen with 3 different platforms and compare potentially actionable calls from each. Methods: Tumor DNA was analyzed by a commercial targeted panel. In addition, tumor-normal DNA was analyzed by whole-genome sequencing (WGS) and tumor RNA was analyzed by RNA sequencing (RNA-seq). The WGS and RNA-seq data were analyzed by a team of bioinformaticians and cancer oncologists, and separately by IBM Watson Genomic Analytics (WGA), an automated system for prioritizing somatic variants and identifying drugs. Results: More variants were identified by WGS/RNA analysis than by targeted panels. WGA completed a comparable analysis in a fraction of the time required by the human analysts. Conclusions: The development of an effective human-machine interface in the analysis of deep cancer genomic datasets may provide potentially clinically actionable calls for individual patients in a more timely and efficient manner than currently possible. ClinicalTrials.gov identifier: NCT02725684. PMID:28740869

  2. CSM research: Methods and application studies

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    1989-01-01

    Computational mechanics is that discipline of applied science and engineering devoted to the study of physical phenomena by means of computational methods based on mathematical modeling and simulation, utilizing digital computers. The discipline combines theoretical and applied mechanics, approximation theory, numerical analysis, and computer science. Computational mechanics has had a major impact on engineering analysis and design. When applied to structural mechanics, the discipline is referred to herein as computational structural mechanics. Complex structures being considered by NASA for the 1990's include composite primary aircraft structures and the space station. These structures will be much more difficult to analyze than today's structures and necessitate a major upgrade in computerized structural analysis technology. NASA has initiated a research activity in structural analysis called Computational Structural Mechanics (CSM). The broad objective of the CSM activity is to develop advanced structural analysis technology that will exploit modern and emerging computers, such as those with vector and/or parallel processing capabilities. Here, the current research directions for the Methods and Application Studies Team of the Langley CSM activity are described.

  3. Specific expression of novel long non-coding RNAs in high-hyperdiploid childhood acute lymphoblastic leukemia

    PubMed Central

    Drouin, Simon; Caron, Maxime; St-Onge, Pascal; Gioia, Romain; Richer, Chantal; Oualkacha, Karim; Droit, Arnaud; Sinnett, Daniel

    2017-01-01

    Pre-B cell childhood acute lymphoblastic leukemia (pre-B cALL) is a heterogeneous disease involving many subtypes typically stratified using a combination of cytogenetic and molecular-based assays. These methods, although widely used, rely on the presence of known chromosomal translocations, which is a limiting factor. There is therefore a need for robust, sensitive, and specific molecular biomarkers unaffected by such limitations that would allow better risk stratification and consequently better clinical outcome. In this study we performed a transcriptome analysis of 56 pre-B cALL patients to identify expression signatures in different subtypes. In both protein-coding and long non-coding RNAs (lncRNA), we identified subtype-specific gene signatures distinguishing pre-B cALL subtypes, particularly in t(12;21) and hyperdiploid cases. The genes up-regulated in pre-B cALL subtypes were enriched in bivalent chromatin marks in their promoters. LncRNAs is a new and under-studied class of transcripts. The subtype-specific nature of lncRNAs suggests they may be suitable clinical biomarkers to guide risk stratification and targeted therapies in pre-B cALL patients. PMID:28346506

  4. Selective object encryption for privacy protection

    NASA Astrophysics Data System (ADS)

    Zhou, Yicong; Panetta, Karen; Cherukuri, Ravindranath; Agaian, Sos

    2009-05-01

    This paper introduces a new recursive sequence called the truncated P-Fibonacci sequence, its corresponding binary code called the truncated Fibonacci p-code and a new bit-plane decomposition method using the truncated Fibonacci pcode. In addition, a new lossless image encryption algorithm is presented that can encrypt a selected object using this new decomposition method for privacy protection. The user has the flexibility (1) to define the object to be protected as an object in an image or in a specific part of the image, a selected region of an image, or an entire image, (2) to utilize any new or existing method for edge detection or segmentation to extract the selected object from an image or a specific part/region of the image, (3) to select any new or existing method for the shuffling process. The algorithm can be used in many different areas such as wireless networking, mobile phone services and applications in homeland security and medical imaging. Simulation results and analysis verify that the algorithm shows good performance in object/image encryption and can withstand plaintext attacks.

  5. Model-Based and Model-Free Pavlovian Reward Learning: Revaluation, Revision and Revelation

    PubMed Central

    Dayan, Peter; Berridge, Kent C.

    2014-01-01

    Evidence supports at least two methods for learning about reward and punishment and making predictions for guiding actions. One method, called model-free, progressively acquires cached estimates of the long-run values of circumstances and actions from retrospective experience. The other method, called model-based, uses representations of the environment, expectations and prospective calculations to make cognitive predictions of future value. Extensive attention has been paid to both methods in computational analyses of instrumental learning. By contrast, although a full computational analysis has been lacking, Pavlovian learning and prediction has typically been presumed to be solely model-free. Here, we revise that presumption and review compelling evidence from Pavlovian revaluation experiments showing that Pavlovian predictions can involve their own form of model-based evaluation. In model-based Pavlovian evaluation, prevailing states of the body and brain influence value computations, and thereby produce powerful incentive motivations that can sometimes be quite new. We consider the consequences of this revised Pavlovian view for the computational landscape of prediction, response and choice. We also revisit differences between Pavlovian and instrumental learning in the control of incentive motivation. PMID:24647659

  6. Model-based and model-free Pavlovian reward learning: revaluation, revision, and revelation.

    PubMed

    Dayan, Peter; Berridge, Kent C

    2014-06-01

    Evidence supports at least two methods for learning about reward and punishment and making predictions for guiding actions. One method, called model-free, progressively acquires cached estimates of the long-run values of circumstances and actions from retrospective experience. The other method, called model-based, uses representations of the environment, expectations, and prospective calculations to make cognitive predictions of future value. Extensive attention has been paid to both methods in computational analyses of instrumental learning. By contrast, although a full computational analysis has been lacking, Pavlovian learning and prediction has typically been presumed to be solely model-free. Here, we revise that presumption and review compelling evidence from Pavlovian revaluation experiments showing that Pavlovian predictions can involve their own form of model-based evaluation. In model-based Pavlovian evaluation, prevailing states of the body and brain influence value computations, and thereby produce powerful incentive motivations that can sometimes be quite new. We consider the consequences of this revised Pavlovian view for the computational landscape of prediction, response, and choice. We also revisit differences between Pavlovian and instrumental learning in the control of incentive motivation.

  7. Non-Gaussian Distributions Affect Identification of Expression Patterns, Functional Annotation, and Prospective Classification in Human Cancer Genomes

    PubMed Central

    Marko, Nicholas F.; Weil, Robert J.

    2012-01-01

    Introduction Gene expression data is often assumed to be normally-distributed, but this assumption has not been tested rigorously. We investigate the distribution of expression data in human cancer genomes and study the implications of deviations from the normal distribution for translational molecular oncology research. Methods We conducted a central moments analysis of five cancer genomes and performed empiric distribution fitting to examine the true distribution of expression data both on the complete-experiment and on the individual-gene levels. We used a variety of parametric and nonparametric methods to test the effects of deviations from normality on gene calling, functional annotation, and prospective molecular classification using a sixth cancer genome. Results Central moments analyses reveal statistically-significant deviations from normality in all of the analyzed cancer genomes. We observe as much as 37% variability in gene calling, 39% variability in functional annotation, and 30% variability in prospective, molecular tumor subclassification associated with this effect. Conclusions Cancer gene expression profiles are not normally-distributed, either on the complete-experiment or on the individual-gene level. Instead, they exhibit complex, heavy-tailed distributions characterized by statistically-significant skewness and kurtosis. The non-Gaussian distribution of this data affects identification of differentially-expressed genes, functional annotation, and prospective molecular classification. These effects may be reduced in some circumstances, although not completely eliminated, by using nonparametric analytics. This analysis highlights two unreliable assumptions of translational cancer gene expression analysis: that “small” departures from normality in the expression data distributions are analytically-insignificant and that “robust” gene-calling algorithms can fully compensate for these effects. PMID:23118863

  8. The Role of Qualitative Approaches to Research in CALL Contexts: Closing in on the Learner's Experience

    ERIC Educational Resources Information Center

    Levy, Mike

    2015-01-01

    The article considers the role of qualitative research methods in CALL through describing a series of examples. These examples are used to highlight the importance and value of qualitative data in relation to a specific research objective in CALL. The use of qualitative methods in conjunction with other approaches as in mixed method research…

  9. Acoustical Applications of the HHT Method

    NASA Technical Reports Server (NTRS)

    Huang, Norden E.

    2003-01-01

    A document discusses applications of a method based on the Huang-Hilbert transform (HHT). The method was described, without the HHT name, in Analyzing Time Series Using EMD and Hilbert Spectra (GSC-13817), NASA Tech Briefs, Vol. 24, No. 10 (October 2000), page 63. To recapitulate: The method is especially suitable for analyzing time-series data that represent nonstationary and nonlinear physical phenomena. The method involves the empirical mode decomposition (EMD), in which a complicated signal is decomposed into a finite number of functions, called intrinsic mode functions (IMFs), that admit well-behaved Hilbert transforms. The HHT consists of the combination of EMD and Hilbert spectral analysis.

  10. Cascading disaster models in postburn flash flood

    Treesearch

    Fred May

    2007-01-01

    A useful method of modeling threats from hazards and documenting their disaster causation sequences is called “cascading threat modeling.” This type of modeling enables emergency planners to address hazard and risk assessments systematically. This paper describes a cascading threat modeling and analysis process. Wildfire and an associated postburn flash flood disaster...

  11. Prioritizing preferable locations for increasing urban tree canopy in New York City

    Treesearch

    Dexter Locke; J. Morgan Grove; Jacqueline W.T. Lu; Austin Troy; Jarlath P.M. O' Neil-Dunne; Brian Beck

    2010-01-01

    This paper presents a set of Geographic Information System (GIS) methods for identifying and prioritizing tree planting sites in urban environments. It uses an analytical approach created by a University of Vermont service-learning class called "GIS Analysis of New York City's Ecology" that was designed to provide research support to the MillionTreesNYC...

  12. A Comparative Analysis of Student Learning with a Collaborative Computer Simulation of the Cardiopulmonary System

    ERIC Educational Resources Information Center

    Keyser, Diane

    2010-01-01

    To design a series of assessments that could be used to compare the learning gains of high school students studying the cardiopulmonary system using traditional methods to those who used a collaborative computer simulation, called "Mr. Vetro". Five teachers and 264 HS biology students participated in the study. The students were in…

  13. Finite Set Control Transcription for Optimal Control Applications

    DTIC Science & Technology

    2009-05-01

    Figures 1.1 The Parameters of x . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 2.1 Categories of Optimization Algorithms ...Programming (NLP) algorithm , such as SNOPT2 (hereafter, called the optimizer). The Finite Set Control Transcription (FSCT) method is essentially a...artificial neural networks, ge- netic algorithms , or combinations thereof for analysis.4,5 Indeed, an actual biological neural network is an example of

  14. Rapid differentiation of citrus Hop stunt viroid variants by use of real-time RT-PCR and high resolution melting analysis

    USDA-ARS?s Scientific Manuscript database

    The RNA genome of Hop stunt viroid (HSVd) contains five to six nucleotides in a variable (V) domain, called the cachexia expression motif, which is associated with pathogenic and non-pathogenic variants in citrus. Current methods to differentiate HSVd variants rely on lengthy greenhouse biological i...

  15. Partial Variance of Increments Method in Solar Wind Observations and Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Greco, A.; Matthaeus, W. H.; Perri, S.; Osman, K. T.; Servidio, S.; Wan, M.; Dmitruk, P.

    2018-02-01

    The method called "PVI" (Partial Variance of Increments) has been increasingly used in analysis of spacecraft and numerical simulation data since its inception in 2008. The purpose of the method is to study the kinematics and formation of coherent structures in space plasmas, a topic that has gained considerable attention, leading the development of identification methods, observations, and associated theoretical research based on numerical simulations. This review paper will summarize key features of the method and provide a synopsis of the main results obtained by various groups using the method. This will enable new users or those considering methods of this type to find details and background collected in one place.

  16. What do consumers want to know about antibiotics? Analysis of a medicines call centre database.

    PubMed

    Hawke, Kate L; McGuire, Treasure M; Ranmuthugala, Geetha; van Driel, Mieke L

    2016-02-01

    Australia is one of the highest users of antibiotics in the developed world. This study aimed to identify consumer antibiotic information needs to improve targeting of medicines information. We conducted a retrospective, mixed-method study of consumers' antibiotic-related calls to Australia's National Prescribing Service (NPS) Medicines Line from September 2002 to June 2010. Demographic and question data were analysed, and the most common enquiry type in each age group was explored for key narrative themes. Relative antibiotic call frequencies were determined by comparing number of calls to antibiotic utilization in Australian Statistics on Medicines (ASM) data. Between 2002 and 2010, consumers made 8696 antibiotic calls to Medicines Line. The most common reason was questions about the role of their medicine (22.4%). Patient age groups differed in enquiry pattern, with more questions about lactation in the 0- to 4-year age group (33.6%), administration (5-14 years: 32.4%), interactions (15-24 years: 33.4% and 25-54 years: 23.3%) and role of the medicine (55 years and over: 26.6%). Key themes were identified for each age group. Relative to use in the community, antibiotics most likely to attract consumer calls were ciprofloxacin (18.0 calls/100,000 ASM prescriptions) and metronidazole (12.9 calls/100,000 ASM prescriptions), with higher call rates than the most commonly prescribed antibiotic amoxicillin (3.9 calls/100,000 ASM prescriptions). Consumers' knowledge gaps and concerns about antibiotics vary with age, and certain antibiotics generate greater concern relative to their usage. Clinicians should target medicines information to proactively address consumer concerns. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Risk factors for computer visual syndrome (CVS) among operators of two call centers in São Paulo, Brazil.

    PubMed

    Sa, Eduardo Costa; Ferreira Junior, Mario; Rocha, Lys Esther

    2012-01-01

    The aims of this study were to investigate work conditions, to estimate the prevalence and to describe risk factors associated with Computer Vision Syndrome among two call centers' operators in São Paulo (n = 476). The methods include a quantitative cross-sectional observational study and an ergonomic work analysis, using work observation, interviews and questionnaires. The case definition was the presence of one or more specific ocular symptoms answered as always, often or sometimes. The multiple logistic regression model, were created using the stepwise forward likelihood method and remained the variables with levels below 5% (p < 0.05). The operators were mainly female and young (from 15 to 24 years old). The call center was opened 24 hours and the operators weekly hours were 36 hours with break time from 21 to 35 minutes per day. The symptoms reported were eye fatigue (73.9%), "weight" in the eyes (68.2%), "burning" eyes (54.6%), tearing (43.9%) and weakening of vision (43.5%). The prevalence of Computer Vision Syndrome was 54.6%. Associations verified were: being female (OR 2.6, 95% CI 1.6 to 4.1), lack of recognition at work (OR 1.4, 95% CI 1.1 to 1.8), organization of work in call center (OR 1.4, 95% CI 1.1 to 1.7) and high demand at work (OR 1.1, 95% CI 1.0 to 1.3). The organization and psychosocial factors at work should be included in prevention programs of visual syndrome among call centers' operators.

  18. Effects of weather conditions on emergency ambulance calls for acute coronary syndromes

    NASA Astrophysics Data System (ADS)

    Vencloviene, Jone; Babarskiene, Ruta; Dobozinskas, Paulius; Siurkaite, Viktorija

    2015-08-01

    The aim of this study was to evaluate the relationship between weather conditions and daily emergency ambulance calls for acute coronary syndromes (ACS). The study included data on 3631 patients who called the ambulance for chest pain and were admitted to the department of cardiology as patients with ACS. We investigated the effect of daily air temperature ( T), barometric pressure (BP), relative humidity, and wind speed (WS) to detect the risk areas for low and high daily volume (DV) of emergency calls. We used the classification and regression tree method as well as cluster analysis. The clusters were created by applying the k-means cluster algorithm using the standardized daily weather variables. The analysis was performed separately during cold (October-April) and warm (May-September) seasons. During the cold period, the greatest DV was observed on days of low T during the 3-day sequence, on cold and windy days, and on days of low BP and high WS during the 3-day sequence; low DV was associated with high BP and decreased WS on the previous day. During June-September, a lower DV was associated with low BP, windless days, and high BP and low WS during the 3-day sequence. During the warm period, the greatest DV was associated with increased BP and changing WS during the 3-day sequence. These results suggest that daily T, BP, and WS on the day of the ambulance call and on the two previous days may be prognostic variables for the risk of ACS.

  19. Analysis of the transient behavior of rubbing components

    NASA Technical Reports Server (NTRS)

    Quezdou, M. B.; Mullen, R. L.

    1986-01-01

    Finite element equations are developed for studying deformations and temperatures resulting from frictional heating in sliding system. The formulation is done for linear steady state motion in two dimensions. The equations include the effect of the velocity on the moving components. This gives spurious oscillations in their solutions by Galerkin finite element methods. A method called streamline upwind scheme is used to try to deal with this deficiency. The finite element program is then used to investigate the friction of heating in gas path seal.

  20. Exploratory analysis of real personal emergency response call conversations: considerations for personal emergency response spoken dialogue systems.

    PubMed

    Young, Victoria; Rochon, Elizabeth; Mihailidis, Alex

    2016-11-14

    The purpose of this study was to derive data from real, recorded, personal emergency response call conversations to help improve the artificial intelligence and decision making capability of a spoken dialogue system in a smart personal emergency response system. The main study objectives were to: develop a model of personal emergency response; determine categories for the model's features; identify and calculate measures from call conversations (verbal ability, conversational structure, timing); and examine conversational patterns and relationships between measures and model features applicable for improving the system's ability to automatically identify call model categories and predict a target response. This study was exploratory and used mixed methods. Personal emergency response calls were pre-classified according to call model categories identified qualitatively from response call transcripts. The relationships between six verbal ability measures, three conversational structure measures, two timing measures and three independent factors: caller type, risk level, and speaker type, were examined statistically. Emergency medical response services were the preferred response for the majority of medium and high risk calls for both caller types. Older adult callers mainly requested non-emergency medical service responders during medium risk situations. By measuring the number of spoken words-per-minute and turn-length-in-words for the first spoken utterance of a call, older adult and care provider callers could be identified with moderate accuracy. Average call taker response time was calculated using the number-of-speaker-turns and time-in-seconds measures. Care providers and older adults used different conversational strategies when responding to call takers. The words 'ambulance' and 'paramedic' may hold different latent connotations for different callers. The data derived from the real personal emergency response recordings may help a spoken dialogue system classify incoming calls by caller type with moderate probability shortly after the initial caller utterance. Knowing the caller type, the target response for the call may be predicted with some degree of probability and the output dialogue could be tailored to this caller type. The average call taker response time measured from real calls may be used to limit the conversation length in a spoken dialogue system before defaulting to a live call taker.

  1. Generation of a monodispersed aerosol

    NASA Technical Reports Server (NTRS)

    Schenck, H.; Mikasa, M.; Devicariis, R.

    1974-01-01

    The identity and laboratory test methods for the generation of a monodispersed aerosol are reported on, and are subjected to the following constraints and parameters; (1) size distribution; (2) specific gravity; (3) scattering properties; (4) costs; (5) production. The procedure called for the collection of information from the literature, commercial available products, and experts working in the field. The following topics were investigated: (1) aerosols; (2) air pollution -- analysis; (3) atomizers; (4) dispersion; (5) particles -- optics, size analysis; (6) smoke -- generators, density measurements; (7) sprays; (8) wind tunnels -- visualization.

  2. MEDIAN-BASED INCREMENTAL COST-EFFECTIVENESS RATIOS WITH CENSORED DATA

    PubMed Central

    Bang, Heejung; Zhao, Hongwei

    2016-01-01

    Cost-effectiveness is an essential part of treatment evaluation, in addition to effectiveness. In the cost-effectiveness analysis, a measure called the incremental cost-effectiveness ratio (ICER) is widely utilized, and the mean cost and the mean (quality-adjusted) life years have served as norms to summarize cost and effectiveness for a study population. Recently, the median-based ICER was proposed for complementary or sensitivity analysis purposes. In this paper, we extend this method when some data are censored. PMID:26010599

  3. Methods for Multiplex Template Sampling in Digital PCR Assays

    PubMed Central

    Petriv, Oleh I.; Heyries, Kevin A.; VanInsberghe, Michael; Walker, David; Hansen, Carl L.

    2014-01-01

    The efficient use of digital PCR (dPCR) for precision copy number analysis requires high concentrations of target molecules that may be difficult or impossible to obtain from clinical samples. To solve this problem we present a strategy, called Multiplex Template Sampling (MTS), that effectively increases template concentrations by detecting multiple regions of fragmented target molecules. Three alternative assay approaches are presented for implementing MTS analysis of chromosome 21, providing a 10-fold concentration enhancement while preserving assay precision. PMID:24854517

  4. Methods for multiplex template sampling in digital PCR assays.

    PubMed

    Petriv, Oleh I; Heyries, Kevin A; VanInsberghe, Michael; Walker, David; Hansen, Carl L

    2014-01-01

    The efficient use of digital PCR (dPCR) for precision copy number analysis requires high concentrations of target molecules that may be difficult or impossible to obtain from clinical samples. To solve this problem we present a strategy, called Multiplex Template Sampling (MTS), that effectively increases template concentrations by detecting multiple regions of fragmented target molecules. Three alternative assay approaches are presented for implementing MTS analysis of chromosome 21, providing a 10-fold concentration enhancement while preserving assay precision.

  5. A Graduate Management Project to Improve the Supplemental Care System at Walter Reed Army Medical Center

    DTIC Science & Technology

    1990-07-20

    certain specified parameters. A method of establishing parameters for exceptions to the routine is called the ABC analysis of inventory control...Undcr ABC analysis, the A group is the 20% of line items which are most expensive per item, the B group is the next 20%, and the C group is the...part of a rational, orderly sytem which delivers the health care outputs required by the users, but allows the system managers to maintain control

  6. Energy landscape analysis of neuroimaging data

    NASA Astrophysics Data System (ADS)

    Ezaki, Takahiro; Watanabe, Takamitsu; Ohzeki, Masayuki; Masuda, Naoki

    2017-05-01

    Computational neuroscience models have been used for understanding neural dynamics in the brain and how they may be altered when physiological or other conditions change. We review and develop a data-driven approach to neuroimaging data called the energy landscape analysis. The methods are rooted in statistical physics theory, in particular the Ising model, also known as the (pairwise) maximum entropy model and Boltzmann machine. The methods have been applied to fitting electrophysiological data in neuroscience for a decade, but their use in neuroimaging data is still in its infancy. We first review the methods and discuss some algorithms and technical aspects. Then, we apply the methods to functional magnetic resonance imaging data recorded from healthy individuals to inspect the relationship between the accuracy of fitting, the size of the brain system to be analysed and the data length. This article is part of the themed issue `Mathematical methods in medicine: neuroscience, cardiology and pathology'.

  7. The Contact Dynamics method: A nonsmooth story

    NASA Astrophysics Data System (ADS)

    Dubois, Frédéric; Acary, Vincent; Jean, Michel

    2018-03-01

    When velocity jumps are occurring, the dynamics is said to be nonsmooth. For instance, in collections of contacting rigid bodies, jumps are caused by shocks and dry friction. Without compliance at the interface, contact laws are not only non-differentiable in the usual sense but also multi-valued. Modeling contacting bodies is of interest in order to understand the behavior of numerous mechanical systems such as flexible multi-body systems, granular materials or masonry. These granular materials behave puzzlingly either like a solid or a fluid and a description in the frame of classical continuous mechanics would be welcome though far to be satisfactory nowadays. Jean-Jacques Moreau greatly contributed to convex analysis, functions of bounded variations, differential measure theory, sweeping process theory, definitive mathematical tools to deal with nonsmooth dynamics. He converted all these underlying theoretical ideas into an original nonsmooth implicit numerical method called Contact Dynamics (CD); a robust and efficient method to simulate large collections of bodies with frictional contacts and impacts. The CD method offers a very interesting complementary alternative to the family of smoothed explicit numerical methods, often called Distinct Elements Method (DEM). In this paper developments and improvements of the CD method are presented together with a critical comparative review of advantages and drawbacks of both approaches. xml:lang="fr"

  8. Illuminant color estimation based on pigmentation separation from human skin color

    NASA Astrophysics Data System (ADS)

    Tanaka, Satomi; Kakinuma, Akihiro; Kamijo, Naohiro; Takahashi, Hiroshi; Tsumura, Norimichi

    2015-03-01

    Human has the visual system called "color constancy" that maintains the perceptive colors of same object across various light sources. The effective method of color constancy algorithm was proposed to use the human facial color in a digital color image, however, this method has wrong estimation results by the difference of individual facial colors. In this paper, we present the novel color constancy algorithm based on skin color analysis. The skin color analysis is the method to separate the skin color into the components of melanin, hemoglobin and shading. We use the stationary property of Japanese facial color, and this property is calculated from the components of melanin and hemoglobin. As a result, we achieve to propose the method to use subject's facial color in image and not depend on the individual difference among Japanese facial color.

  9. Joint multifractal analysis based on wavelet leaders

    NASA Astrophysics Data System (ADS)

    Jiang, Zhi-Qiang; Yang, Yan-Hong; Wang, Gang-Jin; Zhou, Wei-Xing

    2017-12-01

    Mutually interacting components form complex systems and these components usually have long-range cross-correlated outputs. Using wavelet leaders, we propose a method for characterizing the joint multifractal nature of these long-range cross correlations; we call this method joint multifractal analysis based on wavelet leaders (MF-X-WL). We test the validity of the MF-X-WL method by performing extensive numerical experiments on dual binomial measures with multifractal cross correlations and bivariate fractional Brownian motions (bFBMs) with monofractal cross correlations. Both experiments indicate that MF-X-WL is capable of detecting cross correlations in synthetic data with acceptable estimating errors. We also apply the MF-X-WL method to pairs of series from financial markets (returns and volatilities) and online worlds (online numbers of different genders and different societies) and determine intriguing joint multifractal behavior.

  10. Location and Size Planning of Distributed Photovoltaic Generation in Distribution network System Based on K-means Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Lu, Siqi; Wang, Xiaorong; Wu, Junyong

    2018-01-01

    The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.

  11. A numerical formulation and algorithm for limit and shakedown analysis of large-scale elastoplastic structures

    NASA Astrophysics Data System (ADS)

    Peng, Heng; Liu, Yinghua; Chen, Haofeng

    2018-05-01

    In this paper, a novel direct method called the stress compensation method (SCM) is proposed for limit and shakedown analysis of large-scale elastoplastic structures. Without needing to solve the specific mathematical programming problem, the SCM is a two-level iterative procedure based on a sequence of linear elastic finite element solutions where the global stiffness matrix is decomposed only once. In the inner loop, the static admissible residual stress field for shakedown analysis is constructed. In the outer loop, a series of decreasing load multipliers are updated to approach to the shakedown limit multiplier by using an efficient and robust iteration control technique, where the static shakedown theorem is adopted. Three numerical examples up to about 140,000 finite element nodes confirm the applicability and efficiency of this method for two-dimensional and three-dimensional elastoplastic structures, with detailed discussions on the convergence and the accuracy of the proposed algorithm.

  12. Visibility Graph Based Time Series Analysis.

    PubMed

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  13. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  14. [Emergencies and continuous care: overload of the current on-call system and search for new models].

    PubMed

    Enríquez-Navascués, Jose M

    2008-04-01

    Emergency surgical care is still provided by means of an 24 hours physical presence "on-call" model (encompassing a normal day followed by "on call"), and is obligatory for all staff. This defective organisation of work has become unsustainable with the acceptance of the European 48 hours Directive, and is gruelling due to the excessive night work and feeling of being locked in that it entails. Emergency general and digestive system surgery care cannot be provided by a single organisational model, but has to be adapted to local circumstances. It is important to separate scheduled activity from urgent, and whereas increasingly more resources are dedicated to scheduled care, sufficient resources are also required for urgent activities, that cannot be considered as simply an "on call" or a fleeting stop in scheduled activity. Core subjects in residency, creating different levels of provision and activities, the analysis of urgent activity per work period and the identification of foreseeable activity, to maintain a pro-active mentality, and the disappearance of the "overtime" concept, should help provide another care model and method of remuneration.

  15. Evaluation of copy number variation detection for a SNP array platform

    PubMed Central

    2014-01-01

    Background Copy Number Variations (CNVs) are usually inferred from Single Nucleotide Polymorphism (SNP) arrays by use of some software packages based on given algorithms. However, there is no clear understanding of the performance of these software packages; it is therefore difficult to select one or several software packages for CNV detection based on the SNP array platform. We selected four publicly available software packages designed for CNV calling from an Affymetrix SNP array, including Birdsuite, dChip, Genotyping Console (GTC) and PennCNV. The publicly available dataset generated by Array-based Comparative Genomic Hybridization (CGH), with a resolution of 24 million probes per sample, was considered to be the “gold standard”. Compared with the CGH-based dataset, the success rate, average stability rate, sensitivity, consistence and reproducibility of these four software packages were assessed compared with the “gold standard”. Specially, we also compared the efficiency of detecting CNVs simultaneously by two, three and all of the software packages with that by a single software package. Results Simply from the quantity of the detected CNVs, Birdsuite detected the most while GTC detected the least. We found that Birdsuite and dChip had obvious detecting bias. And GTC seemed to be inferior because of the least amount of CNVs it detected. Thereafter we investigated the detection consistency produced by one certain software package and the rest three software suits. We found that the consistency of dChip was the lowest while GTC was the highest. Compared with the CNVs detecting result of CGH, in the matching group, GTC called the most matching CNVs, PennCNV-Affy ranked second. In the non-overlapping group, GTC called the least CNVs. With regards to the reproducibility of CNV calling, larger CNVs were usually replicated better. PennCNV-Affy shows the best consistency while Birdsuite shows the poorest. Conclusion We found that PennCNV outperformed the other three packages in the sensitivity and specificity of CNV calling. Obviously, each calling method had its own limitations and advantages for different data analysis. Therefore, the optimized calling methods might be identified using multiple algorithms to evaluate the concordance and discordance of SNP array-based CNV calling. PMID:24555668

  16. Net analyte signal standard addition method (NASSAM) as a novel spectrofluorimetric and spectrophotometric technique for simultaneous determination, application to assay of melatonin and pyridoxine

    NASA Astrophysics Data System (ADS)

    Asadpour-Zeynali, Karim; Bastami, Mohammad

    2010-02-01

    In this work a new modification of the standard addition method called "net analyte signal standard addition method (NASSAM)" is presented for the simultaneous spectrofluorimetric and spectrophotometric analysis. The proposed method combines the advantages of standard addition method with those of net analyte signal concept. The method can be applied for the determination of analyte in the presence of known interferents. The accuracy of the predictions against H-point standard addition method is not dependent on the shape of the analyte and interferent spectra. The method was successfully applied to simultaneous spectrofluorimetric and spectrophotometric determination of pyridoxine (PY) and melatonin (MT) in synthetic mixtures and in a pharmaceutical formulation.

  17. Altai pika (Ochotona alpina) alarm calls: individual acoustic variation and the phenomenon of call-synchronous ear folding behavior.

    PubMed

    Volodin, Ilya A; Matrosova, Vera A; Frey, Roland; Kozhevnikova, Julia D; Isaeva, Inna L; Volodina, Elena V

    2018-06-11

    Non-hibernating pikas collect winter food reserves and store them in hay piles. Individualization of alarm calls might allow discrimination between colony members and conspecifics trying to steal food items from a colony pile. We investigated vocal posture, vocal tract length, and individual acoustic variation of alarm calls, emitted by wild-living Altai pikas Ochotona alpina toward a researcher. Recording started when a pika started calling and lasted as long as possible. The alarm call series of 442 individual callers from different colonies consisted of discrete short (0.073-0.157 s), high-frequency (7.31-15.46 kHz), and frequency-modulated calls separated by irregular intervals. Analysis of 442 discrete calls, the second of each series, revealed that 44.34% calls lacked nonlinear phenomena, in 7.02% nonlinear phenomena covered less than half of call duration, and in 48.64% nonlinear phenomena covered more than half of call duration. Peak frequencies varied among individuals but always fitted one of three maxima corresponding to the vocal tract resonance frequencies (formants) calculated for an estimated 45-mm oral vocal tract. Discriminant analysis using variables of 8 calls per series of 36 different callers, each from a different colony, correctly assigned over 90% of the calls to individuals. Consequently, Altai pika alarm calls are individualistic and nonlinear phenomena might further increase this acoustic individualization. Additionally, video analysis revealed a call-synchronous, very fast (0.13-0.23 s) folding, depression, and subsequent re-expansion of the pinna confirming an earlier report of this behavior that apparently contributes to protecting the hearing apparatus from damage by the self-generated high-intensity alarm calls.

  18. Development of the Nonstationary Incremental Analysis Update Algorithm for Sequential Data Assimilation System

    NASA Astrophysics Data System (ADS)

    Ham, Yoo-Geun; Song, Hyo-Jong; Jung, Jaehee; Lim, Gyu-Ho

    2017-04-01

    This study introduces a altered version of the incremental analysis updates (IAU), called the nonstationary IAU (NIAU) method, to enhance the assimilation accuracy of the IAU while retaining the continuity of the analysis. Analogous to the IAU, the NIAU is designed to add analysis increments at every model time step to improve the continuity in the intermittent data assimilation. Still, unlike the IAU, the NIAU method applies time-evolved forcing employing the forward operator as rectifications to the model. The solution of the NIAU is better than that of the IAU, of which analysis is performed at the start of the time window for adding the IAU forcing, in terms of the accuracy of the analysis field. It is because, in the linear systems, the NIAU solution equals that in an intermittent data assimilation method at the end of the assimilation interval. To have the filtering property in the NIAU, a forward operator to propagate the increment is reconstructed with only dominant singular vectors. An illustration of those advantages of the NIAU is given using the simple 40-variable Lorenz model.

  19. MAVTgsa: An R Package for Gene Set (Enrichment) Analysis

    DOE PAGES

    Chien, Chih-Yi; Chang, Ching-Wei; Tsai, Chen-An; ...

    2014-01-01

    Gene semore » t analysis methods aim to determine whether an a priori defined set of genes shows statistically significant difference in expression on either categorical or continuous outcomes. Although many methods for gene set analysis have been proposed, a systematic analysis tool for identification of different types of gene set significance modules has not been developed previously. This work presents an R package, called MAVTgsa, which includes three different methods for integrated gene set enrichment analysis. (1) The one-sided OLS (ordinary least squares) test detects coordinated changes of genes in gene set in one direction, either up- or downregulation. (2) The two-sided MANOVA (multivariate analysis variance) detects changes both up- and downregulation for studying two or more experimental conditions. (3) A random forests-based procedure is to identify gene sets that can accurately predict samples from different experimental conditions or are associated with the continuous phenotypes. MAVTgsa computes the P values and FDR (false discovery rate) q -value for all gene sets in the study. Furthermore, MAVTgsa provides several visualization outputs to support and interpret the enrichment results. This package is available online.« less

  20. Ciguatera Fish Poisoning and Climate Change: Analysis of National Poison Center Data in the United States, 2001–2011

    PubMed Central

    Strickland, Matthew J.; Hess, Jeremy J.

    2014-01-01

    Background: Warm sea surface temperatures (SSTs) are positively related to incidence of ciguatera fish poisoning (CFP). Increased severe storm frequency may create more habitat for ciguatoxic organisms. Although climate change could expand the endemic range of CFP, the relationship between CFP incidence and specific environmental conditions is unknown. Objectives: We estimated associations between monthly CFP incidence in the contiguous United States and SST and storm frequency in the Caribbean basin. Methods: We obtained information on 1,102 CFP-related calls to U.S. poison control centers during 2001–2011 from the National Poison Data System. We performed a time-series analysis using Poisson regression to relate monthly CFP call incidence to SST and tropical storms. We investigated associations across a range of plausible lag structures. Results: Results showed associations between monthly CFP calls and both warmer SSTs and increased tropical storm frequency. The SST variable with the strongest association linked current monthly CFP calls to the peak August SST of the previous year. The lag period with the strongest association for storms was 18 months. If climate change increases SST in the Caribbean 2.5–3.5°C over the coming century as projected, this model implies that CFP incidence in the United States is likely to increase 200–400%. Conclusions: Using CFP calls as a marker of CFP incidence, these results clarify associations between climate variability and CFP incidence and suggest that, all other things equal, climate change could increase the burden of CFP. These findings have implications for disease prediction, surveillance, and public health preparedness for climate change. Citation: Gingold DB, Strickland MJ, Hess JJ. 2014. Ciguatera fish poisoning and climate change: analysis of National Poison Center data in the United States, 2001–2011. Environ Health Perspect 122:580–586; http://dx.doi.org/10.1289/ehp.1307196 PMID:24618280

  1. Dynamic variable selection in SNP genotype autocalling from APEX microarray data.

    PubMed

    Podder, Mohua; Welch, William J; Zamar, Ruben H; Tebbutt, Scott J

    2006-11-30

    Single nucleotide polymorphisms (SNPs) are DNA sequence variations, occurring when a single nucleotide--adenine (A), thymine (T), cytosine (C) or guanine (G)--is altered. Arguably, SNPs account for more than 90% of human genetic variation. Our laboratory has developed a highly redundant SNP genotyping assay consisting of multiple probes with signals from multiple channels for a single SNP, based on arrayed primer extension (APEX). This mini-sequencing method is a powerful combination of a highly parallel microarray with distinctive Sanger-based dideoxy terminator sequencing chemistry. Using this microarray platform, our current genotype calling system (known as SNP Chart) is capable of calling single SNP genotypes by manual inspection of the APEX data, which is time-consuming and exposed to user subjectivity bias. Using a set of 32 Coriell DNA samples plus three negative PCR controls as a training data set, we have developed a fully-automated genotyping algorithm based on simple linear discriminant analysis (LDA) using dynamic variable selection. The algorithm combines separate analyses based on the multiple probe sets to give a final posterior probability for each candidate genotype. We have tested our algorithm on a completely independent data set of 270 DNA samples, with validated genotypes, from patients admitted to the intensive care unit (ICU) of St. Paul's Hospital (plus one negative PCR control sample). Our method achieves a concordance rate of 98.9% with a 99.6% call rate for a set of 96 SNPs. By adjusting the threshold value for the final posterior probability of the called genotype, the call rate reduces to 94.9% with a higher concordance rate of 99.6%. We also reversed the two independent data sets in their training and testing roles, achieving a concordance rate up to 99.8%. The strength of this APEX chemistry-based platform is its unique redundancy having multiple probes for a single SNP. Our model-based genotype calling algorithm captures the redundancy in the system considering all the underlying probe features of a particular SNP, automatically down-weighting any 'bad data' corresponding to image artifacts on the microarray slide or failure of a specific chemistry. In this regard, our method is able to automatically select the probes which work well and reduce the effect of other so-called bad performing probes in a sample-specific manner, for any number of SNPs.

  2. Multifractal Cross Wavelet Analysis

    NASA Astrophysics Data System (ADS)

    Jiang, Zhi-Qiang; Gao, Xing-Lu; Zhou, Wei-Xing; Stanley, H. Eugene

    Complex systems are composed of mutually interacting components and the output values of these components usually exhibit long-range cross-correlations. Using wavelet analysis, we propose a method of characterizing the joint multifractal nature of these long-range cross correlations, a method we call multifractal cross wavelet analysis (MFXWT). We assess the performance of the MFXWT method by performing extensive numerical experiments on the dual binomial measures with multifractal cross correlations and the bivariate fractional Brownian motions (bFBMs) with monofractal cross correlations. For binomial multifractal measures, we find the empirical joint multifractality of MFXWT to be in approximate agreement with the theoretical formula. For bFBMs, MFXWT may provide spurious multifractality because of the wide spanning range of the multifractal spectrum. We also apply the MFXWT method to stock market indices, and in pairs of index returns and volatilities we find an intriguing joint multifractal behavior. The tests on surrogate series also reveal that the cross correlation behavior, particularly the cross correlation with zero lag, is the main origin of cross multifractality.

  3. Analysis of pressure-flow data in terms of computer-derived urethral resistance parameters.

    PubMed

    van Mastrigt, R; Kranse, M

    1995-01-01

    The simultaneous measurement of detrusor pressure and flow rate during voiding is at present the only way to measure or grade infravesical obstruction objectively. Numerous methods have been introduced to analyze the resulting data. These methods differ in aim (measurement of urethral resistance and/or diagnosis of obstruction), method (manual versus computerized data processing), theory or model used, and resolution (continuously variable parameters or a limited number of classes, the so-called monogram). In this paper, some aspects of these fundamental differences are discussed and illustrated. Subsequently, the properties and clinical performance of two computer-based methods for deriving continuous urethral resistance parameters are treated.

  4. The Effect of Perceiving a Calling on Pakistani Nurses' Organizational Commitment, Organizational Citizenship Behavior, and Job Stress.

    PubMed

    Afsar, Bilal; Shahjehan, Asad; Cheema, Sadia; Javed, Farheen

    2018-03-01

    People differ considerably in the way in which they express and experience their nursing careers. The positive effects associated with having a calling may differ substantially based on individuals' abilities to live out their callings. In a working world where many individuals have little to no choice in their type of employment and thus are unable to live out a calling even if they have one, the current study examined how perceiving a calling and living a calling interacted to predict organizational commitment, organizational citizenship behavior, and job stress with career commitment mediating the effect of the interactions on the three outcome variables. The purpose of the study is to investigate the mediating effect of career commitment between the relationships of calling and (a) nurses' attitudes (organizational commitment), (b) behaviors (organizational citizenship behavior), and (c) subjective experiences regarding work (job stress). Using a descriptive exploratory design, data were collected from 332 registered nurses working in Pakistani hospitals. Descriptive analysis and hierarchical regression analysis were used for data analysis. Living a calling moderated the effect of calling on career commitment, organizational citizenship behavior, and job stress, and career commitment fully mediated the effect of calling on organizational commitment, organizational citizenship behavior, and job stress. Increasing the understanding of calling, living a calling, and career commitment may increase nurses' organizational commitment and organizational citizenship behavior and decrease job stress. The study provided evidence to help nursing managers and health policy makers integrate knowledge and skills related to calling into career interventions and help nurses discover their calling.

  5. The Acquisition of Problem-Solving Skills in Mathematics: How Animations Can Aid Understanding of Structural Problem Features and Solution Procedures

    ERIC Educational Resources Information Center

    Scheiter, Katharina; Gerjets, Peter; Schuh, Julia

    2010-01-01

    In this paper the augmentation of worked examples with animations for teaching problem-solving skills in mathematics is advocated as an effective instructional method. First, in a cognitive task analysis different knowledge prerequisites are identified for solving mathematical word problems. Second, it is argued that so called hybrid animations…

  6. Statistical, Graphical, and Learning Methods for Sensing, Surveillance, and Navigation Systems

    DTIC Science & Technology

    2016-06-28

    harsh propagation environments. Conventional filtering techniques fail to provide satisfactory performance in many important nonlinear or non...Gaussian scenarios. In addition, there is a lack of a unified methodology for the design and analysis of different filtering techniques. To address...these problems, we have proposed a new filtering methodology called belief condensation (BC) DISTRIBUTION A: Distribution approved for public release

  7. Analysis of Parent, Teacher, and Consultant Speech Exchanges and Educational Outcomes of Students with Autism during COMPASS Consultation

    ERIC Educational Resources Information Center

    Ruble, Lisa; Birdwhistell, Jessie; Toland, Michael D.; McGrew, John H.

    2011-01-01

    The significant increase in the numbers of students with autism combined with the need for better trained teachers (National Research Council, 2001) call for research on the effectiveness of alternative methods, such as consultation, that have the potential to improve service delivery. Data from 2 randomized controlled single-blind trials indicate…

  8. Multivariate Density Estimation and Remote Sensing

    NASA Technical Reports Server (NTRS)

    Scott, D. W.

    1983-01-01

    Current efforts to develop methods and computer algorithms to effectively represent multivariate data commonly encountered in remote sensing applications are described. While this may involve scatter diagrams, multivariate representations of nonparametric probability density estimates are emphasized. The density function provides a useful graphical tool for looking at data and a useful theoretical tool for classification. This approach is called a thunderstorm data analysis.

  9. The Influence of Unemployment and Divorce Rate on Child Help-Seeking Behavior about Violence, Relationships, and Other Issues

    ERIC Educational Resources Information Center

    van Dolen, Willemijn M.; Weinberg, Charles B.; Ma, Leiming

    2013-01-01

    Objective: This study examined the influence of community unemployment and divorce rate on child help-seeking behavior about violence and relationships via a telephone and Internet helpline. Methods: Time series analysis was conducted on monthly call volumes to a child helpline ("De Kindertelefoon") in the Netherlands from 2003 to 2008…

  10. Do Service Users with Intellectual Disabilities Want to Be Involved in the Risk Management Process? A Thematic Analysis

    ERIC Educational Resources Information Center

    Kilcommons, Aoiffe M.; Withers, Paul; Moreno-Lopez, Agueda

    2012-01-01

    Background: Involving ID service users in risk decision making necessitates consideration of an individual's ability to assess the implications and associated risks and thus make an informed choice. This calls for research on service users' awareness and understanding of risk management (RM). Method: Thirteen people in a residential ID service who…

  11. Non-Gradient Blue Native Polyacrylamide Gel Electrophoresis.

    PubMed

    Luo, Xiaoting; Wu, Jinzi; Jin, Zhen; Yan, Liang-Jun

    2017-02-02

    Gradient blue native polyacrylamide gel electrophoresis (BN-PAGE) is a well established and widely used technique for activity analysis of high-molecular-weight proteins, protein complexes, and protein-protein interactions. Since its inception in the early 1990s, a variety of minor modifications have been made to this gradient gel analytical method. Here we provide a major modification of the method, which we call non-gradient BN-PAGE. The procedure, similar to that of non-gradient SDS-PAGE, is simple because there is no expensive gradient maker involved. The non-gradient BN-PAGE protocols presented herein provide guidelines on the analysis of mitochondrial protein complexes, in particular, dihydrolipoamide dehydrogenase (DLDH) and those in the electron transport chain. Protocols for the analysis of blood esterases or mitochondrial esterases are also presented. The non-gradient BN-PAGE method may be tailored for analysis of specific proteins according to their molecular weight regardless of whether the target proteins are hydrophobic or hydrophilic. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  12. Extraction of composite visual objects from audiovisual materials

    NASA Astrophysics Data System (ADS)

    Durand, Gwenael; Thienot, Cedric; Faudemay, Pascal

    1999-08-01

    An effective analysis of Visual Objects appearing in still images and video frames is required in order to offer fine grain access to multimedia and audiovisual contents. In previous papers, we showed how our method for segmenting still images into visual objects could improve content-based image retrieval and video analysis methods. Visual Objects are used in particular for extracting semantic knowledge about the contents. However, low-level segmentation methods for still images are not likely to extract a complex object as a whole but instead as a set of several sub-objects. For example, a person would be segmented into three visual objects: a face, hair, and a body. In this paper, we introduce the concept of Composite Visual Object. Such an object is hierarchically composed of sub-objects called Component Objects.

  13. Microarray Detection Call Methodology as a Means to Identify and Compare Transcripts Expressed within Syncytial Cells from Soybean (Glycine max) Roots Undergoing Resistant and Susceptible Reactions to the Soybean Cyst Nematode (Heterodera glycines)

    PubMed Central

    Klink, Vincent P.; Overall, Christopher C.; Alkharouf, Nadim W.; MacDonald, Margaret H.; Matthews, Benjamin F.

    2010-01-01

    Background. A comparative microarray investigation was done using detection call methodology (DCM) and differential expression analyses. The goal was to identify genes found in specific cell populations that were eliminated by differential expression analysis due to the nature of differential expression methods. Laser capture microdissection (LCM) was used to isolate nearly homogeneous populations of plant root cells. Results. The analyses identified the presence of 13,291 transcripts between the 4 different sample types. The transcripts filtered down into a total of 6,267 that were detected as being present in one or more sample types. A comparative analysis of DCM and differential expression methods showed a group of genes that were not differentially expressed, but were expressed at detectable amounts within specific cell types. Conclusion. The DCM has identified patterns of gene expression not shown by differential expression analyses. DCM has identified genes that are possibly cell-type specific and/or involved in important aspects of plant nematode interactions during the resistance response, revealing the uniqueness of a particular cell population at a particular point during its differentiation process. PMID:20508855

  14. GI-POP: a combinational annotation and genomic island prediction pipeline for ongoing microbial genome projects.

    PubMed

    Lee, Chi-Ching; Chen, Yi-Ping Phoebe; Yao, Tzu-Jung; Ma, Cheng-Yu; Lo, Wei-Cheng; Lyu, Ping-Chiang; Tang, Chuan Yi

    2013-04-10

    Sequencing of microbial genomes is important because of microbial-carrying antibiotic and pathogenetic activities. However, even with the help of new assembling software, finishing a whole genome is a time-consuming task. In most bacteria, pathogenetic or antibiotic genes are carried in genomic islands. Therefore, a quick genomic island (GI) prediction method is useful for ongoing sequencing genomes. In this work, we built a Web server called GI-POP (http://gipop.life.nthu.edu.tw) which integrates a sequence assembling tool, a functional annotation pipeline, and a high-performance GI predicting module, in a support vector machine (SVM)-based method called genomic island genomic profile scanning (GI-GPS). The draft genomes of the ongoing genome projects in contigs or scaffolds can be submitted to our Web server, and it provides the functional annotation and highly probable GI-predicting results. GI-POP is a comprehensive annotation Web server designed for ongoing genome project analysis. Researchers can perform annotation and obtain pre-analytic information include possible GIs, coding/non-coding sequences and functional analysis from their draft genomes. This pre-analytic system can provide useful information for finishing a genome sequencing project. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Effects of different analysis techniques and recording duty cycles on passive acoustic monitoring of killer whales.

    PubMed

    Riera, Amalis; Ford, John K; Ross Chapman, N

    2013-09-01

    Killer whales in British Columbia are at risk, and little is known about their winter distribution. Passive acoustic monitoring of their year-round habitat is a valuable supplemental method to traditional visual and photographic surveys. However, long-term acoustic studies of odontocetes have some limitations, including the generation of large amounts of data that require highly time-consuming processing. There is a need to develop tools and protocols to maximize the efficiency of such studies. Here, two types of analysis, real-time and long term spectral averages, were compared to assess their performance at detecting killer whale calls in long-term acoustic recordings. In addition, two different duty cycles, 1/3 and 2/3, were tested. Both the use of long term spectral averages and a lower duty cycle resulted in a decrease in call detection and positive pod identification, leading to underestimations of the amount of time the whales were present. The impact of these limitations should be considered in future killer whale acoustic surveys. A compromise between a lower resolution data processing method and a higher duty cycle is suggested for maximum methodological efficiency.

  16. Thermodynamic Analysis of Chemically Reacting Mixtures-Comparison of First and Second Order Models.

    PubMed

    Pekař, Miloslav

    2018-01-01

    Recently, a method based on non-equilibrium continuum thermodynamics which derives thermodynamically consistent reaction rate models together with thermodynamic constraints on their parameters was analyzed using a triangular reaction scheme. The scheme was kinetically of the first order. Here, the analysis is further developed for several first and second order schemes to gain a deeper insight into the thermodynamic consistency of rate equations and relationships between chemical thermodynamic and kinetics. It is shown that the thermodynamic constraints on the so-called proper rate coefficient are usually simple sign restrictions consistent with the supposed reaction directions. Constraints on the so-called coupling rate coefficients are more complex and weaker. This means more freedom in kinetic coupling between reaction steps in a scheme, i.e., in the kinetic effects of other reactions on the rate of some reaction in a reacting system. When compared with traditional mass-action rate equations, the method allows a reduction in the number of traditional rate constants to be evaluated from data, i.e., a reduction in the dimensionality of the parameter estimation problem. This is due to identifying relationships between mass-action rate constants (relationships which also include thermodynamic equilibrium constants) which have so far been unknown.

  17. MPAI (mass probes aided ionization) method for total analysis of biomolecules by mass spectrometry.

    PubMed

    Honda, Aki; Hayashi, Shinichiro; Hifumi, Hiroki; Honma, Yuya; Tanji, Noriyuki; Iwasawa, Naoko; Suzuki, Yoshio; Suzuki, Koji

    2007-01-01

    We have designed and synthesized various mass probes, which enable us to effectively ionize various molecules to be detected with mass spectrometry. We call the ionization method using mass probes the "MPAI (mass probes aided ionization)" method. We aim at the sensitive detection of various biological molecules, and also the detection of bio-molecules by a single mass spectrometry serially without changing the mechanical settings. Here, we review mass probes for small molecules with various functional groups and mass probes for proteins. Further, we introduce newly developed mass probes for proteins for highly sensitive detection.

  18. A common base method for analysis of qPCR data and the application of simple blocking in qPCR experiments.

    PubMed

    Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J

    2017-12-01

    qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.

  19. A new stratification of mourning dove call-count routes

    USGS Publications Warehouse

    Blankenship, L.H.; Humphrey, A.B.; MacDonald, D.

    1971-01-01

    The mourning dove (Zenaidura macroura) call-count survey is a nationwide audio-census of breeding mourning doves. Recent analyses of the call-count routes have utilized a stratification based upon physiographic regions of the United States. An analysis of 5 years of call-count data, based upon stratification using potential natural vegetation, has demonstrated that this uew stratification results in strata with greater homogeneity than the physiographic strata, provides lower error variance, and hence generates greatet precision in the analysis without an increase in call-count routes. Error variance was reduced approximately 30 percent for the contiguous United States. This indicates that future analysis based upon the new stratification will result in an increased ability to detect significant year-to-year changes.

  20. Leveraging multiple gene networks to prioritize GWAS candidate genes via network representation learning.

    PubMed

    Wu, Mengmeng; Zeng, Wanwen; Liu, Wenqiang; Lv, Hairong; Chen, Ting; Jiang, Rui

    2018-06-03

    Genome-wide association studies (GWAS) have successfully discovered a number of disease-associated genetic variants in the past decade, providing an unprecedented opportunity for deciphering genetic basis of human inherited diseases. However, it is still a challenging task to extract biological knowledge from the GWAS data, due to such issues as missing heritability and weak interpretability. Indeed, the fact that the majority of discovered loci fall into noncoding regions without clear links to genes has been preventing the characterization of their functions and appealing for a sophisticated approach to bridge genetic and genomic studies. Towards this problem, network-based prioritization of candidate genes, which performs integrated analysis of gene networks with GWAS data, has emerged as a promising direction and attracted much attention. However, most existing methods overlook the sparse and noisy properties of gene networks and thus may lead to suboptimal performance. Motivated by this understanding, we proposed a novel method called REGENT for integrating multiple gene networks with GWAS data to prioritize candidate genes for complex diseases. We leveraged a technique called the network representation learning to embed a gene network into a compact and robust feature space, and then designed a hierarchical statistical model to integrate features of multiple gene networks with GWAS data for the effective inference of genes associated with a disease of interest. We applied our method to six complex diseases and demonstrated the superior performance of REGENT over existing approaches in recovering known disease-associated genes. We further conducted a pathway analysis and showed that the ability of REGENT to discover disease-associated pathways. We expect to see applications of our method to a broad spectrum of diseases for post-GWAS analysis. REGENT is freely available at https://github.com/wmmthu/REGENT. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. Thermal conductivity analysis and applications of nanocellulose materials

    PubMed Central

    Uetani, Kojiro; Hatori, Kimihito

    2017-01-01

    Abstract In this review, we summarize the recent progress in thermal conductivity analysis of nanocellulose materials called cellulose nanopapers, and compare them with polymeric materials, including neat polymers, composites, and traditional paper. It is important to individually measure the in-plane and through-plane heat-conducting properties of two-dimensional planar materials, so steady-state and non-equilibrium methods, in particular the laser spot periodic heating radiation thermometry method, are reviewed. The structural dependency of cellulose nanopaper on thermal conduction is described in terms of the crystallite size effect, fibre orientation, and interfacial thermal resistance between fibres and small pores. The novel applications of cellulose as thermally conductive transparent materials and thermal-guiding materials are also discussed. PMID:29152020

  2. MOSAIC: Software for creating mosaics from collections of images

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Gezari, D. Y.

    1992-01-01

    We have developed a powerful, versatile image processing and analysis software package called MOSAIC, designed specifically for the manipulation of digital astronomical image data obtained with (but not limited to) two-dimensional array detectors. The software package is implemented using the Interactive Data Language (IDL), and incorporates new methods for processing, calibration, analysis, and visualization of astronomical image data, stressing effective methods for the creation of mosaic images from collections of individual exposures, while at the same time preserving the photometric integrity of the original data. Since IDL is available on many computers, the MOSAIC software runs on most UNIX and VAX workstations with the X-Windows or Sun View graphics interface.

  3. Buckling Analysis of Single and Multi Delamination In Composite Beam Using Finite Element Method

    NASA Astrophysics Data System (ADS)

    Simanjorang, Hans Charles; Syamsudin, Hendri; Giri Suada, Muhammad

    2018-04-01

    Delamination is one type of imperfection in structure which found usually in the composite structure. Delamination may exist due to some factors namely in-service condition where the foreign objects hit the composite structure and creates inner defect and poor manufacturing that causes the initial imperfections. Composite structure is susceptible to the compressive loading. Compressive loading leads the instability phenomenon in the composite structure called buckling. The existence of delamination inside of the structure will cause reduction in buckling strength. This paper will explain the effect of delamination location to the buckling strength. The analysis will use the one-dimensional modelling approach using two- dimensional finite element method.

  4. FEAMAC/CARES Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu; Bhatt, Ramakrishna

    2016-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  5. Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composite

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu

    2015-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  6. An Introduction to MAMA (Meta-Analysis of MicroArray data) System.

    PubMed

    Zhang, Zhe; Fenstermacher, David

    2005-01-01

    Analyzing microarray data across multiple experiments has been proven advantageous. To support this kind of analysis, we are developing a software system called MAMA (Meta-Analysis of MicroArray data). MAMA utilizes a client-server architecture with a relational database on the server-side for the storage of microarray datasets collected from various resources. The client-side is an application running on the end user's computer that allows the user to manipulate microarray data and analytical results locally. MAMA implementation will integrate several analytical methods, including meta-analysis within an open-source framework offering other developers the flexibility to plug in additional statistical algorithms.

  7. Culture Representation in Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Gertman; Julie Marble; Steven Novack

    Understanding human-system response is critical to being able to plan and predict mission success in the modern battlespace. Commonly, human reliability analysis has been used to predict failures of human performance in complex, critical systems. However, most human reliability methods fail to take culture into account. This paper takes an easily understood state of the art human reliability analysis method and extends that method to account for the influence of culture, including acceptance of new technology, upon performance. The cultural parameters used to modify the human reliability analysis were determined from two standard industry approaches to cultural assessment: Hofstede’s (1991)more » cultural factors and Davis’ (1989) technology acceptance model (TAM). The result is called the Culture Adjustment Method (CAM). An example is presented that (1) reviews human reliability assessment with and without cultural attributes for a Supervisory Control and Data Acquisition (SCADA) system attack, (2) demonstrates how country specific information can be used to increase the realism of HRA modeling, and (3) discusses the differences in human error probability estimates arising from cultural differences.« less

  8. Local Orthogonal Cutting Method for Computing Medial Curves and Its Biomedical Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiao, Xiangmin; Einstein, Daniel R.; Dyedov, Volodymyr

    2010-03-24

    Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stabilitymore » and consistency tests. These concepts lend themselves to robust numerical techniques including eigenvalue analysis, weighted least squares approximations, and numerical minimization, resulting in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods.« less

  9. Conserved nucleation sites reinforce the significance of Phi value analysis in protein-folding studies.

    PubMed

    Gianni, Stefano; Jemth, Per

    2014-07-01

    The only experimental strategy to address the structure of folding transition states, the so-called Φ value analysis, relies on the synergy between site directed mutagenesis and the measurement of reaction kinetics. Despite its importance, the Φ value analysis has been often criticized and its power to pinpoint structural information has been questioned. In this hypothesis, we demonstrate that comparing the Φ values between proteins not only allows highlighting the robustness of folding pathways but also provides per se a strong validation of the method. © 2014 International Union of Biochemistry and Molecular Biology.

  10. [Gene expression analyses of kidney biopsies: the European renal cDNA bank--Kröner-Fresenius biopsy bank].

    PubMed

    Cohen, C D; Kretzler, M

    2009-03-01

    Histological analysis of kidney biopsies is an essential part of our current diagnostic workup of patients with renal disease. Besides the already established diagnostic tools, new methods allow extensive analysis of the sample tissue's gene expression. Using results from a European multicenter study on gene expression analysis of renal biopsies, in this review we demonstrate that this novel approach not only expands the scope of so-called basic research but also might supplement future biopsy diagnostics. The goals are improved diagnosis and more specific therapy choice and prognosis estimates.

  11. Ground shake test of the UH-60A helicopter airframe and comparison with NASTRAN finite element model predictions

    NASA Technical Reports Server (NTRS)

    Howland, G. R.; Durno, J. A.; Twomey, W. J.

    1990-01-01

    Sikorsky Aircraft, together with the other major helicopter airframe manufacturers, is engaged in a study to improve the use of finite element analysis to predict the dynamic behavior of helicopter airframes, under a rotorcraft structural dynamics program called DAMVIBS (Design Analysis Methods for VIBrationS), sponsored by the NASA-Langley. The test plan and test results are presented for a shake test of the UH-60A BLACK HAWK helicopter. A comparison is also presented of test results with results obtained from analysis using a NASTRAN finite element model.

  12. Exploratory Mediation Analysis via Regularization

    PubMed Central

    Serang, Sarfaraz; Jacobucci, Ross; Brimhall, Kim C.; Grimm, Kevin J.

    2017-01-01

    Exploratory mediation analysis refers to a class of methods used to identify a set of potential mediators of a process of interest. Despite its exploratory nature, conventional approaches are rooted in confirmatory traditions, and as such have limitations in exploratory contexts. We propose a two-stage approach called exploratory mediation analysis via regularization (XMed) to better address these concerns. We demonstrate that this approach is able to correctly identify mediators more often than conventional approaches and that its estimates are unbiased. Finally, this approach is illustrated through an empirical example examining the relationship between college acceptance and enrollment. PMID:29225454

  13. Best practices for evaluating single nucleotide variant calling methods for microbial genomics

    PubMed Central

    Olson, Nathan D.; Lund, Steven P.; Colman, Rebecca E.; Foster, Jeffrey T.; Sahl, Jason W.; Schupp, James M.; Keim, Paul; Morrow, Jayne B.; Salit, Marc L.; Zook, Justin M.

    2015-01-01

    Innovations in sequencing technologies have allowed biologists to make incredible advances in understanding biological systems. As experience grows, researchers increasingly recognize that analyzing the wealth of data provided by these new sequencing platforms requires careful attention to detail for robust results. Thus far, much of the scientific Communit’s focus for use in bacterial genomics has been on evaluating genome assembly algorithms and rigorously validating assembly program performance. Missing, however, is a focus on critical evaluation of variant callers for these genomes. Variant calling is essential for comparative genomics as it yields insights into nucleotide-level organismal differences. Variant calling is a multistep process with a host of potential error sources that may lead to incorrect variant calls. Identifying and resolving these incorrect calls is critical for bacterial genomics to advance. The goal of this review is to provide guidance on validating algorithms and pipelines used in variant calling for bacterial genomics. First, we will provide an overview of the variant calling procedures and the potential sources of error associated with the methods. We will then identify appropriate datasets for use in evaluating algorithms and describe statistical methods for evaluating algorithm performance. As variant calling moves from basic research to the applied setting, standardized methods for performance evaluation and reporting are required; it is our hope that this review provides the groundwork for the development of these standards. PMID:26217378

  14. Time-dependent structural transformation analysis to high-level Petri net model with active state transition diagram.

    PubMed

    Li, Chen; Nagasaki, Masao; Saito, Ayumu; Miyano, Satoru

    2010-04-01

    With an accumulation of in silico data obtained by simulating large-scale biological networks, a new interest of research is emerging for elucidating how living organism functions over time in cells. Investigating the dynamic features of current computational models promises a deeper understanding of complex cellular processes. This leads us to develop a method that utilizes structural properties of the model over all simulation time steps. Further, user-friendly overviews of dynamic behaviors can be considered to provide a great help in understanding the variations of system mechanisms. We propose a novel method for constructing and analyzing a so-called active state transition diagram (ASTD) by using time-course simulation data of a high-level Petri net. Our method includes two new algorithms. The first algorithm extracts a series of subnets (called temporal subnets) reflecting biological components contributing to the dynamics, while retaining positive mathematical qualities. The second one creates an ASTD composed of unique temporal subnets. ASTD provides users with concise information allowing them to grasp and trace how a key regulatory subnet and/or a network changes with time. The applicability of our method is demonstrated by the analysis of the underlying model for circadian rhythms in Drosophila. Building ASTD is a useful means to convert a hybrid model dealing with discrete, continuous and more complicated events to finite time-dependent states. Based on ASTD, various analytical approaches can be applied to obtain new insights into not only systematic mechanisms but also dynamics.

  15. Mutual information estimation reveals global associations between stimuli and biological processes

    PubMed Central

    Suzuki, Taiji; Sugiyama, Masashi; Kanamori, Takafumi; Sese, Jun

    2009-01-01

    Background Although microarray gene expression analysis has become popular, it remains difficult to interpret the biological changes caused by stimuli or variation of conditions. Clustering of genes and associating each group with biological functions are often used methods. However, such methods only detect partial changes within cell processes. Herein, we propose a method for discovering global changes within a cell by associating observed conditions of gene expression with gene functions. Results To elucidate the association, we introduce a novel feature selection method called Least-Squares Mutual Information (LSMI), which computes mutual information without density estimaion, and therefore LSMI can detect nonlinear associations within a cell. We demonstrate the effectiveness of LSMI through comparison with existing methods. The results of the application to yeast microarray datasets reveal that non-natural stimuli affect various biological processes, whereas others are no significant relation to specific cell processes. Furthermore, we discover that biological processes can be categorized into four types according to the responses of various stimuli: DNA/RNA metabolism, gene expression, protein metabolism, and protein localization. Conclusion We proposed a novel feature selection method called LSMI, and applied LSMI to mining the association between conditions of yeast and biological processes through microarray datasets. In fact, LSMI allows us to elucidate the global organization of cellular process control. PMID:19208155

  16. Sparse dictionary learning for resting-state fMRI analysis

    NASA Astrophysics Data System (ADS)

    Lee, Kangjoo; Han, Paul Kyu; Ye, Jong Chul

    2011-09-01

    Recently, there has been increased interest in the usage of neuroimaging techniques to investigate what happens in the brain at rest. Functional imaging studies have revealed that the default-mode network activity is disrupted in Alzheimer's disease (AD). However, there is no consensus, as yet, on the choice of analysis method for the application of resting-state analysis for disease classification. This paper proposes a novel compressed sensing based resting-state fMRI analysis tool called Sparse-SPM. As the brain's functional systems has shown to have features of complex networks according to graph theoretical analysis, we apply a graph model to represent a sparse combination of information flows in complex network perspectives. In particular, a new concept of spatially adaptive design matrix has been proposed by implementing sparse dictionary learning based on sparsity. The proposed approach shows better performance compared to other conventional methods, such as independent component analysis (ICA) and seed-based approach, in classifying the AD patients from normal using resting-state analysis.

  17. A direct-inverse method for transonic and separated flows about airfoils

    NASA Technical Reports Server (NTRS)

    Carlson, K. D.

    1985-01-01

    A direct-inverse technique and computer program called TAMSEP that can be sued for the analysis of the flow about airfoils at subsonic and low transonic freestream velocities is presented. The method is based upon a direct-inverse nonconservative full potential inviscid method, a Thwaites laminar boundary layer technique, and the Barnwell turbulent momentum integral scheme; and it is formulated using Cartesian coordinates. Since the method utilizes inverse boundary conditions in regions of separated flow, it is suitable for predicing the flowfield about airfoils having trailing edge separated flow under high lift conditions. Comparisons with experimental data indicate that the method should be a useful tool for applied aerodynamic analyses.

  18. A direct-inverse method for transonic and separated flows about airfoils

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1990-01-01

    A direct-inverse technique and computer program called TAMSEP that can be used for the analysis of the flow about airfoils at subsonic and low transonic freestream velocities is presented. The method is based upon a direct-inverse nonconservative full potential inviscid method, a Thwaites laminar boundary layer technique, and the Barnwell turbulent momentum integral scheme; and it is formulated using Cartesian coordinates. Since the method utilizes inverse boundary conditions in regions of separated flow, it is suitable for predicting the flow field about airfoils having trailing edge separated flow under high lift conditions. Comparisons with experimental data indicate that the method should be a useful tool for applied aerodynamic analyses.

  19. Use of a public telephone hotline to detect urban plague cases.

    PubMed

    Malberg, J A; Pape, W J; Lezotte, D; Hill, A E

    2012-11-01

    Current methods for vector-borne disease surveillance are limited by time and cost. To avoid human infections from emerging zoonotic diseases, it is important that the United States develop cost-effective surveillance systems for these diseases. This study examines the methodology used in the surveillance of a plague epizootic involving tree squirrels (Sciurus niger) in Denver Colorado, during the summer of 2007. A call-in centre for the public to report dead squirrels was used to direct animal carcass sampling. Staff used these reports to collect squirrel carcasses for the analysis of Yersinia pestis infection. This sampling protocol was analysed at the census tract level using Poisson regression to determine the relationship between higher call volumes in a census tract and the risk of a carcass in that tract testing positive for plague. Over-sampling owing to call volume-directed collection was accounted for by including the number of animals collected as the denominator in the model. The risk of finding an additional plague-positive animal increased as the call volume per census tract increased. The risk in the census tracts with >3 calls a month was significantly higher than that with three or less calls in a month. For tracts with 4-5 calls, the relative risk (RR) of an additional plague-positive carcass was 10.08 (95% CI 5.46-18.61); for tracts with 6-8 calls, the RR = 5.20 (2.93-9.20); for tracts with 9-11 calls, the RR = 12.80 (5.85-28.03) and tracts with >11 calls had RR = 35.41 (18.60-67.40). Overall, the call-in centre directed sampling increased the probability of locating plague-infected carcasses in the known Denver epizootic. Further studies are needed to determine the effectiveness of this methodology at monitoring large-scale zoonotic disease occurrence in the absence of a recognized epizootic. © 2012 Blackwell Verlag GmbH.

  20. Tracking fin whales in the northeast Pacific Ocean with a seafloor seismic network.

    PubMed

    Wilcock, William S D

    2012-10-01

    Ocean bottom seismometer (OBS) networks represent a tool of opportunity to study fin and blue whales. A small OBS network on the Juan de Fuca Ridge in the northeast Pacific Ocean in ~2.3 km of water recorded an extensive data set of 20-Hz fin whale calls. An automated method has been developed to identify arrival times based on instantaneous frequency and amplitude and to locate calls using a grid search even in the presence of a few bad arrival times. When only one whale is calling near the network, tracks can generally be obtained up to distances of ~15 km from the network. When the calls from multiple whales overlap, user supervision is required to identify tracks. The absolute and relative amplitudes of arrivals and their three-component particle motions provide additional constraints on call location but are not useful for extending the distance to which calls can be located. The double-difference method inverts for changes in relative call locations using differences in residuals for pairs of nearby calls recorded on a common station. The method significantly reduces the unsystematic component of the location error, especially when inconsistencies in arrival time observations are minimized by cross-correlation.

  1. 47 CFR 80.225 - Requirements for selective calling equipment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... selective calling (DSC) equipment and selective calling equipment installed in ship and coast stations, and...-STD, “RTCM Recommended Minimum Standards for Digital Selective Calling (DSC) Equipment Providing... Class ‘D’ Digital Selective Calling (DSC)—Methods of testing and required test results,” March 2003. ITU...

  2. Meta-analysis is not an exact science: Call for guidance on quantitative synthesis decisions.

    PubMed

    Haddaway, Neal R; Rytwinski, Trina

    2018-05-01

    Meta-analysis is becoming increasingly popular in the field of ecology and environmental management. It increases the effective power of analyses relative to single studies, and allows researchers to investigate effect modifiers and sources of heterogeneity that could not be easily examined within single studies. Many systematic reviewers will set out to conduct a meta-analysis as part of their synthesis, but meta-analysis requires a niche set of skills that are not widely held by the environmental research community. Each step in the process of carrying out a meta-analysis requires decisions that have both scientific and statistical implications. Reviewers are likely to be faced with a plethora of decisions over which effect size to choose, how to calculate variances, and how to build statistical models. Some of these decisions may be simple based on appropriateness of the options. At other times, reviewers must choose between equally valid approaches given the information available to them. This presents a significant problem when reviewers are attempting to conduct a reliable synthesis, such as a systematic review, where subjectivity is minimised and all decisions are documented and justified transparently. We propose three urgent, necessary developments within the evidence synthesis community. Firstly, we call on quantitative synthesis experts to improve guidance on how to prepare data for quantitative synthesis, providing explicit detail to support systematic reviewers. Secondly, we call on journal editors and evidence synthesis coordinating bodies (e.g. CEE) to ensure that quantitative synthesis methods are adequately reported in a transparent and repeatable manner in published systematic reviews. Finally, where faced with two or more broadly equally valid alternative methods or actions, reviewers should conduct multiple analyses, presenting all options, and discussing the implications of the different analytical approaches. We believe it is vital to tackle the possible subjectivity in quantitative synthesis described herein to ensure that the extensive efforts expended in producing systematic reviews and other evidence synthesis products is not wasted because of a lack of rigour or reliability in the final synthesis step. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Closed-loop bird-computer interactions: a new method to study the role of bird calls.

    PubMed

    Lerch, Alexandre; Roy, Pierre; Pachet, François; Nagle, Laurent

    2011-03-01

    In the field of songbird research, many studies have shown the role of male songs in territorial defense and courtship. Calling, another important acoustic communication signal, has received much less attention, however, because calls are assumed to contain less information about the emitter than songs do. Birdcall repertoire is diverse, and the role of calls has been found to be significant in the area of social interaction, for example, in pair, family, and group cohesion. However, standard methods for studying calls do not allow precise and systematic study of their role in communication. We propose herein a new method to study bird vocal interaction. A closed-loop computer system interacts with canaries, Serinus canaria, by (1) automatically classifying two basic types of canary vocalization, single versus repeated calls, as they are produced by the subject, and (2) responding with a preprogrammed call type recorded from another bird. This computerized animal-machine interaction requires no human interference. We show first that the birds do engage in sustained interactions with the system, by studying the rate of single and repeated calls for various programmed protocols. We then show that female canaries differentially use single and repeated calls. First, they produce significantly more single than repeated calls, and second, the rate of single calls is associated with the context in which they interact, whereas repeated calls are context independent. This experiment is the first illustration of how closed-loop bird-computer interaction can be used productively to study social relationships. © Springer-Verlag 2010

  4. Structural Identifiability of Dynamic Systems Biology Models

    PubMed Central

    Villaverde, Alejandro F.

    2016-01-01

    A powerful way of gaining insight into biological systems is by creating a nonlinear differential equation model, which usually contains many unknown parameters. Such a model is called structurally identifiable if it is possible to determine the values of its parameters from measurements of the model outputs. Structural identifiability is a prerequisite for parameter estimation, and should be assessed before exploiting a model. However, this analysis is seldom performed due to the high computational cost involved in the necessary symbolic calculations, which quickly becomes prohibitive as the problem size increases. In this paper we show how to analyse the structural identifiability of a very general class of nonlinear models by extending methods originally developed for studying observability. We present results about models whose identifiability had not been previously determined, report unidentifiabilities that had not been found before, and show how to modify those unidentifiable models to make them identifiable. This method helps prevent problems caused by lack of identifiability analysis, which can compromise the success of tasks such as experiment design, parameter estimation, and model-based optimization. The procedure is called STRIKE-GOLDD (STRuctural Identifiability taKen as Extended-Generalized Observability with Lie Derivatives and Decomposition), and it is implemented in a MATLAB toolbox which is available as open source software. The broad applicability of this approach facilitates the analysis of the increasingly complex models used in systems biology and other areas. PMID:27792726

  5. Effects of high intensity exercise on isoelectric profiles and SDS-PAGE mobility of erythropoietin.

    PubMed

    Voss, S; Lüdke, A; Romberg, S; Schänzer, E; Flenker, U; deMarees, M; Achtzehn, S; Mester, J; Schänzer, W

    2010-06-01

    Exercise induced proteinuria is a common phenomenon in high performance sports. Based on the appearance of so called "effort urines" in routine doping analysis the purpose of this study was to investigate the influence of exercise induced proteinuria on IEF profiles and SDS-PAGE relative mobility values (rMVs) of endogenous human erythropoietin (EPO). Twenty healthy subjects performed cycle-ergometer exercise until exhaustion. VO (2)max, blood lactate, urinary proteins and urinary creatinine were analysed to evaluate the exercise performance and proteinuria. IEF and SDS-PAGE analyses were performed to test for differences in electrophoretic behaviour of the endogenous EPO before and after exercise. All subjects showed increased levels of protein/creatinine ratio after performance (8.8+/-5.2-26.1+/-14.4). IEF analysis demonstrated an elevation of the relative amount of basic band areas (13.9+/-11.3-36.4+/-12.6). Using SDS-PAGE analysis we observed a decrease in rMVs after exercise and no shift in direction of the recombinant human EPO (rhEPO) region (0.543+/-0.013-0.535+/-0.012). Following identification criteria of the World Anti Doping Agency (WADA) all samples were negative. The implementation of the SDS-PAGE method represents a good solution to distinguish between results influenced by so called effort urines and results of rhEPO abuse. Thus this method can be used to confirm adverse analytical findings.

  6. Training Needs Analysis and Evaluation for New Technologies through the Use of Problem-Based Inquiry

    ERIC Educational Resources Information Center

    Casey, Matthew Scott; Doverspike, Dennis

    2005-01-01

    The analysis of calls to a help desk, in this case calls to a computer help desk, can serve as a rich source of information on the real world problems that individuals are having with the implementation of a new technology. Thus, we propose that an analysis of help desk calls, a form of problem-based inquiry, can serve as a fast and low cost means…

  7. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method

    PubMed Central

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-01-01

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis. PMID:28029121

  8. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.

    PubMed

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-12-24

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  9. Harmonic component detection: Optimized Spectral Kurtosis for operational modal analysis

    NASA Astrophysics Data System (ADS)

    Dion, J.-L.; Tawfiq, I.; Chevallier, G.

    2012-01-01

    This work is a contribution in the field of Operational Modal Analysis to identify the modal parameters of mechanical structures using only measured responses. The study deals with structural responses coupled with harmonic components amplitude and frequency modulated in a short range, a common combination for mechanical systems with engines and other rotating machines in operation. These harmonic components generate misleading data interpreted erroneously by the classical methods used in OMA. The present work attempts to differentiate maxima in spectra stemming from harmonic components and structural modes. The detection method proposed is based on the so-called Optimized Spectral Kurtosis and compared with others definitions of Spectral Kurtosis described in the literature. After a parametric study of the method, a critical study is performed on numerical simulations and then on an experimental structure in operation in order to assess the method's performance.

  10. Using Virtual Social Networks for Case Finding in Clinical Studies: An Experiment from Adolescence, Brain, Cognition, and Diabetes Study.

    PubMed

    Pourabbasi, Ata; Farzami, Jalal; Shirvani, Mahbubeh-Sadat Ebrahimnegad; Shams, Amir Hossein; Larijani, Bagher

    2017-01-01

    One of the main usages of social networks in clinical studies is facilitating the process of sampling and case finding for scientists. The main focus of this study is on comparing two different methods of sampling through phone calls and using social network, for study purposes. One of the researchers started calling 214 families of children with diabetes during 90 days. After this period, phone calls stopped, and the team started communicating with families through telegram, a virtual social network for 30 days. The number of children who participated in the study was evaluated. Although the telegram method was 60 days shorter than the phone call method, researchers found that the number of participants from telegram (17.6%) did not have any significant differences compared with the ones being phone called (12.9%). Using social networks can be suggested as a beneficial method for local researchers who look for easier sampling methods, winning their samples' trust, following up with the procedure, and an easy-access database.

  11. Poor methodological detail precludes experimental repeatability and hampers synthesis in ecology.

    PubMed

    Haddaway, Neal R; Verhoeven, Jos T A

    2015-10-01

    Despite the scientific method's central tenets of reproducibility (the ability to obtain similar results when repeated) and repeatability (the ability to replicate an experiment based on methods described), published ecological research continues to fail to provide sufficient methodological detail to allow either repeatability of verification. Recent systematic reviews highlight the problem, with one example demonstrating that an average of 13% of studies per year (±8.0 [SD]) failed to report sample sizes. The problem affects the ability to verify the accuracy of any analysis, to repeat methods used, and to assimilate the study findings into powerful and useful meta-analyses. The problem is common in a variety of ecological topics examined to date, and despite previous calls for improved reporting and metadata archiving, which could indirectly alleviate the problem, there is no indication of an improvement in reporting standards over time. Here, we call on authors, editors, and peer reviewers to consider repeatability as a top priority when evaluating research manuscripts, bearing in mind that legacy and integration into the evidence base can drastically improve the impact of individual research reports.

  12. Features in simulation of crystal growth using the hyperbolic PFC equation and the dependence of the numerical solution on the parameters of the computational grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Starodumov, Ilya; Kropotin, Nikolai

    2016-08-10

    We investigate the three-dimensional mathematical model of crystal growth called PFC (Phase Field Crystal) in a hyperbolic modification. This model is also called the modified model PFC (originally PFC model is formulated in parabolic form) and allows to describe both slow and rapid crystallization processes on atomic length scales and on diffusive time scales. Modified PFC model is described by the differential equation in partial derivatives of the sixth order in space and second order in time. The solution of this equation is possible only by numerical methods. Previously, authors created the software package for the solution of the Phasemore » Field Crystal problem, based on the method of isogeometric analysis (IGA) and PetIGA program library. During further investigation it was found that the quality of the solution can strongly depends on the discretization parameters of a numerical method. In this report, we show the features that should be taken into account during constructing the computational grid for the numerical simulation.« less

  13. Simulation-based sensitivity analysis for non-ignorably missing data.

    PubMed

    Yin, Peng; Shi, Jian Q

    2017-01-01

    Sensitivity analysis is popular in dealing with missing data problems particularly for non-ignorable missingness, where full-likelihood method cannot be adopted. It analyses how sensitively the conclusions (output) may depend on assumptions or parameters (input) about missing data, i.e. missing data mechanism. We call models with the problem of uncertainty sensitivity models. To make conventional sensitivity analysis more useful in practice we need to define some simple and interpretable statistical quantities to assess the sensitivity models and make evidence based analysis. We propose a novel approach in this paper on attempting to investigate the possibility of each missing data mechanism model assumption, by comparing the simulated datasets from various MNAR models with the observed data non-parametrically, using the K-nearest-neighbour distances. Some asymptotic theory has also been provided. A key step of this method is to plug in a plausibility evaluation system towards each sensitivity parameter, to select plausible values and reject unlikely values, instead of considering all proposed values of sensitivity parameters as in the conventional sensitivity analysis method. The method is generic and has been applied successfully to several specific models in this paper including meta-analysis model with publication bias, analysis of incomplete longitudinal data and mean estimation with non-ignorable missing data.

  14. Malware analysis using visualized image matrices.

    PubMed

    Han, KyoungSoo; Kang, BooJoong; Im, Eul Gyu

    2014-01-01

    This paper proposes a novel malware visual analysis method that contains not only a visualization method to convert binary files into images, but also a similarity calculation method between these images. The proposed method generates RGB-colored pixels on image matrices using the opcode sequences extracted from malware samples and calculates the similarities for the image matrices. Particularly, our proposed methods are available for packed malware samples by applying them to the execution traces extracted through dynamic analysis. When the images are generated, we can reduce the overheads by extracting the opcode sequences only from the blocks that include the instructions related to staple behaviors such as functions and application programming interface (API) calls. In addition, we propose a technique that generates a representative image for each malware family in order to reduce the number of comparisons for the classification of unknown samples and the colored pixel information in the image matrices is used to calculate the similarities between the images. Our experimental results show that the image matrices of malware can effectively be used to classify malware families both statically and dynamically with accuracy of 0.9896 and 0.9732, respectively.

  15. Strelka: accurate somatic small-variant calling from sequenced tumor-normal sample pairs.

    PubMed

    Saunders, Christopher T; Wong, Wendy S W; Swamy, Sajani; Becq, Jennifer; Murray, Lisa J; Cheetham, R Keira

    2012-07-15

    Whole genome and exome sequencing of matched tumor-normal sample pairs is becoming routine in cancer research. The consequent increased demand for somatic variant analysis of paired samples requires methods specialized to model this problem so as to sensitively call variants at any practical level of tumor impurity. We describe Strelka, a method for somatic SNV and small indel detection from sequencing data of matched tumor-normal samples. The method uses a novel Bayesian approach which represents continuous allele frequencies for both tumor and normal samples, while leveraging the expected genotype structure of the normal. This is achieved by representing the normal sample as a mixture of germline variation with noise, and representing the tumor sample as a mixture of the normal sample with somatic variation. A natural consequence of the model structure is that sensitivity can be maintained at high tumor impurity without requiring purity estimates. We demonstrate that the method has superior accuracy and sensitivity on impure samples compared with approaches based on either diploid genotype likelihoods or general allele-frequency tests. The Strelka workflow source code is available at ftp://strelka@ftp.illumina.com/. csaunders@illumina.com

  16. Experimental Researches on the Durability Indicators and the Physiological Comfort of Fabrics using the Principal Component Analysis (PCA) Method

    NASA Astrophysics Data System (ADS)

    Hristian, L.; Ostafe, M. M.; Manea, L. R.; Apostol, L. L.

    2017-06-01

    The work pursued the distribution of combed wool fabrics destined to manufacturing of external articles of clothing in terms of the values of durability and physiological comfort indices, using the mathematical model of Principal Component Analysis (PCA). Principal Components Analysis (PCA) applied in this study is a descriptive method of the multivariate analysis/multi-dimensional data, and aims to reduce, under control, the number of variables (columns) of the matrix data as much as possible to two or three. Therefore, based on the information about each group/assortment of fabrics, it is desired that, instead of nine inter-correlated variables, to have only two or three new variables called components. The PCA target is to extract the smallest number of components which recover the most of the total information contained in the initial data.

  17. Generalized Structured Component Analysis with Uniqueness Terms for Accommodating Measurement Error

    PubMed Central

    Hwang, Heungsun; Takane, Yoshio; Jung, Kwanghee

    2017-01-01

    Generalized structured component analysis (GSCA) is a component-based approach to structural equation modeling (SEM), where latent variables are approximated by weighted composites of indicators. It has no formal mechanism to incorporate errors in indicators, which in turn renders components prone to the errors as well. We propose to extend GSCA to account for errors in indicators explicitly. This extension, called GSCAM, considers both common and unique parts of indicators, as postulated in common factor analysis, and estimates a weighted composite of indicators with their unique parts removed. Adding such unique parts or uniqueness terms serves to account for measurement errors in indicators in a manner similar to common factor analysis. Simulation studies are conducted to compare parameter recovery of GSCAM and existing methods. These methods are also applied to fit a substantively well-established model to real data. PMID:29270146

  18. Analysis of Big Data in Gait Biomechanics: Current Trends and Future Directions.

    PubMed

    Phinyomark, Angkoon; Petri, Giovanni; Ibáñez-Marcelo, Esther; Osis, Sean T; Ferber, Reed

    2018-01-01

    The increasing amount of data in biomechanics research has greatly increased the importance of developing advanced multivariate analysis and machine learning techniques, which are better able to handle "big data". Consequently, advances in data science methods will expand the knowledge for testing new hypotheses about biomechanical risk factors associated with walking and running gait-related musculoskeletal injury. This paper begins with a brief introduction to an automated three-dimensional (3D) biomechanical gait data collection system: 3D GAIT, followed by how the studies in the field of gait biomechanics fit the quantities in the 5 V's definition of big data: volume, velocity, variety, veracity, and value. Next, we provide a review of recent research and development in multivariate and machine learning methods-based gait analysis that can be applied to big data analytics. These modern biomechanical gait analysis methods include several main modules such as initial input features, dimensionality reduction (feature selection and extraction), and learning algorithms (classification and clustering). Finally, a promising big data exploration tool called "topological data analysis" and directions for future research are outlined and discussed.

  19. Analysis of ChIP-seq Data in R/Bioconductor.

    PubMed

    de Santiago, Ines; Carroll, Thomas

    2018-01-01

    The development of novel high-throughput sequencing methods for ChIP (chromatin immunoprecipitation) has provided a very powerful tool to study gene regulation in multiple conditions at unprecedented resolution and scale. Proactive quality-control and appropriate data analysis techniques are of critical importance to extract the most meaningful results from the data. Over the last years, an array of R/Bioconductor tools has been developed allowing researchers to process and analyze ChIP-seq data. This chapter provides an overview of the methods available to analyze ChIP-seq data based primarily on software packages from the open-source Bioconductor project. Protocols described in this chapter cover basic steps including data alignment, peak calling, quality control and data visualization, as well as more complex methods such as the identification of differentially bound regions and functional analyses to annotate regulatory regions. The steps in the data analysis process were demonstrated on publicly available data sets and will serve as a demonstration of the computational procedures routinely used for the analysis of ChIP-seq data in R/Bioconductor, from which readers can construct their own analysis pipelines.

  20. Missing RRI interpolation for HRV analysis using locally-weighted partial least squares regression.

    PubMed

    Kamata, Keisuke; Fujiwara, Koichi; Yamakawa, Toshiki; Kano, Manabu

    2016-08-01

    The R-R interval (RRI) fluctuation in electrocardiogram (ECG) is called heart rate variability (HRV). Since HRV reflects autonomic nervous function, HRV-based health monitoring services, such as stress estimation, drowsy driving detection, and epileptic seizure prediction, have been proposed. In these HRV-based health monitoring services, precise R wave detection from ECG is required; however, R waves cannot always be detected due to ECG artifacts. Missing RRI data should be interpolated appropriately for HRV analysis. The present work proposes a missing RRI interpolation method by utilizing using just-in-time (JIT) modeling. The proposed method adopts locally weighted partial least squares (LW-PLS) for RRI interpolation, which is a well-known JIT modeling method used in the filed of process control. The usefulness of the proposed method was demonstrated through a case study of real RRI data collected from healthy persons. The proposed JIT-based interpolation method could improve the interpolation accuracy in comparison with a static interpolation method.

  1. Determination of the transmission coefficients for quantum structures using FDTD method.

    PubMed

    Peng, Yangyang; Wang, Xiaoying; Sui, Wenquan

    2011-12-01

    The purpose of this work is to develop a simple method to incorporate quantum effect in traditional finite-difference time-domain (FDTD) simulators. Witch could make it possible to co-simulate systems include quantum structures and traditional components. In this paper, tunneling transmission coefficient is calculated by solving time-domain Schrödinger equation with a developed FDTD technique, called FDTD-S method. To validate the feasibility of the method, a simple resonant tunneling diode (RTD) structure model has been simulated using the proposed method. The good agreement between the numerical and analytical results proves its accuracy. The effectness and accuracy of this approach makes it a potential method for analysis and design of hybrid systems includes quantum structures and traditional components.

  2. Assets as a Socioeconomic Status Index: Categorical Principal Components Analysis vs. Latent Class Analysis.

    PubMed

    Sartipi, Majid; Nedjat, Saharnaz; Mansournia, Mohammad Ali; Baigi, Vali; Fotouhi, Akbar

    2016-11-01

    Some variables like Socioeconomic Status (SES) cannot be directly measured, instead, so-called 'latent variables' are measured indirectly through calculating tangible items. There are different methods for measuring latent variables such as data reduction methods e.g. Principal Components Analysis (PCA) and Latent Class Analysis (LCA). The purpose of our study was to measure assets index- as a representative of SES- through two methods of Non-Linear PCA (NLPCA) and LCA, and to compare them for choosing the most appropriate model. This was a cross sectional study in which 1995 respondents filled the questionnaires about their assets in Tehran. The data were analyzed by SPSS 19 (CATPCA command) and SAS 9.2 (PROC LCA command) to estimate their socioeconomic status. The results were compared based on the Intra-class Correlation Coefficient (ICC). The 6 derived classes from LCA based on BIC, were highly consistent with the 6 classes from CATPCA (Categorical PCA) (ICC = 0.87, 95%CI: 0.86 - 0.88). There is no gold standard to measure SES. Therefore, it is not possible to definitely say that a specific method is better than another one. LCA is a complicated method that presents detailed information about latent variables and required one assumption (local independency), while NLPCA is a simple method, which requires more assumptions. Generally, NLPCA seems to be an acceptable method of analysis because of its simplicity and high agreement with LCA.

  3. Unsupervised Bayesian linear unmixing of gene expression microarrays.

    PubMed

    Bazot, Cécile; Dobigeon, Nicolas; Tourneret, Jean-Yves; Zaas, Aimee K; Ginsburg, Geoffrey S; Hero, Alfred O

    2013-03-19

    This paper introduces a new constrained model and the corresponding algorithm, called unsupervised Bayesian linear unmixing (uBLU), to identify biological signatures from high dimensional assays like gene expression microarrays. The basis for uBLU is a Bayesian model for the data samples which are represented as an additive mixture of random positive gene signatures, called factors, with random positive mixing coefficients, called factor scores, that specify the relative contribution of each signature to a specific sample. The particularity of the proposed method is that uBLU constrains the factor loadings to be non-negative and the factor scores to be probability distributions over the factors. Furthermore, it also provides estimates of the number of factors. A Gibbs sampling strategy is adopted here to generate random samples according to the posterior distribution of the factors, factor scores, and number of factors. These samples are then used to estimate all the unknown parameters. Firstly, the proposed uBLU method is applied to several simulated datasets with known ground truth and compared with previous factor decomposition methods, such as principal component analysis (PCA), non negative matrix factorization (NMF), Bayesian factor regression modeling (BFRM), and the gradient-based algorithm for general matrix factorization (GB-GMF). Secondly, we illustrate the application of uBLU on a real time-evolving gene expression dataset from a recent viral challenge study in which individuals have been inoculated with influenza A/H3N2/Wisconsin. We show that the uBLU method significantly outperforms the other methods on the simulated and real data sets considered here. The results obtained on synthetic and real data illustrate the accuracy of the proposed uBLU method when compared to other factor decomposition methods from the literature (PCA, NMF, BFRM, and GB-GMF). The uBLU method identifies an inflammatory component closely associated with clinical symptom scores collected during the study. Using a constrained model allows recovery of all the inflammatory genes in a single factor.

  4. The Leadership Efficacy of Graduates of North Carolina School of Science and Mathematics: A Mixed-Methods Analysis

    NASA Astrophysics Data System (ADS)

    Mason, Letita Renee

    This study examines the leadership efficacy amongst graduates of NCSSM from the classes of 2000--07 as the unit of analysis. How do NCSSM graduates' perceptions of their leadership efficacy align with research on non-cognitive variables as indicators of academic performance using the unit of analysis as a performance outcome? This study is based on the theoretical construct that non-cognitive psychological (also called motivational) factors are core components of leadership self-efficacy, indicative of NCSSM graduates (who had high academic performance and attained STEM degrees). It holds promise for increasing both student interest and diversity in the race to strengthen the STEM pipeline. In this study the Hannah and Avolio (2013) Mind Garden Leadership Efficacy Questionnaire (LEQ) is used. The LEQ is a battery of three instruments designed to assess individual perceptions of personal leadership efficacy across three constructs, via one survey tool. In this mixed-methods analysis, a quantitative phase was conducted to collect the data captured by the Mind Garden Leadership Efficacy Questionnaire. A Post Hoc qualitative analysis was conducted in the second phase of the data analysis, using the Trichotomous-Square Test methodology (with an associated qualitative researcher-designed Inventive Investigative Instrument). The results from the study validated the alternative hypothesis [H1], which proposed that there no are significant differences in the perception of the Leadership Efficacy by the North Carolina School of Science and Mathematics Alumni from the classes of 2000-07 in terms of their overall "Leadership Efficacy" in regards to: Execution or "Leadership Action Efficacy"; Capacity or "Leader Means Efficacy"; and Environment or "Leader Self-Regulation Efficacy" was accepted. The results also led to the development of a new assessment tool called the Mason Leadership Efficacy Model.

  5. Hydrogen Ordering in Hexagonal Intermetallic AB5 Type Compounds

    NASA Astrophysics Data System (ADS)

    Sikora, W.; Kuna, A.

    2008-04-01

    Intermetallic compounds AB5 type (A = rare-earth atoms, B = transition metal) are known to store reversibly large amounts of hydrogen and as that are discussed in this work. It was shown that the alloy cycling stability can be significantly improved by employing the so-called non-stoichiometric compounds AB5+x and that is why analysis of change of structure turned out to be interesting. A tendency for ordering of hydrogen atoms is one of the most intriguing problems for the unsaturated hydrides. The symmetry analysis method in the frame of the theory of space group and their representation gives opportunity to find all possible transformations of the parent structure. In this work symmetry analysis method was applied for AB5+x structure type (P6/mmm parent symmetry space group). There were investigated all possible ordering types and accompanying atom displacements in positions 1a, 2c, 3g (fully occupied in stoichiometric compounds AB5), in positions 2e, 6l (where atom B could appear in non-stoichiometric compounds) and also 4h, 6m, 6k, 12n, 12o, which could be partly occupied by hydrogen as a result of hydrides. An analysis was carried out of all possible structures of lower symmetry, following from P6/mmm for we k=(0, 0, 0). Also the way of getting the structure described by the P63mc space group with double cell along the z-axiswe k=(0, 0, 0.5), as it is suggested in the work of Latroche et al. is discussed by the symmetry analysis. The analysis was obtained by computer program MODY. The program calculates the so-called basis vectors of irreducible representations of a given symmetry group, which can be used for calculation of possible ordering modes.

  6. The Effect of Drama-Based Pedagogy on PreK-16 Outcomes: A Meta-Analysis of Research from 1985 to 2012

    ERIC Educational Resources Information Center

    Lee, Bridget Kiger; Patall, Erika A.; Cawthon, Stephanie W.; Steingut, Rebecca R.

    2015-01-01

    The President's Committee on the Arts and Humanities report heartily supported arts integration. However, the President's Committee called for a better understanding of the dimensions of quality and best practices. One promising arts integration method is drama-based pedagogy (DBP). A comprehensive search of the literature revealed 47…

  7. Application of the Organic Synthetic Designs to Astrobiology

    NASA Astrophysics Data System (ADS)

    Kolb, V. M.

    2009-12-01

    In this paper we propose a synthesis of the heterocyclic compounds and the insoluble materials on the meteorites. Our synthetic scheme involves the reaction of sugars and amino acids, the so-called Maillard reaction. We have developed this scheme based on the combined analysis of the regular and retrosynthetic organic synthetic principles. The merits of these synthetic methods for the prebiotic design are addressed.

  8. Motivation and Dual Enrollment: An Analysis of the Motivation of High School Students to Participate in Dual Enrollment in Association of Christian Schools International Schools

    ERIC Educational Resources Information Center

    Salerno, Mitchell Acri

    2011-01-01

    A phenomenological study utilizing the Consensual Qualitative Research method was conducted to understand the motivation of high school students dually enrolled in high school and college, commonly referred to as dual enrollment, in relation to the Self-Determination Theory and to connect this motivation to research on personal calling. This…

  9. Patrol force allocation for law enforcement: An introductory planning guide

    NASA Technical Reports Server (NTRS)

    Sohn, R. L.; Kennedy, R. D.

    1976-01-01

    Previous and current methods for analyzing police patrol forces are reviewed and discussed. The steps in developing an allocation analysis procedure are defined, including the prediction of the rate of calls for service, determination of the number of patrol units needed, designing sectors, and analyzing dispatch strategies. Existing computer programs used for this purpose are briefly described, and some results of their application are given.

  10. Particle Analysis Pitfalls

    NASA Technical Reports Server (NTRS)

    Hughes, David; Dazzo, Tony

    2007-01-01

    This viewgraph presentation reviews the use of particle analysis to assist in preparing for the 4th Hubble Space Telescope (HST) Servicing mission. During this mission the Space Telescope Imaging Spectrograph (STIS) will be repaired. The particle analysis consisted of Finite element mesh creation, Black-body viewfactors generated using I-DEAS TMG Thermal Analysis, Grey-body viewfactors calculated using Markov method, Particle distribution modeled using an iterative Monte Carlo process, (time-consuming); in house software called MASTRAM, Differential analysis performed in Excel, and Visualization provided by Tecplot and I-DEAS. Several tests were performed and are reviewed: Conformal Coat Particle Study, Card Extraction Study, Cover Fastener Removal Particle Generation Study, and E-Graf Vibration Particulate Study. The lessons learned during this analysis are also reviewed.

  11. On the deduction of chemical reaction pathways from measurements of time series of concentrations.

    PubMed

    Samoilov, Michael; Arkin, Adam; Ross, John

    2001-03-01

    We discuss the deduction of reaction pathways in complex chemical systems from measurements of time series of chemical concentrations of reacting species. First we review a technique called correlation metric construction (CMC) and show the construction of a reaction pathway from measurements on a part of glycolysis. Then we present two new improved methods for the analysis of time series of concentrations, entropy metric construction (EMC), and entropy reduction method (ERM), and illustrate (EMC) with calculations on a model reaction system. (c) 2001 American Institute of Physics.

  12. An Inquiry: Effectiveness of the Complex Empirical Mode Decomposition Method, the Hilbert-Huang Transform, and the Fast-Fourier Transform for Analysis of Dynamic Objects

    DTIC Science & Technology

    2012-03-01

    graphical user interface (GUI) called ALPINE© [18]. Then, it will be converted into a 10 MAT-file that can be read into MATLAB®. At this point...breathing [3]. For comparison purposes, Balocchi et al. recorded the respiratory signal simultaneously with the tachogram (or EKG ) signal. As previously...primary authors, worked to create his own code for implementing the method proposed by Rilling et al. Through reading the BEMD paper and proceeding to

  13. Spectral statistics of the uni-modular ensemble

    NASA Astrophysics Data System (ADS)

    Joyner, Christopher H.; Smilansky, Uzy; Weidenmüller, Hans A.

    2017-09-01

    We investigate the spectral statistics of Hermitian matrices in which the elements are chosen uniformly from U(1) , called the uni-modular ensemble (UME), in the limit of large matrix size. Using three complimentary methods; a supersymmetric integration method, a combinatorial graph-theoretical analysis and a Brownian motion approach, we are able to derive expressions for 1 / N corrections to the mean spectral moments and also analyse the fluctuations about this mean. By addressing the same ensemble from three different point of view, we can critically compare their relative advantages and derive some new results.

  14. Reversed-phase high-performance liquid chromatography of sulfur mustard in water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raghuveeran, C.D.; Malhotra, R.C.; Dangi, R.S.

    1993-01-01

    A reversed-phase high-performance liquid chromatography method for the detection and quantitation of sulfur mustard (HD) in water is described with detection at 200 nm. The detection based on the solubility of HD in water revealed that extremely low quantities of HD (4 to 5 mg/L) only are soluble. Experience shows that water is still the medium of choice for the analysis of HD in water and aqueous effluents in spite of the minor handicap of its half-life of ca. 4 minutes, which only calls for speedy analysis.

  15. Designing a more efficient, effective and safe Medical Emergency Team (MET) service using data analysis

    PubMed Central

    Bilgrami, Irma; Bain, Christopher; Webb, Geoffrey I.; Orosz, Judit; Pilcher, David

    2017-01-01

    Introduction Hospitals have seen a rise in Medical Emergency Team (MET) reviews. We hypothesised that the commonest MET calls result in similar treatments. Our aim was to design a pre-emptive management algorithm that allowed direct institution of treatment to patients without having to wait for attendance of the MET team and to model its potential impact on MET call incidence and patient outcomes. Methods Data was extracted for all MET calls from the hospital database. Association rule data mining techniques were used to identify the most common combinations of MET call causes, outcomes and therapies. Results There were 13,656 MET calls during the 34-month study period in 7936 patients. The most common MET call was for hypotension [31%, (2459/7936)]. These MET calls were strongly associated with the immediate administration of intra-venous fluid (70% [1714/2459] v 13% [739/5477] p<0.001), unless the patient was located on a respiratory ward (adjusted OR 0.41 [95%CI 0.25–0.67] p<0.001), had a cardiac cause for admission (adjusted OR 0.61 [95%CI 0.50–0.75] p<0.001) or was under the care of the heart failure team (adjusted OR 0.29 [95%CI 0.19–0.42] p<0.001). Modelling the effect of a pre-emptive management algorithm for immediate fluid administration without MET activation on data from a test period of 24 months following the study period, suggested it would lead to a 68.7% (2541/3697) reduction in MET calls for hypotension and a 19.6% (2541/12938) reduction in total METs without adverse effects on patients. Conclusion Routinely collected data and analytic techniques can be used to develop a pre-emptive management algorithm to administer intravenous fluid therapy to a specific group of hypotensive patients without the need to initiate a MET call. This could both lead to earlier treatment for the patient and less total MET calls. PMID:29281665

  16. Hamiltonian Dynamics of Spider-Type Multirotor Rigid Bodies Systems

    NASA Astrophysics Data System (ADS)

    Doroshin, Anton V.

    2010-03-01

    This paper sets out to develop a spider-type multiple-rotor system which can be used for attitude control of spacecraft. The multirotor system contains a large number of rotor-equipped rays, so it was called a ``Spider-type System,'' also it can be called ``Rotary Hedgehog.'' These systems allow using spinups and captures of conjugate rotors to perform compound attitude motion of spacecraft. The paper describes a new method of spacecraft attitude reorientation and new mathematical model of motion in Hamilton form. Hamiltonian dynamics of the system is investigated with the help of Andoyer-Deprit canonical variables. These variables allow obtaining exact solution for hetero- and homoclinic orbits in phase space of the system motion, which are very important for qualitative analysis.

  17. Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Xu; Tuo, Rui; Jeff Wu, C. F.

    Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less

  18. Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion

    DOE PAGES

    He, Xu; Tuo, Rui; Jeff Wu, C. F.

    2017-01-31

    Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less

  19. Efficient visualization of urban spaces

    NASA Astrophysics Data System (ADS)

    Stamps, A. E.

    2012-10-01

    This chapter presents a new method for calculating efficiency and applies that method to the issues of selecting simulation media and evaluating the contextual fit of new buildings in urban spaces. The new method is called "meta-analysis". A meta-analytic review of 967 environments indicated that static color simulations are the most efficient media for visualizing urban spaces. For contextual fit, four original experiments are reported on how strongly five factors influence visual appeal of a street: architectural style, trees, height of a new building relative to the heights of existing buildings, setting back a third story, and distance. A meta-analysis of these four experiments and previous findings, covering 461 environments, indicated that architectural style, trees, and height had effects strong enough to warrant implementation, but the effects of setting back third stories and distance were too small to warrant implementation.

  20. Response surface method in geotechnical/structural analysis, phase 1

    NASA Astrophysics Data System (ADS)

    Wong, F. S.

    1981-02-01

    In the response surface approach, an approximating function is fit to a long running computer code based on a limited number of code calculations. The approximating function, called the response surface, is then used to replace the code in subsequent repetitive computations required in a statistical analysis. The procedure of the response surface development and feasibility of the method are shown using a sample problem in slop stability which is based on data from centrifuge experiments of model soil slopes and involves five random soil parameters. It is shown that a response surface can be constructed based on as few as four code calculations and that the response surface is computationally extremely efficient compared to the code calculation. Potential applications of this research include probabilistic analysis of dynamic, complex, nonlinear soil/structure systems such as slope stability, liquefaction, and nuclear reactor safety.

  1. CFD analysis of a twin scroll radial turbine

    NASA Astrophysics Data System (ADS)

    Fürst, Jiří; Žák, Zdenĕk

    2018-06-01

    The contribution deals with the application of coupled implicit solver for compressible flows to CFD analysis of a twin scroll radial turbine. The solver is based on the finite volume method, convective terms are approximated using AUSM+up scheme, viscous terms use central approximation and the time evolution is achieved with lower-upper symmetric Gauss-Seidel (LU-SGS) method. The solver allows steady simulation with the so called frozen rotor approach as well as the fully unsteady solution. Both approaches are at first validated for the case of ERCOFTAC pump [1]. Then the CFD analysis of the flow through a twin scroll radial turbine and the predictions of the efficiency and turbine power is performed and the results are compared to experimental data obtained in the framework of Josef Božek - Competence Centre for Automotive Industry.

  2. Developing tools for digital radar image data evaluation

    NASA Technical Reports Server (NTRS)

    Domik, G.; Leberl, F.; Raggam, J.

    1986-01-01

    The refinement of radar image analysis methods has led to a need for a systems approach to radar image processing software. Developments stimulated through satellite radar are combined with standard image processing techniques to create a user environment to manipulate and analyze airborne and satellite radar images. One aim is to create radar products for the user from the original data to enhance the ease of understanding the contents. The results are called secondary image products and derive from the original digital images. Another aim is to support interactive SAR image analysis. Software methods permit use of a digital height model to create ortho images, synthetic images, stereo-ortho images, radar maps or color combinations of different component products. Efforts are ongoing to integrate individual tools into a combined hardware/software environment for interactive radar image analysis.

  3. eMBI: Boosting Gene Expression-based Clustering for Cancer Subtypes.

    PubMed

    Chang, Zheng; Wang, Zhenjia; Ashby, Cody; Zhou, Chuan; Li, Guojun; Zhang, Shuzhong; Huang, Xiuzhen

    2014-01-01

    Identifying clinically relevant subtypes of a cancer using gene expression data is a challenging and important problem in medicine, and is a necessary premise to provide specific and efficient treatments for patients of different subtypes. Matrix factorization provides a solution by finding checker-board patterns in the matrices of gene expression data. In the context of gene expression profiles of cancer patients, these checkerboard patterns correspond to genes that are up- or down-regulated in patients with particular cancer subtypes. Recently, a new matrix factorization framework for biclustering called Maximum Block Improvement (MBI) is proposed; however, it still suffers several problems when applied to cancer gene expression data analysis. In this study, we developed many effective strategies to improve MBI and designed a new program called enhanced MBI (eMBI), which is more effective and efficient to identify cancer subtypes. Our tests on several gene expression profiling datasets of cancer patients consistently indicate that eMBI achieves significant improvements in comparison with MBI, in terms of cancer subtype prediction accuracy, robustness, and running time. In addition, the performance of eMBI is much better than another widely used matrix factorization method called nonnegative matrix factorization (NMF) and the method of hierarchical clustering, which is often the first choice of clinical analysts in practice.

  4. eMBI: Boosting Gene Expression-based Clustering for Cancer Subtypes

    PubMed Central

    Chang, Zheng; Wang, Zhenjia; Ashby, Cody; Zhou, Chuan; Li, Guojun; Zhang, Shuzhong; Huang, Xiuzhen

    2014-01-01

    Identifying clinically relevant subtypes of a cancer using gene expression data is a challenging and important problem in medicine, and is a necessary premise to provide specific and efficient treatments for patients of different subtypes. Matrix factorization provides a solution by finding checker-board patterns in the matrices of gene expression data. In the context of gene expression profiles of cancer patients, these checkerboard patterns correspond to genes that are up- or down-regulated in patients with particular cancer subtypes. Recently, a new matrix factorization framework for biclustering called Maximum Block Improvement (MBI) is proposed; however, it still suffers several problems when applied to cancer gene expression data analysis. In this study, we developed many effective strategies to improve MBI and designed a new program called enhanced MBI (eMBI), which is more effective and efficient to identify cancer subtypes. Our tests on several gene expression profiling datasets of cancer patients consistently indicate that eMBI achieves significant improvements in comparison with MBI, in terms of cancer subtype prediction accuracy, robustness, and running time. In addition, the performance of eMBI is much better than another widely used matrix factorization method called nonnegative matrix factorization (NMF) and the method of hierarchical clustering, which is often the first choice of clinical analysts in practice. PMID:25374455

  5. Educator and participant perceptions and cost analysis of stage-tailored educational telephone calls.

    PubMed

    Esters, Onikia N; Boeckner, Linda S; Hubert, Melanie; Horacek, Tanya; Kritsch, Karen R; Oakland, Mary J; Lohse, Barbara; Greene, Geoffrey; Nitzke, Susan

    2008-01-01

    To identify strengths and weaknesses of nutrition education via telephone calls as part of a larger stage-of-change tailored intervention with mailed materials. Evaluative feedback was elicited from educators who placed the calls and respondents who received the calls. An internet and telephone survey of 10 states in the midwestern United States. 21 educators in 10 states reached via the internet and 50 young adults reached via telephone. VARIABLES MEASURED AND ANALYSIS: Rankings of intervention components, ratings of key aspects of educational calls, and cost data (as provided by a lead researcher in each state) were summarized via descriptive statistics. RESULTS, CONCLUSIONS, AND IMPLICATIONS: Educational calls used 6 to 17 minutes of preparation time, required 8 to 15 minutes of contact time, and had a mean estimated cost of $5.82 per call. Low-income young adults favored print materials over educational calls. However, the calls were reported to have positive effects on motivating participants to set goals. Educators who use educational telephone calls to reach young adults, a highly mobile target audience, may require a robust and flexible contact plan.

  6. Study designs appropriate for the workplace.

    PubMed

    Hogue, C J

    1986-01-01

    Carlo and Hearn have called for "refinement of old [epidemiologic] methods and an ongoing evaluation of where methods fit in the overall scheme as we address the multiple complexities of reproductive hazard assessment." This review is an attempt to bring together the current state-of-the-art methods for problem definition and hypothesis testing available to the occupational epidemiologist. For problem definition, meta analysis can be utilized to narrow the field of potential causal hypotheses. Passive active surveillance may further refine issues for analytic research. Within analytic epidemiology, several methods may be appropriate for the workplace setting. Those discussed here may be used to estimate the risk ratio in either a fixed or dynamic population.

  7. Acoustic fine structure may encode biologically relevant information for zebra finches.

    PubMed

    Prior, Nora H; Smith, Edward; Lawson, Shelby; Ball, Gregory F; Dooling, Robert J

    2018-04-18

    The ability to discriminate changes in the fine structure of complex sounds is well developed in birds. However, the precise limit of this discrimination ability and how it is used in the context of natural communication remains unclear. Here we describe natural variability in acoustic fine structure of male and female zebra finch calls. Results from psychoacoustic experiments demonstrate that zebra finches are able to discriminate extremely small differences in fine structure, which are on the order of the variation in acoustic fine structure that is present in their vocal signals. Results from signal analysis methods also suggest that acoustic fine structure may carry information that distinguishes between biologically relevant categories including sex, call type and individual identity. Combined, our results are consistent with the hypothesis that zebra finches can encode biologically relevant information within the fine structure of their calls. This study provides a foundation for our understanding of how acoustic fine structure may be involved in animal communication.

  8. Is the phone call the most effective method for recall in cervical cancer screening?--results from a randomised control trial.

    PubMed

    Abdul Rashid, Rima Marhayu; Mohamed, Majdah; Hamid, Zaleha Abdul; Dahlui, Maznah

    2013-01-01

    To compare the effectiveness of different methods of recall for repeat Pap smear among women who had normal smears in the previous screening. Prospective randomized controlled study. All community clinics in Klang under the Ministry of Health Malaysia. Women of Klang who attended cervical screening and had a normal Pap smear in the previous year, and were due for a repeat smear were recruited and randomly assigned to four different methods of recall for repeat smear. The recall methods given to the women to remind them for a repeat smear were either by postal letter, registered letter, short message by phone (SMS) or phone call. Number and percentage of women who responded to the recall within 8 weeks after they had received the recall, irrespective whether they had Pap test conducted. Also the numbers of women in each recall method that came for repeat Pap smear. The rates of recall messages reaching the women when using letter, registered letter, SMS and phone calls were 79%, 87%, 66% and 68%, respectively. However, the positive responses to recall by letter, registered letter, phone messages and telephone call were 23.9%, 23.0%, 32.9% and 50.9%, respectively (p<0.05). Furthermore, more women who received recall by phone call had been screened (p<0.05) compared to those who received recall by postal letter (OR=2.38, CI=1.56-3.62). Both the usual way of sending letters and registered letters had higher chances of reaching patients compared to using phone either for sending messages or calling. The response to the recall method and uptake of repeat smear, however, were highest via phone call, indicating the importance of direct communication.

  9. Influence of atmospheric properties on detection of wood-warbler nocturnal flight calls

    NASA Astrophysics Data System (ADS)

    Horton, Kyle G.; Stepanian, Phillip M.; Wainwright, Charlotte E.; Tegeler, Amy K.

    2015-10-01

    Avian migration monitoring can take on many forms; however, monitoring active nocturnal migration of land birds is limited to a few techniques. Avian nocturnal flight calls are currently the only method for describing migrant composition at the species level. However, as this method develops, more information is needed to understand the sources of variation in call detection. Additionally, few studies examine how detection probabilities differ under varying atmospheric conditions. We use nocturnal flight call recordings from captive individuals to explore the dependence of flight call detection on atmospheric temperature and humidity. Height or distance from origin had the largest influence on call detection, while temperature and humidity also influenced detectability at higher altitudes. Because flight call detection varies with both atmospheric conditions and flight height, improved monitoring across time and space will require correction for these factors to generate standardized metrics of songbird migration.

  10. MareyMap Online: A User-Friendly Web Application and Database Service for Estimating Recombination Rates Using Physical and Genetic Maps.

    PubMed

    Siberchicot, Aurélie; Bessy, Adrien; Guéguen, Laurent; Marais, Gabriel A B

    2017-10-01

    Given the importance of meiotic recombination in biology, there is a need to develop robust methods to estimate meiotic recombination rates. A popular approach, called the Marey map approach, relies on comparing genetic and physical maps of a chromosome to estimate local recombination rates. In the past, we have implemented this approach in an R package called MareyMap, which includes many functionalities useful to get reliable recombination rate estimates in a semi-automated way. MareyMap has been used repeatedly in studies looking at the effect of recombination on genome evolution. Here, we propose a simpler user-friendly web service version of MareyMap, called MareyMap Online, which allows a user to get recombination rates from her/his own data or from a publicly available database that we offer in a few clicks. When the analysis is done, the user is asked whether her/his curated data can be placed in the database and shared with other users, which we hope will make meta-analysis on recombination rates including many species easy in the future. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  11. Prediction and analysis of beta-turns in proteins by support vector machine.

    PubMed

    Pham, Tho Hoan; Satou, Kenji; Ho, Tu Bao

    2003-01-01

    Tight turn has long been recognized as one of the three important features of proteins after the alpha-helix and beta-sheet. Tight turns play an important role in globular proteins from both the structural and functional points of view. More than 90% tight turns are beta-turns. Analysis and prediction of beta-turns in particular and tight turns in general are very useful for the design of new molecules such as drugs, pesticides, and antigens. In this paper, we introduce a support vector machine (SVM) approach to prediction and analysis of beta-turns. We have investigated two aspects of applying SVM to the prediction and analysis of beta-turns. First, we developed a new SVM method, called BTSVM, which predicts beta-turns of a protein from its sequence. The prediction results on the dataset of 426 non-homologous protein chains by sevenfold cross-validation technique showed that our method is superior to the other previous methods. Second, we analyzed how amino acid positions support (or prevent) the formation of beta-turns based on the "multivariable" classification model of a linear SVM. This model is more general than the other ones of previous statistical methods. Our analysis results are more comprehensive and easier to use than previously published analysis results.

  12. Cavity radiation model for solar central receivers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lipps, F.W.

    1981-01-01

    The Energy Laboratory of the University of Houston has developed a computer simulation program called CREAM (i.e., Cavity Radiations Exchange Analysis Model) for application to the solar central receiver system. The zone generating capability of CREAM has been used in several solar re-powering studies. CREAM contains a geometric configuration factor generator based on Nusselt's method. A formulation of Nusselt's method provides support for the FORTRAN subroutine NUSSELT. Numerical results from NUSSELT are compared to analytic values and values from Sparrow's method. Sparrow's method is based on a double contour integral and its reduction to a single integral which is approximatedmore » by Guassian methods. Nusselt's method is adequate for the intended engineering applications, but Sparrow's method is found to be an order of magnitude more efficient in many situations.« less

  13. Site specific probability of passive acoustic detection of humpback whale calls from single fixed hydrophones.

    PubMed

    Helble, Tyler A; D'Spain, Gerald L; Hildebrand, John A; Campbell, Gregory S; Campbell, Richard L; Heaney, Kevin D

    2013-09-01

    Passive acoustic monitoring of marine mammal calls is an increasingly important method for assessing population numbers, distribution, and behavior. A common mistake in the analysis of marine mammal acoustic data is formulating conclusions about these animals without first understanding how environmental properties such as bathymetry, sediment properties, water column sound speed, and ocean acoustic noise influence the detection and character of vocalizations in the acoustic data. The approach in this paper is to use Monte Carlo simulations with a full wave field acoustic propagation model to characterize the site specific probability of detection of six types of humpback whale calls at three passive acoustic monitoring locations off the California coast. Results show that the probability of detection can vary by factors greater than ten when comparing detections across locations, or comparing detections at the same location over time, due to environmental effects. Effects of uncertainties in the inputs to the propagation model are also quantified, and the model accuracy is assessed by comparing calling statistics amassed from 24,690 humpback units recorded in the month of October 2008. Under certain conditions, the probability of detection can be estimated with uncertainties sufficiently small to allow for accurate density estimates.

  14. A Digital Algorithm for Composite Laminate Analysis - FORTRAN. Revision

    DTIC Science & Technology

    1983-10-01

    Addition of a matrix to a scalar multiple of another matrix; C = A + Con x B, Con is scalar 16. MTDM : Translates material properties to each ply 17...1) IF(LI eLT. 8) READ 706,TH(l) NNL=1 LMPI C1)=LI PLNM(1 )1. CALL MTDM (PLNMPMGMPMITPI) CALL MODULS(ES11C1),ES22C1),VS12(1) ,GS12(1),QU) CALL...NNLNLDCNTHLMPIDTCPLNM) CALL MTDM (PLNMPMGMtPMITPI) CALL MOLS(THHLHS) CALL INVRSCSB19A) CALL MTAD(ZEROADUMH) CALL MTAD(ZEROSB1,tDUM1,PHI) CALL NORM

  15. Calls to Florida Poison Control Centers about mercury: Trends over 2003-2013.

    PubMed

    Gribble, Matthew O; Deshpande, Aniruddha; Stephan, Wendy B; Hunter, Candis M; Weisman, Richard S

    2017-11-01

    The aim of this analysis was to contrast trends in exposure-report calls and informational queries (a measure of public interest) about mercury to the Florida Poison Control Centers over 2003-2013. Poison-control specialists coded calls to Florida Poison Control Centers by substance of concern, caller demographics, and whether the call pertained to an exposure event or was an informational query. For the present study, call records regarding mercury were de-identified and provided along with daily total number of calls for statistical analysis. We fit Poisson models using generalized estimating equations to summarize changes across years in counts of daily calls to Florida Poison Control Centers, adjusting for month. In a second stage of analysis, we further adjusted for the total number of calls each day. We also conducted analyses stratified by age of the exposed. There was an overall decrease over 2003-2013 in the number of total calls about mercury [Ratio per year: 0.89, 95% CI: (0.88, 0.90)], and calls about mercury exposure [Ratio per year: 0.84, 95% CI: (0.83, 0.85)], but the number of informational queries about mercury increased over this time [Ratio per year: 1.15 (95% CI: 1.12, 1.18)]. After adjusting for the number of calls of that type each day (e.g., call volume), the associations remained similar: a ratio of 0.88 (95% CI: 0.87, 0.89) per year for total calls, 0.85 (0.83, 0.86) for exposure-related calls, and 1.17 (1.14, 1.21) for informational queries. Although, the number of exposure-related calls decreased, informational queries increased over 2003-2013. This might suggest an increased public interest in mercury health risks despite a decrease in reported exposures over this time period. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Automatic movie skimming with general tempo analysis

    NASA Astrophysics Data System (ADS)

    Lee, Shih-Hung; Yeh, Chia-Hung; Kuo, C. C. J.

    2003-11-01

    Story units are extracted by general tempo analysis including tempos analysis including tempos of audio and visual information in this research. Although many schemes have been proposed to successfully segment video data into shots using basic low-level features, how to group shots into meaningful units called story units is still a challenging problem. By focusing on a certain type of video such as sport or news, we can explore models with the specific application domain knowledge. For movie contents, many heuristic rules based on audiovisual clues have been proposed with limited success. We propose a method to extract story units using general tempo analysis. Experimental results are given to demonstrate the feasibility and efficiency of the proposed technique.

  17. Multiscale hidden Markov models for photon-limited imaging

    NASA Astrophysics Data System (ADS)

    Nowak, Robert D.

    1999-06-01

    Photon-limited image analysis is often hindered by low signal-to-noise ratios. A novel Bayesian multiscale modeling and analysis method is developed in this paper to assist in these challenging situations. In addition to providing a very natural and useful framework for modeling an d processing images, Bayesian multiscale analysis is often much less computationally demanding compared to classical Markov random field models. This paper focuses on a probabilistic graph model called the multiscale hidden Markov model (MHMM), which captures the key inter-scale dependencies present in natural image intensities. The MHMM framework presented here is specifically designed for photon-limited imagin applications involving Poisson statistics, and applications to image intensity analysis are examined.

  18. An analysis of the influence of production conditions on the development of the microporous structure of the activated carbon fibres using the LBET method

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Mirosław

    2017-12-01

    The paper presents the results of the research on the application of the new analytical models of multilayer adsorption on heterogeneous surfaces with the unique fast multivariant identification procedure, together called LBET method, as a tool for analysing the microporous structure of the activated carbon fibres obtained from polyacrylonitrile by chemical activation using potassium and sodium hydroxides. The novel LBET method was employed particularly to evaluate the impact of the used activator and the hydroxide to polyacrylonitrile ratio on the obtained microporous structure of the activated carbon fibres.

  19. On the feasibility of a transient dynamic design analysis

    NASA Astrophysics Data System (ADS)

    Cunniff, Patrick F.; Pohland, Robert D.

    1993-05-01

    The Dynamic Design Analysis Method has been used for the past 30 years as part of the Navy's efforts to shock-harden heavy shipboard equipment. This method which has been validated several times employs normal mode theory and design shock values. This report examines the degree of success that may be achieved by using simple equipment-vehicle models that produce time history responses which are equivalent to the responses that would be achieved using spectral design values employed by the Dynamic Design Analysis Method. These transient models are constructed by attaching the equipment's modal oscillators to the vehicle which is composed of rigid masses and elastic springs. Two methods have been developed for constructing these transient models. Each method generates the parameters of the vehicles so as to approximate the required damaging effects, such that the transient model is excited by an idealized impulse applied to the vehicle mass to which the equipment modal oscillators are attached. The first method called the Direct Modeling Method, is limited to equipment with at most three-degrees of freedom and the vehicle consists of a single lumped mass and spring. The Optimization Modeling Method, which is based on the simplex method for optimization, has been used successfully with a variety of vehicle models and equipment sizes.

  20. A multi-pattern hash-binary hybrid algorithm for URL matching in the HTTP protocol.

    PubMed

    Zeng, Ping; Tan, Qingping; Meng, Xiankai; Shao, Zeming; Xie, Qinzheng; Yan, Ying; Cao, Wei; Xu, Jianjun

    2017-01-01

    In this paper, based on our previous multi-pattern uniform resource locator (URL) binary-matching algorithm called HEM, we propose an improved multi-pattern matching algorithm called MH that is based on hash tables and binary tables. The MH algorithm can be applied to the fields of network security, data analysis, load balancing, cloud robotic communications, and so on-all of which require string matching from a fixed starting position. Our approach effectively solves the performance problems of the classical multi-pattern matching algorithms. This paper explores ways to improve string matching performance under the HTTP protocol by using a hash method combined with a binary method that transforms the symbol-space matching problem into a digital-space numerical-size comparison and hashing problem. The MH approach has a fast matching speed, requires little memory, performs better than both the classical algorithms and HEM for matching fields in an HTTP stream, and it has great promise for use in real-world applications.

  1. Heat Exchange in “Human body - Thermal protection - Environment” System

    NASA Astrophysics Data System (ADS)

    Khromova, I. V.

    2017-11-01

    This article is devoted to the issues of simulation and calculation of thermal processes in the system called “Human body - Thermal protection - Environment” under low temperature conditions. It considers internal heat sources and convective heat transfer between calculated elements. Overall this is important for the Heat Transfer Theory. The article introduces complex heat transfer calculation method and local thermophysical parameters calculation method in the system called «Human body - Thermal protection - Environment», considering passive and active thermal protections, thermophysical and geometric properties of calculated elements in a wide range of environmental parameters (water, air). It also includes research on the influence that thermal resistance of modern materials, used in special protective clothes development, has on heat transfer in the system “Human body - Thermal protection - Environment”. Analysis of the obtained results allows adding of the computer research data to experiments and optimizing of individual life-support system elements, which are intended to protect human body from exposure to external factors.

  2. On Bayesian methods of exploring qualitative interactions for targeted treatment.

    PubMed

    Chen, Wei; Ghosh, Debashis; Raghunathan, Trivellore E; Norkin, Maxim; Sargent, Daniel J; Bepler, Gerold

    2012-12-10

    Providing personalized treatments designed to maximize benefits and minimizing harms is of tremendous current medical interest. One problem in this area is the evaluation of the interaction between the treatment and other predictor variables. Treatment effects in subgroups having the same direction but different magnitudes are called quantitative interactions, whereas those having opposite directions in subgroups are called qualitative interactions (QIs). Identifying QIs is challenging because they are rare and usually unknown among many potential biomarkers. Meanwhile, subgroup analysis reduces the power of hypothesis testing and multiple subgroup analyses inflate the type I error rate. We propose a new Bayesian approach to search for QI in a multiple regression setting with adaptive decision rules. We consider various regression models for the outcome. We illustrate this method in two examples of phase III clinical trials. The algorithm is straightforward and easy to implement using existing software packages. We provide a sample code in Appendix A. Copyright © 2012 John Wiley & Sons, Ltd.

  3. A multi-pattern hash-binary hybrid algorithm for URL matching in the HTTP protocol

    PubMed Central

    Tan, Qingping; Meng, Xiankai; Shao, Zeming; Xie, Qinzheng; Yan, Ying; Cao, Wei; Xu, Jianjun

    2017-01-01

    In this paper, based on our previous multi-pattern uniform resource locator (URL) binary-matching algorithm called HEM, we propose an improved multi-pattern matching algorithm called MH that is based on hash tables and binary tables. The MH algorithm can be applied to the fields of network security, data analysis, load balancing, cloud robotic communications, and so on—all of which require string matching from a fixed starting position. Our approach effectively solves the performance problems of the classical multi-pattern matching algorithms. This paper explores ways to improve string matching performance under the HTTP protocol by using a hash method combined with a binary method that transforms the symbol-space matching problem into a digital-space numerical-size comparison and hashing problem. The MH approach has a fast matching speed, requires little memory, performs better than both the classical algorithms and HEM for matching fields in an HTTP stream, and it has great promise for use in real-world applications. PMID:28399157

  4. Detection of Cutting Tool Wear using Statistical Analysis and Regression Model

    NASA Astrophysics Data System (ADS)

    Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin

    2010-10-01

    This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.

  5. Electron microscopy methods in studies of cultural heritage sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasiliev, A. L., E-mail: a.vasiliev56@gmail.com; Kovalchuk, M. V.; Yatsishina, E. B.

    The history of the development and application of scanning electron microscopy (SEM), transmission electron microscopy (TEM), and energy-dispersive X-ray microanalysis (EDXMA) in studies of cultural heritage sites is considered. In fact, investigations based on these methods began when electron microscopes became a commercial product. Currently, these methods, being developed and improved, help solve many historical enigmas. To date, electron microscopy combined with microanalysis makes it possible to investigate any object, from parchment and wooden articles to pigments, tools, and objects of art. Studies by these methods have revealed that some articles were made by ancient masters using ancient “nanotechnologies”; hence,more » their comprehensive analysis calls for the latest achievements in the corresponding instrumental methods and sample preparation techniques.« less

  6. A comparison of methods for teaching receptive labeling to children with autism spectrum disorders: a systematic replication.

    PubMed

    Grow, Laura L; Kodak, Tiffany; Carr, James E

    2014-01-01

    Previous research has demonstrated that the conditional-only method (starting with a multiple-stimulus array) is more efficient than the simple-conditional method (progressive incorporation of more stimuli into the array) for teaching receptive labeling to children with autism spectrum disorders (Grow, Carr, Kodak, Jostad, & Kisamore,). The current study systematically replicated the earlier study by comparing the 2 approaches using progressive prompting with 2 boys with autism. The results showed that the conditional-only method was a more efficient and reliable teaching procedure than the simple-conditional method. The results further call into question the practice of teaching simple discriminations to facilitate acquisition of conditional discriminations. © Society for the Experimental Analysis of Behavior.

  7. Electron microscopy methods in studies of cultural heritage sites

    NASA Astrophysics Data System (ADS)

    Vasiliev, A. L.; Kovalchuk, M. V.; Yatsishina, E. B.

    2016-11-01

    The history of the development and application of scanning electron microscopy (SEM), transmission electron microscopy (TEM), and energy-dispersive X-ray microanalysis (EDXMA) in studies of cultural heritage sites is considered. In fact, investigations based on these methods began when electron microscopes became a commercial product. Currently, these methods, being developed and improved, help solve many historical enigmas. To date, electron microscopy combined with microanalysis makes it possible to investigate any object, from parchment and wooden articles to pigments, tools, and objects of art. Studies by these methods have revealed that some articles were made by ancient masters using ancient "nanotechnologies"; hence, their comprehensive analysis calls for the latest achievements in the corresponding instrumental methods and sample preparation techniques.

  8. Distortion analysis of subband adaptive filtering methods for FMRI active noise control systems.

    PubMed

    Milani, Ali A; Panahi, Issa M; Briggs, Richard

    2007-01-01

    Delayless subband filtering structure, as a high performance frequency domain filtering technique, is used for canceling broadband fMRI noise (8 kHz bandwidth). In this method, adaptive filtering is done in subbands and the coefficients of the main canceling filter are computed by stacking the subband weights together. There are two types of stacking methods called FFT and FFT-2. In this paper, we analyze the distortion introduced by these two stacking methods. The effect of the stacking distortion on the performance of different adaptive filters in FXLMS algorithm with non-minimum phase secondary path is explored. The investigation is done for different adaptive algorithms (nLMS, APA and RLS), different weight stacking methods, and different number of subbands.

  9. The new PARIOTM device for determining continuous particle-size distributions of soils and sediments.

    NASA Astrophysics Data System (ADS)

    Miller, Alina; Pertassek, Thomas; Steins, Andreas; Durner, Wolfgang; Göttlein, Axel; Petrik, Wolfgang; von Unold, Georg

    2017-04-01

    The particle-size distribution (PSD) is a key property of soils. The reference method for determining the PSD is based on gravitational sedimentation of particles in an initially homogeneous suspension. Traditional methods measure manually (i) the uplift of a floating body in the suspension at different times (Hydrometer method) or (ii) the mass of solids in extracted suspension aliquots at predefined sampling depths and times (Pipette method). Both methods lead to a disturbance of the sedimentation process and provide only discrete data of the PSD. Durner et al. (2017) recently developed a new automated method to determine particle-size distributions of soils and sediments from gravitational sedimentation (Durner, W., S.C. Iden, and G. von Unold: The integral suspension pressure method (ISP) for precise particle-size analysis by gravitational sedimentation, Water Resources Research, doi:10.1002/2016WR019830, 2017). The so-called integral suspension method (ISP) method estimates continuous PSD's from sedimentation experiments by recording the temporal evolution of the suspension pressure at a certain measurement depth in a sedimentation cylinder. It requires no manual interaction after start and thus no specialized training of the lab personnel and avoids any disturbance of the sedimentation process. The required technology to perform these experiments was developed by the UMS company, Munich and is now available as an instrument called PARIO, traded by the METER Group. In this poster, the basic functioning of PARIO is shown and key components and parameters of the technology are explained.

  10. Coupled loads analysis for Space Shuttle payloads

    NASA Technical Reports Server (NTRS)

    Eldridge, J.

    1992-01-01

    Described here is a method for determining the transient response of, and the resultant loads in, a system exposed to predicted external forces. In this case, the system consists of four racks mounted on the inside of a space station resource node module (SSRNMO) which is mounted in the payload bay of the space shuttle. The predicted external forces are forcing functions which envelope worst case forces applied to the shuttle during liftoff and landing. This analysis, called a coupled loads analysis, is used to couple the payload and shuttle models together, determine the transient response of the system, and then recover payload loads, payload accelerations, and payload to shuttle interface forces.

  11. A mechanism for efficient debugging of parallel programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, B.P.; Choi, J.D.

    1988-01-01

    This paper addresses the design and implementation of an integrated debugging system for parallel programs running on shared memory multi-processors (SMMP). The authors describe the use of flowback analysis to provide information on causal relationships between events in a program's execution without re-executing the program for debugging. The authors introduce a mechanism called incremental tracing that, by using semantic analyses of the debugged program, makes the flowback analysis practical with only a small amount of trace generated during execution. The extend flowback analysis to apply to parallel programs and describe a method to detect race conditions in the interactions ofmore » the co-operating processes.« less

  12. Communication between office-based primary care providers and nurses working within patients' homes: an analysis of process data from CAPABLE.

    PubMed

    Smith, Patrick D; Boyd, Cynthia; Bellantoni, Julia; Roth, Jill; Becker, Kathleen L; Savage, Jessica; Nkimbeng, Manka; Szanton, Sarah L

    2016-02-01

    To examine themes of communication between office-based primary care providers and nurses working in private residences; to assess which methods of communication elicit fruitful responses to nurses' concerns. Lack of effective communication between home health care nurses and primary care providers contributes to clinical errors, inefficient care delivery and decreased patient safety. Few studies have described best practices related to frequency, methods and reasons for communication between community-based nurses and primary care providers. Secondary analysis of process data from 'Community Aging in Place: Advancing Better Living for Elders (CAPABLE)'. Independent reviewers analysed nurse documentation of communication (phone calls, letters and client coaching) initiated for 70 patients and analysed 45 letters to primary care providers to identify common concerns and recommendations raised by CAPABLE nurses. Primary care providers responded to 86% of phone calls, 56% of letters and 50% of client coaching efforts. Primary care providers addressed 86% of concerns communicated by phone, 34% of concerns communicated by letter and 41% of client-raised concerns. Nurses' letters addressed five key concerns: medication safety, pain, change in activities of daily living, fall safety and mental health. In letters, CAPABLE nurses recommended 58 interventions: medication change; referral to a specialist; patient education; and further diagnostic evaluation. Effective communication between home-based nurses and primary care providers enhances care coordination and improves outcomes for home-dwelling elders. Various methods of contact show promise for addressing specific communication needs. Nurses practicing within patients' homes can improve care coordination by using phone calls to address minor matters and written letters for detailed communication. Future research should explore implementation of Situation, Background, Assessment and Recommendation in home care to promote safe and efficient communication. Nurses should empower patients to address concerns directly with providers through use of devices including health passports. © 2016 The Authors. Journal of Clinical Nursing published by John Wiley & Sons Ltd.

  13. Advanced Connectivity Analysis (ACA): a Large Scale Functional Connectivity Data Mining Environment.

    PubMed

    Chen, Rong; Nixon, Erika; Herskovits, Edward

    2016-04-01

    Using resting-state functional magnetic resonance imaging (rs-fMRI) to study functional connectivity is of great importance to understand normal development and function as well as a host of neurological and psychiatric disorders. Seed-based analysis is one of the most widely used rs-fMRI analysis methods. Here we describe a freely available large scale functional connectivity data mining software package called Advanced Connectivity Analysis (ACA). ACA enables large-scale seed-based analysis and brain-behavior analysis. It can seamlessly examine a large number of seed regions with minimal user input. ACA has a brain-behavior analysis component to delineate associations among imaging biomarkers and one or more behavioral variables. We demonstrate applications of ACA to rs-fMRI data sets from a study of autism.

  14. PCI fuel failure analysis: a report on a cooperative program undertaken by Pacific Northwest Laboratory and Chalk River Nuclear Laboratories.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohr, C.L.; Pankaskie, P.J.; Heasler, P.G.

    Reactor fuel failure data sets in the form of initial power (P/sub i/), final power (P/sub f/), transient increase in power (..delta..P), and burnup (Bu) were obtained for pressurized heavy water reactors (PHWRs), boiling water reactors (BWRs), and pressurized water reactors (PWRs). These data sets were evaluated and used as the basis for developing two predictive fuel failure models, a graphical concept called the PCI-OGRAM, and a nonlinear regression based model called PROFIT. The PCI-OGRAM is an extension of the FUELOGRAM developed by AECL. It is based on a critical threshold concept for stress dependent stress corrosion cracking. The PROFITmore » model, developed at Pacific Northwest Laboratory, is the result of applying standard statistical regression methods to the available PCI fuel failure data and an analysis of the environmental and strain rate dependent stress-strain properties of the Zircaloy cladding.« less

  15. Explicit versus implicit motivations: Clarifying how experiences affect turkey hunter satisfaction using revised importance-performance, importance grid, and penalty-reward-contrast analyses

    USGS Publications Warehouse

    Schroeder, Susan A.; Cornicelli, Louis; Fulton, David C.; Merchant, Steven S.

    2018-01-01

    Although research has advanced methods for clarifying factors that relate to customer satisfaction, they have not been embraced by leisure researchers. Using results from a survey of wild turkey hunters, we applied traditional and revised importance-performance (IPA/RIPA), importance-grid analysis (IGA), and penalty-reward-contrast analysis (PRCA) to examine how activity-specific factors influenced satisfaction. Results suggested differences between the explicit and implicit importance of factors related to turkey hunting. Opportunities to kill turkeys were explicitly rated as less important than seeing, hearing, or calling in turkeys, but opportunities for harvest had relatively higher levels of implicit importance. PRCA identified “calling turkeys in” and “hearing gobbling” as minimum requirements that cause dissatisfaction if not fulfilled, but do not provide satisfaction, whereas “seeing turkeys” and an “opportunity to kill a turkey” related to both satisfaction and dissatisfaction. RIPA, IGA, and PRCA could provide valuable insights about factors that may improve satisfaction for leisure participants.

  16. Multiplex cDNA quantification method that facilitates the standardization of gene expression data

    PubMed Central

    Gotoh, Osamu; Murakami, Yasufumi; Suyama, Akira

    2011-01-01

    Microarray-based gene expression measurement is one of the major methods for transcriptome analysis. However, current microarray data are substantially affected by microarray platforms and RNA references because of the microarray method can provide merely the relative amounts of gene expression levels. Therefore, valid comparisons of the microarray data require standardized platforms, internal and/or external controls and complicated normalizations. These requirements impose limitations on the extensive comparison of gene expression data. Here, we report an effective approach to removing the unfavorable limitations by measuring the absolute amounts of gene expression levels on common DNA microarrays. We have developed a multiplex cDNA quantification method called GEP-DEAN (Gene expression profiling by DCN-encoding-based analysis). The method was validated by using chemically synthesized DNA strands of known quantities and cDNA samples prepared from mouse liver, demonstrating that the absolute amounts of cDNA strands were successfully measured with a sensitivity of 18 zmol in a highly multiplexed manner in 7 h. PMID:21415008

  17. [Methods of the multivariate statistical analysis of so-called polyetiological diseases using the example of coronary heart disease].

    PubMed

    Lifshits, A M

    1979-01-01

    General characteristics of the multivariate statistical analysis (MSA) is given. Methodical premises and criteria for the selection of an adequate MSA method applicable to pathoanatomic investigations of the epidemiology of multicausal diseases are presented. The experience of using MSA with computors and standard computing programs in studies of coronary arteries aterosclerosis on the materials of 2060 autopsies is described. The combined use of 4 MSA methods: sequential, correlational, regressional, and discriminant permitted to quantitate the contribution of each of the 8 examined risk factors in the development of aterosclerosis. The most important factors were found to be the age, arterial hypertension, and heredity. Occupational hypodynamia and increased fatness were more important in men, whereas diabetes melitus--in women. The registration of this combination of risk factors by MSA methods provides for more reliable prognosis of the likelihood of coronary heart disease with a fatal outcome than prognosis of the degree of coronary aterosclerosis.

  18. What makes a contraceptive acceptable?

    PubMed

    Berer, M

    1995-01-01

    The women's health movement is developing an increasing number of negative campaigns against various contraceptive methods based on three assumptions: 1) user-controlled methods are better for women than provider-controlled methods, 2) long-acting methods are undesirable because of their susceptibility to abuse, and 3) systemic methods carry unacceptable health risks to women. While these objections have sparked helpful debate, criticizing an overreliance on such methods is one thing and calling for bans on the provision of injectables and implants and on the development of vaccine contraceptives is another. Examination of the terms "provider-controlled," "user-controlled," and "long-acting" reveals that their definitions are not as clear-cut as opponents would have us believe. Some women's health advocates find the methods that are long-acting and provider-controlled to be the most problematic. They also criticize the near 100% contraceptive effectiveness of the long-acting methods despite the fact that the goal of contraception is to prevent pregnancy. It is wrong to condemn these methods because of their link to population control policies of the 1960s, and it is important to understand that long-acting, effective methods are often beneficial to women who require contraception for 20-22 years of their lives. Arguments against systemic methods (including RU-486 for early abortion and contraceptive vaccines) rebound around issues of safety. Feminists have gone so far as to create an intolerable situation by publishing books that criticize these methods based on erroneous conclusions and faulty scientific analysis. While women's health advocates have always rightly called for bans on abuse of various methods, they have not extended this ban to the methods themselves. In settings where other methods are not available, bans can lead to harm or maternal deaths. Another perspective can be used to consider methods in terms of their relationship with the user (repeated application). While feminists have called for more barrier and natural methods, most people in the world today refuse to use condoms even though they are the best protection from infection. Instead science should pursue promising new methods as well as continue to improve existing methods and to fill important gaps. Feminists should be advocates for women and their diverse needs rather than advocates against specific contraceptive methods.

  19. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science.

    PubMed

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets.

  20. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science

    PubMed Central

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets. PMID:27532883

  1. Graph-based analysis of kinetics on multidimensional potential-energy surfaces.

    PubMed

    Okushima, T; Niiyama, T; Ikeda, K S; Shimizu, Y

    2009-09-01

    The aim of this paper is twofold: one is to give a detailed description of an alternative graph-based analysis method, which we call saddle connectivity graph, for analyzing the global topography and the dynamical properties of many-dimensional potential-energy landscapes and the other is to give examples of applications of this method in the analysis of the kinetics of realistic systems. A Dijkstra-type shortest path algorithm is proposed to extract dynamically dominant transition pathways by kinetically defining transition costs. The applicability of this approach is first confirmed by an illustrative example of a low-dimensional random potential. We then show that a coarse-graining procedure tailored for saddle connectivity graphs can be used to obtain the kinetic properties of 13- and 38-atom Lennard-Jones clusters. The coarse-graining method not only reduces the complexity of the graphs, but also, with iterative use, reveals a self-similar hierarchical structure in these clusters. We also propose that the self-similarity is common to many-atom Lennard-Jones clusters.

  2. Multiscale multifractal time irreversibility analysis of stock markets

    NASA Astrophysics Data System (ADS)

    Jiang, Chenguang; Shang, Pengjian; Shi, Wenbin

    2016-11-01

    Time irreversibility is one of the most important properties of nonstationary time series. Complex time series often demonstrate even multiscale time irreversibility, such that not only the original but also coarse-grained time series are asymmetric over a wide range of scales. We study the multiscale time irreversibility of time series. In this paper, we develop a method called multiscale multifractal time irreversibility analysis (MMRA), which allows us to extend the description of time irreversibility to include the dependence on the segment size and statistical moments. We test the effectiveness of MMRA in detecting multifractality and time irreversibility of time series generated from delayed Henon map and binomial multifractal model. Then we employ our method to the time irreversibility analysis of stock markets in different regions. We find that the emerging market has higher multifractality degree and time irreversibility compared with developed markets. In this sense, the MMRA method may provide new angles in assessing the evolution stage of stock markets.

  3. The cross-correlation analysis of multi property of stock markets based on MM-DFA

    NASA Astrophysics Data System (ADS)

    Yang, Yujun; Li, Jianping; Yang, Yimei

    2017-09-01

    In this paper, we propose a new method called DH-MXA based on distribution histograms of Hurst surface and multiscale multifractal detrended fluctuation analysis. The method allows us to investigate the cross-correlation characteristics among multiple properties of different stock time series. It may provide a new way of measuring the nonlinearity of several signals. It also can provide a more stable and faithful description of cross-correlation of multiple properties of stocks. The DH-MXA helps us to present much richer information than multifractal detrented cross-correlation analysis and allows us to assess many universal and subtle cross-correlation characteristics of stock markets. We show DH-MXA by selecting four artificial data sets and five properties of four stock time series from different countries. The results show that our proposed method can be adapted to investigate the cross-correlation of stock markets. In general, the American stock markets are more mature and less volatile than the Chinese stock markets.

  4. Visibility Graph Based Time Series Analysis

    PubMed Central

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115

  5. Multiscale multifractal detrended cross-correlation analysis of financial time series

    NASA Astrophysics Data System (ADS)

    Shi, Wenbin; Shang, Pengjian; Wang, Jing; Lin, Aijing

    2014-06-01

    In this paper, we introduce a method called multiscale multifractal detrended cross-correlation analysis (MM-DCCA). The method allows us to extend the description of the cross-correlation properties between two time series. MM-DCCA may provide new ways of measuring the nonlinearity of two signals, and it helps to present much richer information than multifractal detrended cross-correlation analysis (MF-DCCA) by sweeping all the range of scale at which the multifractal structures of complex system are discussed. Moreover, to illustrate the advantages of this approach we make use of the MM-DCCA to analyze the cross-correlation properties between financial time series. We show that this new method can be adapted to investigate stock markets under investigation. It can provide a more faithful and more interpretable description of the dynamic mechanism between financial time series than traditional MF-DCCA. We also propose to reduce the scale ranges to analyze short time series, and some inherent properties which remain hidden when a wide range is used may exhibit perfectly in this way.

  6. Problems of Mathematical Finance by Stochastic Control Methods

    NASA Astrophysics Data System (ADS)

    Stettner, Łukasz

    The purpose of this paper is to present main ideas of mathematics of finance using the stochastic control methods. There is an interplay between stochastic control and mathematics of finance. On the one hand stochastic control is a powerful tool to study financial problems. On the other hand financial applications have stimulated development in several research subareas of stochastic control in the last two decades. We start with pricing of financial derivatives and modeling of asset prices, studying the conditions for the absence of arbitrage. Then we consider pricing of defaultable contingent claims. Investments in bonds lead us to the term structure modeling problems. Special attention is devoted to historical static portfolio analysis called Markowitz theory. We also briefly sketch dynamic portfolio problems using viscosity solutions to Hamilton-Jacobi-Bellman equation, martingale-convex analysis method or stochastic maximum principle together with backward stochastic differential equation. Finally, long time portfolio analysis for both risk neutral and risk sensitive functionals is introduced.

  7. Conjoint analysis: using a market-based research model for healthcare decision making.

    PubMed

    Mele, Nancy L

    2008-01-01

    Conjoint analysis is a market-based research model that has been used by businesses for more than 35 years to predict consumer preferences in product design and purchasing. Researchers in medicine, healthcare economics, and health policy have discovered the value of this methodology in determining treatment preferences, resource allocation, and willingness to pay. To describe the conjoint analysis methodology and explore value-added applications in nursing research. Conjoint analysis methodology is described, using examples from the healthcare and business literature, and personal experience with the method. Nurses are called upon to increase interdisciplinary research, provide an evidence base for nursing practice, create patient-centered treatments, and revise nursing education. Other disciplines have met challenges like these using conjoint analysis and discrete choice modeling.

  8. Simplified and refined structural modeling for economical flutter analysis and design

    NASA Technical Reports Server (NTRS)

    Ricketts, R. H.; Sobieszczanski, J.

    1977-01-01

    A coordinated use of two finite-element models of different levels of refinement is presented to reduce the computer cost of the repetitive flutter analysis commonly encountered in structural resizing to meet flutter requirements. One model, termed a refined model (RM), represents a high degree of detail needed for strength-sizing and flutter analysis of an airframe. The other model, called a simplified model (SM), has a relatively much smaller number of elements and degrees-of-freedom. A systematic method of deriving an SM from a given RM is described. The method consists of judgmental and numerical operations to make the stiffness and mass of the SM elements equivalent to the corresponding substructures of RM. The structural data are automatically transferred between the two models. The bulk of analysis is performed on the SM with periodical verifications carried out by analysis of the RM. In a numerical example of a supersonic cruise aircraft with an arrow wing, this approach permitted substantial savings in computer costs and acceleration of the job turn-around.

  9. Collaborative Research with Chinese, Indian, Filipino and North European Research Organizations on Infectious Disease Epidemics.

    PubMed

    Sumi, Ayako; Kobayashi, Nobumichi

    2017-01-01

    In this report, we present a short review of applications of time series analysis, which consists of spectral analysis based on the maximum entropy method in the frequency domain and the least squares method in the time domain, to the incidence data of infectious diseases. This report consists of three parts. First, we present our results obtained by collaborative research on infectious disease epidemics with Chinese, Indian, Filipino and North European research organizations. Second, we present the results obtained with the Japanese infectious disease surveillance data and the time series numerically generated from a mathematical model, called the susceptible/exposed/infectious/recovered (SEIR) model. Third, we present an application of the time series analysis to pathologic tissues to examine the usefulness of time series analysis for investigating the spatial pattern of pathologic tissue. It is anticipated that time series analysis will become a useful tool for investigating not only infectious disease surveillance data but also immunological and genetic tests.

  10. Effect of geometrical parameters on pressure distributions of impulse manufacturing technologies

    NASA Astrophysics Data System (ADS)

    Brune, Ryan Carl

    Impulse manufacturing techniques constitute a growing field of methods that utilize high-intensity pressure events to conduct useful mechanical operations. As interest in applying this technology continues to grow, greater understanding must be achieved with respect to output pressure events in both magnitude and distribution. In order to address this need, a novel pressure measurement has been developed called the Profile Indentation Pressure Evaluation (PIPE) method that systematically analyzes indentation patterns created with impulse events. Correlation with quasi-static test data and use of software-assisted analysis techniques allows for colorized pressure maps to be generated for both electromagnetic and vaporizing foil actuator (VFA) impulse forming events. Development of this technique aided introduction of a design method for electromagnetic path actuator systems, where key geometrical variables are considered using a newly developed analysis method, which is called the Path Actuator Proximal Array (PAPA) pressure model. This model considers key current distribution and proximity effects and interprets generated pressure by considering the adjacent conductor surfaces as proximal arrays of individual conductors. According to PIPE output pressure analysis, the PAPA model provides a reliable prediction of generated pressure for path actuator systems as local geometry is changed. Associated mechanical calculations allow for pressure requirements to be calculated for shearing, flanging, and hemming operations, providing a design process for such cases. Additionally, geometry effect is investigated through a formability enhancement study using VFA metalworking techniques. A conical die assembly is utilized with both VFA high velocity and traditional quasi-static test methods on varied Hasek-type sample geometries to elicit strain states consistent with different locations on a forming limit diagram. Digital image correlation techniques are utilized to measure major and minor strains for each sample type to compare limit strain results. Overall testing indicated decreased formability at high velocity for 304 DDQ stainless steel and increased formability at high velocity for 3003-H14 aluminum. Microstructural and fractographic analysis helped dissect and analyze the observed differences in these cases. Overall, these studies comprehensively explore the effects of geometrical parameters on magnitude and distribution of impulse manufacturing generated pressure, establishing key guidelines and models for continued development and implementation in commercial applications.

  11. Time-dependent structural transformation analysis to high-level Petri net model with active state transition diagram

    PubMed Central

    2010-01-01

    Background With an accumulation of in silico data obtained by simulating large-scale biological networks, a new interest of research is emerging for elucidating how living organism functions over time in cells. Investigating the dynamic features of current computational models promises a deeper understanding of complex cellular processes. This leads us to develop a method that utilizes structural properties of the model over all simulation time steps. Further, user-friendly overviews of dynamic behaviors can be considered to provide a great help in understanding the variations of system mechanisms. Results We propose a novel method for constructing and analyzing a so-called active state transition diagram (ASTD) by using time-course simulation data of a high-level Petri net. Our method includes two new algorithms. The first algorithm extracts a series of subnets (called temporal subnets) reflecting biological components contributing to the dynamics, while retaining positive mathematical qualities. The second one creates an ASTD composed of unique temporal subnets. ASTD provides users with concise information allowing them to grasp and trace how a key regulatory subnet and/or a network changes with time. The applicability of our method is demonstrated by the analysis of the underlying model for circadian rhythms in Drosophila. Conclusions Building ASTD is a useful means to convert a hybrid model dealing with discrete, continuous and more complicated events to finite time-dependent states. Based on ASTD, various analytical approaches can be applied to obtain new insights into not only systematic mechanisms but also dynamics. PMID:20356411

  12. Automating the Transformational Development of Software. Volume 1.

    DTIC Science & Technology

    1983-03-01

    DRACO system [Neighbors 80] uses meta-rules to derive information about which new transformations will be applicable after a particular transformation has...transformation over another. The new model, as Incorporated in a system called Glitter, explicitly represents transformation goals, methods, and selection...done anew for each new problem (compare this with Neighbor’s Draco system [Neighbors 80] which attempts to reuse domain analysis). o Is the user

  13. Seasheds, a Sealift Enhancement Feature: An Analysis of Methods Employed for Lifting DoD’s Outsize Cargo

    DTIC Science & Technology

    1993-09-23

    do not get called into government service. (Ackley, 1992) 51 1. The Transportation Problems Associated with Soasheds Commercial U.S.-flag liner trade...II! I .F-- I - T i• "* Holds 1, 2, 6, & 7 for 20’ contalners only 94 Class: CG-S-6$C C Number of Soasheds : 12u ber of CCSA: 6 Hold I Held 2: II

  14. Toward a quantitative account of pitch distribution in spontaneous narrative: Method and validation

    PubMed Central

    Matteson, Samuel E.; Streit Olness, Gloria; Caplow, Nancy J.

    2013-01-01

    Pitch is well-known both to animate human discourse and to convey meaning in communication. The study of the statistical population distributions of pitch in discourse will undoubtedly benefit from methodological improvements. The current investigation examines a method that parameterizes pitch in discourse as musical pitch interval H measured in units of cents and that disaggregates the sequence of peak word-pitches using tools employed in time-series analysis and digital signal processing. The investigators test the proposed methodology by its application to distributions in pitch interval of the peak word-pitch (collectively called the discourse gamut) that occur in simulated and actual spontaneous emotive narratives obtained from 17 middle-aged African-American adults. The analysis, in rigorous tests, not only faithfully reproduced simulated distributions imbedded in realistic time series that drift and include pitch breaks, but the protocol also reveals that the empirical distributions exhibit a common hidden structure when normalized to a slowly varying mode (called the gamut root) of their respective probability density functions. Quantitative differences between narratives reveal the speakers' relative propensity for the use of pitch levels corresponding to elevated degrees of a discourse gamut (the “e-la”) superimposed upon a continuum that conforms systematically to an asymmetric Laplace distribution. PMID:23654400

  15. Detecting and Estimating Contamination of Human DNA Samples in Sequencing and Array-Based Genotype Data

    PubMed Central

    Jun, Goo; Flickinger, Matthew; Hetrick, Kurt N.; Romm, Jane M.; Doheny, Kimberly F.; Abecasis, Gonçalo R.; Boehnke, Michael; Kang, Hyun Min

    2012-01-01

    DNA sample contamination is a serious problem in DNA sequencing studies and may result in systematic genotype misclassification and false positive associations. Although methods exist to detect and filter out cross-species contamination, few methods to detect within-species sample contamination are available. In this paper, we describe methods to identify within-species DNA sample contamination based on (1) a combination of sequencing reads and array-based genotype data, (2) sequence reads alone, and (3) array-based genotype data alone. Analysis of sequencing reads allows contamination detection after sequence data is generated but prior to variant calling; analysis of array-based genotype data allows contamination detection prior to generation of costly sequence data. Through a combination of analysis of in silico and experimentally contaminated samples, we show that our methods can reliably detect and estimate levels of contamination as low as 1%. We evaluate the impact of DNA contamination on genotype accuracy and propose effective strategies to screen for and prevent DNA contamination in sequencing studies. PMID:23103226

  16. Report on the analysis of common beverages spiked with gamma-hydroxybutyric acid (GHB) and gamma-butyrolactone (GBL) using NMR and the PURGE solvent-suppression technique.

    PubMed

    Lesar, Casey T; Decatur, John; Lukasiewicz, Elaan; Champeil, Elise

    2011-10-10

    In forensic evidence, the identification and quantitation of gamma-hydroxybutyric acid (GHB) in "spiked" beverages is challenging. In this report, we present the analysis of common alcoholic beverages found in clubs and bars spiked with gamma-hydroxybutyric acid (GHB) and gamma-butyrolactone (GBL). Our analysis of the spiked beverages consisted of using (1)H NMR with a water suppression method called Presaturation Utilizing Relaxation Gradients and Echoes (PURGE). The following beverages were analyzed: water, 10% ethanol in water, vodka-cranberry juice, rum and coke, gin and tonic, whisky and diet coke, white wine, red wine, and beer. The PURGE method allowed for the direct identification and quantitation of both compounds in all beverages except red and white wine where small interferences prevented accurate quantitation. The NMR method presented in this paper utilizes PURGE water suppression. Thanks to the use of a capillary internal standard, the method is fast, non-destructive, sensitive and requires no sample preparation which could disrupt the equilibrium between GHB and GBL. Published by Elsevier Ireland Ltd.

  17. Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers

    PubMed Central

    García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta

    2016-01-01

    The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine. PMID:28773653

  18. Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers.

    PubMed

    García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta

    2016-06-29

    The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine.

  19. Diverse expected gradient active learning for relative attributes.

    PubMed

    You, Xinge; Wang, Ruxin; Tao, Dacheng

    2014-07-01

    The use of relative attributes for semantic understanding of images and videos is a promising way to improve communication between humans and machines. However, it is extremely labor- and time-consuming to define multiple attributes for each instance in large amount of data. One option is to incorporate active learning, so that the informative samples can be actively discovered and then labeled. However, most existing active-learning methods select samples one at a time (serial mode), and may therefore lose efficiency when learning multiple attributes. In this paper, we propose a batch-mode active-learning method, called diverse expected gradient active learning. This method integrates an informativeness analysis and a diversity analysis to form a diverse batch of queries. Specifically, the informativeness analysis employs the expected pairwise gradient length as a measure of informativeness, while the diversity analysis forces a constraint on the proposed diverse gradient angle. Since simultaneous optimization of these two parts is intractable, we utilize a two-step procedure to obtain the diverse batch of queries. A heuristic method is also introduced to suppress imbalanced multiclass distributions. Empirical evaluations of three different databases demonstrate the effectiveness and efficiency of the proposed approach.

  20. Diverse Expected Gradient Active Learning for Relative Attributes.

    PubMed

    You, Xinge; Wang, Ruxin; Tao, Dacheng

    2014-06-02

    The use of relative attributes for semantic understanding of images and videos is a promising way to improve communication between humans and machines. However, it is extremely labor- and time-consuming to define multiple attributes for each instance in large amount of data. One option is to incorporate active learning, so that the informative samples can be actively discovered and then labeled. However, most existing active-learning methods select samples one at a time (serial mode), and may therefore lose efficiency when learning multiple attributes. In this paper, we propose a batch-mode active-learning method, called Diverse Expected Gradient Active Learning (DEGAL). This method integrates an informativeness analysis and a diversity analysis to form a diverse batch of queries. Specifically, the informativeness analysis employs the expected pairwise gradient length as a measure of informativeness, while the diversity analysis forces a constraint on the proposed diverse gradient angle. Since simultaneous optimization of these two parts is intractable, we utilize a two-step procedure to obtain the diverse batch of queries. A heuristic method is also introduced to suppress imbalanced multi-class distributions. Empirical evaluations of three different databases demonstrate the effectiveness and efficiency of the proposed approach.

  1. Discriminant analysis in wildlife research: Theory and applications

    USGS Publications Warehouse

    Williams, B.K.; Capen, D.E.

    1981-01-01

    Discriminant analysis, a method of analyzing grouped multivariate data, is often used in ecological investigations. It has both a predictive and an explanatory function, the former aiming at classification of individuals of unknown group membership. The goal of the latter function is to exhibit group separation by means of linear transforms, and the corresponding method is called canonical analysis. This discussion focuses on the application of canonical analysis in ecology. In order to clarify its meaning, a parametric approach is taken instead of the usual data-based formulation. For certain assumptions the data-based canonical variates are shown to result from maximum likelihood estimation, thus insuring consistency and asymptotic efficiency. The distorting effects of covariance heterogeneity are examined, as are certain difficulties which arise in interpreting the canonical functions. A 'distortion metric' is defined, by means of which distortions resulting from the canonical transformation can be assessed. Several sampling problems which arise in ecological applications are considered. It is concluded that the method may prove valuable for data exploration, but is of limited value as an inferential procedure.

  2. The admixture maximum likelihood test to test for association between rare variants and disease phenotypes.

    PubMed

    Tyrer, Jonathan P; Guo, Qi; Easton, Douglas F; Pharoah, Paul D P

    2013-06-06

    The development of genotyping arrays containing hundreds of thousands of rare variants across the genome and advances in high-throughput sequencing technologies have made feasible empirical genetic association studies to search for rare disease susceptibility alleles. As single variant testing is underpowered to detect associations, the development of statistical methods to combine analysis across variants - so-called "burden tests" - is an area of active research interest. We previously developed a method, the admixture maximum likelihood test, to test multiple, common variants for association with a trait of interest. We have extended this method, called the rare admixture maximum likelihood test (RAML), for the analysis of rare variants. In this paper we compare the performance of RAML with six other burden tests designed to test for association of rare variants. We used simulation testing over a range of scenarios to test the power of RAML compared to the other rare variant association testing methods. These scenarios modelled differences in effect variability, the average direction of effect and the proportion of associated variants. We evaluated the power for all the different scenarios. RAML tended to have the greatest power for most scenarios where the proportion of associated variants was small, whereas SKAT-O performed a little better for the scenarios with a higher proportion of associated variants. The RAML method makes no assumptions about the proportion of variants that are associated with the phenotype of interest or the magnitude and direction of their effect. The method is flexible and can be applied to both dichotomous and quantitative traits and allows for the inclusion of covariates in the underlying regression model. The RAML method performed well compared to the other methods over a wide range of scenarios. Generally power was moderate in most of the scenarios, underlying the need for large sample sizes in any form of association testing.

  3. DendroBLAST: approximate phylogenetic trees in the absence of multiple sequence alignments.

    PubMed

    Kelly, Steven; Maini, Philip K

    2013-01-01

    The rapidly growing availability of genome information has created considerable demand for both fast and accurate phylogenetic inference algorithms. We present a novel method called DendroBLAST for reconstructing phylogenetic dendrograms/trees from protein sequences using BLAST. This method differs from other methods by incorporating a simple model of sequence evolution to test the effect of introducing sequence changes on the reliability of the bipartitions in the inferred tree. Using realistic simulated sequence data we demonstrate that this method produces phylogenetic trees that are more accurate than other commonly-used distance based methods though not as accurate as maximum likelihood methods from good quality multiple sequence alignments. In addition to tests on simulated data, we use DendroBLAST to generate input trees for a supertree reconstruction of the phylogeny of the Archaea. This independent analysis produces an approximate phylogeny of the Archaea that has both high precision and recall when compared to previously published analysis of the same dataset using conventional methods. Taken together these results demonstrate that approximate phylogenetic trees can be produced in the absence of multiple sequence alignments, and we propose that these trees will provide a platform for improving and informing downstream bioinformatic analysis. A web implementation of the DendroBLAST method is freely available for use at http://www.dendroblast.com/.

  4. An integrated workflow for robust alignment and simplified quantitative analysis of NMR spectrometry data.

    PubMed

    Vu, Trung N; Valkenborg, Dirk; Smets, Koen; Verwaest, Kim A; Dommisse, Roger; Lemière, Filip; Verschoren, Alain; Goethals, Bart; Laukens, Kris

    2011-10-20

    Nuclear magnetic resonance spectroscopy (NMR) is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA). The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT) cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio) is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear improvements in comparison to other methods, and the down-to-earth quantitative analysis works well for the CluPA-aligned spectra. The whole workflow is embedded into a modular and statistically sound framework that is implemented as an R package called "speaq" ("spectrum alignment and quantitation"), which is freely available from http://code.google.com/p/speaq/.

  5. Methods for collection and analysis of aquatic biological and microbiological samples

    USGS Publications Warehouse

    Britton, L.J.; Greeson, P.E.

    1989-01-01

    The series of chapters on techniques describes methods used by the U.S. Geological Survey for planning and conducting water-resources investigations. The material is arranged under major subject headings called books and is further subdivided into sections and chapters. Book 5 is on laboratory analysis. Section A is on water. The unit of publication, the chapter, is limited to a narrow field of subject matter. "Methods for Collection and Analysis of Aquatic Biological and Microbiological Samples" is the fourth chapter to be published under Section A of Book 5. The chapter number includes the letter of the section.This chapter was prepared by several aquatic biologists and microbiologists of the U.S. Geological Survey to provide accurate and precise methods for the collection and analysis of aquatic biological and microbiological samples.Use of brand, firm, and trade names in this chapter is for identification purposes only and does not constitute endorsement by the U.S. Geological Survey.This chapter supersedes "Methods for Collection and Analysis of Aquatic Biological and Microbiological Samples" edited by P.E. Greeson, T.A. Ehlke, G.A. Irwin, B.W. Lium, and K.V. Slack (U.S. Geological Survey Techniques of Water-Resources Investigations, Book 5, Chapter A4, 1977) and also supersedes "A Supplement to-Methods for Collection and Analysis of Aquatic Biological and Microbiological Samples" by P.E. Greeson (U.S. Geological Survey Techniques of Water-Resources Investigations, Book 5, Chapter A4), Open-File Report 79-1279, 1979.

  6. Advertisement call and genetic structure conservatism: good news for an endangered Neotropical frog

    PubMed Central

    Costa, William P.; Martins, Lucas B.; Nunes-de-Almeida, Carlos H. L.; Toledo, Luís Felipe

    2016-01-01

    Background: Many amphibian species are negatively affected by habitat change due to anthropogenic activities. Populations distributed over modified landscapes may be subject to local extinction or may be relegated to the remaining—likely isolated and possibly degraded—patches of available habitat. Isolation without gene flow could lead to variability in phenotypic traits owing to differences in local selective pressures such as environmental structure, microclimate, or site-specific species assemblages. Methods: Here, we tested the microevolution hypothesis by evaluating the acoustic parameters of 349 advertisement calls from 15 males from six populations of the endangered amphibian species Proceratophrys moratoi. In addition, we analyzed the genetic distances among populations and the genetic diversity with a haplotype network analysis. We performed cluster analysis on acoustic data based on the Bray-Curtis index of similarity, using the UPGMA method. We correlated acoustic dissimilarities (calculated by Euclidean distance) with geographical and genetic distances among populations. Results: Spectral traits of the advertisement call of P. moratoi presented lower coefficients of variation than did temporal traits, both within and among males. Cluster analyses placed individuals without congruence in population or geographical distance, but recovered the species topology in relation to sister species. The genetic distance among populations was low; it did not exceed 0.4% for the most distant populations, and was not correlated with acoustic distance. Discussion: Both acoustic features and genetic sequences are highly conserved, suggesting that populations could be connected by recent migrations, and that they are subject to stabilizing selective forces. Although further studies are required, these findings add to a growing body of literature suggesting that this species would be a good candidate for a reintroduction program without negative effects on communication or genetic impact. PMID:27190717

  7. A cross-sectional study of the association between mobile phone use and symptoms of ill health.

    PubMed

    Cho, Yong Min; Lim, Hee Jin; Jang, Hoon; Kim, Kyunghee; Choi, Jae Wook; Shin, Chol; Lee, Seung Ku; Kwon, Jong Hwa; Kim, Nam

    2016-01-01

    This study analyzed the associations between mobile phone call frequency and duration with non-specific symptoms. This study was conducted with a population group including 532 non-patient adults established by the Korean Genome and Epidemiology Study. The pattern of phone call using a mobile phone was investigated through face-to-face interview. Structured methods applied to quantitatively assess health effects are Headache Impact Test-6 (HIT-6), Psychosocial Well-being Index-Short Form, Beck Depression Inventory, Korean-Instrumental Activities of Daily Living, Perceived Stress Scale (PSS), Pittsburgh Sleep Quality Index, and 12-item Short Form Health Survey where a higher score represents a higher greater health effect. The average daily phone call frequency showed a significant correlation with the PSS score in female subjects. Increases in the average duration of one phone call were significantly correlated with increases in the severity of headaches in both sexes. The mean (standard deviation) HIT-6 score in the subgroup of subjects whose average duration of one phone call was five minutes or longer was 45.98 (8.15), as compared with 42.48 (7.20) in those whose average duration of one phone call was <5 minutes. The severity of headaches was divided into three levels according to the HIT-6 score (little or no impact/moderate impact/substantial or severe impact), and a logistic regression analysis was performed to investigate the association between an increased phone call duration and the headache severity. When the average duration of one phone call was five minutes or longer, the odds ratio (ORs) and the 95% confidence intervals (CIs) for the moderate impact group were 2.22 and 1.18 to 4.19, respectively. The OR and 95% CI for the substantial or severe impact group were 4.44 and 2.11 to 8.90, respectively. Mobile phone call duration was not significantly associated with stress, sleep, cognitive function, or depression, but was associated with the severity of headaches.

  8. Context Analysis of Customer Requests using a Hybrid Adaptive Neuro Fuzzy Inference System and Hidden Markov Models in the Natural Language Call Routing Problem

    NASA Astrophysics Data System (ADS)

    Rustamov, Samir; Mustafayev, Elshan; Clements, Mark A.

    2018-04-01

    The context analysis of customer requests in a natural language call routing problem is investigated in the paper. One of the most significant problems in natural language call routing is a comprehension of client request. With the aim of finding a solution to this issue, the Hybrid HMM and ANFIS models become a subject to an examination. Combining different types of models (ANFIS and HMM) can prevent misunderstanding by the system for identification of user intention in dialogue system. Based on these models, the hybrid system may be employed in various language and call routing domains due to nonusage of lexical or syntactic analysis in classification process.

  9. [Preliminarily application of content analysis to qualitative nursing data].

    PubMed

    Liang, Shu-Yuan; Chuang, Yeu-Hui; Wu, Shu-Fang

    2012-10-01

    Content analysis is a methodology for objectively and systematically studying the content of communication in various formats. Content analysis in nursing research and nursing education is called qualitative content analysis. Qualitative content analysis is frequently applied to nursing research, as it allows researchers to determine categories inductively and deductively. This article examines qualitative content analysis in nursing research from theoretical and practical perspectives. We first describe how content analysis concepts such as unit of analysis, meaning unit, code, category, and theme are used. Next, we describe the basic steps involved in using content analysis, including data preparation, data familiarization, analysis unit identification, creating tentative coding categories, category refinement, and establishing category integrity. Finally, this paper introduces the concept of content analysis rigor, including dependability, confirmability, credibility, and transferability. This article elucidates the content analysis method in order to help professionals conduct systematic research that generates data that are informative and useful in practical application.

  10. Spatio-Chromatic Adaptation via Higher-Order Canonical Correlation Analysis of Natural Images

    PubMed Central

    Gutmann, Michael U.; Laparra, Valero; Hyvärinen, Aapo; Malo, Jesús

    2014-01-01

    Independent component and canonical correlation analysis are two general-purpose statistical methods with wide applicability. In neuroscience, independent component analysis of chromatic natural images explains the spatio-chromatic structure of primary cortical receptive fields in terms of properties of the visual environment. Canonical correlation analysis explains similarly chromatic adaptation to different illuminations. But, as we show in this paper, neither of the two methods generalizes well to explain both spatio-chromatic processing and adaptation at the same time. We propose a statistical method which combines the desirable properties of independent component and canonical correlation analysis: It finds independent components in each data set which, across the two data sets, are related to each other via linear or higher-order correlations. The new method is as widely applicable as canonical correlation analysis, and also to more than two data sets. We call it higher-order canonical correlation analysis. When applied to chromatic natural images, we found that it provides a single (unified) statistical framework which accounts for both spatio-chromatic processing and adaptation. Filters with spatio-chromatic tuning properties as in the primary visual cortex emerged and corresponding-colors psychophysics was reproduced reasonably well. We used the new method to make a theory-driven testable prediction on how the neural response to colored patterns should change when the illumination changes. We predict shifts in the responses which are comparable to the shifts reported for chromatic contrast habituation. PMID:24533049

  11. Spatio-chromatic adaptation via higher-order canonical correlation analysis of natural images.

    PubMed

    Gutmann, Michael U; Laparra, Valero; Hyvärinen, Aapo; Malo, Jesús

    2014-01-01

    Independent component and canonical correlation analysis are two general-purpose statistical methods with wide applicability. In neuroscience, independent component analysis of chromatic natural images explains the spatio-chromatic structure of primary cortical receptive fields in terms of properties of the visual environment. Canonical correlation analysis explains similarly chromatic adaptation to different illuminations. But, as we show in this paper, neither of the two methods generalizes well to explain both spatio-chromatic processing and adaptation at the same time. We propose a statistical method which combines the desirable properties of independent component and canonical correlation analysis: It finds independent components in each data set which, across the two data sets, are related to each other via linear or higher-order correlations. The new method is as widely applicable as canonical correlation analysis, and also to more than two data sets. We call it higher-order canonical correlation analysis. When applied to chromatic natural images, we found that it provides a single (unified) statistical framework which accounts for both spatio-chromatic processing and adaptation. Filters with spatio-chromatic tuning properties as in the primary visual cortex emerged and corresponding-colors psychophysics was reproduced reasonably well. We used the new method to make a theory-driven testable prediction on how the neural response to colored patterns should change when the illumination changes. We predict shifts in the responses which are comparable to the shifts reported for chromatic contrast habituation.

  12. More efficient parameter estimates for factor analysis of ordinal variables by ridge generalized least squares.

    PubMed

    Yuan, Ke-Hai; Jiang, Ge; Cheng, Ying

    2017-11-01

    Data in psychology are often collected using Likert-type scales, and it has been shown that factor analysis of Likert-type data is better performed on the polychoric correlation matrix than on the product-moment covariance matrix, especially when the distributions of the observed variables are skewed. In theory, factor analysis of the polychoric correlation matrix is best conducted using generalized least squares with an asymptotically correct weight matrix (AGLS). However, simulation studies showed that both least squares (LS) and diagonally weighted least squares (DWLS) perform better than AGLS, and thus LS or DWLS is routinely used in practice. In either LS or DWLS, the associations among the polychoric correlation coefficients are completely ignored. To mend such a gap between statistical theory and empirical work, this paper proposes new methods, called ridge GLS, for factor analysis of ordinal data. Monte Carlo results show that, for a wide range of sample sizes, ridge GLS methods yield uniformly more accurate parameter estimates than existing methods (LS, DWLS, AGLS). A real-data example indicates that estimates by ridge GLS are 9-20% more efficient than those by existing methods. Rescaled and adjusted test statistics as well as sandwich-type standard errors following the ridge GLS methods also perform reasonably well. © 2017 The British Psychological Society.

  13. Integrative Analysis of High-throughput Cancer Studies with Contrasted Penalization

    PubMed Central

    Shi, Xingjie; Liu, Jin; Huang, Jian; Zhou, Yong; Shia, BenChang; Ma, Shuangge

    2015-01-01

    In cancer studies with high-throughput genetic and genomic measurements, integrative analysis provides a way to effectively pool and analyze heterogeneous raw data from multiple independent studies and outperforms “classic” meta-analysis and single-dataset analysis. When marker selection is of interest, the genetic basis of multiple datasets can be described using the homogeneity model or the heterogeneity model. In this study, we consider marker selection under the heterogeneity model, which includes the homogeneity model as a special case and can be more flexible. Penalization methods have been developed in the literature for marker selection. This study advances from the published ones by introducing the contrast penalties, which can accommodate the within- and across-dataset structures of covariates/regression coefficients and, by doing so, further improve marker selection performance. Specifically, we develop a penalization method that accommodates the across-dataset structures by smoothing over regression coefficients. An effective iterative algorithm, which calls an inner coordinate descent iteration, is developed. Simulation shows that the proposed method outperforms the benchmark with more accurate marker identification. The analysis of breast cancer and lung cancer prognosis studies with gene expression measurements shows that the proposed method identifies genes different from those using the benchmark and has better prediction performance. PMID:24395534

  14. Using Poison Center Exposure Calls to Predict Methadone Poisoning Deaths

    PubMed Central

    Dasgupta, Nabarun; Davis, Jonathan; Jonsson Funk, Michele; Dart, Richard

    2012-01-01

    Purpose There are more drug overdose deaths in the Untied States than motor vehicle fatalities. Yet the US vital statistics reporting system is of limited value because the data are delayed by four years. Poison centers report data within an hour of the event, but previous studies suggested a small proportion of poisoning deaths are reported to poison centers (PC). In an era of improved electronic surveillance capabilities, exposure calls to PCs may be an alternate indicator of trends in overdose mortality. Methods We used PC call counts for methadone that were reported to the Researched Abuse, Diversion and Addiction-Related Surveillance (RADARS®) System in 2006 and 2007. US death certificate data were used to identify deaths due to methadone. Linear regression was used to quantify the relationship of deaths and poison center calls. Results Compared to decedents, poison center callers tended to be younger, more often female, at home and less likely to require medical attention. A strong association was found with PC calls and methadone mortality (b = 0.88, se = 0.42, t = 9.5, df = 1, p<0.0001, R2 = 0.77). These findings were robust to large changes in a sensitivity analysis assessing the impact of underreporting of methadone overdose deaths. Conclusions Our results suggest that calls to poison centers for methadone are correlated with poisoning mortality as identified on death certificates. Calls received by poison centers may be used for timely surveillance of mortality due to methadone. In the midst of the prescription opioid overdose epidemic, electronic surveillance tools that report in real-time are powerful public health tools. PMID:22829925

  15. Beta systems error analysis

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The atmospheric backscatter coefficient, beta, measured with an airborne CO Laser Doppler Velocimeter (LDV) system operating in a continuous wave, focussed model is discussed. The Single Particle Mode (SPM) algorithm, was developed from concept through analysis of an extensive amount of data obtained with the system on board a NASA aircraft. The SPM algorithm is intended to be employed in situations where one particle at a time appears in the sensitive volume of the LDV. In addition to giving the backscatter coefficient, the SPM algorithm also produces as intermediate results the aerosol density and the aerosol backscatter cross section distribution. A second method, which measures only the atmospheric backscatter coefficient, is called the Volume Mode (VM) and was simultaneously employed. The results of these two methods differed by slightly less than an order of magnitude. The measurement uncertainties or other errors in the results of the two methods are examined.

  16. Restoration of out-of-focus images based on circle of confusion estimate

    NASA Astrophysics Data System (ADS)

    Vivirito, Paolo; Battiato, Sebastiano; Curti, Salvatore; La Cascia, M.; Pirrone, Roberto

    2002-11-01

    In this paper a new method for a fast out-of-focus blur estimation and restoration is proposed. It is suitable for CFA (Color Filter Array) images acquired by typical CCD/CMOS sensor. The method is based on the analysis of a single image and consists of two steps: 1) out-of-focus blur estimation via Bayer pattern analysis; 2) image restoration. Blur estimation is based on a block-wise edge detection technique. This edge detection is carried out on the green pixels of the CFA sensor image also called Bayer pattern. Once the blur level has been estimated the image is restored through the application of a new inverse filtering technique. This algorithm gives sharp images reducing ringing and crisping artifact, involving wider region of frequency. Experimental results show the effectiveness of the method, both in subjective and numerical way, by comparison with other techniques found in literature.

  17. Uncertain call likelihood negatively affects sleep and next-day cognitive performance while on-call in a laboratory environment.

    PubMed

    Sprajcer, Madeline; Jay, Sarah M; Vincent, Grace E; Vakulin, Andrew; Lack, Leon; Ferguson, Sally A

    2018-05-11

    On-call working arrangements are employed in a number of industries to manage unpredictable events, and often involve tasks that are safety- or time-critical. This study investigated the effects of call likelihood during an overnight on-call shift on self-reported pre-bed anxiety, sleep and next-day cognitive performance. A four-night laboratory-based protocol was employed, with an adaptation, a control and two counterbalanced on-call nights. On one on-call night, participants were instructed that they would definitely be called during the night, while on the other on-call night they were told they may be called. The State-Trait Anxiety Inventory form x-1 was used to investigate pre-bed anxiety, and sleep was assessed using polysomnography and power spectral analysis of the sleep electroencephalographic analysis. Cognitive performance was assessed four times daily using a 10-min psychomotor vigilance task. Participants felt more anxious before bed when they were definitely going to be called, compared with the control and maybe conditions. Conversely, participants experienced significantly less non-rapid eye movement and stage two sleep and poorer cognitive performance when told they may be called. Further, participants had significantly more rapid eye movement sleep in the maybe condition, which may be an adaptive response to the stress associated with this on-call condition. It appears that self-reported anxiety may not be linked with sleep outcomes while on-call. However, this research indicates that it is important to take call likelihood into consideration when constructing rosters and risk-management systems for on-call workers.

  18. Visual Aggregate Analysis of Eligibility Features of Clinical Trials

    PubMed Central

    He, Zhe; Carini, Simona; Sim, Ida; Weng, Chunhua

    2015-01-01

    Objective To develop a method for profiling the collective populations targeted for recruitment by multiple clinical studies addressing the same medical condition using one eligibility feature each time. Methods Using a previously published database COMPACT as the backend, we designed a scalable method for visual aggregate analysis of clinical trial eligibility features. This method consists of four modules for eligibility feature frequency analysis, query builder, distribution analysis, and visualization, respectively. This method is capable of analyzing (1) frequently used qualitative and quantitative features for recruiting subjects for a selected medical condition, (2) distribution of study enrollment on consecutive value points or value intervals of each quantitative feature, and (3) distribution of studies on the boundary values, permissible value ranges, and value range widths of each feature. All analysis results were visualized using Google Charts API. Five recruited potential users assessed the usefulness of this method for identifying common patterns in any selected eligibility feature for clinical trial participant selection. Results We implemented this method as a Web-based analytical system called VITTA (Visual Analysis Tool of Clinical Study Target Populations). We illustrated the functionality of VITTA using two sample queries involving quantitative features BMI and HbA1c for conditions “hypertension” and “Type 2 diabetes”, respectively. The recruited potential users rated the user-perceived usefulness of VITTA with an average score of 86.4/100. Conclusions We contributed a novel aggregate analysis method to enable the interrogation of common patterns in quantitative eligibility criteria and the collective target populations of multiple related clinical studies. A larger-scale study is warranted to formally assess the usefulness of VITTA among clinical investigators and sponsors in various therapeutic areas. PMID:25615940

  19. Dynamic Analysis of Large In-Space Deployable Membrane Antennas

    NASA Technical Reports Server (NTRS)

    Fang, Houfei; Yang, Bingen; Ding, Hongli; Hah, John; Quijano, Ubaldo; Huang, John

    2006-01-01

    This paper presents a vibration analysis of an eight-meter diameter membrane reflectarray antenna, which is composed of a thin membrane and a deployable frame. This analysis process has two main steps. In the first step, a two-variable-parameter (2-VP) membrane model is developed to determine the in-plane stress distribution of the membrane due to pre-tensioning, which eventually yields the differential stiffness of the membrane. In the second step, the obtained differential stiffness is incorporated in a dynamic equation governing the transverse vibration of the membrane-frame assembly. This dynamic equation is then solved by a semi-analytical method, called the Distributed Transfer Function Method (DTFM), which produces the natural frequencies and mode shapes of the antenna. The combination of the 2-VP model and the DTFM provides an accurate prediction of the in-plane stress distribution and modes of vibration for the antenna.

  20. Functional feature embedded space mapping of fMRI data.

    PubMed

    Hu, Jin; Tian, Jie; Yang, Lei

    2006-01-01

    We have proposed a new method for fMRI data analysis which is called Functional Feature Embedded Space Mapping (FFESM). Our work mainly focuses on the experimental design with periodic stimuli which can be described by a number of Fourier coefficients in the frequency domain. A nonlinear dimension reduction technique Isomap is applied to the high dimensional features obtained from frequency domain of the fMRI data for the first time. Finally, the presence of activated time series is identified by the clustering method in which the information theoretic criterion of minimum description length (MDL) is used to estimate the number of clusters. The feasibility of our algorithm is demonstrated by real human experiments. Although we focus on analyzing periodic fMRI data, the approach can be extended to analyze non-periodic fMRI data (event-related fMRI) by replacing the Fourier analysis with a wavelet analysis.

  1. Hamiltonian Dynamics of Spider-Type Multirotor Rigid Bodies Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doroshin, Anton V.

    2010-03-01

    This paper sets out to develop a spider-type multiple-rotor system which can be used for attitude control of spacecraft. The multirotor system contains a large number of rotor-equipped rays, so it was called a 'Spider-type System', also it can be called 'Rotary Hedgehog'. These systems allow using spinups and captures of conjugate rotors to perform compound attitude motion of spacecraft. The paper describes a new method of spacecraft attitude reorientation and new mathematical model of motion in Hamilton form. Hamiltonian dynamics of the system is investigated with the help of Andoyer-Deprit canonical variables. These variables allow obtaining exact solution formore » hetero- and homoclinic orbits in phase space of the system motion, which are very important for qualitative analysis.« less

  2. The planar multijunction cell - A new solar cell for earth and space

    NASA Technical Reports Server (NTRS)

    Evans, J. C., Jr.; Chai, A.-T.; Goradia, C.

    1980-01-01

    A new family of high-voltage solar cells, called the planar multijunction (PMJ) cell is being developed. The new cells combine the attractive features of planar cells with conventional or interdigitated back contacts and the vertical multijunction (VMJ) solar cell. The PMJ solar cell is internally divided into many voltage-generating regions, called unit cells, which are internally connected in series. The key to obtaining reasonable performance from this device was the separation of top surface field regions over each active unit cell area. Using existing solar cell fabricating methods, output voltages in excess of 20 volts per linear centimeter are possible. Analysis of the new device is complex, and numerous geometries are being studied which should provide substantial benefits in both normal sunlight usage as well as with concentrators.

  3. ALIF: a new promising technique for the decomposition and analysis of nonlinear and nonstationary signals

    NASA Astrophysics Data System (ADS)

    Cicone, Antonio; Zhou, Haomin; Piersanti, Mirko; Materassi, Massimo; Spogli, Luca

    2017-04-01

    Nonlinear and nonstationary signals are ubiquitous in real life. Their decomposition and analysis is of crucial importance in many research fields. Traditional techniques, like Fourier and wavelet Transform have been proved to be limited in this context. In the last two decades new kind of nonlinear methods have been developed which are able to unravel hidden features of these kinds of signals. In this talk we will review the state of the art and present a new method, called Adaptive Local Iterative Filtering (ALIF). This method, developed originally to study mono-dimensional signals, unlike any other technique proposed so far, can be easily generalized to study two or higher dimensional signals. Furthermore, unlike most of the similar methods, it does not require any a priori assumption on the signal itself, so that the method can be applied as it is to any kind of signals. Applications of ALIF algorithm to real life signals analysis will be presented. Like, for instance, the behavior of the water level near the coastline in presence of a Tsunami, the length of the day signal, the temperature and pressure measured at ground level on a global grid, and the radio power scintillation from GNSS signals.

  4. Malware Analysis Using Visualized Image Matrices

    PubMed Central

    Im, Eul Gyu

    2014-01-01

    This paper proposes a novel malware visual analysis method that contains not only a visualization method to convert binary files into images, but also a similarity calculation method between these images. The proposed method generates RGB-colored pixels on image matrices using the opcode sequences extracted from malware samples and calculates the similarities for the image matrices. Particularly, our proposed methods are available for packed malware samples by applying them to the execution traces extracted through dynamic analysis. When the images are generated, we can reduce the overheads by extracting the opcode sequences only from the blocks that include the instructions related to staple behaviors such as functions and application programming interface (API) calls. In addition, we propose a technique that generates a representative image for each malware family in order to reduce the number of comparisons for the classification of unknown samples and the colored pixel information in the image matrices is used to calculate the similarities between the images. Our experimental results show that the image matrices of malware can effectively be used to classify malware families both statically and dynamically with accuracy of 0.9896 and 0.9732, respectively. PMID:25133202

  5. Chemometrics Methods for Specificity, Authenticity and Traceability Analysis of Olive Oils: Principles, Classifications and Applications.

    PubMed

    Messai, Habib; Farman, Muhammad; Sarraj-Laabidi, Abir; Hammami-Semmar, Asma; Semmar, Nabil

    2016-11-17

    Olive oils (OOs) show high chemical variability due to several factors of genetic, environmental and anthropic types. Genetic and environmental factors are responsible for natural compositions and polymorphic diversification resulting in different varietal patterns and phenotypes. Anthropic factors, however, are at the origin of different blends' preparation leading to normative, labelled or adulterated commercial products. Control of complex OO samples requires their (i) characterization by specific markers; (ii) authentication by fingerprint patterns; and (iii) monitoring by traceability analysis. These quality control and management aims require the use of several multivariate statistical tools: specificity highlighting requires ordination methods; authentication checking calls for classification and pattern recognition methods; traceability analysis implies the use of network-based approaches able to separate or extract mixed information and memorized signals from complex matrices. This chapter presents a review of different chemometrics methods applied for the control of OO variability from metabolic and physical-chemical measured characteristics. The different chemometrics methods are illustrated by different study cases on monovarietal and blended OO originated from different countries. Chemometrics tools offer multiple ways for quantitative evaluations and qualitative control of complex chemical variability of OO in relation to several intrinsic and extrinsic factors.

  6. A New Method for Analyzing Near-Field Faraday Probe Data in Hall Thrusters

    NASA Technical Reports Server (NTRS)

    Huang, Wensheng; Shastry, Rohit; Herman, Daniel A.; Soulas, George C.; Kamhawi, Hani

    2013-01-01

    This paper presents a new method for analyzing near-field Faraday probe data obtained from Hall thrusters. Traditional methods spawned from far-field Faraday probe analysis rely on assumptions that are not applicable to near-field Faraday probe data. In particular, arbitrary choices for the point of origin and limits of integration have made interpretation of the results difficult. The new method, called iterative pathfinding, uses the evolution of the near-field plume with distance to provide feedback for determining the location of the point of origin. Although still susceptible to the choice of integration limits, this method presents a systematic approach to determining the origin point for calculating the divergence angle. The iterative pathfinding method is applied to near-field Faraday probe data taken in a previous study from the NASA-300M and NASA-457Mv2 Hall thrusters. Since these two thrusters use centrally mounted cathodes the current density associated with the cathode plume is removed before applying iterative pathfinding. A procedure is presented for removing the cathode plume. The results of the analysis are compared to far-field probe analysis results. This paper ends with checks on the validity of the new method and discussions on the implications of the results.

  7. A New Method for Analyzing Near-Field Faraday Probe Data in Hall Thrusters

    NASA Technical Reports Server (NTRS)

    Huang, Wensheng; Shastry, Rohit; Herman, Daniel A.; Soulas, George C.; Kamhawi, Hani

    2013-01-01

    This paper presents a new method for analyzing near-field Faraday probe data obtained from Hall thrusters. Traditional methods spawned from far-field Faraday probe analysis rely on assumptions that are not applicable to near-field Faraday probe data. In particular, arbitrary choices for the point of origin and limits of integration have made interpretation of the results difficult. The new method, called iterative pathfinding, uses the evolution of the near-field plume with distance to provide feedback for determining the location of the point of origin. Although still susceptible to the choice of integration limits, this method presents a systematic approach to determining the origin point for calculating the divergence angle. The iterative pathfinding method is applied to near-field Faraday probe data taken in a previous study from the NASA-300M and NASA-457Mv2 Hall thrusters. Since these two thrusters use centrally mounted cathodes, the current density associated with the cathode plume is removed before applying iterative pathfinding. A procedure is presented for removing the cathode plume. The results of the analysis are compared to far-field probe analysis results. This paper ends with checks on the validity of the new method and discussions on the implications of the results.

  8. A modification of \\mathsf {WKB} method for fractional differential operators of Schrödinger's type

    NASA Astrophysics Data System (ADS)

    Sayevand, K.; Pichaghchi, K.

    2017-09-01

    In this paper, we were concerned with the description of the singularly perturbed differential equations within the scope of fractional calculus. However, we shall note that one of the main methods used to solve these problems is the so-called WKB method. We should mention that this was not achievable via the existing fractional derivative definitions, because they do not obey the chain rule. In order to accommodate the WKB to the scope of fractional derivative, we proposed a relatively new derivative called the local fractional derivative. By use of properties of local fractional derivative, we extend the WKB method in the scope of the fractional differential equation. By means of this extension, the WKB analysis based on the Borel resummation, for fractional differential operators of WKB type are investigated. The convergence and the Mittag-Leffler stability of the proposed approach is proven. The obtained results are in excellent agreement with the existing ones in open literature and it is shown that the present approach is very effective and accurate. Furthermore, we are mainly interested to construct the solution of fractional Schrödinger equation in the Mittag-Leffler form and how it leads naturally to this semi-classical approximation namely modified WKB.

  9. [The grounded theory as a methodological alternative for nursing research].

    PubMed

    dos Santos, Sérgio Ribeiro; da Nóbrega, Maria Miriam

    2002-01-01

    This study presents a method of interpretative and systematic research with appliance to the development of studies in nursing called "the grounded theory", whose theoretical support is the symbolic interactionism. The purpose of the paper is to describe the grounded theory as an alternative methodology for the construction of knowledge in nursing. The study highlights four topics: the basic principle, the basic concepts, the trajectory of the method and the process of analysis of the data. We conclude that the systematization of data and its interpretation, based on social actors' experience, constitute strong subsidies to generate theories through this research tool.

  10. Improving FMEA risk assessment through reprioritization of failures

    NASA Astrophysics Data System (ADS)

    Ungureanu, A. L.; Stan, G.

    2016-08-01

    Most of the current methods used to assess the failure and to identify the industrial equipment defects are based on the determination of Risk Priority Number (RPN). Although conventional RPN calculation is easy to understand and use, the methodology presents some limitations, such as the large number of duplicates and the difficulty of assessing the RPN indices. In order to eliminate the afore-mentioned shortcomings, this paper puts forward an easy and efficient computing method, called Failure Developing Mode and Criticality Analysis (FDMCA), which takes into account the failures and the defect evolution in time, from failure appearance to a breakdown.

  11. Expediting Combinatorial Data Set Analysis by Combining Human and Algorithmic Analysis.

    PubMed

    Stein, Helge Sören; Jiao, Sally; Ludwig, Alfred

    2017-01-09

    A challenge in combinatorial materials science remains the efficient analysis of X-ray diffraction (XRD) data and its correlation to functional properties. Rapid identification of phase-regions and proper assignment of corresponding crystal structures is necessary to keep pace with the improved methods for synthesizing and characterizing materials libraries. Therefore, a new modular software called htAx (high-throughput analysis of X-ray and functional properties data) is presented that couples human intelligence tasks used for "ground-truth" phase-region identification with subsequent unbiased verification by an algorithm to efficiently analyze which phases are present in a materials library. Identified phases and phase-regions may then be correlated to functional properties in an expedited manner. For the functionality of htAx to be proven, two previously published XRD benchmark data sets of the materials systems Al-Cr-Fe-O and Ni-Ti-Cu are analyzed by htAx. The analysis of ∼1000 XRD patterns takes less than 1 day with htAx. The proposed method reliably identifies phase-region boundaries and robustly identifies multiphase structures. The method also addresses the problem of identifying regions with previously unpublished crystal structures using a special daisy ternary plot.

  12. Fast Quantitative Analysis Of Museum Objects Using Laser-Induced Breakdown Spectroscopy And Multiple Regression Algorithms

    NASA Astrophysics Data System (ADS)

    Lorenzetti, G.; Foresta, A.; Palleschi, V.; Legnaioli, S.

    2009-09-01

    The recent development of mobile instrumentation, specifically devoted to in situ analysis and study of museum objects, allows the acquisition of many LIBS spectra in very short time. However, such large amount of data calls for new analytical approaches which would guarantee a prompt analysis of the results obtained. In this communication, we will present and discuss the advantages of statistical analytical methods, such as Partial Least Squares Multiple Regression algorithms vs. the classical calibration curve approach. PLS algorithms allows to obtain in real time the information on the composition of the objects under study; this feature of the method, compared to the traditional off-line analysis of the data, is extremely useful for the optimization of the measurement times and number of points associated with the analysis. In fact, the real time availability of the compositional information gives the possibility of concentrating the attention on the most `interesting' parts of the object, without over-sampling the zones which would not provide useful information for the scholars or the conservators. Some example on the applications of this method will be presented, including the studies recently performed by the researcher of the Applied Laser Spectroscopy Laboratory on museum bronze objects.

  13. Network structure of multivariate time series.

    PubMed

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-21

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  14. A Spiking Neural Network Methodology and System for Learning and Comparative Analysis of EEG Data From Healthy Versus Addiction Treated Versus Addiction Not Treated Subjects.

    PubMed

    Doborjeh, Maryam Gholami; Wang, Grace Y; Kasabov, Nikola K; Kydd, Robert; Russell, Bruce

    2016-09-01

    This paper introduces a method utilizing spiking neural networks (SNN) for learning, classification, and comparative analysis of brain data. As a case study, the method was applied to electroencephalography (EEG) data collected during a GO/NOGO cognitive task performed by untreated opiate addicts, those undergoing methadone maintenance treatment (MMT) for opiate dependence and a healthy control group. the method is based on an SNN architecture called NeuCube, trained on spatiotemporal EEG data. NeuCube was used to classify EEG data across subject groups and across GO versus NOGO trials, but also facilitated a deeper comparative analysis of the dynamic brain processes. This analysis results in a better understanding of human brain functioning across subject groups when performing a cognitive task. In terms of the EEG data classification, a NeuCube model obtained better results (the maximum obtained accuracy: 90.91%) when compared with traditional statistical and artificial intelligence methods (the maximum obtained accuracy: 50.55%). more importantly, new information about the effects of MMT on cognitive brain functions is revealed through the analysis of the SNN model connectivity and its dynamics. this paper presented a new method for EEG data modeling and revealed new knowledge on brain functions associated with mental activity which is different from the brain activity observed in a resting state of the same subjects.

  15. Box-Counting Method of 2D Neuronal Image: Method Modification and Quantitative Analysis Demonstrated on Images from the Monkey and Human Brain.

    PubMed

    Rajković, Nemanja; Krstonošić, Bojana; Milošević, Nebojša

    2017-01-01

    This study calls attention to the difference between traditional box-counting method and its modification. The appropriate scaling factor, influence on image size and resolution, and image rotation, as well as different image presentation, are showed on the sample of asymmetrical neurons from the monkey dentate nucleus. The standard BC method and its modification were evaluated on the sample of 2D neuronal images from the human neostriatum. In addition, three box dimensions (which estimate the space-filling property, the shape, complexity, and the irregularity of dendritic tree) were used to evaluate differences in the morphology of type III aspiny neurons between two parts of the neostriatum.

  16. Transcriptomic SNP discovery for custom genotyping arrays: impacts of sequence data, SNP calling method and genotyping technology on the probability of validation success.

    PubMed

    Humble, Emily; Thorne, Michael A S; Forcada, Jaume; Hoffman, Joseph I

    2016-08-26

    Single nucleotide polymorphism (SNP) discovery is an important goal of many studies. However, the number of 'putative' SNPs discovered from a sequence resource may not provide a reliable indication of the number that will successfully validate with a given genotyping technology. For this it may be necessary to account for factors such as the method used for SNP discovery and the type of sequence data from which it originates, suitability of the SNP flanking sequences for probe design, and genomic context. To explore the relative importance of these and other factors, we used Illumina sequencing to augment an existing Roche 454 transcriptome assembly for the Antarctic fur seal (Arctocephalus gazella). We then mapped the raw Illumina reads to the new hybrid transcriptome using BWA and BOWTIE2 before calling SNPs with GATK. The resulting markers were pooled with two existing sets of SNPs called from the original 454 assembly using NEWBLER and SWAP454. Finally, we explored the extent to which SNPs discovered using these four methods overlapped and predicted the corresponding validation outcomes for both Illumina Infinium iSelect HD and Affymetrix Axiom arrays. Collating markers across all discovery methods resulted in a global list of 34,718 SNPs. However, concordance between the methods was surprisingly poor, with only 51.0 % of SNPs being discovered by more than one method and 13.5 % being called from both the 454 and Illumina datasets. Using a predictive modeling approach, we could also show that SNPs called from the Illumina data were on average more likely to successfully validate, as were SNPs called by more than one method. Above and beyond this pattern, predicted validation outcomes were also consistently better for Affymetrix Axiom arrays. Our results suggest that focusing on SNPs called by more than one method could potentially improve validation outcomes. They also highlight possible differences between alternative genotyping technologies that could be explored in future studies of non-model organisms.

  17. Reinforcement learning for resource allocation in LEO satellite networks.

    PubMed

    Usaha, Wipawee; Barria, Javier A

    2007-06-01

    In this paper, we develop and assess online decision-making algorithms for call admission and routing for low Earth orbit (LEO) satellite networks. It has been shown in a recent paper that, in a LEO satellite system, a semi-Markov decision process formulation of the call admission and routing problem can achieve better performance in terms of an average revenue function than existing routing methods. However, the conventional dynamic programming (DP) numerical solution becomes prohibited as the problem size increases. In this paper, two solution methods based on reinforcement learning (RL) are proposed in order to circumvent the computational burden of DP. The first method is based on an actor-critic method with temporal-difference (TD) learning. The second method is based on a critic-only method, called optimistic TD learning. The algorithms enhance performance in terms of requirements in storage, computational complexity and computational time, and in terms of an overall long-term average revenue function that penalizes blocked calls. Numerical studies are carried out, and the results obtained show that the RL framework can achieve up to 56% higher average revenue over existing routing methods used in LEO satellite networks with reasonable storage and computational requirements.

  18. Sports Training Support Method by Self-Coaching with Humanoid Robot

    NASA Astrophysics Data System (ADS)

    Toyama, S.; Ikeda, F.; Yasaka, T.

    2016-09-01

    This paper proposes a new training support method called self-coaching with humanoid robots. In the proposed method, two small size inexpensive humanoid robots are used because of their availability. One robot called target robot reproduces motion of a target player and another robot called reference robot reproduces motion of an expert player. The target player can recognize a target technique from the reference robot and his/her inadequate skill from the target robot. Modifying the motion of the target robot as self-coaching, the target player could get advanced cognition. Some experimental results show some possibility as the new training method and some issues of the self-coaching interface program as a future work.

  19. Intra- and interspecific responses to Rafinesque’s big-eared bat (Corynorhinus rafinesquii) social calls

    Treesearch

    S. Loeb; E. Britzke

    2010-01-01

    Bats respond to the calls of conspecifics as well as to calls of other species; however, few studies have attempted to quantify these responses or understand the functions of these calls. We tested the response of Rafinesque’s big-eared bats (Corynorhinus rafinesquii) to social calls as a possible method to increase capture success and to understand the function of...

  20. A graph-Laplacian-based feature extraction algorithm for neural spike sorting.

    PubMed

    Ghanbari, Yasser; Spence, Larry; Papamichalis, Panos

    2009-01-01

    Analysis of extracellular neural spike recordings is highly dependent upon the accuracy of neural waveform classification, commonly referred to as spike sorting. Feature extraction is an important stage of this process because it can limit the quality of clustering which is performed in the feature space. This paper proposes a new feature extraction method (which we call Graph Laplacian Features, GLF) based on minimizing the graph Laplacian and maximizing the weighted variance. The algorithm is compared with Principal Components Analysis (PCA, the most commonly-used feature extraction method) using simulated neural data. The results show that the proposed algorithm produces more compact and well-separated clusters compared to PCA. As an added benefit, tentative cluster centers are output which can be used to initialize a subsequent clustering stage.

  1. Disk space and load time requirements for eye movement biometric databases

    NASA Astrophysics Data System (ADS)

    Kasprowski, Pawel; Harezlak, Katarzyna

    2016-06-01

    Biometric identification is a very popular area of interest nowadays. Problems with the so-called physiological methods like fingerprints or iris recognition resulted in increased attention paid to methods measuring behavioral patterns. Eye movement based biometric (EMB) identification is one of the interesting behavioral methods and due to the intensive development of eye tracking devices it has become possible to define new methods for the eye movement signal processing. Such method should be supported by an efficient storage used to collect eye movement data and provide it for further analysis. The aim of the research was to check various setups enabling such a storage choice. There were various aspects taken into consideration, like disk space usage, time required for loading and saving whole data set or its chosen parts.

  2. A comparative meta-analysis of QTL between intraspecific Gossypium hirsutum interspecific populations and Gossypium hirsutum x Gossypium barbadense populations

    USDA-ARS?s Scientific Manuscript database

    Recent Meta-analysis of quantitative trait loci (QTL) in tetraploid cotton (Gossypium spp.) has identified regions of the genome with high concentrations of various trait QTL called clusters, and specific trait QTL called hotspots. The Meta-analysis included all population types of Gossypium mixing ...

  3. A Comparison of Didactic and Inquiry Teaching Methods in a Rural Community College Earth Science Course

    NASA Astrophysics Data System (ADS)

    Beam, Margery Elizabeth

    The combination of increasing enrollment and the importance of providing transfer students a solid foundation in science calls for science faculty to evaluate teaching methods in rural community colleges. The purpose of this study was to examine and compare the effectiveness of two teaching methods, inquiry teaching methods and didactic teaching methods, applied in a rural community college earth science course. Two groups of students were taught the same content via inquiry and didactic teaching methods. Analysis of quantitative data included a non-parametric ranking statistical testing method in which the difference between the rankings and the median of the post-test scores was analyzed for significance. Results indicated there was not a significant statistical difference between the teaching methods for the group of students participating in the research. The practical and educational significance of this study provides valuable perspectives on teaching methods and student learning styles in rural community colleges.

  4. Comparative analysis of methods and optical-electronic equipment to control the form parameters of spherical mirrors

    NASA Astrophysics Data System (ADS)

    Nikitin, Alexander N.; Baryshnikov, Nikolay; Denisov, Dmitrii; Karasik, Valerii; Sakharov, Alexey; Romanov, Pavel; Sheldakova, Julia; Kudryashov, Alexis

    2018-02-01

    In this paper we consider two approaches widely used in testing of spherical optical surfaces: Fizeau interferometer and Shack-Hartmann wavefront sensor. Fizeau interferometer that is widely used in optical testing can be transformed to a device using Shack-Hartmann wavefront sensor, the alternative technique to check spherical optical components. We call this device Hartmannometer, and compare its features to those of Fizeau interferometer.

  5. Large Scale Data Analysis and Knowledge Extraction in Communication Data

    DTIC Science & Technology

    2017-03-31

    this purpose, we developed a novel method the " Correlation Density Ran!C’ which finds probability density distribution of related frequent event on all...which is called " Correlation Density Rank", is developed to derive the community tree from the network. As in the real world, where a network is...Community Structure in Dynamic Social Networks using the Correlation Density Rank," 2014 ASE BigData/SocialCom/Cybersecurity Conference, Stanford

  6. A Shotline Method for Modeling Projectile Geometry

    DTIC Science & Technology

    1986-06-01

    by block number) GIFT Target Description Vulnerability Analysis COMGEOM Shotlining Warhead Lethality MISFIR 20. ABSTRACT fConfteue an r»r»r«» eUm It rt...target interaction is centered upon the program MISFIR, written in CDC Fortran 5. MISFIR is built on the formalisms of the GIFT (Geometric...a ray-tracing subroutine added to GIFT (viz. SHOTCYL); MISFIR itself, together with its subprograms; and an application program, called FUZES, which

  7. Hypertranscription in development, stem cells, and regeneration

    PubMed Central

    Percharde, Michelle; Bulut-Karslioglu, Aydan; Ramalho-Santos, Miguel

    2016-01-01

    SUMMARY Cells can globally up-regulate their transcriptome during specific transitions, a phenomenon called hypertranscription. Evidence for hypertranscription dates back over 70 years, but it has gone largely ignored in the genomics era until recently. We discuss data supporting the notion that hypertranscription is a unifying theme in embryonic development, stem cell biology, regeneration and cell competition. We review the history, methods for analysis, underlying mechanisms and biological significance of hypertranscription. PMID:27989554

  8. Numerical algorithms for finite element computations on concurrent processors

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1986-01-01

    The work of several graduate students which relate to the NASA grant is briefly summarized. One student has worked on a detailed analysis of the so-called ijk forms of Gaussian elemination and Cholesky factorization on concurrent processors. Another student has worked on the vectorization of the incomplete Cholesky conjugate method on the CYBER 205. Two more students implemented various versions of Gaussian elimination and Cholesky factorization on the FLEX/32.

  9. Measuring the effectiveness of patient-chosen reminder methods in a private orthodontic practice.

    PubMed

    Wegrzyniak, Lauren M; Hedderly, Deborah; Chaudry, Kishore; Bollu, Prashanti

    2018-05-01

    To evaluate the effectiveness of patient-chosen appointment reminder methods (phone call, e-mail, or SMS text) in reducing no-show rates. This was a retrospective case study that determined the correlation between patient-chosen appointment reminder methods and no-show rates in a private orthodontic practice. This study was conducted in a single office location of a multioffice private orthodontic practice using data gathered in 2015. The subjects were patients who self-selected the appointment reminder method (phone call, e-mail, or SMS text). Patient appointment data were collected over a 6-month period. Patient attendance was analyzed with descriptive statistics to determine any significant differences among patient-chosen reminder methods. There was a total of 1193 appointments with an average no-show rate of 2.43% across the three reminder methods. No statistically significant differences ( P = .569) were observed in the no-show rates between the three methods: phone call (3.49%), e-mail (2.68%), and SMS text (1.90%). The electronic appointment reminder methods (SMS text and e-mail) had lower no-show rates compared with the phone call method, with SMS text having the lowest no-show rate of 1.90%. However, since no significant differences were observed between the three patient-chosen reminder methods, providers may want to allow patients to choose their reminder method to decrease no-shows.

  10. Distributed collaborative probabilistic design for turbine blade-tip radial running clearance using support vector machine of regression

    NASA Astrophysics Data System (ADS)

    Fei, Cheng-Wei; Bai, Guang-Chen

    2014-12-01

    To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.

  11. Adaptive phase k-means algorithm for waveform classification

    NASA Astrophysics Data System (ADS)

    Song, Chengyun; Liu, Zhining; Wang, Yaojun; Xu, Feng; Li, Xingming; Hu, Guangmin

    2018-01-01

    Waveform classification is a powerful technique for seismic facies analysis that describes the heterogeneity and compartments within a reservoir. Horizon interpretation is a critical step in waveform classification. However, the horizon often produces inconsistent waveform phase, and thus results in an unsatisfied classification. To alleviate this problem, an adaptive phase waveform classification method called the adaptive phase k-means is introduced in this paper. Our method improves the traditional k-means algorithm using an adaptive phase distance for waveform similarity measure. The proposed distance is a measure with variable phases as it moves from sample to sample along the traces. Model traces are also updated with the best phase interference in the iterative process. Therefore, our method is robust to phase variations caused by the interpretation horizon. We tested the effectiveness of our algorithm by applying it to synthetic and real data. The satisfactory results reveal that the proposed method tolerates certain waveform phase variation and is a good tool for seismic facies analysis.

  12. Spectral Regression Discriminant Analysis for Hyperspectral Image Classification

    NASA Astrophysics Data System (ADS)

    Pan, Y.; Wu, J.; Huang, H.; Liu, J.

    2012-08-01

    Dimensionality reduction algorithms, which aim to select a small set of efficient and discriminant features, have attracted great attention for Hyperspectral Image Classification. The manifold learning methods are popular for dimensionality reduction, such as Locally Linear Embedding, Isomap, and Laplacian Eigenmap. However, a disadvantage of many manifold learning methods is that their computations usually involve eigen-decomposition of dense matrices which is expensive in both time and memory. In this paper, we introduce a new dimensionality reduction method, called Spectral Regression Discriminant Analysis (SRDA). SRDA casts the problem of learning an embedding function into a regression framework, which avoids eigen-decomposition of dense matrices. Also, with the regression based framework, different kinds of regularizes can be naturally incorporated into our algorithm which makes it more flexible. It can make efficient use of data points to discover the intrinsic discriminant structure in the data. Experimental results on Washington DC Mall and AVIRIS Indian Pines hyperspectral data sets demonstrate the effectiveness of the proposed method.

  13. Implementing Kernel Methods Incrementally by Incremental Nonlinear Projection Trick.

    PubMed

    Kwak, Nojun

    2016-05-20

    Recently, the nonlinear projection trick (NPT) was introduced enabling direct computation of coordinates of samples in a reproducing kernel Hilbert space. With NPT, any machine learning algorithm can be extended to a kernel version without relying on the so called kernel trick. However, NPT is inherently difficult to be implemented incrementally because an ever increasing kernel matrix should be treated as additional training samples are introduced. In this paper, an incremental version of the NPT (INPT) is proposed based on the observation that the centerization step in NPT is unnecessary. Because the proposed INPT does not change the coordinates of the old data, the coordinates obtained by INPT can directly be used in any incremental methods to implement a kernel version of the incremental methods. The effectiveness of the INPT is shown by applying it to implement incremental versions of kernel methods such as, kernel singular value decomposition, kernel principal component analysis, and kernel discriminant analysis which are utilized for problems of kernel matrix reconstruction, letter classification, and face image retrieval, respectively.

  14. Automatic abdominal lymph node detection method based on local intensity structure analysis from 3D x-ray CT images

    NASA Astrophysics Data System (ADS)

    Nakamura, Yoshihiko; Nimura, Yukitaka; Kitasaka, Takayuki; Mizuno, Shinji; Furukawa, Kazuhiro; Goto, Hidemi; Fujiwara, Michitaka; Misawa, Kazunari; Ito, Masaaki; Nawano, Shigeru; Mori, Kensaku

    2013-03-01

    This paper presents an automated method of abdominal lymph node detection to aid the preoperative diagnosis of abdominal cancer surgery. In abdominal cancer surgery, surgeons must resect not only tumors and metastases but also lymph nodes that might have a metastasis. This procedure is called lymphadenectomy or lymph node dissection. Insufficient lymphadenectomy carries a high risk for relapse. However, excessive resection decreases a patient's quality of life. Therefore, it is important to identify the location and the structure of lymph nodes to make a suitable surgical plan. The proposed method consists of candidate lymph node detection and false positive reduction. Candidate lymph nodes are detected using a multi-scale blob-like enhancement filter based on local intensity structure analysis. To reduce false positives, the proposed method uses a classifier based on support vector machine with the texture and shape information. The experimental results reveal that it detects 70.5% of the lymph nodes with 13.0 false positives per case.

  15. Nuclear Forensics Analysis with Missing and Uncertain Data

    DOE PAGES

    Langan, Roisin T.; Archibald, Richard K.; Lamberti, Vincent

    2015-10-05

    We have applied a new imputation-based method for analyzing incomplete data, called Monte Carlo Bayesian Database Generation (MCBDG), to the Spent Fuel Isotopic Composition (SFCOMPO) database. About 60% of the entries are absent for SFCOMPO. The method estimates missing values of a property from a probability distribution created from the existing data for the property, and then generates multiple instances of the completed database for training a machine learning algorithm. Uncertainty in the data is represented by an empirical or an assumed error distribution. The method makes few assumptions about the underlying data, and compares favorably against results obtained bymore » replacing missing information with constant values.« less

  16. A Two-Layer Least Squares Support Vector Machine Approach to Credit Risk Assessment

    NASA Astrophysics Data System (ADS)

    Liu, Jingli; Li, Jianping; Xu, Weixuan; Shi, Yong

    Least squares support vector machine (LS-SVM) is a revised version of support vector machine (SVM) and has been proved to be a useful tool for pattern recognition. LS-SVM had excellent generalization performance and low computational cost. In this paper, we propose a new method called two-layer least squares support vector machine which combines kernel principle component analysis (KPCA) and linear programming form of least square support vector machine. With this method sparseness and robustness is obtained while solving large dimensional and large scale database. A U.S. commercial credit card database is used to test the efficiency of our method and the result proved to be a satisfactory one.

  17. On the importance of mathematical methods for analysis of MALDI-imaging mass spectrometry data.

    PubMed

    Trede, Dennis; Kobarg, Jan Hendrik; Oetjen, Janina; Thiele, Herbert; Maass, Peter; Alexandrov, Theodore

    2012-03-21

    In the last decade, matrix-assisted laser desorption/ionization (MALDI) imaging mass spectrometry (IMS), also called as MALDI-imaging, has proven its potential in proteomics and was successfully applied to various types of biomedical problems, in particular to histopathological label-free analysis of tissue sections. In histopathology, MALDI-imaging is used as a general analytic tool revealing the functional proteomic structure of tissue sections, and as a discovery tool for detecting new biomarkers discriminating a region annotated by an experienced histologist, in particular, for cancer studies. A typical MALDI-imaging data set contains 10⁸ to 10⁹ intensity values occupying more than 1 GB. Analysis and interpretation of such huge amount of data is a mathematically, statistically and computationally challenging problem. In this paper we overview some computational methods for analysis of MALDI-imaging data sets. We discuss the importance of data preprocessing, which typically includes normalization, baseline removal and peak picking, and hightlight the importance of image denoising when visualizing IMS data.

  18. On the Importance of Mathematical Methods for Analysis of MALDI-Imaging Mass Spectrometry Data.

    PubMed

    Trede, Dennis; Kobarg, Jan Hendrik; Oetjen, Janina; Thiele, Herbert; Maass, Peter; Alexandrov, Theodore

    2012-03-01

    In the last decade, matrix-assisted laser desorption/ionization (MALDI) imaging mass spectrometry (IMS), also called as MALDI-imaging, has proven its potential in proteomics and was successfully applied to various types of biomedical problems, in particular to histopathological label-free analysis of tissue sections. In histopathology, MALDI-imaging is used as a general analytic tool revealing the functional proteomic structure of tissue sections, and as a discovery tool for detecting new biomarkers discriminating a region annotated by an experienced histologist, in particular, for cancer studies. A typical MALDI-imaging data set contains 108 to 109 intensity values occupying more than 1 GB. Analysis and interpretation of such huge amount of data is a mathematically, statistically and computationally challenging problem. In this paper we overview some computational methods for analysis of MALDI-imaging data sets. We discuss the importance of data preprocessing, which typically includes normalization, baseline removal and peak picking, and hightlight the importance of image denoising when visualizing IMS data.

  19. Nanomaterials as Assisted Matrix of Laser Desorption/Ionization Time-of-Flight Mass Spectrometry for the Analysis of Small Molecules.

    PubMed

    Lu, Minghua; Yang, Xueqing; Yang, Yixin; Qin, Peige; Wu, Xiuru; Cai, Zongwei

    2017-04-21

    Matrix-assisted laser desorption/ionization (MALDI), a soft ionization method, coupling with time-of-flight mass spectrometry (TOF MS) has become an indispensible tool for analyzing macromolecules, such as peptides, proteins, nucleic acids and polymers. However, the application of MALDI for the analysis of small molecules (<700 Da) has become the great challenge because of the interference from the conventional matrix in low mass region. To overcome this drawback, more attention has been paid to explore interference-free methods in the past decade. The technique of applying nanomaterials as matrix of laser desorption/ionization (LDI), also called nanomaterial-assisted laser desorption/ionization (nanomaterial-assisted LDI), has attracted considerable attention in the analysis of low-molecular weight compounds in TOF MS. This review mainly summarized the applications of different types of nanomaterials including carbon-based, metal-based and metal-organic frameworks as assisted matrices for LDI in the analysis of small biological molecules, environmental pollutants and other low-molecular weight compounds.

  20. Nanomaterials as Assisted Matrix of Laser Desorption/Ionization Time-of-Flight Mass Spectrometry for the Analysis of Small Molecules

    PubMed Central

    Lu, Minghua; Yang, Xueqing; Yang, Yixin; Qin, Peige; Wu, Xiuru; Cai, Zongwei

    2017-01-01

    Matrix-assisted laser desorption/ionization (MALDI), a soft ionization method, coupling with time-of-flight mass spectrometry (TOF MS) has become an indispensible tool for analyzing macromolecules, such as peptides, proteins, nucleic acids and polymers. However, the application of MALDI for the analysis of small molecules (<700 Da) has become the great challenge because of the interference from the conventional matrix in low mass region. To overcome this drawback, more attention has been paid to explore interference-free methods in the past decade. The technique of applying nanomaterials as matrix of laser desorption/ionization (LDI), also called nanomaterial-assisted laser desorption/ionization (nanomaterial-assisted LDI), has attracted considerable attention in the analysis of low-molecular weight compounds in TOF MS. This review mainly summarized the applications of different types of nanomaterials including carbon-based, metal-based and metal-organic frameworks as assisted matrices for LDI in the analysis of small biological molecules, environmental pollutants and other low-molecular weight compounds. PMID:28430138

  1. Determining temporal scales of the soil moisture variations by Empirical Mode Decompositions and wavelet methods and its use for validation of SMOS data

    NASA Astrophysics Data System (ADS)

    Usowicz, Jerzy, B.; Marczewski, Wojciech; Usowicz, Boguslaw; Lipiec, Jerzy; Lukowski, Mateusz I.

    2010-05-01

    This paper presents the results of the time series analysis of the soil moisture observed at two test sites Podlasie, Polesie, in the Cal/Val AO 3275 campaigns in Poland, during the interval 2006-2009. The test sites have been selected on a basis of their contrasted hydrological conditions. The region Podlasie (Trzebieszow) is essentially drier than the wetland region Polesie (Urszulin). It is worthwhile to note that the soil moisture variations can be represented as a non-stationary random process, and therefore appropriate analysis methods are required. The so-called Empirical Mode Decomposition (EMD) method has been chosen, since it is one of the best methods for the analysis of non-stationary and nonlinear time series. To confirm the results obtained by the EMD we have also used the wavelet methods. Firstly, we have used EMD (analyze step) to decompose the original time series into the so-called Intrinsic Mode Functions (IMFs) and then by grouping and addition similar IMFs (synthesize step) to obtain a few signal components with corresponding temporal scales. Such an adaptive procedure enables to decompose the original time series into diurnal, seasonal and trend components. Revealing of all temporal scales which operates in the original time series is our main objective and this approach may prove to be useful in other studies. Secondly, we have analyzed the soil moisture time series from both sites using the cross-wavelet and wavelet coherency. These methods allow us to study the degree of spatial coherence, which may vary in various intervals of time. We hope the obtained results provide some hints and guidelines for the validation of ESA SMOS data. References: B. Usowicz, J.B. Usowicz, Spatial and temporal variation of selected physical and chemical properties of soil, Institute of Agrophysics, Polish Academy of Sciences, Lublin 2004, ISBN 83-87385-96-4 Rao, A.R., Hsu, E.-C., Hilbert-Huang Transform Analysis of Hydrological and Environmental Time Series, Springer, 2008, ISBN: 978-1-4020-6453-1 Acknowledgements. This work was funded in part by the PECS - Programme for European Cooperating States, No. 98084 "SWEX/R - Soil Water and Energy Exchange/Research".

  2. The Probability of Hitting a Polygonal Target

    DTIC Science & Technology

    1981-04-01

    required for the use of this method for coalputing the probability of hitting d polygonal target. These functions are 1. PHIT (called by user’s main progran...2. FIJ (called by PHIT ) 3. FUN (called by FIJ) The user must include all three of these in his main program, but needs only to call PHIT . The

  3. Geographic-time distribution of ambulance calls in Singapore: utility of geographic information system in ambulance deployment (CARE 3).

    PubMed

    Ong, Marcus E H; Ng, Faith S P; Overton, Jerry; Yap, Susan; Andresen, Derek; Yong, David K L; Lim, Swee Han; Anantharaman, V

    2009-03-01

    Pre-hospital ambulance calls are not random events, but occur in patterns and trends that are related to movement patterns of people, as well as the geographical epidemiology of the population. This study describes the geographic-time epidemiology of ambulance calls in a large urban city and conducts a time demand analysis. This will facilitate a Systems Status Plan for the deployment of ambulances based on the most cost effective deployment strategy. An observational prospective study looking at the geographic-time epidemiology of all ambulance calls in Singapore. Locations of ambulance calls were spot mapped using Geographic Information Systems (GIS) technology. Ambulance response times were mapped and a demand analysis conducted by postal districts. Between 1 January 2006 and 31 May 2006, 31,896 patients were enrolled into the study. Mean age of patients was 51.6 years (S.D. 23.0) with 60.0% male. Race distribution was 62.5% Chinese, 19.4% Malay, 12.9% Indian and 5.2% others. Trauma consisted 31.2% of calls and medical 68.8%. 9.7% of cases were priority 1 (most severe) and 70.1% priority 2 (moderate severity). Mean call receipt to arrival at scene was 8.0 min (S.D. 4.8). Call volumes in the day were almost twice those at night, with the most calls on Mondays. We found a definite geographical distribution pattern with heavier call volumes in the suburban town centres in the Eastern and Southern part of the country. We characterised the top 35 districts with the highest call volumes by time periods, which will form the basis for ambulance deployment plans. We found a definite geographical distribution pattern of ambulance calls. This study demonstrates the utility of GIS with despatch demand analysis and has implications for maximising the effectiveness of ambulance deployment.

  4. SigEMD: A powerful method for differential gene expression analysis in single-cell RNA sequencing data.

    PubMed

    Wang, Tianyu; Nabavi, Sheida

    2018-04-24

    Differential gene expression analysis is one of the significant efforts in single cell RNA sequencing (scRNAseq) analysis to discover the specific changes in expression levels of individual cell types. Since scRNAseq exhibits multimodality, large amounts of zero counts, and sparsity, it is different from the traditional bulk RNA sequencing (RNAseq) data. The new challenges of scRNAseq data promote the development of new methods for identifying differentially expressed (DE) genes. In this study, we proposed a new method, SigEMD, that combines a data imputation approach, a logistic regression model and a nonparametric method based on the Earth Mover's Distance, to precisely and efficiently identify DE genes in scRNAseq data. The regression model and data imputation are used to reduce the impact of large amounts of zero counts, and the nonparametric method is used to improve the sensitivity of detecting DE genes from multimodal scRNAseq data. By additionally employing gene interaction network information to adjust the final states of DE genes, we further reduce the false positives of calling DE genes. We used simulated datasets and real datasets to evaluate the detection accuracy of the proposed method and to compare its performance with those of other differential expression analysis methods. Results indicate that the proposed method has an overall powerful performance in terms of precision in detection, sensitivity, and specificity. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Down-weighting overlapping genes improves gene set analysis

    PubMed Central

    2012-01-01

    Background The identification of gene sets that are significantly impacted in a given condition based on microarray data is a crucial step in current life science research. Most gene set analysis methods treat genes equally, regardless how specific they are to a given gene set. Results In this work we propose a new gene set analysis method that computes a gene set score as the mean of absolute values of weighted moderated gene t-scores. The gene weights are designed to emphasize the genes appearing in few gene sets, versus genes that appear in many gene sets. We demonstrate the usefulness of the method when analyzing gene sets that correspond to the KEGG pathways, and hence we called our method Pathway Analysis with Down-weighting of Overlapping Genes (PADOG). Unlike most gene set analysis methods which are validated through the analysis of 2-3 data sets followed by a human interpretation of the results, the validation employed here uses 24 different data sets and a completely objective assessment scheme that makes minimal assumptions and eliminates the need for possibly biased human assessments of the analysis results. Conclusions PADOG significantly improves gene set ranking and boosts sensitivity of analysis using information already available in the gene expression profiles and the collection of gene sets to be analyzed. The advantages of PADOG over other existing approaches are shown to be stable to changes in the database of gene sets to be analyzed. PADOG was implemented as an R package available at: http://bioinformaticsprb.med.wayne.edu/PADOG/or http://www.bioconductor.org. PMID:22713124

  6. Visual aggregate analysis of eligibility features of clinical trials.

    PubMed

    He, Zhe; Carini, Simona; Sim, Ida; Weng, Chunhua

    2015-04-01

    To develop a method for profiling the collective populations targeted for recruitment by multiple clinical studies addressing the same medical condition using one eligibility feature each time. Using a previously published database COMPACT as the backend, we designed a scalable method for visual aggregate analysis of clinical trial eligibility features. This method consists of four modules for eligibility feature frequency analysis, query builder, distribution analysis, and visualization, respectively. This method is capable of analyzing (1) frequently used qualitative and quantitative features for recruiting subjects for a selected medical condition, (2) distribution of study enrollment on consecutive value points or value intervals of each quantitative feature, and (3) distribution of studies on the boundary values, permissible value ranges, and value range widths of each feature. All analysis results were visualized using Google Charts API. Five recruited potential users assessed the usefulness of this method for identifying common patterns in any selected eligibility feature for clinical trial participant selection. We implemented this method as a Web-based analytical system called VITTA (Visual Analysis Tool of Clinical Study Target Populations). We illustrated the functionality of VITTA using two sample queries involving quantitative features BMI and HbA1c for conditions "hypertension" and "Type 2 diabetes", respectively. The recruited potential users rated the user-perceived usefulness of VITTA with an average score of 86.4/100. We contributed a novel aggregate analysis method to enable the interrogation of common patterns in quantitative eligibility criteria and the collective target populations of multiple related clinical studies. A larger-scale study is warranted to formally assess the usefulness of VITTA among clinical investigators and sponsors in various therapeutic areas. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Hybrid PV/diesel solar power system design using multi-level factor analysis optimization

    NASA Astrophysics Data System (ADS)

    Drake, Joshua P.

    Solar power systems represent a large area of interest across a spectrum of organizations at a global level. It was determined that a clear understanding of current state of the art software and design methods, as well as optimization methods, could be used to improve the design methodology. Solar power design literature was researched for an in depth understanding of solar power system design methods and algorithms. Multiple software packages for the design and optimization of solar power systems were analyzed for a critical understanding of their design workflow. In addition, several methods of optimization were studied, including brute force, Pareto analysis, Monte Carlo, linear and nonlinear programming, and multi-way factor analysis. Factor analysis was selected as the most efficient optimization method for engineering design as it applied to solar power system design. The solar power design algorithms, software work flow analysis, and factor analysis optimization were combined to develop a solar power system design optimization software package called FireDrake. This software was used for the design of multiple solar power systems in conjunction with an energy audit case study performed in seven Tibetan refugee camps located in Mainpat, India. A report of solar system designs for the camps, as well as a proposed schedule for future installations was generated. It was determined that there were several improvements that could be made to the state of the art in modern solar power system design, though the complexity of current applications is significant.

  8. Probability of detecting band-tailed pigeons during call-broadcast versus auditory surveys

    USGS Publications Warehouse

    Kirkpatrick, C.; Conway, C.J.; Hughes, K.M.; Devos, J.C.

    2007-01-01

    Estimates of population trend for the interior subspecies of band-tailed pigeon (Patagioenas fasciata fasciata) are not available because no standardized survey method exists for monitoring the interior subspecies. We evaluated 2 potential band-tailed pigeon survey methods (auditory and call-broadcast surveys) from 2002 to 2004 in 5 mountain ranges in southern Arizona, USA, and in mixed-conifer forest throughout the state. Both auditory and call-broadcast surveys produced low numbers of cooing pigeons detected per survey route (x?? ??? 0.67) and had relatively high temporal variance in average number of cooing pigeons detected during replicate surveys (CV ??? 161%). However, compared to auditory surveys, use of call-broadcast increased 1) the percentage of replicate surveys on which ???1 cooing pigeon was detected by an average of 16%, and 2) the number of cooing pigeons detected per survey route by an average of 29%, with this difference being greatest during the first 45 minutes of the morning survey period. Moreover, probability of detecting a cooing pigeon was 27% greater during call-broadcast (0.80) versus auditory (0.63) surveys. We found that cooing pigeons were most common in mixed-conifer forest in southern Arizona and density of male pigeons in mixed-conifer forest throughout the state averaged 0.004 (SE = 0.001) pigeons/ha. Our results are the first to show that call-broadcast increases the probability of detecting band-tailed pigeons (or any species of Columbidae) during surveys. Call-broadcast surveys may provide a useful method for monitoring populations of the interior subspecies of band-tailed pigeon in areas where other survey methods are inappropriate.

  9. Source localization of narrow band signals in multipath environments, with application to marine mammals

    NASA Astrophysics Data System (ADS)

    Valtierra, Robert Daniel

    Passive acoustic localization has benefited from many major developments and has become an increasingly important focus point in marine mammal research. Several challenges still remain. This work seeks to address several of these challenges such as tracking the calling depths of baleen whales. In this work, data from an array of widely spaced Marine Acoustic Recording Units (MARUs) was used to achieve three dimensional localization by combining the methods Time Difference of Arrival (TDOA) and Direct-Reflected Time Difference of Arrival (DRTD) along with a newly developed autocorrelation technique. TDOA was applied to data for two dimensional (latitude and longitude) localization and depth was resolved using DRTD. Previously, DRTD had been limited to pulsed broadband signals, such as sperm whale or dolphin echolocation, where individual direct and reflected signals are separated in time. Due to the length of typical baleen whale vocalizations, individual multipath signal arrivals can overlap making time differences of arrival difficult to resolve. This problem can be solved using an autocorrelation, which can extract reflection information from overlapping signals. To establish this technique, a derivation was made to model the autocorrelation of a direct signal and its overlapping reflection. The model was exploited to derive performance limits allowing for prediction of the minimum resolvable direct-reflected time difference for a known signal type. The dependence on signal parameters (sweep rate, call duration) was also investigated. The model was then verified using both recorded and simulated data from two analysis cases for North Atlantic right whales (NARWs, Eubalaena glacialis) and humpback whales (Megaptera noveaengliae). The newly developed autocorrelation technique was then combined with DRTD and tested using data from playback transmissions to localize an acoustic transducer at a known depth and location. The combined DRTD-autocorrelation methods enabled calling depth and range estimations of a vocalizing NARW and humpback whale in two separate cases. The DRTD-autocorrelation method was then combined with TDOA to create a three dimensional track of a NARW in the Stellwagen Bank National Marine Sanctuary. Results from these experiments illustrated the potential of the combined methods to successfully resolve baleen calling depths in three dimensions.

  10. Effective normalization for copy number variation detection from whole genome sequencing.

    PubMed

    Janevski, Angel; Varadan, Vinay; Kamalakaran, Sitharthan; Banerjee, Nilanjana; Dimitrova, Nevenka

    2012-01-01

    Whole genome sequencing enables a high resolution view of the human genome and provides unique insights into genome structure at an unprecedented scale. There have been a number of tools to infer copy number variation in the genome. These tools, while validated, also include a number of parameters that are configurable to genome data being analyzed. These algorithms allow for normalization to account for individual and population-specific effects on individual genome CNV estimates but the impact of these changes on the estimated CNVs is not well characterized. We evaluate in detail the effect of normalization methodologies in two CNV algorithms FREEC and CNV-seq using whole genome sequencing data from 8 individuals spanning four populations. We apply FREEC and CNV-seq to a sequencing data set consisting of 8 genomes. We use multiple configurations corresponding to different read-count normalization methodologies in FREEC, and statistically characterize the concordance of the CNV calls between FREEC configurations and the analogous output from CNV-seq. The normalization methodologies evaluated in FREEC are: GC content, mappability and control genome. We further stratify the concordance analysis within genic, non-genic, and a collection of validated variant regions. The GC content normalization methodology generates the highest number of altered copy number regions. Both mappability and control genome normalization reduce the total number and length of copy number regions. Mappability normalization yields Jaccard indices in the 0.07 - 0.3 range, whereas using a control genome normalization yields Jaccard index values around 0.4 with normalization based on GC content. The most critical impact of using mappability as a normalization factor is substantial reduction of deletion CNV calls. The output of another method based on control genome normalization, CNV-seq, resulted in comparable CNV call profiles, and substantial agreement in variable gene and CNV region calls. Choice of read-count normalization methodology has a substantial effect on CNV calls and the use of genomic mappability or an appropriately chosen control genome can optimize the output of CNV analysis.

  11. SDF technology in location and navigation procedures: a survey of applications

    NASA Astrophysics Data System (ADS)

    Kelner, Jan M.; Ziółkowski, Cezary

    2017-04-01

    The basis for development the Doppler location method, also called the signal Doppler frequency (SDF) method or technology is the analytical solution of the wave equation for a mobile source. This paper presents an overview of the simulations, numerical analysis and empirical studies of the possibilities and the range of SDF method applications. In the paper, the various applications from numerous publications are collected and described. They mainly focus on the use of SDF method in: emitter positioning, electronic warfare, crisis management, search and rescue, navigation. The developed method is characterized by an innovative, unique property among other location methods, because it allows the simultaneous location of the many radio emitters. Moreover, this is the first method based on the Doppler effect, which allows positioning of transmitters, using a single mobile platform. In the paper, the results of the using SDF method by the other teams are also presented.

  12. 47 CFR 52.21 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... subscriber calls. (e) The term database method means a number portability method that utilizes one or more external databases for providing called party routing information. (f) The term downstream database means a database owned and operated by an individual carrier for the purpose of providing number portability in...

  13. 47 CFR 52.21 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... subscriber calls. (e) The term database method means a number portability method that utilizes one or more external databases for providing called party routing information. (f) The term downstream database means a database owned and operated by an individual carrier for the purpose of providing number portability in...

  14. 47 CFR 52.21 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... subscriber calls. (e) The term database method means a number portability method that utilizes one or more external databases for providing called party routing information. (f) The term downstream database means a database owned and operated by an individual carrier for the purpose of providing number portability in...

  15. 47 CFR 52.21 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... subscriber calls. (e) The term database method means a number portability method that utilizes one or more external databases for providing called party routing information. (f) The term downstream database means a database owned and operated by an individual carrier for the purpose of providing number portability in...

  16. Increasing Medicaid child health screenings: the effectiveness of mailed pamphlets, phone calls, and home visits.

    PubMed Central

    Selby-Harrington, M; Sorenson, J R; Quade, D; Stearns, S C; Tesh, A S; Donat, P L

    1995-01-01

    OBJECTIVES. A randomized controlled trial was conducted to test the effectiveness and cost effectiveness of three outreach interventions to promote well-child screening for children on Medicaid. METHODS. In rural North Carolina, a random sample of 2053 families with children due or overdue for screening was stratified according to the presence of a home phone. Families were randomly assigned to receive a mailed pamphlet and letter, a phone call, or a home visit outreach intervention, or the usual (control) method of informing at Medicaid intake. RESULTS. All interventions produced more screenings than the control method, but increases were significant only for families with phones. Among families with phones, a home visit was the most effective intervention but a phone call was the most cost-effective. However, absolute rates of effectiveness were low, and incremental costs per effect were high. CONCLUSIONS. Pamphlets, phone calls, and home visits by nurses were minimally effective for increasing well-child screenings. Alternate outreach methods are needed, especially for families without phones. PMID:7573627

  17. History and development of the Schmidt-Hunter meta-analysis methods.

    PubMed

    Schmidt, Frank L

    2015-09-01

    In this article, I provide answers to the questions posed by Will Shadish about the history and development of the Schmidt-Hunter methods of meta-analysis. In the 1970s, I headed a research program on personnel selection at the US Office of Personnel Management (OPM). After our research showed that validity studies have low statistical power, OPM felt a need for a better way to demonstrate test validity, especially in light of court cases challenging selection methods. In response, we created our method of meta-analysis (initially called validity generalization). Results showed that most of the variability of validity estimates from study to study was because of sampling error and other research artifacts such as variations in range restriction and measurement error. Corrections for these artifacts in our research and in replications by others showed that the predictive validity of most tests was high and generalizable. This conclusion challenged long-standing beliefs and so provoked resistance, which over time was overcome. The 1982 book that we published extending these methods to research areas beyond personnel selection was positively received and was followed by expanded books in 1990, 2004, and 2014. Today, these methods are being applied in a wide variety of areas. Copyright © 2015 John Wiley & Sons, Ltd.

  18. Chemometrics Methods for Specificity, Authenticity and Traceability Analysis of Olive Oils: Principles, Classifications and Applications

    PubMed Central

    Messai, Habib; Farman, Muhammad; Sarraj-Laabidi, Abir; Hammami-Semmar, Asma; Semmar, Nabil

    2016-01-01

    Background. Olive oils (OOs) show high chemical variability due to several factors of genetic, environmental and anthropic types. Genetic and environmental factors are responsible for natural compositions and polymorphic diversification resulting in different varietal patterns and phenotypes. Anthropic factors, however, are at the origin of different blends’ preparation leading to normative, labelled or adulterated commercial products. Control of complex OO samples requires their (i) characterization by specific markers; (ii) authentication by fingerprint patterns; and (iii) monitoring by traceability analysis. Methods. These quality control and management aims require the use of several multivariate statistical tools: specificity highlighting requires ordination methods; authentication checking calls for classification and pattern recognition methods; traceability analysis implies the use of network-based approaches able to separate or extract mixed information and memorized signals from complex matrices. Results. This chapter presents a review of different chemometrics methods applied for the control of OO variability from metabolic and physical-chemical measured characteristics. The different chemometrics methods are illustrated by different study cases on monovarietal and blended OO originated from different countries. Conclusion. Chemometrics tools offer multiple ways for quantitative evaluations and qualitative control of complex chemical variability of OO in relation to several intrinsic and extrinsic factors. PMID:28231172

  19. Virtual Surveyor based Object Extraction from Airborne LiDAR data

    NASA Astrophysics Data System (ADS)

    Habib, Md. Ahsan

    Topographic feature detection of land cover from LiDAR data is important in various fields - city planning, disaster response and prevention, soil conservation, infrastructure or forestry. In recent years, feature classification, compliant with Object-Based Image Analysis (OBIA) methodology has been gaining traction in remote sensing and geographic information science (GIS). In OBIA, the LiDAR image is first divided into meaningful segments called object candidates. This results, in addition to spectral values, in a plethora of new information such as aggregated spectral pixel values, morphology, texture, context as well as topology. Traditional nonparametric segmentation methods rely on segmentations at different scales to produce a hierarchy of semantically significant objects. Properly tuned scale parameters are, therefore, imperative in these methods for successful subsequent classification. Recently, some progress has been made in the development of methods for tuning the parameters for automatic segmentation. However, researchers found that it is very difficult to automatically refine the tuning with respect to each object class present in the scene. Moreover, due to the relative complexity of real-world objects, the intra-class heterogeneity is very high, which leads to over-segmentation. Therefore, the method fails to deliver correctly many of the new segment features. In this dissertation, a new hierarchical 3D object segmentation algorithm called Automatic Virtual Surveyor based Object Extracted (AVSOE) is presented. AVSOE segments objects based on their distinct geometric concavity/convexity. This is achieved by strategically mapping the sloping surface, which connects the object to its background. Further analysis produces hierarchical decomposition of objects to its sub-objects at a single scale level. Extensive qualitative and qualitative results are presented to demonstrate the efficacy of this hierarchical segmentation approach.

  20. Usage of Video Analysis of Traffic Conflicts for the Evaluation of Inappropriately Designed Building Elements on Intersections

    NASA Astrophysics Data System (ADS)

    Krivda, Vladislav; Petru, Jan

    2018-04-01

    Human society is constantly evolving in all areas. The transport is no exception – especially development of road transport is considerable and visible on all levels. However, it is necessary to look for tools that can harmonize economic and social progress with the full preservation of the environment for future generations. The so-called sustainable development of road transport is also closely related to the issue of creating safe transport infrastructure. For example, roads must be in compliance with technical rules on one hand and on the other one they have to be clearly understandable for all road participants (drivers, pedestrians, etc.). Unfortunately, even if the designer of transport construction meets the technical requirements, the result is a dangerous place where very frequently the accidents or traffic conflicts take place. There exist statistics of accidents, but on the contrary, there are not any statistics of traffic conflicts. And simply monitoring the traffic conflicts can lead to early detection of problematic places in road infrastructure and thus increasing the safety in the road traffic. There are many methods used for monitoring traffic conflicts. The method used in the Czech Republic is described in the submitted paper. The original method, which monitored mainly and only the behaviour of drivers and their mistakes, was innovated, so that it could be used also for evaluation of inappropriately designed building elements of road constructions. Therefore, the paper deals with the description of so-called Innovated Video Analysis of Traffic Conflicts (IVATC) and with the usage of this method for one particular example of intersection, where larger vehicles (buses, trucks, etc.) had problems with passing through it. It points out to the fact that even a relatively small change can lead to the increase of the road safety.

Top