Mapping Base Modifications in DNA by Transverse-Current Sequencing
NASA Astrophysics Data System (ADS)
Alvarez, Jose R.; Skachkov, Dmitry; Massey, Steven E.; Kalitsov, Alan; Velev, Julian P.
2018-02-01
Sequencing DNA modifications and lesions, such as methylation of cytosine and oxidation of guanine, is even more important and challenging than sequencing the genome itself. The traditional methods for detecting DNA modifications are either insensitive to these modifications or require additional processing steps to identify a particular type of modification. Transverse-current sequencing in nanopores can potentially identify the canonical bases and base modifications in the same run. In this work, we demonstrate that the most common DNA epigenetic modifications and lesions can be detected with any predefined accuracy based on their tunneling current signature. Our results are based on simulations of the nanopore tunneling current through DNA molecules, calculated using nonequilibrium electron-transport methodology within an effective multiorbital model derived from first-principles calculations, followed by a base-calling algorithm accounting for neighbor current-current correlations. This methodology can be integrated with existing experimental techniques to improve base-calling fidelity.
Klink, Vincent P.; Overall, Christopher C.; Alkharouf, Nadim W.; MacDonald, Margaret H.; Matthews, Benjamin F.
2010-01-01
Background. A comparative microarray investigation was done using detection call methodology (DCM) and differential expression analyses. The goal was to identify genes found in specific cell populations that were eliminated by differential expression analysis due to the nature of differential expression methods. Laser capture microdissection (LCM) was used to isolate nearly homogeneous populations of plant root cells. Results. The analyses identified the presence of 13,291 transcripts between the 4 different sample types. The transcripts filtered down into a total of 6,267 that were detected as being present in one or more sample types. A comparative analysis of DCM and differential expression methods showed a group of genes that were not differentially expressed, but were expressed at detectable amounts within specific cell types. Conclusion. The DCM has identified patterns of gene expression not shown by differential expression analyses. DCM has identified genes that are possibly cell-type specific and/or involved in important aspects of plant nematode interactions during the resistance response, revealing the uniqueness of a particular cell population at a particular point during its differentiation process. PMID:20508855
Le Bras, Ronan J; Kuzma, Heidi; Sucic, Victor; Bokelmann, Götz
2016-05-01
A notable sequence of calls was encountered, spanning several days in January 2003, in the central part of the Indian Ocean on a hydrophone triplet recording acoustic data at a 250 Hz sampling rate. This paper presents signal processing methods applied to the waveform data to detect, group, extract amplitude and bearing estimates for the recorded signals. An approximate location for the source of the sequence of calls is inferred from extracting the features from the waveform. As the source approaches the hydrophone triplet, the source level (SL) of the calls is estimated at 187 ± 6 dB re: 1 μPa-1 m in the 15-60 Hz frequency range. The calls are attributed to a subgroup of blue whales, Balaenoptera musculus, with a characteristic acoustic signature. A Bayesian location method using probabilistic models for bearing and amplitude is demonstrated on the calls sequence. The method is applied to the case of detection at a single triad of hydrophones and results in a probability distribution map for the origin of the calls. It can be extended to detections at multiple triads and because of the Bayesian formulation, additional modeling complexity can be built-in as needed.
NASA Astrophysics Data System (ADS)
Gromov, M. B.
2017-11-01
The proposed methodology developed in cooperation of the LIGO, VIRGO, Borexino, LVD, and IceCube collaborations is based on a joint analysis of data from neutrino and gravitational wave detectors which record corresponding radiations, almost undistorted by the interstellar medium and propagating with similar speeds. This approach allows to increase the reliability of observations, detect the so-called Silent supernovae and explore the properties and generation mechanisms of gravitational waves.
Effective normalization for copy number variation detection from whole genome sequencing.
Janevski, Angel; Varadan, Vinay; Kamalakaran, Sitharthan; Banerjee, Nilanjana; Dimitrova, Nevenka
2012-01-01
Whole genome sequencing enables a high resolution view of the human genome and provides unique insights into genome structure at an unprecedented scale. There have been a number of tools to infer copy number variation in the genome. These tools, while validated, also include a number of parameters that are configurable to genome data being analyzed. These algorithms allow for normalization to account for individual and population-specific effects on individual genome CNV estimates but the impact of these changes on the estimated CNVs is not well characterized. We evaluate in detail the effect of normalization methodologies in two CNV algorithms FREEC and CNV-seq using whole genome sequencing data from 8 individuals spanning four populations. We apply FREEC and CNV-seq to a sequencing data set consisting of 8 genomes. We use multiple configurations corresponding to different read-count normalization methodologies in FREEC, and statistically characterize the concordance of the CNV calls between FREEC configurations and the analogous output from CNV-seq. The normalization methodologies evaluated in FREEC are: GC content, mappability and control genome. We further stratify the concordance analysis within genic, non-genic, and a collection of validated variant regions. The GC content normalization methodology generates the highest number of altered copy number regions. Both mappability and control genome normalization reduce the total number and length of copy number regions. Mappability normalization yields Jaccard indices in the 0.07 - 0.3 range, whereas using a control genome normalization yields Jaccard index values around 0.4 with normalization based on GC content. The most critical impact of using mappability as a normalization factor is substantial reduction of deletion CNV calls. The output of another method based on control genome normalization, CNV-seq, resulted in comparable CNV call profiles, and substantial agreement in variable gene and CNV region calls. Choice of read-count normalization methodology has a substantial effect on CNV calls and the use of genomic mappability or an appropriately chosen control genome can optimize the output of CNV analysis.
Allele-specific copy-number discovery from whole-genome and whole-exome sequencing
Wang, WeiBo; Wang, Wei; Sun, Wei; Crowley, James J.; Szatkiewicz, Jin P.
2015-01-01
Copy-number variants (CNVs) are a major form of genetic variation and a risk factor for various human diseases, so it is crucial to accurately detect and characterize them. It is conceivable that allele-specific reads from high-throughput sequencing data could be leveraged to both enhance CNV detection and produce allele-specific copy number (ASCN) calls. Although statistical methods have been developed to detect CNVs using whole-genome sequence (WGS) and/or whole-exome sequence (WES) data, information from allele-specific read counts has not yet been adequately exploited. In this paper, we develop an integrated method, called AS-GENSENG, which incorporates allele-specific read counts in CNV detection and estimates ASCN using either WGS or WES data. To evaluate the performance of AS-GENSENG, we conducted extensive simulations, generated empirical data using existing WGS and WES data sets and validated predicted CNVs using an independent methodology. We conclude that AS-GENSENG not only predicts accurate ASCN calls but also improves the accuracy of total copy number calls, owing to its unique ability to exploit information from both total and allele-specific read counts while accounting for various experimental biases in sequence data. Our novel, user-friendly and computationally efficient method and a complete analytic protocol is freely available at https://sourceforge.net/projects/asgenseng/. PMID:25883151
Roberson, A.M.; Andersen, D.E.; Kennedy, P.L.
2005-01-01
Broadcast surveys using conspecific calls are currently the most effective method for detecting northern goshawks (Accipiter gentilis) during the breeding season. These surveys typically use alarm calls during the nestling phase and juvenile food-begging calls during the fledgling-dependency phase. Because goshawks are most vocal during the courtship phase, we hypothesized that this phase would be an effective time to detect goshawks. Our objective was to improve current survey methodology by evaluating the probability of detecting goshawks at active nests in northern Minnesota in 3 breeding phases and at 4 broadcast distances and to determine the effective area surveyed per broadcast station. Unlike previous studies, we broadcast calls at only 1 distance per trial. This approach better quantifies (1) the relationship between distance and probability of detection, and (2) the effective area surveyed (EAS) per broadcast station. We conducted 99 broadcast trials at 14 active breeding areas. When pooled over all distances, detection rates were highest during the courtship (70%) and fledgling-dependency phases (68%). Detection rates were lowest during the nestling phase (28%), when there appeared to be higher variation in likelihood of detecting individuals. EAS per broadcast station was 39.8 ha during courtship and 24.8 ha during fledgling-dependency. Consequently, in northern Minnesota, broadcast stations may be spaced 712m and 562 m apart when conducting systematic surveys during courtship and fledgling-dependency, respectively. We could not calculate EAS for the nestling phase because probability of detection was not a simple function of distance from nest. Calculation of EAS could be applied to other areas where the probability of detection is a known function of distance.
Smith, Adam D.; Paton, Peter W. C.; McWilliams, Scott R.
2014-01-01
Atmospheric conditions fundamentally influence the timing, intensity, energetics, and geography of avian migration. While radar is typically used to infer the influence of weather on the magnitude and spatiotemporal patterns of nocturnal bird migration, monitoring the flight calls produced by many bird species during nocturnal migration represents an alternative methodology and provides information regarding the species composition of nocturnal migration. We used nocturnal flight call (NFC) recordings of at least 22 migratory songbirds (14 warbler and 8 sparrow species) during fall migration from eight sites along the mainland and island coasts of Rhode Island to evaluate five hypotheses regarding NFC detections. Patterns of warbler and sparrow NFC detections largely supported our expectations in that (1) NFC detections associated positively and strongly with wind conditions that influence the intensity of coastal bird migration and negatively with regional precipitation; (2) NFCs increased during conditions with reduced visibility (e.g., high cloud cover); (3) NFCs decreased with higher wind speeds, presumably due mostly to increased ambient noise; and (4) coastal mainland sites recorded five to nine times more NFCs, on average, than coastal nearshore or offshore island sites. However, we found little evidence that (5) nightly or intra-night patterns of NFCs reflected the well-documented latitudinal patterns of migrant abundance on an offshore island. Despite some potential complications in inferring migration intensity and species composition from NFC data, the acoustic monitoring of NFCs provides a viable and complementary methodology for exploring the spatiotemporal patterns of songbird migration as well as evaluating the atmospheric conditions that shape these patterns. PMID:24643060
[Evaluation of work-related stress in call-center workers: application of a methodology].
Ansaloni, Gianluca; Cichella, Patrizia; Morelli, Carla; Alberghini, Villiam; Finardi, Elisabetta; Guglielmin, Antonia Maria; Nini, Donatella; Sacenti, Elisabetta; Stagni, Cristina
2014-01-01
Several studies highlighting a correlation between call-center working conditions and psychosocial and ergonomic hazards. The aim of this study is to provide an operating methodology for the risk assessment of work-related stress. The study involved 554 call-centre workers employed in three insurance organizations and a mixed work group (worker, company and public health representative) for the study management was defined. We experimented an objective self-made checklist and then we administered a modified version of the OSI (Occupational Stress Indicator) questionnaire. We obtained complementary information from the two different data collection methods. The findings highlight a low level of perceived stress and health complaints compared with other studies previously carried out mainly in 'outsourcing' call centres: workers don't show stress symptoms without adopting coping strategies. Moreover the study underlines an acceptable level of work satisfaction, although there are low career opportunities. These results are probable due to the low job seniority associated to the high job security--the large majority of respondents, 87%, consisted of permanent workers--and the working time mainly consisted of daily shifts five days a week. Our methodology seems to be able to detect the level of work-related stress with a good degree of coherence. Furthermore the presence of a mixed work group determined a good level of involvement among the workers: 464 out of 554 operators completed and returned the questionnaire, representing a response rate of about 84%.
Schuchmann, Maike; Siemers, Björn M.
2010-01-01
Background Only recently data on bat echolocation call intensities is starting to accumulate. Yet, intensity is an ecologically crucial parameter, as it determines the extent of the bats' perceptual space and, specifically, prey detection distance. Interspecifically, we thus asked whether sympatric, congeneric bat species differ in call intensities and whether differences play a role for niche differentiation. Specifically, we investigated whether R. mehelyi that calls at a frequency clearly above what is predicted by allometry, compensates for frequency-dependent loss in detection distance by using elevated call intensity. Maximum echolocation call intensities might depend on body size or condition and thus be used as an honest signal of quality for intraspecific communication. We for the first time investigated whether a size-intensity relation is present in echolocating bats. Methodology/Principal Findings We measured maximum call intensities and frequencies for all five European horseshoe bat species. Maximum intensity differed among species largely due to R. euryale. Furthermore, we found no compensation for frequency-dependent loss in detection distance in R. mehelyi. Intraspecifically, there is a negative correlation between forearm lengths and intensity in R. euryale and a trend for a negative correlation between body condition index and intensity in R. ferrumequinum. In R. hipposideros, females had 8 dB higher intensities than males. There were no correlations with body size or sex differences and intensity for the other species. Conclusions/Significance Based on call intensity and frequency measurements, we estimated echolocation ranges for our study community. These suggest that intensity differences result in different prey detection distances and thus likely play some role for resource access. It is interesting and at first glance counter-intuitive that, where a correlation was found, smaller bats called louder than large individuals. Such negative relationship between size or condition and vocal amplitude may indicate an as yet unknown physiological or sexual selection pressure. PMID:20862252
Allele-specific copy-number discovery from whole-genome and whole-exome sequencing.
Wang, WeiBo; Wang, Wei; Sun, Wei; Crowley, James J; Szatkiewicz, Jin P
2015-08-18
Copy-number variants (CNVs) are a major form of genetic variation and a risk factor for various human diseases, so it is crucial to accurately detect and characterize them. It is conceivable that allele-specific reads from high-throughput sequencing data could be leveraged to both enhance CNV detection and produce allele-specific copy number (ASCN) calls. Although statistical methods have been developed to detect CNVs using whole-genome sequence (WGS) and/or whole-exome sequence (WES) data, information from allele-specific read counts has not yet been adequately exploited. In this paper, we develop an integrated method, called AS-GENSENG, which incorporates allele-specific read counts in CNV detection and estimates ASCN using either WGS or WES data. To evaluate the performance of AS-GENSENG, we conducted extensive simulations, generated empirical data using existing WGS and WES data sets and validated predicted CNVs using an independent methodology. We conclude that AS-GENSENG not only predicts accurate ASCN calls but also improves the accuracy of total copy number calls, owing to its unique ability to exploit information from both total and allele-specific read counts while accounting for various experimental biases in sequence data. Our novel, user-friendly and computationally efficient method and a complete analytic protocol is freely available at https://sourceforge.net/projects/asgenseng/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Detecting Role Errors in the Gene Hierarchy of the NCI Thesaurus
Min, Hua; Cohen, Barry; Halper, Michael; Oren, Marc; Perl, Yehoshua
2008-01-01
Gene terminologies are playing an increasingly important role in the ever-growing field of genomic research. While errors in large, complex terminologies are inevitable, gene terminologies are even more susceptible to them due to the rapid growth of genomic knowledge and the nature of its discovery. It is therefore very important to establish quality-assurance protocols for such genomic-knowledge repositories. Different kinds of terminologies oftentimes require auditing methodologies adapted to their particular structures. In light of this, an auditing methodology tailored to the characteristics of the NCI Thesaurus’s (NCIT’s) Gene hierarchy is presented. The Gene hierarchy is of particular interest to the NCIT’s designers due to the primary role of genomics in current cancer research. This multiphase methodology focuses on detecting role-errors, such as missing roles or roles with incorrect or incomplete target structures, occurring within that hierarchy. The methodology is based on two kinds of abstraction networks, called taxonomies, that highlight the role distribution among concepts within the IS-A (subsumption) hierarchy. These abstract views tend to highlight portions of the hierarchy having a higher concentration of errors. The errors found during an application of the methodology are reported. Hypotheses pertaining to the efficacy of our methodology are investigated. PMID:19221606
Auditing as part of the terminology design life cycle.
Min, Hua; Perl, Yehoshua; Chen, Yan; Halper, Michael; Geller, James; Wang, Yue
2006-01-01
To develop and test an auditing methodology for detecting errors in medical terminologies satisfying systematic inheritance. This methodology is based on various abstraction taxonomies that provide high-level views of a terminology and highlight potentially erroneous concepts. Our auditing methodology is based on dividing concepts of a terminology into smaller, more manageable units. First, we divide the terminology's concepts into areas according to their relationships/roles. Then each multi-rooted area is further divided into partial-areas (p-areas) that are singly-rooted. Each p-area contains a set of structurally and semantically uniform concepts. Two kinds of abstraction networks, called the area taxonomy and p-area taxonomy, are derived. These taxonomies form the basis for the auditing approach. Taxonomies tend to highlight potentially erroneous concepts in areas and p-areas. Human reviewers can focus their auditing efforts on the limited number of problematic concepts following two hypotheses on the probable concentration of errors. A sample of the area taxonomy and p-area taxonomy for the Biological Process (BP) hierarchy of the National Cancer Institute Thesaurus (NCIT) was derived from the application of our methodology to its concepts. These views led to the detection of a number of different kinds of errors that are reported, and to confirmation of the hypotheses on error concentration in this hierarchy. Our auditing methodology based on area and p-area taxonomies is an efficient tool for detecting errors in terminologies satisfying systematic inheritance of roles, and thus facilitates their maintenance. This methodology concentrates a domain expert's manual review on portions of the concepts with a high likelihood of errors.
Using forbidden ordinal patterns to detect determinism in irregularly sampled time series.
Kulp, C W; Chobot, J M; Niskala, B J; Needhammer, C J
2016-02-01
It is known that when symbolizing a time series into ordinal patterns using the Bandt-Pompe (BP) methodology, there will be ordinal patterns called forbidden patterns that do not occur in a deterministic series. The existence of forbidden patterns can be used to identify deterministic dynamics. In this paper, the ability to use forbidden patterns to detect determinism in irregularly sampled time series is tested on data generated from a continuous model system. The study is done in three parts. First, the effects of sampling time on the number of forbidden patterns are studied on regularly sampled time series. The next two parts focus on two types of irregular-sampling, missing data and timing jitter. It is shown that forbidden patterns can be used to detect determinism in irregularly sampled time series for low degrees of sampling irregularity (as defined in the paper). In addition, comments are made about the appropriateness of using the BP methodology to symbolize irregularly sampled time series.
Use of a public telephone hotline to detect urban plague cases.
Malberg, J A; Pape, W J; Lezotte, D; Hill, A E
2012-11-01
Current methods for vector-borne disease surveillance are limited by time and cost. To avoid human infections from emerging zoonotic diseases, it is important that the United States develop cost-effective surveillance systems for these diseases. This study examines the methodology used in the surveillance of a plague epizootic involving tree squirrels (Sciurus niger) in Denver Colorado, during the summer of 2007. A call-in centre for the public to report dead squirrels was used to direct animal carcass sampling. Staff used these reports to collect squirrel carcasses for the analysis of Yersinia pestis infection. This sampling protocol was analysed at the census tract level using Poisson regression to determine the relationship between higher call volumes in a census tract and the risk of a carcass in that tract testing positive for plague. Over-sampling owing to call volume-directed collection was accounted for by including the number of animals collected as the denominator in the model. The risk of finding an additional plague-positive animal increased as the call volume per census tract increased. The risk in the census tracts with >3 calls a month was significantly higher than that with three or less calls in a month. For tracts with 4-5 calls, the relative risk (RR) of an additional plague-positive carcass was 10.08 (95% CI 5.46-18.61); for tracts with 6-8 calls, the RR = 5.20 (2.93-9.20); for tracts with 9-11 calls, the RR = 12.80 (5.85-28.03) and tracts with >11 calls had RR = 35.41 (18.60-67.40). Overall, the call-in centre directed sampling increased the probability of locating plague-infected carcasses in the known Denver epizootic. Further studies are needed to determine the effectiveness of this methodology at monitoring large-scale zoonotic disease occurrence in the absence of a recognized epizootic. © 2012 Blackwell Verlag GmbH.
Auditing as Part of the Terminology Design Life Cycle
Min, Hua; Perl, Yehoshua; Chen, Yan; Halper, Michael; Geller, James; Wang, Yue
2006-01-01
Objective To develop and test an auditing methodology for detecting errors in medical terminologies satisfying systematic inheritance. This methodology is based on various abstraction taxonomies that provide high-level views of a terminology and highlight potentially erroneous concepts. Design Our auditing methodology is based on dividing concepts of a terminology into smaller, more manageable units. First, we divide the terminology’s concepts into areas according to their relationships/roles. Then each multi-rooted area is further divided into partial-areas (p-areas) that are singly-rooted. Each p-area contains a set of structurally and semantically uniform concepts. Two kinds of abstraction networks, called the area taxonomy and p-area taxonomy, are derived. These taxonomies form the basis for the auditing approach. Taxonomies tend to highlight potentially erroneous concepts in areas and p-areas. Human reviewers can focus their auditing efforts on the limited number of problematic concepts following two hypotheses on the probable concentration of errors. Results A sample of the area taxonomy and p-area taxonomy for the Biological Process (BP) hierarchy of the National Cancer Institute Thesaurus (NCIT) was derived from the application of our methodology to its concepts. These views led to the detection of a number of different kinds of errors that are reported, and to confirmation of the hypotheses on error concentration in this hierarchy. Conclusion Our auditing methodology based on area and p-area taxonomies is an efficient tool for detecting errors in terminologies satisfying systematic inheritance of roles, and thus facilitates their maintenance. This methodology concentrates a domain expert’s manual review on portions of the concepts with a high likelihood of errors. PMID:16929044
Seismic Characterization of EGS Reservoirs
NASA Astrophysics Data System (ADS)
Templeton, D. C.; Pyle, M. L.; Matzel, E.; Myers, S.; Johannesson, G.
2014-12-01
To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance the traditional microearthquake detection and location methodologies at two EGS systems. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP are typically smaller magnitude events or events that occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event seismic location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation could be real or simply within the anticipated error range. We apply this methodology to the Basel EGS data set and compare it to another EGS dataset. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Vieira, Manuel; Fonseca, Paulo J; Amorim, M Clara P; Teixeira, Carlos J C
2015-12-01
The study of acoustic communication in animals often requires not only the recognition of species specific acoustic signals but also the identification of individual subjects, all in a complex acoustic background. Moreover, when very long recordings are to be analyzed, automatic recognition and identification processes are invaluable tools to extract the relevant biological information. A pattern recognition methodology based on hidden Markov models is presented inspired by successful results obtained in the most widely known and complex acoustical communication signal: human speech. This methodology was applied here for the first time to the detection and recognition of fish acoustic signals, specifically in a stream of round-the-clock recordings of Lusitanian toadfish (Halobatrachus didactylus) in their natural estuarine habitat. The results show that this methodology is able not only to detect the mating sounds (boatwhistles) but also to identify individual male toadfish, reaching an identification rate of ca. 95%. Moreover this method also proved to be a powerful tool to assess signal durations in large data sets. However, the system failed in recognizing other sound types.
Pansharpening Techniques to Detect Mass Monument Damaging in Iraq
NASA Astrophysics Data System (ADS)
Baiocchi, V.; Bianchi, A.; Maddaluno, C.; Vidale, M.
2017-05-01
The recent mass destructions of monuments in Iraq cannot be monitored with the terrestrial survey methodologies, for obvious reasons of safety. For the same reasons, it's not advisable the use of classical aerial photogrammetry, so it was obvious to think to the use of multispectral Very High Resolution (VHR) satellite imagery. Nowadays VHR satellite images resolutions are very near airborne photogrammetrical images and usually they are acquired in multispectral mode. The combination of the various bands of the images is called pan-sharpening and it can be carried on using different algorithms and strategies. The correct pansharpening methodology, for a specific image, must be chosen considering the specific multispectral characteristics of the satellite used and the particular application. In this paper a first definition of guidelines for the use of VHR multispectral imagery to detect monument destruction in unsafe area, is reported. The proposed methodology, agreed with UNESCO and soon to be used in Libya for the coastal area, has produced a first report delivered to the Iraqi authorities. Some of the most evident examples are reported to show the possible capabilities of identification of damages using VHR images.
Gorresen, Paulo Marcos; Cryan, Paul; Montoya-Aiona, Kristina; Bonaccorso, Frank
2017-01-01
Bats vocalize during flight as part of the sensory modality called echolocation, but very little is known about whether flying bats consistently call. Occasional vocal silence during flight when bats approach prey or conspecifics has been documented for relatively few species and situations. Bats flying alone in clutter-free airspace are not known to forgo vocalization, yet prior observations suggested possible silent behavior in certain, unexpected situations. Determining when, why, and where silent behavior occurs in bats will help evaluate major assumptions of a primary monitoring method for bats used in ecological research, management, and conservation. In this study, we recorded flight activity of Hawaiian hoary bats (Lasiurus cinereus semotus) under seminatural conditions using both thermal video cameras and acoustic detectors. Simultaneous video and audio recordings from 20 nights of observation at 10 sites were analyzed for correspondence between detection methods, with a focus on video observations in three distance categories for which accompanying vocalizations were detected. Comparison of video and audio detections revealed that a high proportion of Hawaiian hoary bats “seen” on video were not simultaneously “heard.” On average, only about one in three visual detections within a night had an accompanying call detection, but this varied greatly among nights. Bats flying on curved flight paths and individuals nearer the cameras were more likely to be detected by both methods. Feeding and social calls were detected, but no clear pattern emerged from the small number of observations involving closely interacting bats. These results may indicate that flying Hawaiian hoary bats often forgo echolocation, or do not always vocalize in a way that is detectable with common sampling and monitoring methods. Possible reasons for the low correspondence between visual and acoustic detections range from methodological to biological and include a number of biases associated with the propagation and detection of sound, cryptic foraging strategies, or conspecific presence. Silent flight behavior may be more prevalent in echolocating bats than previously appreciated, has profound implications for ecological research, and deserves further characterization and study.
A feasibility study of damage detection in beams using high-speed camera (Conference Presentation)
NASA Astrophysics Data System (ADS)
Wan, Chao; Yuan, Fuh-Gwo
2017-04-01
In this paper a method for damage detection in beam structures using high-speed camera is presented. Traditional methods of damage detection in structures typically involve contact (i.e., piezoelectric sensor or accelerometer) or non-contact sensors (i.e., laser vibrometer) which can be costly and time consuming to inspect an entire structure. With the popularity of the digital camera and the development of computer vision technology, video cameras offer a viable capability of measurement including higher spatial resolution, remote sensing and low-cost. In the study, a damage detection method based on the high-speed camera was proposed. The system setup comprises a high-speed camera and a line-laser which can capture the out-of-plane displacement of a cantilever beam. The cantilever beam with an artificial crack was excited and the vibration process was recorded by the camera. A methodology called motion magnification, which can amplify subtle motions in a video is used for modal identification of the beam. A finite element model was used for validation of the proposed method. Suggestions for applications of this methodology and challenges in future work will be discussed.
CONFU: Configuration Fuzzing Testing Framework for Software Vulnerability Detection
Dai, Huning; Murphy, Christian; Kaiser, Gail
2010-01-01
Many software security vulnerabilities only reveal themselves under certain conditions, i.e., particular configurations and inputs together with a certain runtime environment. One approach to detecting these vulnerabilities is fuzz testing. However, typical fuzz testing makes no guarantees regarding the syntactic and semantic validity of the input, or of how much of the input space will be explored. To address these problems, we present a new testing methodology called Configuration Fuzzing. Configuration Fuzzing is a technique whereby the configuration of the running application is mutated at certain execution points, in order to check for vulnerabilities that only arise in certain conditions. As the application runs in the deployment environment, this testing technique continuously fuzzes the configuration and checks “security invariants” that, if violated, indicate a vulnerability. We discuss the approach and introduce a prototype framework called ConFu (CONfiguration FUzzing testing framework) for implementation. We also present the results of case studies that demonstrate the approach’s feasibility and evaluate its performance. PMID:21037923
Applicability of a Conservative Margin Approach for Assessing NDE Flaw Detectability
NASA Technical Reports Server (NTRS)
Koshti, ajay M.
2007-01-01
Nondestructive Evaluation (NDE) procedures are required to detect flaws in structures with a high percentage detectability and high confidence. Conventional Probability of Detection (POD) methods are statistical in nature and require detection data from a relatively large number of flaw specimens. In many circumstances, due to the high cost and long lead time, it is impractical to build the large set of flaw specimens that is required by the conventional POD methodology. Therefore, in such situations it is desirable to have a flaw detectability estimation approach that allows for a reduced number of flaw specimens but provides a high degree of confidence in establishing the flaw detectability size. This paper presents an alternative approach called the conservative margin approach (CMA). To investigate the applicability of the CMA approach, flaw detectability sizes determined by the CMA and POD approaches have been compared on actual datasets. The results of these comparisons are presented and the applicability of the CMA approach is discussed.
Jang, Yikweon; Hahm, Eun Hye; Lee, Hyun-Jung; Park, Soyeon; Won, Yong-Jin; Choe, Jae C.
2011-01-01
Background In a species with a large distribution relative to its dispersal capacity, geographic variation in traits may be explained by gene flow, selection, or the combined effects of both. Studies of genetic diversity using neutral molecular markers show that patterns of isolation by distance (IBD) or barrier effect may be evident for geographic variation at the molecular level in amphibian species. However, selective factors such as habitat, predator, or interspecific interactions may be critical for geographic variation in sexual traits. We studied geographic variation in advertisement calls in the tree frog Hyla japonica to understand patterns of variation in these traits across Korea and provide clues about the underlying forces for variation. Methodology We recorded calls of H. japonica in three breeding seasons from 17 localities including localities in remote Jeju Island. Call characters analyzed were note repetition rate (NRR), note duration (ND), and dominant frequency (DF), along with snout-to-vent length. Results The findings of a barrier effect on DF and a longitudinal variation in NRR seemed to suggest that an open sea between the mainland and Jeju Island and mountain ranges dominated by the north-south Taebaek Mountains were related to geographic variation in call characters. Furthermore, there was a pattern of IBD in mitochondrial DNA sequences. However, no comparable pattern of IBD was found between geographic distance and call characters. We also failed to detect any effects of habitat or interspecific interaction on call characters. Conclusions Geographic variations in call characters as well as mitochondrial DNA sequences were largely stratified by geographic factors such as distance and barriers in Korean populations of H. japoinca. Although we did not detect effects of habitat or interspecific interaction, some other selective factors such as sexual selection might still be operating on call characters in conjunction with restricted gene flow. PMID:21858061
Consensus-based methodology for detection communities in multilayered networks
NASA Astrophysics Data System (ADS)
Karimi-Majd, Amir-Mohsen; Fathian, Mohammad; Makrehchi, Masoud
2018-03-01
Finding groups of network users who are densely related with each other has emerged as an interesting problem in the area of social network analysis. These groups or so-called communities would be hidden behind the behavior of users. Most studies assume that such behavior could be understood by focusing on user interfaces, their behavioral attributes or a combination of these network layers (i.e., interfaces with their attributes). They also assume that all network layers refer to the same behavior. However, in real-life networks, users' behavior in one layer may differ from their behavior in another one. In order to cope with these issues, this article proposes a consensus-based community detection approach (CBC). CBC finds communities among nodes at each layer, in parallel. Then, the results of layers should be aggregated using a consensus clustering method. This means that different behavior could be detected and used in the analysis. As for other significant advantages, the methodology would be able to handle missing values. Three experiments on real-life and computer-generated datasets have been conducted in order to evaluate the performance of CBC. The results indicate superiority and stability of CBC in comparison to other approaches.
Poly-Pattern Compressive Segmentation of ASTER Data for GIS
NASA Technical Reports Server (NTRS)
Myers, Wayne; Warner, Eric; Tutwiler, Richard
2007-01-01
Pattern-based segmentation of multi-band image data, such as ASTER, produces one-byte and two-byte approximate compressions. This is a dual segmentation consisting of nested coarser and finer level pattern mappings called poly-patterns. The coarser A-level version is structured for direct incorporation into geographic information systems in the manner of a raster map. GIs renderings of this A-level approximation are called pattern pictures which have the appearance of color enhanced images. The two-byte version consisting of thousands of B-level segments provides a capability for approximate restoration of the multi-band data in selected areas or entire scenes. Poly-patterns are especially useful for purposes of change detection and landscape analysis at multiple scales. The primary author has implemented the segmentation methodology in a public domain software suite.
A Bioinformatics Approach for Detecting Repetitive Nested Motifs using Pattern Matching.
Romero, José R; Carballido, Jessica A; Garbus, Ingrid; Echenique, Viviana C; Ponzoni, Ignacio
2016-01-01
The identification of nested motifs in genomic sequences is a complex computational problem. The detection of these patterns is important to allow the discovery of transposable element (TE) insertions, incomplete reverse transcripts, deletions, and/or mutations. In this study, a de novo strategy for detecting patterns that represent nested motifs was designed based on exhaustive searches for pairs of motifs and combinatorial pattern analysis. These patterns can be grouped into three categories, motifs within other motifs, motifs flanked by other motifs, and motifs of large size. The methodology used in this study, applied to genomic sequences from the plant species Aegilops tauschii and Oryza sativa , revealed that it is possible to identify putative nested TEs by detecting these three types of patterns. The results were validated through BLAST alignments, which revealed the efficacy and usefulness of the new method, which is called Mamushka.
STREAMFINDER - I. A new algorithm for detecting stellar streams
NASA Astrophysics Data System (ADS)
Malhan, Khyati; Ibata, Rodrigo A.
2018-07-01
We have designed a powerful new algorithm to detect stellar streams in an automated and systematic way. The algorithm, which we call the STREAMFINDER, is well suited for finding dynamically cold and thin stream structures that may lie along any simple or complex orbits in Galactic stellar surveys containing any combination of positional and kinematic information. In the present contribution, we introduce the algorithm, lay out the ideas behind it, explain the methodology adopted to detect streams, and detail its workings by running it on a suite of simulations of mock Galactic survey data of similar quality to that expected from the European Space Agency/Gaia mission. We show that our algorithm is able to detect even ultra-faint stream features lying well below previous detection limits. Tests show that our algorithm will be able to detect distant halo stream structures >10° long containing as few as ˜15 members (ΣG ˜ 33.6 mag arcsec-2) in the Gaia data set.
NASA Technical Reports Server (NTRS)
Temple, Enoch C.
1994-01-01
The space industry has developed many composite materials that have high durability in proportion to their weights. Many of these materials have a likelihood for flaws that is higher than in traditional metals. There are also coverings (such as paint) that develop flaws that may adversely affect the performance of the system in which they are used. Therefore there is a need to monitor the soundness of composite structures. To meet this monitoring need, many nondestructive evaluation (NDE) systems have been developed. An NDE system is designed to detect material flaws and make flaw measurements without destroying the inspected item. Also, the detection operation is expected to be performed in a rapid manner in a field or production environment. Some of the most recent video-based NDE methodologies are shearography, holography, thermography, and video image correlation.
Rodríguez, M T Torres; Andrade, L Cristóbal; Bugallo, P M Bello; Long, J J Casares
2011-09-15
Life cycle thinking (LCT) is one of the philosophies that has recently appeared in the context of the sustainable development. Some of the already existing tools and methods, as well as some of the recently emerged ones, which seek to understand, interpret and design the life of a product, can be included into the scope of the LCT philosophy. That is the case of the material and energy flow analysis (MEFA), a tool derived from the industrial metabolism definition. This paper proposes a methodology combining MEFA with another technique derived from sustainable development which also fits the LCT philosophy, the BAT (best available techniques) analysis. This methodology, applied to an industrial process, seeks to identify the so-called improvable flows by MEFA, so that the appropriate candidate BAT can be selected by BAT analysis. Material and energy inputs, outputs and internal flows are quantified, and sustainable solutions are provided on the basis of industrial metabolism. The methodology has been applied to an exemplary roof tile manufacture plant for validation. 14 Improvable flows have been identified and 7 candidate BAT have been proposed aiming to reduce these flows. The proposed methodology provides a way to detect improvable material or energy flows in a process and selects the most sustainable options to enhance them. Solutions are proposed for the detected improvable flows, taking into account their effectiveness on improving such flows. Copyright © 2011 Elsevier B.V. All rights reserved.
Białk-Bielińska, Anna; Kumirska, Jolanta; Borecka, Marta; Caban, Magda; Paszkiewicz, Monika; Pazdro, Ksenia; Stepnowski, Piotr
2016-03-20
Recent developments and improvements in advanced instruments and analytical methodologies have made the detection of pharmaceuticals at low concentration levels in different environmental matrices possible. As a result of these advances, over the last 15 years residues of these compounds and their metabolites have been detected in different environmental compartments and pharmaceuticals have now become recognized as so-called 'emerging' contaminants. To date, a lot of papers have been published presenting the development of analytical methodologies for the determination of pharmaceuticals in aqueous and solid environmental samples. Many papers have also been published on the application of the new methodologies, mainly to the assessment of the environmental fate of pharmaceuticals. Although impressive improvements have undoubtedly been made, in order to fully understand the behavior of these chemicals in the environment, there are still numerous methodological challenges to be overcome. The aim of this paper therefore, is to present a review of selected recent improvements and challenges in the determination of pharmaceuticals in environmental samples. Special attention has been paid to the strategies used and the current challenges (also in terms of Green Analytical Chemistry) that exist in the analysis of these chemicals in soils, marine environments and drinking waters. There is a particular focus on the applicability of modern sorbents such as carbon nanotubes (CNTs) in sample preparation techniques, to overcome some of the problems that exist in the analysis of pharmaceuticals in different environmental samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Technical and operational users' opinions of a handheld device to detect directed energy.
Boyd, Andrew D; Naiman, Melissa; Stevenson, Greer W; Preston, Richard; Valenta, Annette L
2013-05-01
Lasers, a form of directed energy (DE), are a threat to pilots and Air Force personnel. In light of this threat, a handheld medical device called the "Tricorder" is under development to improve situational awareness of DE. Current operational procedures do not include methods for recording or handling new information regarding DE. The purpose of this study was to understand Air Force personnel opinions and beliefs about desired features and operational use to enhance user acceptance of the Tricorder. Q-methodology was implemented to study opinions and beliefs related to DE. Two groups were approached, medical personnel in the Illinois Air National Guard and four active duty members of an Air Force Rescue Squadron. Both groups completed the same Q-sort of both operational and equipment concerns. Six opinion sets regarding operational concerns described 61% of the total variation in perceptions among participants. The factors were: concern over health effects, implications to individuals, combat/tactical concerns, force health protection, and theater/tactical concerns. Five opinion sets described 68% of the variation in the equipment functions perceived as most important. The participants indicated that ideally the device should measure exposure, enhance laser detection/response, support night vision and ease of use, detect threats, and enhance combat medicine. This survey revealed the complexity of equipment and the operational implications of detecting DE. Q-methodology is a unique strategy to both evaluate technology and explore users' concerns.
ERIC Educational Resources Information Center
Raz, Aviad E.
2007-01-01
Purpose: The purpose of this paper is to describe and analyse the formation of CoPs (communities of practice) in three call centres of cellular communication operating companies in Israel. Design/methodology/approach: This study is based on a qualitative methodology including observations, interviews and textual analysis. Findings: In all three…
ERIC Educational Resources Information Center
Madill, Michael T. R.
2014-01-01
Didactical approaches related to teaching English as a Foreign Language (EFL) have developed into a complex array of instructional methodologies, each having potential benefits attributed to elementary reading development. One such effective practice is Computer Assisted Language Learning (CALL), which uses various forms of technology such as…
Intelligent monitoring and control of semiconductor manufacturing equipment
NASA Technical Reports Server (NTRS)
Murdock, Janet L.; Hayes-Roth, Barbara
1991-01-01
The use of AI methods to monitor and control semiconductor fabrication in a state-of-the-art manufacturing environment called the Rapid Thermal Multiprocessor is described. Semiconductor fabrication involves many complex processing steps with limited opportunities to measure process and product properties. By applying additional process and product knowledge to that limited data, AI methods augment classical control methods by detecting abnormalities and trends, predicting failures, diagnosing, planning corrective action sequences, explaining diagnoses or predictions, and reacting to anomalous conditions that classical control systems typically would not correct. Research methodology and issues are discussed, and two diagnosis scenarios are examined.
Transient Region Coverage in the Propulsion IVHM Technology Experiment
NASA Technical Reports Server (NTRS)
Balaban, Edward; Sweet, Adam; Bajwa, Anupa; Maul, William; Fulton, Chris; Chicatelli, amy
2004-01-01
Over the last several years researchers at NASA Glenn and Ames Research Centers have developed a real-time fault detection and isolation system for propulsion subsystems of future space vehicles. The Propulsion IVHM Technology Experiment (PITEX), as it is called follows the model-based diagnostic methodology and employs Livingstone, developed at NASA Ames, as its reasoning engine. The system has been tested on,flight-like hardware through a series of nominal and fault scenarios. These scenarios have been developed using a highly detailed simulation of the X-34 flight demonstrator main propulsion system and include realistic failures involving valves, regulators, microswitches, and sensors. This paper focuses on one of the recent research and development efforts under PITEX - to provide more complete transient region coverage. It describes the development of the transient monitors, the corresponding modeling methodology, and the interface software responsible for coordinating the flow of information between the quantitative monitors and the qualitative, discrete representation Livingstone.
Systemic Operational Design: Improving Operational Planning for the Netherlands Armed Forces
2006-05-25
This methodology is called Soft Systems Methodology . His methodology is a structured way of thinking in which not only a perceived problematic...Many similarities exist between Systemic Operational Design and Soft Systems Methodology , their epistemology is related. Furthermore, they both have...Systems Thinking: Managing Chaos and Complexity. Boston: Butterworth Heinemann, 1999. Checkland, Peter, and Jim Scholes. Soft Systems Methodology in
Metabolomics analysis: Finding out metabolic building blocks
2017-01-01
In this paper we propose a new methodology for the analysis of metabolic networks. We use the notion of strongly connected components of a graph, called in this context metabolic building blocks. Every strongly connected component is contracted to a single node in such a way that the resulting graph is a directed acyclic graph, called a metabolic DAG, with a considerably reduced number of nodes. The property of being a directed acyclic graph brings out a background graph topology that reveals the connectivity of the metabolic network, as well as bridges, isolated nodes and cut nodes. Altogether, it becomes a key information for the discovery of functional metabolic relations. Our methodology has been applied to the glycolysis and the purine metabolic pathways for all organisms in the KEGG database, although it is general enough to work on any database. As expected, using the metabolic DAGs formalism, a considerable reduction on the size of the metabolic networks has been obtained, specially in the case of the purine pathway due to its relative larger size. As a proof of concept, from the information captured by a metabolic DAG and its corresponding metabolic building blocks, we obtain the core of the glycolysis pathway and the core of the purine metabolism pathway and detect some essential metabolic building blocks that reveal the key reactions in both pathways. Finally, the application of our methodology to the glycolysis pathway and the purine metabolism pathway reproduce the tree of life for the whole set of the organisms represented in the KEGG database which supports the utility of this research. PMID:28493998
U.S. EPA'S ACUTE REFERENCE EXPOSURE METHODOLOGY FOR ACUTE INHALATION EXPOSURES
The US EPA National Center for Environmental Assessment has developed a methodology to derive acute inhalation toxicity benchmarks, called acute reference exposures (AREs), for noncancer effects. The methodology provides guidance for the derivation of chemical-specific benchmark...
NASA Astrophysics Data System (ADS)
Kirkham, R.; Olsen, K.; Hayes, J. C.; Emer, D. F.
2013-12-01
Underground nuclear tests may be first detected by seismic or air samplers operated by the CTBTO (Comprehensive Nuclear-Test-Ban Treaty Organization). After initial detection of a suspicious event, member nations may call for an On-Site Inspection (OSI) that in part, will sample for localized releases of radioactive noble gases and particles. Although much of the commercially available equipment and methods used for surface and subsurface environmental sampling of gases can be used for an OSI scenario, on-site sampling conditions, required sampling volumes and establishment of background concentrations of noble gases require development of specialized methodologies. To facilitate development of sampling equipment and methodologies that address OSI sampling volume and detection objectives, and to collect information required for model development, a field test site was created at a former underground nuclear explosion site located in welded volcanic tuff. A mixture of SF-6, Xe127 and Ar37 was metered into 4400 m3 of air as it was injected into the top region of the UNE cavity. These tracers were expected to move towards the surface primarily in response to barometric pumping or through delayed cavity pressurization (accelerated transport to minimize source decay time). Sampling approaches compared during the field exercise included sampling at the soil surface, inside surface fractures, and at soil vapor extraction points at depths down to 2 m. Effectiveness of various sampling approaches and the results of tracer gas measurements will be presented.
A Comparison of Methods to Analyze Aquatic Heterotrophic Flagellates of Different Taxonomic Groups.
Jeuck, Alexandra; Nitsche, Frank; Wylezich, Claudia; Wirth, Olaf; Bergfeld, Tanja; Brutscher, Fabienne; Hennemann, Melanie; Monir, Shahla; Scherwaß, Anja; Troll, Nicole; Arndt, Hartmut
2017-08-01
Heterotrophic flagellates contribute significantly to the matter flux in aquatic and terrestrial ecosystems. Still today their quantification and taxonomic classification bear several problems in field studies, though these methodological problems seem to be increasingly ignored in current ecological studies. Here we describe and test different methods, the live-counting technique, different fixation techniques, cultivation methods like the liquid aliquot method (LAM), and a molecular single cell survey called aliquot PCR (aPCR). All these methods have been tested either using aquatic field samples or cultures of freshwater and marine taxa. Each of the described methods has its advantages and disadvantages, which have to be considered in every single case. With the live-counting technique a detection of living cells up to morphospecies level is possible. Fixation of cells and staining methods are advantageous due to the possible long-term storage and observation of samples. Cultivation methods (LAM) offer the possibility of subsequent molecular analyses, and aPCR tools might complete the deficiency of LAM in terms of the missing detection of non-cultivable flagellates. In summary, we propose a combination of several investigation techniques reducing the gap between the different methodological problems. Copyright © 2017 Elsevier GmbH. All rights reserved.
Riera, Amalis; Ford, John K; Ross Chapman, N
2013-09-01
Killer whales in British Columbia are at risk, and little is known about their winter distribution. Passive acoustic monitoring of their year-round habitat is a valuable supplemental method to traditional visual and photographic surveys. However, long-term acoustic studies of odontocetes have some limitations, including the generation of large amounts of data that require highly time-consuming processing. There is a need to develop tools and protocols to maximize the efficiency of such studies. Here, two types of analysis, real-time and long term spectral averages, were compared to assess their performance at detecting killer whale calls in long-term acoustic recordings. In addition, two different duty cycles, 1/3 and 2/3, were tested. Both the use of long term spectral averages and a lower duty cycle resulted in a decrease in call detection and positive pod identification, leading to underestimations of the amount of time the whales were present. The impact of these limitations should be considered in future killer whale acoustic surveys. A compromise between a lower resolution data processing method and a higher duty cycle is suggested for maximum methodological efficiency.
Moving object detection using dynamic motion modelling from UAV aerial images.
Saif, A F M Saifuddin; Prabuwono, Anton Satria; Mahayuddin, Zainal Rasyid
2014-01-01
Motion analysis based moving object detection from UAV aerial image is still an unsolved issue due to inconsideration of proper motion estimation. Existing moving object detection approaches from UAV aerial images did not deal with motion based pixel intensity measurement to detect moving object robustly. Besides current research on moving object detection from UAV aerial images mostly depends on either frame difference or segmentation approach separately. There are two main purposes for this research: firstly to develop a new motion model called DMM (dynamic motion model) and secondly to apply the proposed segmentation approach SUED (segmentation using edge based dilation) using frame difference embedded together with DMM model. The proposed DMM model provides effective search windows based on the highest pixel intensity to segment only specific area for moving object rather than searching the whole area of the frame using SUED. At each stage of the proposed scheme, experimental fusion of the DMM and SUED produces extracted moving objects faithfully. Experimental result reveals that the proposed DMM and SUED have successfully demonstrated the validity of the proposed methodology.
78 FR 4369 - Rates for Interstate Inmate Calling Services
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-22
.... Marginal Location Methodology. In 2008, ICS providers submitted the ICS Provider Proposal for ICS rates. The ICS Provider Proposal uses the ``marginal location'' methodology, previously adopted by the... ``marginal location'' methodology provides a ``basis for rates that represent `fair compensation' as set...
Influence of atmospheric properties on detection of wood-warbler nocturnal flight calls
NASA Astrophysics Data System (ADS)
Horton, Kyle G.; Stepanian, Phillip M.; Wainwright, Charlotte E.; Tegeler, Amy K.
2015-10-01
Avian migration monitoring can take on many forms; however, monitoring active nocturnal migration of land birds is limited to a few techniques. Avian nocturnal flight calls are currently the only method for describing migrant composition at the species level. However, as this method develops, more information is needed to understand the sources of variation in call detection. Additionally, few studies examine how detection probabilities differ under varying atmospheric conditions. We use nocturnal flight call recordings from captive individuals to explore the dependence of flight call detection on atmospheric temperature and humidity. Height or distance from origin had the largest influence on call detection, while temperature and humidity also influenced detectability at higher altitudes. Because flight call detection varies with both atmospheric conditions and flight height, improved monitoring across time and space will require correction for these factors to generate standardized metrics of songbird migration.
Risk assessment methodology applied to counter IED research & development portfolio prioritization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shevitz, Daniel W; O' Brien, David A; Zerkle, David K
2009-01-01
In an effort to protect the United States from the ever increasing threat of domestic terrorism, the Department of Homeland Security, Science and Technology Directorate (DHS S&T), has significantly increased research activities to counter the terrorist use of explosives. More over, DHS S&T has established a robust Counter-Improvised Explosive Device (C-IED) Program to Deter, Predict, Detect, Defeat, and Mitigate this imminent threat to the Homeland. The DHS S&T portfolio is complicated and changing. In order to provide the ''best answer'' for the available resources, DHS S&T would like some ''risk based'' process for making funding decisions. There is a definitemore » need for a methodology to compare very different types of technologies on a common basis. A methodology was developed that allows users to evaluate a new ''quad chart'' and rank it, compared to all other quad charts across S&T divisions. It couples a logic model with an evidential reasoning model using an Excel spreadsheet containing weights of the subjective merits of different technologies. The methodology produces an Excel spreadsheet containing the aggregate rankings of the different technologies. It uses Extensible Logic Modeling (ELM) for logic models combined with LANL software called INFTree for evidential reasoning.« less
Ramifications of increased training in quantitative methodology.
Zimiles, Herbert
2009-01-01
Comments on the article "Doctoral training in statistics, measurement, and methodology in psychology: Replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America" by Aiken, West, and Millsap. The current author asks three questions that are provoked by the comprehensive identification of gaps and deficiencies in the training of quantitative methodology that led Aiken, West, and Millsap to call for expanded graduate instruction resources and programs. This comment calls for greater attention to how advances and expansion in the training of quantitative analysis are influencing who chooses to study psychology and how and what will be studied. PsycINFO Database Record 2009 APA.
Günthard, H F; Wong, J K; Ignacio, C C; Havlir, D V; Richman, D D
1998-07-01
The performance of the high-density oligonucleotide array methodology (GeneChip) in detecting drug resistance mutations in HIV-1 pol was compared with that of automated dideoxynucleotide sequencing (ABI) of clinical samples, viral stocks, and plasmid-derived NL4-3 clones. Sequences from 29 clinical samples (plasma RNA, n = 17; lymph node RNA, n = 5; lymph node DNA, n = 7) from 12 patients, from 6 viral stock RNA samples, and from 13 NL4-3 clones were generated by both methods. Editing was done independently by a different investigator for each method before comparing the sequences. In addition, NL4-3 wild type (WT) and mutants were mixed in varying concentrations and sequenced by both methods. Overall, a concordance of 99.1% was found for a total of 30,865 bases compared. The comparison of clinical samples (plasma RNA and lymph node RNA and DNA) showed a slightly lower match of base calls, 98.8% for 19,831 nucleotides compared (protease region, 99.5%, n = 8272; RT region, 98.3%, n = 11,316), than for viral stocks and NL4-3 clones (protease region, 99.8%; RT region, 99.5%). Artificial mixing experiments showed a bias toward calling wild-type bases by GeneChip. Discordant base calls are most likely due to differential detection of mixtures. The concordance between GeneChip and ABI was high and appeared dependent on the nature of the templates (directly amplified versus cloned) and the complexity of mixes.
Prognostics Methodology for Complex Systems
NASA Technical Reports Server (NTRS)
Gulati, Sandeep; Mackey, Ryan
2003-01-01
An automatic method to schedule maintenance and repair of complex systems is produced based on a computational structure called the Informed Maintenance Grid (IMG). This method provides solutions to the two fundamental problems in autonomic logistics: (1) unambiguous detection of deterioration or impending loss of function and (2) determination of the time remaining to perform maintenance or other corrective action based upon information from the system. The IMG provides a health determination over the medium-to-longterm operation of the system, from one or more days to years of study. The IMG is especially applicable to spacecraft and both piloted and autonomous aircraft, or industrial control processes.
A Fault Tolerant System for an Integrated Avionics Sensor Configuration
NASA Technical Reports Server (NTRS)
Caglayan, A. K.; Lancraft, R. E.
1984-01-01
An aircraft sensor fault tolerant system methodology for the Transport Systems Research Vehicle in a Microwave Landing System (MLS) environment is described. The fault tolerant system provides reliable estimates in the presence of possible failures both in ground-based navigation aids, and in on-board flight control and inertial sensors. Sensor failures are identified by utilizing the analytic relationships between the various sensors arising from the aircraft point mass equations of motion. The estimation and failure detection performance of the software implementation (called FINDS) of the developed system was analyzed on a nonlinear digital simulation of the research aircraft. Simulation results showing the detection performance of FINDS, using a dual redundant sensor compliment, are presented for bias, hardover, null, ramp, increased noise and scale factor failures. In general, the results show that FINDS can distinguish between normal operating sensor errors and failures while providing an excellent detection speed for bias failures in the MLS, indicated airspeed, attitude and radar altimeter sensors.
Detecting Damage in Composite Material Using Nonlinear Elastic Wave Spectroscopy Methods
NASA Astrophysics Data System (ADS)
Meo, Michele; Polimeno, Umberto; Zumpano, Giuseppe
2008-05-01
Modern aerospace structures make increasing use of fibre reinforced plastic composites, due to their high specific mechanical properties. However, due to their brittleness, low velocity impact can cause delaminations beneath the surface, while the surface may appear to be undamaged upon visual inspection. Such damage is called barely visible impact damage (BVID). Such internal damages lead to significant reduction in local strengths and ultimately could lead to catastrophic failures. It is therefore important to detect and monitor damages in high loaded composite components to receive an early warning for a well timed maintenance of the aircraft. Non-linear ultrasonic spectroscopy methods are promising damage detection and material characterization tools. In this paper, two different non-linear elastic wave spectroscopy (NEWS) methods are presented: single mode nonlinear resonance ultrasound (NRUS) and nonlinear wave modulation technique (NWMS). The NEWS methods were applied to detect delamination damage due to low velocity impact (<12 J) on various composite plates. The results showed that the proposed methodology appear to be highly sensitive to the presence of damage with very promising future NDT and structural health monitoring applications.
Identification of pathogen genomic variants through an integrated pipeline
2014-01-01
Background Whole-genome sequencing represents a powerful experimental tool for pathogen research. We present methods for the analysis of small eukaryotic genomes, including a streamlined system (called Platypus) for finding single nucleotide and copy number variants as well as recombination events. Results We have validated our pipeline using four sets of Plasmodium falciparum drug resistant data containing 26 clones from 3D7 and Dd2 background strains, identifying an average of 11 single nucleotide variants per clone. We also identify 8 copy number variants with contributions to resistance, and report for the first time that all analyzed amplification events are in tandem. Conclusions The Platypus pipeline provides malaria researchers with a powerful tool to analyze short read sequencing data. It provides an accurate way to detect SNVs using known software packages, and a novel methodology for detection of CNVs, though it does not currently support detection of small indels. We have validated that the pipeline detects known SNVs in a variety of samples while filtering out spurious data. We bundle the methods into a freely available package. PMID:24589256
Infectious Etiologies of Childhood Leukemia: Plausibility and Challenges to Proof
O’Connor, Siobhán M.; Boneva, Roumiana S.
2007-01-01
Infections as well as environmental exposures are proposed determinants of childhood acute lymphoblastic leukemia (ALL), particularly common precursor B-cell ALL (cALL). Lines of investigation test hypotheses that cALL is a rarer result of common infection, that it results from uncommon infection, or that it ensues from abnormal immune development; perhaps it requires a preceding prenatal or early childhood insult. Ideally, studies should document that particular infections precede leukemia and induce malignant transformation. However, limited detection studies have not directly linked specific human or nonhuman infectious agents with ALL or cALL. Primarily based on surrogate markers of infectious exposure, indirect evidence from ecologic and epidemiologic studies varies widely, but some suggest that infancy or early childhood infectious exposures might protect against childhood ALL or cALL. Several others suggest that maternal infection during pregnancy might increase risk or that certain breast-feeding practices decrease risk. To date, evidence cannot confirm or refute whether at least one infection induces or is a major co-factor for developing ALL or cALL, or perhaps actually protects against disease. Differences in methodology and populations studied may explain some inconsistencies. Other challenges to proof include the likely time lag between infection and diagnosis, the ubiquity of many infections, the influence of age at infection, and the limitations in laboratory assays; small numbers of cases, inaccurate background leukemia rates, and difficulty tracking mobile populations further affect cluster investigations. Nevertheless, existing evidence partially supports plausibility and warrants further investigation into potential infectious determinants of ALL and cALL, particularly in the context of multifactorial or complex systems. PMID:17366835
Point counts from clustered populations: Lessons from an experiment with Hawaiian crows
Hayward, G.D.; Kepler, C.B.; Scott, J.M.
1991-01-01
We designed an experiment to identify factors contributing most to error in counts of Hawaiian Crow or Alala (Corvus hawaiiensis) groups that are detected aurally. Seven observers failed to detect calling Alala on 197 of 361 3-min point counts on four transects extending from cages with captive Alala. A detection curve describing the relation between frequency of flock detection and distance typified the distribution expected in transect or point counts. Failure to detect calling Alala was affected most by distance, observer, and Alala calling frequency. The number of individual Alala calling was not important in detection rate. Estimates of the number of Alala calling (flock size) were biased and imprecise: average difference between number of Alala calling and number heard was 3.24 (.+-. 0.277). Distance, observer, number of Alala calling, and Alala calling frequency all contributed to errors in estimates of group size (P < 0.0001). Multiple regression suggested that number of Alala calling contributed most to errors. These results suggest that well-designed point counts may be used to estimate the number of Alala flocks but cast doubt on attempts to estimate flock size when individuals are counted aurally.
A Call for a New National Norming Methodology.
ERIC Educational Resources Information Center
Ligon, Glynn; Mangino, Evangelina
Issues related to achieving adequate national norms are reviewed, and a new methodology is proposed that would work to provide a true measure of national achievement levels on an annual basis and would enable reporting results in current-year norms. Statistical methodology and technology could combine to create a national norming process that…
EVALUATING THE SUSTAINABILITY OF GREEN CHEMISTRIES
The U.S. EPA's National Risk Management Research Laboratory is developing a methodology for the evaluation of reaction chemistries. This methodology, called GREENSCOPE (Gauging Reaction Effectiveness for the ENvironmental Sustainability of Chemistries with a multi-Objective Proc...
Toward Theory-Based Research in Political Communication.
ERIC Educational Resources Information Center
Simon, Adam F.; Iyengar, Shanto
1996-01-01
Praises the theoretical and methodological potential of the field of political communication. Calls for greater interaction and cross fertilization among the fields of political science, sociology, economics, and psychology. Briefly discusses relevant research methodologies. (MJP)
Ly-Sunnaram, Beatrice; Henry, Catherine; Gandemer, Virginie; Mee, Franseza Le; Burtin, Florence; Blayau, Martine; Cayuela, Jean-Michel; Oster, Magalie; Clech, Philippe; Rambeau, Marc; Marie, Celine; Pampin, Cecilia; Edan, Christine; Gall, Edouard Le; Goasguen, Jean E
2005-09-01
We describe here a late extramedullary ovarian relapse in an 18-year-old female who was diagnosed with hypotetraploid cell acute lymphoblastic leukaemia (cALL) at the age of 6. At both occurrences of the disease cells were analyzed by morphology, immunophenotyping, cytogenetics and molecular methods. TEL/AML1 was detected by RT-PCR and FISH analysis in both events. We demonstrated, using detection of IGH/TCR rearrangements and TEL/AML1 breakpoints sequencing that the cells were clonally related. Moreover, interphasic FISH using TEL and AML1 probes showed the loss of a second TEL at the time of relapse. This observation confirms that TEL/AML1 alone is not sufficient to trigger ALL and that TEL deletion is a secondary event in leukemogenesis. To our knowledge, it is the first complete description of extramedullary ALL relapse combining all methodologies.
Delvosalle, Christian; Fievez, Cécile; Pipart, Aurore; Debray, Bruno
2006-03-31
In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term "major accidents" must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called "risk matrix", crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage.
Long-range acoustic detection and localization of blue whale calls in the northeast Pacific Ocean.
Stafford, K M; Fox, C G; Clark, D S
1998-12-01
Analysis of acoustic signals recorded from the U.S. Navy's SOund SUrveillance System (SOSUS) was used to detect and locate blue whale (Balaenoptera musculus) calls offshore in the northeast Pacific. The long, low-frequency components of these calls are characteristic of calls recorded in the presence of blue whales elsewhere in the world. Mean values for frequency and time characteristics from field-recorded blue whale calls were used to develop a simple matched filter for detecting such calls in noisy time series. The matched filter was applied to signals from three different SOSUS arrays off the coast of the Pacific Northwest to detect and associate individual calls from the same animal on the different arrays. A U.S. Navy maritime patrol aircraft was directed to an area where blue whale calls had been detected on SOSUS using these methods, and the presence of vocalizing blue whale was confirmed at the site with field recordings from sonobuoys.
NASA Astrophysics Data System (ADS)
Holt, Marla M.; Insley, Stephen J.; Southall, Brandon L.; Schusterman, Ronald J.
2005-09-01
While attempting to gain access to receptive females, male northern elephant seals form dominance hierarchies through multiple dyadic interactions involving visual and acoustic signals. These signals are both highly stereotyped and directional. Previous behavioral observations suggested that males attend to the directional cues of these signals. We used in situ vocal playbacks to test whether males attend to directional cues of the acoustic components of a competitors calls (i.e., variation in call spectra and source levels). Here, we will focus on playback methodology. Playback calls were multiple exemplars of a marked dominant male from an isolated area, recorded with a directional microphone and DAT recorder and edited into a natural sequence that controlled call amplitude. Control calls were recordings of ambient rookery sounds with the male calls removed. Subjects were 20 marked males (10 adults and 10 subadults) all located at An~o Nuevo, CA. Playback presentations, calibrated for sound-pressure level, were broadcast at a distance of 7 m from each subject. Most responses were classified into the following categories: visual orientation, postural change, calling, movement toward or away from the loudspeaker, and re-directed aggression. We also investigated developmental, hierarchical, and ambient noise variables that were thought to influence male behavior.
Designing a Strategic Plan through an Emerging Knowledge Generation Process: The ATM Experience
ERIC Educational Resources Information Center
Zanotti, Francesco
2012-01-01
Purpose: The aim of this contribution is to describe a new methodology for designing strategic plans and how it was implemented by ATM, a public transportation agency based in Milan, Italy. Design/methodology/approach: This methodology is founded on a new system theory, called "quantum systemics". It is based on models and metaphors both…
Improving Self Service the Six Sigma Way at Newcastle University Library
ERIC Educational Resources Information Center
Kumi, Susan; Morrow, John
2006-01-01
Purpose: To report on the collaborative project between Newcastle University Library and 3M which aimed to increase self-issue levels using six sigma methodology. Design/methodology/approach: The six-month long project is outlined and gives an insight into the process improvement methodology called six sigma. An explanation of why we ran the…
ERIC Educational Resources Information Center
Ziegler, Nicole; Meurers, Detmar; Rebuschat, Patrick; Ruiz, Simón; Moreno-Vega, José L.; Chinkina, Maria; Li, Wenjing; Grey, Sarah
2017-01-01
Despite the promise of research conducted at the intersection of computer-assisted language learning (CALL), natural language processing, and second language acquisition, few studies have explored the potential benefits of using intelligent CALL systems to deepen our understanding of the process and products of second language (L2) learning. The…
EVALUATING METRICS FOR GREEN CHEMISTRIES: INFORMATION AND CALCULATION NEEDS
Research within the U.S. EPA's National Risk Management Research Laboratory is developing a methodology for the evaluation of green chemistries. This methodology called GREENSCOPE (Gauging Reaction Effectiveness for the ENvironmental Sustainability of Chemistries with a multi-Ob...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-12
... . Following an invitation from the UNFCCC to ``undertake further methodological work on wetlands, focusing on... developing additional national-level inventory methodological guidance on wetlands, including default...
USDA-ARS?s Scientific Manuscript database
The tomato genome sequence was undertaken at a time when state-of-the-art sequencing methodologies were undergoing a transition to co-called next generation methodologies. The result was an international consortium undertaking a strategy merging both old and new approaches. Because biologists were...
Researcher / Researched: Repositioning Research Paradigms
ERIC Educational Resources Information Center
Meerwald, Agnes May Lin
2013-01-01
"Researcher / Researched" calls for a complementary research methodology by proposing autoethnography as both a method and text that crosses the boundaries of conventional and alternative methodologies in higher education. Autoethnography rearticulates the researcher / researched positions by blurring the boundary between them. This…
Identifying Key Words in 9-1-1 Calls for Stroke: A Mixed Methods Approach.
Richards, Christopher T; Wang, Baiyang; Markul, Eddie; Albarran, Frank; Rottman, Doreen; Aggarwal, Neelum T; Lindeman, Patricia; Stein-Spencer, Leslee; Weber, Joseph M; Pearlman, Kenneth S; Tataris, Katie L; Holl, Jane L; Klabjan, Diego; Prabhakaran, Shyam
2017-01-01
Identifying stroke during a 9-1-1 call is critical to timely prehospital care. However, emergency medical dispatchers (EMDs) recognize stroke in less than half of 9-1-1 calls, potentially due to the words used by callers to communicate stroke signs and symptoms. We hypothesized that callers do not typically use words and phrases considered to be classical descriptors of stroke, such as focal neurologic deficits, but that a mixed-methods approach can identify words and phrases commonly used by 9-1-1 callers to describe acute stroke victims. We performed a mixed-method, retrospective study of 9-1-1 call audio recordings for adult patients with confirmed stroke who were transported by ambulance in a large urban city. Content analysis, a qualitative methodology, and computational linguistics, a quantitative methodology, were used to identify key words and phrases used by 9-1-1 callers to describe acute stroke victims. Because a caller's level of emotional distress contributes to the communication during a 9-1-1 call, the Emotional Content and Cooperation Score was scored by a multidisciplinary team. A total of 110 9-1-1 calls, received between June and September 2013, were analyzed. EMDs recognized stroke in 48% of calls, and the emotional state of most callers (95%) was calm. In 77% of calls in which EMDs recognized stroke, callers specifically used the word "stroke"; however, the word "stroke" was used in only 38% of calls. Vague, non-specific words and phrases were used to describe stroke victims' symptoms in 55% of calls, and 45% of callers used distractor words and phrases suggestive of non-stroke emergencies. Focal neurologic symptoms were described in 39% of calls. Computational linguistics identified 9 key words that were more commonly used in calls where the EMD identified stroke. These words were concordant with terms identified through qualitative content analysis. Most 9-1-1 callers used vague, non-specific, or distractor words and phrases and infrequently provide classic stroke descriptions during 9-1-1 calls for stroke. Both qualitative and quantitative methodologies identified similar key words and phrases associated with accurate EMD stroke recognition. This study suggests that tools incorporating commonly used words and phrases could potentially improve EMD stroke recognition.
ERIC Educational Resources Information Center
Ross, Linda
2003-01-01
Recent work with automotive e-commerce clients led to the development of a performance analysis methodology called the Seven Performance Drivers, including: standards, incentives, capacity, knowledge and skill, measurement, feedback, and analysis. This methodology has been highly effective in introducing and implementing performance improvement.…
SIMPLIFYING EVALUATIONS OF GREEN CHEMISTRIES: HOW MUCH INFORMATION DO WE NEED?
Research within the U.S. EPA's National Risk Management Research Laboratory is developing a methodology for the evaluation of green chemistries. This methodology called GREENSCOPE (Gauging Reaction Effectiveness for the Environmental Sustainability of Chemistries with a multi-Ob...
Cragg, Jenna L.; Burger, Alan E.; Piatt, John F.
2015-01-01
Cryptic nest sites and secretive breeding behavior make population estimates and monitoring of Marbled Murrelets Brachyramphus marmoratus difficult and expensive. Standard audio-visual and radar protocols have been refined but require intensive field time by trained personnel. We examined the detection range of automated sound recorders (Song Meters; Wildlife Acoustics Inc.) and the reliability of automated recognition models (“recognizers”) for identifying and quantifying Marbled Murrelet vocalizations during the 2011 and 2012 breeding seasons at Kodiak Island, Alaska. The detection range of murrelet calls by Song Meters was estimated to be 60 m. Recognizers detected 20 632 murrelet calls (keer and keheer) from a sample of 268 h of recordings, yielding 5 870 call series, which compared favorably with human scanning of spectrograms (on average detecting 95% of the number of call series identified by a human observer, but not necessarily the same call series). The false-negative rate (percentage of murrelet call series that the recognizers failed to detect) was 32%, mainly involving weak calls and short call series. False-positives (other sounds included by recognizers as murrelet calls) were primarily due to complex songs of other bird species, wind and rain. False-positives were lower in forest nesting habitat (48%) and highest in shrubby vegetation where calls of other birds were common (97%–99%). Acoustic recorders tracked spatial and seasonal trends in vocal activity, with higher call detections in high-quality forested habitat and during late July/early August. Automated acoustic monitoring of Marbled Murrelet calls could provide cost-effective, valuable information for assessing habitat use and temporal and spatial trends in nesting activity; reliability is dependent on careful placement of sensors to minimize false-positives and on prudent application of digital recognizers with visual checking of spectrograms.
Harvesting model uncertainty for the simulation of interannual variability
NASA Astrophysics Data System (ADS)
Misra, Vasubandhu
2009-08-01
An innovative modeling strategy is introduced to account for uncertainty in the convective parameterization (CP) scheme of a coupled ocean-atmosphere model. The methodology involves calling the CP scheme several times at every given time step of the model integration to pick the most probable convective state. Each call of the CP scheme is unique in that one of its critical parameter values (which is unobserved but required by the scheme) is chosen randomly over a given range. This methodology is tested with the relaxed Arakawa-Schubert CP scheme in the Center for Ocean-Land-Atmosphere Studies (COLA) coupled general circulation model (CGCM). Relative to the control COLA CGCM, this methodology shows improvement in the El Niño-Southern Oscillation simulation and the Indian summer monsoon precipitation variability.
Behavioral networks as a model for intelligent agents
NASA Technical Reports Server (NTRS)
Sliwa, Nancy E.
1990-01-01
On-going work at NASA Langley Research Center in the development and demonstration of a paradigm called behavioral networks as an architecture for intelligent agents is described. This work focuses on the need to identify a methodology for smoothly integrating the characteristics of low-level robotic behavior, including actuation and sensing, with intelligent activities such as planning, scheduling, and learning. This work assumes that all these needs can be met within a single methodology, and attempts to formalize this methodology in a connectionist architecture called behavioral networks. Behavioral networks are networks of task processes arranged in a task decomposition hierarchy. These processes are connected by both command/feedback data flow, and by the forward and reverse propagation of weights which measure the dynamic utility of actions and beliefs.
ERIC Educational Resources Information Center
Nordstrum, Lee E.; LeMahieu, Paul G.; Dodd, Karen
2017-01-01
Purpose: This paper is one of seven in this volume elaborating different approaches to quality improvement in education. This paper aims to delineate a methodology called Deliverology. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study showing an application of Deliverology in the…
Spatiotemporal Detection of Unusual Human Population Behavior Using Mobile Phone Data
Dobra, Adrian; Williams, Nathalie E.; Eagle, Nathan
2015-01-01
With the aim to contribute to humanitarian response to disasters and violent events, scientists have proposed the development of analytical tools that could identify emergency events in real-time, using mobile phone data. The assumption is that dramatic and discrete changes in behavior, measured with mobile phone data, will indicate extreme events. In this study, we propose an efficient system for spatiotemporal detection of behavioral anomalies from mobile phone data and compare sites with behavioral anomalies to an extensive database of emergency and non-emergency events in Rwanda. Our methodology successfully captures anomalous behavioral patterns associated with a broad range of events, from religious and official holidays to earthquakes, floods, violence against civilians and protests. Our results suggest that human behavioral responses to extreme events are complex and multi-dimensional, including extreme increases and decreases in both calling and movement behaviors. We also find significant temporal and spatial variance in responses to extreme events. Our behavioral anomaly detection system and extensive discussion of results are a significant contribution to the long-term project of creating an effective real-time event detection system with mobile phone data and we discuss the implications of our findings for future research to this end. PMID:25806954
Episodic Upwelling of Zooplankton within a Bowhead Whale Feeding Area Near Barrow, AK
2011-09-30
the Beaufort year-round. Bowhead whales vocalize using both calls and songs . There was distinct seasonal variability in the detection of the...different species’ calls/ songs . Calls/ songs from whale species were detected in fall and declined as ice concentration in the mooring vicinity increased...Figs. 4 & 5). In the spring, however, whale calls/ songs were detected beginning in April when the region was still covered with ice, and continued
Lorey, Martina; Adler, Belinda; Yan, Hong; Soliymani, Rabah; Ekström, Simon; Yli-Kauhaluoma, Jari; Laurell, Thomas; Baumann, Marc
2015-05-19
A new read-out method for antibody arrays using laser desorption/ionization-mass spectrometry (LDI-MS) is presented. Small, photocleavable reporter molecules with a defined mass called "mass-tags" are used for detection of immunocaptured proteins from human plasma. Using prostate specific antigen (PSA), a biomarker for prostate cancer, as a model antigen, a high sensitivity generic detection methodology based immunocapture with a primary antibody and with a biotin labeled secondary antibody coupled to mass-tagged avidin is demonstrated. As each secondary antibody can bind several avidin molecules, each having a large number of mass-tags, signal amplification can be achieved. The developed PSA sandwich mass-tag analysis method provided a limit of detection below 200 pg/mL (6 pM) for a 10 μL plasma sample, well below the clinically relevant cutoff value of 3-4 ng/mL. This brings the limit of detection (LOD) for detection of intact antigens with matrix-assisted laser desorption/ionization-mass spectrometry (MALDI-MS) down to levels comparable to capture by anti-peptide antibodies selected reaction monitoring (SISCAPA SRM) and enzyme linked immunosorbent assay (ELISA), as 6 pM corresponds to a maximal amount of 60 amol PSA captured on-spot. We propose the potential use of LDI (laser desorption/ionization) with mass-tag read-out implemented in a sandwich assay format for low abundant and/or early disease biomarker detection.
Hayes, J E; McGreevy, P D; Forbes, S L; Laing, G; Stuetz, R M
2018-08-01
Detection dogs serve a plethora of roles within modern society, and are relied upon to identify threats such as explosives and narcotics. Despite their importance, research and training regarding detection dogs has involved ambiguity. This is partially due to the fact that the assessment of effectiveness regarding detection dogs continues to be entrenched within a traditional, non-scientific understanding. Furthermore, the capabilities of detection dogs are also based on their olfactory physiology and training methodologies, both of which are hampered by knowledge gaps. Additionally, the future of detection dogs is strongly influenced by welfare and social implications. Most importantly however, is the emergence of progressively inexpensive and efficacious analytical methodologies including gas chromatography related techniques, "e-noses", and capillary electrophoresis. These analytical methodologies provide both an alternative and assistor for the detection dog industry, however the interrelationship between these two detection paradigms requires clarification. These factors, when considering their relative contributions, illustrate a need to address research gaps, formalise the detection dog industry and research process, as well as take into consideration analytical methodologies and their influence on the future status of detection dogs. This review offers an integrated assessment of the factors involved in order to determine the current and future status of detection dogs. Copyright © 2018 Elsevier B.V. All rights reserved.
Cheng, Chia-Yang; Chu, Chia-Han; Hsu, Hung-Wei; Hsu, Fang-Rong; Tang, Chung Yi; Wang, Wen-Ching; Kung, Hsing-Jien; Chang, Pei-Ching
2014-01-01
Post-translational modification (PTM) of transcriptional factors and chromatin remodelling proteins is recognized as a major mechanism by which transcriptional regulation occurs. Chromatin immunoprecipitation (ChIP) in combination with high-throughput sequencing (ChIP-seq) is being applied as a gold standard when studying the genome-wide binding sites of transcription factor (TFs). This has greatly improved our understanding of protein-DNA interactions on a genomic-wide scale. However, current ChIP-seq peak calling tools are not sufficiently sensitive and are unable to simultaneously identify post-translational modified TFs based on ChIP-seq analysis; this is largely due to the wide-spread presence of multiple modified TFs. Using SUMO-1 modification as an example; we describe here an improved approach that allows the simultaneous identification of the particular genomic binding regions of all TFs with SUMO-1 modification. Traditional peak calling methods are inadequate when identifying multiple TF binding sites that involve long genomic regions and therefore we designed a ChIP-seq processing pipeline for the detection of peaks via a combinatorial fusion method. Then, we annotate the peaks with known transcription factor binding sites (TFBS) using the Transfac Matrix Database (v7.0), which predicts potential SUMOylated TFs. Next, the peak calling result was further analyzed based on the promoter proximity, TFBS annotation, a literature review, and was validated by ChIP-real-time quantitative PCR (qPCR) and ChIP-reChIP real-time qPCR. The results show clearly that SUMOylated TFs are able to be pinpointed using our pipeline. A methodology is presented that analyzes SUMO-1 ChIP-seq patterns and predicts related TFs. Our analysis uses three peak calling tools. The fusion of these different tools increases the precision of the peak calling results. TFBS annotation method is able to predict potential SUMOylated TFs. Here, we offer a new approach that enhances ChIP-seq data analysis and allows the identification of multiple SUMOylated TF binding sites simultaneously, which can then be utilized for other functional PTM binding site prediction in future.
Using Puppets to Teach Schoolchildren to Detect Stroke and Call 911
ERIC Educational Resources Information Center
Sharkey, Sonya; Denke, Linda; Herbert, Morley A.
2016-01-01
To overcome barriers to improved outcomes, we undertook an intervention to teach schoolchildren how to detect a stroke and call emergency medical services (EMS). We obtained permission from parents and guardians to use an 8-min puppet show to instruct the fourth, fifth, and sixth graders about stroke detection, symptomatology, and calling EMS. A…
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Greco, Patricia
2017-01-01
Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Lean for Education. Design/methodology/approach: The paper presents the origins, theoretical foundations, core concepts and a case study demonstrating an application in US education,…
User-Centered Computer Aided Language Learning
ERIC Educational Resources Information Center
Zaphiris, Panayiotis, Ed.; Zacharia, Giorgos, Ed.
2006-01-01
In the field of computer aided language learning (CALL), there is a need for emphasizing the importance of the user. "User-Centered Computer Aided Language Learning" presents methodologies, strategies, and design approaches for building interfaces for a user-centered CALL environment, creating a deeper understanding of the opportunities and…
Instructional Innovation - MOS: A Model Involving Student Participation.
ERIC Educational Resources Information Center
Malloy, Elizabeth; O'Donnell, Terrence P.
1987-01-01
Asserts that new models of instructional methodology are needed to meet the demands of a changing world community. Describes a small-group teaching method called MOS, which calls for students to read, analyze, and shape meaning gained from material while instructors encourage and provide insight. (BSR)
Methodological challenges collecting parent phone-call healthcare utilization data.
Moreau, Paula; Crawford, Sybil; Sullivan-Bolyai, Susan
2016-02-01
Recommendations by the National Institute of Nursing Research and other groups have strongly encouraged nurses to pay greater attention to cost-effectiveness analysis when conducting research. Given the increasing prominence of translational science and comparative effective research, cost-effective analysis has become a basic tool in determining intervention value in research. Tracking phone-call communication (number of calls and context) with cross-checks between parents and healthcare providers is an example of this type of healthcare utilization data collection. This article identifies some methodological challenges that have emerged in the process of collecting this type of data in a randomized controlled trial: Parent education Through Simulation-Diabetes (PETS-D). We also describe ways in which those challenges have been addressed with comparison data results, and make recommendations for future research. Copyright © 2015 Elsevier Inc. All rights reserved.
Carduff, Emma; Murray, Scott A; Kendall, Marilyn
2015-04-11
Qualitative longitudinal research is an evolving methodology, particularly within health care research. It facilitates a nuanced understanding of how phenomena change over time and is ripe for innovative approaches. However, methodological reflections which are tailored to health care research are scarce. This article provides a synthesised and practical account of the advantages and challenges of maintaining regular telephone contact between interviews with participants in a qualitative longitudinal study. Participants with metastatic colorectal cancer were interviewed at 3 time points over the course of a year. Half the group also received monthly telephone calls to explore the added value and the feasibility of capturing change as close to when it was occurring as possible. The data gathered from the telephone calls added context to the participants' overall narrative and informed subsequent interviews. The telephone calls meant we were able to capture change close to when it happened and there was a more evolved, and involved, relationship between the researcher and the participants who were called on a monthly basis. However, ethical challenges were amplified, boundaries of the participant/researcher relationship questioned, and there was the added analytical burden. The telephone calls facilitated a more nuanced understanding of the illness experience to emerge, when compared with the interview only group. The findings suggest that intensive telephone contact may be justified if retention is an issue, when the phenomena being studied is unpredictable and when participants feel disempowered or lack control. These are potential issues for research involving participants with long-term illness.
1984-02-01
PERFORM FLOW, CAPITAL COST, CALL CALCI ENGINEERING AND OPERATING CALL CALC2 CALCULATIONS AND MAINTENANCE REPORTS PERFORM FINANCIAL CALL ECONM FINANCIAL...8217-, " : ’.:. _’t " .- - -,, . , . , . ’,L "- "e " .°,’,/’,,.’" ""./"" " - - , "."-" ". 9 -".3 "’, 9.2.5 Financial Analysis Routines ECONM serves as
Thode, Aaron M; Kim, Katherine H; Blackwell, Susanna B; Greene, Charles R; Nations, Christopher S; McDonald, Trent L; Macrander, A Michael
2012-05-01
An automated procedure has been developed for detecting and localizing frequency-modulated bowhead whale sounds in the presence of seismic airgun surveys. The procedure was applied to four years of data, collected from over 30 directional autonomous recording packages deployed over a 280 km span of continental shelf in the Alaskan Beaufort Sea. The procedure has six sequential stages that begin by extracting 25-element feature vectors from spectrograms of potential call candidates. Two cascaded neural networks then classify some feature vectors as bowhead calls, and the procedure then matches calls between recorders to triangulate locations. To train the networks, manual analysts flagged 219 471 bowhead call examples from 2008 and 2009. Manual analyses were also used to identify 1.17 million transient signals that were not whale calls. The network output thresholds were adjusted to reject 20% of whale calls in the training data. Validation runs using 2007 and 2010 data found that the procedure missed 30%-40% of manually detected calls. Furthermore, 20%-40% of the sounds flagged as calls are not present in the manual analyses; however, these extra detections incorporate legitimate whale calls overlooked by human analysts. Both manual and automated methods produce similar spatial and temporal call distributions.
Complicating Methodological Transparency
ERIC Educational Resources Information Center
Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.
2016-01-01
A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…
Actualizacion linguistica, AL-1 (Current Linguistics, AL-1).
ERIC Educational Resources Information Center
Penaloza, Miguel
This document, the first in a series called "Actualizacion Linguistica," seeks to establish the bases for testing a new methodology for teaching Spanish to Colombia beginning at the preschool and primary levels. The methodology initially uses a system of "logic blocks" of differing size, color, shape, and weight to devise games…
Against Simplicity, against Ethics: Analytics of Disruption as Quasi-Methodology
ERIC Educational Resources Information Center
Childers, Sara M.
2012-01-01
Simplified understandings of qualitative inquiry as mere method overlook the complexity and nuance of qualitative practice. As is the call of this special issue, the author intervenes in the simplification of qualitative inquiry through a discussion of methodology to illustrate how theory and inquiry are inextricably linked and ethically…
ERIC Educational Resources Information Center
Casado, Banghwa Lee; Negi, Nalini Junko; Hong, Michin
2012-01-01
Despite the growing number of language minorities, foreign-born individuals with limited English proficiency, this population has been largely left out of social work research, often due to methodological challenges involved in conducting research with this population. Whereas the professional standard calls for cultural competence, a discussion…
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Cudney, Elizabeth A.
2017-01-01
Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Six Sigma. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study demonstrating an application of Six Sigma in a…
Positive Deviance: Learning from Positive Anomalies
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Gale, Dick
2017-01-01
Purpose: This paper is one of seven in this volume, each elaborating different approaches to quality improvement in education. The purpose of this paper is to delineate a methodology called positive deviance. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study demonstrating an…
Visualization of delamination in composite materials utilizing advanced X-ray imaging techniques
NASA Astrophysics Data System (ADS)
Vavrik, D.; Jakubek, J.; Jandejsek, I.; Krejci, F.; Kumpova, I.; Zemlicka, J.
2015-04-01
This work is focused on the development of instrumental radiographic methods for detection of delaminations in layered carbon fibre reinforced plastic composites used in the aerospace industry. The main limitation of current visualisation techniques is a very limited possibility to image so-called closed delaminations in which delaminated layers are in contact practically with no physical gap. In this contribution we report the development of innovative methods for closed delamination detection using an X-ray phase contrast technique for which the distance between delamination surfaces is not relevant. The approach is based on the energetic sensitivity of phase-enhanced radiography. Based on the applied methodology, we can distinguish both closed and open delamination. Further we have demonstrated the possibility to visualise open delaminations characterised by a physical gap between delaminated layers. This delamination type was successfully identified and visualized utilizing a high resolution and computed tomography table-top technique based on proper beam-hardening effect correction.
Eyetracking Methodology in SCMC: A Tool for Empowering Learning and Teaching
ERIC Educational Resources Information Center
Stickler, Ursula; Shi, Lijing
2017-01-01
Computer-assisted language learning, or CALL, is an interdisciplinary area of research, positioned between science and social science, computing and education, linguistics and applied linguistics. This paper argues that by appropriating methods originating in some areas of CALL-related research, for example human-computer interaction (HCI) or…
NASA Astrophysics Data System (ADS)
Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB
2017-11-01
Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.
Structural Damage Detection Using Changes in Natural Frequencies: Theory and Applications
NASA Astrophysics Data System (ADS)
He, K.; Zhu, W. D.
2011-07-01
A vibration-based method that uses changes in natural frequencies of a structure to detect damage has advantages over conventional nondestructive tests in detecting various types of damage, including loosening of bolted joints, using minimum measurement data. Two major challenges associated with applications of the vibration-based damage detection method to engineering structures are addressed: accurate modeling of structures and the development of a robust inverse algorithm to detect damage, which are defined as the forward and inverse problems, respectively. To resolve the forward problem, new physics-based finite element modeling techniques are developed for fillets in thin-walled beams and for bolted joints, so that complex structures can be accurately modeled with a reasonable model size. To resolve the inverse problem, a logistical function transformation is introduced to convert the constrained optimization problem to an unconstrained one, and a robust iterative algorithm using a trust-region method, called the Levenberg-Marquardt method, is developed to accurately detect the locations and extent of damage. The new methodology can ensure global convergence of the iterative algorithm in solving under-determined system equations and deal with damage detection problems with relatively large modeling error and measurement noise. The vibration-based damage detection method is applied to various structures including lightning masts, a space frame structure and one of its components, and a pipeline. The exact locations and extent of damage can be detected in the numerical simulation where there is no modeling error and measurement noise. The locations and extent of damage can be successfully detected in experimental damage detection.
Automatic food intake detection based on swallowing sounds.
Makeyev, Oleksandr; Lopez-Meyer, Paulo; Schuckers, Stephanie; Besio, Walter; Sazonov, Edward
2012-11-01
This paper presents a novel fully automatic food intake detection methodology, an important step toward objective monitoring of ingestive behavior. The aim of such monitoring is to improve our understanding of eating behaviors associated with obesity and eating disorders. The proposed methodology consists of two stages. First, acoustic detection of swallowing instances based on mel-scale Fourier spectrum features and classification using support vector machines is performed. Principal component analysis and a smoothing algorithm are used to improve swallowing detection accuracy. Second, the frequency of swallowing is used as a predictor for detection of food intake episodes. The proposed methodology was tested on data collected from 12 subjects with various degrees of adiposity. Average accuracies of >80% and >75% were obtained for intra-subject and inter-subject models correspondingly with a temporal resolution of 30s. Results obtained on 44.1 hours of data with a total of 7305 swallows show that detection accuracies are comparable for obese and lean subjects. They also suggest feasibility of food intake detection based on swallowing sounds and potential of the proposed methodology for automatic monitoring of ingestive behavior. Based on a wearable non-invasive acoustic sensor the proposed methodology may potentially be used in free-living conditions.
Automatic food intake detection based on swallowing sounds
Makeyev, Oleksandr; Lopez-Meyer, Paulo; Schuckers, Stephanie; Besio, Walter; Sazonov, Edward
2012-01-01
This paper presents a novel fully automatic food intake detection methodology, an important step toward objective monitoring of ingestive behavior. The aim of such monitoring is to improve our understanding of eating behaviors associated with obesity and eating disorders. The proposed methodology consists of two stages. First, acoustic detection of swallowing instances based on mel-scale Fourier spectrum features and classification using support vector machines is performed. Principal component analysis and a smoothing algorithm are used to improve swallowing detection accuracy. Second, the frequency of swallowing is used as a predictor for detection of food intake episodes. The proposed methodology was tested on data collected from 12 subjects with various degrees of adiposity. Average accuracies of >80% and >75% were obtained for intra-subject and inter-subject models correspondingly with a temporal resolution of 30s. Results obtained on 44.1 hours of data with a total of 7305 swallows show that detection accuracies are comparable for obese and lean subjects. They also suggest feasibility of food intake detection based on swallowing sounds and potential of the proposed methodology for automatic monitoring of ingestive behavior. Based on a wearable non-invasive acoustic sensor the proposed methodology may potentially be used in free-living conditions. PMID:23125873
Nadeau, Christopher P.; Conway, Courtney J.; Piest, Linden; Burger, William P.
2013-01-01
Broadcasting calls of marsh birds during point-count surveys increases their detection probability and decreases variation in the number of birds detected across replicate surveys. However, multi-species monitoring using call-broadcast may reduce these benefits if birds are reluctant to call once they hear broadcasted calls of other species. We compared a protocol that uses call-broadcast for only one species (Yuma clapper rail [Rallus longirostris yumanensis]) to a protocol that uses call-broadcast for multiple species. We detected more of each of the following species using the multi-species protocol: 25 % more pied-billed grebes, 160 % more American bitterns, 52 % more least bitterns, 388 % more California black rails, 12 % more Yuma clapper rails, 156 % more Virginia rails, 214 % more soras, and 19 % more common gallinules. Moreover, the coefficient of variation was smaller when using the multi-species protocol: 10 % smaller for pied-billed grebes, 38 % smaller for American bitterns, 19 % smaller for least bitterns, 55 % smaller for California black rails, 5 % smaller for Yuma clapper rails, 38 % smaller for Virginia rails, 44 % smaller for soras, and 8 % smaller for common gallinules. Our results suggest that multi-species monitoring approaches may be more effective and more efficient than single-species approaches even when using call-broadcast.
A Cooperative IDS Approach Against MPTCP Attacks
2017-06-01
physical testbeds in order to present a methodology that allows distributed IDSs (DIDS) to cooperate in a manner that permits effective detection of...reconstruct MPTCP subflows and detect malicious content. Next, we build physical testbeds in order to present a methodology that allows distributed IDSs...hypotheses on a more realistic testbed environment. • Developing a methodology to incorporate multiple IDSs, real and virtual, to be able to detect cross
Using a Principle-Based Method to Support a Disability Aesthetic
ERIC Educational Resources Information Center
Anderson, Bailey
2015-01-01
This article calls choreographers and educators alike to continue building an awareness of methodologies that support a disability aesthetic. A disability aesthetic supports the embodiment of dancers with disabilities by allowing for their bodies to set guidelines of beauty and value. Principle-based work is a methodology that supports a…
ERIC Educational Resources Information Center
Lundh, Anna
2010-01-01
Introduction: The concept of information needs is significant within the field of Information Needs Seeking and Use. "How" information needs can be studied empirically is however something that has been called into question. The main aim of this paper is to explore the methodological consequences of discursively oriented theories when…
ERIC Educational Resources Information Center
Huesca, Robert
The participatory method of image production holds enormous potential for communication and journalism scholars operating out of a critical/cultural framework. The methodological potentials of mechanical reproduction were evident in the 1930s, when Walter Benjamin contributed three enduring concepts: questioning the art/document dichotomy; placing…
Preliminary Validation of Composite Material Constitutive Characterization
John G. Michopoulos; Athanasios lliopoulos; John C. Hermanson; Adrian C. Orifici; Rodney S. Thomson
2012-01-01
This paper is describing the preliminary results of an effort to validate a methodology developed for composite material constitutive characterization. This methodology involves using massive amounts of data produced from multiaxially tested coupons via a 6-DoF robotic system called NRL66.3 developed at the Naval Research Laboratory. The testing is followed by...
A Call for Methodological Plurality: Reconsidering Research Approaches in Adult Education
ERIC Educational Resources Information Center
Daley, Barbara J.; Martin, Larry G.; Roessger, Kevin M.
2018-01-01
Within this "Adult Education Quarterly" ("AEQ") forum, the authors call for a dialogue and examination of research methods in the field of adult and continuing education. Using the article by Boeren as a starting point, the authors analyze both qualitative and quantitative research trends and advocate for more methodological…
ERIC Educational Resources Information Center
Ross, Karen; Dennis, Barbara; Zhao, Pengfei; Li, Peiwei
2017-01-01
We are in an era that calls for increasing "training" in educational research methodologies. When the National Research Council (2004) calls for training in educational research that is "rigorous" and "relevant," the focus strongly emphasizes WHAT should be taught instead of WHO is being engaged in the learning.…
Contextuality and Cultural Texts: A Case Study of Workplace Learning in Call Centres
ERIC Educational Resources Information Center
Crouch, Margaret
2006-01-01
Purpose: The paper seeks to show the contextualisation of call centres as a work-specific ethnographically and culturally based community, which, in turn, influences pedagogical practices through the encoding and decoding of cultural texts in relation to two logics: cost-efficiency and customer-orientation. Design/methodology/approach: The paper…
The Flipped Classroom: Implementing Technology to Aid in College Mathematics Student's Success
ERIC Educational Resources Information Center
Buch, George R.; Warren, Carryn B.
2017-01-01
August 2016 there was a call (Braun, Bremser, Duval, Lockwood & White, 2017) for post-secondary instructors to use active learning in their classrooms. Once such example of active learning is what is called the "flipped" classroom. This paper presents the need for, and the methodology of the flipped classroom, results of…
ERIC Educational Resources Information Center
Keisanen, Tiina; Kuure, Leena
2015-01-01
Language teachers of the future, our current students, live in an increasingly technology-rich world. However, language students do not necessarily see their own digital practices as having relevance for guiding language learning. Research in the fields of CALL and language education more generally indicates that teaching practices change slowly…
Pairing call-response surveys and distance sampling for a mammalian carnivore
Hansen, Sara J. K.; Frair, Jacqueline L.; Underwood, Harold B.; Gibbs, James P.
2015-01-01
Density estimates accounting for differential animal detectability are difficult to acquire for wide-ranging and elusive species such as mammalian carnivores. Pairing distance sampling with call-response surveys may provide an efficient means of tracking changes in populations of coyotes (Canis latrans), a species of particular interest in the eastern United States. Blind field trials in rural New York State indicated 119-m linear error for triangulated coyote calls, and a 1.8-km distance threshold for call detectability, which was sufficient to estimate a detection function with precision using distance sampling. We conducted statewide road-based surveys with sampling locations spaced ≥6 km apart from June to August 2010. Each detected call (be it a single or group) counted as a single object, representing 1 territorial pair, because of uncertainty in the number of vocalizing animals. From 524 survey points and 75 detections, we estimated the probability of detecting a calling coyote to be 0.17 ± 0.02 SE, yielding a detection-corrected index of 0.75 pairs/10 km2 (95% CI: 0.52–1.1, 18.5% CV) for a minimum of 8,133 pairs across rural New York State. Importantly, we consider this an index rather than true estimate of abundance given the unknown probability of coyote availability for detection during our surveys. Even so, pairing distance sampling with call-response surveys provided a novel, efficient, and noninvasive means of monitoring populations of wide-ranging and elusive, albeit reliably vocal, mammalian carnivores. Our approach offers an effective new means of tracking species like coyotes, one that is readily extendable to other species and geographic extents, provided key assumptions of distance sampling are met.
Evaluation of copy number variation detection for a SNP array platform
2014-01-01
Background Copy Number Variations (CNVs) are usually inferred from Single Nucleotide Polymorphism (SNP) arrays by use of some software packages based on given algorithms. However, there is no clear understanding of the performance of these software packages; it is therefore difficult to select one or several software packages for CNV detection based on the SNP array platform. We selected four publicly available software packages designed for CNV calling from an Affymetrix SNP array, including Birdsuite, dChip, Genotyping Console (GTC) and PennCNV. The publicly available dataset generated by Array-based Comparative Genomic Hybridization (CGH), with a resolution of 24 million probes per sample, was considered to be the “gold standard”. Compared with the CGH-based dataset, the success rate, average stability rate, sensitivity, consistence and reproducibility of these four software packages were assessed compared with the “gold standard”. Specially, we also compared the efficiency of detecting CNVs simultaneously by two, three and all of the software packages with that by a single software package. Results Simply from the quantity of the detected CNVs, Birdsuite detected the most while GTC detected the least. We found that Birdsuite and dChip had obvious detecting bias. And GTC seemed to be inferior because of the least amount of CNVs it detected. Thereafter we investigated the detection consistency produced by one certain software package and the rest three software suits. We found that the consistency of dChip was the lowest while GTC was the highest. Compared with the CNVs detecting result of CGH, in the matching group, GTC called the most matching CNVs, PennCNV-Affy ranked second. In the non-overlapping group, GTC called the least CNVs. With regards to the reproducibility of CNV calling, larger CNVs were usually replicated better. PennCNV-Affy shows the best consistency while Birdsuite shows the poorest. Conclusion We found that PennCNV outperformed the other three packages in the sensitivity and specificity of CNV calling. Obviously, each calling method had its own limitations and advantages for different data analysis. Therefore, the optimized calling methods might be identified using multiple algorithms to evaluate the concordance and discordance of SNP array-based CNV calling. PMID:24555668
Probability of detecting band-tailed pigeons during call-broadcast versus auditory surveys
Kirkpatrick, C.; Conway, C.J.; Hughes, K.M.; Devos, J.C.
2007-01-01
Estimates of population trend for the interior subspecies of band-tailed pigeon (Patagioenas fasciata fasciata) are not available because no standardized survey method exists for monitoring the interior subspecies. We evaluated 2 potential band-tailed pigeon survey methods (auditory and call-broadcast surveys) from 2002 to 2004 in 5 mountain ranges in southern Arizona, USA, and in mixed-conifer forest throughout the state. Both auditory and call-broadcast surveys produced low numbers of cooing pigeons detected per survey route (x?? ??? 0.67) and had relatively high temporal variance in average number of cooing pigeons detected during replicate surveys (CV ??? 161%). However, compared to auditory surveys, use of call-broadcast increased 1) the percentage of replicate surveys on which ???1 cooing pigeon was detected by an average of 16%, and 2) the number of cooing pigeons detected per survey route by an average of 29%, with this difference being greatest during the first 45 minutes of the morning survey period. Moreover, probability of detecting a cooing pigeon was 27% greater during call-broadcast (0.80) versus auditory (0.63) surveys. We found that cooing pigeons were most common in mixed-conifer forest in southern Arizona and density of male pigeons in mixed-conifer forest throughout the state averaged 0.004 (SE = 0.001) pigeons/ha. Our results are the first to show that call-broadcast increases the probability of detecting band-tailed pigeons (or any species of Columbidae) during surveys. Call-broadcast surveys may provide a useful method for monitoring populations of the interior subspecies of band-tailed pigeon in areas where other survey methods are inappropriate.
NASA Astrophysics Data System (ADS)
Schmidt, S.; Heyns, P. S.; de Villiers, J. P.
2018-02-01
In this paper, a fault diagnostic methodology is developed which is able to detect, locate and trend gear faults under fluctuating operating conditions when only vibration data from a single transducer, measured on a healthy gearbox are available. A two-phase feature extraction and modelling process is proposed to infer the operating condition and based on the operating condition, to detect changes in the machine condition. Information from optimised machine and operating condition hidden Markov models are statistically combined to generate a discrepancy signal which is post-processed to infer the condition of the gearbox. The discrepancy signal is processed and combined with statistical methods for automatic fault detection and localisation and to perform fault trending over time. The proposed methodology is validated on experimental data and a tacholess order tracking methodology is used to enhance the cost-effectiveness of the diagnostic methodology.
Damage detection of an in-service condensation pipeline joint
NASA Astrophysics Data System (ADS)
Briand, Julie; Rezaei, Davood; Taheri, Farid
2010-04-01
The early detection of damage in structural or mechanical systems is of vital importance. With early detection, the damage may be repaired before the integrity of the system is jeopardized, resulting in monetary losses, loss of life or limb, and environmental impacts. Among the various types of structural health monitoring techniques, vibration-based methods are of significant interest since the damage location does not need to be known beforehand, making it a more versatile approach. The non-destructive damage detection method used for the experiments herein is a novel vibration-based method which uses an index called the EMD Energy Damage Index, developed with the aim of providing improved qualitative results compared to those methods currently available. As part of an effort to establish the integrity and limitation of this novel damage detection method, field testing was completed on a mechanical pipe joint on a condensation line, located in the physical plant of Dalhousie University. Piezoceramic sensors, placed at various locations around the joint were used to monitor the free vibration of the pipe imposed through the use of an impulse hammer. Multiple damage progression scenarios were completed, each having a healthy state and multiple damage cases. Subsequently, the recorded signals from the healthy and damaged joint were processed through the EMD Energy Damage Index developed in-house in an effort to detect the inflicted damage. The proposed methodology successfully detected the inflicted damages. In this paper, the effects of impact location, sensor location, frequency bandwidth, intrinsic mode functions, and boundary conditions are discussed.
Occurrence of veterinary pharmaceuticals in the aquatic environment in Flanders
NASA Astrophysics Data System (ADS)
Servaes, K.; Vanermen, G.; Seuntjens, P.
2009-04-01
There is a growing interest in the occurrence of pharmaceuticals in the aquatic environment. Pharmaceuticals are classified as so-called ‘emerging pollutants'. ‘Emerging pollutants' are not necessarily new chemical compounds. Often these compounds are already present in the environment for a long time. But, their occurrence and especially their impact on the environment has only recently become clear. Consequently, data on their occurrence are rather scarce. In this study, we focus on the occurrence of veterinary pharmaceuticals in surface water in Flanders. We have only considered active substances administered to cattle, pigs and poultry. Based on the literature and information concerning the use in Belgium, a selection of 25 veterinary pharmaceuticals has been made. This selection consists of the most important antibiotics and antiparasitic substances applied in veterinary medicine in Belgium. We develop an analytical methodology based on UPLC-MS/MS for the detection of these veterinary pharmaceuticals in surface water. Therefore, the mass characteristics as well as the optimum LC conditions will be determined. To obtain limits of detection as low as possible, the samples are concentrated prior to analysis using solid phase extraction (SPE). Different SPE cartridges will be tested during the method development. At first, this SPE sample pre-treatment is performed off-line. In a next step, online SPE is optimized for this purpose. The analytical procedure will be subject to an in-house validation study, thereby determining recovery, repeatability (% RSD), limits of detection and limits of quantification. Finally, the developed methodology will be applied for monitoring the occurrence of veterinary pharmaceuticals in surface water and groundwater in Flanders. These water samples will be taken in areas characterized by intensive cattle breeding. Moreover, the samples will be collected during springtime. In this season, farmers apply manure, stored during winter, onto the fields.
Figueiredo, L; Erny, G L; Santos, L; Alves, A
2016-01-01
Personal-care products (PCPs) involve a variety of chemicals whose persistency along with their constant release into the environment raised concern to their potential impact on wildlife and humans health. Regarded as emergent contaminants, PCPs demonstrated estrogenic activity leading to the need of new methodologies to detect and remove those compounds from the environment. Molecular imprinting starts with a complex between a template molecule and a functional monomer, which is then polymerized in the presence of a cross-linker. After template removal, the polymer will contain specific cavities. Based on a good selectivity towards the template, molecularly imprinted polymers (MIPs) have been investigated as efficient materials for the analysis and extraction of the so called emergent pollutants contaminants. Rather than lowering the limit of detections, the key theoretical advantage of MIP over existing methodologies is the potential to target specific chemicals. This unique feature, sometime named specificity (as synonym to very high selectivity) allows to use cheap, simple and/or rapid quantitative techniques such as fast separation with ultra-violet (UV) detection, sensors or even spectrometric techniques. When a high degree of selectivity is achieved, samples extracted with MIPs can be directly analyzed without the need of a separation step. However, while some papers clearly demonstrated the specificity of their MIP toward the targeted PCP, such prove is often lacking, especially with real matrices, making it difficult to assess the success of the different approaches. This review paper focusses on the latest development of MIPs for the analysis of personal care products in the environment, with particular emphasis on design, preparation and practical applications of MIPs. Copyright © 2015 Elsevier B.V. All rights reserved.
Advancements in nano-enabled therapeutics for neuroHIV management.
Kaushik, Ajeet; Jayant, Rahul Dev; Nair, Madhavan
This viewpoint is a global call to promote fundamental and applied research aiming toward designing smart nanocarriers of desired properties, novel noninvasive strategies to open the blood-brain barrier (BBB), delivery/release of single/multiple therapeutic agents across the BBB to eradicate neurohuman immunodeficiency virus (HIV), strategies for on-demand site-specific release of antiretroviral therapy, developing novel nanoformulations capable to recognize and eradicate latently infected HIV reservoirs, and developing novel smart analytical diagnostic tools to detect and monitor HIV infection. Thus, investigation of novel nanoformulations, methodologies for site-specific delivery/release, analytical methods, and diagnostic tools would be of high significance to eradicate and monitor neuroacquired immunodeficiency syndrome. Overall, these developments will certainly help to develop personalized nanomedicines to cure HIV and to develop smart HIV-monitoring analytical systems for disease management.
Artificial intelligence for multi-mission planetary operations
NASA Technical Reports Server (NTRS)
Atkinson, David J.; Lawson, Denise L.; James, Mark L.
1990-01-01
A brief introduction is given to an automated system called the Spacecraft Health Automated Reasoning Prototype (SHARP). SHARP is designed to demonstrate automated health and status analysis for multi-mission spacecraft and ground data systems operations. The SHARP system combines conventional computer science methodologies with artificial intelligence techniques to produce an effective method for detecting and analyzing potential spacecraft and ground systems problems. The system performs real-time analysis of spacecraft and other related telemetry, and is also capable of examining data in historical context. Telecommunications link analysis of the Voyager II spacecraft is the initial focus for evaluation of the prototype in a real-time operations setting during the Voyager spacecraft encounter with Neptune in August, 1989. The preliminary results of the SHARP project and plans for future application of the technology are discussed.
Urban Detection, Delimitation and Morphology: Comparative Analysis of Selective "MEGACITIES"
NASA Astrophysics Data System (ADS)
Alhaddad, B.; Arellano, B. E.; Roca, J.
2012-08-01
Over the last 50 years, the world has faced an impressive growth of urban population. The walled city, close to the outside, an "island"for economic activities and population density within the rural land, has led to the spread of urban life and urban networks in almost all the territory. There was, as said Margalef (1999), "a topological inversion of the landscape". The "urban" has gone from being an island in the ocean of rural land vastness, to represent the totally of the space in which are inserted natural and rural "systems". New phenomena such as the fall of the fordist model of production, the spread of urbanization known as urban sprawl, and the change of scale of the metropolis, covering increasingly large regions, called "megalopolis" (Gottmann, 1961), have characterized the century. However there are no rigorous databases capable of measuring and evaluating the phenomenon of megacities and in general the process of urbanization in the contemporary world. The aim of this paper is to detect, identify and analyze the morphology of the megacities through remote sensing instruments as well as various indicators of landscape. To understand the structure of these heterogeneous landscapes called megacities, land consumption and spatial complexity needs to be quantified accurately. Remote sensing might be helpful in evaluating how the different land covers shape urban megaregions. The morphological landscape analysis allows establishing the analogies and the differences between patterns of cities and studying the symmetry, growth direction, linearity, complexity and compactness of the urban form. The main objective of this paper is to develop a new methodology to detect urbanized land of some megacities around the world (Tokyo, Mexico, Chicago, New York, London, Moscow, Sao Paulo and Shanghai) using Landsat 7 images.
Increasing accuracy in the assessment of motion sickness: A construct methodology
NASA Technical Reports Server (NTRS)
Stout, Cynthia S.; Cowings, Patricia S.
1993-01-01
The purpose is to introduce a new methodology that should improve the accuracy of the assessment of motion sickness. This construct methodology utilizes both subjective reports of motion sickness and objective measures of physiological correlates to assess motion sickness. Current techniques and methods used in the framework of a construct methodology are inadequate. Current assessment techniques for diagnosing motion sickness and space motion sickness are reviewed, and attention is called to the problems with the current methods. Further, principles of psychophysiology that when applied will probably resolve some of these problems are described in detail.
Critical Inquiry for the Social Good: Methodological Work as a Means for Truth-Telling in Education
ERIC Educational Resources Information Center
Kuntz, Aaron M.; Pickup, Austin
2016-01-01
This article questions the ubiquity of the term "critical" in methodological scholarship, calling for a renewed association of the term with projects concerned with social justice, truth-telling, and overt articulations of the social good. Drawing on Michel Foucault's work with parrhesia (or truth-telling) and Aristotle's articulation of…
ERIC Educational Resources Information Center
Torres Valdés, Rosa María; Santa Soriano, Alba; Lorenzo Álvarez, Carolina
2018-01-01
This paper presents the findings of a training programme based on an Action-Research methodology that has been applied in two subjects of Event Organization, Protocol, and Institutional Relations undergraduate and Master's degrees. Through a teaching methodology called "learning by doing," students are encouraged to understand,…
ERIC Educational Resources Information Center
Stassart, Pierre Marie; Mathieu, Valerie; Melard, Francois
2011-01-01
This paper proposes a new way for sociology, through both methodology and theory, to understand the reality of social groups and their "minority practices." It is based on an experiment that concerns a very specific category of agriculturalists called "pluriactive" stock farmers. These stock farmers, who engage in raising livestock part-time…
Learning as Leaving Home: Fear, Empathy, and Hospitality in the Theology and Religion Classroom
ERIC Educational Resources Information Center
Fleming, Daniel; Lovat, Terence
2015-01-01
The article is a response to this journal's call for papers on metaphors for teaching, and also draws from a previous publication in which Kent Eilers developed a methodology for teaching global theologies. In this methodology, the ultimate goal was the development of "hermeneutical dispositions of empathy, hospitality, and receptivity toward…
Learning to Support Learning Together: An Experience with the Soft Systems Methodology
ERIC Educational Resources Information Center
Sanchez, Adolfo; Mejia, Andres
2008-01-01
An action research approach called soft systems methodology (SSM) was used to foster organisational learning in a school regarding the role of the learning support department within the school and its relation with the normal teaching-learning activities. From an initial situation of lack of coordination as well as mutual misunderstanding and…
ERIC Educational Resources Information Center
Bonometti, Patrizia
2012-01-01
Purpose: The aim of this contribution is to describe a new complexity-science-based approach for improving safety, quality and efficiency and the way it was implemented by TenarisDalmine. Design/methodology/approach: This methodology is called "a safety-building community". It consists of a safety-behaviour social self-construction…
ERIC Educational Resources Information Center
Quigley, Cassie F.; Che, S. Megan; Achieng, Stella; Liaram, Sarah
2017-01-01
Environmental education research (EER) rarely includes women's perspectives. This means that in environmental education research, an entire knowledge source is largely ignored. This study employed a methodology called Participatory Rural Appraisal, a methodology new to the field of EER, of Kenyan teachers from the Maasai Mara region to understand…
Re-Aligning Research into Teacher Education for CALL and Bringing It into the Mainstream
ERIC Educational Resources Information Center
Motteram, Gary
2014-01-01
This paper explores three research projects conducted by the writer and others with a view to demonstrating the importance of effective theory and methodology in the analysis of teaching situations where Computer Assisted Language Learning (CALL), teacher practice and teacher education meet. It argues that there is a tendency in the field of…
Overview of Current Activities in Combustion Instability
2015-10-02
and avoid liquid rocket engine combustion stability problems Approach: 1) Develop a SOA combustion stability software package called Stable...phase II will invest in Multifidelity Tools and Methodologies – CSTD will develop a SOA combustion stability software package called Stable Combustion
ERIC Educational Resources Information Center
Collins, Donald R.
2011-01-01
This book outlines a methodology for viewing multiple generations of African Americans, specifically those who were called or called themselves Negro, Colored, Black, or African American (NCBAA). Within this framework, African Americans of varying ages describe their lives and educational experiences, allowing researchers to address a variety of…
MONTAGE: A Methodology for Designing Composable End-to-End Secure Distributed Systems
2012-08-01
83 7.6 Formal Model of Loc Separation . . . . . . . . . . . . . . . . . . . . . . . . . 84 7.6.1 Static Partitions...Next, we derive five requirements (called Loc Separation, Implicit Parameter Separation, Error Signaling Separation, Conf Separation, and Next Call...hypervisors and hardware) and a real cloud (with shared hypervisors and hardware) that satisfies these requirements. Finally we study Loc Separation
NASA Astrophysics Data System (ADS)
Helble, Tyler Adam
Passive acoustic monitoring of marine mammal calls is an increasingly important method for assessing population numbers, distribution, and behavior. Automated methods are needed to aid in the analyses of the recorded data. When a mammal vocalizes in the marine environment, the received signal is a filtered version of the original waveform emitted by the marine mammal. The waveform is reduced in amplitude and distorted due to propagation effects that are influenced by the bathymetry and environment. It is important to account for these effects to determine a site-specific probability of detection for marine mammal calls in a given study area. A knowledge of that probability function over a range of environmental and ocean noise conditions allows vocalization statistics from recordings of single, fixed, omnidirectional sensors to be compared across sensors and at the same sensor over time with less bias and uncertainty in the results than direct comparison of the raw statistics. This dissertation focuses on both the development of new tools needed to automatically detect humpback whale vocalizations from single-fixed omnidirectional sensors as well as the determination of the site-specific probability of detection for monitoring sites off the coast of California. Using these tools, detected humpback calls are "calibrated" for environmental properties using the site-specific probability of detection values, and presented as call densities (calls per square kilometer per time). A two-year monitoring effort using these calibrated call densities reveals important biological and ecological information on migrating humpback whales off the coast of California. Call density trends are compared between the monitoring sites and at the same monitoring site over time. Call densities also are compared to several natural and human-influenced variables including season, time of day, lunar illumination, and ocean noise. The results reveal substantial differences in call densities between the two sites which were not noticeable using uncorrected (raw) call counts. Additionally, a Lombard effect was observed for humpback whale vocalizations in response to increasing ocean noise. The results presented in this thesis develop techniques to accurately measure marine mammal abundances from passive acoustic sensors.
Calibration methodology for proportional counters applied to yield measurements of a neutron burst.
Tarifeño-Saldivia, Ariel; Mayer, Roberto E; Pavez, Cristian; Soto, Leopoldo
2014-01-01
This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.
Paradigms, pragmatism and possibilities: mixed-methods research in speech and language therapy.
Glogowska, Margaret
2011-01-01
After the decades of the so-called 'paradigm wars' in social science research methodology and the controversy about the relative place and value of quantitative and qualitative research methodologies, 'paradigm peace' appears to have now been declared. This has come about as many researchers have begun to take a 'pragmatic' approach in the selection of research methodology, choosing the methodology best suited to answering the research question rather than conforming to a methodological orthodoxy. With the differences in the philosophical underpinnings of the two traditions set to one side, an increasing awareness, and valuing, of the 'mixed-methods' approach to research is now present in the fields of social, educational and health research. To explore what is meant by mixed-methods research and the ways in which quantitative and qualitative methodologies and methods can be combined and integrated, particularly in the broad field of health services research and the narrower one of speech and language therapy. The paper discusses the ways in which methodological approaches have already been combined and integrated in health services research and speech and language therapy, highlighting the suitability of mixed-methods research for answering the typically multifaceted questions arising from the provision of complex interventions. The challenges of combining and integrating quantitative and qualitative methods and the barriers to the adoption of mixed-methods approaches are also considered. The questions about healthcare, as it is being provided in the 21st century, calls for a range of methodological approaches. This is particularly the case for human communication and its disorders, where mixed-methods research offers a wealth of possibilities. In turn, speech and language therapy research should be able to contribute substantively to the future development of mixed-methods research. © 2010 Royal College of Speech & Language Therapists.
Hydrological Retrospective of floods and droughts: Case study in the Amazon
NASA Astrophysics Data System (ADS)
Wongchuig Correa, Sly; Cauduro Dias de Paiva, Rodrigo; Carlo Espinoza Villar, Jhan; Collischonn, Walter
2017-04-01
Recent studies have reported an increase in intensity and frequency of hydrological extreme events in many regions of the Amazon basin over last decades, these events such as seasonal floods and droughts have originated a significant impact in human and natural systems. Recently, methodologies such as climatic reanalysis are being developed in order to create a coherent register of climatic systems, thus taking this notion, this research efforts to produce a methodology called Hydrological Retrospective (HR), that essentially simulate large rainfall datasets over hydrological models in order to develop a record over past hydrology, enabling the analysis of past floods and droughts. We developed our methodology on the Amazon basin, thus we used eight large precipitation datasets (more than 30 years) through a large scale hydrological and hydrodynamic model (MGB-IPH), after that HR products were validated against several in situ discharge gauges dispersed throughout Amazon basin, given focus in maximum and minimum events. For better HR results according performance metrics, we performed a forecast skill of HR to detect floods and droughts considering in-situ observations. Furthermore, statistical temporal series trend was performed for intensity of seasonal floods and drought in the whole Amazon basin. Results indicate that better HR represented well most past extreme events registered by in-situ observed data and also showed coherent with many events cited by literature, thus we consider viable to use some large precipitation datasets as climatic reanalysis mainly based on land surface component and datasets based in merged products for represent past regional hydrology and seasonal hydrological extreme events. On the other hand, an increase trend of intensity was realized for maximum annual discharges (related to floods) in north-western regions and for minimum annual discharges (related to drought) in central-south regions of the Amazon basin, these features were previously detected by other researches. In the whole basin, we estimated an upward trend of maximum annual discharges at Amazon River. In order to estimate better future hydrological behavior and their impacts on the society, HR could be used as a methodology to understand past extreme events occurrence in many places considering the global coverage of rainfall datasets.
Three tenets for secure cyber-physical system design and assessment
NASA Astrophysics Data System (ADS)
Hughes, Jeff; Cybenko, George
2014-06-01
This paper presents a threat-driven quantitative mathematical framework for secure cyber-physical system design and assessment. Called The Three Tenets, this originally empirical approach has been used by the US Air Force Research Laboratory (AFRL) for secure system research and development. The Tenets were first documented in 2005 as a teachable methodology. The Tenets are motivated by a system threat model that itself consists of three elements which must exist for successful attacks to occur: - system susceptibility; - threat accessibility and; - threat capability. The Three Tenets arise naturally by countering each threat element individually. Specifically, the tenets are: Tenet 1: Focus on What's Critical - systems should include only essential functions (to reduce susceptibility); Tenet 2: Move Key Assets Out-of-Band - make mission essential elements and security controls difficult for attackers to reach logically and physically (to reduce accessibility); Tenet 3: Detect, React, Adapt - confound the attacker by implementing sensing system elements with dynamic response technologies (to counteract the attackers' capabilities). As a design methodology, the Tenets mitigate reverse engineering and subsequent attacks on complex systems. Quantified by a Bayesian analysis and further justified by analytic properties of attack graph models, the Tenets suggest concrete cyber security metrics for system assessment.
'Seed + expand': a general methodology for detecting publication oeuvres of individual researchers.
Reijnhoudt, Linda; Costas, Rodrigo; Noyons, Ed; Börner, Katy; Scharnhorst, Andrea
2014-01-01
The study of science at the individual scholar level requires the disambiguation of author names. The creation of author's publication oeuvres involves matching the list of unique author names to names used in publication databases. Despite recent progress in the development of unique author identifiers, e.g., ORCID, VIVO, or DAI, author disambiguation remains a key problem when it comes to large-scale bibliometric analysis using data from multiple databases. This study introduces and tests a new methodology called seed + expand for semi-automatic bibliographic data collection for a given set of individual authors. Specifically, we identify the oeuvre of a set of Dutch full professors during the period 1980-2011. In particular, we combine author records from a Dutch National Research Information System (NARCIS) with publication records from the Web of Science. Starting with an initial list of 8,378 names, we identify 'seed publications' for each author using five different approaches. Subsequently, we 'expand' the set of publications in three different approaches. The different approaches are compared and resulting oeuvres are evaluated on precision and recall using a 'gold standard' dataset of authors for which verified publications in the period 2001-2010 are available.
Rizzi, Aurora; Raddadi, Noura; Sorlini, Claudia; Nordgrd, Lise; Nielsen, Kaare Magne; Daffonchio, Daniele
2012-01-01
The fate of dietary DNA in the gastrointestinal tract (GIT) of animals has gained renewed interest after the commercial introduction of genetically modified organisms (GMO). Among the concerns regarding GM food, are the possible consequences of horizontal gene transfer (HGT) of recombinant dietary DNA to bacteria or animal cells. The exposure of the GIT to dietary DNA is related to the extent of food processing, food composition, and to the level of intake. Animal feeding studies have demonstrated that a minor amount of fragmented dietary DNA may resist the digestive process. Mammals have been shown to take up dietary DNA from the GIT, but stable integration and expression of internalized DNA has not been demonstrated. Despite the ability of several bacterial species to acquire external DNA by natural transformation, in vivo transfer of dietary DNA to bacteria in the intestine has not been detected in the few experimental studies conducted so far. However, major methodological limitations and knowledge gaps of the mechanistic aspects of HGT calls for methodological improvements and further studies to understand the fate of various types of dietary DNA in the GIT.
Statistical, Graphical, and Learning Methods for Sensing, Surveillance, and Navigation Systems
2016-06-28
harsh propagation environments. Conventional filtering techniques fail to provide satisfactory performance in many important nonlinear or non...Gaussian scenarios. In addition, there is a lack of a unified methodology for the design and analysis of different filtering techniques. To address...these problems, we have proposed a new filtering methodology called belief condensation (BC) DISTRIBUTION A: Distribution approved for public release
ERIC Educational Resources Information Center
Bierema, Andrea M.-K.; Schwartz, Renee S.; Gill, Sharon A.
2017-01-01
Recent calls for reform in education recommend science curricula to be based on central ideas instead of a larger number of topics and for alignment between current scientific research and curricula. Because alignment is rarely studied, especially for central ideas, we developed a methodology to discover the extent of alignment between primary…
ERIC Educational Resources Information Center
Lavoie, Constance; Benson, Carol
2011-01-01
This paper illustrates how a methodological tool called "drawing-voice" can be used to demonstrate qualitatively what statistical and policy data are not able to reveal regarding the educational realities of Hmong minority communities in northern Vietnam, particularly with regard to the role of local language and culture in school. This…
NASA Astrophysics Data System (ADS)
Ortiz-Jaramillo, B.; Fandiño Toro, H. A.; Benitez-Restrepo, H. D.; Orjuela-Vargas, S. A.; Castellanos-Domínguez, G.; Philips, W.
2012-03-01
Infrared Non-Destructive Testing (INDT) is known as an effective and rapid method for nondestructive inspection. It can detect a broad range of near-surface structuring flaws in metallic and composite components. Those flaws are modeled as a smooth contour centered at peaks of stored thermal energy, termed Regions of Interest (ROI). Dedicated methodologies must detect the presence of those ROIs. In this paper, we present a methodology for ROI extraction in INDT tasks. The methodology deals with the difficulties due to the non-uniform heating. The non-uniform heating affects low spatial/frequencies and hinders the detection of relevant points in the image. In this paper, a methodology for ROI extraction in INDT using multi-resolution analysis is proposed, which is robust to ROI low contrast and non-uniform heating. The former methodology includes local correlation, Gaussian scale analysis and local edge detection. In this methodology local correlation between image and Gaussian window provides interest points related to ROIs. We use a Gaussian window because thermal behavior is well modeled by Gaussian smooth contours. Also, the Gaussian scale is used to analyze details in the image using multi-resolution analysis avoiding low contrast, non-uniform heating and selection of the Gaussian window size. Finally, local edge detection is used to provide a good estimation of the boundaries in the ROI. Thus, we provide a methodology for ROI extraction based on multi-resolution analysis that is better or equal compared with the other dedicate algorithms proposed in the state of art.
ERIC Educational Resources Information Center
Barros, Jessica M.
2012-01-01
"Koladeras" are women who use call and response in impromptu songs that may contain proverbs, stories about the community, their life experiences, and who and what they see in their world from their own perspective. Via qualitative methods of (auto)ethnography, personal and life story narratives, and interviews, I look at how…
Probabilistic assessment methodology for continuous-type petroleum accumulations
Crovelli, R.A.
2003-01-01
The analytic resource assessment method, called ACCESS (Analytic Cell-based Continuous Energy Spreadsheet System), was developed to calculate estimates of petroleum resources for the geologic assessment model, called FORSPAN, in continuous-type petroleum accumulations. The ACCESS method is based upon mathematical equations derived from probability theory in the form of a computer spreadsheet system. ?? 2003 Elsevier B.V. All rights reserved.
Stafford, Kathleen M; Mellinger, David K; Moore, Sue E; Fox, Christopher G
2007-12-01
Five species of large whales, including the blue (Balaenoptera musculus), fin (B. physalus), sei (B. borealis), humpback (Megaptera novaeangliae), and North Pacific right (Eubalaena japonica), were the target of commercial harvests in the Gulf of Alaska (GoA) during the 19th through mid-20th Centuries. Since this time, there have been a few summer time visual surveys for these species, but no overview of year-round use of these waters by endangered whales primarily because standard visual survey data are difficult and costly. From October 1999-May 2002, moored hydrophones were deployed in six locations in the GoA to record whale calls. Reception of calls from fin, humpback, and blue whales and an unknown source, called Watkins' whale, showed seasonal and geographic variation. Calls were detected more often during the winter than during the summer, suggesting that animals inhabit the GoA year-round. To estimate the distance at which species-diagnostic calls could be heard, parabolic equation propagation loss models for frequencies characteristic of each of each call type were run. Maximum detection ranges in the subarctic North Pacific ranged from 45 to 250 km among three species (fin, humpback, blue), although modeled detection ranges varied greatly with input parameters and choice of ambient noise level.
Selected Aspects of the eCall Emergency Notification System
NASA Astrophysics Data System (ADS)
Kaminski, Tomasz; Nowacki, Gabriel; Mitraszewska, Izabella; Niezgoda, Michał; Kruszewski, Mikołaj; Kaminska, Ewa; Filipek, Przemysław
2012-02-01
The article describes problems associated with the road collision detection for the purpose of the automatic emergency call. At the moment collision is detected, the eCall device installed in the vehicle will automatically make contact with Emergency Notification Centre and send the set of essential information on the vehicle and the place of the accident. To activate the alarm, the information about the deployment of the airbags will not be used, because connection of the eCall device might interfere with the vehicle’s safety systems. It is necessary to develop a method enabling detection of the road collision, similar to the one used in airbag systems, and based on the signals available from the acceleration sensors.
Helble, Tyler A; D'Spain, Gerald L; Campbell, Greg S; Hildebrand, John A
2013-11-01
This paper demonstrates the importance of accounting for environmental effects on passive underwater acoustic monitoring results. The situation considered is the reduction in shipping off the California coast between 2008-2010 due to the recession and environmental legislation. The resulting variations in ocean noise change the probability of detecting marine mammal vocalizations. An acoustic model was used to calculate the time-varying probability of detecting humpback whale vocalizations under best-guess environmental conditions and varying noise. The uncorrected call counts suggest a diel pattern and an increase in calling over a two-year period; the corrected call counts show minimal evidence of these features.
Ramos, Enrique; Levinson, Benjamin T; Chasnoff, Sara; Hughes, Andrew; Young, Andrew L; Thornton, Katherine; Li, Allie; Vallania, Francesco L M; Province, Michael; Druley, Todd E
2012-12-06
Rare genetic variation in the human population is a major source of pathophysiological variability and has been implicated in a host of complex phenotypes and diseases. Finding disease-related genes harboring disparate functional rare variants requires sequencing of many individuals across many genomic regions and comparing against unaffected cohorts. However, despite persistent declines in sequencing costs, population-based rare variant detection across large genomic target regions remains cost prohibitive for most investigators. In addition, DNA samples are often precious and hybridization methods typically require large amounts of input DNA. Pooled sample DNA sequencing is a cost and time-efficient strategy for surveying populations of individuals for rare variants. We set out to 1) create a scalable, multiplexing method for custom capture with or without individual DNA indexing that was amenable to low amounts of input DNA and 2) expand the functionality of the SPLINTER algorithm for calling substitutions, insertions and deletions across either candidate genes or the entire exome by integrating the variant calling algorithm with the dynamic programming aligner, Novoalign. We report methodology for pooled hybridization capture with pre-enrichment, indexed multiplexing of up to 48 individuals or non-indexed pooled sequencing of up to 92 individuals with as little as 70 ng of DNA per person. Modified solid phase reversible immobilization bead purification strategies enable no sample transfers from sonication in 96-well plates through adapter ligation, resulting in 50% less library preparation reagent consumption. Custom Y-shaped adapters containing novel 7 base pair index sequences with a Hamming distance of ≥2 were directly ligated onto fragmented source DNA eliminating the need for PCR to incorporate indexes, and was followed by a custom blocking strategy using a single oligonucleotide regardless of index sequence. These results were obtained aligning raw reads against the entire genome using Novoalign followed by variant calling of non-indexed pools using SPLINTER or SAMtools for indexed samples. With these pipelines, we find sensitivity and specificity of 99.4% and 99.7% for pooled exome sequencing. Sensitivity, and to a lesser degree specificity, proved to be a function of coverage. For rare variants (≤2% minor allele frequency), we achieved sensitivity and specificity of ≥94.9% and ≥99.99% for custom capture of 2.5 Mb in multiplexed libraries of 22-48 individuals with only ≥5-fold coverage/chromosome, but these parameters improved to ≥98.7 and 100% with 20-fold coverage/chromosome. This highly scalable methodology enables accurate rare variant detection, with or without individual DNA sample indexing, while reducing the amount of required source DNA and total costs through less hybridization reagent consumption, multi-sample sonication in a standard PCR plate, multiplexed pre-enrichment pooling with a single hybridization and lesser sequencing coverage required to obtain high sensitivity.
NASA Astrophysics Data System (ADS)
Ampatzidis, Dimitrios; König, Rolf; Glaser, Susanne; Heinkelmann, Robert; Schuh, Harald; Flechtner, Frank; Nilsson, Tobias
2016-04-01
The aim of our study is to assess the classical Helmert similarity transformation using the Velocity Decomposition Analysis (VEDA). The VEDA is a new methodology, developed by GFZ for the assessment of the reference frames' temporal variation and it is based on the separation of the velocities into two specified parts: The first is related to the reference system choice (the so called datum effect) and the latter one which refers to the real deformation of the terrestrial points. The advantage of the VEDA is its ability to detect the relative biases and reference system effects between two different frames or two different realizations of the same frame, respectively. We apply the VEDA for the assessment between several modern tectonic plate models and the recent global terrestrial reference frames.
Silano, Marco; Silano, Vittorio
2017-07-03
A priority of the European Union is the control of risks possibly associated with chemical contaminants in food and undesirable substances in feed. Following an initial chapter describing the main contaminants detected in food and undesirable substances in feed in the EU, their main sources and the factors which affect their occurrence, the present review focuses on the "continous call for data" procedure that is a very effective system in place at EFSA to make possible the exposure assessment of specific contaminants and undesirable substances. Risk assessment of contaminants in food atances in feed is carried currently in the European Union by the CONTAM Panel of EFSA according to well defined methodologies and in collaboration with competent international organizations and with Member States.
Development of the first-mention bias*
HARTSHORNE, JOSHUA K.; NAPPA, REBECCA; SNEDEKER, JESSE
2015-01-01
In many contexts, pronouns are interpreted as referring to the character mentioned first in the previous sentence, an effect called the ‘first-mention bias’. While adults can rapidly use the first-mention bias to guide pronoun interpretation, it is unclear when this bias emerges during development. Curiously, experiments with children between two and three years old show successful use of order of mention, while experiments with older children (four to five years old) do not. While this could suggest U-shaped development, it could also reflect differences in the methodologies employed. We show that children can indeed use first-mention information, but do so too slowly to have been detected in previous work reporting null results. Comparison across the present and previously published studies suggests that the speed at which children deploy first-mention information increases greatly during the preschool years. PMID:24735525
Avian predators are less abundant during periodical cicada emergences, but why?
Koenig, Walter D; Ries, Leslie; Olsen, V Beth K; Liebhold, Andrew M
2011-03-01
Despite a substantial resource pulse, numerous avian insectivores known to depredate periodical cicadas (Magicicada spp.) are detected less commonly during emergence years than in either the previous or following years. We used data on periodical cicada calls collected by volunteers conducting North American Breeding Bird Surveys within the range of cicada Brood X to test three hypotheses for this observation: lower detection rates could be caused by bird calls being obscured by cicada calls ("detectability" hypothesis), by birds avoiding areas with cicadas ("repel" hypothesis), or because bird abundances are generally lower during emergence years for some reason unrelated to the current emergence event ("true decline" hypothesis). We tested these hypotheses by comparing bird detections at stations coincident with calling cicadas vs. those without calling cicadas in the year prior to and during cicada emergences. At four distinct levels (stop, route, range, and season), parallel declines of birds in groups exposed and not exposed to cicada calls supported the true decline hypothesis. We discuss several potential mechanisms for this pattern, including the possibility that it is a consequence of the ecological and evolutionary interactions between predators of this extraordinary group of insects.
Okada, Sachiko; Nagase, Keisuke; Ito, Ayako; Ando, Fumihiko; Nakagawa, Yoshiaki; Okamoto, Kazuya; Kume, Naoto; Takemura, Tadamasa; Kuroda, Tomohiro; Yoshihara, Hiroyuki
2014-01-01
Comparison of financial indices helps to illustrate differences in operations and efficiency among similar hospitals. Outlier data tend to influence statistical indices, and so detection of outliers is desirable. Development of a methodology for financial outlier detection using information systems will help to reduce the time and effort required, eliminate the subjective elements in detection of outlier data, and improve the efficiency and quality of analysis. The purpose of this research was to develop such a methodology. Financial outliers were defined based on a case model. An outlier-detection method using the distances between cases in multi-dimensional space is proposed. Experiments using three diagnosis groups indicated successful detection of cases for which the profitability and income structure differed from other cases. Therefore, the method proposed here can be used to detect outliers. Copyright © 2013 John Wiley & Sons, Ltd.
Using State Estimation Residuals to Detect Abnormal SCADA Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Jian; Chen, Yousu; Huang, Zhenyu
2010-04-30
Detection of abnormal supervisory control and data acquisition (SCADA) data is critically important for safe and secure operation of modern power systems. In this paper, a methodology of abnormal SCADA data detection based on state estimation residuals is presented. Preceded with a brief overview of outlier detection methods and bad SCADA data detection for state estimation, the framework of the proposed methodology is described. Instead of using original SCADA measurements as the bad data sources, the residuals calculated based on the results of the state estimator are used as the input for the outlier detection algorithm. The BACON algorithm ismore » applied to the outlier detection task. The IEEE 118-bus system is used as a test base to evaluate the effectiveness of the proposed methodology. The accuracy of the BACON method is compared with that of the 3-σ method for the simulated SCADA measurements and residuals.« less
NASA Astrophysics Data System (ADS)
Nakatsuji, Hiroshi
Chemistry is a science of complex subjects that occupy this universe and biological world and that are composed of atoms and molecules. Its essence is diversity. However, surprisingly, whole of this science is governed by simple quantum principles like the Schrödinger and the Dirac equations. Therefore, if we can find a useful general method of solving these quantum principles under the fermionic and/or bosonic constraints accurately in a reasonable speed, we can replace somewhat empirical methodologies of this science with purely quantum theoretical and computational logics. This is the purpose of our series of studies - called ``exact theory'' in our laboratory. Some of our documents are cited below. The key idea was expressed as the free complement (FC) theory (originally called ICI theory) that was introduced to solve the Schrödinger and Dirac equations analytically. For extending this methodology to larger systems, order N methodologies are essential, but actually the antisymmetry constraints for electronic wave functions become big constraints. Recently, we have shown that the antisymmetry rule or `dogma' can be very much relaxed when our subjects are large molecular systems. In this talk, I want to present our recent progress in our FC methodology. The purpose is to construct ``predictive quantum chemistry'' that is useful in chemical and physical researches and developments in institutes and industries
NASA Astrophysics Data System (ADS)
Moradi, Saed; Moallem, Payman; Sabahi, Mohamad Farzan
2018-03-01
False alarm rate and detection rate are still two contradictory metrics for infrared small target detection in an infrared search and track system (IRST), despite the development of new detection algorithms. In certain circumstances, not detecting true targets is more tolerable than detecting false items as true targets. Hence, considering background clutter and detector noise as the sources of the false alarm in an IRST system, in this paper, a false alarm aware methodology is presented to reduce false alarm rate while the detection rate remains undegraded. To this end, advantages and disadvantages of each detection algorithm are investigated and the sources of the false alarms are determined. Two target detection algorithms having independent false alarm sources are chosen in a way that the disadvantages of the one algorithm can be compensated by the advantages of the other one. In this work, multi-scale average absolute gray difference (AAGD) and Laplacian of point spread function (LoPSF) are utilized as the cornerstones of the desired algorithm of the proposed methodology. After presenting a conceptual model for the desired algorithm, it is implemented through the most straightforward mechanism. The desired algorithm effectively suppresses background clutter and eliminates detector noise. Also, since the input images are processed through just four different scales, the desired algorithm has good capability for real-time implementation. Simulation results in term of signal to clutter ratio and background suppression factor on real and simulated images prove the effectiveness and the performance of the proposed methodology. Since the desired algorithm was developed based on independent false alarm sources, our proposed methodology is expandable to any pair of detection algorithms which have different false alarm sources.
Halwani, Muhammad A; Turnbull, Alison E; Harris, Meredith; Witter, Frank; Perl, Trish M
2016-04-01
To assess how enhanced postdischarge telephone follow-up calls would improve case finding for surgical site infection (SSI) surveillance after cesarean section. We conducted a prospective cohort study of all patients who delivered by cesarean section between April 22 and August 22, 2010. In addition to our routine surveillance, using clinical databases and electronic patient records, we also made follow-up calls to the patients at 7, 14, and 30 days postoperation. A standard questionnaire with questions about symptoms of SSI, health-seeking behaviors, and treatment received was administered. Descriptive statistics and univariate analysis were performed to assess the effect of the enhanced surveillance. One hundred ninety-three patients underwent cesarean section during this study period. Standard surveillance identified 14 infections with telephone follow-ups identifying an additional 5 infections. Using the call as a gold standard, the sensitivity of the standard methodology to capture SSI was 73.3%. The duration of the calls ranged from 1 to 5 minutes and were well received by the patients. Results suggest that follow-up telephone calls to patients following cesarean section identifies 26.3% of the total SSIs. Enhanced surveillance can provide more informed data to enhance performance and avoid underestimation of rates. Copyright © 2016 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
On Constructing, Grouping and Using Topical Ontology for Semantic Matching
NASA Astrophysics Data System (ADS)
Tang, Yan; de Baer, Peter; Zhao, Gang; Meersman, Robert
An ontology topic is used to group concepts from different contexts (or even from different domain ontologies). This paper presents a pattern-driven modeling methodology for constructing and grouping topics in an ontology (PAD-ON methodology), which is used for matching similarities between competences in the human resource management (HRM) domain. The methodology is supported by a tool called PAD-ON. This paper demonstrates our recent achievement in the work from the EC Prolix project. The paper approach is applied to the training processes at British Telecom as the test bed.
Wilkinson, Samuel L.; John, Shibu; Walsh, Roddy; Novotny, Tomas; Valaskova, Iveta; Gupta, Manu; Game, Laurence; Barton, Paul J R.; Cook, Stuart A.; Ware, James S.
2013-01-01
Background Molecular genetic testing is recommended for diagnosis of inherited cardiac disease, to guide prognosis and treatment, but access is often limited by cost and availability. Recently introduced high-throughput bench-top DNA sequencing platforms have the potential to overcome these limitations. Methodology/Principal Findings We evaluated two next-generation sequencing (NGS) platforms for molecular diagnostics. The protein-coding regions of six genes associated with inherited arrhythmia syndromes were amplified from 15 human samples using parallelised multiplex PCR (Access Array, Fluidigm), and sequenced on the MiSeq (Illumina) and Ion Torrent PGM (Life Technologies). Overall, 97.9% of the target was sequenced adequately for variant calling on the MiSeq, and 96.8% on the Ion Torrent PGM. Regions missed tended to be of high GC-content, and most were problematic for both platforms. Variant calling was assessed using 107 variants detected using Sanger sequencing: within adequately sequenced regions, variant calling on both platforms was highly accurate (Sensitivity: MiSeq 100%, PGM 99.1%. Positive predictive value: MiSeq 95.9%, PGM 95.5%). At the time of the study the Ion Torrent PGM had a lower capital cost and individual runs were cheaper and faster. The MiSeq had a higher capacity (requiring fewer runs), with reduced hands-on time and simpler laboratory workflows. Both provide significant cost and time savings over conventional methods, even allowing for adjunct Sanger sequencing to validate findings and sequence exons missed by NGS. Conclusions/Significance MiSeq and Ion Torrent PGM both provide accurate variant detection as part of a PCR-based molecular diagnostic workflow, and provide alternative platforms for molecular diagnosis of inherited cardiac conditions. Though there were performance differences at this throughput, platforms differed primarily in terms of cost, scalability, protocol stability and ease of use. Compared with current molecular genetic diagnostic tests for inherited cardiac arrhythmias, these NGS approaches are faster, less expensive, and yet more comprehensive. PMID:23861798
Van Pamel, Anton; Brett, Colin R; Lowe, Michael J S
2014-12-01
Improving the ultrasound inspection capability for coarse-grained metals remains of longstanding interest and is expected to become increasingly important for next-generation electricity power plants. Conventional ultrasonic A-, B-, and C-scans have been found to suffer from strong background noise caused by grain scattering, which can severely limit the detection of defects. However, in recent years, array probes and full matrix capture (FMC) imaging algorithms have unlocked exciting possibilities for improvements. To improve and compare these algorithms, we must rely on robust methodologies to quantify their performance. This article proposes such a methodology to evaluate the detection performance of imaging algorithms. For illustration, the methodology is applied to some example data using three FMC imaging algorithms; total focusing method (TFM), phase-coherent imaging (PCI), and decomposition of the time-reversal operator with multiple scattering filter (DORT MSF). However, it is important to note that this is solely to illustrate the methodology; this article does not attempt the broader investigation of different cases that would be needed to compare the performance of these algorithms in general. The methodology considers the statistics of detection, presenting the detection performance as probability of detection (POD) and probability of false alarm (PFA). A test sample of coarse-grained nickel super alloy, manufactured to represent materials used for future power plant components and containing some simple artificial defects, is used to illustrate the method on the candidate algorithms. The data are captured in pulse-echo mode using 64-element array probes at center frequencies of 1 and 5 MHz. In this particular case, it turns out that all three algorithms are shown to perform very similarly when comparing their flaw detection capabilities.
Helble, Tyler A; D'Spain, Gerald L; Hildebrand, John A; Campbell, Gregory S; Campbell, Richard L; Heaney, Kevin D
2013-09-01
Passive acoustic monitoring of marine mammal calls is an increasingly important method for assessing population numbers, distribution, and behavior. A common mistake in the analysis of marine mammal acoustic data is formulating conclusions about these animals without first understanding how environmental properties such as bathymetry, sediment properties, water column sound speed, and ocean acoustic noise influence the detection and character of vocalizations in the acoustic data. The approach in this paper is to use Monte Carlo simulations with a full wave field acoustic propagation model to characterize the site specific probability of detection of six types of humpback whale calls at three passive acoustic monitoring locations off the California coast. Results show that the probability of detection can vary by factors greater than ten when comparing detections across locations, or comparing detections at the same location over time, due to environmental effects. Effects of uncertainties in the inputs to the propagation model are also quantified, and the model accuracy is assessed by comparing calling statistics amassed from 24,690 humpback units recorded in the month of October 2008. Under certain conditions, the probability of detection can be estimated with uncertainties sufficiently small to allow for accurate density estimates.
ERIC Educational Resources Information Center
Mankowska, Anna
2016-01-01
Little, if any, examination of using play-based tools to examine children's opinions in research exists in the current literature. Therefore, this paper is meant to address that gap within the literature and showcase the study about the use of a specific play-based methodological tool in qualitative research. This specific tool called social board…
ERIC Educational Resources Information Center
Howard, Lyz
2016-01-01
As an experienced face-to-face teacher, working in a small Crown Dependency with no Higher Education Institute (HEI) to call its own, the subsequent geographical and professional isolation in the context of Networked Learning (NL), as a sub-set of eLearning, calls for innovative ways in which to develop self-reliant methods of professional…
Learning Strategies in Play during Basic Training for Medal of Honor and Call of Duty Video Games
ERIC Educational Resources Information Center
Ziaeehezarjeribi, Yadi
2010-01-01
This study, based on experiential play methodology was used to explore student engagement while playing "Medal of Honor (2002)" and "Call of Duty (2003)". It identifies some of the key issues related to the use of video games and simulations during the training phase of game play. Research into the effects of gaming in education has been extremely…
Mostashari, Farzad; Fine, Annie; Das, Debjani; Adams, John; Layton, Marcelle
2003-06-01
In 1998, the New York City Department of Health and the Mayor's Office of Emergency Management began monitoring the volume of ambulance dispatch calls as a surveillance tool for biologic terrorism. We adapted statistical techniques designed to measure excess influenza mortality and applied them to outbreak detection using ambulance dispatch data. Since 1999, we have been performing serial daily regressions to determine the alarm threshold for the current day. In this article, we evaluate this approach by simulating a series of 2,200 daily regressions. In the influenza detection implementation of this model, there were 71 (3.2%) alarms at the 99% level. Of these alarms, 64 (90%) occurred shortly before or during a period of peak influenza in each of six influenza seasons. In the bioterrorism detection implementation of this methodology, after accounting for current influenza activity, there were 24 (1.1%) alarms at the 99% level. Two occurred during a large snowstorm, 1 is unexplained, and 21 occurred shortly before or during a period of peak influenza activity in each of six influenza seasons. Our findings suggest that this surveillance system is sensitive to communitywide respiratory outbreaks with relatively few false alarms. More work needs to be done to evaluate the sensitivity of this approach for detecting nonrespiratory illness and more localized outbreaks.
One approach to design of speech emotion database
NASA Astrophysics Data System (ADS)
Uhrin, Dominik; Chmelikova, Zdenka; Tovarek, Jaromir; Partila, Pavol; Voznak, Miroslav
2016-05-01
This article describes a system for evaluating the credibility of recordings with emotional character. Sound recordings form Czech language database for training and testing systems of speech emotion recognition. These systems are designed to detect human emotions in his voice. The emotional state of man is useful in the security forces and emergency call service. Man in action (soldier, police officer and firefighter) is often exposed to stress. Information about the emotional state (his voice) will help to dispatch to adapt control commands for procedure intervention. Call agents of emergency call service must recognize the mental state of the caller to adjust the mood of the conversation. In this case, the evaluation of the psychological state is the key factor for successful intervention. A quality database of sound recordings is essential for the creation of the mentioned systems. There are quality databases such as Berlin Database of Emotional Speech or Humaine. The actors have created these databases in an audio studio. It means that the recordings contain simulated emotions, not real. Our research aims at creating a database of the Czech emotional recordings of real human speech. Collecting sound samples to the database is only one of the tasks. Another one, no less important, is to evaluate the significance of recordings from the perspective of emotional states. The design of a methodology for evaluating emotional recordings credibility is described in this article. The results describe the advantages and applicability of the developed method.
BATSE gamma-ray burst line search. 2: Bayesian consistency methodology
NASA Technical Reports Server (NTRS)
Band, D. L.; Ford, L. A.; Matteson, J. L.; Briggs, M.; Paciesas, W.; Pendleton, G.; Preece, R.; Palmer, D.; Teegarden, B.; Schaefer, B.
1994-01-01
We describe a Bayesian methodology to evaluate the consistency between the reported Ginga and Burst and Transient Source Experiment (BATSE) detections of absorption features in gamma-ray burst spectra. Currently no features have been detected by BATSE, but this methodology will still be applicable if and when such features are discovered. The Bayesian methodology permits the comparison of hypotheses regarding the two detectors' observations and makes explicit the subjective aspects of our analysis (e.g., the quantification of our confidence in detector performance). We also present non-Bayesian consistency statistics. Based on preliminary calculations of line detectability, we find that both the Bayesian and non-Bayesian techniques show that the BATSE and Ginga observations are consistent given our understanding of these detectors.
Weir, L.A.; Royle, J. Andrew; Nanjappa, P.; Jung, R.E.
2005-01-01
One of the most fundamental problems in monitoring animal populations is that of imperfect detection. Although imperfect detection can be modeled, studies examining patterns in occurrence often ignore detection and thus fail to properly partition variation in detection from that of occurrence. In this study, we used anuran calling survey data collected on North American Amphibian Monitoring Program routes in eastern Maryland to investigate factors that influence detection probability and site occupancy for 10 anuran species. In 2002, 17 calling survey routes in eastern Maryland were surveyed to collect environmental and species data nine or more times. To analyze these data, we developed models incorporating detection probability and site occupancy. The results suggest that, for more than half of the 10 species, detection probabilities vary most with season (i.e., day-of-year), air temperature, time, and moon illumination, whereas site occupancy may vary by the amount of palustrine forested wetland habitat. Our results suggest anuran calling surveys should document air temperature, time of night, moon illumination, observer skill, and habitat change over time, as these factors can be important to model-adjusted estimates of site occupancy. Our study represents the first formal modeling effort aimed at developing an analytic assessment framework for NAAMP calling survey data.
Effects of Airgun Sounds on Bowhead Whale Calling Rates: Evidence for Two Behavioral Thresholds
Blackwell, Susanna B.; Nations, Christopher S.; McDonald, Trent L.; Thode, Aaron M.; Mathias, Delphine; Kim, Katherine H.; Greene, Charles R.; Macrander, A. Michael
2015-01-01
In proximity to seismic operations, bowhead whales (Balaena mysticetus) decrease their calling rates. Here, we investigate the transition from normal calling behavior to decreased calling and identify two threshold levels of received sound from airgun pulses at which calling behavior changes. Data were collected in August–October 2007–2010, during the westward autumn migration in the Alaskan Beaufort Sea. Up to 40 directional acoustic recorders (DASARs) were deployed at five sites offshore of the Alaskan North Slope. Using triangulation, whale calls localized within 2 km of each DASAR were identified and tallied every 10 minutes each season, so that the detected call rate could be interpreted as the actual call production rate. Moreover, airgun pulses were identified on each DASAR, analyzed, and a cumulative sound exposure level was computed for each 10-min period each season (CSEL10-min). A Poisson regression model was used to examine the relationship between the received CSEL10-min from airguns and the number of detected bowhead calls. Calling rates increased as soon as airgun pulses were detectable, compared to calling rates in the absence of airgun pulses. After the initial increase, calling rates leveled off at a received CSEL10-min of ~94 dB re 1 μPa2-s (the lower threshold). In contrast, once CSEL10-min exceeded ~127 dB re 1 μPa2-s (the upper threshold), whale calling rates began decreasing, and when CSEL10-min values were above ~160 dB re 1 μPa2-s, the whales were virtually silent. PMID:26039218
Effects of airgun sounds on bowhead whale calling rates: evidence for two behavioral thresholds.
Blackwell, Susanna B; Nations, Christopher S; McDonald, Trent L; Thode, Aaron M; Mathias, Delphine; Kim, Katherine H; Greene, Charles R; Macrander, A Michael
2015-01-01
In proximity to seismic operations, bowhead whales (Balaena mysticetus) decrease their calling rates. Here, we investigate the transition from normal calling behavior to decreased calling and identify two threshold levels of received sound from airgun pulses at which calling behavior changes. Data were collected in August-October 2007-2010, during the westward autumn migration in the Alaskan Beaufort Sea. Up to 40 directional acoustic recorders (DASARs) were deployed at five sites offshore of the Alaskan North Slope. Using triangulation, whale calls localized within 2 km of each DASAR were identified and tallied every 10 minutes each season, so that the detected call rate could be interpreted as the actual call production rate. Moreover, airgun pulses were identified on each DASAR, analyzed, and a cumulative sound exposure level was computed for each 10-min period each season (CSEL10-min). A Poisson regression model was used to examine the relationship between the received CSEL10-min from airguns and the number of detected bowhead calls. Calling rates increased as soon as airgun pulses were detectable, compared to calling rates in the absence of airgun pulses. After the initial increase, calling rates leveled off at a received CSEL10-min of ~94 dB re 1 μPa2-s (the lower threshold). In contrast, once CSEL10-min exceeded ~127 dB re 1 μPa2-s (the upper threshold), whale calling rates began decreasing, and when CSEL10-min values were above ~160 dB re 1 μPa2-s, the whales were virtually silent.
Mothé, Geórgia; Castro, Maria; Sthel, Marcelo; Lima, Guilherme; Brasil, Laisa; Campos, Layse; Rocha, Aline; Vargas, Helion
2010-01-01
Atmospheric pollution is one of the worst threats to modern society. The consequences derived from different forms of atmospheric pollution vary from the local to the global scale, with deep impacts on climate, environment and human health. Several gaseous pollutants, even when present in trace concentrations, play a fundamental role in important processes that occur in atmosphere. Phenomena such as global warming, photochemical smog formation, acid rain and the depletion of the stratospheric ozone layer are strongly related to the increased concentration of certain gaseous species in the atmosphere. The transport sector significantly produces atmospheric pollution, mainly when diesel oil is used as fuel. Therefore, new methodologies based on selective and sensitive gas detection schemes must be developed in order to detect and monitor pollutant gases from this source. In this work, CO2 Laser Photoacoustic Spectroscopy was used to evaluate ethylene emissions and electrochemical analyzers were used to evaluate the emissions of CO, NOx and SO2 from the exhaust of diesel powered vehicles (rural diesel with 5% of biodiesel, in this paper called only diesel) at different engine rotation speeds. Concentrations in the range 6 to 45 ppmV for ethylene, 109 to 1,231 ppmV for carbon monoxide, 75 to 868 ppmV for nitrogen oxides and 3 to 354 ppmV for sulfur dioxide were obtained. The results indicate that the detection techniques used were sufficiently selective and sensitive to detect the gaseous species mentioned above in the ppmV range. PMID:22163437
Data Randomization and Cluster-Based Partitioning for Botnet Intrusion Detection.
Al-Jarrah, Omar Y; Alhussein, Omar; Yoo, Paul D; Muhaidat, Sami; Taha, Kamal; Kim, Kwangjo
2016-08-01
Botnets, which consist of remotely controlled compromised machines called bots, provide a distributed platform for several threats against cyber world entities and enterprises. Intrusion detection system (IDS) provides an efficient countermeasure against botnets. It continually monitors and analyzes network traffic for potential vulnerabilities and possible existence of active attacks. A payload-inspection-based IDS (PI-IDS) identifies active intrusion attempts by inspecting transmission control protocol and user datagram protocol packet's payload and comparing it with previously seen attacks signatures. However, the PI-IDS abilities to detect intrusions might be incapacitated by packet encryption. Traffic-based IDS (T-IDS) alleviates the shortcomings of PI-IDS, as it does not inspect packet payload; however, it analyzes packet header to identify intrusions. As the network's traffic grows rapidly, not only the detection-rate is critical, but also the efficiency and the scalability of IDS become more significant. In this paper, we propose a state-of-the-art T-IDS built on a novel randomized data partitioned learning model (RDPLM), relying on a compact network feature set and feature selection techniques, simplified subspacing and a multiple randomized meta-learning technique. The proposed model has achieved 99.984% accuracy and 21.38 s training time on a well-known benchmark botnet dataset. Experiment results demonstrate that the proposed methodology outperforms other well-known machine-learning models used in the same detection task, namely, sequential minimal optimization, deep neural network, C4.5, reduced error pruning tree, and randomTree.
ERIC Educational Resources Information Center
Lu, Hui-Chuan; Chu, Yu-Hsin; Chang, Cheng-Yu
2013-01-01
Compared with English learners, Spanish learners have fewer resources for automatic error detection and revision and following the current integrative Computer Assisted Language Learning (CALL), we combined corpus-based approach and CALL to create the System of Error Detection and Revision Suggestion (SEDRS) for learning Spanish. Through…
Homozygous and hemizygous CNV detection from exome sequencing data in a Mendelian disease cohort
Gambin, Tomasz; Akdemir, Zeynep C.; Yuan, Bo; Gu, Shen; Chiang, Theodore; Carvalho, Claudia M.B.; Shaw, Chad; Jhangiani, Shalini; Boone, Philip M.; Eldomery, Mohammad K.; Karaca, Ender; Bayram, Yavuz; Stray-Pedersen, Asbjørg; Muzny, Donna; Charng, Wu-Lin; Bahrambeigi, Vahid; Belmont, John W.; Boerwinkle, Eric; Beaudet, Arthur L.; Gibbs, Richard A.
2017-01-01
Abstract We developed an algorithm, HMZDelFinder, that uses whole exome sequencing (WES) data to identify rare and intragenic homozygous and hemizygous (HMZ) deletions that may represent complete loss-of-function of the indicated gene. HMZDelFinder was applied to 4866 samples in the Baylor–Hopkins Center for Mendelian Genomics (BHCMG) cohort and detected 773 HMZ deletion calls (567 homozygous or 206 hemizygous) with an estimated sensitivity of 86.5% (82% for single-exonic and 88% for multi-exonic calls) and precision of 78% (53% single-exonic and 96% for multi-exonic calls). Out of 773 HMZDelFinder-detected deletion calls, 82 were subjected to array comparative genomic hybridization (aCGH) and/or breakpoint PCR and 64 were confirmed. These include 18 single-exon deletions out of which 8 were exclusively detected by HMZDelFinder and not by any of seven other CNV detection tools examined. Further investigation of the 64 validated deletion calls revealed at least 15 pathogenic HMZ deletions. Of those, 7 accounted for 17–50% of pathogenic CNVs in different disease cohorts where 7.1–11% of the molecular diagnosis solved rate was attributed to CNVs. In summary, we present an algorithm to detect rare, intragenic, single-exon deletion CNVs using WES data; this tool can be useful for disease gene discovery efforts and clinical WES analyses. PMID:27980096
Detection of baleen whales on an ocean-bottom seismometer array in the Lau Basin
NASA Astrophysics Data System (ADS)
Brodie, D.; Dunn, R.
2011-12-01
Long-term deployment of ocean-bottom seismometer arrays provides a unique opportunity for identifying and tracking whales in a manner not usually possible in biological studies. Large baleen whales emit low frequency (>5Hz) sounds called 'calls' or 'songs' that can be detected on either the hydrophone or vertical channel of the instrument at distances in excess of 50 km. The calls are distinct to individual species and even geographical groups among species, and are thought to serve a variety of purposes. Distinct repeating calls can be automatically identified using matched-filter processing, and whales can be located in a manner similar to that of earthquakes. Many baleen whale species are endangered, and little is known about their geographic distribution, population dynamics, and basic behaviors. The Lau back-arc basin, a tectonically active, elongated basin bounded by volcanic shallows, lies in the southwestern Pacific Ocean between Fiji and Tonga. Although whales are known to exist around Fiji and Tonga, little is understood about the population dynamics and migration patterns throughout the basin. Twenty-nine broadband ocean-bottom seismometers deployed in the basin recorded data for approximately ten months during the years 2009-2010. To date, four species of whales have been identified in the data: Blue (one call type), Humpback (two call types, including long-lasting 'songs'), Bryde's (one call type), and Fin whales (three call types). Three as-yet-unknown call types have also been identified. After the calls were identified, idealized spectrograms of the known calls were matched against the entire data set using an auto-detection algorithm. The auto-detection output provides the number of calls and times of year when each call type was recorded. Based on the results, whales migrate seasonally through the basin with some overlapping of species. Initial results also indicate that different species of whales are more common in some parts of the basin than others, suggesting preferences in water depth and distance to land. In future work, whales will be tracked through the basin using call localization information to illustrate migration patterns of the various species.
Use of a Parabolic Microphone to Detect Hidden Subjects in Search and Rescue.
Bowditch, Nathaniel L; Searing, Stanley K; Thomas, Jeffrey A; Thompson, Peggy K; Tubis, Jacqueline N; Bowditch, Sylvia P
2018-03-01
This study compares a parabolic microphone to unaided hearing in detecting and comprehending hidden callers at ranges of 322 to 2510 m. Eight subjects were placed 322 to 2510 m away from a central listening point. The subjects were concealed, and their calling volume was calibrated. In random order, subjects were asked to call the name of a state for 5 minutes. Listeners with parabolic microphones and others with unaided hearing recorded the direction of the call (detection) and name of the state (comprehension). The parabolic microphone was superior to unaided hearing in both detecting subjects and comprehending their calls, with an effect size (Cohen's d) of 1.58 for detection and 1.55 for comprehension. For each of the 8 hidden subjects, there were 24 detection attempts with the parabolic microphone and 54 to 60 attempts by unaided listeners. At the longer distances (1529-2510 m), the parabolic microphone was better at detecting callers (83% vs 51%; P<0.00001 by χ 2 ) and comprehension (57% vs 12%; P<0.00001). At the shorter distances (322-1190 m), the parabolic microphone offered advantages in detection (100% vs 83%; P=0.000023) and comprehension (86% vs 51%; P<0.00001), although not as pronounced as at the longer distances. Use of a 66-cm (26-inch) parabolic microphone significantly improved detection and comprehension of hidden calling subjects at distances between 322 and 2510 m when compared with unaided hearing. This study supports the use of a parabolic microphone in search and rescue to locate responsive subjects in favorable weather and terrain. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Automatic Feature Selection and Improved Classification in SICADA Counterfeit Electronics Detection
2017-03-20
The SICADA methodology was developed to detect such counterfeit microelectronics by collecting power side channel data and applying machine learning...to identify counterfeits. This methodology has been extended to include a two-step automated feature selection process and now uses a one-class SVM...classifier. We describe this methodology and show results for empirical data collected from several types of Microchip dsPIC33F microcontrollers
2004-01-01
login identity to the one under which the system call is executed, the parameters of the system call execution - file names including full path...Anomaly detection COAST-EIMDT Distributed on target hosts EMERALD Distributed on target hosts and security servers Signature recognition Anomaly...uses a centralized architecture, and employs an anomaly detection technique for intrusion detection. The EMERALD project [80] proposes a
Estimating Lion Abundance using N-mixture Models for Social Species
Belant, Jerrold L.; Bled, Florent; Wilton, Clay M.; Fyumagwa, Robert; Mwampeta, Stanslaus B.; Beyer, Dean E.
2016-01-01
Declining populations of large carnivores worldwide, and the complexities of managing human-carnivore conflicts, require accurate population estimates of large carnivores to promote their long-term persistence through well-informed management We used N-mixture models to estimate lion (Panthera leo) abundance from call-in and track surveys in southeastern Serengeti National Park, Tanzania. Because of potential habituation to broadcasted calls and social behavior, we developed a hierarchical observation process within the N-mixture model conditioning lion detectability on their group response to call-ins and individual detection probabilities. We estimated 270 lions (95% credible interval = 170–551) using call-ins but were unable to estimate lion abundance from track data. We found a weak negative relationship between predicted track density and predicted lion abundance from the call-in surveys. Luminosity was negatively correlated with individual detection probability during call-in surveys. Lion abundance and track density were influenced by landcover, but direction of the corresponding effects were undetermined. N-mixture models allowed us to incorporate multiple parameters (e.g., landcover, luminosity, observer effect) influencing lion abundance and probability of detection directly into abundance estimates. We suggest that N-mixture models employing a hierarchical observation process can be used to estimate abundance of other social, herding, and grouping species. PMID:27786283
Estimating Lion Abundance using N-mixture Models for Social Species.
Belant, Jerrold L; Bled, Florent; Wilton, Clay M; Fyumagwa, Robert; Mwampeta, Stanslaus B; Beyer, Dean E
2016-10-27
Declining populations of large carnivores worldwide, and the complexities of managing human-carnivore conflicts, require accurate population estimates of large carnivores to promote their long-term persistence through well-informed management We used N-mixture models to estimate lion (Panthera leo) abundance from call-in and track surveys in southeastern Serengeti National Park, Tanzania. Because of potential habituation to broadcasted calls and social behavior, we developed a hierarchical observation process within the N-mixture model conditioning lion detectability on their group response to call-ins and individual detection probabilities. We estimated 270 lions (95% credible interval = 170-551) using call-ins but were unable to estimate lion abundance from track data. We found a weak negative relationship between predicted track density and predicted lion abundance from the call-in surveys. Luminosity was negatively correlated with individual detection probability during call-in surveys. Lion abundance and track density were influenced by landcover, but direction of the corresponding effects were undetermined. N-mixture models allowed us to incorporate multiple parameters (e.g., landcover, luminosity, observer effect) influencing lion abundance and probability of detection directly into abundance estimates. We suggest that N-mixture models employing a hierarchical observation process can be used to estimate abundance of other social, herding, and grouping species.
Installation Restoration Program Records Search for Kingsley Field, Oregon.
1982-06-01
Hazardous Assesment Rating Methodology (HARM), is now used for all Air Force IRP studies. To maintain consistency, AFESC had their on-call contractors review...Installation History D. Industrial Facilities E. POL Storage Tanks F. Abandoned Tanks G. Oil/Water Separators :" H. Site Hazard Rating Methodology I. Site...and implementing regulations. The pur- pose of DOD policy is to control the migration of hazardous material contaminants from DOD installations. 3
ERIC Educational Resources Information Center
Ibáñez Moreno, Ana; Vermeulen, Anna
2015-01-01
In this paper the methodological steps taken in the conception of a new mobile application (app) are introduced. This app, called VISP (Videos for Speaking), is easily accessible and manageable, and is aimed at helping students of English as a Foreign Language (EFL) to improve their idiomaticity in their oral production. In order to do so, the app…
Poliva, Oren
2017-01-01
In the brain of primates, the auditory cortex connects with the frontal lobe via the temporal pole (auditory ventral stream; AVS) and via the inferior parietal lobe (auditory dorsal stream; ADS). The AVS is responsible for sound recognition, and the ADS for sound-localization, voice detection and integration of calls with faces. I propose that the primary role of the ADS in non-human primates is the detection and response to contact calls. These calls are exchanged between tribe members (e.g., mother-offspring) and are used for monitoring location. Detection of contact calls occurs by the ADS identifying a voice, localizing it, and verifying that the corresponding face is out of sight. Once a contact call is detected, the primate produces a contact call in return via descending connections from the frontal lobe to a network of limbic and brainstem regions. Because the ADS of present day humans also performs speech production, I further propose an evolutionary course for the transition from contact call exchange to an early form of speech. In accordance with this model, structural changes to the ADS endowed early members of the genus Homo with partial vocal control. This development was beneficial as it enabled offspring to modify their contact calls with intonations for signaling high or low levels of distress to their mother. Eventually, individuals were capable of participating in yes-no question-answer conversations. In these conversations the offspring emitted a low-level distress call for inquiring about the safety of objects (e.g., food), and his/her mother responded with a high- or low-level distress call to signal approval or disapproval of the interaction. Gradually, the ADS and its connections with brainstem motor regions became more robust and vocal control became more volitional. Speech emerged once vocal control was sufficient for inventing novel calls. PMID:28928931
A Design Methodology for Optoelectronic VLSI
2007-01-01
current gets converted to a CMOS voltage level through a transimpedance amplifier circuit called a receiver. The output of the receiver is then...change the current flowing from the diode to a voltage that the logic inputs can use. That circuit is called a receiver. It is a transimpedance amplifier ...incorpo- rate random access memory circuits, SRAM or dynamic RAM (DRAM). These circuits use weak internal analog signals that are amplified by sense
Scaling of echolocation call parameters in bats.
Jones, G
1999-12-01
I investigated the scaling of echolocation call parameters (frequency, duration and repetition rate) in bats in a functional context. Low-duty-cycle bats operate with search phase cycles of usually less than 20 %. They process echoes in the time domain and are therefore intolerant of pulse-echo overlap. High-duty-cycle (>30 %) species use Doppler shift compensation, and they separate pulse and echo in the frequency domain. Call frequency scales negatively with body mass in at least five bat families. Pulse duration scales positively with mass in low-duty-cycle quasi-constant-frequency (QCF) species because the large aerial-hawking species that emit these signals fly fast in open habitats. They therefore detect distant targets and experience pulse-echo overlap later than do smaller bats. Pulse duration also scales positively with mass in the Hipposideridae, which show at least partial Doppler shift compensation. Pulse repetition rate corresponds closely with wingbeat frequency in QCF bat species that fly relatively slowly. Larger, fast-flying species often skip pulses when detecting distant targets. There is probably a trade-off between call intensity and repetition rate because 'whispering' bats (and hipposiderids) produce several calls per predicted wingbeat and because batches of calls are emitted per wingbeat during terminal buzzes. Severe atmospheric attenuation at high frequencies limits the range of high-frequency calls. Low-duty-cycle bats that call at high frequencies must therefore use short pulses to avoid pulse-echo overlap. Rhinolophids escape this constraint by Doppler shift compensation and, importantly, can exploit advantages associated with the emission of both high-frequency and long-duration calls. Low frequencies are unsuited for the detection of small prey, and low repetition rates may limit prey detection rates. Echolocation parameters may therefore constrain maximum body size in aerial-hawking bats.
Environmental Influences On Diel Calling Behavior In Baleen Whales
2015-09-30
and calm seas were infrequent and short (Figure 1b), making traditional shipboard marine mammal observations difficult. The real time detection...first use of real-time detection and reporting of marine mammal calls from autonomous underwater vehicles to adaptively plan research activities. 3...conferences: • 6th International Workshop on Detection, Classification, Localization, and Density Estimation (DCLDE) of Marine Mammals using
Brodie, Dana C; Dunn, Robert A
2015-01-01
Ten months of broadband seismic data, recorded on six ocean-bottom seismographs located in the Lau Basin, were examined to identify baleen whale species. As the first systematic survey of baleen whales in this part of the southwest Pacific Ocean, this study reveals the variety of species present and their temporal occurrence in and near the basin. Baleen whales produce species-specific low frequency calls that can be identified by distinct patterns in data spectrograms. By matching spectrograms with published accounts, fin, Bryde's, Antarctic blue, and New Zealand blue whale calls were identified. Probable whale sounds that could not be matched to published spectrograms, as well as non-biologic sounds that are likely of volcanogenic origin, were also recorded. Detections of fin whale calls (mid-June to mid-October) and blue whale calls (June through September) suggest that these species migrate through the region seasonally. Detections of Bryde's whale calls (primarily February to June, but also other times of the year) suggest this species resides around the basin nearly year round. The discovery of previously unpublished call types emphasizes the limited knowledge of the full call repertoires of baleen whales and the utility of using seismic survey data to enhance understanding in understudied regions.
Finite element model updating and damage detection for bridges using vibration measurement.
DOT National Transportation Integrated Search
2013-12-01
In this report, the results of a study on developing a damage detection methodology based on Statistical Pattern Recognition are : presented. This methodology uses a new damage sensitive feature developed in this study that relies entirely on modal :...
An aerial-hawking bat uses stealth echolocation to counter moth hearing.
Goerlitz, Holger R; ter Hofstede, Hannah M; Zeale, Matt R K; Jones, Gareth; Holderied, Marc W
2010-09-14
Ears evolved in many nocturnal insects, including some moths, to detect bat echolocation calls and evade capture [1, 2]. Although there is evidence that some bats emit echolocation calls that are inconspicuous to eared moths, it is difficult to determine whether this was an adaptation to moth hearing or originally evolved for a different purpose [2, 3]. Aerial-hawking bats generally emit high-amplitude echolocation calls to maximize detection range [4, 5]. Here we present the first example of an echolocation counterstrategy to overcome prey hearing at the cost of reduced detection distance. We combined comparative bat flight-path tracking and moth neurophysiology with fecal DNA analysis to show that the barbastelle, Barbastella barbastellus, emits calls that are 10 to 100 times lower in amplitude than those of other aerial-hawking bats, remains undetected by moths until close, and captures mainly eared moths. Model calculations demonstrate that only bats emitting such low-amplitude calls hear moth echoes before their calls are conspicuous to moths. This stealth echolocation allows the barbastelle to exploit food resources that are difficult to catch for other aerial-hawking bats emitting calls of greater amplitude. Copyright © 2010 Elsevier Ltd. All rights reserved.
Using State Estimation Residuals to Detect Abnormal SCADA Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Jian; Chen, Yousu; Huang, Zhenyu
2010-06-14
Detection of manipulated supervisory control and data acquisition (SCADA) data is critically important for the safe and secure operation of modern power systems. In this paper, a methodology of detecting manipulated SCADA data based on state estimation residuals is presented. A framework of the proposed methodology is described. Instead of using original SCADA measurements as the bad data sources, the residuals calculated based on the results of the state estimator are used as the input for the outlier detection process. The BACON algorithm is applied to detect outliers in the state estimation residuals. The IEEE 118-bus system is used asmore » a test case to evaluate the effectiveness of the proposed methodology. The accuracy of the BACON method is compared with that of the 3-σ method for the simulated SCADA measurements and residuals.« less
Echolocation calls of Poey's flower bat (Phyllonycteris poeyi) unlike those of other phyllostomids.
Mora, Emanuel C; Macías, Silvio
2007-05-01
Unlike any other foraging phyllostomid bat studied to date, Poey's flower bats (Phyllonycteris poeyi-Phyllostomidae) emit relatively long (up to 7.2 ms), intense, single-harmonic echolocation calls. These calls are readily detectable at distances of at least 15 m. Furthermore, the echolocation calls contain only the first harmonic, which is usually filtered out in the vocal tract of phyllostomids. The foraging echolocation calls of P. poeyi are more like search-phase echolocation calls of sympatric aerial-feeding bats (Molossidae, Vespertilionidae, Mormoopidae). Intense, long, narrowband, single-harmonic echolocation calls focus acoustic energy maximizing range and favoring detection, which may be particularly important for cruising bats, like P. poeyi, when flying in the open. Flying in enclosed spaces, P. poeyi emit short, low-intensity, frequency-modulated, multiharmonic echolocation calls typical of other phyllostomids. This is the first report of a phyllostomid species emitting long, intense, single-harmonic echolocation calls with most energy in the first harmonic.
Effect of temporal and spectral noise features on gap detection behavior by calling green treefrogs.
Höbel, Gerlinde
2014-10-01
Communication plays a central role in the behavioral ecology of many animals, yet the background noise generated by large breeding aggregations may impair effective communication. A common behavioral strategy to ameliorate noise interference is gap detection, where signalers display primarily during lulls in the background noise. When attempting gap detection, signalers have to deal with the fact that the spacing and duration of silent gaps is often unpredictable, and that noise varies in its spectral composition and may thus vary in the degree in which it impacts communication. I conducted playback experiments to examine how male treefrogs deal with the problem that refraining from calling while waiting for a gap to appear limits a male's ability to attract females, yet producing calls during noise also interferes with effective sexual communication. I found that the temporal structure of noise (i.e., duration of noise and silent gap segments) had a stronger effect on male calling behavior than the spectral composition. Males placed calls predominantly during silent gaps and avoided call production during short, but not long, noise segments. This suggests that male treefrogs use a calling strategy that maximizes the production of calls without interference, yet allows for calling to persist if lulls in the background noise are infrequent. Copyright © 2014 Elsevier B.V. All rights reserved.
Luo, Jinhong; Koselj, Klemen; Zsebők, Sándor; Siemers, Björn M.; Goerlitz, Holger R.
2014-01-01
Climate change impacts the biogeography and phenology of plants and animals, yet the underlying mechanisms are little known. Here, we present a functional link between rising temperature and the prey detection ability of echolocating bats. The maximum distance for echo-based prey detection is physically determined by sound attenuation. Attenuation is more pronounced for high-frequency sound, such as echolocation, and is a nonlinear function of both call frequency and ambient temperature. Hence, the prey detection ability, and thus possibly the foraging efficiency, of echolocating bats and susceptible to rising temperatures through climate change. Using present-day climate data and projected temperature rises, we modelled this effect for the entire range of bat call frequencies and climate zones around the globe. We show that depending on call frequency, the prey detection volume of bats will either decrease or increase: species calling above a crossover frequency will lose and species emitting lower frequencies will gain prey detection volume, with crossover frequency and magnitude depending on the local climatic conditions. Within local species assemblages, this may cause a change in community composition. Global warming can thus directly affect the prey detection ability of individual bats and indirectly their interspecific interactions with competitors and prey. PMID:24335559
Luo, Jinhong; Koselj, Klemen; Zsebok, Sándor; Siemers, Björn M; Goerlitz, Holger R
2014-02-06
Climate change impacts the biogeography and phenology of plants and animals, yet the underlying mechanisms are little known. Here, we present a functional link between rising temperature and the prey detection ability of echolocating bats. The maximum distance for echo-based prey detection is physically determined by sound attenuation. Attenuation is more pronounced for high-frequency sound, such as echolocation, and is a nonlinear function of both call frequency and ambient temperature. Hence, the prey detection ability, and thus possibly the foraging efficiency, of echolocating bats and susceptible to rising temperatures through climate change. Using present-day climate data and projected temperature rises, we modelled this effect for the entire range of bat call frequencies and climate zones around the globe. We show that depending on call frequency, the prey detection volume of bats will either decrease or increase: species calling above a crossover frequency will lose and species emitting lower frequencies will gain prey detection volume, with crossover frequency and magnitude depending on the local climatic conditions. Within local species assemblages, this may cause a change in community composition. Global warming can thus directly affect the prey detection ability of individual bats and indirectly their interspecific interactions with competitors and prey.
Corroborating evidence-based medicine.
Mebius, Alexander
2014-12-01
Proponents of evidence-based medicine (EBM) have argued convincingly for applying this scientific method to medicine. However, the current methodological framework of the EBM movement has recently been called into question, especially in epidemiology and the philosophy of science. The debate has focused on whether the methodology of randomized controlled trials provides the best evidence available. This paper attempts to shift the focus of the debate by arguing that clinical reasoning involves a patchwork of evidential approaches and that the emphasis on evidence hierarchies of methodology fails to lend credence to the common practice of corroboration in medicine. I argue that the strength of evidence lies in the evidence itself, and not the methodology used to obtain that evidence. Ultimately, when it comes to evaluating the effectiveness of medical interventions, it is the evidence obtained from the methodology rather than the methodology that should establish the strength of the evidence. © 2014 John Wiley & Sons, Ltd.
Using scan statistics for congenital anomalies surveillance: the EUROCAT methodology.
Teljeur, Conor; Kelly, Alan; Loane, Maria; Densem, James; Dolk, Helen
2015-11-01
Scan statistics have been used extensively to identify temporal clusters of health events. We describe the temporal cluster detection methodology adopted by the EUROCAT (European Surveillance of Congenital Anomalies) monitoring system. Since 2001, EUROCAT has implemented variable window width scan statistic for detecting unusual temporal aggregations of congenital anomaly cases. The scan windows are based on numbers of cases rather than being defined by time. The methodology is imbedded in the EUROCAT Central Database for annual application to centrally held registry data. The methodology was incrementally adapted to improve the utility and to address statistical issues. Simulation exercises were used to determine the power of the methodology to identify periods of raised risk (of 1-18 months). In order to operationalize the scan methodology, a number of adaptations were needed, including: estimating date of conception as unit of time; deciding the maximum length (in time) and recency of clusters of interest; reporting of multiple and overlapping significant clusters; replacing the Monte Carlo simulation with a lookup table to reduce computation time; and placing a threshold on underlying population change and estimating the false positive rate by simulation. Exploration of power found that raised risk periods lasting 1 month are unlikely to be detected except when the relative risk and case counts are high. The variable window width scan statistic is a useful tool for the surveillance of congenital anomalies. Numerous adaptations have improved the utility of the original methodology in the context of temporal cluster detection in congenital anomalies.
Towards a Pattern-Driven Topical Ontology Modeling Methodology in Elderly Care Homes
NASA Astrophysics Data System (ADS)
Tang, Yan; de Baer, Peter; Zhao, Gang; Meersman, Robert; Pudkey, Kevin
This paper presents a pattern-driven ontology modeling methodology, which is used to create topical ontologies in the human resource management (HRM) domain. An ontology topic is used to group concepts from different contexts (or even from different domain ontologies). We use the Organization for Economic Co-operation and Development (OECD) and the National Vocational Qualification (NVQ) as the resource to create the topical ontologies in this paper. The methodology is implemented in a tool called PAD-ON suit. The paper approach is illustrated with a use case from elderly care homes in UK.
Structural health management of aerospace hotspots under fatigue loading
NASA Astrophysics Data System (ADS)
Soni, Sunilkumar
Sustainability and life-cycle assessments of aerospace systems, such as aircraft structures and propulsion systems, represent growing challenges in engineering. Hence, there has been an increasing demand in using structural health monitoring (SHM) techniques for continuous monitoring of these systems in an effort to improve safety and reduce maintenance costs. The current research is part of an ongoing multidisciplinary effort to develop a robust SHM framework resulting in improved models for damage-state awareness and life prediction, and enhancing capability of future aircraft systems. Lug joints, a typical structural hotspot, were chosen as the test article for the current study. The thesis focuses on integrated SHM techniques for damage detection and characterization in lug joints. Piezoelectric wafer sensors (PZTs) are used to generate guided Lamb waves as they can be easily used for onboard applications. Sensor placement in certain regions of a structural component is not feasible due to the inaccessibility of the area to be monitored. Therefore, a virtual sensing concept is introduced to acquire sensor data from finite element (FE) models. A full three dimensional FE analysis of lug joints with piezoelectric transducers, accounting for piezoelectrical-mechanical coupling, was performed in Abaqus and the sensor signals were simulated. These modeled sensors are called virtual sensors. A combination of real data from PZTs and virtual sensing data from FE analysis is used to monitor and detect fatigue damage in aluminum lug joints. Experiments were conducted on lug joints under fatigue loads and sensor signals collected were used to validate the simulated sensor response. An optimal sensor placement methodology for lug joints is developed based on a detection theory framework to maximize the detection rate and minimize the false alarm rate. The placement technique is such that the sensor features can be directly correlated to damage. The technique accounts for a number of factors, such as actuation frequency and strength, minimum damage size, damage detection scheme, material damping, signal to noise ratio and sensing radius. Advanced information processing methodologies are discussed for damage diagnosis. A new, instantaneous approach for damage detection, localization and quantification is proposed for applications to practical problems associated with changes in reference states under different environmental and operational conditions. Such an approach improves feature extraction for state awareness, resulting in robust life prediction capabilities.
Detection of Cyanotoxins During Potable Water Treatment
USDA-ARS?s Scientific Manuscript database
In 2007, the U.S. EPA listed three cyanobacterial toxins on the CCL3 containment priority list for potable drinking waters. This paper describes all methodologies used for detection of these toxins, and assesses each on a cost/benefit basis. Methodologies for microcystin, cylindrospermopsin, and a...
Environmental Influences on Diel Calling Behavior in Baleen Whales
2013-09-30
to allow known calls (e.g., right whale upcall and gunshot, fin whale 20- Hz pulses, humpback whale downsweeps, sei whale low-frequency downsweeps...fin, humpback , sei, and North Atlantic right whales . Real-time detections were evaluated after recovery of the gliders by (1) comparing the acoustic...from both an aircraft and ship. The overall false detection rate for individual calls was 14%, and for right, humpback , and fin whales , false
A state-of-the-art review on segmentation algorithms in intravascular ultrasound (IVUS) images.
Katouzian, Amin; Angelini, Elsa D; Carlier, Stéphane G; Suri, Jasjit S; Navab, Nassir; Laine, Andrew F
2012-09-01
Over the past two decades, intravascular ultrasound (IVUS) image segmentation has remained a challenge for researchers while the use of this imaging modality is rapidly growing in catheterization procedures and in research studies. IVUS provides cross-sectional grayscale images of the arterial wall and the extent of atherosclerotic plaques with high spatial resolution in real time. In this paper, we review recently developed image processing methods for the detection of media-adventitia and luminal borders in IVUS images acquired with different transducers operating at frequencies ranging from 20 to 45 MHz. We discuss methodological challenges, lack of diversity in reported datasets, and weaknesses of quantification metrics that make IVUS segmentation still an open problem despite all efforts. In conclusion, we call for a common reference database, validation metrics, and ground-truth definition with which new and existing algorithms could be benchmarked.
Methodologies for the Detection of BSE Risk Material in Meat and Meat Products
NASA Astrophysics Data System (ADS)
Lücker, Ernst
Soon after the emergence of bovine spongiform encephalopathy (BSE), a fatal disease of the central nervous system (CNS) in cattle, so-called specified bovine offal were legally defined and banned (SBO-ban) in order to reduce the presumed potential BSE exposition risk for British consumers (UK, 1989). Later on the legal definition of risk material was frequently modified according to new scientific results on BSE tissue infectivity (Table 19.1). A European-wide ban on specified risk materials (SRM) was established in 2001 (EC, 2001). In effect, the SRM-ban is still the most important direct measure in reducing potential human BSE exposure risk (EC, 2005). Taking into account the overall and constant reduction of the frequency of BSE cases as well as the very high costs of preventive measures, the European Commission has envisioned a future lifting of the SRM-ban (EC, 2005).
Artificial immune system via Euclidean Distance Minimization for anomaly detection in bearings
NASA Astrophysics Data System (ADS)
Montechiesi, L.; Cocconcelli, M.; Rubini, R.
2016-08-01
In recent years new diagnostics methodologies have emerged, with particular interest into machinery operating in non-stationary conditions. In fact continuous speed changes and variable loads make non-trivial the spectrum analysis. A variable speed means a variable characteristic fault frequency related to the damage that is no more recognizable in the spectrum. To overcome this problem the scientific community proposed different approaches listed in two main categories: model-based approaches and expert systems. In this context the paper aims to present a simple expert system derived from the mechanisms of the immune system called Euclidean Distance Minimization, and its application in a real case of bearing faults recognition. The proposed method is a simplification of the original process, adapted by the class of Artificial Immune Systems, which proved to be useful and promising in different application fields. Comparative results are provided, with a complete explanation of the algorithm and its functioning aspects.
Detecting the Presence of a Personality Disorder Using Interpersonal and Self-Dysfunction.
Beeney, Joseph E; Lazarus, Sophie A; Hallquist, Michael N; Stepp, Stephanie D; Wright, Aidan G C; Scott, Lori N; Giertych, Rachel A; Pilkonis, Paul A
2018-03-05
Calls have increased to place interpersonal and self-disturbance as defining features of personality disorders (PDs). Findings from a methodologically diverse set of studies suggest that a common factor undergirds all PDs. The nature of this core of PDs, however, is not clear. In the current study, interviews were completed for DSM-IV PD diagnosis and interpersonal dysfunction independently with 272 individuals (PD = 191, no-PD = 91). Specifically, we evaluated interpersonal dysfunction across social domains. In addition, we empirically assessed the structure of self-dysfunction in PDs. We found dysfunction in work and romantic domains, and unstable identity uniquely predicted variance in the presence of a PD. Using receiver operating characteristic analysis, we found that the interpersonal dysfunction and self-dysfunction scales each predicted PDs with high accuracy. In combination, the scales resulted in excellent sensitivity (.90) and specificity (.88). The results support interpersonal and self-dysfunction as general factors of PD.
A novel network module for medical devices.
Chen, Ping-Yu
2008-01-01
In order to allow medical devices to upload the vital signs to a server on a network without manually configuring for end-users, a new network module is proposed. The proposed network module, called Medical Hub (MH), functions as a bridge to fetch the data from all connecting medical devices, and then upload these data to the server. When powering on, the MH can immediately establish network configuration automatically. Network Address Translation (NAT) traversal is also supported by the MH with the UPnP Internet Gateway Device (IGD) methodology. Besides the network configuration, other configuration in the MH is automatically established by using the remote management protocol TR-069. On the other hand, a mechanism for updating software automatically according to the variant connected medical device is proposed. With this mechanism, newcome medical devices can be detected and supported by the MH without manual operation.
Obtaining gravitational waves from inspiral binary systems using LIGO data
NASA Astrophysics Data System (ADS)
Antelis, Javier M.; Moreno, Claudia
2017-01-01
The discovery of the astrophysical events GW150926 and GW151226 has experimentally confirmed the existence of gravitational waves (GW) and has demonstrated the existence of binary stellar-mass black hole systems. This finding marks the beginning of a new era that will reveal unexpected features of our universe. This work presents a basic insight to the fundamental theory of GW emitted by inspiral binary systems and describes the scientific and technological efforts developed to measure these waves using the interferometer-based detector called LIGO. Subsequently, the work presents a comprehensive data analysis methodology based on the matched filter algorithm, which aims to recovery GW signals emitted by inspiral binary systems of astrophysical sources. This algorithm was evaluated with freely available LIGO data containing injected GW waveforms. Results of the experiments performed to assess detection accuracy showed the recovery of 85% of the injected GW.
Applications of neuroscience in criminal law: legal and methodological issues.
Meixner, John B
2015-01-01
The use of neuroscience in criminal law applications is an increasingly discussed topic among legal and psychological scholars. Over the past 5 years, several prominent federal criminal cases have referenced neuroscience studies and made admissibility determinations regarding neuroscience evidence. Despite this growth, the field is exceptionally young, and no one knows for sure how significant of a contribution neuroscience will make to criminal law. This article focuses on three major subfields: (1) neuroscience-based credibility assessment, which seeks to detect lies or knowledge associated with a crime; (2) application of neuroscience to aid in assessments of brain capacity for culpability, especially among adolescents; and (3) neuroscience-based prediction of future recidivism. The article briefly reviews these fields as applied to criminal law and makes recommendations for future research, calling for the increased use of individual-level data and increased realism in laboratory studies.
Homozygous and hemizygous CNV detection from exome sequencing data in a Mendelian disease cohort.
Gambin, Tomasz; Akdemir, Zeynep C; Yuan, Bo; Gu, Shen; Chiang, Theodore; Carvalho, Claudia M B; Shaw, Chad; Jhangiani, Shalini; Boone, Philip M; Eldomery, Mohammad K; Karaca, Ender; Bayram, Yavuz; Stray-Pedersen, Asbjørg; Muzny, Donna; Charng, Wu-Lin; Bahrambeigi, Vahid; Belmont, John W; Boerwinkle, Eric; Beaudet, Arthur L; Gibbs, Richard A; Lupski, James R
2017-02-28
We developed an algorithm, HMZDelFinder, that uses whole exome sequencing (WES) data to identify rare and intragenic homozygous and hemizygous (HMZ) deletions that may represent complete loss-of-function of the indicated gene. HMZDelFinder was applied to 4866 samples in the Baylor-Hopkins Center for Mendelian Genomics (BHCMG) cohort and detected 773 HMZ deletion calls (567 homozygous or 206 hemizygous) with an estimated sensitivity of 86.5% (82% for single-exonic and 88% for multi-exonic calls) and precision of 78% (53% single-exonic and 96% for multi-exonic calls). Out of 773 HMZDelFinder-detected deletion calls, 82 were subjected to array comparative genomic hybridization (aCGH) and/or breakpoint PCR and 64 were confirmed. These include 18 single-exon deletions out of which 8 were exclusively detected by HMZDelFinder and not by any of seven other CNV detection tools examined. Further investigation of the 64 validated deletion calls revealed at least 15 pathogenic HMZ deletions. Of those, 7 accounted for 17-50% of pathogenic CNVs in different disease cohorts where 7.1-11% of the molecular diagnosis solved rate was attributed to CNVs. In summary, we present an algorithm to detect rare, intragenic, single-exon deletion CNVs using WES data; this tool can be useful for disease gene discovery efforts and clinical WES analyses. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
VarDict: a novel and versatile variant caller for next-generation sequencing in cancer research
Lai, Zhongwu; Markovets, Aleksandra; Ahdesmaki, Miika; Chapman, Brad; Hofmann, Oliver; McEwen, Robert; Johnson, Justin; Dougherty, Brian; Barrett, J. Carl; Dry, Jonathan R.
2016-01-01
Abstract Accurate variant calling in next generation sequencing (NGS) is critical to understand cancer genomes better. Here we present VarDict, a novel and versatile variant caller for both DNA- and RNA-sequencing data. VarDict simultaneously calls SNV, MNV, InDels, complex and structural variants, expanding the detected genetic driver landscape of tumors. It performs local realignments on the fly for more accurate allele frequency estimation. VarDict performance scales linearly to sequencing depth, enabling ultra-deep sequencing used to explore tumor evolution or detect tumor DNA circulating in blood. In addition, VarDict performs amplicon aware variant calling for polymerase chain reaction (PCR)-based targeted sequencing often used in diagnostic settings, and is able to detect PCR artifacts. Finally, VarDict also detects differences in somatic and loss of heterozygosity variants between paired samples. VarDict reprocessing of The Cancer Genome Atlas (TCGA) Lung Adenocarcinoma dataset called known driver mutations in KRAS, EGFR, BRAF, PIK3CA and MET in 16% more patients than previously published variant calls. We believe VarDict will greatly facilitate application of NGS in clinical cancer research. PMID:27060149
Identifying hidden voice and video streams
NASA Astrophysics Data System (ADS)
Fan, Jieyan; Wu, Dapeng; Nucci, Antonio; Keralapura, Ram; Gao, Lixin
2009-04-01
Given the rising popularity of voice and video services over the Internet, accurately identifying voice and video traffic that traverse their networks has become a critical task for Internet service providers (ISPs). As the number of proprietary applications that deliver voice and video services to end users increases over time, the search for the one methodology that can accurately detect such services while being application independent still remains open. This problem becomes even more complicated when voice and video service providers like Skype, Microsoft, and Google bundle their voice and video services with other services like file transfer and chat. For example, a bundled Skype session can contain both voice stream and file transfer stream in the same layer-3/layer-4 flow. In this context, traditional techniques to identify voice and video streams do not work. In this paper, we propose a novel self-learning classifier, called VVS-I , that detects the presence of voice and video streams in flows with minimum manual intervention. Our classifier works in two phases: training phase and detection phase. In the training phase, VVS-I first extracts the relevant features, and subsequently constructs a fingerprint of a flow using the power spectral density (PSD) analysis. In the detection phase, it compares the fingerprint of a flow to the existing fingerprints learned during the training phase, and subsequently classifies the flow. Our classifier is not only capable of detecting voice and video streams that are hidden in different flows, but is also capable of detecting different applications (like Skype, MSN, etc.) that generate these voice/video streams. We show that our classifier can achieve close to 100% detection rate while keeping the false positive rate to less that 1%.
NASA Astrophysics Data System (ADS)
Levesque, M.
Artificial satellites, and particularly space junk, drift continuously from their known orbits. In the surveillance-of-space context, they must be observed frequently to ensure that the corresponding orbital parameter database entries are up-to-date. Autonomous ground-based optical systems are periodically tasked to observe these objects, calculate the difference between their predicted and real positions and update object orbital parameters. The real satellite positions are provided by the detection of the satellite streaks in the astronomical images specifically acquired for this purpose. This paper presents the image processing techniques used to detect and extract the satellite positions. The methodology includes several processing steps including: image background estimation and removal, star detection and removal, an iterative matched filter for streak detection, and finally false alarm rejection algorithms. This detection methodology is able to detect very faint objects. Simulated data were used to evaluate the methodology's performance and determine the sensitivity limits where the algorithm can perform detection without false alarm, which is essential to avoid corruption of the orbital parameter database.
Brand Discrimination: An Implicit Measure of the Strength of Mental Brand Representations
Friedman, Mike; Leclercq, Thomas
2015-01-01
While mental associations between a brand and its marketing elements are an important part of brand equity, previous research has yet to provide a sound methodology to measure the strength of these links. The following studies present the development and validation of an implicit measure to assess the strength of mental representations of brand elements in the mind of the consumer. The measure described in this paper, which we call the Brand Discrimination task, requires participants to identify whether images of brand elements (e.g. color, logo, packaging) belong to a target brand or not. Signal detection theory (SDT) is used to calculate a Brand Discrimination index which gives a measure of overall recognition accuracy for a brand’s elements in the context of its competitors. A series of five studies shows that the Brand Discrimination task can discriminate between strong and weak brands, increases when mental representations of brands are experimentally strengthened, is relatively stable across time, and can predict brand choice, independently and while controlling for other explicit and implicit brand evaluation measures. Together, these studies provide unique evidence for the importance of mental brand representations in marketing and consumer behavior, along with a research methodology to measure this important consumer-based brand attribute. PMID:25803845
[Methodology for Identification of Inverse Drug Distribution, Spain].
López Pérez, M Arantzazu; Muñoz Arias, Mariano; Vázquez Mourelle, Raquel
2016-04-04
The phenomenon of reverse drug trafficking in the legal supply chain is an unlawful practice to serious risks to public health. The aims was to identify proactively pharmacies that carry out these illegal activities. An analysis was performed through the crossing billing data to SAS of 52 million packs of medicines for the 496 pharmacies in the province over a period of 29 months with the drug packaging data supplied by the distribution entities of the province with the implementation of specific indicator defined called 'percentage overbought' allows us to detect those pharmacies at high risk of being involved in this illicit trade. It was tested in two pharmacies one rural and other urban a detour of 5.130 medicine containers and an illicit profit obtained from € 9,591.78 for the first and 9.982 packaging and € 26,885.11 for the second; they had gone unnoticed in previous inspections. The methodology implemented to define a profile of infringing pharmacies high risk in these illicit practices, identify new ones that had not been sanctioned, weigh the drugs for illegal trade and to identify new drugs subject to diversion; also added as a challenge, it helps to adjust accurately and effectively calculate the illicit profit obtained.
Temperature and heat wave trends in northwest Mexico
NASA Astrophysics Data System (ADS)
Martínez-Austria, Polioptro F.; Bandala, Erick R.; Patiño-Gómez, Carlos
2016-02-01
Increase in temperature extremes is one of the main expected impacts of climate change, as well as one of the first signs of its occurrence. Nevertheless, results emerging from General Circulation Models, while sufficient for large scales, are not enough for forecasting local trends and, hence, the IPCC has called for local studies based on on-site data. Indeed, it is expected that climate extremes will be detected much earlier than changes in climate averages. Heat waves are among the most important and least studied climate extremes, however its occurrence has been only barely studied and even its very definition remains controversial. This paper discusses the observed changes in temperature trends and heat waves in Northwestern Mexico, one of the most vulnerable regions of the country. The climate records in two locations of the region are analyzed, including one of the cities with extreme climate in Mexico, Mexicali City in the state of Baja California and the Yaqui River basin at Sonora State using three different methodologies. Results showed clear trends on temperature increase and occurrence of heat waves in both of the study zones using the three methodologies proposed. As result, some policy making suggestion are included in order to increase the adaptability of the studied regions to climate change, particularly related with heat wave occurrence.
Brand discrimination: an implicit measure of the strength of mental brand representations.
Friedman, Mike; Leclercq, Thomas
2015-01-01
While mental associations between a brand and its marketing elements are an important part of brand equity, previous research has yet to provide a sound methodology to measure the strength of these links. The following studies present the development and validation of an implicit measure to assess the strength of mental representations of brand elements in the mind of the consumer. The measure described in this paper, which we call the Brand Discrimination task, requires participants to identify whether images of brand elements (e.g. color, logo, packaging) belong to a target brand or not. Signal detection theory (SDT) is used to calculate a Brand Discrimination index which gives a measure of overall recognition accuracy for a brand's elements in the context of its competitors. A series of five studies shows that the Brand Discrimination task can discriminate between strong and weak brands, increases when mental representations of brands are experimentally strengthened, is relatively stable across time, and can predict brand choice, independently and while controlling for other explicit and implicit brand evaluation measures. Together, these studies provide unique evidence for the importance of mental brand representations in marketing and consumer behavior, along with a research methodology to measure this important consumer-based brand attribute.
The US EPA National Center for Environmental Assessment has developed a methodology to derive acute inhalation toxicity benchmarks, called acute reference exposures (AREs), for noncancer effects. The methodology provides guidance for the derivation of chemical-specific benchmark...
An Adaptive Database Intrusion Detection System
ERIC Educational Resources Information Center
Barrios, Rita M.
2011-01-01
Intrusion detection is difficult to accomplish when attempting to employ current methodologies when considering the database and the authorized entity. It is a common understanding that current methodologies focus on the network architecture rather than the database, which is not an adequate solution when considering the insider threat. Recent…
A Rebuttal of NTL Institute's Learning Pyramid
ERIC Educational Resources Information Center
Letrud, Kare
2012-01-01
This article discusses the learning pyramid corroborated by National Training Laboratories Institute. It present and compliment historical and methodological critique against the learning pyramid, and call upon NTL Institute ought to retract their model.
2000-02-01
HIDS] Program: Power Drive Train Crack Detection Diagnostics and Prognostics ife Usage Monitoring and Damage Tolerance; Techniques, Methodologies, and...and Prognostics , Life Usage Monitoring , and Damage Tolerance; Techniques, Methodologies, and Experiences Andrew Hess Harrison Chin William Hardman...continuing program and deployed engine monitoring systems in fixed to evaluate helicopter diagnostic, prognostic , and wing aircraft, notably on the A
PIA and REWIND: Two New Methodologies for Cross Section Adjustment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmiotti, G.; Salvatores, M.
2017-02-01
This paper presents two new cross section adjustment methodologies intended for coping with the problem of compensations. The first one PIA, Progressive Incremental Adjustment, gives priority to the utilization of experiments of elemental type (those sensitive to a specific cross section), following a definite hierarchy on which type of experiment to use. Once the adjustment is performed, both the new adjusted data and the new covariance matrix are kept. The second methodology is called REWIND (Ranking Experiments by Weighting for Improved Nuclear Data). This new proposed approach tries to establish a methodology for ranking experiments by looking at the potentialmore » gain they can produce in an adjustment. Practical applications for different adjustments illustrate the results of the two methodologies against the current one and show the potential improvement for reducing uncertainties in target reactors.« less
Yan, Weixin; Zhang, Aiguo; Powell, Michael J
2016-07-21
Gastrointestinal stromal tumors (GISTs) have been recognized as a biologically distinctive type of tumor, different from smooth muscle and neural tumors of the gastrointestinal tract. The identification of genetic aberrations in proto-oncogenes that drive the growth of GISTs is critical for improving the efficacy of cancer therapy by matching targeted drugs to specific mutations. Research into the oncogenic mechanisms of GISTs has found that these tumors frequently contain activating gene mutations in either platelet-derived growth factor receptor A (PDGFRA) or a receptor tyrosine protein associated with a mast cell growth factor receptor encoded by the KIT gene. Mutant cancer subpopulations have the potential to disrupt durable patient responses to molecularly targeted therapy for GISTs, yet the prevalence and size of subpopulations remain largely unexplored. Detection of the cancer subpopulations that harbor low-frequency mutant alleles of target proto-oncogenes through the use of molecular genetic methods, such as polymerase chain reaction (PCR) target amplification technology, is hampered by the high abundance of wild-type alleles, which limit the sensitivity of detection of these minor mutant alleles. This is especially true in the case of mutant tumor DNA derived "driver" and "drug-resistant" alleles that are present in the circulating cell-free tumor DNA (cfDNA) in the peripheral blood circulation of GIST patients. So-called "liquid biopsy" allows for the dynamic monitoring of the patients' tumor status during treatment using minimally invasive sampling. New methodologies, such as a technology that employs a xenonucleic acid (XNA) clamping probe to block the PCR amplification of wild-type templates, have allowed improved molecular detection of these low-frequency alleles both in tissue biopsy samples and in cfDNA. These new methodologies could be widely applied for minimally invasive molecular testing in the therapeutic management of GISTs.
Data mining of atmospheric parameters associated with coastal earthquakes
NASA Astrophysics Data System (ADS)
Cervone, Guido
Earthquakes are natural hazards that pose a serious threat to society and the environment. A single earthquake can claim thousands of lives, cause damages for billions of dollars, destroy natural landmarks and render large territories uninhabitable. Studying earthquakes and the processes that govern their occurrence, is of fundamental importance to protect lives, properties and the environment. Recent studies have shown that anomalous changes in land, ocean and atmospheric parameters occur prior to earthquakes. The present dissertation introduces an innovative methodology and its implementation to identify anomalous changes in atmospheric parameters associated with large coastal earthquakes. Possible geophysical mechanisms are discussed in view of the close interaction between the lithosphere, the hydrosphere and the atmosphere. The proposed methodology is a multi strategy data mining approach which combines wavelet transformations, evolutionary algorithms, and statistical analysis of atmospheric data to analyze possible precursory signals. One dimensional wavelet transformations and statistical tests are employed to identify significant singularities in the data, which may correspond to anomalous peaks due to the earthquake preparatory processes. Evolutionary algorithms and other localized search strategies are used to analyze the spatial and temporal continuity of the anomalies detected over a large area (about 2000 km2), to discriminate signals that are most likely associated with earthquakes from those due to other, mostly atmospheric, phenomena. Only statistically significant singularities occurring within a very short time of each other, and which tract a rigorous geometrical path related to the geological properties of the epicentral area, are considered to be associated with a seismic event. A program called CQuake was developed to implement and validate the proposed methodology. CQuake is a fully automated, real time semi-operational system, developed to study precursory signals associated with earthquakes. CQuake can be used for the retrospective analysis of past earthquakes, and for detecting early warning information about impending events. Using CQuake more than 300 earthquakes have been analyzed. In the case of coastal earthquakes with magnitude larger than 5.0, prominent anomalies are found up to two weeks prior to the main event. In case of earthquakes occurring away from the coast, no strong anomaly is detected. The identified anomalies provide a potentially reliable mean to mitigate earthquake risks in the future, and can be used to develop a fully operational forecasting system.
NASA Astrophysics Data System (ADS)
Christensen-Dalsgaard, Jakob
Anuran amphibians (frogs and toads) of most of the 3,500 species that exist today are highly vocal animals. In most frogs, males will spend considerable energy on calling and incur sizeable predation risks and the females’ detection and localization of the calls of conspecific males is often a prerequisite for successful mating. Therefore, acoustic communication is evidently evolutionarily important in the anurans, and their auditory system is probably shaped by the selective pressures associated with production, detection and localization of the communication calls.
Leroy, Emmanuelle C; Samaran, Flore; Bonnel, Julien; Royer, Jean-Yves
2016-01-01
Passive acoustic monitoring is an efficient way to provide insights on the ecology of large whales. This approach allows for long-term and species-specific monitoring over large areas. In this study, we examined six years (2010 to 2015) of continuous acoustic recordings at up to seven different locations in the Central and Southern Indian Basin to assess the peak periods of presence, seasonality and migration movements of Antarctic blue whales (Balaenoptera musculus intermedia). An automated method is used to detect the Antarctic blue whale stereotyped call, known as Z-call. Detection results are analyzed in terms of distribution, seasonal presence and diel pattern of emission at each site. Z-calls are detected year-round at each site, except for one located in the equatorial Indian Ocean, and display highly seasonal distribution. This seasonality is stable across years for every site, but varies between sites. Z-calls are mainly detected during autumn and spring at the subantarctic locations, suggesting that these sites are on the Antarctic blue whale migration routes, and mostly during winter at the subtropical sites. In addition to these seasonal trends, there is a significant diel pattern in Z-call emission, with more Z-calls in daytime than in nighttime. This diel pattern may be related to the blue whale feeding ecology.
Leroy, Emmanuelle C.; Samaran, Flore; Bonnel, Julien; Royer, Jean-Yves
2016-01-01
Passive acoustic monitoring is an efficient way to provide insights on the ecology of large whales. This approach allows for long-term and species-specific monitoring over large areas. In this study, we examined six years (2010 to 2015) of continuous acoustic recordings at up to seven different locations in the Central and Southern Indian Basin to assess the peak periods of presence, seasonality and migration movements of Antarctic blue whales (Balaenoptera musculus intermedia). An automated method is used to detect the Antarctic blue whale stereotyped call, known as Z-call. Detection results are analyzed in terms of distribution, seasonal presence and diel pattern of emission at each site. Z-calls are detected year-round at each site, except for one located in the equatorial Indian Ocean, and display highly seasonal distribution. This seasonality is stable across years for every site, but varies between sites. Z-calls are mainly detected during autumn and spring at the subantarctic locations, suggesting that these sites are on the Antarctic blue whale migration routes, and mostly during winter at the subtropical sites. In addition to these seasonal trends, there is a significant diel pattern in Z-call emission, with more Z-calls in daytime than in nighttime. This diel pattern may be related to the blue whale feeding ecology. PMID:27828976
Bui, Thuy-Vy D.; Takekawa, John Y.; Overton, Cory T.; Schultz, Emily R.; Hull, Joshua M.; Casazza, Michael L.
2015-01-01
The California Ridgway's rail Rallus obsoletus obsoletus (hereafter California rail) is a secretive marsh bird endemic to tidal marshes in the San Francisco Bay (hereafter bay) of California. The California rail has undergone significant range contraction and population declines due to a variety of factors, including predation and the degradation and loss of habitat. Call-count surveys, which include call playbacks, based on the standardized North American marsh bird monitoring protocol have been conducted throughout the bay since 2005 to monitor population size and distribution of the California rail. However, call-count surveys are difficult to evaluate for efficacy or accuracy. To measure the accuracy of call-count surveys and investigate whether radio-marked California rails moved in response to call-count surveys, we compared locations of radio-marked California rails collected at frequent intervals (15 min) to California rail detections recorded during call-count surveys conducted over the same time periods. Overall, 60% of radio-marked California rails within 200 m of observers were not detected during call-count surveys. Movements of radio-marked California rails showed no directional bias (P = 0.92) irrespective of whether or not playbacks of five marsh bird species (including the California rail) were broadcast from listening stations. Our findings suggest that playbacks of rail vocalizations do not consistently influence California rail movements during surveys. However, call-count surveys may underestimate California rail presence; therefore, caution should be used when relating raw numbers of call-count detections to population abundance.
Schmidtke, Daniel; Schulz, Jochen; Hartung, Jörg; Esser, Karl-Heinz
2013-01-01
In the 1970s, Tavolga conducted a series of experiments in which he found behavioral evidence that the vocalizations of the catfish species Ariopsis felis may play a role in a coarse form of echolocation. Based on his findings, he postulated a similar function for the calls of closely related catfish species. Here, we describe the physical characteristics of the predominant call-type of Ariopsis seemanni. In two behavioral experiments, we further explore whether A. seemanni uses these calls for acoustic obstacle detection by testing the hypothesis that the call-emission rate of individual fish should increase when subjects are confronted with novel objects, as it is known from other vertebrate species that use pulse-type signals to actively probe the environment. Audio-video monitoring of the fish under different obstacle conditions did not reveal a systematic increase in the number of emitted calls in the presence of novel objects or in dependence on the proximity between individual fish and different objects. These negative findings in combination with our current understanding of directional hearing in fishes (which is a prerequisite for acoustic obstacle detection) make it highly unlikely that A. seemanni uses its calls for acoustic obstacle detection. We argue that the calls are more likely to play a role in intra- or interspecific communication (e.g. in school formation or predator deterrence) and present results from a preliminary Y-maze experiment that are indicative for a positive phonotaxis of A. seemanni towards the calls of conspecifics. PMID:23741408
Evaluation of listener-based anuran surveys with automated audio recording devices
Shearin, A. F.; Calhoun, A.J.K.; Loftin, C.S.
2012-01-01
Volunteer-based audio surveys are used to document long-term trends in anuran community composition and abundance. Current sampling protocols, however, are not region- or species-specific and may not detect relatively rare or audibly cryptic species. We used automated audio recording devices to record calling anurans during 2006–2009 at wetlands in Maine, USA. We identified species calling, chorus intensity, time of day, and environmental variables when each species was calling and developed logistic and generalized mixed models to determine the time interval and environmental variables that optimize detection of each species during peak calling periods. We detected eight of nine anurans documented in Maine. Individual recordings selected from the sampling period (0.5 h past sunset to 0100 h) described in the North American Amphibian Monitoring Program (NAAMP) detected fewer species than were detected in recordings from 30 min past sunset until sunrise. Time of maximum detection of presence and full chorusing for three species (green frogs, mink frogs, pickerel frogs) occurred after the NAAMP sampling end time (0100 h). The NAAMP protocol’s sampling period may result in omissions and misclassifications of chorus sizes for certain species. These potential errors should be considered when interpreting trends generated from standardized anuran audio surveys.
Oil Spill Detection: Past and Future Trends
NASA Astrophysics Data System (ADS)
Topouzelis, Konstantinos; Singha, Suman
2016-08-01
In the last 15 years, the detection of oil spills by satellite means has been moved from experimental to operational. Actually, what is really changed is the satellite image availability. From the late 1990's, in the age of "no data" we have moved forward 15 years to the age of "Sentinels" with an abundance of data. Either large accident related to offshore oil exploration and production activity or illegal discharges from tankers, oil on the sea surface is or can be now regularly monitored, over European Waters. National and transnational organizations (i.e. European Maritime Safety Agency's 'CleanSeaNet' Service) are routinely using SAR imagery to detect oil due to it's all weather, day and night imaging capability. However, all these years the scientific methodology on the detection remains relatively constant. From manual analysis to fully automatic detection methodologies, no significant contribution has been published in the last years and certainly none has dramatically changed the rules of the detection. On the contrary, although the overall accuracy of the methodology is questioned, the four main classification steps (dark area detection, features extraction, statistic database creation, and classification) are continuously improving. In recent years, researchers came up with the use of polarimetric SAR data for oil spill detection and characterizations, although utilization of Pol-SAR data for this purpose still remains questionable due to lack of verified dataset and low spatial coverage of Pol-SAR data. The present paper is trying to point out the drawbacks of the oil spill detection in the last years and focus on the bottlenecks of the oil spill detection methodologies. Also, solutions on the basis of data availability, management and analysis are proposed. Moreover, an ideal detection system is discussed regarding satellite image and in situ observations using different scales and sensors.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-18
... during conference calls and via email discussions. Member duties include prioritizing topics, designing... their expertise in methodological issues such as meta-analysis, analytic modeling or clinical...
Characterization of deformable materials in the THOR dummy
DOT National Transportation Integrated Search
2000-01-01
Methodologies used to characterize the mechanical behavior of various materials used in the construction of the crash test dummy called THOR (Test device for Human Occupant Restraint) are described. These materials include polyurethane, neoprene, and...
Goodman, Corey W.; Major, Heather J.; Walls, William D.; Sheffield, Val C.; Casavant, Thomas L.; Darbro, Benjamin W.
2016-01-01
Chromosomal microarrays (CMAs) are routinely used in both research and clinical laboratories; yet, little attention has been given to the estimation of genome-wide true and false negatives during the assessment of these assays and how such information could be used to calibrate various algorithmic metrics to improve performance. Low-throughput, locus-specific methods such as fluorescence in situ hybridization (FISH), quantitative PCR (qPCR), or multiplex ligation-dependent probe amplification (MLPA) preclude rigorous calibration of various metrics used by copy number variant (CNV) detection algorithms. To aid this task, we have established a comparative methodology, CNV-ROC, which is capable of performing a high throughput, low cost, analysis of CMAs that takes into consideration genome-wide true and false negatives. CNV-ROC uses a higher resolution microarray to confirm calls from a lower resolution microarray and provides for a true measure of genome-wide performance metrics at the resolution offered by microarray testing. CNV-ROC also provides for a very precise comparison of CNV calls between two microarray platforms without the need to establish an arbitrary degree of overlap. Comparison of CNVs across microarrays is done on a per-probe basis and receiver operator characteristic (ROC) analysis is used to calibrate algorithmic metrics, such as log2 ratio threshold, to enhance CNV calling performance. CNV-ROC addresses a critical and consistently overlooked aspect of analytical assessments of genome-wide techniques like CMAs which is the measurement and use of genome-wide true and false negative data for the calculation of performance metrics and comparison of CNV profiles between different microarray experiments. PMID:25595567
The Influence of Judgment Calls on Meta-Analytic Findings.
Tarrahi, Farid; Eisend, Martin
2016-01-01
Previous research has suggested that judgment calls (i.e., methodological choices made in the process of conducting a meta-analysis) have a strong influence on meta-analytic findings and question their robustness. However, prior research applies case study comparison or reanalysis of a few meta-analyses with a focus on a few selected judgment calls. These studies neglect the fact that different judgment calls are related to each other and simultaneously influence the outcomes of a meta-analysis, and that meta-analytic findings can vary due to non-judgment call differences between meta-analyses (e.g., variations of effects over time). The current study analyzes the influence of 13 judgment calls in 176 meta-analyses in marketing research by applying a multivariate, multilevel meta-meta-analysis. The analysis considers simultaneous influences from different judgment calls on meta-analytic effect sizes and controls for alternative explanations based on non-judgment call differences between meta-analyses. The findings suggest that judgment calls have only a minor influence on meta-analytic findings, whereas non-judgment call differences between meta-analyses are more likely to explain differences in meta-analytic findings. The findings support the robustness of meta-analytic results and conclusions.
Garland, Ellen C; Castellote, Manuel; Berchok, Catherine L
2015-06-01
Beluga whales, Delphinapterus leucas, have a graded call system; call types exist on a continuum making classification challenging. A description of vocalizations from the eastern Beaufort Sea beluga population during its spring migration are presented here, using both a non-parametric classification tree analysis (CART), and a Random Forest analysis. Twelve frequency and duration measurements were made on 1019 calls recorded over 14 days off Icy Cape, Alaska, resulting in 34 identifiable call types with 83% agreement in classification for both CART and Random Forest analyses. This high level of agreement in classification, with an initial subjective classification of calls into 36 categories, demonstrates that the methods applied here provide a quantitative analysis of a graded call dataset. Further, as calls cannot be attributed to individuals using single sensor passive acoustic monitoring efforts, these methods provide a comprehensive analysis of data where the influence of pseudo-replication of calls from individuals is unknown. This study is the first to describe the vocal repertoire of a beluga population using a robust and repeatable methodology. A baseline eastern Beaufort Sea beluga population repertoire is presented here, against which the call repertoire of other seasonally sympatric Alaskan beluga populations can be compared.
McLeod, M.A.; Andersen, D.E.
1998-01-01
Forest-nesting raptors are often difficult to detect and monitor because they can be secretive, and their nests can be difficult to locate. Some species, however, respond to broadcasts of taped calls, and these responses may be useful both in monitoring population trends and in locating nests. We conducted broadcast surveys on roads and at active red-shouldered hawk (Buteo lineatus) nests in northcentral Minnesota to determine effects of type of call (conspecific or great horned owl [Bubo virginianus]), time of day, and phase of the breeding cycle on red-shouldered hawk response behavior and to evaluate usefulness of broadcasts as a population monitoring tool using area occupied-probability-of-detection techniques. During the breeding seasons of 1994 and 1995, we surveyed 4 10-station road transects 59 times and conducted 76 surveys at 24 active nests. Results of these surveys indicated conspecific calls broadcast prior to hatch and early in the day were the most effective method of detecting red-shouldered hawks. Probability of detection via conspecific calls averaged 0.25, and area occupied was 100%. Computer simulations using these field data indicated broadcast surveys have the potential to be used as a population monitoring tool.
THE HUNT FOR EXOMOONS WITH KEPLER (HEK). I. DESCRIPTION OF A NEW OBSERVATIONAL PROJECT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kipping, D. M.; Bakos, G. A.; Buchhave, L.
2012-05-10
Two decades ago, empirical evidence concerning the existence and frequency of planets around stars, other than our own, was absent. Since that time, the detection of extrasolar planets from Jupiter-sized to, most recently, Earth-sized worlds has blossomed and we are finally able to shed light on the plurality of Earth-like, habitable planets in the cosmos. Extrasolar moons may also be frequently habitable worlds, but their detection or even systematic pursuit remains lacking in the current literature. Here, we present a description of the first systematic search for extrasolar moons as part of a new observational project called 'The Hunt formore » Exomoons with Kepler' (HEK). The HEK project distills the entire list of known transiting planet candidates found by Kepler (2326 at the time of writing) down to the most promising candidates for hosting a moon. Selected targets are fitted using a multimodal nested sampling algorithm coupled with a planet-with-moon light curve modeling routine. By comparing the Bayesian evidence of a planet-only model to that of a planet-with-moon, the detection process is handled in a Bayesian framework. In the case of null detections, upper limits derived from posteriors marginalized over the entire prior volume will be provided to inform the frequency of large moons around viable planetary hosts, {eta} leftmoon. After discussing our methodologies for target selection, modeling, fitting, and vetting, we provide two example analyses.« less
Rousseau, Matthieu; Belleannee, Clemence; Duchez, Anne-Claire; Cloutier, Nathalie; Levesque, Tania; Jacques, Frederic; Perron, Jean; Nigrovic, Peter A; Dieude, Melanie; Hebert, Marie-Josee; Gelb, Michael H; Boilard, Eric
2015-01-01
Microparticles, also called microvesicles, are submicron extracellular vesicles produced by plasma membrane budding and shedding recognized as key actors in numerous physio(patho)logical processes. Since they can be released by virtually any cell lineages and are retrieved in biological fluids, microparticles appear as potent biomarkers. However, the small dimensions of microparticles and soluble factors present in body fluids can considerably impede their quantification. Here, flow cytometry with improved methodology for microparticle resolution was used to detect microparticles of human and mouse species generated from platelets, red blood cells, endothelial cells, apoptotic thymocytes and cells from the male reproductive tract. A family of soluble proteins, the secreted phospholipases A2 (sPLA2), comprises enzymes concomitantly expressed with microparticles in biological fluids and that catalyze the hydrolysis of membrane phospholipids. As sPLA2 can hydrolyze phosphatidylserine, a phospholipid frequently used to assess microparticles, and might even clear microparticles, we further considered the impact of relevant sPLA2 enzymes, sPLA2 group IIA, V and X, on microparticle quantification. We observed that if enriched in fluids, certain sPLA2 enzymes impair the quantification of microparticles depending on the species studied, the source of microparticles and the means of detection employed (surface phosphatidylserine or protein antigen detection). This study provides analytical considerations for appropriate interpretation of microparticle cytofluorometric measurements in biological samples containing sPLA2 enzymes.
Davarani, Saied Saeed Hosseiny; Najarian, Amin Morteza; Nojavan, Saeed; Tabatabaei, Mohammad-Ali
2012-05-06
Recent advances in electromembrane extraction (EME) methodology calls for effective and accessible detection methods. Using imipramine and clomipramine as model therapeutics, this proof-of-principle work combines EME with gas chromatography analysis employing a flame ionization detector (FID). The drugs were extracted from acidic aqueous sample solutions, through a supported liquid membrane (SLM) consisting of 2-nitrophenyl octyl ether (NPOE) impregnated on the walls of the hollow fiber. EME parameters, such as SLM composition, type of ion carrier, pH and the composition of donor and acceptor solutions, agitation speed, extraction voltage, and extraction time were studied in detail. Under optimized conditions, the therapeutics were effectively extracted from different matrices with recoveries ranging from 90 to 95%. The samples were preconcentrated 270-280 times prior to GC analysis. Reliable linearity was also achieved for calibration curves with a regression coefficient of at least 0.995. Detection limits and intra-day precision (n=3) were less than 0.7 ng mL(-1) and 8.5%, respectively. Finally, method was applied to determination and quantification of drugs in human plasma and urine samples and satisfactory results were achieved. Copyright © 2012 Elsevier B.V. All rights reserved.
Seismic Characterization of the Newberry and Cooper Basin EGS Sites
NASA Astrophysics Data System (ADS)
Templeton, D. C.; Wang, J.; Goebel, M.; Johannesson, G.; Myers, S. C.; Harris, D.; Cladouhos, T. T.
2015-12-01
To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance traditional microearthquake detection and location methodologies at two EGS systems: the Newberry EGS site and the Habanero EGS site in the Cooper Basin of South Australia. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP typically have smaller magnitudes or occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation is real, or simply within the anticipated error range. At the Newberry EGS site, 235 events were reported in the original catalog. MFP identified 164 additional events (an increase of over 70% more events). For the relocated events in the Newberry catalog, we can distinguish two distinct seismic swarms that fall outside of one another's 95% probability error ellipsoids.This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Perceiving polarization with the naked eye: characterization of human polarization sensitivity
Temple, Shelby E.; McGregor, Juliette E.; Miles, Camilla; Graham, Laura; Miller, Josie; Buck, Jordan; Scott-Samuel, Nicholas E.; Roberts, Nicholas W.
2015-01-01
Like many animals, humans are sensitive to the polarization of light. We can detect the angle of polarization using an entoptic phenomenon called Haidinger's brushes, which is mediated by dichroic carotenoids in the macula lutea. While previous studies have characterized the spectral sensitivity of Haidinger's brushes, other aspects remain unexplored. We developed a novel methodology for presenting gratings in polarization-only contrast at varying degrees of polarization in order to measure the lower limits of human polarized light detection. Participants were, on average, able to perform the task down to a threshold of 56%, with some able to go as low as 23%. This makes humans the most sensitive vertebrate tested to date. Additionally, we quantified a nonlinear relationship between presented and perceived polarization angle when an observer is presented with a rotatable polarized light field. This result confirms a previous theoretical prediction of how uniaxial corneal birefringence impacts the perception of Haidinger's brushes. The rotational dynamics of Haidinger's brushes were then used to calculate corneal retardance. We suggest that psychophysical experiments, based upon the perception of polarized light, are amenable to the production of affordable technologies for self-assessment and longitudinal monitoring of visual dysfunctions such as age-related macular degeneration. PMID:26136441
Cerqueira, Maristela B R; Guilherme, Juliana R; Caldas, Sergiane S; Martins, Manoel L; Zanella, Renato; Primel, Ednei G
2014-07-01
A modified version of the QuEChERS method has been evaluated for the determination of 21 pharmaceuticals and 6 personal care products (PPCPs) in drinking-water sludge samples by employing ultra high liquid chromatography-tandem mass spectrometry (UPLC-MS/MS). The performance of the method was evaluated through linearity, recovery, precision (intra-day), method detection and quantification limits (MDL and MQL) and matrix effect. The calibration curves prepared in acetonitrile and in the matrix extract showed a correlation coefficient ranging from 0.98 to 0.99. MQLs values were on the ng g(-1) order of magnitude for most compounds. Recoveries between 50% and 93% were reached with RSDs lower than 10% for most compounds. Matrix effect was almost absent with values lower than 16% for 93% of the compounds. By coupling a quick and simple extraction called QuEChERS with the UPLC-MS/MS analysis, a method that is both selective and sensitive was obtained. This methodology was successfully applied to real samples and caffeine and benzophenone-3 were detected in ng g(-1) levels. Copyright © 2014 Elsevier Ltd. All rights reserved.
Perceiving polarization with the naked eye: characterization of human polarization sensitivity.
Temple, Shelby E; McGregor, Juliette E; Miles, Camilla; Graham, Laura; Miller, Josie; Buck, Jordan; Scott-Samuel, Nicholas E; Roberts, Nicholas W
2015-07-22
Like many animals, humans are sensitive to the polarization of light. We can detect the angle of polarization using an entoptic phenomenon called Haidinger's brushes, which is mediated by dichroic carotenoids in the macula lutea. While previous studies have characterized the spectral sensitivity of Haidinger's brushes, other aspects remain unexplored. We developed a novel methodology for presenting gratings in polarization-only contrast at varying degrees of polarization in order to measure the lower limits of human polarized light detection. Participants were, on average, able to perform the task down to a threshold of 56%, with some able to go as low as 23%. This makes humans the most sensitive vertebrate tested to date. Additionally, we quantified a nonlinear relationship between presented and perceived polarization angle when an observer is presented with a rotatable polarized light field. This result confirms a previous theoretical prediction of how uniaxial corneal birefringence impacts the perception of Haidinger's brushes. The rotational dynamics of Haidinger's brushes were then used to calculate corneal retardance.We suggest that psychophysical experiments, based upon the perception of polarized light, are amenable to the production of affordable technologies for self-assessment and longitudinal monitoring of visual dysfunctions such as age-related macular degeneration.
Monte Carlo simulation of ò ó coincidence system using plastic scintillators in 4àgeometry
NASA Astrophysics Data System (ADS)
Dias, M. S.; Piuvezam-Filho, H.; Baccarelli, A. M.; Takeda, M. N.; Koskinas, M. F.
2007-09-01
A modified version of a Monte Carlo code called Esquema, developed at the Nuclear Metrology Laboratory in IPEN, São Paulo, Brazil, has been applied for simulating a 4 πβ(PS)-γ coincidence system designed for primary radionuclide standardisation. This system consists of a plastic scintillator in 4 π geometry, for alpha or electron detection, coupled to a NaI(Tl) counter for gamma-ray detection. The response curves for monoenergetic electrons and photons have been calculated previously by Penelope code and applied as input data to code Esquema. The latter code simulates all the disintegration processes, from the precursor nucleus to the ground state of the daughter radionuclide. As a result, the curve between the observed disintegration rate as a function of the beta efficiency parameter can be simulated. A least-squares fit between the experimental activity values and the Monte Carlo calculation provided the actual radioactive source activity, without need of conventional extrapolation procedures. Application of this methodology to 60Co and 133Ba radioactive sources is presented and showed results in good agreement with a conventional proportional counter 4 πβ(PC)-γ coincidence system.
A-Track: A new approach for detection of moving objects in FITS images
NASA Astrophysics Data System (ADS)
Atay, T.; Kaplan, M.; Kilic, Y.; Karapinar, N.
2016-10-01
We have developed a fast, open-source, cross-platform pipeline, called A-Track, for detecting the moving objects (asteroids and comets) in sequential telescope images in FITS format. The pipeline is coded in Python 3. The moving objects are detected using a modified line detection algorithm, called MILD. We tested the pipeline on astronomical data acquired by an SI-1100 CCD with a 1-meter telescope. We found that A-Track performs very well in terms of detection efficiency, stability, and processing time. The code is hosted on GitHub under the GNU GPL v3 license.
2016-03-11
Control and Prevention Evaluation of a National Call Center and a Local Alerts System for Detection of New Cases of Ebola Virus Disease — Guinea, 2014...principally through the use of a telephone alert system. Community members and health facilities report deaths and suspected Ebola cases to local alert ...sensitivity of the national call center with the local alerts system, the CDC country team performed probabilistic record linkage of the combined
A generalized baleen whale call detection and classification system.
Baumgartner, Mark F; Mussoline, Sarah E
2011-05-01
Passive acoustic monitoring allows the assessment of marine mammal occurrence and distribution at greater temporal and spatial scales than is now possible with traditional visual surveys. However, the large volume of acoustic data and the lengthy and laborious task of manually analyzing these data have hindered broad application of this technique. To overcome these limitations, a generalized automated detection and classification system (DCS) was developed to efficiently and accurately identify low-frequency baleen whale calls. The DCS (1) accounts for persistent narrowband and transient broadband noise, (2) characterizes temporal variation of dominant call frequencies via pitch-tracking, and (3) classifies calls based on attributes of the resulting pitch tracks using quadratic discriminant function analysis (QDFA). Automated detections of sei whale (Balaenoptera borealis) downsweep calls and North Atlantic right whale (Eubalaena glacialis) upcalls were evaluated using recordings collected in the southwestern Gulf of Maine during the spring seasons of 2006 and 2007. The accuracy of the DCS was similar to that of a human analyst: variability in differences between the DCS and an analyst was similar to that between independent analysts, and temporal variability in call rates was similar among the DCS and several analysts.
NASA Technical Reports Server (NTRS)
Leininger, G.; Jutila, S.; King, J.; Muraco, W.; Hansell, J.; Lindeen, J.; Franckowiak, E.; Flaschner, A.
1975-01-01
A methodology is described for the evaluation of societal impacts associated with the implementation of a new technology. Theoretical foundations for the methodology, called the total assessment profile, are established from both the economic and social science perspectives. The procedure provides for accountability of nonquantifiable factors and measures through the use of a comparative value matrix by assessing the impacts of the technology on the value system of the society.
Szklo, André Salem; da Silva Freire Coutinho, Evandro; Reichenheim, Michael Eduardo
2012-01-01
According to the World Health Organization, smoking is an important cause of death worldwide. To encourage smoking cessation, persuasive messages can be used to raise smokers' risk perception. This article discusses challenges and solutions in designing a study to evaluate the effect of two different communication strategies ("gains from quitting" vs. "losses from continuing smoking") in encouraging calls to a quitline. The authors conducted an intervention study in two subway stations for 4 weeks, considering only 1 strategy per station. Large posters containing non-age-specific images and texts, on the basis of the theme"shortness of breath," were displayed on central dividing columns on the boarding platforms. Call rates from the selected stations, and respective rate ratios, overall and per study week, were calculated. Passengers who were smokers, exposed to the positive-content message, called on average 1.7 times more often than did those exposed to the negative-content message (p = .01). Moreover, call rate ratios did not decline over the 4 weeks of the study (p = .40). The effectiveness findings suggest that antismoking campaigns could use positive-content messages in order to recruit a larger smoker population. The proposed methodology can also be used to evaluate effectiveness of messages for "capturing" individuals with other health problems (e.g., alcohol abuse), thereby increasing its potential impact.
Schuchmann, Maike; Siemers, Björn M
2010-09-17
Only recently data on bat echolocation call intensities is starting to accumulate. Yet, intensity is an ecologically crucial parameter, as it determines the extent of the bats' perceptual space and, specifically, prey detection distance. Interspecifically, we thus asked whether sympatric, congeneric bat species differ in call intensities and whether differences play a role for niche differentiation. Specifically, we investigated whether R. mehelyi that calls at a frequency clearly above what is predicted by allometry, compensates for frequency-dependent loss in detection distance by using elevated call intensity. Maximum echolocation call intensities might depend on body size or condition and thus be used as an honest signal of quality for intraspecific communication. We for the first time investigated whether a size-intensity relation is present in echolocating bats. We measured maximum call intensities and frequencies for all five European horseshoe bat species. Maximum intensity differed among species largely due to R. euryale. Furthermore, we found no compensation for frequency-dependent loss in detection distance in R. mehelyi. Intraspecifically, there is a negative correlation between forearm lengths and intensity in R. euryale and a trend for a negative correlation between body condition index and intensity in R. ferrumequinum. In R. hipposideros, females had 8 dB higher intensities than males. There were no correlations with body size or sex differences and intensity for the other species. Based on call intensity and frequency measurements, we estimated echolocation ranges for our study community. These suggest that intensity differences result in different prey detection distances and thus likely play some role for resource access. It is interesting and at first glance counter-intuitive that, where a correlation was found, smaller bats called louder than large individuals. Such negative relationship between size or condition and vocal amplitude may indicate an as yet unknown physiological or sexual selection pressure.
NASA Astrophysics Data System (ADS)
Madariaga, J. M.; Torre-Fdez, I.; Ruiz-Galende, P.; Aramendia, J.; Gomez-Nubla, L.; Fdez-Ortiz de Vallejuelo, S.; Maguregui, M.; Castro, K.; Arana, G.
2018-04-01
Advanced methodologies based on Raman spectroscopy are proposed to detect prebiotic and biotic molecules in returned samples from Mars: (a) optical microscopy with confocal micro-Raman, (b) the SCA instrument, (c) Raman Imaging. Examples for NWA 6148.
NASA Astrophysics Data System (ADS)
Serra, Roger; Lopez, Lautaro
2018-05-01
Different approaches on the detection of damages based on dynamic measurement of structures have appeared in the last decades. They were based, amongst others, on changes in natural frequencies, modal curvatures, strain energy or flexibility. Wavelet analysis has also been used to detect the abnormalities on modal shapes induced by damages. However the majority of previous work was made with non-corrupted by noise signals. Moreover, the damage influence for each mode shape was studied separately. This paper proposes a new methodology based on combined modal wavelet transform strategy to cope with noisy signals, while at the same time, able to extract the relevant information from each mode shape. The proposed methodology will be then compared with the most frequently used and wide-studied methods from the bibliography. To evaluate the performance of each method, their capacity to detect and localize damage will be analyzed in different cases. The comparison will be done by simulating the oscillations of a cantilever steel beam with and without defect as a numerical case. The proposed methodology proved to outperform classical methods in terms of noisy signals.
Bohnenstiehl, DelWayne R.; Eggleston, David B.; Kellogg, M. Lisa; Lyon, R. Patrick
2017-01-01
During May 2015, passive acoustic recorders were deployed at eight subtidal oyster reefs within Harris Creek Oyster Sanctuary in Chesapeake Bay, Maryland USA. These sites were selected to represent both restored and unrestored habitats having a range of oyster densities. Throughout the survey, the soundscape within Harris Creek was dominated by the boatwhistle calls of the oyster toadfish, Opsanus tau. A novel, multi-kernel spectral correlation approach was developed to automatically detect these boatwhistle calls using their two lowest harmonic bands. The results provided quantitative information on how call rate and call frequency varied in space and time. Toadfish boatwhistle fundamental frequency ranged from 140 Hz to 260 Hz and was well correlated (r = 0.94) with changes in water temperature, with the fundamental frequency increasing by ~11 Hz for every 1°C increase in temperature. The boatwhistle call rate increased from just a few calls per minute at the start of monitoring on May 7th to ~100 calls/min on May 10th and remained elevated throughout the survey. As male toadfish are known to generate boatwhistles to attract mates, this rapid increase in call rate was interpreted to mark the onset of spring spawning behavior. Call rate was not modulated by water temperature, but showed a consistent diurnal pattern, with a sharp decrease in rate just before sunrise and a peak just after sunset. There was a significant difference in call rate between restored and unrestored reefs, with restored sites having nearly twice the call rate as unrestored sites. This work highlights the benefits of using automated detection techniques that provide quantitative information on species-specific call characteristics and patterns. This type of non-invasive acoustic monitoring provides long-term, semi-continuous information on animal behavior and abundance, and operates effectively in settings that are otherwise difficult to sample. PMID:28792543
Ricci, Shannon W; Bohnenstiehl, DelWayne R; Eggleston, David B; Kellogg, M Lisa; Lyon, R Patrick
2017-01-01
During May 2015, passive acoustic recorders were deployed at eight subtidal oyster reefs within Harris Creek Oyster Sanctuary in Chesapeake Bay, Maryland USA. These sites were selected to represent both restored and unrestored habitats having a range of oyster densities. Throughout the survey, the soundscape within Harris Creek was dominated by the boatwhistle calls of the oyster toadfish, Opsanus tau. A novel, multi-kernel spectral correlation approach was developed to automatically detect these boatwhistle calls using their two lowest harmonic bands. The results provided quantitative information on how call rate and call frequency varied in space and time. Toadfish boatwhistle fundamental frequency ranged from 140 Hz to 260 Hz and was well correlated (r = 0.94) with changes in water temperature, with the fundamental frequency increasing by ~11 Hz for every 1°C increase in temperature. The boatwhistle call rate increased from just a few calls per minute at the start of monitoring on May 7th to ~100 calls/min on May 10th and remained elevated throughout the survey. As male toadfish are known to generate boatwhistles to attract mates, this rapid increase in call rate was interpreted to mark the onset of spring spawning behavior. Call rate was not modulated by water temperature, but showed a consistent diurnal pattern, with a sharp decrease in rate just before sunrise and a peak just after sunset. There was a significant difference in call rate between restored and unrestored reefs, with restored sites having nearly twice the call rate as unrestored sites. This work highlights the benefits of using automated detection techniques that provide quantitative information on species-specific call characteristics and patterns. This type of non-invasive acoustic monitoring provides long-term, semi-continuous information on animal behavior and abundance, and operates effectively in settings that are otherwise difficult to sample.
NASA Astrophysics Data System (ADS)
Sierra-Pérez, Julián; Torres-Arredondo, M.-A.; Alvarez-Montoya, Joham
2018-01-01
Structural health monitoring consists of using sensors integrated within structures together with algorithms to perform load monitoring, damage detection, damage location, damage size and severity, and prognosis. One possibility is to use strain sensors to infer structural integrity by comparing patterns in the strain field between the pristine and damaged conditions. In previous works, the authors have demonstrated that it is possible to detect small defects based on strain field pattern recognition by using robust machine learning techniques. They have focused on methodologies based on principal component analysis (PCA) and on the development of several unfolding and standardization techniques, which allow dealing with multiple load conditions. However, before a real implementation of this approach in engineering structures, changes in the strain field due to conditions different from damage occurrence need to be isolated. Since load conditions may vary in most engineering structures and promote significant changes in the strain field, it is necessary to implement novel techniques for uncoupling such changes from those produced by damage occurrence. A damage detection methodology based on optimal baseline selection (OBS) by means of clustering techniques is presented. The methodology includes the use of hierarchical nonlinear PCA as a nonlinear modeling technique in conjunction with Q and nonlinear-T 2 damage indices. The methodology is experimentally validated using strain measurements obtained by 32 fiber Bragg grating sensors bonded to an aluminum beam under dynamic bending loads and simultaneously submitted to variations in its pitch angle. The results demonstrated the capability of the methodology for clustering data according to 13 different load conditions (pitch angles), performing the OBS and detecting six different damages induced in a cumulative way. The proposed methodology showed a true positive rate of 100% and a false positive rate of 1.28% for a 99% of confidence.
Craig, Hugh; Berretta, Regina; Moscato, Pablo
2016-01-01
In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-28
... without primary health care clinical experience may be selected based on their expertise in methodological... calls and via email discussions. Member duties include prioritizing topics, designing research plans...
Langhout, Regina Day
2016-12-01
Agitation, as deployed by the Industrial Areas Foundation (IAF), occurs when imaginations and curiosities are piqued, and self-interest is made visible. In this framework, agitation is a step in creating change. In this paper, I outline two agitations within US-based community psychology. I then describe a third agitation that is underway; I add my voice and call for a methodology of diffraction as a contribution to critical reflexivity practices within US-based community psychology. Consistent with the IAF framework, I do not provide solutions. I write this paper as a provocation to help us think imaginatively and creatively about our actions and future, so that we can consider the paradigm shifts needed to move into critical ways of understanding connection, responsibility, accountability, and creating change-of interest during Swampscott and today. © Society for Community Research and Action 2016.
León, María Cosio; Nieto-Hipólito, Juan Ivan; Garibaldi-Beltrán, Julián; Amaya-Parra, Guillermo; Luque-Morales, Priscy; Magaña-Espinoza, Pedro; Aguilar-Velazco, José
2016-06-01
Wellness is a term often used to talk about optimal health as "dynamic balance of physical, emotional, social, spiritual, and intellectual health." While healthcare is a term about care offered to patients for improving their health. We use both terms, as well as the Business Model Canvas (BMC) methodology, to design a digital ecosystem model for healthcare and wellness called DE4HW; the model considers economic, technological, and legal asymmetries, which are present on e-services beyond geographical regions. BMC methodology was embedded into the global project strategy called: IBOT (Initiate, Build, Operate and Transfer); it is a methodology to establish a functional, integrated national telemedicine network and virtual education network; of which we took its phases rationale. The results in this work illustrate the design of DE4HW model, into the first phase of IBOT, enriched with the BMC, which enables us to define actors, their interactions, rules and protocols, in order to build DE4HW, while IBOT strategy manages the project goal, up to the transfer phase, where an integral service platform of healthcare and wellness is turned over to stakeholders.
[Priority research agendas: a strategic resource for health in Latin America].
Becerra-Posada, Francisco; de Snyder, Nelly Salgado; Cuervo, Luis Gabriel; Montorzi, Gabriela
2014-12-01
Understand and analyze procedures used to create national integrated research agendas from 2007 to 2011 in Argentina, Guatemala, Mexico, Panama, and Paraguay. Descriptive, cross-sectional study using an online survey of agenda preparation processes; specifically, development, integration, implementation, and use and dissemination of the agenda. The 45 respondents reported following specific methodologies for agenda construction and had a good opinion of organizational aspects with regard to prior information provided and balance among disciplines and stakeholders. Some 60% considered the coordinators impartial, although 25% mentioned biases favoring some subject; 42% received technical support from consultants, reading matter, and methodological guidelines; 40% engaged in subject-matter priority-setting; and 55% confirmed dissemination and communication of the agenda. However, only 22% reported inclusion of agenda topics in national calls for research proposals. In the countries studied, development of the health research agenda was characterized by prior planning and appropriate organization to achieve - consensus-based outcomes. Nevertheless, the agendas were not used in national calls for research proposals, reflecting lack of coordination in national health research systems and lack of connection between funders and researchers. It is recommended that stakeholders strengthen integration and advocacy efforts to modify processes and structures of agenda-based calls for research proposals.
James, Richard; Khim, Keovathanak; Boudarene, Lydia; Yoong, Joanne; Phalla, Chea; Saint, Saly; Koeut, Pichenda; Mao, Tan Eang; Coker, Richard; Khan, Mishal Sameer
2017-08-22
Globally, almost 40% of tuberculosis (TB) patients remain undiagnosed, and those that are diagnosed often experience prolonged delays before initiating correct treatment, leading to ongoing transmission. While there is a push for active case finding (ACF) to improve early detection and treatment of TB, there is extremely limited evidence about the relative cost-effectiveness of different ACF implementation models. Cambodia presents a unique opportunity for addressing this gap in evidence as ACF has been implemented using different models, but no comparisons have been conducted. The objective of our study is to contribute to knowledge and methodology on comparing cost-effectiveness of alternative ACF implementation models from the health service perspective, using programmatic data, in order to inform national policy and practice. We retrospectively compared three distinct ACF implementation models - door to door symptom screening in urban slums, checking contacts of TB patients, and door to door symptom screening focusing on rural populations aged above 55 - in terms of the number of new bacteriologically-positive pulmonary TB cases diagnosed and the cost of implementation assuming activities are conducted by the national TB program of Cambodia. We calculated the cost per additional case detected using the alternative ACF models. Our analysis, which is the first of its kind for TB, revealed that the ACF model based on door to door screening in poor urban areas of Phnom Penh was the most cost-effective (249 USD per case detected, 737 cases diagnosed), followed by the model based on testing contacts of TB patients (308 USD per case detected, 807 cases diagnosed), and symptomatic screening of older rural populations (316 USD per case detected, 397 cases diagnosed). Our study provides new evidence on the relative effectiveness and economics of three implementation models for enhanced TB case finding, in line with calls for data from 'routine conditions' to be included in disease control program strategic planning. Such cost-effectiveness comparisons are essential to inform resource allocation decisions of national policy makers in resource constraint settings. We applied a novel, pragmatic methodological approach, which was designed to provide results that are directly relevant to policy makers, costing the interventions from Cambodia's national TB program's perspective and using case finding data from implementation activities, rather than experimental settings.
Implementation of efficient trajectories for an ultrasonic scanner using chaotic maps
NASA Astrophysics Data System (ADS)
Almeda, A.; Baltazar, A.; Treesatayapun, C.; Mijarez, R.
2012-05-01
Typical ultrasonic methodology for nondestructive scanning evaluation uses systematic scanning paths. In many cases, this approach is time inefficient and also energy and computational power consuming. Here, a methodology for the scanning of defects using an ultrasonic echo-pulse scanning technique combined with chaotic trajectory generation is proposed. This is implemented in a Cartesian coordinate robotic system developed in our lab. To cover the entire search area, a chaotic function and a proposed mirror mapping were incorporated. To improve detection probability, our proposed scanning methodology is complemented with a probabilistic approach of discontinuity detection. The developed methodology was found to be more efficient than traditional ones used to localize and characterize hidden flaws.
NASA Astrophysics Data System (ADS)
El Houda Thabet, Rihab; Combastel, Christophe; Raïssi, Tarek; Zolghadri, Ali
2015-09-01
The paper develops a set membership detection methodology which is applied to the detection of abnormal positions of aircraft control surfaces. Robust and early detection of such abnormal positions is an important issue for early system reconfiguration and overall optimisation of aircraft design. In order to improve fault sensitivity while ensuring a high level of robustness, the method combines a data-driven characterisation of noise and a model-driven approach based on interval prediction. The efficiency of the proposed methodology is illustrated through simulation results obtained based on data recorded in several flight scenarios of a highly representative aircraft benchmark.
Quantum Dots Applied to Methodology on Detection of Pesticide and Veterinary Drug Residues.
Zhou, Jia-Wei; Zou, Xue-Mei; Song, Shang-Hong; Chen, Guan-Hua
2018-02-14
The pesticide and veterinary drug residues brought by large-scale agricultural production have become one of the issues in the fields of food safety and environmental ecological security. It is necessary to develop the rapid, sensitive, qualitative and quantitative methodology for the detection of pesticide and veterinary drug residues. As one of the achievements of nanoscience, quantum dots (QDs) have been widely used in the detection of pesticide and veterinary drug residues. In these methodology studies, the used QD-signal styles include fluorescence, chemiluminescence, electrochemical luminescence, photoelectrochemistry, etc. QDs can also be assembled into sensors with different materials, such as QD-enzyme, QD-antibody, QD-aptamer, and QD-molecularly imprinted polymer sensors, etc. Plenty of study achievements in the field of detection of pesticide and veterinary drug residues have been obtained from the different combinations among these signals and sensors. They are summarized in this paper to provide a reference for the QD application in the detection of pesticide and veterinary drug residues.
Bayesian truthing as experimental verification of C4ISR sensors
NASA Astrophysics Data System (ADS)
Jannson, Tomasz; Forrester, Thomas; Romanov, Volodymyr; Wang, Wenjian; Nielsen, Thomas; Kostrzewski, Andrew
2015-05-01
In this paper, the general methodology for experimental verification/validation of C4ISR and other sensors' performance, is presented, based on Bayesian inference, in general, and binary sensors, in particular. This methodology, called Bayesian Truthing, defines Performance Metrics for binary sensors in: physics, optics, electronics, medicine, law enforcement, C3ISR, QC, ATR (Automatic Target Recognition), terrorism related events, and many others. For Bayesian Truthing, the sensing medium itself is not what is truly important; it is how the decision process is affected.
Anthropogenic ``Global Warming'' Alarmism: Illuminating some Scientific and Methodological Flaws
NASA Astrophysics Data System (ADS)
Gould, Larry
2009-10-01
There continues to be an increasing number of scientists and public figures around the world who are challenging the dominant political- and mediadriven claims that have been bolstered by so-called ``consensus'' scientific views -- that dangerous ``global warming/climate change'' is caused primarily by human-produced carbon dioxide. This general talk will show that the weight of scientific evidence strongly contradicts the alarmist claims. It will also explain what are some of the methodological flaws that continue to threaten the scientific method.
NASA Astrophysics Data System (ADS)
Byers, J. M.; Doctor, K.
2017-12-01
A common application of the satellite and airborne acquired hyperspectral imagery in the visible and NIR spectrum is the assessment of vegetation. Various absorption features of plants related to both water and chlorophyll content can be used to measure the vigor and access to underlying water sources of the vegetation. The typical strategy is to form hand-crafted features from the hyperspectral data cube by selecting two wavelengths to form difference or ratio images in the pixel space. The new image attempts to provide greater contrast for some feature of the vegetation. The Normalized Difference Vegetation Index (NDVI) is a widely used example formed from the ratio of differences and sums at two different wavelengths. There are dozens of these indices that are ostensibly formed using insights about the underlying physics of the spectral absorption with claims to efficacy in representing various properties of vegetation. In the language of machine learning these vegetation indices are features that can be used as a useful data representation within an algorithm. In this work we use a powerful approach from machine learning, probabilistic graphical models (PGM), to balance the competing needs of using existing hydrological classifications of terrain while finding statistically reliable features within hyperspectral data for identifying the generative process of the data. The algorithm in its simplest form is called a Naïve Bayes (NB) classifier and can be constructed in a data-driven estimation procedure of the conditional probability distributions that form the PGM. The Naïve Bayes model assumes that all vegetation indices (VI) are independent of one another given the hydrological class label. We seek to test its validity in a pilot study of detecting subsurface water flow pathways from VI. A more sophisticated PGM will also be explored called a tree-augmented NB that accounts for the probabilistic dependence between VI features. This methodology provides a general approach for classifying hydrological structures from hyperspectral data.
Jensen, Jamie L.; Dario-Becker, Juville; Hughes, Lee E.; Amburn, D. Sue Katz; Shaw, Joyce A.
2012-01-01
Recent recommendations for educational research encourage empirically tested, theory-based, completely transparent, and broadly applicable studies. In light of these recommendations, we call for a research standard and community of practice in the evaluation of technology use in the undergraduate life science classroom. We outline appropriate research methodology, review and critique the past research on technology usage and, lastly, suggest a new and improved focus for research on emerging technologies. PMID:23653777
Jensen, Jamie L; Dario-Becker, Juville; Hughes, Lee E; Amburn, D Sue Katz; Shaw, Joyce A
2012-01-01
Recent recommendations for educational research encourage empirically tested, theory-based, completely transparent, and broadly applicable studies. In light of these recommendations, we call for a research standard and community of practice in the evaluation of technology use in the undergraduate life science classroom. We outline appropriate research methodology, review and critique the past research on technology usage and, lastly, suggest a new and improved focus for research on emerging technologies.
METHODS FOR EVALUATING THE SUSTAINABILITY OF GREEN PROCESSES
A methodology, called GREENSCOPE (Gauging Reaction Effectiveness for the ENvironmental Sustainability of Chemistries with a multi-Objective Process Evaluator), has been developed in the U.S. EPA's Office of Research and Development to directly compare the sustainability of proces...
Response-Guided Community Detection: Application to Climate Index Discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bello, Gonzalo; Angus, Michael; Pedemane, Navya
Discovering climate indices-time series that summarize spatiotemporal climate patterns-is a key task in the climate science domain. In this work, we approach this task as a problem of response-guided community detection; that is, identifying communities in a graph associated with a response variable of interest. To this end, we propose a general strategy for response-guided community detection that explicitly incorporates information of the response variable during the community detection process, and introduce a graph representation of spatiotemporal data that leverages information from multiple variables. We apply our proposed methodology to the discovery of climate indices associated with seasonal rainfall variability.more » Our results suggest that our methodology is able to capture the underlying patterns known to be associated with the response variable of interest and to improve its predictability compared to existing methodologies for data-driven climate index discovery and official forecasts.« less
Leaf Movements of Indoor Plants Monitored by Terrestrial LiDAR
Herrero-Huerta, Mónica; Lindenbergh, Roderik; Gard, Wolfgang
2018-01-01
Plant leaf movement is induced by some combination of different external and internal stimuli. Detailed geometric characterization of such movement is expected to improve understanding of these mechanisms. A metric high-quality, non-invasive and innovative sensor system to analyze plant movement is Terrestrial LiDAR (TLiDAR). This technique has an active sensor and is, therefore, independent of light conditions, able to obtain accurate high spatial and temporal resolution point clouds. In this study, a movement parameterization approach of leaf plants based on TLiDAR is introduced. For this purpose, two Calathea roseopicta plants were scanned in an indoor environment during 2 full-days, 1 day in natural light conditions and the other in darkness. The methodology to estimate leaf movement is based on segmenting individual leaves using an octree-based 3D-grid and monitoring the changes in their orientation by Principal Component Analysis. Additionally, canopy variations of the plant as a whole were characterized by a convex-hull approach. As a result, 9 leaves in plant 1 and 11 leaves in plant 2 were automatically detected with a global accuracy of 93.57 and 87.34%, respectively, compared to a manual detection. Regarding plant 1, in natural light conditions, the displacement average of the leaves between 7.00 a.m. and 12.30 p.m. was 3.67 cm as estimated using so-called deviation maps. The maximum displacement was 7.92 cm. In addition, the orientation changes of each leaf within a day were analyzed. The maximum variation in the vertical angle was 69.6° from 12.30 to 6.00 p.m. In darkness, the displacements were smaller and showed a different orientation pattern. The canopy volume of plant 1 changed more in the morning (4.42 dm3) than in the afternoon (2.57 dm3). The results of plant 2 largely confirmed the results of the first plant and were added to check the robustness of the methodology. The results show how to quantify leaf orientation variation and leaf movements along a day at mm accuracy in different light conditions. This confirms the feasibility of the proposed methodology to robustly analyse leaf movements. PMID:29527217
Evolutionary neural networks for anomaly detection based on the behavior of a program.
Han, Sang-Jun; Cho, Sung-Bae
2006-06-01
The process of learning the behavior of a given program by using machine-learning techniques (based on system-call audit data) is effective to detect intrusions. Rule learning, neural networks, statistics, and hidden Markov models (HMMs) are some of the kinds of representative methods for intrusion detection. Among them, neural networks are known for good performance in learning system-call sequences. In order to apply this knowledge to real-world problems successfully, it is important to determine the structures and weights of these call sequences. However, finding the appropriate structures requires very long time periods because there are no suitable analytical solutions. In this paper, a novel intrusion-detection technique based on evolutionary neural networks (ENNs) is proposed. One advantage of using ENNs is that it takes less time to obtain superior neural networks than when using conventional approaches. This is because they discover the structures and weights of the neural networks simultaneously. Experimental results with the 1999 Defense Advanced Research Projects Agency (DARPA) Intrusion Detection Evaluation (IDEVAL) data confirm that ENNs are promising tools for intrusion detection.
Bat detective-Deep learning tools for bat acoustic signal detection.
Mac Aodha, Oisin; Gibb, Rory; Barlow, Kate E; Browning, Ella; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R; Newson, Stuart E; Pandourski, Ivan; Parsons, Stuart; Russ, Jon; Szodoray-Paradi, Abigel; Szodoray-Paradi, Farkas; Tilova, Elena; Girolami, Mark; Brostow, Gabriel; Jones, Kate E
2018-03-01
Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio.
Multi-decadal Hydrological Retrospective: Case study of Amazon floods and droughts
NASA Astrophysics Data System (ADS)
Wongchuig Correa, Sly; Paiva, Rodrigo Cauduro Dias de; Espinoza, Jhan Carlo; Collischonn, Walter
2017-06-01
Recently developed methodologies such as climate reanalysis make it possible to create a historical record of climate systems. This paper proposes a methodology called Hydrological Retrospective (HR), which essentially simulates large rainfall datasets, using this as input into hydrological models to develop a record of past hydrology, making it possible to analyze past floods and droughts. We developed a methodology for the Amazon basin, where studies have shown an increase in the intensity and frequency of hydrological extreme events in recent decades. We used eight large precipitation datasets (more than 30 years) as input for a large scale hydrological and hydrodynamic model (MGB-IPH). HR products were then validated against several in situ discharge gauges controlling the main Amazon sub-basins, focusing on maximum and minimum events. For the most accurate HR, based on performance metrics, we performed a forecast skill of HR to detect floods and droughts, comparing the results with in-situ observations. A statistical temporal series trend was performed for intensity of seasonal floods and droughts in the entire Amazon basin. Results indicate that HR could represent most past extreme events well, compared with in-situ observed data, and was consistent with many events reported in literature. Because of their flow duration, some minor regional events were not reported in literature but were captured by HR. To represent past regional hydrology and seasonal hydrological extreme events, we believe it is feasible to use some large precipitation datasets such as i) climate reanalysis, which is mainly based on a land surface component, and ii) datasets based on merged products. A significant upward trend in intensity was seen in maximum annual discharge (related to floods) in western and northwestern regions and for minimum annual discharge (related to droughts) in south and central-south regions of the Amazon basin. Because of the global coverage of rainfall datasets, this methodology can be transferred to other regions for better estimation of future hydrological behavior and its impact on society.
Du, Xiaojiao; Jiang, Ding; Hao, Nan; Qian, Jing; Dai, Liming; Zhou, Lei; Hu, Jianping; Wang, Kun
2016-10-04
The development of novel detection methodologies in electrochemiluminescence (ECL) aptasensor fields with simplicity and ultrasensitivity is essential for constructing biosensing architectures. Herein, a facile, specific, and sensitive methodology was developed unprecedentedly for quantitative detection of microcystin-LR (MC-LR) based on three-dimensional boron and nitrogen codoped graphene hydrogels (BN-GHs) assisted steric hindrance amplifying effect between the aptamer and target analytes. The recognition reaction was monitored by quartz crystal microbalance (QCM) to validate the possible steric hindrance effect. First, the BN-GHs were synthesized via self-assembled hydrothermal method and then applied as the Ru(bpy) 3 2+ immobilization platform for further loading the biomolecule aptamers due to their nanoporous structure and large specific surface area. Interestingly, we discovered for the first time that, without the aid of conventional double-stranded DNA configuration, such three-dimensional nanomaterials can directly amplify the steric hindrance effect between the aptamer and target analytes to a detectable level, and this facile methodology could be for an exquisite assay. With the MC-LR as a model, this novel ECL biosensor showed a high sensitivity and a wide linear range. This strategy supplies a simple and versatile platform for specific and sensitive determination of a wide range of aptamer-related targets, implying that three-dimensional nanomaterials would play a crucial role in engineering and developing novel detection methodologies for ECL aptasensing fields.
2012-09-30
generalized power-law detection algorithm for humpback whale vocalizations. J. Acous. Soc. Am. 131(4), 2682-2699. Roch, M. A., H. Klinck, S...Heaney (2012b). Site specific probability of passive acoustic detection of humpback whale calls from single fixed hydrophones. J. Acous. Soc. Am...monitoring: Correcting humpback call detections for site-specific and time-dependent environmental characteristics . JASA Express Letters, submitted October, 2012, 5 pgs plus 3 figs.
DOT National Transportation Integrated Search
2003-04-01
Introduction Motorist aid call boxes are used to provide motorist assistance, improve safety, and can serve as an incident detection tool. More recently, Intelligent Transportation Systems (ITS) applications have been added to call box systems to enh...
NASA Astrophysics Data System (ADS)
Jia, Xiaodong; Jin, Chao; Buzza, Matt; Di, Yuan; Siegel, David; Lee, Jay
2018-01-01
Successful applications of Diffusion Map (DM) in machine failure detection and diagnosis have been reported in several recent studies. DM provides an efficient way to visualize the high-dimensional, complex and nonlinear machine data, and thus suggests more knowledge about the machine under monitoring. In this paper, a DM based methodology named as DM-EVD is proposed for machine degradation assessment, abnormality detection and diagnosis in an online fashion. Several limitations and challenges of using DM for machine health monitoring have been analyzed and addressed. Based on the proposed DM-EVD, a deviation based methodology is then proposed to include more dimension reduction methods. In this work, the incorporation of Laplacian Eigen-map and Principal Component Analysis (PCA) are explored, and the latter algorithm is named as PCA-Dev and is validated in the case study. To show the successful application of the proposed methodology, case studies from diverse fields are presented and investigated in this work. Improved results are reported by benchmarking with other machine learning algorithms.
The Hospice Environmental Survey (HES): Pilot Test of a New Measurement Instrument.
ERIC Educational Resources Information Center
Taylor, Jean H.; Perrill, Norman K.
1988-01-01
Describes development of the Hospice Environmental Survey (HES) to measure user's perception of the homelike atmosphere provided by a hospital inpatient unit called Hospice House. Presents the HES instrument, methodology, and pilot study data. (Author/NB)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobi, Rober
2007-03-28
This Topical Report (#6 of 9) consists of the figures 3.6-13 to (and including) 3.6-18 (and appropriate figure captions) that accompany the Final Technical Progress Report entitled: “Innovative Methodology for Detection of Fracture-Controlled Sweet Spots in the Northern Appalachian Basin” for DOE/NETL Award DE-AC26-00NT40698.
Sela, Itamar; Ashkenazy, Haim; Katoh, Kazutaka; Pupko, Tal
2015-07-01
Inference of multiple sequence alignments (MSAs) is a critical part of phylogenetic and comparative genomics studies. However, from the same set of sequences different MSAs are often inferred, depending on the methodologies used and the assumed parameters. Much effort has recently been devoted to improving the ability to identify unreliable alignment regions. Detecting such unreliable regions was previously shown to be important for downstream analyses relying on MSAs, such as the detection of positive selection. Here we developed GUIDANCE2, a new integrative methodology that accounts for: (i) uncertainty in the process of indel formation, (ii) uncertainty in the assumed guide tree and (iii) co-optimal solutions in the pairwise alignments, used as building blocks in progressive alignment algorithms. We compared GUIDANCE2 with seven methodologies to detect unreliable MSA regions using extensive simulations and empirical benchmarks. We show that GUIDANCE2 outperforms all previously developed methodologies. Furthermore, GUIDANCE2 also provides a set of alternative MSAs which can be useful for downstream analyses. The novel algorithm is implemented as a web-server, available at: http://guidance.tau.ac.il. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Shabangu, Fannie W.; Yemane, Dawit; Stafford, Kathleen M.; Ensor, Paul; Findlay, Ken P.
2017-01-01
Harvested to perilously low numbers by commercial whaling during the past century, the large scale response of Antarctic blue whales Balaenoptera musculus intermedia to environmental variability is poorly understood. This study uses acoustic data collected from 586 sonobuoys deployed in the austral summers of 1997 through 2009, south of 38°S, coupled with visual observations of blue whales during the IWC SOWER line-transect surveys. The characteristic Z-call and D-call of Antarctic blue whales were detected using an automated detection template and visual verification method. Using a random forest model, we showed the environmental preferences pattern, spatial occurrence and acoustic behaviour of Antarctic blue whales. Distance to the southern boundary of the Antarctic Circumpolar Current (SBACC), latitude and distance from the nearest Antarctic shores were the main geographic predictors of blue whale call occurrence. Satellite-derived sea surface height, sea surface temperature, and productivity (chlorophyll-a) were the most important environmental predictors of blue whale call occurrence. Call rates of D-calls were strongly predicted by the location of the SBACC, latitude and visually detected number of whales in an area while call rates of Z-call were predicted by the SBACC, latitude and longitude. Satellite-derived sea surface height, wind stress, wind direction, water depth, sea surface temperatures, chlorophyll-a and wind speed were important environmental predictors of blue whale call rates in the Southern Ocean. Blue whale call occurrence and call rates varied significantly in response to inter-annual and long term variability of those environmental predictors. Our results identify the response of Antarctic blue whales to inter-annual variability in environmental conditions and highlighted potential suitable habitats for this population. Such emerging knowledge about the acoustic behaviour, environmental and habitat preferences of Antarctic blue whales is important in improving the management and conservation of this highly depleted species. PMID:28222124
Shabangu, Fannie W; Yemane, Dawit; Stafford, Kathleen M; Ensor, Paul; Findlay, Ken P
2017-01-01
Harvested to perilously low numbers by commercial whaling during the past century, the large scale response of Antarctic blue whales Balaenoptera musculus intermedia to environmental variability is poorly understood. This study uses acoustic data collected from 586 sonobuoys deployed in the austral summers of 1997 through 2009, south of 38°S, coupled with visual observations of blue whales during the IWC SOWER line-transect surveys. The characteristic Z-call and D-call of Antarctic blue whales were detected using an automated detection template and visual verification method. Using a random forest model, we showed the environmental preferences pattern, spatial occurrence and acoustic behaviour of Antarctic blue whales. Distance to the southern boundary of the Antarctic Circumpolar Current (SBACC), latitude and distance from the nearest Antarctic shores were the main geographic predictors of blue whale call occurrence. Satellite-derived sea surface height, sea surface temperature, and productivity (chlorophyll-a) were the most important environmental predictors of blue whale call occurrence. Call rates of D-calls were strongly predicted by the location of the SBACC, latitude and visually detected number of whales in an area while call rates of Z-call were predicted by the SBACC, latitude and longitude. Satellite-derived sea surface height, wind stress, wind direction, water depth, sea surface temperatures, chlorophyll-a and wind speed were important environmental predictors of blue whale call rates in the Southern Ocean. Blue whale call occurrence and call rates varied significantly in response to inter-annual and long term variability of those environmental predictors. Our results identify the response of Antarctic blue whales to inter-annual variability in environmental conditions and highlighted potential suitable habitats for this population. Such emerging knowledge about the acoustic behaviour, environmental and habitat preferences of Antarctic blue whales is important in improving the management and conservation of this highly depleted species.
Auditing Complex Concepts in Overlapping Subsets of SNOMED
Wang, Yue; Wei, Duo; Xu, Junchuan; Elhanan, Gai; Perl, Yehoshua; Halper, Michael; Chen, Yan; Spackman, Kent A.; Hripcsak, George
2008-01-01
Limited resources and the sheer volume of concepts make auditing a large terminology, such as SNOMED CT, a daunting task. It is essential to devise techniques that can aid an auditor by automatically identifying concepts that deserve attention. A methodology for this purpose based on a previously introduced abstraction network (called the p-area taxonomy) for a SNOMED CT hierarchy is presented. The methodology algorithmically gathers concepts appearing in certain overlapping subsets, defined exclusively with respect to the p-area taxonomy, for review. The results of applying the methodology to SNOMED’s Specimen hierarchy are presented. These results are compared against a control sample composed of concepts residing in subsets without the overlaps. With the use of the double bootstrap, the concept group produced by our methodology is shown to yield a statistically significant higher proportion of error discoveries. PMID:18998838
Auditing complex concepts in overlapping subsets of SNOMED.
Wang, Yue; Wei, Duo; Xu, Junchuan; Elhanan, Gai; Perl, Yehoshua; Halper, Michael; Chen, Yan; Spackman, Kent A; Hripcsak, George
2008-11-06
Limited resources and the sheer volume of concepts make auditing a large terminology, such as SNOMED CT, a daunting task. It is essential to devise techniques that can aid an auditor by automatically identifying concepts that deserve attention. A methodology for this purpose based on a previously introduced abstraction network (called the p-area taxonomy) for a SNOMED CT hierarchy is presented. The methodology algorithmically gathers concepts appearing in certain overlapping subsets, defined exclusively with respect to the p-area taxonomy, for review. The results of applying the methodology to SNOMED's Specimen hierarchy are presented. These results are compared against a control sample composed of concepts residing in subsets without the overlaps. With the use of the double bootstrap, the concept group produced by our methodology is shown to yield a statistically significant higher proportion of error discoveries.
Papadimitropoulos, Adam; Rovithakis, George A; Parisini, Thomas
2007-07-01
In this paper, the problem of fault detection in mechanical systems performing linear motion, under the action of friction phenomena is addressed. The friction effects are modeled through the dynamic LuGre model. The proposed architecture is built upon an online neural network (NN) approximator, which requires only system's position and velocity. The friction internal state is not assumed to be available for measurement. The neural fault detection methodology is analyzed with respect to its robustness and sensitivity properties. Rigorous fault detectability conditions and upper bounds for the detection time are also derived. Extensive simulation results showing the effectiveness of the proposed methodology are provided, including a real case study on an industrial actuator.
Effects-Based Operations in the Cyber Domain
2017-05-03
as the joint targeting methodology . The description that Batschelet gave the traditional targeting methodology included a process of, “Decide, Detect...technology, requires new planning and methodology to fight back. This paper evaluates current Department of Defense doctrine to look at ways to conduct...developing its cyber tactics, techniques, and procedures, which, includes various targeting methodologies , such as the use of effects-based
Nery, Susana V.; Doi, Suhail A.; Gray, Darren J.; Soares Magalhães, Ricardo J.; McCarthy, James S.; Traub, Rebecca J.; Andrews, Ross M.; Clements, Archie C. A.
2016-01-01
Background: Soil-transmitted helminths (STH) have acute and chronic manifestations, and can result in lifetime morbidity. Disease burden is difficult to quantify, yet quantitative evidence is required to justify large-scale deworming programmes. A recent Cochrane systematic review, which influences Global Burden of Disease (GBD) estimates for STH, has again called into question the evidence for deworming benefit on morbidity due to STH. In this narrative review, we investigate in detail what the shortfalls in evidence are. Methodology/Principal Findings: We systematically reviewed recent literature that used direct measures to investigate morbidity from STH and we critically appraised systematic reviews, particularly the most recent Cochrane systematic review investigating deworming impact on morbidity. We included six systematic reviews and meta-analyses, 36 literature reviews, 44 experimental or observational studies, and five case series. We highlight where evidence is insufficient and where research needs to be directed to strengthen morbidity evidence, ideally to prove benefits of deworming. Conclusions/Significance: Overall, the Cochrane systematic review and recent studies indicate major shortfalls in evidence for direct morbidity. However, it is questionable whether the systematic review methodology should be applied to STH due to heterogeneity of the prevalence of different species in each setting. Urgent investment in studies powered to detect direct morbidity effects due to STH is required. PMID:27196100
NASA Technical Reports Server (NTRS)
Ladbury, Ray
2018-01-01
In 1972, when engineers at Hughes Aircraft Corporation discovered that errors in their satellite avionics were being caused by cosmic rays (so-called single-event effects, or SEE), Moore's Law was only 7 years old. Now, more than 45 years on, the scaling that drove Moore's Law for its first 35 years has reached its limits. However, electronics technology continues to evolve exponentially and SEE remain a formidable issue for use of electronics in space. SEE occur when a single ionizing particle passes through a sensitive volume in an active semiconductor device and generates sufficient charge to cause anomalous behavior or failure in the device. Because SEE can occur at any time during the mission, the emphasis of SEE risk management methodologies is ensuring that all SEE modes in a device under test are detected by the test. Because a particle's probability of causing an SEE generally increases as the particle becomes more ionizing, heavy-ion beams have been and remain the preferred tools for elucidating SEE vulnerabilities. In this talk we briefly discuss space radiation environments and SEE mechanisms, describe SEE test methodologies and discuss current and future challenges for use of heavy-ion beams for SEE testing in an era when the continued validity of Moore's law depends on innovation rather than CMOS scaling.
Short-term Inundation Forecasting for Tsunamis in the Caribbean Sea Region
NASA Astrophysics Data System (ADS)
Mercado-Irizarry, A.; Schmidt, W.
2007-05-01
After the 2004 Indian Ocean tsunami, the USA Congress gave a mandate to the National Oceanographic and Atmospheric Administration (NOAA) to assess the tsunami threat for all USA interests, and adapt to them the Short-term Inundation Forecasting for Tsunamis (SIFT) methodology first developed for the USA Pacific seaboard states. This methodology would be used with the DART buoys deployed in the Atlantic Ocean and Caribbean Sea. The first step involved the evaluation and characterization of the major tsunamigenic regions in both regions, work done by the US Geological Survey (USGS). This was followed by the modeling of the generation and propagation of tsunamis due to unit slip tsunamigenic earthquakes located at different locations along the tsunamigenic zones identified by the USGS. These pre-computed results are stored and are used as sources (in an inverse modeling approach using the DART buoys) for so-called Standby Inundation Models (SIM's) being developed for selected coastal cities in Puerto Rico, the US Virgin Islands, and others along the Atlantic seaboard of the USA. It is the purpose of this presentation to describe the work being carried out in the Caribbean Sea region, where two SIM's for Puerto Rico have already being prepared, allowing for near real-time assessment (less than 10 minutes after detection by the DART buoys) of the expected tsunami impact for two major coastal cities.
Detection of white matter lesion regions in MRI using SLIC0 and convolutional neural network.
Diniz, Pedro Henrique Bandeira; Valente, Thales Levi Azevedo; Diniz, João Otávio Bandeira; Silva, Aristófanes Corrêa; Gattass, Marcelo; Ventura, Nina; Muniz, Bernardo Carvalho; Gasparetto, Emerson Leandro
2018-04-19
White matter lesions are non-static brain lesions that have a prevalence rate up to 98% in the elderly population. Because they may be associated with several brain diseases, it is important that they are detected as soon as possible. Magnetic Resonance Imaging (MRI) provides three-dimensional data with the possibility to detect and emphasize contrast differences in soft tissues, providing rich information about the human soft tissue anatomy. However, the amount of data provided for these images is far too much for manual analysis/interpretation, representing a difficult and time-consuming task for specialists. This work presents a computational methodology capable of detecting regions of white matter lesions of the brain in MRI of FLAIR modality. The techniques highlighted in this methodology are SLIC0 clustering for candidate segmentation and convolutional neural networks for candidate classification. The methodology proposed here consists of four steps: (1) images acquisition, (2) images preprocessing, (3) candidates segmentation and (4) candidates classification. The methodology was applied on 91 magnetic resonance images provided by DASA, and achieved an accuracy of 98.73%, specificity of 98.77% and sensitivity of 78.79% with 0.005 of false positives, without any false positives reduction technique, in detection of white matter lesion regions. It is demonstrated the feasibility of the analysis of brain MRI using SLIC0 and convolutional neural network techniques to achieve success in detection of white matter lesions regions. Copyright © 2018. Published by Elsevier B.V.
New approaches to the measurement of chlorophyll, related pigments and productivity in the sea
NASA Technical Reports Server (NTRS)
Booth, C. R.; Keifer, D. A.
1989-01-01
In the 1984 SBIR Call for Proposals, NASA solicited new methods to measure primary production and chlorophyll in the ocean. Biospherical Instruments Inc. responded to this call with a proposal first to study a variety of approaches to this problem. A second phase of research was then funded to pursue instrumentation to measure the sunlight stimulated naturally occurring fluorescence of chlorophyll in marine phytoplankton. The monitoring of global productivity, global fisheries resources, application of above surface-to-underwater optical communications systems, submarine detection applications, correlation, and calibration of remote sensing systems are but some of the reasons for developing inexpensive sensors to measure chlorophyll and productivity. Normally, productivity measurements are manpower and cost intensive and, with the exception of a very few expensive multiship research experiments, provide no contemporaneous data. We feel that the patented, simple sensors that we have designed will provide a cost effective method for large scale, synoptic, optical measurements in the ocean. This document is the final project report for a NASA sponsored SBIR Phase 2 effort to develop new methods for the measurements of primary production in the ocean. This project has been successfully completed, a U.S. patent was issued covering the methodology and sensors, and the first production run of instrumentation developed under this contract has sold out and been delivered.
Bat detective—Deep learning tools for bat acoustic signal detection
Barlow, Kate E.; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R.; Newson, Stuart E.; Pandourski, Ivan; Russ, Jon; Szodoray-Paradi, Abigel; Tilova, Elena; Girolami, Mark; Jones, Kate E.
2018-01-01
Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio. PMID:29518076
Blue-Whale Calls Detected at the Pioneer Seamount Underwater Observatory
NASA Astrophysics Data System (ADS)
Hoffman, M. D.; Vuosalo, C. O.; Bland, R. W.; Garfield, N.
2002-12-01
In September of 2001 a cabled vertical linear array (VLA) of hydrophones was deployed on Pioneer Seamount, 90 km off the California coast near Half Moon Bay, by the NOAA-PMEL and University of Washington-APL. The array of 4 hydrophones is at a depth of 950 m, and the four signals are digitized at the shore end of the cable at 1000 Hz. The data are archived by PMEL, and are available to the public over the internet. Spectrograms of all of the data are accessible on the SFSU web site. A large number of blue-whale calls are evident in the spectrograms. We have employed spectrogram correlation [Mellinger 2000] and a matched-filter detection scheme [Stafford 1998] to automatically identify these whale calls in three months of data. Results on the frequency of calls and their variability will be presented. Mellinger, David K., and Christopher W. Clark [2000], "Recognizing transient low-frequency whale sounds by spectrogram correlation," J. Acoust. Soc. Am. 107 (3518). Stafford, Kathleen M., Christopher G. Fox, and Davis S. Clark [1998], "Long-range acoustic detection and localization of blue whale calls in the northeast Pacific Ocean," J. Acoust. Soc. Am. 104 (3616).
Garnier, A; Poncet, F; Billette De Villemeur, A; Exbrayat, C; Bon, M F; Chevalier, A; Salicru, B; Tournegros, J M
2009-06-01
The screening program guidelines specify that the call back rate of women for additional imaging (positive mammogram) should not exceed 7% at initial screening, and 5% at subsequent screening. Materials and methods. Results in the Isere region (12%) have prompted a review of the correlation between the call back rate and indicators of quality (detection rate, sensitivity, specificity, positive predictive value) for the radiologists providing interpretations during that time period. Three groups of radiologists were identified: the group with call back rate of 10% achieved the best results (sensitivity: 92%, detection rate: 0.53%, specificity: 90%). The group with lowest call back rate (7.7%) showed insufficient sensitivity (58%). The last group with call back rate of 18.3%, showed no improvement in sensitivity (82%) and detection rate (0.53%), but showed reduced specificity (82%). The protocol update in 2001 does not resolve this problematic situation and national results continue to demonstrate a high percentage of positive screening mammograms. A significant increase in the number of positive screening examinations compared to recommended guidelines is not advantageous and leads to an overall decrease in the quality of the screening.
Goodman, Corey W; Major, Heather J; Walls, William D; Sheffield, Val C; Casavant, Thomas L; Darbro, Benjamin W
2015-04-01
Chromosomal microarrays (CMAs) are routinely used in both research and clinical laboratories; yet, little attention has been given to the estimation of genome-wide true and false negatives during the assessment of these assays and how such information could be used to calibrate various algorithmic metrics to improve performance. Low-throughput, locus-specific methods such as fluorescence in situ hybridization (FISH), quantitative PCR (qPCR), or multiplex ligation-dependent probe amplification (MLPA) preclude rigorous calibration of various metrics used by copy number variant (CNV) detection algorithms. To aid this task, we have established a comparative methodology, CNV-ROC, which is capable of performing a high throughput, low cost, analysis of CMAs that takes into consideration genome-wide true and false negatives. CNV-ROC uses a higher resolution microarray to confirm calls from a lower resolution microarray and provides for a true measure of genome-wide performance metrics at the resolution offered by microarray testing. CNV-ROC also provides for a very precise comparison of CNV calls between two microarray platforms without the need to establish an arbitrary degree of overlap. Comparison of CNVs across microarrays is done on a per-probe basis and receiver operator characteristic (ROC) analysis is used to calibrate algorithmic metrics, such as log2 ratio threshold, to enhance CNV calling performance. CNV-ROC addresses a critical and consistently overlooked aspect of analytical assessments of genome-wide techniques like CMAs which is the measurement and use of genome-wide true and false negative data for the calculation of performance metrics and comparison of CNV profiles between different microarray experiments. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Janches, D.; Hocking, W.; Pifko, S.; Hormaechea, J. L.; Fritts, D. C.; Brunini, C; Michell, R.; Samara, M.
2013-01-01
A radar meteor echo is the radar scattering signature from the free-electrons in a plasma trail generated by entry of extraterrestrial particles into the atmosphere. Three categories of scattering mechanisms exist: specular, nonspecular trails, and head-echoes. Generally, there are two types of radars utilized to detect meteors. Traditional VHF meteor radars (often called all-sky1radars) primarily detect the specular reflection of meteor trails traveling perpendicular to the line of sight of the scattering trail, while High Power and Large Aperture (HPLA) radars efficiently detect meteor head-echoes and, in some cases, non-specular trails. The fact that head-echo measurements can be performed only with HPLA radars limits these studies in several ways. HPLA radars are very sensitive instruments constraining the studies to the lower masses, and these observations cannot be performed continuously because they take place at national observatories with limited allocated observing time. These drawbacks can be addressed by developing head echo observing techniques with modified all-sky meteor radars. In addition, the fact that the simultaneous detection of all different scattering mechanisms can be made with the same instrument, rather than requiring assorted different classes of radars, can help clarify observed differences between the different methodologies. In this study, we demonstrate that such concurrent observations are now possible, enabled by the enhanced design of the Southern Argentina Agile Meteor Radar (SAAMER) deployed at the Estacion Astronomica Rio Grande (EARG) in Tierra del Fuego, Argentina. The results presented here are derived from observations performed over a period of 12 days in August 2011, and include meteoroid dynamical parameter distributions, radiants and estimated masses. Overall, the SAAMER's head echo detections appear to be produced by larger particles than those which have been studied thus far using this technique.
76 FR 55804 - Dicamba; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-09
... Considerations A. Analytical Enforcement Methodology Adequate enforcement methodologies, Methods I and II--gas chromatography with electron capture detection (GC/ECD), are available to enforce the tolerance expression. The...
Bridge damage detection using spatiotemporal patterns extracted from dense sensor network
NASA Astrophysics Data System (ADS)
Liu, Chao; Gong, Yongqiang; Laflamme, Simon; Phares, Brent; Sarkar, Soumik
2017-01-01
The alarmingly degrading state of transportation infrastructures combined with their key societal and economic importance calls for automatic condition assessment methods to facilitate smart management of maintenance and repairs. With the advent of ubiquitous sensing and communication capabilities, scalable data-driven approaches is of great interest, as it can utilize large volume of streaming data without requiring detailed physical models that can be inaccurate and computationally expensive to run. Properly designed, a data-driven methodology could enable fast and automatic evaluation of infrastructures, discovery of causal dependencies among various sub-system dynamic responses, and decision making with uncertainties and lack of labeled data. In this work, a spatiotemporal pattern network (STPN) strategy built on symbolic dynamic filtering (SDF) is proposed to explore spatiotemporal behaviors in a bridge network. Data from strain gauges installed on two bridges are generated using finite element simulation for three types of sensor networks from a density perspective (dense, nominal, sparse). Causal relationships among spatially distributed strain data streams are extracted and analyzed for vehicle identification and detection, and for localization of structural degradation in bridges. Multiple case studies show significant capabilities of the proposed approach in: (i) capturing spatiotemporal features to discover causality between bridges (geographically close), (ii) robustness to noise in data for feature extraction, (iii) detecting and localizing damage via comparison of bridge responses to similar vehicle loads, and (iv) implementing real-time health monitoring and decision making work flow for bridge networks. Also, the results demonstrate increased sensitivity in detecting damages and higher reliability in quantifying the damage level with increase in sensor network density.
Teaching for Cultural Literacy: A Curriculum Study.
ERIC Educational Resources Information Center
Flinders, David J.
1996-01-01
Explores the concept of curriculum enactment, which calls attention to classroom uses of content and contextual constructions of meaning. The study's methodological framework draws on descriptive, interpretive, evaluative, and thematic dimensions of educational criticism. Two high school English and social studies classes exemplifying curriculum…
Conceptualizing Effectiveness in Disability Research
ERIC Educational Resources Information Center
de Bruin, Catriona L.
2017-01-01
Policies promoting evidence-based practice in education typically endorse evaluations of the effectiveness of teaching strategies through specific experimental research designs and methods. A number of researchers have critiqued this approach to evaluation as narrow and called for greater methodological sophistication. This paper discusses the…
After Behaviourism, Navigationism?
ERIC Educational Resources Information Center
Moran, Sean
2008-01-01
Two previous articles in this journal advocate the greater use of a behaviourist methodology called "Precision Teaching" (PT). From a position located within virtue ethics, this article argues that the technical feat of raising narrowly defined performance in mathematics and other subjects is not sufficient justification for the…
PROFIL: A Method for the Development of Multimedia.
ERIC Educational Resources Information Center
Koper, Rob
1995-01-01
Describes a dedicated method for the design of multimedia courseware, called PROFIL, which integrates instructional design with software engineering techniques and incorporates media selection in the design methodology. The phases of development are outlined: preliminary investigation, definition, script, technical realization, implementation, and…
DEVELOPMENT AND EVALUATION OF PM 2.5 SOURCE APPORTIONMENT METHODOLOGIES
The receptor model called Positive Matrix Factorization (PMF) has been extensively used to apportion sources of ambient fine particulate matter (PM2.5), but the accuracy of source apportionment results currently remains unknown. In addition, air quality forecast model...
GIS applied to location of fires detection towers in domain area of tropical forest.
Eugenio, Fernando Coelho; Rosa Dos Santos, Alexandre; Fiedler, Nilton Cesar; Ribeiro, Guido Assunção; da Silva, Aderbal Gomes; Juvanhol, Ronie Silva; Schettino, Vitor Roberto; Marcatti, Gustavo Eduardo; Domingues, Getúlio Fonseca; Alves Dos Santos, Gleissy Mary Amaral Dino; Pezzopane, José Eduardo Macedo; Pedra, Beatriz Duguy; Banhos, Aureo; Martins, Lima Deleon
2016-08-15
In most countries, the loss of biodiversity caused by the fires is worrying. In this sense, the fires detection towers are crucial for rapid identification of fire outbreaks and can also be used in environmental inspection, biodiversity monitoring, telecommunications mechanisms, telemetry and others. Currently the methodologies for allocating fire detection towers over large areas are numerous, complex and non-standardized by government supervisory agencies. Therefore, this study proposes and evaluates different methodologies to best location of points to install fire detection towers considering the topography, risk areas, conservation units and heat spots. Were used Geographic Information Systems (GIS) techniques and unaligned stratified systematic sampling for implementing and evaluating 9 methods for allocating fire detection towers. Among the methods evaluated, the C3 method was chosen, represented by 140 fire detection towers, with coverage of: a) 67% of the study area, b) 73.97% of the areas with high risk, c) 70.41% of the areas with very high risk, d) 70.42% of the conservation units and e) 84.95% of the heat spots in 2014. The proposed methodology can be adapted to areas of other countries. Copyright © 2016 Elsevier B.V. All rights reserved.
Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.
Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania
2016-04-01
The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets.
Illegal discharges in Spanish waters. Analysis of the profile of the Alleged Offending Vessel.
Martín Alonso, J M; Ortega Piris, Andrés; Pérez Labajos, Carlos
2015-08-15
There is at present a growing concern, on an international level, over environmental offences caused by oil discharges into the sea from vessels. The objective of the Spanish Maritime Administration is to prevent the illegal discharges of polluting substances in Spanish maritime waters by vessels in transit. To combat such discharges, since 2007 Spain has reinforced its means of response with the use of aircrafts that provide services of maritime surveillance, identifying the Alleged Offending Vessels and acting as a deterrent. The objective of the present study is both to introduce the concept and to analyze certain aspects of the so-called "Alleged Offending Vessel" (AOV) that have been detected within Spanish Search and Rescue (SAR) jurisdiction waters in the period 2008-2012, in order to build a profile of such a vessel. For this purpose, an analysis methodology is formalized based on the GINI index and Lorenz curves, associated with certain aspects of vessels: type, flag and sailing area. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ma, Jieshi; Xu, Canhua; Dai, Meng; You, Fusheng; Shi, Xuetao; Dong, Xiuzhen; Fu, Feng
2014-01-01
Stroke has a high mortality and disability rate and should be rapidly diagnosed to improve prognosis. Diagnosing stroke is not a problem for hospitals with CT, MRI, and other imaging devices but is difficult for community hospitals without these devices. Based on the mechanism that the electrical impedance of the two hemispheres of a normal human head is basically symmetrical and a stroke can alter this symmetry, a fast electrical impedance imaging method called symmetrical electrical impedance tomography (SEIT) is proposed. In this technique, electrical impedance tomography (EIT) data measured from the undamaged craniocerebral hemisphere (CCH) is regarded as reference data for the remaining EIT data measured from the other CCH for difference imaging to identify the differences in resistivity distribution between the two CCHs. The results of SEIT imaging based on simulation data from the 2D human head finite element model and that from the physical phantom of human head verified this method in detection of unilateral stroke.
Xu, Canhua; Dai, Meng; You, Fusheng; Shi, Xuetao
2014-01-01
Stroke has a high mortality and disability rate and should be rapidly diagnosed to improve prognosis. Diagnosing stroke is not a problem for hospitals with CT, MRI, and other imaging devices but is difficult for community hospitals without these devices. Based on the mechanism that the electrical impedance of the two hemispheres of a normal human head is basically symmetrical and a stroke can alter this symmetry, a fast electrical impedance imaging method called symmetrical electrical impedance tomography (SEIT) is proposed. In this technique, electrical impedance tomography (EIT) data measured from the undamaged craniocerebral hemisphere (CCH) is regarded as reference data for the remaining EIT data measured from the other CCH for difference imaging to identify the differences in resistivity distribution between the two CCHs. The results of SEIT imaging based on simulation data from the 2D human head finite element model and that from the physical phantom of human head verified this method in detection of unilateral stroke. PMID:25006594
Diagnosis of skin cancer using image processing
NASA Astrophysics Data System (ADS)
Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué; Coronel-Beltrán, Ángel
2014-10-01
In this papera methodology for classifying skin cancerin images of dermatologie spots based on spectral analysis using the K-law Fourier non-lineartechnique is presented. The image is segmented and binarized to build the function that contains the interest area. The image is divided into their respective RGB channels to obtain the spectral properties of each channel. The green channel contains more information and therefore this channel is always chosen. This information is point to point multiplied by a binary mask and to this result a Fourier transform is applied written in nonlinear form. If the real part of this spectrum is positive, the spectral density takeunit values, otherwise are zero. Finally the ratio of the sum of the unit values of the spectral density with the sum of values of the binary mask are calculated. This ratio is called spectral index. When the value calculated is in the spectral index range three types of cancer can be detected. Values found out of this range are benign injure.
Meta-STEPP: subpopulation treatment effect pattern plot for individual patient data meta-analysis.
Wang, Xin Victoria; Cole, Bernard; Bonetti, Marco; Gelber, Richard D
2016-09-20
We have developed a method, called Meta-STEPP (subpopulation treatment effect pattern plot for meta-analysis), to explore treatment effect heterogeneity across covariate values in the meta-analysis setting for time-to-event data when the covariate of interest is continuous. Meta-STEPP forms overlapping subpopulations from individual patient data containing similar numbers of events with increasing covariate values, estimates subpopulation treatment effects using standard fixed-effects meta-analysis methodology, displays the estimated subpopulation treatment effect as a function of the covariate values, and provides a statistical test to detect possibly complex treatment-covariate interactions. Simulation studies show that this test has adequate type-I error rate recovery as well as power when reasonable window sizes are chosen. When applied to eight breast cancer trials, Meta-STEPP suggests that chemotherapy is less effective for tumors with high estrogen receptor expression compared with those with low expression. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Compact tracking of surgical instruments through structured markers.
Alberto Borghese, N; Frosio, I
2013-07-01
Virtual and augmented reality surgery calls for reliable and efficient tracking of the surgical instruments in the virtual or real operating theatre. The most diffused approach uses three or more not aligned markers, attached to each instrument and surveyed by a set of cameras. However, the structure required to carry the markers does modify the instrument's mass distribution and can interfere with surgeon movements. To overcome these problems, we propose here a new methodology, based on structured markers, to compute the six degrees of freedom of a surgical instrument. Two markers are attached on the instrument axis and one of them has a stripe painted over its surface. We also introduce a procedure to compute with high accuracy the markers center on the cameras image, even when partially occluded by the instrument's axis or by other structures. Experimental results demonstrate the reliability and accuracy of the proposed approach. The introduction of structured passive markers can open new possibilities to accurate tracking, combining markers detection with real-time image processing.
Multivariate longitudinal data analysis with censored and intermittent missing responses.
Lin, Tsung-I; Lachos, Victor H; Wang, Wan-Lun
2018-05-08
The multivariate linear mixed model (MLMM) has emerged as an important analytical tool for longitudinal data with multiple outcomes. However, the analysis of multivariate longitudinal data could be complicated by the presence of censored measurements because of a detection limit of the assay in combination with unavoidable missing values arising when subjects miss some of their scheduled visits intermittently. This paper presents a generalization of the MLMM approach, called the MLMM-CM, for a joint analysis of the multivariate longitudinal data with censored and intermittent missing responses. A computationally feasible expectation maximization-based procedure is developed to carry out maximum likelihood estimation within the MLMM-CM framework. Moreover, the asymptotic standard errors of fixed effects are explicitly obtained via the information-based method. We illustrate our methodology by using simulated data and a case study from an AIDS clinical trial. Experimental results reveal that the proposed method is able to provide more satisfactory performance as compared with the traditional MLMM approach. Copyright © 2018 John Wiley & Sons, Ltd.
Rapid Detection of Ebola Virus with a Reagent-Free, Point-of-Care Biosensor
Baca, Justin T.; Severns, Virginia; Lovato, Debbie; Branch, Darren W.; Larson, Richard S.
2015-01-01
Surface acoustic wave (SAW) sensors can rapidly detect Ebola antigens at the point-of-care without the need for added reagents, sample processing, or specialized personnel. This preliminary study demonstrates SAW biosensor detection of the Ebola virus in a concentration-dependent manner. The detection limit with this methodology is below the average level of viremia detected on the first day of symptoms by PCR. We observe a log-linear sensor response for highly fragmented Ebola viral particles, with a detection limit corresponding to 1.9 × 104 PFU/mL prior to virus inactivation. We predict greatly improved sensitivity for intact, infectious Ebola virus. This point-of-care methodology has the potential to detect Ebola viremia prior to symptom onset, greatly enabling infection control and rapid treatment. This biosensor platform is powered by disposable AA batteries and can be rapidly adapted to detect other emerging diseases in austere conditions. PMID:25875186
NASA Astrophysics Data System (ADS)
Berchok, Catherine L.
During four field seasons from 1998--2001, 115 hours of acoustic recordings were made in the presence of the well-studied St. Lawrence population of blue whales. The primary field site for this study was the estuary region of the St. Lawrence River (Quebec, Canada) with most recordings made between mid-August and late October. Effort was concentrated in the daylight hours, although occasionally extending past nightfall. An inexpensive and portable recording system was built that was easy to deploy and provided quality recordings in a variety of sea conditions. It consisted of a calibrated omni-directional hydrophone with a flat (+/-3dB) response from 5Hz to 800Hz; and a surface isolation buoy to minimize the vertical movement of the sensor. During the recording sessions detailed field notes were taken on all blue whales within sight, with individual identities confirmed through photo-identification work between sessions. Notes were also taken on all other species sighted during the recording sessions. Characterization of the more than one-thousand blue whale calls detected during this study revealed that the St. Lawrence repertoire is much more extensive than previously reported. Three infrasonic (<20Hz) and four audible range (30--200Hz) call types were detected in this study, with much time/frequency variation seen within each type. The infrasonic calls were long (5--30s) in duration and arranged into regularly patterned series. These calls were similar in call characteristics and spacing to those detected in the North Atlantic, but had much shorter and more variable patterned series. The audible call types were much shorter (1--4s), and occurred singly or in irregularly spaced clusters, although a special patterning was seen that contained both regular and irregular spaced components. Comparison of the daily, seasonal, and spatial distributions of calling behavior with those of several biological parameters revealed interesting differences between the three call types examined. The trends seen suggest a migratory, reproductive, or foraging context for the infrasonic calls. A closer-range social context is suggested for the audible downsweeps, which have been detected in foraging situations as well as in courtship displays. The audible mixed-pattern call type appears to have a primarily reproductive context.
Villeval, M; Carayol, M; Lamy, S; Lepage, B; Lang, T
2016-12-01
In the field of health, evidence-based medicine and associated methods like randomised controlled trials (RCTs) have become widely used. RCT has become the gold standard for evaluating causal links between interventions and health results. Originating in pharmacology, this method has been progressively expanded to medical devices, non-pharmacological individual interventions, as well as collective public health interventions. Its use in these domains has led to the formulation of several limits, and it has been called into question as an undisputed gold standard. Some of those limits (e.g. confounding biases and external validity) are common to these four different domains, while others are more specific. This paper describes the different limits, as well as several research avenues. Some are methodological reflections aiming at adapting RCT to the complexity of the tested interventions, and at overcoming some of its limits. Others are alternative methods. The objective is not to remove RCT from the range of evaluation methodologies, but to resituate it within this range. The aim is to encourage choosing between different methods according to the features and the level of the intervention to evaluate, thereby calling for methodological pluralism. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Yeo, Zhen Xuan; Wong, Joshua Chee Leong; Rozen, Steven G; Lee, Ann Siew Gek
2014-06-24
The Ion Torrent PGM is a popular benchtop sequencer that shows promise in replacing conventional Sanger sequencing as the gold standard for mutation detection. Despite the PGM's reported high accuracy in calling single nucleotide variations, it tends to generate many false positive calls in detecting insertions and deletions (indels), which may hinder its utility for clinical genetic testing. Recently, the proprietary analytical workflow for the Ion Torrent sequencer, Torrent Suite (TS), underwent a series of upgrades. We evaluated three major upgrades of TS by calling indels in the BRCA1 and BRCA2 genes. Our analysis revealed that false negative indels could be generated by TS under both default calling parameters and parameters adjusted for maximum sensitivity. However, indel calling with the same data using the open source variant callers, GATK and SAMtools showed that false negatives could be minimised with the use of appropriate bioinformatics analysis. Furthermore, we identified two variant calling measures, Quality-by-Depth (QD) and VARiation of the Width of gaps and inserts (VARW), which substantially reduced false positive indels, including non-homopolymer associated errors without compromising sensitivity. In our best case scenario that involved the TMAP aligner and SAMtools, we achieved 100% sensitivity, 99.99% specificity and 29% False Discovery Rate (FDR) in indel calling from all 23 samples, which is a good performance for mutation screening using PGM. New versions of TS, BWA and GATK have shown improvements in indel calling sensitivity and specificity over their older counterpart. However, the variant caller of TS exhibits a lower sensitivity than GATK and SAMtools. Our findings demonstrate that although indel calling from PGM sequences may appear to be noisy at first glance, proper computational indel calling analysis is able to maximize both the sensitivity and specificity at the single base level, paving the way for the usage of this technology for future clinical genetic testing.
Abstraction of complex concepts with a refined partial-area taxonomy of SNOMED
Wang, Yue; Halper, Michael; Wei, Duo; Perl, Yehoshua; Geller, James
2012-01-01
An algorithmically-derived abstraction network, called the partial-area taxonomy, for a SNOMED hierarchy has led to the identification of concepts considered complex. The designation “complex” is arrived at automatically on the basis of structural analyses of overlap among the constituent concept groups of the partial-area taxonomy. Such complex concepts, called overlapping concepts, constitute a tangled portion of a hierarchy and can be obstacles to users trying to gain an understanding of the hierarchy’s content. A new methodology for partitioning the entire collection of overlapping concepts into singly-rooted groups, that are more manageable to work with and comprehend, is presented. Different kinds of overlapping concepts with varying degrees of complexity are identified. This leads to an abstract model of the overlapping concepts called the disjoint partial-area taxonomy, which serves as a vehicle for enhanced, high-level display. The methodology is demonstrated with an application to SNOMED’s Specimen hierarchy. Overall, the resulting disjoint partial-area taxonomy offers a refined view of the hierarchy’s structural organization and conceptual content that can aid users, such as maintenance personnel, working with SNOMED. The utility of the disjoint partial-area taxonomy as the basis for a SNOMED auditing regimen is presented in a companion paper. PMID:21878396
Object permanence in marine mammals using the violation of expectation procedure.
Singer, Rebecca; Henderson, Elizabeth
2015-03-01
Object permanence refers to the ability to process information about objects even when they are not visible. One stage of object permanence, called visible displacement, involves being able to find an object that has been fully hidden from view. Visible displacement has been demonstrated in many animal species, yet very little is known about object permanence in marine mammals. In addition, the methodology for testing visible displacement has sometimes been called into question because alternative explanations could account for subjects' success. The current study investigated visible displacement in Atlantic bottlenose dolphins and California sea lions using a methodology called violation of expectation, in which the animal's fish bucket was placed on a table surrounded on three sides by curtains. A solid screen placed in front of the bucket was then rotated in an arc from front to back. The screen was rotated either 120° (possible event) or 180° (surprising event), appearing as if the bucket disappeared. Both dolphins and sea lions looked significantly longer during the 180°, unexpected, trials than the expected event trials. Results suggest that both dolphins and sea lions pass visible displacement tests without the use of perceptual cues. This article is part of a Special Issue entitled: Tribute to Tom Zentall. Copyright © 2014 Elsevier B.V. All rights reserved.
Video analysis for insight and coding: Examples from tutorials in introductory physics
NASA Astrophysics Data System (ADS)
Scherr, Rachel E.
2009-12-01
The increasing ease of video recording offers new opportunities to create richly detailed records of classroom activities. These recordings, in turn, call for research methodologies that balance generalizability with interpretive validity. This paper shares methodology for two practices of video analysis: (1) gaining insight into specific brief classroom episodes and (2) developing and applying a systematic observational protocol for a relatively large corpus of video data. These two aspects of analytic practice are illustrated in the context of a particular research interest but are intended to serve as general suggestions.
McCarthy, Alun
2011-09-01
Pharmacogenomic Innovative Solutions Ltd (PGXIS) was established in 2007 by a group of pharmacogenomic (PGx) experts to make their expertise available to biotechnology and pharmaceutical companies. PGXIS has subsequently established a network of experts to broaden its access to relevant PGx knowledge and technologies. In addition, it has developed a novel multivariate analysis method called Taxonomy3 which is both a data integration tool and a targeting tool. Together with siRNA methodology from CytoPathfinder Inc., PGXIS now has an extensive range of diverse PGx methodologies focused on enhancing drug development.
Aguilar-López, Ricardo; Mata-Machuca, Juan L
2016-01-01
This paper proposes a synchronization methodology of two chaotic oscillators under the framework of identical synchronization and master-slave configuration. The proposed methodology is based on state observer design under the frame of control theory; the observer structure provides finite-time synchronization convergence by cancelling the upper bounds of the main nonlinearities of the chaotic oscillator. The above is showed via an analysis of the dynamic of the so called synchronization error. Numerical experiments corroborate the satisfactory results of the proposed scheme.
Aguilar-López, Ricardo
2016-01-01
This paper proposes a synchronization methodology of two chaotic oscillators under the framework of identical synchronization and master-slave configuration. The proposed methodology is based on state observer design under the frame of control theory; the observer structure provides finite-time synchronization convergence by cancelling the upper bounds of the main nonlinearities of the chaotic oscillator. The above is showed via an analysis of the dynamic of the so called synchronization error. Numerical experiments corroborate the satisfactory results of the proposed scheme. PMID:27738651
Methodology of Numerical Optimization for Orbital Parameters of Binary Systems
NASA Astrophysics Data System (ADS)
Araya, I.; Curé, M.
2010-02-01
The use of a numerical method of maximization (or minimization) in optimization processes allows us to obtain a great amount of solutions. Therefore, we can find a global maximum or minimum of the problem, but this is only possible if we used a suitable methodology. To obtain the global optimum values, we use the genetic algorithm called PIKAIA (P. Charbonneau) and other four algorithms implemented in Mathematica. We demonstrate that derived orbital parameters of binary systems published in some papers, based on radial velocity measurements, are local minimum instead of global ones.
Advances in Human-Computer Interaction: Graphics and Animation Components for Interface Design
NASA Astrophysics Data System (ADS)
Cipolla Ficarra, Francisco V.; Nicol, Emma; Cipolla-Ficarra, Miguel; Richardson, Lucy
We present an analysis of communicability methodology in graphics and animation components for interface design, called CAN (Communicability, Acceptability and Novelty). This methodology has been under development between 2005 and 2010, obtaining excellent results in cultural heritage, education and microcomputing contexts. In studies where there is a bi-directional interrelation between ergonomics, usability, user-centered design, software quality and the human-computer interaction. We also present the heuristic results about iconography and layout design in blogs and websites of the following countries: Spain, Italy, Portugal and France.
Forget, Patrice; Berlière, Martine; van Maanen, Aline; Duhoux, Francois P; Machiels, Jean-Pascal; Coulie, Pierre G; Bouche, Gauthier; De Kock, Marc
2013-10-01
Ketorolac, a NSAID routinely used during surgery proposed to have anticancer effects, is a promising way to improve postoperative oncological outcome. This effect may be particularly prominent in patients with elevated preoperative inflammatory scores, like the neutrophil:lymphocyte ratio. In this paper, we describe the rationale, the preliminary analyses in our patients, the feasibility and the methodology of a prospective randomized trial called "Ketorolac in Breast Cancer trial" (KBCt) (NCT01806259). Copyright © 2013 Elsevier Ltd. All rights reserved.
Blais, Jules M.; Rosen, Michael R.; Smol, John P.
2015-01-01
Newly produced, as well as some so-called legacy contaminants, continue to be released into the environment at an accelerated rate. Given the general lack of integrated, direct monitoring programs, the use of natural archival records of contaminants will almost certainly continue to increase. We conclude this volume with a short chapter highlighting some of our final thoughts, with a focus on a call to action to develop and apply methodologies to assess the fidelity of the archival record.
The development of a super-fine-grained nuclear emulsion
NASA Astrophysics Data System (ADS)
Asada, Takashi; Naka, Tatsuhiro; Kuwabara, Ken-ichi; Yoshimoto, Masahiro
2017-06-01
A nuclear emulsion with micronized crystals is required for the tracking detection of submicron ionizing particles, which are one of the targets of dark-matter detection and other techniques. We found that a new production method, called the PVA—gelatin mixing method (PGMM), could effectively control crystal size from 20 nm to 50 nm. We called the two types of emulsion produced with the new method the nano imaging tracker and the ultra-nano imaging tracker. Their composition and spatial resolution were measured, and the results indicate that these emulsions detect extremely short tracks.
Tregidgo, Daniel J; West, Sarah E; Ashmore, Mike R
2013-11-01
Citizen science is having increasing influence on environmental monitoring as its advantages are becoming recognised. However methodologies are often simplified to make them accessible to citizen scientists. We tested whether a recent citizen science survey (the OPAL Air Survey) could detect trends in lichen community composition over transects away from roads. We hypothesised that the abundance of nitrophilic lichens would decrease with distance from the road, while that of nitrophobic lichens would increase. The hypothesised changes were detected along strong pollution gradients, but not where the road source was relatively weak, or background pollution relatively high. We conclude that the simplified OPAL methodology can detect large contrasts in nitrogenous pollution, but it may not be able to detect more subtle changes in pollution exposure. Similar studies are needed in conjunction with the ever-growing body of citizen science work to ensure that the limitations of these methods are fully understood. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Bundick, W. Thomas
1990-01-01
A methodology for designing a failure detection and identification (FDI) system to detect and isolate control element failures in aircraft control systems is reviewed. An FDI system design for a modified B-737 aircraft resulting from this methodology is also reviewed, and the results of evaluating this system via simulation are presented. The FDI system performed well in a no-turbulence environment, but it experienced an unacceptable number of false alarms in atmospheric turbulence. An adaptive FDI system, which adjusts thresholds and other system parameters based on the estimated turbulence level, was developed and evaluated. The adaptive system performed well over all turbulence levels simulated, reliably detecting all but the smallest magnitude partially-missing-surface failures.
Global-Context Based Salient Region Detection in Nature Images
NASA Astrophysics Data System (ADS)
Bao, Hong; Xu, De; Tang, Yingjun
Visually saliency detection provides an alternative methodology to image description in many applications such as adaptive content delivery and image retrieval. One of the main aims of visual attention in computer vision is to detect and segment the salient regions in an image. In this paper, we employ matrix decomposition to detect salient object in nature images. To efficiently eliminate high contrast noise regions in the background, we integrate global context information into saliency detection. Therefore, the most salient region can be easily selected as the one which is globally most isolated. The proposed approach intrinsically provides an alternative methodology to model attention with low implementation complexity. Experiments show that our approach achieves much better performance than that from the existing state-of-art methods.
Nadeau, C.P.; Conway, C.J.; Smith, B.S.; Lewis, T.E.
2008-01-01
We conducted 262 call-broadcast point-count surveys (1-6 replicate surveys on each of 62 points) using standardized North American Marsh Bird Monitoring Protocols between 31 May and 7 July 2006 on St. Vincent National Wildlife Refuge, an island off the northwest coast of Florida. We conducted double-blind multiple-observer surveys, paired morning and evening surveys, and paired morning and night surveys to examine the influence of call-broadcast and time of day on detection probability. Observer detection probability for all species pooled was 75% and was similar between passive (69%) and call-broadcast (65%) periods. Detection probability was higher on morning than evening (t = 3.0, P = 0.030) or night (t = 3.4, P = 0.042) surveys when we pooled all species. Detection probability was higher (but not significant for all species) on morning compared to evening or night surveys for all five focal species detected on surveys: Least Bittern (Ixobrychus exilis), Clapper Rail (Rallus longirostris), Purple Gallinule (Porphyrula martinica), Common Moorhen (Gallinula chloropus), and American Coot (Fulica americana). We detected more Least Bitterns (t = 2.4, P = 0.064) and Common Moorhens (t = 2.8, P = 0.026) on morning than evening surveys, and more Clapper Rails (t = 5.1, P = 0.014) on morning than night surveys.
Automated surveillance of 911 call data for detection of possible water contamination incidents.
Haas, Adam J; Gibbons, Darcy; Dangel, Chrissy; Allgeier, Steve
2011-03-30
Drinking water contamination, with the capability to affect large populations, poses a significant risk to public health. In recent water contamination events, the impact of contamination on public health appeared in data streams monitoring health-seeking behavior. While public health surveillance has traditionally focused on the detection of pathogens, developing methods for detection of illness from fast-acting chemicals has not been an emphasis. An automated surveillance system was implemented for Cincinnati's drinking water contamination warning system to monitor health-related 911 calls in the city of Cincinnati. Incident codes indicative of possible water contamination were filtered from all 911 calls for analysis. The 911 surveillance system uses a space-time scan statistic to detect potential water contamination incidents. The frequency and characteristics of the 911 alarms over a 2.5 year period were studied. During the evaluation, 85 alarms occurred, although most occurred prior to the implementation of an additional alerting constraint in May 2009. Data were available for analysis approximately 48 minutes after calls indicating alarms may be generated 1-2 hours after a rapid increase in call volume. Most alerts occurred in areas of high population density. The average alarm area was 9.22 square kilometers. The average number of cases in an alarm was nine calls. The 911 surveillance system provides timely notification of possible public health events, but did have limitations. While the alarms contained incident codes and location of the caller, additional information such as medical status was not available to assist validating the cause of the alarm. Furthermore, users indicated that a better understanding of 911 system functionality is necessary to understand how it would behave in an actual water contamination event.
Automated surveillance of 911 call data for detection of possible water contamination incidents
2011-01-01
Background Drinking water contamination, with the capability to affect large populations, poses a significant risk to public health. In recent water contamination events, the impact of contamination on public health appeared in data streams monitoring health-seeking behavior. While public health surveillance has traditionally focused on the detection of pathogens, developing methods for detection of illness from fast-acting chemicals has not been an emphasis. Methods An automated surveillance system was implemented for Cincinnati's drinking water contamination warning system to monitor health-related 911 calls in the city of Cincinnati. Incident codes indicative of possible water contamination were filtered from all 911 calls for analysis. The 911 surveillance system uses a space-time scan statistic to detect potential water contamination incidents. The frequency and characteristics of the 911 alarms over a 2.5 year period were studied. Results During the evaluation, 85 alarms occurred, although most occurred prior to the implementation of an additional alerting constraint in May 2009. Data were available for analysis approximately 48 minutes after calls indicating alarms may be generated 1-2 hours after a rapid increase in call volume. Most alerts occurred in areas of high population density. The average alarm area was 9.22 square kilometers. The average number of cases in an alarm was nine calls. Conclusions The 911 surveillance system provides timely notification of possible public health events, but did have limitations. While the alarms contained incident codes and location of the caller, additional information such as medical status was not available to assist validating the cause of the alarm. Furthermore, users indicated that a better understanding of 911 system functionality is necessary to understand how it would behave in an actual water contamination event. PMID:21450105
Dakos, Vasilis; Carpenter, Stephen R.; Brock, William A.; Ellison, Aaron M.; Guttal, Vishwesha; Ives, Anthony R.; Kéfi, Sonia; Livina, Valerie; Seekell, David A.; van Nes, Egbert H.; Scheffer, Marten
2012-01-01
Many dynamical systems, including lakes, organisms, ocean circulation patterns, or financial markets, are now thought to have tipping points where critical transitions to a contrasting state can happen. Because critical transitions can occur unexpectedly and are difficult to manage, there is a need for methods that can be used to identify when a critical transition is approaching. Recent theory shows that we can identify the proximity of a system to a critical transition using a variety of so-called ‘early warning signals’, and successful empirical examples suggest a potential for practical applicability. However, while the range of proposed methods for predicting critical transitions is rapidly expanding, opinions on their practical use differ widely, and there is no comparative study that tests the limitations of the different methods to identify approaching critical transitions using time-series data. Here, we summarize a range of currently available early warning methods and apply them to two simulated time series that are typical of systems undergoing a critical transition. In addition to a methodological guide, our work offers a practical toolbox that may be used in a wide range of fields to help detect early warning signals of critical transitions in time series data. PMID:22815897
Validation of a new technique to detect Cryptosporidium spp. oocysts in bovine feces.
Inácio, Sandra Valéria; Gomes, Jancarlo Ferreira; Oliveira, Bruno César Miranda; Falcão, Alexandre Xavier; Suzuki, Celso Tetsuo Nagase; Dos Santos, Bianca Martins; de Aquino, Monally Conceição Costa; de Paula Ribeiro, Rafaela Silva; de Assunção, Danilla Mendes; Casemiro, Pamella Almeida Freire; Meireles, Marcelo Vasconcelos; Bresciani, Katia Denise Saraiva
2016-11-01
Due to its important zoonotic potential, cryptosporidiosis arouses strong interest in the scientific community, because, it was initially considered a rare and opportunistic disease. The parasitological diagnosis of the causative agent of this disease, the protozoan Cryptosporidium spp., requires the use of specific techniques of concentration and permanent staining, which are laborious and costly, and are difficult to use in routine laboratory tests. In view of the above, we conducted the feasibility, development, evaluation and intralaboratory validation of a new parasitological technique for analysis in optical microscopy of Cryptosporidium spp. oocysts, called TF-Test Coccidia, using fecal samples from calves from the city of Araçatuba, São Paulo. To confirm the aforementioned parasite and prove the diagnostic efficiency of the new technique, we used two established methodologies in the scientific literature: parasite concentration by centrifugal sedimentation and negative staining with malachite green (CSN-Malachite) and Nested-PCR. We observed good effectiveness of the TF-Test Coccidia technique, being statistically equivalent to CSN-Malachite. Thus, we verified the effectiveness of the TF-Test Coccidia parasitological technique for the detection of Cryptosporidium spp. oocysts and observed good concentration and morphology of the parasite, with a low amount of debris in the fecal smear. Copyright © 2016 Elsevier B.V. All rights reserved.
Lima, Gustavo F; Freitas, Victor C G; Araújo, Renan P; Maitelli, André L; Salazar, Andrés O
2017-09-15
The pipeline inspection using a device called Pipeline Inspection Gauge (PIG) is safe and reliable when the PIG is at low speeds during inspection. We built a Testing Laboratory, containing a testing loop and supervisory system to study speed control techniques for PIGs. The objective of this work is to present and validate the Testing Laboratory, which will allow development of a speed controller for PIGs and solve an existing problem in the oil industry. The experimental methodology used throughout the project is also presented. We installed pressure transducers on pipeline outer walls to detect the PIG's movement and, with data from supervisory, calculated an average speed of 0.43 m/s. At the same time, the electronic board inside the PIG received data from odometer and calculated an average speed of 0.45 m/s. We found an error of 4.44%, which is experimentally acceptable. The results showed that it is possible to successfully build a Testing Laboratory to detect the PIG's passage and estimate its speed. The validation of the Testing Laboratory using data from the odometer and its auxiliary electronic was very successful. Lastly, we hope to develop more research in the oil industry area using this Testing Laboratory.
NASA Astrophysics Data System (ADS)
Kannan, Rohit; Tangirala, Arun K.
2014-06-01
Identification of directional influences in multivariate systems is of prime importance in several applications of engineering and sciences such as plant topology reconstruction, fault detection and diagnosis, and neurosciences. A spectrum of related directionality measures, ranging from linear measures such as partial directed coherence (PDC) to nonlinear measures such as transfer entropy, have emerged over the past two decades. The PDC-based technique is simple and effective, but being a linear directionality measure has limited applicability. On the other hand, transfer entropy, despite being a robust nonlinear measure, is computationally intensive and practically implementable only for bivariate processes. The objective of this work is to develop a nonlinear directionality measure, termed as KPDC, that possesses the simplicity of PDC but is still applicable to nonlinear processes. The technique is founded on a nonlinear measure called correntropy, a recently proposed generalized correlation measure. The proposed method is equivalent to constructing PDC in a kernel space where the PDC is estimated using a vector autoregressive model built on correntropy. A consistent estimator of the KPDC is developed and important theoretical results are established. A permutation scheme combined with the sequential Bonferroni procedure is proposed for testing hypothesis on absence of causality. It is demonstrated through several case studies that the proposed methodology effectively detects Granger causality in nonlinear processes.
Freitas, Victor C. G.; Araújo, Renan P.; Maitelli, André L.; Salazar, Andrés O.
2017-01-01
The pipeline inspection using a device called Pipeline Inspection Gauge (PIG) is safe and reliable when the PIG is at low speeds during inspection. We built a Testing Laboratory, containing a testing loop and supervisory system to study speed control techniques for PIGs. The objective of this work is to present and validate the Testing Laboratory, which will allow development of a speed controller for PIGs and solve an existing problem in the oil industry. The experimental methodology used throughout the project is also presented. We installed pressure transducers on pipeline outer walls to detect the PIG’s movement and, with data from supervisory, calculated an average speed of 0.43 m/s. At the same time, the electronic board inside the PIG received data from odometer and calculated an average speed of 0.45 m/s. We found an error of 4.44%, which is experimentally acceptable. The results showed that it is possible to successfully build a Testing Laboratory to detect the PIG’s passage and estimate its speed. The validation of the Testing Laboratory using data from the odometer and its auxiliary electronic was very successful. Lastly, we hope to develop more research in the oil industry area using this Testing Laboratory. PMID:28914757
A PC based time domain reflectometer for space station cable fault isolation
NASA Technical Reports Server (NTRS)
Pham, Michael; McClean, Marty; Hossain, Sabbir; Vo, Peter; Kouns, Ken
1994-01-01
Significant problems are faced by astronauts on orbit in the Space Station when trying to locate electrical faults in multi-segment avionics and communication cables. These problems necessitate the development of an automated portable device that will detect and locate cable faults using the pulse-echo technique known as Time Domain Reflectometry. A breadboard time domain reflectometer (TDR) circuit board was designed and developed at the NASA-JSC. The TDR board works in conjunction with a GRiD lap-top computer to automate the fault detection and isolation process. A software program was written to automatically display the nature and location of any possible faults. The breadboard system can isolate open circuit and short circuit faults within two feet in a typical space station cable configuration. Follow-on efforts planned for 1994 will produce a compact, portable prototype Space Station TDR capable of automated switching in multi-conductor cables for high fidelity evaluation. This device has many possible commercial applications, including commercial and military aircraft avionics, cable TV, telephone, communication, information and computer network systems. This paper describes the principle of time domain reflectometry and the methodology for on-orbit avionics utility distribution system repair, utilizing the newly developed device called the Space Station Time Domain Reflectometer (SSTDR).
Ma, Zhanshan Sam
2018-05-01
Relatively little progress in the methodology for differentiating between the healthy and diseased microbiomes, beyond comparing microbial community diversities with traditional species richness or Shannon index, has been made. Network analysis has increasingly been called for the task, but most currently available microbiome datasets only allows for the construction of simple species correlation networks (SCNs). The main results from SCN analysis are a series of network properties such as network degree and modularity, but the metrics for these network properties often produce inconsistent evidence. We propose a simple new network property, the P/N ratio, defined as the ratio of positive links to the number of negative links in the microbial SCN. We postulate that the P/N ratio should reflect the balance between facilitative and inhibitive interactions among microbial species, possibly one of the most important changes occurring in diseased microbiome. We tested our hypothesis with five datasets representing five major human microbiome sites and discovered that the P/N ratio exhibits contrasting differences between healthy and diseased microbiomes and may be harnessed as an in silico biomarker for detecting disease-associated changes in the human microbiome, and may play an important role in personalized diagnosis of the human microbiome-associated diseases.
Computational Identification of Novel Genes: Current and Future Perspectives.
Klasberg, Steffen; Bitard-Feildel, Tristan; Mallet, Ludovic
2016-01-01
While it has long been thought that all genomic novelties are derived from the existing material, many genes lacking homology to known genes were found in recent genome projects. Some of these novel genes were proposed to have evolved de novo, ie, out of noncoding sequences, whereas some have been shown to follow a duplication and divergence process. Their discovery called for an extension of the historical hypotheses about gene origination. Besides the theoretical breakthrough, increasing evidence accumulated that novel genes play important roles in evolutionary processes, including adaptation and speciation events. Different techniques are available to identify genes and classify them as novel. Their classification as novel is usually based on their similarity to known genes, or lack thereof, detected by comparative genomics or against databases. Computational approaches are further prime methods that can be based on existing models or leveraging biological evidences from experiments. Identification of novel genes remains however a challenging task. With the constant software and technologies updates, no gold standard, and no available benchmark, evaluation and characterization of genomic novelty is a vibrant field. In this review, the classical and state-of-the-art tools for gene prediction are introduced. The current methods for novel gene detection are presented; the methodological strategies and their limits are discussed along with perspective approaches for further studies.
[Methodology of psychiatric case histories].
Scherbaum, N; Mirzaian, E
1999-05-01
This paper deals with the methodology of psychiatric case histories. Three types of case histories are differentiated. The didactic case history teaches about the typical aspects of a psychiatric disorder or treatment by using an individual patient as an example. In the heuristic case history the individual case gives rise to challenging established concepts or to generate new hypotheses. Such hypotheses drawn from inductive reasoning have then to be tested using representative samples. The focus of hermeneutic case histories is the significance of pathological behaviour and experience in the context of the biography of an individual patient. So-called psychopathographies of important historical figures can also be differentiated according to these types. Based on these methodological considerations, quality standards for the named types of case histories are stated.
Taxonomy-Based Approaches to Quality Assurance of Ontologies
Perl, Yehoshua; Ochs, Christopher
2017-01-01
Ontologies are important components of health information management systems. As such, the quality of their content is of paramount importance. It has been proven to be practical to develop quality assurance (QA) methodologies based on automated identification of sets of concepts expected to have higher likelihood of errors. Four kinds of such sets (called QA-sets) organized around the themes of complex and uncommonly modeled concepts are introduced. A survey of different methodologies based on these QA-sets and the results of applying them to various ontologies are presented. Overall, following these approaches leads to higher QA yields and better utilization of QA personnel. The formulation of additional QA-set methodologies will further enhance the suite of available ontology QA tools. PMID:29158885
Conjoint analysis: using a market-based research model for healthcare decision making.
Mele, Nancy L
2008-01-01
Conjoint analysis is a market-based research model that has been used by businesses for more than 35 years to predict consumer preferences in product design and purchasing. Researchers in medicine, healthcare economics, and health policy have discovered the value of this methodology in determining treatment preferences, resource allocation, and willingness to pay. To describe the conjoint analysis methodology and explore value-added applications in nursing research. Conjoint analysis methodology is described, using examples from the healthcare and business literature, and personal experience with the method. Nurses are called upon to increase interdisciplinary research, provide an evidence base for nursing practice, create patient-centered treatments, and revise nursing education. Other disciplines have met challenges like these using conjoint analysis and discrete choice modeling.
The Case Method in Teaching Critical Thinking.
ERIC Educational Resources Information Center
Gantt, Vernon W.
When one instructor teaches a course called "Communication and Critical Thinking," he uses Josina Makau's book "Reasoning and Communication: Thinking Critically about Arguments" (1990), which maintains that critical thinking requires training. Case methodology can be used for training, not exclusively but as an alternative to…
Enacting Post-Reflexive Teacher Education
ERIC Educational Resources Information Center
Vagle, Mark D.; Monette, Rachel; Thiel, Jaye Johnson; Wester-Neal, Katie
2017-01-01
The purpose of this article is to re-conceptualize Schön's call for a phenomenology of practice--moving away from reflection and towards "post-reflexion"--by explicitly drawing on philosophical and methodological tenets of phenomenology, specifically some of Vagle's theorizing of a "post-intentional phenomenology." Finally, we…
A Qualitative Experiment: Research on Mediated Meaning Construction Using a Hybrid Approach
ERIC Educational Resources Information Center
Robinson, Sue; Mendelson, Andrew L.
2012-01-01
This article presents a hybrid methodological technique that fuses elements of experimental design with qualitative strategies to explore mediated communication. Called the "qualitative experiment," this strategy uses focus groups and in-depth interviews "within" randomized stimulus conditions typically associated with…
Improved Spectroscopy of Molecular Ions in the Mid-Infrared with Up-Conversion Detection
NASA Astrophysics Data System (ADS)
Markus, Charles R.; Perry, Adam J.; Hodges, James N.; McCall, Benjamin J.
2016-06-01
Heterodyne detection, velocity modulation, and cavity enhancement are useful tools for observing rovibrational transitions of important molecular ions. We have utilized these methods to investigate a number of molecular ions, such as H_3^+, CH_5^+, HeH^+, and OH^+. In the past, parasitic etalons and the lack of fast and sensitive detectors in the mid-infrared have limited the number of transitions we could measure with MHz-level precision. Recently, we have significantly reduced the amplitude of unwanted interference fringes with a Brewster-plate spoiler. We have also developed a detection scheme which up-converts the mid-infrared light with difference frequency generation which allows the use of a faster and more sensitive avalanche photodetector. The higher detection bandwidth allows for optimized heterodyne detection at higher modulation frequencies. The overall gain in signal-to-noise from both improvements will enable extensive high-precision line lists of molecular ions and searches for previously unobserved transitions. K.N. Crabtree, J.N. Hodges, B.M. Siller, A.J. Perry, J.E. Kelly, P.A. Jenkins II, and B.J. McCall, Chem. Phys. Lett. 551 (2012) 1-6. A.J. Perry, J.N. Hodges, C.R. Markus, G.S. Kocheril, and B.J. McCall, J. Mol. Spec. 317 (2015) 71-73. J.N. Hodges, A.J. Perry, P.A. Jenkins II, B.M. Siller, and B.J. McCall, J. Chem. Phys. 139 (2013) 164291. A.J. Perry, J.N. Hodges, C.R. Markus, G.S. Kocheril, and B.J. McCall. 2014, J. Chem. Phys. 141, 101101 C.R. Markus, J.N. Hodges, A.J. Perry, G.S. Kocheril, H.S.P. Muller, and B.J. McCall, Astrophys. J. 817 (2016) 138.
LSHSIM: A Locality Sensitive Hashing based method for multiple-point geostatistics
NASA Astrophysics Data System (ADS)
Moura, Pedro; Laber, Eduardo; Lopes, Hélio; Mesejo, Daniel; Pavanelli, Lucas; Jardim, João; Thiesen, Francisco; Pujol, Gabriel
2017-10-01
Reservoir modeling is a very important task that permits the representation of a geological region of interest, so as to generate a considerable number of possible scenarios. Since its inception, many methodologies have been proposed and, in the last two decades, multiple-point geostatistics (MPS) has been the dominant one. This methodology is strongly based on the concept of training image (TI) and the use of its characteristics, which are called patterns. In this paper, we propose a new MPS method that combines the application of a technique called Locality Sensitive Hashing (LSH), which permits to accelerate the search for patterns similar to a target one, with a Run-Length Encoding (RLE) compression technique that speeds up the calculation of the Hamming similarity. Experiments with both categorical and continuous images show that LSHSIM is computationally efficient and produce good quality realizations. In particular, for categorical data, the results suggest that LSHSIM is faster than MS-CCSIM, one of the state-of-the-art methods.
Some Spatial Politics of Queer-Feminist Research: Personal Reflections From the Field.
Misgav, Chen
2016-01-01
This article addresses methodological issues emerging from research conducted with Trans in the Center, an LGBT activist group in Tel Aviv, Israel. It addresses some complex issues related to the politics and ethics of applying queer and feminist methodology to qualitative research in a trans, queer, and feminist community space. The focus is on two issues: the researcher's positionality vis-à-vis the participants and selecting the appropriate methodology in relation to the characteristics of the group under study. Such issues demonstrate how queer and feminist principles are articulated and interwoven in geographical-spatial research in two different dimensions: in the research practice and methodology and in the practices and the spaces created by the activity of the researched group itself. I conclude with insights arising from the attempt to apply feminist and queer paradigms in both theory and research, and I call for their integration into geographical research.
The struggle for methodological orthodoxy in nursing research: the case of mental health.
White, Edward
2003-06-01
This paper is not intended as an exhaustive review of contemporary mental health nursing research. Rather, the intention is to explore some of the competing arguments for different methodological approaches in social research, using mental health nursing as a case example. The paper questions the extent to which the artificially dichotomized debate over quantitative versus qualitative research impacts upon the working lives of practitioners, managers and policy makers. In particular, the paper traces the development of survey method, during this its centennial anniversary year. It also traces its subsequent decline, in favour of what will be referred to as the new methodological orthodoxy in nursing research. It is also interwoven with occasional accounts of personal experience, drawn from an international perspective. The paper calls for a reapproachement between different wings of methodological opinion, in deference to a publicly unified position for nursing research in which the achievement of quality becomes the over-arching concern.
Brown, Matt A; Bishnoi, Ram J; Dholakia, Sara; Velligan, Dawn I
2016-01-20
Recent failures to detect efficacy in clinical trials investigating pharmacological treatments for schizophrenia raise concerns regarding the potential contribution of methodological shortcomings to this research. This review provides an examination of two key methodological issues currently suspected of playing a role in hampering schizophrenia drug development; 1) limitations on the translational utility of preclinical development models, and 2) methodological challenges posed by increased placebo effects. Recommendations for strategies to address these methodological issues are addressed.
Using Puppets to Teach Schoolchildren to Detect Stroke and Call 911.
Sharkey, Sonya; Denke, Linda; Herbert, Morley A
2016-08-01
To overcome barriers to improved outcomes, we undertook an intervention to teach schoolchildren how to detect a stroke and call emergency medical services (EMS). We obtained permission from parents and guardians to use an 8-min puppet show to instruct the fourth, fifth, and sixth graders about stroke detection, symptomatology, and calling EMS. A pretest and three posttests-one immediately following the presentation, one at 3 months, and a third at 6 months-were administered. Responses from 282 students were evaluable. Significant improvements (p < .001) in knowledge were found through all posttests in identifying what parts of the body stroke affected and through the first two posttests in recognizing symptoms stroke victims experienced. Students demonstrated at pretest a high awareness of EMS and 911 (97.5%) and showed slight, but not significant, improvement over time. © The Author(s) 2016.
ERIC Educational Resources Information Center
Conley-Ware, Lakita D.
2010-01-01
This research addresses a real world cyberspace problem, where currently no cross industry standard methodology exists. The goal is to develop a model for identification and detection of vulnerabilities and threats of cyber-crime or cyber-terrorism where cyber-technology is the vehicle to commit the criminal or terrorist act (CVCT). This goal was…
Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier
2012-01-01
Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth’s resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution. PMID:22737023
Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier
2012-01-01
Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth's resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution.
Order restricted inference for oscillatory systems for detecting rhythmic signals
Larriba, Yolanda; Rueda, Cristina; Fernández, Miguel A.; Peddada, Shyamal D.
2016-01-01
Motivation: Many biological processes, such as cell cycle, circadian clock, menstrual cycles, are governed by oscillatory systems consisting of numerous components that exhibit rhythmic patterns over time. It is not always easy to identify such rhythmic components. For example, it is a challenging problem to identify circadian genes in a given tissue using time-course gene expression data. There is a great potential for misclassifying non-rhythmic as rhythmic genes and vice versa. This has been a problem of considerable interest in recent years. In this article we develop a constrained inference based methodology called Order Restricted Inference for Oscillatory Systems (ORIOS) to detect rhythmic signals. Instead of using mathematical functions (e.g. sinusoidal) to describe shape of rhythmic signals, ORIOS uses mathematical inequalities. Consequently, it is robust and not limited by the biologist's choice of the mathematical model. We studied the performance of ORIOS using simulated as well as real data obtained from mouse liver, pituitary gland and data from NIH3T3, U2OS cell lines. Our results suggest that, for a broad collection of patterns of gene expression, ORIOS has substantially higher power to detect true rhythmic genes in comparison to some popular methods, while also declaring substantially fewer non-rhythmic genes as rhythmic. Availability and Implementation: A user friendly code implemented in R language can be downloaded from http://www.niehs.nih.gov/research/atniehs/labs/bb/staff/peddada/index.cfm. Contact: peddada@niehs.nih.gov PMID:27596593
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
NASA Astrophysics Data System (ADS)
Alegre, D. M.; Koroishi, E. H.; Melo, G. P.
2015-07-01
This paper presents a methodology for detection and localization of faults by using state observers. State Observers can rebuild the states not measured or values from points of difficult access in the system. So faults can be detected in these points without the knowledge of its measures, and can be track by the reconstructions of their states. In this paper this methodology will be applied in a system which represents a simplified model of a vehicle. In this model the chassis of the car was represented by a flat plate, which was divided in finite elements of plate (plate of Kirchoff), in addition, was considered the car suspension (springs and dampers). A test rig was built and the developed methodology was used to detect and locate faults on this system. In analyses done, the idea is to use a system with a specific fault, and then use the state observers to locate it, checking on a quantitative variation of the parameter of the system which caused this crash. For the computational simulations the software MATLAB was used.
Using accelerometers to determine the calling behavior of tagged baleen whales.
Goldbogen, J A; Stimpert, A K; DeRuiter, S L; Calambokidis, J; Friedlaender, A S; Schorr, G S; Moretti, D J; Tyack, P L; Southall, B L
2014-07-15
Low-frequency acoustic signals generated by baleen whales can propagate over vast distances, making the assignment of calls to specific individuals problematic. Here, we report the novel use of acoustic recording tags equipped with high-resolution accelerometers to detect vibrations from the surface of two tagged fin whales that directly match the timing of recorded acoustic signals. A tag deployed on a buoy in the vicinity of calling fin whales and a recording from a tag that had just fallen off a whale were able to detect calls acoustically but did not record corresponding accelerometer signals that were measured on calling individuals. Across the hundreds of calls measured on two tagged fin whales, the accelerometer response was generally anisotropic across all three axes, appeared to depend on tag placement and increased with the level of received sound. These data demonstrate that high-sample rate accelerometry can provide important insights into the acoustic behavior of baleen whales that communicate at low frequencies. This method helps identify vocalizing whales, which in turn enables the quantification of call rates, a fundamental component of models used to estimate baleen whale abundance and distribution from passive acoustic monitoring. © 2014. Published by The Company of Biologists Ltd.
Surveillance Systems for Waterborne Protozoa Past, Present and Future
OVERVIEW I. Brief introduction to waterborne Cryptosporidium Historical perspective on detecting Cryptosporidium Current detection methodologies II. US EPA’s waterborne protozoan research program Detecting, typing, and tracking sources of Cryptosporidium contami...
Correlated evolution between hearing sensitivity and social calls in bats
Bohn, Kirsten M; Moss, Cynthia F; Wilkinson, Gerald S
2006-01-01
Echolocating bats are auditory specialists, with exquisite hearing that spans several octaves. In the ultrasonic range, bat audiograms typically show highest sensitivity in the spectral region of their species-specific echolocation calls. Well-developed hearing in the audible range has been commonly attributed to a need to detect sounds produced by prey. However, bat pups often emit isolation calls with low-frequency components that facilitate mother–young reunions. In this study, we examine whether low-frequency hearing in bats exhibits correlated evolution with (i) body size; (ii) high-frequency hearing sensitivity or (iii) pup isolation call frequency. Using published audiograms, we found that low-frequency hearing sensitivity is not dependent on body size but is related to high-frequency hearing. After controlling for high-frequency hearing, we found that low-frequency hearing exhibits correlated evolution with isolation call frequency. We infer that detection and discrimination of isolation calls have favoured enhanced low-frequency hearing because accurate parental investment is critical: bats have low reproductive rates, non-volant altricial young and must often identify their pups within large crèches. PMID:17148288
Georgoulas, George; Georgopoulos, Voula C; Stylios, Chrysostomos D
2006-01-01
This paper proposes a novel integrated methodology to extract features and classify speech sounds with intent to detect the possible existence of a speech articulation disorder in a speaker. Articulation, in effect, is the specific and characteristic way that an individual produces the speech sounds. A methodology to process the speech signal, extract features and finally classify the signal and detect articulation problems in a speaker is presented. The use of support vector machines (SVMs), for the classification of speech sounds and detection of articulation disorders is introduced. The proposed method is implemented on a data set where different sets of features and different schemes of SVMs are tested leading to satisfactory performance.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-21
...] Agency Information Collection Activities; Proposed Collection; Comment Request; Prescription Drug Product... the distribution of patient labeling, called Medications Guides, for certain products that pose a... validity of the methodology and assumption used; (3) ways to enhance the quality, utility, and clarity of...
The Planning, Implementation, and Movement of an Academic Library Collection.
ERIC Educational Resources Information Center
Kurkul, Donna Lee
1983-01-01
Discusses methodology, logistics, and time/cost study of planning, implementation, and relocation of 682,810 volume Smith College Library collection into its newly constructed and renovated facility. Call number sequence location, collection movement phasing and formulas for sequence distribution, and personnel requirements are noted. Elementary…
Methodological Problems Encountered in the Review of Research in Science Teaching
ERIC Educational Resources Information Center
Lawlor, E. P.; Lawlor, F. X.
1972-01-01
Describes the difficulties encountered in selecting material to be included in the reviews of science education research in the Curtis Series'' published by the Columbia Teachers' College Press. Presents evidence outlining the weaknesses of using a jury'' to determine so-called superior research. (AL)
Design-Based Implementation Research
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Potvin, Ashley Seidel
2017-01-01
Purpose: This paper is second of seven in this volume elaborating different approaches to quality improvement in education. It delineates a methodology called design-based implementation research (DBIR). The approach used in this paper is aimed at iteratively improving the quality of classroom teaching and learning practices in defined problem…
Training in the Community-Collaborative Context: A Case Study
ERIC Educational Resources Information Center
Yamada, Racquel-María
2014-01-01
Emerging community-based methodologies call for collaboration with speech community members. Although motivated, community members may lack the tools or training to contribute actively. In response, many linguists deliver training workshops in documentation or preservation, while others train community members to record data. Although workshops…
Factors Influencing Self-Directed Career Management: An Integrative Investigation
ERIC Educational Resources Information Center
Park, Yongho
2009-01-01
Purpose: This paper aims to investigate the relationship between the protean career and other variables, including organizational learning climate, individual calling work orientation, and demographic variables. Design/methodology/approach: The research data were obtained from a sample consisting of 292 employees of two South Korean manufacturing…
"Extreme Programming" in a Bioinformatics Class
ERIC Educational Resources Information Center
Kelley, Scott; Alger, Christianna; Deutschman, Douglas
2009-01-01
The importance of Bioinformatics tools and methodology in modern biological research underscores the need for robust and effective courses at the college level. This paper describes such a course designed on the principles of cooperative learning based on a computer software industry production model called "Extreme Programming" (EP).…
Qualitative Research in Counseling Psychology: Conceptual Foundations
ERIC Educational Resources Information Center
Morrow, Susan L.
2007-01-01
Beginning with calls for methodological diversity in counseling psychology, this article addresses the history and current state of qualitative research in counseling psychology. It identifies the historical and disciplinary origins as well as basic assumptions and underpinnings of qualitative research in general, as well as within counseling…
Tamborini, Marco
2015-12-01
This paper examines Karl Alfred von Zittel’s practice in order to uncover the roots of so-called idiographic paleontology.The great American paleontologist Stephen Jay Gould (1941–2002) defined the discipline of idiographic paleontology as illustration and description of the morphological features of extinct species. However, this approach does not investigate macroevolutionary patterns and processes. On the contrary, the paleobiological revolution of the 1970s implemented an epistemic methodology that illustrates macrovelutionary patterns and laws by combining idiographic data with a nomothetic form of explanation. This article elucidates the features of the idiographic data as well as the acquired knowledge coupled with this approach. First of all, Heinrich G. Bronn’s (1800–1862) statistical method is analyzed. Zittel’s practice arose as a reaction against the approximate conclusions reached by Bronn’s quantitative approach. Second, the details of Zittel’s methodology are described in order to bring out its peculiarities.The microscope played a pivotal role in creating and forming Zittel’s morphological data. This analysis sheds new light on the reasons behind the so-called ideographic paleontology, thus revising Gould’s historical reconstruction, as well as on the notion of paleontological data. However, even though Zittel aimed at reaching precise and stable conclusions,his data cannot be used for elucidating evolutionary mechanisms: they are scientific in a purely descriptive sense, but completely useless for biological investigations. Finally, this paper examines how Zittel’s methodology affects the contemporary paleobiological enterprise and thereby reflects upon the notion of natural history.
An effective method on pornographic images realtime recognition
NASA Astrophysics Data System (ADS)
Wang, Baosong; Lv, Xueqiang; Wang, Tao; Wang, Chengrui
2013-03-01
In this paper, skin detection, texture filtering and face detection are used to extract feature on an image library, training them with the decision tree arithmetic to create some rules as a decision tree classifier to distinguish an unknown image. Experiment based on more than twenty thousand images, the precision rate can get 76.21% when testing on 13025 pornographic images and elapsed time is less than 0.2s. This experiment shows it has a good popularity. Among the steps mentioned above, proposing a new skin detection model which called irregular polygon region skin detection model based on YCbCr color space. This skin detection model can lower the false detection rate on skin detection. A new method called sequence region labeling on binary connected area can calculate features on connected area, it is faster and needs less memory than other recursive methods.
Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.
Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda
2014-05-01
We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.
PCB congener analysis with Hall electrolytic conductivity detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edstrom, R.D.
1989-01-01
This work reports the development of an analytical methodology for the analysis of PCB congeners based on integrating relative retention data provided by other researchers. The retention data were transposed into a multiple retention marker system which provided good precision in the calculation of relative retention indices for PCB congener analysis. Analytical run times for the developed methodology were approximately one hour using a commercially available GC capillary column. A Tracor Model 700A Hall Electrolytic Conductivity Detector (HECD) was employed in the GC detection of Aroclor standards and environmental samples. Responses by the HECD provided good sensitivity and were reasonablymore » predictable. Ten response factors were calculated based on the molar chlorine content of each homolog group. Homolog distributions were determined for Aroclors 1016, 1221, 1232, 1242, 1248, 1254, 1260, 1262 along with binary and ternary mixtures of the same. These distributions were compared with distributions reported by other researchers using electron capture detection as well as chemical ionization mass spectrometric methodologies. Homolog distributions acquired by the HECD methodology showed good correlation with the previously mentioned methodologies. The developed analytical methodology was used in the analysis of bluefish (Pomatomas saltatrix) and weakfish (Cynoscion regalis) collected from the York River, lower James River and lower Chesapeake Bay in Virginia. Total PCB concentrations were calculated and homolog distributions were constructed from the acquired data. Increases in total PCB concentrations were found in the analyzed fish samples during the fall of 1985 collected from the lower James River and lower Chesapeake Bay.« less
Vanegas, Fernando; Bratanov, Dmitry; Powell, Kevin; Weiss, John; Gonzalez, Felipe
2018-01-17
Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used-the sensors, the UAV, and the flight operations-the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analising and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Waring, Mike; Bielfeldt, Stephan; Mätzold, Katja; Wilhelm, Klaus-Peter
2013-02-01
Chronic wounds require frequent dressing changes. Adhesive dressings used for this indication can be damaging to the stratum corneum, particularly in the elderly where the skin tends to be thinner. Understanding the level of damage caused by dressing removal can aid dressing selection. This study used a novel methodology that applied a stain to the skin and measured the intensity of that stain after repeated application and removal of a series of different adhesive types. Additionally, a traditional method of measuring skin barrier damage (transepidermal water loss) was also undertaken and compared with the staining methodology. The staining methodology and measurement of transepidermal water loss differentiated the adhesive dressings, showing that silicone adhesives caused least trauma to the skin. The staining methodology was shown to be as effective as transepidermal water loss in detecting damage to the stratum corneum and was shown to detect disruption of the barrier earlier than the traditional technique. © 2012 John Wiley & Sons A/S.
An Automated Directed Spectral Search Methodology for Small Target Detection
NASA Astrophysics Data System (ADS)
Grossman, Stanley I.
Much of the current efforts in remote sensing tackle macro-level problems such as determining the extent of wheat in a field, the general health of vegetation or the extent of mineral deposits in an area. However, for many of the remaining remote sensing challenges being studied currently, such as border protection, drug smuggling, treaty verification, and the war on terror, most targets are very small in nature - a vehicle or even a person. While in typical macro-level problems the objective vegetation is in the scene, for small target detection problems it is not usually known if the desired small target even exists in the scene, never mind finding it in abundance. The ability to find specific small targets, such as vehicles, typifies this problem. Complicating the analyst's life, the growing number of available sensors is generating mountains of imagery outstripping the analysts' ability to visually peruse them. This work presents the important factors influencing spectral exploitation using multispectral data and suggests a different approach to small target detection. The methodology of directed search is presented, including the use of scene-modeled spectral libraries, various search algorithms, and traditional statistical and ROC curve analysis. The work suggests a new metric to calibrate analysis labeled the analytic sweet spot as well as an estimation method for identifying the sweet spot threshold for an image. It also suggests a new visualization aid for highlighting the target in its entirety called nearest neighbor inflation (NNI). It brings these all together to propose that these additions to the target detection arena allow for the construction of a fully automated target detection scheme. This dissertation next details experiments to support the hypothesis that the optimum detection threshold is the analytic sweet spot and that the estimation method adequately predicts it. Experimental results and analysis are presented for the proposed directed search techniques of spectral image based small target detection. It offers evidence of the functionality of the NNI visualization and also provides evidence that the increased spectral dimensionality of the 8-band Worldview-2 datasets provides noteworthy improvement in results over traditional 4-band multispectral datasets. The final experiment presents the results from a prototype fully automated target detection scheme in support of the overarching premise. This work establishes the analytic sweet spot as the optimum threshold defined as the point where error detection rate curves -- false detections vs. missing detections -- cross. At this point the errors are minimized while the detection rate is maximized. It then demonstrates that taking the first moment statistic of the histogram of calculated target detection values from a detection search with test threshold set arbitrarily high will estimate the analytic sweet spot for that image. It also demonstrates that directed search techniques -- when utilized with appropriate scene-specific modeled signatures and atmospheric compensations -- perform at least as well as in-scene search techniques 88% of the time and grossly under-performing only 11% of the time; the in-scene only performs as well or better 50% of the time. It further demonstrates the clear advantage increased multispectral dimensionality brings to detection searches improving performance in 50% of the cases while performing at least as well 72% of the time. Lastly, it presents evidence that a fully automated prototype performs as anticipated laying the groundwork for further research into fully automated processes for small target detection.
Spectral Target Detection using Schroedinger Eigenmaps
NASA Astrophysics Data System (ADS)
Dorado-Munoz, Leidy P.
Applications of optical remote sensing processes include environmental monitoring, military monitoring, meteorology, mapping, surveillance, etc. Many of these tasks include the detection of specific objects or materials, usually few or small, which are surrounded by other materials that clutter the scene and hide the relevant information. This target detection process has been boosted lately by the use of hyperspectral imagery (HSI) since its high spectral dimension provides more detailed spectral information that is desirable in data exploitation. Typical spectral target detectors rely on statistical or geometric models to characterize the spectral variability of the data. However, in many cases these parametric models do not fit well HSI data that impacts the detection performance. On the other hand, non-linear transformation methods, mainly based on manifold learning algorithms, have shown a potential use in HSI transformation, dimensionality reduction and classification. In target detection, non-linear transformation algorithms are used as preprocessing techniques that transform the data to a more suitable lower dimensional space, where the statistical or geometric detectors are applied. One of these non-linear manifold methods is the Schroedinger Eigenmaps (SE) algorithm that has been introduced as a technique for semi-supervised classification. The core tool of the SE algorithm is the Schroedinger operator that includes a potential term that encodes prior information about the materials present in a scene, and enables the embedding to be steered in some convenient directions in order to cluster similar pixels together. A completely novel target detection methodology based on SE algorithm is proposed for the first time in this thesis. The proposed methodology does not just include the transformation of the data to a lower dimensional space but also includes the definition of a detector that capitalizes on the theory behind SE. The fact that target pixels and those similar pixels are clustered in a predictable region of the low-dimensional representation is used to define a decision rule that allows one to identify target pixels over the rest of pixels in a given image. In addition, a knowledge propagation scheme is used to combine spectral and spatial information as a means to propagate the "potential constraints" to nearby points. The propagation scheme is introduced to reinforce weak connections and improve the separability between most of the target pixels and the background. Experiments using different HSI data sets are carried out in order to test the proposed methodology. The assessment is performed from a quantitative and qualitative point of view, and by comparing the SE-based methodology against two other detection methodologies that use linear/non-linear algorithms as transformations and the well-known Adaptive Coherence/Cosine Estimator (ACE) detector. Overall results show that the SE-based detector outperforms the other two detection methodologies, which indicates the usefulness of the SE transformation in spectral target detection problems.
Lirio, R B; Dondériz, I C; Pérez Abalo, M C
1992-08-01
The methodology of Receiver Operating Characteristic curves based on the signal detection model is extended to evaluate the accuracy of two-stage diagnostic strategies. A computer program is developed for the maximum likelihood estimation of parameters that characterize the sensitivity and specificity of two-stage classifiers according to this extended methodology. Its use is briefly illustrated with data collected in a two-stage screening for auditory defects.
Usefulness of MLPA in the detection of SHOX deletions.
Funari, Mariana F A; Jorge, Alexander A L; Souza, Silvia C A L; Billerbeck, Ana E C; Arnhold, Ivo J P; Mendonca, Berenice B; Nishi, Mirian Y
2010-01-01
SHOX haploinsufficiency causes a wide spectrum of short stature phenotypes, such as Leri-Weill dyschondrosteosis (LWD) and disproportionate short stature (DSS). SHOX deletions are responsible for approximately two thirds of isolated haploinsufficiency; therefore, it is important to determine the most appropriate methodology for detection of gene deletion. In this study, three methodologies for the detection of SHOX deletions were compared: the fluorescence in situ hybridization (FISH), microsatellite analysis and multiplex ligation-dependent probe amplification (MLPA). Forty-four patients (8 LWD and 36 DSS) were analyzed. The cosmid LLNOYCO3'M'34F5 was used as a probe for the FISH analysis and microsatellite analysis were performed using three intragenic microsatellite markers. MLPA was performed using commercial kits. Twelve patients (8 LWD and 4 DSS) had deletions in SHOX area detected by MLPA and 2 patients generated discordant results with the other methodologies. In the first case, the deletion was not detected by FISH. In the second case, both FISH and microsatellite analyses were unable to identify the intragenic deletion. In conclusion, MLPA was more sensitive, less expensive and less laborious; therefore, it should be used as the initial molecular method for the detection of SHOX gene deletion. Copyright © 2010 Elsevier Masson SAS. All rights reserved.
Bjelkmar, Pär; Hansen, Anette; Schönning, Caroline; Bergström, Jakob; Löfdahl, Margareta; Lebbad, Marianne; Wallensten, Anders; Allestam, Görel; Stenmark, Stephan; Lindh, Johan
2017-04-18
In the winter and spring of 2011 a large outbreak of cryptosporidiosis occurred in Skellefteå municipality, Sweden. This study summarizes the outbreak investigation in terms of outbreak size, duration, clinical characteristics, possible source(s) and the potential for earlier detection using calls to a health advice line. The investigation included two epidemiological questionnaires and microbial analysis of samples from patients, water and other environmental sources. In addition, a retrospective study based on phone calls to a health advice line was performed by comparing patterns of phone calls between different water distribution areas. Our analyses showed that approximately 18,500 individuals were affected by a waterborne outbreak of cryptosporidiosis in Skellefteå in 2011. This makes it the second largest outbreak of cryptosporidiosis in Europe to date. Cryptosporidium hominis oocysts of subtype IbA10G2 were found in patient and sewage samples, but not in raw water or in drinking water, and the initial contamination source could not be determined. The outbreak went unnoticed to authorities for several months. The analysis of the calls to the health advice line provides strong indications early in the outbreak that it was linked to a particular water treatment plant. We conclude that an earlier detection of the outbreak by linking calls to a health advice line to water distribution areas could have limited the outbreak substantially.
A Shellcode Detection Method Based on Full Native API Sequence and Support Vector Machine
NASA Astrophysics Data System (ADS)
Cheng, Yixuan; Fan, Wenqing; Huang, Wei; An, Jing
2017-09-01
Dynamic monitoring the behavior of a program is widely used to discriminate between benign program and malware. It is usually based on the dynamic characteristics of a program, such as API call sequence or API call frequency to judge. The key innovation of this paper is to consider the full Native API sequence and use the support vector machine to detect the shellcode. We also use the Markov chain to extract and digitize Native API sequence features. Our experimental results show that the method proposed in this paper has high accuracy and low detection rate.
Analysis of the impact of safeguards criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mullen, M.F.; Reardon, P.T.
As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) ofmore » the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.« less
Sequencing CYP2D6 for the detection of poor-metabolizers in post-mortem blood samples with tramadol.
Fonseca, Suzana; Amorim, António; Costa, Heloísa Afonso; Franco, João; Porto, Maria João; Santos, Jorge Costa; Dias, Mário
2016-08-01
Tramadol concentrations and analgesic effect are dependent on the CYP2D6 enzymatic activity. It is well known that some genetic polymorphisms are responsible for the variability in the expression of this enzyme and in the individual drug response. The detection of allelic variants described as non-functional can be useful to explain some circumstances of death in the study of post-mortem cases with tramadol. A Sanger sequencing methodology was developed for the detection of genetic variants that cause absent or reduced CYP2D6 activity, such as *3, *4, *6, *8, *10 and *12 alleles. This methodology, as well as the GC/MS method for the detection and quantification of tramadol and its main metabolites in blood samples was fully validated in accordance with international guidelines. Both methodologies were successfully applied to 100 post-mortem blood samples and the relation between toxicological and genetic results evaluated. Tramadol metabolism, expressed as its metabolites concentration ratio (N-desmethyltramadol/O-desmethyltramadol), has been shown to be correlated with the poor-metabolizer phenotype based on genetic characterization. It was also demonstrated the importance of enzyme inhibitors identification in toxicological analysis. According to our knowledge, this is the first study where a CYP2D6 sequencing methodology is validated and applied to post-mortem samples, in Portugal. The developed methodology allows the data collection of post-mortem cases, which is of primordial importance to enhance the application of these genetic tools to forensic toxicology and pathology. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Methodological and Pedagogical Potential of Reflection in Development of Contemporary Didactics
ERIC Educational Resources Information Center
Chupina, Valentina A.; Pleshakova, Anastasiia Yu.; Konovalova, Maria E.
2016-01-01
Applicability of the issue under research is preconditioned by the need of practical pedagogics to expand methodological and methodical tools of contemporary didactics. The purpose of the article is to detect the methodological core of reflection as a form of thinking and to provide insight thereunto on the basis of systematic attributes of the…
NASA Astrophysics Data System (ADS)
Barbarella, M.; De Giglio, M.; Galeandro, A.; Mancini, F.
2012-04-01
The modification of some atmospheric physical properties prior to a high magnitude earthquake has been recently debated within the Lithosphere-Atmosphere-Ionosphere (LAI) Coupling model. Among this variety of phenomena the ionization of air at the higher level of the atmosphere, called ionosphere, is investigated in this work. Such a ionization occurrences could be caused by possible leaking of gases from earth crust and their presence was detected around the time of high magnitude earthquakes by several authors. However, the spatial scale and temporal domain over which such a disturbances come into evidence is still a controversial item. Even thought the ionospheric activity could be investigated by different methodologies (satellite or terrestrial measurements), we selected the production of ionospheric maps by the analysis of GNSS (Global Navigation Satellite Data) data as possible way to detect anomalies prior of a seismic event over a wide area around the epicentre. It is well known that, in the GNSS sciences, the ionospheric activity could be probed by the analysis of refraction phenomena occurred on the dual frequency signals along the satellite to receiver path. The analysis of refraction phenomena affecting data acquired by the GNSS permanent trackers is able to produce daily to hourly maps representing the spatial distribution of the ionospheric Total Electron Content (TEC) as an index of the ionization degree in the upper atmosphere. The presence of large ionospheric anomalies could be therefore interpreted in the LAI Coupling model like a precursor signal of a strong earthquake, especially when the appearance of other different precursors (thermal anomalies and/or gas fluxes) could be detected. In this work, a six-month long series of ionospheric maps produced from GNSS data collected by a network of 49 GPS permanent stations distributed within an area around the city of L'Aquila (Abruzzi, Italy), where an earthquake (M = 6.3) occurred on April 6, 2009, were investigated. Basically, the proposed methodology is able to perform a time series analysis of the TEC maps and, eventually, define the spatial and temporal domains of ionospheric disturbances. This goal was achieved by a time series analysis of the spatial dataset able to compare a local pattern of ionospheric activity with its historical mean value and detect areas where the TEC content exhibits anomalous values. This data processing shows some 1 to 2 days long anomalies about 20 days before of the seismic event (confirming also results provided in recent studies by means of ionospheric soundings).
Profiler-2000: Attacking the Insider Threat
2005-09-01
detection approach and its incorporation into a number of current automated intrusion-detection strategies (e.g., AT&T’s Com- puterWatch, SRI’s Emerald ...administrative privileges, to be activated upon his or her next login . The system calls required to implement this method are chmod and exit. These two calls...kinds of information that can be derived from these (and other) logs are: time of login , physical location of login , duration of user session
Acoustic and Visual Monitoring for Marine Mammals at the Southern California Off-Shore Range (SCORE)
2005-02-28
1998 ). Long - range acoustic detection and localization of blue whale calls in the northeast Pacific Ocean. Journal of the Acoustical ...C. G. and Clark, D.S. 1998 . Long - range acoustic detection and localization of blue whale calls in the northeast Pacific Ocean, J. Acous. Soc. Am...Gulf of Alaska. Marine Mammal Science 19: 682-693. Stafford , K.M., C.G. Fox, and D.S. Clark. 1998 .
QESA: Quarantine Extraterrestrial Sample Analysis Methodology
NASA Astrophysics Data System (ADS)
Simionovici, A.; Lemelle, L.; Beck, P.; Fihman, F.; Tucoulou, R.; Kiryukhina, K.; Courtade, F.; Viso, M.
2018-04-01
Our nondestructive, nm-sized, hyperspectral analysis methodology of combined X-rays/Raman/IR probes in BSL4 quarantine, renders our patented mini-sample holder ideal for detecting extraterrestrial life. Our Stardust and Archean results validate it.
Advanced Technologies and Methodology for Automated Ultrasonic Testing Systems Quantification
DOT National Transportation Integrated Search
2011-04-29
For automated ultrasonic testing (AUT) detection and sizing accuracy, this program developed a methodology for quantification of AUT systems, advancing and quantifying AUT systems imagecapture capabilities, quantifying the performance of multiple AUT...
ERIC Educational Resources Information Center
Harac, Lani
2004-01-01
In this article, the author features the Universal Design for Learning, a computer-assisted methodology that has enabled special-needs kids in the Boston area to stay in regular classrooms. Developed by a nonprofit group called the Center for Applied Special Technology, the UDL approach--in which students use whatever print or technological tools…
Artful Interventions for Workplace Bullying: Exploring Forum Theatre
ERIC Educational Resources Information Center
Edwards, Margot; Blackwood, Kate Marie
2017-01-01
Purpose: This paper aims to explore the phenomenon of workplace bullying in response to recent calls for the development of different approaches and provide an exploration of artful approaches to intervention. Design/methodology/approach: The paper offers a unique conceptualisation of workplace bullying and applies a phenomenological lens to the…
Martini Qualitative Research: Shaken, Not Stirred
ERIC Educational Resources Information Center
Nieuwenhuis, F. J.
2015-01-01
Although the number of qualitative research studies has boomed in recent years, close observation reveals that often the research designs and methodological considerations and approaches have developed a type of configuration that does not adhere to purist definitions of the labels attached. Very often so called interpretivist studies are not…
GuidosToolbox: universal digital image object analysis
Peter Vogt; Kurt Riitters
2017-01-01
The increased availability of mapped environmental data calls for better tools to analyze the spatial characteristics and information contained in those maps. Publicly available, userfriendly and universal tools are needed to foster the interdisciplinary development and application of methodologies for the extraction of image object information properties contained in...
Situated Research Design and Methodological Choices in Formative Program Evaluation
ERIC Educational Resources Information Center
Supovitz, Jonathan
2013-01-01
Design-based implementation research offers the opportunity to rethink the relationships between intervention, research, and situation to better attune research and evaluation to the program development process. Using a heuristic called the intervention development curve, I describe the rough trajectory that programs typically follow as they…
Conversation Analysis in Computer-Assisted Language Learning
ERIC Educational Resources Information Center
González-Lloret, Marta
2015-01-01
The use of Conversation Analysis (CA) in the study of technology-mediated interactions is a recent methodological addition to qualitative research in the field of Computer-assisted Language Learning (CALL). The expansion of CA in Second Language Acquisition research, coupled with the need for qualitative techniques to explore how people interact…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-19
... approach that incorporates ``mass balance'' constraints to determine emissions from AFOs. Unfortunately... ventilation rate of the monitored confinement structure. Nitrogen content of process inputs and outputs (e.g., feed, water, bedding, eggs, milk). Nitrogen content of manure excreted. Description of any control...
Using Learning Labs for Culturally Responsive Positive Behavioral Interventions and Supports
ERIC Educational Resources Information Center
Bal, Aydin; Schrader, Elizabeth M.; Afacan, Kemal; Mawene, Dian
2016-01-01
Culturally responsive positive behavioral interventions and supports (CRPBIS) is a statewide research project designed to renovate behavioral support systems to become more inclusive, adaptive, and supportive for all. The CRPBIS methodology, called "learning lab," provides a research-based process to bring together local stakeholders and…
Communicating Qualitative Research Study Designs to Research Ethics Review Boards
ERIC Educational Resources Information Center
Ells, Carolyn
2011-01-01
Researchers using qualitative methodologies appear to be particularly prone to having their study designs called into question by research ethics or funding agency review committees. In this paper, the author considers the issue of communicating qualitative research study designs in the context of institutional research ethics review and offers…
30 CFR 780.21 - Hydrologic information.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 3 2011-07-01 2011-07-01 false Hydrologic information. 780.21 Section 780.21... Hydrologic information. (a) Sampling and analysis methodology. All water-quality analyses performed to meet... information on the availability of this material at NARA, call 202-741-6030, or go to: http://www.archives.gov...
30 CFR 780.21 - Hydrologic information.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 3 2012-07-01 2012-07-01 false Hydrologic information. 780.21 Section 780.21... Hydrologic information. (a) Sampling and analysis methodology. All water-quality analyses performed to meet... information on the availability of this material at NARA, call 202-741-6030, or go to: http://www.archives.gov...
Multiple Effects of Human Resource Development Interventions
ERIC Educational Resources Information Center
Rowold, Jens
2008-01-01
Purpose: This study aims to explore the simultaneous impact of employees participation in non-technical training, technical training, and coaching on subsequent job performance, job involvement, and job satisfaction. Design/methodology/approach: The present study was based on a sample of German call center employees and on a longitudinal,…
Critical Hip Hop Pedagogy as a Form of Liberatory Praxis
ERIC Educational Resources Information Center
Akom, A. A.
2009-01-01
This article uses Paulo Freire's problem-posing method, youth participatory action research, and case study methodology to introduce an alternative instructional strategy called Critical Hip Hop Pedagogy (CHHP). This approach attempts to address deep-rooted ideologies to social inequities by creating a space in teacher education courses for…
ERIC Educational Resources Information Center
Turan, Fikret Korhan; Cetinkaya, Saadet; Ustun, Ceyda
2016-01-01
Building sustainable universities calls for participative management and collaboration among stakeholders. Combining analytic hierarchy and network processes (AHP/ANP) with statistical analysis, this research proposes a framework that can be used in higher education institutions for integrating stakeholder preferences into strategic decisions. The…
Emission Database for Global Atmospheric Research (EDGAR).
ERIC Educational Resources Information Center
Olivier, J. G. J.; And Others
1994-01-01
Presents the objective and methodology chosen for the construction of a global emissions source database called EDGAR and the structural design of the database system. The database estimates on a regional and grid basis, 1990 annual emissions of greenhouse gases, and of ozone depleting compounds from all known sources. (LZ)
An Ethical Frame for Research with Immigrant Families
ERIC Educational Resources Information Center
Nguyen, Jacqueline; Hernández, María G.; Saetermoe, Carrie L.; Suárez-Orozco, Carola
2013-01-01
In this introduction, the editors give an overview of the ways the volume addresses the growing individual and institutional calls for increased clarity and rigor in methodological, ethical, and practical research policies and guidelines for conducting research with immigrant individuals, families, and communities. In addition to summarizing the…
"Picturing" Lay Ministry: Photovoice and Participatory Group Spiritual Gifts Assessment
ERIC Educational Resources Information Center
Trefz, Steven G.
2013-01-01
The "Picturing Lay Ministry" project uses the visual methodology of photovoice as a way of generating participatory laity discernment around the topics of calling, rural ministry, and spiritual gifts. The project involves working with curriculum action research embedded within one-day ministry discernment events for laity. Measurement…
Inclusive Assessment: Toward a Socially-Just Methodology for Measuring Institution-Wide Engagement
ERIC Educational Resources Information Center
Getto, Guiseppe; McCunney, Dennis
2015-01-01
Institutions are increasingly being called upon to collect large amounts of data to demonstrate community impact. At institutions with strong and wide-reaching public engagement/service missions, this expectation is even greater--both for quality improvement and for demonstrating regional transformation. Despite these expectations, the…
Expansive Visibilization to Stimulate EFL Teacher Reflection
ERIC Educational Resources Information Center
Ito, Ryu
2012-01-01
Despite the growing popularity of action research, bridging the gap between data collection and reflective data analysis still lacks a well-developed methodology. As a supplement to the traditional action research procedure for language teaching, I adopted a method called expansive visibilization (EV), which has the potential to be a reflective…
Development Cooperation as Methodology for Teaching Social Responsibility to Engineers
ERIC Educational Resources Information Center
Lappalainen, Pia
2011-01-01
The role of engineering in promoting global well-being has become accentuated, turning the engineering curriculum into a means of dividing well-being equally. The gradual fortifying calls for humanitarian engineering have resulted in the incorporation of social responsibility themes in the university curriculum. Cooperation, communication,…
Shared Governance in Times of Change: A Practical Guide for Universities and Colleges
ERIC Educational Resources Information Center
Bahls, Steven S.
2014-01-01
Today's challenging higher education environment demands a new way of making decisions. Changing business models and methodologies for delivering academic programs present new opportunities (as well as risks) and call for innovative responses. This publication aims to "reboot" dialogues among boards, presidents, and faculties. It creates…
Designing and Deploying 3D Collaborative Games in Education
ERIC Educational Resources Information Center
Mavridis, Apostolos; Tsiatsos, Thrasyvoulos; Terzidou, Theodouli
2016-01-01
This paper focuses on methodologies of serious games deployment and evaluation. Particularly, this study will present a specific category of serious games that are based on Collaborative Virtual Environments and they aim to support Collaborative Learning. We call these serious games Collaborative Virtual Educational Games (CVEG). The paper aims to…
76 FR 29773 - Call for Participation in Pillbox Patient-Safety Initiative
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-23
... digital images and descriptive information for solid oral dosage form medications. This project seeks to... Participation, NLM seeks to evaluate the photography methodology and procedures it has developed for creating... available via a publicly accessible resource ( http://pillbox.nlm.nih.gov ) digital images and descriptive...
Investigation of an Error Theory for Conjoint Measurement Methodology.
1983-05-01
1ybren, 1982; Srinivasan and Shocker, 1973a, 1973b; Ullrich =d Cumins , 1973; Takane, Young, and de Leeui, 190C; Yount,, 1972’. & OEM...procedures as a diagnostic tool. Specifically, they used the oompted STRESS - value and a measure of fit they called PRECAP that could be obtained
Using a Virtual Population to Authentically Teach Epidemiology and Biostatistics
ERIC Educational Resources Information Center
Dunn, Peter K.; Donnison, Sharn; Cole, Rachel; Bulmer, Michael
2017-01-01
Epidemiology is the study of the distribution of disease in human populations. This means that authentically teaching primary data collection in epidemiology is difficult as students cannot easily access suitable human populations. Using an action research methodology, this paper studied the use of a virtual human population (called "The…
Unlearning Established Organizational Routines--Part II
ERIC Educational Resources Information Center
Fiol, C. Marlena; O'Connor, Edward J.
2017-01-01
Purpose: The purpose of Part II of this two-part paper is to uncover important differences in the nature of the three unlearning subprocesses, which call for different leadership interventions to motivate people to move through them. Design/methodology/approach: The paper draws on research in behavioral medicine and psychology to demonstrate that…
Borges & Bikes Riders: Toward an Understanding of Autoethnography
ERIC Educational Resources Information Center
Wamsted, John O.
2012-01-01
In this article the author--a full-time high school mathematics teacher and concurrent doctoral candidate in Department of Middle-Secondary Education and Instructional Technology at Georgia State University--will make a case for the use of an autoethnographic methodological tool he is calling "narrative mining." He will begin by briefly…
Exploring Our Ecological Selves within Learning Organizations
ERIC Educational Resources Information Center
Rogers, Katrina S.
2012-01-01
Purpose: The paper's aim is to explore the connection between individual worldviews, called ecological selves, and organizational change, which allows people to create the conditions to confront the global environmental challenges they face as a species. Design/methodology/approach: The essay is a conceptual one, with reference to a small…
EcSL: Teaching Economics as a Second Language.
ERIC Educational Resources Information Center
Crowe, Richard
Hazard Community College, in Kentucky, has implemented a new instructional methodology for economics courses called Economics as a Second Language (EcSL). This teaching approach, based on the theory of Rendigs Fel that the best model for learning economics is the foreign language classroom, utilizes strategies similar to those employed in…
New Directions, New Questions? Social Theory, Education and Embodiment
ERIC Educational Resources Information Center
Evans, John; Davies, Brian
2011-01-01
This paper introduces the contents of the special issue whose authors, in our view, together demonstrate the need for transdisciplinary study of body pedagogies focussed on embodiment, emplacement, enactment and subjectivity. We celebrate theoretical and methodological diversity in the social sciences while calling for "border crossings" between…
IR-Raman Correlation of Shocked Minerals in Csátalja Meteorite — Clues for Shock Stages
NASA Astrophysics Data System (ADS)
Gyollai, I.; Kereszturi, A.; Fintor, K.; Kereszty, Zs.; Szabo, M.; Walter, H.
2017-11-01
The analyzed meteorite called Csátalja is an H chondrite (H4, S2, W2), and based on the differences between its certain parts, probably it is a breccia. The aim of methodological testing is characterizing shock deformation and heterogeneity.
A Call for Conducting Multivariate Mixed Analyses
ERIC Educational Resources Information Center
Onwuegbuzie, Anthony J.
2016-01-01
Several authors have written methodological works that provide an introductory- and/or intermediate-level guide to conducting mixed analyses. Although these works have been useful for beginning and emergent mixed researchers, with very few exceptions, works are lacking that describe and illustrate advanced-level mixed analysis approaches. Thus,…
Equivalent Viscous Damping Methodologies Applied on VEGA Launch Vehicle Numerical Model
NASA Astrophysics Data System (ADS)
Bartoccini, D.; Di Trapani, C.; Fransen, S.
2014-06-01
Part of the mission analysis of a spacecraft is the so- called launcher-satellite coupled loads analysis which aims at computing the dynamic environment of the satellite and of the launch vehicle for the most severe load cases in flight. Evidently the damping of the coupled system shall be defined with care as to not overestimate or underestimate the loads derived for the spacecraft. In this paper the application of several EqVD (Equivalent Viscous Damping) for Craig an Bampton (CB)-systems are investigated. Based on the structural damping defined for the various materials in the parent FE-models of the CB-components, EqVD matrices can be computed according to different methodologies. The effect of these methodologies on the numerical reconstruction of the VEGA launch vehicle dynamic environment will be presented.
Social customer relationship management: taking advantage of Web 2.0 and Big Data technologies.
Orenga-Roglá, Sergio; Chalmeta, Ricardo
2016-01-01
The emergence of Web 2.0 and Big Data technologies has allowed a new customer relationship strategy based on interactivity and collaboration called Social Customer Relationship Management (Social CRM) to be created. This enhances customer engagement and satisfaction. The implementation of Social CRM is a complex task that involves different organisational, human and technological aspects. However, there is a lack of methodologies to assist companies in these processes. This paper shows a novel methodology that helps companies to implement Social CRM, taking into account different aspects such as social customer strategy, the Social CRM performance measurement system, the Social CRM business processes, or the Social CRM computer system. The methodology was applied to one company in order to validate and refine it.
Towards a Methodology for Identifying Program Constraints During Requirements Analysis
NASA Technical Reports Server (NTRS)
Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo
1997-01-01
Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.
Methodology for the passive detection and discrimination of chemical and biological aerosols
NASA Astrophysics Data System (ADS)
Marinelli, William J.; Shokhirev, Kirill N.; Konno, Daisei; Rossi, David C.; Richardson, Martin
2013-05-01
The standoff detection and discrimination of aerosolized biological and chemical agents has traditionally been addressed through LIDAR approaches, but sensor systems using these methods have yet to be deployed. We discuss the development and testing of an approach to detect these aerosols using the deployed base of passive infrared hyperspectral sensors used for chemical vapor detection. The detection of aerosols requires the inclusion of down welling sky and up welling ground radiation in the description of the radiative transfer process. The wavelength and size dependent ratio of absorption to scattering provides much of the discrimination capability. The approach to the detection of aerosols utilizes much of the same phenomenology employed in vapor detection; however, the sensor system must acquire information on non-line-of-sight sources of radiation contributing to the scattering process. We describe the general methodology developed to detect chemical or biological aerosols, including justifications for the simplifying assumptions that enable the development of a real-time sensor system. Mie scattering calculations, aerosol size distribution dependence, and the angular dependence of the scattering on the aerosol signature will be discussed. This methodology will then be applied to two test cases: the ground level release of a biological aerosol (BG) and a nonbiological confuser (kaolin clay) as well as the debris field resulting from the intercept of a cruise missile carrying a thickened VX warhead. A field measurement, conducted at the Utah Test and Training Range will be used to illustrate the issues associated with the use of the method.
Self-Care Behaviors of African Americans Living with Heart Failure.
Woda, Aimee; Haglund, Kristin; Belknap, Ruth Ann; Sebern, Margaret
2015-01-01
African Americans have a higher risk of developing heart failure (HF) than persons from other ethnic groups. Once diagnosed, they have lower rates of HF self-care and poorer health outcomes. Promoting engagement in HF self-care is amenable to change and represents an important way to improve the health of African Americans with HF. This study used a community-based participatory action research methodology called photovoice to explore the practice of HF self-care among low-income, urban, community dwelling African Americans. Using the photovoice methodology, themes emerged regarding self-care management and self-care maintenance.
Constrained Stochastic Extended Redundancy Analysis.
DeSarbo, Wayne S; Hwang, Heungsun; Stadler Blank, Ashley; Kappe, Eelco
2015-06-01
We devise a new statistical methodology called constrained stochastic extended redundancy analysis (CSERA) to examine the comparative impact of various conceptual factors, or drivers, as well as the specific predictor variables that contribute to each driver on designated dependent variable(s). The technical details of the proposed methodology, the maximum likelihood estimation algorithm, and model selection heuristics are discussed. A sports marketing consumer psychology application is provided in a Major League Baseball (MLB) context where the effects of six conceptual drivers of game attendance and their defining predictor variables are estimated. Results compare favorably to those obtained using traditional extended redundancy analysis (ERA).
[The grounded theory as a methodological alternative for nursing research].
dos Santos, Sérgio Ribeiro; da Nóbrega, Maria Miriam
2002-01-01
This study presents a method of interpretative and systematic research with appliance to the development of studies in nursing called "the grounded theory", whose theoretical support is the symbolic interactionism. The purpose of the paper is to describe the grounded theory as an alternative methodology for the construction of knowledge in nursing. The study highlights four topics: the basic principle, the basic concepts, the trajectory of the method and the process of analysis of the data. We conclude that the systematization of data and its interpretation, based on social actors' experience, constitute strong subsidies to generate theories through this research tool.
How can history of science matter to scientists?
Maienschein, Jane; Laubichler, Manfred; Loettgers, Andrea
2008-06-01
History of science has developed into a methodologically diverse discipline, adding greatly to our understanding of the interplay between science, society, and culture. Along the way, one original impetus for the then newly emerging discipline--what George Sarton called the perspective "from the point of view of the scientist"--dropped out of fashion. This essay shows, by means of several examples, that reclaiming this interaction between science and history of science yields interesting perspectives and new insights for both science and history of science. The authors consequently suggest that historians of science also adopt this perspective as part of their methodological repertoire.
Evaluation of Incident Detection Methodologies
DOT National Transportation Integrated Search
1999-10-01
Original Report Date: October 1998. The detection of freeway incidents is an essential element of an area's traffic management system. Incidents need to be detected and handled as promptly as possible to minimize delay to the public. Various algorith...
Operator function modeling: An approach to cognitive task analysis in supervisory control systems
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1987-01-01
In a study of models of operators in complex, automated space systems, an operator function model (OFM) methodology was extended to represent cognitive as well as manual operator activities. Development continued on a software tool called OFMdraw, which facilitates construction of an OFM by permitting construction of a heterarchic network of nodes and arcs. Emphasis was placed on development of OFMspert, an expert system designed both to model human operation and to assist real human operators. The system uses a blackboard method of problem solving to make an on-line representation of operator intentions, called ACTIN (actions interpreter).
Regional Shelter Analysis Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dillon, Michael B.; Dennison, Deborah; Kane, Jave
2015-08-01
The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b)more » country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.« less
Aquino, Arturo; Gegundez-Arias, Manuel Emilio; Marin, Diego
2010-11-01
Optic disc (OD) detection is an important step in developing systems for automated diagnosis of various serious ophthalmic pathologies. This paper presents a new template-based methodology for segmenting the OD from digital retinal images. This methodology uses morphological and edge detection techniques followed by the Circular Hough Transform to obtain a circular OD boundary approximation. It requires a pixel located within the OD as initial information. For this purpose, a location methodology based on a voting-type algorithm is also proposed. The algorithms were evaluated on the 1200 images of the publicly available MESSIDOR database. The location procedure succeeded in 99% of cases, taking an average computational time of 1.67 s. with a standard deviation of 0.14 s. On the other hand, the segmentation algorithm rendered an average common area overlapping between automated segmentations and true OD regions of 86%. The average computational time was 5.69 s with a standard deviation of 0.54 s. Moreover, a discussion on advantages and disadvantages of the models more generally used for OD segmentation is also presented in this paper.
NASA Astrophysics Data System (ADS)
Sakellariou, J. S.; Fassois, S. D.
2006-11-01
A stochastic output error (OE) vibration-based methodology for damage detection and assessment (localization and quantification) in structures under earthquake excitation is introduced. The methodology is intended for assessing the state of a structure following potential damage occurrence by exploiting vibration signal measurements produced by low-level earthquake excitations. It is based upon (a) stochastic OE model identification, (b) statistical hypothesis testing procedures for damage detection, and (c) a geometric method (GM) for damage assessment. The methodology's advantages include the effective use of the non-stationary and limited duration earthquake excitation, the handling of stochastic uncertainties, the tackling of the damage localization and quantification subproblems, the use of "small" size, simple and partial (in both the spatial and frequency bandwidth senses) identified OE-type models, and the use of a minimal number of measured vibration signals. Its feasibility and effectiveness are assessed via Monte Carlo experiments employing a simple simulation model of a 6 storey building. It is demonstrated that damage levels of 5% and 20% reduction in a storey's stiffness characteristics may be properly detected and assessed using noise-corrupted vibration signals.
Bontempi, Iván A; Bizai, María L; Ortiz, Sylvia; Manattini, Silvia; Fabbro, Diana; Solari, Aldo; Diez, Cristina
2016-09-01
Different DNA markers to genotype Trypanosoma cruzi are now available. However, due to the low quantity of parasites present in biological samples, DNA markers with high copy number like kinetoplast minicircles are needed. The aim of this study was to complete a DNA assay called minicircle lineage specific-PCR (MLS-PCR) previously developed to genotype the T. cruzi DTUs TcV and TcVI, in order to genotype DTUs TcI and TcII and to improve TcVI detection. We screened kinetoplast minicircle hypervariable sequences from cloned PCR products from reference strains belonging to the mentioned DTUs using specific kDNA probes. With the four highly specific sequences selected, we designed primers to be used in the MLS-PCR to directly genotype T. cruzi from biological samples. High specificity and sensitivity were obtained when we evaluated the new approach for TcI, TcII, TcV and TcVI genotyping in twenty two T. cruzi reference strains. Afterward, we compared it with hybridization tests using specific kDNA probes in 32 blood samples from chronic chagasic patients from North Eastern Argentina. With both tests we were able to genotype 94% of the samples and the concordance between them was very good (kappa=0.855). The most frequent T. cruzi DTUs detected were TcV and TcVI, followed by TcII and much lower TcI. A unique T. cruzi DTU was detected in 18 samples meantime more than one in the remaining; being TcV and TcVI the most frequent association. A high percentage of mixed detections were obtained with both assays and its impact was discussed. Copyright © 2016 Elsevier B.V. All rights reserved.
Xenosurveillance: A Novel Mosquito-Based Approach for Examining the Human-Pathogen Landscape
Grubaugh, Nathan D.; Sharma, Supriya; Krajacich, Benjamin J.; Fakoli III, Lawrence S.; Bolay, Fatorma K.; Diclaro II, Joe W.; Johnson, W. Evan; Ebel, Gregory D.; Foy, Brian D.; Brackney, Doug E.
2015-01-01
Background Globally, regions at the highest risk for emerging infectious diseases are often the ones with the fewest resources. As a result, implementing sustainable infectious disease surveillance systems in these regions is challenging. The cost of these programs and difficulties associated with collecting, storing and transporting relevant samples have hindered them in the regions where they are most needed. Therefore, we tested the sensitivity and feasibility of a novel surveillance technique called xenosurveillance. This approach utilizes the host feeding preferences and behaviors of Anopheles gambiae, which are highly anthropophilic and rest indoors after feeding, to sample viruses in human beings. We hypothesized that mosquito bloodmeals could be used to detect vertebrate viral pathogens within realistic field collection timeframes and clinically relevant concentrations. Methodology/Principal Findings To validate this approach, we examined variables influencing virus detection such as the duration between mosquito blood feeding and mosquito processing, the pathogen nucleic acid stability in the mosquito gut and the pathogen load present in the host’s blood at the time of bloodmeal ingestion using our laboratory model. Our findings revealed that viral nucleic acids, at clinically relevant concentrations, could be detected from engorged mosquitoes for up to 24 hours post feeding by qRT-PCR. Subsequently, we tested this approach in the field by examining blood from engorged mosquitoes from two field sites in Liberia. Using next-generation sequencing and PCR we were able to detect the genetic signatures of multiple viral pathogens including Epstein-Barr virus and canine distemper virus. Conclusions/Significance Together, these data demonstrate the feasibility of xenosurveillance and in doing so validated a simple and non-invasive surveillance tool that could be used to complement current biosurveillance efforts. PMID:25775236
Integrating automated support for a software management cycle into the TAME system
NASA Technical Reports Server (NTRS)
Sunazuka, Toshihiko; Basili, Victor R.
1989-01-01
Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.
Ontological realism: A methodology for coordinated evolution of scientific ontologies.
Smith, Barry; Ceusters, Werner
2010-11-15
Since 2002 we have been testing and refining a methodology for ontology development that is now being used by multiple groups of researchers in different life science domains. Gary Merrill, in a recent paper in this journal, describes some of the reasons why this methodology has been found attractive by researchers in the biological and biomedical sciences. At the same time he assails the methodology on philosophical grounds, focusing specifically on our recommendation that ontologies developed for scientific purposes should be constructed in such a way that their terms are seen as referring to what we call universals or types in reality. As we show, Merrill's critique is of little relevance to the success of our realist project, since it not only reveals no actual errors in our work but also criticizes views on universals that we do not in fact hold. However, it nonetheless provides us with a valuable opportunity to clarify the realist methodology, and to show how some of its principles are being applied, especially within the framework of the OBO (Open Biomedical Ontologies) Foundry initiative.
Ontological realism: A methodology for coordinated evolution of scientific ontologies
Smith, Barry; Ceusters, Werner
2011-01-01
Since 2002 we have been testing and refining a methodology for ontology development that is now being used by multiple groups of researchers in different life science domains. Gary Merrill, in a recent paper in this journal, describes some of the reasons why this methodology has been found attractive by researchers in the biological and biomedical sciences. At the same time he assails the methodology on philosophical grounds, focusing specifically on our recommendation that ontologies developed for scientific purposes should be constructed in such a way that their terms are seen as referring to what we call universals or types in reality. As we show, Merrill’s critique is of little relevance to the success of our realist project, since it not only reveals no actual errors in our work but also criticizes views on universals that we do not in fact hold. However, it nonetheless provides us with a valuable opportunity to clarify the realist methodology, and to show how some of its principles are being applied, especially within the framework of the OBO (Open Biomedical Ontologies) Foundry initiative. PMID:21637730
Azzini, Elena; Maiani, Giuseppe; Turrini, Aida; Intorre, Federica; Lo Feudo, Gabriella; Capone, Roberto; Bottalico, Francesco; El Bilali, Hamid; Polito, Angela
2018-08-01
The aim of this paper is to provide a methodological approach to evaluate the nutritional sustainability of typical agro-food products, representing Mediterranean eating habits and included in the Mediterranean food pyramid. For each group of foods, suitable and easily measurable indicators were identified. Two macro-indicators were used to assess the nutritional sustainability of each product. The first macro-indicator, called 'business distinctiveness', takes into account the application of different regulations and standards regarding quality, safety and traceability as well as the origin of raw materials. The second macro-indicator, called 'nutritional quality', assesses product nutritional quality taking into account the contents of key compounds including micronutrients and bioactive phytochemicals. For each indicator a 0-10 scoring system was set up, with scores from 0 (unsustainable) to 10 (very sustainable), with 5 as a sustainability benchmark value. The benchmark value is the value from which a product can be considered sustainable. A simple formula was developed to produce a sustainability index. The proposed sustainability index could be considered a useful tool to describe both the qualitative and quantitative value of micronutrients and bioactive phytochemical present in foodstuffs. This methodological approach can also be applied beyond the Mediterranean, to food products in other world regions. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.
Micro-resonator-based electric field sensors with long durations of sensitivity
NASA Astrophysics Data System (ADS)
Ali, Amir R.
2017-05-01
In this paper, we present a new fabrication method for the whispering gallery mode (WGM) micro-sphere based electric field sensor that which allows for longer time periods of sensitivity. Recently, a WGM-based photonic electric field sensor was proposed using a coupled dielectric microsphere-beam. The external electric field imposes an electrtrostriction force on the dielectric beam, deflecting it. The beam, in turn compresses the sphere causing a shift in its WGM. As part of the fabrication process, the PDMS micro-beams and the spheres are curied at high-temperature (100oC) and subsequently poled by exposing to strong external electric field ( 8 MV/m) for two hours. The poling process allows for the deposition of surface charges thereby increasing the electrostriction effect. This methodology is called curing-then-poling (CTP). Although the sensors do become sufficiently sensitive to electric field, they start de-poling after a short period (within 10 minutes) after poling, hence losing sensitivity. In an attempt to mitigate this problem and to lock the polarization for a longer period, we use an alternate methodology whereby the beam is poled and cured simultaneously (curing-while-poling or CWP). The new fabrication method allows for the retention of polarization (and hence, sensitivity to electric field) longer ( 1500 minutes). An analysis is carried out along with preliminary experiments. Results show that electric fields as small as 100 V/m can be detected with a 300 μm diameter sphere sensor a day after poling.
Mobile phones improve case detection and management of malaria in rural Bangladesh
2013-01-01
Background The recent introduction of mobile phones into the rural Bandarban district of Bangladesh provided a resource to improve case detection and treatment of patients with malaria. Methods During studies to define the epidemiology of malaria in villages in south-eastern Bangladesh, an area with hypoendemic malaria, the project recorded 986 mobile phone calls from families because of illness suspected to be malaria between June 2010 and June 2012. Results Based on phone calls, field workers visited the homes with ill persons, and collected blood samples for malaria on 1,046 people. 265 (25%) of the patients tested were positive for malaria. Of the 509 symptomatic malaria cases diagnosed during this study period, 265 (52%) were detected because of an initial mobile phone call. Conclusion Mobile phone technology was found to be an efficient and effective method for rapidly detecting and treating patients with malaria in this remote area. This technology, when combined with local knowledge and field support, may be applicable to other hard-to-reach areas to improve malaria control. PMID:23374585
Patterning ecological risk of pesticide contamination at the river basin scale.
Faggiano, Leslie; de Zwart, Dick; García-Berthou, Emili; Lek, Sovan; Gevrey, Muriel
2010-05-01
Ecological risk assessment was conducted to determine the risk posed by pesticide mixtures to the Adour-Garonne river basin (south-western France). The objectives of this study were to assess the general state of this basin with regard to pesticide contamination using a risk assessment procedure and to detect patterns in toxic mixture assemblages through a self-organizing map (SOM) methodology in order to identify the locations at risk. Exposure assessment, risk assessment with species sensitivity distribution, and mixture toxicity rules were used to compute six relative risk predictors for different toxic modes of action: the multi-substance potentially affected fraction of species depending on the toxic mode of action of compounds found in the mixture (msPAF CA(TMoA) values). Those predictors computed for the 131 sampling sites assessed in this study were then patterned through the SOM learning process. Four clusters of sampling sites exhibiting similar toxic assemblages were identified. In the first cluster, which comprised 83% of the sampling sites, the risk caused by pesticide mixture toward aquatic species was weak (mean msPAF value for those sites<0.0036%), while in another cluster the risk was significant (mean msPAF<1.09%). GIS mapping allowed an interesting spatial pattern of the distribution of sampling sites for each cluster to be highlighted with a significant and highly localized risk in the French department called "Lot et Garonne". The combined use of the SOM methodology, mixture toxicity modelling and a clear geo-referenced representation of results not only revealed the general state of the Adour-Garonne basin with regard to contamination by pesticides but also enabled to analyze the spatial pattern of toxic mixture assemblage in order to prioritize the locations at risk and to detect the group of compounds causing the greatest risk at the basin scale. Copyright 2010 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Pulinets, S. A.; Andrzej, K.; Hernandez-Pajares, M.; Cherniak, I.; Zakharenkova, I.; Rothkaehl, H.; Davidenko, D.
2017-12-01
The INSPIRE project is dedicated to the study of physical processes and their effects in ionosphere which could be determined as earthquake precursors together with detailed description of the methodology of ionospheric pre-seismic anomalies definition. It was initiated by ESA and carried out by international consortium. The physical mechanisms of the ionospheric pre-seismic anomalies generation from ground to the ionosphere altitudes were formulated within framework of the Lithosphere-Atmosphere-Ionosphere-Magnetosphere Coupling (LAIMC) model (Pulinets et al., 2015). The general algorithm for the identification of the ionospheric precursors was formalized which also takes into account the external Space Weather factors able to generate the false alarms. Importance of the special stable pattern called the "precursor mask" was highlighted which is based on self-similarity of pre-seismic ionospheric variations. The role of expert decision in pre-seismic anomalies interpretation for generation of seismic warning is important as well. The algorithm performance of the LAIMC seismo-ionospheric effect detection module has been demonstrated using the L'Aquila 2009 earthquake as a case study. The results of INSPIRE project have demonstrated that the ionospheric anomalies registered before the strong earthquakes could be used as reliable precursors. The detailed classification of the pre-seismic anomalies was presented in different regions of the ionosphere and signatures of the pre-seismic anomalies as detected by ground and satellite based instruments were described what clarified methodology of the precursor's identification from ionospheric multi-instrumental measurements. Configuration for the dedicated multi-observation experiment and satellite payload was proposed for the future implementation of the INSPIRE project results. In this regard the multi-instrument set can be divided by two groups: space equipment and ground-based support, which could be used for real-time monitoring. Together with scientific and technical tasks the set of political, logistic and administrative problems (including certification of approaches by seismological community, juridical procedures by the governmental authorities) should be resolved for the real earthquake forecast effectuation.
Land Surface Albedo From EPS/AVHRR : Method For Retrieval and Validation
NASA Astrophysics Data System (ADS)
Jacob, G.
2015-12-01
The scope of Land Surface Analysis Satellite Applications Facility (LSA-SAF) is to increase benefit from EUMETSAT Satellites (MSG and EPS) data by providing added value products for the meteorological and environmental science communities with main applications in the fields of climate modelling, environmental management, natural hazards management, and climate change detection. The MSG/SEVIRI daily albedo product is disseminated operationally by the LSA-SAF processing centre based in Portugal since 2009. This product so-called MDAL covers Europe and Africa includes in the visible, near infrared and shortwave bands at a resolution of 3km at the equator. Recently, an albedo product at 1km so-called ETAL has been built from EPS/AVHRR observations in order to primarily MDAL product outside the MSG disk, while ensuring a global coverage. The methodology is common to MSG and EPS data and relies on the inversion of the BRDF (Bidirectional Reflectance Distribution Function) model of Roujean et al. On a given target, ETAL products exploits the variability of viewing angles whereas MDAL looks at the variations of solar illumination. The comparison of ETAL albedo product against MODIS and MSG/SEVIRI products over the year 2015 is instructive in many ways and shows in general a good agreement between them. The dispersion may be accounted by different factors that will be explained The additional information provided by EPS appears to be particularly beneficial for high latitudes during winter and for snow albedo.
78 FR 77024 - Telemarketing Sales Rule; Notice of Termination of Caller ID Rulemaking
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-20
..., data mining and anomaly detection, and call-blocking technology). \\19\\ AT&T Servs., Inc., No. 00040, at... technically feasible, by looking at the signaling data . . . to distinguish between a CPN [calling party...
... A telltale abnormality — called a type 1 Brugada ECG pattern — is detected by an electrocardiogram (ECG) test. Brugada syndrome is much more common in ... syndrome is an abnormal pattern on an electrocardiogram (ECG) called a type 1 Brugada ECG pattern. You ...
French, Deborah; Smith, Andrew; Powers, Martin P; Wu, Alan H B
2011-08-17
Binding of a ligand to the epidermal growth factor receptor (EGFR) stimulates various intracellular signaling pathways resulting in cell cycle progression, proliferation, angiogenesis and apoptosis inhibition. KRAS is involved in signaling pathways including RAF/MAPK and PI3K and mutations in this gene result in constitutive activation of these pathways, independent of EGFR activation. Seven mutations in codons 12 and 13 of KRAS comprise around 95% of the observed human mutations, rendering monoclonal antibodies against EGFR (e.g. cetuximab and panitumumab) useless in treatment of colorectal cancer. KRAS mutation testing by two different methodologies was compared; Sanger sequencing and AutoGenomics INFINITI® assay, on DNA extracted from colorectal cancers. Out of 29 colorectal tumor samples tested, 28 were concordant between the two methodologies for the KRAS mutations that were detected in both assays with the INFINITI® assay detecting a mutation in one sample that was indeterminate by Sanger sequencing and a third methodology; single nucleotide primer extension. This study indicates the utility of the AutoGenomics INFINITI® methodology in a clinical laboratory setting where technical expertise or access to equipment for DNA sequencing does not exist. Copyright © 2011 Elsevier B.V. All rights reserved.
Red-shouldered hawk occupancy surveys in central Minnesota, USA
Henneman, C.; McLeod, M.A.; Andersen, D.E.
2007-01-01
Forest-dwelling raptors are often difficult to detect because many species occur at low density or are secretive. Broadcasting conspecific vocalizations can increase the probability of detecting forest-dwelling raptors and has been shown to be an effective method for locating raptors and assessing their relative abundance. Recent advances in statistical techniques based on presence-absence data use probabilistic arguments to derive probability of detection when it is <1 and to provide a model and likelihood-based method for estimating proportion of sites occupied. We used these maximum-likelihood models with data from red-shouldered hawk (Buteo lineatus) call-broadcast surveys conducted in central Minnesota, USA, in 1994-1995 and 2004-2005. Our objectives were to obtain estimates of occupancy and detection probability 1) over multiple sampling seasons (yr), 2) incorporating within-season time-specific detection probabilities, 3) with call type and breeding stage included as covariates in models of probability of detection, and 4) with different sampling strategies. We visited individual survey locations 2-9 times per year, and estimates of both probability of detection (range = 0.28-0.54) and site occupancy (range = 0.81-0.97) varied among years. Detection probability was affected by inclusion of a within-season time-specific covariate, call type, and breeding stage. In 2004 and 2005 we used survey results to assess the effect that number of sample locations, double sampling, and discontinued sampling had on parameter estimates. We found that estimates of probability of detection and proportion of sites occupied were similar across different sampling strategies, and we suggest ways to reduce sampling effort in a monitoring program.
Vanegas, Fernando; Weiss, John; Gonzalez, Felipe
2018-01-01
Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used—the sensors, the UAV, and the flight operations—the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analysing and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications. PMID:29342101
NASA Astrophysics Data System (ADS)
Ali-Alvarez, S.; Ferdinand, P.; Magne, S.; Nogueira, R. P.
2013-04-01
Corrosion of reinforced bar (rebar) in concrete structures represents a major issue in civil engineering works, being its detection and evolution a challenge for the applied research. In this work, we present a new methodology to corrosion detection in reinforced concrete structures, by combining Fiber Bragg Grating (FBG) sensors with the electrochemical and physical properties of rebar in a simplified assembly. Tests in electrolytic solutions and concrete were performed for pitting and general corrosion. The proposed Structural Health Monitoring (SHM) methodology constitutes a direct corrosion measurement potentially useful to implement or improve Condition-Based Maintenance (CBM) program for civil engineering concrete structures.
Statistical Model Applied to NetFlow for Network Intrusion Detection
NASA Astrophysics Data System (ADS)
Proto, André; Alexandre, Leandro A.; Batista, Maira L.; Oliveira, Isabela L.; Cansian, Adriano M.
The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application.
Ferrante di Ruffano, Lavinia; Dinnes, Jacqueline; Sitch, Alice J; Hyde, Chris; Deeks, Jonathan J
2017-02-24
There is a growing recognition for the need to expand our evidence base for the clinical effectiveness of diagnostic tests. Many international bodies are calling for diagnostic randomized controlled trials to provide the most rigorous evidence of impact to patient health. Although these so-called test-treatment RCTs are very challenging to undertake due to their methodological complexity, they have not been subjected to a systematic appraisal of their methodological quality. The extent to which these trials may be producing biased results therefore remains unknown. We set out to address this issue by conducting a methodological review of published test-treatment trials to determine how often they implement adequate methods to limit bias and safeguard the validity of results. We ascertained all test-treatment RCTs published 2004-2007, indexed in CENTRAL, including RCTs which randomized patients to diagnostic tests and measured patient outcomes after treatment. Tests used for screening, monitoring or prognosis were excluded. We assessed adequacy of sequence generation, allocation concealment and intention-to-treat, appropriateness of primary analyses, blinding and reporting of power calculations, and extracted study characteristics including the primary outcome. One hundred three trials compared 105 control with 119 experimental interventions, and reported 150 primary outcomes. Randomization and allocation concealment were adequate in 57 and 37% of trials. Blinding was uncommon (patients 5%, clinicians 4%, outcome assessors 21%), as was an adequate intention-to-treat analysis (29%). Overall 101 of 103 trials (98%) were at risk of bias, as judged using standard Cochrane criteria. Test-treatment trials are particularly susceptible to attrition and inadequate primary analyses, lack of blinding and under-powering. These weaknesses pose much greater methodological and practical challenges to conducting reliable RCT evaluations of test-treatment strategies than standard treatment interventions. We suggest a cautious approach that first examines whether a test-treatment intervention can accommodate the methodological safeguards necessary to minimize bias, and highlight that test-treatment RCTs require different methods to ensure reliability than standard treatment trials. Please see the companion paper to this article: http://bmcmedresmethodol.biomedcentral.com/articles/10.1186/s12874-016-0286-0 .
Does Vessel Noise Affect Oyster Toadfish Calling Rates?
Luczkovich, Joseph J; Krahforst, Cecilia S; Hoppe, Harry; Sprague, Mark W
2016-01-01
The question we addressed in this study is whether oyster toadfish respond to vessel disturbances by calling less when vessels with lower frequency spectra are present in a sound recording and afterward. Long-term data recorders were deployed at the Neuse (high vessel-noise site) and Pamlico (low vessel-noise site) Rivers. There were many fewer toadfish detections at the high vessel-noise site than the low-noise station. Calling rates were lower in the high-boat traffic area, suggesting that toadfish cannot call over loud vessel noise, reducing the overall calling rate, and may have to call more often when vessels are not present.
Rapid detection of bacteria in foods and biological fluids
NASA Technical Reports Server (NTRS)
Fealey, R. D.; Renner, W.
1973-01-01
Simple and inexpensive apparatus, called "redox monitoring cell," rapidly detects presence of bacteria. Bacteria is detected by measuring drop in oxygen content in test solution. Apparatus consists of vial with two specially designed electrodes connected to sensitive voltmeter.
Marine mammal acoustic detections in the northeastern Chukchi Sea, September 2007-July 2011
NASA Astrophysics Data System (ADS)
Hannay, David E.; Delarue, Julien; Mouy, Xavier; Martin, Bruce S.; Leary, Del; Oswald, Julie N.; Vallarta, Jonathan
2013-09-01
Several cetacean and pinniped species use the northeastern Chukchi Sea as seasonal or year-round habitat. This area has experienced pronounced reduction in the extent of summer sea ice over the last decade, as well as increased anthropogenic activity, particularly in the form of oil and gas exploration. The effects of these changes on marine mammal species are presently unknown. Autonomous passive acoustic recorders were deployed over a wide area of the northeastern Chukchi Sea off the coast of Alaska from Cape Lisburne to Barrow, at distances from 8 km to 200 km from shore: up to 44 each summer and up to 8 each winter. Acoustic data were acquired at 16 kHz continuously during summer and on a duty cycle of 40 or 48 min within each 4-h period during winter. Recordings were analyzed manually and using automated detection and classification systems to identify calls. Bowhead (Balaena mysticetus) and beluga (Delphinapterus leucas) whale calls were detected primarily from April through June and from September to December during their migrations between the Bering and Beaufort seas. Summer detections were rare and usually concentrated off Wainwright and Barrow, Alaska. Gray (Eschrichtius robustus) whale calls were detected between July and October, their occurrence decreasing with increasing distance from shore. Fin (Balaenoptera physalus), killer (Orcinus orca), minke (Balaenoptera acutorostrata), and humpback (Megaptera novaeangliae) whales were detected sporadically in summer and early fall. Walrus (Odobenus rosmarus) was the most commonly detected species between June and October, primarily occupying the southern edge of Hanna Shoal and haul-outs near coastal recording stations off Wainwright and Point Lay. Ringed (Pusa hispida) and bearded (Erignathus barbatus) seals occur year-round in the Chukchi Sea. Ringed seal acoustic detections occurred throughout the year but detection numbers were low, likely due to low vocalization rates. Bearded seal acoustic detections peaked in April and May during their breeding season, with much lower detection numbers in July and August, likely as a result of reduced calling rates after breeding season. Ribbon seals (Histriophoca fasciata) were only detected in the fall as they migrated south through the study area toward the Bering Sea. These results suggest a regular presence of marine mammals in the Chukchi Sea year-round, with species-dependent seasonal and spatial density variations.
Cetacean population density estimation from single fixed sensors using passive acoustics.
Küsel, Elizabeth T; Mellinger, David K; Thomas, Len; Marques, Tiago A; Moretti, David; Ward, Jessica
2011-06-01
Passive acoustic methods are increasingly being used to estimate animal population density. Most density estimation methods are based on estimates of the probability of detecting calls as functions of distance. Typically these are obtained using receivers capable of localizing calls or from studies of tagged animals. However, both approaches are expensive to implement. The approach described here uses a MonteCarlo model to estimate the probability of detecting calls from single sensors. The passive sonar equation is used to predict signal-to-noise ratios (SNRs) of received clicks, which are then combined with a detector characterization that predicts probability of detection as a function of SNR. Input distributions for source level, beam pattern, and whale depth are obtained from the literature. Acoustic propagation modeling is used to estimate transmission loss. Other inputs for density estimation are call rate, obtained from the literature, and false positive rate, obtained from manual analysis of a data sample. The method is applied to estimate density of Blainville's beaked whales over a 6-day period around a single hydrophone located in the Tongue of the Ocean, Bahamas. Results are consistent with those from previous analyses, which use additional tag data. © 2011 Acoustical Society of America
Accuracy of CNV Detection from GWAS Data.
Zhang, Dandan; Qian, Yudong; Akula, Nirmala; Alliey-Rodriguez, Ney; Tang, Jinsong; Gershon, Elliot S; Liu, Chunyu
2011-01-13
Several computer programs are available for detecting copy number variants (CNVs) using genome-wide SNP arrays. We evaluated the performance of four CNV detection software suites--Birdsuite, Partek, HelixTree, and PennCNV-Affy--in the identification of both rare and common CNVs. Each program's performance was assessed in two ways. The first was its recovery rate, i.e., its ability to call 893 CNVs previously identified in eight HapMap samples by paired-end sequencing of whole-genome fosmid clones, and 51,440 CNVs identified by array Comparative Genome Hybridization (aCGH) followed by validation procedures, in 90 HapMap CEU samples. The second evaluation was program performance calling rare and common CNVs in the Bipolar Genome Study (BiGS) data set (1001 bipolar cases and 1033 controls, all of European ancestry) as measured by the Affymetrix SNP 6.0 array. Accuracy in calling rare CNVs was assessed by positive predictive value, based on the proportion of rare CNVs validated by quantitative real-time PCR (qPCR), while accuracy in calling common CNVs was assessed by false positive/false negative rates based on qPCR validation results from a subset of common CNVs. Birdsuite recovered the highest percentages of known HapMap CNVs containing >20 markers in two reference CNV datasets. The recovery rate increased with decreased CNV frequency. In the tested rare CNV data, Birdsuite and Partek had higher positive predictive values than the other software suites. In a test of three common CNVs in the BiGS dataset, Birdsuite's call was 98.8% consistent with qPCR quantification in one CNV region, but the other two regions showed an unacceptable degree of accuracy. We found relatively poor consistency between the two "gold standards," the sequence data of Kidd et al., and aCGH data of Conrad et al. Algorithms for calling CNVs especially common ones need substantial improvement, and a "gold standard" for detection of CNVs remains to be established.
Hällfors, Eerik; Saku, Sami A; Mäkinen, Tatu J; Madanat, Rami
2018-03-01
Different measures for reducing costs after total joint arthroplasty (TJA) have gained attention lately. At our institution, a free-of-charge consultation phone service was initiated that targeted patients with TJA. This service aimed at reducing unnecessary emergency department (ED) visits and, thus, potentially improving the cost-effectiveness of TJAs. To our knowledge, a similar consultation service had not been described previously. We aimed at examining the rates and reasons for early postdischarge phone calls and evaluating the efficacy of this consultation service. During a 2-month period, we gathered information on every call received by the consultation phone service from patients with TJAs within 90 days of the index TJA procedure. Patients were followed for 2 weeks after making a call to detect major complications and self-initiated ED visits. Data were collected from electronic medical charts regarding age, gender, type of surgery, date of discharge, and length of hospital stay. We analyzed 288 phone calls. Calls were mostly related to medication (41%), wound complications (17%), and mobilization issues (15%). Most calls were resolved in the phone consultation. Few patients (13%) required further evaluation in the ED. The consultation service failed to detect the need for an ED visit in 2 cases (0.7%) that required further care. The consultation phone service clearly benefitted patients with TJAs. The service reduced the number of unnecessary ED visits and functioned well in detecting patients who required further care. Most postoperative concerns were related to prescribed medications, wound complications, and mobilization issues. Copyright © 2017 Elsevier Inc. All rights reserved.
Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid
2017-01-01
Background Design processes such as human-centered design (HCD), which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of HCD can often conflict with the necessary rapid product development life-cycles associated with the competitive connected health industry. Objective The aim of this study was to apply a structured HCD methodology to the development of a smartphone app that was to be used within a connected health fall risk detection system. Our methodology utilizes so called discount usability engineering techniques to minimize the burden on resources during development and maintain a rapid pace of development. This study will provide prospective designers a detailed description of the application of a HCD methodology. Methods A 3-phase methodology was applied. In the first phase, a descriptive “use case” was developed by the system designers and analyzed by both expert stakeholders and end users. The use case described the use of the app and how various actors would interact with it and in what context. A working app prototype and a user manual were then developed based on this feedback and were subjected to a rigorous usability inspection. Further changes were made both to the interface and support documentation. The now advanced prototype was exposed to user testing by end users where further design recommendations were made. Results With combined expert and end-user analysis of a comprehensive use case having originally identified 21 problems with the system interface, we have only seen and observed 3 of these problems in user testing, implying that 18 problems were eliminated between phase 1 and 3. Satisfactory ratings were obtained during validation testing by both experts and end users, and final testing by users shows the system requires low mental, physical, and temporal demands according to the NASA Task Load Index (NASA-TLX). Conclusions From our observation of older adults’ interactions with smartphone interfaces, there were some recurring themes. Clear and relevant feedback as the user attempts to complete a task is critical. Feedback should include pop-ups, sound tones, color or texture changes, or icon changes to indicate that a function has been completed successfully, such as for the connection sequence. For text feedback, clear and unambiguous language should be used so as not to create anxiety, particularly when it comes to saving data. Warning tones or symbols, such as caution symbols or shrill tones, should only be used if absolutely necessary. Our HCD methodology, designed and implemented based on the principles of the International Standard Organizaton (ISO) 9241-210 standard, produced a functional app interface within a short production cycle, which is now suitable for use by older adults in long term clinical trials. PMID:28559227
Role of More Active Identification of Brain-Dead Cases in Increasing Organ Donation.
Sadegh Beigee, Farahnaz; Mohsenzadeh, Mojtaba; Shahryari, Shagin; Mojtabaee, Meysam
2017-02-01
Organ donor shortage is a worldwide problem, resulting in 10% to 30% mortality rates for patients on wait lists for organ transplant. For brain-dead patients in Iran, it is mandatory for intensive care unit patients with Glasgow Coma Scale below 5/15 to be reported to an organ procurement unit. However, this process has not been functioning effectively. Here, we present the effects of changing the strategies on detecting brain-dead cases on the organ donor pool. From March 2015 to March 2016, we changed our strategy in active detection of brain-dead cases. Since March 2015, our newly established protocol for active detection of brain-dead cases includes the following changes: (1) instead of calling high-volume intensive care units 3 times per week, we switched to calling every day in the morning; (2) instead of calling low-volume intensive care units 1 time per week, we switched to calling 3 times per week; (3) we included intensive care units (cardiac and general), neurosurgery, and emergency departments, as well as nursing supervisor offices, in our call and visit lists; and (4) we increased visits to wards by our trained staff as inspectors. From March 2015 to March 2016, the number of reported suspected brain-dead cases has increased from 224 to 460 per year, with proven brain death increasing from 180 to 306 cases. The actual number of donors has also increased, from 116 to 165 donations (53% increase) over 1 year. More proactive strategies have had significant effects on brain-dead detection, resulting in significantly increased donor pools and organ donations. In countries with low cooperation of hospital staff, more proactive engagement in detecting brain-dead cases is a good solution to prevent loss of potential organ donors, with a final result of decreasing wait list mortality.
Akosah, Eric; Ohemeng-Dapaah, Seth; Sakyi Baah, Joseph; Kanter, Andrew S
2013-01-01
Background The network structure of an organization influences how well or poorly an organization communicates and manages its resources. In the Millennium Villages Project site in Bonsaaso, Ghana, a mobile phone closed user group has been introduced for use by the Bonsaaso Millennium Villages Project Health Team and other key individuals. No assessment on the benefits or barriers of the use of the closed user group had been carried out. Objective The purpose of this research was to make the case for the use of social network analysis methods to be applied in health systems research—specifically related to mobile health. Methods This study used mobile phone voice records of, conducted interviews with, and reviewed call journals kept by a mobile phone closed user group consisting of the Bonsaaso Millennium Villages Project Health Team. Social network analysis methodology complemented by a qualitative component was used. Monthly voice data of the closed user group from Airtel Bharti Ghana were analyzed using UCINET and visual depictions of the network were created using NetDraw. Interviews and call journals kept by informants were analyzed using NVivo. Results The methodology was successful in helping identify effective organizational structure. Members of the Health Management Team were the more central players in the network, rather than the Community Health Nurses (who might have been expected to be central). Conclusions Social network analysis methodology can be used to determine the most productive structure for an organization or team, identify gaps in communication, identify key actors with greatest influence, and more. In conclusion, this methodology can be a useful analytical tool, especially in the context of mobile health, health services, and operational and managerial research. PMID:23552721
Kaonga, Nadi Nina; Labrique, Alain; Mechael, Patricia; Akosah, Eric; Ohemeng-Dapaah, Seth; Sakyi Baah, Joseph; Kodie, Richmond; Kanter, Andrew S; Levine, Orin
2013-04-03
The network structure of an organization influences how well or poorly an organization communicates and manages its resources. In the Millennium Villages Project site in Bonsaaso, Ghana, a mobile phone closed user group has been introduced for use by the Bonsaaso Millennium Villages Project Health Team and other key individuals. No assessment on the benefits or barriers of the use of the closed user group had been carried out. The purpose of this research was to make the case for the use of social network analysis methods to be applied in health systems research--specifically related to mobile health. This study used mobile phone voice records of, conducted interviews with, and reviewed call journals kept by a mobile phone closed user group consisting of the Bonsaaso Millennium Villages Project Health Team. Social network analysis methodology complemented by a qualitative component was used. Monthly voice data of the closed user group from Airtel Bharti Ghana were analyzed using UCINET and visual depictions of the network were created using NetDraw. Interviews and call journals kept by informants were analyzed using NVivo. The methodology was successful in helping identify effective organizational structure. Members of the Health Management Team were the more central players in the network, rather than the Community Health Nurses (who might have been expected to be central). Social network analysis methodology can be used to determine the most productive structure for an organization or team, identify gaps in communication, identify key actors with greatest influence, and more. In conclusion, this methodology can be a useful analytical tool, especially in the context of mobile health, health services, and operational and managerial research.
Modeling seasonal detection patterns for burrowing owl surveys
Quresh S. Latif; Kathleen D. Fleming; Cameron Barrows; John T. Rotenberry
2012-01-01
To guide monitoring of burrowing owls (Athene cunicularia) in the Coachella Valley, California, USA, we analyzed survey-method-specific seasonal variation in detectability. Point-based call-broadcast surveys yielded high early season detectability that then declined through time, whereas detectability on driving surveys increased through the season. Point surveys...
An enhanced methodology for spacecraft correlation activity using virtual testing tools
NASA Astrophysics Data System (ADS)
Remedia, Marcello; Aglietti, Guglielmo S.; Appolloni, Matteo; Cozzani, Alessandro; Kiley, Andrew
2017-11-01
Test planning and post-test correlation activity have been issues of growing importance in the last few decades and many methodologies have been developed to either quantify or improve the correlation between computational and experimental results. In this article the methodologies established so far are enhanced with the implementation of a recently developed procedure called Virtual Testing. In the context of fixed-base sinusoidal tests (commonly used in the space sector for correlation), there are several factors in the test campaign that affect the behaviour of the satellite and are not normally taken into account when performing analyses: different boundary conditions created by the shaker's own dynamics, non-perfect control system, signal delays etc. All these factors are the core of the Virtual Testing implementation, which will be thoroughly explained in this article and applied to the specific case of Bepi-Colombo spacecraft tested on the ESA QUAD Shaker. Correlation activity will be performed in the various stages of the process, showing important improvements observed after applying the final complete methodology.
Horizon Mission Methodology - A tool for the study of technology innovation and new paradigms
NASA Technical Reports Server (NTRS)
Anderson, John L.
1993-01-01
The Horizon Mission (HM) methodology was developed to provide a means of identifying and evaluating highly innovative, breakthrough technology concepts (BTCs) and for assessing their potential impact on advanced space missions. The methodology is based on identifying new capabilities needed by hypothetical 'horizon' space missions having performance requirements that cannot be met even by extrapolating known space technologies. Normal human evaluation of new ideas such as BTCs appears to be governed (and limited) by 'inner models of reality' defined as paradigms. Thus, new ideas are evaluated by old models. This paper describes the use of the HM Methodology to define possible future paradigms that would provide alternatives to evaluation by current paradigms. The approach is to represent a future paradigm by a set of new BTC-based capabilities - called a paradigm abstract. The paper describes methods of constructing and using the abstracts for evaluating BTCs for space applications and for exploring the concept of paradigms and paradigm shifts as a representation of technology innovation.
Enhanced High Performance Power Compensation Methodology by IPFC Using PIGBT-IDVR
Arumugom, Subramanian; Rajaram, Marimuthu
2015-01-01
Currently, power systems are involuntarily controlled without high speed control and are frequently initiated, therefore resulting in a slow process when compared with static electronic devices. Among various power interruptions in power supply systems, voltage dips play a central role in causing disruption. The dynamic voltage restorer (DVR) is a process based on voltage control that compensates for line transients in the distributed system. To overcome these issues and to achieve a higher speed, a new methodology called the Parallel IGBT-Based Interline Dynamic Voltage Restorer (PIGBT-IDVR) method has been proposed, which mainly spotlights the dynamic processing of energy reloads in common dc-linked energy storage with less adaptive transition. The interline power flow controller (IPFC) scheme has been employed to manage the power transmission between the lines and the restorer method for controlling the reactive power in the individual lines. By employing the proposed methodology, the failure of a distributed system has been avoided and provides better performance than the existing methodologies. PMID:26613101
The Cost of Inadequate Sleep among On-Call Workers in Australia: A Workplace Perspective
Ferguson, Sally A.; Jay, Sarah M.
2018-01-01
On-call or stand-by is becoming an increasingly prevalent form of work scheduling. However, on-call arrangements are typically utilised when workloads are low, for example at night, which can result in inadequate sleep. It is a matter of concern that on-call work is associated with an increased risk of workplace injury. This study sought to determine the economic cost of injury due to inadequate sleep in Australian on-call workers. The prevalence of inadequate sleep among on-call workers was determined using an online survey, and economic costs were estimated using a previously validated costing methodology. Two-thirds of the sample (66%) reported obtaining inadequate sleep on weekdays (work days) and over 80% reported inadequate sleep while on-call. The resulting cost of injury is estimated at $2.25 billion per year ($1.71–2.73 billion). This equates to $1222 per person per incident involving a short-term absence from work; $2.53 million per incident classified as full incapacity, and $1.78 million for each fatality. To the best of our knowledge this is the first study to quantify the economic cost of workplace injury due to inadequate sleep in on-call workers. Well-rested employees are critical to safe and productive workplace operations. Therefore, it is in the interest of both employers and governments to prioritise and invest far more into the management of inadequate sleep in industries which utilise on-call work arrangements. PMID:29495371
NASA Astrophysics Data System (ADS)
Al-Mousa, Amjed A.
Thin films are essential constituents of modern electronic devices and have a multitude of applications in such devices. The impact of the surface morphology of thin films on the device characteristics where these films are used has generated substantial attention to advanced film characterization techniques. In this work, we present a new approach to characterize surface nanostructures of thin films by focusing on isolating nanostructures and extracting quantitative information, such as the shape and size of the structures. This methodology is applicable to any Scanning Probe Microscopy (SPM) data, such as Atomic Force Microscopy (AFM) data which we are presenting here. The methodology starts by compensating the AFM data for some specific classes of measurement artifacts. After that, the methodology employs two distinct techniques. The first, which we call the overlay technique, proceeds by systematically processing the raster data that constitute the scanning probe image in both vertical and horizontal directions. It then proceeds by classifying points in each direction separately. Finally, the results from both the horizontal and the vertical subsets are overlaid, where a final decision on each surface point is made. The second technique, based on fuzzy logic, relies on a Fuzzy Inference Engine (FIE) to classify the surface points. Once classified, these points are clustered into surface structures. The latter technique also includes a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and then tune the fuzzy technique system uniquely for that surface. Both techniques have been applied to characterize organic semiconductor thin films of pentacene on different substrates. Also, we present a case study to demonstrate the effectiveness of our methodology to identify quantitatively particle sizes of two specimens of gold nanoparticles of different nominal dimensions dispersed on a mica surface. A comparison with other techniques like: thresholding, watershed and edge detection is presented next. Finally, we present a systematic study of the fuzzy logic technique by experimenting with synthetic data. These results are discussed and compared along with the challenges of the two techniques.
ERIC Educational Resources Information Center
Guo, Shibao; Maitra, Srabani
2017-01-01
Under the new mobilities paradigm, migration is conceptualized as circulatory and transnational, moving us beyond the framework of methodological nationalism. Transnational mobility has called into question dominant notions of migrant acculturation or assimilation. Migrants no longer feel obligated to remain tied to or locatable in a…
On the Use of Social Clocks for the Monitoring of Multidimensional Social Development
ERIC Educational Resources Information Center
Mueller, Georg P.
2011-01-01
This article describes a new methodology for monitoring multidimensional social development using social clocks: comparisons with so called reference trajectories make it possible to establish the development stage of a country along a number of independent time axes, thus affording new opportunities for analyzing leads, lags, and asynchronies…
A Call for a More Measured Approach to Reporting and Interpreting PISA Results
ERIC Educational Resources Information Center
Rutkowski, Leslie; Rutkowski, David
2016-01-01
In the current article, we consider the influential position of the Programme for International Student Assessment (PISA) and discuss several methodological areas that demonstrate the need for caution when using and interpreting PISA results. We motivate our argument by briefly describing the program's increased influence in educational policy…
Underwater Photo-Elicitation: A New Experiential Marine Education Technique
ERIC Educational Resources Information Center
Andrews, Steve; Stocker, Laura; Oechel, Walter
2018-01-01
Underwater photo-elicitation is a novel experiential marine education technique that combines direct experience in the marine environment with the use of digital underwater cameras. A program called Show Us Your Ocean! (SUYO!) was created, utilising a mixed methodology (qualitative and quantitative methods) to test the efficacy of this technique.…
Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI
ERIC Educational Resources Information Center
Forer, Barry; Zumbo, Bruno D.
2011-01-01
The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…
The Impact of Mode of Instructional Delivery on Second Language Teacher Self-Efficacy
ERIC Educational Resources Information Center
Kissau, Scott; Algozzine, Bob
2015-01-01
Research has called into question the suitability of fully-online instruction for certain teacher preparation courses. Methodology coursework, in particular, has been singled out in research as ill-suited to online instruction. Recent research, for example, involving second language (L2) teacher candidates has demonstrated that aspiring teachers…
Elementary Teacher Leaders: Theory and Methodology of Development
ERIC Educational Resources Information Center
Medina, Andrew J.
2014-01-01
Education reform and improvement in K-12 public education continues to be a national priority. Research within the past half century has brought to the forefront the brutal truth that American public education needs improvement (Tyack & Cuban, 1995). The call for improvement includes the need for teachers to emerge from silence and isolation…
The PLATO System: A Study in the Diffusion of an Innovation.
ERIC Educational Resources Information Center
Driscoll, Francis D.; Wolf, W. C., Jr.
This study was designed to ascertain the relationships between the steps of a tool designed to link knowledge production and the needs of knowledge users (the Wolf-Welsh Linkage Methodology or WWLM) with milestones in the evolution of an innovative computer-assisted instructional system called PLATO (Programming Logic for Advanced Teaching…
relationships and can be utilized to provide seasonal forecasts of tropical cyclones. Details of methodologies thunderstorm systems (called mesoscale convective complexes [MCCs]) often produce an inertially stable, warm , they considered hurricanes and intense hurricanes that occurred anywhere within these water boundaries
Moments, Mixed Methods, and Paradigm Dialogs
ERIC Educational Resources Information Center
Denzin, Norman K.
2010-01-01
I reread the 50-year-old history of the qualitative inquiry that calls for triangulation and mixed methods. I briefly visit the disputes within the mixed methods community asking how did we get to where we are today, the period of mixed-multiple-methods advocacy, and Teddlie and Tashakkori's third methodological moment. (Contains 10 notes.)
Photovoice in the Diversity Classroom: Engagement, Voice, and the "Eye/I" of the Camera
ERIC Educational Resources Information Center
Chio, Vanessa C. M.; Fandt, Patricia M.
2007-01-01
A response to calls for more self-reflective and inclusive pedagogy, this article considers pedagogical and teaching possibilities offered by Photovoice--a community and participatory action research methodology developed by Wang and Burris. Extrapolating Photovoice to the context of the diversity classroom, the authors discuss how the methodology…
ERIC Educational Resources Information Center
Plante, Jarrad D.; Cox, Thomas D.
2016-01-01
Service-learning has a longstanding history in higher education in and includes three main tenets: academic learning, meaningful community service, and civic learning. The Carnegie Foundation for the Advancement of Teaching created an elective classification system called the Carnegie Community Engagement Classification for higher education…
Childcare Regulations: Regulatory Enforcement in Ireland. What Happens When the Inspector Calls?
ERIC Educational Resources Information Center
Moloney, Mary
2016-01-01
Childcare regulations ensure children's rights to Early Childhood Care and Education settings that protect them from harm and promote their healthy development. To ensure that settings comply, power is vested with regulatory bodies that are tasked with enforcing regulations. Using a qualitative methodology, 43 interviews were undertaken with Early…
The Use of Factorial Forecasting to Predict Public Response
ERIC Educational Resources Information Center
Weiss, David J.
2012-01-01
Policies that call for members of the public to change their behavior fail if people don't change; predictions of whether the requisite changes will take place are needed prior to implementation. I propose to solve the prediction problem with Factorial Forecasting, a version of functional measurement methodology that employs group designs. Aspects…
"You've Got the Power": Documentary Film as a Tool of Environmental Adult Education
ERIC Educational Resources Information Center
Clover, Darlene E.
2011-01-01
Educators call for more creative means to combat the moribund narratives of contemporary environmentalism. Using visual methodology and environmental adult education theory, this article discusses how a documentary film titled "You've Got the Power" works to pose questions about complex environmental issues and develop critical thinking…
An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application
ERIC Educational Resources Information Center
Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth
2016-01-01
Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…
The Past as More than Prologue: A Call for Historical Research
ERIC Educational Resources Information Center
Horsford, Sonya Douglass; D'Amico, Diana
2015-01-01
Purpose: The purpose of this paper is to argue that historical research methods offer an innovative and powerful way to examine, frame, explain, and disrupt the study of contemporary issues in educational leadership. More specifically, the authors examine how historical methodology might recast some of the questions educational leadership…
Indigenous-Centered Pedagogies: Strategies for Teaching Native American Literature and Culture
ERIC Educational Resources Information Center
Portillo, Annette
2013-01-01
As a reflection on pedagogy, this essay seeks to provide strategic tools for teaching Native American literature and culture to non-native students. My teaching philosophy is informed by the indigenous-centered, decolonial methodologies as defined by Devon Mihesuah who calls for "indigenizing" the academy by challenging the status quo…
Taking a Stance through Visual Texts: Novice Teachers as Educational Agents
ERIC Educational Resources Information Center
Orland-Barak, Lily; Maskit, Ditza
2014-01-01
Drawing on qualitative methodologies that integrate verbal and non-verbal texts, this study investigated novice teachers' attributions of their experiences of internship, as conveyed through a visual text. Novices were invited to design a visual text that represented their experience during internship, as part of a national call entitled…
A Core Journal Decision Model Based on Weighted Page Rank
ERIC Educational Resources Information Center
Wang, Hei-Chia; Chou, Ya-lin; Guo, Jiunn-Liang
2011-01-01
Purpose: The paper's aim is to propose a core journal decision method, called the local impact factor (LIF), which can evaluate the requirements of the local user community by combining both the access rate and the weighted impact factor, and by tracking citation information on the local users' articles. Design/methodology/approach: Many…
ERIC Educational Resources Information Center
Cleary, Timothy J.; Dong, Ting; Artino, Anthony R., Jr.
2015-01-01
This study examined within-group shifts in the motivation beliefs and regulatory processes of second-year medical students as they engaged in a diagnostic reasoning activity. Using a contextualized assessment methodology called self-regulated learning microanalysis, the authors found that the 71 medical student participants showed statistically…
Engaging Families in the Galleries Using Design Thinking
ERIC Educational Resources Information Center
Larson, Lucy
2017-01-01
The Palo Alto Art Center sought a solution to the challenge that loyal family audiences, visiting weekly for art studio classes, rarely visit the contemporary art exhibition galleries. This article relates the experience of using the human-centered design process, often called Design Thinking, as the methodology to create a solution for family…