High-Resolution Wind Measurements for Offshore Wind Energy Development
NASA Technical Reports Server (NTRS)
Nghiem, Son V.; Neumann, Gregory
2011-01-01
A mathematical transform, called the Rosette Transform, together with a new method, called the Dense Sampling Method, have been developed. The Rosette Transform is invented to apply to both the mean part and the fluctuating part of a targeted radar signature using the Dense Sampling Method to construct the data in a high-resolution grid at 1-km posting for wind measurements over water surfaces such as oceans or lakes.
ParticleCall: A particle filter for base calling in next-generation sequencing systems
2012-01-01
Background Next-generation sequencing systems are capable of rapid and cost-effective DNA sequencing, thus enabling routine sequencing tasks and taking us one step closer to personalized medicine. Accuracy and lengths of their reads, however, are yet to surpass those provided by the conventional Sanger sequencing method. This motivates the search for computationally efficient algorithms capable of reliable and accurate detection of the order of nucleotides in short DNA fragments from the acquired data. Results In this paper, we consider Illumina’s sequencing-by-synthesis platform which relies on reversible terminator chemistry and describe the acquired signal by reformulating its mathematical model as a Hidden Markov Model. Relying on this model and sequential Monte Carlo methods, we develop a parameter estimation and base calling scheme called ParticleCall. ParticleCall is tested on a data set obtained by sequencing phiX174 bacteriophage using Illumina’s Genome Analyzer II. The results show that the developed base calling scheme is significantly more computationally efficient than the best performing unsupervised method currently available, while achieving the same accuracy. Conclusions The proposed ParticleCall provides more accurate calls than the Illumina’s base calling algorithm, Bustard. At the same time, ParticleCall is significantly more computationally efficient than other recent schemes with similar performance, rendering it more feasible for high-throughput sequencing data analysis. Improvement of base calling accuracy will have immediate beneficial effects on the performance of downstream applications such as SNP and genotype calling. ParticleCall is freely available at https://sourceforge.net/projects/particlecall. PMID:22776067
Developing Multimedia Courseware for the Internet's Java versus Shockwave.
ERIC Educational Resources Information Center
Majchrzak, Tina L.
1996-01-01
Describes and compares two methods for developing multimedia courseware for use on the Internet: an authoring tool called Shockwave, and an object-oriented language called Java. Topics include vector graphics, browsers, interaction with network protocols, data security, multithreading, and computer languages versus development environments. (LRW)
Improvements to Passive Acoustic Tracking Methods for Marine Mammal Monitoring
2014-09-30
species of interest in these datasets are sperm whales , beaked whales , minke whales , and humpback whales . Most methods developed will be...datasets, automated detectors for fin and sei whales were developed, implemented and quantified. For the “stereotypical” calls produced by these animals...Objective 4: The matched filter detectors implemented for fin and sei whale calls are sufficient for the purposes of this project, with
Flexible Delivery as a "Whole-Organisation": What Does This Mean in Practice?
ERIC Educational Resources Information Center
Henry, John; Wakefield, Lyn
A research project called Support Services for Flexible Delivery was commissioned by the Australian organization TAFE (technical and further education) Frontiers. Since 1995, the project has been conducted by using a research approach called the Generalizations from Case Studies (GCS) research method. The GCS method was developed, tested, and…
NASA Technical Reports Server (NTRS)
Gramoll, K. C.; Dillard, D. A.; Brinson, H. F.
1989-01-01
In response to the tremendous growth in the development of advanced materials, such as fiber-reinforced plastic (FRP) composite materials, a new numerical method is developed to analyze and predict the time-dependent properties of these materials. Basic concepts in viscoelasticity, laminated composites, and previous viscoelastic numerical methods are presented. A stable numerical method, called the nonlinear differential equation method (NDEM), is developed to calculate the in-plane stresses and strains over any time period for a general laminate constructed from nonlinear viscoelastic orthotropic plies. The method is implemented in an in-plane stress analysis computer program, called VCAP, to demonstrate its usefulness and to verify its accuracy. A number of actual experimental test results performed on Kevlar/epoxy composite laminates are compared to predictions calculated from the numerical method.
Biological relevance of CNV calling methods using familial relatedness including monozygotic twins.
Castellani, Christina A; Melka, Melkaye G; Wishart, Andrea E; Locke, M Elizabeth O; Awamleh, Zain; O'Reilly, Richard L; Singh, Shiva M
2014-04-21
Studies involving the analysis of structural variation including Copy Number Variation (CNV) have recently exploded in the literature. Furthermore, CNVs have been associated with a number of complex diseases and neurodevelopmental disorders. Common methods for CNV detection use SNP, CNV, or CGH arrays, where the signal intensities of consecutive probes are used to define the number of copies associated with a given genomic region. These practices pose a number of challenges that interfere with the ability of available methods to accurately call CNVs. It has, therefore, become necessary to develop experimental protocols to test the reliability of CNV calling methods from microarray data so that researchers can properly discriminate biologically relevant data from noise. We have developed a workflow for the integration of data from multiple CNV calling algorithms using the same array results. It uses four CNV calling programs: PennCNV (PC), Affymetrix® Genotyping Console™ (AGC), Partek® Genomics Suite™ (PGS) and Golden Helix SVS™ (GH) to analyze CEL files from the Affymetrix® Human SNP 6.0 Array™. To assess the relative suitability of each program, we used individuals of known genetic relationships. We found significant differences in CNV calls obtained by different CNV calling programs. Although the programs showed variable patterns of CNVs in the same individuals, their distribution in individuals of different degrees of genetic relatedness has allowed us to offer two suggestions. The first involves the use of multiple algorithms for the detection of the largest possible number of CNVs, and the second suggests the use of PennCNV over all other methods when the use of only one software program is desirable.
Influence of atmospheric properties on detection of wood-warbler nocturnal flight calls
NASA Astrophysics Data System (ADS)
Horton, Kyle G.; Stepanian, Phillip M.; Wainwright, Charlotte E.; Tegeler, Amy K.
2015-10-01
Avian migration monitoring can take on many forms; however, monitoring active nocturnal migration of land birds is limited to a few techniques. Avian nocturnal flight calls are currently the only method for describing migrant composition at the species level. However, as this method develops, more information is needed to understand the sources of variation in call detection. Additionally, few studies examine how detection probabilities differ under varying atmospheric conditions. We use nocturnal flight call recordings from captive individuals to explore the dependence of flight call detection on atmospheric temperature and humidity. Height or distance from origin had the largest influence on call detection, while temperature and humidity also influenced detectability at higher altitudes. Because flight call detection varies with both atmospheric conditions and flight height, improved monitoring across time and space will require correction for these factors to generate standardized metrics of songbird migration.
Tracking fin whales in the northeast Pacific Ocean with a seafloor seismic network.
Wilcock, William S D
2012-10-01
Ocean bottom seismometer (OBS) networks represent a tool of opportunity to study fin and blue whales. A small OBS network on the Juan de Fuca Ridge in the northeast Pacific Ocean in ~2.3 km of water recorded an extensive data set of 20-Hz fin whale calls. An automated method has been developed to identify arrival times based on instantaneous frequency and amplitude and to locate calls using a grid search even in the presence of a few bad arrival times. When only one whale is calling near the network, tracks can generally be obtained up to distances of ~15 km from the network. When the calls from multiple whales overlap, user supervision is required to identify tracks. The absolute and relative amplitudes of arrivals and their three-component particle motions provide additional constraints on call location but are not useful for extending the distance to which calls can be located. The double-difference method inverts for changes in relative call locations using differences in residuals for pairs of nearby calls recorded on a common station. The method significantly reduces the unsystematic component of the location error, especially when inconsistencies in arrival time observations are minimized by cross-correlation.
Probabilistic assessment methodology for continuous-type petroleum accumulations
Crovelli, R.A.
2003-01-01
The analytic resource assessment method, called ACCESS (Analytic Cell-based Continuous Energy Spreadsheet System), was developed to calculate estimates of petroleum resources for the geologic assessment model, called FORSPAN, in continuous-type petroleum accumulations. The ACCESS method is based upon mathematical equations derived from probability theory in the form of a computer spreadsheet system. ?? 2003 Elsevier B.V. All rights reserved.
DSP Synthesis Algorithm for Generating Florida Scrub Jay Calls
NASA Technical Reports Server (NTRS)
Lane, John; Pittman, Tyler
2017-01-01
A prototype digital signal processing (DSP) algorithm has been developed to approximate Florida scrub jay calls. The Florida scrub jay (Aphelocoma coerulescens), believed to have been in existence for 2 million years, living only in Florida, has a complicated social system that is evident by examining the spectrograms of its calls. Audio data was acquired at the Helen and Allan Cruickshank Sanctuary, Rockledge, Florida during the 2016 mating season using three digital recorders sampling at 44.1 kHz. The synthesis algorithm is a first step at developing a robust identification and call analysis algorithm. Since the Florida scrub jay is severely threatened by loss of habitat, it is important to develop effective methods to monitor their threatened population using autonomous means.
Best practices for evaluating single nucleotide variant calling methods for microbial genomics
Olson, Nathan D.; Lund, Steven P.; Colman, Rebecca E.; Foster, Jeffrey T.; Sahl, Jason W.; Schupp, James M.; Keim, Paul; Morrow, Jayne B.; Salit, Marc L.; Zook, Justin M.
2015-01-01
Innovations in sequencing technologies have allowed biologists to make incredible advances in understanding biological systems. As experience grows, researchers increasingly recognize that analyzing the wealth of data provided by these new sequencing platforms requires careful attention to detail for robust results. Thus far, much of the scientific Communit’s focus for use in bacterial genomics has been on evaluating genome assembly algorithms and rigorously validating assembly program performance. Missing, however, is a focus on critical evaluation of variant callers for these genomes. Variant calling is essential for comparative genomics as it yields insights into nucleotide-level organismal differences. Variant calling is a multistep process with a host of potential error sources that may lead to incorrect variant calls. Identifying and resolving these incorrect calls is critical for bacterial genomics to advance. The goal of this review is to provide guidance on validating algorithms and pipelines used in variant calling for bacterial genomics. First, we will provide an overview of the variant calling procedures and the potential sources of error associated with the methods. We will then identify appropriate datasets for use in evaluating algorithms and describe statistical methods for evaluating algorithm performance. As variant calling moves from basic research to the applied setting, standardized methods for performance evaluation and reporting are required; it is our hope that this review provides the groundwork for the development of these standards. PMID:26217378
A short review of variants calling for single-cell-sequencing data with applications.
Wei, Zhuohui; Shu, Chang; Zhang, Changsheng; Huang, Jingying; Cai, Hongmin
2017-11-01
The field of single-cell sequencing is fleetly expanding, and many techniques have been developed in the past decade. With this technology, biologists can study not only the heterogeneity between two adjacent cells in the same tissue or organ, but also the evolutionary relationships and degenerative processes in a single cell. Calling variants is the main purpose in analyzing single cell sequencing (SCS) data. Currently, some popular methods used for bulk-cell-sequencing data analysis are tailored directly to be applied in dealing with SCS data. However, SCS requires an extra step of genome amplification to accumulate enough quantity for satisfying sequencing needs. The amplification yields large biases and thus raises challenge for using the bulk-cell-sequencing methods. In order to provide guidance for the development of specialized analyzed methods as well as using currently developed tools for SNS, this paper aims to bridge the gap. In this paper, we firstly introduced two popular genome amplification methods and compared their capabilities. Then we introduced a few popular models for calling single-nucleotide polymorphisms and copy-number variations. Finally, break-through applications of SNS were summarized to demonstrate its potential in researching cell evolution. Copyright © 2017 Elsevier Ltd. All rights reserved.
Valtierra, Robert D; Glynn Holt, R; Cholewiak, Danielle; Van Parijs, Sofie M
2013-09-01
Multipath localization techniques have not previously been applied to baleen whale vocalizations due to difficulties in application to tonal vocalizations. Here it is shown that an autocorrelation method coupled with the direct reflected time difference of arrival localization technique can successfully resolve location information. A derivation was made to model the autocorrelation of a direct signal and its overlapping reflections to illustrate that an autocorrelation may be used to extract reflection information from longer duration signals containing a frequency sweep, such as some calls produced by baleen whales. An analysis was performed to characterize the difference in behavior of the autocorrelation when applied to call types with varying parameters (sweep rate, call duration). The method's feasibility was tested using data from playback transmissions to localize an acoustic transducer at a known depth and location. The method was then used to estimate the depth and range of a single North Atlantic right whale (Eubalaena glacialis) and humpback whale (Megaptera novaeangliae) from two separate experiments.
ERIC Educational Resources Information Center
Ugwu, Romanus Iroabuchi
2012-01-01
The purpose of this mixed-methods study was to describe the perceptions of elementary teachers from an urban school district in Southern California regarding their inquiry-based science instructional practices, assessment methods and professional development. The district's inquiry professional development called the California Mathematics and…
Flight-Test Evaluation of Flutter-Prediction Methods
NASA Technical Reports Server (NTRS)
Lind, RIck; Brenner, Marty
2003-01-01
The flight-test community routinely spends considerable time and money to determine a range of flight conditions, called a flight envelope, within which an aircraft is safe to fly. The cost of determining a flight envelope could be greatly reduced if there were a method of safely and accurately predicting the speed associated with the onset of an instability called flutter. Several methods have been developed with the goal of predicting flutter speeds to improve the efficiency of flight testing. These methods include (1) data-based methods, in which one relies entirely on information obtained from the flight tests and (2) model-based approaches, in which one relies on a combination of flight data and theoretical models. The data-driven methods include one based on extrapolation of damping trends, one that involves an envelope function, one that involves the Zimmerman-Weissenburger flutter margin, and one that involves a discrete-time auto-regressive model. An example of a model-based approach is that of the flutterometer. These methods have all been shown to be theoretically valid and have been demonstrated on simple test cases; however, until now, they have not been thoroughly evaluated in flight tests. An experimental apparatus called the Aerostructures Test Wing (ATW) was developed to test these prediction methods.
PROFIL: A Method for the Development of Multimedia.
ERIC Educational Resources Information Center
Koper, Rob
1995-01-01
Describes a dedicated method for the design of multimedia courseware, called PROFIL, which integrates instructional design with software engineering techniques and incorporates media selection in the design methodology. The phases of development are outlined: preliminary investigation, definition, script, technical realization, implementation, and…
Essentials of Suggestopedia: A Primer for Practitioners.
ERIC Educational Resources Information Center
Caskey, Owen L.; Flake, Muriel H.
Suggestology is the scientific study of the psychology of suggestion and Suggestopedia in the application of relaxation and suggestion techniques to learning. The approach applied to learning processes (called Suggestopedic) developed by Dr. Georgi Lozanov (called the Lozanov Method) utilizes mental and physical relaxation, deep breathing,…
Biofouling development on plasma treated samples versus layers coated samples
NASA Astrophysics Data System (ADS)
Hnatiuc, B.; Exnar, P.; Sabau, A.; Spatenka, P.; Dumitrache, C. L.; Hnatiuc, M.; Ghita, S.
2016-12-01
Biofouling is the most important cause of naval corrosion. In order to reduce the Biofouling development on naval materials as steel or resin, different new methods have been tested. These methods could help to follow the new IMO environment reglementations and they could replace few classic operations before the painting of the small ships. The replacement of these operations means a reduction in maintenance costs. Their action must influence especially the first two steps of the Biofouling development, called Microfouling, that demand about 24 hours. This work presents the comparative results of the Biofouling development on two different classic naval materials, steel and resin, for three treated samples, immersed in sea water. Non-thermal plasma, produced by GlidArc technology, is applied to the first sample, called GD. The plasma treatment was set to 10 minutes. The last two samples, called AE9 and AE10 are covered by hydrophobic layers, prepared from a special organic-inorganic sol synthesized by sol-gel method. Theoretically, because of the hydrophobic properties, the Biofouling formation must be delayed for AE9 and AE10. The Biofouling development on each treated sample was compared with a witness non-treated sample. The microbiological analyses have been done for 24 hours by epifluorescence microscopy, available for one single layer.
Reinforcement learning for resource allocation in LEO satellite networks.
Usaha, Wipawee; Barria, Javier A
2007-06-01
In this paper, we develop and assess online decision-making algorithms for call admission and routing for low Earth orbit (LEO) satellite networks. It has been shown in a recent paper that, in a LEO satellite system, a semi-Markov decision process formulation of the call admission and routing problem can achieve better performance in terms of an average revenue function than existing routing methods. However, the conventional dynamic programming (DP) numerical solution becomes prohibited as the problem size increases. In this paper, two solution methods based on reinforcement learning (RL) are proposed in order to circumvent the computational burden of DP. The first method is based on an actor-critic method with temporal-difference (TD) learning. The second method is based on a critic-only method, called optimistic TD learning. The algorithms enhance performance in terms of requirements in storage, computational complexity and computational time, and in terms of an overall long-term average revenue function that penalizes blocked calls. Numerical studies are carried out, and the results obtained show that the RL framework can achieve up to 56% higher average revenue over existing routing methods used in LEO satellite networks with reasonable storage and computational requirements.
ERIC Educational Resources Information Center
Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse
2015-01-01
The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…
Renting Rooms in Three Canadian Cities: Accepting and Rejecting the AIDS Patient.
ERIC Educational Resources Information Center
Page, Stewart
Following methods previously developed, this study investigated the social stigma associated with Acquired Immunodeficiency Syndrome (AIDS) by placing 90 telephone calls to landlords advertising rooms for rent in each of three Canadian cities: Windsor, Toronto, and Halifax. Compared to control conditions, calls ostensibly from AIDS patients were…
Inclusive Education National Research Advocacy Agenda: A Call to Action
ERIC Educational Resources Information Center
Morningstar, Mary E.; Allcock, Heather C.; White, Julia M.; Taub, Deborah; Kurth, Jennifer A.; Gonsier-Gerdin, Jean; Ryndak, Diane L.; Sauer, Janet; Jorgensen, Cheryl M.
2016-01-01
The TASH Inclusive Education National Committee responded to Horner and Dunlap's call to ensure that future research integrates inclusive values with strong science by developing an inclusive education national research advocacy agenda. Qualitative methods were implemented to answer three questions: (a) "What is the state of inclusive…
The development of a super-fine-grained nuclear emulsion
NASA Astrophysics Data System (ADS)
Asada, Takashi; Naka, Tatsuhiro; Kuwabara, Ken-ichi; Yoshimoto, Masahiro
2017-06-01
A nuclear emulsion with micronized crystals is required for the tracking detection of submicron ionizing particles, which are one of the targets of dark-matter detection and other techniques. We found that a new production method, called the PVA—gelatin mixing method (PGMM), could effectively control crystal size from 20 nm to 50 nm. We called the two types of emulsion produced with the new method the nano imaging tracker and the ultra-nano imaging tracker. Their composition and spatial resolution were measured, and the results indicate that these emulsions detect extremely short tracks.
Implicity restarted Arnoldi/Lanczos methods for large scale eigenvalue calculations
NASA Technical Reports Server (NTRS)
Sorensen, Danny C.
1996-01-01
Eigenvalues and eigenfunctions of linear operators are important to many areas of applied mathematics. The ability to approximate these quantities numerically is becoming increasingly important in a wide variety of applications. This increasing demand has fueled interest in the development of new methods and software for the numerical solution of large-scale algebraic eigenvalue problems. In turn, the existence of these new methods and software, along with the dramatically increased computational capabilities now available, has enabled the solution of problems that would not even have been posed five or ten years ago. Until very recently, software for large-scale nonsymmetric problems was virtually non-existent. Fortunately, the situation is improving rapidly. The purpose of this article is to provide an overview of the numerical solution of large-scale algebraic eigenvalue problems. The focus will be on a class of methods called Krylov subspace projection methods. The well-known Lanczos method is the premier member of this class. The Arnoldi method generalizes the Lanczos method to the nonsymmetric case. A recently developed variant of the Arnoldi/Lanczos scheme called the Implicitly Restarted Arnoldi Method is presented here in some depth. This method is highlighted because of its suitability as a basis for software development.
Bezombes, Lucie; Gaucherand, Stéphanie; Kerbiriou, Christian; Reinert, Marie-Eve; Spiegelberger, Thomas
2017-08-01
In many countries, biodiversity compensation is required to counterbalance negative impacts of development projects on biodiversity by carrying out ecological measures, called offset when the goal is to reach "no net loss" of biodiversity. One main issue is to ensure that offset gains are equivalent to impact-related losses. Ecological equivalence is assessed with ecological equivalence assessment methods taking into account a range of key considerations that we summarized as ecological, spatial, temporal, and uncertainty. When equivalence assessment methods take into account all considerations, we call them "comprehensive". Equivalence assessment methods should also aim to be science-based and operational, which is challenging. Many equivalence assessment methods have been developed worldwide but none is fully satisfying. In the present study, we examine 13 equivalence assessment methods in order to identify (i) their general structure and (ii) the synergies and trade-offs between equivalence assessment methods characteristics related to operationality, scientific-basis and comprehensiveness (called "challenges" in his paper). We evaluate each equivalence assessment methods on the basis of 12 criteria describing the level of achievement of each challenge. We observe that all equivalence assessment methods share a general structure, with possible improvements in the choice of target biodiversity, the indicators used, the integration of landscape context and the multipliers reflecting time lags and uncertainties. We show that no equivalence assessment methods combines all challenges perfectly. There are trade-offs between and within the challenges: operationality tends to be favored while scientific basis are integrated heterogeneously in equivalence assessment methods development. One way of improving the challenges combination would be the use of offset dedicated data-bases providing scientific feedbacks on previous offset measures.
CSM Testbed Development and Large-Scale Structural Applications
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Gillian, R. E.; Mccleary, Susan L.; Lotts, C. G.; Poole, E. L.; Overman, A. L.; Macy, S. C.
1989-01-01
A research activity called Computational Structural Mechanics (CSM) conducted at the NASA Langley Research Center is described. This activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM Testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM Testbed methods development environment is presented and some new numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.
Allele-specific copy-number discovery from whole-genome and whole-exome sequencing
Wang, WeiBo; Wang, Wei; Sun, Wei; Crowley, James J.; Szatkiewicz, Jin P.
2015-01-01
Copy-number variants (CNVs) are a major form of genetic variation and a risk factor for various human diseases, so it is crucial to accurately detect and characterize them. It is conceivable that allele-specific reads from high-throughput sequencing data could be leveraged to both enhance CNV detection and produce allele-specific copy number (ASCN) calls. Although statistical methods have been developed to detect CNVs using whole-genome sequence (WGS) and/or whole-exome sequence (WES) data, information from allele-specific read counts has not yet been adequately exploited. In this paper, we develop an integrated method, called AS-GENSENG, which incorporates allele-specific read counts in CNV detection and estimates ASCN using either WGS or WES data. To evaluate the performance of AS-GENSENG, we conducted extensive simulations, generated empirical data using existing WGS and WES data sets and validated predicted CNVs using an independent methodology. We conclude that AS-GENSENG not only predicts accurate ASCN calls but also improves the accuracy of total copy number calls, owing to its unique ability to exploit information from both total and allele-specific read counts while accounting for various experimental biases in sequence data. Our novel, user-friendly and computationally efficient method and a complete analytic protocol is freely available at https://sourceforge.net/projects/asgenseng/. PMID:25883151
NASA Astrophysics Data System (ADS)
Nakamura, Yusuke; Hoshizawa, Taku
2016-09-01
Two methods for increasing the data capacity of a holographic data storage system (HDSS) were developed. The first method is called “run-length-limited (RLL) high-density recording”. An RLL modulation has the same effect as enlarging the pixel pitch; namely, it optically reduces the hologram size. Accordingly, the method doubles the raw-data recording density. The second method is called “RLL turbo signal processing”. The RLL turbo code consists of \\text{RLL}(1,∞ ) trellis modulation and an optimized convolutional code. The remarkable point of the developed turbo code is that it employs the RLL modulator and demodulator as parts of the error-correction process. The turbo code improves the capability of error correction more than a conventional LDPC code, even though interpixel interference is generated. These two methods will increase the data density 1.78-fold. Moreover, by simulation and experiment, a data density of 2.4 Tbit/in.2 is confirmed.
Browning, Brian L.; Yu, Zhaoxia
2009-01-01
We present a novel method for simultaneous genotype calling and haplotype-phase inference. Our method employs the computationally efficient BEAGLE haplotype-frequency model, which can be applied to large-scale studies with millions of markers and thousands of samples. We compare genotype calls made with our method to genotype calls made with the BIRDSEED, CHIAMO, GenCall, and ILLUMINUS genotype-calling methods, using genotype data from the Illumina 550K and Affymetrix 500K arrays. We show that our method has higher genotype-call accuracy and yields fewer uncalled genotypes than competing methods. We perform single-marker analysis of data from the Wellcome Trust Case Control Consortium bipolar disorder and type 2 diabetes studies. For bipolar disorder, the genotype calls in the original study yield 25 markers with apparent false-positive association with bipolar disorder at a p < 10−7 significance level, whereas genotype calls made with our method yield no associated markers at this significance threshold. Conversely, for markers with replicated association with type 2 diabetes, there is good concordance between genotype calls used in the original study and calls made by our method. Results from single-marker and haplotypic analysis of our method's genotype calls for the bipolar disorder study indicate that our method is highly effective at eliminating genotyping artifacts that cause false-positive associations in genome-wide association studies. Our new genotype-calling methods are implemented in the BEAGLE and BEAGLECALL software packages. PMID:19931040
Feasibility of digital imaging to characterize earth materials : part 2.
DOT National Transportation Integrated Search
2012-06-06
This study demonstrated the feasibility of digital imaging to characterize earth materials. Two rapid, relatively low cost image-based methods were developed for determining the grain size distribution of soils and aggregates. The first method, calle...
Feasibility of digital imaging to characterize earth materials : part 6.
DOT National Transportation Integrated Search
2012-06-06
This study demonstrated the feasibility of digital imaging to characterize earth materials. Two rapid, relatively low cost image-based methods were developed for determining the grain size distribution of soils and aggregates. The first method, calle...
Feasibility of digital imaging to characterize earth materials : part 3.
DOT National Transportation Integrated Search
2012-06-06
This study demonstrated the feasibility of digital imaging to characterize earth materials. Two rapid, relatively low cost image-based methods were developed for determining the grain size distribution of soils and aggregates. The first method, calle...
Indicators and Metrics for Evaluating the Sustainability of Chemical Processes
A metric-based method, called GREENSCOPE, has been developed for evaluating process sustainability. Using lab-scale information and engineering assumptions the method evaluates full-scale epresentations of processes in environmental, efficiency, energy and economic areas. The m...
Feasibility of digital imaging to characterize earth materials : part 1.
DOT National Transportation Integrated Search
2012-06-06
This study demonstrated the feasibility of digital imaging to characterize earth materials. Two rapid, relatively low cost image-based methods were developed for determining the grain size distribution of soils and aggregates. The first method, calle...
Feasibility of digital imaging to characterize earth materials : part 4.
DOT National Transportation Integrated Search
2012-06-06
This study demonstrated the feasibility of digital imaging to characterize earth materials. Two rapid, relatively low cost image-based methods were developed for determining the grain size distribution of soils and aggregates. The first method, calle...
Feasibility of digital imaging to characterize earth materials : part 5.
DOT National Transportation Integrated Search
2012-05-06
This study demonstrated the feasibility of digital imaging to characterize earth materials. Two rapid, relatively low cost image-based methods were developed for determining the grain size distribution of soils and aggregates. The first method, calle...
Allele-specific copy-number discovery from whole-genome and whole-exome sequencing.
Wang, WeiBo; Wang, Wei; Sun, Wei; Crowley, James J; Szatkiewicz, Jin P
2015-08-18
Copy-number variants (CNVs) are a major form of genetic variation and a risk factor for various human diseases, so it is crucial to accurately detect and characterize them. It is conceivable that allele-specific reads from high-throughput sequencing data could be leveraged to both enhance CNV detection and produce allele-specific copy number (ASCN) calls. Although statistical methods have been developed to detect CNVs using whole-genome sequence (WGS) and/or whole-exome sequence (WES) data, information from allele-specific read counts has not yet been adequately exploited. In this paper, we develop an integrated method, called AS-GENSENG, which incorporates allele-specific read counts in CNV detection and estimates ASCN using either WGS or WES data. To evaluate the performance of AS-GENSENG, we conducted extensive simulations, generated empirical data using existing WGS and WES data sets and validated predicted CNVs using an independent methodology. We conclude that AS-GENSENG not only predicts accurate ASCN calls but also improves the accuracy of total copy number calls, owing to its unique ability to exploit information from both total and allele-specific read counts while accounting for various experimental biases in sequence data. Our novel, user-friendly and computationally efficient method and a complete analytic protocol is freely available at https://sourceforge.net/projects/asgenseng/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Kopsinis, Yannis; Aboutanios, Elias; Waters, Dean A; McLaughlin, Steve
2010-02-01
In this paper, techniques for time-frequency analysis and investigation of bat echolocation calls are studied. Particularly, enhanced resolution techniques are developed and/or used in this specific context for the first time. When compared to traditional time-frequency representation methods, the proposed techniques are more capable of showing previously unseen features in the structure of bat echolocation calls. It should be emphasized that although the study is focused on bat echolocation recordings, the results are more general and applicable to many other types of signal.
METHODS FOR EVALUATING THE SUSTAINABILITY OF GREEN PROCESSES
A methodology, called GREENSCOPE (Gauging Reaction Effectiveness for the ENvironmental Sustainability of Chemistries with a multi-Objective Process Evaluator), has been developed in the U.S. EPA's Office of Research and Development to directly compare the sustainability of proces...
NASA Astrophysics Data System (ADS)
Helble, Tyler Adam
Passive acoustic monitoring of marine mammal calls is an increasingly important method for assessing population numbers, distribution, and behavior. Automated methods are needed to aid in the analyses of the recorded data. When a mammal vocalizes in the marine environment, the received signal is a filtered version of the original waveform emitted by the marine mammal. The waveform is reduced in amplitude and distorted due to propagation effects that are influenced by the bathymetry and environment. It is important to account for these effects to determine a site-specific probability of detection for marine mammal calls in a given study area. A knowledge of that probability function over a range of environmental and ocean noise conditions allows vocalization statistics from recordings of single, fixed, omnidirectional sensors to be compared across sensors and at the same sensor over time with less bias and uncertainty in the results than direct comparison of the raw statistics. This dissertation focuses on both the development of new tools needed to automatically detect humpback whale vocalizations from single-fixed omnidirectional sensors as well as the determination of the site-specific probability of detection for monitoring sites off the coast of California. Using these tools, detected humpback calls are "calibrated" for environmental properties using the site-specific probability of detection values, and presented as call densities (calls per square kilometer per time). A two-year monitoring effort using these calibrated call densities reveals important biological and ecological information on migrating humpback whales off the coast of California. Call density trends are compared between the monitoring sites and at the same monitoring site over time. Call densities also are compared to several natural and human-influenced variables including season, time of day, lunar illumination, and ocean noise. The results reveal substantial differences in call densities between the two sites which were not noticeable using uncorrected (raw) call counts. Additionally, a Lombard effect was observed for humpback whale vocalizations in response to increasing ocean noise. The results presented in this thesis develop techniques to accurately measure marine mammal abundances from passive acoustic sensors.
Long-range acoustic detection and localization of blue whale calls in the northeast Pacific Ocean.
Stafford, K M; Fox, C G; Clark, D S
1998-12-01
Analysis of acoustic signals recorded from the U.S. Navy's SOund SUrveillance System (SOSUS) was used to detect and locate blue whale (Balaenoptera musculus) calls offshore in the northeast Pacific. The long, low-frequency components of these calls are characteristic of calls recorded in the presence of blue whales elsewhere in the world. Mean values for frequency and time characteristics from field-recorded blue whale calls were used to develop a simple matched filter for detecting such calls in noisy time series. The matched filter was applied to signals from three different SOSUS arrays off the coast of the Pacific Northwest to detect and associate individual calls from the same animal on the different arrays. A U.S. Navy maritime patrol aircraft was directed to an area where blue whale calls had been detected on SOSUS using these methods, and the presence of vocalizing blue whale was confirmed at the site with field recordings from sonobuoys.
On fixed-area plot sampling for downed coarse woody debris
Jeffrey H. Gove; Paul C. Van Deusen
2011-01-01
The use of fixed-area plots for sampling down coarse woody debris is reviewed. A set of clearly defined protocols for two previously described methods is established and a new method, which we call the 'sausage' method, is developed. All methods (protocols) are shown to be unbiased for volume estimation, but not necessarily for estimation of population...
ERIC Educational Resources Information Center
Howard, Lyz
2016-01-01
As an experienced face-to-face teacher, working in a small Crown Dependency with no Higher Education Institute (HEI) to call its own, the subsequent geographical and professional isolation in the context of Networked Learning (NL), as a sub-set of eLearning, calls for innovative ways in which to develop self-reliant methods of professional…
Developing Preschoolers' Social Skills: The Effectiveness of Two Educational Methods
ERIC Educational Resources Information Center
Smogorzewska, Joanna; Szumski, Grzegorz
2018-01-01
This study tested whether and how methods called 'Play Time/Social Time' and 'I Can Problem Solve' contribute to the improvement of social skills and the development of theory of mind (ToM) in children. The participants in the experiment were nearly 200 (N = 196) preschool children with low social functioning, with and without disabilities. The…
Women in History--Maria Montessori
ERIC Educational Resources Information Center
Zierdt, Ginger L.
2007-01-01
This article profiles Maria Montessori, an international ambassador for children who became known for her theories and methods of pedagogy, called the "Montessori Method." Montessori developed an educational theory, which combined ideas of scholar Froebel, anthropologists Giuseooe Serge, French physicians Jean Itard and Edouard Sequin,…
Mishara, Brian L; Chagnon, François; Daigle, Marc; Balan, Bogdan; Raymond, Sylvaine; Marcoux, Isabelle; Bardon, Cécile; Campbell, Julie K; Berman, Alan
2007-06-01
A total of 2,611 calls to 14 helplines were monitored to observe helper behaviors and caller characteristics and changes during the calls. The relationship between intervention characteristics and call outcomes are reported for 1,431 crisis calls. Empathy and respect, as well as factor-analytically derived scales of supportive approach and good contact and collaborative problem solving were significantly related to positive outcomes, but not active listening. We recommend recruitment of helpers with these characteristics, development of standardized training in those methods that are empirically shown to be effective, and the need for research relating short-term outcomes to long-term effects.
Selected Aspects of the eCall Emergency Notification System
NASA Astrophysics Data System (ADS)
Kaminski, Tomasz; Nowacki, Gabriel; Mitraszewska, Izabella; Niezgoda, Michał; Kruszewski, Mikołaj; Kaminska, Ewa; Filipek, Przemysław
2012-02-01
The article describes problems associated with the road collision detection for the purpose of the automatic emergency call. At the moment collision is detected, the eCall device installed in the vehicle will automatically make contact with Emergency Notification Centre and send the set of essential information on the vehicle and the place of the accident. To activate the alarm, the information about the deployment of the airbags will not be used, because connection of the eCall device might interfere with the vehicle’s safety systems. It is necessary to develop a method enabling detection of the road collision, similar to the one used in airbag systems, and based on the signals available from the acceleration sensors.
ERIC Educational Resources Information Center
Morris, Michael Lane; Storberg-Walker, Julia; McMillan, Heather S.
2009-01-01
This article presents a new model, generated through applied theory-building research methods, that helps human resource development (HRD) practitioners evaluate the return on investment (ROI) of organization development (OD) interventions. This model, called organization development human-capital accounting system (ODHCAS), identifies…
Quick, Joshua; Quinlan, Aaron R; Loman, Nicholas J
2014-01-01
The MinION™ is a new, portable single-molecule sequencer developed by Oxford Nanopore Technologies. It measures four inches in length and is powered from the USB 3.0 port of a laptop computer. The MinION™ measures the change in current resulting from DNA strands interacting with a charged protein nanopore. These measurements can then be used to deduce the underlying nucleotide sequence. We present a read dataset from whole-genome shotgun sequencing of the model organism Escherichia coli K-12 substr. MG1655 generated on a MinION™ device during the early-access MinION™ Access Program (MAP). Sequencing runs of the MinION™ are presented, one generated using R7 chemistry (released in July 2014) and one using R7.3 (released in September 2014). Base-called sequence data are provided to demonstrate the nature of data produced by the MinION™ platform and to encourage the development of customised methods for alignment, consensus and variant calling, de novo assembly and scaffolding. FAST5 files containing event data within the HDF5 container format are provided to assist with the development of improved base-calling methods.
A Machine Learning Method for Power Prediction on the Mobile Devices.
Chen, Da-Ren; Chen, You-Shyang; Chen, Lin-Chih; Hsu, Ming-Yang; Chiang, Kai-Feng
2015-10-01
Energy profiling and estimation have been popular areas of research in multicore mobile architectures. While short sequences of system calls have been recognized by machine learning as pattern descriptions for anomalous detection, power consumption of running processes with respect to system-call patterns are not well studied. In this paper, we propose a fuzzy neural network (FNN) for training and analyzing process execution behaviour with respect to series of system calls, parameters and their power consumptions. On the basis of the patterns of a series of system calls, we develop a power estimation daemon (PED) to analyze and predict the energy consumption of the running process. In the initial stage, PED categorizes sequences of system calls as functional groups and predicts their energy consumptions by FNN. In the operational stage, PED is applied to identify the predefined sequences of system calls invoked by running processes and estimates their energy consumption.
Task Based Language Teaching: Development of CALL
ERIC Educational Resources Information Center
Anwar, Khoirul; Arifani, Yudhi
2016-01-01
The dominant complexities of English teaching in Indonesia are about limited development of teaching methods and materials which still cannot optimally reflect students' needs (in particular of how to acquire knowledge and select the most effective learning models). This research is to develop materials with complete task-based activities by using…
USDA-ARS?s Scientific Manuscript database
Market demands for cotton varieties with improved fiber properties also call for the development of fast, reliable analytical methods for monitoring fiber development and measuring their properties. Currently, cotton breeders rely on instrumentation that can require significant amounts of sample, w...
ERIC Educational Resources Information Center
Cobb, Janice Lynn
2017-01-01
Accounting professionals have consistently called for educators to develop curriculum designed to encourage students to develop intellectual skills. The purpose of this action research study was to develop and implement an instructional method that requires intermediate financial accounting (IFA) students to consistently practice higher order…
Using KIE To Help Students Develop Shared Criteria for House Designs.
ERIC Educational Resources Information Center
Cuthbert, Alex; Hoadley, Christopher M.
How can students develop shared criteria for problems that have no "right" answer? Ill-structured problems of this sort are called design problems. Like portfolio projects, these problems are difficult to evaluate for both teachers and students. This investigation contrasts two methods for developing shared criteria for project…
Development of EPA OTM 10 for Landfill Applications
In 2006, the U.S. Environmental Protection Agency posted a new test method on its website called OTM 10 which describes direct measurement of pollutant mass emission flux from area sources using ground-based optical remote sensing. The method has validated application to relative...
Analysis methods for tocopherols and tocotrienols
USDA-ARS?s Scientific Manuscript database
Tocopherols and tocotrienols, sometimes called tocochromanols or tocols, are also collectively termed Vitamin E. Vitamins A, D, E, and K, are referred to as fat soluble vitamins. Since the discovery of Vitamin E in 1922, many methods have been developed for the analysis of tocopherols and tocotrie...
Linking Neuroscience and Psychoanalysis.
ERIC Educational Resources Information Center
Habicht, Manuela H.
This review discusses the relationship between neuroscience and psychoanalysis and introduces a new scientific method called neuro-psychoanalysis, a combination of the two phenomena. A significant difference between the two is that psychoanalysis has not evolved scientifically since it has not developed objective methods for testing ideas that it…
NASA Astrophysics Data System (ADS)
Unger, André J. A.
2010-02-01
This work is the first installment in a two-part series, and focuses on the development of a numerical PDE approach to price components of a Bermudan-style callable catastrophe (CAT) bond. The bond is based on two underlying stochastic variables; the PCS index which posts quarterly estimates of industry-wide hurricane losses as well as a single-factor CIR interest rate model for the three-month LIBOR. The aggregate PCS index is analogous to losses claimed under traditional reinsurance in that it is used to specify a reinsurance layer. The proposed CAT bond model contains a Bermudan-style call feature designed to allow the reinsurer to minimize their interest rate risk exposure on making substantial fixed coupon payments using capital from the reinsurance premium. Numerical PDE methods are the fundamental strategy for pricing early-exercise constraints, such as the Bermudan-style call feature, into contingent claim models. Therefore, the objective and unique contribution of this first installment in the two-part series is to develop a formulation and discretization strategy for the proposed CAT bond model utilizing a numerical PDE approach. Object-oriented code design is fundamental to the numerical methods used to aggregate the PCS index, and implement the call feature. Therefore, object-oriented design issues that relate specifically to the development of a numerical PDE approach for the component of the proposed CAT bond model that depends on the PCS index and LIBOR are described here. Formulation, numerical methods and code design issues that relate to aggregating the PCS index and introducing the call option are the subject of the companion paper.
ERIC Educational Resources Information Center
Osterman, Dean
This chapter explains how the Guided Design method of teaching can be used to solve problems, and how this method was used in the development of a new method of teaching. Called the Feedback Lecture, this method is illustrated through an example, and research data on its effectiveness is presented. The Guided Decision-Making Process is also…
LEARNING TO READ SCIENTIFIC RUSSIAN BY THE THREE QUESTION EXPERIMENTAL (3QX) METHOD.
ERIC Educational Resources Information Center
ALFORD, M.H.T.
A NEW METHOD FOR LEARNING TO READ TECHNICAL LITERATURE IN A FOREIGN LANGUAGE IS BEING DEVELOPED AND TESTED AT THE LANGUAGE CENTRE OF THE UNIVERSITY OF ESSEX, COLCHESTER, ENGLAND. THE METHOD IS CALLED "THREE QUESTION EXPERIMENTAL METHOD (3QX)," AND IT HAS BEEN USED IN THREE COURSES FOR TEACHING SCIENTIFIC RUSSIAN TO PHYSICISTS. THE THREE…
Combining Project Management Methods: A Case Study of Dlstributed Work Practices
NASA Astrophysics Data System (ADS)
Backlund, Per; Lundell, Björn
The increasing complexity of information systems development (ISD) projects call for improved project management practices. This, together with an endeavour to improve the success rate of ISD projects (Lyytinen and Robey 1999; Cooke-Davies 2002; White and Fortune 2002), has served as drivers for various efforts in process improvement such as the introduction of new development methods (Fitzgerald 1997; Iivari and Maansaari 1998).
Efficient multifeature index structures for music data retrieval
NASA Astrophysics Data System (ADS)
Lee, Wegin; Chen, Arbee L. P.
1999-12-01
In this paper, we propose four index structures for music data retrieval. Based on suffix trees, we develop two index structures called combined suffix tree and independent suffix trees. These methods still show shortcomings for some search functions. Hence we develop another index, called Twin Suffix Trees, to overcome these problems. However, the Twin Suffix Trees lack of scalability when the amount of music data becomes large. Therefore we propose the fourth index, called Grid-Twin Suffix Trees, to provide scalability and flexibility for a large amount of music data. For each index, we can use different search functions, like exact search and approximate search, on different music features, like melody, rhythm or both. We compare the performance of the different search functions applied on each index structure by a series of experiments.
Harris, Scott H.; Johnson, Joel A.; Neiswanger, Jeffery R.; Twitchell, Kevin E.
2004-03-09
The present invention includes systems configured to distribute a telephone call, communication systems, communication methods and methods of routing a telephone call to a customer service representative. In one embodiment of the invention, a system configured to distribute a telephone call within a network includes a distributor adapted to connect with a telephone system, the distributor being configured to connect a telephone call using the telephone system and output the telephone call and associated data of the telephone call; and a plurality of customer service representative terminals connected with the distributor and a selected customer service representative terminal being configured to receive the telephone call and the associated data, the distributor and the selected customer service representative terminal being configured to synchronize, application of the telephone call and associated data from the distributor to the selected customer service representative terminal.
Three novel approaches to structural identifiability analysis in mixed-effects models.
Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D
2016-05-06
Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not possible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Peterson, Ivars
1991-01-01
A method that enables people to obtain the benefits of statistics and probability theory without the shortcomings of conventional methods because it is free of mathematical formulas and is easy to understand and use is described. A resampling technique called the "bootstrap" is discussed in terms of application and development. (KR)
Design and Effects of Scenario Educational Software.
ERIC Educational Resources Information Center
Keegan, Mark
1993-01-01
Describes the development of educational computer software called scenario software that was designed to incorporate advances in cognitive, affective, and physiological research. Instructional methods are outlined; the need to change from didactic methods to discovery learning is explained; and scenario software design features are discussed. (24…
Analyzing Students' Learning in Classroom Discussions about Socioscientific Issues
ERIC Educational Resources Information Center
Rudsberg, Karin; Ohman, Johan; Ostman, Leif
2013-01-01
In this study, the purpose is to develop and illustrate a method that facilitates investigations of students' learning processes in classroom discussions about socioscientific issues. The method, called transactional argumentation analysis, combines a transactional perspective on meaning making based on John Dewey's pragmatic philosophy and an…
Wright, Mark H.; Tung, Chih-Wei; Zhao, Keyan; Reynolds, Andy; McCouch, Susan R.; Bustamante, Carlos D.
2010-01-01
Motivation: The development of new high-throughput genotyping products requires a significant investment in testing and training samples to evaluate and optimize the product before it can be used reliably on new samples. One reason for this is current methods for automated calling of genotypes are based on clustering approaches which require a large number of samples to be analyzed simultaneously, or an extensive training dataset to seed clusters. In systems where inbred samples are of primary interest, current clustering approaches perform poorly due to the inability to clearly identify a heterozygote cluster. Results: As part of the development of two custom single nucleotide polymorphism genotyping products for Oryza sativa (domestic rice), we have developed a new genotype calling algorithm called ‘ALCHEMY’ based on statistical modeling of the raw intensity data rather than modelless clustering. A novel feature of the model is the ability to estimate and incorporate inbreeding information on a per sample basis allowing accurate genotyping of both inbred and heterozygous samples even when analyzed simultaneously. Since clustering is not used explicitly, ALCHEMY performs well on small sample sizes with accuracy exceeding 99% with as few as 18 samples. Availability: ALCHEMY is available for both commercial and academic use free of charge and distributed under the GNU General Public License at http://alchemy.sourceforge.net/ Contact: mhw6@cornell.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20926420
ERIC Educational Resources Information Center
Akerson, Valarie L.; Carter, Ingrid S.; Park Rogers, Meredith A.; Pongsanon, Khemmawadee
2018-01-01
In this mixed methods study, the researchers developed a video-based measure called a "Prediction Assessment" to determine preservice elementary teachers' abilities to predict students' scientific reasoning. The instrument is based on teachers' need to develop pedagogical content knowledge for teaching science. Developing a knowledge…
Operator function modeling: An approach to cognitive task analysis in supervisory control systems
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1987-01-01
In a study of models of operators in complex, automated space systems, an operator function model (OFM) methodology was extended to represent cognitive as well as manual operator activities. Development continued on a software tool called OFMdraw, which facilitates construction of an OFM by permitting construction of a heterarchic network of nodes and arcs. Emphasis was placed on development of OFMspert, an expert system designed both to model human operation and to assist real human operators. The system uses a blackboard method of problem solving to make an on-line representation of operator intentions, called ACTIN (actions interpreter).
Low Base-Substitution Mutation Rate in the Germline Genome of the Ciliate Tetrahymena thermophila
2016-09-15
generations of mutation accumulation (MA). We applied an existing mutation-calling pipeline and developed a new probabilistic mutation detection approach...noise introduced by mismapped reads. We used both our new method and an existing mutation-calling pipeline (Sung, Tucker, et al. 2012) to analyse the...and larger MA experiments will be required to confidently estimate the mutational spectrum of a species with such a low mutation rate. Materials and
Modeling Electromagnetic Scattering From Complex Inhomogeneous Objects
NASA Technical Reports Server (NTRS)
Deshpande, Manohar; Reddy, C. J.
2011-01-01
This software innovation is designed to develop a mathematical formulation to estimate the electromagnetic scattering characteristics of complex, inhomogeneous objects using the finite-element-method (FEM) and method-of-moments (MoM) concepts, as well as to develop a FORTRAN code called FEMOM3DS (Finite Element Method and Method of Moments for 3-Dimensional Scattering), which will implement the steps that are described in the mathematical formulation. Very complex objects can be easily modeled, and the operator of the code is not required to know the details of electromagnetic theory to study electromagnetic scattering.
Advances in the use of observed spatial patterns of catchment hydrological response
NASA Astrophysics Data System (ADS)
Grayson, Rodger B.; Blöschl, Günter; Western, Andrew W.; McMahon, Thomas A.
Over the past two decades there have been repeated calls for the collection of new data for use in developing hydrological science. The last few years have begun to bear fruit from the seeds sown by these calls, through increases in the availability and utility of remote sensing data, as well as the execution of campaigns in research catchments aimed at providing new data for advancing hydrological understanding and predictive capability. In this paper we discuss some philosophical considerations related to model complexity, data availability and predictive performance, highlighting the potential of observed patterns in moving the science and practice of catchment hydrology forward. We then review advances that have arisen from recent work on spatial patterns, including in the characterisation of spatial structure and heterogeneity, and the use of patterns for developing, calibrating and testing distributed hydrological models. We illustrate progress via examples using observed patterns of snow cover, runoff occurrence and soil moisture. Methods for the comparison of patterns are presented, illustrating how they can be used to assess hydrologically important characteristics of model performance. These methods include point-to-point comparisons, spatial relationships between errors and landscape parameters, transects, and optimal local alignment. It is argued that the progress made to date augers well for future developments, but there is scope for improvements in several areas. These include better quantitative methods for pattern comparisons, better use of pattern information in data assimilation and modelling, and a call for improved archiving of data from field studies to assist in comparative studies for generalising results and developing fundamental understanding.
Calling behavior of blue and fin whales off California
NASA Astrophysics Data System (ADS)
Oleson, Erin Marie
Passive acoustic monitoring is an effective means for evaluating cetacean presence in remote regions and over long time periods, and may become an important component of cetacean abundance surveys. To use passive acoustic recordings for abundance estimation, an understanding of the behavioral ecology of cetacean calling is crucial. In this dissertation, I develop a better understanding of how blue (Balaenoptera musculus) and fin (B. physalus ) whales use sound with the goal of evaluating passive acoustic techniques for studying their populations. Both blue and fin whales produce several different call types, though the behavioral and environmental context of these calls have not been widely investigated. To better understand how calling is used by these whales off California I have employed both new technologies and traditional techniques, including acoustic recording tags, continuous long-term autonomous acoustic recordings, and simultaneous shipboard acoustic and visual surveys. The outcome of these investigations has led to several conclusions. The production of blue whale calls varies with sex, behavior, season, location, and time of day. Each blue whale call type has a distinct behavioral context, including a male-only bias in the production of song, a call type thought to function in reproduction, and the production of some calls by both sexes. Long-term acoustic records, when interpreted using all call types, provide a more accurate measure of the local seasonal presence of whales, and how they use the region annually, seasonally and daily. The relative occurrence of different call types may indicate prime foraging habitat and the presence of different segments of the population. The proportion of animals heard calling changes seasonally and geographically relative to the number seen, indicating the calibration of acoustic and visual surveys is complex and requires further study on the motivations behind call production and the behavior of calling whales. These findings will play a role in the future development of acoustic census methods and habitat studies for these species, and will provide baseline information for the determination of anthropogenic impacts on these populations.
NASA Technical Reports Server (NTRS)
Rosenbaum, J. S.
1971-01-01
Systems of ordinary differential equations in which the magnitudes of the eigenvalues (or time constants) vary greatly are commonly called stiff. Such systems of equations arise in nuclear reactor kinetics, the flow of chemically reacting gas, dynamics, control theory, circuit analysis and other fields. The research reported develops an A-stable numerical integration technique for solving stiff systems of ordinary differential equations. The method, which is called the generalized trapezoidal rule, is a modification of the trapezoidal rule. However, the method is computationally more efficient than the trapezoidal rule when the solution of the almost-discontinuous segments is being calculated.
ERIC Educational Resources Information Center
Wilcox, Amie K.; Shoulders, Catherine W.; Myers, Brian E.
2014-01-01
Calls for increased interdisciplinary education have led to the development of numerous teaching methods designed to help teachers provide meaningful experiences for their students. However, methods of guiding teachers in the successful adoption of innovative teaching methods are not firmly set. This qualitative study sought to better understand…
Kim, Junho; Maeng, Ju Heon; Lim, Jae Seok; Son, Hyeonju; Lee, Junehawk; Lee, Jeong Ho; Kim, Sangwoo
2016-10-15
Advances in sequencing technologies have remarkably lowered the detection limit of somatic variants to a low frequency. However, calling mutations at this range is still confounded by many factors including environmental contamination. Vector contamination is a continuously occurring issue and is especially problematic since vector inserts are hardly distinguishable from the sample sequences. Such inserts, which may harbor polymorphisms and engineered functional mutations, can result in calling false variants at corresponding sites. Numerous vector-screening methods have been developed, but none could handle contamination from inserts because they are focusing on vector backbone sequences alone. We developed a novel method-Vecuum-that identifies vector-originated reads and resultant false variants. Since vector inserts are generally constructed from intron-less cDNAs, Vecuum identifies vector-originated reads by inspecting the clipping patterns at exon junctions. False variant calls are further detected based on the biased distribution of mutant alleles to vector-originated reads. Tests on simulated and spike-in experimental data validated that Vecuum could detect 93% of vector contaminants and could remove up to 87% of variant-like false calls with 100% precision. Application to public sequence datasets demonstrated the utility of Vecuum in detecting false variants resulting from various types of external contamination. Java-based implementation of the method is available at http://vecuum.sourceforge.net/ CONTACT: swkim@yuhs.acSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Multiple scattering in particulate planetary surfaces
NASA Astrophysics Data System (ADS)
Muinonen, Karri; Peltoniemi, Jouni; Markkanen, Johannes; Penttilä, Antti; Videen, Gorden
2015-08-01
There are two ubiquitous phenomena observed at small solar phase angles (the Sun-Object-Observer angle) from, for example, asteroids and transneptunian objects. First, a nonlinear increase of brightness is observed toward the zero phase angle in the magnitude scale that is commonly called the opposition effect. Second, the scattered light is observed to be partially linearly polarized parallel to the Sun-Object-Observer plane that iscommonly called the negative polarization surge.The observations can be interpreted using a radiative-transfer coherent-backscattering Monte Carlo method (RT-CB, Muinonen 2004) that makes use of a so-called phenomenological fundamental single scatterer (Muinonen and Videen 2012). For the validity of RT-CB, see Muinonen et al. (2012). The method can allow us to put constraints on the size, shape, and refractive index of the fundamental scatterers.In the present work, we extend the RT-CB method for the specific case of a macroscopic medium of electric dipole scatterers. For the computation of the interactions, the far-field approximation inherent in the RT-CB method is replaced by an exact treatment, allowing us to account for, e.g., the so-called near-field effects. The present method constitutes the first milestone in the development of a multiple-scattering method, where the so-called ladder and maximally crossed cyclical diagrams of the multiple electromagnetic interactions are rigorously computed. We expect to utilize the new methods in the spectroscopic, photometric, and polarimetric studies of asteroids, as well as in the interpretation of radar echoes from small Solar System bodies.Acknowledgments. The research is funded by the ERC Advanced Grant No 320773 entitled Scattering and Absorption of Electromagnetic Waves in Particulate Media (SAEMPL).K. Muinonen, Waves in Random Media 14, 365 (2004).K. Muinonen, K., and G. Videen, JQSRT 113, 2385 (2012).K. Muinonen, M. I. Mishchenko, J. M. Dlugach, E. Zubko, A. Penttilä,and G. Videen, ApJ 760, 118 (2012).
A call for benchmarking transposable element annotation methods.
Hoen, Douglas R; Hickey, Glenn; Bourque, Guillaume; Casacuberta, Josep; Cordaux, Richard; Feschotte, Cédric; Fiston-Lavier, Anna-Sophie; Hua-Van, Aurélie; Hubley, Robert; Kapusta, Aurélie; Lerat, Emmanuelle; Maumus, Florian; Pollock, David D; Quesneville, Hadi; Smit, Arian; Wheeler, Travis J; Bureau, Thomas E; Blanchette, Mathieu
2015-01-01
DNA derived from transposable elements (TEs) constitutes large parts of the genomes of complex eukaryotes, with major impacts not only on genomic research but also on how organisms evolve and function. Although a variety of methods and tools have been developed to detect and annotate TEs, there are as yet no standard benchmarks-that is, no standard way to measure or compare their accuracy. This lack of accuracy assessment calls into question conclusions from a wide range of research that depends explicitly or implicitly on TE annotation. In the absence of standard benchmarks, toolmakers are impeded in improving their tools, annotators cannot properly assess which tools might best suit their needs, and downstream researchers cannot judge how accuracy limitations might impact their studies. We therefore propose that the TE research community create and adopt standard TE annotation benchmarks, and we call for other researchers to join the authors in making this long-overdue effort a success.
Beluga whale, Delphinapterus leucas, vocalizations from the Churchill River, Manitoba, Canada.
Chmelnitsky, Elly G; Ferguson, Steven H
2012-06-01
Classification of animal vocalizations is often done by a human observer using aural and visual analysis but more efficient, automated methods have also been utilized to reduce bias and increase reproducibility. Beluga whale, Delphinapterus leucas, calls were described from recordings collected in the summers of 2006-2008, in the Churchill River, Manitoba. Calls (n=706) were classified based on aural and visual analysis, and call characteristics were measured; calls were separated into 453 whistles (64.2%; 22 types), 183 pulsed∕noisy calls (25.9%; 15 types), and 70 combined calls (9.9%; seven types). Measured parameters varied within each call type but less variation existed in pulsed and noisy call types and some combined call types than in whistles. A more efficient and repeatable hierarchical clustering method was applied to 200 randomly chosen whistles using six call characteristics as variables; twelve groups were identified. Call characteristics varied less in cluster analysis groups than in whistle types described by visual and aural analysis and results were similar to the whistle contours described. This study provided the first description of beluga calls in Hudson Bay and using two methods provides more robust interpretations and an assessment of appropriate methods for future studies.
Introduction to Stand-up Meetings in Agile Methods
NASA Astrophysics Data System (ADS)
Hasnain, Eisha; Hall, Tracy
2009-05-01
In recent years, agile methods have become more popular in the software industry. Agile methods are a new approach compared to plan-driven approaches. One of the most important shifts in adopting an agile approach is the central focus given to people in the process. This is exemplified by the independence afforded to developers in the development work they do. This work investigates the opinions of practitioners about daily stand-up meetings in the agile methods and the role of developer in that. For our investigation we joined a yahoo group called "Extreme Programming". Our investigation suggests that although trust is an important factor in agile methods. But stand-ups are not the place to build trust.
Development of Speaking Skills through Activity Based Learning at the Elementary Level
ERIC Educational Resources Information Center
Ul-Haq, Zahoor; Khurram, Bushra Ahmed; Bangash, Arshad Khan
2017-01-01
Purpose: This paper discusses an effective instructional method called "activity based learning" that can be used to develop the speaking skills of students in the elementary school level. The present study was conducted to determine the effect of activity based learning on the development of the speaking skills of low and high achievers…
Heat simulation via Scilab programming
NASA Astrophysics Data System (ADS)
Hasan, Mohammad Khatim; Sulaiman, Jumat; Karim, Samsul Arifin Abdul
2014-07-01
This paper discussed the used of an open source sofware called Scilab to develop a heat simulator. In this paper, heat equation was used to simulate heat behavior in an object. The simulator was developed using finite difference method. Numerical experiment output show that Scilab can produce a good heat behavior simulation with marvellous visual output with only developing simple computer code.
An Adaptive Instability Suppression Controls Method for Aircraft Gas Turbine Engine Combustors
NASA Technical Reports Server (NTRS)
Kopasakis, George; DeLaat, John C.; Chang, Clarence T.
2008-01-01
An adaptive controls method for instability suppression in gas turbine engine combustors has been developed and successfully tested with a realistic aircraft engine combustor rig. This testing was part of a program that demonstrated, for the first time, successful active combustor instability control in an aircraft gas turbine engine-like environment. The controls method is called Adaptive Sliding Phasor Averaged Control. Testing of the control method has been conducted in an experimental rig with different configurations designed to simulate combustors with instabilities of about 530 and 315 Hz. Results demonstrate the effectiveness of this method in suppressing combustor instabilities. In addition, a dramatic improvement in suppression of the instability was achieved by focusing control on the second harmonic of the instability. This is believed to be due to a phenomena discovered and reported earlier, the so called Intra-Harmonic Coupling. These results may have implications for future research in combustor instability control.
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Integrated Product and Process Development (IPPD) embodies the simultaneous application of both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. Georgia Tech has proposed the development of an Integrated Design Engineering Simulator that will merge Integrated Product and Process Development with interdisciplinary analysis techniques and state-of-the-art computational technologies. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. The current status of development is given and future directions are outlined.
Phaser crystallographic software.
McCoy, Airlie J; Grosse-Kunstleve, Ralf W; Adams, Paul D; Winn, Martyn D; Storoni, Laurent C; Read, Randy J
2007-08-01
Phaser is a program for phasing macromolecular crystal structures by both molecular replacement and experimental phasing methods. The novel phasing algorithms implemented in Phaser have been developed using maximum likelihood and multivariate statistics. For molecular replacement, the new algorithms have proved to be significantly better than traditional methods in discriminating correct solutions from noise, and for single-wavelength anomalous dispersion experimental phasing, the new algorithms, which account for correlations between F(+) and F(-), give better phases (lower mean phase error with respect to the phases given by the refined structure) than those that use mean F and anomalous differences DeltaF. One of the design concepts of Phaser was that it be capable of a high degree of automation. To this end, Phaser (written in C++) can be called directly from Python, although it can also be called using traditional CCP4 keyword-style input. Phaser is a platform for future development of improved phasing methods and their release, including source code, to the crystallographic community.
OTG-snpcaller: An Optimized Pipeline Based on TMAP and GATK for SNP Calling from Ion Torrent Data
Huang, Wenpan; Xi, Feng; Lin, Lin; Zhi, Qihuan; Zhang, Wenwei; Tang, Y. Tom; Geng, Chunyu; Lu, Zhiyuan; Xu, Xun
2014-01-01
Because the new Proton platform from Life Technologies produced markedly different data from those of the Illumina platform, the conventional Illumina data analysis pipeline could not be used directly. We developed an optimized SNP calling method using TMAP and GATK (OTG-snpcaller). This method combined our own optimized processes, Remove Duplicates According to AS Tag (RDAST) and Alignment Optimize Structure (AOS), together with TMAP and GATK, to call SNPs from Proton data. We sequenced four sets of exomes captured by Agilent SureSelect and NimbleGen SeqCap EZ Kit, using Life Technology’s Ion Proton sequencer. Then we applied OTG-snpcaller and compared our results with the results from Torrent Variants Caller. The results indicated that OTG-snpcaller can reduce both false positive and false negative rates. Moreover, we compared our results with Illumina results generated by GATK best practices, and we found that the results of these two platforms were comparable. The good performance in variant calling using GATK best practices can be primarily attributed to the high quality of the Illumina sequences. PMID:24824529
OTG-snpcaller: an optimized pipeline based on TMAP and GATK for SNP calling from ion torrent data.
Zhu, Pengyuan; He, Lingyu; Li, Yaqiao; Huang, Wenpan; Xi, Feng; Lin, Lin; Zhi, Qihuan; Zhang, Wenwei; Tang, Y Tom; Geng, Chunyu; Lu, Zhiyuan; Xu, Xun
2014-01-01
Because the new Proton platform from Life Technologies produced markedly different data from those of the Illumina platform, the conventional Illumina data analysis pipeline could not be used directly. We developed an optimized SNP calling method using TMAP and GATK (OTG-snpcaller). This method combined our own optimized processes, Remove Duplicates According to AS Tag (RDAST) and Alignment Optimize Structure (AOS), together with TMAP and GATK, to call SNPs from Proton data. We sequenced four sets of exomes captured by Agilent SureSelect and NimbleGen SeqCap EZ Kit, using Life Technology's Ion Proton sequencer. Then we applied OTG-snpcaller and compared our results with the results from Torrent Variants Caller. The results indicated that OTG-snpcaller can reduce both false positive and false negative rates. Moreover, we compared our results with Illumina results generated by GATK best practices, and we found that the results of these two platforms were comparable. The good performance in variant calling using GATK best practices can be primarily attributed to the high quality of the Illumina sequences.
Current methods designed to control and reduce the amount of sulfur dioxide emitted into the atmosphere from coal-fired power plants and factories rely upon the reaction between SO2 and alkaline earth compounds and are called flue gas desulfurization (FGD) processes. Of these met...
The Development of MST Test Information for the Prediction of Test Performances
ERIC Educational Resources Information Center
Park, Ryoungsun; Kim, Jiseon; Chung, Hyewon; Dodd, Barbara G.
2017-01-01
The current study proposes novel methods to predict multistage testing (MST) performance without conducting simulations. This method, called MST test information, is based on analytic derivation of standard errors of ability estimates across theta levels. We compared standard errors derived analytically to the simulation results to demonstrate the…
Thode, Aaron M; Kim, Katherine H; Blackwell, Susanna B; Greene, Charles R; Nations, Christopher S; McDonald, Trent L; Macrander, A Michael
2012-05-01
An automated procedure has been developed for detecting and localizing frequency-modulated bowhead whale sounds in the presence of seismic airgun surveys. The procedure was applied to four years of data, collected from over 30 directional autonomous recording packages deployed over a 280 km span of continental shelf in the Alaskan Beaufort Sea. The procedure has six sequential stages that begin by extracting 25-element feature vectors from spectrograms of potential call candidates. Two cascaded neural networks then classify some feature vectors as bowhead calls, and the procedure then matches calls between recorders to triangulate locations. To train the networks, manual analysts flagged 219 471 bowhead call examples from 2008 and 2009. Manual analyses were also used to identify 1.17 million transient signals that were not whale calls. The network output thresholds were adjusted to reject 20% of whale calls in the training data. Validation runs using 2007 and 2010 data found that the procedure missed 30%-40% of manually detected calls. Furthermore, 20%-40% of the sounds flagged as calls are not present in the manual analyses; however, these extra detections incorporate legitimate whale calls overlooked by human analysts. Both manual and automated methods produce similar spatial and temporal call distributions.
Remane, Daniela; Wissenbach, Dirk K; Peters, Frank T
2016-09-01
Liquid chromatography (LC) coupled to mass spectrometry (MS) or tandem mass spectrometry (MS/MS) is a well-established and widely used technique in clinical and forensic toxicology as well as doping control especially for quantitative analysis. In recent years, many applications for so-called multi-target screening and/or quantification of drugs, poisons, and or their metabolites in biological matrices have been developed. Such methods have proven particularly useful for analysis of so-called new psychoactive substances that have appeared on recreational drug markets throughout the world. Moreover, the evolvement of high resolution MS techniques and the development of data-independent detection modes have opened new possibilities for applications of LC-(MS/MS) in systematic toxicological screening analysis in the so called general unknown setting. The present paper will provide an overview and discuss these recent developments focusing on the literature published after 2010. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
METHODS DEVELOPMENT FOR THE ANALYSIS OF CHIRAL PESTICIDES
Chiral compounds exist as a pair of nonsuperimposable mirror images called enantiomers. Enantiomers have identical physical-chemical properties, but their interactions with other chiral molecules, toxicity, biodegradation, and fate are often different. Many pharmaceutical com...
The Socialization of Virtual Teams: Implications for ISD
NASA Astrophysics Data System (ADS)
Mullally, Brenda; Stapleton, Larry
Studies show that Information Systems Development (ISD) projects do not fulfil stakeholder expectations of completion time, quality and budget. (2005) study shows that development is more about social interaction and mutual understanding than following a prescribed method. Systems development is a social process where interactions help to make sense of the reality within which the system is developed (Hirschheirn et al., 1991). Research concentrates on methodology when in fact method may not be the primary problem. Authors have called for further research to investigate the true nature of the current systems development environment in real organisational situations (Fitzgerald, 2000).
Developing brain networks of attention.
Posner, Michael I; Rothbart, Mary K; Voelker, Pascale
2016-12-01
Attention is a primary cognitive function critical for perception, language, and memory. We provide an update on brain networks related to attention, their development, training, and pathologies. An executive attention network, also called the cingulo-opercular network, allows voluntary control of behavior in accordance with goals. Individual differences among children in self-regulation have been measured by a higher order factor called effortful control, which is related to the executive network and to the size of the anterior cingulate cortex. Brain networks of attention arise in infancy and are related to individual differences, including pathology during childhood. Methods of training attention may improve performance and ameliorate pathology.
Detailed temporal structure of communication networks in groups of songbirds.
Stowell, Dan; Gill, Lisa; Clayton, David
2016-06-01
Animals in groups often exchange calls, in patterns whose temporal structure may be influenced by contextual factors such as physical location and the social network structure of the group. We introduce a model-based analysis for temporal patterns of animal call timing, originally developed for networks of firing neurons. This has advantages over cross-correlation analysis in that it can correctly handle common-cause confounds and provides a generative model of call patterns with explicit parameters for the influences between individuals. It also has advantages over standard Markovian analysis in that it incorporates detailed temporal interactions which affect timing as well as sequencing of calls. Further, a fitted model can be used to generate novel synthetic call sequences. We apply the method to calls recorded from groups of domesticated zebra finch (Taeniopygia guttata) individuals. We find that the communication network in these groups has stable structure that persists from one day to the next, and that 'kernels' reflecting the temporal range of influence have a characteristic structure for a calling individual's effect on itself, its partner and on others in the group. We further find characteristic patterns of influences by call type as well as by individual. © 2016 The Authors.
NASA Astrophysics Data System (ADS)
Labate, Demetrio; Negi, Pooran; Ozcan, Burcin; Papadakis, Manos
2015-09-01
As advances in imaging technologies make more and more data available for biomedical applications, there is an increasing need to develop efficient quantitative algorithms for the analysis and processing of imaging data. In this paper, we introduce an innovative multiscale approach called Directional Ratio which is especially effective to distingush isotropic from anisotropic structures. This task is especially useful in the analysis of images of neurons, the main units of the nervous systems which consist of a main cell body called the soma and many elongated processes called neurites. We analyze the theoretical properties of our method on idealized models of neurons and develop a numerical implementation of this approach for analysis of fluorescent images of cultured neurons. We show that this algorithm is very effective for the detection of somas and the extraction of neurites in images of small circuits of neurons.
Dynamic variable selection in SNP genotype autocalling from APEX microarray data.
Podder, Mohua; Welch, William J; Zamar, Ruben H; Tebbutt, Scott J
2006-11-30
Single nucleotide polymorphisms (SNPs) are DNA sequence variations, occurring when a single nucleotide--adenine (A), thymine (T), cytosine (C) or guanine (G)--is altered. Arguably, SNPs account for more than 90% of human genetic variation. Our laboratory has developed a highly redundant SNP genotyping assay consisting of multiple probes with signals from multiple channels for a single SNP, based on arrayed primer extension (APEX). This mini-sequencing method is a powerful combination of a highly parallel microarray with distinctive Sanger-based dideoxy terminator sequencing chemistry. Using this microarray platform, our current genotype calling system (known as SNP Chart) is capable of calling single SNP genotypes by manual inspection of the APEX data, which is time-consuming and exposed to user subjectivity bias. Using a set of 32 Coriell DNA samples plus three negative PCR controls as a training data set, we have developed a fully-automated genotyping algorithm based on simple linear discriminant analysis (LDA) using dynamic variable selection. The algorithm combines separate analyses based on the multiple probe sets to give a final posterior probability for each candidate genotype. We have tested our algorithm on a completely independent data set of 270 DNA samples, with validated genotypes, from patients admitted to the intensive care unit (ICU) of St. Paul's Hospital (plus one negative PCR control sample). Our method achieves a concordance rate of 98.9% with a 99.6% call rate for a set of 96 SNPs. By adjusting the threshold value for the final posterior probability of the called genotype, the call rate reduces to 94.9% with a higher concordance rate of 99.6%. We also reversed the two independent data sets in their training and testing roles, achieving a concordance rate up to 99.8%. The strength of this APEX chemistry-based platform is its unique redundancy having multiple probes for a single SNP. Our model-based genotype calling algorithm captures the redundancy in the system considering all the underlying probe features of a particular SNP, automatically down-weighting any 'bad data' corresponding to image artifacts on the microarray slide or failure of a specific chemistry. In this regard, our method is able to automatically select the probes which work well and reduce the effect of other so-called bad performing probes in a sample-specific manner, for any number of SNPs.
ERIC Educational Resources Information Center
Spencer, R. W.
1974-01-01
The British Gas Corporation has formulated and refined the incident process of training into their own method, which they call developing case study. Sales trainees learn indoor and outdoor sales techniques for selling central heating through self-taught case studies. (DS)
Mapping Farming Practices in Belgian Intensive Cropping Systems from Sentinel-1 SAR Time Series
NASA Astrophysics Data System (ADS)
Chome, G.; Baret, P. V.; Defourny, P.
2016-08-01
The environmental impact of the so-called conventional farming system calls for new farming practices reducing negative externalities. Emerging farming practices such as no-till and new inter-cropping management are promising tracks. The development of methods to characterize crop management across an entire region and to understand their spatial dimension offers opportunities to accompany the transition towards a more sustainable agriculture.This research takes advantage of the unmatched polarimetric and temporal resolutions of Sentinel-1 SAR C- band to develop a method to identify farming practices at the parcel level. To this end, the detection of changes in backscattering due to surface roughness modification (tillage, inter-crop cover destruction ...) is used to detect the farming management. The final results are compared to a reference dataset collected through an intensive field campaign. Finally, the performances are discussed in the perspective of practices monitoring of cropping systems through remote sensing.
The National Assessment Approach to Objectives and Exercise Development.
ERIC Educational Resources Information Center
Ward, Barbara
The National Assessment of Educational Progress (NAEP) item development procedures, possible improvements or alternatives to these procedures, and methods used to control potential sources of errors of interpretation are described. Current procedures call for the assessment of 9-, 13- and 17-year-olds in subject areas typically taught in schools.…
ERIC Educational Resources Information Center
Mantyla, Terhi
2013-01-01
In teaching physics, the history of physics offers fruitful starting points for designing instruction. I introduce here an approach that uses historical cognitive processes to enhance the conceptual development of pre-service physics teachers' knowledge. It applies a method called cognitive-historical approach, introduced to the cognitive sciences…
Let's Be PALS: An Evidence-Based Approach to Professional Development
ERIC Educational Resources Information Center
Dunst, Carl J.; Trivette, Carol M.
2009-01-01
An evidence-based approach to professional development is described on the basis of the findings from a series of research syntheses and meta-analyses of adult learning methods and strategies. The approach, called PALS (Participatory Adult Learning Strategy), places major emphasis on both active learner involvement in all aspects of training…
Development and evaluation of the photoload sampling technique
Robert E. Keane; Laura J. Dickinson
2007-01-01
Wildland fire managers need better estimates of fuel loading so they can accurately predict potential fire behavior and effects of alternative fuel and ecosystem restoration treatments. This report presents the development and evaluation of a new fuel sampling method, called the photoload sampling technique, to quickly and accurately estimate loadings for six common...
Recent advances in Lanczos-based iterative methods for nonsymmetric linear systems
NASA Technical Reports Server (NTRS)
Freund, Roland W.; Golub, Gene H.; Nachtigal, Noel M.
1992-01-01
In recent years, there has been a true revival of the nonsymmetric Lanczos method. On the one hand, the possible breakdowns in the classical algorithm are now better understood, and so-called look-ahead variants of the Lanczos process have been developed, which remedy this problem. On the other hand, various new Lanczos-based iterative schemes for solving nonsymmetric linear systems have been proposed. This paper gives a survey of some of these recent developments.
Piecewise SALT sampling for estimating suspended sediment yields
Robert B. Thomas
1989-01-01
A probability sampling method called SALT (Selection At List Time) has been developed for collecting and summarizing data on delivery of suspended sediment in rivers. It is based on sampling and estimating yield using a suspended-sediment rating curve for high discharges and simple random sampling for low flows. The method gives unbiased estimates of total yield and...
ERIC Educational Resources Information Center
Kilburn, Daniel; Nind, Melanie; Wiles, Rose
2014-01-01
In light of calls to improve the capacity for social science research within UK higher education, this article explores the possibilities for an emerging pedagogy for research methods. A lack of pedagogical culture in this field has been identified by previous studies. In response, we examine pedagogical literature surrounding approaches for…
Magnetic Interactions and the Method of Images: A Wealth of Educational Suggestions
ERIC Educational Resources Information Center
Bonanno, A.; Camarca, M.; Sapia, P.
2011-01-01
Under some conditions, the method of images (well known in electrostatics) may be implemented in magnetostatic problems too, giving an excellent example of the usefulness of formal analogies in the description of physical systems. In this paper, we develop a quantitative model for the magnetic interactions underlying the so-called Geomag[TM]…
D'Nealian Manuscript--An Aid to Reading Development.
ERIC Educational Resources Information Center
Thurber, Donald N.
A new method of continuous stroke manuscript print called D'Nealian Manuscript is challenging the traditional circle-stick method of teaching children how to write. The circle-stick uses component or splinter parts to form whole letters. Children are forced to form all writing with verticle lines and to learn a manuscript print that goes nowhere.…
NASA Astrophysics Data System (ADS)
Valtierra, Robert Daniel
Passive acoustic localization has benefited from many major developments and has become an increasingly important focus point in marine mammal research. Several challenges still remain. This work seeks to address several of these challenges such as tracking the calling depths of baleen whales. In this work, data from an array of widely spaced Marine Acoustic Recording Units (MARUs) was used to achieve three dimensional localization by combining the methods Time Difference of Arrival (TDOA) and Direct-Reflected Time Difference of Arrival (DRTD) along with a newly developed autocorrelation technique. TDOA was applied to data for two dimensional (latitude and longitude) localization and depth was resolved using DRTD. Previously, DRTD had been limited to pulsed broadband signals, such as sperm whale or dolphin echolocation, where individual direct and reflected signals are separated in time. Due to the length of typical baleen whale vocalizations, individual multipath signal arrivals can overlap making time differences of arrival difficult to resolve. This problem can be solved using an autocorrelation, which can extract reflection information from overlapping signals. To establish this technique, a derivation was made to model the autocorrelation of a direct signal and its overlapping reflection. The model was exploited to derive performance limits allowing for prediction of the minimum resolvable direct-reflected time difference for a known signal type. The dependence on signal parameters (sweep rate, call duration) was also investigated. The model was then verified using both recorded and simulated data from two analysis cases for North Atlantic right whales (NARWs, Eubalaena glacialis) and humpback whales (Megaptera noveaengliae). The newly developed autocorrelation technique was then combined with DRTD and tested using data from playback transmissions to localize an acoustic transducer at a known depth and location. The combined DRTD-autocorrelation methods enabled calling depth and range estimations of a vocalizing NARW and humpback whale in two separate cases. The DRTD-autocorrelation method was then combined with TDOA to create a three dimensional track of a NARW in the Stellwagen Bank National Marine Sanctuary. Results from these experiments illustrated the potential of the combined methods to successfully resolve baleen calling depths in three dimensions.
A non-parametric peak calling algorithm for DamID-Seq.
Li, Renhua; Hempel, Leonie U; Jiang, Tingbo
2015-01-01
Protein-DNA interactions play a significant role in gene regulation and expression. In order to identify transcription factor binding sites (TFBS) of double sex (DSX)-an important transcription factor in sex determination, we applied the DNA adenine methylation identification (DamID) technology to the fat body tissue of Drosophila, followed by deep sequencing (DamID-Seq). One feature of DamID-Seq data is that induced adenine methylation signals are not assured to be symmetrically distributed at TFBS, which renders the existing peak calling algorithms for ChIP-Seq, including SPP and MACS, inappropriate for DamID-Seq data. This challenged us to develop a new algorithm for peak calling. A challenge in peaking calling based on sequence data is estimating the averaged behavior of background signals. We applied a bootstrap resampling method to short sequence reads in the control (Dam only). After data quality check and mapping reads to a reference genome, the peaking calling procedure compromises the following steps: 1) reads resampling; 2) reads scaling (normalization) and computing signal-to-noise fold changes; 3) filtering; 4) Calling peaks based on a statistically significant threshold. This is a non-parametric method for peak calling (NPPC). We also used irreproducible discovery rate (IDR) analysis, as well as ChIP-Seq data to compare the peaks called by the NPPC. We identified approximately 6,000 peaks for DSX, which point to 1,225 genes related to the fat body tissue difference between female and male Drosophila. Statistical evidence from IDR analysis indicated that these peaks are reproducible across biological replicates. In addition, these peaks are comparable to those identified by use of ChIP-Seq on S2 cells, in terms of peak number, location, and peaks width.
Investigations of Sayre's Equation.
NASA Astrophysics Data System (ADS)
Shiono, Masaaki
Available from UMI in association with The British Library. Since the discovery of X-ray diffraction, various methods of using it to solve crystal structures have been developed. The major methods used can be divided into two categories: (1) Patterson function based methods; (2) Direct phase-determination methods. In the early days of structure determination from X-ray diffraction, Patterson methods played the leading role. Direct phase-determining methods ('direct methods' for short) were introduced by D. Harker and J. S. Kasper in the form of inequality relationships in 1948. A significant development of direct methods was produced by Sayre (1952). The equation he introduced, generally called Sayre's equation, gives exact relationships between structure factors for equal atoms. Later Cochran (1955) derived the so-called triple phase relationship, the main means by which it has become possible to find the structure factor phases automatically by computer. Although the background theory of direct methods is very mathematical, the user of direct-methods computer programs needs no detailed knowledge of these automatic processes in order to solve structures. Recently introduced direct methods are based on Sayre's equation, so it is important to investigate its properties thoroughly. One such new method involves the Sayre equation tangent formula (SETF) which attempts to minimise the least square residual for the Sayre's equations (Debaerdemaeker, Tate and Woolfson; 1985). In chapters I-III the principles and developments of direct methods will be described and in chapters IV -VI the properties of Sayre's equation and its modification will be discussed. Finally, in chapter VII, there will be described the investigation of the possible use of an equation, similar in type to Sayre's equation, derived from the characteristics of the Patterson function.
Toward fidelity between specification and implementation
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.; Morrison, Jeff; Wu, Yunqing
1994-01-01
This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.
Verification and validation of a reliable multicast protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1995-01-01
This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.
NASA Technical Reports Server (NTRS)
Greenberg, Albert G.; Lubachevsky, Boris D.; Nicol, David M.; Wright, Paul E.
1994-01-01
Fast, efficient parallel algorithms are presented for discrete event simulations of dynamic channel assignment schemes for wireless cellular communication networks. The driving events are call arrivals and departures, in continuous time, to cells geographically distributed across the service area. A dynamic channel assignment scheme decides which call arrivals to accept, and which channels to allocate to the accepted calls, attempting to minimize call blocking while ensuring co-channel interference is tolerably low. Specifically, the scheme ensures that the same channel is used concurrently at different cells only if the pairwise distances between those cells are sufficiently large. Much of the complexity of the system comes from ensuring this separation. The network is modeled as a system of interacting continuous time automata, each corresponding to a cell. To simulate the model, conservative methods are used; i.e., methods in which no errors occur in the course of the simulation and so no rollback or relaxation is needed. Implemented on a 16K processor MasPar MP-1, an elegant and simple technique provides speedups of about 15 times over an optimized serial simulation running on a high speed workstation. A drawback of this technique, typical of conservative methods, is that processor utilization is rather low. To overcome this, new methods were developed that exploit slackness in event dependencies over short intervals of time, thereby raising the utilization to above 50 percent and the speedup over the optimized serial code to about 120 times.
Microcoppice: a new strategy for red oak clonal propagation
D.E. Harper; B.H. McCown
1991-01-01
The great demand for red oak (Quercus rubra L.) has forced plant propagators to consider viable methods of mass clonal propagation for the species. A process called 'microcoppicing' is presently being developed to help meet such needs.
Drawert, Brian; Lawson, Michael J; Petzold, Linda; Khammash, Mustafa
2010-02-21
We have developed a computational framework for accurate and efficient simulation of stochastic spatially inhomogeneous biochemical systems. The new computational method employs a fractional step hybrid strategy. A novel formulation of the finite state projection (FSP) method, called the diffusive FSP method, is introduced for the efficient and accurate simulation of diffusive transport. Reactions are handled by the stochastic simulation algorithm.
ERIC Educational Resources Information Center
Carter, Angela
This study involved observing a second-grade classroom to investigate how the teacher called on students, noting whether the teacher gave enough attention to students who raised their hands frequently by calling on them and examining students' responses when called on. Researchers implemented a new method of calling on students using name cards,…
Newton-like methods for Navier-Stokes solution
NASA Astrophysics Data System (ADS)
Qin, N.; Xu, X.; Richards, B. E.
1992-12-01
The paper reports on Newton-like methods called SFDN-alpha-GMRES and SQN-alpha-GMRES methods that have been devised and proven as powerful schemes for large nonlinear problems typical of viscous compressible Navier-Stokes solutions. They can be applied using a partially converged solution from a conventional explicit or approximate implicit method. Developments have included the efficient parallelization of the schemes on a distributed memory parallel computer. The methods are illustrated using a RISC workstation and a transputer parallel system respectively to solve a hypersonic vortical flow.
MAFsnp: A Multi-Sample Accurate and Flexible SNP Caller Using Next-Generation Sequencing Data
Hu, Jiyuan; Li, Tengfei; Xiu, Zidi; Zhang, Hong
2015-01-01
Most existing statistical methods developed for calling single nucleotide polymorphisms (SNPs) using next-generation sequencing (NGS) data are based on Bayesian frameworks, and there does not exist any SNP caller that produces p-values for calling SNPs in a frequentist framework. To fill in this gap, we develop a new method MAFsnp, a Multiple-sample based Accurate and Flexible algorithm for calling SNPs with NGS data. MAFsnp is based on an estimated likelihood ratio test (eLRT) statistic. In practical situation, the involved parameter is very close to the boundary of the parametric space, so the standard large sample property is not suitable to evaluate the finite-sample distribution of the eLRT statistic. Observing that the distribution of the test statistic is a mixture of zero and a continuous part, we propose to model the test statistic with a novel two-parameter mixture distribution. Once the parameters in the mixture distribution are estimated, p-values can be easily calculated for detecting SNPs, and the multiple-testing corrected p-values can be used to control false discovery rate (FDR) at any pre-specified level. With simulated data, MAFsnp is shown to have much better control of FDR than the existing SNP callers. Through the application to two real datasets, MAFsnp is also shown to outperform the existing SNP callers in terms of calling accuracy. An R package “MAFsnp” implementing the new SNP caller is freely available at http://homepage.fudan.edu.cn/zhangh/softwares/. PMID:26309201
Perspectives of Nurses and Patients on Call Light Technology.
Galinato, Jose; Montie, Mary; Patak, Lance; Titler, Marita
2015-08-01
Call lights are prevalent in inpatient healthcare facilities across the nation. While call light use directly influences the delivery of nursing care, there remain significant gaps both in research and technology that can affect the quality of care and patient satisfaction. This study examines nurse and patient perceptions of the use of a new call communication solution, Eloquence, in the acute care inpatient setting. Eighteen patients were recruited for the study and participated in individual semistructured interviews during their hospital stay. Eighteen nurses were recruited and participated in focus groups for this study. Qualitative descriptive methods were used to analyze the data. Results revealed themes of usability, improved communication, and suggestions for improvement to the alpha prototype design. After a demonstration of the use and capability of Eloquence, nurse and patient participants found Eloquence as a welcomed advancement in nurse call technology that has the potential to improve workflow and patient outcomes. In addition, the participants also proposed ideas on how to further develop the technology to improve its use.
Bandwidth and Detection of Packet Length Covert Channels
2011-03-01
Shared Resource Matrix ( SRM ): Develop a matrix of all resources on one side and on the other all the processes. Then, determine which process uses which...system calls. This method is similar to that of the SRM . Covert channels have also been created by modulating packet timing, data and headers of net- work...analysis, noninterference analysis, SRM method, and the covert flow tree method [4]. These methods can be used during the design phase of a system. Less
Development of a Novel Tissue Specific Aromatase Activity Regulation Therapeutic Method
2009-09-01
Distribution Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Estrogen is essential for normal growth and development of the female ...the ovaries and other tissues of the body using an enzyme called aromatase. Once women have reached menopause, the ovaries no longer produce estrogen...Introduction Estrogen is essential for normal growth and development of the female reproductive system, including breast tissue, and lifetime
Systematic comparison of variant calling pipelines using gold standard personal exome variants
Hwang, Sohyun; Kim, Eiru; Lee, Insuk; Marcotte, Edward M.
2015-01-01
The success of clinical genomics using next generation sequencing (NGS) requires the accurate and consistent identification of personal genome variants. Assorted variant calling methods have been developed, which show low concordance between their calls. Hence, a systematic comparison of the variant callers could give important guidance to NGS-based clinical genomics. Recently, a set of high-confident variant calls for one individual (NA12878) has been published by the Genome in a Bottle (GIAB) consortium, enabling performance benchmarking of different variant calling pipelines. Based on the gold standard reference variant calls from GIAB, we compared the performance of thirteen variant calling pipelines, testing combinations of three read aligners—BWA-MEM, Bowtie2, and Novoalign—and four variant callers—Genome Analysis Tool Kit HaplotypeCaller (GATK-HC), Samtools mpileup, Freebayes and Ion Proton Variant Caller (TVC), for twelve data sets for the NA12878 genome sequenced by different platforms including Illumina2000, Illumina2500, and Ion Proton, with various exome capture systems and exome coverage. We observed different biases toward specific types of SNP genotyping errors by the different variant callers. The results of our study provide useful guidelines for reliable variant identification from deep sequencing of personal genomes. PMID:26639839
Development of an education campaign to reduce delays in pre-hospital response to stroke.
Caminiti, Caterina; Schulz, Peter; Marcomini, Barbara; Iezzi, Elisa; Riva, Silvia; Scoditti, Umberto; Zini, Andrea; Malferrari, Giovanni; Zedde, Maria Luisa; Guidetti, Donata; Montanari, Enrico; Baratti, Mario; Denti, Licia
2017-06-24
Systematic reviews call for well-designed trials with clearly described intervention components to support the effectiveness of educational campaigns to reduce patient delay in stroke presentation. We herein describe the systematic development process of a campaign aimed to increase stroke awareness and preparedness. Campaign development followed Intervention Mapping (IM), a theory- and evidence-based tool, and was articulated in two phases: needs assessment and intervention development. In phase 1, two cross-sectional surveys were performed, one aiming to measure stroke awareness in the target population and the other to analyze the behavioral determinants of prehospital delay. In phase 2, a matrix of proximal program objectives was developed, theory-based intervention methods and practical strategies were selected and program components and materials produced. In phase 1, the survey on 202 citizens highlighted underestimation of symptom severity, as in only 44% of stroke situations respondents would choose to call the emergency service (EMS). In the survey on 393 consecutive patients, 55% presented over 2 hours after symptom onset; major determinants were deciding to call the general practitioner first and the reaction of the first person the patient called. In phase 2, adult individuals were identified as the target of the intervention, both as potential "patients" and witnesses of stroke. The low educational level found in the patient survey called for a narrative approach in cartoon form. The family setting was chosen for the message because 42% of patients who presented within 2 hours had been advised by a family member to call EMS. To act on people's tendency to view stroke as an untreatable disease, it was decided to avoid fear-arousal appeals and use a positive message providing instructions and hope. Focus groups were used to test educational products and identify the most suitable sites for message dissemination. The IM approach allowed to develop a stroke campaign integrating theories, scientific evidence and information collected from the target population, and enabled to provide clear explanations for the reasons behind key decisions during the intervention development process. NCT01881152 . Retrospectively registered June 7 2013.
Thermodynamics of quantum information scrambling
NASA Astrophysics Data System (ADS)
Campisi, Michele; Goold, John
2017-06-01
Scrambling of quantum information can conveniently be quantified by so-called out-of-time-order correlators (OTOCs), i.e., correlators of the type <[Wτ,V ] †[Wτ,V ] > , whose measurements present a formidable experimental challenge. Here we report on a method for the measurement of OTOCs based on the so-called two-point measurement scheme developed in the field of nonequilibrium quantum thermodynamics. The scheme is of broader applicability than methods employed in current experiments and provides a clear-cut interpretation of quantum information scrambling in terms of nonequilibrium fluctuations of thermodynamic quantities, such as work and heat. Furthermore, we provide a numerical example on a spin chain which highlights the utility of our thermodynamic approach when understanding the differences between integrable and ergodic behaviors. We also discuss how the method can be used to extend the reach of current experiments.
Extension of Nikiforov-Uvarov method for the solution of Heun equation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karayer, H., E-mail: hale.karayer@gmail.com; Demirhan, D.; Büyükkılıç, F.
2015-06-15
We report an alternative method to solve second order differential equations which have at most four singular points. This method is developed by changing the degrees of the polynomials in the basic equation of Nikiforov-Uvarov (NU) method. This is called extended NU method for this paper. The eigenvalue solutions of Heun equation and confluent Heun equation are obtained via extended NU method. Some quantum mechanical problems such as Coulomb problem on a 3-sphere, two Coulombically repelling electrons on a sphere, and hyperbolic double-well potential are investigated by this method.
Second Language Acquisition: Implications of Web 2.0 and Beyond
ERIC Educational Resources Information Center
Chang, Ching-Wen; Pearman, Cathy; Farha, Nicholas
2012-01-01
Language laboratories, developed in the 1970s under the influence of the Audiolingual Method, were superseded several decades later by computer-assisted language learning (CALL) work stations (Gündüz, 2005). The World Wide Web was developed shortly thereafter. From this introduction and the well-documented and staggering growth of the Internet and…
Who Does What to Whom: Introduction of Referents in Children's Storytelling from Pictures
ERIC Educational Resources Information Center
Schneider, Phyllis; Hayward, Denyse
2010-01-01
Purpose: This article describes the development of a measure, called First Mentions (FM), that can be used to evaluate the referring expressions that children use to introduce characters and objects when telling a story. Method: Participants were 377 children ages 4 to 9 years (300 with typical development, 77 with language impairment) who told…
ERIC Educational Resources Information Center
Ministerio de Educacion, Lima (Peru).
These two documents describe developments and methods of information dissemination which have contributed to strengthening educational reform in Peru since 1972. Educational developments discussed include: (1) decentralization of school administration with increased community participation; (2) a policy calling for the implementation of…
ERIC Educational Resources Information Center
Hitchcock, John H.; Sarkar, Sreeroopa; Nastasi, Bonnie; Burkholder, Gary; Varjas, Kristen; Jayasena, Asoka
2006-01-01
Despite on-going calls for developing cultural competency among mental health practitioners, few assessment instruments consider cultural variation in psychological constructs. To meet the challenge of developing measures for minority and international students, it is necessary to account for the influence culture may have on the latent constructs…
Sound imaging of nocturnal animal calls in their natural habitat.
Mizumoto, Takeshi; Aihara, Ikkyu; Otsuka, Takuma; Takeda, Ryu; Aihara, Kazuyuki; Okuno, Hiroshi G
2011-09-01
We present a novel method for imaging acoustic communication between nocturnal animals. Investigating the spatio-temporal calling behavior of nocturnal animals, e.g., frogs and crickets, has been difficult because of the need to distinguish many animals' calls in noisy environments without being able to see them. Our method visualizes the spatial and temporal dynamics using dozens of sound-to-light conversion devices (called "Firefly") and an off-the-shelf video camera. The Firefly, which consists of a microphone and a light emitting diode, emits light when it captures nearby sound. Deploying dozens of Fireflies in a target area, we record calls of multiple individuals through the video camera. We conduct two experiments, one indoors and the other in the field, using Japanese tree frogs (Hyla japonica). The indoor experiment demonstrates that our method correctly visualizes Japanese tree frogs' calling behavior. It has confirmed the known behavior; two frogs call synchronously or in anti-phase synchronization. The field experiment (in a rice paddy where Japanese tree frogs live) also visualizes the same calling behavior to confirm anti-phase synchronization in the field. Experimental results confirm that our method can visualize the calling behavior of nocturnal animals in their natural habitat.
Malchaire, J B
2004-08-01
The first section of the document describes a risk-prevention strategy, called SOBANE, in four levels: screening, observation, analysis and expertise. The aim is to make risk prevention faster, more cost effective, and more effective in coordinating the contributions of the workers themselves, their management, the internal and external occupational health (OH) practitioners and the experts. These four levels are: screening, where the risk factors are detected by the workers and their management, and obvious solutions are implemented; observation, where the remaining problems are studied in more detail, one by one, and the reasons and the solutions are discussed in detail; analysis, where, when necessary, an OH practitioner is called upon to carry out appropriate measurements to develop specific solutions; expertise, where, in very sophisticated and rare cases, the assistance of an expert is called upon to solve a particular problem. The method for the participatory screening of the risks (in French: Dépistage Participatif des Risques), Déparis, is proposed for the first level screening of the SOBANE strategy. The work situation is systematically reviewed and all the aspects conditioning the easiness, the effectiveness and the satisfaction at work are discussed, in search of practical prevention measures. The points to be studied more in detail at level 2, observation, are identified. The method is carried out during a meeting of key workers and technical staff. The method proves to be simple, sparing in time and means and playing a significant role in the development of a dynamic plan of risk management and of a culture of dialogue in the company.
Automated tetraploid genotype calling by hierarchical clustering
USDA-ARS?s Scientific Manuscript database
SNP arrays are transforming breeding and genetics research for autotetraploids. To fully utilize these arrays, however, the relationship between signal intensity and allele dosage must be inferred independently for each marker. We developed an improved computational method to automate this process, ...
Microbial Resistant Test Method Development
Because humans spend most of their time in the indoor environment, environmental analysis of the quality of indoor air has become an important research topic. A major component of the aerosol in the indoor environment consists of biological particles, called bioaerosols, and fur...
ON DEVELOPING TOOLS AND METHODS FOR ENVIRONMENTALLY BENIGN PROCESSES
Two types of tools are generally needed for designing processes and products that are cleaner from environmental impact perspectives. The first kind is called process tools. Process tools are based on information obtained from experimental investigations in chemistry., material s...
ERIC Educational Resources Information Center
Redfield, Gretchen
As a first step towards resource sharing among libraries in the Cleveland Area Metropolitan Library System (CAMLS), a unique method, called the Site Appraisal for Area Resources Inventory (SAFARI), was developed to examine the library collections. This approach was different than others in that collections were compared by experts in a specific…
ERIC Educational Resources Information Center
Hallinger, Philip; Chen, Junjun
2015-01-01
Over the past two decades scholars have called for a more concerted effort to develop an empirically grounded literature on educational leadership outside of mainstream "Western" contexts. This paper reports the results of a review of research topics and methods that comprise the literature on educational leadership and management in…
A call to improve methods for estimating tree biomass for regional and national assessments
Aaron R. Weiskittel; David W. MacFarlane; Philip J. Radtke; David L.R. Affleck; Hailemariam Temesgen; Christopher W. Woodall; James A. Westfall; John W. Coulston
2015-01-01
Tree biomass is typically estimated using statistical models. This review highlights five limitations of most tree biomass models, which include the following: (1) biomass data are costly to collect and alternative sampling methods are used; (2) belowground data and models are generally lacking; (3) models are often developed from small and geographically limited data...
Correcting for Sample Contamination in Genotype Calling of DNA Sequence Data
Flickinger, Matthew; Jun, Goo; Abecasis, Gonçalo R.; Boehnke, Michael; Kang, Hyun Min
2015-01-01
DNA sample contamination is a frequent problem in DNA sequencing studies and can result in genotyping errors and reduced power for association testing. We recently described methods to identify within-species DNA sample contamination based on sequencing read data, showed that our methods can reliably detect and estimate contamination levels as low as 1%, and suggested strategies to identify and remove contaminated samples from sequencing studies. Here we propose methods to model contamination during genotype calling as an alternative to removal of contaminated samples from further analyses. We compare our contamination-adjusted calls to calls that ignore contamination and to calls based on uncontaminated data. We demonstrate that, for moderate contamination levels (5%–20%), contamination-adjusted calls eliminate 48%–77% of the genotyping errors. For lower levels of contamination, our contamination correction methods produce genotypes nearly as accurate as those based on uncontaminated data. Our contamination correction methods are useful generally, but are particularly helpful for sample contamination levels from 2% to 20%. PMID:26235984
Transition operators in electromagnetic-wave diffraction theory. II - Applications to optics
NASA Technical Reports Server (NTRS)
Hahne, G. E.
1993-01-01
The theory developed by Hahne (1992) for the diffraction of time-harmonic electromagnetic waves from fixed obstacles is briefly summarized and extended. Applications of the theory are considered which comprise, first, a spherical harmonic expansion of the so-called radiation impedance operator in the theory, for a spherical surface, and second, a reconsideration of familiar short-wavelength approximation from the new standpoint, including a derivation of the so-called physical optics method on the basis of quasi-planar approximation to the radiation impedance operator, augmented by the method of stationary phase. The latter includes a rederivation of the geometrical optics approximation for the complete Green's function for the electromagnetic field in the presence of a smooth- and a convex-surfaced perfectly electrically conductive obstacle.
Development and Application of the p-version of the Finite Element Method.
1985-11-21
this property hierarchic families of finite elements. The h-version of the finite element method has been the subject of inten- sive study since the...early 1950’s and perhaps even earlier. Study of the p-version of the finite element method, on the other hand, began at Washington University in St...Louis in the early 1970’s and led to a more recent study of * .the h-p version. Research in the p-version (formerly called The Constraint Method) has
QQ-SNV: single nucleotide variant detection at low frequency by comparing the quality quantiles.
Van der Borght, Koen; Thys, Kim; Wetzels, Yves; Clement, Lieven; Verbist, Bie; Reumers, Joke; van Vlijmen, Herman; Aerssens, Jeroen
2015-11-10
Next generation sequencing enables studying heterogeneous populations of viral infections. When the sequencing is done at high coverage depth ("deep sequencing"), low frequency variants can be detected. Here we present QQ-SNV (http://sourceforge.net/projects/qqsnv), a logistic regression classifier model developed for the Illumina sequencing platforms that uses the quantiles of the quality scores, to distinguish true single nucleotide variants from sequencing errors based on the estimated SNV probability. To train the model, we created a dataset of an in silico mixture of five HIV-1 plasmids. Testing of our method in comparison to the existing methods LoFreq, ShoRAH, and V-Phaser 2 was performed on two HIV and four HCV plasmid mixture datasets and one influenza H1N1 clinical dataset. For default application of QQ-SNV, variants were called using a SNV probability cutoff of 0.5 (QQ-SNV(D)). To improve the sensitivity we used a SNV probability cutoff of 0.0001 (QQ-SNV(HS)). To also increase specificity, SNVs called were overruled when their frequency was below the 80(th) percentile calculated on the distribution of error frequencies (QQ-SNV(HS-P80)). When comparing QQ-SNV versus the other methods on the plasmid mixture test sets, QQ-SNV(D) performed similarly to the existing approaches. QQ-SNV(HS) was more sensitive on all test sets but with more false positives. QQ-SNV(HS-P80) was found to be the most accurate method over all test sets by balancing sensitivity and specificity. When applied to a paired-end HCV sequencing study, with lowest spiked-in true frequency of 0.5%, QQ-SNV(HS-P80) revealed a sensitivity of 100% (vs. 40-60% for the existing methods) and a specificity of 100% (vs. 98.0-99.7% for the existing methods). In addition, QQ-SNV required the least overall computation time to process the test sets. Finally, when testing on a clinical sample, four putative true variants with frequency below 0.5% were consistently detected by QQ-SNV(HS-P80) from different generations of Illumina sequencers. We developed and successfully evaluated a novel method, called QQ-SNV, for highly efficient single nucleotide variant calling on Illumina deep sequencing virology data.
NASA Technical Reports Server (NTRS)
Kao, G. C.
1973-01-01
Method has been developed for predicting interaction between components and corresponding support structures subjected to acoustic excitations. Force environments determined in spectral form are called force spectra. Force-spectra equation is determined based on one-dimensional structural impedance model.
A Computer-Aided Abstracting Tool Kit.
ERIC Educational Resources Information Center
Craven, Timothy C.
1993-01-01
Reports on the development of a prototype computerized abstractor's assistant called TEXNET, a text network management system. Features of the system discussed include semantic dependency links; displays of text structure; basic text editing; extracting; weighting methods; and listings of frequent words. (Contains 25 references.) (LRW)
EVALUATION OF PUBLIC DATABASES AS SOURCES OF DATA FOR LIFE CYCLE ASSESSMENTS
Methods to determine the environmental effects of production systems must encourage a comprehensive evaluation of all "upstream" and "downstream" effects and their interrelationships. This cradle-to-grave approach, called Life Cycle Assessment (LCA), has led to the development...
ExpoCast: Exposure Science for Prioritization and Toxicity Testing (S)
The US EPA is completing the Phase I pilot for a chemical prioritization research program, called ToxCast. Here EPA is developing methods for using computational chemistry, high-throughput screening, and toxicogenomic technologies to predict potential toxicity and prioritize limi...
ExpoCast: Exposure Science for Prioritization and Toxicity Testing
The US EPA is completing the Phase I pilot for a chemical prioritization research program, called ToxCastTM. Here EPA is developing methods for using computational chemistry, high-throughput screening, and toxicogenomic technologies to predict potential toxicity and prioritize l...
Swendeman, Dallas; Jana, Smarajit; Ray, Protim; Mindry, Deborah; Das, Madhushree; Bhakta, Bhumi
2015-01-01
This two-phase pilot study aimed to design, pilot, and refine an automated Interactive Voice Response (IVR) intervention to support antiretroviral adherence for people living with HIV (PLH), in Kolkata, India. Mixed-methods formative research included a community advisory board (CAB) for IVR message development, one-month pre-post pilot, post-pilot focus groups, and further message development. Two IVR calls are made daily, timed to patients’ dosing schedules, with brief messages (<1-minute) on strategies for self-management of three domains: medical (adherence, symptoms, co-infections), mental health (social support, stress, positive cognitions), and nutrition and hygiene (per PLH preferences). Three ART appointment reminders are also sent each month. One-month pilot results (n=46, 80% women, 60% sex workers) found significant increases in self-reported ART adherence, both within past three days (p=0.05) and time since missed last dose (p=0.015). Depression was common. Messaging content and assessment domains were expanded for testing in a randomized trial is currently underway. PMID:25638037
Swendeman, Dallas; Jana, Smarajit; Ray, Protim; Mindry, Deborah; Das, Madhushree; Bhakta, Bhumi
2015-06-01
This two-phase pilot study aimed to design, pilot, and refine an automated interactive voice response (IVR) intervention to support antiretroviral adherence for people living with HIV (PLH), in Kolkata, India. Mixed-methods formative research included a community advisory board for IVR message development, 1-month pre-post pilot, post-pilot focus groups, and further message development. Two IVR calls are made daily, timed to patients' dosing schedules, with brief messages (<1-min) on strategies for self-management of three domains: medical (adherence, symptoms, co-infections), mental health (social support, stress, positive cognitions), and nutrition and hygiene (per PLH preferences). Three ART appointment reminders are also sent each month. One-month pilot results (n = 46, 80 % women, 60 % sex workers) found significant increases in self-reported ART adherence, both within past three days (p = 0.05) and time since missed last dose (p = 0.015). Depression was common. Messaging content and assessment domains were expanded for testing in a randomized trial currently underway.
What makes a contraceptive acceptable?
Berer, M
1995-01-01
The women's health movement is developing an increasing number of negative campaigns against various contraceptive methods based on three assumptions: 1) user-controlled methods are better for women than provider-controlled methods, 2) long-acting methods are undesirable because of their susceptibility to abuse, and 3) systemic methods carry unacceptable health risks to women. While these objections have sparked helpful debate, criticizing an overreliance on such methods is one thing and calling for bans on the provision of injectables and implants and on the development of vaccine contraceptives is another. Examination of the terms "provider-controlled," "user-controlled," and "long-acting" reveals that their definitions are not as clear-cut as opponents would have us believe. Some women's health advocates find the methods that are long-acting and provider-controlled to be the most problematic. They also criticize the near 100% contraceptive effectiveness of the long-acting methods despite the fact that the goal of contraception is to prevent pregnancy. It is wrong to condemn these methods because of their link to population control policies of the 1960s, and it is important to understand that long-acting, effective methods are often beneficial to women who require contraception for 20-22 years of their lives. Arguments against systemic methods (including RU-486 for early abortion and contraceptive vaccines) rebound around issues of safety. Feminists have gone so far as to create an intolerable situation by publishing books that criticize these methods based on erroneous conclusions and faulty scientific analysis. While women's health advocates have always rightly called for bans on abuse of various methods, they have not extended this ban to the methods themselves. In settings where other methods are not available, bans can lead to harm or maternal deaths. Another perspective can be used to consider methods in terms of their relationship with the user (repeated application). While feminists have called for more barrier and natural methods, most people in the world today refuse to use condoms even though they are the best protection from infection. Instead science should pursue promising new methods as well as continue to improve existing methods and to fill important gaps. Feminists should be advocates for women and their diverse needs rather than advocates against specific contraceptive methods.
Summary of Research 1997, Department of Mechanical Engineering.
1999-01-01
Maintenance for Diesel Engines 49 Control Architectures and Non-Linear Controllers for Unmanned Underwater Vehicles 38 Creep of Fiber Reinforced Metal...Technology Demonstration (ATD) 50 Development of Delphi Visual Performance Model 25 Diffraction Methods for the Accurate Measurement of Structure Factors...literature. If this could be done, a U.S. version of ORACLE (to be called DELPHI ) could be developed and used. The result has been the development of a
Projection methods for line radiative transfer in spherical media.
NASA Astrophysics Data System (ADS)
Anusha, L. S.; Nagendra, K. N.
An efficient numerical method called the Preconditioned Bi-Conjugate Gradient (Pre-BiCG) method is presented for the solution of radiative transfer equation in spherical geometry. A variant of this method called Stabilized Preconditioned Bi-Conjugate Gradient (Pre-BiCG-STAB) is also presented. These methods are based on projections on the subspaces of the n dimensional Euclidean space mathbb {R}n called Krylov subspaces. The methods are shown to be faster in terms of convergence rate compared to the contemporary iterative methods such as Jacobi, Gauss-Seidel and Successive Over Relaxation (SOR).
Determination of the transmission coefficients for quantum structures using FDTD method.
Peng, Yangyang; Wang, Xiaoying; Sui, Wenquan
2011-12-01
The purpose of this work is to develop a simple method to incorporate quantum effect in traditional finite-difference time-domain (FDTD) simulators. Witch could make it possible to co-simulate systems include quantum structures and traditional components. In this paper, tunneling transmission coefficient is calculated by solving time-domain Schrödinger equation with a developed FDTD technique, called FDTD-S method. To validate the feasibility of the method, a simple resonant tunneling diode (RTD) structure model has been simulated using the proposed method. The good agreement between the numerical and analytical results proves its accuracy. The effectness and accuracy of this approach makes it a potential method for analysis and design of hybrid systems includes quantum structures and traditional components.
NASA Astrophysics Data System (ADS)
Miller, Alina; Pertassek, Thomas; Steins, Andreas; Durner, Wolfgang; Göttlein, Axel; Petrik, Wolfgang; von Unold, Georg
2017-04-01
The particle-size distribution (PSD) is a key property of soils. The reference method for determining the PSD is based on gravitational sedimentation of particles in an initially homogeneous suspension. Traditional methods measure manually (i) the uplift of a floating body in the suspension at different times (Hydrometer method) or (ii) the mass of solids in extracted suspension aliquots at predefined sampling depths and times (Pipette method). Both methods lead to a disturbance of the sedimentation process and provide only discrete data of the PSD. Durner et al. (2017) recently developed a new automated method to determine particle-size distributions of soils and sediments from gravitational sedimentation (Durner, W., S.C. Iden, and G. von Unold: The integral suspension pressure method (ISP) for precise particle-size analysis by gravitational sedimentation, Water Resources Research, doi:10.1002/2016WR019830, 2017). The so-called integral suspension method (ISP) method estimates continuous PSD's from sedimentation experiments by recording the temporal evolution of the suspension pressure at a certain measurement depth in a sedimentation cylinder. It requires no manual interaction after start and thus no specialized training of the lab personnel and avoids any disturbance of the sedimentation process. The required technology to perform these experiments was developed by the UMS company, Munich and is now available as an instrument called PARIO, traded by the METER Group. In this poster, the basic functioning of PARIO is shown and key components and parameters of the technology are explained.
ERIC Educational Resources Information Center
O'Neill, D. Kevin; Guloy, Sheryl; Sensoy, Özlem
2014-01-01
To prepare students for participation in a pluralistic, democratic society, history curriculum should help them develop mature ideas about why multiple accounts of the same events exist. But how can we know if we are successful? In this article, we describe work on the design, validation, and piloting of a paper-and-pencil instrument called the…
Play in the Sandpit: A University and a Child-Care Center Collaborate in Facilitated-Action Research
ERIC Educational Resources Information Center
Jarrett, Olga; French-Lee, Stacey; Bulunuz, Nermin; Bulunuz, Mizrap
2010-01-01
Sand play commonly occupies children at preschools, child-development centers, and school and park playgrounds. The authors review the research on sand play and present a small study on outdoor sand play conducted at a university-based, child-development center using a method they call "facilitated-action research." This study had four…
ERIC Educational Resources Information Center
Boakes, Norma J.
2009-01-01
Within the study of geometry in the middle school curriculum is the natural development of students' spatial visualization, the ability to visualize two- and three-dimensional objects. The national mathematics standards call specifically for the development of such skills through hands-on experiences. A commonly accepted method is through the…
ERIC Educational Resources Information Center
Upitis, Rena; Brook, Julia
2017-01-01
Even though there are demonstrated benefits of using online tools to support student musicians, there is a persistent challenge of providing sufficient and effective professional development for independent music teachers to use such tools successfully. This paper describes several methods for helping teachers use an online tool called iSCORE,…
Bruce Shindler; Kristin Aldred Cheek; George H. Stankey
1999-01-01
As the Forest Service and the Bureau of Land Management turn toward ecosystem and adaptive models of forest stewardship, they are being called on to develop meaningful and lasting relations with citizens. These new management styles require not only improved strategies for public involvement but also methods to examine the interactions between citizens and agencies in...
Supercoherent states and physical systems
NASA Technical Reports Server (NTRS)
Fatyga, B. W.; Kostelecky, V. Alan; Nieto, Michael Martin; Truax, D. Rodney
1992-01-01
A method is developed for obtaining coherent states of a system admitting a supersymmetry. These states are called supercoherent states. The presented approach is based on an extension to supergroups of the usual group-theoretic approach. The example of the supersymmetric harmonic oscillator is discussed, thereby illustrating some of the attractive features of the method. Supercoherent states of an electron moving in a constant magnetic field are also described.
[Andragogy: reality or utopy].
Wautier, J L; Vileyn, F
2004-07-01
The education of adult differs from that of children and the methods, which have to be used, should take into account that adults have specific goals and diverse knowledge. As the teaching methods for children are called pedagogy, it is now known as andragogy for adults. Andragogy has lead to the development of several approaches to improve continuous education. Several tools and methodologies have been created for adult education.
A method for locating Barred Owl (Strix varia) nests in the southern boreal forest of Saskatchewan
Shanna D. Frith; Kurt M. Mazur; Paul C. James
1997-01-01
Barred Owl (Strix varia) nests are often very difficult to locate. We developed a method for locating Barred Owl nests within the boreal forest of central Saskatchewan, Canada. During the nesting period, we located pairs of Barred Owls through call-playback surveys. We returned to the survey location at sunset and listened for vocalizations from the...
Elyasigomari, V; Lee, D A; Screen, H R C; Shaheed, M H
2017-03-01
For each cancer type, only a few genes are informative. Due to the so-called 'curse of dimensionality' problem, the gene selection task remains a challenge. To overcome this problem, we propose a two-stage gene selection method called MRMR-COA-HS. In the first stage, the minimum redundancy and maximum relevance (MRMR) feature selection is used to select a subset of relevant genes. The selected genes are then fed into a wrapper setup that combines a new algorithm, COA-HS, using the support vector machine as a classifier. The method was applied to four microarray datasets, and the performance was assessed by the leave one out cross-validation method. Comparative performance assessment of the proposed method with other evolutionary algorithms suggested that the proposed algorithm significantly outperforms other methods in selecting a fewer number of genes while maintaining the highest classification accuracy. The functions of the selected genes were further investigated, and it was confirmed that the selected genes are biologically relevant to each cancer type. Copyright © 2017. Published by Elsevier Inc.
Detection of Cutting Tool Wear using Statistical Analysis and Regression Model
NASA Astrophysics Data System (ADS)
Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin
2010-10-01
This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.
Li, John; Maclehose, Rich; Smith, Kirk; Kaehler, Dawn; Hedberg, Craig
2011-01-01
Foodborne illness surveillance based on consumer complaints detects outbreaks by finding common exposures among callers, but this process is often difficult. Laboratory testing of ill callers could also help identify potential outbreaks. However, collection of stool samples from all callers is not feasible. Methods to help screen calls for etiology are needed to increase the efficiency of complaint surveillance systems and increase the likelihood of detecting foodborne outbreaks caused by Salmonella. Data from the Minnesota Department of Health foodborne illness surveillance database (2000 to 2008) were analyzed. Complaints with identified etiologies were examined to create a predictive model for Salmonella. Bootstrap methods were used to internally validate the model. Seventy-one percent of complaints in the foodborne illness database with known etiologies were due to norovirus. The predictive model had a good discriminatory ability to identify Salmonella calls. Three cutoffs for the predictive model were tested: one that maximized sensitivity, one that maximized specificity, and one that maximized predictive ability, providing sensitivities and specificities of 32 and 96%, 100 and 54%, and 89 and 72%, respectively. Development of a predictive model for Salmonella could help screen calls for etiology. The cutoff that provided the best predictive ability for Salmonella corresponded to a caller reporting diarrhea and fever with no vomiting, and five or fewer people ill. Screening calls for etiology would help identify complaints for further follow-up and result in identifying Salmonella cases that would otherwise go unconfirmed; in turn, this could lead to the identification of more outbreaks.
ERIC Educational Resources Information Center
Levy, Mike
2015-01-01
The article considers the role of qualitative research methods in CALL through describing a series of examples. These examples are used to highlight the importance and value of qualitative data in relation to a specific research objective in CALL. The use of qualitative methods in conjunction with other approaches as in mixed method research…
A new method for enhancer prediction based on deep belief network.
Bu, Hongda; Gan, Yanglan; Wang, Yang; Zhou, Shuigeng; Guan, Jihong
2017-10-16
Studies have shown that enhancers are significant regulatory elements to play crucial roles in gene expression regulation. Since enhancers are unrelated to the orientation and distance to their target genes, it is a challenging mission for scholars and researchers to accurately predicting distal enhancers. In the past years, with the high-throughout ChiP-seq technologies development, several computational techniques emerge to predict enhancers using epigenetic or genomic features. Nevertheless, the inconsistency of computational models across different cell-lines and the unsatisfactory prediction performance call for further research in this area. Here, we propose a new Deep Belief Network (DBN) based computational method for enhancer prediction, which is called EnhancerDBN. This method combines diverse features, composed of DNA sequence compositional features, DNA methylation and histone modifications. Our computational results indicate that 1) EnhancerDBN outperforms 13 existing methods in prediction, and 2) GC content and DNA methylation can serve as relevant features for enhancer prediction. Deep learning is effective in boosting the performance of enhancer prediction.
Perspectives of Nurses and Patients on Call Light Technology
Galinato, Jose; Montie, Mary; Patak, Lance; Titler, Marita
2015-01-01
Call lights are prevalent in inpatient healthcare facilities across the nation. While call light use directly influences the delivery of nursing care, there remain significant gaps both in research and technology that can impact the quality of care and patient satisfaction. This study examines the perception of nurses and patients on the use of a new call communication solution, Eloquence™, in the acute care inpatient setting. Eighteen patients were recruited for the study and participated in individual semi-structured interviews during their hospital stay. Eighteen nurses were recruited and participated in focus groups for this study. Qualitative descriptive methods were used to analyze the data. Results revealed themes of usability, improved communication, and suggestions for improvement to the alpha prototype design. After a demonstration of the use and capability of Eloquence™, nurse and patient participants found Eloquence™ as a welcomed advancement in nurse call technology that has the potential to improve workflow and patient outcomes. In addition, the participants also proposed ideas on how to further develop the technology to improve its use. PMID:26176639
Electron microscopy methods in studies of cultural heritage sites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasiliev, A. L., E-mail: a.vasiliev56@gmail.com; Kovalchuk, M. V.; Yatsishina, E. B.
The history of the development and application of scanning electron microscopy (SEM), transmission electron microscopy (TEM), and energy-dispersive X-ray microanalysis (EDXMA) in studies of cultural heritage sites is considered. In fact, investigations based on these methods began when electron microscopes became a commercial product. Currently, these methods, being developed and improved, help solve many historical enigmas. To date, electron microscopy combined with microanalysis makes it possible to investigate any object, from parchment and wooden articles to pigments, tools, and objects of art. Studies by these methods have revealed that some articles were made by ancient masters using ancient “nanotechnologies”; hence,more » their comprehensive analysis calls for the latest achievements in the corresponding instrumental methods and sample preparation techniques.« less
Electron microscopy methods in studies of cultural heritage sites
NASA Astrophysics Data System (ADS)
Vasiliev, A. L.; Kovalchuk, M. V.; Yatsishina, E. B.
2016-11-01
The history of the development and application of scanning electron microscopy (SEM), transmission electron microscopy (TEM), and energy-dispersive X-ray microanalysis (EDXMA) in studies of cultural heritage sites is considered. In fact, investigations based on these methods began when electron microscopes became a commercial product. Currently, these methods, being developed and improved, help solve many historical enigmas. To date, electron microscopy combined with microanalysis makes it possible to investigate any object, from parchment and wooden articles to pigments, tools, and objects of art. Studies by these methods have revealed that some articles were made by ancient masters using ancient "nanotechnologies"; hence, their comprehensive analysis calls for the latest achievements in the corresponding instrumental methods and sample preparation techniques.
High-energy evolution to three loops
NASA Astrophysics Data System (ADS)
Caron-Huot, Simon; Herranen, Matti
2018-02-01
The Balitsky-Kovchegov equation describes the high-energy growth of gauge theory scattering amplitudes as well as nonlinear saturation effects which stop it. We obtain the three-loop corrections to the equation in planar N = 4 super Yang-Mills theory. Our method exploits a recently established equivalence with the physics of soft wide-angle radiation, so-called non-global logarithms, and thus yields at the same time the threeloop evolution equation for non-global logarithms. As a by-product of our analysis, we develop a Lorentz-covariant method to subtract infrared and collinear divergences in crosssection calculations in the planar limit. We compare our result in the linear regime with a recent prediction for the so-called Pomeron trajectory, and compare its collinear limit with predictions from the spectrum of twist-two operators.
ERIC Educational Resources Information Center
Dik, Bryan J.; Eldridge, Brandy M.; Steger, Michael F.; Duffy, Ryan D.
2012-01-01
Research on work as a calling is limited by measurement concerns. In response, the authors introduce the multidimensional Calling and Vocation Questionnaire (CVQ) and the Brief Calling scale (BCS), instruments assessing presence of, and search for, a calling. Study 1 describes CVQ development using exploratory and confirmatory factor analysis…
DOT National Transportation Integrated Search
2015-08-01
This research developed a smartphone application called ORcycle to collect cyclists routes, users, and : comfort levels. ORcycle combines GPS revealed route data collection with new questionnaires that try : to elicit cyclists attitudes as well...
THE ONSITE ON-LINE CALCULATORS AND TRAINING FOR SUBSURFACE CONTAMINANT TRANSPORT SITE ASSESSMENT
EPA has developed a suite of on-line calculators called "OnSite" for assessing transport of environmental contaminants in the subsurface. The purpose of these calculators is to provide methods and data for common calculations used in assessing impacts from subsurface contaminatio...
Science and General Education: Reactions From A Generalist
ERIC Educational Resources Information Center
McQuigg, R. Bruce
1972-01-01
Author says that science courses that develop contempt for and hostility toward science are doing more harm than good to many students, and calls on curriculum makers to see that respect for the scientific method is engendered by translating it from the lab to life itself. (Editor)
Physical activity problem-solving inventory for adolescents: Development and initial validation
USDA-ARS?s Scientific Manuscript database
Youth encounter physical activity barriers, often called problems. The purpose of problem-solving is to generate solutions to overcome the barriers. Enhancing problem-solving ability may enable youth to be more physically active. Therefore, a method for reliably assessing physical activity problem-s...
Applied Epistemology and Understanding in Information Studies
ERIC Educational Resources Information Center
Gorichanaz, Tim
2017-01-01
Introduction: Applied epistemology allows information studies to benefit from developments in philosophy. In information studies, epistemic concepts are rarely considered in detail. This paper offers a review of several epistemic concepts, focusing on understanding, as a call for further work in applied epistemology in information studies. Method:…
Improving Students' Diagram Comprehension with Classroom Instruction
ERIC Educational Resources Information Center
Cromley, Jennifer G.; Perez, Tony C.; Fitzhugh, Shannon L.; Newcombe, Nora S.; Wills, Theodore W.; Tanaka, Jacqueline C.
2013-01-01
The authors tested whether students can be taught to better understand conventional representations in diagrams, photographs, and other visual representations in science textbooks. The authors developed a teacher-delivered, workbook-and-discussion-based classroom instructional method called Conventions of Diagrams (COD). The authors trained 1…
Sato, Kuniya; Ooba, Masahiro; Takagi, Tomohiko; Furukawa, Zengo; Komiya, Seiichi; Yaegashi, Rihito
2013-12-01
Agile software development gains requirements from the direct discussion with customers and the development staff each time, and the customers evaluate the appropriateness of the requirement. If the customers divide the complicated requirement into individual requirements, the engineer who is in charge of software development can understand it easily. This is called division of requirement. However, the customers do not understand how much and how to divide the requirements. This paper proposes the method to divide a complicated requirement into individual requirements. Also, it shows the development of requirement specification editor which can describe individual requirements. The engineer who is in charge of software development can understand requirements easily.
ERIC Educational Resources Information Center
Yirmiya, Nurit; Gamliel, Ifat; Pilowsky, Tammy; Feldman, Ruth; Baron-Cohen, Simon; Sigman, Marian
2006-01-01
Aims: To compare siblings of children with autism (SIBS-A) and siblings of children with typical development (SIBS-TD) at 4 and 14 months of age. Methods: At 4 months, mother-infant interactional synchrony during free play, infant gaze and affect during the still-face paradigm, and infant responsiveness to a name-calling paradigm were examined (n…
Static Methods in the Design of Nonlinear Automatic Control Systems,
1984-06-27
227 Chapter VI. Ways of Decrease of the Number of Statistical Nodes During the Research of Nonlinear Systems...at present occupies the central place. This region of research was called the statistical dynamics of nonlinear H automatic control systems...receives further development in the numerous research of Soviet and C foreign scientists. Special role in the development of the statistical dynamics of
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hale, M.A.; Craig, J.I.
Integrated Product and Process Development (IPPD) embodies the simultaneous application to both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. Agents are used to implementmore » the overall infrastructure on the computer. Successful agent utilization requires that they be made of three components: the resource, the model, and the wrap. Current work is focused on the development of generalized agent schemes and associated demonstration projects. When in place, the technology independent computing infrastructure will aid the designer in systematically generating knowledge used to facilitate decision-making.« less
Use of agents to implement an integrated computing environment
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.
1995-01-01
Integrated Product and Process Development (IPPD) embodies the simultaneous application to both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. Agents are used to implement the overall infrastructure on the computer. Successful agent utilization requires that they be made of three components: the resource, the model, and the wrap. Current work is focused on the development of generalized agent schemes and associated demonstration projects. When in place, the technology independent computing infrastructure will aid the designer in systematically generating knowledge used to facilitate decision-making.
2016-07-08
Systems Using Automata Theory and Barrier Certifi- cates We developed a sound but incomplete method for the computational verification of specifications...method merges ideas from automata -based model checking with those from control theory including so-called barrier certificates and optimization-based... Automata theory meets barrier certificates: Temporal logic verification of nonlinear systems,” IEEE Transactions on Automatic Control, 2015. [J2] R
ERIC Educational Resources Information Center
Thurgood, Larry L.
2010-01-01
A mixed methods study examined how a newly developed campus-wide framework for learning and teaching, called the Learning Model, was accepted and embraced by faculty members at Brigham Young University-Idaho from September 2007 to January 2009. Data from two administrations of the Approaches to Teaching Inventory showed that (a) faculty members…
ERIC Educational Resources Information Center
Manzo, Rosa D.; Whent, Linda; Liets, Lauren; de la Torre, Adela; Gomez-Camacho, Rosa
2016-01-01
This study examined how science teachers' knowledge of research methods, neuroscience and drug addiction changed through their participation in a 5-day summer science institute. The data for this study evolved from a four-year NIH funded science education project called Addiction Research and Investigation for Science Educators (ARISE). Findings…
Resource understanding: a challenge to aerial methods
Udall, Stewart L.
1965-01-01
Aerial survey methods are speeding acquisition of survey data needed to provide and manage the nation's resources. These methods have been applied to topographic mapping for a number of years and the record clearly shows their advantages in terms of cost and speed in contrast to the ground methods that have been historically employed. Limited use is now being made of aerial methods to assist cadastral surveys, in location, acquisition and development of National Parks, in mapping the geology of the nation, in locating and developing water resources, and in surveys of the oceans. It is the purpose of this paper to call attention to these uses and to encourage the scientific community to further refine aerial methods so that their use may be increased and the veracity of data improved.
Two modeling strategies for empirical Bayes estimation
Efron, Bradley
2014-01-01
Empirical Bayes methods use the data from parallel experiments, for instance observations Xk ~ 𝒩 (Θk, 1) for k = 1, 2, …, N, to estimate the conditional distributions Θk|Xk. There are two main estimation strategies: modeling on the θ space, called “g-modeling” here, and modeling on the×space, called “f-modeling.” The two approaches are de- scribed and compared. A series of computational formulas are developed to assess their frequentist accuracy. Several examples, both contrived and genuine, show the strengths and limitations of the two strategies. PMID:25324592
Properties of fiber reinforced plastics about static and dynamic loadings
NASA Astrophysics Data System (ADS)
Kudinov, Vladimir V.; Korneeva, Natalia V.
2016-05-01
A method for investigation of impact toughness of anisotropic polymer composite materials (reinforced plastics) with the help of CM model sample in the configuration of microplastic (micro plastic) and impact pendulum-type testing machine under static and dynamic loadings has been developed. The method is called "Break by Impact" (Impact Break IB). The estimation of impact resistance CFRP by this method showed that an increase in loading velocity ~104 times the largest changes occurs in impact toughness and deformation ability of a material.
Polyadenylation site prediction using PolyA-iEP method.
Kavakiotis, Ioannis; Tzanis, George; Vlahavas, Ioannis
2014-01-01
This chapter presents a method called PolyA-iEP that has been developed for the prediction of polyadenylation sites. More precisely, PolyA-iEP is a method that recognizes mRNA 3'ends which contain polyadenylation sites. It is a modular system which consists of two main components. The first exploits the advantages of emerging patterns and the second is a distance-based scoring method. The outputs of the two components are finally combined by a classifier. The final results reach very high scores of sensitivity and specificity.
Application of integrated fluid-thermal-structural analysis methods
NASA Technical Reports Server (NTRS)
Wieting, Allan R.; Dechaumphai, Pramote; Bey, Kim S.; Thornton, Earl A.; Morgan, Ken
1988-01-01
Hypersonic vehicles operate in a hostile aerothermal environment which has a significant impact on their aerothermostructural performance. Significant coupling occurs between the aerodynamic flow field, structural heat transfer, and structural response creating a multidisciplinary interaction. Interfacing state-of-the-art disciplinary analysis methods is not efficient, hence interdisciplinary analysis methods integrated into a single aerothermostructural analyzer are needed. The NASA Langley Research Center is developing such methods in an analyzer called LIFTS (Langley Integrated Fluid-Thermal-Structural) analyzer. The evolution and status of LIFTS is reviewed and illustrated through applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saad, Yousef
2014-03-19
The master project under which this work is funded had as its main objective to develop computational methods for modeling electronic excited-state and optical properties of various nanostructures. The specific goals of the computer science group were primarily to develop effective numerical algorithms in Density Functional Theory (DFT) and Time Dependent Density Functional Theory (TDDFT). There were essentially four distinct stated objectives. The first objective was to study and develop effective numerical algorithms for solving large eigenvalue problems such as those that arise in Density Functional Theory (DFT) methods. The second objective was to explore so-called linear scaling methods ormore » Methods that avoid diagonalization. The third was to develop effective approaches for Time-Dependent DFT (TDDFT). Our fourth and final objective was to examine effective solution strategies for other problems in electronic excitations, such as the GW/Bethe-Salpeter method, and quantum transport problems.« less
Theory of the development of alternans in the heart during controlled diastolic interval pacing
NASA Astrophysics Data System (ADS)
Otani, Niels F.
2017-09-01
The beat-to-beat alternation in action potential durations (APDs) in the heart, called APD alternans, has been linked to the development of serious cardiac rhythm disorders, including ventricular tachycardia and fibrillation. The length of the period between action potentials, called the diastolic interval (DI), is a key dynamical variable in the standard theory of alternans development. Thus, methods that control the DI may be useful in preventing dangerous cardiac rhythms. In this study, we examine the dynamics of alternans during controlled-DI pacing using a series of single-cell and one-dimensional (1D) fiber models of alternans dynamics. We find that a model that combines a so-called memory model with a calcium cycling model can reasonably explain two key experimental results: the possibility of alternans during constant-DI pacing and the phase lag of APDs behind DIs during sinusoidal-DI pacing. We also find that these results can be replicated by incorporating the memory model into an amplitude equation description of a 1D fiber. The 1D fiber result is potentially concerning because it seems to suggest that constant-DI control of alternans can only be effective over only a limited region in space.
Setting technical standards for visual assessment procedures
Kenneth H. Craik; Nickolaus R. Feimer
1979-01-01
Under the impetus of recent legislative and administrative mandates concerning analysis and management of the landscape, governmental agencies are being called upon to adopt or develop visual resource and impact assessment (VRIA) systems. A variety of techniques that combine methods of psychological assessment and landscape analysis to serve these purposes is being...
EVALUATION OF THE HIGH VOLUME COLLECTION SYSTEM (HVCS) FOR QUANTIFYING FUGITIVE ORGANIC VAPOR LEAKS
The report discusses a recently developed measurements technique that offers the potential for providing an easy-to-use and cost effective means to directly measure organic vapor leaks. The method, called High Volume Collection System (HVCS), uses a high volume sampling device an...
Creative Thinking in Schools: Finding the "Just Right" Challenge for Students
ERIC Educational Resources Information Center
Fletcher, Tina Sue
2011-01-01
Spurred on by explosive technological developments and unprecedented access to information, leaders in the fields of business, industry, and education are all calling for creative, innovative workers. In an atmosphere of high-stakes testing and global competitiveness, educators around the world are examining their teaching methods to determine…
A Phenomenological Study of Undergraduate Instructors Using the Inverted or Flipped Classroom Model
ERIC Educational Resources Information Center
Brown, Anna F.
2012-01-01
The changing educational needs of undergraduate students have not been addressed with a corresponding development of instructional methods in higher education classrooms. This study used a phenomenological approach to investigate a classroom-based instructional model called the "inverted" or "flipped" classroom. The flipped…
Expansive Visibilization to Stimulate EFL Teacher Reflection
ERIC Educational Resources Information Center
Ito, Ryu
2012-01-01
Despite the growing popularity of action research, bridging the gap between data collection and reflective data analysis still lacks a well-developed methodology. As a supplement to the traditional action research procedure for language teaching, I adopted a method called expansive visibilization (EV), which has the potential to be a reflective…
In 2002 the National Research Council (NRC) issued a report which identified a number of issues regarding biosolids land application practices and pointed out the need for improved and validated analytical techniques for regulated indicator organisms and pathogens. They also call...
Defining and Measuring Literacy: Facing the Reality
ERIC Educational Resources Information Center
Ahmed, Manzoor
2011-01-01
Increasing recognition of a broadened concept of literacy challenges policy-makers and practitioners to re-define literacy operationally, develop and apply appropriate methods of assessing literacy and consider and act upon the consequent policy implications. This task is given a new urgency by the call of the Belem Framework for Action to…
A Dynamic Alternative to the Scientific Method
ERIC Educational Resources Information Center
Musante, Susan
2009-01-01
Scotchmoor and a team of natural scientists, social scientists, philosophers, and educators developed a Web site called Understanding Science ("www.understandingscience.org") to explain to teachers, students, and the general public "how science "really" works." The site, launched in January 2009 and funded by the National Science Foundation,…
Formal verification and testing: An integrated approach to validating Ada programs
NASA Technical Reports Server (NTRS)
Cohen, Norman H.
1986-01-01
An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.
The development of the deterministic nonlinear PDEs in particle physics to stochastic case
NASA Astrophysics Data System (ADS)
Abdelrahman, Mahmoud A. E.; Sohaly, M. A.
2018-06-01
In the present work, accuracy method called, Riccati-Bernoulli Sub-ODE technique is used for solving the deterministic and stochastic case of the Phi-4 equation and the nonlinear Foam Drainage equation. Also, the control on the randomness input is studied for stability stochastic process solution.
Techniques for calling sapsuckers and finding their nesting territories
Francis M. Rushmore
1973-01-01
In 9 years of study in Maine, the author developed a survey method for estimating populations of yellow-bellied sapsuckers (Sphyrapicus varius varius L.) and delineating their territories. The birds were attracted for study by tapping hardwood dowels to imitate noises the birds make in drumming and feeding.
Survival analysis, or what to do with upper limits in astronomical surveys
NASA Technical Reports Server (NTRS)
Isobe, Takashi; Feigelson, Eric D.
1986-01-01
A field of applied statistics called survival analysis has been developed over several decades to deal with censored data, which occur in astronomical surveys when objects are too faint to be detected. How these methods can assist in the statistical interpretation of astronomical data are reviewed.
Preparing to Be Allies: Narratives of Non-Indigenous Researchers Working in Indigenous Contexts
ERIC Educational Resources Information Center
Brophey, Alison; Raptis, Helen
2016-01-01
Insensitive research approaches have resulted in damaged relationships between non-Indigenous researchers and Indigenous communities, prompting scholars and funding agencies to call for more culturally compatible research methods. This paper addresses the qualities, skills and knowledge developed by six non-Indigenous researchers as they…
Stuttering and Language Ability in Children: Questioning the Connection
ERIC Educational Resources Information Center
Nippold, Marilyn A.
2012-01-01
Purpose: This article explains why it is reasonable to question the view that stuttering and language ability in children are linked--the so-called "stuttering-language connection." Method: Studies that focused on syntactic, morphologic, and lexical development in children who stutter (CWS) are examined for evidence to support the following…
Automated surveillance of 911 call data for detection of possible water contamination incidents
2011-01-01
Background Drinking water contamination, with the capability to affect large populations, poses a significant risk to public health. In recent water contamination events, the impact of contamination on public health appeared in data streams monitoring health-seeking behavior. While public health surveillance has traditionally focused on the detection of pathogens, developing methods for detection of illness from fast-acting chemicals has not been an emphasis. Methods An automated surveillance system was implemented for Cincinnati's drinking water contamination warning system to monitor health-related 911 calls in the city of Cincinnati. Incident codes indicative of possible water contamination were filtered from all 911 calls for analysis. The 911 surveillance system uses a space-time scan statistic to detect potential water contamination incidents. The frequency and characteristics of the 911 alarms over a 2.5 year period were studied. Results During the evaluation, 85 alarms occurred, although most occurred prior to the implementation of an additional alerting constraint in May 2009. Data were available for analysis approximately 48 minutes after calls indicating alarms may be generated 1-2 hours after a rapid increase in call volume. Most alerts occurred in areas of high population density. The average alarm area was 9.22 square kilometers. The average number of cases in an alarm was nine calls. Conclusions The 911 surveillance system provides timely notification of possible public health events, but did have limitations. While the alarms contained incident codes and location of the caller, additional information such as medical status was not available to assist validating the cause of the alarm. Furthermore, users indicated that a better understanding of 911 system functionality is necessary to understand how it would behave in an actual water contamination event. PMID:21450105
2010-01-01
Background Telephone hotlines designed to address common concerns in the early postpartum could be a useful resource for parents. Our aim was to test the feasibility of using a telephone as an intervention in a randomized controlled trial. We also aimed to test to use of algorithms to address parental concerns through a telephone hotline. Methods Healthy first-time mothers were recruited from postpartum wards of hospitals throughout Lebanon. Participants were given the number of a 24-hour telephone hotline that they could access for the first four months after delivery. Calls were answered by a midwife using algorithms developed by the study team whenever possible. Callers with medical complaints were referred to their physicians. Call patterns and content were recorded and analyzed. Results Eighty-four of the 353 women enrolled (24%) used the hotline. Sixty percent of the women who used the service called more than once, and all callers reported they were satisfied with the service. The midwife received an average of three calls per day and most calls occurred during the first four weeks postpartum. Our algorithms were used to answer questions in 62.8% of calls and 18.6% of calls required referral to a physician. Of the questions related to mothers, 66% were about breastfeeding. Sixty percent of questions related to the infant were about routine care and 23% were about excessive crying. Conclusions Utilization of a telephone hotline service for postpartum support is highest in the first four weeks postpartum. Most questions are related to breastfeeding, routine newborn care, and management of a fussy infant. It is feasible to test a telephone hotline as an intervention in a randomized controlled trial. Algorithms can be developed to provide standardized answers to the most common questions. PMID:20946690
Theory and computation of optimal low- and medium-thrust transfers
NASA Technical Reports Server (NTRS)
Chuang, C.-H.
1994-01-01
This report describes the current state of development of methods for calculating optimal orbital transfers with large numbers of burns. Reported on first is the homotopy-motivated and so-called direction correction method. So far this method has been partially tested with one solver; the final step has yet to be implemented. Second is the patched transfer method. This method is rooted in some simplifying approximations made on the original optimal control problem. The transfer is broken up into single-burn segments, each single-burn solved as a predictor step and the whole problem then solved with a corrector step.
Brasil, Christiane Regina Soares; Delbem, Alexandre Claudio Botazzo; da Silva, Fernando Luís Barroso
2013-07-30
This article focuses on the development of an approach for ab initio protein structure prediction (PSP) without using any earlier knowledge from similar protein structures, as fragment-based statistics or inference of secondary structures. Such an approach is called purely ab initio prediction. The article shows that well-designed multiobjective evolutionary algorithms can predict relevant protein structures in a purely ab initio way. One challenge for purely ab initio PSP is the prediction of structures with β-sheets. To work with such proteins, this research has also developed procedures to efficiently estimate hydrogen bond and solvation contribution energies. Considering van der Waals, electrostatic, hydrogen bond, and solvation contribution energies, the PSP is a problem with four energetic terms to be minimized. Each interaction energy term can be considered an objective of an optimization method. Combinatorial problems with four objectives have been considered too complex for the available multiobjective optimization (MOO) methods. The proposed approach, called "Multiobjective evolutionary algorithms with many tables" (MEAMT), can efficiently deal with four objectives through the combination thereof, performing a more adequate sampling of the objective space. Therefore, this method can better map the promising regions in this space, predicting structures in a purely ab initio way. In other words, MEAMT is an efficient optimization method for MOO, which explores simultaneously the search space as well as the objective space. MEAMT can predict structures with one or two domains with RMSDs comparable to values obtained by recently developed ab initio methods (GAPFCG , I-PAES, and Quark) that use different levels of earlier knowledge. Copyright © 2013 Wiley Periodicals, Inc.
Scientific Teaching: Defining a Taxonomy of Observable Practices
Couch, Brian A.; Brown, Tanya L.; Schelpat, Tyler J.; Graham, Mark J.; Knight, Jennifer K.
2015-01-01
Over the past several decades, numerous reports have been published advocating for changes to undergraduate science education. These national calls inspired the formation of the National Academies Summer Institutes on Undergraduate Education in Biology (SI), a group of regional workshops to help faculty members learn and implement interactive teaching methods. The SI curriculum promotes a pedagogical framework called Scientific Teaching (ST), which aims to bring the vitality of modern research into the classroom by engaging students in the scientific discovery process and using student data to inform the ongoing development of teaching methods. With the spread of ST, the need emerges to systematically define its components in order to establish a common description for education researchers and practitioners. We describe the development of a taxonomy detailing ST’s core elements and provide data from classroom observations and faculty surveys in support of its applicability within undergraduate science courses. The final taxonomy consists of 15 pedagogical goals and 37 supporting practices, specifying observable behaviors, artifacts, and features associated with ST. This taxonomy will support future educational efforts by providing a framework for researchers studying the processes and outcomes of ST-based course transformations as well as a concise guide for faculty members developing classes. PMID:25713097
NASA Technical Reports Server (NTRS)
Pappa, Richard S. (Technical Monitor); Black, Jonathan T.
2003-01-01
This report discusses the development and application of metrology methods called photogrammetry and videogrammetry that make accurate measurements from photographs. These methods have been adapted for the static and dynamic characterization of gossamer structures, as four specific solar sail applications demonstrate. The applications prove that high-resolution, full-field, non-contact static measurements of solar sails using dot projection photogrammetry are possible as well as full-field, non-contact, dynamic characterization using dot projection videogrammetry. The accuracy of the measurement of the resonant frequencies and operating deflection shapes that were extracted surpassed expectations. While other non-contact measurement methods exist, they are not full-field and require significantly more time to take data.
An ontology-based nurse call management system (oNCS) with probabilistic priority assessment
2011-01-01
Background The current, place-oriented nurse call systems are very static. A patient can only make calls with a button which is fixed to a wall of a room. Moreover, the system does not take into account various factors specific to a situation. In the future, there will be an evolution to a mobile button for each patient so that they can walk around freely and still make calls. The system would become person-oriented and the available context information should be taken into account to assign the correct nurse to a call. The aim of this research is (1) the design of a software platform that supports the transition to mobile and wireless nurse call buttons in hospitals and residential care and (2) the design of a sophisticated nurse call algorithm. This algorithm dynamically adapts to the situation at hand by taking the profile information of staff members and patients into account. Additionally, the priority of a call probabilistically depends on the risk factors, assigned to a patient. Methods The ontology-based Nurse Call System (oNCS) was developed as an extension of a Context-Aware Service Platform. An ontology is used to manage the profile information. Rules implement the novel nurse call algorithm that takes all this information into account. Probabilistic reasoning algorithms are designed to determine the priority of a call based on the risk factors of the patient. Results The oNCS system is evaluated through a prototype implementation and simulations, based on a detailed dataset obtained from Ghent University Hospital. The arrival times of nurses at the location of a call, the workload distribution of calls amongst nurses and the assignment of priorities to calls are compared for the oNCS system and the current, place-oriented nurse call system. Additionally, the performance of the system is discussed. Conclusions The execution time of the nurse call algorithm is on average 50.333 ms. Moreover, the oNCS system significantly improves the assignment of nurses to calls. Calls generally have a nurse present faster and the workload-distribution amongst the nurses improves. PMID:21294860
The cardiac muscle duplex as a method to study myocardial heterogeneity
Solovyova, O.; Katsnelson, L.B.; Konovalov, P.V.; Kursanov, A.G.; Vikulova, N.A.; Kohl, P.; Markhasin, V.S.
2014-01-01
This paper reviews the development and application of paired muscle preparations, called duplex, for the investigation of mechanisms and consequences of intra-myocardial electro-mechanical heterogeneity. We illustrate the utility of the underlying combined experimental and computational approach for conceptual development and integration of basic science insight with clinically relevant settings, using previously published and new data. Directions for further study are identified. PMID:25106702
Modeling crime events by d-separation method
NASA Astrophysics Data System (ADS)
Aarthee, R.; Ezhilmaran, D.
2017-11-01
Problematic legal cases have recently called for a scientifically founded method of dealing with the qualitative and quantitative roles of evidence in a case [1].To deal with quantitative, we proposed a d-separation method for modeling the crime events. A d-separation is a graphical criterion for identifying independence in a directed acyclic graph. By developing a d-separation method, we aim to lay the foundations for the development of a software support tool that can deal with the evidential reasoning in legal cases. Such a tool is meant to be used by a judge or juror, in alliance with various experts who can provide information about the details. This will hopefully improve the communication between judges or jurors and experts. The proposed method used to uncover more valid independencies than any other graphical criterion.
Statistical Process Control Techniques for the Telecommunications Systems Manager
1992-03-01
products that are out of 59 tolerance and bad designs. The third type of defect, mistakes, are remedied by Poka - Yoke methods that are 1 introduced later...based on total production costs plus quality costs. Once production is underway, interventions are determined by their impact on the QLF. F. POKA - YOKE ...Mistakes require process improvements called Poka Yoke or mistake proofing. Shiego Shingo developed Poka Yoke methods to incorporate 100% inspection at
47 CFR 80.225 - Requirements for selective calling equipment.
Code of Federal Regulations, 2011 CFR
2011-10-01
... selective calling (DSC) equipment and selective calling equipment installed in ship and coast stations, and...-STD, “RTCM Recommended Minimum Standards for Digital Selective Calling (DSC) Equipment Providing... Class ‘D’ Digital Selective Calling (DSC)—Methods of testing and required test results,” March 2003. ITU...
PIRIA: a general tool for indexing, search, and retrieval of multimedia content
NASA Astrophysics Data System (ADS)
Joint, Magali; Moellic, Pierre-Alain; Hede, P.; Adam, P.
2004-05-01
The Internet is a continuously expanding source of multimedia content and information. There are many products in development to search, retrieve, and understand multimedia content. But most of the current image search/retrieval engines, rely on a image database manually pre-indexed with keywords. Computers are still powerless to understand the semantic meaning of still or animated image content. Piria (Program for the Indexing and Research of Images by Affinity), the search engine we have developed brings this possibility closer to reality. Piria is a novel search engine that uses the query by example method. A user query is submitted to the system, which then returns a list of images ranked by similarity, obtained by a metric distance that operates on every indexed image signature. These indexed images are compared according to several different classifiers, not only Keywords, but also Form, Color and Texture, taking into account geometric transformations and variance like rotation, symmetry, mirroring, etc. Form - Edges extracted by an efficient segmentation algorithm. Color - Histogram, semantic color segmentation and spatial color relationship. Texture - Texture wavelets and local edge patterns. If required, Piria is also able to fuse results from multiple classifiers with a new classification of index categories: Single Indexer Single Call (SISC), Single Indexer Multiple Call (SIMC), Multiple Indexers Single Call (MISC) or Multiple Indexers Multiple Call (MIMC). Commercial and industrial applications will be explored and discussed as well as current and future development.
New approaches to the measurement of chlorophyll, related pigments and productivity in the sea
NASA Technical Reports Server (NTRS)
Booth, C. R.; Keifer, D. A.
1989-01-01
In the 1984 SBIR Call for Proposals, NASA solicited new methods to measure primary production and chlorophyll in the ocean. Biospherical Instruments Inc. responded to this call with a proposal first to study a variety of approaches to this problem. A second phase of research was then funded to pursue instrumentation to measure the sunlight stimulated naturally occurring fluorescence of chlorophyll in marine phytoplankton. The monitoring of global productivity, global fisheries resources, application of above surface-to-underwater optical communications systems, submarine detection applications, correlation, and calibration of remote sensing systems are but some of the reasons for developing inexpensive sensors to measure chlorophyll and productivity. Normally, productivity measurements are manpower and cost intensive and, with the exception of a very few expensive multiship research experiments, provide no contemporaneous data. We feel that the patented, simple sensors that we have designed will provide a cost effective method for large scale, synoptic, optical measurements in the ocean. This document is the final project report for a NASA sponsored SBIR Phase 2 effort to develop new methods for the measurements of primary production in the ocean. This project has been successfully completed, a U.S. patent was issued covering the methodology and sensors, and the first production run of instrumentation developed under this contract has sold out and been delivered.
Solder Joint Health Monitoring Testbed
NASA Technical Reports Server (NTRS)
Delaney, Michael M.; Flynn, James; Browder, Mark
2009-01-01
A method of monitoring the health of selected solder joints, called SJ-BIST, has been developed by Ridgetop Group Inc. under a Small Business Innovative Research (SBIR) contract. The primary goal of this research program is to test and validate this method in a flight environment using realistically seeded faults in selected solder joints. An additional objective is to gather environmental data for future development of physics-based and data-driven prognostics algorithms. A test board is being designed using a Xilinx FPGA. These boards will be tested both in flight and on the ground using a shaker table and an altitude chamber.
Development of Light-Activated CRISPR Using Guide RNAs with Photocleavable Protectors.
Jain, Piyush K; Ramanan, Vyas; Schepers, Arnout G; Dalvie, Nisha S; Panda, Apekshya; Fleming, Heather E; Bhatia, Sangeeta N
2016-09-26
The ability to remotely trigger CRISPR/Cas9 activity would enable new strategies to study cellular events with greater precision and complexity. In this work, we have developed a method to photocage the activity of the guide RNA called "CRISPR-plus" (CRISPR-precise light-mediated unveiling of sgRNAs). The photoactivation capability of our CRISPR-plus method is compatible with the simultaneous targeting of multiple DNA sequences and supports numerous modifications that can enable guide RNA labeling for use in imaging and mechanistic investigations. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Scalia, C.; Leone, F.; Gangi, M.; Giarrusso, M.; Stift, M. J.
2017-12-01
One method for the determination of integrated longitudinal stellar fields from low-resolution spectra is the so-called slope method, which is based on the regression of the Stokes V signal against the first derivative of Stokes I. Here we investigate the possibility of extending this technique to measure the magnetic fields of cool stars from high-resolution spectra. For this purpose we developed a multi-line modification to the slope method, called the multi-line slope method. We tested this technique by analysing synthetic spectra computed with the COSSAM code and real observations obtained with the high-resolution spectropolarimeters Narval, HARPSpol and the Catania Astrophysical Observatory Spectropolarimeter (CAOS). We show that the multi-line slope method is a fast alternative to the least squares deconvolution technique for the measurement of the effective magnetic fields of cool stars. Using a Fourier transform on the effective magnetic field variations of the star ε Eri, we find that the long-term periodicity of the field corresponds to the 2.95-yr period of the stellar dynamo, revealed by the variation of the activity index.
Delatorre, Carolina; Rodríguez, Ana; Rodríguez, Lucía; Majada, Juan P; Ordás, Ricardo J; Feito, Isabel
2017-01-01
Plant growth regulators (PGRs) are very different chemical compounds that play essential roles in plant development and the regulation of physiological processes. They exert their functions by a mechanism called cross-talk (involving either synergistic or antagonistic actions) thus; it is for great interest to study as many PGRs as possible to obtain accurate information about plant status. Much effort has been applied to develop methods capable of analyze large numbers of these compounds but frequently excluding some chemical families or important PGRs within each family. In addition, most of the methods are specially designed for matrices easy to work with. Therefore, we wanted to develop a method which achieved the requirements lacking in the literature and also being fast and reliable. Here we present a simple, fast and robust method for the extraction and quantification of 20 different PGRs using UHPLC-MS/MS optimized in complex matrices. Copyright © 2016 Elsevier B.V. All rights reserved.
SDF technology in location and navigation procedures: a survey of applications
NASA Astrophysics Data System (ADS)
Kelner, Jan M.; Ziółkowski, Cezary
2017-04-01
The basis for development the Doppler location method, also called the signal Doppler frequency (SDF) method or technology is the analytical solution of the wave equation for a mobile source. This paper presents an overview of the simulations, numerical analysis and empirical studies of the possibilities and the range of SDF method applications. In the paper, the various applications from numerous publications are collected and described. They mainly focus on the use of SDF method in: emitter positioning, electronic warfare, crisis management, search and rescue, navigation. The developed method is characterized by an innovative, unique property among other location methods, because it allows the simultaneous location of the many radio emitters. Moreover, this is the first method based on the Doppler effect, which allows positioning of transmitters, using a single mobile platform. In the paper, the results of the using SDF method by the other teams are also presented.
The application of contraction theory to an iterative formulation of electromagnetic scattering
NASA Technical Reports Server (NTRS)
Brand, J. C.; Kauffman, J. F.
1985-01-01
Contraction theory is applied to an iterative formulation of electromagnetic scattering from periodic structures and a computational method for insuring convergence is developed. A short history of spectral (or k-space) formulation is presented with an emphasis on application to periodic surfaces. To insure a convergent solution of the iterative equation, a process called the contraction corrector method is developed. Convergence properties of previously presented iterative solutions to one-dimensional problems are examined utilizing contraction theory and the general conditions for achieving a convergent solution are explored. The contraction corrector method is then applied to several scattering problems including an infinite grating of thin wires with the solution data compared to previous works.
Development of the mathematical model for design and verification of acoustic modal analysis methods
NASA Astrophysics Data System (ADS)
Siner, Alexander; Startseva, Maria
2016-10-01
To reduce the turbofan noise it is necessary to develop methods for the analysis of the sound field generated by the blade machinery called modal analysis. Because modal analysis methods are very difficult and their testing on the full scale measurements are very expensive and tedious it is necessary to construct some mathematical models allowing to test modal analysis algorithms fast and cheap. At this work the model allowing to set single modes at the channel and to analyze generated sound field is presented. Modal analysis of the sound generated by the ring array of point sound sources is made. Comparison of experimental and numerical modal analysis results is presented at this work.
Closed-loop bird-computer interactions: a new method to study the role of bird calls.
Lerch, Alexandre; Roy, Pierre; Pachet, François; Nagle, Laurent
2011-03-01
In the field of songbird research, many studies have shown the role of male songs in territorial defense and courtship. Calling, another important acoustic communication signal, has received much less attention, however, because calls are assumed to contain less information about the emitter than songs do. Birdcall repertoire is diverse, and the role of calls has been found to be significant in the area of social interaction, for example, in pair, family, and group cohesion. However, standard methods for studying calls do not allow precise and systematic study of their role in communication. We propose herein a new method to study bird vocal interaction. A closed-loop computer system interacts with canaries, Serinus canaria, by (1) automatically classifying two basic types of canary vocalization, single versus repeated calls, as they are produced by the subject, and (2) responding with a preprogrammed call type recorded from another bird. This computerized animal-machine interaction requires no human interference. We show first that the birds do engage in sustained interactions with the system, by studying the rate of single and repeated calls for various programmed protocols. We then show that female canaries differentially use single and repeated calls. First, they produce significantly more single than repeated calls, and second, the rate of single calls is associated with the context in which they interact, whereas repeated calls are context independent. This experiment is the first illustration of how closed-loop bird-computer interaction can be used productively to study social relationships. © Springer-Verlag 2010
Pourabbasi, Ata; Farzami, Jalal; Shirvani, Mahbubeh-Sadat Ebrahimnegad; Shams, Amir Hossein; Larijani, Bagher
2017-01-01
One of the main usages of social networks in clinical studies is facilitating the process of sampling and case finding for scientists. The main focus of this study is on comparing two different methods of sampling through phone calls and using social network, for study purposes. One of the researchers started calling 214 families of children with diabetes during 90 days. After this period, phone calls stopped, and the team started communicating with families through telegram, a virtual social network for 30 days. The number of children who participated in the study was evaluated. Although the telegram method was 60 days shorter than the phone call method, researchers found that the number of participants from telegram (17.6%) did not have any significant differences compared with the ones being phone called (12.9%). Using social networks can be suggested as a beneficial method for local researchers who look for easier sampling methods, winning their samples' trust, following up with the procedure, and an easy-access database.
Agnihotri, Samira; Sundeep, P. V. D. S.; Seelamantula, Chandra Sekhar; Balakrishnan, Rohini
2014-01-01
Objective identification and description of mimicked calls is a primary component of any study on avian vocal mimicry but few studies have adopted a quantitative approach. We used spectral feature representations commonly used in human speech analysis in combination with various distance metrics to distinguish between mimicked and non-mimicked calls of the greater racket-tailed drongo, Dicrurus paradiseus and cross-validated the results with human assessment of spectral similarity. We found that the automated method and human subjects performed similarly in terms of the overall number of correct matches of mimicked calls to putative model calls. However, the two methods also misclassified different subsets of calls and we achieved a maximum accuracy of ninety five per cent only when we combined the results of both the methods. This study is the first to use Mel-frequency Cepstral Coefficients and Relative Spectral Amplitude - filtered Linear Predictive Coding coefficients to quantify vocal mimicry. Our findings also suggest that in spite of several advances in automated methods of song analysis, corresponding cross-validation by humans remains essential. PMID:24603717
What do consumers want to know about antibiotics? Analysis of a medicines call centre database.
Hawke, Kate L; McGuire, Treasure M; Ranmuthugala, Geetha; van Driel, Mieke L
2016-02-01
Australia is one of the highest users of antibiotics in the developed world. This study aimed to identify consumer antibiotic information needs to improve targeting of medicines information. We conducted a retrospective, mixed-method study of consumers' antibiotic-related calls to Australia's National Prescribing Service (NPS) Medicines Line from September 2002 to June 2010. Demographic and question data were analysed, and the most common enquiry type in each age group was explored for key narrative themes. Relative antibiotic call frequencies were determined by comparing number of calls to antibiotic utilization in Australian Statistics on Medicines (ASM) data. Between 2002 and 2010, consumers made 8696 antibiotic calls to Medicines Line. The most common reason was questions about the role of their medicine (22.4%). Patient age groups differed in enquiry pattern, with more questions about lactation in the 0- to 4-year age group (33.6%), administration (5-14 years: 32.4%), interactions (15-24 years: 33.4% and 25-54 years: 23.3%) and role of the medicine (55 years and over: 26.6%). Key themes were identified for each age group. Relative to use in the community, antibiotics most likely to attract consumer calls were ciprofloxacin (18.0 calls/100,000 ASM prescriptions) and metronidazole (12.9 calls/100,000 ASM prescriptions), with higher call rates than the most commonly prescribed antibiotic amoxicillin (3.9 calls/100,000 ASM prescriptions). Consumers' knowledge gaps and concerns about antibiotics vary with age, and certain antibiotics generate greater concern relative to their usage. Clinicians should target medicines information to proactively address consumer concerns. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Spillmann, Brigitte; van Noordwijk, Maria A; Willems, Erik P; Mitra Setia, Tatang; Wipfli, Urs; van Schaik, Carel P
2015-07-01
The long call is an important vocal communication signal in the widely dispersed, semi-solitary orangutan. Long calls affect individuals' ranging behavior and mediate social relationships and regulate encounters between dispersed individuals in a dense rainforest. The aim of this study was to test the utility of an Acoustic Location System (ALS) for recording and triangulating the loud calls of free-living primates. We developed and validated a data extraction protocol for an ALS used to record wild orangutan males' long calls at the Tuanan field site (Central Kalimantan). We installed an ALS in a grid of 300 ha, containing 20 SM2+ recorders placed in a regular lattice at 500 m intervals, to monitor the distribution of calling males in the area. The validated system had the following main features: (i) a user-trained software algorithm (Song Scope) that reliably recognized orangutan long calls from sound files at distances up to 700 m from the nearest recorder, resulting in a total area of approximately 900 ha that could be monitored continuously; (ii) acoustic location of calling males up to 200 m outside the microphone grid, which meant that within an area of approximately 450 ha, call locations could be calculated through triangulation. The mean accuracy was 58 m, an error that is modest relative to orangutan mobility and average inter-individual distances. We conclude that an ALS is a highly effective method for detecting long-distance calls of wild primates and triangulating their position. In combination with conventional individual focal follow data, an ALS can greatly improve our knowledge of orangutans' social organization, and is readily adaptable for studying other highly vocal animals. © 2015 Wiley Periodicals, Inc.
Marker-aided genetic divergence analysis in Brassica.
Arunachalam, V; Verma, Shefali; Sujata, V; Prabhu, K V
2005-08-01
Genetic divergence was evaluated in 31 breeding lines from four Brassica species using Mahalanobis' D2. A new method of grouping using D2 values was used to group the 31 lines, based on diagnostic morphological traits (called morphoqts). Isozyme variation of the individual enzymes esterase and glutamate oxaloacetate was quantified by five parameters (called isoqts) developed earlier. Grouping by the same method was also done based on the isoqts, and the grouping by isozymes was compared with that by morphoqts. Overall, there was an agreement of 73% suggesting that isoqts can be used in the choice of parents and also first stage selection of segregants in the laboratory. It was suggested that such an exercise would help to take care of season-bound and field-related problems of breeding. The new isozyme QTs, within lane variance of relative mobility and relative absorption, accounted for about 50% of the total divergence. The utility of the new method and isoqts in cost-effective breeding were highlighted.
ERIC Educational Resources Information Center
Clayton, Karen Elizabeth
2012-01-01
In recent years, there have been increasing calls to develop a more contextually-based, sociocultural perspective of achievement motivation. With this in mind, this mixed method study examined Jamaican, of the West Indies, undergraduate students' perception of motivation. This study was conducted in two phases. First, a qualitative investigation…
Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI
ERIC Educational Resources Information Center
Forer, Barry; Zumbo, Bruno D.
2011-01-01
The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…
Strengthening Competence in Working with Culturally and Linguistically Diverse Students
ERIC Educational Resources Information Center
Lineman, Janeann M.; Miller, Gloria E.
2012-01-01
It has been projected that by 2020 one out of three children will be from a culturally and linguistically diverse (CLD) background. Rapid changes in school demographics and student needs are already presenting new challenges to school-based service-delivery methods and there have been calls for increased professional development efforts to better…
A new kind of end-glued joint for the hardwood industry
Philip A. Araman
1973-01-01
A method has been developed for end- and edge-gluing short pieces of high-value hardwood lumber into long panels, using a curved end joint we call SEM (Serpentine End Matching). Panels containing SEM end joints are aesthetically pleasing and are suited for exposed applications such as in finished furniture.
The Development of Prosocial Behavior in Adolescents: A Mixed Methods Study from NOLS
ERIC Educational Resources Information Center
Furman, Nate; Sibthorp, Jim
2014-01-01
Learning transfer and prosocial behavior (PSB) are critical components of many outdoor education programs for adolescents. This study examined the effects of a theoretically grounded treatment curriculum designed to foster the transfer of learning of general and contextual PSB (also called expedition behavior) among adolescents enrolled on 14-day…
Rediscovering Froebel: A Call to Re-Examine His Life and Gifts
ERIC Educational Resources Information Center
Manning, John P.
2005-01-01
This article examines the life of Friedrich Froebel, the founder of the kindergarten movement and his first 10 "gifts to children." The author suggests that Froebel's philosophy of German Romanticism caused the waning use of his methods. He continues to state that Froebel's development of instructional material and structured play-based curricula…
ERIC Educational Resources Information Center
Ford, Julie Dyke; Bracken, Jennifer L.; Wilson, Gregory D.
2009-01-01
This article addresses previous arguments that call for increased emphasis on research in technical communication programs. Focusing on the value of scholarly-based research at the undergraduate level, we present New Mexico Tech's thesis model as an example of helping students develop familiarity with research skills and methods. This two-semester…
Analysis of Tasks in Pre-Service Elementary Teacher Education Courses
ERIC Educational Resources Information Center
Sierpinska, Anna; Osana, Helena
2012-01-01
This paper presents some results of research aimed at contributing to the development of a professional knowledge base for teachers of elementary mathematics methods courses, called here "teacher educators." We propose that a useful unit of analysis for this knowledge could be the tasks in which teacher-educators engage pre-service…
Accuracy of a Screening Tool for Early Identification of Language Impairment
ERIC Educational Resources Information Center
Uilenburg, Noëlle; Wiefferink, Karin; Verkerk, Paul; van Denderen, Margot; van Schie, Carla; Oudesluys-Murphy, Ann-Marie
2018-01-01
Purpose: A screening tool called the "VTO Language Screening Instrument" (VTO-LSI) was developed to enable more uniform and earlier detection of language impairment. This report, consisting of 2 retrospective studies, focuses on the effects of using the VTO-LSI compared to regular detection procedures. Method: Study 1 retrospectively…
ERIC Educational Resources Information Center
Armstrong, Patrick Ian; Rounds, James
2010-01-01
Career assessment methods often include measures of individual differences constructs, such as interests, personality, abilities, and values. Although many researchers have recently called for the development of integrated models, career counseling professionals have long faced the challenge of integrating this information into their practice. The…
ERIC Educational Resources Information Center
Tingerthal, John Steven
2013-01-01
Using case study methodology and autoethnographic methods, this study examines a process of curricular development known as "Decoding the Disciplines" (Decoding) by documenting the experience of its application in a construction engineering mechanics course. Motivated by the call to integrate what is known about teaching and learning…
USDA-ARS?s Scientific Manuscript database
As new research is conducted and new methods for solving problems are developed, the USDAARS has a program that allocates substantial funding to ensure these improved strategies and techniques are adopted by those who can benefit from them. These programs are called Area-wide demonstrations. A partn...
Mobile phone call data as a regional socio-economic proxy indicator.
Šćepanović, Sanja; Mishkovski, Igor; Hui, Pan; Nurminen, Jukka K; Ylä-Jääski, Antti
2015-01-01
The advent of publishing anonymized call detail records opens the door for temporal and spatial human dynamics studies. Such studies, besides being useful for creating universal models for mobility patterns, could be also used for creating new socio-economic proxy indicators that will not rely only on the local or state institutions. In this paper, from the frequency of calls at different times of the day, in different small regional units (sub-prefectures) in Côte d'Ivoire, we infer users' home and work sub-prefectures. This division of users enables us to analyze different mobility and calling patterns for the different regions. We then compare how those patterns correlate to the data from other sources, such as: news for particular events in the given period, census data, economic activity, poverty index, power plants and energy grid data. Our results show high correlation in many of the cases revealing the diversity of socio-economic insights that can be inferred using only mobile phone call data. The methods and the results may be particularly relevant to policy-makers engaged in poverty reduction initiatives as they can provide an affordable tool in the context of resource-constrained developing economies, such as Côte d'Ivoire's.
A Comparison of Authoring Software for Developing Mathematics Self-Learning Software Packages.
ERIC Educational Resources Information Center
Suen, Che-yin; Pok, Yang-ming
Four years ago, the authors started to develop a self-paced mathematics learning software called NPMaths by using an authoring package called Tencore. However, NPMaths had some weak points. A development team was hence formed to develop similar software called Mathematics On Line. This time the team used another development language called…
ITALICS: an algorithm for normalization and DNA copy number calling for Affymetrix SNP arrays.
Rigaill, Guillem; Hupé, Philippe; Almeida, Anna; La Rosa, Philippe; Meyniel, Jean-Philippe; Decraene, Charles; Barillot, Emmanuel
2008-03-15
Affymetrix SNP arrays can be used to determine the DNA copy number measurement of 11 000-500 000 SNPs along the genome. Their high density facilitates the precise localization of genomic alterations and makes them a powerful tool for studies of cancers and copy number polymorphism. Like other microarray technologies it is influenced by non-relevant sources of variation, requiring correction. Moreover, the amplitude of variation induced by non-relevant effects is similar or greater than the biologically relevant effect (i.e. true copy number), making it difficult to estimate non-relevant effects accurately without including the biologically relevant effect. We addressed this problem by developing ITALICS, a normalization method that estimates both biological and non-relevant effects in an alternate, iterative manner, accurately eliminating irrelevant effects. We compared our normalization method with other existing and available methods, and found that ITALICS outperformed these methods for several in-house datasets and one public dataset. These results were validated biologically by quantitative PCR. The R package ITALICS (ITerative and Alternative normaLIzation and Copy number calling for affymetrix Snp arrays) has been submitted to Bioconductor.
Goetzel, Ron Z.; Roemer, Enid Chung; Holingue, Calliope; Fallin, M. Daniele; McCleary, Katherine; Eaton, William; Agnew, Jacqueline; Azocar, Francisca; Ballard, David; Bartlett, John; Braga, Michael; Conway, Heidi; Crighton, K. Andrew; Frank, Richard; Jinnett, Kim; Keller-Greene, Debra; Rauch, Sara Martin; Safeer, Richard; Saporito, Dick; Schill, Anita; Shern, David; Strecher, Victor; Wald, Peter; Wang, Philip; Mattingly, C. Richard
2018-01-01
Objective To declare a call to action to improve mental health in the workplace. Methods We convened a public health summit and assembled an Advisory Council consisting of experts in the field of occupational health and safety, workplace wellness, and public policy to offer recommendations for action steps to improve health and well-being of workers. Results The Advisory Council narrowed the list of ideas to four priority projects. Conclusions The recommendations for action include developing a Mental Health in the Workplace 1) “How to” Guide, 2) Scorecard, 3) Recognition Program, and 4) Executive Training. PMID:29280775
Young, Victoria; Rochon, Elizabeth; Mihailidis, Alex
2016-11-14
The purpose of this study was to derive data from real, recorded, personal emergency response call conversations to help improve the artificial intelligence and decision making capability of a spoken dialogue system in a smart personal emergency response system. The main study objectives were to: develop a model of personal emergency response; determine categories for the model's features; identify and calculate measures from call conversations (verbal ability, conversational structure, timing); and examine conversational patterns and relationships between measures and model features applicable for improving the system's ability to automatically identify call model categories and predict a target response. This study was exploratory and used mixed methods. Personal emergency response calls were pre-classified according to call model categories identified qualitatively from response call transcripts. The relationships between six verbal ability measures, three conversational structure measures, two timing measures and three independent factors: caller type, risk level, and speaker type, were examined statistically. Emergency medical response services were the preferred response for the majority of medium and high risk calls for both caller types. Older adult callers mainly requested non-emergency medical service responders during medium risk situations. By measuring the number of spoken words-per-minute and turn-length-in-words for the first spoken utterance of a call, older adult and care provider callers could be identified with moderate accuracy. Average call taker response time was calculated using the number-of-speaker-turns and time-in-seconds measures. Care providers and older adults used different conversational strategies when responding to call takers. The words 'ambulance' and 'paramedic' may hold different latent connotations for different callers. The data derived from the real personal emergency response recordings may help a spoken dialogue system classify incoming calls by caller type with moderate probability shortly after the initial caller utterance. Knowing the caller type, the target response for the call may be predicted with some degree of probability and the output dialogue could be tailored to this caller type. The average call taker response time measured from real calls may be used to limit the conversation length in a spoken dialogue system before defaulting to a live call taker.
Improved Method for Linear B-Cell Epitope Prediction Using Antigen’s Primary Sequence
Raghava, Gajendra P. S.
2013-01-01
One of the major challenges in designing a peptide-based vaccine is the identification of antigenic regions in an antigen that can stimulate B-cell’s response, also called B-cell epitopes. In the past, several methods have been developed for the prediction of conformational and linear (or continuous) B-cell epitopes. However, the existing methods for predicting linear B-cell epitopes are far from perfection. In this study, an attempt has been made to develop an improved method for predicting linear B-cell epitopes. We have retrieved experimentally validated B-cell epitopes as well as non B-cell epitopes from Immune Epitope Database and derived two types of datasets called Lbtope_Variable and Lbtope_Fixed length datasets. The Lbtope_Variable dataset contains 14876 B-cell epitope and 23321 non-epitopes of variable length where as Lbtope_Fixed length dataset contains 12063 B-cell epitopes and 20589 non-epitopes of fixed length. We also evaluated the performance of models on above datasets after removing highly identical peptides from the datasets. In addition, we have derived third dataset Lbtope_Confirm having 1042 epitopes and 1795 non-epitopes where each epitope or non-epitope has been experimentally validated in at least two studies. A number of models have been developed to discriminate epitopes and non-epitopes using different machine-learning techniques like Support Vector Machine, and K-Nearest Neighbor. We achieved accuracy from ∼54% to 86% using diverse s features like binary profile, dipeptide composition, AAP (amino acid pair) profile. In this study, for the first time experimentally validated non B-cell epitopes have been used for developing method for predicting linear B-cell epitopes. In previous studies, random peptides have been used as non B-cell epitopes. In order to provide service to scientific community, a web server LBtope has been developed for predicting and designing B-cell epitopes (http://crdd.osdd.net/raghava/lbtope/). PMID:23667458
Structural Embeddings: Mechanization with Method
NASA Technical Reports Server (NTRS)
Munoz, Cesar; Rushby, John
1999-01-01
The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.
2008-03-01
to predict its exact position. To locate Ceres, Carl Friedrich Gauss , a mere 24 years old at the time, developed a method called least-squares...dividend to produce the quotient. This method converges to the reciprocal quadratically [11]. For the special case of: 1 H × P (:, :, k)×H ′ + R (3.9) the...high-speed computation of reciprocals within the overall system. The Newton-Raphson method is also expanded for use in calculat- ing square-roots in
Large Scale Data Analysis and Knowledge Extraction in Communication Data
2017-03-31
this purpose, we developed a novel method the " Correlation Density Ran!C’ which finds probability density distribution of related frequent event on all...which is called " Correlation Density Rank", is developed to derive the community tree from the network. As in the real world, where a network is...Community Structure in Dynamic Social Networks using the Correlation Density Rank," 2014 ASE BigData/SocialCom/Cybersecurity Conference, Stanford
Hypertranscription in development, stem cells, and regeneration
Percharde, Michelle; Bulut-Karslioglu, Aydan; Ramalho-Santos, Miguel
2016-01-01
SUMMARY Cells can globally up-regulate their transcriptome during specific transitions, a phenomenon called hypertranscription. Evidence for hypertranscription dates back over 70 years, but it has gone largely ignored in the genomics era until recently. We discuss data supporting the notion that hypertranscription is a unifying theme in embryonic development, stem cell biology, regeneration and cell competition. We review the history, methods for analysis, underlying mechanisms and biological significance of hypertranscription. PMID:27989554
Multi-Source Fusion for Explosive Hazard Detection in Forward Looking Sensors
2016-12-01
include; (1) Investigating (a) thermal, (b) synthetic aperture acoustics ( SAA ) and (c) voxel space Radar for buried and side threat attacks. (2...detection. (3) With respect to SAA , we developed new approaches in the time and frequency domains for analyzing signature of concealed targets (called...Fraz). We also developed a method to extract a multi-spectral signature from SAA and deep learning was used on limited training and class imbalance
Dynamism to promote reproductive medicine and its development, Rihachi Iizuka.
Sueoka, K
2001-12-01
Rihachi Iizuka has contributed strong leadership for the remarkable development of reproductive medicine which has undergone a complete transformation in the previous half century. The Keio University Hospital introduced artificial insemination as the first assisted reproductive technology in Japan. As it follows, lizuka and his colleagues first reported the live birth of a female infant in August 1949 after heterologous insemination: AID. Iizuka and his colleagues were also among the first to successfully inseminate a woman with sperm that had been frozen. He developed the new cryopreservation medium for human semen called "KS Cryo-medium". He also developed semen preparation methods of washing and concentrating sperm counts by centrifugation with Percoll (colloidal silica derivative) solution for oligozoospermic patients. These methods are broadly used in the clinical field. Furthermore, he developed the X-, Y-bearing sperm preseparation method using Percoll which is the so-called "gender selection" procedure for preventing X-linked genetic disorders. The most striking assisted reproductive technology was in vitro fertilization first carried out in Britain. Prior to the clinical application in Japan, the Japan Society of Fertilization and Implantation was established as the main organ for the exchange of official scientific information by lizuka in 1982. As rapid development and spreading of in vitro fertilization and its implicated technologies, lizuka and his colleague of the department had the first success of offspring following embryo freezing and thawing in Japan which was performed at the Tokyo Dental College Ichikawa General Hospital. Already the numbers of offspring following in vitro fertilization treatment has risen to approximately 1% of births in Japan. Rihachi lizuka still undertakes the responsibility for reproductive medicine as he has done so far.
Group foliation of finite difference equations
NASA Astrophysics Data System (ADS)
Thompson, Robert; Valiquette, Francis
2018-06-01
Using the theory of equivariant moving frames, a group foliation method for invariant finite difference equations is developed. This method is analogous to the group foliation of differential equations and uses the symmetry group of the equation to decompose the solution process into two steps, called resolving and reconstruction. Our constructions are performed algorithmically and symbolically by making use of discrete recurrence relations among joint invariants. Applications to invariant finite difference equations that approximate differential equations are given.
NASA Astrophysics Data System (ADS)
Shao, Haidong; Jiang, Hongkai; Zhang, Haizhou; Duan, Wenjing; Liang, Tianchen; Wu, Shuaipeng
2018-02-01
The vibration signals collected from rolling bearing are usually complex and non-stationary with heavy background noise. Therefore, it is a great challenge to efficiently learn the representative fault features of the collected vibration signals. In this paper, a novel method called improved convolutional deep belief network (CDBN) with compressed sensing (CS) is developed for feature learning and fault diagnosis of rolling bearing. Firstly, CS is adopted for reducing the vibration data amount to improve analysis efficiency. Secondly, a new CDBN model is constructed with Gaussian visible units to enhance the feature learning ability for the compressed data. Finally, exponential moving average (EMA) technique is employed to improve the generalization performance of the constructed deep model. The developed method is applied to analyze the experimental rolling bearing vibration signals. The results confirm that the developed method is more effective than the traditional methods.
Uncertainty Propagation Methods for High-Dimensional Complex Systems
NASA Astrophysics Data System (ADS)
Mukherjee, Arpan
Researchers are developing ever smaller aircraft called Micro Aerial Vehicles (MAVs). The Space Robotics Group has joined the field by developing a dragonfly-inspired MAV. This thesis presents two contributions to this project. The first is the development of a dynamical model of the internal MAV components to be used for tuning design parameters and as a future plant model. This model is derived using the Lagrangian method and differs from others because it accounts for the internal dynamics of the system. The second contribution of this thesis is an estimation algorithm that can be used to determine prototype performance and verify the dynamical model from the first part. Based on the Gauss-Newton Batch Estimator, this algorithm uses a single camera and known points of interest on the wing to estimate the wing kinematic angles. Unlike other single-camera methods, this method is probabilistically based rather than being geometric.
A fast and objective multidimensional kernel density estimation method: fastKDE
O'Brien, Travis A.; Kashinath, Karthik; Cavanaugh, Nicholas R.; ...
2016-03-07
Numerous facets of scientific research implicitly or explicitly call for the estimation of probability densities. Histograms and kernel density estimates (KDEs) are two commonly used techniques for estimating such information, with the KDE generally providing a higher fidelity representation of the probability density function (PDF). Both methods require specification of either a bin width or a kernel bandwidth. While techniques exist for choosing the kernel bandwidth optimally and objectively, they are computationally intensive, since they require repeated calculation of the KDE. A solution for objectively and optimally choosing both the kernel shape and width has recently been developed by Bernacchiamore » and Pigolotti (2011). While this solution theoretically applies to multidimensional KDEs, it has not been clear how to practically do so. A method for practically extending the Bernacchia-Pigolotti KDE to multidimensions is introduced. This multidimensional extension is combined with a recently-developed computational improvement to their method that makes it computationally efficient: a 2D KDE on 10 5 samples only takes 1 s on a modern workstation. This fast and objective KDE method, called the fastKDE method, retains the excellent statistical convergence properties that have been demonstrated for univariate samples. The fastKDE method exhibits statistical accuracy that is comparable to state-of-the-science KDE methods publicly available in R, and it produces kernel density estimates several orders of magnitude faster. The fastKDE method does an excellent job of encoding covariance information for bivariate samples. This property allows for direct calculation of conditional PDFs with fastKDE. It is demonstrated how this capability might be leveraged for detecting non-trivial relationships between quantities in physical systems, such as transitional behavior.« less
Automated UMLS-Based Comparison of Medical Forms
Dugas, Martin; Fritz, Fleur; Krumm, Rainer; Breil, Bernhard
2013-01-01
Medical forms are very heterogeneous: on a European scale there are thousands of data items in several hundred different systems. To enable data exchange for clinical care and research purposes there is a need to develop interoperable documentation systems with harmonized forms for data capture. A prerequisite in this harmonization process is comparison of forms. So far – to our knowledge – an automated method for comparison of medical forms is not available. A form contains a list of data items with corresponding medical concepts. An automatic comparison needs data types, item names and especially item with these unique concept codes from medical terminologies. The scope of the proposed method is a comparison of these items by comparing their concept codes (coded in UMLS). Each data item is represented by item name, concept code and value domain. Two items are called identical, if item name, concept code and value domain are the same. Two items are called matching, if only concept code and value domain are the same. Two items are called similar, if their concept codes are the same, but the value domains are different. Based on these definitions an open-source implementation for automated comparison of medical forms in ODM format with UMLS-based semantic annotations was developed. It is available as package compareODM from http://cran.r-project.org. To evaluate this method, it was applied to a set of 7 real medical forms with 285 data items from a large public ODM repository with forms for different medical purposes (research, quality management, routine care). Comparison results were visualized with grid images and dendrograms. Automated comparison of semantically annotated medical forms is feasible. Dendrograms allow a view on clustered similar forms. The approach is scalable for a large set of real medical forms. PMID:23861827
A Roadmap for Using Agile Development in a Traditional Environment
NASA Technical Reports Server (NTRS)
Streiffert, Barbara; Starbird, Thomas; Grenander, Sven
2006-01-01
One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.
A Roadmap for Using Agile Development in a Traditional Environment
NASA Technical Reports Server (NTRS)
Streiffert, Barbara A.; Starbird, Thomas; Grenander, Sven
2006-01-01
One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and procedure for developing software. This paper will discuss the some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies.
Abdul Rashid, Rima Marhayu; Mohamed, Majdah; Hamid, Zaleha Abdul; Dahlui, Maznah
2013-01-01
To compare the effectiveness of different methods of recall for repeat Pap smear among women who had normal smears in the previous screening. Prospective randomized controlled study. All community clinics in Klang under the Ministry of Health Malaysia. Women of Klang who attended cervical screening and had a normal Pap smear in the previous year, and were due for a repeat smear were recruited and randomly assigned to four different methods of recall for repeat smear. The recall methods given to the women to remind them for a repeat smear were either by postal letter, registered letter, short message by phone (SMS) or phone call. Number and percentage of women who responded to the recall within 8 weeks after they had received the recall, irrespective whether they had Pap test conducted. Also the numbers of women in each recall method that came for repeat Pap smear. The rates of recall messages reaching the women when using letter, registered letter, SMS and phone calls were 79%, 87%, 66% and 68%, respectively. However, the positive responses to recall by letter, registered letter, phone messages and telephone call were 23.9%, 23.0%, 32.9% and 50.9%, respectively (p<0.05). Furthermore, more women who received recall by phone call had been screened (p<0.05) compared to those who received recall by postal letter (OR=2.38, CI=1.56-3.62). Both the usual way of sending letters and registered letters had higher chances of reaching patients compared to using phone either for sending messages or calling. The response to the recall method and uptake of repeat smear, however, were highest via phone call, indicating the importance of direct communication.
Implementation of unsteady sampling procedures for the parallel direct simulation Monte Carlo method
NASA Astrophysics Data System (ADS)
Cave, H. M.; Tseng, K.-C.; Wu, J.-S.; Jermy, M. C.; Huang, J.-C.; Krumdieck, S. P.
2008-06-01
An unsteady sampling routine for a general parallel direct simulation Monte Carlo method called PDSC is introduced, allowing the simulation of time-dependent flow problems in the near continuum range. A post-processing procedure called DSMC rapid ensemble averaging method (DREAM) is developed to improve the statistical scatter in the results while minimising both memory and simulation time. This method builds an ensemble average of repeated runs over small number of sampling intervals prior to the sampling point of interest by restarting the flow using either a Maxwellian distribution based on macroscopic properties for near equilibrium flows (DREAM-I) or output instantaneous particle data obtained by the original unsteady sampling of PDSC for strongly non-equilibrium flows (DREAM-II). The method is validated by simulating shock tube flow and the development of simple Couette flow. Unsteady PDSC is found to accurately predict the flow field in both cases with significantly reduced run-times over single processor code and DREAM greatly reduces the statistical scatter in the results while maintaining accurate particle velocity distributions. Simulations are then conducted of two applications involving the interaction of shocks over wedges. The results of these simulations are compared to experimental data and simulations from the literature where there these are available. In general, it was found that 10 ensembled runs of DREAM processing could reduce the statistical uncertainty in the raw PDSC data by 2.5-3.3 times, based on the limited number of cases in the present study.
Basic, specific, mechanistic? Conceptualizing musical emotions in the brain.
Omigie, Diana
2016-06-01
The number of studies investigating music processing in the human brain continues to increase, with a large proportion of them focussing on the correlates of so-called musical emotions. The current Review highlights the recent development whereby such studies are no longer concerned only with basic emotions such as happiness and sadness but also with so-called music-specific or "aesthetic" ones such as nostalgia and wonder. It also highlights how mechanisms such as expectancy and empathy, which are seen as inducing musical emotions, are enjoying ever-increasing investigation and substantiation with physiological and neuroimaging methods. It is proposed that a combination of these approaches, namely, investigation of the precise mechanisms through which so-called music-specific or aesthetic emotions may arise, will provide the most important advances for our understanding of the unique nature of musical experience. © 2015 Wiley Periodicals, Inc.
Partial Variance of Increments Method in Solar Wind Observations and Plasma Simulations
NASA Astrophysics Data System (ADS)
Greco, A.; Matthaeus, W. H.; Perri, S.; Osman, K. T.; Servidio, S.; Wan, M.; Dmitruk, P.
2018-02-01
The method called "PVI" (Partial Variance of Increments) has been increasingly used in analysis of spacecraft and numerical simulation data since its inception in 2008. The purpose of the method is to study the kinematics and formation of coherent structures in space plasmas, a topic that has gained considerable attention, leading the development of identification methods, observations, and associated theoretical research based on numerical simulations. This review paper will summarize key features of the method and provide a synopsis of the main results obtained by various groups using the method. This will enable new users or those considering methods of this type to find details and background collected in one place.
General methodology for simultaneous representation and discrimination of multiple object classes
NASA Astrophysics Data System (ADS)
Talukder, Ashit; Casasent, David P.
1998-03-01
We address a new general method for linear and nonlinear feature extraction for simultaneous representation and classification. We call this approach the maximum representation and discrimination feature (MRDF) method. We develop a novel nonlinear eigenfeature extraction technique to represent data with closed-form solutions and use it to derive a nonlinear MRDF algorithm. Results of the MRDF method on synthetic databases are shown and compared with results from standard Fukunaga-Koontz transform and Fisher discriminant function methods. The method is also applied to an automated product inspection problem and for classification and pose estimation of two similar objects under 3D aspect angle variations.
Masking as an effective quality control method for next-generation sequencing data analysis.
Yun, Sajung; Yun, Sijung
2014-12-13
Next generation sequencing produces base calls with low quality scores that can affect the accuracy of identifying simple nucleotide variation calls, including single nucleotide polymorphisms and small insertions and deletions. Here we compare the effectiveness of two data preprocessing methods, masking and trimming, and the accuracy of simple nucleotide variation calls on whole-genome sequence data from Caenorhabditis elegans. Masking substitutes low quality base calls with 'N's (undetermined bases), whereas trimming removes low quality bases that results in a shorter read lengths. We demonstrate that masking is more effective than trimming in reducing the false-positive rate in single nucleotide polymorphism (SNP) calling. However, both of the preprocessing methods did not affect the false-negative rate in SNP calling with statistical significance compared to the data analysis without preprocessing. False-positive rate and false-negative rate for small insertions and deletions did not show differences between masking and trimming. We recommend masking over trimming as a more effective preprocessing method for next generation sequencing data analysis since masking reduces the false-positive rate in SNP calling without sacrificing the false-negative rate although trimming is more commonly used currently in the field. The perl script for masking is available at http://code.google.com/p/subn/. The sequencing data used in the study were deposited in the Sequence Read Archive (SRX450968 and SRX451773).
The Contact Dynamics method: A nonsmooth story
NASA Astrophysics Data System (ADS)
Dubois, Frédéric; Acary, Vincent; Jean, Michel
2018-03-01
When velocity jumps are occurring, the dynamics is said to be nonsmooth. For instance, in collections of contacting rigid bodies, jumps are caused by shocks and dry friction. Without compliance at the interface, contact laws are not only non-differentiable in the usual sense but also multi-valued. Modeling contacting bodies is of interest in order to understand the behavior of numerous mechanical systems such as flexible multi-body systems, granular materials or masonry. These granular materials behave puzzlingly either like a solid or a fluid and a description in the frame of classical continuous mechanics would be welcome though far to be satisfactory nowadays. Jean-Jacques Moreau greatly contributed to convex analysis, functions of bounded variations, differential measure theory, sweeping process theory, definitive mathematical tools to deal with nonsmooth dynamics. He converted all these underlying theoretical ideas into an original nonsmooth implicit numerical method called Contact Dynamics (CD); a robust and efficient method to simulate large collections of bodies with frictional contacts and impacts. The CD method offers a very interesting complementary alternative to the family of smoothed explicit numerical methods, often called Distinct Elements Method (DEM). In this paper developments and improvements of the CD method are presented together with a critical comparative review of advantages and drawbacks of both approaches. xml:lang="fr"
NASA Astrophysics Data System (ADS)
Silva, Ricardo Petri; Naozuka, Gustavo Taiji; Mastelini, Saulo Martiello; Felinto, Alan Salvany
2018-01-01
The incidence of luminous reflections (LR) in captured images can interfere with the color of the affected regions. These regions tend to oversaturate, becoming whitish and, consequently, losing the original color information of the scene. Decision processes that employ images acquired from digital cameras can be impaired by the LR incidence. Such applications include real-time video surgeries, facial, and ocular recognition. This work proposes an algorithm called contrast enhancement of potential LR regions, which is a preprocessing to increase the contrast of potential LR regions, in order to improve the performance of automatic LR detectors. In addition, three automatic detectors were compared with and without the employment of our preprocessing method. The first one is a technique already consolidated in the literature called the Chang-Tseng threshold. We propose two automatic detectors called adapted histogram peak and global threshold. We employed four performance metrics to evaluate the detectors, namely, accuracy, precision, exactitude, and root mean square error. The exactitude metric is developed by this work. Thus, a manually defined reference model was created. The global threshold detector combined with our preprocessing method presented the best results, with an average exactitude rate of 82.47%.
Selecting a restoration technique to minimize OCR error.
Cannon, M; Fugate, M; Hush, D R; Scovel, C
2003-01-01
This paper introduces a learning problem related to the task of converting printed documents to ASCII text files. The goal of the learning procedure is to produce a function that maps documents to restoration techniques in such a way that on average the restored documents have minimum optical character recognition error. We derive a general form for the optimal function and use it to motivate the development of a nonparametric method based on nearest neighbors. We also develop a direct method of solution based on empirical error minimization for which we prove a finite sample bound on estimation error that is independent of distribution. We show that this empirical error minimization problem is an extension of the empirical optimization problem for traditional M-class classification with general loss function and prove computational hardness for this problem. We then derive a simple iterative algorithm called generalized multiclass ratchet (GMR) and prove that it produces an optimal function asymptotically (with probability 1). To obtain the GMR algorithm we introduce a new data map that extends Kesler's construction for the multiclass problem and then apply an algorithm called Ratchet to this mapped data, where Ratchet is a modification of the Pocket algorithm . Finally, we apply these methods to a collection of documents and report on the experimental results.
Tyrer, Jonathan P; Guo, Qi; Easton, Douglas F; Pharoah, Paul D P
2013-06-06
The development of genotyping arrays containing hundreds of thousands of rare variants across the genome and advances in high-throughput sequencing technologies have made feasible empirical genetic association studies to search for rare disease susceptibility alleles. As single variant testing is underpowered to detect associations, the development of statistical methods to combine analysis across variants - so-called "burden tests" - is an area of active research interest. We previously developed a method, the admixture maximum likelihood test, to test multiple, common variants for association with a trait of interest. We have extended this method, called the rare admixture maximum likelihood test (RAML), for the analysis of rare variants. In this paper we compare the performance of RAML with six other burden tests designed to test for association of rare variants. We used simulation testing over a range of scenarios to test the power of RAML compared to the other rare variant association testing methods. These scenarios modelled differences in effect variability, the average direction of effect and the proportion of associated variants. We evaluated the power for all the different scenarios. RAML tended to have the greatest power for most scenarios where the proportion of associated variants was small, whereas SKAT-O performed a little better for the scenarios with a higher proportion of associated variants. The RAML method makes no assumptions about the proportion of variants that are associated with the phenotype of interest or the magnitude and direction of their effect. The method is flexible and can be applied to both dichotomous and quantitative traits and allows for the inclusion of covariates in the underlying regression model. The RAML method performed well compared to the other methods over a wide range of scenarios. Generally power was moderate in most of the scenarios, underlying the need for large sample sizes in any form of association testing.
Development and evaluation of an automatic method for the study of platelet osmotic response.
Gigout, T; Blondel, W; Didelon, J; Latger, V; Dumas, D; Schooneman, F; Stoltz, J F
1999-01-01
Study of the osmotic resistance to hypotonic medium of platelets has often been suggested as a global test to assess the viability of these cells in transfusion or to study modification during haematological pathologies. A number of authors have analysed the behaviour of platelets in hypotonic media by a variety of methods (cell count, determinations of substances released, morphology, etc.), but most studies are currently based on the so-called "Hypotonic Shock Response" test (HSR). In this study, the authors describe a new automated and reproducible apparatus, called fragilimeter, using slow dialysis to assess platelet osmotic resistance. The variations in light transmission through a platelet suspension according to ionic strength are linked to the change in cellular volume and lysis and characterise the osmotic behaviour of the cells. The results revealed the good reproducibility and sensibility of the technique. This apparatus allows also the realisation of the "HSR" test.
Glass Effect in Inbreeding-Avoidance Systems: Minimum Viable Population for Outbreeders
NASA Astrophysics Data System (ADS)
Tainaka, Kei-ichi; Itoh, Yoshiaki
1996-10-01
Many animals, birds and plants have evolved mechanisms to avoid inbreeding between close relatives.Such mating systems may have developed several methods for restricting mate choice.If fragmentation of habitats becomes serious, these methods may lead to a lack of acceptable mates. We call this “glass effect”which is a generalization of the so-called Allee effect.We present two inbreeding-avoidance (outbreeding) models.Both models show that outbreeders have a high risk infragmented environments.We thus obtain the minimum viable population (MVP). It is found that the value of MVP amounts to the range from several hundred to several thousand individuals.While this value is much larger than thoseobtained by the previous demographic theories,it is consistent with recent empirical estimations.Moreover, we find that the glass effect is caused bydynamically induced clusters of relatives.This suggests that genetic variation will be decreased by the outbreeding in a highly fragmented environment.
Acoustic signal detection of manatee calls
NASA Astrophysics Data System (ADS)
Niezrecki, Christopher; Phillips, Richard; Meyer, Michael; Beusse, Diedrich O.
2003-04-01
The West Indian manatee (trichechus manatus latirostris) has become endangered partly because of a growing number of collisions with boats. A system to warn boaters of the presence of manatees, that can signal to boaters that manatees are present in the immediate vicinity, could potentially reduce these boat collisions. In order to identify the presence of manatees, acoustic methods are employed. Within this paper, three different detection algorithms are used to detect the calls of the West Indian manatee. The detection systems are tested in the laboratory using simulated manatee vocalizations from an audio compact disc. The detection method that provides the best overall performance is able to correctly identify ~=96% of the manatee vocalizations. However the system also results in a false positive rate of ~=16%. The results of this work may ultimately lead to the development of a manatee warning system that can warn boaters of the presence of manatees.
An overview of NSPCG: A nonsymmetric preconditioned conjugate gradient package
NASA Astrophysics Data System (ADS)
Oppe, Thomas C.; Joubert, Wayne D.; Kincaid, David R.
1989-05-01
The most recent research-oriented software package developed as part of the ITPACK Project is called "NSPCG" since it contains many nonsymmetric preconditioned conjugate gradient procedures. It is designed to solve large sparse systems of linear algebraic equations by a variety of different iterative methods. One of the main purposes for the development of the package is to provide a common modular structure for research on iterative methods for nonsymmetric matrices. Another purpose for the development of the package is to investigate the suitability of several iterative methods for vector computers. Since the vectorizability of an iterative method depends greatly on the matrix structure, NSPCG allows great flexibility in the operator representation. The coefficient matrix can be passed in one of several different matrix data storage schemes. These sparse data formats allow matrices with a wide range of structures from highly structured ones such as those with all nonzeros along a relatively small number of diagonals to completely unstructured sparse matrices. Alternatively, the package allows the user to call the accelerators directly with user-supplied routines for performing certain matrix operations. In this case, one can use the data format from an application program and not be required to copy the matrix into one of the package formats. This is particularly advantageous when memory space is limited. Some of the basic preconditioners that are available are point methods such as Jacobi, Incomplete LU Decomposition and Symmetric Successive Overrelaxation as well as block and multicolor preconditioners. The user can select from a large collection of accelerators such as Conjugate Gradient (CG), Chebyshev (SI, for semi-iterative), Generalized Minimal Residual (GMRES), Biconjugate Gradient Squared (BCGS) and many others. The package is modular so that almost any accelerator can be used with almost any preconditioner.
Randomized Trial of Nicotine Lozenges and Phone Counseling for Smokeless Tobacco Cessation
Danaher, Brian G.; Ebbert, Jon O.; van Meter, Nora; Lichtenstein, Edward; Widdop, Chris; Crowley, Ryann; Akers, Laura; Seeley, John R.
2015-01-01
Introduction: Relatively few treatment programs have been developed specifically for smokeless tobacco (ST) users who want to quit. Their results suggest that self-help materials, telephone counseling, and nicotine lozenges are efficacious. This study provides the first direct examination of the separate and combined effects of telephone counseling and lozenges. Methods: We recruited ST users online (N = 1067) and randomly assigned them to 1 of 3 conditions: (a) a lozenge group (n = 356), who were mailed 4-mg nicotine lozenges; (b) a coach calls group (n = 354), who were offered 3 coaching phone calls; or (c) a lozenge + coach calls group (N = 357), who received both lozenges and coaching calls. Additionally, all participants were mailed self-help materials. Self-reported tobacco abstinence was assessed at 3 and 6 months after randomization. Results: Complete-case and intention-to-treat (ITT) analyses for all tobacco abstinence were performed at 3 months, 6 months, and both 3 and 6 months (repeated point prevalence). ITT analyses revealed a highly similar result: the lozenge + coach calls condition was significantly more successful in encouraging tobacco abstinence than either the lozenge group or the coach calls group, which did not differ. Conclusions: Combining nicotine lozenges and phone counseling significantly increased tobacco abstinence rates compared with either intervention alone, whereas coach calls and lozenges were equivalent. The study confirms the high tobacco abstinence rates for self-help ST cessation interventions and offers guidance to providing tobacco treatment to ST users. PMID:25168034
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Brien, Travis A.; Kashinath, Karthik; Cavanaugh, Nicholas R.
Numerous facets of scientific research implicitly or explicitly call for the estimation of probability densities. Histograms and kernel density estimates (KDEs) are two commonly used techniques for estimating such information, with the KDE generally providing a higher fidelity representation of the probability density function (PDF). Both methods require specification of either a bin width or a kernel bandwidth. While techniques exist for choosing the kernel bandwidth optimally and objectively, they are computationally intensive, since they require repeated calculation of the KDE. A solution for objectively and optimally choosing both the kernel shape and width has recently been developed by Bernacchiamore » and Pigolotti (2011). While this solution theoretically applies to multidimensional KDEs, it has not been clear how to practically do so. A method for practically extending the Bernacchia-Pigolotti KDE to multidimensions is introduced. This multidimensional extension is combined with a recently-developed computational improvement to their method that makes it computationally efficient: a 2D KDE on 10 5 samples only takes 1 s on a modern workstation. This fast and objective KDE method, called the fastKDE method, retains the excellent statistical convergence properties that have been demonstrated for univariate samples. The fastKDE method exhibits statistical accuracy that is comparable to state-of-the-science KDE methods publicly available in R, and it produces kernel density estimates several orders of magnitude faster. The fastKDE method does an excellent job of encoding covariance information for bivariate samples. This property allows for direct calculation of conditional PDFs with fastKDE. It is demonstrated how this capability might be leveraged for detecting non-trivial relationships between quantities in physical systems, such as transitional behavior.« less
Weakly stationary noise filtering of satellite-acquired imagery
NASA Technical Reports Server (NTRS)
Palgen, J. J. O.; Tamches, I.; Deutsch, E. S.
1971-01-01
A type of weakly stationary noise called herringbone noise was observed in satellite imagery. The characteristics of this noise are described; a model for its simulation was developed. The model is used to degrade pictorial data for comparison with similar noise degraded Nimbus data. Two filtering methods are defined and evaluated. A user's application demonstration is discussed.
ERIC Educational Resources Information Center
Drott, M. Carl; And Others
This study describes materials used by secondary school students in preparing independent study papers and other types of assignments calling for library use, including the use of home collections and school, public, college, and special libraries. Bibliometric methods were used to provide measurement of the nature and currency of books,…
ERIC Educational Resources Information Center
Moore, J.
2010-01-01
The philosophy of science is the branch of philosophy that critically examines the foundations, assumptions, methods, products, and implications of the activity called science. The present sketch reviews the historical development of the philosophy of science, representative individuals in the field, and topics of long-standing interest. The…
The Development and Application of Novel Methods for the Solution of EMP Shielding Problems.
1981-02-01
chiralit\\ into chemistr , and originated the branch of chemistr \\ on a molar scale in, for example. snails, flowers, and we no% call stereochemistr\\ More... Chemistr %. Nobel L ecture (Dec 12. 1915). als in Les Prix Nobel timprimenec Ro.Nalc P A Norstedi & denotes that the strand in position I crosses in front
Games as an Artistic Medium: Investigating Complexity Thinking in Game-Based Art Pedagogy
ERIC Educational Resources Information Center
Patton, Ryan M.
2013-01-01
This action research study examines the making of video games, using an integrated development environment software program called GameMaker, as art education curriculum for students between the ages of 8-13. Through a method I designed, students created video games using the concepts of move, avoid, release, and contact (MARC) to explore their…
Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne
2018-01-01
Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile phones offers promise for future data collection in Ghana and may be suitable for other developing countries. PMID:29351349
Garland, Ellen C; Castellote, Manuel; Berchok, Catherine L
2015-06-01
Beluga whales, Delphinapterus leucas, have a graded call system; call types exist on a continuum making classification challenging. A description of vocalizations from the eastern Beaufort Sea beluga population during its spring migration are presented here, using both a non-parametric classification tree analysis (CART), and a Random Forest analysis. Twelve frequency and duration measurements were made on 1019 calls recorded over 14 days off Icy Cape, Alaska, resulting in 34 identifiable call types with 83% agreement in classification for both CART and Random Forest analyses. This high level of agreement in classification, with an initial subjective classification of calls into 36 categories, demonstrates that the methods applied here provide a quantitative analysis of a graded call dataset. Further, as calls cannot be attributed to individuals using single sensor passive acoustic monitoring efforts, these methods provide a comprehensive analysis of data where the influence of pseudo-replication of calls from individuals is unknown. This study is the first to describe the vocal repertoire of a beluga population using a robust and repeatable methodology. A baseline eastern Beaufort Sea beluga population repertoire is presented here, against which the call repertoire of other seasonally sympatric Alaskan beluga populations can be compared.
Multi-Optimisation Consensus Clustering
NASA Astrophysics Data System (ADS)
Li, Jian; Swift, Stephen; Liu, Xiaohui
Ensemble Clustering has been developed to provide an alternative way of obtaining more stable and accurate clustering results. It aims to avoid the biases of individual clustering algorithms. However, it is still a challenge to develop an efficient and robust method for Ensemble Clustering. Based on an existing ensemble clustering method, Consensus Clustering (CC), this paper introduces an advanced Consensus Clustering algorithm called Multi-Optimisation Consensus Clustering (MOCC), which utilises an optimised Agreement Separation criterion and a Multi-Optimisation framework to improve the performance of CC. Fifteen different data sets are used for evaluating the performance of MOCC. The results reveal that MOCC can generate more accurate clustering results than the original CC algorithm.
A decade of improvements in Mimiviridae and Marseilleviridae isolation from amoeba.
Pagnier, Isabelle; Reteno, Dorine-Gaelle Ikanga; Saadi, Hanene; Boughalmi, Mondher; Gaia, Morgan; Slimani, Meriem; Ngounga, Tatsiana; Bekliz, Meriem; Colson, Philippe; Raoult, Didier; La Scola, Bernard
2013-01-01
Since the isolation of the first giant virus, the Mimivirus, by T.J. Rowbotham in a cooling tower in Bradford, UK, and after its characterisation by our group in 2003, we have continued to develop novel strategies to isolate additional strains. By first focusing on cooling towers using our original time-consuming procedure, we were able to isolate a new lineage of giant virus called Marseillevirus and a new Mimivirus strain called Mamavirus. In the following years, we have accumulated the world's largest unique collection of giant viruses by improving the use of antibiotic combinations to avoid bacterial contamination of amoeba, developing strategies of preliminary screening of samples by molecular methods, and using a high-throughput isolation method developed by our group. Based on the inoculation of nearly 7,000 samples, our collection currently contains 43 strains of Mimiviridae (14 in lineage A, 6 in lineage B, and 23 in lineage C) and 17 strains of Marseilleviridae isolated from various environments, including 3 of human origin. This study details the procedures used to build this collection and paves the way for the high-throughput isolation of new isolates to improve the record of giant virus distribution in the environment and the determination of their pangenome.
Algorithms and software for solving finite element equations on serial and parallel architectures
NASA Technical Reports Server (NTRS)
George, Alan
1989-01-01
Over the past 15 years numerous new techniques have been developed for solving systems of equations and eigenvalue problems arising in finite element computations. A package called SPARSPAK has been developed by the author and his co-workers which exploits these new methods. The broad objective of this research project is to incorporate some of this software in the Computational Structural Mechanics (CSM) testbed, and to extend the techniques for use on multiprocessor architectures.
Analysis of the transient behavior of rubbing components
NASA Technical Reports Server (NTRS)
Quezdou, M. B.; Mullen, R. L.
1986-01-01
Finite element equations are developed for studying deformations and temperatures resulting from frictional heating in sliding system. The formulation is done for linear steady state motion in two dimensions. The equations include the effect of the velocity on the moving components. This gives spurious oscillations in their solutions by Galerkin finite element methods. A method called streamline upwind scheme is used to try to deal with this deficiency. The finite element program is then used to investigate the friction of heating in gas path seal.
Finding Major Patterns of Aging Process by Data Synchronization
NASA Astrophysics Data System (ADS)
Miyano, Takaya; Tsutsui, Takako
We developed a method for extracting feature patterns from multivariate data using a network of coupled phase oscillators subject to an analogue of the Kuramoto model for collective synchronization. Our method may be called data synchronization. We applied data synchronization to the care-needs-certification data, provided by Otsu City as a historical old city near Kyoto City, in the Japanese public long-term care insurance program to find the trend of the major patterns of the aging process for elderly people needing nursing care.
Use of a public telephone hotline to detect urban plague cases.
Malberg, J A; Pape, W J; Lezotte, D; Hill, A E
2012-11-01
Current methods for vector-borne disease surveillance are limited by time and cost. To avoid human infections from emerging zoonotic diseases, it is important that the United States develop cost-effective surveillance systems for these diseases. This study examines the methodology used in the surveillance of a plague epizootic involving tree squirrels (Sciurus niger) in Denver Colorado, during the summer of 2007. A call-in centre for the public to report dead squirrels was used to direct animal carcass sampling. Staff used these reports to collect squirrel carcasses for the analysis of Yersinia pestis infection. This sampling protocol was analysed at the census tract level using Poisson regression to determine the relationship between higher call volumes in a census tract and the risk of a carcass in that tract testing positive for plague. Over-sampling owing to call volume-directed collection was accounted for by including the number of animals collected as the denominator in the model. The risk of finding an additional plague-positive animal increased as the call volume per census tract increased. The risk in the census tracts with >3 calls a month was significantly higher than that with three or less calls in a month. For tracts with 4-5 calls, the relative risk (RR) of an additional plague-positive carcass was 10.08 (95% CI 5.46-18.61); for tracts with 6-8 calls, the RR = 5.20 (2.93-9.20); for tracts with 9-11 calls, the RR = 12.80 (5.85-28.03) and tracts with >11 calls had RR = 35.41 (18.60-67.40). Overall, the call-in centre directed sampling increased the probability of locating plague-infected carcasses in the known Denver epizootic. Further studies are needed to determine the effectiveness of this methodology at monitoring large-scale zoonotic disease occurrence in the absence of a recognized epizootic. © 2012 Blackwell Verlag GmbH.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohr, C.L.; Pankaskie, P.J.; Heasler, P.G.
Reactor fuel failure data sets in the form of initial power (P/sub i/), final power (P/sub f/), transient increase in power (..delta..P), and burnup (Bu) were obtained for pressurized heavy water reactors (PHWRs), boiling water reactors (BWRs), and pressurized water reactors (PWRs). These data sets were evaluated and used as the basis for developing two predictive fuel failure models, a graphical concept called the PCI-OGRAM, and a nonlinear regression based model called PROFIT. The PCI-OGRAM is an extension of the FUELOGRAM developed by AECL. It is based on a critical threshold concept for stress dependent stress corrosion cracking. The PROFITmore » model, developed at Pacific Northwest Laboratory, is the result of applying standard statistical regression methods to the available PCI fuel failure data and an analysis of the environmental and strain rate dependent stress-strain properties of the Zircaloy cladding.« less
The Effect of the Laboratory Specimen on Fatigue Crack Growth Rate
NASA Technical Reports Server (NTRS)
Forth, S. C.; Johnston, W. M.; Seshadri, B. R.
2006-01-01
Over the past thirty years, laboratory experiments have been devised to develop fatigue crack growth rate data that is representative of the material response. The crack growth rate data generated in the laboratory is then used to predict the safe operating envelope of a structure. The ability to interrelate laboratory data and structural response is called similitude. In essence, a nondimensional term, called the stress intensity factor, was developed that includes the applied stresses, crack size and geometric configuration. The stress intensity factor is then directly related to the rate at which cracks propagate in a material, resulting in the material property of fatigue crack growth response. Standardized specimen configurations and experimental procedures have been developed for laboratory testing to generate crack growth rate data that supports similitude of the stress intensity factor solution. In this paper, the authors present laboratory fatigue crack growth rate test data and finite element analyses that show similitude between standard specimen configurations tested using the constant stress ratio test method is unobtainable.
Applying fiber optical methods for toxicological testing in vitro
NASA Astrophysics Data System (ADS)
Maerz, Holger K.; Buchholz, Rainer; Emmrich, Frank; Fink, Frank; Geddes, Clive L.; Pfeifer, Lutz; Raabe, Ferdinand; Scheper, Thomas-Helmut; Ulrich, Elizabeth; Marx, Uwe
1999-04-01
The new medical developments, e.g. immune therapy, patient oriented chemotherapy or even gene therapy, create a questionable doubt to the further requirement of animal test. Instead the call for humanitarian reproductive in vitro models becomes increasingly louder. Pharmaceutical usage of in vitro has a long proven history. In cancer research and therapy, the effect of chemostatica in vitro in the so-called oncobiogram is being tested; but the assays do not always correlate with in vivo-like drug resistance and sensitivity. We developed a drug test system in vitro, feasible for therapeutic drug monitoring by the combination of tissue cultivation in hollow fiber bioreactors and fiber optic sensors for monitoring the pharmaceutical effect. Using two fiber optic sensors - an optical oxygen sensor and a metabolism detecting Laserfluoroscope, we were able to successfully monitor the biological status of tissue culture and the drug or toxic effects of in vitro pharmaceutical testing. Furthermore, we developed and patented a system for monitoring the effect of minor toxic compounds which can induce Sick Building Syndrome.
Software design and implementation of ship heave motion monitoring system based on MBD method
NASA Astrophysics Data System (ADS)
Yu, Yan; Li, Yuhan; Zhang, Chunwei; Kang, Won-Hee; Ou, Jinping
2015-03-01
Marine transportation plays a significant role in the modern transport sector due to its advantage of low cost, large capacity. It is being attached enormous importance to all over the world. Nowadays the related areas of product development have become an existing hot spot. DSP signal processors feature micro volume, low cost, high precision, fast processing speed, which has been widely used in all kinds of monitoring systems. But traditional DSP code development process is time-consuming, inefficiency, costly and difficult. MathWorks company proposed Model-based Design (MBD) to overcome these defects. By calling the target board modules in simulink library to compile and generate the corresponding code for the target processor. And then automatically call DSP integrated development environment CCS for algorithm validation on the target processor. This paper uses the MDB to design the algorithm for the ship heave motion monitoring system. It proves the effectiveness of the MBD run successfully on the processor.
Conduits to care: call lights and patients’ perceptions of communication
Montie, Mary; Shuman, Clayton; Galinato, Jose; Patak, Lance; Anderson, Christine A; Titler, Marita G
2017-01-01
Background Call light systems remain the primary means of hospitalized patients to initiate communication with their health care providers. Although there is vast amounts of literature discussing patient communication with their health care providers, few studies have explored patients’ perceptions concerning call light use and communication. The specific aim of this study was to solicit patients’ perceptions regarding their call light use and communication with nursing staff. Methods Patients invited to this study met the following inclusion criteria: proficient in English, been hospitalized for at least 24 hours, aged ≥21 years, and able to communicate verbally (eg, not intubated). Thirty participants provided written informed consent, were enrolled in the study, and completed interviews. Results Using qualitative descriptive methods, five major themes emerged from patients’ perceptions (namely; establishing connectivity, participant safety concerns, no separation: health care and the call light device, issues with the current call light, and participants’ perceptions of “nurse work”). Multiple minor themes supported these major themes. Data analysis utilized the constant comparative methods of Glaser and Strauss. Discussion Findings from this study extend the knowledge of patients’ understanding of not only why inconsistencies occur between the call light and their nurses, but also why the call light is more than merely a device to initiate communication; rather, it is a direct conduit to their health care and its delivery. PMID:29075125
Bates, Maxwell; Berliner, Aaron J; Lachoff, Joe; Jaschke, Paul R; Groban, Eli S
2017-01-20
Wet Lab Accelerator (WLA) is a cloud-based tool that allows a scientist to conduct biology via robotic control without the need for any programming knowledge. A drag and drop interface provides a convenient and user-friendly method of generating biological protocols. Graphically developed protocols are turned into programmatic instruction lists required to conduct experiments at the cloud laboratory Transcriptic. Prior to the development of WLA, biologists were required to write in a programming language called "Autoprotocol" in order to work with Transcriptic. WLA relies on a new abstraction layer we call "Omniprotocol" to convert the graphical experimental description into lower level Autoprotocol language, which then directs robots at Transcriptic. While WLA has only been tested at Transcriptic, the conversion of graphically laid out experimental steps into Autoprotocol is generic, allowing extension of WLA into other cloud laboratories in the future. WLA hopes to democratize biology by bringing automation to general biologists.
Improving Upon String Methods for Transition State Discovery.
Chaffey-Millar, Hugh; Nikodem, Astrid; Matveev, Alexei V; Krüger, Sven; Rösch, Notker
2012-02-14
Transition state discovery via application of string methods has been researched on two fronts. The first front involves development of a new string method, named the Searching String method, while the second one aims at estimating transition states from a discretized reaction path. The Searching String method has been benchmarked against a number of previously existing string methods and the Nudged Elastic Band method. The developed methods have led to a reduction in the number of gradient calls required to optimize a transition state, as compared to existing methods. The Searching String method reported here places new beads on a reaction pathway at the midpoint between existing beads, such that the resolution of the path discretization in the region containing the transition state grows exponentially with the number of beads. This approach leads to favorable convergence behavior and generates more accurate estimates of transition states from which convergence to the final transition states occurs more readily. Several techniques for generating improved estimates of transition states from a converged string or nudged elastic band have been developed and benchmarked on 13 chemical test cases. Optimization approaches for string methods, and pitfalls therein, are discussed.
Greene, Charles R; McLennan, Miles Wm; Norman, Robert G; McDonald, Trent L; Jakubczak, Ray S; Richardson, W John
2004-08-01
Bowhead whales, Balaena mysticetus, migrate west during fall approximately 10-75 km off the north coast of Alaska, passing the petroleum developments around Prudhoe Bay. Oil production operations on an artificial island 5 km offshore create sounds heard by some whales. As part of an effort to assess whether migrating whales deflect farther offshore at times with high industrial noise, an acoustical approach was selected for localizing calling whales. The technique incorporated DIFAR (directional frequency and recording) sonobuoy techniques. An array of 11 DASARs (directional autonomous seafloor acoustic recorders) was built and installed with unit-to-unit separation of 5 km. When two or more DASARs detected the same call, the whale location was determined from the bearing intersections. This article describes the acoustic methods used to determine the locations of the calling bowhead whales and shows the types and precision of the data acquired. Calibration transmissions at GPS-measured times and locations provided measures of the individual DASAR clock drift and directional orientation. The standard error of the bearing measurements at distances of 3-4 km was approximately 1.35 degrees after corrections for gain imbalance in the two directional sensors. During 23 days in 2002, 10,587 bowhead calls were detected and 8383 were localized.
Mobile Phone Call Data as a Regional Socio-Economic Proxy Indicator
Šćepanović, Sanja; Mishkovski, Igor; Hui, Pan; Nurminen, Jukka K.; Ylä-Jääski, Antti
2015-01-01
The advent of publishing anonymized call detail records opens the door for temporal and spatial human dynamics studies. Such studies, besides being useful for creating universal models for mobility patterns, could be also used for creating new socio-economic proxy indicators that will not rely only on the local or state institutions. In this paper, from the frequency of calls at different times of the day, in different small regional units (sub-prefectures) in Côte d'Ivoire, we infer users' home and work sub-prefectures. This division of users enables us to analyze different mobility and calling patterns for the different regions. We then compare how those patterns correlate to the data from other sources, such as: news for particular events in the given period, census data, economic activity, poverty index, power plants and energy grid data. Our results show high correlation in many of the cases revealing the diversity of socio-economic insights that can be inferred using only mobile phone call data. The methods and the results may be particularly relevant to policy-makers engaged in poverty reduction initiatives as they can provide an affordable tool in the context of resource-constrained developing economies, such as Côte d'Ivoire's. PMID:25897957
Relative Displacement Method for Track-Structure Interaction
Ramos, Óscar Ramón; Pantaleón, Marcos J.
2014-01-01
The track-structure interaction effects are usually analysed with conventional FEM programs, where it is difficult to implement the complex track-structure connection behaviour, which is nonlinear, elastic-plastic and depends on the vertical load. The authors developed an alternative analysis method, which they call the relative displacement method. It is based on the calculation of deformation states in single DOF element models that satisfy the boundary conditions. For its solution, an iterative optimisation algorithm is used. This method can be implemented in any programming language or analysis software. A comparison with ABAQUS calculations shows a very good result correlation and compliance with the standard's specifications. PMID:24634610
An extension of command shaping methods for controlling residual vibration using frequency sampling
NASA Technical Reports Server (NTRS)
Singer, Neil C.; Seering, Warren P.
1992-01-01
The authors present an extension to the impulse shaping technique for commanding machines to move with reduced residual vibration. The extension, called frequency sampling, is a method for generating constraints that are used to obtain shaping sequences which minimize residual vibration in systems such as robots whose resonant frequencies change during motion. The authors present a review of impulse shaping methods, a development of the proposed extension, and a comparison of results of tests conducted on a simple model of the space shuttle robot arm. Frequency shaping provides a method for minimizing the impulse sequence duration required to give the desired insensitivity.
Theoretical model for frequency locking a diode laser with a Faraday cell
NASA Technical Reports Server (NTRS)
Wanninger, P.; Shay, T. M.
1992-01-01
A new method was developed for frequency locking a diode lasers, called 'the Faraday anomalous dispersion optical transmitter (FADOT) laser locking', which is much simpler than other known locking schemes. The FADOT laser locking method uses commercial laser diodes with no antireflection coatings, an atomic Faraday cell with a single polarizer, and an output coupler to form a compound cavity. The FADOT method is vibration insensitive and exhibits minimal thermal expansion effects. The system has a frequency pull in the range of 443.2 GHz (9 A). The method has potential applications in optical communication, remote sensing, and pumping laser excited optical filters.
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.
1998-01-01
Robust control system analysis and design is based on an uncertainty description, called a linear fractional transformation (LFT), which separates the uncertain (or varying) part of the system from the nominal system. These models are also useful in the design of gain-scheduled control systems based on Linear Parameter Varying (LPV) methods. Low-order LFT models are difficult to form for problems involving nonlinear parameter variations. This paper presents a numerical computational method for constructing and LFT model for a given LPV model. The method is developed for multivariate polynomial problems, and uses simple matrix computations to obtain an exact low-order LFT representation of the given LPV system without the use of model reduction. Although the method is developed for multivariate polynomial problems, multivariate rational problems can also be solved using this method by reformulating the rational problem into a polynomial form.
Noncontact evaluation for interface states by photocarrier counting
NASA Astrophysics Data System (ADS)
Furuta, Masaaki; Shimizu, Kojiro; Maeta, Takahiro; Miyashita, Moriya; Izunome, Koji; Kubota, Hiroshi
2018-03-01
We have developed a noncontact measurement method that enables in-line measurement and does not have any test element group (TEG) formation. In this method, the number of photocarriers excited from the interface states are counted which is called “photocarrier counting”, and then the energy distribution of the interface states density (D it) is evaluated by spectral light excitation. In our previous experiment, the method used was a preliminary contact measurement method at the oxide on top of the Si wafer. We developed, at this time, a D it measurement method as a noncontact measurement with a gap between the probes and the wafer. The shallow trench isolation (STI) sidewall has more localized interface states than the region under the gate electrode. We demonstrate the noncontact measurement of trapped carriers from interface states using wafers of three different crystal plane orientations. The demonstration will pave the way for evaluating STI sidewall interface states in future studies.
Efficient Fluid Dynamic Design Optimization Using Cartesian Grids
NASA Technical Reports Server (NTRS)
Dadone, A.; Grossman, B.; Sellers, Bill (Technical Monitor)
2004-01-01
This report is subdivided in three parts. The first one reviews a new approach to the computation of inviscid flows using Cartesian grid methods. The crux of the method is the curvature-corrected symmetry technique (CCST) developed by the present authors for body-fitted grids. The method introduces ghost cells near the boundaries whose values are developed from an assumed flow-field model in vicinity of the wall consisting of a vortex flow, which satisfies the normal momentum equation and the non-penetration condition. The CCST boundary condition was shown to be substantially more accurate than traditional boundary condition approaches. This improved boundary condition is adapted to a Cartesian mesh formulation, which we call the Ghost Body-Cell Method (GBCM). In this approach, all cell centers exterior to the body are computed with fluxes at the four surrounding cell edges. There is no need for special treatment corresponding to cut cells which complicate other Cartesian mesh methods.
Quantitative method of medication system interface evaluation.
Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F
2007-01-01
The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.
New computational tools for H/D determination in macromolecular structures from neutron data.
Siliqi, Dritan; Caliandro, Rocco; Carrozzini, Benedetta; Cascarano, Giovanni Luca; Mazzone, Annamaria
2010-11-01
Two new computational methods dedicated to neutron crystallography, called n-FreeLunch and DNDM-NDM, have been developed and successfully tested. The aim in developing these methods is to determine hydrogen and deuterium positions in macromolecular structures by using information from neutron density maps. Of particular interest is resolving cases in which the geometrically predicted hydrogen or deuterium positions are ambiguous. The methods are an evolution of approaches that are already applied in X-ray crystallography: extrapolation beyond the observed resolution (known as the FreeLunch procedure) and a difference electron-density modification (DEDM) technique combined with the electron-density modification (EDM) tool (known as DEDM-EDM). It is shown that the two methods are complementary to each other and are effective in finding the positions of H and D atoms in neutron density maps.
Multi data mode method as an alternative way for SPM studies of high relief surfaces
NASA Astrophysics Data System (ADS)
Abdullayeva, S. H.; Molchanov, S. P.; Mamedov, N. T.; Alekperov, S. D.
2006-09-01
In this paper we report the results of our studies of the high relief surfaces of Al oxide-based ceramic catalyst by SPM contact mode, and by so-called Multi Data Mode (MDM) method for comparison. We failed to obtain any reasonable image of the highly-developed surfaces of above material by the first method but were successful in doing so when applied the second one. The topographic and complimentary images obtained by MDM probing with high resolution are discussed to show a full range of the applications possible using MDM.
Deep classification hashing for person re-identification
NASA Astrophysics Data System (ADS)
Wang, Jiabao; Li, Yang; Zhang, Xiancai; Miao, Zhuang; Tao, Gang
2018-04-01
As the development of surveillance in public, person re-identification becomes more and more important. The largescale databases call for efficient computation and storage, hashing technique is one of the most important methods. In this paper, we proposed a new deep classification hashing network by introducing a new binary appropriation layer in the traditional ImageNet pre-trained CNN models. It outputs binary appropriate features, which can be easily quantized into binary hash-codes for hamming similarity comparison. Experiments show that our deep hashing method can outperform the state-of-the-art methods on the public CUHK03 and Market1501 datasets.
MPAI (mass probes aided ionization) method for total analysis of biomolecules by mass spectrometry.
Honda, Aki; Hayashi, Shinichiro; Hifumi, Hiroki; Honma, Yuya; Tanji, Noriyuki; Iwasawa, Naoko; Suzuki, Yoshio; Suzuki, Koji
2007-01-01
We have designed and synthesized various mass probes, which enable us to effectively ionize various molecules to be detected with mass spectrometry. We call the ionization method using mass probes the "MPAI (mass probes aided ionization)" method. We aim at the sensitive detection of various biological molecules, and also the detection of bio-molecules by a single mass spectrometry serially without changing the mechanical settings. Here, we review mass probes for small molecules with various functional groups and mass probes for proteins. Further, we introduce newly developed mass probes for proteins for highly sensitive detection.
Effect of introduction of electronic patient reporting on the duration of ambulance calls.
Kuisma, Markku; Väyrynen, Taneli; Hiltunen, Tuomas; Porthan, Kari; Aaltonen, Janne
2009-10-01
We examined the effect of the change from paper records to the electronic patient records (EPRs) on ambulance call duration. We retrieved call duration times 6 months before (group 1) and 6 months after (group 2) the introduction of EPR. Subgroup analysis of group 2 was fulfilled depending whether the calls were made during the first or last 3 months after EPR introduction. We analyzed 37 599 ambulance calls (17 950 were in group 1 and 19 649 were in group 2). The median call duration in group 1 was 48 minutes and in group 2 was 49 minutes (P = .008). In group 2, call duration was longer during the first 3 months after EPR introduction. In multiple linear regression analysis, urgency category (P < .0001), unit level (P < .0001), and transportation decision (P < .0001) influenced the call duration. The documentation method was not a significant factor. Electronic patient record system can be implemented in an urban ambulance service in such a way that documentation method does not become a significant factor in determining call duration in the long run. Temporary performance drop during the first 3 months after introduction was noticed, reflecting adaptation process to a new way of working.
ERIC Educational Resources Information Center
Rogerson-Revell, Pamela
2005-01-01
This paper describes some of the pedagogical and technical issues involved in adopting a hybrid approach to CALL materials development. It illustrates some of these issues with reference to a vocational CALL project, LANCAM, which took such a hybrid approach. It describes some of the benefits and considerations involved in hybrid development and…
Humble, Emily; Thorne, Michael A S; Forcada, Jaume; Hoffman, Joseph I
2016-08-26
Single nucleotide polymorphism (SNP) discovery is an important goal of many studies. However, the number of 'putative' SNPs discovered from a sequence resource may not provide a reliable indication of the number that will successfully validate with a given genotyping technology. For this it may be necessary to account for factors such as the method used for SNP discovery and the type of sequence data from which it originates, suitability of the SNP flanking sequences for probe design, and genomic context. To explore the relative importance of these and other factors, we used Illumina sequencing to augment an existing Roche 454 transcriptome assembly for the Antarctic fur seal (Arctocephalus gazella). We then mapped the raw Illumina reads to the new hybrid transcriptome using BWA and BOWTIE2 before calling SNPs with GATK. The resulting markers were pooled with two existing sets of SNPs called from the original 454 assembly using NEWBLER and SWAP454. Finally, we explored the extent to which SNPs discovered using these four methods overlapped and predicted the corresponding validation outcomes for both Illumina Infinium iSelect HD and Affymetrix Axiom arrays. Collating markers across all discovery methods resulted in a global list of 34,718 SNPs. However, concordance between the methods was surprisingly poor, with only 51.0 % of SNPs being discovered by more than one method and 13.5 % being called from both the 454 and Illumina datasets. Using a predictive modeling approach, we could also show that SNPs called from the Illumina data were on average more likely to successfully validate, as were SNPs called by more than one method. Above and beyond this pattern, predicted validation outcomes were also consistently better for Affymetrix Axiom arrays. Our results suggest that focusing on SNPs called by more than one method could potentially improve validation outcomes. They also highlight possible differences between alternative genotyping technologies that could be explored in future studies of non-model organisms.
Bridge Condition Assessment Using D Numbers
Hu, Yong
2014-01-01
Bridge condition assessment is a complex problem influenced by many factors. The uncertain environment increases more its complexity. Due to the uncertainty in the process of assessment, one of the key problems is the representation of assessment results. Though there exists many methods that can deal with uncertain information, however, they have more or less deficiencies. In this paper, a new representation of uncertain information, called D numbers, is presented. It extends the Dempster-Shafer theory. By using D numbers, a new method is developed for the bridge condition assessment. Compared to these existing methods, the proposed method is simpler and more effective. An illustrative case is given to show the effectiveness of the new method. PMID:24696639
Sports Training Support Method by Self-Coaching with Humanoid Robot
NASA Astrophysics Data System (ADS)
Toyama, S.; Ikeda, F.; Yasaka, T.
2016-09-01
This paper proposes a new training support method called self-coaching with humanoid robots. In the proposed method, two small size inexpensive humanoid robots are used because of their availability. One robot called target robot reproduces motion of a target player and another robot called reference robot reproduces motion of an expert player. The target player can recognize a target technique from the reference robot and his/her inadequate skill from the target robot. Modifying the motion of the target robot as self-coaching, the target player could get advanced cognition. Some experimental results show some possibility as the new training method and some issues of the self-coaching interface program as a future work.
S. Loeb; E. Britzke
2010-01-01
Bats respond to the calls of conspecifics as well as to calls of other species; however, few studies have attempted to quantify these responses or understand the functions of these calls. We tested the response of Rafinesqueâs big-eared bats (Corynorhinus rafinesquii) to social calls as a possible method to increase capture success and to understand the function of...
Fast summation of divergent series and resurgent transseries from Meijer-G approximants
NASA Astrophysics Data System (ADS)
Mera, Héctor; Pedersen, Thomas G.; Nikolić, Branislav K.
2018-05-01
We develop a resummation approach based on Meijer-G functions and apply it to approximate the Borel sum of divergent series and the Borel-Écalle sum of resurgent transseries in quantum mechanics and quantum field theory (QFT). The proposed method is shown to vastly outperform the conventional Borel-Padé and Borel-Padé-Écalle summation methods. The resulting Meijer-G approximants are easily parametrized by means of a hypergeometric ansatz and can be thought of as a generalization to arbitrary order of the Borel-hypergeometric method [Mera et al., Phys. Rev. Lett. 115, 143001 (2015), 10.1103/PhysRevLett.115.143001]. Here we demonstrate the accuracy of this technique in various examples from quantum mechanics and QFT, traditionally employed as benchmark models for resummation, such as zero-dimensional ϕ4 theory; the quartic anharmonic oscillator; the calculation of critical exponents for the N -vector model; ϕ4 with degenerate minima; self-interacting QFT in zero dimensions; and the summation of one- and two-instanton contributions in the quantum-mechanical double-well problem.
"Master-Slave" Biological Network Alignment
NASA Astrophysics Data System (ADS)
Ferraro, Nicola; Palopoli, Luigi; Panni, Simona; Rombo, Simona E.
Performing global alignment between protein-protein interaction (PPI) networks of different organisms is important to infer knowledge about conservation across species. Known methods that perform this task operate symmetrically, that is to say, they do not assign a distinct role to the input PPI networks. However, in most cases, the input networks are indeed distinguishable on the basis of how well the corresponding organism is biologically well-characterized. For well-characterized organisms the associated PPI network supposedly encode in a sound manner all the information about their proteins and associated interactions, which is far from being the case for not well characterized ones. Here the new idea is developed to devise a method for global alignment of PPI networks that in fact exploit differences in the characterization of organisms at hand. We assume that the PPI network (called Master) of the best characterized is used as a fingerprint to guide the alignment process to the second input network (called Slave), so that generated results preferably retain the structural characteristics of the Master (and using the Slave) network. We tested our method showing that the results it returns are biologically relevant.
Developing Student Social Skills Using Restorative Practices: A New Framework Called H.E.A.R.T
ERIC Educational Resources Information Center
Kehoe, Michelle; Bourke-Taylor, Helen; Broderick, David
2018-01-01
Students attending schools today not only learn about formal academic subjects, they also learn social and emotional skills. Whole-school restorative practices (RP) is an approach which can be used to address student misbehaviour when it occurs, and as a holistic method to increase social and emotional learning in students. The aim of this study…
The Evolution of the Snellen E to the Blackbird. (Blackbird Preschool Vision Screening Program).
ERIC Educational Resources Information Center
Sato-Viacrucis, Kiyo
Comparison of a variety of vision screening methods used with preschool children led to modification of the standard Snellen E test called the Blackbird Vision Screening System. An instructional story using an "E-bird" was developed to teach children the various possible positions of the E. The visual confusion caused by the chart was…
Automated MeSH indexing of the World-Wide Web.
Fowler, J.; Kouramajian, V.; Maram, S.; Devadhar, V.
1995-01-01
To facilitate networked discovery and information retrieval in the biomedical domain, we have designed a system for automatic assignment of Medical Subject Headings to documents retrieved from the World-Wide Web. Our prototype implementations show significant promise. We describe our methods and discuss the further development of a completely automated indexing tool called the "Web-MeSH Medibot." PMID:8563421
a Variant of Lsd-Slam Capable of Processing High-Speed Low-Framerate Monocular Datasets
NASA Astrophysics Data System (ADS)
Schmid, S.; Fritsch, D.
2017-11-01
We develop a new variant of LSD-SLAM, called C-LSD-SLAM, which is capable of performing monocular tracking and mapping in high-speed low-framerate situations such as those of the KITTI datasets. The methods used here are robust against the influence of erronously triangulated points near the epipolar direction, which otherwise causes tracking divergence.
Determining the impact of felling method and season of year on coppice regeneration
Daniel de Souza; Tom Gallagher; Dana Mitchell; Matthew Smidt; Tim McDonald; Jeff Wright
2014-01-01
There is an increasing interest in the establishment of plantations in the Southeast region with the objective of producing biomass for energy and fuel. Establishment of these plantations will require the development of a feasible way to harvest them. These types of plantations are called Short Rotation Woody Crops (SRWC). Popular SRWC species are Eucalypt (...
Rapid Analysis and Manufacturing Propulsion Technology (RAMPT)
NASA Technical Reports Server (NTRS)
Fikes, John C.
2018-01-01
NASA's strategic plan calls for the development of enabling technologies, improved production methods, and advanced design and analysis tools related to the agency's objectives to expand human presence in the solar system. NASA seeks to advance exploration, science, innovation, benefits to humanity, and international collaboration, as well as facilitate and utilize U.S. commercial capabilities to deliver cargo and crew to space.
Butterfly valve in a virtual environment
NASA Astrophysics Data System (ADS)
Talekar, Aniruddha; Patil, Saurabh; Thakre, Prashant; Rajkumar, E.
2017-11-01
Assembly of components is one of the processes involved in product design and development. The present paper deals with the assembly of a simple butterfly valve components in a virtual environment. The assembly has been carried out using virtual reality software by trial and error methods. The parts are modelled using parametric software (SolidWorks), meshed accordingly, and then called into virtual environment for assembly.
1997 Technology Applications Report,
1997-01-01
handle high -power loads at microwave radio frequencies , microwave vacuum tubes remain the chosen technology to amplify high power. Aria Microwave...structure called the active RF cavity amplifier (ARFCA). With this design , the amplifier handles high -power loads at radio and microwave frequencies ...developed this technology using BMDO-funded modeling methods designed to simulate the dynamics of large space-based structures. Because it increases
Multivariate Density Estimation and Remote Sensing
NASA Technical Reports Server (NTRS)
Scott, D. W.
1983-01-01
Current efforts to develop methods and computer algorithms to effectively represent multivariate data commonly encountered in remote sensing applications are described. While this may involve scatter diagrams, multivariate representations of nonparametric probability density estimates are emphasized. The density function provides a useful graphical tool for looking at data and a useful theoretical tool for classification. This approach is called a thunderstorm data analysis.
ERIC Educational Resources Information Center
Yang, Eunice
2016-01-01
This paper discusses the use of a free mobile engineering application (app) called Autodesk® ForceEffect™ to provide students assistance with spatial visualization of forces and more practice in solving/visualizing statics problems compared to the traditional pencil-and-paper method. ForceEffect analyzes static rigid-body systems using free-body…
The Scientific Theory Profile: A Philosophy of Science Model for Science Teachers.
ERIC Educational Resources Information Center
Loving, Cathleen
The model developed for use with science teachers--called the Scientific Theory Profile--consists of placing three well-known philosophers of science on a grid, with the x-axis being their methods for judging theories (rational vs. natural) and the y-axis being their views on scientific theories representing the Truth versus mere models of what…
Free-Swinging Failure Tolerance for Robotic Manipulators
NASA Technical Reports Server (NTRS)
English, James
1997-01-01
Under this GSRP fellowship, software-based failure-tolerance techniques were developed for robotic manipulators. The focus was on failures characterized by the loss of actuator torque at a joint, called free-swinging failures. The research results spanned many aspects of the free-swinging failure-tolerance problem, from preparing for an expected failure to discovery of postfailure capabilities to establishing efficient methods to realize those capabilities. Developed algorithms were verified using computer-based dynamic simulations, and these were further verified using hardware experiments at Johnson Space Center.
CELL-SELEX: Novel Perspectives of Aptamer-Based Therapeutics
Guo, Ke-Tai; Paul, Angela; Schichor, Christian; Ziemer, Gerhard; Wendel, Hans P.
2008-01-01
Aptamers, single stranded DNA or RNA molecules, generated by a method called SELEX (systematic evolution of ligands by exponential enrichment) have been widely used in various biomedical applications. The newly developed Cell-SELEX (cell based-SELEX) targeting whole living cells has raised great expectations for cancer biology, -therapy and regenerative medicine. Combining nanobiotechnology with aptamers, this technology opens the way to more sophisticated applications in molecular diagnosis. This paper gives a review of recent developments in SELEX technologies and new applications of aptamers. PMID:19325777
Roothaan approach in the thermodynamic limit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutierrez, G.; Plastino, A.
1982-02-01
A systematic method for the solution of the Hartree-Fock equations in the thermodynamic limit is presented. The approach is seen to be a natural extension of the one usually employed in the finite-fermion case, i.e., that developed by Roothaan. The new techniques developed here are applied, as an example, to neutron matter, employing the so-called V/sub 1/ Bethe homework potential. The results obtained are, by far, superior to those that the ordinary plane-wave Hartree-Fock theory yields.
Formal verification of mathematical software
NASA Technical Reports Server (NTRS)
Sutherland, D.
1984-01-01
Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.
Characterization of Cloud Water-Content Distribution
NASA Technical Reports Server (NTRS)
Lee, Seungwon
2010-01-01
The development of realistic cloud parameterizations for climate models requires accurate characterizations of subgrid distributions of thermodynamic variables. To this end, a software tool was developed to characterize cloud water-content distributions in climate-model sub-grid scales. This software characterizes distributions of cloud water content with respect to cloud phase, cloud type, precipitation occurrence, and geo-location using CloudSat radar measurements. It uses a statistical method called maximum likelihood estimation to estimate the probability density function of the cloud water content.
ERIC Educational Resources Information Center
Dashtestani, Reza
2014-01-01
Even though there are a plethora of CALL materials available to EFL teachers nowadays, very limited attention has been directed toward the issue that most EFL teachers are merely the consumers of CALL materials. The main challenge is to equip EFL teachers with the required CALL materials development skills to enable them to be contributors to CALL…
Identifying Populace Susceptible to Flooding Using ArcGIS and Remote Sensing Datasets
NASA Astrophysics Data System (ADS)
Fernandez, Sim Joseph; Milano, Alan
2016-07-01
Remote sensing technologies are growing vastly as with its various applications. The Department of Science and Technology (DOST), Republic of the Philippines, has made projects exploiting LiDAR datasets from remote sensing technologies. The Phil-LiDAR 1 project of DOST is a flood hazard mapping project. Among the project's objectives is the identification of building features which can be associated to the flood-exposed population. The extraction of building features from the LiDAR dataset is arduous as it requires manual identification of building features on an elevation map. The mapping of building footprints is made meticulous in order to compensate the accuracy between building floor area and building height both of which are crucial in flood decision making. A building identification method was developed to generate a LiDAR derivative which will serve as a guide in mapping building footprints. The method utilizes several tools of a Geographic Information System (GIS) software called ArcGIS which can operate on physical attributes of buildings such as roofing curvature, slope and blueprint area in order to obtain the LiDAR derivative from LiDAR dataset. The method also uses an intermediary process called building removal process wherein buildings and other features lying below the defined minimum building height - 2 meters in the case of Phil-LiDAR 1 project - are removed. The building identification method was developed in the hope to hasten the identification of building features especially when orthophotographs and/or satellite imageries are not made available.
Automatic face naming by learning discriminative affinity matrices from weakly labeled images.
Xiao, Shijie; Xu, Dong; Wu, Jianxin
2015-10-01
Given a collection of images, where each image contains several faces and is associated with a few names in the corresponding caption, the goal of face naming is to infer the correct name for each face. In this paper, we propose two new methods to effectively solve this problem by learning two discriminative affinity matrices from these weakly labeled images. We first propose a new method called regularized low-rank representation by effectively utilizing weakly supervised information to learn a low-rank reconstruction coefficient matrix while exploring multiple subspace structures of the data. Specifically, by introducing a specially designed regularizer to the low-rank representation method, we penalize the corresponding reconstruction coefficients related to the situations where a face is reconstructed by using face images from other subjects or by using itself. With the inferred reconstruction coefficient matrix, a discriminative affinity matrix can be obtained. Moreover, we also develop a new distance metric learning method called ambiguously supervised structural metric learning by using weakly supervised information to seek a discriminative distance metric. Hence, another discriminative affinity matrix can be obtained using the similarity matrix (i.e., the kernel matrix) based on the Mahalanobis distances of the data. Observing that these two affinity matrices contain complementary information, we further combine them to obtain a fused affinity matrix, based on which we develop a new iterative scheme to infer the name of each face. Comprehensive experiments demonstrate the effectiveness of our approach.
Bilgrami, Irma; Bain, Christopher; Webb, Geoffrey I.; Orosz, Judit; Pilcher, David
2017-01-01
Introduction Hospitals have seen a rise in Medical Emergency Team (MET) reviews. We hypothesised that the commonest MET calls result in similar treatments. Our aim was to design a pre-emptive management algorithm that allowed direct institution of treatment to patients without having to wait for attendance of the MET team and to model its potential impact on MET call incidence and patient outcomes. Methods Data was extracted for all MET calls from the hospital database. Association rule data mining techniques were used to identify the most common combinations of MET call causes, outcomes and therapies. Results There were 13,656 MET calls during the 34-month study period in 7936 patients. The most common MET call was for hypotension [31%, (2459/7936)]. These MET calls were strongly associated with the immediate administration of intra-venous fluid (70% [1714/2459] v 13% [739/5477] p<0.001), unless the patient was located on a respiratory ward (adjusted OR 0.41 [95%CI 0.25–0.67] p<0.001), had a cardiac cause for admission (adjusted OR 0.61 [95%CI 0.50–0.75] p<0.001) or was under the care of the heart failure team (adjusted OR 0.29 [95%CI 0.19–0.42] p<0.001). Modelling the effect of a pre-emptive management algorithm for immediate fluid administration without MET activation on data from a test period of 24 months following the study period, suggested it would lead to a 68.7% (2541/3697) reduction in MET calls for hypotension and a 19.6% (2541/12938) reduction in total METs without adverse effects on patients. Conclusion Routinely collected data and analytic techniques can be used to develop a pre-emptive management algorithm to administer intravenous fluid therapy to a specific group of hypotensive patients without the need to initiate a MET call. This could both lead to earlier treatment for the patient and less total MET calls. PMID:29281665
Measuring the effectiveness of patient-chosen reminder methods in a private orthodontic practice.
Wegrzyniak, Lauren M; Hedderly, Deborah; Chaudry, Kishore; Bollu, Prashanti
2018-05-01
To evaluate the effectiveness of patient-chosen appointment reminder methods (phone call, e-mail, or SMS text) in reducing no-show rates. This was a retrospective case study that determined the correlation between patient-chosen appointment reminder methods and no-show rates in a private orthodontic practice. This study was conducted in a single office location of a multioffice private orthodontic practice using data gathered in 2015. The subjects were patients who self-selected the appointment reminder method (phone call, e-mail, or SMS text). Patient appointment data were collected over a 6-month period. Patient attendance was analyzed with descriptive statistics to determine any significant differences among patient-chosen reminder methods. There was a total of 1193 appointments with an average no-show rate of 2.43% across the three reminder methods. No statistically significant differences ( P = .569) were observed in the no-show rates between the three methods: phone call (3.49%), e-mail (2.68%), and SMS text (1.90%). The electronic appointment reminder methods (SMS text and e-mail) had lower no-show rates compared with the phone call method, with SMS text having the lowest no-show rate of 1.90%. However, since no significant differences were observed between the three patient-chosen reminder methods, providers may want to allow patients to choose their reminder method to decrease no-shows.
Characteristics of fin whale vocalizations recorded on instruments in the northeast Pacific Ocean
NASA Astrophysics Data System (ADS)
Weirathmueller, Maria Michelle Josephine
This thesis focuses on fin whale vocalizations recorded on ocean bottom seismometers (OBSs) in the Northeast Pacific Ocean, using data collected between 2003 and 2013. OBSs are a valuable, and largely untapped resource for the passive acoustic monitoring of large baleen whales. This dissertation is divided into three parts, each of which uses the recordings of fin whale vocalizations to better understand their calling behaviors and distributions. The first study describes the development of a technique to extract source levels of fin whale vocalizations from OBS recordings. Source levels were estimated using data collected on a network of eight OBSs in the Northeast Pacific Ocean. The acoustic pressure levels measured at the instruments were adjusted for the propagation path between the calling whales and the instruments using the call location and estimating losses along the acoustic travel path. A total of 1241 calls were used to estimate an average source level of 189 +/-5.8 dB re 1uPa 1m. This variability is largely attributed to uncertainties in the horizontal and vertical position of the fin whale at the time of each call, and the effect of these uncertainties on subsequent calculations. The second study describes a semi-automated method for obtaining horizontal ranges to vocalizing fin whales using the timing and relative amplitude of multipath arrivals. A matched filter is used to detect fin whale calls and pick the relative times and amplitudes of multipath arrivals. Ray-based propagation models are used to predict multipath times and amplitudes as function of range. Because the direct and first multiple arrivals are not always observed, three hypotheses for the paths of the observed arrivals are considered; the solution is the hypothesis and range that optimizes the fit to the data. Ray-theoretical amplitudes are not accurate and solutions are improved by determining amplitudes from the observations using a bootstrap method. Data from ocean bottom seismometers at two locations are used to assess the method: one on the Juan de Fuca Ridge, a bathymetrically complex mid-ocean ridge environment, and the other at a flat sedimented location in the Cascadia Basin. At both sites, the method is reliable up to 4 km range which is sufficient to enable estimates of call density. The third study explores spatial and temporal trends in fin whale calling patterns. The frequency and inter-pulse interval of fin whale 20 Hz vocalizations were observed over 10 years from 2003-2013 on bottom mounted hydrophones and OBSs in the northeast Pacific Ocean. The instrument locations extended from 40°N and 130°W to 125°W with water depths ranging from 1500-4000 m. The inter-pulse interval (IPI) of fin whale song sequences was observed to increase at a rate of 0.59 seconds/year over the decade of observation. During the same time period, peak frequency decreased at a rate of 0.16 Hz/year. Two primary call patterns were observed. During the earlier years, the more commonly observed pattern had a single frequency and single IPI. In later years, a doublet pattern emerged, with two dominant frequencies and two IPIs. Many call sequences in the intervening years appeared to represent a transitional state between the two patterns. The overall trend was consistent across the entire geographical span, although some regional differences exist.
Improved Hybrid Modeling of Spent Fuel Storage Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bibber, Karl van
This work developed a new computational method for improving the ability to calculate the neutron flux in deep-penetration radiation shielding problems that contain areas with strong streaming. The “gold standard” method for radiation transport is Monte Carlo (MC) as it samples the physics exactly and requires few approximations. Historically, however, MC was not useful for shielding problems because of the computational challenge of following particles through dense shields. Instead, deterministic methods, which are superior in term of computational effort for these problems types but are not as accurate, were used. Hybrid methods, which use deterministic solutions to improve MC calculationsmore » through a process called variance reduction, can make it tractable from a computational time and resource use perspective to use MC for deep-penetration shielding. Perhaps the most widespread and accessible of these methods are the Consistent Adjoint Driven Importance Sampling (CADIS) and Forward-Weighted CADIS (FW-CADIS) methods. For problems containing strong anisotropies, such as power plants with pipes through walls, spent fuel cask arrays, active interrogation, and locations with small air gaps or plates embedded in water or concrete, hybrid methods are still insufficiently accurate. In this work, a new method for generating variance reduction parameters for strongly anisotropic, deep penetration radiation shielding studies was developed. This method generates an alternate form of the adjoint scalar flux quantity, Φ Ω, which is used by both CADIS and FW-CADIS to generate variance reduction parameters for local and global response functions, respectively. The new method, called CADIS-Ω, was implemented in the Denovo/ADVANTG software. Results indicate that the flux generated by CADIS-Ω incorporates localized angular anisotropies in the flux more effectively than standard methods. CADIS-Ω outperformed CADIS in several test problems. This initial work indicates that CADIS- may be highly useful for shielding problems with strong angular anisotropies. This is a benefit to the public by increasing accuracy for lower computational effort for many problems that have energy, security, and economic importance.« less
Evaluation of 3-D graphics software: A case study
NASA Technical Reports Server (NTRS)
Lores, M. E.; Chasen, S. H.; Garner, J. M.
1984-01-01
An efficient 3-D geometry graphics software package which is suitable for advanced design studies was developed. The advanced design system is called GRADE--Graphics for Advanced Design. Efficiency and ease of use are gained by sacrificing flexibility in surface representation. The immediate options were either to continue development of GRADE or to acquire a commercially available system which would replace or complement GRADE. Test cases which would reveal the ability of each system to satisfy the requirements were developed. A scoring method which adequately captured the relative capabilities of the three systems was presented. While more complex multi-attribute decision methods could be used, the selected method provides all the needed information without being so complex that it is difficult to understand. If the value factors are modestly perturbed, system Z is a clear winner based on its overall capabilities. System Z is superior in two vital areas: surfacing and ease of interface with application programs.
The Probability of Hitting a Polygonal Target
1981-04-01
required for the use of this method for coalputing the probability of hitting d polygonal target. These functions are 1. PHIT (called by user’s main progran...2. FIJ (called by PHIT ) 3. FUN (called by FIJ) The user must include all three of these in his main program, but needs only to call PHIT . The
Moradi, Saleh; Nima, Ali A.; Rapp Ricciardi, Max; Archer, Trevor; Garcia, Danilo
2014-01-01
Background: Performance monitoring might have an adverse influence on call center agents' well-being. We investigate how performance, over a 6-month period, is related to agents' perceptions of their learning climate, character strengths, well-being (subjective and psychological), and physical activity. Method: Agents (N = 135) self-reported perception of the learning climate (Learning Climate Questionnaire), character strengths (Values In Action Inventory Short Version), well-being (Positive Affect, Negative Affect Schedule, Satisfaction With Life Scale, Psychological Well-Being Scales Short Version), and how often/intensively they engaged in physical activity. Performance, “time on the phone,” was monitored for 6 consecutive months by the same system handling the calls. Results: Performance was positively related to having opportunities to develop, the character strengths clusters of Wisdom and Knowledge (e.g., curiosity for learning, perspective) and Temperance (e.g., having self-control, being prudent, humble, and modest), and exercise frequency. Performance was negatively related to the sense of autonomy and responsibility, contentedness, the character strengths clusters of Humanity and Love (e.g., helping others, cooperation) and Justice (e.g., affiliation, fairness, leadership), positive affect, life satisfaction and exercise Intensity. Conclusion: Call centers may need to create opportunities to develop to increase agents' performance and focus on individual differences in the recruitment and selection of agents to prevent future shortcomings or worker dissatisfaction. Nevertheless, performance measurement in call centers may need to include other aspects that are more attuned with different character strengths. After all, allowing individuals to put their strengths at work should empower the individual and at the end the organization itself. Finally, physical activity enhancement programs might offer considerable positive work outcomes. PMID:25002853
An historical survey of computational methods in optimal control.
NASA Technical Reports Server (NTRS)
Polak, E.
1973-01-01
Review of some of the salient theoretical developments in the specific area of optimal control algorithms. The first algorithms for optimal control were aimed at unconstrained problems and were derived by using first- and second-variation methods of the calculus of variations. These methods have subsequently been recognized as gradient, Newton-Raphson, or Gauss-Newton methods in function space. A much more recent addition to the arsenal of unconstrained optimal control algorithms are several variations of conjugate-gradient methods. At first, constrained optimal control problems could only be solved by exterior penalty function methods. Later algorithms specifically designed for constrained problems have appeared. Among these are methods for solving the unconstrained linear quadratic regulator problem, as well as certain constrained minimum-time and minimum-energy problems. Differential-dynamic programming was developed from dynamic programming considerations. The conditional-gradient method, the gradient-projection method, and a couple of feasible directions methods were obtained as extensions or adaptations of related algorithms for finite-dimensional problems. Finally, the so-called epsilon-methods combine the Ritz method with penalty function techniques.
Ai, Yuncan; Ai, Hannan; Meng, Fanmei; Zhao, Lei
2013-01-01
No attention has been paid on comparing a set of genome sequences crossing genetic components and biological categories with far divergence over large size range. We define it as the systematic comparative genomics and aim to develop the methodology. First, we create a method, GenomeFingerprinter, to unambiguously produce a set of three-dimensional coordinates from a sequence, followed by one three-dimensional plot and six two-dimensional trajectory projections, to illustrate the genome fingerprint of a given genome sequence. Second, we develop a set of concepts and tools, and thereby establish a method called the universal genome fingerprint analysis (UGFA). Particularly, we define the total genetic component configuration (TGCC) (including chromosome, plasmid, and phage) for describing a strain as a systematic unit, the universal genome fingerprint map (UGFM) of TGCC for differentiating strains as a universal system, and the systematic comparative genomics (SCG) for comparing a set of genomes crossing genetic components and biological categories. Third, we construct a method of quantitative analysis to compare two genomes by using the outcome dataset of genome fingerprint analysis. Specifically, we define the geometric center and its geometric mean for a given genome fingerprint map, followed by the Euclidean distance, the differentiate rate, and the weighted differentiate rate to quantitatively describe the difference between two genomes of comparison. Moreover, we demonstrate the applications through case studies on various genome sequences, giving tremendous insights into the critical issues in microbial genomics and taxonomy. We have created a method, GenomeFingerprinter, for rapidly computing, geometrically visualizing, intuitively comparing a set of genomes at genome fingerprint level, and hence established a method called the universal genome fingerprint analysis, as well as developed a method of quantitative analysis of the outcome dataset. These have set up the methodology of systematic comparative genomics based on the genome fingerprint analysis.
Probability of detecting band-tailed pigeons during call-broadcast versus auditory surveys
Kirkpatrick, C.; Conway, C.J.; Hughes, K.M.; Devos, J.C.
2007-01-01
Estimates of population trend for the interior subspecies of band-tailed pigeon (Patagioenas fasciata fasciata) are not available because no standardized survey method exists for monitoring the interior subspecies. We evaluated 2 potential band-tailed pigeon survey methods (auditory and call-broadcast surveys) from 2002 to 2004 in 5 mountain ranges in southern Arizona, USA, and in mixed-conifer forest throughout the state. Both auditory and call-broadcast surveys produced low numbers of cooing pigeons detected per survey route (x?? ??? 0.67) and had relatively high temporal variance in average number of cooing pigeons detected during replicate surveys (CV ??? 161%). However, compared to auditory surveys, use of call-broadcast increased 1) the percentage of replicate surveys on which ???1 cooing pigeon was detected by an average of 16%, and 2) the number of cooing pigeons detected per survey route by an average of 29%, with this difference being greatest during the first 45 minutes of the morning survey period. Moreover, probability of detecting a cooing pigeon was 27% greater during call-broadcast (0.80) versus auditory (0.63) surveys. We found that cooing pigeons were most common in mixed-conifer forest in southern Arizona and density of male pigeons in mixed-conifer forest throughout the state averaged 0.004 (SE = 0.001) pigeons/ha. Our results are the first to show that call-broadcast increases the probability of detecting band-tailed pigeons (or any species of Columbidae) during surveys. Call-broadcast surveys may provide a useful method for monitoring populations of the interior subspecies of band-tailed pigeon in areas where other survey methods are inappropriate.
2018-01-01
Human vocal development is dependent on learning by imitation through social feedback between infants and caregivers. Recent studies have revealed that vocal development is also influenced by parental feedback in marmoset monkeys, suggesting vocal learning mechanisms in nonhuman primates. Marmoset infants that experience more contingent vocal feedback than their littermates develop vocalizations more rapidly, and infant marmosets with limited parental interaction exhibit immature vocal behavior beyond infancy. However, it is yet unclear whether direct parental interaction is an obligate requirement for proper vocal development because all monkeys in the aforementioned studies were able to produce the adult call repertoire after infancy. Using quantitative measures to compare distinct call parameters and vocal sequence structure, we show that social interaction has a direct impact not only on the maturation of the vocal behavior but also on acoustic call structures during vocal development. Monkeys with limited parental interaction during development show systematic differences in call entropy, a measure for maturity, compared with their normally raised siblings. In addition, different call types were occasionally uttered in motif-like sequences similar to those exhibited by vocal learners, such as birds and humans, in early vocal development. These results indicate that a lack of parental interaction leads to long-term disturbances in the acoustic structure of marmoset vocalizations, suggesting an imperative role for social interaction in proper primate vocal development. PMID:29651461
Gultekin, Yasemin B; Hage, Steffen R
2018-04-01
Human vocal development is dependent on learning by imitation through social feedback between infants and caregivers. Recent studies have revealed that vocal development is also influenced by parental feedback in marmoset monkeys, suggesting vocal learning mechanisms in nonhuman primates. Marmoset infants that experience more contingent vocal feedback than their littermates develop vocalizations more rapidly, and infant marmosets with limited parental interaction exhibit immature vocal behavior beyond infancy. However, it is yet unclear whether direct parental interaction is an obligate requirement for proper vocal development because all monkeys in the aforementioned studies were able to produce the adult call repertoire after infancy. Using quantitative measures to compare distinct call parameters and vocal sequence structure, we show that social interaction has a direct impact not only on the maturation of the vocal behavior but also on acoustic call structures during vocal development. Monkeys with limited parental interaction during development show systematic differences in call entropy, a measure for maturity, compared with their normally raised siblings. In addition, different call types were occasionally uttered in motif-like sequences similar to those exhibited by vocal learners, such as birds and humans, in early vocal development. These results indicate that a lack of parental interaction leads to long-term disturbances in the acoustic structure of marmoset vocalizations, suggesting an imperative role for social interaction in proper primate vocal development.
A new Method for Determining the Interplanetary Current-Sheet Local Orientation
NASA Astrophysics Data System (ADS)
Blanco, J. J.; Rodríguez-pacheco, J.; Sequeiros, J.
2003-03-01
In this work we have developed a new method for determining the interplanetary current sheet local parameters. The method, called `HYTARO' (from Hyperbolic Tangent Rotation), is based on a modified Harris magnetic field. This method has been applied to a pool of 57 events, all of them recorded during solar minimum conditions. The model performance has been tested by comparing both, its outputs and noise response, with these of the `classic MVM' (from Minimum Variance Method). The results suggest that, despite the fact that in many cases they behave in a similar way, there are specific crossing conditions that produce an erroneous MVM response. Moreover, our method shows a lower noise level sensitivity than that of MVM.
Validation of an Adaptive Combustion Instability Control Method for Gas-Turbine Engines
NASA Technical Reports Server (NTRS)
Kopasakis, George; DeLaat, John C.; Chang, Clarence T.
2004-01-01
This paper describes ongoing testing of an adaptive control method to suppress high frequency thermo-acoustic instabilities like those found in lean-burning, low emission combustors that are being developed for future aircraft gas turbine engines. The method called Adaptive Sliding Phasor Averaged Control, was previously tested in an experimental rig designed to simulate a combustor with an instability of about 530 Hz. Results published earlier, and briefly presented here, demonstrated that this method was effective in suppressing the instability. Because this test rig did not exhibit a well pronounced instability, a question remained regarding the effectiveness of the control methodology when applied to a more coherent instability. To answer this question, a modified combustor rig was assembled at the NASA Glenn Research Center in Cleveland, Ohio. The modified rig exhibited a more coherent, higher amplitude instability, but at a lower frequency of about 315 Hz. Test results show that this control method successfully reduced the instability pressure of the lower frequency test rig. In addition, due to a certain phenomena discovered and reported earlier, the so called Intra-Harmonic Coupling, a dramatic suppression of the instability was achieved by focusing control on the second harmonic of the instability. These results and their implications are discussed, as well as a hypothesis describing the mechanism of intra-harmonic coupling.
A hybrid stochastic approach for self-location of wireless sensors in indoor environments.
Lloret, Jaime; Tomas, Jesus; Garcia, Miguel; Canovas, Alejandro
2009-01-01
Indoor location systems, especially those using wireless sensor networks, are used in many application areas. While the need for these systems is widely proven, there is a clear lack of accuracy. Many of the implemented applications have high errors in their location estimation because of the issues arising in the indoor environment. Two different approaches had been proposed using WLAN location systems: on the one hand, the so-called deductive methods take into account the physical properties of signal propagation. These systems require a propagation model, an environment map, and the position of the radio-stations. On the other hand, the so-called inductive methods require a previous training phase where the system learns the received signal strength (RSS) in each location. This phase can be very time consuming. This paper proposes a new stochastic approach which is based on a combination of deductive and inductive methods whereby wireless sensors could determine their positions using WLAN technology inside a floor of a building. Our goal is to reduce the training phase in an indoor environment, but, without an loss of precision. Finally, we compare the measurements taken using our proposed method in a real environment with the measurements taken by other developed systems. Comparisons between the proposed system and other hybrid methods are also provided.
GStream: Improving SNP and CNV Coverage on Genome-Wide Association Studies
Alonso, Arnald; Marsal, Sara; Tortosa, Raül; Canela-Xandri, Oriol; Julià, Antonio
2013-01-01
We present GStream, a method that combines genome-wide SNP and CNV genotyping in the Illumina microarray platform with unprecedented accuracy. This new method outperforms previous well-established SNP genotyping software. More importantly, the CNV calling algorithm of GStream dramatically improves the results obtained by previous state-of-the-art methods and yields an accuracy that is close to that obtained by purely CNV-oriented technologies like Comparative Genomic Hybridization (CGH). We demonstrate the superior performance of GStream using microarray data generated from HapMap samples. Using the reference CNV calls generated by the 1000 Genomes Project (1KGP) and well-known studies on whole genome CNV characterization based either on CGH or genotyping microarray technologies, we show that GStream can increase the number of reliably detected variants up to 25% compared to previously developed methods. Furthermore, the increased genome coverage provided by GStream allows the discovery of CNVs in close linkage disequilibrium with SNPs, previously associated with disease risk in published Genome-Wide Association Studies (GWAS). These results could provide important insights into the biological mechanism underlying the detected disease risk association. With GStream, large-scale GWAS will not only benefit from the combined genotyping of SNPs and CNVs at an unprecedented accuracy, but will also take advantage of the computational efficiency of the method. PMID:23844243
A Hybrid Stochastic Approach for Self-Location of Wireless Sensors in Indoor Environments
Lloret, Jaime; Tomas, Jesus; Garcia, Miguel; Canovas, Alejandro
2009-01-01
Indoor location systems, especially those using wireless sensor networks, are used in many application areas. While the need for these systems is widely proven, there is a clear lack of accuracy. Many of the implemented applications have high errors in their location estimation because of the issues arising in the indoor environment. Two different approaches had been proposed using WLAN location systems: on the one hand, the so-called deductive methods take into account the physical properties of signal propagation. These systems require a propagation model, an environment map, and the position of the radio-stations. On the other hand, the so-called inductive methods require a previous training phase where the system learns the received signal strength (RSS) in each location. This phase can be very time consuming. This paper proposes a new stochastic approach which is based on a combination of deductive and inductive methods whereby wireless sensors could determine their positions using WLAN technology inside a floor of a building. Our goal is to reduce the training phase in an indoor environment, but, without an loss of precision. Finally, we compare the measurements taken using our proposed method in a real environment with the measurements taken by other developed systems. Comparisons between the proposed system and other hybrid methods are also provided. PMID:22412334
Optimization methods and silicon solar cell numerical models
NASA Technical Reports Server (NTRS)
Girardini, K.
1986-01-01
The goal of this project is the development of an optimization algorithm for use with a solar cell model. It is possible to simultaneously vary design variables such as impurity concentrations, front junction depth, back junctions depth, and cell thickness to maximize the predicted cell efficiency. An optimization algorithm has been developed and interfaced with the Solar Cell Analysis Program in 1 Dimension (SCAPID). SCAPID uses finite difference methods to solve the differential equations which, along with several relations from the physics of semiconductors, describe mathematically the operation of a solar cell. A major obstacle is that the numerical methods used in SCAPID require a significant amount of computer time, and during an optimization the model is called iteratively until the design variables converge to the value associated with the maximum efficiency. This problem has been alleviated by designing an optimization code specifically for use with numerically intensive simulations, to reduce the number of times the efficiency has to be calculated to achieve convergence to the optimal solution. Adapting SCAPID so that it could be called iteratively by the optimization code provided another means of reducing the cpu time required to complete an optimization. Instead of calculating the entire I-V curve, as is usually done in SCAPID, only the efficiency is calculated (maximum power voltage and current) and the solution from previous calculations is used to initiate the next solution.
Surface infrastructure functions, requirements and subsystems for a manned Mars mission
NASA Technical Reports Server (NTRS)
Fairchild, Kyle
1986-01-01
Planning and development for a permanently manned scientific outpost on Mars requires an in-depth understanding and analysis of the functions the outpost is expected to perform. The optimum configuration that accomplishes these functions then arises during the trade studies process. In a project this complex, it becomes necessary to use a formal methodology to document the design and planning process. The method chosen for this study is called top-down functional decomposition. This method is used to determine the functions that are needed to accomplish the overall mission, then determine what requirements and systems are needed to do each of the functions. This method facilitates automation of the trades and options process. In the example, this was done with an off-the shelf software package called TK! olver. The basic functions that a permanently manned outpost on Mars must accomplish are: (1) Establish the Life Critical Systems; (2) Support Planetary Sciences and Exploration; and (3) Develop and Maintain Long-term Support Functions, including those systems needed towards self-sufficiency. The top-down functional decomposition methology, combined with standard spread sheet software, offers a powerful tool to quickly assess various design trades and analyze options. As the specific subsystems, and the relational rule algorithms are further refined, it will be possible to very accurately determine the implications of continually evolving mission requirements.
Hydrograph matching method for measuring model performance
NASA Astrophysics Data System (ADS)
Ewen, John
2011-09-01
SummaryDespite all the progress made over the years on developing automatic methods for analysing hydrographs and measuring the performance of rainfall-runoff models, automatic methods cannot yet match the power and flexibility of the human eye and brain. Very simple approaches are therefore being developed that mimic the way hydrologists inspect and interpret hydrographs, including the way that patterns are recognised, links are made by eye, and hydrological responses and errors are studied and remembered. In this paper, a dynamic programming algorithm originally designed for use in data mining is customised for use with hydrographs. It generates sets of "rays" that are analogous to the visual links made by the hydrologist's eye when linking features or times in one hydrograph to the corresponding features or times in another hydrograph. One outcome from this work is a new family of performance measures called "visual" performance measures. These can measure differences in amplitude and timing, including the timing errors between simulated and observed hydrographs in model calibration. To demonstrate this, two visual performance measures, one based on the Nash-Sutcliffe Efficiency and the other on the mean absolute error, are used in a total of 34 split-sample calibration-validation tests for two rainfall-runoff models applied to the Hodder catchment, northwest England. The customised algorithm, called the Hydrograph Matching Algorithm, is very simple to apply; it is given in a few lines of pseudocode.
Verification of a Quality Management Theory: Using a Delphi Study
Mosadeghrad, Ali Mohammad
2013-01-01
Background: A model of quality management called Strategic Collaborative Quality Management (SCQM) model was developed based on the quality management literature review, the findings of a survey on quality management assessment in healthcare organisations, semi-structured interviews with healthcare stakeholders, and a Delphi study on healthcare quality management experts. The purpose of this study was to verify the SCQM model. Methods: The proposed model was further developed using feedback from thirty quality management experts using a Delphi method. Further, a guidebook for its implementation was prepared including a road map and performance measurement. Results: The research led to the development of a context-specific model of quality management for healthcare organisations and a series of guidelines for its implementation. Conclusion: A proper model of quality management should be developed and implemented properly in healthcare organisations to achieve business excellence. PMID:24596883
Heget, Jeffrey R; Bagian, James P; Lee, Caryl Z; Gosbee, John W
2002-12-01
In 1998 the Veterans Health Administration (VHA) created the National Center for Patient Safety (NCPS) to lead the effort to reduce adverse events and close calls systemwide. NCPS's aim is to foster a culture of safety in the Department of Veterans Affairs (VA) by developing and providing patient safety programs and delivering standardized tools, methods, and initiatives to the 163 VA facilities. To create a system-oriented approach to patient safety, NCPS looked for models in fields such as aviation, nuclear power, human factors, and safety engineering. Core concepts included a non-punitive approach to patient safety activities that emphasizes systems-based learning, the active seeking out of close calls, which are viewed as opportunities for learning and investigation, and the use of interdisciplinary teams to investigate close calls and adverse events through a root cause analysis (RCA) process. Participation by VA facilities and networks was voluntary. NCPS has always aimed to develop a program that would be applicable both within the VA and beyond. NCPS's full patient safety program was tested and implemented throughout the VA system from November 1999 to August 2000. Program components included an RCA system for use by caregivers at the front line, a system for the aggregate review of RCA results, information systems software, alerts and advisories, and cognitive acids. Following program implementation, NCPS saw a 900-fold increase in reporting of close calls of high-priority events, reflecting the level of commitment to the program by VHA leaders and staff.
eMBI: Boosting Gene Expression-based Clustering for Cancer Subtypes.
Chang, Zheng; Wang, Zhenjia; Ashby, Cody; Zhou, Chuan; Li, Guojun; Zhang, Shuzhong; Huang, Xiuzhen
2014-01-01
Identifying clinically relevant subtypes of a cancer using gene expression data is a challenging and important problem in medicine, and is a necessary premise to provide specific and efficient treatments for patients of different subtypes. Matrix factorization provides a solution by finding checker-board patterns in the matrices of gene expression data. In the context of gene expression profiles of cancer patients, these checkerboard patterns correspond to genes that are up- or down-regulated in patients with particular cancer subtypes. Recently, a new matrix factorization framework for biclustering called Maximum Block Improvement (MBI) is proposed; however, it still suffers several problems when applied to cancer gene expression data analysis. In this study, we developed many effective strategies to improve MBI and designed a new program called enhanced MBI (eMBI), which is more effective and efficient to identify cancer subtypes. Our tests on several gene expression profiling datasets of cancer patients consistently indicate that eMBI achieves significant improvements in comparison with MBI, in terms of cancer subtype prediction accuracy, robustness, and running time. In addition, the performance of eMBI is much better than another widely used matrix factorization method called nonnegative matrix factorization (NMF) and the method of hierarchical clustering, which is often the first choice of clinical analysts in practice.
eMBI: Boosting Gene Expression-based Clustering for Cancer Subtypes
Chang, Zheng; Wang, Zhenjia; Ashby, Cody; Zhou, Chuan; Li, Guojun; Zhang, Shuzhong; Huang, Xiuzhen
2014-01-01
Identifying clinically relevant subtypes of a cancer using gene expression data is a challenging and important problem in medicine, and is a necessary premise to provide specific and efficient treatments for patients of different subtypes. Matrix factorization provides a solution by finding checker-board patterns in the matrices of gene expression data. In the context of gene expression profiles of cancer patients, these checkerboard patterns correspond to genes that are up- or down-regulated in patients with particular cancer subtypes. Recently, a new matrix factorization framework for biclustering called Maximum Block Improvement (MBI) is proposed; however, it still suffers several problems when applied to cancer gene expression data analysis. In this study, we developed many effective strategies to improve MBI and designed a new program called enhanced MBI (eMBI), which is more effective and efficient to identify cancer subtypes. Our tests on several gene expression profiling datasets of cancer patients consistently indicate that eMBI achieves significant improvements in comparison with MBI, in terms of cancer subtype prediction accuracy, robustness, and running time. In addition, the performance of eMBI is much better than another widely used matrix factorization method called nonnegative matrix factorization (NMF) and the method of hierarchical clustering, which is often the first choice of clinical analysts in practice. PMID:25374455
Bui, Quang M.; Huggins, Richard M.; Hwang, Wen-Han; White, Victoria; Erbas, Bircan
2010-01-01
Background Anti-smoking advertisements are an effective population-based smoking reduction strategy. The Quitline telephone service provides a first point of contact for adults considering quitting. Because of data complexity, the relationship between anti-smoking advertising placement, intensity, and time trends in total call volume is poorly understood. In this study we use a recently developed semi-varying coefficient model to elucidate this relationship. Methods Semi-varying coefficient models comprise parametric and nonparametric components. The model is fitted to the daily number of calls to Quitline in Victoria, Australia to estimate a nonparametric long-term trend and parametric terms for day-of-the-week effects and to clarify the relationship with target audience rating points (TARPs) for the Quit and nicotine replacement advertising campaigns. Results The number of calls to Quitline increased with the TARP value of both the Quit and other smoking cessation advertisement; the TARP values associated with the Quit program were almost twice as effective. The varying coefficient term was statistically significant for peak periods with little or no advertising. Conclusions Semi-varying coefficient models are useful for modeling public health data when there is little or no information on other factors related to the at-risk population. These models are well suited to modeling call volume to Quitline, because the varying coefficient allowed the underlying time trend to depend on fixed covariates that also vary with time, thereby explaining more of the variation in the call model. PMID:20827036
Algorithm Building and Learning Programming Languages Using a New Educational Paradigm
NASA Astrophysics Data System (ADS)
Jain, Anshul K.; Singhal, Manik; Gupta, Manu Sheel
2011-08-01
This research paper presents a new concept of using a single tool to associate syntax of various programming languages, algorithms and basic coding techniques. A simple framework has been programmed in Python that helps students learn skills to develop algorithms, and implement them in various programming languages. The tool provides an innovative and a unified graphical user interface for development of multimedia objects, educational games and applications. It also aids collaborative learning amongst students and teachers through an integrated mechanism based on Remote Procedure Calls. The paper also elucidates an innovative method for code generation to enable students to learn the basics of programming languages using drag-n-drop methods for image objects.
MIANN models in medicinal, physical and organic chemistry.
González-Díaz, Humberto; Arrasate, Sonia; Sotomayor, Nuria; Lete, Esther; Munteanu, Cristian R; Pazos, Alejandro; Besada-Porto, Lina; Ruso, Juan M
2013-01-01
Reducing costs in terms of time, animal sacrifice, and material resources with computational methods has become a promising goal in Medicinal, Biological, Physical and Organic Chemistry. There are many computational techniques that can be used in this sense. In any case, almost all these methods focus on few fundamental aspects including: type (1) methods to quantify the molecular structure, type (2) methods to link the structure with the biological activity, and others. In particular, MARCH-INSIDE (MI), acronym for Markov Chain Invariants for Networks Simulation and Design, is a well-known method for QSAR analysis useful in step (1). In addition, the bio-inspired Artificial-Intelligence (AI) algorithms called Artificial Neural Networks (ANNs) are among the most powerful type (2) methods. We can combine MI with ANNs in order to seek QSAR models, a strategy which is called herein MIANN (MI & ANN models). One of the first applications of the MIANN strategy was in the development of new QSAR models for drug discovery. MIANN strategy has been expanded to the QSAR study of proteins, protein-drug interactions, and protein-protein interaction networks. In this paper, we review for the first time many interesting aspects of the MIANN strategy including theoretical basis, implementation in web servers, and examples of applications in Medicinal and Biological chemistry. We also report new applications of the MIANN strategy in Medicinal chemistry and the first examples in Physical and Organic Chemistry, as well. In so doing, we developed new MIANN models for several self-assembly physicochemical properties of surfactants and large reaction networks in organic synthesis. In some of the new examples we also present experimental results which were not published up to date.
Code of Federal Regulations, 2014 CFR
2014-10-01
... subscriber calls. (e) The term database method means a number portability method that utilizes one or more external databases for providing called party routing information. (f) The term downstream database means a database owned and operated by an individual carrier for the purpose of providing number portability in...
Code of Federal Regulations, 2011 CFR
2011-10-01
... subscriber calls. (e) The term database method means a number portability method that utilizes one or more external databases for providing called party routing information. (f) The term downstream database means a database owned and operated by an individual carrier for the purpose of providing number portability in...
Code of Federal Regulations, 2012 CFR
2012-10-01
... subscriber calls. (e) The term database method means a number portability method that utilizes one or more external databases for providing called party routing information. (f) The term downstream database means a database owned and operated by an individual carrier for the purpose of providing number portability in...
Code of Federal Regulations, 2013 CFR
2013-10-01
... subscriber calls. (e) The term database method means a number portability method that utilizes one or more external databases for providing called party routing information. (f) The term downstream database means a database owned and operated by an individual carrier for the purpose of providing number portability in...
Selby-Harrington, M; Sorenson, J R; Quade, D; Stearns, S C; Tesh, A S; Donat, P L
1995-01-01
OBJECTIVES. A randomized controlled trial was conducted to test the effectiveness and cost effectiveness of three outreach interventions to promote well-child screening for children on Medicaid. METHODS. In rural North Carolina, a random sample of 2053 families with children due or overdue for screening was stratified according to the presence of a home phone. Families were randomly assigned to receive a mailed pamphlet and letter, a phone call, or a home visit outreach intervention, or the usual (control) method of informing at Medicaid intake. RESULTS. All interventions produced more screenings than the control method, but increases were significant only for families with phones. Among families with phones, a home visit was the most effective intervention but a phone call was the most cost-effective. However, absolute rates of effectiveness were low, and incremental costs per effect were high. CONCLUSIONS. Pamphlets, phone calls, and home visits by nurses were minimally effective for increasing well-child screenings. Alternate outreach methods are needed, especially for families without phones. PMID:7573627
Magrath, Robert D; Platzen, Dirk; Kondo, Junko
2006-09-22
Young birds and mammals are extremely vulnerable to predators and so should benefit from responding to parental alarm calls warning of danger. However, young often respond differently from adults. This difference may reflect: (i) an imperfect stage in the gradual development of adult behaviour or (ii) an adaptation to different vulnerability. Altricial birds provide an excellent model to test for adaptive changes with age in response to alarm calls, because fledglings are vulnerable to a different range of predators than nestlings. For example, a flying hawk is irrelevant to a nestling in a enclosed nest, but is dangerous to that individual once it has left the nest, so we predict that young develop a response to aerial alarm calls to coincide with fledging. Supporting our prediction, recently fledged white-browed scrubwrens, Sericornis frontalis, fell silent immediately after playback of their parents' aerial alarm call, whereas nestlings continued to calling despite hearing the playback. Young scrubwrens are therefore exquisitely adapted to the changing risks faced during development.
Who watches the watchers?: preventing fault in a fault tolerance library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stanavige, C. D.
The Scalable Checkpoint/Restart library (SCR) was developed and is used by researchers at Lawrence Livermore National Laboratory to provide a fast and efficient method of saving and recovering large applications during runtime on high-performance computing (HPC) systems. Though SCR protects other programs, up until June 2017, nothing was actively protecting SCR. The goal of this project was to automate the building and testing of this library on the varying HPC architectures on which it is used. Our methods centered around the use of a continuous integration tool called Bamboo that allowed for automation agents to be installed on the HPCmore » systems themselves. These agents provided a way for us to establish a new and unique way to automate and customize the allocation of resources and running of tests with CMake’s unit testing framework, CTest, as well as integration testing scripts though an HPC package manager called Spack. These methods provided a parallel environment in which to test the more complex features of SCR. As a result, SCR is now automatically built and tested on several HPC architectures any time changes are made by developers to the library’s source code. The results of these tests are then communicated back to the developers for immediate feedback, allowing them to fix functionality of SCR that may have broken. Hours of developers’ time are now being saved from the tedious process of manually testing and debugging, which saves money and allows the SCR project team to focus their efforts towards development. Thus, HPC system users can use SCR in conjunction with their own applications to efficiently and effectively checkpoint and restart as needed with the assurance that SCR itself is functioning properly.« less
Ferrarini, Alberto; Forcato, Claudio; Buson, Genny; Tononi, Paola; Del Monaco, Valentina; Terracciano, Mario; Bolognesi, Chiara; Fontana, Francesca; Medoro, Gianni; Neves, Rui; Möhlendick, Birte; Rihawi, Karim; Ardizzoni, Andrea; Sumanasuriya, Semini; Flohr, Penny; Lambros, Maryou; de Bono, Johann; Stoecklein, Nikolas H; Manaresi, Nicolò
2018-01-01
Chromosomal instability and associated chromosomal aberrations are hallmarks of cancer and play a critical role in disease progression and development of resistance to drugs. Single-cell genome analysis has gained interest in latest years as a source of biomarkers for targeted-therapy selection and drug resistance, and several methods have been developed to amplify the genomic DNA and to produce libraries suitable for Whole Genome Sequencing (WGS). However, most protocols require several enzymatic and cleanup steps, thus increasing the complexity and length of protocols, while robustness and speed are key factors for clinical applications. To tackle this issue, we developed a single-tube, single-step, streamlined protocol, exploiting ligation mediated PCR (LM-PCR) Whole Genome Amplification (WGA) method, for low-pass genome sequencing with the Ion Torrent™ platform and copy number alterations (CNAs) calling from single cells. The method was evaluated on single cells isolated from 6 aberrant cell lines of the NCI-H series. In addition, to demonstrate the feasibility of the workflow on clinical samples, we analyzed single circulating tumor cells (CTCs) and white blood cells (WBCs) isolated from the blood of patients affected by prostate cancer or lung adenocarcinoma. The results obtained show that the developed workflow generates data accurately representing whole genome absolute copy number profiles of single cell and allows alterations calling at resolutions down to 100 Kbp with as few as 200,000 reads. The presented data demonstrate the feasibility of the Ampli1™ WGA-based low-pass workflow for detection of CNAs in single tumor cells which would be of particular interest for genome-driven targeted therapy selection and for monitoring of disease progression.
ERIC Educational Resources Information Center
Taneri, Ahu
2018-01-01
In this research, the aim was showing the evaluation of students on scenario-based case study method and showing the functionality of the studied section called "from production to consumption". Qualitative research method and content analysis were used to reveal participants' experiences and reveal meaningful relations regarding…
Methods for examining data quality in healthcare integrated data repositories.
Huser, Vojtech; Kahn, Michael G; Brown, Jeffrey S; Gouripeddi, Ramkiran
2018-01-01
This paper summarizes content of the workshop focused on data quality. The first speaker (VH) described data quality infrastructure and data quality evaluation methods currently in place within the Observational Data Science and Informatics (OHDSI) consortium. The speaker described in detail a data quality tool called Achilles Heel and latest development for extending this tool. Interim results of an ongoing Data Quality study within the OHDSI consortium were also presented. The second speaker (MK) described lessons learned and new data quality checks developed by the PEDsNet pediatric research network. The last two speakers (JB, RG) described tools developed by the Sentinel Initiative and University of Utah's service oriented framework. The workshop discussed at the end and throughout how data quality assessment can be advanced by combining best features of each network.
Dynamic game balancing implementation using adaptive algorithm in mobile-based Safari Indonesia game
NASA Astrophysics Data System (ADS)
Yuniarti, Anny; Nata Wardanie, Novita; Kuswardayan, Imam
2018-03-01
In developing a game there is one method that should be applied to maintain the interest of players, namely dynamic game balancing. Dynamic game balancing is a process to match a player’s playing style with the behaviour, attributes, and game environment. This study applies dynamic game balancing using adaptive algorithm in scrolling shooter game type called Safari Indonesia which developed using Unity. The game of this type is portrayed by a fighter aircraft character trying to defend itself from insistent enemy attacks. This classic game is chosen to implement adaptive algorithms because it has quite complex attributes to be developed using dynamic game balancing. Tests conducted by distributing questionnaires to a number of players indicate that this method managed to reduce frustration and increase the pleasure factor in playing.
Acoustic fine structure may encode biologically relevant information for zebra finches.
Prior, Nora H; Smith, Edward; Lawson, Shelby; Ball, Gregory F; Dooling, Robert J
2018-04-18
The ability to discriminate changes in the fine structure of complex sounds is well developed in birds. However, the precise limit of this discrimination ability and how it is used in the context of natural communication remains unclear. Here we describe natural variability in acoustic fine structure of male and female zebra finch calls. Results from psychoacoustic experiments demonstrate that zebra finches are able to discriminate extremely small differences in fine structure, which are on the order of the variation in acoustic fine structure that is present in their vocal signals. Results from signal analysis methods also suggest that acoustic fine structure may carry information that distinguishes between biologically relevant categories including sex, call type and individual identity. Combined, our results are consistent with the hypothesis that zebra finches can encode biologically relevant information within the fine structure of their calls. This study provides a foundation for our understanding of how acoustic fine structure may be involved in animal communication.
Offshore killer whale tracking using multiple hydrophone arrays.
Gassmann, Martin; Henderson, E Elizabeth; Wiggins, Sean M; Roch, Marie A; Hildebrand, John A
2013-11-01
To study delphinid near surface movements and behavior, two L-shaped hydrophone arrays and one vertical hydrophone line array were deployed at shallow depths (<125 m) from the floating instrument platform R/P FLIP, moored northwest of San Clemente Island in the Southern California Bight. A three-dimensional propagation-model based passive acoustic tracking method was developed and used to track a group of five offshore killer whales (Orcinus orca) using their emitted clicks. In addition, killer whale pulsed calls and high-frequency modulated (HFM) signals were localized using other standard techniques. Based on these tracks sound source levels for the killer whales were estimated. The peak to peak source levels for echolocation clicks vary between 170-205 dB re 1 μPa @ 1 m, for HFM calls between 185-193 dB re 1 μPa @ 1 m, and for pulsed calls between 146-158 dB re 1 μPa @ 1 m.
Toolboxes for a standardised and systematic study of glycans
2014-01-01
Background Recent progress in method development for characterising the branched structures of complex carbohydrates has now enabled higher throughput technology. Automation of structure analysis then calls for software development since adding meaning to large data collections in reasonable time requires corresponding bioinformatics methods and tools. Current glycobioinformatics resources do cover information on the structure and function of glycans, their interaction with proteins or their enzymatic synthesis. However, this information is partial, scattered and often difficult to find to for non-glycobiologists. Methods Following our diagnosis of the causes of the slow development of glycobioinformatics, we review the "objective" difficulties encountered in defining adequate formats for representing complex entities and developing efficient analysis software. Results Various solutions already implemented and strategies defined to bridge glycobiology with different fields and integrate the heterogeneous glyco-related information are presented. Conclusions Despite the initial stage of our integrative efforts, this paper highlights the rapid expansion of glycomics, the validity of existing resources and the bright future of glycobioinformatics. PMID:24564482
A Verification Method of Inter-Task Cooperation in Embedded Real-time Systems and its Evaluation
NASA Astrophysics Data System (ADS)
Yoshida, Toshio
In software development process of embedded real-time systems, the design of the task cooperation process is very important. The cooperating process of such tasks is specified by task cooperation patterns. Adoption of unsuitable task cooperation patterns has fatal influence on system performance, quality, and extendibility. In order to prevent repetitive work caused by the shortage of task cooperation performance, it is necessary to verify task cooperation patterns in an early software development stage. However, it is very difficult to verify task cooperation patterns in an early software developing stage where task program codes are not completed yet. Therefore, we propose a verification method using task skeleton program codes and a real-time kernel that has a function of recording all events during software execution such as system calls issued by task program codes, external interrupts, and timer interrupt. In order to evaluate the proposed verification method, we applied it to the software development process of a mechatronics control system.
A Study of Multimedia Application-Based Vocabulary Acquisition
ERIC Educational Resources Information Center
Shao, Jing
2012-01-01
The development of computer-assisted language learning (CALL) has created the opportunity for exploring the effects of the multimedia application on foreign language vocabulary acquisition in recent years. This study provides an overview the computer-assisted language learning (CALL) and detailed a developing result of CALL--multimedia. With the…
Hamiltonian Dynamics of Spider-Type Multirotor Rigid Bodies Systems
NASA Astrophysics Data System (ADS)
Doroshin, Anton V.
2010-03-01
This paper sets out to develop a spider-type multiple-rotor system which can be used for attitude control of spacecraft. The multirotor system contains a large number of rotor-equipped rays, so it was called a ``Spider-type System,'' also it can be called ``Rotary Hedgehog.'' These systems allow using spinups and captures of conjugate rotors to perform compound attitude motion of spacecraft. The paper describes a new method of spacecraft attitude reorientation and new mathematical model of motion in Hamilton form. Hamiltonian dynamics of the system is investigated with the help of Andoyer-Deprit canonical variables. These variables allow obtaining exact solution for hetero- and homoclinic orbits in phase space of the system motion, which are very important for qualitative analysis.
2017-01-01
Purpose This study is aimed at identifying the relationships between medical school students’ academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. Methods A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students’ empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. Results The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. Conclusion This result demonstrates that calling is a key variable that mediates the relationship between medical students’ academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students’ empathy skills. PMID:28870019
Bhaya, Amit; Kaszkurewicz, Eugenius
2004-01-01
It is pointed out that the so called momentum method, much used in the neural network literature as an acceleration of the backpropagation method, is a stationary version of the conjugate gradient method. Connections with the continuous optimization method known as heavy ball with friction are also made. In both cases, adaptive (dynamic) choices of the so called learning rate and momentum parameters are obtained using a control Liapunov function analysis of the system.
Parallel/Vector Integration Methods for Dynamical Astronomy
NASA Astrophysics Data System (ADS)
Fukushima, T.
Progress of parallel/vector computers has driven us to develop suitable numerical integrators utilizing their computational power to the full extent while being independent on the size of system to be integrated. Unfortunately, the parallel version of Runge-Kutta type integrators are known to be not so efficient. Recently we developed a parallel version of the extrapolation method (Ito and Fukushima 1997), which allows variable timesteps and still gives an acceleration factor of 3-4 for general problems. While the vector-mode usage of Picard-Chebyshev method (Fukushima 1997a, 1997b) will lead the acceleration factor of order of 1000 for smooth problems such as planetary/satellites orbit integration. The success of multiple-correction PECE mode of time-symmetric implicit Hermitian integrator (Kokubo 1998) seems to enlighten Milankar's so-called "pipelined predictor corrector method", which is expected to lead an acceleration factor of 3-4. We will review these directions and discuss future prospects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fukuda, Ryoichi, E-mail: fukuda@ims.ac.jp; Ehara, Masahiro; Elements Strategy Initiative for Catalysts and Batteries
2015-12-31
The effects from solvent environment are specific to the electronic states; therefore, a computational scheme for solvent effects consistent with the electronic states is necessary to discuss electronic excitation of molecules in solution. The PCM (polarizable continuum model) SAC (symmetry-adapted cluster) and SAC-CI (configuration interaction) methods are developed for such purposes. The PCM SAC-CI adopts the state-specific (SS) solvation scheme where solvent effects are self-consistently considered for every ground and excited states. For efficient computations of many excited states, we develop a perturbative approximation for the PCM SAC-CI method, which is called corrected linear response (cLR) scheme. Our test calculationsmore » show that the cLR PCM SAC-CI is a very good approximation of the SS PCM SAC-CI method for polar and nonpolar solvents.« less
LDRD final report : leveraging multi-way linkages on heterogeneous data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunlavy, Daniel M.; Kolda, Tamara Gibson
2010-09-01
This report is a summary of the accomplishments of the 'Leveraging Multi-way Linkages on Heterogeneous Data' which ran from FY08 through FY10. The goal was to investigate scalable and robust methods for multi-way data analysis. We developed a new optimization-based method called CPOPT for fitting a particular type of tensor factorization to data; CPOPT was compared against existing methods and found to be more accurate than any faster method and faster than any equally accurate method. We extended this method to computing tensor factorizations for problems with incomplete data; our results show that you can recover scientifically meaningfully factorizations withmore » large amounts of missing data (50% or more). The project has involved 5 members of the technical staff, 2 postdocs, and 1 summer intern. It has resulted in a total of 13 publications, 2 software releases, and over 30 presentations. Several follow-on projects have already begun, with more potential projects in development.« less
A Mechanized Decision Support System for Academic Scheduling.
1986-03-01
an operational system called software. The first step in the development phase is Design . Designers destribute software control by factoring the Data...SUBJECT TERMS (Continue on reverse if necessary and identify by block number) ELD GROUP SUB-GROUP Scheduling, Decision Support System , Software Design ...scheduling system . It will also examine software - design techniques to identify the most appropriate method- ology for this problem. " - Chapter 3 will
ERIC Educational Resources Information Center
Sheehan, Kathleen M.; Kostin, Irene; Futagi, Yoko; Flor, Michael
2010-01-01
The Common Core Standards call for students to be exposed to a much greater level of text complexity than has been the norm in schools for the past 40 years. Textbook publishers, teachers, and assessment developers are being asked to refocus materials and methods to ensure that students are challenged to read texts at steadily increasing…
Internet VSMOKE: A user-oriented system for smoke management
James T. Paul; Alan Dozier; Daniel Chan
2007-01-01
The Georgia Forestry Commission has developed an Internet-based, user friendly version of a USDA Forest Service smoke dispersion model called âVSMOKE.â The program provides an easy to use method to quickly evaluate what areas will likely be impacted by smoke from a wild or prescribed fire. This is particularly important in assessing air quality, public safety and...
This speaker abstract is part of a session proposal for the 2018 Society of Toxicology annual meeting. I am proposing to speak about the use of new approach methods and data, such as AOPs, in mixtures risk assessment. I have developed an innovative approach called AOP footprint...
ERIC Educational Resources Information Center
Towaf, Siti Malikhah
2016-01-01
Learning can be observed from three-dimensions called: effectiveness, efficiency, and attractiveness of learning. Careful study carried out by analyzing the learning elements of the system are: input, process, and output. Lesson study is an activity designed and implemented as an effort to improve learning in a variety of dimensions. "Lesson…
An Assessment of Reliability of Dialogue Annotation Instructions
1977-01-01
This report is part of an ongoing research effort on man-machine communication, which is engaged in transforming knowledge of how human communication works...certain kinds of recurring features in transcripts of human communication . These methods involve having a trained person, called an Observer, annotate...right kind of data for developing human communication theory. It is a confirmation of the appropriateness and potential effectiveness of using this
ERIC Educational Resources Information Center
Litt, J.; Fishel, G.
2017-01-01
The Office of School Design and Charter Partnerships (OSDCP) at the New York City Department of Education (DOE) developed and executed a plan for district-charter collaboration, which they called the District-Charter Partnerships (DCP) initiative. This document describes the results of a mixed-method study of DCP conducted during the 2016-17…
Application of the Organic Synthetic Designs to Astrobiology
NASA Astrophysics Data System (ADS)
Kolb, V. M.
2009-12-01
In this paper we propose a synthesis of the heterocyclic compounds and the insoluble materials on the meteorites. Our synthetic scheme involves the reaction of sugars and amino acids, the so-called Maillard reaction. We have developed this scheme based on the combined analysis of the regular and retrosynthetic organic synthetic principles. The merits of these synthetic methods for the prebiotic design are addressed.
ERIC Educational Resources Information Center
Emery, Mary; Higgins, Lorie; Chazdon, Scott; Hansen, Debra
2015-01-01
A mind mapping approach to evaluation called Ripple Effects Mapping (REM) has been developed and used by a number of Extension faculty across the country recently. This article describes three approaches to REM, as well as key differences and similarities. The authors, each from different land-grant institutions, believe REM is an effective way to…
Patrol force allocation for law enforcement: An introductory planning guide
NASA Technical Reports Server (NTRS)
Sohn, R. L.; Kennedy, R. D.
1976-01-01
Previous and current methods for analyzing police patrol forces are reviewed and discussed. The steps in developing an allocation analysis procedure are defined, including the prediction of the rate of calls for service, determination of the number of patrol units needed, designing sectors, and analyzing dispatch strategies. Existing computer programs used for this purpose are briefly described, and some results of their application are given.
NASA Technical Reports Server (NTRS)
Elliott, R. D.; Werner, N. M.; Baker, W. M.
1975-01-01
The Aerodynamic Data Analysis and Integration System (ADAIS), developed as a highly interactive computer graphics program capable of manipulating large quantities of data such that addressable elements of a data base can be called up for graphic display, compared, curve fit, stored, retrieved, differenced, etc., was described. The general nature of the system is evidenced by the fact that limited usage has already occurred with data bases consisting of thermodynamic, basic loads, and flight dynamics data. Productivity using ADAIS of five times that for conventional manual methods of wind tunnel data analysis is routinely achieved. In wind tunnel data analysis, data from one or more runs of a particular test may be called up and displayed along with data from one or more runs of a different test. Curves may be faired through the data points by any of four methods, including cubic spline and least squares polynomial fit up to seventh order.
Heat Exchange in “Human body - Thermal protection - Environment” System
NASA Astrophysics Data System (ADS)
Khromova, I. V.
2017-11-01
This article is devoted to the issues of simulation and calculation of thermal processes in the system called “Human body - Thermal protection - Environment” under low temperature conditions. It considers internal heat sources and convective heat transfer between calculated elements. Overall this is important for the Heat Transfer Theory. The article introduces complex heat transfer calculation method and local thermophysical parameters calculation method in the system called «Human body - Thermal protection - Environment», considering passive and active thermal protections, thermophysical and geometric properties of calculated elements in a wide range of environmental parameters (water, air). It also includes research on the influence that thermal resistance of modern materials, used in special protective clothes development, has on heat transfer in the system “Human body - Thermal protection - Environment”. Analysis of the obtained results allows adding of the computer research data to experiments and optimizing of individual life-support system elements, which are intended to protect human body from exposure to external factors.
MuLoG, or How to Apply Gaussian Denoisers to Multi-Channel SAR Speckle Reduction?
Deledalle, Charles-Alban; Denis, Loic; Tabti, Sonia; Tupin, Florence
2017-09-01
Speckle reduction is a longstanding topic in synthetic aperture radar (SAR) imaging. Since most current and planned SAR imaging satellites operate in polarimetric, interferometric, or tomographic modes, SAR images are multi-channel and speckle reduction techniques must jointly process all channels to recover polarimetric and interferometric information. The distinctive nature of SAR signal (complex-valued, corrupted by multiplicative fluctuations) calls for the development of specialized methods for speckle reduction. Image denoising is a very active topic in image processing with a wide variety of approaches and many denoising algorithms available, almost always designed for additive Gaussian noise suppression. This paper proposes a general scheme, called MuLoG (MUlti-channel LOgarithm with Gaussian denoising), to include such Gaussian denoisers within a multi-channel SAR speckle reduction technique. A new family of speckle reduction algorithms can thus be obtained, benefiting from the ongoing progress in Gaussian denoising, and offering several speckle reduction results often displaying method-specific artifacts that can be dismissed by comparison between results.
Much ado about mice: Standard-setting in model organism research.
Hardesty, Rebecca A
2018-04-11
Recently there has been a practice turn in the philosophy of science that has called for analyses to be grounded in the actual doings of everyday science. This paper is in furtherance of this call and it does so by employing participant-observation ethnographic methods as a tool for discovering epistemological features of scientific practice in a neuroscience lab. The case I present focuses on a group of neurobiologists researching the genetic underpinnings of cognition in Down syndrome (DS) and how they have developed a new mouse model which they argue should be regarded as the "gold standard" for all DS mouse research. Through use of ethnographic methods, interviews, and analyses of publications, I uncover how the lab constructed their new mouse model. Additionally, I describe how model organisms can serve as abstract standards for scientific work that impact the epistemic value of scientific claims, regulate practice, and constrain future work. Copyright © 2018 Elsevier Ltd. All rights reserved.
Time Dependent Tomography of the Solar Corona in Three Spatial Dimensions
NASA Astrophysics Data System (ADS)
Butala, M. D.; Frazin, R. A.; Kamalabadi, F.
2006-12-01
The combination of the soon to be launched STEREO mission with SOHO will provide scientists with three simultaneous space-borne views of the Sun. The increase in available measurements will reduce the data acquisition time necessary to obtain 3D coronal electron density (N_e) estimates from coronagraph images using a technique called solar rotational tomography (SRT). However, the data acquisition period will still be long enough for the corona to dynamically evolve, requiring time dependent solar tomography. The Kalman filter (KF) would seem to be an ideal computational method for time dependent SRT. Unfortunately, the KF scales poorly with problem size and is, as a result, inapplicable. A Monte Carlo approximation to the KF called the localized ensemble Kalman filter was developed for massive applications and has the promise of making the time dependent estimation of the 3D coronal N_e possible. We present simulations showing that this method will make time dependent tomography in three spatial dimensions computationally feasible.
Mutual recognition of TNT using antibodies polymeric shell having CdS.
Say, Ridvan; Büyüktiryaki, Sibel; Hür, Deniz; Yilmaz, Filiz; Ersöz, Arzu
2012-02-15
Click chemistry is the latest strategy called upon in the development of state of the art exponents of bioconjugation. In this study, we have proposed a covalent and photosensitive crosslinking conjugation of the antibody on nano-structures. For this purpose, quantum dots (QDs) without affecting conformation and function of proteins through the ruthenium-chelate based aminoacid monomer linkages have been applied. The aminoacid-monomer linkages called ANADOLUCA (AmiNoAcid Decorated and Light Underpining Conjugation Approach) give reusable oriented and cross-linked anti 2,4,6-trinitrotoluene (TNT) conjugated QD for TNT detection. In this work, a new and simple method has improved to design and prepare high sensitive nanoconjugates for TNT determination. We have demonstrated the use of luminescent QDs conjugated to antibody for the specific detection of the explosive TNT in aqueous environments. The binding affinity of each nanoconjugates for TNT detection by using Langmuir adsorption methods has also been investigated. Copyright © 2012 Elsevier B.V. All rights reserved.
Monte Carlo based statistical power analysis for mediation models: methods and software.
Zhang, Zhiyong
2014-12-01
The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.
NASA Astrophysics Data System (ADS)
Xu, Lianyun; Hou, Zhende; Qin, Yuwen
2002-05-01
Because some composite material, thin film material, and biomaterial, are very thin and some of them are flexible, the classical methods for measuring their Young's moduli, by mounting extensometers on specimens, are not available. A bi-image method based on image correlation for measuring Young's moduli is developed in this paper. The measuring precision achieved is one order enhanced with general digital image correlation or called single image method. By this way, the Young's modulus of a SS301 stainless steel thin tape, with thickness 0.067mm, is measured, and the moduli of polyester fiber films, a kind of flexible sheet with thickness 0.25 mm, are also measured.
NASA Technical Reports Server (NTRS)
Wendel, Thomas R.; Boland, Joseph R.; Hahne, David E.
1991-01-01
Flight-control laws are developed for a wind-tunnel aircraft model flying at a high angle of attack by using a synthesis technique called direct eigenstructure assignment. The method employs flight guidelines and control-power constraints to develop the control laws, and gain schedules and nonlinear feedback compensation provide a framework for considering the nonlinear nature of the attack angle. Linear and nonlinear evaluations show that the control laws are effective, a conclusion that is further confirmed by a scale model used for free-flight testing.
Cavity radiation model for solar central receivers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lipps, F.W.
1981-01-01
The Energy Laboratory of the University of Houston has developed a computer simulation program called CREAM (i.e., Cavity Radiations Exchange Analysis Model) for application to the solar central receiver system. The zone generating capability of CREAM has been used in several solar re-powering studies. CREAM contains a geometric configuration factor generator based on Nusselt's method. A formulation of Nusselt's method provides support for the FORTRAN subroutine NUSSELT. Numerical results from NUSSELT are compared to analytic values and values from Sparrow's method. Sparrow's method is based on a double contour integral and its reduction to a single integral which is approximatedmore » by Guassian methods. Nusselt's method is adequate for the intended engineering applications, but Sparrow's method is found to be an order of magnitude more efficient in many situations.« less
NASA Astrophysics Data System (ADS)
Tayebi, A.; Shekari, Y.; Heydari, M. H.
2017-07-01
Several physical phenomena such as transformation of pollutants, energy, particles and many others can be described by the well-known convection-diffusion equation which is a combination of the diffusion and advection equations. In this paper, this equation is generalized with the concept of variable-order fractional derivatives. The generalized equation is called variable-order time fractional advection-diffusion equation (V-OTFA-DE). An accurate and robust meshless method based on the moving least squares (MLS) approximation and the finite difference scheme is proposed for its numerical solution on two-dimensional (2-D) arbitrary domains. In the time domain, the finite difference technique with a θ-weighted scheme and in the space domain, the MLS approximation are employed to obtain appropriate semi-discrete solutions. Since the newly developed method is a meshless approach, it does not require any background mesh structure to obtain semi-discrete solutions of the problem under consideration, and the numerical solutions are constructed entirely based on a set of scattered nodes. The proposed method is validated in solving three different examples including two benchmark problems and an applied problem of pollutant distribution in the atmosphere. In all such cases, the obtained results show that the proposed method is very accurate and robust. Moreover, a remarkable property so-called positive scheme for the proposed method is observed in solving concentration transport phenomena.
On the Inference of Functional Circadian Networks Using Granger Causality
Pourzanjani, Arya; Herzog, Erik D.; Petzold, Linda R.
2015-01-01
Being able to infer one way direct connections in an oscillatory network such as the suprachiastmatic nucleus (SCN) of the mammalian brain using time series data is difficult but crucial to understanding network dynamics. Although techniques have been developed for inferring networks from time series data, there have been no attempts to adapt these techniques to infer directional connections in oscillatory time series, while accurately distinguishing between direct and indirect connections. In this paper an adaptation of Granger Causality is proposed that allows for inference of circadian networks and oscillatory networks in general called Adaptive Frequency Granger Causality (AFGC). Additionally, an extension of this method is proposed to infer networks with large numbers of cells called LASSO AFGC. The method was validated using simulated data from several different networks. For the smaller networks the method was able to identify all one way direct connections without identifying connections that were not present. For larger networks of up to twenty cells the method shows excellent performance in identifying true and false connections; this is quantified by an area-under-the-curve (AUC) 96.88%. We note that this method like other Granger Causality-based methods, is based on the detection of high frequency signals propagating between cell traces. Thus it requires a relatively high sampling rate and a network that can propagate high frequency signals. PMID:26413748
Radar target classification studies: Software development and documentation
NASA Astrophysics Data System (ADS)
Kamis, A.; Garber, F.; Walton, E.
1985-09-01
Three computer programs were developed to process and analyze calibrated radar returns. The first program, called DATABASE, was developed to create and manage a random accessed data base. The second program, called FTRAN DB, was developed to process horizontal and vertical polarizations radar returns into different formats (i.e., time domain, circular polarizations and polarization parameters). The third program, called RSSE, was developed to simulate a variety of radar systems and to evaluate their ability to identify radar returns. Complete computer listings are included in the appendix volumes.
ERIC Educational Resources Information Center
Jones, Linda C.
2010-01-01
From 1981 to today, the encouragement Jim Pusack and his colleague Sue Otto gave faculty to develop and/or implement CALL into the curriculum has been vital to our L2 teaching evolution. This article describes how their efforts evolved over the last two and a half decades and the ties that bind their efforts with today's CALL development.
Savas, Linda Ann; Grady, Katherine; Cotterill, Sarah; Summers, Lucinda; Boaden, Ruth; Gibson, J Martin
2015-02-01
To design, deliver and evaluate IGT Care Call, a telephone service providing a 6 month lifestyle education programme for people with impaired glucose tolerance (IGT). An observational study of IGT Care Call, a programme providing motivational support and education using electronic scripts. The service was delivered to 55 participants, all of whom completed the course (an information pack and at least five telephone calls over 6 months). Clinical measurements were undertaken in General Practice at baseline, on completion of the programme and one year later. Among the 40 participants for whom we have complete data available, one year after discharge, participants showed improvements in fasting plasma glucose (0.29 mmol/l, 95% CI 0.07 to 0.51), weight (2.81 kg, 95% CI 1.20 to 4.42) and BMI (1.06 kg/m(2), 95% CI 0.49 to 1.63). All differences were statistically significant (p < 0.01). Whilst an uncontrolled observational study with a small sample size, this pilot suggests IGT Care Call may be effective in promoting positive and sustained lifestyle changes to prevent type 2 diabetes, which warrants further investigation. A telephone method of service delivery was acceptable, convenient and may have improved self confidence in how to reduce risk of type 2 diabetes. Copyright © 2014 Primary Care Diabetes Europe. Published by Elsevier Ltd. All rights reserved.
The WOMBAT Attack Attribution Method: Some Results
NASA Astrophysics Data System (ADS)
Dacier, Marc; Pham, Van-Hau; Thonnard, Olivier
In this paper, we present a new attack attribution method that has been developed within the WOMBAT project. We illustrate the method with some real-world results obtained when applying it to almost two years of attack traces collected by low interaction honeypots. This analytical method aims at identifying large scale attack phenomena composed of IP sources that are linked to the same root cause. All malicious sources involved in a same phenomenon constitute what we call a Misbehaving Cloud (MC). The paper offers an overview of the various steps the method goes through to identify these clouds, providing pointers to external references for more detailed information. Four instances of misbehaving clouds are then described in some more depth to demonstrate the meaningfulness of the concept.
PHIRE (Public Health Innovation and Research in Europe): methods, structures and evaluation.
Barnhoorn, Floris; McCarthy, Mark; Devillé, Walter; Alexanderson, Kristina; Voss, Margaretha; Conceição, Claudia
2013-11-01
Public Health Innovation and Research in Europe (PHIRE), building on previous European collaborative projects, was developed to assess national uptake and impacts of European public health innovations, to describe national public health research programmes, strategies and structures and to develop participation of researchers through the organizational structures of the European Public Health Association (EUPHA). This article describes the methods used. PHIRE was led by EUPHA with seven partner organisations over 30 months. It was conceived to engage the organisation of EUPHA--working through its thematic Sections, and through its national public health associations--and assess innovation and research across 30 European countries. Public health research was defined broadly as health research at population and organisational level. There were seven Work Packages (three covering coordination and four for technical aspects) led by partners and coordinated through management meetings. Seven EUPHA Sections identified eight innovations within the projects funded by the Public Health Programme of the European Commission Directorate for Health and Consumers. Country informants, identified through EUPHA thematic Sections, reported on national uptake of the innovations in eight public health projects supported by the European Union Public Health Programme. Four PHIRE partners, each taking a regional sector of Europe, worked with the public health associations and other informants to describe public health research programmes, calls and systems. A classification was created for the national public health research programmes and calls in 2010. The internal and external evaluations were supportive. PHIRE described public health innovations and research across Europe through national experts. More work is needed to conceptualize and define public health 'innovations' and to develop theories and methods for the assessment of their uptake and impacts at country and cross-country levels. More attention to methods to describe and assess national public health research programmes, strategies and structures--contributing to development of the European Research Area.
The use of Lanczos's method to solve the large generalized symmetric definite eigenvalue problem
NASA Technical Reports Server (NTRS)
Jones, Mark T.; Patrick, Merrell L.
1989-01-01
The generalized eigenvalue problem, Kx = Lambda Mx, is of significant practical importance, especially in structural enginering where it arises as the vibration and buckling problem. A new algorithm, LANZ, based on Lanczos's method is developed. LANZ uses a technique called dynamic shifting to improve the efficiency and reliability of the Lanczos algorithm. A new algorithm for solving the tridiagonal matrices that arise when using Lanczos's method is described. A modification of Parlett and Scott's selective orthogonalization algorithm is proposed. Results from an implementation of LANZ on a Convex C-220 show it to be superior to a subspace iteration code.
E-commerce Review System to Detect False Reviews.
Kolhar, Manjur
2017-08-15
E-commerce sites have been doing profitable business since their induction in high-speed and secured networks. Moreover, they continue to influence consumers through various methods. One of the most effective methods is the e-commerce review rating system, in which consumers provide review ratings for the products used. However, almost all e-commerce review rating systems are unable to provide cumulative review ratings. Furthermore, review ratings are influenced by positive and negative malicious feedback ratings, collectively called false reviews. In this paper, we proposed an e-commerce review system framework developed using the cumulative sum method to detect and remove malicious review ratings.
NASA Astrophysics Data System (ADS)
Kwiatkowski, Mirosław
2017-12-01
The paper presents the results of the research on the application of the new analytical models of multilayer adsorption on heterogeneous surfaces with the unique fast multivariant identification procedure, together called LBET method, as a tool for analysing the microporous structure of the activated carbon fibres obtained from polyacrylonitrile by chemical activation using potassium and sodium hydroxides. The novel LBET method was employed particularly to evaluate the impact of the used activator and the hydroxide to polyacrylonitrile ratio on the obtained microporous structure of the activated carbon fibres.
Geometrical comparison of two protein structures using Wigner-D functions.
Saberi Fathi, S M; White, Diana T; Tuszynski, Jack A
2014-10-01
In this article, we develop a quantitative comparison method for two arbitrary protein structures. This method uses a root-mean-square deviation characterization and employs a series expansion of the protein's shape function in terms of the Wigner-D functions to define a new criterion, which is called a "similarity value." We further demonstrate that the expansion coefficients for the shape function obtained with the help of the Wigner-D functions correspond to structure factors. Our method addresses the common problem of comparing two proteins with different numbers of atoms. We illustrate it with a worked example. © 2014 Wiley Periodicals, Inc.
Interaction sorting method for molecular dynamics on multi-core SIMD CPU architecture.
Matvienko, Sergey; Alemasov, Nikolay; Fomin, Eduard
2015-02-01
Molecular dynamics (MD) is widely used in computational biology for studying binding mechanisms of molecules, molecular transport, conformational transitions, protein folding, etc. The method is computationally expensive; thus, the demand for the development of novel, much more efficient algorithms is still high. Therefore, the new algorithm designed in 2007 and called interaction sorting (IS) clearly attracted interest, as it outperformed the most efficient MD algorithms. In this work, a new IS modification is proposed which allows the algorithm to utilize SIMD processor instructions. This paper shows that the improvement provides an additional gain in performance, 9% to 45% in comparison to the original IS method.
The assessment of biases in the acoustic discrimination of individuals
Šálek, Martin
2017-01-01
Animal vocalizations contain information about individual identity that could potentially be used for the monitoring of individuals. However, the performance of individual discrimination is subjected to many biases depending on factors such as the amount of identity information, or methods used. These factors need to be taken into account when comparing results of different studies or selecting the most cost-effective solution for a particular species. In this study, we evaluate several biases associated with the discrimination of individuals. On a large sample of little owl male individuals, we assess how discrimination performance changes with methods of call description, an increasing number of individuals, and number of calls per male. Also, we test whether the discrimination performance within the whole population can be reliably estimated from a subsample of individuals in a pre-screening study. Assessment of discrimination performance at the level of the individual and at the level of call led to different conclusions. Hence, studies interested in individual discrimination should optimize methods at the level of individuals. The description of calls by their frequency modulation leads to the best discrimination performance. In agreement with our expectations, discrimination performance decreased with population size. Increasing the number of calls per individual linearly increased the discrimination of individuals (but not the discrimination of calls), likely because it allows distinction between individuals with very similar calls. The available pre-screening index does not allow precise estimation of the population size that could be reliably monitored. Overall, projects applying acoustic monitoring at the individual level in population need to consider limitations regarding the population size that can be reliably monitored and fine-tune their methods according to their needs and limitations. PMID:28486488
Development of echolocation calls and neural selectivity for echolocation calls in the pallid bat.
Razak, Khaleel A; Fuzessery, Zoltan M
2015-10-01
Studies of birdsongs and neural selectivity for songs have provided important insights into principles of concurrent behavioral and auditory system development. Relatively little is known about mammalian auditory system development in terms of vocalizations or other behaviorally relevant sounds. This review suggests echolocating bats are suitable mammalian model systems to understand development of auditory behaviors. The simplicity of echolocation calls with known behavioral relevance and strong neural selectivity provides a platform to address how natural experience shapes cortical receptive field (RF) mechanisms. We summarize recent studies in the pallid bat that followed development of echolocation calls and cortical processing of such calls. We also discuss similar studies in the mustached bat for comparison. These studies suggest: (1) there are different developmental sensitive periods for different acoustic features of the same vocalization. The underlying basis is the capacity for some components of the RF to be modified independent of others. Some RF computations and maps involved in call processing are present even before the cochlea is mature and well before use of echolocation in flight. Others develop over a much longer time course. (2) Normal experience is required not just for refinement, but also for maintenance, of response properties that develop in an experience independent manner. (3) Experience utilizes millisecond range changes in timing of inhibitory and excitatory RF components as substrates to shape vocalization selectivity. We suggest that bat species and call diversity provide a unique opportunity to address developmental constraints in the evolution of neural mechanisms of vocalization processing. © 2014 Wiley Periodicals, Inc.
Development of echolocation calls and neural selectivity for echolocation calls in the pallid bat
Razak, Khaleel A.; Fuzessery, Zoltan M.
2014-01-01
Studies of birdsongs and neural selectivity for songs have provided important insights into principles of concurrent behavioral and auditory system development. Relatively little is known about mammalian auditory system development in terms of vocalizations, or other behaviorally relevant sounds. This review suggests echolocating bats are suitable mammalian model systems to understand development of auditory behaviors. The simplicity of echolocation calls with known behavioral relevance and strong neural selectivity provides a platform to address how natural experience shapes cortical receptive field (RF) mechanisms. We summarize recent studies in the pallid bat that followed development of echolocation calls and cortical processing of such calls. We also discuss similar studies in the mustached bat for comparison. These studies suggest: (1) there are different developmental sensitive periods for different acoustic features of the same vocalization. The underlying basis is the capacity for some components of the RF to be modified independent of others. Some RF computations and maps involved in call processing are present even before the cochlea is mature and well before use of echolocation in flight. Others develop over a much longer time course. (2) Normal experience is required not just for refinement, but also for maintenance, of response properties that develop in an experience independent manner. (3) Experience utilizes millisecond range changes in timing of inhibitory and excitatory RF components as substrates to shape vocalization selectivity. We suggest that bat species and call diversity provide a unique opportunity to address developmental constraints in the evolution of neural mechanisms of vocalization processing. PMID:25142131
Lotfy, Hayam Mahmoud; Salem, Hesham; Abdelkawy, Mohammad; Samir, Ahmed
2015-04-05
Five spectrophotometric methods were successfully developed and validated for the determination of betamethasone valerate and fusidic acid in their binary mixture. Those methods are isoabsorptive point method combined with the first derivative (ISO Point--D1) and the recently developed and well established methods namely ratio difference (RD) and constant center coupled with spectrum subtraction (CC) methods, in addition to derivative ratio (1DD) and mean centering of ratio spectra (MCR). New enrichment technique called spectrum addition technique was used instead of traditional spiking technique. The proposed spectrophotometric procedures do not require any separation steps. Accuracy, precision and linearity ranges of the proposed methods were determined and the specificity was assessed by analyzing synthetic mixtures of both drugs. They were applied to their pharmaceutical formulation and the results obtained were statistically compared to that of official methods. The statistical comparison showed that there is no significant difference between the proposed methods and the official ones regarding both accuracy and precision. Copyright © 2015 Elsevier B.V. All rights reserved.
Le Bras, Ronan J; Kuzma, Heidi; Sucic, Victor; Bokelmann, Götz
2016-05-01
A notable sequence of calls was encountered, spanning several days in January 2003, in the central part of the Indian Ocean on a hydrophone triplet recording acoustic data at a 250 Hz sampling rate. This paper presents signal processing methods applied to the waveform data to detect, group, extract amplitude and bearing estimates for the recorded signals. An approximate location for the source of the sequence of calls is inferred from extracting the features from the waveform. As the source approaches the hydrophone triplet, the source level (SL) of the calls is estimated at 187 ± 6 dB re: 1 μPa-1 m in the 15-60 Hz frequency range. The calls are attributed to a subgroup of blue whales, Balaenoptera musculus, with a characteristic acoustic signature. A Bayesian location method using probabilistic models for bearing and amplitude is demonstrated on the calls sequence. The method is applied to the case of detection at a single triad of hydrophones and results in a probability distribution map for the origin of the calls. It can be extended to detections at multiple triads and because of the Bayesian formulation, additional modeling complexity can be built-in as needed.
L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne
2018-01-01
Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile phones offers promise for future data collection in Ghana and may be suitable for other developing countries.
Social Communication and Vocal Recognition in Free-Ranging Rhesus Monkeys
NASA Astrophysics Data System (ADS)
Rendall, Christopher Andrew
Kinship and individual identity are key determinants of primate sociality, and the capacity for vocal recognition of individuals and kin is hypothesized to be an important adaptation facilitating intra-group social communication. Research was conducted on adult female rhesus monkeys on Cayo Santiago, Puerto Rico to test this hypothesis for three acoustically distinct calls characterized by varying selective pressures on communicating identity: coos (contact calls), grunts (close range social calls), and noisy screams (agonistic recruitment calls). Vocalization playback experiments confirmed a capacity for both individual and kin recognition of coos, but not screams (grunts were not tested). Acoustic analyses, using traditional spectrographic methods as well as linear predictive coding techniques, indicated that coos (but not grunts or screams) were highly distinctive, and that the effects of vocal tract filtering--formants --contributed more to statistical discriminations of both individuals and kin groups than did temporal or laryngeal source features. Formants were identified from very short (23 ms.) segments of coos and were stable within calls, indicating that formant cues to individual and kin identity were available throughout a call. This aspect of formant cues is predicted to be an especially important design feature for signaling identity efficiently in complex acoustic environments. Results of playback experiments involving manipulated coo stimuli provided preliminary perceptual support for the statistical inference that formant cues take precedence in facilitating vocal recognition. The similarity of formants among female kin suggested a mechanism for the development of matrilineal vocal signatures from the genetic and environmental determinants of vocal tract morphology shared among relatives. The fact that screams --calls strongly expected to communicate identity--were not individually distinctive nor recognized suggested the possibility that their acoustic structure and role in signaling identity might be constrained by functional or morphological design requirements associated with their role in signaling submission.
A Data‐Rich Recruitment Core to Support Translational Clinical Research
Corregano, Lauren M.; Rainer, Tyler‐Lauren; Melendez, Caroline; Coller, Barry S.
2014-01-01
Abstract Background Underenrollment of clinical studies wastes resources and delays assessment of research discoveries. We describe the organization and impact of a centralized recruitment core delivering comprehensive recruitment support to investigators. Methods The Rockefeller University Center for Clinical and Translational Science supports a centralized recruitment core, call center, Research Volunteer Repository, data infrastructure, and staff who provide expert recruitment services to investigators. During protocol development, consultations aim to optimize enrollment feasibility, develop recruitment strategy, budget, and advertising. Services during study conduct include advertising placement, repository queries, call management, prescreening, referral, and visit scheduling. Utilization and recruitment outcomes are tracked using dedicated software. Results For protocols receiving recruitment services during 2009–2013: median time from initiation of recruitment to the first enrolled participant was 10 days; of 4,047 first‐time callers to the call center, 92% (n = 3,722) enrolled in the Research Volunteer Repository, with 99% retention; 23% of Repository enrollees subsequently enrolled in ≥1 research studies, with 89% retention. Of volunteers referred by repository queries, 49% (280/537) enrolled into the study, with 92% retained. Conclusions Provision of robust recruitment infrastructure including expertise, a volunteer repository, data capture and real‐time analysis accelerates protocol accrual. Application of recruitment science improves the quality of clinical investigation. PMID:25381717
NASA Technical Reports Server (NTRS)
Kvaternik, Raymond G.
1992-01-01
An overview is presented of government contributions to the program called Design Analysis Methods for Vibrations (DAMV) which attempted to develop finite-element-based analyses of rotorcraft vibrations. NASA initiated the program with a finite-element modeling program for the CH-47D tandem-rotor helicopter. The DAMV program emphasized four areas including: airframe finite-element modeling, difficult components studies, coupled rotor-airframe vibrations, and airframe structural optimization. Key accomplishments of the program include industrywide standards for modeling metal and composite airframes, improved industrial designs for vibrations, and the identification of critical structural contributors to airframe vibratory responses. The program also demonstrated the value of incorporating secondary modeling details to improving correlation, and the findings provide the basis for an improved finite-element-based dynamics design-analysis capability.
Using Approximate Bayesian Computation to infer sex ratios from acoustic data.
Lehnen, Lisa; Schorcht, Wigbert; Karst, Inken; Biedermann, Martin; Kerth, Gerald; Puechmaille, Sebastien J
2018-01-01
Population sex ratios are of high ecological relevance, but are challenging to determine in species lacking conspicuous external cues indicating their sex. Acoustic sexing is an option if vocalizations differ between sexes, but is precluded by overlapping distributions of the values of male and female vocalizations in many species. A method allowing the inference of sex ratios despite such an overlap will therefore greatly increase the information extractable from acoustic data. To meet this demand, we developed a novel approach using Approximate Bayesian Computation (ABC) to infer the sex ratio of populations from acoustic data. Additionally, parameters characterizing the male and female distribution of acoustic values (mean and standard deviation) are inferred. This information is then used to probabilistically assign a sex to a single acoustic signal. We furthermore develop a simpler means of sex ratio estimation based on the exclusion of calls from the overlap zone. Applying our methods to simulated data demonstrates that sex ratio and acoustic parameter characteristics of males and females are reliably inferred by the ABC approach. Applying both the ABC and the exclusion method to empirical datasets (echolocation calls recorded in colonies of lesser horseshoe bats, Rhinolophus hipposideros) provides similar sex ratios as molecular sexing. Our methods aim to facilitate evidence-based conservation, and to benefit scientists investigating ecological or conservation questions related to sex- or group specific behaviour across a wide range of organisms emitting acoustic signals. The developed methodology is non-invasive, low-cost and time-efficient, thus allowing the study of many sites and individuals. We provide an R-script for the easy application of the method and discuss potential future extensions and fields of applications. The script can be easily adapted to account for numerous biological systems by adjusting the type and number of groups to be distinguished (e.g. age, social rank, cryptic species) and the acoustic parameters investigated.
The ECCO Family of State Estimates: An Overview
NASA Astrophysics Data System (ADS)
Wunsch, C.
2008-12-01
The idea of ECCO (Estimating the Circulation and Climate of the Ocean)originated in the middle 1980s, when it became apparent that a global oceanographic observing system for the general circulation would become a reality as it did through the World Ocean Circulation Experiment. Observational design involved extremely diverse technologies and oceanic flow regimes. To be physically interpretable, these diverse data and physical processes would need to be combined into a useful, coherent, whole. Such a synthesis can only be done with a skillful GCM having useful resolution. ECCO originated as an experiment to demonstrate the technical feasibility of such a synthesis and to determine if any of several possible methods was preferable. In contrast to a number of other superficially similar efforts, mainly derived from weather forecasting methods, the ECCO goal was to estimate the long-term circulation mean and its variability on climate (decadal and longer) time scales in a form exactly satisfying known equations of motion. ECCO was made feasible with the simultaneous construction of a new GCM (MIT) along with the development of an automatic differentiation (AD) software tool(now called TAF) which rendered practical the method of Lagrange multipliers (called the adjoint method in oceanography). Parallel developments of simplified sequential methods (smoothers) provided an alternative, also practical, methodology. One can now use the existing (publicly available) machinery to discuss the ocean circulation and its variability. The huge variety of issues connected with the global circulation has meant that an entire family of estimates has grown up, each having different emphases (primarily global; but some primarily regional---the tropics, the Southern Ocean); some focussed on physics---the role of eddies or sea ice). The methodology leads, usefully, to intense scrutiny of data and model errors and spatio-temporal coverage. As with any estimation problem, no uniquely 'correct' solution is now or ever going to be possible-- -only evolving best estimates. Further development of these and similar methodologies appears to be a necessary, inevitable, and growing component of oceanography and climate.
NASA Technical Reports Server (NTRS)
Deavours, Daniel D.; Qureshi, M. Akber; Sanders, William H.
1997-01-01
Modeling tools and technologies are important for aerospace development. At the University of Illinois, we have worked on advancing the state of the art in modeling by Markov reward models in two important areas: reducing the memory necessary to numerically solve systems represented as stochastic activity networks and other stochastic Petri net extensions while still obtaining solutions in a reasonable amount of time, and finding numerically stable and memory-efficient methods to solve for the reward accumulated during a finite mission time. A long standing problem when modeling with high level formalisms such as stochastic activity networks is the so-called state space explosion, where the number of states increases exponentially with size of the high level model. Thus, the corresponding Markov model becomes prohibitively large and solution is constrained by the the size of primary memory. To reduce the memory necessary to numerically solve complex systems, we propose new methods that can tolerate such large state spaces that do not require any special structure in the model (as many other techniques do). First, we develop methods that generate row and columns of the state transition-rate-matrix on-the-fly, eliminating the need to explicitly store the matrix at all. Next, we introduce a new iterative solution method, called modified adaptive Gauss-Seidel, that exhibits locality in its use of data from the state transition-rate-matrix, permitting us to cache portions of the matrix and hence reduce the solution time. Finally, we develop a new memory and computationally efficient technique for Gauss-Seidel based solvers that avoids the need for generating rows of A in order to solve Ax = b. This is a significant performance improvement for on-the-fly methods as well as other recent solution techniques based on Kronecker operators. Taken together, these new results show that one can solve very large models without any special structure.
NASA Astrophysics Data System (ADS)
Ma, Hélène; Gronchi, Giovanni F.
2014-07-01
We advertise a new method of preliminary orbit determination for space debris using radar observations, which we call Infang †. We can perform a linkage of two sets of four observations collected at close times. The context is characterized by the accuracy of the range ρ, whereas the right ascension α and the declination δ are much more inaccurate due to observational errors. This method can correct α, δ, assuming the exact knowledge of the range ρ. Considering no perturbations from the J 2 effect, but including errors in the observations, we can compare the new method, the classical method of Gibbs, and the more recent Keplerian integrals method. The development of Infang is still on-going and will be further improved and tested.
Shum, Bennett O V; Henner, Ilya; Belluoccio, Daniele; Hinchcliffe, Marcus J
2017-07-01
The sensitivity and specificity of next-generation sequencing laboratory developed tests (LDTs) are typically determined by an analyte-specific approach. Analyte-specific validations use disease-specific controls to assess an LDT's ability to detect known pathogenic variants. Alternatively, a methods-based approach can be used for LDT technical validations. Methods-focused validations do not use disease-specific controls but use benchmark reference DNA that contains known variants (benign, variants of unknown significance, and pathogenic) to assess variant calling accuracy of a next-generation sequencing workflow. Recently, four whole-genome reference materials (RMs) from the National Institute of Standards and Technology (NIST) were released to standardize methods-based validations of next-generation sequencing panels across laboratories. We provide a practical method for using NIST RMs to validate multigene panels. We analyzed the utility of RMs in validating a novel newborn screening test that targets 70 genes, called NEO1. Despite the NIST RM variant truth set originating from multiple sequencing platforms, replicates, and library types, we discovered a 5.2% false-negative variant detection rate in the RM truth set genes that were assessed in our validation. We developed a strategy using complementary non-RM controls to demonstrate 99.6% sensitivity of the NEO1 test in detecting variants. Our findings have implications for laboratories or proficiency testing organizations using whole-genome NIST RMs for testing. Copyright © 2017 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.
Incentives for new antibiotics: the Options Market for Antibiotics (OMA) model
2013-01-01
Background Antimicrobial resistance is a growing threat resulting from the convergence of biological, economic and political pressures. Investment in research and development of new antimicrobials has suffered secondary to these pressures, leading to an emerging crisis in antibiotic resistance. Methods Current policies to stimulate antibiotic development have proven inadequate to overcome market failures. Therefore innovative ideas utilizing market forces are necessary to stimulate new investment efforts. Employing the benefits of both the previously described Advanced Market Commitment and a refined Call Options for Vaccines model, we describe herein a novel incentive mechanism, the Options Market for Antibiotics. Results This model applies the benefits of a financial call option to the investment in and purchase of new antibiotics. The goal of this new model is to provide an effective mechanism for early investment and risk sharing while maintaining a credible purchase commitment and incentives for companies to ultimately bring new antibiotics to market. Conclusions We believe that the Options Market for Antibiotics (OMA) may help to overcome some of the traditional market failures associated with the development of new antibiotics. Additional work must be done to develop a more robust mathematical model to pave the way for practical implementation. PMID:24199835
2015-12-01
conducted a second round of focus groups in early 2013, designed as group self-administered pre- tests followed by a group debriefing. The first...procedures for team communication and coordination (month 1) Completed. A listserve was developed for the group early on . Bi-weekly conference calls were... group has developed an automated 2D method. Figure 7 shows the automated 2D (area) results on the same dataset presented in Task 4 (figures 1 and 2). The
2015-12-01
designed as group self-administered pre- tests followed by a group debriefing. The first group was conducted in Charlottesville, Virginia, and the second...coordination (month 1) Completed. A listserve was developed for the group early on . Bi-weekly conference calls were held on Tuesdays at noon. An agenda...Yaffe’s group has developed an automated 2D method. Figure 7 shows the automated 2D (area) results on the same dataset presented in Task 4 (figures 1
Free-Swinging Failure Tolerance for Robotic Manipulators. Degree awarded by Purdue Univ.
NASA Technical Reports Server (NTRS)
English, James
1997-01-01
Under this GSRP fellowship, software-based failure-tolerance techniques were developed for robotic manipulators. The focus was on failures characterized by the loss of actuator torque at a joint, called free-swinging failures. The research results spanned many aspects of the free-swinging failure-tolerance problem, from preparing for an expected failure to discovery of postfailure capabilities to establishing efficient methods to realize those capabilities. Developed algorithms were verified using computer-based dynamic simulations, and these were further verified using hardware experiments at Johnson Space Center.
McCarthy, Alun
2011-09-01
Pharmacogenomic Innovative Solutions Ltd (PGXIS) was established in 2007 by a group of pharmacogenomic (PGx) experts to make their expertise available to biotechnology and pharmaceutical companies. PGXIS has subsequently established a network of experts to broaden its access to relevant PGx knowledge and technologies. In addition, it has developed a novel multivariate analysis method called Taxonomy3 which is both a data integration tool and a targeting tool. Together with siRNA methodology from CytoPathfinder Inc., PGXIS now has an extensive range of diverse PGx methodologies focused on enhancing drug development.
[Occupational myofibrosis - main aspects of clinics, diagnosis and treatment].
Popov, A V; Ulanovskaya, E V
2013-01-01
Occupational chronic myofibrosis is a disease resulting from physical overstrain and functional overload of upper extremities and shoulder girdle and beeing the most prevalent occupational diseases related to the so-called "working hand". Myofibrosis occur among persons employed actually in all industries, building and agriculture and may develop as an isolated disease or combined with other occupational diseases of musculoskeletal and peripheral nervous systems. Today problems of diagnostics, especially at the early stage of the disease, and the development of knew methods of treatment are still topical.
1986-04-01
realism here, and I * need to have it from the Board. Federal basic research and development is *simply not a winner. That doesn’t mean we cannot find ways...CERC. The wave height across the surf zone was measured by a cinematic remote sensing technique developed in Japan (Hotta and 82 .......8217’" ,’ 1. Mizuguchi, 1980). The cinematic wave measurement method is under study for in.•.. adaptation at CERC where it is called the "photo
Adaptive strategy for joint measurements
NASA Astrophysics Data System (ADS)
Uola, Roope; Luoma, Kimmo; Moroder, Tobias; Heinosaari, Teiko
2016-08-01
We develop a technique to find simultaneous measurements for noisy quantum observables in finite-dimensional Hilbert spaces. We use the method to derive lower bounds for the noise needed to make incompatible measurements jointly measurable. Using our strategy together with recent developments in the field of one-sided quantum information processing we show that the attained lower bounds are tight for various symmetric sets of quantum measurements. We use this characterisation to prove the existence of so called 4-Specker sets, i.e. sets of four incompatible observables with compatible subsets in the qubit case.
Dienes, Keith
2018-01-10
We are currently in the throes of a potentially huge paradigm shift in physics. Motivated by recent developments in string theory and the discovery of the so-called "string landscape", physicists are beginning to question the uniqueness of fundamental theories of physics and the methods by which such theories might be understood and investigated. In this colloquium, I will give a non-technical introduction to the nature of this paradigm shift and how it developed. I will also discuss some of the questions to which it has led, and the nature of the controversies it has spawned.
Towards Model-Driven End-User Development in CALL
ERIC Educational Resources Information Center
Farmer, Rod; Gruba, Paul
2006-01-01
The purpose of this article is to introduce end-user development (EUD) processes to the CALL software development community. EUD refers to the active participation of end-users, as non-professional developers, in the software development life cycle. Unlike formal software engineering approaches, the focus in EUD on means/ends development is…
Passive Acoustic Methods for Tracking Marine Mammals Using Widely-Spaced Bottom-Mounted Hydrophones
2011-10-26
standard time-of-arrival (TOA) tracking methods fail. Clicks and long duration calls (whistles or baleen whale calls) were both considered. Methods...Evaluation Center (AUTEC) and the Pacific Missile Range Facility (PMRF). Beaked whales , minke whales , humpback whales , and sperm whales were the main species...of interest. io. auBJEUi i Lmvia Passive acoustic monitoring, localization, tracking, minke whale , beaked whale , sperm whale , humpback whale
12 CFR 334.25 - Reasonable and simple methods of opting out.
Code of Federal Regulations, 2011 CFR
2011-01-01
... or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing... by calling a single toll-free telephone number. (2) Opt-out methods that are not reasonable and...
12 CFR 334.25 - Reasonable and simple methods of opting out.
Code of Federal Regulations, 2012 CFR
2012-01-01
... or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing... by calling a single toll-free telephone number. (2) Opt-out methods that are not reasonable and...
12 CFR 334.25 - Reasonable and simple methods of opting out.
Code of Federal Regulations, 2013 CFR
2013-01-01
... or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing... by calling a single toll-free telephone number. (2) Opt-out methods that are not reasonable and...
12 CFR 334.25 - Reasonable and simple methods of opting out.
Code of Federal Regulations, 2014 CFR
2014-01-01
... or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing... by calling a single toll-free telephone number. (2) Opt-out methods that are not reasonable and...
12 CFR 334.25 - Reasonable and simple methods of opting out.
Code of Federal Regulations, 2010 CFR
2010-01-01
... or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing... by calling a single toll-free telephone number. (2) Opt-out methods that are not reasonable and...
Conduits to care: call lights and patients' perceptions of communication.
Montie, Mary; Shuman, Clayton; Galinato, Jose; Patak, Lance; Anderson, Christine A; Titler, Marita G
2017-01-01
Call light systems remain the primary means of hospitalized patients to initiate communication with their health care providers. Although there is vast amounts of literature discussing patient communication with their health care providers, few studies have explored patients' perceptions concerning call light use and communication. The specific aim of this study was to solicit patients' perceptions regarding their call light use and communication with nursing staff. Patients invited to this study met the following inclusion criteria: proficient in English, been hospitalized for at least 24 hours, aged ≥21 years, and able to communicate verbally (eg, not intubated). Thirty participants provided written informed consent, were enrolled in the study, and completed interviews. Using qualitative descriptive methods, five major themes emerged from patients' perceptions (namely; establishing connectivity, participant safety concerns, no separation: health care and the call light device, issues with the current call light, and participants' perceptions of "nurse work"). Multiple minor themes supported these major themes. Data analysis utilized the constant comparative methods of Glaser and Strauss. Findings from this study extend the knowledge of patients' understanding of not only why inconsistencies occur between the call light and their nurses, but also why the call light is more than merely a device to initiate communication; rather, it is a direct conduit to their health care and its delivery.
Method and system for conserving power in a telecommunications network during emergency situations
Conrad, Stephen H [Algodones, NM; O'Reilly, Gerard P [Manalapan, NJ
2011-10-11
Disclosed is a method and apparatus for conserving power in a telecommunications network during emergency situations. A permissible number list of emergency and/or priority numbers is stored in the telecommunications network. In the event of an emergency or power failure, input digits of a call to the telecommunications network are compared to the permissible number list. The call is processed in the telecommunications network and routed to its destination if the input digits match an entry in the permissible number list. The call is dropped without any further processing if the input digits do not match an entry in the permissible number list. Thus, power can be conserved in emergency situations by only allowing emergency and/or priority calls.
ERIC Educational Resources Information Center
Hamel, Marie-Josee; Caws, Catherine
2010-01-01
This article discusses CALL development from both educational and ergonomic perspectives. It focuses on the learner-task-tool interaction, in particular on the aspects contributing to its overall quality, herein called "usability." Two pilot studies are described that were carried out with intermediate to advanced learners of French in two…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patton, A.D.; Ayoub, A.K.; Singh, C.
1982-07-01
Existing methods for generating capacity reliability evaluation do not explicitly recognize a number of operating considerations which may have important effects in system reliability performance. Thus, current methods may yield estimates of system reliability which differ appreciably from actual observed reliability. Further, current methods offer no means of accurately studying or evaluating alternatives which may differ in one or more operating considerations. Operating considerations which are considered to be important in generating capacity reliability evaluation include: unit duty cycles as influenced by load cycle shape, reliability performance of other units, unit commitment policy, and operating reserve policy; unit start-up failuresmore » distinct from unit running failures; unit start-up times; and unit outage postponability and the management of postponable outages. A detailed Monte Carlo simulation computer model called GENESIS and two analytical models called OPCON and OPPLAN have been developed which are capable of incorporating the effects of many operating considerations including those noted above. These computer models have been used to study a variety of actual and synthetic systems and are available from EPRI. The new models are shown to produce system reliability indices which differ appreciably from index values computed using traditional models which do not recognize operating considerations.« less
Lane, India F; Strand, Elizabeth
2008-01-01
Missing in the recent calls for accountability and assurance of veterinary students' clinical competence are similar calls for competence in clinical teaching. Most clinician educators have no formal training in teaching theory or method. At the University of Tennessee College of Veterinary Medicine (UTCVM), we have initiated multiple strategies to enhance the quality of teaching in our curriculum and in clinical settings. An interview study of veterinary faculty was completed to investigate the strengths and weaknesses of clinical education; findings were used in part to prepare a professional development program in clinical teaching. Centered on principles of effective feedback, the program prepares participants to organize clinical rotation structure and orientation, maximize teaching moments, improve teaching and participation during formal rounds, and provide clearer summative feedback to students at the end of a rotation. The program benefits from being situated within a larger college-wide focus on teaching improvement. We expect the program's audience and scope to continue to expand.
3D automatic anatomy recognition based on iterative graph-cut-ASM
NASA Astrophysics Data System (ADS)
Chen, Xinjian; Udupa, Jayaram K.; Bagci, Ulas; Alavi, Abass; Torigian, Drew A.
2010-02-01
We call the computerized assistive process of recognizing, delineating, and quantifying organs and tissue regions in medical imaging, occurring automatically during clinical image interpretation, automatic anatomy recognition (AAR). The AAR system we are developing includes five main parts: model building, object recognition, object delineation, pathology detection, and organ system quantification. In this paper, we focus on the delineation part. For the modeling part, we employ the active shape model (ASM) strategy. For recognition and delineation, we integrate several hybrid strategies of combining purely image based methods with ASM. In this paper, an iterative Graph-Cut ASM (IGCASM) method is proposed for object delineation. An algorithm called GC-ASM was presented at this symposium last year for object delineation in 2D images which attempted to combine synergistically ASM and GC. Here, we extend this method to 3D medical image delineation. The IGCASM method effectively combines the rich statistical shape information embodied in ASM with the globally optimal delineation capability of the GC method. We propose a new GC cost function, which effectively integrates the specific image information with the ASM shape model information. The proposed methods are tested on a clinical abdominal CT data set. The preliminary results show that: (a) it is feasible to explicitly bring prior 3D statistical shape information into the GC framework; (b) the 3D IGCASM delineation method improves on ASM and GC and can provide practical operational time on clinical images.
Developing CALL to Meet the Needs of Language Teaching and Learning
ERIC Educational Resources Information Center
Jiang, Zhaofeng
2008-01-01
This paper illustrates the advantages and disadvantages of CALL. It points out that CALL is influenced by traditional language teaching and learning approaches to some extent. It concludes that what is important in our university system is that CALL design and implementation should match the users' needs, since CALL is not always better than…
Measurement equivalence and differential item functioning in family psychology.
Bingenheimer, Jeffrey B; Raudenbush, Stephen W; Leventhal, Tama; Brooks-Gunn, Jeanne
2005-09-01
Several hypotheses in family psychology involve comparisons of sociocultural groups. Yet the potential for cross-cultural inequivalence in widely used psychological measurement instruments threatens the validity of inferences about group differences. Methods for dealing with these issues have been developed via the framework of item response theory. These methods deal with an important type of measurement inequivalence, called differential item functioning (DIF). The authors introduce DIF analytic methods, linking them to a well-established framework for conceptualizing cross-cultural measurement equivalence in psychology (C.H. Hui and H.C. Triandis, 1985). They illustrate the use of DIF methods using data from the Project on Human Development in Chicago Neighborhoods (PHDCN). Focusing on the Caregiver Warmth and Environmental Organization scales from the PHDCN's adaptation of the Home Observation for Measurement of the Environment Inventory, the authors obtain results that exemplify the range of outcomes that may result when these methods are applied to psychological measurement instruments. (c) 2005 APA, all rights reserved
Scoring systems for the Clock Drawing Test: A historical review
Spenciere, Bárbara; Alves, Heloisa; Charchat-Fichman, Helenice
2017-01-01
The Clock Drawing Test (CDT) is a simple neuropsychological screening instrument that is well accepted by patients and has solid psychometric properties. Several different CDT scoring methods have been developed, but no consensus has been reached regarding which scoring method is the most accurate. This article reviews the literature on these scoring systems and the changes they have undergone over the years. Historically, different types of scoring systems emerged. Initially, the focus was on screening for dementia, and the methods were both quantitative and semi-quantitative. Later, the need for an early diagnosis called for a scoring system that can detect subtle errors, especially those related to executive function. Therefore, qualitative analyses began to be used for both differential and early diagnoses of dementia. A widely used qualitative method was proposed by Rouleau et al. (1992). Tracing the historical path of these scoring methods is important for developing additional scoring systems and furthering dementia prevention research. PMID:29213488
The pseudo-Boolean optimization approach to form the N-version software structure
NASA Astrophysics Data System (ADS)
Kovalev, I. V.; Kovalev, D. I.; Zelenkov, P. V.; Voroshilova, A. A.
2015-10-01
The problem of developing an optimal structure of N-version software system presents a kind of very complex optimization problem. This causes the use of deterministic optimization methods inappropriate for solving the stated problem. In this view, exploiting heuristic strategies looks more rational. In the field of pseudo-Boolean optimization theory, the so called method of varied probabilities (MVP) has been developed to solve problems with a large dimensionality. Some additional modifications of MVP have been made to solve the problem of N-version systems design. Those algorithms take into account the discovered specific features of the objective function. The practical experiments have shown the advantage of using these algorithm modifications because of reducing a search space.
Face pose tracking using the four-point algorithm
NASA Astrophysics Data System (ADS)
Fung, Ho Yin; Wong, Kin Hong; Yu, Ying Kin; Tsui, Kwan Pang; Kam, Ho Chuen
2017-06-01
In this paper, we have developed an algorithm to track the pose of a human face robustly and efficiently. Face pose estimation is very useful in many applications such as building virtual reality systems and creating an alternative input method for the disabled. Firstly, we have modified a face detection toolbox called DLib for the detection of a face in front of a camera. The detected face features are passed to a pose estimation method, known as the four-point algorithm, for pose computation. The theory applied and the technical problems encountered during system development are discussed in the paper. It is demonstrated that the system is able to track the pose of a face in real time using a consumer grade laptop computer.
A fast bottom-up algorithm for computing the cut sets of noncoherent fault trees
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corynen, G.C.
1987-11-01
An efficient procedure for finding the cut sets of large fault trees has been developed. Designed to address coherent or noncoherent systems, dependent events, shared or common-cause events, the method - called SHORTCUT - is based on a fast algorithm for transforming a noncoherent tree into a quasi-coherent tree (COHERE), and on a new algorithm for reducing cut sets (SUBSET). To assure sufficient clarity and precision, the procedure is discussed in the language of simple sets, which is also developed in this report. Although the new method has not yet been fully implemented on the computer, we report theoretical worst-casemore » estimates of its computational complexity. 12 refs., 10 figs.« less
Huth, John K.; Silvis, Alexander; Moosman, Paul R.; Ford, W. Mark; Sweeten, Sara E.
2015-01-01
Many aspects of foraging and roosting habitat of Myotis leibii (Eastern Small-Footed Bat), an emergent rock roosting-obligate, are poorly described. Previous comparisons of effectiveness of acoustic sampling and mist-net captures have not included Eastern Small-Footed Bat. Habitat requirements of this species differ from congeners in the region, and it is unclear whether survey protocols developed for other species are applicable. Using data from three overlapping studies at two sampling sites in western Virginia’s central Appalachian Mountains, detection probabilities were examined for three survey methods (acoustic surveys with automated identification of calls, visual searches of rock crevices, and mist-netting) for use in the development of “best practices” for future surveys and monitoring. Observer effects were investigated using an expanded version of visual search data. Results suggested that acoustic surveys with automated call identification are not effective for documenting presence of Eastern Small-Footed Bats on talus slopes (basal detection rate of 0%) even when the species is known to be present. The broadband, high frequency echolocation calls emitted by Eastern Small-Footed Bat may be prone to attenuation by virtue of their high frequencies, and these factors, along with signal reflection, lower echolocation rates or possible misidentification to other bat species over talus slopes may all have contributed to poor acoustic survey success. Visual searches and mist-netting of emergent rock had basal detection probabilities of 91% and 75%, respectively. Success of visual searches varied among observers, but detection probability improved with practice. Additionally, visual searches were considerably more economical than mist-netting.
Identifying Key Words in 9-1-1 Calls for Stroke: A Mixed Methods Approach.
Richards, Christopher T; Wang, Baiyang; Markul, Eddie; Albarran, Frank; Rottman, Doreen; Aggarwal, Neelum T; Lindeman, Patricia; Stein-Spencer, Leslee; Weber, Joseph M; Pearlman, Kenneth S; Tataris, Katie L; Holl, Jane L; Klabjan, Diego; Prabhakaran, Shyam
2017-01-01
Identifying stroke during a 9-1-1 call is critical to timely prehospital care. However, emergency medical dispatchers (EMDs) recognize stroke in less than half of 9-1-1 calls, potentially due to the words used by callers to communicate stroke signs and symptoms. We hypothesized that callers do not typically use words and phrases considered to be classical descriptors of stroke, such as focal neurologic deficits, but that a mixed-methods approach can identify words and phrases commonly used by 9-1-1 callers to describe acute stroke victims. We performed a mixed-method, retrospective study of 9-1-1 call audio recordings for adult patients with confirmed stroke who were transported by ambulance in a large urban city. Content analysis, a qualitative methodology, and computational linguistics, a quantitative methodology, were used to identify key words and phrases used by 9-1-1 callers to describe acute stroke victims. Because a caller's level of emotional distress contributes to the communication during a 9-1-1 call, the Emotional Content and Cooperation Score was scored by a multidisciplinary team. A total of 110 9-1-1 calls, received between June and September 2013, were analyzed. EMDs recognized stroke in 48% of calls, and the emotional state of most callers (95%) was calm. In 77% of calls in which EMDs recognized stroke, callers specifically used the word "stroke"; however, the word "stroke" was used in only 38% of calls. Vague, non-specific words and phrases were used to describe stroke victims' symptoms in 55% of calls, and 45% of callers used distractor words and phrases suggestive of non-stroke emergencies. Focal neurologic symptoms were described in 39% of calls. Computational linguistics identified 9 key words that were more commonly used in calls where the EMD identified stroke. These words were concordant with terms identified through qualitative content analysis. Most 9-1-1 callers used vague, non-specific, or distractor words and phrases and infrequently provide classic stroke descriptions during 9-1-1 calls for stroke. Both qualitative and quantitative methodologies identified similar key words and phrases associated with accurate EMD stroke recognition. This study suggests that tools incorporating commonly used words and phrases could potentially improve EMD stroke recognition.
Hudson, Robyn; Chacha, Jimena; Bánszegi, Oxána; Szenczi, Péter; Rödel, Heiko G
2017-04-01
Study of the development of individuality is often hampered by rapidly changing behavioral repertoires and the need for minimally intrusive tests. We individually tested 33 kittens from eight litters of the domestic cat in an arena for 3 min once a week for the first 3 postnatal weeks, recording the number of separation calls and the duration of locomotor activity. Kittens showed consistent and stable individual differences on both measures across and within trials. Stable individual differences in the emission of separation calls across trials emerged already within the first 10 s of testing, and in locomotor activity within the first 30 s. Furthermore, individual kittens' emission of separation calls, but not their locomotor activity, was highly stable within trials. We conclude that separation calls provide an efficient, minimally intrusive and reliable measure of individual differences in behavior during development in the cat, and possibly in other species emitting such calls. © 2017 Wiley Periodicals, Inc.
Defining Human Failure Events for Petroleum Risk Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; Knut Øien
2014-06-01
In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.
ERIC Educational Resources Information Center
Wolf, R. Cameron; Bicego, George; Marconi, Katherine; Bessinger, Ruth; van Praag, Eric; Noriega-Minichiello, Shanti; Pappas, Gregory; Fronczak, Nancy; Peersman, Greet; Fiorentino, Renee K.; Rugg, Deborah; Novak, John
2004-01-01
The sharp rise in the HIV/AIDS burden worldwide has elicited calls for increased efforts to combat the spread and impact of HIV/AIDS. Efforts must continue with the aim to decrease new infections. At the same time, care and treatment services for those already infected can lead to longer, productive lives, thereby minimizing negative effects on…
Automating the Transformational Development of Software. Volume 1.
1983-03-01
DRACO system [Neighbors 80] uses meta-rules to derive information about which new transformations will be applicable after a particular transformation has...transformation over another. The new model, as Incorporated in a system called Glitter, explicitly represents transformation goals, methods, and selection...done anew for each new problem (compare this with Neighbor’s Draco system [Neighbors 80] which attempts to reuse domain analysis). o Is the user
JPRS Report, Science & Technology, USSR: Science & Technology Policy.
1987-07-15
called for by the specifications. We pointed this out, tut the suppliers were in no hurry to correct the situation," shop foreman S . Korenev ... S &T Progress ( S . Leznov; TEKHNIKA I NAUKA,No 9, Sep 86) 10 Work of ESSR S &T Societies on Intensifying Social Production (Vladimir...Moscow Automated Control System ( S . Ye. Serdyuk; TEKHNIKA I NAUKA, No 9, Sep 86) 41 INDUSTRIAL, COMMERCIAL APPLICATION Development of Methods
Metalworking Techniques Unlock a Unique Alloy
NASA Technical Reports Server (NTRS)
2015-01-01
Approached by West Hartford, Connecticut-based Abbot Ball Company, Glenn Research Center agreed to test an intriguing alloy called Nitinol 60 that had been largely unused for a half century. Using powdered metallurgy, the partners developed a method for manufacturing and working with the material, which Abbott Ball has now commercialized. Nitinol 60 provides a unique combination of qualities that make it an excellent material for ball bearings, among other applications.
ERIC Educational Resources Information Center
Ghebtsawi, Tesheme
This report illustrates the method, called the "SfB System," recommended in an earlier publication by this organization, for systematization of building projects. The aim of the report is to show how to arrange information on building elements and building construction for clear definition of the parts to be built and for easy…
iSS-PseDNC: identifying splicing sites using pseudo dinucleotide composition.
Chen, Wei; Feng, Peng-Mian; Lin, Hao; Chou, Kuo-Chen
2014-01-01
In eukaryotic genes, exons are generally interrupted by introns. Accurately removing introns and joining exons together are essential processes in eukaryotic gene expression. With the avalanche of genome sequences generated in the postgenomic age, it is highly desired to develop automated methods for rapid and effective detection of splice sites that play important roles in gene structure annotation and even in RNA splicing. Although a series of computational methods were proposed for splice site identification, most of them neglected the intrinsic local structural properties. In the present study, a predictor called "iSS-PseDNC" was developed for identifying splice sites. In the new predictor, the sequences were formulated by a novel feature-vector called "pseudo dinucleotide composition" (PseDNC) into which six DNA local structural properties were incorporated. It was observed by the rigorous cross-validation tests on two benchmark datasets that the overall success rates achieved by iSS-PseDNC in identifying splice donor site and splice acceptor site were 85.45% and 87.73%, respectively. It is anticipated that iSS-PseDNC may become a useful tool for identifying splice sites and that the six DNA local structural properties described in this paper may provide novel insights for in-depth investigations into the mechanism of RNA splicing.
SMART on FHIR: a standards-based, interoperable apps platform for electronic health records
Kreda, David A; Mandl, Kenneth D; Kohane, Isaac S; Ramoni, Rachel B
2016-01-01
Objective In early 2010, Harvard Medical School and Boston Children’s Hospital began an interoperability project with the distinctive goal of developing a platform to enable medical applications to be written once and run unmodified across different healthcare IT systems. The project was called Substitutable Medical Applications and Reusable Technologies (SMART). Methods We adopted contemporary web standards for application programming interface transport, authorization, and user interface, and standard medical terminologies for coded data. In our initial design, we created our own openly licensed clinical data models to enforce consistency and simplicity. During the second half of 2013, we updated SMART to take advantage of the clinical data models and the application-programming interface described in a new, openly licensed Health Level Seven draft standard called Fast Health Interoperability Resources (FHIR). Signaling our adoption of the emerging FHIR standard, we called the new platform SMART on FHIR. Results We introduced the SMART on FHIR platform with a demonstration that included several commercial healthcare IT vendors and app developers showcasing prototypes at the Health Information Management Systems Society conference in February 2014. This established the feasibility of SMART on FHIR, while highlighting the need for commonly accepted pragmatic constraints on the base FHIR specification. Conclusion In this paper, we describe the creation of SMART on FHIR, relate the experience of the vendors and developers who built SMART on FHIR prototypes, and discuss some challenges in going from early industry prototyping to industry-wide production use. PMID:26911829
The 6-31B(d) basis set and the BMC-QCISD and BMC-CCSD multicoefficient correlation methods.
Lynch, Benjamin J; Zhao, Yan; Truhlar, Donald G
2005-03-03
Three new multicoefficient correlation methods (MCCMs) called BMC-QCISD, BMC-CCSD, and BMC-CCSD-C are optimized against 274 data that include atomization energies, electron affinities, ionization potentials, and reaction barrier heights. A new basis set called 6-31B(d) is developed and used as part of the new methods. BMC-QCISD has mean unsigned errors in calculating atomization energies per bond and barrier heights of 0.49 and 0.80 kcal/mol, respectively. BMC-CCSD has mean unsigned errors of 0.42 and 0.71 kcal/mol for the same two quantities. BMC-CCSD-C is an equally effective variant of BMC-CCSD that employs Cartesian rather than spherical harmonic basis sets. The mean unsigned error of BMC-CCSD or BMC-CCSD-C for atomization energies, barrier heights, ionization potentials, and electron affinities is 22% lower than G3SX(MP2) at an order of magnitude less cost for gradients for molecules with 9-13 atoms, and it scales better (N6 vs N,7 where N is the number of atoms) when the size of the molecule is increased.
Structures and Materials Working Group report
NASA Technical Reports Server (NTRS)
Torczyner, Robert; Hanks, Brantley R.
1986-01-01
The appropriateness of the selection of four issues (advanced materials development, analysis/design methods, tests of large flexible structures, and structural concepts) was evaluated. A cross-check of the issues and their relationship to the technology drivers is presented. Although all of the issues addressed numerous drivers, the advanced materials development issue impacts six out of the seven drivers and is considered to be the most crucial. The advanced materials technology development and the advanced design/analysis methods development were determined to be enabling technologies with the testing issues and development of structural concepts considered to be of great importance, although not enabling technologies. In addition, and of more general interest and criticality, the need for a Government/Industry commitment which does not now exist, was established. This commitment would call for the establishment of the required infrastructure to facilitate the development of the capabilities highlighted through the availability of resources and testbed facilities, including a national testbed in space to be in place in ten years.
Mahamdallie, Shazia; Ruark, Elise; Yost, Shawn; Ramsay, Emma; Uddin, Imran; Wylie, Harriett; Elliott, Anna; Strydom, Ann; Renwick, Anthony; Seal, Sheila; Rahman, Nazneen
2017-01-01
Detection of deletions and duplications of whole exons (exon CNVs) is a key requirement of genetic testing. Accurate detection of this variant type has proved very challenging in targeted next-generation sequencing (NGS) data, particularly if only a single exon is involved. Many different NGS exon CNV calling methods have been developed over the last five years. Such methods are usually evaluated using simulated and/or in-house data due to a lack of publicly-available datasets with orthogonally generated results. This hinders tool comparisons, transparency and reproducibility. To provide a community resource for assessment of exon CNV calling methods in targeted NGS data, we here present the ICR96 exon CNV validation series. The dataset includes high-quality sequencing data from a targeted NGS assay (the TruSight Cancer Panel) together with Multiplex Ligation-dependent Probe Amplification (MLPA) results for 96 independent samples. 66 samples contain at least one validated exon CNV and 30 samples have validated negative results for exon CNVs in 26 genes. The dataset includes 46 exon CNVs in BRCA1 , BRCA2 , TP53 , MLH1 , MSH2 , MSH6 , PMS2 , EPCAM or PTEN , giving excellent representation of the cancer predisposition genes most frequently tested in clinical practice. Moreover, the validated exon CNVs include 25 single exon CNVs, the most difficult type of exon CNV to detect. The FASTQ files for the ICR96 exon CNV validation series can be accessed through the European-Genome phenome Archive (EGA) under the accession number EGAS00001002428.
Bi-color near infrared thermoreflectometry: a method for true temperature field measurement.
Sentenac, Thierry; Gilblas, Rémi; Hernandez, Daniel; Le Maoult, Yannick
2012-12-01
In a context of radiative temperature field measurement, this paper deals with an innovative method, called bicolor near infrared thermoreflectometry, for the measurement of true temperature fields without prior knowledge of the emissivity field of an opaque material. This method is achieved by a simultaneous measurement, in the near infrared spectral band, of the radiance temperature fields and of the emissivity fields measured indirectly by reflectometry. The theoretical framework of the method is introduced and the principle of the measurements at two wavelengths is detailed. The crucial features of the indirect measurement of emissivity are the measurement of bidirectional reflectivities in a single direction and the introduction of an unknown variable, called the "diffusion factor." Radiance temperature and bidirectional reflectivities are then merged into a bichromatic system based on Kirchhoff's laws. The assumption of the system, based on the invariance of the diffusion factor for two near wavelengths, and the value of the chosen wavelengths, are then discussed in relation to a database of several material properties. A thermoreflectometer prototype was developed, dimensioned, and evaluated. Experiments were carried out to outline its trueness in challenging cases. First, experiments were performed on a metallic sample with a high emissivity value. The bidirectional reflectivity was then measured from low signals. The results on erbium oxide demonstrate the power of the method with materials with high emissivity variations in near infrared spectral band.
NASA Astrophysics Data System (ADS)
Butlitsky, M. A.; Zelener, B. B.; Zelener, B. V.
2015-11-01
Earlier a two-component pseudopotential plasma model, which we called a “shelf Coulomb” model has been developed. A Monte-Carlo study of canonical NVT ensemble with periodic boundary conditions has been undertaken to calculate equations of state, pair distribution functions, internal energies and other thermodynamics properties of the model. In present work, an attempt is made to apply so-called hybrid Gibbs statistical ensemble Monte-Carlo technique to this model. First simulation results data show qualitatively similar results for critical point region for both methods. Gibbs ensemble technique let us to estimate the melting curve position and a triple point of the model (in reduced temperature and specific volume coordinates): T* ≈ 0.0476, v* ≈ 6 × 10-4.
Hamiltonian Dynamics of Spider-Type Multirotor Rigid Bodies Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doroshin, Anton V.
2010-03-01
This paper sets out to develop a spider-type multiple-rotor system which can be used for attitude control of spacecraft. The multirotor system contains a large number of rotor-equipped rays, so it was called a 'Spider-type System', also it can be called 'Rotary Hedgehog'. These systems allow using spinups and captures of conjugate rotors to perform compound attitude motion of spacecraft. The paper describes a new method of spacecraft attitude reorientation and new mathematical model of motion in Hamilton form. Hamiltonian dynamics of the system is investigated with the help of Andoyer-Deprit canonical variables. These variables allow obtaining exact solution formore » hetero- and homoclinic orbits in phase space of the system motion, which are very important for qualitative analysis.« less
The planar multijunction cell - A new solar cell for earth and space
NASA Technical Reports Server (NTRS)
Evans, J. C., Jr.; Chai, A.-T.; Goradia, C.
1980-01-01
A new family of high-voltage solar cells, called the planar multijunction (PMJ) cell is being developed. The new cells combine the attractive features of planar cells with conventional or interdigitated back contacts and the vertical multijunction (VMJ) solar cell. The PMJ solar cell is internally divided into many voltage-generating regions, called unit cells, which are internally connected in series. The key to obtaining reasonable performance from this device was the separation of top surface field regions over each active unit cell area. Using existing solar cell fabricating methods, output voltages in excess of 20 volts per linear centimeter are possible. Analysis of the new device is complex, and numerous geometries are being studied which should provide substantial benefits in both normal sunlight usage as well as with concentrators.
Effect of geometrical parameters on pressure distributions of impulse manufacturing technologies
NASA Astrophysics Data System (ADS)
Brune, Ryan Carl
Impulse manufacturing techniques constitute a growing field of methods that utilize high-intensity pressure events to conduct useful mechanical operations. As interest in applying this technology continues to grow, greater understanding must be achieved with respect to output pressure events in both magnitude and distribution. In order to address this need, a novel pressure measurement has been developed called the Profile Indentation Pressure Evaluation (PIPE) method that systematically analyzes indentation patterns created with impulse events. Correlation with quasi-static test data and use of software-assisted analysis techniques allows for colorized pressure maps to be generated for both electromagnetic and vaporizing foil actuator (VFA) impulse forming events. Development of this technique aided introduction of a design method for electromagnetic path actuator systems, where key geometrical variables are considered using a newly developed analysis method, which is called the Path Actuator Proximal Array (PAPA) pressure model. This model considers key current distribution and proximity effects and interprets generated pressure by considering the adjacent conductor surfaces as proximal arrays of individual conductors. According to PIPE output pressure analysis, the PAPA model provides a reliable prediction of generated pressure for path actuator systems as local geometry is changed. Associated mechanical calculations allow for pressure requirements to be calculated for shearing, flanging, and hemming operations, providing a design process for such cases. Additionally, geometry effect is investigated through a formability enhancement study using VFA metalworking techniques. A conical die assembly is utilized with both VFA high velocity and traditional quasi-static test methods on varied Hasek-type sample geometries to elicit strain states consistent with different locations on a forming limit diagram. Digital image correlation techniques are utilized to measure major and minor strains for each sample type to compare limit strain results. Overall testing indicated decreased formability at high velocity for 304 DDQ stainless steel and increased formability at high velocity for 3003-H14 aluminum. Microstructural and fractographic analysis helped dissect and analyze the observed differences in these cases. Overall, these studies comprehensively explore the effects of geometrical parameters on magnitude and distribution of impulse manufacturing generated pressure, establishing key guidelines and models for continued development and implementation in commercial applications.
Point model equations for neutron correlation counting: Extension of Böhnel's equations to any order
Favalli, Andrea; Croft, Stephen; Santi, Peter
2015-06-15
Various methods of autocorrelation neutron analysis may be used to extract information about a measurement item containing spontaneously fissioning material. The two predominant approaches being the time correlation analysis (that make use of a coincidence gate) methods of multiplicity shift register logic and Feynman sampling. The common feature is that the correlated nature of the pulse train can be described by a vector of reduced factorial multiplet rates. We call these singlets, doublets, triplets etc. Within the point reactor model the multiplet rates may be related to the properties of the item, the parameters of the detector, and basic nuclearmore » data constants by a series of coupled algebraic equations – the so called point model equations. Solving, or inverting, the point model equations using experimental calibration model parameters is how assays of unknown items is performed. Currently only the first three multiplets are routinely used. In this work we develop the point model equations to higher order multiplets using the probability generating functions approach combined with the general derivative chain rule, the so called Faà di Bruno Formula. Explicit expression up to 5th order are provided, as well the general iterative formula to calculate any order. This study represents the first necessary step towards determining if higher order multiplets can add value to nondestructive measurement practice for nuclear materials control and accountancy.« less
Virtual shelves in a digital library: a framework for access to networked information sources.
Patrick, T B; Springer, G K; Mitchell, J A; Sievert, M E
1995-01-01
Develop a framework for collections-based access to networked information sources that addresses the problem of location-dependent access to information sources. This framework uses a metaphor of a virtual shelf. A virtual shelf is a general-purpose server that is dedicated to a particular information subject class. The identifier of one of these servers identifies its subject class. Location-independent call numbers are assigned to information sources. Call numbers are based on standard vocabulary codes. The call numbers are first mapped to the location-independent identifiers of virtual shelves. When access to an information resource is required, a location directory provides a second mapping of these location-independent server identifiers to actual network locations. The framework has been implemented in two different systems. One system is based on the Open System Foundation/Distributed Computing Environment and the other is based on the World Wide Web. This framework applies in new ways traditional methods of library classification and cataloging. It is compatible with two traditional styles of selecting information searching and browsing. Traditional methods may be combined with new paradigms of information searching that will be able to take advantage of the special properties of digital information. Cooperation between the library-informational science community and the informatics community can provide a means for a continuing application of the knowledge and techniques of library science to the new problems of networked information sources.
Unterberger, Michael J; Holzapfel, Gerhard A
2014-11-01
The protein actin is a part of the cytoskeleton and, therefore, responsible for the mechanical properties of the cells. Starting with the single molecule up to the final structure, actin creates a hierarchical structure of several levels exhibiting a remarkable behavior. The hierarchy spans several length scales and limitations in computational power; therefore, there is a call for different mechanical modeling approaches for the different scales. On the molecular level, we may consider each atom in molecular dynamics simulations. Actin forms filaments by combining the molecules into a double helix. In a model, we replace molecular subdomains using coarse-graining methods, allowing the investigation of larger systems of several atoms. These models on the nanoscale inform continuum mechanical models of large filaments, which are based on worm-like chain models for polymers. Assemblies of actin filaments are connected with cross-linker proteins. Models with discrete filaments, so-called Mikado models, allow us to investigate the dependence of the properties of networks on the parameters of the constituents. Microstructurally motivated continuum models of the networks provide insights into larger systems containing cross-linked actin networks. Modeling of such systems helps to gain insight into the processes on such small scales. On the other hand, they call for verification and hence trigger the improvement of established experiments and the development of new methods.
Calling, Vocational Development, and Well Being: A Longitudinal Study of Medical Students
ERIC Educational Resources Information Center
Duffy, Ryan D.; Manuel, R. Stephen; Borges, Nicole J.; Bott, Elizabeth M.
2011-01-01
The present study investigated the relation of calling to the vocational development and well-being of a sample of medical students. Students were surveyed at two time points: prior to beginning the first year of medical school and prior to beginning the third year of medical school. At each time point, calling moderately correlated with positive…
A comparison in Colorado of three methods to monitor breeding amphibians
Corn, P.S.; Muths, E.; Iko, W.M.
2000-01-01
We surveyed amphibians at 4 montane and 2 plains lentic sites in northern Colorado using 3 techniques: standardized call surveys, automated recording devices (frog-loggers), and intensive surveys including capture-recapture techniques. Amphibians were observed at 5 sites. Species richness varied from 0 to 4 species at each site. Richness scores, the sums of species richness among sites, were similar among methods: 8 for call surveys, 10 for frog-loggers, and 11 for intensive surveys (9 if the non-vocal salamander Ambystoma tigrinum is excluded). The frog-logger at 1 site recorded Spea bombifrons which was not active during the times when call and intensive surveys were conducted. Relative abundance scores from call surveys failed to reflect a relatively large population of Bufo woodhousii at 1 site and only weakly differentiated among different-sized populations of Pseudacris maculata at 3 other sites. For extensive applications, call surveys have the lowest costs and fewest requirements for highly trained personnel. However, for a variety of reasons, call surveys cannot be used with equal effectiveness in all parts of North America.
Minimizing Higgs potentials via numerical polynomial homotopy continuation
NASA Astrophysics Data System (ADS)
Maniatis, M.; Mehta, D.
2012-08-01
The study of models with extended Higgs sectors requires to minimize the corresponding Higgs potentials, which is in general very difficult. Here, we apply a recently developed method, called numerical polynomial homotopy continuation (NPHC), which guarantees to find all the stationary points of the Higgs potentials with polynomial-like non-linearity. The detection of all stationary points reveals the structure of the potential with maxima, metastable minima, saddle points besides the global minimum. We apply the NPHC method to the most general Higgs potential having two complex Higgs-boson doublets and up to five real Higgs-boson singlets. Moreover the method is applicable to even more involved potentials. Hence the NPHC method allows to go far beyond the limits of the Gröbner basis approach.
Scare Tactics: Evaluating Problem Decompositions Using Failure Scenarios
NASA Technical Reports Server (NTRS)
Helm, B. Robert; Fickas, Stephen
1992-01-01
Our interest is in the design of multi-agent problem-solving systems, which we refer to as composite systems. We have proposed an approach to composite system design by decomposition of problem statements. An automated assistant called Critter provides a library of reusable design transformations which allow a human analyst to search the space of decompositions for a problem. In this paper we describe a method for evaluating and critiquing problem decompositions generated by this search process. The method uses knowledge stored in the form of failure decompositions attached to design transformations. We suggest the benefits of our critiquing method by showing how it could re-derive steps of a published development example. We then identify several open issues for the method.
Bloedel, Kimberly; Skhal, Kathryn
2006-01-01
Hardin Library for the Health Sciences offers an education service called Hardin House Calls. In collaboration with the University of Iowa libraries' public relations coordinator, the education team developed a marketing campaign for Hardin House Calls. Marketing strategies included designing a new logo, meeting with external relations representatives and faculty, distributing a user survey, and producing and distributing posters and advertisements. These marketing strategies greatly increased the visibility and use of Hardin House Calls. The campaign also led to a series of faculty development sessions, education collaborations with smaller health sciences departments, and collection development opportunities. Promoting an instructional service through a public relations frameworkwas found to be a highly successful strategy.
Touchpoints: Your Child's Emotional and Behavioral Development.
ERIC Educational Resources Information Center
Brazelton, T. Berry
This book looks at children's early development through what are called "touchpoints": times just before a surge of rapid motor, cognitive, or emotional development when, for a short time, children regress in several areas and become difficult to understand. Part 1, called "Touchpoints of Development," is organized around the…
17 CFR 248.125 - Reasonable and simple methods of opting out.
Code of Federal Regulations, 2014 CFR
2014-04-01
... electronically mailed or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing..., such as by calling a single toll-free telephone number. (2) Opt out methods that are not reasonable and...
17 CFR 248.125 - Reasonable and simple methods of opting out.
Code of Federal Regulations, 2013 CFR
2013-04-01
... electronically mailed or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing..., such as by calling a single toll-free telephone number. (2) Opt out methods that are not reasonable and...
17 CFR 248.125 - Reasonable and simple methods of opting out.
Code of Federal Regulations, 2012 CFR
2012-04-01
... electronically mailed or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing..., such as by calling a single toll-free telephone number. (2) Opt out methods that are not reasonable and...
17 CFR 248.125 - Reasonable and simple methods of opting out.
Code of Federal Regulations, 2010 CFR
2010-04-01
... electronically mailed or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing..., such as by calling a single toll-free telephone number. (2) Opt out methods that are not reasonable and...
17 CFR 248.125 - Reasonable and simple methods of opting out.
Code of Federal Regulations, 2011 CFR
2011-04-01
... electronically mailed or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing..., such as by calling a single toll-free telephone number. (2) Opt out methods that are not reasonable and...
12 CFR 222.25 - Reasonable and simple methods of opting out.
Code of Federal Regulations, 2010 CFR
2010-01-01
... electronically mailed or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing... by calling a single toll-free telephone number. (2) Opt-out methods that are not reasonable and...
12 CFR 222.25 - Reasonable and simple methods of opting out.
Code of Federal Regulations, 2014 CFR
2014-01-01
... electronically mailed or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing... by calling a single toll-free telephone number. (2) Opt-out methods that are not reasonable and...