Sample records for simple analysis based

  1. Analysis of pre-service physics teacher skills designing simple physics experiments based technology

    NASA Astrophysics Data System (ADS)

    Susilawati; Huda, C.; Kurniawan, W.; Masturi; Khoiri, N.

    2018-03-01

    Pre-service physics teacher skill in designing simple experiment set is very important in adding understanding of student concept and practicing scientific skill in laboratory. This study describes the skills of physics students in designing simple experiments based technologicall. The experimental design stages include simple tool design and sensor modification. The research method used is descriptive method with the number of research samples 25 students and 5 variations of simple physics experimental design. Based on the results of interviews and observations obtained the results of pre-service physics teacher skill analysis in designing simple experimental physics charged technology is good. Based on observation result, pre-service physics teacher skill in designing simple experiment is good while modification and sensor application are still not good. This suggests that pre-service physics teacher still need a lot of practice and do experiments in designing physics experiments using sensor modifications. Based on the interview result, it is found that students have high enough motivation to perform laboratory activities actively and students have high curiosity to be skilled at making simple practicum tool for physics experiment.

  2. A Simple Exact Error Rate Analysis for DS-CDMA with Arbitrary Pulse Shape in Flat Nakagami Fading

    NASA Astrophysics Data System (ADS)

    Rahman, Mohammad Azizur; Sasaki, Shigenobu; Kikuchi, Hisakazu; Harada, Hiroshi; Kato, Shuzo

    A simple exact error rate analysis is presented for random binary direct sequence code division multiple access (DS-CDMA) considering a general pulse shape and flat Nakagami fading channel. First of all, a simple model is developed for the multiple access interference (MAI). Based on this, a simple exact expression of the characteristic function (CF) of MAI is developed in a straight forward manner. Finally, an exact expression of error rate is obtained following the CF method of error rate analysis. The exact error rate so obtained can be much easily evaluated as compared to the only reliable approximate error rate expression currently available, which is based on the Improved Gaussian Approximation (IGA).

  3. A Simple Plant Growth Analysis.

    ERIC Educational Resources Information Center

    Oxlade, E.

    1985-01-01

    Describes the analysis of dandelion peduncle growth based on peduncle length, epidermal cell dimensions, and fresh/dry mass. Methods are simple and require no special apparatus or materials. Suggests that limited practical work in this area may contribute to students' lack of knowledge on plant growth. (Author/DH)

  4. Phasor Analysis of Binary Diffraction Gratings with Different Fill Factors

    ERIC Educational Resources Information Center

    Martinez, Antonio; Sanchez-Lopez, Ma del Mar; Moreno, Ignacio

    2007-01-01

    In this work, we present a simple analysis of binary diffraction gratings with different slit widths relative to the grating period. The analysis is based on a simple phasor technique directly derived from the Huygens principle. By introducing a slit phasor and a grating phasor, the intensity of the diffracted orders and the grating's resolving…

  5. Fourier Spectroscopy: A Simple Analysis Technique

    ERIC Educational Resources Information Center

    Oelfke, William C.

    1975-01-01

    Presents a simple method of analysis in which the student can integrate, point by point, any interferogram to obtain its Fourier transform. The manual technique requires no special equipment and is based on relationships that most undergraduate physics students can derive from the Fourier integral equations. (Author/MLH)

  6. Interactions between Type of Instruction and Type of Language Feature: A Meta-Analysis

    ERIC Educational Resources Information Center

    Spada, Nina; Tomita, Yasuyo

    2010-01-01

    A meta-analysis was conducted to investigate the effects of explicit and implicit instruction on the acquisition of simple and complex grammatical features in English. The target features in the 41 studies contributing to the meta-analysis were categorized as simple or complex based on the number of criteria applied to arrive at the correct target…

  7. Low cost charged-coupled device (CCD) based detectors for Shiga toxins activity analysis

    USDA-ARS?s Scientific Manuscript database

    To improve food safety there is a need to develop simple, low-cost sensitive devices for detection of foodborne pathogens and their toxins. We describe a simple and relatively low-cost webcam-based detector which can be used for various optical detection modalities, including fluorescence, chemilumi...

  8. Two-way ANOVA Problems with Simple Numbers.

    ERIC Educational Resources Information Center

    Read, K. L. Q.; Shihab, L. H.

    1998-01-01

    Describes how to construct simple numerical examples in two-way ANOVAs, specifically randomized blocks, balanced two-way layouts, and Latin squares. Indicates that working through simple numerical problems is helpful to students meeting a technique for the first time and should be followed by computer-based analysis of larger, real datasets when…

  9. Summer Research Program - 1997 Summer Faculty Research Program Volume 6 Arnold Engineering Development Center United States Air Force Academy Air Logistics Centers

    DTIC Science & Technology

    1997-12-01

    Fracture Analysis of the F-5, 15%-Spar Bolt DR Devendra Kumar SAALC/LD 6- 16 CUNY-City College, New York, NY A Simple, Multiversion Concurrency Control...Program, University of Dayton, Dayton, OH. [3]AFGROW, Air Force Crack Propagation Analysis Program, Version 3.82 (1997) 15-8 A SIMPLE, MULTIVERSION ...Office of Scientific Research Boiling Air Force Base, DC and San Antonio Air Logistic Center August 1997 16-1 A SIMPLE, MULTIVERSION CONCURRENCY

  10. Using High Speed Smartphone Cameras and Video Analysis Techniques to Teach Mechanical Wave Physics

    ERIC Educational Resources Information Center

    Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano

    2017-01-01

    We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses…

  11. On-Line Analysis of Southern FIA Data

    Treesearch

    Michael P. Spinney; Paul C. Van Deusen; Francis A. Roesch

    2006-01-01

    The Southern On-Line Estimator (SOLE) is a web-based FIA database analysis tool designed with an emphasis on modularity. The Java-based user interface is simple and intuitive to use and the R-based analysis engine is fast and stable. Each component of the program (data retrieval, statistical analysis and output) can be individually modified to accommodate major...

  12. A common base method for analysis of qPCR data and the application of simple blocking in qPCR experiments.

    PubMed

    Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J

    2017-12-01

    qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.

  13. Post-game analysis: An initial experiment for heuristic-based resource management in concurrent systems

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.

    1987-01-01

    In concurrent systems, a major responsibility of the resource management system is to decide how the application program is to be mapped onto the multi-processor. Instead of using abstract program and machine models, a generate-and-test framework known as 'post-game analysis' that is based on data gathered during program execution is proposed. Each iteration consists of (1) (a simulation of) an execution of the program; (2) analysis of the data gathered; and (3) the proposal of a new mapping that would have a smaller execution time. These heuristics are applied to predict execution time changes in response to small perturbations applied to the current mapping. An initial experiment was carried out using simple strategies on 'pipeline-like' applications. The results obtained from four simple strategies demonstrated that for this kind of application, even simple strategies can produce acceptable speed-up with a small number of iterations.

  14. Fabrication of a Paper-Based Microfluidic Device to Readily Determine Nitrite Ion Concentration by Simple Colorimetric Assay

    ERIC Educational Resources Information Center

    Wang, Bo; Lin, Zhiqiang; Wang, Min

    2015-01-01

    Paper-based microfluidic devices (µPAD) are a burgeoning platform of microfluidic analysis technology. The method described herein is for use in undergraduate and high school chemistry laboratories. A simple and convenient µPAD was fabricated by easy patterning of filter paper using a permanent marker pen. The usefulness of the device was…

  15. [Analysis on the accuracy of simple selection method of Fengshi (GB 31)].

    PubMed

    Li, Zhixing; Zhang, Haihua; Li, Suhe

    2015-12-01

    To explore the accuracy of simple selection method of Fengshi (GB 31). Through the study of the ancient and modern data,the analysis and integration of the acupuncture books,the comparison of the locations of Fengshi (GB 31) by doctors from all dynasties and the integration of modern anatomia, the modern simple selection method of Fengshi (GB 31) is definite, which is the same as the traditional way. It is believed that the simple selec tion method is in accord with the human-oriented thought of TCM. Treatment by acupoints should be based on the emerging nature and the individual difference of patients. Also, it is proposed that Fengshi (GB 31) should be located through the integration between the simple method and body surface anatomical mark.

  16. Exploratory reconstructability analysis of accident TBI data

    NASA Astrophysics Data System (ADS)

    Zwick, Martin; Carney, Nancy; Nettleton, Rosemary

    2018-02-01

    This paper describes the use of reconstructability analysis to perform a secondary study of traumatic brain injury data from automobile accidents. Neutral searches were done and their results displayed with a hypergraph. Directed searches, using both variable-based and state-based models, were applied to predict performance on two cognitive tests and one neurological test. Very simple state-based models gave large uncertainty reductions for all three DVs and sizeable improvements in percent correct for the two cognitive test DVs which were equally sampled. Conditional probability distributions for these models are easily visualized with simple decision trees. Confounding variables and counter-intuitive findings are also reported.

  17. A Model and Simple Iterative Algorithm for Redundancy Analysis.

    ERIC Educational Resources Information Center

    Fornell, Claes; And Others

    1988-01-01

    This paper shows that redundancy maximization with J. K. Johansson's extension can be accomplished via a simple iterative algorithm based on H. Wold's Partial Least Squares. The model and the iterative algorithm for the least squares approach to redundancy maximization are presented. (TJH)

  18. Anticancer Agents Based on a New Class of Transition- State Analog Inhibitors for Serine and Cysteine Proteases

    DTIC Science & Technology

    1999-08-01

    electrostatic repulsion between the het- eroatom and the ketone. Swain and Lupton31 have constructed a modified Hammett equation (eq 2) in which they...determined by nonlinear fit to the Michaelis-Menten equation for competitive inhibition using simple weighing. Competitive inhibition was confirmed... equation for competitive inhibition using simple weighing. Competitive inhibition was confirmed by Lineweaver - Burk analysis using simple

  19. Reachability Analysis for Base Placement in Mobile Manipulators

    NASA Technical Reports Server (NTRS)

    Seraji, H.

    1994-01-01

    This paper addresses the problem of base placement for mobile robots, and proposes a simple off-line solution to determine the appropriate base locations from which the robot can reach a target point.

  20. SIMPL: A Simplified Model-Based Program for the Analysis and Visualization of Groundwater Rebound in Abandoned Mines to Prevent Contamination of Water and Soils by Acid Mine Drainage

    PubMed Central

    Kim, Sung-Min

    2018-01-01

    Cessation of dewatering following underground mine closure typically results in groundwater rebound, because mine voids and surrounding strata undergo flooding up to the levels of the decant points, such as shafts and drifts. SIMPL (Simplified groundwater program In Mine workings using the Pipe equation and Lumped parameter model), a simplified lumped parameter model-based program for predicting groundwater levels in abandoned mines, is presented herein. The program comprises a simulation engine module, 3D visualization module, and graphical user interface, which aids data processing, analysis, and visualization of results. The 3D viewer facilitates effective visualization of the predicted groundwater level rebound phenomenon together with a topographic map, mine drift, goaf, and geological properties from borehole data. SIMPL is applied to data from the Dongwon coal mine and Dalsung copper mine in Korea, with strong similarities in simulated and observed results. By considering mine workings and interpond connections, SIMPL can thus be used to effectively analyze and visualize groundwater rebound. In addition, the predictions by SIMPL can be utilized to prevent the surrounding environment (water and soil) from being polluted by acid mine drainage. PMID:29747480

  1. Simple methods of exploiting the underlying structure of rule-based systems

    NASA Technical Reports Server (NTRS)

    Hendler, James

    1986-01-01

    Much recent work in the field of expert systems research has aimed at exploiting the underlying structures of the rule base for reasons of analysis. Such techniques as Petri-nets and GAGs have been proposed as representational structures that will allow complete analysis. Much has been made of proving isomorphisms between the rule bases and the mechanisms, and in examining the theoretical power of this analysis. In this paper we describe some early work in a new system which has much simpler (and thus, one hopes, more easily achieved) aims and less formality. The technique being examined is a very simple one: OPS5 programs are analyzed in a purely syntactic way and a FSA description is generated. In this paper we describe the technique and some user interface tools which exploit this structure.

  2. Volatility and correlation-based systemic risk measures in the US market

    NASA Astrophysics Data System (ADS)

    Civitarese, Jamil

    2016-10-01

    This paper deals with the problem of how to use simple systemic risk measures to assess portfolio risk characteristics. Using three simple examples taken from previous literature, one based on raw and partial correlations, another based on the eigenvalue decomposition of the covariance matrix and the last one based on an eigenvalue entropy, a Granger-causation analysis revealed some of them are not always a good measure of risk in the S&P 500 and in the VIX. The measures selected do not Granger-cause the VIX index in all windows selected; therefore, in the sense of risk as volatility, the indicators are not always suitable. Nevertheless, their results towards returns are similar to previous works that accept them. A deeper analysis has shown that any symmetric measure based on eigenvalue decomposition of correlation matrices, however, is not useful as a measure of "correlation" risk. The empirical counterpart analysis of this proposition stated that negative correlations are usually small and, therefore, do not heavily distort the behavior of the indicator.

  3. Analysis and Identification of Acid-Base Indicator Dyes by Thin-Layer Chromatography

    ERIC Educational Resources Information Center

    Clark, Daniel D.

    2007-01-01

    Thin-layer chromatography (TLC) is a very simple and effective technique that is used by chemists by different purposes, including the monitoring of the progress of a reaction. TLC can also be easily used for the analysis and identification of various acid-base indicator dyes.

  4. A simple and selective colorimetric mercury (II) sensing system based on chitosan stabilized gold nanoparticles and 2,6-pyridinedicarboxylic acid.

    PubMed

    Tian, Kun; Siegel, Gene; Tiwari, Ashutosh

    2017-02-01

    The development of simple and cost-effective methods for the detection and treatment of Hg 2+ in the environment is an important area of research due to the serious health risk that Hg 2+ poses to humans. Colorimetric sensing based on the induced aggregation of nanoparticles is of great interest since it offers a low cost, simple, and relatively rapid procedure, making it perfect for on-site analysis. Herein we report the development of a simple colorimetric sensor for the selective detection and estimation of mercury ions in water, based on chitosan stabilized gold nanoparticles (AuNPs) and 2,6-pyridinedicarboxylic acid (PDA). In the presence of Hg 2+ , PDA induces the aggregation of AuNPs, causing the solution to change colors varying from red to blue, depending on the concentration of Hg 2+ . The formation of aggregated AuNPs in the presence of Hg 2+ was confirmed using transmission electron microscopy (TEM) and UV-Vis spectroscopy. The method exhibits linearity in the range of 300nM to 5μM and shows excellent selectivity towards Hg 2+ among seventeen different metal ions and was successfully applied for the detection of Hg 2+ in spiked river water samples. The developed technique is simple and superior to the existing techniques in that it allows detection of Hg 2+ using the naked eye and simple and rapid colorimetric analysis, which eliminates the need for sophisticated instruments and sample preparation methods. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Simulating and mapping spatial complexity using multi-scale techniques

    USGS Publications Warehouse

    De Cola, L.

    1994-01-01

    A central problem in spatial analysis is the mapping of data for complex spatial fields using relatively simple data structures, such as those of a conventional GIS. This complexity can be measured using such indices as multi-scale variance, which reflects spatial autocorrelation, and multi-fractal dimension, which characterizes the values of fields. These indices are computed for three spatial processes: Gaussian noise, a simple mathematical function, and data for a random walk. Fractal analysis is then used to produce a vegetation map of the central region of California based on a satellite image. This analysis suggests that real world data lie on a continuum between the simple and the random, and that a major GIS challenge is the scientific representation and understanding of rapidly changing multi-scale fields. -Author

  6. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    PubMed

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  7. Item Selection, Evaluation, and Simple Structure in Personality Data

    PubMed Central

    Pettersson, Erik; Turkheimer, Eric

    2010-01-01

    We report an investigation of the genesis and interpretation of simple structure in personality data using two very different self-reported data sets. The first consists of a set of relatively unselected lexical descriptors, whereas the second is based on responses to a carefully constructed instrument. In both data sets, we explore the degree of simple structure by comparing factor solutions to solutions from simulated data constructed to have either strong or weak simple structure. The analysis demonstrates that there is little evidence of simple structure in the unselected items, and a moderate degree among the selected items. In both instruments, however, much of the simple structure that could be observed originated in a strong dimension of positive vs. negative evaluation. PMID:20694168

  8. Grass Grows, the Cow Eats: A Simple Grazing Systems Model with Emergent Properties

    ERIC Educational Resources Information Center

    Ungar, Eugene David; Seligman, Noam G.; Noy-Meir, Imanuel

    2004-01-01

    We describe a simple, yet intellectually challenging model of grazing systems that introduces basic concepts in ecology and systems analysis. The practical is suitable for high-school and university curricula with a quantitative orientation, and requires only basic skills in mathematics and spreadsheet use. The model is based on Noy-Meir's (1975)…

  9. Cultivar identification, pedigree verification, and diversity analysis among Peach (Prunus persica L. Batsch) Cultivars based on Simple Sequence Repeat markers

    USDA-ARS?s Scientific Manuscript database

    The genetic relationships and pedigree inferences among peach (Prunus persica (L.) Batsch) accessions and breeding lines used in genetic improvement were evaluated using 15 simple sequence repeat (SSR) markers. A total of 80 alleles were detected among the 37 peach accessions with an average of 5.53...

  10. Advantages of Thesaurus Representation Using the Simple Knowledge Organization System (SKOS) Compared with Proposed Alternatives

    ERIC Educational Resources Information Center

    Pastor-Sanchez, Juan-Antonio; Martinez Mendez, Francisco Javier; Rodriguez-Munoz, Jose Vicente

    2009-01-01

    Introduction: This paper presents an analysis of the Simple Knowledge Organization System (SKOS) compared with other alternatives for thesaurus representation in the Semantic Web. Method: Based on functional and structural changes of thesauri, provides an overview of the current context in which lexical paradigm is abandoned in favour of the…

  11. When Practice Doesn't Lead to Retrieval: An Analysis of Children's Errors with Simple Addition

    ERIC Educational Resources Information Center

    de Villiers, Celéste; Hopkins, Sarah

    2013-01-01

    Counting strategies initially used by young children to perform simple addition are often replaced by more efficient counting strategies, decomposition strategies and rule-based strategies until most answers are encoded in memory and can be directly retrieved. Practice is thought to be the key to developing fluent retrieval of addition facts. This…

  12. A simple webcam-based approach for the measurement of rodent locomotion and other behavioural parameters.

    PubMed

    Tort, Adriano B L; Neto, Waldemar P; Amaral, Olavo B; Kazlauckas, Vanessa; Souza, Diogo O; Lara, Diogo R

    2006-10-15

    We hereby describe a simple and inexpensive approach to evaluate the position and locomotion of rodents in an arena. The system is based on webcam registering of animal behaviour with subsequent analysis on customized software. Based on black/white differentiation, it provides rapid evaluation of animal position over a period of time, and can be used in a myriad of behavioural tasks in which locomotion, velocity or place preference are variables of interest. A brief review of the results obtained so far with this system and a discussion of other possible applications in behavioural neuroscience are also included. Such a system can be easily implemented in most laboratories and can significantly reduce the time and costs involved in behavioural analysis, especially in developing countries.

  13. Molecular Analysis of Date Palm Genetic Diversity Using Random Amplified Polymorphic DNA (RAPD) and Inter-Simple Sequence Repeats (ISSRs).

    PubMed

    El Sharabasy, Sherif F; Soliman, Khaled A

    2017-01-01

    The date palm is an ancient domesticated plant with great diversity and has been cultivated in the Middle East and North Africa for at last 5000 years. Date palm cultivars are classified based on the fruit moisture content, as dry, semidry, and soft dates. There are a number of biochemical and molecular techniques available for characterization of the date palm variation. This chapter focuses on the DNA-based markers random amplified polymorphic DNA (RAPD) and inter-simple sequence repeats (ISSR) techniques, in addition to biochemical markers based on isozyme analysis. These techniques coupled with appropriate statistical tools proved useful for determining phylogenetic relationships among date palm cultivars and provide information resources for date palm gene banks.

  14. Gold nanoparticle-based optical microfluidic sensors for analysis of environmental pollutants.

    PubMed

    Lafleur, Josiane P; Senkbeil, Silja; Jensen, Thomas G; Kutter, Jörg P

    2012-11-21

    Conventional methods of environmental analysis can be significantly improved by the development of portable microscale technologies for direct in-field sensing at remote locations. This report demonstrates the vast potential of gold nanoparticle-based microfluidic sensors for the rapid, in-field, detection of two important classes of environmental contaminants - heavy metals and pesticides. Using gold nanoparticle-based microfluidic sensors linked to a simple digital camera as the detector, detection limits as low as 0.6 μg L(-1) and 16 μg L(-1) could be obtained for the heavy metal mercury and the dithiocarbamate pesticide ziram, respectively. These results demonstrate that the attractive optical properties of gold nanoparticle probes combine synergistically with the inherent qualities of microfluidic platforms to offer simple, portable and sensitive sensors for environmental contaminants.

  15. Accuracy analysis of automodel solutions for Lévy flight-based transport: from resonance radiative transfer to a simple general model

    NASA Astrophysics Data System (ADS)

    Kukushkin, A. B.; Sdvizhenskii, P. A.

    2017-12-01

    The results of accuracy analysis of automodel solutions for Lévy flight-based transport on a uniform background are presented. These approximate solutions have been obtained for Green’s function of the following equations: the non-stationary Biberman-Holstein equation for three-dimensional (3D) radiative transfer in plasma and gases, for various (Doppler, Lorentz, Voigt and Holtsmark) spectral line shapes, and the 1D transport equation with a simple longtailed step-length probability distribution function with various power-law exponents. The results suggest the possibility of substantial extension of the developed method of automodel solution to other fields far beyond physics.

  16. Web-Based Analysis for Student-Generated Complex Genetic Profiles

    ERIC Educational Resources Information Center

    Kass, David H.; LaRoe, Robert

    2007-01-01

    A simple, rapid method for generating complex genetic profiles using Alu-based markers was recently developed for students primarily at the undergraduate level to learn more about forensics and paternity analysis. On the basis of the Cold Spring Harbor Allele Server, which provides an excellent tool for analyzing a single Alu variant, we present a…

  17. Genetic diversity analysis of sugarcane germplasm based on fluorescence-labeled simple sequence repeat markers and a capillary electrophoresis-based genotyping platform

    USDA-ARS?s Scientific Manuscript database

    Genetic diversity analysis, which refers to the elaboration of total extent of genetic characteristics in the genetic makeup of a certain species, constitutes a classical strategy for the study of diversity, population genetic structure, and breeding practices. In this study, fluorescence-labeled se...

  18. Validation of the replica trick for simple models

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2018-04-01

    We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.

  19. Risk-Based Sampling: I Don't Want to Weight in Vain.

    PubMed

    Powell, Mark R

    2015-12-01

    Recently, there has been considerable interest in developing risk-based sampling for food safety and animal and plant health for efficient allocation of inspection and surveillance resources. The problem of risk-based sampling allocation presents a challenge similar to financial portfolio analysis. Markowitz (1952) laid the foundation for modern portfolio theory based on mean-variance optimization. However, a persistent challenge in implementing portfolio optimization is the problem of estimation error, leading to false "optimal" portfolios and unstable asset weights. In some cases, portfolio diversification based on simple heuristics (e.g., equal allocation) has better out-of-sample performance than complex portfolio optimization methods due to estimation uncertainty. Even for portfolios with a modest number of assets, the estimation window required for true optimization may imply an implausibly long stationary period. The implications for risk-based sampling are illustrated by a simple simulation model of lot inspection for a small, heterogeneous group of producers. © 2015 Society for Risk Analysis.

  20. A practical approach to language complexity: a Wikipedia case study.

    PubMed

    Yasseri, Taha; Kornai, András; Kertész, János

    2012-01-01

    In this paper we present statistical analysis of English texts from Wikipedia. We try to address the issue of language complexity empirically by comparing the simple English Wikipedia (Simple) to comparable samples of the main English Wikipedia (Main). Simple is supposed to use a more simplified language with a limited vocabulary, and editors are explicitly requested to follow this guideline, yet in practice the vocabulary richness of both samples are at the same level. Detailed analysis of longer units (n-grams of words and part of speech tags) shows that the language of Simple is less complex than that of Main primarily due to the use of shorter sentences, as opposed to drastically simplified syntax or vocabulary. Comparing the two language varieties by the Gunning readability index supports this conclusion. We also report on the topical dependence of language complexity, that is, that the language is more advanced in conceptual articles compared to person-based (biographical) and object-based articles. Finally, we investigate the relation between conflict and language complexity by analyzing the content of the talk pages associated to controversial and peacefully developing articles, concluding that controversy has the effect of reducing language complexity.

  1. Determination of water pH using absorption-based optical sensors: evaluation of different calculation methods

    NASA Astrophysics Data System (ADS)

    Wang, Hongliang; Liu, Baohua; Ding, Zhongjun; Wang, Xiangxin

    2017-02-01

    Absorption-based optical sensors have been developed for the determination of water pH. In this paper, based on the preparation of a transparent sol-gel thin film with a phenol red (PR) indicator, several calculation methods, including simple linear regression analysis, quadratic regression analysis and dual-wavelength absorbance ratio analysis, were used to calculate water pH. Results of MSSRR show that dual-wavelength absorbance ratio analysis can improve the calculation accuracy of water pH in long-term measurement.

  2. Theory and simulations of covariance mapping in multiple dimensions for data analysis in high-event-rate experiments

    NASA Astrophysics Data System (ADS)

    Zhaunerchyk, V.; Frasinski, L. J.; Eland, J. H. D.; Feifel, R.

    2014-05-01

    Multidimensional covariance analysis and its validity for correlation of processes leading to multiple products are investigated from a theoretical point of view. The need to correct for false correlations induced by experimental parameters which fluctuate from shot to shot, such as the intensity of self-amplified spontaneous emission x-ray free-electron laser pulses, is emphasized. Threefold covariance analysis based on simple extension of the two-variable formulation is shown to be valid for variables exhibiting Poisson statistics. In this case, false correlations arising from fluctuations in an unstable experimental parameter that scale linearly with signals can be eliminated by threefold partial covariance analysis, as defined here. Fourfold covariance based on the same simple extension is found to be invalid in general. Where fluctuations in an unstable parameter induce nonlinear signal variations, a technique of contingent covariance analysis is proposed here to suppress false correlations. In this paper we also show a method to eliminate false correlations associated with fluctuations of several unstable experimental parameters.

  3. Simple Electrolyzer Model Development for High-Temperature Electrolysis System Analysis Using Solid Oxide Electrolysis Cell

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JaeHwa Koh; DuckJoo Yoon; Chang H. Oh

    2010-07-01

    An electrolyzer model for the analysis of a hydrogen-production system using a solid oxide electrolysis cell (SOEC) has been developed, and the effects for principal parameters have been estimated by sensitivity studies based on the developed model. The main parameters considered are current density, area specific resistance, temperature, pressure, and molar fraction and flow rates in the inlet and outlet. Finally, a simple model for a high-temperature hydrogen-production system using the solid oxide electrolysis cell integrated with very high temperature reactors is estimated.

  4. A Simple Method Based on the Application of a CCD Camera as a Sensor to Detect Low Concentrations of Barium Sulfate in Suspension

    PubMed Central

    de Sena, Rodrigo Caciano; Soares, Matheus; Pereira, Maria Luiza Oliveira; da Silva, Rogério Cruz Domingues; do Rosário, Francisca Ferreira; da Silva, Joao Francisco Cajaiba

    2011-01-01

    The development of a simple, rapid and low cost method based on video image analysis and aimed at the detection of low concentrations of precipitated barium sulfate is described. The proposed system is basically composed of a webcam with a CCD sensor and a conventional dichroic lamp. For this purpose, software for processing and analyzing the digital images based on the RGB (Red, Green and Blue) color system was developed. The proposed method had shown very good repeatability and linearity and also presented higher sensitivity than the standard turbidimetric method. The developed method is presented as a simple alternative for future applications in the study of precipitations of inorganic salts and also for detecting the crystallization of organic compounds. PMID:22346607

  5. Sublaminate analysis of interlaminar fracture in composites

    NASA Technical Reports Server (NTRS)

    Armanios, E. A.; Rehfield, L. W.

    1986-01-01

    A simple analysis method based upon a transverse shear deformation theory and a sublaminate approach is utilized to analyze a mixed-mode edge delamination specimen. The analysis provides closed form expressions for the interlaminar shear stresses ahead of the crack, the total energy release rate, and the energy release rate components. The parameters controlling the behavior are identified. The effect of specimen stacking sequence and delamination interface on the strain energy release rate components is investigated. Results are compared with a finite element simulation for reference. The simple nature of the method makes it suitable for preliminary design analyses which require a large number of configurations to be evaluated quickly and economically.

  6. Validation of PC-based Sound Card with Biopac for Digitalization of ECG Recording in Short-term HRV Analysis.

    PubMed

    Maheshkumar, K; Dilara, K; Maruthy, K N; Sundareswaren, L

    2016-07-01

    Heart rate variability (HRV) analysis is a simple and noninvasive technique capable of assessing autonomic nervous system modulation on heart rate (HR) in healthy as well as disease conditions. The aim of the present study was to compare (validate) the HRV using a temporal series of electrocardiograms (ECG) obtained by simple analog amplifier with PC-based sound card (audacity) and Biopac MP36 module. Based on the inclusion criteria, 120 healthy participants, including 72 males and 48 females, participated in the present study. Following standard protocol, 5-min ECG was recorded after 10 min of supine rest by Portable simple analog amplifier PC-based sound card as well as by Biopac module with surface electrodes in Leads II position simultaneously. All the ECG data was visually screened and was found to be free of ectopic beats and noise. RR intervals from both ECG recordings were analyzed separately in Kubios software. Short-term HRV indexes in both time and frequency domain were used. The unpaired Student's t-test and Pearson correlation coefficient test were used for the analysis using the R statistical software. No statistically significant differences were observed when comparing the values analyzed by means of the two devices for HRV. Correlation analysis revealed perfect positive correlation (r = 0.99, P < 0.001) between the values in time and frequency domain obtained by the devices. On the basis of the results of the present study, we suggest that the calculation of HRV values in the time and frequency domains by RR series obtained from the PC-based sound card is probably as reliable as those obtained by the gold standard Biopac MP36.

  7. Nondeducibility-Based Analysis of Cyber-Physical Systems

    NASA Astrophysics Data System (ADS)

    Gamage, Thoshitha; McMillin, Bruce

    Controlling information flow in a cyber-physical system (CPS) is challenging because cyber domain decisions and actions manifest themselves as visible changes in the physical domain. This paper presents a nondeducibility-based observability analysis for CPSs. In many CPSs, the capacity of a low-level (LL) observer to deduce high-level (HL) actions ranges from limited to none. However, a collaborative set of observers strategically located in a network may be able to deduce all the HL actions. This paper models a distributed power electronics control device network using a simple DC circuit in order to understand the effect of multiple observers in a CPS. The analysis reveals that the number of observers required to deduce all the HL actions in a system increases linearly with the number of configurable units. A simple definition of nondeducibility based on the uniqueness of low-level projections is also presented. This definition is used to show that a system with two security domain levels could be considered “nondeducibility secure” if no unique LL projections exist.

  8. A Simple Plasma Retinol Isotope Ratio Method for Estimating β-Carotene Relative Bioefficacy in Humans: Validation with the Use of Model-Based Compartmental Analysis.

    PubMed

    Ford, Jennifer Lynn; Green, Joanne Balmer; Lietz, Georg; Oxley, Anthony; Green, Michael H

    2017-09-01

    Background: Provitamin A carotenoids are an important source of dietary vitamin A for many populations. Thus, accurate and simple methods for estimating carotenoid bioefficacy are needed to evaluate the vitamin A value of test solutions and plant sources. β-Carotene bioefficacy is often estimated from the ratio of the areas under plasma isotope response curves after subjects ingest labeled β-carotene and a labeled retinyl acetate reference dose [isotope reference method (IRM)], but to our knowledge, the method has not yet been evaluated for accuracy. Objectives: Our objectives were to develop and test a physiologically based compartmental model that includes both absorptive and postabsorptive β-carotene bioconversion and to use the model to evaluate the accuracy of the IRM and a simple plasma retinol isotope ratio [(RIR), labeled β-carotene-derived retinol/labeled reference-dose-derived retinol in one plasma sample] for estimating relative bioefficacy. Methods: We used model-based compartmental analysis (Simulation, Analysis and Modeling software) to develop and apply a model that provided known values for β-carotene bioefficacy. Theoretical data for 10 subjects were generated by the model and used to determine bioefficacy by RIR and IRM; predictions were compared with known values. We also applied RIR and IRM to previously published data. Results: Plasma RIR accurately predicted β-carotene relative bioefficacy at 14 d or later. IRM also accurately predicted bioefficacy by 14 d, except that, when there was substantial postabsorptive bioconversion, IRM underestimated bioefficacy. Based on our model, 1-d predictions of relative bioefficacy include absorptive plus a portion of early postabsorptive conversion. Conclusion: The plasma RIR is a simple tracer method that accurately predicts β-carotene relative bioefficacy based on analysis of one blood sample obtained at ≥14 d after co-ingestion of labeled β-carotene and retinyl acetate. The method also provides information about the contributions of absorptive and postabsorptive conversion to total bioefficacy if an additional sample is taken at 1 d. © 2017 American Society for Nutrition.

  9. Simple 2.5 GHz time-bin quantum key distribution

    NASA Astrophysics Data System (ADS)

    Boaron, Alberto; Korzh, Boris; Houlmann, Raphael; Boso, Gianluca; Rusca, Davide; Gray, Stuart; Li, Ming-Jun; Nolan, Daniel; Martin, Anthony; Zbinden, Hugo

    2018-04-01

    We present a 2.5 GHz quantum key distribution setup with the emphasis on a simple experimental realization. It features a three-state time-bin protocol based on a pulsed diode laser and a single intensity modulator. Implementing an efficient one-decoy scheme and finite-key analysis, we achieve record breaking secret key rates of 1.5 kbps over 200 km of standard optical fibers.

  10. A simple method of fabricating mask-free microfluidic devices for biological analysis

    PubMed Central

    Yi, Xin; Kodzius, Rimantas; Gong, Xiuqing; Xiao, Kang; Wen, Weijia

    2010-01-01

    We report a simple, low-cost, rapid, and mask-free method to fabricate two-dimensional (2D) and three-dimensional (3D) microfluidic chip for biological analysis researches. In this fabrication process, a laser system is used to cut through paper to form intricate patterns and differently configured channels for specific purposes. Bonded with cyanoacrylate-based resin, the prepared paper sheet is sandwiched between glass slides (hydrophilic) or polymer-based plates (hydrophobic) to obtain a multilayer structure. In order to examine the chip’s biocompatibility and applicability, protein concentration was measured while DNA capillary electrophoresis was carried out, and both of them show positive results. With the utilization of direct laser cutting and one-step gas-sacrificing techniques, the whole fabrication processes for complicated 2D and 3D microfluidic devices are shorten into several minutes which make it a good alternative of poly(dimethylsiloxane) microfluidic chips used in biological analysis researches. PMID:20890452

  11. Simulations of multi-contrast x-ray imaging using near-field speckles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zdora, Marie-Christine; Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE, United Kingdom and Department of Physics & Astronomy, University College London, London, WC1E 6BT; Thibault, Pierre

    2016-01-28

    X-ray dark-field and phase-contrast imaging using near-field speckles is a novel technique that overcomes limitations inherent in conventional absorption x-ray imaging, i.e. poor contrast for features with similar density. Speckle-based imaging yields a wealth of information with a simple setup tolerant to polychromatic and divergent beams, and simple data acquisition and analysis procedures. Here, we present a simulation software used to model the image formation with the speckle-based technique, and we compare simulated results on a phantom sample with experimental synchrotron data. Thorough simulation of a speckle-based imaging experiment will help for better understanding and optimising the technique itself.

  12. Analysis of Levene's Test under Design Imbalance.

    ERIC Educational Resources Information Center

    Keyes, Tim K.; Levy, Martin S.

    1997-01-01

    H. Levene (1960) proposed a heuristic test for heteroscedasticity in the case of a balanced two-way layout, based on analysis of variance of absolute residuals. Conditions under which design imbalance affects the test's characteristics are identified, and a simple correction involving leverage is proposed. (SLD)

  13. Using high speed smartphone cameras and video analysis techniques to teach mechanical wave physics

    NASA Astrophysics Data System (ADS)

    Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano

    2017-07-01

    We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses along a spring and the period of transverse standing waves generated in the same spring. These experiments can be helpful in addressing several relevant concepts about the physics of mechanical waves and in overcoming some of the typical student misconceptions in this same field.

  14. DETECTION OF DNA DAMAGE USING MELTING ANALYSIS TECHNIQUES

    EPA Science Inventory

    A rapid and simple fluorescence screening assay for UV radiation-, chemical-, and enzyme-induced DNA damage is reported. This assay is based on a melting/annealing analysis technique and has been used with both calf thymus DNA and plasmid DNA (puc 19 plasmid from E. coli). DN...

  15. Deviation from the mean in teaching uncertainties

    NASA Astrophysics Data System (ADS)

    Budini, N.; Giorgi, S.; Sarmiento, L. M.; Cámara, C.; Carreri, R.; Gómez Carrillo, S. C.

    2017-07-01

    In this work we present two simple and interactive web-based activities for introducing students to the concepts of uncertainties in measurements. These activities are based on the real-time construction of histograms from students measurements and their subsequent analysis through an active and dynamic approach.

  16. A minimally sufficient model for rib proximal-distal patterning based on genetic analysis and agent-based simulations

    PubMed Central

    Mah, In Kyoung

    2017-01-01

    For decades, the mechanism of skeletal patterning along a proximal-distal axis has been an area of intense inquiry. Here, we examine the development of the ribs, simple structures that in most terrestrial vertebrates consist of two skeletal elements—a proximal bone and a distal cartilage portion. While the ribs have been shown to arise from the somites, little is known about how the two segments are specified. During our examination of genetically modified mice, we discovered a series of progressively worsening phenotypes that could not be easily explained. Here, we combine genetic analysis of rib development with agent-based simulations to conclude that proximal-distal patterning and outgrowth could occur based on simple rules. In our model, specification occurs during somite stages due to varying Hedgehog protein levels, while later expansion refines the pattern. This framework is broadly applicable for understanding the mechanisms of skeletal patterning along a proximal-distal axis. PMID:29068314

  17. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    PubMed Central

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  18. The vulnerability of electric equipment to carbon fibers of mixed lengths: An analysis

    NASA Technical Reports Server (NTRS)

    Elber, W.

    1980-01-01

    The susceptibility of a stereo amplifier to damage from a spectrum of lengths of graphite fibers was calculated. A simple analysis was developed by which such calculations can be based on test results with fibers of uniform lengths. A statistical analysis was applied for the conversation of data for various logical failure criteria.

  19. Impact of an engineering design-based curriculum compared to an inquiry-based curriculum on fifth graders' content learning of simple machines

    NASA Astrophysics Data System (ADS)

    Marulcu, Ismail; Barnett, Michael

    2016-01-01

    Background: Elementary Science Education is struggling with multiple challenges. National and State test results confirm the need for deeper understanding in elementary science education. Moreover, national policy statements and researchers call for increased exposure to engineering and technology in elementary science education. The basic motivation of this study is to suggest a solution to both improving elementary science education and increasing exposure to engineering and technology in it. Purpose/Hypothesis: This mixed-method study examined the impact of an engineering design-based curriculum compared to an inquiry-based curriculum on fifth graders' content learning of simple machines. We hypothesize that the LEGO-engineering design unit is as successful as the inquiry-based unit in terms of students' science content learning of simple machines. Design/Method: We used a mixed-methods approach to investigate our research questions; we compared the control and the experimental groups' scores from the tests and interviews by using Analysis of Covariance (ANCOVA) and compared each group's pre- and post-scores by using paired t-tests. Results: Our findings from the paired t-tests show that both the experimental and comparison groups significantly improved their scores from the pre-test to post-test on the multiple-choice, open-ended, and interview items. Moreover, ANCOVA results show that students in the experimental group, who learned simple machines with the design-based unit, performed significantly better on the interview questions. Conclusions: Our analyses revealed that the design-based Design a people mover: Simple machines unit was, if not better, as successful as the inquiry-based FOSS Levers and pulleys unit in terms of students' science content learning.

  20. New insight in quantitative analysis of vascular permeability during immune reaction (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Kalchenko, Vyacheslav; Molodij, Guillaume; Kuznetsov, Yuri; Smolyakov, Yuri; Israeli, David; Meglinski, Igor; Harmelin, Alon

    2016-03-01

    The use of fluorescence imaging of vascular permeability becomes a golden standard for assessing the inflammation process during experimental immune response in vivo. The use of the optical fluorescence imaging provides a very useful and simple tool to reach this purpose. The motivation comes from the necessity of a robust and simple quantification and data presentation of inflammation based on a vascular permeability. Changes of the fluorescent intensity, as a function of time is a widely accepted method to assess the vascular permeability during inflammation related to the immune response. In the present study we propose to bring a new dimension by applying a more sophisticated approach to the analysis of vascular reaction by using a quantitative analysis based on methods derived from astronomical observations, in particular by using a space-time Fourier filtering analysis followed by a polynomial orthogonal modes decomposition. We demonstrate that temporal evolution of the fluorescent intensity observed at certain pixels correlates quantitatively to the blood flow circulation at normal conditions. The approach allows to determine the regions of permeability and monitor both the fast kinetics related to the contrast material distribution in the circulatory system and slow kinetics associated with extravasation of the contrast material. Thus, we introduce a simple and convenient method for fast quantitative visualization of the leakage related to the inflammatory (immune) reaction in vivo.

  1. Webcam camera as a detector for a simple lab-on-chip time based approach.

    PubMed

    Wongwilai, Wasin; Lapanantnoppakhun, Somchai; Grudpan, Supara; Grudpan, Kate

    2010-05-15

    A modification of a webcam camera for use as a small and low cost detector was demonstrated with a simple lab-on-chip reactor. Real time continuous monitoring of the reaction zone could be done. Acid-base neutralization with phenolphthalein indicator was used as a model reaction. The fading of pink color of the indicator when the acidic solution diffused into the basic solution zone was recorded as the change of red, blue and green colors (%RBG.) The change was related to acid concentration. A low cost portable semi-automation analysis system was achieved.

  2. Time-domain representation of frequency-dependent foundation impedance functions

    USGS Publications Warehouse

    Safak, E.

    2006-01-01

    Foundation impedance functions provide a simple means to account for soil-structure interaction (SSI) when studying seismic response of structures. Impedance functions represent the dynamic stiffness of the soil media surrounding the foundation. The fact that impedance functions are frequency dependent makes it difficult to incorporate SSI in standard time-history analysis software. This paper introduces a simple method to convert frequency-dependent impedance functions into time-domain filters. The method is based on the least-squares approximation of impedance functions by ratios of two complex polynomials. Such ratios are equivalent, in the time-domain, to discrete-time recursive filters, which are simple finite-difference equations giving the relationship between foundation forces and displacements. These filters can easily be incorporated into standard time-history analysis programs. Three examples are presented to show the applications of the method.

  3. Determining Phylogenetic Relationships Among Date Palm Cultivars Using Random Amplified Polymorphic DNA (RAPD) and Inter-Simple Sequence Repeat (ISSR) Markers.

    PubMed

    Haider, Nadia

    2017-01-01

    Investigation of genetic variation and phylogenetic relationships among date palm (Phoenix dactylifera L.) cultivars is useful for their conservation and genetic improvement. Various molecular markers such as restriction fragment length polymorphisms (RFLPs), simple sequence repeat (SSR), representational difference analysis (RDA), and amplified fragment length polymorphism (AFLP) have been developed to molecularly characterize date palm cultivars. PCR-based markers random amplified polymorphic DNA (RAPD) and inter-simple sequence repeat (ISSR) are powerful tools to determine the relatedness of date palm cultivars that are difficult to distinguish morphologically. In this chapter, the principles, materials, and methods of RAPD and ISSR techniques are presented. Analysis of data generated from these two techniques and the use of these data to reveal phylogenetic relationships among date palm cultivars are also discussed.

  4. Implementing Computer Based Laboratories

    NASA Astrophysics Data System (ADS)

    Peterson, David

    2001-11-01

    Physics students at Francis Marion University will complete several required laboratory exercises utilizing computer-based Vernier probes. The simple pendulum, the acceleration due to gravity, simple harmonic motion, radioactive half lives, and radiation inverse square law experiments will be incorporated into calculus-based and algebra-based physics courses. Assessment of student learning and faculty satisfaction will be carried out by surveys and test results. Cost effectiveness and time effectiveness assessments will be presented. Majors in Computational Physics, Health Physics, Engineering, Chemistry, Mathematics and Biology take these courses, and assessments will be categorized by major. To enhance the computer skills of students enrolled in the courses, MAPLE will be used for further analysis of the data acquired during the experiments. Assessment of these enhancement exercises will also be presented.

  5. Simple Rules, Not So Simple: The Use of International Ovarian Tumor Analysis (IOTA) Terminology and Simple Rules in Inexperienced Hands in a Prospective Multicenter Cohort Study.

    PubMed

    Meys, Evelyne; Rutten, Iris; Kruitwagen, Roy; Slangen, Brigitte; Lambrechts, Sandrina; Mertens, Helen; Nolting, Ernst; Boskamp, Dieuwke; Van Gorp, Toon

    2017-12-01

     To analyze how well untrained examiners - without experience in the use of International Ovarian Tumor Analysis (IOTA) terminology or simple ultrasound-based rules (simple rules) - are able to apply IOTA terminology and simple rules and to assess the level of agreement between non-experts and an expert.  This prospective multicenter cohort study enrolled women with ovarian masses. Ultrasound was performed by non-expert examiners and an expert. Ultrasound features were recorded using IOTA nomenclature, and used for classifying the mass by simple rules. Interobserver agreement was evaluated with Fleiss' kappa and percentage agreement between observers.  50 consecutive women were included. We observed 46 discrepancies in the description of ovarian masses when non-experts utilized IOTA terminology. Tumor type was misclassified often (n = 22), resulting in poor interobserver agreement between the non-experts and the expert (kappa = 0.39, 95 %-CI 0.244 - 0.529, percentage of agreement = 52.0 %). Misinterpretation of simple rules by non-experts was observed 57 times, resulting in an erroneous diagnosis in 15 patients (30 %). The agreement for classifying the mass as benign, malignant or inconclusive by simple rules was only moderate between the non-experts and the expert (kappa = 0.50, 95 %-CI 0.300 - 0.704, percentage of agreement = 70.0 %). The level of agreement for all 10 simple rules features varied greatly (kappa index range: -0.08 - 0.74, percentage of agreement 66 - 94 %).  Although simple rules are useful to distinguish benign from malignant adnexal masses, they are not that simple for untrained examiners. Training with both IOTA terminology and simple rules is necessary before simple rules can be introduced into guidelines and daily clinical practice. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Stoogiometry: A Cognitive Approach to Teaching Stoichiometry.

    ERIC Educational Resources Information Center

    Krieger, Carla R.

    1997-01-01

    Describes the use of Moe's Mall, a locational device designed to be used by learners, as a simple algorithm for solving mole-based exercises efficiently and accurately using dimensional analysis. (DDR)

  7. Comparison of estimators for rolling samples using Forest Inventory and Analysis data

    Treesearch

    Devin S. Johnson; Michael S. Williams; Raymond L. Czaplewski

    2003-01-01

    The performance of three classes of weighted average estimators is studied for an annual inventory design similar to the Forest Inventory and Analysis program of the United States. The first class is based on an ARIMA(0,1,1) time series model. The equal weight, simple moving average is a member of this class. The second class is based on an ARIMA(0,2,2) time series...

  8. The Adversarial Route Analysis Tool: A Web Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casson, William H. Jr.

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  9. Small Oscillations via Conservation of Energy

    ERIC Educational Resources Information Center

    Troy, Tia; Reiner, Megan; Haugen, Andrew J.; Moore, Nathan T.

    2017-01-01

    The work describes an analogy-based small oscillations analysis of a standard static equilibrium lab problem. In addition to force analysis, a potential energy function for the system is developed, and by drawing out mathematical similarities to the simple harmonic oscillator, we are able to describe (and experimentally verify) the period of small…

  10. Factor Analysis for Clustered Observations.

    ERIC Educational Resources Information Center

    Longford, N. T.; Muthen, B. O.

    1992-01-01

    A two-level model for factor analysis is defined, and formulas for a scoring algorithm for this model are derived. A simple noniterative method based on decomposition of total sums of the squares and cross-products is discussed and illustrated with simulated data and data from the Second International Mathematics Study. (SLD)

  11. Simple BiCMOS CCCTA design and resistorless analog function realization.

    PubMed

    Tangsrirat, Worapong

    2014-01-01

    The simple realization of the current-controlled conveyor transconductance amplifier (CCCTA) in BiCMOS technology is introduced. The proposed BiCMOS CCCTA realization is based on the use of differential pair and basic current mirror, which results in simple structure. Its characteristics, that is, parasitic resistance (R x) and current transfer (i o/i z), are also tunable electronically by external bias currents. The realized circuit is suitable for fabrication using standard 0.35 μm BiCMOS technology. Some simple and compact resistorless applications employing the proposed CCCTA as active elements are also suggested, which show that their circuit characteristics with electronic controllability are obtained. PSPICE simulation results demonstrating the circuit behaviors and confirming the theoretical analysis are performed.

  12. Inquiry-Based Pre-Engineering Activities for K-4 Students

    ERIC Educational Resources Information Center

    Perrin, Michele

    2004-01-01

    This paper uses inquiry-based learning to introduce primary students to the concepts and terminology found in four introductory engineering courses: Differential Equations, Circuit Analysis, Thermodynamics, and Dynamics. Simple electronic sensors coupled with everyday objects, such as a troll doll, demonstrate and reinforce the physical principles…

  13. Genetic diversity studies and identification of SSR markers associated with Fusarium wilt (Fusarium udum) resistance in cultivated pigeonpea (Cajanus cajan).

    PubMed

    Singh, A K; Rai, V P; Chand, R; Singh, R P; Singh, M N

    2013-01-01

    Genetic diversity and identification of simple sequence repeat markers correlated with Fusarium wilt resistance was performed in a set of 36 elite cultivated pigeonpea genotypes differing in levels of resistance to Fusarium wilt. Twenty-four polymorphic sequence repeat markers were screened across these genotypes, and amplified a total of 59 alleles with an average high polymorphic information content value of 0.52. Cluster analysis, done by UPGMA and PCA, grouped the 36 pigeonpea genotypes into two main clusters according to their Fusarium wilt reaction. Based on the Kruskal-Wallis ANOVA and simple regression analysis, six simple sequence repeat markers were found to be significantly associated with Fusarium wilt resistance. The phenotypic variation explained by these markers ranged from 23.7 to 56.4%. The present study helps in finding out feasibility of prescreened SSR markers to be used in genetic diversity analysis and their potential association with disease resistance.

  14. A simple strategy for varying the restart parameter in GMRES(m)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, A H; Jessup, E R; Kolev, T V

    2007-10-02

    When solving a system of linear equations with the restarted GMRES method, a fixed restart parameter is typically chosen. We present numerical experiments that demonstrate the beneficial effects of changing the value of the restart parameter in each restart cycle on the total time to solution. We propose a simple strategy for varying the restart parameter and provide some heuristic explanations for its effectiveness based on analysis of the symmetric case.

  15. IOTA simple rules in differentiating between benign and malignant ovarian tumors.

    PubMed

    Tantipalakorn, Charuwan; Wanapirak, Chanane; Khunamornpong, Surapan; Sukpan, Kornkanok; Tongsong, Theera

    2014-01-01

    To evaluate the diagnostic performance of IOTA simple rules in differentiating between benign and malignant ovarian tumors. A study of diagnostic performance was conducted on women scheduled for elective surgery due to ovarian masses between March 2007 and March 2012. All patients underwent ultrasound examination for IOTA simple rules within 24 hours of surgery. All examinations were performed by the authors, who had no any clinical information of the patients, to differentiate between benign and malignant adnexal masses using IOTA simple rules. Gold standard diagnosis was based on pathological or operative findings. A total of 398 adnexal masses, in 376 women, were available for analysis. Of them, the IOTA simple rules could be applied in 319 (80.1%) including 212 (66.5%) benign tumors and 107 (33.6%) malignant tumors. The simple rules yielded inconclusive results in 79 (19.9%) masses. In the 319 masses for which the IOTA simple rules could be applied, sensitivity was 82.9% and specificity 95.3%. The IOTA simple rules have high diagnostic performance in differentiating between benign and malignant adnexal masses. Nevertheless, inconclusive results are relatively common.

  16. An Evidence-Based Videotaped Running Biomechanics Analysis.

    PubMed

    Souza, Richard B

    2016-02-01

    Running biomechanics play an important role in the development of injuries. Performing a running biomechanics analysis on injured runners can help to develop treatment strategies. This article provides a framework for a systematic video-based running biomechanics analysis plan based on the current evidence on running injuries, using 2-dimensional (2D) video and readily available tools. Fourteen measurements are proposed in this analysis plan from lateral and posterior video. Identifying simple 2D surrogates for 3D biomechanic variables of interest allows for widespread translation of best practices, and have the best opportunity to impact the highly prevalent problem of the injured runner. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. A simple model for indentation creep

    NASA Astrophysics Data System (ADS)

    Ginder, Ryan S.; Nix, William D.; Pharr, George M.

    2018-03-01

    A simple model for indentation creep is developed that allows one to directly convert creep parameters measured in indentation tests to those observed in uniaxial tests through simple closed-form relationships. The model is based on the expansion of a spherical cavity in a power law creeping material modified to account for indentation loading in a manner similar to that developed by Johnson for elastic-plastic indentation (Johnson, 1970). Although only approximate in nature, the simple mathematical form of the new model makes it useful for general estimation purposes or in the development of other deformation models in which a simple closed-form expression for the indentation creep rate is desirable. Comparison to a more rigorous analysis which uses finite element simulation for numerical evaluation shows that the new model predicts uniaxial creep rates within a factor of 2.5, and usually much better than this, for materials creeping with stress exponents in the range 1 ≤ n ≤ 7. The predictive capabilities of the model are evaluated by comparing it to the more rigorous analysis and several sets of experimental data in which both the indentation and uniaxial creep behavior have been measured independently.

  18. [Fabrications of a poly (methyl methacrylate) (PMMA) microfluidic chip-based DNA analysis device].

    PubMed

    Du, Xiao-Guang

    2009-12-01

    A DNA analysis device based on poly(methyl methacrylate) (PMMA) microfluidic chips was developed. A PMMA chip with cross microchannels was fabricated by a simple hot embossing. Microchannels were modified with a static adsorptive coating method using 2% hydroxyethyl cellulose. A high-voltage power unit, variable in the range 0-1 800 V, was used for on-chip DNA sample injection and gel electrophoretic separation. High speed, high resolution DNA analysis was obtained with the home-built PMMA chip in a sieving matrix containing 2% hydroxyethyl cellulose with a blue intercalating dye, TO-PRO-3 (TP3), by using diode laser induced fluorescence detection based on optical fibers with a 670 nm long-pass filter. The DNA analysis device was applied for the separation of phiX-174/HaeIII DNA digest sample with 11 fragments ranging from 72 to 1 353 bp. A separation efficiency of 1.14 x 10(6) plates/m was obtained for the 603 bp fragments, while the R of 271/281 bp fragments was 1.2. The device was characterized by simple design, low cost for fabrication and operation, reusable PMMA chips, and good reproducibility. A portable microfluidic device for DNA analysis can be developed for clinical diagnosis and disease screening.

  19. X-ray peak profile analysis of zinc oxide nanoparticles formed by simple precipitation method

    NASA Astrophysics Data System (ADS)

    Pelicano, Christian Mark; Rapadas, Nick Joaquin; Magdaluyo, Eduardo

    2017-12-01

    Zinc oxide (ZnO) nanoparticles were successfully synthesized by a simple precipitation method using zinc acetate and tetramethylammonium hydroxide. The synthesized ZnO nanoparticles were characterized by X-ray Diffraction analysis (XRD) and Transmission Electron Microscopy (TEM). The XRD result revealed a hexagonal wurtzite structure for the ZnO nanoparticles. The TEM image showed spherical nanoparticles with an average crystallite size of 6.70 nm. For x-ray peak analysis, Williamson-Hall (W-H) and Size-Strain Plot (SSP) methods were applied to examine the effects of crystallite size and lattice strain on the peak broadening of the ZnO nanoparticles. Based on the calculations, the estimated crystallite sizes and lattice strains obtained are in good agreement with each other.

  20. Automated quantitative cytological analysis using portable microfluidic microscopy.

    PubMed

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Simultaneous pre-concentration and separation on simple paper-based analytical device for protein analysis.

    PubMed

    Niu, Ji-Cheng; Zhou, Ting; Niu, Li-Li; Xie, Zhen-Sheng; Fang, Fang; Yang, Fu-Quan; Wu, Zhi-Yong

    2018-02-01

    In this work, fast isoelectric focusing (IEF) was successfully implemented on an open paper fluidic channel for simultaneous concentration and separation of proteins from complex matrix. With this simple device, IEF can be finished in 10 min with a resolution of 0.03 pH units and concentration factor of 10, as estimated by color model proteins by smartphone-based colorimetric detection. Fast detection of albumin from human serum and glycated hemoglobin (HBA1c) from blood cell was demonstrated. In addition, off-line identification of the model proteins from the IEF fractions with matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF-MS) was also shown. This PAD IEF is potentially useful either for point of care test (POCT) or biomarker analysis as a cost-effective sample pretreatment method.

  2. Laser-Based Measurement of Refractive Index Changes: Kinetics of 2,3-Epoxy-1-propanol Hydrolysis.

    ERIC Educational Resources Information Center

    Spencer, Bert; Zare, Richard N.

    1988-01-01

    Describes an experiment in which a simple laser-based apparatus is used for measuring the change in refractive index during the acid-catalyzed hydrolysis of glycidol into glycerine. Gives a schematic of the experimental setup and discusses the kinetic analysis. (MVL)

  3. An analysis of ratings: A guide to RMRATE

    Treesearch

    Thomas C. Brown; Terry C. Daniel; Herbert W. Schroeder; Glen E. Brink

    1990-01-01

    This report describes RMRATE, a computer program for analyzing rating judgments. RMRATE scales ratings using several scaling procedures, and compares the resulting scale values. The scaling procedures include the median and simple mean, standardized values, scale values based on Thurstone's Law of Categorical Judgment, and regression-based values. RMRATE also...

  4. Waveform Similarity Analysis: A Simple Template Comparing Approach for Detecting and Quantifying Noisy Evoked Compound Action Potentials.

    PubMed

    Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira

    2015-01-01

    Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies.

  5. Waveform Similarity Analysis: A Simple Template Comparing Approach for Detecting and Quantifying Noisy Evoked Compound Action Potentials

    PubMed Central

    Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira

    2015-01-01

    Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies. PMID:26325291

  6. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  7. A new simple form of quark mixing matrix

    NASA Astrophysics Data System (ADS)

    Qin, Nan; Ma, Bo-Qiang

    2011-01-01

    Although different parametrizations of quark mixing matrix are mathematically equivalent, the consequences of experimental analysis may be distinct. Based on the triminimal expansion of Kobayashi-Maskawa matrix around the unit matrix, we propose a new simple parametrization. Compared with the Wolfenstein parametrization, we find that the new form is not only consistent with the original one in the hierarchical structure, but also more convenient for numerical analysis and measurement of the CP-violating phase. By discussing the relation between our new form and the unitarity boomerang, we point out that along with the unitarity boomerang, this new parametrization is useful in hunting for new physics.

  8. Fifth Graders' Learning About Simple Machines Through Engineering Design-Based Instruction Using LEGO™ Materials

    NASA Astrophysics Data System (ADS)

    Marulcu, Ismail; Barnett, Mike

    2013-10-01

    This study is part of a 5-year National Science Foundation-funded project, Transforming Elementary Science Learning Through LEGO™ Engineering Design. In this study, we report on the successes and challenges of implementing an engineering design-based and LEGO™-oriented unit in an urban classroom setting and we focus on the impact of the unit on students' content understanding of simple machines. The LEGO™ engineering-based simple machines module, which was developed for fifth graders by our research team, was implemented in an urban school in a large city in the Northeastern region of the USA. Thirty-three fifth grade students participated in the study, and they showed significant growth in content understanding. We measured students' content knowledge by using identical paper tests and semistructured interviews before and after instruction. Our paired t test analysis results showed that students significantly improved their test and interview scores (t = -3.62, p < 0.001 for multiple-choice items and t = -9.06, p < 0.000 for the open-ended items in the test and t = -12.11, p < 0.000 for the items in interviews). We also identified several alternative conceptions that are held by students on simple machines.

  9. High-accuracy self-mixing interferometer based on multiple reflections using a simple external reflecting mirror

    NASA Astrophysics Data System (ADS)

    Wang, Xiu-lin; Wei, Zheng; Wang, Rui; Huang, Wen-cai

    2018-05-01

    A self-mixing interferometer (SMI) with resolution twenty times higher than that of a conventional interferometer is developed by multiple reflections. Only by employing a simple external reflecting mirror, the multiple-pass optical configuration can be constructed. The advantage of the configuration is simple and easy to make the light re-injected back into the laser cavity. Theoretical analysis shows that the resolution of measurement is scalable by adjusting the number of reflections. The experiment shows that the proposed method has the optical resolution of approximate λ/40. The influence of displacement sensitivity gain ( G) is further analyzed and discussed in practical experiments.

  10. Fault identification of rotor-bearing system based on ensemble empirical mode decomposition and self-zero space projection analysis

    NASA Astrophysics Data System (ADS)

    Jiang, Fan; Zhu, Zhencai; Li, Wei; Zhou, Gongbo; Chen, Guoan

    2014-07-01

    Accurately identifying faults in rotor-bearing systems by analyzing vibration signals, which are nonlinear and nonstationary, is challenging. To address this issue, a new approach based on ensemble empirical mode decomposition (EEMD) and self-zero space projection analysis is proposed in this paper. This method seeks to identify faults appearing in a rotor-bearing system using simple algebraic calculations and projection analyses. First, EEMD is applied to decompose the collected vibration signals into a set of intrinsic mode functions (IMFs) for features. Second, these extracted features under various mechanical health conditions are used to design a self-zero space matrix according to space projection analysis. Finally, the so-called projection indicators are calculated to identify the rotor-bearing system's faults with simple decision logic. Experiments are implemented to test the reliability and effectiveness of the proposed approach. The results show that this approach can accurately identify faults in rotor-bearing systems.

  11. Analysis and calculation by integral methods of laminar compressible boundary-layer with heat transfer and with and without pressure gradient

    NASA Technical Reports Server (NTRS)

    Morduchow, Morris

    1955-01-01

    A survey of integral methods in laminar-boundary-layer analysis is first given. A simple and sufficiently accurate method for practical purposes of calculating the properties (including stability) of the laminar compressible boundary layer in an axial pressure gradient with heat transfer at the wall is presented. For flow over a flat plate, the method is applicable for an arbitrarily prescribed distribution of temperature along the surface and for any given constant Prandtl number close to unity. For flow in a pressure gradient, the method is based on a Prandtl number of unity and a uniform wall temperature. A simple and accurate method of determining the separation point in a compressible flow with an adverse pressure gradient over a surface at a given uniform wall temperature is developed. The analysis is based on an extension of the Karman-Pohlhausen method to the momentum and the thermal energy equations in conjunction with fourth- and especially higher degree velocity and stagnation-enthalpy profiles.

  12. Quantification of sensory and food quality: the R-index analysis.

    PubMed

    Lee, Hye-Seong; van Hout, Danielle

    2009-08-01

    The accurate quantification of sensory difference/similarity between foods, as well as consumer acceptance/preference and concepts, is greatly needed to optimize and maintain food quality. The R-Index is one class of measures of the degree of difference/similarity, and was originally developed for sensory difference tests for food quality control, product development, and so on. The index is based on signal detection theory and is free of the response bias that can invalidate difference testing protocols, including categorization and same-different and A-Not A tests. It is also a nonparametric analysis, making no assumptions about sensory distributions, and is simple to compute and understand. The R-Index is also flexible in its application. Methods based on R-Index analysis have been used as detection and sensory difference tests, as simple alternatives to hedonic scaling, and for the measurement of consumer concepts. This review indicates the various computational strategies for the R-Index and its practical applications to consumer and sensory measurements in food science.

  13. Quantitative analysis of fungicide azoxystrobin in agricultural samples with rapid, simple and reliable monoclonal immunoassay.

    PubMed

    Watanabe, Eiki; Miyake, Shiro

    2013-01-15

    This work presents analytical performance of a kit-based direct competitive enzyme-linked immunosorbent assay (dc-ELISA) for azoxystrobin detection in agricultural products. The dc-ELISA was sufficiently sensitive for analysis of residue levels close to the maximum residue limits. The dc-ELISA did not show cross-reactivity to other strobilurin analogues. Absorbance decreased with the increase of methanol concentration in sample solution from 2% to 40%, while the standard curve became most linear when the sample solution contained 10% methanol. Agricultural samples were extracted with methanol, and the extracts were diluted with water to 10% methanol adequate. No significant matrix interference was observed. Satisfying recovery was found for all of spiked samples and the results were well agreed with the analysis with liquid chromatography. These results clearly indicate that the kit-based dc-ELISA is suitable for the rapid, simple, quantitative and reliable determination of the fungicide. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Advantages of the net benefit regression framework for economic evaluations of interventions in the workplace: a case study of the cost-effectiveness of a collaborative mental health care program for people receiving short-term disability benefits for psychiatric disorders.

    PubMed

    Hoch, Jeffrey S; Dewa, Carolyn S

    2014-04-01

    Economic evaluations commonly accompany trials of new treatments or interventions; however, regression methods and their corresponding advantages for the analysis of cost-effectiveness data are not well known. To illustrate regression-based economic evaluation, we present a case study investigating the cost-effectiveness of a collaborative mental health care program for people receiving short-term disability benefits for psychiatric disorders. We implement net benefit regression to illustrate its strengths and limitations. Net benefit regression offers a simple option for cost-effectiveness analyses of person-level data. By placing economic evaluation in a regression framework, regression-based techniques can facilitate the analysis and provide simple solutions to commonly encountered challenges. Economic evaluations of person-level data (eg, from a clinical trial) should use net benefit regression to facilitate analysis and enhance results.

  15. A simple antireflection overcoat for opaque coatings in the submillimeter region

    NASA Technical Reports Server (NTRS)

    Smith, S. M.

    1986-01-01

    An antireflection overcoat for opaque baffle coatings in the far infrared (FIR)/submillimeter region was made from a simple Teflon spray-on lubricant. The Teflon overcoat reduced the specular reflectance of four different opaque coatings by nearly a factor of two. Analysis, based on the interference term of a reflecting-layer model, indicates that in the submillimeter region the reduced reflectance depends primarily on the refractive index of the overcoat and very little on its thickness.

  16. Synthetic bioactive novel ether based Schiff bases and their copper(II) complexes

    NASA Astrophysics Data System (ADS)

    Shabbir, Muhammad; Akhter, Zareen; Ismail, Hammad; Mirza, Bushra

    2017-10-01

    Novel ether based Schiff bases (HL1- HL4) were synthesized from 5-chloro-2-hydroxy benzaldehyde and primary amines (1-amino-4-phenoxybenzene, 4-(4-aminophenyloxy) biphenyl, 1-(4-aminophenoxy) naphthalene and 2-(4-aminophenoxy) naphthalene). From these Schiff bases copper(II) complexes (Cu(L1)2-Cu(L4)2)) were synthesized and characterized by elemental analysis and spectroscopic (FTIR, NMR) techniques. The synthesized Schiff bases and copper(II) complexes were further assessed for various biological studies. In brine shrimp assay the copper(II) complexes revealed 4-fold higher activity (LD50 3.8 μg/ml) as compared with simple ligands (LD50 12.4 μg/ml). Similar findings were observed in potato disc antitumor assay with higher activities for copper(II) complexes (IC50 range 20.4-24.1 μg/ml) than ligands (IC50 range 40.5-48.3 μg/ml). DPPH assay was performed to determine the antioxidant potential of the compounds. Significant antioxidant activity was shown by the copper(II) complexes whereas simple ligands have shown no activity. In DNA protection assay significant protection behavior was exhibited by simple ligand molecules while copper(II) complexes showed neutral behavior (neither protective nor damaging).

  17. A rapid and ultrasensitive SERRS assay for histidine and tyrosine based on azo coupling.

    PubMed

    Sui, Huimin; Wang, Yue; Yu, Zhi; Cong, Qian; Han, Xiao Xia; Zhao, Bing

    2016-10-01

    A simple and highly sensitive surface-enhanced resonance Raman scattering (SERRS)-based approach coupled with azo coupling reaction has been put forward for quantitative analysis of histidine and tyrosine. The SERRS-based assay is simple and rapid by mixing the azo reaction products with silver nanoparticles (AgNPs) for measurements within 2min. The limits of detection (LODs) of the method are as low as 4.33×10(-11) and 8.80×10(-11)M for histidine and tyrosine, respectively. Moreover, the SERRS fingerprint information specific to corresponding amino acids guarantees the selective detection for the target histidine and tyrosine. The results from serum indicated the potential application of the proposed approach into biological samples. Compared with the methods ever reported, the main advantages of this methodology are simpleness, rapidity without time-consuming separation or pretreatment steps, high sensitivity, selectivity and the potential for determination of other molecules containing imidazole or phenol groups. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Predicting the risk of malignancy in adnexal masses based on the Simple Rules from the International Ovarian Tumor Analysis group.

    PubMed

    Timmerman, Dirk; Van Calster, Ben; Testa, Antonia; Savelli, Luca; Fischerova, Daniela; Froyman, Wouter; Wynants, Laure; Van Holsbeke, Caroline; Epstein, Elisabeth; Franchi, Dorella; Kaijser, Jeroen; Czekierdowski, Artur; Guerriero, Stefano; Fruscio, Robert; Leone, Francesco P G; Rossi, Alberto; Landolfo, Chiara; Vergote, Ignace; Bourne, Tom; Valentin, Lil

    2016-04-01

    Accurate methods to preoperatively characterize adnexal tumors are pivotal for optimal patient management. A recent metaanalysis concluded that the International Ovarian Tumor Analysis algorithms such as the Simple Rules are the best approaches to preoperatively classify adnexal masses as benign or malignant. We sought to develop and validate a model to predict the risk of malignancy in adnexal masses using the ultrasound features in the Simple Rules. This was an international cross-sectional cohort study involving 22 oncology centers, referral centers for ultrasonography, and general hospitals. We included consecutive patients with an adnexal tumor who underwent a standardized transvaginal ultrasound examination and were selected for surgery. Data on 5020 patients were recorded in 3 phases from 2002 through 2012. The 5 Simple Rules features indicative of a benign tumor (B-features) and the 5 features indicative of malignancy (M-features) are based on the presence of ascites, tumor morphology, and degree of vascularity at ultrasonography. Gold standard was the histopathologic diagnosis of the adnexal mass (pathologist blinded to ultrasound findings). Logistic regression analysis was used to estimate the risk of malignancy based on the 10 ultrasound features and type of center. The diagnostic performance was evaluated by area under the receiver operating characteristic curve, sensitivity, specificity, positive likelihood ratio (LR+), negative likelihood ratio (LR-), positive predictive value (PPV), negative predictive value (NPV), and calibration curves. Data on 4848 patients were analyzed. The malignancy rate was 43% (1402/3263) in oncology centers and 17% (263/1585) in other centers. The area under the receiver operating characteristic curve on validation data was very similar in oncology centers (0.917; 95% confidence interval, 0.901-0.931) and other centers (0.916; 95% confidence interval, 0.873-0.945). Risk estimates showed good calibration. In all, 23% of patients in the validation data set had a very low estimated risk (<1%) and 48% had a high estimated risk (≥30%). For the 1% risk cutoff, sensitivity was 99.7%, specificity 33.7%, LR+ 1.5, LR- 0.010, PPV 44.8%, and NPV 98.9%. For the 30% risk cutoff, sensitivity was 89.0%, specificity 84.7%, LR+ 5.8, LR- 0.13, PPV 75.4%, and NPV 93.9%. Quantification of the risk of malignancy based on the Simple Rules has good diagnostic performance both in oncology centers and other centers. A simple classification based on these risk estimates may form the basis of a clinical management system. Patients with a high risk may benefit from surgery by a gynecological oncologist, while patients with a lower risk may be managed locally. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Economics of internal and external energy storage in solar power plant operation

    NASA Technical Reports Server (NTRS)

    Manvi, R.; Fujita, T.

    1977-01-01

    A simple approach is formulated to investigate the effect of energy storage on the bus-bar electrical energy cost of solar thermal power plants. Economic analysis based on this approach does not require detailed definition of a specific storage system. A wide spectrum of storage system candidates ranging from hot water to superconducting magnets can be studied based on total investment and a rough knowledge of energy in and out efficiencies. Preliminary analysis indicates that internal energy storage (thermal) schemes offer better opportunities for energy cost reduction than external energy storage (nonthermal) schemes for solar applications. Based on data and assumptions used in JPL evaluation studies, differential energy costs due to storage are presented for a 100 MWe solar power plant by varying the energy capacity. The simple approach presented in this paper provides useful insight regarding the operation of energy storage in solar power plant applications, while also indicating a range of design parameters where storage can be cost effective.

  20. THz Discrimination of Materials: Development of an Apparatus Based on Room Temperature Detection and Metasurfaces Selective Filters

    NASA Astrophysics Data System (ADS)

    Carelli, P.; Chiarello, F.; Torrioli, G.; Castellano, M. G.

    2017-03-01

    We present an apparatus for terahertz discrimination of materials designed to be fast, simple, compact, and economical in order to be suitable for preliminary on-field analysis. The system working principles, bio-inspired by the human vision of colors, are based on the use of an incoherent source, a room temperature detector, a series of microfabricated metamaterials selective filters, a very compact optics based on metallic ellipsoidal mirrors in air, and a treatment of the mirrors' surfaces that select the frequency band of interest. We experimentally demonstrate the operation of the apparatus in discriminating simple substances such as salt, staple foods, and grease. We present the system and the obtained results and discuss issues and possible developments.

  1. Perspective on Certainty-Based Marking: An Interview with Tony Gardner-Medwin

    ERIC Educational Resources Information Center

    Cornwell, Reid; Gardner-Medwin, Tony

    2008-01-01

    In this edition of Perspectives, Reid Cornwell discusses certainty-based marking (CBM) with Tony Gardner-Medwin, professor emeritus of physiology at University College London (UCL), which adopted a simple, theoretically sound version of CBM in its medical education program. CBM has been shown to encourage thinking, reflection, improved analysis,…

  2. A Comparative Analysis of the Snort and Suricata Intrusion-Detection Systems

    DTIC Science & Technology

    2011-09-01

    Category: Test Rules Test #6: Simple LFI Attack 43 Snort True Positive: Snort generated an alert based on the ‘/etc/ passwd ’ string passed...through an HTTP command. Suricata True Positive: Suricata generated an alert based on the ‘/etc/ passwd ’ string passed through an HTTP command

  3. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    PubMed

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  4. Network structure of multivariate time series.

    PubMed

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-21

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  5. The Productivity Dilemma in Workplace Health Promotion.

    PubMed

    Cherniack, Martin

    2015-01-01

    Worksite-based programs to improve workforce health and well-being (Workplace Health Promotion (WHP)) have been advanced as conduits for improved worker productivity and decreased health care costs. There has been a countervailing health economics contention that return on investment (ROI) does not merit preventive health investment. METHODS/PROCEDURES: Pertinent studies were reviewed and results reconsidered. A simple economic model is presented based on conventional and alternate assumptions used in cost benefit analysis (CBA), such as discounting and negative value. The issues are presented in the format of 3 conceptual dilemmas. In some occupations such as nursing, the utility of patient survival and staff health is undervalued. WHP may miss important components of work related health risk. Altering assumptions on discounting and eliminating the drag of negative value radically change the CBA value. Simple monetization of a work life and calculation of return on workforce health investment as a simple alternate opportunity involve highly selective interpretations of productivity and utility.

  6. Analysis of changes in tornadogenesis conditions over Northern Eurasia based on a simple index of atmospheric convective instability

    NASA Astrophysics Data System (ADS)

    Chernokulsky, A. V.; Kurgansky, M. V.; Mokhov, I. I.

    2017-12-01

    A simple index of convective instability (3D-index) is used for analysis of weather and climate processes that favor to the occurrence of severe convective events including tornadoes. The index is based on information on the surface air temperature and humidity. The prognostic ability of the index to reproduce severe convective events (thunderstorms, showers, tornadoes) is analyzed. It is shown that most tornadoes in North Eurasia are characterized by high values of the 3D-index; furthermore, the 3D-index is significantly correlated with the available convective potential energy. Reanalysis data (for recent decades) and global climate model simulations (for the 21st century) show an increase in the frequency of occurrence of favorable for tornado formation meteorological conditions in the regions of Northern Eurasia. The most significant increase is found on the Black Sea coast and in the south of the Far East.

  7. Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity.

    PubMed

    Li, Harbin; McNulty, Steven G

    2007-10-01

    Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL estimates to the national scale could be developed. Specifically, we wanted to quantify CAL uncertainty under natural variability in 17 model parameters, and determine their relative contributions in predicting CAL. Results indicated that uncertainty in CAL came primarily from components of base cation weathering (BC(w); 49%) and acid neutralizing capacity (46%), whereas the most critical parameters were BC(w) base rate (62%), soil depth (20%), and soil temperature (11%). Thus, improvements in estimates of these factors are crucial to reducing uncertainty and successfully scaling up SMBE for national assessments of CAL.

  8. A study of stiffness, residual strength and fatigue life relationships for composite laminates

    NASA Technical Reports Server (NTRS)

    Ryder, J. T.; Crossman, F. W.

    1983-01-01

    Qualitative and quantitative exploration of the relationship between stiffness, strength, fatigue life, residual strength, and damage of unnotched, graphite/epoxy laminates subjected to tension loading. Clarification of the mechanics of the tension loading is intended to explain previous contradictory observations and hypotheses; to develop a simple procedure to anticipate strength, fatigue life, and stiffness changes; and to provide reasons for the study of more complex cases of compression, notches, and spectrum fatigue loading. Mathematical models are developed based upon analysis of the damage states. Mathematical models were based on laminate analysis, free body type modeling or a strain energy release rate. Enough understanding of the tension loaded case is developed to allow development of a proposed, simple procedure for calculating strain to failure, stiffness, strength, data scatter, and shape of the stress-life curve for unnotched laminates subjected to tension load.

  9. ViSimpl: Multi-View Visual Analysis of Brain Simulation Data

    PubMed Central

    Galindo, Sergio E.; Toharia, Pablo; Robles, Oscar D.; Pastor, Luis

    2016-01-01

    After decades of independent morphological and functional brain research, a key point in neuroscience nowadays is to understand the combined relationships between the structure of the brain and its components and their dynamics on multiple scales, ranging from circuits of neurons at micro or mesoscale to brain regions at macroscale. With such a goal in mind, there is a vast amount of research focusing on modeling and simulating activity within neuronal structures, and these simulations generate large and complex datasets which have to be analyzed in order to gain the desired insight. In such context, this paper presents ViSimpl, which integrates a set of visualization and interaction tools that provide a semantic view of brain data with the aim of improving its analysis procedures. ViSimpl provides 3D particle-based rendering that allows visualizing simulation data with their associated spatial and temporal information, enhancing the knowledge extraction process. It also provides abstract representations of the time-varying magnitudes supporting different data aggregation and disaggregation operations and giving also focus and context clues. In addition, ViSimpl tools provide synchronized playback control of the simulation being analyzed. Finally, ViSimpl allows performing selection and filtering operations relying on an application called NeuroScheme. All these views are loosely coupled and can be used independently, but they can also work together as linked views, both in centralized and distributed computing environments, enhancing the data exploration and analysis procedures. PMID:27774062

  10. ViSimpl: Multi-View Visual Analysis of Brain Simulation Data.

    PubMed

    Galindo, Sergio E; Toharia, Pablo; Robles, Oscar D; Pastor, Luis

    2016-01-01

    After decades of independent morphological and functional brain research, a key point in neuroscience nowadays is to understand the combined relationships between the structure of the brain and its components and their dynamics on multiple scales, ranging from circuits of neurons at micro or mesoscale to brain regions at macroscale. With such a goal in mind, there is a vast amount of research focusing on modeling and simulating activity within neuronal structures, and these simulations generate large and complex datasets which have to be analyzed in order to gain the desired insight. In such context, this paper presents ViSimpl, which integrates a set of visualization and interaction tools that provide a semantic view of brain data with the aim of improving its analysis procedures. ViSimpl provides 3D particle-based rendering that allows visualizing simulation data with their associated spatial and temporal information, enhancing the knowledge extraction process. It also provides abstract representations of the time-varying magnitudes supporting different data aggregation and disaggregation operations and giving also focus and context clues. In addition, ViSimpl tools provide synchronized playback control of the simulation being analyzed. Finally, ViSimpl allows performing selection and filtering operations relying on an application called NeuroScheme. All these views are loosely coupled and can be used independently, but they can also work together as linked views, both in centralized and distributed computing environments, enhancing the data exploration and analysis procedures.

  11. Analysis of a novel sensor interrogation technique based on fiber cavity ring-down (CRD) loop and OTDR

    NASA Astrophysics Data System (ADS)

    Yüksel, Kivilcim; Yilmaz, Anil

    2018-07-01

    We present the analysis of a remote sensor based on fiber Cavity Ring-Down (CRD) loop interrogated by an Optical Time Domain Reflectometer (OTDR) taking into account both practical limitations and the related signal processing. A commercial OTDR is used for both pulse generation and sensor output detection. This allows obtaining a compact and simple design for intensity-based sensor applications. This novel sensor interrogation approach is experimentally demonstrated by placing a variable attenuator inside the fiber loop that mimics a sensor head.

  12. Whole-Range Assessment: A Simple Method for Analysing Allelopathic Dose-Response Data

    PubMed Central

    An, Min; Pratley, J. E.; Haig, T.; Liu, D.L.

    2005-01-01

    Based on the typical biological responses of an organism to allelochemicals (hormesis), concepts of whole-range assessment and inhibition index were developed for improved analysis of allelopathic data. Examples of their application are presented using data drawn from the literature. The method is concise and comprehensive, and makes data grouping and multiple comparisons simple, logical, and possible. It improves data interpretation, enhances research outcomes, and is a statistically efficient summary of the plant response profiles. PMID:19330165

  13. Improving the Selection, Classification, and Utilization of Army Enlisted Personnel. Project A: Research Plan

    DTIC Science & Technology

    1983-05-01

    occur. 4) It is also true that during a given time period, at a given base, not all of the people in the sample will actually be available for testing...taken sample sizes into consideration, we currently estimate that with few exceptions, we will have adequate samples to perform the analysis of simple ...aalanced Half Sample Repli- cations (BHSA). His analyses of simple cases have shown that this method is substantially more efficient than the

  14. A simple highly efficient non invasive EMG-based HMI.

    PubMed

    Vitiello, N; Olcese, U; Oddo, C M; Carpaneto, J; Micera, S; Carrozza, M C; Dario, P

    2006-01-01

    Muscle activity recorded non-invasively is sufficient to control a mobile robot if it is used in combination with an algorithm for its asynchronous analysis. In this paper, we show that several subjects successfully can control the movements of a robot in a structured environment made up of six rooms by contracting two different muscles using a simple algorithm. After a small training period, subjects were able to control the robot with performances comparable to those achieved manually controlling the robot.

  15. A Fan-tastic Quantitative Exploration of Ohm's Law

    NASA Astrophysics Data System (ADS)

    Mitchell, Brandon; Ekey, Robert; McCullough, Roy; Reitz, William

    2018-02-01

    Teaching simple circuits and Ohm's law to students in the introductory classroom has been extensively investigated through the common practice of using incandescent light bulbs to help students develop a conceptual foundation before moving on to quantitative analysis. However, the bulb filaments' resistance has a large temperature dependence, which makes them less suitable as a tool for quantitative analysis. Some instructors show that light bulbs do not obey Ohm's law either outright or through inquiry-based laboratory experiments. Others avoid the subject altogether by using bulbs strictly for qualitative purposes and then later switching to resistors for a numerical analysis, or by changing the operating conditions of the bulb so that it is "barely" glowing. It seems incongruous to develop a conceptual basis for the behavior of simple circuits using bulbs only to later reveal that they do not follow Ohm's law. Recently, small computer fans were proposed as a suitable replacement of bulbs for qualitative analysis of simple circuits where the current is related to the rotational speed of the fans. In this contribution, we demonstrate that fans can also be used for quantitative measurements and provide suggestions for successful classroom implementation.

  16. Automating Embedded Analysis Capabilities and Managing Software Complexity in Multiphysics Simulation, Part I: Template-Based Generic Programming

    DOE PAGES

    Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.

    2012-01-01

    An approach for incorporating embedded simulation and analysis capabilities in complex simulation codes through template-based generic programming is presented. This approach relies on templating and operator overloading within the C++ language to transform a given calculation into one that can compute a variety of additional quantities that are necessary for many state-of-the-art simulation and analysis algorithms. An approach for incorporating these ideas into complex simulation codes through general graph-based assembly is also presented. These ideas have been implemented within a set of packages in the Trilinos framework and are demonstrated on a simple problem from chemical engineering.

  17. An Analysis of Government Training Contract Practices in Terms of Cost and Contingency Management Implications.

    ERIC Educational Resources Information Center

    O'Neal, A. Frederick; Ross, Clarence J.

    This paper briefly outlines and explains critical characteristics of the most important and frequently used government contract classes. These classes are explored in terms of contract (and government) behaviors expected to ensue based on simple analysis of where payoffs and rewards are, monetarily and otherwise, and in terms of how these…

  18. Vocoders and Speech Perception: Uses of Computer-Based Speech Analysis-Synthesis in Stimulus Generation.

    ERIC Educational Resources Information Center

    Tierney, Joseph; Mack, Molly

    1987-01-01

    Stimuli used in research on the perception of the speech signal have often been obtained from simple filtering and distortion of the speech waveform, sometimes accompanied by noise. However, for more complex stimulus generation, the parameters of speech can be manipulated, after analysis and before synthesis, using various types of algorithms to…

  19. Analysis of aircraft longitudinal handling qualities

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1981-01-01

    The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion for determining the susceptibility of an aircraft to pilot induced oscillations (PIO) is formulated. Finally, a model-based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.

  20. Nutrition screening tools: an analysis of the evidence.

    PubMed

    Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy

    2012-05-01

    In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings.

  1. Interaction dynamics of multiple mobile robots with simple navigation strategies

    NASA Technical Reports Server (NTRS)

    Wang, P. K. C.

    1989-01-01

    The global dynamic behavior of multiple interacting autonomous mobile robots with simple navigation strategies is studied. Here, the effective spatial domain of each robot is taken to be a closed ball about its mass center. It is assumed that each robot has a specified cone of visibility such that interaction with other robots takes place only when they enter its visibility cone. Based on a particle model for the robots, various simple homing and collision-avoidance navigation strategies are derived. Then, an analysis of the dynamical behavior of the interacting robots in unbounded spatial domains is made. The article concludes with the results of computer simulations studies of two or more interacting robots.

  2. Remote Access Multi-Mission Processing and Analysis Ground Environment (RAMPAGE)

    NASA Technical Reports Server (NTRS)

    Lee, Y.; Specht, T.

    2000-01-01

    At Jet Propulsion Laboratory (JPL), a goal of providing easy and simple data access to the mission engineering data using web-based standards to a wide variety of users is now possible by the RAMPAGE development.

  3. Brief Lags in Interrupted Sequential Performance: Evaluating a Model and Model Evaluation Method

    DTIC Science & Technology

    2015-01-05

    rehearsal mechanism in the model. To evaluate the model we developed a simple new goodness-of-fit test based on analysis of variance that offers an...repeated step). Sequen- tial constraints are common in medicine, equipment maintenance, computer programming and technical support, data analysis ...legal analysis , accounting, and many other home and workplace environ- ments. Sequential constraints also play a role in such basic cognitive processes

  4. Turn on ESPT: novel salicylaldehyde based sensor for biological important fluoride sensing.

    PubMed

    Liu, Kai; Zhao, Xiaojun; Liu, Qingxiang; Huo, Jianzhong; Fu, Huifang; Wang, Ying

    2014-09-05

    A novel and simple salicylaldehyde based anion fluorescent sensor 1 has been designed, which can selectively sense fluoride by 'turn on' excited-state intermolecular proton transfer (ESPT). The binding constant and the stoichiometry were obtained by non-linear least-square analysis of the titration curves. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. A simple laminate theory using the orthotropic viscoplasticity theory based on overstress. I - In-plane stress-strain relationships for metal matrix composites

    NASA Technical Reports Server (NTRS)

    Krempl, Erhard; Hong, Bor Zen

    1989-01-01

    A macromechanics analysis is presented for the in-plane, anisotropic time-dependent behavior of metal matrix laminates. The small deformation, orthotropic viscoplasticity theory based on overstress represents lamina behavior in a modified simple laminate theory. Material functions and constants can be identified in principle from experiments with laminae. Orthotropic invariants can be repositories for tension-compression asymmetry and for linear elasticity in one direction while the other directions behave in a viscoplastic manner. Computer programs are generated and tested for either unidirectional or symmetric laminates under in-plane loading. Correlations with the experimental results on metal matrix composites are presented.

  6. SAMP, the Simple Application Messaging Protocol: Letting applications talk to each other

    NASA Astrophysics Data System (ADS)

    Taylor, M. B.; Boch, T.; Taylor, J.

    2015-06-01

    SAMP, the Simple Application Messaging Protocol, is a hub-based communication standard for the exchange of data and control between participating client applications. It has been developed within the context of the Virtual Observatory with the aim of enabling specialised data analysis tools to cooperate as a loosely integrated suite, and is now in use by many and varied desktop and web-based applications dealing with astronomical data. This paper reviews the requirements and design principles that led to SAMP's specification, provides a high-level description of the protocol, and discusses some of its common and possible future usage patterns, with particular attention to those factors that have aided its success in practice.

  7. LCR circuit: new simple methods for measuring the equivalent series resistance of a capacitor and inductance of a coil

    NASA Astrophysics Data System (ADS)

    Ivković, Saša S.; Marković, Marija Z.; Ivković, Dragica Ž.; Cvetanović, Nikola

    2017-09-01

    Equivalent series resistance (ESR) represents the measurement of total energy loss in a capacitor. In this paper a simple method for measuring the ESR of ceramic capacitors based on the analysis of the oscillations of an LCR circuit is proposed. It is shown that at frequencies under 3300 Hz, the ESR is directly proportional to the period of oscillations. Based on the determined dependence of the ESR on the period, a method is devised and tested for measuring coil inductance. All measurements were performed using the standard equipment found in student laboratories, which makes both methods very suitable for implementation at high school and university levels.

  8. Theoretical Analysis of Local Search and Simple Evolutionary Algorithms for the Generalized Travelling Salesperson Problem.

    PubMed

    Pourhassan, Mojgan; Neumann, Frank

    2018-06-22

    The generalized travelling salesperson problem is an important NP-hard combinatorial optimization problem for which meta-heuristics, such as local search and evolutionary algorithms, have been used very successfully. Two hierarchical approaches with different neighbourhood structures, namely a Cluster-Based approach and a Node-Based approach, have been proposed by Hu and Raidl (2008) for solving this problem. In this paper, local search algorithms and simple evolutionary algorithms based on these approaches are investigated from a theoretical perspective. For local search algorithms, we point out the complementary abilities of the two approaches by presenting instances where they mutually outperform each other. Afterwards, we introduce an instance which is hard for both approaches when initialized on a particular point of the search space, but where a variable neighbourhood search combining them finds the optimal solution in polynomial time. Then we turn our attention to analysing the behaviour of simple evolutionary algorithms that use these approaches. We show that the Node-Based approach solves the hard instance of the Cluster-Based approach presented in Corus et al. (2016) in polynomial time. Furthermore, we prove an exponential lower bound on the optimization time of the Node-Based approach for a class of Euclidean instances.

  9. Modal cost analysis for simple continua

    NASA Technical Reports Server (NTRS)

    Hu, A.; Skelton, R. E.; Yang, T. Y.

    1988-01-01

    The most popular finite element codes are based upon appealing theories of convergence of modal frequencies. For example, the popularity of cubic elements for beam-like structures is due to the rapid convergence of modal frequencies and stiffness properties. However, for those problems in which the primary consideration is the accuracy of response of the structure at specified locations, it is more important to obtain accuracy in the modal costs than in the modal frequencies. The modal cost represents the contribution of a mode in the norm of the response vector. This paper provides a complete modal cost analysis for simple continua such as beam-like structures. Upper bounds are developed for mode truncation errors in the model reduction process and modal cost analysis dictates which modes to retain in order to reduce the model for control design purposes.

  10. Use of Simple Sequence Repeat (SSR) markers for DNA fingerprinting and diversity analysis of sugarcane (Saccharum spp.) cultivars resistant and susceptible to red rot

    USDA-ARS?s Scientific Manuscript database

    In recent years SSR markers have been used widely for the genetic analysis. The objective of present research was to use SSR markers to develop DNA-based genetic identification and analyze genetic relationship of sugarcane cultivars grown in Pakistan either resistant or susceptible to red rot. Twent...

  11. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  12. Development of a Landforms Model for Puerto Rico and its Application for Land Cover Change Analysis

    Treesearch

    Sebastian Martinuzzi; William A. Gould; Olga M. Ramos Gonzalez; Brook E. Edwards

    2007-01-01

    Comprehensive analysis of land morphology is essential to supporting a wide range environmental studies. We developed a landforms model that identifies eleven landform units for Puerto Rico based on parameters of land position and slope. The model is capable of extracting operational information in a simple way and is adaptable to different environments and objectives...

  13. Analytical and multibody modeling for the power analysis of standing jumps.

    PubMed

    Palmieri, G; Callegari, M; Fioretti, S

    2015-01-01

    Two methods for the power analysis of standing jumps are proposed and compared in this article. The first method is based on a simple analytical formulation which requires as input the coordinates of the center of gravity in three specified instants of the jump. The second method is based on a multibody model that simulates the jumps processing the data obtained by a three-dimensional (3D) motion capture system and the dynamometric measurements obtained by the force platforms. The multibody model is developed with OpenSim, an open-source software which provides tools for the kinematic and dynamic analyses of 3D human body models. The study is focused on two of the typical tests used to evaluate the muscular activity of lower limbs, which are the counter movement jump and the standing long jump. The comparison between the results obtained by the two methods confirms that the proposed analytical formulation is correct and represents a simple tool suitable for a preliminary analysis of total mechanical work and the mean power exerted in standing jumps.

  14. Effects of different preservation methods on inter simple sequence repeat (ISSR) and random amplified polymorphic DNA (RAPD) molecular markers in botanic samples.

    PubMed

    Wang, Xiaolong; Li, Lin; Zhao, Jiaxin; Li, Fangliang; Guo, Wei; Chen, Xia

    2017-04-01

    To evaluate the effects of different preservation methods (stored in a -20°C ice chest, preserved in liquid nitrogen and dried in silica gel) on inter simple sequence repeat (ISSR) or random amplified polymorphic DNA (RAPD) analyses in various botanical specimens (including broad-leaved plants, needle-leaved plants and succulent plants) for different times (three weeks and three years), we used a statistical analysis based on the number of bands, genetic index and cluster analysis. The results demonstrate that methods used to preserve samples can provide sufficient amounts of genomic DNA for ISSR and RAPD analyses; however, the effect of different preservation methods on these analyses vary significantly, and the preservation time has little effect on these analyses. Our results provide a reference for researchers to select the most suitable preservation method depending on their study subject for the analysis of molecular markers based on genomic DNA. Copyright © 2017 Académie des sciences. Published by Elsevier Masson SAS. All rights reserved.

  15. Predictive Analytics In Healthcare: Medications as a Predictor of Medical Complexity.

    PubMed

    Higdon, Roger; Stewart, Elizabeth; Roach, Jared C; Dombrowski, Caroline; Stanberry, Larissa; Clifton, Holly; Kolker, Natali; van Belle, Gerald; Del Beccaro, Mark A; Kolker, Eugene

    2013-12-01

    Children with special healthcare needs (CSHCN) require health and related services that exceed those required by most hospitalized children. A small but growing and important subset of the CSHCN group includes medically complex children (MCCs). MCCs typically have comorbidities and disproportionately consume healthcare resources. To enable strategic planning for the needs of MCCs, simple screens to identify potential MCCs rapidly in a hospital setting are needed. We assessed whether the number of medications used and the class of those medications correlated with MCC status. Retrospective analysis of medication data from the inpatients at Seattle Children's Hospital found that the numbers of inpatient and outpatient medications significantly correlated with MCC status. Numerous variables based on counts of medications, use of individual medications, and use of combinations of medications were considered, resulting in a simple model based on three different counts of medications: outpatient and inpatient drug classes and individual inpatient drug names. The combined model was used to rank the patient population for medical complexity. As a result, simple, objective admission screens for predicting the complexity of patients based on the number and type of medications were implemented.

  16. Bayesian analysis of volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Ho, Chih-Hsiang

    1990-10-01

    The simple Poisson model generally gives a good fit to many volcanoes for volcanic eruption forecasting. Nonetheless, empirical evidence suggests that volcanic activity in successive equal time-periods tends to be more variable than a simple Poisson with constant eruptive rate. An alternative model is therefore examined in which eruptive rate(λ) for a given volcano or cluster(s) of volcanoes is described by a gamma distribution (prior) rather than treated as a constant value as in the assumptions of a simple Poisson model. Bayesian analysis is performed to link two distributions together to give the aggregate behavior of the volcanic activity. When the Poisson process is expanded to accomodate a gamma mixing distribution on λ, a consequence of this mixed (or compound) Poisson model is that the frequency distribution of eruptions in any given time-period of equal length follows the negative binomial distribution (NBD). Applications of the proposed model and comparisons between the generalized model and simple Poisson model are discussed based on the historical eruptive count data of volcanoes Mauna Loa (Hawaii) and Etna (Italy). Several relevant facts lead to the conclusion that the generalized model is preferable for practical use both in space and time.

  17. Harmonics analysis of the ITER poloidal field converter based on a piecewise method

    NASA Astrophysics Data System (ADS)

    Xudong, WANG; Liuwei, XU; Peng, FU; Ji, LI; Yanan, WU

    2017-12-01

    Poloidal field (PF) converters provide controlled DC voltage and current to PF coils. The many harmonics generated by the PF converter flow into the power grid and seriously affect power systems and electric equipment. Due to the complexity of the system, the traditional integral operation in Fourier analysis is complicated and inaccurate. This paper presents a piecewise method to calculate the harmonics of the ITER PF converter. The relationship between the grid input current and the DC output current of the ITER PF converter is deduced. The grid current is decomposed into the sum of some simple functions. By calculating simple function harmonics based on the piecewise method, the harmonics of the PF converter under different operation modes are obtained. In order to examine the validity of the method, a simulation model is established based on Matlab/Simulink and a relevant experiment is implemented in the ITER PF integration test platform. Comparative results are given. The calculated results are found to be consistent with simulation and experiment. The piecewise method is proved correct and valid for calculating the system harmonics.

  18. A Simple and Reliable Method of Design for Standalone Photovoltaic Systems

    NASA Astrophysics Data System (ADS)

    Srinivasarao, Mantri; Sudha, K. Rama; Bhanu, C. V. K.

    2017-06-01

    Standalone photovoltaic (SAPV) systems are seen as a promoting method of electrifying areas of developing world that lack power grid infrastructure. Proliferations of these systems require a design procedure that is simple, reliable and exhibit good performance over its life time. The proposed methodology uses simple empirical formulae and easily available parameters to design SAPV systems, that is, array size with energy storage. After arriving at the different array size (area), performance curves are obtained for optimal design of SAPV system with high amount of reliability in terms of autonomy at a specified value of loss of load probability (LOLP). Based on the array to load ratio (ALR) and levelized energy cost (LEC) through life cycle cost (LCC) analysis, it is shown that the proposed methodology gives better performance, requires simple data and is more reliable when compared with conventional design using monthly average daily load and insolation.

  19. Simple yet Hidden Counterexamples in Undergraduate Real Analysis

    ERIC Educational Resources Information Center

    Shipman, Barbara A.; Shipman, Patrick D.

    2013-01-01

    We study situations in introductory analysis in which students affirmed false statements as true, despite simple counterexamples that they easily recognized afterwards. The study draws attention to how simple counterexamples can become hidden in plain sight, even in an active learning atmosphere where students proposed simple (as well as more…

  20. Simple Process-Based Simulators for Generating Spatial Patterns of Habitat Loss and Fragmentation: A Review and Introduction to the G-RaFFe Model

    PubMed Central

    Pe'er, Guy; Zurita, Gustavo A.; Schober, Lucia; Bellocq, Maria I.; Strer, Maximilian; Müller, Michael; Pütz, Sandro

    2013-01-01

    Landscape simulators are widely applied in landscape ecology for generating landscape patterns. These models can be divided into two categories: pattern-based models that generate spatial patterns irrespective of the processes that shape them, and process-based models that attempt to generate patterns based on the processes that shape them. The latter often tend toward complexity in an attempt to obtain high predictive precision, but are rarely used for generic or theoretical purposes. Here we show that a simple process-based simulator can generate a variety of spatial patterns including realistic ones, typifying landscapes fragmented by anthropogenic activities. The model “G-RaFFe” generates roads and fields to reproduce the processes in which forests are converted into arable lands. For a selected level of habitat cover, three factors dominate its outcomes: the number of roads (accessibility), maximum field size (accounting for land ownership patterns), and maximum field disconnection (which enables field to be detached from roads). We compared the performance of G-RaFFe to three other models: Simmap (neutral model), Qrule (fractal-based) and Dinamica EGO (with 4 model versions differing in complexity). A PCA-based analysis indicated G-RaFFe and Dinamica version 4 (most complex) to perform best in matching realistic spatial patterns, but an alternative analysis which considers model variability identified G-RaFFe and Qrule as performing best. We also found model performance to be affected by habitat cover and the actual land-uses, the latter reflecting on land ownership patterns. We suggest that simple process-based generators such as G-RaFFe can be used to generate spatial patterns as templates for theoretical analyses, as well as for gaining better understanding of the relation between spatial processes and patterns. We suggest caution in applying neutral or fractal-based approaches, since spatial patterns that typify anthropogenic landscapes are often non-fractal in nature. PMID:23724108

  1. Simple process-based simulators for generating spatial patterns of habitat loss and fragmentation: a review and introduction to the G-RaFFe model.

    PubMed

    Pe'er, Guy; Zurita, Gustavo A; Schober, Lucia; Bellocq, Maria I; Strer, Maximilian; Müller, Michael; Pütz, Sandro

    2013-01-01

    Landscape simulators are widely applied in landscape ecology for generating landscape patterns. These models can be divided into two categories: pattern-based models that generate spatial patterns irrespective of the processes that shape them, and process-based models that attempt to generate patterns based on the processes that shape them. The latter often tend toward complexity in an attempt to obtain high predictive precision, but are rarely used for generic or theoretical purposes. Here we show that a simple process-based simulator can generate a variety of spatial patterns including realistic ones, typifying landscapes fragmented by anthropogenic activities. The model "G-RaFFe" generates roads and fields to reproduce the processes in which forests are converted into arable lands. For a selected level of habitat cover, three factors dominate its outcomes: the number of roads (accessibility), maximum field size (accounting for land ownership patterns), and maximum field disconnection (which enables field to be detached from roads). We compared the performance of G-RaFFe to three other models: Simmap (neutral model), Qrule (fractal-based) and Dinamica EGO (with 4 model versions differing in complexity). A PCA-based analysis indicated G-RaFFe and Dinamica version 4 (most complex) to perform best in matching realistic spatial patterns, but an alternative analysis which considers model variability identified G-RaFFe and Qrule as performing best. We also found model performance to be affected by habitat cover and the actual land-uses, the latter reflecting on land ownership patterns. We suggest that simple process-based generators such as G-RaFFe can be used to generate spatial patterns as templates for theoretical analyses, as well as for gaining better understanding of the relation between spatial processes and patterns. We suggest caution in applying neutral or fractal-based approaches, since spatial patterns that typify anthropogenic landscapes are often non-fractal in nature.

  2. PC-BASED SUPERCOMPUTING FOR UNCERTAINTY AND SENSITIVITY ANALYSIS OF MODELS

    EPA Science Inventory

    Evaluating uncertainty and sensitivity of multimedia environmental models that integrate assessments of air, soil, sediments, groundwater, and surface water is a difficult task. It can be an enormous undertaking even for simple, single-medium models (i.e. groundwater only) descr...

  3. Pilot/vehicle model analysis of visually guided flight

    NASA Technical Reports Server (NTRS)

    Zacharias, Greg L.

    1991-01-01

    Information is given in graphical and outline form on a pilot/vehicle model description, control of altitude with simple terrain clues, simulated flight with visual scene delays, model-based in-cockpit display design, and some thoughts on the role of pilot/vehicle modeling.

  4. Application of simple negative feedback model for avalanche photodetectors investigation

    NASA Astrophysics Data System (ADS)

    Kushpil, V. V.

    2009-10-01

    A simple negative feedback model based on Miller's formula is used to investigate the properties of Avalanche Photodetectors (APDs). The proposed method can be applied to study classical APD as well as new type of devices, which are operating in the Internal Negative Feedback (INF) regime. The method shows a good sensitivity to technological APD parameters making it possible to use it as a tool to analyse various APD parameters. It also allows better understanding of the APD operation conditions. The simulations and experimental data analysis for different types of APDs are presented.

  5. Time-lapse and slow-motion tracking of temperature changes: response time of a thermometer

    NASA Astrophysics Data System (ADS)

    Moggio, L.; Onorato, P.; Gratton, L. M.; Oss, S.

    2017-03-01

    We propose the use of a smartphone based time-lapse and slow-motion video techniques together with tracking analysis as valuable tools for investigating thermal processes such as the response time of a thermometer. The two simple experimental activities presented here, suitable also for high school and undergraduate students, allow one to measure in a simple yet rigorous way the response time of an alcohol thermometer and show its critical dependence on the properties of the surrounding environment giving insight into instrument characteristics, heat transfer and thermal equilibrium concepts.

  6. Design of ground test suspension systems for verification of flexible space structures

    NASA Technical Reports Server (NTRS)

    Cooley, V. M.; Juang, J. N.; Ghaemmaghami, P.

    1988-01-01

    A simple model demonstrates the frequency-increasing effects of a simple cable suspension on flexible test article/suspension systems. Two passive suspension designs, namely a negative spring mechanism and a rolling cart mechanism, are presented to alleviate the undesirable frequency-increasing effects. Analysis methods are provided for systems in which the augmentations are applied to both discrete and continuous representations of test articles. The damping analyses are based on friction equivalent viscous damping. Numerical examples are given for comparing the two augmentations with respect to minimizing frequency and damping increases.

  7. Requirements analysis, domain knowledge, and design

    NASA Technical Reports Server (NTRS)

    Potts, Colin

    1988-01-01

    Two improvements to current requirements analysis practices are suggested: domain modeling, and the systematic application of analysis heuristics. Domain modeling is the representation of relevant application knowledge prior to requirements specification. Artificial intelligence techniques may eventually be applicable for domain modeling. In the short term, however, restricted domain modeling techniques, such as that in JSD, will still be of practical benefit. Analysis heuristics are standard patterns of reasoning about the requirements. They usually generate questions of clarification or issues relating to completeness. Analysis heuristics can be represented and therefore systematically applied in an issue-based framework. This is illustrated by an issue-based analysis of JSD's domain modeling and functional specification heuristics. They are discussed in the context of the preliminary design of simple embedded systems.

  8. A Conceptual Analysis of State Support for Higher Education: Appropriations versus Need-Based Financial Aid

    ERIC Educational Resources Information Center

    Toutkoushian, Robert K.; Shafiq, M. Najeeb

    2010-01-01

    In this paper, we use economic concepts to examine the choice that states make between giving appropriations to public colleges or need-based financial aid to students. We begin by reviewing the economic justification for state support for higher education. Next, we introduce a simple economic model for comparing and contrasting appropriations and…

  9. Cost Effective Paper-Based Colorimetric Microfluidic Devices and Mobile Phone Camera Readers for the Classroom

    ERIC Educational Resources Information Center

    Koesdjojo, Myra T.; Pengpumkiat, Sumate; Wu, Yuanyuan; Boonloed, Anukul; Huynh, Daniel; Remcho, Thomas P.; Remcho, Vincent T.

    2015-01-01

    We have developed a simple and direct method to fabricate paper-based microfluidic devices that can be used for a wide range of colorimetric assay applications. With these devices, assays can be performed within minutes to allow for quantitative colorimetric analysis by use of a widely accessible iPhone camera and an RGB color reader application…

  10. Quantification of the methylation status of the PWS/AS imprinted region: comparison of two approaches based on bisulfite sequencing and methylation-sensitive MLPA.

    PubMed

    Dikow, Nicola; Nygren, Anders Oh; Schouten, Jan P; Hartmann, Carolin; Krämer, Nikola; Janssen, Bart; Zschocke, Johannes

    2007-06-01

    Standard methods used for genomic methylation analysis allow the detection of complete absence of either methylated or non-methylated alleles but are usually unable to detect changes in the proportion of methylated and unmethylated alleles. We compare two methods for quantitative methylation analysis, using the chromosome 15q11-q13 imprinted region as model. Absence of the non-methylated paternal allele in this region leads to Prader-Willi syndrome (PWS) whilst absence of the methylated maternal allele results in Angelman syndrome (AS). A proportion of AS is caused by mosaic imprinting defects which may be missed with standard methods and require quantitative analysis for their detection. Sequence-based quantitative methylation analysis (SeQMA) involves quantitative comparison of peaks generated through sequencing reactions after bisulfite treatment. It is simple, cost-effective and can be easily established for a large number of genes. However, our results support previous suggestions that methods based on bisulfite treatment may be problematic for exact quantification of methylation status. Methylation-specific multiplex ligation-dependent probe amplification (MS-MLPA) avoids bisulfite treatment. It detects changes in both CpG methylation as well as copy number of up to 40 chromosomal sequences in one simple reaction. Once established in a laboratory setting, the method is more accurate, reliable and less time consuming.

  11. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  12. Development of a Simple Dipstick Assay for Operational Monitoring of DDT.

    PubMed

    Ismail, Hanafy M; Kumar, Vijay; Singh, Rudra P; Williams, Christopher; Shivam, Pushkar; Ghosh, Ayan; Deb, Rinki; Foster, Geraldine M; Hemingway, Janet; Coleman, Michael; Coleman, Marlize; Das, Pradeep; Paine, Mark J I

    2016-01-01

    Indoor residual spraying (IRS) of DDT is used to control visceral leishmaniasis (VL) in India. However, the quality of spraying is severely compromised by a lack of affordable field assays to monitor target doses of insecticide. Our aim was to develop a simple DDT insecticide quantification kit (IQK) for monitoring DDT levels in an operational setting. DDT quantification was based on the stoichiometric release of chloride from DDT by alkaline hydrolysis and detection of the released ion using Quantab chloride detection strips. The assay was specific for insecticidal p,p`-DDT (LoQ = 0.082 g/m2). Bostik discs were effective in post spray wall sampling, extracting 25-70% of active ingredient depending on surface. Residual DDT was sampled from walls in Bihar state in India using Bostik adhesive discs and DDT concentrations (g p,p`-DDT/m2) were determined using IQK and HPLC (n = 1964 field samples). Analysis of 161 Bostik samples (pooled sample pairs) by IQK and HPLC produced excellent correlation (R2 = 0.96; Bland-Altman bias = -0.0038). IQK analysis of the remaining field samples matched HPLC data in identifying households that had been under sprayed, in range or over sprayed. A simple dipstick assay has been developed for monitoring DDT spraying that gives comparable results to HPLC. By making laboratory-based analysis of DDT dosing accessible to field operatives, routine monitoring of DDT levels can be promoted in low- and middle- income countries to maximise the effectiveness of IRS.

  13. iSeq: Web-Based RNA-seq Data Analysis and Visualization.

    PubMed

    Zhang, Chao; Fan, Caoqi; Gan, Jingbo; Zhu, Ping; Kong, Lei; Li, Cheng

    2018-01-01

    Transcriptome sequencing (RNA-seq) is becoming a standard experimental methodology for genome-wide characterization and quantification of transcripts at single base-pair resolution. However, downstream analysis of massive amount of sequencing data can be prohibitively technical for wet-lab researchers. A functionally integrated and user-friendly platform is required to meet this demand. Here, we present iSeq, an R-based Web server, for RNA-seq data analysis and visualization. iSeq is a streamlined Web-based R application under the Shiny framework, featuring a simple user interface and multiple data analysis modules. Users without programming and statistical skills can analyze their RNA-seq data and construct publication-level graphs through a standardized yet customizable analytical pipeline. iSeq is accessible via Web browsers on any operating system at http://iseq.cbi.pku.edu.cn .

  14. Metrics for comparing neuronal tree shapes based on persistent homology.

    PubMed

    Li, Yanjie; Wang, Dingkang; Ascoli, Giorgio A; Mitra, Partha; Wang, Yusu

    2017-01-01

    As more and more neuroanatomical data are made available through efforts such as NeuroMorpho.Org and FlyCircuit.org, the need to develop computational tools to facilitate automatic knowledge discovery from such large datasets becomes more urgent. One fundamental question is how best to compare neuron structures, for instance to organize and classify large collection of neurons. We aim to develop a flexible yet powerful framework to support comparison and classification of large collection of neuron structures efficiently. Specifically we propose to use a topological persistence-based feature vectorization framework. Existing methods to vectorize a neuron (i.e, convert a neuron to a feature vector so as to support efficient comparison and/or searching) typically rely on statistics or summaries of morphometric information, such as the average or maximum local torque angle or partition asymmetry. These simple summaries have limited power in encoding global tree structures. Based on the concept of topological persistence recently developed in the field of computational topology, we vectorize each neuron structure into a simple yet informative summary. In particular, each type of information of interest can be represented as a descriptor function defined on the neuron tree, which is then mapped to a simple persistence-signature. Our framework can encode both local and global tree structure, as well as other information of interest (electrophysiological or dynamical measures), by considering multiple descriptor functions on the neuron. The resulting persistence-based signature is potentially more informative than simple statistical summaries (such as average/mean/max) of morphometric quantities-Indeed, we show that using a certain descriptor function will give a persistence-based signature containing strictly more information than the classical Sholl analysis. At the same time, our framework retains the efficiency associated with treating neurons as points in a simple Euclidean feature space, which would be important for constructing efficient searching or indexing structures over them. We present preliminary experimental results to demonstrate the effectiveness of our persistence-based neuronal feature vectorization framework.

  15. Metrics for comparing neuronal tree shapes based on persistent homology

    PubMed Central

    Li, Yanjie; Wang, Dingkang; Ascoli, Giorgio A.; Mitra, Partha

    2017-01-01

    As more and more neuroanatomical data are made available through efforts such as NeuroMorpho.Org and FlyCircuit.org, the need to develop computational tools to facilitate automatic knowledge discovery from such large datasets becomes more urgent. One fundamental question is how best to compare neuron structures, for instance to organize and classify large collection of neurons. We aim to develop a flexible yet powerful framework to support comparison and classification of large collection of neuron structures efficiently. Specifically we propose to use a topological persistence-based feature vectorization framework. Existing methods to vectorize a neuron (i.e, convert a neuron to a feature vector so as to support efficient comparison and/or searching) typically rely on statistics or summaries of morphometric information, such as the average or maximum local torque angle or partition asymmetry. These simple summaries have limited power in encoding global tree structures. Based on the concept of topological persistence recently developed in the field of computational topology, we vectorize each neuron structure into a simple yet informative summary. In particular, each type of information of interest can be represented as a descriptor function defined on the neuron tree, which is then mapped to a simple persistence-signature. Our framework can encode both local and global tree structure, as well as other information of interest (electrophysiological or dynamical measures), by considering multiple descriptor functions on the neuron. The resulting persistence-based signature is potentially more informative than simple statistical summaries (such as average/mean/max) of morphometric quantities—Indeed, we show that using a certain descriptor function will give a persistence-based signature containing strictly more information than the classical Sholl analysis. At the same time, our framework retains the efficiency associated with treating neurons as points in a simple Euclidean feature space, which would be important for constructing efficient searching or indexing structures over them. We present preliminary experimental results to demonstrate the effectiveness of our persistence-based neuronal feature vectorization framework. PMID:28809960

  16. Simple Numerical Analysis of Longboard Speedometer Data

    ERIC Educational Resources Information Center

    Hare, Jonathan

    2013-01-01

    Simple numerical data analysis is described, using a standard spreadsheet program, to determine distance, velocity (speed) and acceleration from voltage data generated by a skateboard/longboard speedometer (Hare 2012 "Phys. Educ." 47 409-17). This simple analysis is an introduction to data processing including scaling data as well as…

  17. The Productivity Dilemma in Workplace Health Promotion

    PubMed Central

    Cherniack, Martin

    2015-01-01

    Background. Worksite-based programs to improve workforce health and well-being (Workplace Health Promotion (WHP)) have been advanced as conduits for improved worker productivity and decreased health care costs. There has been a countervailing health economics contention that return on investment (ROI) does not merit preventive health investment. Methods/Procedures. Pertinent studies were reviewed and results reconsidered. A simple economic model is presented based on conventional and alternate assumptions used in cost benefit analysis (CBA), such as discounting and negative value. The issues are presented in the format of 3 conceptual dilemmas. Principal Findings. In some occupations such as nursing, the utility of patient survival and staff health is undervalued. WHP may miss important components of work related health risk. Altering assumptions on discounting and eliminating the drag of negative value radically change the CBA value. Significance. Simple monetization of a work life and calculation of return on workforce health investment as a simple alternate opportunity involve highly selective interpretations of productivity and utility. PMID:26380374

  18. Masking as an effective quality control method for next-generation sequencing data analysis.

    PubMed

    Yun, Sajung; Yun, Sijung

    2014-12-13

    Next generation sequencing produces base calls with low quality scores that can affect the accuracy of identifying simple nucleotide variation calls, including single nucleotide polymorphisms and small insertions and deletions. Here we compare the effectiveness of two data preprocessing methods, masking and trimming, and the accuracy of simple nucleotide variation calls on whole-genome sequence data from Caenorhabditis elegans. Masking substitutes low quality base calls with 'N's (undetermined bases), whereas trimming removes low quality bases that results in a shorter read lengths. We demonstrate that masking is more effective than trimming in reducing the false-positive rate in single nucleotide polymorphism (SNP) calling. However, both of the preprocessing methods did not affect the false-negative rate in SNP calling with statistical significance compared to the data analysis without preprocessing. False-positive rate and false-negative rate for small insertions and deletions did not show differences between masking and trimming. We recommend masking over trimming as a more effective preprocessing method for next generation sequencing data analysis since masking reduces the false-positive rate in SNP calling without sacrificing the false-negative rate although trimming is more commonly used currently in the field. The perl script for masking is available at http://code.google.com/p/subn/. The sequencing data used in the study were deposited in the Sequence Read Archive (SRX450968 and SRX451773).

  19. Highly selective and sensitive determination of Cu2+ in drink and water samples based on a 1,8-diaminonaphthalene derived fluorescent sensor

    NASA Astrophysics Data System (ADS)

    Sun, Tao; Li, Yang; Niu, Qingfen; Li, Tianduo; Liu, Yan

    2018-04-01

    A new simple and efficient fluorescent sensor L based on 1,8-diaminonaphthalene Schiff-base for highly sensitive and selective determination of Cu2+ in drink and water has been developed. This Cu2+-selective detection over other tested metal ions displayed an obvious color change from blue to colorless easily detected by naked eye. The detection limit is determined to be as low as 13.2 nM and the response time is very fast within 30 s. The 1:1 binding mechanism was well confirmed by fluorescence measurements, IR analysis and DFT calculations. Importantly, this sensor L was employed for quick detection of Cu2+ in drink and environmental water samples with satisfactory results, providing a simple, rapid, reliable and feasible Cu2+-sensing method.

  20. Towards a Certified Lightweight Array Bound Checker for Java Bytecode

    NASA Technical Reports Server (NTRS)

    Pichardie, David

    2009-01-01

    Dynamic array bound checks are crucial elements for the security of a Java Virtual Machines. These dynamic checks are however expensive and several static analysis techniques have been proposed to eliminate explicit bounds checks. Such analyses require advanced numerical and symbolic manipulations that 1) penalize bytecode loading or dynamic compilation, 2) complexify the trusted computing base. Following the Foundational Proof Carrying Code methodology, our goal is to provide a lightweight bytecode verifier for eliminating array bound checks that is both efficient and trustable. In this work, we define a generic relational program analysis for an imperative, stackoriented byte code language with procedures, arrays and global variables and instantiate it with a relational abstract domain as polyhedra. The analysis has automatic inference of loop invariants and method pre-/post-conditions, and efficient checking of analysis results by a simple checker. Invariants, which can be large, can be specialized for proving a safety policy using an automatic pruning technique which reduces their size. The result of the analysis can be checked efficiently by annotating the program with parts of the invariant together with certificates of polyhedral inclusions. The resulting checker is sufficiently simple to be entirely certified within the Coq proof assistant for a simple fragment of the Java bytecode language. During the talk, we will also report on our ongoing effort to scale this approach for the full sequential JVM.

  1. A New Mixing Diagnostic and Gulf Oil Spill Movement

    DTIC Science & Technology

    2010-10-01

    could be used with new estimates of the suppression parameter to yield appreciably larger estimates of the hydrogen content in the shallow lunar ...paradigm for mixing in fluid flows with simple time dependence. Its skeletal structure is based on analysis of invariant attracting and repelling...continues to the present day. Model analysis and forecasts are compared to independent (nonassimilated) infrared frontal po- sitions and drifter trajectories

  2. ATAC Autocuer Modeling Analysis.

    DTIC Science & Technology

    1981-01-01

    the analysis of the simple rectangular scrnentation (1) is based on detection and estimation theory (2). This approach uses the concept of maximum ...continuous wave forms. In order to develop the principles of maximum likelihood, it is con- venient to develop the principles for the "classical...the concept of maximum likelihood is significant in that it provides the optimum performance of the detection/estimation problem. With a knowledge of

  3. Facile synthesis of magnetic carbon nitride nanosheets and its application in magnetic solid phase extraction for polycyclic aromatic hydrocarbons in edible oil samples.

    PubMed

    Zheng, Hao-Bo; Ding, Jun; Zheng, Shu-Jian; Zhu, Gang-Tian; Yuan, Bi-Feng; Feng, Yu-Qi

    2016-01-01

    In this study, we proposed a method to fabricate magnetic carbon nitride (CN) nanosheets by simple physical blending. Low-cost CN nanosheets prepared by urea possessed a highly π-conjugated structure; therefore the obtained composites were employed as magnetic solid-phase extraction (MSPE) sorbent for extraction of polycyclic aromatic hydrocarbons (PAHs) in edible oil samples. Moreover, sample pre-treatment time could be carried out within 10 min. Thus, a simple and cheap method for the analysis of PAHs in edible oil samples was established by coupling magnetic CN nanosheets-based MSPE with gas chromatography-mass spectrometry (GC/MS) analysis. Limits of quantitation (LOQs) for eight PAHs ranged from 0.4 to 0.9 ng/g. The intra- and inter-day relative standard deviations (RSDs) were less than 15.0%. The recoveries of PAHs for spiked soybean oil samples ranged from 91.0% to 124.1%, with RSDs of less than 10.2%. Taken together, the proposed method offers a simple and cost-effective option for the convenient analysis of PAHs in oil samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Bacterial Expression of a Single-Chain Variable Fragment (scFv) Antibody against Ganoderic Acid A: A Cost-Effective Approach for Quantitative Analysis Using the scFv-Based Enzyme-Linked Immunosorbent Assay.

    PubMed

    Yusakul, Gorawit; Nuntawong, Poomraphie; Sakamoto, Seiichi; Ratnatilaka Na Bhuket, Pahweenvaj; Kohno, Toshitaka; Kikkawa, Nao; Rojsitthisak, Pornchai; Shimizu, Kuniyoshi; Tanaka, Hiroyuki; Morimoto, Satoshi

    2017-01-01

    Due to the highly specific binding between an antibody and its target, superior analytical performances was obtained by immunoassays for phytochemical analysis over conventional chromatographic techniques. Here, we describe a simple method for producing a functional single-chain variable fragment (scFv) antibody against ganoderic acid A (GAA), a pharmacologically active metabolite from Ganoderma lingzhi. The Escherichia coli BL21(DE3) strain produced a large amount of anti-GAA scFv. However, in vitro refolding steps, which partially recovered the reactivity of the scFv, were required. Interestingly, the functional scFv was expressed as a soluble and active form in the cytoplasm of an engineered E. coli SHuffle ® strain. Purified anti-GAA scFv, which yielded 2.56 mg from 1 L of culture medium, was obtained from simple and inexpensive procedures for expression and purification. The anti-GAA scFv-based indirect competitive enzyme-linked immunosorbent assay (icELISA) exhibited high sensitivity (linearity: 0.078-1.25 µg/mL) with precision (CV: ≤6.20%) and reliability (recovery: 100.1-101.8%) for GAA determination. In summary, the approach described here is an inexpensive, simple, and efficient expression system that extends the application of anti-GAA scFv-based immunoassays. In addition, when in vitro refolding steps can be skipped, the cost and complexity of scFv antibody production can be minimized.

  5. Load alleviation maneuvers for a launch vehicle

    NASA Technical Reports Server (NTRS)

    Seywald, Hans; Bless, Robert

    1993-01-01

    This paper addresses the design of a forward-looking autopilot that is capable of employing a priori knowledge of wind gusts ahead of the flight path to reduce the bending loads experienced by a launch vehicle. The analysis presented in the present paper is only preliminary, employing a very simple vehicle dynamical model and restricting itself to wind gusts of the form of isolated spikes. The main result of the present study is that LQR based feedback laws are inappropriate to handle spike-type wind perturbations with large amplitude and narrow base. The best performance is achieved with an interior-point penalty optimal control formulation which can be well approximated by a simple feedback control law. Reduction of the maximum bending loads by nearly 50 percent is demonstrated.

  6. Prediction of the Main Engine Power of a New Container Ship at the Preliminary Design Stage

    NASA Astrophysics Data System (ADS)

    Cepowski, Tomasz

    2017-06-01

    The paper presents mathematical relationships that allow us to forecast the estimated main engine power of new container ships, based on data concerning vessels built in 2005-2015. The presented approximations allow us to estimate the engine power based on the length between perpendiculars and the number of containers the ship will carry. The approximations were developed using simple linear regression and multivariate linear regression analysis. The presented relations have practical application for estimation of container ship engine power needed in preliminary parametric design of the ship. It follows from the above that the use of multiple linear regression to predict the main engine power of a container ship brings more accurate solutions than simple linear regression.

  7. Energy Savings Analysis for Energy Monitoring and Control Systems

    DTIC Science & Technology

    1995-01-01

    for evaluating design and construction a:-0 quality, and for studying the effectiveness of air - tightening AC retrofits. No simple relationship...Energy These models of residential infiltration are based on statistical "Resource Center (1983) include information on air tightening in fits of

  8. CONVERTING ISOTOPE RATIOS TO DIET COMPOSITION - THE USE OF MIXING MODELS

    EPA Science Inventory

    Investigations of wildlife foraging ecology with stable isotope analysis are increasing. Converting isotope values to proportions of different foods in a consumer's diet requires the use of mixing models. Simple mixing models based on mass balance equations have been used for d...

  9. Effects of Exposure Measurement Error in the Analysis of Health Effects from Traffic-Related Air Pollution

    EPA Science Inventory

    In large epidemiological studies, many researchers use surrogates of air pollution exposure such as geographic information system (GIS)-based characterizations of traffic or simple housing characteristics. It is important to validate these surrogates against measured pollutant co...

  10. IDEA: An Interdisciplinary Unit Comparing "Don Quixote" to "Hamlet."

    ERIC Educational Resources Information Center

    Harris, Mary J. G.

    2001-01-01

    Describes an idea for teaching language through content-based instruction in which a high school Spanish class studying a shortened abridged version of Cervantes'"Don Quixote" and an English class reading Shakespeare's "Hamlet," did a simple comparative analysis of the two texts. (Author/VWL)

  11. Mixed Beam Murine Harderian Gland Tumorigenesis: Predicted Dose-Effect Relationships if neither Synergism nor Antagonism Occurs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siranart, Nopphon; Blakely, Eleanor A.; Cheng, Alden

    Complex mixed radiation fields exist in interplanetary space, and not much is known about their latent effects on space travelers. In silico synergy analysis default predictions are useful when planning relevant mixed-ion-beam experiments and interpreting their results. These predictions are based on individual dose-effect relationships (IDER) for each component of the mixed-ion beam, assuming no synergy or antagonism. For example, a default hypothesis of simple effect additivity has often been used throughout the study of biology. However, for more than a century pharmacologists interested in mixtures of therapeutic drugs have analyzed conceptual, mathematical and practical questions similar to those thatmore » arise when analyzing mixed radiation fields, and have shown that simple effect additivity often gives unreasonable predictions when the IDER are curvilinear. Various alternatives to simple effect additivity proposed in radiobiology, pharmacometrics, toxicology and other fields are also known to have important limitations. In this work, we analyze upcoming murine Harderian gland (HG) tumor prevalence mixed-beam experiments, using customized open-source software and published IDER from past single-ion experiments. The upcoming experiments will use acute irradiation and the mixed beam will include components of high atomic number and energy (HZE). We introduce a new alternative to simple effect additivity, "incremental effect additivity", which is more suitable for the HG analysis and perhaps for other end points. We use incremental effect additivity to calculate default predictions for mixture dose-effect relationships, including 95% confidence intervals. We have drawn three main conclusions from this work. 1. It is important to supplement mixed-beam experiments with single-ion experiments, with matching end point(s), shielding and dose timing. 2. For HG tumorigenesis due to a mixed beam, simple effect additivity and incremental effect additivity sometimes give default predictions that are numerically close. However, if nontargeted effects are important and the mixed beam includes a number of different HZE components, simple effect additivity becomes unusable and another method is needed such as incremental effect additivity. 3. Eventually, synergy analysis default predictions of the effects of mixed radiation fields will be replaced by more mechanistic, biophysically-based predictions. However, optimizing synergy analyses is an important first step. If mixed-beam experiments indicate little synergy or antagonism, plans by NASA for further experiments and possible missions beyond low earth orbit will be substantially simplified.« less

  12. ANALYSIS OF METHODS FOR DETECTING THE PROXIMITY EFFECT IN QUASAR SPECTRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall'Aglio, Aldo; Gnedin, Nickolay Y., E-mail: adaglio@aip.d

    Using numerical simulations of structure formation, we investigate several methods for determining the strength of the proximity effect in the H I Ly{alpha} forest. We analyze three high-resolution ({approx}10 kpc) redshift snapshots (z-bar=4,3, and 2.25) of a Hydro-Particle-Mesh simulation to obtain realistic absorption spectra of the H I Ly{alpha} forest. We model the proximity effect along the simulated sight lines with a simple analytical prescription based on the assumed quasar luminosity and the intensity of the cosmic UV background (UVB). We begin our analysis investigating the intrinsic biases thought to arise in the widely adopted standard technique of combining multiplemore » lines of sight when searching for the proximity effect. We confirm the existence of these biases, albeit smaller than previously predicted with simple Monte Carlo simulations. We then concentrate on the analysis of the proximity effect along individual lines of sight. After determining its strength with a fiducial value of the UVB intensity, we construct the proximity effect strength distribution (PESD). We confirm that the PESD inferred from the simple averaging technique accurately recovers the input strength of the proximity effect at all redshifts. Moreover, the PESD closely follows the behaviors found in observed samples of quasar spectra. However, the PESD obtained from our new simulated sight lines presents some differences to that of simple Monte Carlo simulations. At all redshifts, we find a smaller dispersion of the strength parameters, the source of the corresponding smaller biases found when combining multiple lines of sight. After developing three new theoretical methods for recovering the strength of the proximity effect on individual lines of sight, we compare their accuracy to the PESD from the simple averaging technique. All our new approaches are based on the maximization of the likelihood function, albeit invoking some modifications. The new techniques presented here, in spite of their complexity, fail to recover the input proximity effect in an unbiased way, presumably due to some (unknown) higher order correlations in the spectrum. Thus, employing complex three-dimensional simulations, we provide strong evidence in favor of the PESD obtained from the simple averaging technique, as a method of estimating the UVB intensity, free of any intrinsic biases.« less

  13. A simple apparatus for quick qualitative analysis of CR39 nuclear track detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gautier, D. C.; Kline, J. L.; Flippo, K. A.

    2008-10-15

    Quantifying the ion pits in Columbia Resin 39 (CR39) nuclear track detector from Thomson parabolas is a time consuming and tedious process using conventional microscope based techniques. A simple inventive apparatus for fast screening and qualitative analysis of CR39 detectors has been developed, enabling efficient selection of data for a more detailed analysis. The system consists simply of a green He-Ne laser and a high-resolution digital single-lens reflex camera. The laser illuminates the edge of the CR39 at grazing incidence and couples into the plastic, acting as a light pipe. Subsequently, the laser illuminates all ion tracks on the surface.more » A high-resolution digital camera is used to photograph the scattered light from the ion tracks, enabling one to quickly determine charge states and energies measured by the Thomson parabola.« less

  14. A Simple Engineering Analysis of Solar Particle Event High Energy Tails and Their Impact on Vehicle Design

    NASA Technical Reports Server (NTRS)

    Singleterry, Robert C., Jr.; Walker, Steven A.; Clowdsley, Martha S.

    2016-01-01

    The mathematical models for Solar Particle Event (SPE) high energy tails are constructed with several di erent algorithms. Since limited measured data exist above energies around 400 MeV, this paper arbitrarily de nes the high energy tail as any proton with an energy above 400 MeV. In order to better understand the importance of accurately modeling the high energy tail for SPE spectra, the contribution to astronaut whole body e ective dose equivalent of the high energy portions of three di erent SPE models has been evaluated. To ensure completeness of this analysis, simple and complex geometries were used. This analysis showed that the high energy tail of certain SPEs can be relevant to astronaut exposure and hence safety. Therefore, models of high energy tails for SPEs should be well analyzed and based on data if possible.

  15. Aviation and programmatic analyses; Volume 1, Task 1: Aviation data base development and application. [for NASA OAST programs

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A method was developed for using the NASA aviation data base and computer programs in conjunction with the GE management analysis and projection service to perform simple and complex economic analysis for planning, forecasting, and evaluating OAST programs. Capabilities of the system are discussed along with procedures for making basic data tabulations, updates and entries. The system is applied in an agricultural aviation study in order to assess its value for actual utility in the OAST working environment.

  16. AR(p) -based detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Alvarez-Ramirez, J.; Rodriguez, E.

    2018-07-01

    Autoregressive models are commonly used for modeling time-series from nature, economics and finance. This work explored simple autoregressive AR(p) models to remove long-term trends in detrended fluctuation analysis (DFA). Crude oil prices and bitcoin exchange rate were considered, with the former corresponding to a mature market and the latter to an emergent market. Results showed that AR(p) -based DFA performs similar to traditional DFA. However, the former DFA provides information on stability of long-term trends, which is valuable for understanding and quantifying the dynamics of complex time series from financial systems.

  17. Selecting supplier combination based on fuzzy multicriteria analysis

    NASA Astrophysics Data System (ADS)

    Han, Zhi-Qiu; Luo, Xin-Xing; Chen, Xiao-Hong; Yang, Wu-E.

    2015-07-01

    Existing multicriteria analysis (MCA) methods are probably ineffective in selecting a supplier combination. Thus, an MCA-based fuzzy 0-1 programming method is introduced. The programming relates to a simple MCA matrix that is used to select a single supplier. By solving the programming, the most feasible combination of suppliers is selected. Importantly, this result differs from selecting suppliers one by one according to a single-selection order, which is used to rank sole suppliers in existing MCA methods. An example highlights such difference and illustrates the proposed method.

  18. Enhancement of orientation gradients during simple shear deformation by application of simple compression

    NASA Astrophysics Data System (ADS)

    Jahedi, Mohammad; Ardeljan, Milan; Beyerlein, Irene J.; Paydar, Mohammad Hossein; Knezevic, Marko

    2015-06-01

    We use a multi-scale, polycrystal plasticity micromechanics model to study the development of orientation gradients within crystals deforming by slip. At the largest scale, the model is a full-field crystal plasticity finite element model with explicit 3D grain structures created by DREAM.3D, and at the finest scale, at each integration point, slip is governed by a dislocation density based hardening law. For deformed polycrystals, the model predicts intra-granular misorientation distributions that follow well the scaling law seen experimentally by Hughes et al., Acta Mater. 45(1), 105-112 (1997), independent of strain level and deformation mode. We reveal that the application of a simple compression step prior to simple shearing significantly enhances the development of intra-granular misorientations compared to simple shearing alone for the same amount of total strain. We rationalize that the changes in crystallographic orientation and shape evolution when going from simple compression to simple shearing increase the local heterogeneity in slip, leading to the boost in intra-granular misorientation development. In addition, the analysis finds that simple compression introduces additional crystal orientations that are prone to developing intra-granular misorientations, which also help to increase intra-granular misorientations. Many metal working techniques for refining grain sizes involve a preliminary or concurrent application of compression with severe simple shearing. Our finding reveals that a pre-compression deformation step can, in fact, serve as another processing variable for improving the rate of grain refinement during the simple shearing of polycrystalline metals.

  19. Correction of Atmospheric Haze in RESOURCESAT-1 LISS-4 MX Data for Urban Analysis: AN Improved Dark Object Subtraction Approach

    NASA Astrophysics Data System (ADS)

    Mustak, S.

    2013-09-01

    The correction of atmospheric effects is very essential because visible bands of shorter wavelength are highly affected by atmospheric scattering especially of Rayleigh scattering. The objectives of the paper is to find out the haze values present in the all spectral bands and to correct the haze values for urban analysis. In this paper, Improved Dark Object Subtraction method of P. Chavez (1988) is applied for the correction of atmospheric haze in the Resoucesat-1 LISS-4 multispectral satellite image. Dark object Subtraction is a very simple image-based method of atmospheric haze which assumes that there are at least a few pixels within an image which should be black (% reflectance) and such black reflectance termed as dark object which are clear water body and shadows whose DN values zero (0) or Close to zero in the image. Simple Dark Object Subtraction method is a first order atmospheric correction but Improved Dark Object Subtraction method which tends to correct the Haze in terms of atmospheric scattering and path radiance based on the power law of relative scattering effect of atmosphere. The haze values extracted using Simple Dark Object Subtraction method for Green band (Band2), Red band (Band3) and NIR band (band4) are 40, 34 and 18 but the haze values extracted using Improved Dark Object Subtraction method are 40, 18.02 and 11.80 for aforesaid bands. Here it is concluded that the haze values extracted by Improved Dark Object Subtraction method provides more realistic results than Simple Dark Object Subtraction method.

  20. Evaluation of Strain-Life Fatigue Curve Estimation Methods and Their Application to a Direct-Quenched High-Strength Steel

    NASA Astrophysics Data System (ADS)

    Dabiri, M.; Ghafouri, M.; Rohani Raftar, H. R.; Björk, T.

    2018-03-01

    Methods to estimate the strain-life curve, which were divided into three categories: simple approximations, artificial neural network-based approaches and continuum damage mechanics models, were examined, and their accuracy was assessed in strain-life evaluation of a direct-quenched high-strength steel. All the prediction methods claim to be able to perform low-cycle fatigue analysis using available or easily obtainable material properties, thus eliminating the need for costly and time-consuming fatigue tests. Simple approximations were able to estimate the strain-life curve with satisfactory accuracy using only monotonic properties. The tested neural network-based model, although yielding acceptable results for the material in question, was found to be overly sensitive to the data sets used for training and showed an inconsistency in estimation of the fatigue life and fatigue properties. The studied continuum damage-based model was able to produce a curve detecting early stages of crack initiation. This model requires more experimental data for calibration than approaches using simple approximations. As a result of the different theories underlying the analyzed methods, the different approaches have different strengths and weaknesses. However, it was found that the group of parametric equations categorized as simple approximations are the easiest for practical use, with their applicability having already been verified for a broad range of materials.

  1. Method for the determination of natural ester-type gum bases used as food additives via direct analysis of their constituent wax esters using high-temperature GC/MS.

    PubMed

    Tada, Atsuko; Ishizuki, Kyoko; Yamazaki, Takeshi; Sugimoto, Naoki; Akiyama, Hiroshi

    2014-07-01

    Natural ester-type gum bases, which are used worldwide as food additives, mainly consist of wax esters composed of long-chain fatty acids and long-chain fatty alcohols. There are many varieties of ester-type gum bases, and thus a useful method for their discrimination is needed in order to establish official specifications and manage their quality control. Herein is reported a rapid and simple method for the analysis of different ester-type gum bases used as food additives by high-temperature gas chromatography/mass spectrometry (GC/MS). With this method, the constituent wax esters in ester-type gum bases can be detected without hydrolysis and derivatization. The method was applied to the determination of 10 types of gum bases, including beeswax, carnauba wax, lanolin, and jojoba wax, and it was demonstrated that the gum bases derived from identical origins have specific and characteristic total ion chromatogram (TIC) patterns and ester compositions. Food additive gum bases were thus distinguished from one another based on their TIC patterns and then more clearly discriminated using simultaneous monitoring of the fragment ions corresponding to the fatty acid moieties of the individual molecular species of the wax esters. This direct high-temperature GC/MS method was shown to be very useful for the rapid and simple discrimination of varieties of ester-type gum bases used as food additives.

  2. Method for the determination of natural ester-type gum bases used as food additives via direct analysis of their constituent wax esters using high-temperature GC/MS

    PubMed Central

    Tada, Atsuko; Ishizuki, Kyoko; Yamazaki, Takeshi; Sugimoto, Naoki; Akiyama, Hiroshi

    2014-01-01

    Natural ester-type gum bases, which are used worldwide as food additives, mainly consist of wax esters composed of long-chain fatty acids and long-chain fatty alcohols. There are many varieties of ester-type gum bases, and thus a useful method for their discrimination is needed in order to establish official specifications and manage their quality control. Herein is reported a rapid and simple method for the analysis of different ester-type gum bases used as food additives by high-temperature gas chromatography/mass spectrometry (GC/MS). With this method, the constituent wax esters in ester-type gum bases can be detected without hydrolysis and derivatization. The method was applied to the determination of 10 types of gum bases, including beeswax, carnauba wax, lanolin, and jojoba wax, and it was demonstrated that the gum bases derived from identical origins have specific and characteristic total ion chromatogram (TIC) patterns and ester compositions. Food additive gum bases were thus distinguished from one another based on their TIC patterns and then more clearly discriminated using simultaneous monitoring of the fragment ions corresponding to the fatty acid moieties of the individual molecular species of the wax esters. This direct high-temperature GC/MS method was shown to be very useful for the rapid and simple discrimination of varieties of ester-type gum bases used as food additives. PMID:25473499

  3. A step-by-step guide to non-linear regression analysis of experimental data using a Microsoft Excel spreadsheet.

    PubMed

    Brown, A M

    2001-06-01

    The objective of this present study was to introduce a simple, easily understood method for carrying out non-linear regression analysis based on user input functions. While it is relatively straightforward to fit data with simple functions such as linear or logarithmic functions, fitting data with more complicated non-linear functions is more difficult. Commercial specialist programmes are available that will carry out this analysis, but these programmes are expensive and are not intuitive to learn. An alternative method described here is to use the SOLVER function of the ubiquitous spreadsheet programme Microsoft Excel, which employs an iterative least squares fitting routine to produce the optimal goodness of fit between data and function. The intent of this paper is to lead the reader through an easily understood step-by-step guide to implementing this method, which can be applied to any function in the form y=f(x), and is well suited to fast, reliable analysis of data in all fields of biology.

  4. Rapid 2,2'-bicinchoninic-based xylanase assay compatible with high throughput screening

    Treesearch

    William R. Kenealy; Thomas W. Jeffries

    2003-01-01

    High-throughput screening requires simple assays that give reliable quantitative results. A microplate assay was developed for reducing sugar analysis that uses a 2,2'-bicinchoninic-based protein reagent. Endo-1,4-â-D-xylanase activity against oat spelt xylan was detected at activities of 0.002 to 0.011 IU ml−1. The assay is linear for sugar...

  5. Impact Crater Experiments for Introductory Physics and Astronomy Laboratories

    ERIC Educational Resources Information Center

    Claycomb, J. R.

    2009-01-01

    Activity-based collisional analysis is developed for introductory physics and astronomy laboratory experiments. Crushable floral foam is used to investigate the physics of projectiles undergoing completely inelastic collisions with a low-density solid forming impact craters. Simple drop experiments enable determination of the average acceleration,…

  6. Diffusion of Super-Gaussian Profiles

    ERIC Educational Resources Information Center

    Rosenberg, C.-J.; Anderson, D.; Desaix, M.; Johannisson, P.; Lisak, M.

    2007-01-01

    The present analysis describes an analytically simple and systematic approximation procedure for modelling the free diffusive spreading of initially super-Gaussian profiles. The approach is based on a self-similar ansatz for the evolution of the diffusion profile, and the parameter functions involved in the modelling are determined by suitable…

  7. Assessing map accuracy in a remotely sensed, ecoregion-scale cover map

    USGS Publications Warehouse

    Edwards, T.C.; Moisen, Gretchen G.; Cutler, D.R.

    1998-01-01

    Landscape- and ecoregion-based conservation efforts increasingly use a spatial component to organize data for analysis and interpretation. A challenge particular to remotely sensed cover maps generated from these efforts is how best to assess the accuracy of the cover maps, especially when they can exceed 1000 s/km2 in size. Here we develop and describe a methodological approach for assessing the accuracy of large-area cover maps, using as a test case the 21.9 million ha cover map developed for Utah Gap Analysis. As part of our design process, we first reviewed the effect of intracluster correlation and a simple cost function on the relative efficiency of cluster sample designs to simple random designs. Our design ultimately combined clustered and subsampled field data stratified by ecological modeling unit and accessibility (hereafter a mixed design). We next outline estimation formulas for simple map accuracy measures under our mixed design and report results for eight major cover types and the three ecoregions mapped as part of the Utah Gap Analysis. Overall accuracy of the map was 83.2% (SE=1.4). Within ecoregions, accuracy ranged from 78.9% to 85.0%. Accuracy by cover type varied, ranging from a low of 50.4% for barren to a high of 90.6% for man modified. In addition, we examined gains in efficiency of our mixed design compared with a simple random sample approach. In regard to precision, our mixed design was more precise than a simple random design, given fixed sample costs. We close with a discussion of the logistical constraints facing attempts to assess the accuracy of large-area, remotely sensed cover maps.

  8. Smartphone-based colorimetric analysis for detection of saliva alcohol concentration.

    PubMed

    Jung, Youngkee; Kim, Jinhee; Awofeso, Olumide; Kim, Huisung; Regnier, Fred; Bae, Euiwon

    2015-11-01

    A simple device and associated analytical methods are reported. We provide objective and accurate determination of saliva alcohol concentrations using smartphone-based colorimetric imaging. The device utilizes any smartphone with a miniature attachment that positions the sample and provides constant illumination for sample imaging. Analyses of histograms based on channel imaging of red-green-blue (RGB) and hue-saturation-value (HSV) color space provide unambiguous determination of blood alcohol concentration from color changes on sample pads. A smartphone-based sample analysis by colorimetry was developed and tested with blind samples that matched with the training sets. This technology can be adapted to any smartphone and used to conduct color change assays.

  9. Analyzing Array Manipulating Programs by Program Transformation

    NASA Technical Reports Server (NTRS)

    Cornish, J. Robert M.; Gange, Graeme; Navas, Jorge A.; Schachte, Peter; Sondergaard, Harald; Stuckey, Peter J.

    2014-01-01

    We explore a transformational approach to the problem of verifying simple array-manipulating programs. Traditionally, verification of such programs requires intricate analysis machinery to reason with universally quantified statements about symbolic array segments, such as "every data item stored in the segment A[i] to A[j] is equal to the corresponding item stored in the segment B[i] to B[j]." We define a simple abstract machine which allows for set-valued variables and we show how to translate programs with array operations to array-free code for this machine. For the purpose of program analysis, the translated program remains faithful to the semantics of array manipulation. Based on our implementation in LLVM, we evaluate the approach with respect to its ability to extract useful invariants and the cost in terms of code size.

  10. Simple green approach to reinforce natural rubber with bacterial cellulose nanofibers.

    PubMed

    Trovatti, Eliane; Carvalho, Antonio J F; Ribeiro, Sidney J L; Gandini, Alessandro

    2013-08-12

    Natural rubber (NR) is a renewable polymer with a wide range of applications, which is constantly tailored, further increasing its utilizations. The tensile strength is one of its most important properties susceptible of being enhanced by the simple incorporation of nanofibers. The preparation and characterization of natural-rubber based nanocomposites reinforced with bacterial cellulose (BC) and bacterial cellulose coated with polystyrene (BCPS), yielded high performance materials. The nanocomposites were prepared by a simple and green process, and characterized by tensile tests, dynamical mechanical analysis (DMA), scanning electron microscopy (SEM), and swelling experiments. The effect of the nanofiber content on morphology, static, and dynamic mechanical properties was also investigated. The results showed an increase in the mechanical properties, such as Young's modulus and tensile strength, even with modest nanofiber loadings.

  11. Construction of a genetic linkage map and analysis of quantitative trait loci associated with the agronomically important traits of Pleurotus eryngii

    Treesearch

    Chak Han Im; Young-Hoon Park; Kenneth E. Hammel; Bokyung Park; Soon Wook Kwon; Hojin Ryu; Jae-San Ryu

    2016-01-01

    Breeding new strains with improved traits is a long-standing goal of mushroom breeders that can be expedited by marker-assisted selection (MAS). We constructed a genetic linkage map of Pleurotus eryngii based on segregation analysis of markers in postmeiotic monokaryons from KNR2312. In total, 256 loci comprising 226 simple sequence-repeat (SSR) markers, 2 mating-type...

  12. A sensitivity analysis method for the body segment inertial parameters based on ground reaction and joint moment regressor matrices.

    PubMed

    Futamure, Sumire; Bonnet, Vincent; Dumas, Raphael; Venture, Gentiane

    2017-11-07

    This paper presents a method allowing a simple and efficient sensitivity analysis of dynamics parameters of complex whole-body human model. The proposed method is based on the ground reaction and joint moment regressor matrices, developed initially in robotics system identification theory, and involved in the equations of motion of the human body. The regressor matrices are linear relatively to the segment inertial parameters allowing us to use simple sensitivity analysis methods. The sensitivity analysis method was applied over gait dynamics and kinematics data of nine subjects and with a 15 segments 3D model of the locomotor apparatus. According to the proposed sensitivity indices, 76 segments inertial parameters out the 150 of the mechanical model were considered as not influent for gait. The main findings were that the segment masses were influent and that, at the exception of the trunk, moment of inertia were not influent for the computation of the ground reaction forces and moments and the joint moments. The same method also shows numerically that at least 90% of the lower-limb joint moments during the stance phase can be estimated only from a force-plate and kinematics data without knowing any of the segment inertial parameters. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Slow feature analysis: unsupervised learning of invariances.

    PubMed

    Wiskott, Laurenz; Sejnowski, Terrence J

    2002-04-01

    Invariant features of temporally varying signals are useful for analysis and classification. Slow feature analysis (SFA) is a new method for learning invariant or slowly varying features from a vectorial input signal. It is based on a nonlinear expansion of the input signal and application of principal component analysis to this expanded signal and its time derivative. It is guaranteed to find the optimal solution within a family of functions directly and can learn to extract a large number of decorrelated features, which are ordered by their degree of invariance. SFA can be applied hierarchically to process high-dimensional input signals and extract complex features. SFA is applied first to complex cell tuning properties based on simple cell output, including disparity and motion. Then more complicated input-output functions are learned by repeated application of SFA. Finally, a hierarchical network of SFA modules is presented as a simple model of the visual system. The same unstructured network can learn translation, size, rotation, contrast, or, to a lesser degree, illumination invariance for one-dimensional objects, depending on only the training stimulus. Surprisingly, only a few training objects suffice to achieve good generalization to new objects. The generated representation is suitable for object recognition. Performance degrades if the network is trained to learn multiple invariances simultaneously.

  14. Anthropometry-corrected exposure modeling as a method to improve trunk posture assessment with a single inclinometer.

    PubMed

    Van Driel, Robin; Trask, Catherine; Johnson, Peter W; Callaghan, Jack P; Koehoorn, Mieke; Teschke, Kay

    2013-01-01

    Measuring trunk posture in the workplace commonly involves subjective observation or self-report methods or the use of costly and time-consuming motion analysis systems (current gold standard). This work compared trunk inclination measurements using a simple data-logging inclinometer with trunk flexion measurements using a motion analysis system, and evaluated adding measures of subject anthropometry to exposure prediction models to improve the agreement between the two methods. Simulated lifting tasks (n=36) were performed by eight participants, and trunk postures were simultaneously measured with each method. There were significant differences between the two methods, with the inclinometer initially explaining 47% of the variance in the motion analysis measurements. However, adding one key anthropometric parameter (lower arm length) to the inclinometer-based trunk flexion prediction model reduced the differences between the two systems and accounted for 79% of the motion analysis method's variance. Although caution must be applied when generalizing lower-arm length as a correction factor, the overall strategy of anthropometric modeling is a novel contribution. In this lifting-based study, by accounting for subject anthropometry, a single, simple data-logging inclinometer shows promise for trunk posture measurement and may have utility in larger-scale field studies where similar types of tasks are performed.

  15. On an image reconstruction method for ECT

    NASA Astrophysics Data System (ADS)

    Sasamoto, Akira; Suzuki, Takayuki; Nishimura, Yoshihiro

    2007-04-01

    An image by Eddy Current Testing(ECT) is a blurred image to original flaw shape. In order to reconstruct fine flaw image, a new image reconstruction method has been proposed. This method is based on an assumption that a very simple relationship between measured data and source were described by a convolution of response function and flaw shape. This assumption leads to a simple inverse analysis method with deconvolution.In this method, Point Spread Function (PSF) and Line Spread Function(LSF) play a key role in deconvolution processing. This study proposes a simple data processing to determine PSF and LSF from ECT data of machined hole and line flaw. In order to verify its validity, ECT data for SUS316 plate(200x200x10mm) with artificial machined hole and notch flaw had been acquired by differential coil type sensors(produced by ZETEC Inc). Those data were analyzed by the proposed method. The proposed method restored sharp discrete multiple hole image from interfered data by multiple holes. Also the estimated width of line flaw has been much improved compared with original experimental data. Although proposed inverse analysis strategy is simple and easy to implement, its validity to holes and line flaw have been shown by many results that much finer image than original image have been reconstructed.

  16. High-Throughput Density Measurement Using Magnetic Levitation.

    PubMed

    Ge, Shencheng; Wang, Yunzhe; Deshler, Nicolas J; Preston, Daniel J; Whitesides, George M

    2018-06-20

    This work describes the development of an integrated analytical system that enables high-throughput density measurements of diamagnetic particles (including cells) using magnetic levitation (MagLev), 96-well plates, and a flatbed scanner. MagLev is a simple and useful technique with which to carry out density-based analysis and separation of a broad range of diamagnetic materials with different physical forms (e.g., liquids, solids, gels, pastes, gums, etc.); one major limitation, however, is the capacity to perform high-throughput density measurements. This work addresses this limitation by (i) re-engineering the shape of the magnetic fields so that the MagLev system is compatible with 96-well plates, and (ii) integrating a flatbed scanner (and simple optical components) to carry out imaging of the samples that levitate in the system. The resulting system is compatible with both biological samples (human erythrocytes) and nonbiological samples (simple liquids and solids, such as 3-chlorotoluene, cholesterol crystals, glass beads, copper powder, and polymer beads). The high-throughput capacity of this integrated MagLev system will enable new applications in chemistry (e.g., analysis and separation of materials) and biochemistry (e.g., cellular responses under environmental stresses) in a simple and label-free format on the basis of a universal property of all matter, i.e., density.

  17. Application of Artificial Boundary Conditions in Sensitivity-Based Updating of Finite Element Models

    DTIC Science & Technology

    2007-06-01

    is known as the impedance matrix[ ]( )Z Ω . [ ] [ ] 1( ) ( )Z H −Ω = Ω (12) where [ ] 2( )Z K M j C ⎡ ⎤Ω = −Ω + Ω⎣ ⎦ (13) A. REDUCED ORDER...D.L. A correlation coefficient for modal vector analysis. Proceedings of 1st International Modal Analysis Conference, 1982, 110-116. Anton , H ... Rorres , C ., (2005). Elementary Linear Algebra. New York: John Wiley and Sons. Avitable, Peter (2001, January) Experimental Modal Analysis, A Simple

  18. Simple morphological control over functional diversity of SERS materials

    NASA Astrophysics Data System (ADS)

    Semenova, A. A.; Goodilin, E. A.

    2018-03-01

    Nowadays, surface-enhanced Raman spectroscopy (SERS) becomes a promising universal low-cost and real-time tool in biomedical applications, medical screening or forensic analysis allowing for detection of different molecules below nanomolar concentrations. Silver nanoparticles and nanostructures have proven to be a common choice for SERS measurements due to a tunable plasmon resonance, high stability and facile fabrication methods. However, a proper design of silver-based nanomaterials for highly sensitive SERS applications still remains a challenge. In this work, effective and simple preparation methods of various silver nanostructures are proposed and systematically developed using aqueous diamminesilver (I) hydroxide as a precursor.

  19. A Simple and Robust Method for Partially Matched Samples Using the P-Values Pooling Approach

    PubMed Central

    Kuan, Pei Fen; Huang, Bo

    2013-01-01

    This paper focuses on statistical analyses in scenarios where some samples from the matched pairs design are missing, resulting in partially matched samples. Motivated by the idea of meta-analysis, we recast the partially matched samples as coming from two experimental designs, and propose a simple yet robust approach based on the weighted Z-test to integrate the p-values computed from these two designs. We show that the proposed approach achieves better operating characteristics in simulations and a case study, compared to existing methods for partially matched samples. PMID:23417968

  20. A simple algorithm to estimate the effective regional atmospheric parameters for thermal-inertia mapping

    USGS Publications Warehouse

    Watson, K.; Hummer-Miller, S.

    1981-01-01

    A method based solely on remote sensing data has been developed to estimate those meteorological effects which are required for thermal-inertia mapping. It assumes that the atmospheric fluxes are spatially invariant and that the solar, sky, and sensible heat fluxes can be approximated by a simple mathematical form. Coefficients are determined from least-squares method by fitting observational data to our thermal model. A comparison between field measurements and the model-derived flux shows the type of agreement which can be achieved. An analysis of the limitations of the method is also provided. ?? 1981.

  1. Beyond Molecular Codes: Simple Rules to Wire Complex Brains

    PubMed Central

    Hassan, Bassem A.; Hiesinger, P. Robin

    2015-01-01

    Summary Molecular codes, like postal zip codes, are generally considered a robust way to ensure the specificity of neuronal target selection. However, a code capable of unambiguously generating complex neural circuits is difficult to conceive. Here, we re-examine the notion of molecular codes in the light of developmental algorithms. We explore how molecules and mechanisms that have been considered part of a code may alternatively implement simple pattern formation rules sufficient to ensure wiring specificity in neural circuits. This analysis delineates a pattern-based framework for circuit construction that may contribute to our understanding of brain wiring. PMID:26451480

  2. High-fidelity, low-cost, automated method to assess laparoscopic skills objectively.

    PubMed

    Gray, Richard J; Kahol, Kanav; Islam, Gazi; Smith, Marshall; Chapital, Alyssa; Ferrara, John

    2012-01-01

    We sought to define the extent to which a motion analysis-based assessment system constructed with simple equipment could measure technical skill objectively and quantitatively. An "off-the-shelf" digital video system was used to capture the hand and instrument movement of surgical trainees (beginner level = PGY-1, intermediate level = PGY-3, and advanced level = PGY-5/fellows) while they performed a peg transfer exercise. The video data were passed through a custom computer vision algorithm that analyzed incoming pixels to measure movement smoothness objectively. The beginner-level group had the poorest performance, whereas those in the advanced group generated the highest scores. Intermediate-level trainees scored significantly (p < 0.04) better than beginner trainees. Advanced-level trainees scored significantly better than intermediate-level trainees and beginner-level trainees (p < 0.04 and p < 0.03, respectively). A computer vision-based analysis of surgical movements provides an objective basis for technical expertise-level analysis with construct validity. The technology to capture the data is simple, low cost, and readily available, and it obviates the need for expert human assessment in this setting. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  3. Analysis of genetic diversity and population structure of oil palm (Elaeis guineensis) from China and Malaysia based on species-specific simple sequence repeat markers.

    PubMed

    Zhou, L X; Xiao, Y; Xia, W; Yang, Y D

    2015-12-08

    Genetic diversity and patterns of population structure of the 94 oil palm lines were investigated using species-specific simple sequence repeat (SSR) markers. We designed primers for 63 SSR loci based on their flanking sequences and conducted amplification in 94 oil palm DNA samples. The amplification result showed that a relatively high level of genetic diversity was observed between oil palm individuals according a set of 21 polymorphic microsatellite loci. The observed heterozygosity (Ho) was 0.3683 and 0.4035, with an average of 0.3859. The Ho value was a reliable determinant of the discriminatory power of the SSR primer combinations. The principal component analysis and unweighted pair-group method with arithmetic averaging cluster analysis showed the 94 oil palm lines were grouped into one cluster. These results demonstrated that the oil palm in Hainan Province of China and the germplasm introduced from Malaysia may be from the same source. The SSR protocol was effective and reliable for assessing the genetic diversity of oil palm. Knowledge of the genetic diversity and population structure will be crucial for establishing appropriate management stocks for this species.

  4. Simple structured hybrid WOLEDs based on incomplete energy transfer mechanism: from blue exciplex to orange dopant.

    PubMed

    Zhang, Tianyou; Zhao, Bo; Chu, Bei; Li, Wenlian; Su, Zisheng; Yan, Xingwu; Liu, Chengyuan; Wu, Hairuo; Gao, Yuan; Jin, Fangming; Hou, Fuhua

    2015-05-15

    Exciplex is well known as a charge transfer state formed between electron-donating and electron-accepting molecules. However, exciplex based organic light emitting diodes (OLED) often performed low efficiencies relative to pure phosphorescent OLED and could hardly be used to construct white OLED (WOLED). In this work, a new mechanism is developed to realize efficient WOLED with extremely simple structure by redistributing the energy of triplet exciplex to both singlet exciplex and the orange dopant. The micro process of energy transfer could be directly examined by detailed photoluminescence decay measurement and time resolved photoluminescence analysis. This strategy overcomes the low reverse intersystem crossing efficiency of blue exciplex and complicated device structure of traditional WOLED, enables us to achieve efficient hybrid WOLEDs. Based on this mechanism, we have successfully constructed both exciplex-fluorescence and exciplex-phosphorescence hybrid WOLEDs with remarkable efficiencies.

  5. Simple structured hybrid WOLEDs based on incomplete energy transfer mechanism: from blue exciplex to orange dopant

    NASA Astrophysics Data System (ADS)

    Zhang, Tianyou; Zhao, Bo; Chu, Bei; Li, Wenlian; Su, Zisheng; Yan, Xingwu; Liu, Chengyuan; Wu, Hairuo; Gao, Yuan; Jin, Fangming; Hou, Fuhua

    2015-05-01

    Exciplex is well known as a charge transfer state formed between electron-donating and electron-accepting molecules. However, exciplex based organic light emitting diodes (OLED) often performed low efficiencies relative to pure phosphorescent OLED and could hardly be used to construct white OLED (WOLED). In this work, a new mechanism is developed to realize efficient WOLED with extremely simple structure by redistributing the energy of triplet exciplex to both singlet exciplex and the orange dopant. The micro process of energy transfer could be directly examined by detailed photoluminescence decay measurement and time resolved photoluminescence analysis. This strategy overcomes the low reverse intersystem crossing efficiency of blue exciplex and complicated device structure of traditional WOLED, enables us to achieve efficient hybrid WOLEDs. Based on this mechanism, we have successfully constructed both exciplex-fluorescence and exciplex-phosphorescence hybrid WOLEDs with remarkable efficiencies.

  6. Simple structured hybrid WOLEDs based on incomplete energy transfer mechanism: from blue exciplex to orange dopant

    PubMed Central

    Zhang, Tianyou; Zhao, Bo; Chu, Bei; Li, Wenlian; Su, Zisheng; Yan, Xingwu; Liu, Chengyuan; Wu, Hairuo; Gao, Yuan; Jin, Fangming; Hou, Fuhua

    2015-01-01

    Exciplex is well known as a charge transfer state formed between electron-donating and electron-accepting molecules. However, exciplex based organic light emitting diodes (OLED) often performed low efficiencies relative to pure phosphorescent OLED and could hardly be used to construct white OLED (WOLED). In this work, a new mechanism is developed to realize efficient WOLED with extremely simple structure by redistributing the energy of triplet exciplex to both singlet exciplex and the orange dopant. The micro process of energy transfer could be directly examined by detailed photoluminescence decay measurement and time resolved photoluminescence analysis. This strategy overcomes the low reverse intersystem crossing efficiency of blue exciplex and complicated device structure of traditional WOLED, enables us to achieve efficient hybrid WOLEDs. Based on this mechanism, we have successfully constructed both exciplex-fluorescence and exciplex-phosphorescence hybrid WOLEDs with remarkable efficiencies. PMID:25975371

  7. A remote sensing based vegetation classification logic for global land cover analysis

    USGS Publications Warehouse

    Running, Steven W.; Loveland, Thomas R.; Pierce, Lars L.; Nemani, R.R.; Hunt, E. Raymond

    1995-01-01

    This article proposes a simple new logic for classifying global vegetation. The critical features of this classification are that 1) it is based on simple, observable, unambiguous characteristics of vegetation structure that are important to ecosystem biogeochemistry and can be measured in the field for validation, 2) the structural characteristics are remotely sensible so that repeatable and efficient global reclassifications of existing vegetation will be possible, and 3) the defined vegetation classes directly translate into the biophysical parameters of interest by global climate and biogeochemical models. A first test of this logic for the continental United States is presented based on an existing 1 km AVHRR normalized difference vegetation index database. Procedures for solving critical remote sensing problems needed to implement the classification are discussed. Also, some inferences from this classification to advanced vegetation biophysical variables such as specific leaf area and photosynthetic capacity useful to global biogeochemical modeling are suggested.

  8. Characterization of 14 microsatellite markers for genetic analysis and cultivar identification of walnut

    USDA-ARS?s Scientific Manuscript database

    One hundred and forty-seven primer pairs originally designed to amplify microsatellites, also known as simple sequence repeats (SSR), in black walnut (Juglans nigra L.) were screened for utility in persian walnut (J. regia L.). Based on scorability and number of informative polymorphisms, the best 1...

  9. A Cluster Analytic Study of Clinical Orientations among Chemical Dependency Counselors.

    ERIC Educational Resources Information Center

    Thombs, Dennis L.; Osborn, Cynthia J.

    2001-01-01

    Three distinct clinical orientations were identified in a sample of chemical dependency counselors (N=406). Based on cluster analysis, the largest group, identified and labeled as "uniform counselors," endorsed a simple, moral-disease model with little interest in psychosocial interventions. (Contains 50 references and 4 tables.) (GCP)

  10. The Analysis of Spontaneous Processes Using Equilibrium Thermodynamics

    ERIC Educational Resources Information Center

    Honig, J. M.; Ben-Amotz, Dor

    2006-01-01

    The derivations based on the use of deficit functions provide a simple means of demonstrating the extremism conditions that are applicable to various thermodynamics function. The method shows that the maximum quantity of work is available from a system only when the processes are carried out reversibly since irreversible (spontaneous)…

  11. Reference Models for Structural Technology Assessment and Weight Estimation

    NASA Technical Reports Server (NTRS)

    Cerro, Jeff; Martinovic, Zoran; Eldred, Lloyd

    2005-01-01

    Previously the Exploration Concepts Branch of NASA Langley Research Center has developed techniques for automating the preliminary design level of launch vehicle airframe structural analysis for purposes of enhancing historical regression based mass estimating relationships. This past work was useful and greatly reduced design time, however its application area was very narrow in terms of being able to handle a large variety in structural and vehicle general arrangement alternatives. Implementation of the analysis approach presented herein also incorporates some newly developed computer programs. Loft is a program developed to create analysis meshes and simultaneously define structural element design regions. A simple component defining ASCII file is read by Loft to begin the design process. HSLoad is a Visual Basic implementation of the HyperSizer Application Programming Interface, which automates the structural element design process. Details of these two programs and their use are explained in this paper. A feature which falls naturally out of the above analysis paradigm is the concept of "reference models". The flexibility of the FEA based JAVA processing procedures and associated process control classes coupled with the general utility of Loft and HSLoad make it possible to create generic program template files for analysis of components ranging from something as simple as a stiffened flat panel, to curved panels, fuselage and cryogenic tank components, flight control surfaces, wings, through full air and space vehicle general arrangements.

  12. Evaluation of IOTA Simple Ultrasound Rules to Distinguish Benign and Malignant Ovarian Tumours.

    PubMed

    Garg, Sugandha; Kaur, Amarjit; Mohi, Jaswinder Kaur; Sibia, Preet Kanwal; Kaur, Navkiran

    2017-08-01

    IOTA stands for International Ovarian Tumour Analysis group. Ovarian cancer is one of the common cancers in women and is diagnosed at later stage in majority. The limiting factor for early diagnosis is lack of standardized terms and procedures in gynaecological sonography. Introduction of IOTA rules has provided some consistency in defining morphological features of ovarian masses through a standardized examination technique. To evaluate the efficacy of IOTA simple ultrasound rules in distinguishing benign and malignant ovarian tumours and establishing their use as a tool in early diagnosis of ovarian malignancy. A hospital based case control prospective study was conducted. Patients with suspected ovarian pathology were evaluated using IOTA ultrasound rules and designated as benign or malignant. Findings were correlated with histopathological findings. Collected data was statistically analysed using chi-square test and kappa statistical method. Out of initial 55 patients, 50 patients were included in the final analysis who underwent surgery. IOTA simple rules were applicable in 45 out of these 50 patients (90%). The sensitivity for the detection of malignancy in cases where IOTA simple rules were applicable was 91.66% and the specificity was 84.84%. Accuracy was 86.66%. Classifying inconclusive cases as malignant, the sensitivity and specificity was 93% and 80% respectively. High level of agreement was found between USG and histopathological diagnosis with Kappa value as 0.323. IOTA simple ultrasound rules were highly sensitive and specific in predicting ovarian malignancy preoperatively yet being reproducible, easy to train and use.

  13. Side-polished fiber based gain-flattening filter for erbium doped fiber amplifiers

    NASA Astrophysics Data System (ADS)

    Varshney, R. K.; Singh, A.; Pande, K.; Pal, B. P.

    2007-03-01

    A simple and accurate novel normal mode analysis has been developed to take into account the effect of the non-uniform depth of polishing in the study of the transmission characteristics of optical waveguide devices based on loading of a side-polished fiber half-coupler with a multimode planar waveguide. We apply the same to design and fabricate a gain-flattening filter suitable for fiber amplifiers. The wavelength dependent filtering action of the overall device could demonstrate flattening of an EDFA gain spectrum within ±0.7 dB over a bandwidth of 30 nm in the C-band. Results obtained by the present analysis agree very well with our experimental results. This present analysis should be very useful in the accurate design and analysis of any SPF-MMOW device/component including side-polished fiber based sensors.

  14. Development of a Simple Dipstick Assay for Operational Monitoring of DDT

    PubMed Central

    Ismail, Hanafy M.; Kumar, Vijay; Singh, Rudra P.; Williams, Christopher; Shivam, Pushkar; Ghosh, Ayan; Deb, Rinki; Foster, Geraldine M.; Hemingway, Janet; Coleman, Michael; Coleman, Marlize; Das, Pradeep; Paine, Mark J. I.

    2016-01-01

    Background Indoor residual spraying (IRS) of DDT is used to control visceral leishmaniasis (VL) in India. However, the quality of spraying is severely compromised by a lack of affordable field assays to monitor target doses of insecticide. Our aim was to develop a simple DDT insecticide quantification kit (IQK) for monitoring DDT levels in an operational setting. Methodology/ principle findings DDT quantification was based on the stoichiometric release of chloride from DDT by alkaline hydrolysis and detection of the released ion using Quantab chloride detection strips. The assay was specific for insecticidal p,p`-DDT (LoQ = 0.082 g/m2). Bostik discs were effective in post spray wall sampling, extracting 25–70% of active ingredient depending on surface. Residual DDT was sampled from walls in Bihar state in India using Bostik adhesive discs and DDT concentrations (g p,p`-DDT/m2) were determined using IQK and HPLC (n = 1964 field samples). Analysis of 161 Bostik samples (pooled sample pairs) by IQK and HPLC produced excellent correlation (R2 = 0.96; Bland-Altman bias = −0.0038). IQK analysis of the remaining field samples matched HPLC data in identifying households that had been under sprayed, in range or over sprayed. Interpretation A simple dipstick assay has been developed for monitoring DDT spraying that gives comparable results to HPLC. By making laboratory-based analysis of DDT dosing accessible to field operatives, routine monitoring of DDT levels can be promoted in low- and middle- income countries to maximise the effectiveness of IRS. PMID:26760773

  15. Understanding and comparisons of different sampling approaches for the Fourier Amplitudes Sensitivity Test (FAST)

    PubMed Central

    Xu, Chonggang; Gertner, George

    2013-01-01

    Fourier Amplitude Sensitivity Test (FAST) is one of the most popular uncertainty and sensitivity analysis techniques. It uses a periodic sampling approach and a Fourier transformation to decompose the variance of a model output into partial variances contributed by different model parameters. Until now, the FAST analysis is mainly confined to the estimation of partial variances contributed by the main effects of model parameters, but does not allow for those contributed by specific interactions among parameters. In this paper, we theoretically show that FAST analysis can be used to estimate partial variances contributed by both main effects and interaction effects of model parameters using different sampling approaches (i.e., traditional search-curve based sampling, simple random sampling and random balance design sampling). We also analytically calculate the potential errors and biases in the estimation of partial variances. Hypothesis tests are constructed to reduce the effect of sampling errors on the estimation of partial variances. Our results show that compared to simple random sampling and random balance design sampling, sensitivity indices (ratios of partial variances to variance of a specific model output) estimated by search-curve based sampling generally have higher precision but larger underestimations. Compared to simple random sampling, random balance design sampling generally provides higher estimation precision for partial variances contributed by the main effects of parameters. The theoretical derivation of partial variances contributed by higher-order interactions and the calculation of their corresponding estimation errors in different sampling schemes can help us better understand the FAST method and provide a fundamental basis for FAST applications and further improvements. PMID:24143037

  16. Understanding and comparisons of different sampling approaches for the Fourier Amplitudes Sensitivity Test (FAST).

    PubMed

    Xu, Chonggang; Gertner, George

    2011-01-01

    Fourier Amplitude Sensitivity Test (FAST) is one of the most popular uncertainty and sensitivity analysis techniques. It uses a periodic sampling approach and a Fourier transformation to decompose the variance of a model output into partial variances contributed by different model parameters. Until now, the FAST analysis is mainly confined to the estimation of partial variances contributed by the main effects of model parameters, but does not allow for those contributed by specific interactions among parameters. In this paper, we theoretically show that FAST analysis can be used to estimate partial variances contributed by both main effects and interaction effects of model parameters using different sampling approaches (i.e., traditional search-curve based sampling, simple random sampling and random balance design sampling). We also analytically calculate the potential errors and biases in the estimation of partial variances. Hypothesis tests are constructed to reduce the effect of sampling errors on the estimation of partial variances. Our results show that compared to simple random sampling and random balance design sampling, sensitivity indices (ratios of partial variances to variance of a specific model output) estimated by search-curve based sampling generally have higher precision but larger underestimations. Compared to simple random sampling, random balance design sampling generally provides higher estimation precision for partial variances contributed by the main effects of parameters. The theoretical derivation of partial variances contributed by higher-order interactions and the calculation of their corresponding estimation errors in different sampling schemes can help us better understand the FAST method and provide a fundamental basis for FAST applications and further improvements.

  17. A simple model of hysteresis behavior using spreadsheet analysis

    NASA Astrophysics Data System (ADS)

    Ehrmann, A.; Blachowicz, T.

    2015-01-01

    Hysteresis loops occur in many scientific and technical problems, especially as field dependent magnetization of ferromagnetic materials, but also as stress-strain-curves of materials measured by tensile tests including thermal effects, liquid-solid phase transitions, in cell biology or economics. While several mathematical models exist which aim to calculate hysteresis energies and other parameters, here we offer a simple model for a general hysteretic system, showing different hysteresis loops depending on the defined parameters. The calculation which is based on basic spreadsheet analysis plus an easy macro code can be used by students to understand how these systems work and how the parameters influence the reactions of the system on an external field. Importantly, in the step-by-step mode, each change of the system state, compared to the last step, becomes visible. The simple program can be developed further by several changes and additions, enabling the building of a tool which is capable of answering real physical questions in the broad field of magnetism as well as in other scientific areas, in which similar hysteresis loops occur.

  18. Load alleviation maneuvers for a launch vehicle

    NASA Technical Reports Server (NTRS)

    Seywald, Hans; Bless, Robert R.

    1993-01-01

    This paper addresses the design of a forward-looking autopilot that is capable of employing a priori knowledge of wind gusts ahead of the flight path to reduce the bending loads experienced by a launch vehicle. The analysis presented in the present paper is only preliminary, employing a very simple vehicle dynamical model and restricting itself to wind gusts of the form of isolated spikes. The main result of the present study is that linear quadratic regulator (LQR) based feedback laws are inappropriate to handle spike-type wind perturbations with large amplitude and narrow base. The best performance is achieved with an interior-point penalty optimal control formulation which can be well approximated by a simple feedback control law. Reduction of the maximum bending loads by nearly 50% is demonstrated.

  19. A proposed method for world weightlifting championships team selection.

    PubMed

    Chiu, Loren Z F

    2009-08-01

    The caliber of competitors at the World Weightlifting Championships (WWC) has increased greatly over the past 20 years. As the WWC are the primary qualifiers for Olympic slots (1996 to present), it is imperative for a nation to select team members who will finish with a high placing and score team points. Previous selection methods were based on a simple percentage system. Analysis of the results from the 2006 and 2007 WWC indicates a curvilinear trend in each weight class, suggesting a simple percentage system will not maximize the number of team points earned. To maximize team points, weightlifters should be selected based on their potential to finish in the top 25. A 5-tier ranking system is proposed that should ensure the athletes with the greatest potential to score team points are selected.

  20. Use of paired simple and complex models to reduce predictive bias and quantify uncertainty

    NASA Astrophysics Data System (ADS)

    Doherty, John; Christensen, Steen

    2011-12-01

    Modern environmental management and decision-making is based on the use of increasingly complex numerical models. Such models have the advantage of allowing representation of complex processes and heterogeneous system property distributions inasmuch as these are understood at any particular study site. The latter are often represented stochastically, this reflecting knowledge of the character of system heterogeneity at the same time as it reflects a lack of knowledge of its spatial details. Unfortunately, however, complex models are often difficult to calibrate because of their long run times and sometimes questionable numerical stability. Analysis of predictive uncertainty is also a difficult undertaking when using models such as these. Such analysis must reflect a lack of knowledge of spatial hydraulic property details. At the same time, it must be subject to constraints on the spatial variability of these details born of the necessity for model outputs to replicate observations of historical system behavior. In contrast, the rapid run times and general numerical reliability of simple models often promulgates good calibration and ready implementation of sophisticated methods of calibration-constrained uncertainty analysis. Unfortunately, however, many system and process details on which uncertainty may depend are, by design, omitted from simple models. This can lead to underestimation of the uncertainty associated with many predictions of management interest. The present paper proposes a methodology that attempts to overcome the problems associated with complex models on the one hand and simple models on the other hand, while allowing access to the benefits each of them offers. It provides a theoretical analysis of the simplification process from a subspace point of view, this yielding insights into the costs of model simplification, and into how some of these costs may be reduced. It then describes a methodology for paired model usage through which predictive bias of a simplified model can be detected and corrected, and postcalibration predictive uncertainty can be quantified. The methodology is demonstrated using a synthetic example based on groundwater modeling environments commonly encountered in northern Europe and North America.

  1. Healthy habits: efficacy of simple advice on weight control based on a habit-formation model.

    PubMed

    Lally, P; Chipperfield, A; Wardle, J

    2008-04-01

    To evaluate the efficacy of a simple weight loss intervention, based on principles of habit formation. An exploratory trial in which overweight and obese adults were randomized either to a habit-based intervention condition (with two subgroups given weekly vs monthly weighing; n=33, n=36) or to a waiting-list control condition (n=35) over 8 weeks. Intervention participants were followed up for 8 months. A total of 104 adults (35 men, 69 women) with an average BMI of 30.9 kg m(-2). Intervention participants were given a leaflet containing advice on habit formation and simple recommendations for eating and activity behaviours promoting negative energy balance, together with a self-monitoring checklist. Weight change over 8 weeks in the intervention condition compared with the control condition and weight loss maintenance over 32 weeks in the intervention condition. At 8 weeks, people in the intervention condition had lost significantly more weight (mean=2.0 kg) than those in the control condition (0.4 kg), with no difference between weekly and monthly weighing subgroups. At 32 weeks, those who remained in the study had lost an average of 3.8 kg, with 54% losing 5% or more of their body weight. An intention-to-treat analysis (based on last-observation-carried-forward) reduced this to 2.6 kg, with 26% achieving a 5% weight loss. This easily disseminable, low-cost, simple intervention produced clinically significant weight loss. In limited resource settings it has potential as a tool for obesity management.

  2. A simple, mass balance model of carbon flow in a controlled ecological life support system

    NASA Technical Reports Server (NTRS)

    Garland, Jay L.

    1989-01-01

    Internal cycling of chemical elements is a fundamental aspect of a Controlled Ecological Life Support System (CELSS). Mathematical models are useful tools for evaluating fluxes and reservoirs of elements associated with potential CELSS configurations. A simple mass balance model of carbon flow in CELSS was developed based on data from the CELSS Breadboard project at Kennedy Space Center. All carbon reservoirs and fluxes were calculated based on steady state conditions and modelled using linear, donor-controlled transfer coefficients. The linear expression of photosynthetic flux was replaced with Michaelis-Menten kinetics based on dynamical analysis of the model which found that the latter produced more adequate model output. Sensitivity analysis of the model indicated that accurate determination of the maximum rate of gross primary production is critical to the development of an accurate model of carbon flow. Atmospheric carbon dioxide was particularly sensitive to changes in photosynthetic rate. The small reservoir of CO2 relative to large CO2 fluxes increases the potential for volatility in CO2 concentration. Feedback control mechanisms regulating CO2 concentration will probably be necessary in a CELSS to reduce this system instability.

  3. A founder large deletion mutation in Xeroderma pigmentosum-Variant form in Tunisia: implication for molecular diagnosis and therapy.

    PubMed

    Ben Rekaya, Mariem; Laroussi, Nadia; Messaoud, Olfa; Jones, Mariem; Jerbi, Manel; Naouali, Chokri; Bouyacoub, Yosra; Chargui, Mariem; Kefi, Rym; Fazaa, Becima; Boubaker, Mohamed Samir; Boussen, Hamouda; Mokni, Mourad; Abdelhak, Sonia; Zghal, Mohamed; Khaled, Aida; Yacoub-Youssef, Houda

    2014-01-01

    Xeroderma pigmentosum Variant (XP-V) form is characterized by a late onset of skin symptoms. Our aim is the clinical and genetic investigations of XP-V Tunisian patients in order to develop a simple tool for early diagnosis. We investigated 16 suspected XP patients belonging to ten consanguineous families. Analysis of the POLH gene was performed by linkage analysis, long range PCR, and sequencing. Genetic analysis showed linkage to the POLH gene with a founder haplotype in all affected patients. Long range PCR of exon 9 to exon 11 showed a 3926 bp deletion compared to control individuals. Sequence analysis demonstrates that this deletion has occurred between two Alu-Sq2 repetitive sequences in the same orientation, respectively, in introns 9 and 10. We suggest that this mutation POLH NG_009252.1: g.36847_40771del3925 is caused by an equal crossover event that occurred between two homologous chromosomes at meiosis. These results allowed us to develop a simple test based on a simple PCR in order to screen suspected XP-V patients. In Tunisia, the prevalence of XP-V group seems to be underestimated and clinical diagnosis is usually later. Cascade screening of this founder mutation by PCR in regions with high frequency of XP provides a rapid and cost-effective tool for early diagnosis of XP-V in Tunisia and North Africa.

  4. A Founder Large Deletion Mutation in Xeroderma Pigmentosum-Variant Form in Tunisia: Implication for Molecular Diagnosis and Therapy

    PubMed Central

    Ben Rekaya, Mariem; Laroussi, Nadia; Messaoud, Olfa; Jones, Mariem; Jerbi, Manel; Bouyacoub, Yosra; Chargui, Mariem; Kefi, Rym; Fazaa, Becima; Boubaker, Mohamed Samir; Boussen, Hamouda; Mokni, Mourad; Abdelhak, Sonia; Zghal, Mohamed; Khaled, Aida; Yacoub-Youssef, Houda

    2014-01-01

    Xeroderma pigmentosum Variant (XP-V) form is characterized by a late onset of skin symptoms. Our aim is the clinical and genetic investigations of XP-V Tunisian patients in order to develop a simple tool for early diagnosis. We investigated 16 suspected XP patients belonging to ten consanguineous families. Analysis of the POLH gene was performed by linkage analysis, long range PCR, and sequencing. Genetic analysis showed linkage to the POLH gene with a founder haplotype in all affected patients. Long range PCR of exon 9 to exon 11 showed a 3926 bp deletion compared to control individuals. Sequence analysis demonstrates that this deletion has occurred between two Alu-Sq2 repetitive sequences in the same orientation, respectively, in introns 9 and 10. We suggest that this mutation POLH NG_009252.1: g.36847_40771del3925 is caused by an equal crossover event that occurred between two homologous chromosomes at meiosis. These results allowed us to develop a simple test based on a simple PCR in order to screen suspected XP-V patients. In Tunisia, the prevalence of XP-V group seems to be underestimated and clinical diagnosis is usually later. Cascade screening of this founder mutation by PCR in regions with high frequency of XP provides a rapid and cost-effective tool for early diagnosis of XP-V in Tunisia and North Africa. PMID:24877075

  5. Comparison between rpoB and 16S rRNA Gene Sequencing for Molecular Identification of 168 Clinical Isolates of Corynebacterium

    PubMed Central

    Khamis, Atieh; Raoult, Didier; La Scola, Bernard

    2005-01-01

    Higher proportions (91%) of 168 corynebacterial isolates were positively identified by partial rpoB gene determination than by that based on 16S rRNA gene sequences. This method is thus a simple, molecular-analysis-based method for identification of corynebacteria, but it should be used in conjunction with other tests for definitive identification. PMID:15815024

  6. An efficient current-based logic cell model for crosstalk delay analysis

    NASA Astrophysics Data System (ADS)

    Nazarian, Shahin; Das, Debasish

    2013-04-01

    Logic cell modelling is an important component in the analysis and design of CMOS integrated circuits, mostly due to nonlinear behaviour of CMOS cells with respect to the voltage signal at their input and output pins. A current-based model for CMOS logic cells is presented, which can be used for effective crosstalk noise and delta delay analysis in CMOS VLSI circuits. Existing current source models are expensive and need a new set of Spice-based characterisation, which is not compatible with typical EDA tools. In this article we present Imodel, a simple nonlinear logic cell model that can be derived from the typical cell libraries such as NLDM, with accuracy much higher than NLDM-based cell delay models. In fact, our experiments show an average error of 3% compared to Spice. This level of accuracy comes with a maximum runtime penalty of 19% compared to NLDM-based cell delay models on medium-sized industrial designs.

  7. Highly Sensitive Ratiometric Fluorescent Sensor for Trinitrotoluene Based on the Inner Filter Effect between Gold Nanoparticles and Fluorescent Nanoparticles.

    PubMed

    Lu, Hongzhi; Quan, Shuai; Xu, Shoufang

    2017-11-08

    In this work, we developed a simple and sensitive ratiometric fluorescent assay for sensing trinitrotoluene (TNT) based on the inner filter effect (IFE) between gold nanoparticles (AuNPs) and ratiometric fluorescent nanoparticles (RFNs), which was designed by hybridizing green emissive carbon dots (CDs) and red emissive quantum dots (QDs) into a silica sphere as a fluorophore pair. AuNPs in their dispersion state can be a powerful absorber to quench CDs, while the aggregated AuNPs can quench QDs in the IFE-based fluorescent assays as a result of complementary overlap between the absorption spectrum of AuNPs and emission spectrum of RFNs. As a result of the fact that TNT can induce the aggregation of AuNPs, with the addition of TNT, the fluorescent of QDs can be quenched, while the fluorescent of CDs would be recovered. Then, ratiometric fluorescent detection of TNT is feasible. The present IFE-based ratiometric fluorescent sensor can detect TNT ranging from 0.1 to 270 nM, with a detection limit of 0.029 nM. In addition, the developed method was successfully applied to investigate TNT in water and soil samples with satisfactory recoveries ranging from 95 to 103%, with precision below 4.5%. The simple sensing approach proposed here could improve the sensitivity of colorimetric analysis by changing the ultraviolet analysis to ratiometric fluorescent analysis and promote the development of a dual-mode detection system.

  8. A Lithology Based Map Unit Schema For Onegeology Regional Geologic Map Integration

    NASA Astrophysics Data System (ADS)

    Moosdorf, N.; Richard, S. M.

    2012-12-01

    A system of lithogenetic categories for a global lithological map (GLiM, http://www.ifbm.zmaw.de/index.php?id=6460&L=3) has been compiled based on analysis of lithology/genesis categories for regional geologic maps for the entire globe. The scheme is presented for discussion and comment. Analysis of units on a variety of regional geologic maps indicates that units are defined based on assemblages of rock types, as well as their genetic type. In this compilation of continental geology, outcropping surface materials are dominantly sediment/sedimentary rock; major subdivisions of the sedimentary category include clastic sediment, carbonate sedimentary rocks, clastic sedimentary rocks, mixed carbonate and clastic sedimentary rock, colluvium and residuum. Significant areas of mixed igneous and metamorphic rock are also present. A system of global categories to characterize the lithology of regional geologic units is important for Earth System models of matter fluxes to soils, ecosystems, rivers and oceans, and for regional analysis of Earth surface processes at global scale. Because different applications of the classification scheme will focus on different lithologic constituents in mixed units, an ontology-type representation of the scheme that assigns properties to the units in an analyzable manner will be pursued. The OneGeology project is promoting deployment of geologic map services at million scale for all nations. Although initial efforts are commonly simple scanned map WMS services, the intention is to move towards data-based map services that categorize map units with standard vocabularies to allow use of a common map legend for better visual integration of the maps (e.g. see OneGeology Europe, http://onegeology-europe.brgm.fr/ geoportal/ viewer.jsp). Current categorization of regional units with a single lithology from the CGI SimpleLithology (http://resource.geosciml.org/201202/ Vocab2012html/ SimpleLithology201012.html) vocabulary poorly captures the lithologic character of such units in a meaningful way. A lithogenetic unit category scheme accessible as a GeoSciML-portrayal-based OGC Styled Layer Description resource is key to enabling OneGeology (http://oneGeology.org) geologic map services to achieve a high degree of visual harmonization.

  9. Information categorization approach to literary authorship disputes

    NASA Astrophysics Data System (ADS)

    Yang, Albert C.-C.; Peng, C.-K.; Yien, H.-W.; Goldberger, Ary L.

    2003-11-01

    Scientific analysis of the linguistic styles of different authors has generated considerable interest. We present a generic approach to measuring the similarity of two symbolic sequences that requires minimal background knowledge about a given human language. Our analysis is based on word rank order-frequency statistics and phylogenetic tree construction. We demonstrate the applicability of this method to historic authorship questions related to the classic Chinese novel “The Dream of the Red Chamber,” to the plays of William Shakespeare, and to the Federalist papers. This method may also provide a simple approach to other large databases based on their information content.

  10. Applying Multi-Criteria Decision Analysis (MCDA) Simple Scoring as an Evidence-based HTA Methodology for Evaluating Off-Patent Pharmaceuticals (OPPs) in Emerging Markets.

    PubMed

    Brixner, Diana; Maniadakis, Nikos; Kaló, Zoltán; Hu, Shanlian; Shen, Jie; Wijaya, Kalman

    2017-09-01

    Off-patent pharmaceuticals (OPPs) represent more than 60% of the pharmaceutical market in many emerging countries, where they are frequently evaluated primarily on cost rather than with health technology assessment. OPPs are assumed to be identical to the originators. Branded and unbranded generic versions can, however, vary from the originator in active pharmaceutical ingredients, dosage, consistency formulation, excipients, manufacturing processes, and distribution, for example. These variables can alter the efficacy and safety of the product, negatively impacting both the anticipated cost savings and the population's health. In addition, many health care systems lack the resources or expertise to evaluate such products, and current assessment methods can be complex and difficult to adapt to a health system's needs. Multicriteria decision analysis (MCDA) simple scoring is an evidence-based health technology assessment methodology for evaluating OPPs, especially in emerging countries in which resources are limited but decision makers still must balance affordability with factors such as drug safety, level interchangeability, manufacturing site and active pharmaceutical ingredient quality, supply track record, and real-life outcomes. MCDA simple scoring can be applied to pharmaceutical pricing, reimbursement, formulary listing, and drug procurement. In November 2015, a workshop was held at the International Society for Pharmacoeconomics and Outcomes Research Annual Meeting in Milan to refine and prioritize criteria that can be used in MCDA simple scoring for OPPs, resulting in an example MCDA process and 22 prioritized criteria that health care systems in emerging countries can easily adapt to their own decision-making processes. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  11. AppEEARS: A Simple Tool that Eases Complex Data Integration and Visualization Challenges for Users

    NASA Astrophysics Data System (ADS)

    Maiersperger, T.

    2017-12-01

    The Application for Extracting and Exploring Analysis-Ready Samples (AppEEARS) offers a simple and efficient way to perform discovery, processing, visualization, and acquisition across large quantities and varieties of Earth science data. AppEEARS brings significant value to a very broad array of user communities by 1) significantly reducing data volumes, at-archive, based on user-defined space-time-variable subsets, 2) promoting interoperability across a wide variety of datasets via format and coordinate reference system harmonization, 3) increasing the velocity of both data analysis and insight by providing analysis-ready data packages and by allowing interactive visual exploration of those packages, and 4) ensuring veracity by making data quality measures more apparent and usable and by providing standards-based metadata and processing provenance. Development and operation of AppEEARS is led by the National Aeronautics and Space Administration (NASA) Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC also partners with several other archives to extend the capability across a larger federation of geospatial data providers. Over one hundred datasets are currently available, covering a diversity of variables including land cover, population, elevation, vegetation indices, and land surface temperature. Many hundreds of users have already used this new web-based capability to make the complex tasks of data integration and visualization much simpler and more efficient.

  12. Adaptation of a Simple Microfluidic Platform for High-Dimensional Quantitative Morphological Analysis of Human Mesenchymal Stromal Cells on Polystyrene-Based Substrates.

    PubMed

    Lam, Johnny; Marklein, Ross A; Jimenez-Torres, Jose A; Beebe, David J; Bauer, Steven R; Sung, Kyung E

    2017-12-01

    Multipotent stromal cells (MSCs, often called mesenchymal stem cells) have garnered significant attention within the field of regenerative medicine because of their purported ability to differentiate down musculoskeletal lineages. Given the inherent heterogeneity of MSC populations, recent studies have suggested that cell morphology may be indicative of MSC differentiation potential. Toward improving current methods and developing simple yet effective approaches for the morphological evaluation of MSCs, we combined passive pumping microfluidic technology with high-dimensional morphological characterization to produce robust tools for standardized high-throughput analysis. Using ultraviolet (UV) light as a modality for reproducible polystyrene substrate modification, we show that MSCs seeded on microfluidic straight channel devices incorporating UV-exposed substrates exhibited morphological changes that responded accordingly to the degree of substrate modification. Substrate modification also effected greater morphological changes in MSCs seeded at a lower rather than higher density within microfluidic channels. Despite largely comparable trends in morphology, MSCs seeded in microscale as opposed to traditional macroscale platforms displayed much higher sensitivity to changes in substrate properties. In summary, we adapted and qualified microfluidic cell culture platforms comprising simple straight channel arrays as a viable and robust tool for high-throughput quantitative morphological analysis to study cell-material interactions.

  13. Monte Carlo based statistical power analysis for mediation models: methods and software.

    PubMed

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  14. Superresolution confocal technology for displacement measurements based on total internal reflection.

    PubMed

    Kuang, Cuifang; Ali, M Yakut; Hao, Xiang; Wang, Tingting; Liu, Xu

    2010-10-01

    In order to achieve a higher axial resolution for displacement measurement, a novel method is proposed based on total internal reflection filter and confocal microscope principle. A theoretical analysis of the basic measurement principles is presented. The analysis reveals that the proposed confocal detection scheme is effective in enhancing the resolution of nonlinearity of the reflectance curve greatly. In addition, a simple prototype system has been developed based on the theoretical analysis and a series of experiments have been performed under laboratory conditions to verify the system feasibility, accuracy, and stability. The experimental results demonstrate that the axial resolution in displacement measurements is better than 1 nm in a range of 200 nm which is threefold better than that can be achieved using the plane reflector.

  15. On the (In)Validity of Tests of Simple Mediation: Threats and Solutions

    PubMed Central

    Pek, Jolynn; Hoyle, Rick H.

    2015-01-01

    Mediation analysis is a popular framework for identifying underlying mechanisms in social psychology. In the context of simple mediation, we review and discuss the implications of three facets of mediation analysis: (a) conceptualization of the relations between the variables, (b) statistical approaches, and (c) relevant elements of design. We also highlight the issue of equivalent models that are inherent in simple mediation. The extent to which results are meaningful stem directly from choices regarding these three facets of mediation analysis. We conclude by discussing how mediation analysis can be better applied to examine causal processes, highlight the limits of simple mediation, and make recommendations for better practice. PMID:26985234

  16. A Paper-Based Device for Performing Loop-Mediated Isothermal Amplification with Real-Time Simultaneous Detection of Multiple DNA Targets.

    PubMed

    Seok, Youngung; Joung, Hyou-Arm; Byun, Ju-Young; Jeon, Hyo-Sung; Shin, Su Jeong; Kim, Sanghyo; Shin, Young-Beom; Han, Hyung Soo; Kim, Min-Gon

    2017-01-01

    Paper-based diagnostic devices have many advantages as a one of the multiple diagnostic test platforms for point-of-care (POC) testing because they have simplicity, portability, and cost-effectiveness. However, despite high sensitivity and specificity of nucleic acid testing (NAT), the development of NAT based on a paper platform has not progressed as much as the others because various specific conditions for nucleic acid amplification reactions such as pH, buffer components, and temperature, inhibitions from technical differences of paper-based device. Here, we propose a paper-based device for performing loop-mediated isothermal amplification (LAMP) with real-time simultaneous detection of multiple DNA targets. We determined the optimal chemical components to enable dry conditions for the LAMP reaction without lyophilization or other techniques. We also devised the simple paper device structure by sequentially stacking functional layers, and employed a newly discovered property of hydroxynaphthol blue fluorescence to analyze real-time LAMP signals in the paper device. This proposed platform allowed analysis of three different meningitis DNA samples in a single device with single-step operation. This LAMP-based multiple diagnostic device has potential for real-time analysis with quantitative detection of 10 2 -10 5 copies of genomic DNA. Furthermore, we propose the transformation of DNA amplification devices to a simple and affordable paper system approach with great potential for realizing a paper-based NAT system for POC testing.

  17. A Paper-Based Device for Performing Loop-Mediated Isothermal Amplification with Real-Time Simultaneous Detection of Multiple DNA Targets

    PubMed Central

    Seok, Youngung; Joung, Hyou-Arm; Byun, Ju-Young; Jeon, Hyo-Sung; Shin, Su Jeong; Kim, Sanghyo; Shin, Young-Beom; Han, Hyung Soo; Kim, Min-Gon

    2017-01-01

    Paper-based diagnostic devices have many advantages as a one of the multiple diagnostic test platforms for point-of-care (POC) testing because they have simplicity, portability, and cost-effectiveness. However, despite high sensitivity and specificity of nucleic acid testing (NAT), the development of NAT based on a paper platform has not progressed as much as the others because various specific conditions for nucleic acid amplification reactions such as pH, buffer components, and temperature, inhibitions from technical differences of paper-based device. Here, we propose a paper-based device for performing loop-mediated isothermal amplification (LAMP) with real-time simultaneous detection of multiple DNA targets. We determined the optimal chemical components to enable dry conditions for the LAMP reaction without lyophilization or other techniques. We also devised the simple paper device structure by sequentially stacking functional layers, and employed a newly discovered property of hydroxynaphthol blue fluorescence to analyze real-time LAMP signals in the paper device. This proposed platform allowed analysis of three different meningitis DNA samples in a single device with single-step operation. This LAMP-based multiple diagnostic device has potential for real-time analysis with quantitative detection of 102-105 copies of genomic DNA. Furthermore, we propose the transformation of DNA amplification devices to a simple and affordable paper system approach with great potential for realizing a paper-based NAT system for POC testing. PMID:28740546

  18. Sulfanilic acid-modified chitosan mini-spheres and their application for lysozyme purification from egg white.

    PubMed

    Hirsch, Daniela B; Baieli, María F; Urtasun, Nicolás; Lázaro-Martínez, Juan M; Glisoni, Romina J; Miranda, María V; Cascone, Osvaldo; Wolman, Federico J

    2018-03-01

    A cation exchange matrix with zwitterionic and multimodal properties was synthesized by a simple reaction sequence coupling sulfanilic acid to a chitosan based support. The novel chromatographic matrix was physico-chemically characterized by ss-NMR and ζ potential, and its chromatographic performance was evaluated for lysozyme purification from diluted egg white. The maximum adsorption capacity, calculated according to Langmuir adsorption isotherm, was 50.07 ± 1.47 mg g -1 while the dissociation constant was 0.074 ± 0.012 mg mL -1 . The process for lysozyme purification from egg white was optimized, with 81.9% yield and a purity degree of 86.5%, according to RP-HPLC analysis. This work shows novel possible applications of chitosan based materials. The simple synthesis reactions combined with the simple mode of use of the chitosan matrix represents a novel method to purify proteins from raw starting materials. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:387-396, 2018. © 2017 American Institute of Chemical Engineers.

  19. Current distribution in a three-dimensional IC analyzed by a perturbation method. Part 1: A simple steady state theory

    NASA Technical Reports Server (NTRS)

    Edmonds, Larry D.

    1987-01-01

    The steady state current distribution in a three dimensional integrated circuit is presented. A device physics approach, based on a perturbation method rather than an equivalent lumped circuit approach, is used. The perturbation method allows the various currents to be expressed in terms of elementary solutions which are solutions to very simple boundary value problems. A Simple Steady State Theory is the subtitle because the most obvious limitation of the present version of the analysis is that all depletion region boundary surfaces are treated as equipotential surfaces. This may be an adequate approximation in some applications but it is an obvious weakness in the theory when applied to latched states. Examples that illustrate the use of these analytical methods are not given because they will be presented in detail in the future.

  20. Mining Distance Based Outliers in Near Linear Time with Randomization and a Simple Pruning Rule

    NASA Technical Reports Server (NTRS)

    Bay, Stephen D.; Schwabacher, Mark

    2003-01-01

    Defining outliers by their distance to neighboring examples is a popular approach to finding unusual examples in a data set. Recently, much work has been conducted with the goal of finding fast algorithms for this task. We show that a simple nested loop algorithm that in the worst case is quadratic can give near linear time performance when the data is in random order and a simple pruning rule is used. We test our algorithm on real high-dimensional data sets with millions of examples and show that the near linear scaling holds over several orders of magnitude. Our average case analysis suggests that much of the efficiency is because the time to process non-outliers, which are the majority of examples, does not depend on the size of the data set.

  1. Recognition of Simple 3D Geometrical Objects under Partial Occlusion

    NASA Astrophysics Data System (ADS)

    Barchunova, Alexandra; Sommer, Gerald

    In this paper we present a novel procedure for contour-based recognition of partially occluded three-dimensional objects. In our approach we use images of real and rendered objects whose contours have been deformed by a restricted change of the viewpoint. The preparatory part consists of contour extraction, preprocessing, local structure analysis and feature extraction. The main part deals with an extended construction and functionality of the classifier ensemble Adaptive Occlusion Classifier (AOC). It relies on a hierarchical fragmenting algorithm to perform a local structure analysis which is essential when dealing with occlusions. In the experimental part of this paper we present classification results for five classes of simple geometrical figures: prism, cylinder, half cylinder, a cube, and a bridge. We compare classification results for three classical feature extractors: Fourier descriptors, pseudo Zernike and Zernike moments.

  2. Sleep Quality Estimation based on Chaos Analysis for Heart Rate Variability

    NASA Astrophysics Data System (ADS)

    Fukuda, Toshio; Wakuda, Yuki; Hasegawa, Yasuhisa; Arai, Fumihito; Kawaguchi, Mitsuo; Noda, Akiko

    In this paper, we propose an algorithm to estimate sleep quality based on a heart rate variability using chaos analysis. Polysomnography(PSG) is a conventional and reliable system to diagnose sleep disorder and to evaluate its severity and therapeatic effect, by estimating sleep quality based on multiple channels. However, a recording process requires a lot of time and a controlled environment for measurement and then an analyzing process of PSG data is hard work because the huge sensed data should be manually evaluated. On the other hand, it is focused that some people make a mistake or cause an accident due to lost of regular sleep and of homeostasis these days. Therefore a simple home system for checking own sleep is required and then the estimation algorithm for the system should be developed. Therefore we propose an algorithm to estimate sleep quality based only on a heart rate variability which can be measured by a simple sensor such as a pressure sensor and an infrared sensor in an uncontrolled environment, by experimentally finding the relationship between chaos indices and sleep quality. The system including the estimation algorithm can inform patterns and quality of own daily sleep to a user, and then the user can previously arranges his life schedule, pays more attention based on sleep results and consult with a doctor.

  3. ReSTART: A Novel Framework for Resource-Based Triage in Mass-Casualty Events.

    PubMed

    Mills, Alex F; Argon, Nilay T; Ziya, Serhan; Hiestand, Brian; Winslow, James

    2014-01-01

    Current guidelines for mass-casualty triage do not explicitly use information about resource availability. Even though this limitation has been widely recognized, how it should be addressed remains largely unexplored. The authors present a novel framework developed using operations research methods to account for resource limitations when determining priorities for transportation of critically injured patients. To illustrate how this framework can be used, they also develop two specific example methods, named ReSTART and Simple-ReSTART, both of which extend the widely adopted triage protocol Simple Triage and Rapid Treatment (START) by using a simple calculation to determine priorities based on the relative scarcity of transportation resources. The framework is supported by three techniques from operations research: mathematical analysis, optimization, and discrete-event simulation. The authors? algorithms were developed using mathematical analysis and optimization and then extensively tested using 9,000 discrete-event simulations on three distributions of patient severity (representing low, random, and high acuity). For each incident, the expected number of survivors was calculated under START, ReSTART, and Simple-ReSTART. A web-based decision support tool was constructed to help providers make prioritization decisions in the aftermath of mass-casualty incidents based on ReSTART. In simulations, ReSTART resulted in significantly lower mortality than START regardless of which severity distribution was used (paired t test, p<.01). Mean decrease in critical mortality, the percentage of immediate and delayed patients who die, was 8.5% for low-acuity distribution (range ?2.2% to 21.1%), 9.3% for random distribution (range ?0.2% to 21.2%), and 9.1% for high-acuity distribution (range ?0.7% to 21.1%). Although the critical mortality improvement due to ReSTART was different for each of the three severity distributions, the variation was less than 1 percentage point, indicating that the ReSTART policy is relatively robust to different severity distributions. Taking resource limitations into account in mass-casualty situations, triage has the potential to increase the expected number of survivors. Further validation is required before field implementation; however, the framework proposed in here can serve as the foundation for future work in this area. 2014.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Townsend, D.W.; Linnhoff, B.

    In Part I, criteria for heat engine and heat pump placement in chemical process networks were derived, based on the ''temperature interval'' (T.I) analysis of the heat exchanger network problem. Using these criteria, this paper gives a method for identifying the best outline design for any combined system of chemical process, heat engines, and heat pumps. The method eliminates inferior alternatives early, and positively leads on to the most appropriate solution. A graphical procedure based on the T.I. analysis forms the heart of the approach, and the calculations involved are simple enough to be carried out on, say, a programmablemore » calculator. Application to a case study is demonstrated. Optimization methods based on this procedure are currently under research.« less

  5. Linear discriminant analysis based on L1-norm maximization.

    PubMed

    Zhong, Fujin; Zhang, Jiashu

    2013-08-01

    Linear discriminant analysis (LDA) is a well-known dimensionality reduction technique, which is widely used for many purposes. However, conventional LDA is sensitive to outliers because its objective function is based on the distance criterion using L2-norm. This paper proposes a simple but effective robust LDA version based on L1-norm maximization, which learns a set of local optimal projection vectors by maximizing the ratio of the L1-norm-based between-class dispersion and the L1-norm-based within-class dispersion. The proposed method is theoretically proved to be feasible and robust to outliers while overcoming the singular problem of the within-class scatter matrix for conventional LDA. Experiments on artificial datasets, standard classification datasets and three popular image databases demonstrate the efficacy of the proposed method.

  6. Key findings from the International Ovarian Tumor Analysis (IOTA) study: an approach to the optimal ultrasound based characterisation of adnexal pathology

    PubMed Central

    Bourne, Tom; De Rijdt, Sylvie; Van Holsbeke, Caroline; Sayasneh, Ahmad; Valentin, Lil; Van Calster, Ben; Timmerman, Dirk

    2015-01-01

    Abstract The principal aim of the IOTA project has been to develop approaches to the evaluation of adnexal pathology using ultrasound that can be transferred to all examiners. Creating models that use simple, easily reproducible ultrasound characteristics is one approach. PMID:28191150

  7. A Simple Algorithm for Obtaining Nearly Optimal Quadrature Rules for NURBS-based Isogeometric Analysis

    DTIC Science & Technology

    2012-01-05

    Università degli Studi di Pavia bIstituto di Matematica Applicata e Tecnologie Informatiche “E. Magenes” del CNR, Pavia cDAEIMI, Università degli Studi di...Cassino d Institute for Computational Engineering and Sciences, University of Texas at Austin eDipartimento di Matematica , Università degli Studi di

  8. Resistivity in Play-Doh: Time and Color Variations

    ERIC Educational Resources Information Center

    Fuse, Christopher; August, Brandon; Cannaday, Ashley; Barker, Casey

    2013-01-01

    The study of electricity and magnetism is fundamental to all first-year physics courses. Developing simple electricity laboratory experiences that are open ended and inquiry based can be difficult. We wished to create a lab experiment where the students have some control over the experimental design, data analysis is required, and students…

  9. Calculation of the bending stresses in helicopter rotor blades

    NASA Technical Reports Server (NTRS)

    De Guillenchmidt, P

    1951-01-01

    A comparatively rapid method is presented for determining theoretically the bending stresses of helicopter rotor blades in forward flight. The method is based on the analysis of the properties of a vibrating beam, and its uniqueness lies in the simple solution of the differential equation which governs the motion of the bent blades.

  10. Aeroelastic analysis of a troposkien-type wind turbine blade

    NASA Technical Reports Server (NTRS)

    Nitzsche, F.

    1981-01-01

    The linear aeroelastic equations for one curved blade of a vertical axis wind turbine in state vector form are presented. The method is based on a simple integrating matrix scheme together with the transfer matrix idea. The method is proposed as a convenient way of solving the associated eigenvalue problem for general support conditions.

  11. "Met" Made Simple: Building Research-Based Teacher Evaluations. Issue Analysis Report

    ERIC Educational Resources Information Center

    New Teacher Project, 2012

    2012-01-01

    Groundbreaking new findings from the Bill and Melinda Gates Foundation's Measures of Effective Teaching (MET) project hold the potential to answer crucial questions about how to assess teachers' performance. For the past two years, MET researchers have conducted a research project of unprecedented scope, involving 3,000 teachers in six school…

  12. Keeping It Simple: The Grammatical Properties of Shared Book Reading

    ERIC Educational Resources Information Center

    Noble, Claire H.; Cameron-Faulkner, Thea; Lieven, Elena

    2018-01-01

    The positive effects of shared book reading on vocabulary and reading development are well attested (e.g., Bus, van Ijzendoorn, & Pellegrini, 1995). However, the role of shared book reading in GRAMMATICAL DEVELOPMENT remains unclear. In this study, we conducted a construction-based analysis of caregivers' child-directed speech during shared…

  13. Space based lidar shot pattern targeting strategies for small targets such as streams

    NASA Technical Reports Server (NTRS)

    Spiers, Gary D.

    2001-01-01

    An analysis of the effectiveness of four different types of lidar shot distribution is conducted to determine which is best for concentrating shots in a given location. A simple preemptive targeting strategy is found to work as adequately as a more involved dynamic strategy for most target sizes considered.

  14. Sorption of Triangular Silver Nanoplates on Polyurethane Foam

    NASA Astrophysics Data System (ADS)

    Furletov, A. A.; Apyari, V. V.; Garshev, A. V.; Volkov, P. A.; Tolmacheva, V. V.; Dmitrienko, S. G.

    2018-02-01

    The sorption of triangular silver nanoplates on polyurethane foam is investigated as a procedure for creating a nanocomposite sensing material for subsequent use in optical means of chemical analysis. Triangular silver nanoplates are synthesized and characterized, and a simple sorption technique for the formation of a composite material based on these nanoplates is proposed.

  15. Introducing Valuation Effects-Based External Balance Analysis into the Undergraduate Macroeconomics Curricula: A Simple Framework with Applications

    ERIC Educational Resources Information Center

    Brust, Peter; Jayakumar, Vivekanand

    2012-01-01

    Global imbalances and the sustainability of large U.S. current account deficits have dominated international macroeconomics of late. Pedagogically, a clear disconnect exists between graduate-level open-economy macroeconomics that emphasizes intertemporal current account models and net foreign asset adjustment featuring valuation effects, and,…

  16. Determination of lysine content based on an in situ pretreatment and headspace gas chromatographic measurement technique.

    PubMed

    Wan, Xiao-Fang; Liu, Bao-Lian; Yu, Teng; Yan, Ning; Chai, Xin-Sheng; Li, You-Ming; Chen, Guang-Xue

    2018-05-01

    This work reports on a simple method for the determination of lysine content by an in situ sample pretreatment and headspace gas chromatographic measurement (HS-GC) technique, based on carbon dioxide (CO 2 ) formation from the pretreatment reaction (between lysine and ninhydrin solution) in a closed vial. It was observed that complete lysine conversion to CO 2 could be achieved within 60 min at 60 °C in a phosphate buffer medium (pH = 4.0), with a minimum molar ratio of ninhydrin/lysine of 16. The results showed that the method had a good precision (RSD < 5.23%) and accuracy (within 6.80%), compared to the results measured by a reference method (ninhydrin spectroscopic method). Due to the feature of in situ sample pretreatment and headspace measurement, the present method becomes very simple and particularly suitable to be used for batch sample analysis in lysine-related research and applications. Graphical abstract The flow path of the reaction and HS-GC measurement for the lysine analysis.

  17. A simple, rapid and novel method based on salting-out assisted liquid-liquid extraction for ochratoxin A determination in beer samples prior to ultra-high performance liquid chromatography coupled to tandem mass spectrometry.

    PubMed

    Mariño-Repizo, Leonardo; Goicoechea, Hector; Raba, Julio; Cerutti, Soledad

    2018-06-07

    A novel, simple, easy and cheap sample treatment strategy based on salting-out assisted liquid-liquid extraction (SALLE) for ochratoxin A (OTA) ultra-trace analysis in beer samples using ultra-high performance liquid chromatography-tandem mass spectrometry determination was developed. The factors involved in the efficiency of pretreatment were studied employing factorial design in the screening phase and the optimal conditions of the significant variables on the analytical response were evaluated using a central composite face-centred design (CCF). Consequently, the amount of salt ((NH 4 ) 2 SO 4 ), together with the volumes of sample, hydrophilic (acetone) and nonpolar (toluene) solvents, and times of vortexing and centrifugation were optimized. Under optimized conditions, the limits of detection (LOD) and quantification (LOQ) were 0.02 µg l -1 and 0.08 µg l -1 respectively. OTA extraction recovery by SALLE was approximately 90% (0.2 µg l -1 ). Furthermore, the methodology was in agreement with EU Directive requirements and was successfully applied for analysis of beer samples.

  18. The Amount and Preferred Orientation of Simple-shear in a Deformation Tensor: Implications for Detecting Shear Zones and Faults with GPS

    NASA Astrophysics Data System (ADS)

    Johnson, A. M.; Griffiths, J. H.

    2007-05-01

    At the 2005 Fall Meeting of the American Geophysical Union, Griffiths and Johnson [2005] introduced a method of extracting from the deformation-gradient (and velocity-gradient) tensor the amount and preferred orientation of simple-shear associated with 2-D shear zones and faults. Noting the 2-D is important because the shear zones and faults in Griffiths and Johnson [2005] were assumed non-dilatant and infinitely long, ignoring the scissors- like action along strike associated with shear zones and faults of finite length. Because shear zones and faults can dilate (and contract) normal to their walls and can have a scissors-like action associated with twisting about an axis normal to their walls, the more general method of detecting simple-shear is introduced and called MODES "method of detecting simple-shear." MODES can thus extract from the deformation-gradient (and velocity- gradient) tensor the amount and preferred orientation of simple-shear associated with 3-D shear zones and faults near or far from the Earth's surface, providing improvements and extensions to existing analytical methods used in active tectonics studies, especially strain analysis and dislocation theory. The derivation of MODES is based on one definition and two assumptions: by definition, simple-shear deformation becomes localized in some way; by assumption, the twirl within the deformation-gradient (or the spin within the velocity-gradient) is due to a combination of simple-shear and twist, and coupled with the simple- shear and twist is a dilatation of the walls of shear zones and faults. The preferred orientation is thus the orientation of the plane containing the simple-shear and satisfying the mechanical and kinematical boundary conditions. Results from a MODES analysis are illustrated by means of a three-dimensional diagram, the cricket- ball, which is reminiscent of the seismologist's "beach ball." In this poster, we present the underlying theory of MODES and illustrate how it works by analyzing the three- dimensional displacements measured with the Global Positioning System across the 1999 Chi-Chi earthquake ground rupture in Taiwan. In contrast to the deformation zone in the upper several meters of the ground below the surface detected by Yu et al. [2001], MODES determines the orientation and direction of shift of a shear zone representing the earthquake fault within the upper several hundred or thousand meters of ground below the surface. Thus, one value of the MODES analysis in this case is to provide boundary conditions for dislocation solutions for the subsurface shape of the main rupture during the earthquake.

  19. Comparative study on gene set and pathway topology-based enrichment methods.

    PubMed

    Bayerlová, Michaela; Jung, Klaus; Kramer, Frank; Klemm, Florian; Bleckmann, Annalen; Beißbarth, Tim

    2015-10-22

    Enrichment analysis is a popular approach to identify pathways or sets of genes which are significantly enriched in the context of differentially expressed genes. The traditional gene set enrichment approach considers a pathway as a simple gene list disregarding any knowledge of gene or protein interactions. In contrast, the new group of so called pathway topology-based methods integrates the topological structure of a pathway into the analysis. We comparatively investigated gene set and pathway topology-based enrichment approaches, considering three gene set and four topological methods. These methods were compared in two extensive simulation studies and on a benchmark of 36 real datasets, providing the same pathway input data for all methods. In the benchmark data analysis both types of methods showed a comparable ability to detect enriched pathways. The first simulation study was conducted with KEGG pathways, which showed considerable gene overlaps between each other. In this study with original KEGG pathways, none of the topology-based methods outperformed the gene set approach. Therefore, a second simulation study was performed on non-overlapping pathways created by unique gene IDs. Here, methods accounting for pathway topology reached higher accuracy than the gene set methods, however their sensitivity was lower. We conducted one of the first comprehensive comparative works on evaluating gene set against pathway topology-based enrichment methods. The topological methods showed better performance in the simulation scenarios with non-overlapping pathways, however, they were not conclusively better in the other scenarios. This suggests that simple gene set approach might be sufficient to detect an enriched pathway under realistic circumstances. Nevertheless, more extensive studies and further benchmark data are needed to systematically evaluate these methods and to assess what gain and cost pathway topology information introduces into enrichment analysis. Both types of methods for enrichment analysis require further improvements in order to deal with the problem of pathway overlaps.

  20. Kidney function estimating equations in patients with chronic kidney disease.

    PubMed

    Hojs, R; Bevc, S; Ekart, R; Gorenjak, M; Puklavec, L

    2011-04-01

    The current guidelines emphasise the need to assess kidney function using predictive equations rather than just serum creatinine. The present study compares serum cystatin C-based equations and serum creatinine-based equations in patients with chronic kidney disease (CKD). Seven hundred and sixty-four adult patients with CKD were enrolled. In each patient serum creatinine and serum cystatin C were determined. Their glomerular filtration rate (GFR) was estimated using three serum creatinine-based equations [Cockcroft-Gault (C&G), modification of diet in renal disease (MDRD) and the Chronic Kidney Disease Epidemiology Collaboration equation (CKD-EPI)] and two serum cystatin C-based equations [our own cystatin C formula (GFR=90.63 × cystatin C(-1.192) ) and simple cystatin C formula (GFR=100/cystatin C)]. The GFR was measured using (51) CrEDTA clearance. Statistically significant correlation between (51) CrEDTA clearance with serum creatinine, serum cystatin C and all observed formulas was found. The receiver operating characteristic curve analysis (cut-off for GFR 60 ml/min/1.73m(2)) showed that serum cystatin C and both cystatin C formulas had a higher diagnostic accuracy than C&G formula. Bland and Altman analysis for the same cut-off value showed that all formulas except simple cystatin C formula underestimated measured GFR. The accuracy within 30% of estimated (51) CrEDTA clearance values differs according to stages of CKD. Analysis of ability to correctly predict patient's GFR below or above 60 ml/min/1.73m(2) showed statistically significant higher ability for both cystatin C formulas compared to MDRD formula. Our results indicate that serum cystatin C-based equations are reliable markers of GFR comparable with creatinine-based formulas. © 2011 Blackwell Publishing Ltd.

  1. SPARTA: Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis.

    PubMed

    Johnson, Benjamin K; Scholz, Matthew B; Teal, Tracy K; Abramovitch, Robert B

    2016-02-04

    Many tools exist in the analysis of bacterial RNA sequencing (RNA-seq) transcriptional profiling experiments to identify differentially expressed genes between experimental conditions. Generally, the workflow includes quality control of reads, mapping to a reference, counting transcript abundance, and statistical tests for differentially expressed genes. In spite of the numerous tools developed for each component of an RNA-seq analysis workflow, easy-to-use bacterially oriented workflow applications to combine multiple tools and automate the process are lacking. With many tools to choose from for each step, the task of identifying a specific tool, adapting the input/output options to the specific use-case, and integrating the tools into a coherent analysis pipeline is not a trivial endeavor, particularly for microbiologists with limited bioinformatics experience. To make bacterial RNA-seq data analysis more accessible, we developed a Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis (SPARTA). SPARTA is a reference-based bacterial RNA-seq analysis workflow application for single-end Illumina reads. SPARTA is turnkey software that simplifies the process of analyzing RNA-seq data sets, making bacterial RNA-seq analysis a routine process that can be undertaken on a personal computer or in the classroom. The easy-to-install, complete workflow processes whole transcriptome shotgun sequencing data files by trimming reads and removing adapters, mapping reads to a reference, counting gene features, calculating differential gene expression, and, importantly, checking for potential batch effects within the data set. SPARTA outputs quality analysis reports, gene feature counts and differential gene expression tables and scatterplots. SPARTA provides an easy-to-use bacterial RNA-seq transcriptional profiling workflow to identify differentially expressed genes between experimental conditions. This software will enable microbiologists with limited bioinformatics experience to analyze their data and integrate next generation sequencing (NGS) technologies into the classroom. The SPARTA software and tutorial are available at sparta.readthedocs.org.

  2. On-line capacity-building program on "analysis of data" for medical educators in the South Asia region: a qualitative exploration of our experience.

    PubMed

    Dongre, A R; Chacko, T V; Banu, S; Bhandary, S; Sahasrabudhe, R A; Philip, S; Deshmukh, P R

    2010-11-01

    In medical education, using the World Wide Web is a new approach for building the capacity of faculty. However, there is little information available on medical education researchers' needs and their collective learning outcomes in such on-line environments. Hence, the present study attempted: 1)to identify needs for capacity-building of fellows in a faculty development program on the topic of data analysis; and 2) to describe, analyze and understand the collective learning outcomes of the fellows during this need-based on-line session. The present research is based on quantitative (on-line survey for needs assessment) and qualitative (contents of e-mails exchanged in listserv discussion) data which were generated during the October 2009 Mentoring and Learning (M-L) Web discussion on the topic of data analysis. The data sources were shared e-mail responses during the process of planning and executing the M-L Web discussion. Content analysis was undertaken and the categories of discussion were presented as a simple non-hierarchical typology which represents the collective learning of the project fellows. We identified the types of learning needs on the topic 'Analysis of Data' to be addressed for faculty development in the field of education research. This need-based M-L Web discussion could then facilitate collective learning on such topics as 'basic concepts in statistics', tests of significance, Likert scale analysis, bivariate correlation, and simple regression analysis and content analysis of qualitative data. Steps like identifying the learning needs for an on-line M-L Web discussion, addressing the immediate needs of learners and creating a flexible reflective learning environment on the M-L Web facilitated the collective learning of the fellows on the topic of data analysis. Our outcomes can be useful in the design of on-line pedagogical strategies for supporting research in medical education.

  3. GetData: A filesystem-based, column-oriented database format for time-ordered binary data

    NASA Astrophysics Data System (ADS)

    Wiebe, Donald V.; Netterfield, Calvin B.; Kisner, Theodore S.

    2015-12-01

    The GetData Project is the reference implementation of the Dirfile Standards, a filesystem-based, column-oriented database format for time-ordered binary data. Dirfiles provide a fast, simple format for storing and reading data, suitable for both quicklook and analysis pipelines. GetData provides a C API and bindings exist for various other languages. GetData is distributed under the terms of the GNU Lesser General Public License.

  4. Automatic network coupling analysis for dynamical systems based on detailed kinetic models.

    PubMed

    Lebiedz, Dirk; Kammerer, Julia; Brandt-Pollmann, Ulrich

    2005-10-01

    We introduce a numerical complexity reduction method for the automatic identification and analysis of dynamic network decompositions in (bio)chemical kinetics based on error-controlled computation of a minimal model dimension represented by the number of (locally) active dynamical modes. Our algorithm exploits a generalized sensitivity analysis along state trajectories and subsequent singular value decomposition of sensitivity matrices for the identification of these dominant dynamical modes. It allows for a dynamic coupling analysis of (bio)chemical species in kinetic models that can be exploited for the piecewise computation of a minimal model on small time intervals and offers valuable functional insight into highly nonlinear reaction mechanisms and network dynamics. We present results for the identification of network decompositions in a simple oscillatory chemical reaction, time scale separation based model reduction in a Michaelis-Menten enzyme system and network decomposition of a detailed model for the oscillatory peroxidase-oxidase enzyme system.

  5. Control volume based hydrocephalus research; analysis of human data

    NASA Astrophysics Data System (ADS)

    Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer

    2010-11-01

    Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.

  6. Genetic Variation and Population Differentiation in a Medical Herb Houttuynia cordata in China Revealed by Inter-Simple Sequence Repeats (ISSRs)

    PubMed Central

    Wei, Lin; Wu, Xian-Jin

    2012-01-01

    Houttuynia cordata is an important traditional Chinese herb with unresolved genetics and taxonomy, which lead to potential problems in the conservation and utilization of the resource. Inter-simple sequence repeat (ISSR) markers were used to assess the level and distribution of genetic diversity in 226 individuals from 15 populations of H. cordata in China. ISSR analysis revealed low genetic variations within populations but high genetic differentiations among populations. This genetic structure probably mainly reflects the historical association among populations. Genetic cluster analysis showed that the basal clade is composed of populations from Southwest China, and the other populations have continuous and eastward distributions. The structure of genetic diversity in H. cordata demonstrated that this species might have survived in Southwest China during the glacial age, and subsequently experienced an eastern postglacial expansion. Based on the results of genetic analysis, it was proposed that as many as possible targeted populations for conservation be included. PMID:22942696

  7. Genetic variation and population differentiation in a medical herb Houttuynia cordata in China revealed by inter-simple sequence repeats (ISSRs).

    PubMed

    Wei, Lin; Wu, Xian-Jin

    2012-01-01

    Houttuynia cordata is an important traditional Chinese herb with unresolved genetics and taxonomy, which lead to potential problems in the conservation and utilization of the resource. Inter-simple sequence repeat (ISSR) markers were used to assess the level and distribution of genetic diversity in 226 individuals from 15 populations of H. cordata in China. ISSR analysis revealed low genetic variations within populations but high genetic differentiations among populations. This genetic structure probably mainly reflects the historical association among populations. Genetic cluster analysis showed that the basal clade is composed of populations from Southwest China, and the other populations have continuous and eastward distributions. The structure of genetic diversity in H. cordata demonstrated that this species might have survived in Southwest China during the glacial age, and subsequently experienced an eastern postglacial expansion. Based on the results of genetic analysis, it was proposed that as many as possible targeted populations for conservation be included.

  8. Real-time detection of hazardous materials in air

    NASA Astrophysics Data System (ADS)

    Schechter, Israel; Schroeder, Hartmut; Kompa, Karl L.

    1994-03-01

    A new detection system has been developed for real-time analysis of organic compounds in ambient air. It is based on multiphoton ionization by an unfocused laser beam in a single parallel-plate device. Thus, the ionization volume can be relatively large. The amount of laser created ions is determined quantitatively from the induced total voltage drop between the biased plates (Q equals (Delta) V(DOT)C). Mass information is obtained from computer analysis of the time-dependent signal. When a KrF laser (5 ev) is used, most of the organic compounds can be ionized in a two-photon process, but none of the standard components of atmospheric air are ionized by this process. Therefore, this instrument may be developed as a `sniffer' for organic materials. The method has been applied for benzene analysis in air. The detection limit is about 10 ppb. With a simple preconcentration technique the detection limit can be decreased to the sub-ppb range. Simple binary mixtures are also resolved.

  9. Second Law of Thermodynamics Applied to Metabolic Networks

    NASA Technical Reports Server (NTRS)

    Nigam, R.; Liang, S.

    2003-01-01

    We present a simple algorithm based on linear programming, that combines Kirchoff's flux and potential laws and applies them to metabolic networks to predict thermodynamically feasible reaction fluxes. These law's represent mass conservation and energy feasibility that are widely used in electrical circuit analysis. Formulating the Kirchoff's potential law around a reaction loop in terms of the null space of the stoichiometric matrix leads to a simple representation of the law of entropy that can be readily incorporated into the traditional flux balance analysis without resorting to non-linear optimization. Our technique is new as it can easily check the fluxes got by applying flux balance analysis for thermodynamic feasibility and modify them if they are infeasible so that they satisfy the law of entropy. We illustrate our method by applying it to the network dealing with the central metabolism of Escherichia coli. Due to its simplicity this algorithm will be useful in studying large scale complex metabolic networks in the cell of different organisms.

  10. Simple radiative transfer model for relationships between canopy biomass and reflectance

    NASA Technical Reports Server (NTRS)

    Park, J. K.; Deering, D. W.

    1982-01-01

    A modified Kubelka-Munk model has been utilized to derive useful equations for the analysis of apparent canopy reflectance. Based on the solution to the model simple working equations were formulated by employing reflectance characteristic parameters. The relationships derived show the asymptotic nature of reflectance data that is typically observed in remote sensing studies of plant biomass. They also establish the range of expected apparent canopy reflectance values for specific plant canopy types. The usefulness of the simplified equations was demonstrated by the exceptionally close fit of the theoretical curves to two separately acquired data sets for alfalfa and shortgrass prairie canopies.

  11. A simple hydrodynamic model of tornado-like vortices

    NASA Astrophysics Data System (ADS)

    Kurgansky, M. V.

    2015-05-01

    Based on similarity arguments, a simple fluid dynamic model of tornado-like vortices is offered that, with account for "vortex breakdown" at a certain height above the ground, relates the maximal azimuthal velocity in the vortex, reachable near the ground surface, to the convective available potential energy (CAPE) stored in the environmental atmosphere under pre-tornado conditions. The relative proportion of the helicity (kinetic energy) destruction (dissipation) in the "vortex breakdown" zone and, accordingly, within the surface boundary layer beneath the vortex is evaluated. These considerations form the basis of the dynamic-statistical analysis of the relationship between the tornado intensity and the CAPE budget in the surrounding atmosphere.

  12. Statistical Properties of Online Auctions

    NASA Astrophysics Data System (ADS)

    Namazi, Alireza; Schadschneider, Andreas

    We characterize the statistical properties of a large number of online auctions run on eBay. Both stationary and dynamic properties, like distributions of prices, number of bids etc., as well as relations between these quantities are studied. The analysis of the data reveals surprisingly simple distributions and relations, typically of power-law form. Based on these findings we introduce a simple method to identify suspicious auctions that could be influenced by a form of fraud known as shill bidding. Furthermore the influence of bidding strategies is discussed. The results indicate that the observed behavior is related to a mixture of agents using a variety of strategies.

  13. An analysis of life expectancy of airplane wings in normal cruising flight

    NASA Technical Reports Server (NTRS)

    Putnam, Abbott A

    1945-01-01

    In order to provide a basis for judging the relative importance of wing failure by fatigue and by single intense gusts, an analysis of wing life for normal cruising flight was made based on data on the frequency of atmospheric gusts. The independent variables considered in the analysis included stress-concentration factor, stress-load relation, wing loading, design and cruising speeds, design gust velocity, and airplane size. Several methods for estimating fatigue life from gust frequencies are discussed. The procedure selected for the analysis is believed to be simple and reasonably accurate, though slightly conservative.

  14. [Analysis of characteristics shown in self introduction letter and professor's recommendation letter].

    PubMed

    Kim, Sang Hyun

    2013-09-01

    The purpose of this study was to investigate applicants' behavioral characteristics based on the evaluation of cognitive, affective and social domain shown in self introduction letter and professor's recommendation letter. Self introduction letters and professor's recommendation letters of 109 applicants students who applied to medical school were collected. Frequency analysis and simple correlation were done in self introduction letter and professor's recommendation letter. Frequency analysis showed affective characteristics were most often mentioned in self introduction letter, and cognitive characteristics were most frequently described in professor's recommendation letter. There was a strong correlation between cognitive domains of self introduction letter and cognitive domain of professor's recommendation letter. There was a strong correlation between affective domain of self introduction letter and cognitive domain professor's recommendation letter. It is very important to make full use of self introduction letter and professor's recommendation letter for selecting medical students. Through the frequency analysis and simple correlation, more specific guidelines need to be suggested in order to secure fairness and objectivity in the evaluation of self-introduction letter and professor's recommendation letter.

  15. dc analysis and design of zero-voltage-switched multi-resonant converters

    NASA Astrophysics Data System (ADS)

    Tabisz, Wojciech A.; Lee, Fred C.

    Recently introduced multiresonant converters (MRCs) provide zero-voltage switching (ZVS) of both active and passive switches and offer a substantial reduction of transistor voltage stress and an increase of load range, compared to their quasi-resonant converter counterparts. Using the resonant switch concept, a simple, generalized analysis of ZVS MRCs is presented. The conversion ratio and voltage stress characteristics are derived for basic ZVS MRCs, including buck, boost, and buck/boost converters. Based on the analysis, a design procedure that optimizes the selection of resonant elements for maximum conversion efficiency is proposed.

  16. Control and prediction of the course of brewery fermentations by gravimetric analysis.

    PubMed

    Kosín, P; Savel, J; Broz, A; Sigler, K

    2008-01-01

    A simple, fast and cheap test suitable for predicting the course of brewery fermentations based on mass analysis is described and its efficiency is evaluated. Compared to commonly used yeast vitality tests, this analysis takes into account wort composition and other factors that influence fermentation performance. It can be used to predict the shape of the fermentation curve in brewery fermentations and in research and development projects concerning yeast vitality, fermentation conditions and wort composition. It can also be a useful tool for homebrewers to control their fermentations.

  17. Prediction of gas-liquid two-phase flow regime in microgravity

    NASA Technical Reports Server (NTRS)

    Lee, Jinho; Platt, Jonathan A.

    1993-01-01

    An attempt is made to predict gas-liquid two-phase flow regime in a pipe in a microgravity environment through scaling analysis based on dominant physical mechanisms. Simple inlet geometry is adopted in the analysis to see the effect of inlet configuration on flow regime transitions. Comparison of the prediction with the existing experimental data shows good agreement, though more work is required to better define some physical parameters. The analysis clarifies much of the physics involved in this problem and can be applied to other configurations.

  18. Container-Based Clinical Solutions for Portable and Reproducible Image Analysis.

    PubMed

    Matelsky, Jordan; Kiar, Gregory; Johnson, Erik; Rivera, Corban; Toma, Michael; Gray-Roncal, William

    2018-05-08

    Medical imaging analysis depends on the reproducibility of complex computation. Linux containers enable the abstraction, installation, and configuration of environments so that software can be both distributed in self-contained images and used repeatably by tool consumers. While several initiatives in neuroimaging have adopted approaches for creating and sharing more reliable scientific methods and findings, Linux containers are not yet mainstream in clinical settings. We explore related technologies and their efficacy in this setting, highlight important shortcomings, demonstrate a simple use-case, and endorse the use of Linux containers for medical image analysis.

  19. Nonlinear analysis of NPP safety against the aircraft attack

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Králik, Juraj, E-mail: juraj.kralik@stuba.sk; Králik, Juraj, E-mail: kralik@fa.stuba.sk

    The paper presents the nonlinear probabilistic analysis of the reinforced concrete buildings of nuclear power plant under the aircraft attack. The dynamic load is defined in time on base of the airplane impact simulations considering the real stiffness, masses, direction and velocity of the flight. The dynamic response is calculated in the system ANSYS using the transient nonlinear analysis solution method. The damage of the concrete wall is evaluated in accordance with the standard NDRC considering the spalling, scabbing and perforation effects. The simple and detailed calculations of the wall damage are compared.

  20. Scaling of drizzle virga depth with cloud thickness for marine stratocumulus clouds

    DOE PAGES

    Yang, Fan; Luke, Edward P.; Kollias, Pavlos; ...

    2018-04-20

    Drizzle plays a crucial role in cloud lifetime and radiation properties of marine stratocumulus clouds. Understanding where drizzle exists in the sub-cloud layer, which depends on drizzle virga depth, can help us better understand where below-cloud scavenging and evaporative cooling and moisturizing occur. In this study, we examine the statistical properties of drizzle frequency and virga depth of marine stratocumulus based on unique ground-based remote sensing data. Results show that marine stratocumulus clouds are drizzling nearly all the time. In addition, we derive a simple scaling analysis between drizzle virga thickness and cloud thickness. Our analytical expression agrees with themore » observational data reasonable well, which suggests that our formula provides a simple parameterization for drizzle virga of stratocumulus clouds suitable for use in other models.« less

  1. Design of arbitrarily homogeneous permanent magnet systems for NMR and MRI: theory and experimental developments of a simple portable magnet.

    PubMed

    Hugon, Cedric; D'Amico, Francesca; Aubert, Guy; Sakellariou, Dimitris

    2010-07-01

    Starting from general results of magnetostatics, we give fundamental considerations on the design and characterization of permanent magnets for NMR based on harmonic analysis and symmetry. We then propose a simple geometry that takes advantage of some of these considerations and discuss the practical aspects of the assembly of a real magnet based on this geometry, involving the characterization of its elements, the optimization of the layout and the correction of residual inhomogeneities due to material and geometry imperfections. We report with this low-cost, light-weight magnet (100 euros and 1.8 kg including the aluminum frame) a field of 120 mT (5.1 MHz proton) with a 10 ppm natural homogeneity over a sphere of 1.5 mm in diameter. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  2. A novel electrochemical sensor based on zirconia/ordered macroporous polyaniline for ultrasensitive detection of pesticides.

    PubMed

    Wang, Yonglan; Jin, Jun; Yuan, Caixia; Zhang, Fan; Ma, Linlin; Qin, Dongdong; Shan, Duoliang; Lu, Xiaoquan

    2015-01-21

    A simple and mild strategy was proposed to develop a novel electrochemical sensor based on zirconia/ordered macroporous polyaniline (ZrO2/OMP) and further used for the detection of methyl parathion (MP), one of the organophosphate pesticides (OPPs). Due to the strong affinity of phosphate groups with ZrO2 and the advantages of OMP such as high catalytic activity and good conductivity, the developed sensor showed a limit of detection as low as 2.28 × 10(-10) mol L(-1) (S/N = 3) by square-wave voltammograms, and good selectivity, acceptable reproducibility and stability. Most importantly, this novel sensor was successfully applied to detect MP in real samples of apple and cabbage. It is expected that this method has potential applications in electrochemical sensing platforms with simple, sensitive, selective and fast analysis.

  3. Virtual Solar Observatory Distributed Query Construction

    NASA Technical Reports Server (NTRS)

    Gurman, J. B.; Dimitoglou, G.; Bogart, R.; Davey, A.; Hill, F.; Martens, P.

    2003-01-01

    Through a prototype implementation (Tian et al., this meeting) the VSO has already demonstrated the capability of unifying geographically distributed data sources following the Web Services paradigm and utilizing mechanisms such as the Simple Object Access Protocol (SOAP). So far, four participating sites (Stanford, Montana State University, National Solar Observatory and the Solar Data Analysis Center) permit Web-accessible, time-based searches that allow browse access to a number of diverse data sets. Our latest work includes the extension of the simple, time-based queries to include numerous other searchable observation parameters. For VSO users, this extended functionality enables more refined searches. For the VSO, it is a proof of concept that more complex, distributed queries can be effectively constructed and that results from heterogeneous, remote sources can be synthesized and presented to users as a single, virtual data product.

  4. Exploring Magnetic Fields with a Compass

    NASA Astrophysics Data System (ADS)

    Lunk, Brandon; Beichner, Robert

    2011-01-01

    A compass is an excellent classroom tool for the exploration of magnetic fields. Any student can tell you that a compass is used to determine which direction is north, but when paired with some basic trigonometry, the compass can be used to actually measure the strength of the magnetic field due to a nearby magnet or current-carrying wire. In this paper, we present a series of simple activities adapted from the Matter & Interactions textbook for doing just this. Interestingly, these simple measurements are comparable to predictions made by the Bohr model of the atom. Although antiquated, Bohr's atom can lead the way to a deeper analysis of the atomic properties of magnets. Although originally developed for an introductory calculus-based course, these activities can easily be adapted for use in an algebra-based class or even at the high school level.

  5. Scaling of drizzle virga depth with cloud thickness for marine stratocumulus clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Fan; Luke, Edward P.; Kollias, Pavlos

    Drizzle plays a crucial role in cloud lifetime and radiation properties of marine stratocumulus clouds. Understanding where drizzle exists in the sub-cloud layer, which depends on drizzle virga depth, can help us better understand where below-cloud scavenging and evaporative cooling and moisturizing occur. In this study, we examine the statistical properties of drizzle frequency and virga depth of marine stratocumulus based on unique ground-based remote sensing data. Results show that marine stratocumulus clouds are drizzling nearly all the time. In addition, we derive a simple scaling analysis between drizzle virga thickness and cloud thickness. Our analytical expression agrees with themore » observational data reasonable well, which suggests that our formula provides a simple parameterization for drizzle virga of stratocumulus clouds suitable for use in other models.« less

  6. Deep Learning Nuclei Detection in Digitized Histology Images by Superpixels.

    PubMed

    Sornapudi, Sudhir; Stanley, Ronald Joe; Stoecker, William V; Almubarak, Haidar; Long, Rodney; Antani, Sameer; Thoma, George; Zuna, Rosemary; Frazier, Shelliane R

    2018-01-01

    Advances in image analysis and computational techniques have facilitated automatic detection of critical features in histopathology images. Detection of nuclei is critical for squamous epithelium cervical intraepithelial neoplasia (CIN) classification into normal, CIN1, CIN2, and CIN3 grades. In this study, a deep learning (DL)-based nuclei segmentation approach is investigated based on gathering localized information through the generation of superpixels using a simple linear iterative clustering algorithm and training with a convolutional neural network. The proposed approach was evaluated on a dataset of 133 digitized histology images and achieved an overall nuclei detection (object-based) accuracy of 95.97%, with demonstrated improvement over imaging-based and clustering-based benchmark techniques. The proposed DL-based nuclei segmentation Method with superpixel analysis has shown improved segmentation results in comparison to state-of-the-art methods.

  7. Incremental development and prototyping in current laboratory software development projects: Preliminary analysis

    NASA Technical Reports Server (NTRS)

    Griesel, Martha Ann

    1988-01-01

    Several Laboratory software development projects that followed nonstandard development processes, which were hybrids of incremental development and prototyping, are being studied. Factors in the project environment leading to the decision to use a nonstandard development process and affecting its success are analyzed. A simple characterization of project environment based on this analysis is proposed, together with software development approaches which have been found effective for each category. These approaches include both documentation and review requirements.

  8. Tear glucose detection combining microfluidic thread based device, amperometric biosensor and microflow injection analysis.

    PubMed

    Agustini, Deonir; Bergamini, Márcio F; Marcolino-Junior, Luiz Humberto

    2017-12-15

    The tear glucose analysis is an important alternative for the indirect, simple and less invasive monitoring of blood glucose levels. However, the high cost and complex manufacturing process of tear glucose analyzers combined with the need to exchange the sensor after each analysis in the disposable tests prevent widespread application of the tear in glucose monitoring. Here, we present the integration of a biosensor made by the electropolymerization of poly(toluidine blue O) (PTB) and glucose oxidase (GOx) with an electroanalytical microfluidic device of easy assembly based on cotton threads, low cost materials and measurements by microflow injection analysis (µFIA) through passive pumping for performing tear glucose analyses in a simple, rapid and inexpensive way. A high stability between the analyses (RSD = 2.54%) and among the different systems (RSD = 3.13%) was obtained for the determination of glucose, in addition to a wide linear range between 0.075 and 7.5mmolL -1 and a limit of detection of 22.2µmolL -1 . The proposed method was efficiently employed in the determination of tear glucose in non-diabetic volunteers, obtaining a close correlation with their blood glucose levels, simplifying and reducing the costs of the analyses, making the tear glucose monitoring more accessible for the population. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. PCANet: A Simple Deep Learning Baseline for Image Classification?

    PubMed

    Chan, Tsung-Han; Jia, Kui; Gao, Shenghua; Lu, Jiwen; Zeng, Zinan; Ma, Yi

    2015-12-01

    In this paper, we propose a very simple deep learning network for image classification that is based on very basic data processing components: 1) cascaded principal component analysis (PCA); 2) binary hashing; and 3) blockwise histograms. In the proposed architecture, the PCA is employed to learn multistage filter banks. This is followed by simple binary hashing and block histograms for indexing and pooling. This architecture is thus called the PCA network (PCANet) and can be extremely easily and efficiently designed and learned. For comparison and to provide a better understanding, we also introduce and study two simple variations of PCANet: 1) RandNet and 2) LDANet. They share the same topology as PCANet, but their cascaded filters are either randomly selected or learned from linear discriminant analysis. We have extensively tested these basic networks on many benchmark visual data sets for different tasks, including Labeled Faces in the Wild (LFW) for face verification; the MultiPIE, Extended Yale B, AR, Facial Recognition Technology (FERET) data sets for face recognition; and MNIST for hand-written digit recognition. Surprisingly, for all tasks, such a seemingly naive PCANet model is on par with the state-of-the-art features either prefixed, highly hand-crafted, or carefully learned [by deep neural networks (DNNs)]. Even more surprisingly, the model sets new records for many classification tasks on the Extended Yale B, AR, and FERET data sets and on MNIST variations. Additional experiments on other public data sets also demonstrate the potential of PCANet to serve as a simple but highly competitive baseline for texture classification and object recognition.

  10. Evaluation of IOTA Simple Ultrasound Rules to Distinguish Benign and Malignant Ovarian Tumours

    PubMed Central

    Kaur, Amarjit; Mohi, Jaswinder Kaur; Sibia, Preet Kanwal; Kaur, Navkiran

    2017-01-01

    Introduction IOTA stands for International Ovarian Tumour Analysis group. Ovarian cancer is one of the common cancers in women and is diagnosed at later stage in majority. The limiting factor for early diagnosis is lack of standardized terms and procedures in gynaecological sonography. Introduction of IOTA rules has provided some consistency in defining morphological features of ovarian masses through a standardized examination technique. Aim To evaluate the efficacy of IOTA simple ultrasound rules in distinguishing benign and malignant ovarian tumours and establishing their use as a tool in early diagnosis of ovarian malignancy. Materials and Methods A hospital based case control prospective study was conducted. Patients with suspected ovarian pathology were evaluated using IOTA ultrasound rules and designated as benign or malignant. Findings were correlated with histopathological findings. Collected data was statistically analysed using chi-square test and kappa statistical method. Results Out of initial 55 patients, 50 patients were included in the final analysis who underwent surgery. IOTA simple rules were applicable in 45 out of these 50 patients (90%). The sensitivity for the detection of malignancy in cases where IOTA simple rules were applicable was 91.66% and the specificity was 84.84%. Accuracy was 86.66%. Classifying inconclusive cases as malignant, the sensitivity and specificity was 93% and 80% respectively. High level of agreement was found between USG and histopathological diagnosis with Kappa value as 0.323. Conclusion IOTA simple ultrasound rules were highly sensitive and specific in predicting ovarian malignancy preoperatively yet being reproducible, easy to train and use. PMID:28969237

  11. A Finite Element Analysis for Predicting the Residual Compressive Strength of Impact-Damaged Sandwich Panels

    NASA Technical Reports Server (NTRS)

    Ratcliffe, James G.; Jackson, Wade C.

    2008-01-01

    A simple analysis method has been developed for predicting the residual compressive strength of impact-damaged sandwich panels. The method is tailored for honeycomb core-based sandwich specimens that exhibit an indentation growth failure mode under axial compressive loading, which is driven largely by the crushing behavior of the core material. The analysis method is in the form of a finite element model, where the impact-damaged facesheet is represented using shell elements and the core material is represented using spring elements, aligned in the thickness direction of the core. The nonlinear crush response of the core material used in the analysis is based on data from flatwise compression tests. A comparison with a previous analysis method and some experimental data shows good agreement with results from this new approach.

  12. A Finite Element Analysis for Predicting the Residual Compression Strength of Impact-Damaged Sandwich Panels

    NASA Technical Reports Server (NTRS)

    Ratcliffe, James G.; Jackson, Wade C.

    2008-01-01

    A simple analysis method has been developed for predicting the residual compression strength of impact-damaged sandwich panels. The method is tailored for honeycomb core-based sandwich specimens that exhibit an indentation growth failure mode under axial compression loading, which is driven largely by the crushing behavior of the core material. The analysis method is in the form of a finite element model, where the impact-damaged facesheet is represented using shell elements and the core material is represented using spring elements, aligned in the thickness direction of the core. The nonlinear crush response of the core material used in the analysis is based on data from flatwise compression tests. A comparison with a previous analysis method and some experimental data shows good agreement with results from this new approach.

  13. Model-based gene set analysis for Bioconductor.

    PubMed

    Bauer, Sebastian; Robinson, Peter N; Gagneur, Julien

    2011-07-01

    Gene Ontology and other forms of gene-category analysis play a major role in the evaluation of high-throughput experiments in molecular biology. Single-category enrichment analysis procedures such as Fisher's exact test tend to flag large numbers of redundant categories as significant, which can complicate interpretation. We have recently developed an approach called model-based gene set analysis (MGSA), that substantially reduces the number of redundant categories returned by the gene-category analysis. In this work, we present the Bioconductor package mgsa, which makes the MGSA algorithm available to users of the R language. Our package provides a simple and flexible application programming interface for applying the approach. The mgsa package has been made available as part of Bioconductor 2.8. It is released under the conditions of the Artistic license 2.0. peter.robinson@charite.de; julien.gagneur@embl.de.

  14. Getting started with Open-Hardware: Development and Control of Microfluidic Devices

    PubMed Central

    da Costa, Eric Tavares; Mora, Maria F.; Willis, Peter A.; do Lago, Claudimir L.; Jiao, Hong; Garcia, Carlos D.

    2014-01-01

    Understanding basic concepts of electronics and computer programming allows researchers to get the most out of the equipment found in their laboratories. Although a number of platforms have been specifically designed for the general public and are supported by a vast array of on-line tutorials, this subject is not normally included in university chemistry curricula. Aiming to provide the basic concepts of hardware and software, this article is focused on the design and use of a simple module to control a series of PDMS-based valves. The module is based on a low-cost microprocessor (Teensy) and open-source software (Arduino). The microvalves were fabricated using thin sheets of PDMS and patterned using CO2 laser engraving, providing a simple and efficient way to fabricate devices without the traditional photolithographic process or facilities. Synchronization of valve control enabled the development of two simple devices to perform injection (1.6 ± 0.4 μL/stroke) and mixing of different solutions. Furthermore, a practical demonstration of the utility of this system for microscale chemical sample handling and analysis was achieved performing an on-chip acid-base titration, followed by conductivity detection with an open-source low-cost detection system. Overall, the system provided a very reproducible (98%) platform to perform fluid delivery at the microfluidic scale. PMID:24823494

  15. A paper-based cantilever array sensor: Monitoring volatile organic compounds with naked eye.

    PubMed

    Fraiwan, Arwa; Lee, Hankeun; Choi, Seokheun

    2016-09-01

    Volatile organic compound (VOC) detection is critical for controlling industrial and commercial emissions, environmental monitoring, and public health. Simple, portable, rapid and low-cost VOC sensing platforms offer the benefits of on-site and real-time monitoring anytime and anywhere. The best and most practically useful approaches to monitoring would include equipment-free and power-free detection by the naked eye. In this work, we created a novel, paper-based cantilever sensor array that allows simple and rapid naked-eye VOC detection without the need for power, electronics or readout interface/equipment. This simple VOC detection method was achieved using (i) low-cost paper materials as a substrate and (ii) swellable thin polymers adhered to the paper. Upon exposure to VOCs, the polymer swelling adhered to the paper-based cantilever, inducing mechanical deflection that generated a distinctive composite pattern of the deflection angles for a specific VOC. The angle is directly measured by the naked eye on a 3-D protractor printed on a paper facing the cantilevers. The generated angle patterns are subjected to statistical algorithms (linear discriminant analysis (LDA)) to classify each VOC sample and selectively detect a VOC. We classified four VOC samples with 100% accuracy using LDA. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Architecture with GIDEON, A Program for Design in Structural DNA Nanotechnology

    PubMed Central

    Birac, Jeffrey J.; Sherman, William B.; Kopatsch, Jens; Constantinou, Pamela E.; Seeman, Nadrian C.

    2012-01-01

    We present geometry based design strategies for DNA nanostructures. The strategies have been implemented with GIDEON – a Graphical Integrated Development Environment for OligoNucleotides. GIDEON has a highly flexible graphical user interface that facilitates the development of simple yet precise models, and the evaluation of strains therein. Models are built on a simple model of undistorted B-DNA double-helical domains. Simple point and click manipulations of the model allow the minimization of strain in the phosphate-backbone linkages between these domains and the identification of any steric clashes that might occur as a result. Detailed analysis of 3D triangles yields clear predictions of the strains associated with triangles of different sizes. We have carried out experiments that confirm that 3D triangles form well only when their geometrical strain is less than 4% deviation from the estimated relaxed structure. Thus geometry-based techniques alone, without energetic considerations, can be used to explain general trends in DNA structure formation. We have used GIDEON to build detailed models of double crossover and triple crossover molecules, evaluating the non-planarity associated with base tilt and junction mis-alignments. Computer modeling using a graphical user interface overcomes the limited precision of physical models for larger systems, and the limited interaction rate associated with earlier, command-line driven software. PMID:16630733

  17. Development of a rapid, simple assay of plasma total carotenoids

    PubMed Central

    2012-01-01

    Background Plasma total carotenoids can be used as an indicator of risk of chronic disease. Laboratory analysis of individual carotenoids by high performance liquid chromatography (HPLC) is time consuming, expensive, and not amenable to use beyond a research laboratory. The aim of this research is to establish a rapid, simple, and inexpensive spectrophotometric assay of plasma total carotenoids that has a very strong correlation with HPLC carotenoid profile analysis. Results Plasma total carotenoids from 29 volunteers ranged in concentration from 1.2 to 7.4 μM, as analyzed by HPLC. A linear correlation was found between the absorbance at 448 nm of an alcohol / heptane extract of the plasma and plasma total carotenoids analyzed by HPLC, with a Pearson correlation coefficient of 0.989. The average coefficient of variation for the spectrophotometric assay was 6.5% for the plasma samples. The limit of detection was about 0.3 μM and was linear up to about 34 μM without dilution. Correlations between the integrals of the absorption spectra in the range of carotenoid absorption and total plasma carotenoid concentration gave similar results to the absorbance correlation. Spectrophotometric assay results also agreed with the calculated expected absorbance based on published extinction coefficients for the individual carotenoids, with a Pearson correlation coefficient of 0.988. Conclusion The spectrophotometric assay of total carotenoids strongly correlated with HPLC analysis of carotenoids of the same plasma samples and expected absorbance values based on extinction coefficients. This rapid, simple, inexpensive assay, when coupled with the carotenoid health index, may be useful for nutrition intervention studies, population cohort studies, and public health interventions. PMID:23006902

  18. High-resolution melting analysis for bird sexing: a successful approach to molecular sex identification using different biological samples.

    PubMed

    Morinha, Francisco; Travassos, Paulo; Seixas, Fernanda; Santos, Nuno; Sargo, Roberto; Sousa, Luís; Magalhães, Paula; Cabral, João A; Bastos, Estela

    2013-05-01

    High-resolution melting (HRM) analysis is a very attractive and flexible advanced post-PCR method with high sensitivity/specificity for simple, fast and cost-effective genotyping based on the detection of specific melting profiles of PCR products. Next generation real-time PCR systems, along with improved saturating DNA-binding dyes, enable the direct acquisition of HRM data after quantitative PCR. Melting behaviour is particularly influenced by the length, nucleotide sequence and GC content of the amplicons. This method is expanding rapidly in several research areas such as human genetics, reproductive biology, microbiology and ecology/conservation of wild populations. Here we have developed a successful HRM protocol for avian sex identification based on the amplification of sex-specific CHD1 fragments. The melting curve patterns allowed efficient sexual differentiation of 111 samples analysed (plucked feathers, muscle tissues, blood and oral cavity epithelial cells) of 14 bird species. In addition, we sequenced the amplified regions of the CHD1 gene and demonstrated the usefulness of this strategy for the genotype discrimination of various amplicons (CHD1Z and CHD1W), which have small size differences, ranging from 2 bp to 44 bp. The established methodology clearly revealed the advantages (e.g. closed-tube system, high sensitivity and rapidity) of a simple HRM assay for accurate sex differentiation of the species under study. The requirements, strengths and limitations of the method are addressed to provide a simple guide for its application in the field of molecular sexing of birds. The high sensitivity and resolution relative to previous real-time PCR methods makes HRM analysis an excellent approach for improving advanced molecular methods for bird sexing. © 2013 Blackwell Publishing Ltd.

  19. Reliability assessment of different plate theories for elastic wave propagation analysis in functionally graded plates.

    PubMed

    Mehrkash, Milad; Azhari, Mojtaba; Mirdamadi, Hamid Reza

    2014-01-01

    The importance of elastic wave propagation problem in plates arises from the application of ultrasonic elastic waves in non-destructive evaluation of plate-like structures. However, precise study and analysis of acoustic guided waves especially in non-homogeneous waveguides such as functionally graded plates are so complicated that exact elastodynamic methods are rarely employed in practical applications. Thus, the simple approximate plate theories have attracted much interest for the calculation of wave fields in FGM plates. Therefore, in the current research, the classical plate theory (CPT), first-order shear deformation theory (FSDT) and third-order shear deformation theory (TSDT) are used to obtain the transient responses of flexural waves in FGM plates subjected to transverse impulsive loadings. Moreover, comparing the results with those based on a well recognized hybrid numerical method (HNM), we examine the accuracy of the plate theories for several plates of various thicknesses under excitations of different frequencies. The material properties of the plate are assumed to vary across the plate thickness according to a simple power-law distribution in terms of volume fractions of constituents. In all analyses, spatial Fourier transform together with modal analysis are applied to compute displacement responses of the plates. A comparison of the results demonstrates the reliability ranges of the approximate plate theories for elastic wave propagation analysis in FGM plates. Furthermore, based on various examples, it is shown that whenever the plate theories are used within the appropriate ranges of plate thickness and frequency content, solution process in wave number-time domain based on modal analysis approach is not only sufficient but also efficient for finding the transient waveforms in FGM plates. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Study on the Absorbed Fingerprint-Efficacy of Yuanhu Zhitong Tablet Based on Chemical Analysis, Vasorelaxation Evaluation and Data Mining

    PubMed Central

    Chen, Yanjun; Zhang, Yingchun; Tang, Shihuan; Wang, Shanshan; Shen, Dan; Wang, Xuguang; Lei, Yun; Li, Defeng; Zhang, Yi; Jin, Lan; Yang, Hongjun; Huang, Luqi

    2013-01-01

    Yuanhu Zhitong Tablet (YZT) is an example of a typical and relatively simple clinical herb formula that is widely used in clinics. It is generally believed that YZT play a therapeutical effect in vivo by the synergism of multiple constituents. Thus, it is necessary to build the relationship between the absorbed fingerprints and bioactivity so as to ensure the quality, safety and efficacy. In this study, a new combinative method, an intestinal absorption test coupled with a vasorelaxation bioactivity experiment in vitro, was a simple, sensitive, and feasible technique to study on the absorbed fingerprint-efficacy of YZT based on chemical analysis, vasorelaxation evaluation and data mining. As part of this method, an everted intestinal sac method was performed to determine the intestinal absorption of YZT solutions. YZT were dissolved in solution (n = 12), and the portion of the solution that was absorbed into intestinal sacs was analyzed using rapid-resolution liquid chromatography coupled with quadruple time-of-flight mass spectrometry (RRLC-Q-TOF/MS). Semi-quantitative analysis indicated the presence of 34 compounds. The effect of the intestinally absorbed solution on vasorelaxation of rat aortic rings with endothelium attached was then evaluated in vitro. The results showed that samples grouped by HCA from chemical profiles have similar bioactivity while samples in different groups displayed very different. Moreover, it established a relationship between the absorbed fingerprints and their bioactivity to identify important components by grey relational analysis, which could predict bioactive values based on chemical profiles and provide an evidence for the quantification of multi-constituents. PMID:24339904

  1. Triacylglycerol "hand-shape profile" of Argan oil. Rapid and simple UHPLC-PDA-ESI-TOF/MS and HPTLC methods to detect counterfeit Argan oil and Argan-oil-based products.

    PubMed

    Pagliuca, Giordana; Bozzi, Carlotta; Gallo, Francesca Romana; Multari, Giuseppina; Palazzino, Giovanna; Porrà, Rita; Panusa, Alessia

    2018-02-20

    The marketing of new argan-based products is greatly increased in the last few years and consequently, it has enhanced the number of control analysis aimed at detecting counterfeit products claiming argan oil as a major ingredient. Argan oil is produced in Morocco and it is quite expensive. Two simple methods for the rapid screening of pure oil and argan-oil based products, focused on the analysis of the triacylglycerol profile, have been developed. A three-minute-run by UHPLC-PDA allows the identification of a pure argan oil, while the same run with the MS detector allows also the analysis of products containing the oil down to 0.03%. On the other hand, by HPTLC the simultaneous analysis of twenty samples, containing argan oil down to 0.5%, can be carried out in a forty-five-minute run. The triglyceride profile of the most common vegetable fats such as almond, coconut, linseed, wheat germ, sunflower, peanut, olive, soybean, rapeseed, hemp oils as well as shea butter used either in cosmetics or commonly added for the counterfeiting of argan oil, has been also investigated. Over sixty products with different formulations and use have been successfully analyzed and argan oil in the 2.4-0.06% concentration range has been quantified. The methods are suitable either for a rapid screening or for quantifying argan oil in different formulations. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. DeMix Workflow for Efficient Identification of Cofragmented Peptides in High Resolution Data-dependent Tandem Mass Spectrometry*

    PubMed Central

    Zhang, Bo; Pirmoradian, Mohammad; Chernobrovkin, Alexey; Zubarev, Roman A.

    2014-01-01

    Based on conventional data-dependent acquisition strategy of shotgun proteomics, we present a new workflow DeMix, which significantly increases the efficiency of peptide identification for in-depth shotgun analysis of complex proteomes. Capitalizing on the high resolution and mass accuracy of Orbitrap-based tandem mass spectrometry, we developed a simple deconvolution method of “cloning” chimeric tandem spectra for cofragmented peptides. Additional to a database search, a simple rescoring scheme utilizes mass accuracy and converts the unwanted cofragmenting events into a surprising advantage of multiplexing. With the combination of cloning and rescoring, we obtained on average nine peptide-spectrum matches per second on a Q-Exactive workbench, whereas the actual MS/MS acquisition rate was close to seven spectra per second. This efficiency boost to 1.24 identified peptides per MS/MS spectrum enabled analysis of over 5000 human proteins in single-dimensional LC-MS/MS shotgun experiments with an only two-hour gradient. These findings suggest a change in the dominant “one MS/MS spectrum - one peptide” paradigm for data acquisition and analysis in shotgun data-dependent proteomics. DeMix also demonstrated higher robustness than conventional approaches in terms of lower variation among the results of consecutive LC-MS/MS runs. PMID:25100859

  3. Optimal interpolation analysis of leaf area index using MODIS data

    USGS Publications Warehouse

    Gu, Yingxin; Belair, Stephane; Mahfouf, Jean-Francois; Deblonde, Godelieve

    2006-01-01

    A simple data analysis technique for vegetation leaf area index (LAI) using Moderate Resolution Imaging Spectroradiometer (MODIS) data is presented. The objective is to generate LAI data that is appropriate for numerical weather prediction. A series of techniques and procedures which includes data quality control, time-series data smoothing, and simple data analysis is applied. The LAI analysis is an optimal combination of the MODIS observations and derived climatology, depending on their associated errors σo and σc. The “best estimate” LAI is derived from a simple three-point smoothing technique combined with a selection of maximum LAI (after data quality control) values to ensure a higher quality. The LAI climatology is a time smoothed mean value of the “best estimate” LAI during the years of 2002–2004. The observation error is obtained by comparing the MODIS observed LAI with the “best estimate” of the LAI, and the climatological error is obtained by comparing the “best estimate” of LAI with the climatological LAI value. The LAI analysis is the result of a weighting between these two errors. Demonstration of the method described in this paper is presented for the 15-km grid of Meteorological Service of Canada (MSC)'s regional version of the numerical weather prediction model. The final LAI analyses have a relatively smooth temporal evolution, which makes them more appropriate for environmental prediction than the original MODIS LAI observation data. They are also more realistic than the LAI data currently used operationally at the MSC which is based on land-cover databases.

  4. A simple method for the extraction and identification of light density microplastics from soil.

    PubMed

    Zhang, Shaoliang; Yang, Xiaomei; Gertsen, Hennie; Peters, Piet; Salánki, Tamás; Geissen, Violette

    2018-03-01

    This article introduces a simple and cost-saving method developed to extract, distinguish and quantify light density microplastics of polyethylene (PE) and polypropylene (PP) in soil. A floatation method using distilled water was used to extract the light density microplastics from soil samples. Microplastics and impurities were identified using a heating method (3-5s at 130°C). The number and size of particles were determined using a camera (Leica DFC 425) connected to a microscope (Leica wild M3C, Type S, simple light, 6.4×). Quantification of the microplastics was conducted using a developed model. Results showed that the floatation method was effective in extracting microplastics from soils, with recovery rates of approximately 90%. After being exposed to heat, the microplastics in the soil samples melted and were transformed into circular transparent particles while other impurities, such as organic matter and silicates were not changed by the heat. Regression analysis of microplastics weight and particle volume (a calculation based on image J software analysis) after heating showed the best fit (y=1.14x+0.46, R 2 =99%, p<0.001). Recovery rates based on the empirical model method were >80%. Results from field samples collected from North-western China prove that our method of repetitive floatation and heating can be used to extract, distinguish and quantify light density polyethylene microplastics in soils. Microplastics mass can be evaluated using the empirical model. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Influence of Thermal Contact Resistance of Aluminum Foams in Forced Convection: Experimental Analysis

    PubMed Central

    Venettacci, Simone

    2017-01-01

    In this paper, the heat transfer performances of aluminum metal foams, placed on horizontal plane surface, was evaluated in forced convection conditions. Three different types of contacts between the sample and the heated base plate have been investigated: simple contact, brazed contact and grease paste contact. First, in order to perform the study, an ad hoc experimental set-up was built. Second, the value of thermal contact resistance was estimated. The results show that both the use of a conductive paste and the brazing contact, realized by means of a copper electro-deposition, allows a great reduction of the global thermal resistance, increasing de facto the global heat transfer coefficient of almost 80%, compared to the simple contact case. Finally, it was shown that, while the contribution of thermal resistance is negligible for the cases of brazed and grease paste contact, it is significantly high for the case of simple contact. PMID:28783052

  6. CE microchips: an opened gate to food analysis.

    PubMed

    Escarpa, Alberto; González, María Cristina; Crevillén, Agustín González; Blasco, Antonio Javier

    2007-03-01

    CE microchips are the first generation of micrototal analysis systems (-TAS) emerging in the miniaturization scene of food analysis. CE microchips for food analysis are fabricated in both glass and polymer materials, such as PDMS and poly(methyl methacrylate) (PMMA), and use simple layouts of simple and double T crosses. Nowadays, the detection route preferred is electrochemical in both, amperometry and conductivity modes, using end-channel and contactless configurations, respectively. Food applications using CE microchips are now emerging since food samples present complex matrices, the selectivity being a very important challenge because the total integration of analytical steps into microchip format is very difficult. As a consequence, the first contributions that have recently appeared in the relevant literature are based primarily on fast separations of analytes of high food significance. These protocols are combined with different strategies to achieve selectivity using a suitable nonextensive sample preparation and/or strategically choosing detection routes. Polyphenolic compounds, amino acids, preservatives, and organic and inorganic ions have been studied using CE microchips. Thus, new and exciting future expectations arise in the domain of food analysis. However, several drawbacks could easily be found and assumed within the miniaturization map.

  7. Simple glucose reduction route for one-step synthesis of copper nanofluids

    NASA Astrophysics Data System (ADS)

    Shenoy, U. Sandhya; Shetty, A. Nityananda

    2014-01-01

    One-step method has been employed in the synthesis of copper nanofluids. Copper nitrate is reduced by glucose in the presence of sodium lauryl sulfate. The synthesized particles are characterized by X-ray diffraction technique for the phase structure; electron diffraction X-ray analysis for chemical composition; transmission electron microscopy and field emission scanning electron microscopy for the morphology; Fourier-transform infrared spectroscopy and ultraviolet-visible spectroscopy for the analysis of ingredients of the solution. Thermal conductivity, sedimentation and rheological measurements have also been carried out. It is found that the reaction parameters have considerable effect on the size of the particle formed and rate of the reaction. The techniques confirm that the synthesized particles are copper. The reported method showed promising increase in the thermal conductivity of the base fluid and is found to be reliable, simple and cost-effective method for preparing heat transfer fluids with higher stability.

  8. Decoding spike timing: the differential reverse correlation method

    PubMed Central

    Tkačik, Gašper; Magnasco, Marcelo O.

    2009-01-01

    It is widely acknowledged that detailed timing of action potentials is used to encode information, for example in auditory pathways; however the computational tools required to analyze encoding through timing are still in their infancy. We present a simple example of encoding, based on a recent model of time-frequency analysis, in which units fire action potentials when a certain condition is met, but the timing of the action potential depends also on other features of the stimulus. We show that, as a result, spike-triggered averages are smoothed so much they do not represent the true features of the encoding. Inspired by this example, we present a simple method, differential reverse correlations, that can separate an analysis of what causes a neuron to spike, and what controls its timing. We analyze with this method the leaky integrate-and-fire neuron and show the method accurately reconstructs the model's kernel. PMID:18597928

  9. Continuous particle separation using pressure-driven flow-induced miniaturizing free-flow electrophoresis (PDF-induced μ-FFE).

    PubMed

    Jeon, Hyungkook; Kim, Youngkyu; Lim, Geunbae

    2016-01-28

    In this paper, we introduce pressure-driven flow-induced miniaturizing free-flow electrophoresis (PDF-induced μ-FFE), a novel continuous separation method. In our separation system, the external flow and electric field are applied to particles, such that particle movement is affected by pressure-driven flow, electroosmosis, and electrophoresis. We then analyzed the hydrodynamic drag force and electrophoretic force applied to the particles in opposite directions. Based on this analysis, micro- and nano-sized particles were separated according to their electrophoretic mobilities with high separation efficiency. Because the separation can be achieved in a simple T-shaped microchannel, without the use of internal electrodes, it offers the advantages of low-cost, simple device fabrication and bubble-free operation, compared with conventional μ-FFE methods. Therefore, we expect the proposed separation method to have a wide range of filtering/separation applications in biochemical analysis.

  10. Simple quantification of surface carboxylic acids on chemically oxidized multi-walled carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Gong, Hyejin; Kim, Seong-Taek; Lee, Jong Doo; Yim, Sanggyu

    2013-02-01

    The surface of multi-walled carbon nanotube (MWCNT) was chemically oxidized using nitric acid and sulfuric-nitric acid mixtures. Thermogravimetric analysis, transmission electron microscopy and infrared spectroscopy revealed that the use of acid mixtures led to higher degree of oxidation. More quantitative identification of surface carboxylic acids was carried out using X-ray photoelectron spectroscopy (XPS) and acid-base titration. However, these techniques are costly and require very long analysis times to promptly respond to the extent of the reaction. We propose a much simpler method using pH measurements and pre-determined pKa value in order to estimate the concentration of carboxylic acids on the oxidized MWCNT surfaces. The results from this technique were consistent with those obtained from XPS and titration, and it is expected that this simple quantification method can provide a cheap and fast way to monitor and control the oxidation reaction of MWCNT.

  11. Genetic diversity studies in pea (Pisum sativum L.) using simple sequence repeat markers.

    PubMed

    Kumari, P; Basal, N; Singh, A K; Rai, V P; Srivastava, C P; Singh, P K

    2013-03-13

    The genetic diversity among 28 pea (Pisum sativum L.) genotypes was analyzed using 32 simple sequence repeat markers. A total of 44 polymorphic bands, with an average of 2.1 bands per primer, were obtained. The polymorphism information content ranged from 0.657 to 0.309 with an average of 0.493. The variation in genetic diversity among these cultivars ranged from 0.11 to 0.73. Cluster analysis based on Jaccard's similarity coefficient using the unweighted pair-group method with arithmetic mean (UPGMA) revealed 2 distinct clusters, I and II, comprising 6 and 22 genotypes, respectively. Cluster II was further differentiated into 2 subclusters, IIA and IIB, with 12 and 10 genotypes, respectively. Principal component (PC) analysis revealed results similar to those of UPGMA. The first, second, and third PCs contributed 21.6, 16.1, and 14.0% of the variation, respectively; cumulative variation of the first 3 PCs was 51.7%.

  12. Continuous particle separation using pressure-driven flow-induced miniaturizing free-flow electrophoresis (PDF-induced μ-FFE)

    PubMed Central

    Jeon, Hyungkook; Kim, Youngkyu; Lim, Geunbae

    2016-01-01

    In this paper, we introduce pressure-driven flow-induced miniaturizing free-flow electrophoresis (PDF-induced μ-FFE), a novel continuous separation method. In our separation system, the external flow and electric field are applied to particles, such that particle movement is affected by pressure-driven flow, electroosmosis, and electrophoresis. We then analyzed the hydrodynamic drag force and electrophoretic force applied to the particles in opposite directions. Based on this analysis, micro- and nano-sized particles were separated according to their electrophoretic mobilities with high separation efficiency. Because the separation can be achieved in a simple T-shaped microchannel, without the use of internal electrodes, it offers the advantages of low-cost, simple device fabrication and bubble-free operation, compared with conventional μ-FFE methods. Therefore, we expect the proposed separation method to have a wide range of filtering/separation applications in biochemical analysis. PMID:26819221

  13. Design of a Class of Antennas Utilizing MEMS, EBG and Septum Polarizers including Near-field Coupling Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Ilkyu

    Recent developments in mobile communications have led to an increased appearance of short-range communications and high data-rate signal transmission. New technologies provides the need for an accurate near-field coupling analysis and novel antenna designs. An ability to effectively estimate the coupling within the near-field region is required to realize short-range communications. Currently, two common techniques that are applicable to the near-field coupling problem are 1) integral form of coupling formula and 2) generalized Friis formula. These formulas are investigated with an emphasis on straightforward calculation and accuracy for various distances between the two antennas. The coupling formulas are computed for a variety of antennas, and several antenna configurations are evaluated through full-wave simulation and indoor measurement in order to validate these techniques. In addition, this research aims to design multi-functional and high performance antennas based on MEMS (Microelectromechanical Systems) switches, EBG (Electromagnetic Bandgap) structures, and septum polarizers. A MEMS switch is incorporated into a slot loaded patch antenna to attain frequency reconfigurability. The resonant frequency of the patch antenna can be shifted using the MEM switch, which is actuated by the integrated bias networks. Furthermore, a high gain base-station antenna utilizing beam-tilting is designed to maximize gain for tilted beam applications. To realize this base-station antenna, an array of four dipole-EBG elements is constructed to implement a fixed down-tilt main beam with application in base station arrays. An improvement of the operating range with the EBG-dipole array is evaluated using a simple linkbudget analysis. The septum polarizer has been widely used in circularly polarized antenna systems due to its simple and compact design and high quality of circularity. In this research, the sigmoid function is used to smoothen the edge in the septum design, which makes it suitable for HPM systems. The PSO (Particle Swarm Optimization) technique is applied to the septum design to achieve a high performance antenna design. The electric field intensity above the septum is evaluated through the simulation and its properties are compared to simple half-plane scattering phenomena.

  14. PROPOSAL FOR A SIMPLE AND EFFICIENT MONTHLY QUALITY MANAGEMENT PROGRAM ASSESSING THE CONSISTENCY OF ROBOTIC IMAGE-GUIDED SMALL ANIMAL RADIATION SYSTEMS

    PubMed Central

    Brodin, N. Patrik; Guha, Chandan; Tomé, Wolfgang A.

    2015-01-01

    Modern pre-clinical radiation therapy (RT) research requires high precision and accurate dosimetry to facilitate the translation of research findings into clinical practice. Several systems are available that provide precise delivery and on-board imaging capabilities, highlighting the need for a quality management program (QMP) to ensure consistent and accurate radiation dose delivery. An ongoing, simple, and efficient QMP for image-guided robotic small animal irradiators used in pre-clinical RT research is described. Protocols were developed and implemented to assess the dose output constancy (based on the AAPM TG-61 protocol), cone-beam computed tomography (CBCT) image quality and object representation accuracy (using a custom-designed imaging phantom), CBCT-guided target localization accuracy and consistency of the CBCT-based dose calculation. To facilitate an efficient read-out and limit the user dependence of the QMP data analysis, a semi-automatic image analysis and data representation program was developed using the technical computing software MATLAB. The results of the first six months experience using the suggested QMP for a Small Animal Radiation Research Platform (SARRP) are presented, with data collected on a bi-monthly basis. The dosimetric output constancy was established to be within ±1 %, the consistency of the image resolution was within ±0.2 mm, the accuracy of CBCT-guided target localization was within ±0.5 mm, and dose calculation consistency was within ±2 s (± 3 %) per treatment beam. Based on these results, this simple quality assurance program allows for the detection of inconsistencies in dosimetric or imaging parameters that are beyond the acceptable variability for a reliable and accurate pre-clinical RT system, on a monthly or bi-monthly basis. PMID:26425981

  15. Proposal for a Simple and Efficient Monthly Quality Management Program Assessing the Consistency of Robotic Image-Guided Small Animal Radiation Systems.

    PubMed

    Brodin, N Patrik; Guha, Chandan; Tomé, Wolfgang A

    2015-11-01

    Modern pre-clinical radiation therapy (RT) research requires high precision and accurate dosimetry to facilitate the translation of research findings into clinical practice. Several systems are available that provide precise delivery and on-board imaging capabilities, highlighting the need for a quality management program (QMP) to ensure consistent and accurate radiation dose delivery. An ongoing, simple, and efficient QMP for image-guided robotic small animal irradiators used in pre-clinical RT research is described. Protocols were developed and implemented to assess the dose output constancy (based on the AAPM TG-61 protocol), cone-beam computed tomography (CBCT) image quality and object representation accuracy (using a custom-designed imaging phantom), CBCT-guided target localization accuracy and consistency of the CBCT-based dose calculation. To facilitate an efficient read-out and limit the user dependence of the QMP data analysis, a semi-automatic image analysis and data representation program was developed using the technical computing software MATLAB. The results of the first 6-mo experience using the suggested QMP for a Small Animal Radiation Research Platform (SARRP) are presented, with data collected on a bi-monthly basis. The dosimetric output constancy was established to be within ±1 %, the consistency of the image resolution was within ±0.2 mm, the accuracy of CBCT-guided target localization was within ±0.5 mm, and dose calculation consistency was within ±2 s (±3%) per treatment beam. Based on these results, this simple quality assurance program allows for the detection of inconsistencies in dosimetric or imaging parameters that are beyond the acceptable variability for a reliable and accurate pre-clinical RT system, on a monthly or bi-monthly basis.

  16. Finding text in color images

    NASA Astrophysics Data System (ADS)

    Zhou, Jiangying; Lopresti, Daniel P.; Tasdizen, Tolga

    1998-04-01

    In this paper, we consider the problem of locating and extracting text from WWW images. A previous algorithm based on color clustering and connected components analysis works well as long as the color of each character is relatively uniform and the typography is fairly simple. It breaks down quickly, however, when these assumptions are violated. In this paper, we describe more robust techniques for dealing with this challenging problem. We present an improved color clustering algorithm that measures similarity based on both RGB and spatial proximity. Layout analysis is also incorporated to handle more complex typography. THese changes significantly enhance the performance of our text detection procedure.

  17. Risk Interfaces to Support Integrated Systems Analysis and Development

    NASA Technical Reports Server (NTRS)

    Mindock, Jennifer; Lumpkins, Sarah; Shelhamer, Mark; Anton, Wilma; Havenhill, Maria

    2016-01-01

    Objectives for systems analysis capability: Develop integrated understanding of how a complex human physiological-socio-technical mission system behaves in spaceflight. Why? Support development of integrated solutions that prevent unwanted outcomes (Implementable approaches to minimize mission resources(mass, power, crew time, etc.)); Support development of tools for autonomy (need for exploration) (Assess and maintain resilience -individuals, teams, integrated system). Output of this exercise: -Representation of interfaces based on Human System Risk Board (HSRB) Risk Summary information and simple status based on Human Research Roadmap; Consolidated HSRB information applied to support communication; Point-of-Departure for HRP Element planning; Ability to track and communicate status of collaborations. 4

  18. Fabricating Simple Wax Screen-Printing Paper-Based Analytical Devices to Demonstrate the Concept of Limiting Reagent in Acid- Base Reactions

    ERIC Educational Resources Information Center

    Namwong, Pithakpong; Jarujamrus, Purim; Amatatongchai, Maliwan; Chairam, Sanoe

    2018-01-01

    In this article, a low-cost, simple, and rapid fabrication of paper-based analytical devices (PADs) using a wax screen-printing method is reported here. The acid-base reaction is implemented in the simple PADs to demonstrate to students the chemistry concept of a limiting reagent. When a fixed concentration of base reacts with a gradually…

  19. Base Pressure at Supersonic Speeds on Two-dimensional Airfoils and on Bodies of Revolution with and Without Fins Having Turbulent Boundary Layers

    NASA Technical Reports Server (NTRS)

    LOVE EUGENE S

    1957-01-01

    An analysis has been made of available experimental data to show the effects of most of the variables that are more predominant in determining base pressure at supersonic speeds. The analysis covers base pressures for two-dimensional airfoils and for bodies of revolution with and without stabilizing fins and is restricted to turbulent boundary layers. The present status of available experimental information is summarized as are the existing methods for predicting base pressure. A simple semiempirical method is presented for estimating base pressure. For two-dimensional bases, this method stems from an analogy established between the base-pressure phenomena and the peak pressure rise associated with the separation of the boundary layer. An analysis made for axially symmetric flow indicates that the base pressure for bodies of revolution is subject to the same analogy. Based upon the methods presented, estimations are made of such effects as Mach number, angle of attack, boattailing, fineness ratio, and fins. These estimations give fair predictions of experimental results. (author)

  20. [Surgical treatment of chronic pancreatitis based on classification of M. Buchler and coworkers].

    PubMed

    Krivoruchko, I A; Boĭko, V V; Goncharova, N N; Andreeshchev, S A

    2011-08-01

    The results of surgical treatment of 452 patients, suffering chronic pancreatitis (CHP), were analyzed. The CHP classification, elaborated by M. Buchler and coworkers (2009), based on clinical signs, morphological peculiarities and pancreatic function analysis, contains scientifically substantiated recommendations for choice of diagnostic methods and complex treatment of the disease. The classification proposed is simple in application and constitutes an instrument for studying and comparison of the CHP course severity, the patients prognosis and treatment.

  1. Vectorial atomic magnetometer based on coherent transients of laser absorption in Rb vapor

    NASA Astrophysics Data System (ADS)

    Lenci, L.; Auyuanet, A.; Barreiro, S.; Valente, P.; Lezama, A.; Failache, H.

    2014-04-01

    We have designed and tested an atomic vectorial magnetometer based on the analysis of the coherent oscillatory transients in the transmission of resonant laser light through a Rb vapor cell. We show that the oscillation amplitudes at the Larmor frequency and its first harmonic are related through a simple formula to the angles determining the orientation of the magnetic field vector. The magnetometer was successfully applied to the measurement of the ambient magnetic field.

  2. Game Theory and Risk-Based Levee System Design

    NASA Astrophysics Data System (ADS)

    Hui, R.; Lund, J. R.; Madani, K.

    2014-12-01

    Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.

  3. Simple prediction method of lumbar lordosis for planning of lumbar corrective surgery: radiological analysis in a Korean population.

    PubMed

    Lee, Chong Suh; Chung, Sung Soo; Park, Se Jun; Kim, Dong Min; Shin, Seong Kee

    2014-01-01

    This study aimed at deriving a lordosis predictive equation using the pelvic incidence and to establish a simple prediction method of lumbar lordosis for planning lumbar corrective surgery in Asians. Eighty-six asymptomatic volunteers were enrolled in the study. The maximal lumbar lordosis (MLL), lower lumbar lordosis (LLL), pelvic incidence (PI), and sacral slope (SS) were measured. The correlations between the parameters were analyzed using Pearson correlation analysis. Predictive equations of lumbar lordosis through simple regression analysis of the parameters and simple predictive values of lumbar lordosis using PI were derived. The PI strongly correlated with the SS (r = 0.78), and a strong correlation was found between the SS and LLL (r = 0.89), and between the SS and MLL (r = 0.83). Based on these correlations, the predictive equations of lumbar lordosis were found (SS = 0.80 + 0.74 PI (r = 0.78, R (2) = 0.61), LLL = 5.20 + 0.87 SS (r = 0.89, R (2) = 0.80), MLL = 17.41 + 0.96 SS (r = 0.83, R (2) = 0.68). When PI was between 30° to 35°, 40° to 50° and 55° to 60°, the equations predicted that MLL would be PI + 10°, PI + 5° and PI, and LLL would be PI - 5°, PI - 10° and PI - 15°, respectively. This simple calculation method can provide a more appropriate and simpler prediction of lumbar lordosis for Asian populations. The prediction of lumbar lordosis should be used as a reference for surgeons planning to restore the lumbar lordosis in lumbar corrective surgery.

  4. Analysis of Calibration Errors for Both Short and Long Stroke White Light Experiments

    NASA Technical Reports Server (NTRS)

    Pan, Xaiopei

    2006-01-01

    This work will analyze focusing and tilt variations introduced by thermal changes in calibration processes. In particular the accuracy limits are presented for common short- and long-stroke experiments. A new, simple, practical calibration scheme is proposed and analyzed based on the SIM PlanetQuest's Micro-Arcsecond Metrology (MAM) testbed experiments.

  5. Tallying Differences between Demographic Subgroups from Multiple Institutions: The Practical Utility of Nonparametric Analysis

    ERIC Educational Resources Information Center

    Yorke, Mantz

    2017-01-01

    When analysing course-level data by subgroups based upon some demographic characteristics, the numbers in analytical cells are often too small to allow inferences to be drawn that might help in the enhancement of practices. However, relatively simple analyses can provide useful pointers. This article draws upon a study involving a partnership with…

  6. Landscape scale mapping of forest inventory data by nearest neighbor classification

    Treesearch

    Andrew Lister

    2009-01-01

    One of the goals of the Forest Service, U.S. Department of Agriculture's Forest Inventory and Analysis (FIA) program is large-area mapping. FIA scientists have tried many methods in the past, including geostatistical methods, linear modeling, nonlinear modeling, and simple choropleth and dot maps. Mapping methods that require individual model-based maps to be...

  7. Time-Lapse and Slow-Motion Tracking of Temperature Changes: Response Time of a Thermometer

    ERIC Educational Resources Information Center

    Moggio, L.; Onorato, P.; Gratton, L. M.; Oss, S.

    2017-01-01

    We propose the use of a smartphone based time-lapse and slow-motion video techniques together with tracking analysis as valuable tools for investigating thermal processes such as the response time of a thermometer. The two simple experimental activities presented here, suitable also for high school and undergraduate students, allow one to measure…

  8. The Use of Tetrads in the Analysis of Arts-Based Media

    ERIC Educational Resources Information Center

    Gouzouasis, Peter; LaMonde, Anne-Marie

    2005-01-01

    In this article, we chose the musical form of a sonata to examine tetrads, a simple four-fold structure that Marshall McLuhan coined and employed to describe various technologies. Tetrads, as cognitive models, are used to refine, focus, or discover entities in cultures and technologies, which are hidden from view in the psyche. Tetradic logic…

  9. Development and Characterization of Novel SSR Markers in Carrot (Daucus Carota L.) and Their Application for Mapping and Diversity Analysis in Apiaceae

    USDA-ARS?s Scientific Manuscript database

    Genomic resources in carrot and other Apiaceae are relatively underdeveloped. The availability of a large set of pcr-based codominant markers, such as simple sequence repeats (SSR), would allow integration of the different carrot genetic maps constructed to date (mainly using anonymous dominant mark...

  10. A Modification of the Oersted Experiment

    ERIC Educational Resources Information Center

    Stoyanov, Dimitar G.

    2009-01-01

    The paper describes a simple set-up of the Oersted experiment. A planar coil of wires has been used to deflect vigorously the magnetic needle (more than 80 angular degrees) when a current of up to 1 A flows along it. Based on theoretical analysis the torque on the magnetic field is analytically expressed taking into account the inhomogeneity of…

  11. Mastering Overdetection and Underdetection in Learner-Answer Processing: Simple Techniques for Analysis and Diagnosis

    ERIC Educational Resources Information Center

    Blanchard, Alexia; Kraif, Olivier; Ponton, Claude

    2009-01-01

    This paper presents a "didactic triangulation" strategy to cope with the problem of reliability of NLP applications for computer-assisted language learning (CALL) systems. It is based on the implementation of basic but well mastered NLP techniques and puts the emphasis on an adapted gearing between computable linguistic clues and didactic features…

  12. Aerothermal modeling program, phase 1

    NASA Technical Reports Server (NTRS)

    Srinivasan, R.; Reynolds, R.; Ball, I.; Berry, R.; Johnson, K.; Mongia, H.

    1983-01-01

    Aerothermal submodels used in analytical combustor models are analyzed. The models described include turbulence and scalar transport, gaseous full combustion, spray evaporation/combustion, soot formation and oxidation, and radiation. The computational scheme is discussed in relation to boundary conditions and convergence criteria. Also presented is the data base for benchmark quality test cases and an analysis of simple flows.

  13. Measuring the Impacts of a Volunteer-Based Community Development Program in Developing Volunteers' Leadership Skills

    ERIC Educational Resources Information Center

    Meier, Amy; Singletary, Loretta; Hill, George

    2012-01-01

    This article summarizes the results of an evaluation of the impacts of a community development program to develop leadership skills in its adult volunteers. The evaluation featured 20 questions about leadership skills learned as a result of volunteer experiences. Data analysis strategies beyond a simple means ranking resulted in evidence…

  14. Helping Students Assess the Relative Importance of Different Intermolecular Interactions

    ERIC Educational Resources Information Center

    Jasien, Paul G.

    2008-01-01

    A semi-quantitative model has been developed to estimate the relative effects of dispersion, dipole-dipole interactions, and H-bonding on the normal boiling points ("T[subscript b]") for a subset of simple organic systems. The model is based upon a statistical analysis using multiple linear regression on a series of straight-chain organic…

  15. A simple next-best alternative to seasonal predictions in Europe

    NASA Astrophysics Data System (ADS)

    Buontempo, Carlo; De Felice, Matteo

    2016-04-01

    In order to build a climate proof society, we need to learn how to best use the climate information we have. Having spent time and resources in developing complex numerical models has often blinded us on the value some of this information really has in the eyes of a decision maker. An effective way to assess this is to check the quality of the forecast (and its cost) to the quality of the forecast from a prediction system based on simpler assumption (and thus cheaper to run). Such a practice is common in marketing analysis where it is often referred to as the next-best alternative. As a way to facilitate such an analysis, climate service providers should always provide alongside the predictions a set of skill scores. These are usually based on climatological means, anomaly persistence or more recently multiple linear regressions. We here present an equally simple benchmark based on a Markov chain process locally trained at a monthly or seasonal time-scale. We demonstrate that in spite of its simplicity the model easily outperforms not only the standard benchmark but also most of the seasonal predictions system at least in EUROPE. We suggest that a benchmark of this kind could represent a useful next-best alternative for a number of users.

  16. Development of a multiplex PCR-based rapid typing method for enterohemorrhagic Escherichia coli O157 strains.

    PubMed

    Ooka, Tadasuke; Terajima, Jun; Kusumoto, Masahiro; Iguchi, Atsushi; Kurokawa, Ken; Ogura, Yoshitoshi; Asadulghani, Md; Nakayama, Keisuke; Murase, Kazunori; Ohnishi, Makoto; Iyoda, Sunao; Watanabe, Haruo; Hayashi, Tetsuya

    2009-09-01

    Enterohemorrhagic Escherichia coli O157 (EHEC O157) is a food-borne pathogen that has raised worldwide public health concern. The development of simple and rapid strain-typing methods is crucial for the rapid detection and surveillance of EHEC O157 outbreaks. In the present study, we developed a multiplex PCR-based strain-typing method for EHEC O157, which is based on the variability in genomic location of IS629 among EHEC O157 strains. This method is very simple, in that the procedures are completed within 2 h, the analysis can be performed without the need for special equipment or techniques (requiring only conventional PCR and agarose gel electrophoresis systems), the results can easily be transformed into digital data, and the genes for the major virulence markers of EHEC O157 (the stx(1), stx(2), and eae genes) can be detected simultaneously. Using this method, 201 EHEC O157 strains showing different XbaI digestion patterns in pulsed-field gel electrophoresis (PFGE) analysis were classified into 127 types, and outbreak-related strains showed identical or highly similar banding patterns. Although this method is less discriminatory than PFGE, it may be useful as a primary screening tool for EHEC O157 outbreaks.

  17. Network Sampling and Classification:An Investigation of Network Model Representations

    PubMed Central

    Airoldi, Edoardo M.; Bai, Xue; Carley, Kathleen M.

    2011-01-01

    Methods for generating a random sample of networks with desired properties are important tools for the analysis of social, biological, and information networks. Algorithm-based approaches to sampling networks have received a great deal of attention in recent literature. Most of these algorithms are based on simple intuitions that associate the full features of connectivity patterns with specific values of only one or two network metrics. Substantive conclusions are crucially dependent on this association holding true. However, the extent to which this simple intuition holds true is not yet known. In this paper, we examine the association between the connectivity patterns that a network sampling algorithm aims to generate and the connectivity patterns of the generated networks, measured by an existing set of popular network metrics. We find that different network sampling algorithms can yield networks with similar connectivity patterns. We also find that the alternative algorithms for the same connectivity pattern can yield networks with different connectivity patterns. We argue that conclusions based on simulated network studies must focus on the full features of the connectivity patterns of a network instead of on the limited set of network metrics for a specific network type. This fact has important implications for network data analysis: for instance, implications related to the way significance is currently assessed. PMID:21666773

  18. Analysis of tablet compaction. I. Characterization of mechanical behavior of powder and powder/tooling friction.

    PubMed

    Cunningham, J C; Sinka, I C; Zavaliangos, A

    2004-08-01

    In this first of two articles on the modeling of tablet compaction, the experimental inputs related to the constitutive model of the powder and the powder/tooling friction are determined. The continuum-based analysis of tableting makes use of an elasto-plastic model, which incorporates the elements of yield, plastic flow potential, and hardening, to describe the mechanical behavior of microcrystalline cellulose over the range of densities experienced during tableting. Specifically, a modified Drucker-Prager/cap plasticity model, which includes material parameters such as cohesion, internal friction, and hydrostatic yield pressure that evolve with the internal state variable relative density, was applied. Linear elasticity is assumed with the elastic parameters, Young's modulus, and Poisson's ratio dependent on the relative density. The calibration techniques were developed based on a series of simple mechanical tests including diametrical compression, simple compression, and die compaction using an instrumented die. The friction behavior is measured using an instrumented die and the experimental data are analyzed using the method of differential slices. The constitutive model and frictional properties are essential experimental inputs to the finite element-based model described in the companion article. Copyright 2004 Wiley-Liss, Inc. and the American Pharmacists Association J Pharm Sci 93:2022-2039, 2004

  19. A statistical method for measuring activation of gene regulatory networks.

    PubMed

    Esteves, Gustavo H; Reis, Luiz F L

    2018-06-13

    Gene expression data analysis is of great importance for modern molecular biology, given our ability to measure the expression profiles of thousands of genes and enabling studies rooted in systems biology. In this work, we propose a simple statistical model for the activation measuring of gene regulatory networks, instead of the traditional gene co-expression networks. We present the mathematical construction of a statistical procedure for testing hypothesis regarding gene regulatory network activation. The real probability distribution for the test statistic is evaluated by a permutation based study. To illustrate the functionality of the proposed methodology, we also present a simple example based on a small hypothetical network and the activation measuring of two KEGG networks, both based on gene expression data collected from gastric and esophageal samples. The two KEGG networks were also analyzed for a public database, available through NCBI-GEO, presented as Supplementary Material. This method was implemented in an R package that is available at the BioConductor project website under the name maigesPack.

  20. Regression-based model of skin diffuse reflectance for skin color analysis

    NASA Astrophysics Data System (ADS)

    Tsumura, Norimichi; Kawazoe, Daisuke; Nakaguchi, Toshiya; Ojima, Nobutoshi; Miyake, Yoichi

    2008-11-01

    A simple regression-based model of skin diffuse reflectance is developed based on reflectance samples calculated by Monte Carlo simulation of light transport in a two-layered skin model. This reflectance model includes the values of spectral reflectance in the visible spectra for Japanese women. The modified Lambert Beer law holds in the proposed model with a modified mean free path length in non-linear density space. The averaged RMS and maximum errors of the proposed model were 1.1 and 3.1%, respectively, in the above range.

  1. Multivariate statistical analysis software technologies for astrophysical research involving large data bases

    NASA Technical Reports Server (NTRS)

    Djorgovski, George

    1993-01-01

    The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multiparameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resource.

  2. Multivariate statistical analysis software technologies for astrophysical research involving large data bases

    NASA Technical Reports Server (NTRS)

    Djorgovski, Stanislav

    1992-01-01

    The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multi parameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resources.

  3. Simple non-laboratory- and laboratory-based risk assessment algorithms and nomogram for detecting undiagnosed diabetes mellitus.

    PubMed

    Wong, Carlos K H; Siu, Shing-Chung; Wan, Eric Y F; Jiao, Fang-Fang; Yu, Esther Y T; Fung, Colman S C; Wong, Ka-Wai; Leung, Angela Y M; Lam, Cindy L K

    2016-05-01

    The aim of the present study was to develop a simple nomogram that can be used to predict the risk of diabetes mellitus (DM) in the asymptomatic non-diabetic subjects based on non-laboratory- and laboratory-based risk algorithms. Anthropometric data, plasma fasting glucose, full lipid profile, exercise habits, and family history of DM were collected from Chinese non-diabetic subjects aged 18-70 years. Logistic regression analysis was performed on a random sample of 2518 subjects to construct non-laboratory- and laboratory-based risk assessment algorithms for detection of undiagnosed DM; both algorithms were validated on data of the remaining sample (n = 839). The Hosmer-Lemeshow test and area under the receiver operating characteristic (ROC) curve (AUC) were used to assess the calibration and discrimination of the DM risk algorithms. Of 3357 subjects recruited, 271 (8.1%) had undiagnosed DM defined by fasting glucose ≥7.0 mmol/L or 2-h post-load plasma glucose ≥11.1 mmol/L after an oral glucose tolerance test. The non-laboratory-based risk algorithm, with scores ranging from 0 to 33, included age, body mass index, family history of DM, regular exercise, and uncontrolled blood pressure; the laboratory-based risk algorithm, with scores ranging from 0 to 37, added triglyceride level to the risk factors. Both algorithms demonstrated acceptable calibration (Hosmer-Lemeshow test: P = 0.229 and P = 0.483) and discrimination (AUC 0.709 and 0.711) for detection of undiagnosed DM. A simple-to-use nomogram for detecting undiagnosed DM has been developed using validated non-laboratory-based and laboratory-based risk algorithms. © 2015 Ruijin Hospital, Shanghai Jiaotong University School of Medicine and Wiley Publishing Asia Pty Ltd.

  4. Sonographic Diagnosis of Tubal Cancer with IOTA Simple Rules Plus Pattern Recognition

    PubMed Central

    Tongsong, Theera; Wanapirak, Chanane; Tantipalakorn, Charuwan; Tinnangwattana, Dangcheewan

    2017-01-01

    Objective: To evaluate diagnostic performance of IOTA simple rules plus pattern recognition in predicting tubal cancer. Methods: Secondary analysis was performed on prospective database of our IOTA project. The patients recruited in the project were those who were scheduled for pelvic surgery due to adnexal masses. The patients underwent ultrasound examinations within 24 hours before surgery. On ultrasound examination, the masses were evaluated using the well-established IOTA simple rules plus pattern recognition (sausage-shaped appearance, incomplete septum, visible ipsilateral ovaries) to predict tubal cancer. The gold standard diagnosis was based on histological findings or operative findings. Results: A total of 482 patients, including 15 cases of tubal cancer, were evaluated by ultrasound preoperatively. The IOTA simple rules plus pattern recognition gave a sensitivity of 86.7% (13 in 15) and specificity of 97.4%. Sausage-shaped appearance was identified in nearly all cases (14 in 15). Incomplete septa and normal ovaries could be identified in 33.3% and 40%, respectively. Conclusion: IOTA simple rules plus pattern recognition is relatively effective in predicting tubal cancer. Thus, we propose the simple scheme in diagnosis of tubal cancer as follows. First of all, the adnexal masses are evaluated with IOTA simple rules. If the B-rules could be applied, tubal cancer is reliably excluded. If the M-rules could be applied or the result is inconclusive, careful delineation of the mass with pattern recognition should be performed. PMID:29172273

  5. Sonographic Diagnosis of Tubal Cancer with IOTA Simple Rules Plus Pattern Recognition

    PubMed

    Tongsong, Theera; Wanapirak, Chanane; Tantipalakorn, Charuwan; Tinnangwattana, Dangcheewan

    2017-11-26

    Objective: To evaluate diagnostic performance of IOTA simple rules plus pattern recognition in predicting tubal cancer. Methods: Secondary analysis was performed on prospective database of our IOTA project. The patients recruited in the project were those who were scheduled for pelvic surgery due to adnexal masses. The patients underwent ultrasound examinations within 24 hours before surgery. On ultrasound examination, the masses were evaluated using the well-established IOTA simple rules plus pattern recognition (sausage-shaped appearance, incomplete septum, visible ipsilateral ovaries) to predict tubal cancer. The gold standard diagnosis was based on histological findings or operative findings. Results: A total of 482 patients, including 15 cases of tubal cancer, were evaluated by ultrasound preoperatively. The IOTA simple rules plus pattern recognition gave a sensitivity of 86.7% (13 in 15) and specificity of 97.4%. Sausage-shaped appearance was identified in nearly all cases (14 in 15). Incomplete septa and normal ovaries could be identified in 33.3% and 40%, respectively. Conclusion: IOTA simple rules plus pattern recognition is relatively effective in predicting tubal cancer. Thus, we propose the simple scheme in diagnosis of tubal cancer as follows. First of all, the adnexal masses are evaluated with IOTA simple rules. If the B-rules could be applied, tubal cancer is reliably excluded. If the M-rules could be applied or the result is inconclusive, careful delineation of the mass with pattern recognition should be performed. Creative Commons Attribution License

  6. Analysis of NASA JP-4 fire tests data and development of a simple fire model

    NASA Technical Reports Server (NTRS)

    Raj, P.

    1980-01-01

    The temperature, velocity and species concentration data obtained during the NASA fire tests (3m, 7.5m and 15m diameter JP-4 fires) were analyzed. Utilizing the data analysis, a sample theoretical model was formulated to predict the temperature and velocity profiles in JP-4 fires. The theoretical model, which does not take into account the detailed chemistry of combustion, is capable of predicting the extent of necking of the fire near its base.

  7. Ontology-aided feature correlation for multi-modal urban sensing

    NASA Astrophysics Data System (ADS)

    Misra, Archan; Lantra, Zaman; Jayarajah, Kasthuri

    2016-05-01

    The paper explores the use of correlation across features extracted from different sensing channels to help in urban situational understanding. We use real-world datasets to show how such correlation can improve the accuracy of detection of city-wide events by combining metadata analysis with image analysis of Instagram content. We demonstrate this through a case study on the Singapore Haze. We show that simple ontological relationships and reasoning can significantly help in automating such correlation-based understanding of transient urban events.

  8. Rare Cell Separation and Analysis by Magnetic Sorting

    PubMed Central

    Zborowski, Maciej; Chalmers, Jeffrey J.

    2011-01-01

    Summary The separation and or isolation of rare cells using magnetic forces is commonly used and growing in use ranging from simple sample prep for further studies to a FDA approved, clinical diagnostic test. This grown is the result of both the demand to obtain homogeneous rare cells for molecular analysis and the dramatic increases in the power of permanent magnets that even allow the separation of some unlabeled cells based on intrinsic magnetic moments, such as malaria parasite-infected red blood cells. PMID:21812408

  9. CLMNANAL: A C++ program for application of the Coleman stability analysis to rotorcraft

    NASA Technical Reports Server (NTRS)

    Lance, Michael B.

    1996-01-01

    This program is an adaptation of the theory of Robert P. Coleman and Arnold M. Feingold as presented in NACA Report 1351, 1958. This theory provided a method for the analysis of multiple-bladed rotor systems to determine the system susceptibility to ground resonance. Their treatment also provided a simple means for determining the required product of rotor and chassis damping factors to suppress the resonance. This C++ program is based on a FORTRAN 77 version of a similar code.

  10. Increased depth-diameter ratios in the Medusae Fossae Formation deposits of Mars

    NASA Technical Reports Server (NTRS)

    Barlow, N. G.

    1993-01-01

    Depth to diameter ratios for fresh impact craters on Mars are commonly cited as approximately 0.2 for simple craters and 0.1 for complex craters. Recent computation of depth-diameter ratios in the Amazonis-Memnonia region of Mars indicates that craters within the Medusae Fossae Formation deposits found in this region display greater depth-diameter ratios than expected for both simple and complex craters. Photoclinometric and shadow length techniques have been used to obtain depths of craters within the Amazonis-Memnonia region. The 37 craters in the 2 to 29 km diameter range and displaying fresh impact morphologies were identified in the area of study. This region includes the Amazonian aged upper and middle members of the Medusae Fossae Formation and Noachian aged cratered and hilly units. The Medusae Fossae Formation is characterized by extensive, flat to gently undulating deposits of controversial origin. These deposits appear to vary from friable to indurated. Early analysis of crater degradation in the Medusae Fossae region suggested that simple craters excavated to greater depths than expected based on the general depth-diameter relationships derived for Mars. However, too few craters were available in the initial analysis to estimate the actual depth-diameter ratios within this region. Although the analysis is continuing, we are now beginning to see a convergence towards specific values for the depth-diameter ratio depending on geologic unit.

  11. Airplane Stress Analysis

    NASA Technical Reports Server (NTRS)

    Zahm, A F; Crook, L H

    1918-01-01

    Report presents stress analysis of individual components of an airplane. Normal and abnormal loads, sudden loads, simple stresses, indirect simple stresses, resultant unit stress, repetitive and equivalent stress, maximum steady load and stress are considered.

  12. Sample injection and electrophoretic separation on a simple laminated paper based analytical device.

    PubMed

    Xu, Chunxiu; Zhong, Minghua; Cai, Longfei; Zheng, Qingyu; Zhang, Xiaojun

    2016-02-01

    We described a strategy to perform multistep operations on a simple laminated paper-based separation device by using electrokinetic flow to manipulate the fluids. A laminated crossed-channel paper-based separation device was fabricated by cutting a filter paper sheet followed by lamination. Multiple function units including sample loading, sample injection, and electrophoretic separation were integrated on a single paper based analytical device for the first time, by applying potential at different reservoirs for sample, sample waste, buffer, and buffer waste. As a proof-of-concept demonstration, mixed sample solution containing carmine and sunset yellow were loaded in the sampling channel, and then injected into separation channel followed by electrophoretic separation, by adjusting the potentials applied at the four terminals of sampling and separation channel. The effects of buffer pH, buffer concentration, channel width, and separation time on resolution of electrophoretic separation were studied. This strategy may be used to perform multistep operations such as reagent dilution, sample injection, mixing, reaction, and separation on a single microfluidic paper based analytical device, which is very attractive for building micro total analysis systems on microfluidic paper based analytical devices. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Full quantum mechanical analysis of atomic three-grating Mach–Zehnder interferometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanz, A.S., E-mail: asanz@iff.csic.es; Davidović, M.; Božić, M.

    2015-02-15

    Atomic three-grating Mach–Zehnder interferometry constitutes an important tool to probe fundamental aspects of the quantum theory. There is, however, a remarkable gap in the literature between the oversimplified models and robust numerical simulations considered to describe the corresponding experiments. Consequently, the former usually lead to paradoxical scenarios, such as the wave–particle dual behavior of atoms, while the latter make difficult the data analysis in simple terms. Here these issues are tackled by means of a simple grating working model consisting of evenly-spaced Gaussian slits. As is shown, this model suffices to explore and explain such experiments both analytically and numerically,more » giving a good account of the full atomic journey inside the interferometer, and hence contributing to make less mystic the physics involved. More specifically, it provides a clear and unambiguous picture of the wavefront splitting that takes place inside the interferometer, illustrating how the momentum along each emerging diffraction order is well defined even though the wave function itself still displays a rather complex shape. To this end, the local transverse momentum is also introduced in this context as a reliable analytical tool. The splitting, apart from being a key issue to understand atomic Mach–Zehnder interferometry, also demonstrates at a fundamental level how wave and particle aspects are always present in the experiment, without incurring in any contradiction or interpretive paradox. On the other hand, at a practical level, the generality and versatility of the model and methodology presented, makes them suitable to attack analogous problems in a simple manner after a convenient tuning. - Highlights: • A simple model is proposed to analyze experiments based on atomic Mach–Zehnder interferometry. • The model can be easily handled both analytically and computationally. • A theoretical analysis based on the combination of the position and momentum representations is considered. • Wave and particle aspects are shown to coexist within the same experiment, thus removing the old wave-corpuscle dichotomy. • A good agreement between numerical simulations and experimental data is found without appealing to best-fit procedures.« less

  14. Wastewater-Based Epidemiology of Stimulant Drugs: Functional Data Analysis Compared to Traditional Statistical Methods.

    PubMed

    Salvatore, Stefania; Bramness, Jørgen Gustav; Reid, Malcolm J; Thomas, Kevin Victor; Harman, Christopher; Røislien, Jo

    2015-01-01

    Wastewater-based epidemiology (WBE) is a new methodology for estimating the drug load in a population. Simple summary statistics and specification tests have typically been used to analyze WBE data, comparing differences between weekday and weekend loads. Such standard statistical methods may, however, overlook important nuanced information in the data. In this study, we apply functional data analysis (FDA) to WBE data and compare the results to those obtained from more traditional summary measures. We analysed temporal WBE data from 42 European cities, using sewage samples collected daily for one week in March 2013. For each city, the main temporal features of two selected drugs were extracted using functional principal component (FPC) analysis, along with simpler measures such as the area under the curve (AUC). The individual cities' scores on each of the temporal FPCs were then used as outcome variables in multiple linear regression analysis with various city and country characteristics as predictors. The results were compared to those of functional analysis of variance (FANOVA). The three first FPCs explained more than 99% of the temporal variation. The first component (FPC1) represented the level of the drug load, while the second and third temporal components represented the level and the timing of a weekend peak. AUC was highly correlated with FPC1, but other temporal characteristic were not captured by the simple summary measures. FANOVA was less flexible than the FPCA-based regression, and even showed concordance results. Geographical location was the main predictor for the general level of the drug load. FDA of WBE data extracts more detailed information about drug load patterns during the week which are not identified by more traditional statistical methods. Results also suggest that regression based on FPC results is a valuable addition to FANOVA for estimating associations between temporal patterns and covariate information.

  15. Deep Learning Nuclei Detection in Digitized Histology Images by Superpixels

    PubMed Central

    Sornapudi, Sudhir; Stanley, Ronald Joe; Stoecker, William V.; Almubarak, Haidar; Long, Rodney; Antani, Sameer; Thoma, George; Zuna, Rosemary; Frazier, Shelliane R.

    2018-01-01

    Background: Advances in image analysis and computational techniques have facilitated automatic detection of critical features in histopathology images. Detection of nuclei is critical for squamous epithelium cervical intraepithelial neoplasia (CIN) classification into normal, CIN1, CIN2, and CIN3 grades. Methods: In this study, a deep learning (DL)-based nuclei segmentation approach is investigated based on gathering localized information through the generation of superpixels using a simple linear iterative clustering algorithm and training with a convolutional neural network. Results: The proposed approach was evaluated on a dataset of 133 digitized histology images and achieved an overall nuclei detection (object-based) accuracy of 95.97%, with demonstrated improvement over imaging-based and clustering-based benchmark techniques. Conclusions: The proposed DL-based nuclei segmentation Method with superpixel analysis has shown improved segmentation results in comparison to state-of-the-art methods. PMID:29619277

  16. Experimental evaluation of dynamic data allocation strategies in a distributed database with changing workloads

    NASA Technical Reports Server (NTRS)

    Brunstrom, Anna; Leutenegger, Scott T.; Simha, Rahul

    1995-01-01

    Traditionally, allocation of data in distributed database management systems has been determined by off-line analysis and optimization. This technique works well for static database access patterns, but is often inadequate for frequently changing workloads. In this paper we address how to dynamically reallocate data for partionable distributed databases with changing access patterns. Rather than complicated and expensive optimization algorithms, a simple heuristic is presented and shown, via an implementation study, to improve system throughput by 30 percent in a local area network based system. Based on artificial wide area network delays, we show that dynamic reallocation can improve system throughput by a factor of two and a half for wide area networks. We also show that individual site load must be taken into consideration when reallocating data, and provide a simple policy that incorporates load in the reallocation decision.

  17. Interpretation with a Donnan-based concept of the influence of simple salt concentration on the apparent binding of divalent ions to the polyelectrolytes polystyrenesulfonate and dextran sulfate

    USGS Publications Warehouse

    Marinsky, J.A.; Baldwin, Robert F.; Reddy, M.M.

    1985-01-01

    It has been shown that the apparent enhancement of divalent metal ion binding to polyions such as polystyrenesulfonate (PSS) and dextran sulfate (DS) by decreasing the ionic strength of these mixed counterion systems (M2+, M+, X-, polyion) can be anticipated with the Donnan-based model developed by one of us (J.A.M.). Ion-exchange distribution methods have been employed to measure the removal by the polyion of trace divalent metal ion from simple salt (NaClO4)-polyion (NaPSS) mixtures. These data and polyion interaction data published earlier by Mattai and Kwak for the mixed counterion systems MgCl2-LiCl-DS and MgCl2-CsCl-DS have been shown to be amenable to rather precise analysis by this model. ?? 1985 American Chemical Society.

  18. Laplace Transform Based Radiative Transfer Studies

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Lin, B.; Ng, T.; Yang, P.; Wiscombe, W.; Herath, J.; Duffy, D.

    2006-12-01

    Multiple scattering is the major uncertainty for data analysis of space-based lidar measurements. Until now, accurate quantitative lidar data analysis has been limited to very thin objects that are dominated by single scattering, where photons from the laser beam only scatter a single time with particles in the atmosphere before reaching the receiver, and simple linear relationship between physical property and lidar signal exists. In reality, multiple scattering is always a factor in space-based lidar measurement and it dominates space- based lidar returns from clouds, dust aerosols, vegetation canopy and phytoplankton. While multiple scattering are clear signals, the lack of a fast-enough lidar multiple scattering computation tool forces us to treat the signal as unwanted "noise" and use simple multiple scattering correction scheme to remove them. Such multiple scattering treatments waste the multiple scattering signals and may cause orders of magnitude errors in retrieved physical properties. Thus the lack of fast and accurate time-dependent radiative transfer tools significantly limits lidar remote sensing capabilities. Analyzing lidar multiple scattering signals requires fast and accurate time-dependent radiative transfer computations. Currently, multiple scattering is done with Monte Carlo simulations. Monte Carlo simulations take minutes to hours and are too slow for interactive satellite data analysis processes and can only be used to help system / algorithm design and error assessment. We present an innovative physics approach to solve the time-dependent radiative transfer problem. The technique utilizes FPGA based reconfigurable computing hardware. The approach is as following, 1. Physics solution: Perform Laplace transform on the time and spatial dimensions and Fourier transform on the viewing azimuth dimension, and convert the radiative transfer differential equation solving into a fast matrix inversion problem. The majority of the radiative transfer computation goes to matrix inversion processes, FFT and inverse Laplace transforms. 2. Hardware solutions: Perform the well-defined matrix inversion, FFT and Laplace transforms on highly parallel, reconfigurable computing hardware. This physics-based computational tool leads to accurate quantitative analysis of space-based lidar signals and improves data quality of current lidar mission such as CALIPSO. This presentation will introduce the basic idea of this approach, preliminary results based on SRC's FPGA-based Mapstation, and how we may apply it to CALIPSO data analysis.

  19. A Thermal Management Systems Model for the NASA GTX RBCC Concept

    NASA Technical Reports Server (NTRS)

    Traci, Richard M.; Farr, John L., Jr.; Laganelli, Tony; Walker, James (Technical Monitor)

    2002-01-01

    The Vehicle Integrated Thermal Management Analysis Code (VITMAC) was further developed to aid the analysis, design, and optimization of propellant and thermal management concepts for advanced propulsion systems. The computational tool is based on engineering level principles and models. A graphical user interface (GUI) provides a simple and straightforward method to assess and evaluate multiple concepts before undertaking more rigorous analysis of candidate systems. The tool incorporates the Chemical Equilibrium and Applications (CEA) program and the RJPA code to permit heat transfer analysis of both rocket and air breathing propulsion systems. Key parts of the code have been validated with experimental data. The tool was specifically tailored to analyze rocket-based combined-cycle (RBCC) propulsion systems being considered for space transportation applications. This report describes the computational tool and its development and verification for NASA GTX RBCC propulsion system applications.

  20. Prediction of aircraft handling qualities using analytical models of the human pilot

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1982-01-01

    The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion for determining the susceptibility of an aircraft to pilot-induced oscillations (PIO) is formulated. Finally, a model-based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.

  1. Prediction of aircraft handling qualities using analytical models of the human pilot

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1982-01-01

    The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion for determining the susceptibility of an aircraft to pilot induced oscillations is formulated. Finally, a model based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.

  2. Comments on the variational modified-hypernetted-chain theory for simple fluids

    NASA Astrophysics Data System (ADS)

    Rosenfeld, Yaakov

    1986-02-01

    The variational modified-hypernetted-chain (VMHNC) theory, based on the approximation of universality of the bridge functions, is reformulated. The new formulation includes recent calculations by Lado and by Lado, Foiles, and Ashcroft, as two stages in a systematic approach which is analyzed. A variational iterative procedure for solving the exact (diagrammatic) equations for the fluid structure which is formally identical to the VMHNC is described, featuring the theory of simple classical fluids as a one-iteration theory. An accurate method for calculating the pair structure for a given potential and for inverting structure factor data in order to obtain the potential and the thermodynamic functions, follows from our analysis.

  3. An analytical approach for predicting pilot induced oscillations

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1981-01-01

    The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion or determining the susceptability of an aircraft to pilot induced oscillations (PIO) is formulated. Finally, a model-based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.

  4. Electrochemistry and analytical determination of lysergic acid diethylamide (LSD) via adsorptive stripping voltammetry.

    PubMed

    Merli, Daniele; Zamboni, Daniele; Protti, Stefano; Pesavento, Maria; Profumo, Antonella

    2014-12-01

    Lysergic acid diethylamide (LSD) is hardly detectable and quantifiable in biological samples because of its low active dose. Although several analytical tests are available, routine analysis of this drug is rarely performed. In this article, we report a simple and accurate method for the determination of LSD, based on adsorptive stripping voltammetry in DMF/tetrabutylammonium perchlorate, with a linear range of 1-90 ng L(-1) for deposition times of 50s. LOD of 1.4 ng L(-1) and LOQ of 4.3 ng L(-1) were found. The method can be also applied to biological samples after a simple extraction with 1-chlorobutane. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. USGS Coal Desorption Equipment and a Spreadsheet for Analysis of Lost and Total Gas from Canister Desorption Measurements

    USGS Publications Warehouse

    Barker, Charles E.; Dallegge, Todd A.; Clark, Arthur C.

    2002-01-01

    We have updated a simple polyvinyl chloride plastic canister design by adding internal headspace temperature measurement, and redesigned it so it is made with mostly off-the-shelf components for ease of construction. Using self-closing quick connects, this basic canister is mated to a zero-head manometer to make a simple coalbed methane desorption system that is easily transported in small aircraft to remote localities. This equipment is used to gather timed measurements of pressure, volume and temperature data that are corrected to standard pressure and temperature (STP) and graphically analyzed using an Excel(tm)-based spreadsheet. Used together these elements form an effective, practical canister desorption method.

  6. A Simple Device for Lens-to-Sample Distance Adjustment in Laser-Induced Breakdown Spectroscopy (LIBS).

    PubMed

    Cortez, Juliana; Farias Filho, Benedito B; Fontes, Laiane M; Pasquini, Celio; Raimundo, Ivo M; Pimentel, Maria Fernanda; de Souza Lins Borba, Flávia

    2017-04-01

    A simple device based on two commercial laser pointers is described to assist in the analysis of samples that present uneven surfaces and/or irregular shapes using laser-induced breakdown spectroscopy (LIBS). The device allows for easy positioning of the sample surface at a reproducible distance from the focusing lens that conveys the laser pulse to generate the micro-plasma in a LIBS system, with reproducibility better than ±0.2 mm. In this way, fluctuations in the fluence (J cm -2 ) are minimized and the LIBS analytical signals can be obtained with a better precision even when samples with irregular surfaces are probed.

  7. GSA-PCA: gene set generation by principal component analysis of the Laplacian matrix of a metabolic network

    PubMed Central

    2012-01-01

    Background Gene Set Analysis (GSA) has proven to be a useful approach to microarray analysis. However, most of the method development for GSA has focused on the statistical tests to be used rather than on the generation of sets that will be tested. Existing methods of set generation are often overly simplistic. The creation of sets from individual pathways (in isolation) is a poor reflection of the complexity of the underlying metabolic network. We have developed a novel approach to set generation via the use of Principal Component Analysis of the Laplacian matrix of a metabolic network. We have analysed a relatively simple data set to show the difference in results between our method and the current state-of-the-art pathway-based sets. Results The sets generated with this method are semi-exhaustive and capture much of the topological complexity of the metabolic network. The semi-exhaustive nature of this method has also allowed us to design a hypergeometric enrichment test to determine which genes are likely responsible for set significance. We show that our method finds significant aspects of biology that would be missed (i.e. false negatives) and addresses the false positive rates found with the use of simple pathway-based sets. Conclusions The set generation step for GSA is often neglected but is a crucial part of the analysis as it defines the full context for the analysis. As such, set generation methods should be robust and yield as complete a representation of the extant biological knowledge as possible. The method reported here achieves this goal and is demonstrably superior to previous set analysis methods. PMID:22876834

  8. A new design approach to MMI-based (de)multiplexers

    NASA Astrophysics Data System (ADS)

    Yueyu, Xiao; Sailing, He

    2004-09-01

    A novel design method of the wavelength (de)multiplexer is presented. The output spectral response of a (de)multiplexer is designed from the view of FIR filters. Avoiding laborious mathematic analysis, the (de)multiplexer is analyzed and designed in this explicit and simple method. A four channel (de)multiplexer based on multimode interference (MMI) is designed as an example. The result obtained agrees with that of the commonly used method, and is verified by a finite difference beam propagation method (FDBPM) simulation.

  9. Parametric Study of Biconic Re-Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Steele, Bryan; Banks, Daniel W.; Whitmore, Stephen A.

    2007-01-01

    An optimization based on hypersonic aerodynamic performance and volumetric efficiency was accomplished for a range of biconic configurations. Both axisymmetric and quasi-axisymmetric geometries (bent and flattened) were analyzed. The aerodynamic optimization wag based on hypersonic simple Incidence angle analysis tools. The range of configurations included those suitable for r lunar return trajectory with a lifting aerocapture at Earth and an overall volume that could support a nominal crew. The results yielded five configurations that had acceptable aerodynamic performance and met overall geometry and size limitations

  10. Microfluidic volumetric flow determination using optical coherence tomography speckle: An autocorrelation approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Pretto, Lucas R., E-mail: lucas.de.pretto@usp.br; Nogueira, Gesse E. C.; Freitas, Anderson Z.

    2016-04-28

    Functional modalities of Optical Coherence Tomography (OCT) based on speckle analysis are emerging in the literature. We propose a simple approach to the autocorrelation of OCT signal to enable volumetric flow rate differentiation, based on decorrelation time. Our results show that this technique could distinguish flows separated by 3 μl/min, limited by the acquisition speed of the system. We further perform a B-scan of gradient flow inside a microchannel, enabling the visualization of the drag effect on the walls.

  11. Software Development Cost Estimation Executive Summary

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus M.; Menzies, Tim

    2006-01-01

    Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.

  12. Use of FTA® classic cards for epigenetic analysis of sperm DNA.

    PubMed

    Serra, Olga; Frazzi, Raffaele; Perotti, Alessio; Barusi, Lorenzo; Buschini, Annamaria

    2018-02-01

    FTA® technologies provide the most reliable method for DNA extraction. Although FTA technologies have been widely used for genetic analysis, there is no literature on their use for epigenetic analysis yet. We present for the first time, a simple method for quantitative methylation assessment based on sperm cells stored on Whatman FTA classic cards. Specifically, elution of seminal DNA from FTA classic cards was successfully tested with an elution buffer and an incubation step in a thermocycler. The eluted DNA was bisulfite converted, amplified by PCR, and a region of interest was pyrosequenced.

  13. Building a common pipeline for rule-based document classification.

    PubMed

    Patterson, Olga V; Ginter, Thomas; DuVall, Scott L

    2013-01-01

    Instance-based classification of clinical text is a widely used natural language processing task employed as a step for patient classification, document retrieval, or information extraction. Rule-based approaches rely on concept identification and context analysis in order to determine the appropriate class. We propose a five-step process that enables even small research teams to develop simple but powerful rule-based NLP systems by taking advantage of a common UIMA AS based pipeline for classification. Our proposed methodology coupled with the general-purpose solution provides researchers with access to the data locked in clinical text in cases of limited human resources and compact timelines.

  14. Improving Public Perception of Behavior Analysis.

    PubMed

    Freedman, David H

    2016-05-01

    The potential impact of behavior analysis is limited by the public's dim awareness of the field. The mass media rarely cover behavior analysis, other than to echo inaccurate negative stereotypes about control and punishment. The media instead play up appealing but less-evidence-based approaches to problems, a key example being the touting of dubious diets over behavioral approaches to losing excess weight. These sorts of claims distort or skirt scientific evidence, undercutting the fidelity of behavior analysis to scientific rigor. Strategies for better connecting behavior analysis with the public might include reframing the field's techniques and principles in friendlier, more resonant form; pushing direct outcome comparisons between behavior analysis and its rivals in simple terms; and playing up the "warm and fuzzy" side of behavior analysis.

  15. Cloud-based solution to identify statistically significant MS peaks differentiating sample categories.

    PubMed

    Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B

    2013-03-23

    Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.

  16. An enrichment method based on synergistic and reversible covalent interactions for large-scale analysis of glycoproteins.

    PubMed

    Xiao, Haopeng; Chen, Weixuan; Smeekens, Johanna M; Wu, Ronghu

    2018-04-27

    Protein glycosylation is ubiquitous in biological systems and essential for cell survival. However, the heterogeneity of glycans and the low abundance of many glycoproteins complicate their global analysis. Chemical methods based on reversible covalent interactions between boronic acid and glycans have great potential to enrich glycopeptides, but the binding affinity is typically not strong enough to capture low-abundance species. Here, we develop a strategy using dendrimer-conjugated benzoboroxole to enhance the glycopeptide enrichment. We test the performance of several boronic acid derivatives, showing that benzoboroxole markedly increases glycopeptide coverage from human cell lysates. The enrichment is further improved by conjugating benzoboroxole to a dendrimer, which enables synergistic benzoboroxole-glycan interactions. This robust and simple method is highly effective for sensitive glycoproteomics analysis, especially capturing low-abundance glycopeptides. Importantly, the enriched glycopeptides remain intact, making the current method compatible with mass-spectrometry-based approaches to identify glycosylation sites and glycan structures.

  17. Taxonomic discrimination of higher plants by pyrolysis mass spectrometry.

    PubMed

    Kim, S W; Ban, S H; Chung, H J; Choi, D W; Choi, P S; Yoo, O J; Liu, J R

    2004-02-01

    Pyrolysis mass spectrometry (PyMS) is a rapid, simple, high-resolution analytical method based on thermal degradation of complex material in a vacuum and has been widely applied to the discrimination of closely related microbial strains. Leaf samples of six species and one variety of higher plants (Rosa multiflora, R. multiflora var. platyphylla, Sedum kamtschaticum, S. takesimense, S. sarmentosum, Hepatica insularis, and H. asiatica) were subjected to PyMS for spectral fingerprinting. Principal component analysis of PyMS data was not able to discriminate these plants in discrete clusters. However, canonical variate analysis of PyMS data separated these plants from one another. A hierarchical dendrogram based on canonical variate analysis was in agreement with the known taxonomy of the plants at the variety level. These results indicate that PyMS is able to discriminate higher plants based on taxonomic classification at the family, genus, species, and variety level.

  18. Nailfold capillaroscopy for day-to-day clinical use: construction of a simple scoring modality as a clinical prognostic index for digital trophic lesions.

    PubMed

    Smith, Vanessa; De Keyser, Filip; Pizzorni, Carmen; Van Praet, Jens T; Decuman, Saskia; Sulli, Alberto; Deschepper, Ellen; Cutolo, Maurizio

    2011-01-01

    Construction of a simple nailfold videocapillaroscopic (NVC) scoring modality as a prognostic index for digital trophic lesions for day-to-day clinical use. An association with a single simple (semi)-quantitatively scored NVC parameter, mean score of capillary loss, was explored in 71 consecutive patients with systemic sclerosis (SSc), and reliable reduction in the number of investigated fields (F32-F16-F8-F4). The cut-off value of the prognostic index (mean score of capillary loss calculated over a reduced number of fields) for present/future digital trophic lesions was selected by receiver operating curve (ROC) analysis. Reduction in the number of fields for mean score of capillary loss was reliable from F32 to F8 (intraclass correlation coefficient of F16/F32: 0.97; F8/F32: 0.90). Based on ROC analysis, a prognostic index (mean score of capillary loss as calculated over F8) with a cut-off value of 1.67 is proposed. This value has a sensitivity of 72.22/70.00, specificity of 70.59/69.77, positive likelihood ratio of 2.46/2.32 and a negative likelihood ratio of 0.39/0.43 for present/future digital trophic lesions. A simple prognostic index for digital trophic lesions for daily use in SSc clinics is proposed, limited to the mean score of capillary loss as calculated over eight fields (8 fingers, 1 field per finger).

  19. An Isotopic Dilution Experiment Using Liquid Scintillation: A Simple Two-System, Two-Phase Analysis.

    ERIC Educational Resources Information Center

    Moehs, Peter J.; Levine, Samuel

    1982-01-01

    A simple isotonic, dilution analysis whose principles apply to methods of more complex radioanalyses is described. Suitable for clinical and instrumental analysis chemistry students, experimental manipulations are kept to a minimum involving only aqueous extraction before counting. Background information, procedures, and results are discussed.…

  20. A simple calculation method for determination of equivalent square field.

    PubMed

    Shafiei, Seyed Ali; Hasanzadeh, Hadi; Shafiei, Seyed Ahmad

    2012-04-01

    Determination of the equivalent square fields for rectangular and shielded fields is of great importance in radiotherapy centers and treatment planning software. This is accomplished using standard tables and empirical formulas. The goal of this paper is to present a formula based on analysis of scatter reduction due to inverse square law to obtain equivalent field. Tables are published by different agencies such as ICRU (International Commission on Radiation Units and measurements), which are based on experimental data; but there exist mathematical formulas that yield the equivalent square field of an irregular rectangular field which are used extensively in computation techniques for dose determination. These processes lead to some complicated and time-consuming formulas for which the current study was designed. In this work, considering the portion of scattered radiation in absorbed dose at a point of measurement, a numerical formula was obtained based on which a simple formula was developed to calculate equivalent square field. Using polar coordinate and inverse square law will lead to a simple formula for calculation of equivalent field. The presented method is an analytical approach based on which one can estimate the equivalent square field of a rectangular field and may be used for a shielded field or an off-axis point. Besides, one can calculate equivalent field of rectangular field with the concept of decreased scatter radiation with inverse square law with a good approximation. This method may be useful in computing Percentage Depth Dose and Tissue-Phantom Ratio which are extensively used in treatment planning.

  1. On the derivation of linear irreversible thermodynamics for classical fluids

    PubMed Central

    Theodosopulu, M.; Grecos, A.; Prigogine, I.

    1978-01-01

    We consider the microscopic derivation of the linearized hydrodynamic equations for an arbitrary simple fluid. Our discussion is based on the concept of hydrodynamical modes, and use is made of the ideas and methods of the theory of subdynamics. We also show that this analysis leads to the Gibbs relation for the entropy of the system. PMID:16592516

  2. A Framework for Designing Cluster Randomized Trials with Binary Outcomes

    ERIC Educational Resources Information Center

    Spybrook, Jessaca; Martinez, Andres

    2011-01-01

    The purpose of this paper is to provide a frame work for approaching a power analysis for a CRT (cluster randomized trial) with a binary outcome. The authors suggest a framework in the context of a simple CRT and then extend it to a blocked design, or a multi-site cluster randomized trial (MSCRT). The framework is based on proportions, an…

  3. Using Modern Solid-State Analytical Tools for Investigations of an Advanced Carbon Capture Material: Experiments for the Inorganic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Wriedt, Mario; Sculley, Julian P.; Aulakh, Darpandeep; Zhou, Hong-Cai

    2016-01-01

    A simple and straightforward synthesis of an ultrastable porous metal-organic framework (MOF) based on copper(II) and a mixed N donor ligand system is described as a laboratory experiment for chemistry undergraduate students. These experiments and the resulting analysis are designed to teach students basic research tools and procedures while…

  4. Graph-Theoretic Properties of Networks Based on Word Association Norms: Implications for Models of Lexical Semantic Memory

    ERIC Educational Resources Information Center

    Gruenenfelder, Thomas M.; Recchia, Gabriel; Rubin, Tim; Jones, Michael N.

    2016-01-01

    We compared the ability of three different contextual models of lexical semantic memory (BEAGLE, Latent Semantic Analysis, and the Topic model) and of a simple associative model (POC) to predict the properties of semantic networks derived from word association norms. None of the semantic models were able to accurately predict all of the network…

  5. OPAD data analysis

    NASA Astrophysics Data System (ADS)

    Buntine, Wray L.; Kraft, Richard; Whitaker, Kevin; Cooper, Anita E.; Powers, W. T.; Wallace, Tim L.

    1993-06-01

    Data obtained in the framework of an Optical Plume Anomaly Detection (OPAD) program intended to create a rocket engine health monitor based on spectrometric detections of anomalous atomic and molecular species in the exhaust plume are analyzed. The major results include techniques for handling data noise, methods for registration of spectra to wavelength, and a simple automatic process for estimating the metallic component of a spectrum.

  6. Moisture transfer through the membrane of a cross-flow energy recovery ventilator: Measurement and simple data-driven modeling

    Treesearch

    CR Boardman; Samuel V. Glass

    2015-01-01

    The moisture transfer effectiveness (or latent effectiveness) of a cross-flow, membrane based energy recovery ventilator is measured and modeled. Analysis of in situ measurements for a full year shows that energy recovery ventilator latent effectiveness increases with increasing average relative humidity and surprisingly increases with decreasing average temperature. A...

  7. Dynamic texture recognition using local binary patterns with an application to facial expressions.

    PubMed

    Zhao, Guoying; Pietikäinen, Matti

    2007-06-01

    Dynamic texture (DT) is an extension of texture to the temporal domain. Description and recognition of DTs have attracted growing attention. In this paper, a novel approach for recognizing DTs is proposed and its simplifications and extensions to facial image analysis are also considered. First, the textures are modeled with volume local binary patterns (VLBP), which are an extension of the LBP operator widely used in ordinary texture analysis, combining motion and appearance. To make the approach computationally simple and easy to extend, only the co-occurrences of the local binary patterns on three orthogonal planes (LBP-TOP) are then considered. A block-based method is also proposed to deal with specific dynamic events such as facial expressions in which local information and its spatial locations should also be taken into account. In experiments with two DT databases, DynTex and Massachusetts Institute of Technology (MIT), both the VLBP and LBP-TOP clearly outperformed the earlier approaches. The proposed block-based method was evaluated with the Cohn-Kanade facial expression database with excellent results. The advantages of our approach include local processing, robustness to monotonic gray-scale changes, and simple computation.

  8. Development of a simple fluorescence-based microplate method for the high-throughput analysis of proline in wine samples.

    PubMed

    Robert-Peillard, Fabien; Boudenne, Jean-Luc; Coulomb, Bruno

    2014-05-01

    This paper presents a simple, accurate and multi-sample method for the determination of proline in wines thanks to a 96-well microplate technique. Proline is the most abundant amino acid in wine and is an important parameter related to wine characteristics or maturation processes of grape. In the current study, an improved application of the general method based on sodium hypochlorite oxidation and o-phthaldialdehyde (OPA)-thiol spectrofluorometric detection is described. The main interfering compounds for specific proline detection in wines are strongly reduced by selective reaction with OPA in a preliminary step under well-defined pH conditions. Application of the protocol after a 500-fold dilution of wine samples provides a working range between 0.02 and 2.90gL(-1), with a limit of detection of 7.50mgL(-1). Comparison and validation on real wine samples by ion-exchange chromatography prove that this procedure yields accurate results. Simplicity of the protocol used, with no need for centrifugation or filtration, organic solvents or high temperature enables its full implementation in plastic microplates and efficient application for routine analysis of proline in wines. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Analysis of biofluids in aqueous environment based on mid-infrared spectroscopy.

    PubMed

    Fabian, Heinz; Lasch, Peter; Naumann, Dieter

    2005-01-01

    In this study we describe a semiautomatic Fourier transform infrared spectroscopic methodology for the analysis of liquid serum samples, which combines simple sample introduction with high sample throughput. The applicability of this new infrared technology to the analysis of liquid serum samples from a cohort of cattle naturally infected with bovine spongiform encephalopathy and from controls was explored in comparison to the conventional approach based on transmission infrared spectroscopy of dried serum films. Artifical neural network analysis of the infrared data was performed to differentiate between bovine spongiform encephalopathy-negative controls and animals in the late stage of the disease. After training of artifical neural network classifiers, infrared spectra of sera from an independent external validation data set were analyzed. In this way, sensitivities between 90 and 96% and specificities between 84 and 92% were achieved, respectively, depending upon the strategy of data collection and data analysis. Based on these results, the advantages and limitations of the liquid sample technique and the dried film approach for routine analysis of biofluids are discussed. 2005 Society of Photo-Optical Instrumentation Engineers.

  10. M13-Tailed Simple Sequence Repeat (SSR) Markers in Studies of Genetic Diversity and Population Structure of Common Oat Germplasm.

    PubMed

    Onyśk, Agnieszka; Boczkowska, Maja

    2017-01-01

    Simple Sequence Repeat (SSR) markers are one of the most frequently used molecular markers in studies of crop diversity and population structure. This is due to their uniform distribution in the genome, the high polymorphism, reproducibility, and codominant character. Additional advantages are the possibility of automatic analysis and simple interpretation of the results. The M13 tagged PCR reaction significantly reduces the costs of analysis by the automatic genetic analyzers. Here, we also disclose a short protocol of SSR data analysis.

  11. Memory-Based Simple Heuristics as Attribute Substitution: Competitive Tests of Binary Choice Inference Models.

    PubMed

    Honda, Hidehito; Matsuka, Toshihiko; Ueda, Kazuhiro

    2017-05-01

    Some researchers on binary choice inference have argued that people make inferences based on simple heuristics, such as recognition, fluency, or familiarity. Others have argued that people make inferences based on available knowledge. To examine the boundary between heuristic and knowledge usage, we examine binary choice inference processes in terms of attribute substitution in heuristic use (Kahneman & Frederick, 2005). In this framework, it is predicted that people will rely on heuristic or knowledge-based inference depending on the subjective difficulty of the inference task. We conducted competitive tests of binary choice inference models representing simple heuristics (fluency and familiarity heuristics) and knowledge-based inference models. We found that a simple heuristic model (especially a familiarity heuristic model) explained inference patterns for subjectively difficult inference tasks, and that a knowledge-based inference model explained subjectively easy inference tasks. These results were consistent with the predictions of the attribute substitution framework. Issues on usage of simple heuristics and psychological processes are discussed. Copyright © 2016 Cognitive Science Society, Inc.

  12. Joining the yellow hub: Uses of the Simple Application Messaging Protocol in Space Physics analysis tools

    NASA Astrophysics Data System (ADS)

    Génot, V.; André, N.; Cecconi, B.; Bouchemit, M.; Budnik, E.; Bourrel, N.; Gangloff, M.; Dufourg, N.; Hess, S.; Modolo, R.; Renard, B.; Lormant, N.; Beigbeder, L.; Popescu, D.; Toniutti, J.-P.

    2014-11-01

    The interest for data communication between analysis tools in planetary sciences and space physics is illustrated in this paper via several examples of the uses of SAMP. The Simple Application Messaging Protocol is developed in the frame of the IVOA from an earlier protocol called PLASTIC. SAMP enables easy communication and interoperability between astronomy software, stand-alone and web-based; it is now increasingly adopted by the planetary sciences and space physics community. Its attractiveness is based, on one hand, on the use of common file formats for exchange and, on the other hand, on established messaging models. Examples of uses at the CDPP and elsewhere are presented. The CDPP (Centre de Données de la Physique des Plasmas, http://cdpp.eu/), the French data center for plasma physics, is engaged for more than a decade in the archiving and dissemination of data products from space missions and ground observatories. Besides these activities, the CDPP developed services like AMDA (Automated Multi Dataset Analysis, http://amda.cdpp.eu/) which enables in depth analysis of large amount of data through dedicated functionalities such as: visualization, conditional search and cataloging. Besides AMDA, the 3DView (http://3dview.cdpp.eu/) tool provides immersive visualizations and is further developed to include simulation and observational data. These tools and their interactions with each other, notably via SAMP, are presented via science cases of interest to planetary sciences and space physics communities.

  13. Robust LOD scores for variance component-based linkage analysis.

    PubMed

    Blangero, J; Williams, J T; Almasy, L

    2000-01-01

    The variance component method is now widely used for linkage analysis of quantitative traits. Although this approach offers many advantages, the importance of the underlying assumption of multivariate normality of the trait distribution within pedigrees has not been studied extensively. Simulation studies have shown that traits with leptokurtic distributions yield linkage test statistics that exhibit excessive Type I error when analyzed naively. We derive analytical formulae relating the deviation from the expected asymptotic distribution of the lod score to the kurtosis and total heritability of the quantitative trait. A simple correction constant yields a robust lod score for any deviation from normality and for any pedigree structure, and effectively eliminates the problem of inflated Type I error due to misspecification of the underlying probability model in variance component-based linkage analysis.

  14. Retrosynthetic Analysis-Guided Breaking Tile Symmetry for the Assembly of Complex DNA Nanostructures.

    PubMed

    Wang, Pengfei; Wu, Siyu; Tian, Cheng; Yu, Guimei; Jiang, Wen; Wang, Guansong; Mao, Chengde

    2016-10-11

    Current tile-based DNA self-assembly produces simple repetitive or highly symmetric structures. In the case of 2D lattices, the unit cell often contains only one basic tile because the tiles often are symmetric (in terms of either the backbone or the sequence). In this work, we have applied retrosynthetic analysis to determine the minimal asymmetric units for complex DNA nanostructures. Such analysis guides us to break the intrinsic structural symmetries of the tiles to achieve high structural complexities. This strategy has led to the construction of several DNA nanostructures that are not accessible from conventional symmetric tile designs. Along with previous studies, herein we have established a set of four fundamental rules regarding tile-based assembly. Such rules could serve as guidelines for the design of DNA nanostructures.

  15. A new strategy for statistical analysis-based fingerprint establishment: Application to quality assessment of Semen sojae praeparatum.

    PubMed

    Guo, Hui; Zhang, Zhen; Yao, Yuan; Liu, Jialin; Chang, Ruirui; Liu, Zhao; Hao, Hongyuan; Huang, Taohong; Wen, Jun; Zhou, Tingting

    2018-08-30

    Semen sojae praeparatum with homology of medicine and food is a famous traditional Chinese medicine. A simple and effective quality fingerprint analysis, coupled with chemometrics methods, was developed for quality assessment of Semen sojae praeparatum. First, similarity analysis (SA) and hierarchical clusting analysis (HCA) were applied to select the qualitative markers, which obviously influence the quality of Semen sojae praeparatum. 21 chemicals were selected and characterized by high resolution ion trap/time-of-flight mass spectrometry (LC-IT-TOF-MS). Subsequently, principal components analysis (PCA) and orthogonal partial least squares discriminant analysis (OPLS-DA) were conducted to select the quantitative markers of Semen sojae praeparatum samples from different origins. Moreover, 11 compounds with statistical significance were determined quantitatively, which provided an accurate and informative data for quality evaluation. This study proposes a new strategy for "statistic analysis-based fingerprint establishment", which would be a valuable reference for further study. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Analysis of Salmonella enterica Serovar Typhimurium Variable-Number Tandem-Repeat Data for Public Health Investigation Based on Measured Mutation Rates and Whole-Genome Sequence Comparisons

    PubMed Central

    Dimovski, Karolina; Cao, Hanwei; Wijburg, Odilia L. C.; Strugnell, Richard A.; Mantena, Radha K.; Whipp, Margaret; Hogg, Geoff

    2014-01-01

    Variable-number tandem repeats (VNTRs) mutate rapidly and can be useful markers for genotyping. While multilocus VNTR analysis (MLVA) is increasingly used in the detection and investigation of food-borne outbreaks caused by Salmonella enterica serovar Typhimurium (S. Typhimurium) and other bacterial pathogens, MLVA data analysis usually relies on simple clustering approaches that may lead to incorrect interpretations. Here, we estimated the rates of copy number change at each of the five loci commonly used for S. Typhimurium MLVA, during in vitro and in vivo passage. We found that loci STTR5, STTR6, and STTR10 changed during passage but STTR3 and STTR9 did not. Relative rates of change were consistent across in vitro and in vivo growth and could be accurately estimated from diversity measures of natural variation observed during large outbreaks. Using a set of 203 isolates from a series of linked outbreaks and whole-genome sequencing of 12 representative isolates, we assessed the accuracy and utility of several alternative methods for analyzing and interpreting S. Typhimurium MLVA data. We show that eBURST analysis was accurate and informative. For construction of MLVA-based trees, a novel distance metric, based on the geometric model of VNTR evolution coupled with locus-specific weights, performed better than the commonly used simple or categorical distance metrics. The data suggest that, for the purpose of identifying potential transmission clusters for further investigation, isolates whose profiles differ at one of the rapidly changing STTR5, STTR6, and STTR10 loci should be collapsed into the same cluster. PMID:24957617

  17. Easy, Fast, and Reproducible Quantification of Cholesterol and Other Lipids in Human Plasma by Combined High Resolution MSX and FTMS Analysis

    NASA Astrophysics Data System (ADS)

    Gallego, Sandra F.; Højlund, Kurt; Ejsing, Christer S.

    2018-01-01

    Reliable, cost-effective, and gold-standard absolute quantification of non-esterified cholesterol in human plasma is of paramount importance in clinical lipidomics and for the monitoring of metabolic health. Here, we compared the performance of three mass spectrometric approaches available for direct detection and quantification of cholesterol in extracts of human plasma. These approaches are high resolution full scan Fourier transform mass spectrometry (FTMS) analysis, parallel reaction monitoring (PRM), and novel multiplexed MS/MS (MSX) technology, where fragments from selected precursor ions are detected simultaneously. Evaluating the performance of these approaches in terms of dynamic quantification range, linearity, and analytical precision showed that the MSX-based approach is superior to that of the FTMS and PRM-based approaches. To further show the efficacy of this approach, we devised a simple routine for extensive plasma lipidome characterization using only 8 μL of plasma, using a new commercially available ready-to-spike-in mixture with 14 synthetic lipid standards, and executing a single 6 min sample injection with combined MSX analysis for cholesterol quantification and FTMS analysis for quantification of sterol esters, glycerolipids, glycerophospholipids, and sphingolipids. Using this simple routine afforded reproducible and absolute quantification of 200 lipid species encompassing 13 lipid classes in human plasma samples. Notably, the analysis time of this procedure can be shortened for high throughput-oriented clinical lipidomics studies or extended with more advanced MSALL technology (Almeida R. et al., J. Am. Soc. Mass Spectrom. 26, 133-148 [1]) to support in-depth structural elucidation of lipid molecules. [Figure not available: see fulltext.

  18. Easy, Fast, and Reproducible Quantification of Cholesterol and Other Lipids in Human Plasma by Combined High Resolution MSX and FTMS Analysis.

    PubMed

    Gallego, Sandra F; Højlund, Kurt; Ejsing, Christer S

    2018-01-01

    Reliable, cost-effective, and gold-standard absolute quantification of non-esterified cholesterol in human plasma is of paramount importance in clinical lipidomics and for the monitoring of metabolic health. Here, we compared the performance of three mass spectrometric approaches available for direct detection and quantification of cholesterol in extracts of human plasma. These approaches are high resolution full scan Fourier transform mass spectrometry (FTMS) analysis, parallel reaction monitoring (PRM), and novel multiplexed MS/MS (MSX) technology, where fragments from selected precursor ions are detected simultaneously. Evaluating the performance of these approaches in terms of dynamic quantification range, linearity, and analytical precision showed that the MSX-based approach is superior to that of the FTMS and PRM-based approaches. To further show the efficacy of this approach, we devised a simple routine for extensive plasma lipidome characterization using only 8 μL of plasma, using a new commercially available ready-to-spike-in mixture with 14 synthetic lipid standards, and executing a single 6 min sample injection with combined MSX analysis for cholesterol quantification and FTMS analysis for quantification of sterol esters, glycerolipids, glycerophospholipids, and sphingolipids. Using this simple routine afforded reproducible and absolute quantification of 200 lipid species encompassing 13 lipid classes in human plasma samples. Notably, the analysis time of this procedure can be shortened for high throughput-oriented clinical lipidomics studies or extended with more advanced MS ALL technology (Almeida R. et al., J. Am. Soc. Mass Spectrom. 26, 133-148 [1]) to support in-depth structural elucidation of lipid molecules. Graphical Abstract ᅟ.

  19. Saliva as a diagnostic fluid. Literature review

    PubMed Central

    Mancheño-Franch, Aisha; Marzal-Gamarra, Cristina; Carlos-Fabuel, Laura

    2012-01-01

    There is a growing interest in diagnosis based on the analysis of saliva. This is a simple, non-invasive method of obtaining oral samples which is safe for both the health worker and the patient, not to mention allowing for simple and cost-efficient storage. The majority of studies use general saliva samples in their entirety, complex fluids containing both local and systemic sources and whose composition corresponds to that of the blood. General saliva contains a considerable amount of desquamated epithelial cells, microorganisms and remnants of food and drink; it is essential to cleanse and refine the saliva samples to remove any external elements. Immediate processing of the sample is recommended in order to avoid decomposition, where this is not possible, the sample may be stored at -80ºC. Salivary analysis – much the same as blood analysis – aims to identify diverse medication or indications of certain diseases while providing a relatively simple tool for both early diagnosis and monitoring various irregularities. The practicalities of salivary analysis have been studied in fields such as: viral and bacterial infections, autoimmune diseases (like Sjögren’s syndrome and cɶliac disease), endocrinopathies (such as Cushing’s syndrome), oncology (early diagnosis of breast, lung and stomach carcinoma and oral squamous cell carcinoma), stress assessment, medication detection and forensic science among others. It is hoped that salivary analysis, with the help of current technological advances, will be valued much more highly in the near future. There still remain contradictory results with respect to analytic markers, which is why further studies into wider-ranging samples are fundamental to prove its viability. Key words:Saliva, biomarkers, early diagnosis. PMID:24558562

  20. A simple methodological validation of the gas/particle fractionation of polycyclic aromatic hydrocarbons in ambient air

    NASA Astrophysics Data System (ADS)

    Kim, Yong-Hyun; Kim, Ki-Hyun

    2015-07-01

    The analysis of polycyclic aromatic hydrocarbons (PAH) in ambient air requires the tedious experimental steps of both sampling and pretreatment (e.g., extraction or clean-up). To replace pre-existing conventional methods, a simple, rapid, and novel technique was developed to measure gas-particle fractionation of PAH in ambient air based on ‘sorbent tube-thermal desorption-gas chromatograph-mass spectrometer (ST-TD-GC-MS)’. The separate collection and analysis of ambient PAHs were achieved independently by two serially connected STs. The basic quality assurance confirmed good linearity, precision, and high sensitivity to eliminate the need for complicated pretreatment procedures with the detection limit (16 PAHs: 13.1 ± 7.04 pg). The analysis of real ambient PAH samples showed a clear fractionation between gas (two-three ringed PAHs) and particulate phases (five-six ringed PAHs). In contrast, for intermediate (four ringed) PAHs (fluoranthene, pyrene, benz[a]anthracene, and chrysene), a highly systematic/gradual fractionation was established. It thus suggests a promising role of ST-TD-GC-MS as measurement system in acquiring a reliable database of airborne PAH.

  1. An approximate methods approach to probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A major research and technology program in Probabilistic Structural Analysis Methods (PSAM) is currently being sponsored by the NASA Lewis Research Center with Southwest Research Institute as the prime contractor. This program is motivated by the need to accurately predict structural response in an environment where the loadings, the material properties, and even the structure may be considered random. The heart of PSAM is a software package which combines advanced structural analysis codes with a fast probability integration (FPI) algorithm for the efficient calculation of stochastic structural response. The basic idea of PAAM is simple: make an approximate calculation of system response, including calculation of the associated probabilities, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The deterministic solution resulting should give a reasonable and realistic description of performance-limiting system responses, although some error will be inevitable. If the simple model has correctly captured the basic mechanics of the system, however, including the proper functional dependence of stress, frequency, etc. on design parameters, then the response sensitivities calculated may be of significantly higher accuracy.

  2. Design a New Strategy Based on Nanoparticle-Enhanced Chemiluminescence Sensor Array for Biothiols Discrimination

    NASA Astrophysics Data System (ADS)

    Shahrajabian, Maryam; Hormozi-Nezhad, M. Reza

    2016-08-01

    Array-based sensor is an interesting approach that suggests an alternative to expensive analytical methods. In this work, we introduce a novel, simple, and sensitive nanoparticle-based chemiluminescence (CL) sensor array for discrimination of biothiols (e.g., cysteine, glutathione and glutathione disulfide). The proposed CL sensor array is based on the CL efficiencies of four types of enhanced nanoparticle-based CL systems. The intensity of CL was altered to varying degrees upon interaction with biothiols, producing unique CL response patterns. These distinct CL response patterns were collected as “fingerprints” and were then identified through chemometric methods, including linear discriminant analysis (LDA) and hierarchical cluster analysis (HCA). The developed array was able to successfully differentiate between cysteine, glutathione and glutathione disulfide in a wide concentration range. Moreover, it was applied to distinguish among the above analytes in human plasma.

  3. Spatial analysis of cities using Renyi entropy and fractal parameters

    NASA Astrophysics Data System (ADS)

    Chen, Yanguang; Feng, Jian

    2017-12-01

    The spatial distributions of cities fall into two groups: one is the simple distribution with characteristic scale (e.g. exponential distribution), and the other is the complex distribution without characteristic scale (e.g. power-law distribution). The latter belongs to scale-free distributions, which can be modeled with fractal geometry. However, fractal dimension is not suitable for the former distribution. In contrast, spatial entropy can be used to measure any types of urban distributions. This paper is devoted to generalizing multifractal parameters by means of dual relation between Euclidean and fractal geometries. The main method is mathematical derivation and empirical analysis, and the theoretical foundation is the discovery that the normalized fractal dimension is equal to the normalized entropy. Based on this finding, a set of useful spatial indexes termed dummy multifractal parameters are defined for geographical analysis. These indexes can be employed to describe both the simple distributions and complex distributions. The dummy multifractal indexes are applied to the population density distribution of Hangzhou city, China. The calculation results reveal the feature of spatio-temporal evolution of Hangzhou's urban morphology. This study indicates that fractal dimension and spatial entropy can be combined to produce a new methodology for spatial analysis of city development.

  4. Getting started with open-hardware: development and control of microfluidic devices.

    PubMed

    da Costa, Eric Tavares; Mora, Maria F; Willis, Peter A; do Lago, Claudimir L; Jiao, Hong; Garcia, Carlos D

    2014-08-01

    Understanding basic concepts of electronics and computer programming allows researchers to get the most out of the equipment found in their laboratories. Although a number of platforms have been specifically designed for the general public and are supported by a vast array of on-line tutorials, this subject is not normally included in university chemistry curricula. Aiming to provide the basic concepts of hardware and software, this article is focused on the design and use of a simple module to control a series of PDMS-based valves. The module is based on a low-cost microprocessor (Teensy) and open-source software (Arduino). The microvalves were fabricated using thin sheets of PDMS and patterned using CO2 laser engraving, providing a simple and efficient way to fabricate devices without the traditional photolithographic process or facilities. Synchronization of valve control enabled the development of two simple devices to perform injection (1.6 ± 0.4 μL/stroke) and mixing of different solutions. Furthermore, a practical demonstration of the utility of this system for microscale chemical sample handling and analysis was achieved performing an on-chip acid-base titration, followed by conductivity detection with an open-source low-cost detection system. Overall, the system provided a very reproducible (98%) platform to perform fluid delivery at the microfluidic scale. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Aspergillus tubingensis and Aspergillus niger as the dominant black Aspergillus, use of simple PCR-RFLP for preliminary differentiation.

    PubMed

    Mirhendi, H; Zarei, F; Motamedi, M; Nouripour-Sisakht, S

    2016-03-01

    This work aimed to identify the species distribution of common clinical and environmental isolates of black Aspergilli based on simple restriction fragment length polymorphism (RFLP) analysis of the β-tubulin gene. A total of 149 clinical and environmental strains of black Aspergilli were collected and subjected to preliminary morphological examination. Total genomic DNAs were extracted, and PCR was performed to amplify part of the β-tubulin gene. At first, 52 randomly selected samples were species-delineated by sequence analysis. In order to distinguish the most common species, PCR amplicons of 117 black Aspergillus strains were identified by simple PCR-RFLP analysis using the enzyme TasI. Among 52 sequenced isolates, 28 were Aspergillus tubingensis, 21 Aspergillus niger, and the three remaining isolates included Aspergillus uvarum, Aspergillus awamori, and Aspergillus acidus. All 100 environmental and 17 BAL samples subjected to TasI-RFLP analysis of the β-tubulin gene, fell into two groups, consisting of about 59% (n=69) A. tubingensis and 41% (n=48) A. niger. Therefore, the method successfully and rapidly distinguished A. tubingensis and A. niger as the most common species among the clinical and environmental isolates. Although tardy, the Ehrlich test was also able to differentiate A. tubingensis and A. niger according to the yellow color reaction specific to A. niger. A. tubingensis and A. niger are the most common black Aspergillus in both clinical and environmental isolates in Iran. PCR-RFLP using TasI digestion of β-tubulin DNA enables rapid screening for these common species. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  6. A population-based study on the association between rheumatoid arthritis and voice problems.

    PubMed

    Hah, J Hun; An, Soo-Youn; Sim, Songyong; Kim, So Young; Oh, Dong Jun; Park, Bumjung; Kim, Sung-Gyun; Choi, Hyo Geun

    2016-07-01

    The objective of this study was to investigate whether rheumatoid arthritis increases the frequency of organic laryngeal lesions and the subjective voice complaint rate in those with no organic laryngeal lesion. We performed a cross-sectional study using the data from 19,368 participants (418 rheumatoid arthritis patients and 18,950 controls) of the 2008-2011 Korea National Health and Nutrition Examination Survey. The associations between rheumatoid arthritis and organic laryngeal lesions/subjective voice complaints were analyzed using simple/multiple logistic regression analysis with complex sample adjusting for confounding factors, including age, sex, smoking status, stress level, and body mass index, which could provoke voice problems. Vocal nodules, vocal polyp, and vocal palsy were not associated with rheumatoid arthritis in a multiple regression analysis, and only laryngitis showed a positive association (adjusted odds ratio, 1.59; 95 % confidence interval, 1.01-2.52; P = 0.047). Rheumatoid arthritis was associated with subjective voice discomfort in a simple regression analysis, but not in a multiple regression analysis. Participants with rheumatoid arthritis were older, more often female, and had higher stress levels than those without rheumatoid arthritis. These factors were associated with subjective voice complaints in both simple and multiple regression analyses. Rheumatoid arthritis was not associated with organic laryngeal diseases except laryngitis. Rheumatoid arthritis did not increase the odds ratio for subjective voice complaints. Voice problems in participants with rheumatoid arthritis originated from the characteristics of the rheumatoid arthritis group (higher mean age, female sex, and stress level) rather than rheumatoid arthritis itself.

  7. Sensorless position estimation and control of permanent-magnet synchronous motors using a saturation model

    NASA Astrophysics Data System (ADS)

    Kassem Jebai, Al; Malrait, François; Martin, Philippe; Rouchon, Pierre

    2016-03-01

    Sensorless control of permanent-magnet synchronous motors at low velocity remains a challenging task. A now well-established method consists of injecting a high-frequency signal and using the rotor saliency, both geometric and magnetic-saturation induced. This paper proposes a clear and original analysis based on second-order averaging of how to recover the position information from signal injection; this analysis blends well with a general model of magnetic saturation. It also proposes a simple parametric model of the saturated motor, based on an energy function which simply encompasses saturation and cross-saturation effects. Experimental results on a surface-mounted motor and an interior magnet motor illustrate the relevance of the approach.

  8. Object Classification Based on Analysis of Spectral Characteristics of Seismic Signal Envelopes

    NASA Astrophysics Data System (ADS)

    Morozov, Yu. V.; Spektor, A. A.

    2017-11-01

    A method for classifying moving objects having a seismic effect on the ground surface is proposed which is based on statistical analysis of the envelopes of received signals. The values of the components of the amplitude spectrum of the envelopes obtained applying Hilbert and Fourier transforms are used as classification criteria. Examples illustrating the statistical properties of spectra and the operation of the seismic classifier are given for an ensemble of objects of four classes (person, group of people, large animal, vehicle). It is shown that the computational procedures for processing seismic signals are quite simple and can therefore be used in real-time systems with modest requirements for computational resources.

  9. Advances in Optical Fiber-Based Faraday Rotation Diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, A D; McHale, G B; Goerz, D A

    2009-07-27

    In the past two years, we have used optical fiber-based Faraday Rotation Diagnostics (FRDs) to measure pulsed currents on several dozen capacitively driven and explosively driven pulsed power experiments. We have made simplifications to the necessary hardware for quadrature-encoded polarization analysis, including development of an all-fiber analysis scheme. We have developed a numerical model that is useful for predicting and quantifying deviations from the ideal diagnostic response. We have developed a method of analyzing quadrature-encoded FRD data that is simple to perform and offers numerous advantages over several existing methods. When comparison has been possible, we have seen good agreementmore » with our FRDs and other current sensors.« less

  10. A simple method for processing data with least square method

    NASA Astrophysics Data System (ADS)

    Wang, Chunyan; Qi, Liqun; Chen, Yongxiang; Pang, Guangning

    2017-08-01

    The least square method is widely used in data processing and error estimation. The mathematical method has become an essential technique for parameter estimation, data processing, regression analysis and experimental data fitting, and has become a criterion tool for statistical inference. In measurement data analysis, the distribution of complex rules is usually based on the least square principle, i.e., the use of matrix to solve the final estimate and to improve its accuracy. In this paper, a new method is presented for the solution of the method which is based on algebraic computation and is relatively straightforward and easy to understand. The practicability of this method is described by a concrete example.

  11. Is math anxiety in the secondary classroom limiting physics mastery? A study of math anxiety and physics performance

    NASA Astrophysics Data System (ADS)

    Mercer, Gary J.

    This quantitative study examined the relationship between secondary students with math anxiety and physics performance in an inquiry-based constructivist classroom. The Revised Math Anxiety Rating Scale was used to evaluate math anxiety levels. The results were then compared to the performance on a physics standardized final examination. A simple correlation was performed, followed by a multivariate regression analysis to examine effects based on gender and prior math background. The correlation showed statistical significance between math anxiety and physics performance. The regression analysis showed statistical significance for math anxiety, physics performance, and prior math background, but did not show statistical significance for math anxiety, physics performance, and gender.

  12. Preliminary analysis of a membrane-based atmosphere-control subsystem

    NASA Technical Reports Server (NTRS)

    Mccray, Scott B.; Newbold, David D.; Ray, Rod; Ogle, Kathryn

    1993-01-01

    Controlled ecological life supprot systems will require subsystems for maintaining the consentrations of atmospheric gases within acceptable ranges in human habitat chambers and plant growth chambers. The goal of this work was to develop a membrane-based atmosphere comntrol (MBAC) subsystem that allows the controlled exchange of atmospheric componets (e.g., oxygen, carbon dioxide, and water vapor) between these chambers. The MBAC subsystem promises to offer a simple, nonenergy intensive method to separate, store and exchange atmospheric components, producing optimal concentrations of components in each chamber. In this paper, the results of a preliminary analysis of the MBAC subsystem for control of oxygen and nitrogen are presented. Additionally, the MBAC subsystem and its operation are described.

  13. Rapid determination of Swiss cheese composition by Fourier transform infrared/attenuated total reflectance spectroscopy.

    PubMed

    Rodriguez-Saona, L E; Koca, N; Harper, W J; Alvarez, V B

    2006-05-01

    There is a need for rapid and simple techniques that can be used to predict the quality of cheese. The aim of this research was to develop a simple and rapid screening tool for monitoring Swiss cheese composition by using Fourier transform infrared spectroscopy. Twenty Swiss cheese samples from different manufacturers and degree of maturity were evaluated. Direct measurements of Swiss cheese slices (approximately 0.5 g) were made using a MIRacle 3-reflection diamond attenuated total reflectance (ATR) accessory. Reference methods for moisture (vacuum oven), protein content (Kjeldahl), and fat (Babcock) were used. Calibration models were developed based on a cross-validated (leave-one-out approach) partial least squares regression. The information-rich infrared spectral range for Swiss cheese samples was from 3,000 to 2,800 cm(-1) and 1,800 to 900 cm(-1). The performance statistics for cross-validated models gave estimates for standard error of cross-validation of 0.45, 0.25, and 0.21% for moisture, protein, and fat respectively, and correlation coefficients r > 0.96. Furthermore, the ATR infrared protocol allowed for the classification of cheeses according to manufacturer and aging based on unique spectral information, especially of carbonyl groups, probably due to their distinctive lipid composition. Attenuated total reflectance infrared spectroscopy allowed for the rapid (approximately 3-min analysis time) and accurate analysis of the composition of Swiss cheese. This technique could contribute to the development of simple and rapid protocols for monitoring complex biochemical changes, and predicting the final quality of the cheese.

  14. Simple Genetic Distance-Optimized Field Deployments for Clonal Seed Orchards Based on Microsatellite Markers: As a Case of Chinese Pine Seed Orchard.

    PubMed

    Yuan, Huwei; Niu, Shihui; El-Kassaby, Yousry A; Li, Yue; Li, Wei

    2016-01-01

    Chinese pine seed orchards are in a period of transition from first-generation to advanced-generations. How to effectively select populations for second-generation seed orchards and significantly increase genetic gain through rational deployment have become major issues. In this study, we examined open- and control-pollinated progeny of the first-generation Chinese pine seed orchards in Zhengning (Gansu Province, China) and Xixian (Shanxi Province, China) to address issues related to phenotypic selection for high volume growth, genetic diversity analysis and genetic distance-based phylogenetic analysis of the selections by simple sequence repeats (SSRs), and phylogenetic relationship-based field deployment for advanced-generation orchards. In total, 40, 28, 20, and 13 superior individuals were selected from the large-scale no-pedigree open-pollinated progeny of Zhengning (ZN-NP), open-pollinated families of Zhengning (ZN-OP), open-pollinated families of Xixian (XX-OP), and control-pollinated families of Xixian, with mean volume dominance ratios of 0.83, 0.15, 0.25, and 0.20, respectively. Phylogenetic relationship analysis of the ZN-NP and XX-OP populations showed that the 40 superior individuals in the ZN-NP selected population belonged to 23 families and could be further divided into five phylogenetic groups, and that families in the same group were closely related. Similarly, 20 families in the XX-OP population were related to varying degrees. Based on these results, we found that second-generation Chinese pine seed orchards in Zhengning and Xixian should adopt a grouped, unbalanced, complete, fixed block design and an unbalanced, incomplete, fixed block design, respectively. This study will provide practical references for applying molecular markers to establishing advanced-generation seed orchards.

  15. Synthesis and characterization of a novel schiff base of 1,2-diaminopropane with substituted salicyaldehyde and its transition metal complexes: Single crystal structures and biological activities

    NASA Astrophysics Data System (ADS)

    Tadavi, Samina K.; Yadav, Abhijit A.; Bendre, Ratnamala S.

    2018-01-01

    A novel schiff base H2L derived from simple condensation of 2-hydroxy-6-isopropyl-3-methyl benzaldehyde and 1,2-diaminopropane in 2:1 M ratio and its [MnL], [CoL] and [NiL]2 complexes have been prepared and characterized by spectroscopic technique, elemental analysis, SEM-EDX analysis, and cyclic voltammetry. Additionally, single crystal X-ray diffraction technique has been applied to the schiff base ligand H2L and its nickel complex. The structure of nickel complex exhibited dimeric form with formula [NiL]2 with distorted square planar geometry around each nickel center. Furthermore, all the synthesized compounds were screened for their antimicrobial and antioxidant and DNA cleavage activities.

  16. Volumetric quantification of bone-implant contact using micro-computed tomography analysis based on region-based segmentation.

    PubMed

    Kang, Sung-Won; Lee, Woo-Jin; Choi, Soon-Chul; Lee, Sam-Sun; Heo, Min-Suk; Huh, Kyung-Hoe; Kim, Tae-Il; Yi, Won-Jin

    2015-03-01

    We have developed a new method of segmenting the areas of absorbable implants and bone using region-based segmentation of micro-computed tomography (micro-CT) images, which allowed us to quantify volumetric bone-implant contact (VBIC) and volumetric absorption (VA). The simple threshold technique generally used in micro-CT analysis cannot be used to segment the areas of absorbable implants and bone. Instead, a region-based segmentation method, a region-labeling method, and subsequent morphological operations were successively applied to micro-CT images. The three-dimensional VBIC and VA of the absorbable implant were then calculated over the entire volume of the implant. Two-dimensional (2D) bone-implant contact (BIC) and bone area (BA) were also measured based on the conventional histomorphometric method. VA and VBIC increased significantly with as the healing period increased (p<0.05). VBIC values were significantly correlated with VA values (p<0.05) and with 2D BIC values (p<0.05). It is possible to quantify VBIC and VA for absorbable implants using micro-CT analysis using a region-based segmentation method.

  17. Theory of freezing in simple systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cerjan, C.; Bagchi, B.

    The transition parameters for the freezing of two one-component liquids into crystalline solids are evaluated by two theoretical approaches. The first system considered is liquid sodium which crystallizes into a body-centered-cubic (bcc) lattice; the second system is the freezing of adhesive hard spheres into a face-centered-cubic (fcc) lattice. Two related theoretical techniques are used in this evaluation: One is based upon a recently developed bifurcation analysis; the other is based upon the theory of freezing developed by Ramakrishnan and Yussouff. For liquid sodium, where experimental information is available, the predictions of the two theories agree well with experiment and eachmore » other. The adhesive-hard-sphere system, which displays a triple point and can be used to fit some liquids accurately, shows a temperature dependence of the freezing parameters which is similar to Lennard-Jones systems. At very low temperature, the fractional density change on freezing shows a dramatic increase as a function of temperature indicating the importance of all the contributions due to the triplet direction correlation function. Also, we consider the freezing of a one-component liquid into a simple-cubic (sc) lattice by bifurcation analysis and show that this transition is highly unfavorable, independent of interatomic potential choice. The bifurcation diagrams for the three lattices considered are compared and found to be strikingly different. Finally, a new stability analysis of the bifurcation diagrams is presented.« less

  18. Surface acoustic wave (SAW) vibration sensors.

    PubMed

    Filipiak, Jerzy; Solarz, Lech; Steczko, Grzegorz

    2011-01-01

    In the paper a feasibility study on the use of surface acoustic wave (SAW) vibration sensors for electronic warning systems is presented. The system is assembled from concatenated SAW vibration sensors based on a SAW delay line manufactured on a surface of a piezoelectric plate. Vibrations of the plate are transformed into electric signals that allow identification of the sensor and localization of a threat. The theoretical study of sensor vibrations leads us to the simple isotropic model with one degree of freedom. This model allowed an explicit description of the sensor plate movement and identification of the vibrating sensor. Analysis of frequency response of the ST-cut quartz sensor plate and a damping speed of its impulse response has been conducted. The analysis above was the basis to determine the ranges of parameters for vibrating plates to be useful in electronic warning systems. Generally, operation of electronic warning systems with SAW vibration sensors is based on the analysis of signal phase changes at the working frequency of delay line after being transmitted via two circuits of concatenated four-terminal networks. Frequencies of phase changes are equal to resonance frequencies of vibrating plates of sensors. The amplitude of these phase changes is proportional to the amplitude of vibrations of a sensor plate. Both pieces of information may be sent and recorded jointly by a simple electrical unit.

  19. Ratio of mean platelet volume to platelet count is a potential surrogate marker predicting liver cirrhosis.

    PubMed

    Iida, Hiroya; Kaibori, Masaki; Matsui, Kosuke; Ishizaki, Morihiko; Kon, Masanori

    2018-01-27

    To provide a simple surrogate marker predictive of liver cirrhosis (LC). Specimens from 302 patients who underwent resection for hepatocellular carcinoma between January 2006 and December 2012 were retrospectively analyzed. Based on pathologic findings, patients were divided into groups based on whether or not they had LC. Parameters associated with hepatic functional reserve were compared in these two groups using Mann-Whitney U -test for univariate analysis. Factors differing significantly in univariate analyses were entered into multivariate logistic regression analysis. There were significant differences between the LC group ( n = 100) and non-LC group ( n = 202) in prothrombin activity, concentrations of alanine aminotransferase, aspartate aminotransferase, total bilirubin, albumin, cholinesterase, type IV collagen, hyaluronic acid, indocyanine green retention rate at 15 min, maximal removal rate of technitium-99m diethylene triamine penta-acetic acid-galactosyl human serum albumin and ratio of mean platelet volume to platelet count (MPV/PLT). Multivariate analysis showed that prothrombin activity, concentrations of alanine aminotransferase, aspartate aminotransferase, total bilirubin and hyaluronic acid, and MPV/PLT ratio were factors independently predictive of LC. The area under the curve value for MPV/PLT was 0.78, with a 0.8 cutoff value having a sensitivity of 65% and a specificity of 78%. The MPV/PLT ratio, which can be determined simply from the complete blood count, may be a simple surrogate marker predicting LC.

  20. A simple visual ethanol biosensor based on alcohol oxidase immobilized onto polyaniline film for halal verification of fermented beverage samples.

    PubMed

    Kuswandi, Bambang; Irmawati, Titi; Hidayat, Moch Amrun; Jayus; Ahmad, Musa

    2014-01-27

    A simple visual ethanol biosensor based on alcohol oxidase (AOX) immobilised onto polyaniline (PANI) film for halal verification of fermented beverage samples is described. This biosensor responds to ethanol via a colour change from green to blue, due to the enzymatic reaction of ethanol that produces acetaldehyde and hydrogen peroxide, when the latter oxidizes the PANI film. The procedure to obtain this biosensor consists of the immobilization of AOX onto PANI film by adsorption. For the immobilisation, an AOX solution is deposited on the PANI film and left at room temperature until dried (30 min). The biosensor was constructed as a dip stick for visual and simple use. The colour changes of the films have been scanned and analysed using image analysis software (i.e., ImageJ) to study the characteristics of the biosensor's response toward ethanol. The biosensor has a linear response in an ethanol concentration range of 0.01%-0.8%, with a correlation coefficient (r) of 0.996. The limit detection of the biosensor was 0.001%, with reproducibility (RSD) of 1.6% and a life time up to seven weeks when stored at 4 °C. The biosensor provides accurate results for ethanol determination in fermented drinks and was in good agreement with the standard method (gas chromatography) results. Thus, the biosensor could be used as a simple visual method for ethanol determination in fermented beverage samples that can be useful for Muslim community for halal verification.

  1. HPLC fingerprint analysis combined with chemometrics for pattern recognition of ginger.

    PubMed

    Feng, Xu; Kong, Weijun; Wei, Jianhe; Ou-Yang, Zhen; Yang, Meihua

    2014-03-01

    Ginger, the fresh rhizome of Zingiber officinale Rosc. (Zingiberaceae), has been used worldwide; however, for a long time, there has been no standard approbated internationally for its quality control. To establish an efficacious and combinational method and pattern recognition technique for quality control of ginger. A simple, accurate and reliable method based on high-performance liquid chromatography with photodiode array (HPLC-PDA) detection was developed for establishing the chemical fingerprints of 10 batches of ginger from different markets in China. The method was validated in terms of precision, reproducibility and stability; and the relative standard deviations were all less than 1.57%. On the basis of this method, the fingerprints of 10 batches of ginger samples were obtained, which showed 16 common peaks. Coupled with similarity evaluation software, the similarities between each fingerprint of the sample and the simulative mean chromatogram were in the range of 0.998-1.000. Then, the chemometric techniques, including similarity analysis, hierarchical clustering analysis and principal component analysis were applied to classify the ginger samples. Consistent results were obtained to show that ginger samples could be successfully classified into two groups. This study revealed that HPLC-PDA method was simple, sensitive and reliable for fingerprint analysis, and moreover, for pattern recognition and quality control of ginger.

  2. Fibre optic system for biochemical and microbiological sensing

    NASA Astrophysics Data System (ADS)

    Penwill, L. A.; Slater, J. H.; Hayes, N. W.; Tremlett, C. J.

    2007-07-01

    This poster will discuss state-of-the-art fibre optic sensors based on evanescent wave technology emphasising chemophotonic sensors for biochemical reactions and microbe detection. Devices based on antibody specificity and unique DNA sequences will be described. The development of simple sensor devices with disposable single use sensor probes will be illustrated with a view to providing cost effective field based or point of care analysis of major themes such as hospital acquired infections or bioterrorism events. This presentation will discuss the nature and detection thresholds required, the optical detection techniques investigated, results of sensor trials and the potential for wider commercial application.

  3. Coupled electromagnetic-thermodynamic simulations of microwave heating problems using the FDTD algorithm.

    PubMed

    Kopyt, Paweł; Celuch, Małgorzata

    2007-01-01

    A practical implementation of a hybrid simulation system capable of modeling coupled electromagnetic-thermodynamic problems typical in microwave heating is described. The paper presents two approaches to modeling such problems. Both are based on an FDTD-based commercial electromagnetic solver coupled to an external thermodynamic analysis tool required for calculations of heat diffusion. The first approach utilizes a simple FDTD-based thermal solver while in the second it is replaced by a universal commercial CFD solver. The accuracy of the two modeling systems is verified against the original experimental data as well as the measurement results available in literature.

  4. Simple Levelized Cost of Energy (LCOE) Calculator Documentation | Energy

    Science.gov Websites

    Analysis | NREL Simple Levelized Cost of Energy (LCOE) Calculator Documentation Simple Levelized Cost of Energy (LCOE) Calculator Documentation Transparent Cost Database Button This is a simple : 1). Cost and Performance Adjust the sliders to suitable values for each of the cost and performance

  5. Determination of Diethyl Phthalate and Polyhexamethylene Guanidine in Surrogate Alcohol from Russia

    PubMed Central

    Monakhova, Yulia B.; Kuballa, Thomas; Leitz, Jenny; Lachenmeier, Dirk W.

    2011-01-01

    Analytical methods based on spectroscopic techniques were developed and validated for the determination of diethyl phthalate (DEP) and polyhexamethylene guanidine (PHMG), which may occur in unrecorded alcohol. Analysis for PHMG was based on UV-VIS spectrophotometry after derivatization with Eosin Y and 1H NMR spectroscopy of the DMSO extract. Analysis of DEP was performed with direct UV-VIS and 1H NMR methods. Multivariate curve resolution and spectra computation methods were used to confirm the presence of PHMG and DEP in the investigated beverages. Of 22 analysed alcohol samples, two contained DEP or PHMG. 1H NMR analysis also revealed the presence of signals of hawthorn extract in three medicinal alcohols used as surrogate alcohol. The simple and cheap UV-VIS methods can be used for rapid screening of surrogate alcohol samples for impurities, while 1H NMR is recommended for specific confirmatory analysis if required. PMID:21647285

  6. Determination of diethyl phthalate and polyhexamethylene guanidine in surrogate alcohol from Russia.

    PubMed

    Monakhova, Yulia B; Kuballa, Thomas; Leitz, Jenny; Lachenmeier, Dirk W

    2011-01-01

    Analytical methods based on spectroscopic techniques were developed and validated for the determination of diethyl phthalate (DEP) and polyhexamethylene guanidine (PHMG), which may occur in unrecorded alcohol. Analysis for PHMG was based on UV-VIS spectrophotometry after derivatization with Eosin Y and (1)H NMR spectroscopy of the DMSO extract. Analysis of DEP was performed with direct UV-VIS and (1)H NMR methods. Multivariate curve resolution and spectra computation methods were used to confirm the presence of PHMG and DEP in the investigated beverages. Of 22 analysed alcohol samples, two contained DEP or PHMG. (1)H NMR analysis also revealed the presence of signals of hawthorn extract in three medicinal alcohols used as surrogate alcohol. The simple and cheap UV-VIS methods can be used for rapid screening of surrogate alcohol samples for impurities, while (1)H NMR is recommended for specific confirmatory analysis if required.

  7. Visual interface for space and terrestrial analysis

    NASA Technical Reports Server (NTRS)

    Dombrowski, Edmund G.; Williams, Jason R.; George, Arthur A.; Heckathorn, Harry M.; Snyder, William A.

    1995-01-01

    The management of large geophysical and celestial data bases is now, more than ever, the most critical path to timely data analysis. With today's large volume data sets from multiple satellite missions, analysts face the task of defining useful data bases from which data and metadata (information about data) can be extracted readily in a meaningful way. Visualization, following an object-oriented design, is a fundamental method of organizing and handling data. Humans, by nature, easily accept pictorial representations of data. Therefore graphically oriented user interfaces are appealing, as long as they remain simple to produce and use. The Visual Interface for Space and Terrestrial Analysis (VISTA) system, currently under development at the Naval Research Laboratory's Backgrounds Data Center (BDC), has been designed with these goals in mind. Its graphical user interface (GUI) allows the user to perform queries, visualization, and analysis of atmospheric and celestial backgrounds data.

  8. A simple algorithm for the identification of clinical COPD phenotypes.

    PubMed

    Burgel, Pierre-Régis; Paillasseur, Jean-Louis; Janssens, Wim; Piquet, Jacques; Ter Riet, Gerben; Garcia-Aymerich, Judith; Cosio, Borja; Bakke, Per; Puhan, Milo A; Langhammer, Arnulf; Alfageme, Inmaculada; Almagro, Pere; Ancochea, Julio; Celli, Bartolome R; Casanova, Ciro; de-Torres, Juan P; Decramer, Marc; Echazarreta, Andrés; Esteban, Cristobal; Gomez Punter, Rosa Mar; Han, MeiLan K; Johannessen, Ane; Kaiser, Bernhard; Lamprecht, Bernd; Lange, Peter; Leivseth, Linda; Marin, Jose M; Martin, Francis; Martinez-Camblor, Pablo; Miravitlles, Marc; Oga, Toru; Sofia Ramírez, Ana; Sin, Don D; Sobradillo, Patricia; Soler-Cataluña, Juan J; Turner, Alice M; Verdu Rivera, Francisco Javier; Soriano, Joan B; Roche, Nicolas

    2017-11-01

    This study aimed to identify simple rules for allocating chronic obstructive pulmonary disease (COPD) patients to clinical phenotypes identified by cluster analyses.Data from 2409 COPD patients of French/Belgian COPD cohorts were analysed using cluster analysis resulting in the identification of subgroups, for which clinical relevance was determined by comparing 3-year all-cause mortality. Classification and regression trees (CARTs) were used to develop an algorithm for allocating patients to these subgroups. This algorithm was tested in 3651 patients from the COPD Cohorts Collaborative International Assessment (3CIA) initiative.Cluster analysis identified five subgroups of COPD patients with different clinical characteristics (especially regarding severity of respiratory disease and the presence of cardiovascular comorbidities and diabetes). The CART-based algorithm indicated that the variables relevant for patient grouping differed markedly between patients with isolated respiratory disease (FEV 1 , dyspnoea grade) and those with multi-morbidity (dyspnoea grade, age, FEV 1 and body mass index). Application of this algorithm to the 3CIA cohorts confirmed that it identified subgroups of patients with different clinical characteristics, mortality rates (median, from 4% to 27%) and age at death (median, from 68 to 76 years).A simple algorithm, integrating respiratory characteristics and comorbidities, allowed the identification of clinically relevant COPD phenotypes. Copyright ©ERS 2017.

  9. Calculation of density of states for modeling photoemission using method of moments

    NASA Astrophysics Data System (ADS)

    Finkenstadt, Daniel; Lambrakos, Samuel G.; Jensen, Kevin L.; Shabaev, Andrew; Moody, Nathan A.

    2017-09-01

    Modeling photoemission using the Moments Approach (akin to Spicer's "Three Step Model") is often presumed to follow simple models for the prediction of two critical properties of photocathodes: the yield or "Quantum Efficiency" (QE), and the intrinsic spreading of the beam or "emittance" ɛnrms. The simple models, however, tend to obscure properties of electrons in materials, the understanding of which is necessary for a proper prediction of a semiconductor or metal's QE and ɛnrms. This structure is characterized by localized resonance features as well as a universal trend at high energy. Presented in this study is a prototype analysis concerning the density of states (DOS) factor D(E) for Copper in bulk to replace the simple three-dimensional form of D(E) = (m/π2 h3)p2mE currently used in the Moments approach. This analysis demonstrates that excited state spectra of atoms, molecules and solids based on density-functional theory can be adapted as useful information for practical applications, as well as providing theoretical interpretation of density-of-states structure, e.g., qualitatively good descriptions of optical transitions in matter, in addition to DFT's utility in providing the optical constants and material parameters also required in the Moments Approach.

  10. Fast and Simple Discriminative Analysis of Anthocyanins-Containing Berries Using LC/MS Spectral Data.

    PubMed

    Yang, Heejung; Kim, Hyun Woo; Kwon, Yong Soo; Kim, Ho Kyong; Sung, Sang Hyun

    2017-09-01

    Anthocyanins are potent antioxidant agents that protect against many degenerative diseases; however, they are unstable because they are vulnerable to external stimuli including temperature, pH and light. This vulnerability hinders the quality control of anthocyanin-containing berries using classical high-performance liquid chromatography (HPLC) analytical methodologies based on UV or MS chromatograms. To develop an alternative approach for the quality assessment and discrimination of anthocyanin-containing berries, we used MS spectral data acquired in a short analytical time rather than UV or MS chromatograms. Mixtures of anthocyanins were separated from other components in a short gradient time (5 min) due to their higher polarity, and the representative MS spectrum was acquired from the MS chromatogram corresponding to the mixture of anthocyanins. The chemometric data from the representative MS spectra contained reliable information for the identification and relative quantification of anthocyanins in berries with good precision and accuracy. This fast and simple methodology, which consists of a simple sample preparation method and short gradient analysis, could be applied to reliably discriminate the species and geographical origins of different anthocyanin-containing berries. These features make the technique useful for the food industry. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Optimisation of a simple and reliable method based on headspace solid-phase microextraction for the determination of volatile phenols in beer.

    PubMed

    Pizarro, C; Pérez-del-Notario, N; González-Sáiz, J M

    2010-09-24

    A simple, accurate and sensitive method based on headspace solid-phase microextraction (HS-SPME) coupled to gas chromatography-tandem mass spectrometry (GC-MS/MS) was developed for the analysis of 4-ethylguaiacol, 4-ethylphenol, 4-vinylguaiacol and 4-vinylphenol in beer. The effect of the presence of CO2 in the sample on the extraction of analytes was examined. The influence on extraction efficiency of different fibre coatings, of salt addition and stirring was also evaluated. Divinylbenzene/carboxen/polydimethylsiloxane was selected as extraction fibre and was used to evaluate the influence of exposure time, extraction temperature and sample volume/total volume ratio (Vs/Vt) by means of a central composite design (CCD). The optimal conditions identified were 80 degrees C for extraction temperature, 55 min for extraction time and 6 mL of beer (Vs/Vt 0.30). Under optimal conditions, the proposed method showed satisfactory linearity (correlation coefficients between 0.993 and 0.999), precision (between 6.3% and 9.7%) and detection limits (lower than those previously reported for volatile phenols in beers). The method was applied successfully to the analysis of beer samples. To our knowledge, this is the first time that a HS-SPME based method has been developed to determine simultaneously these four volatile phenols in beers. Copyright 2010 Elsevier B.V. All rights reserved.

  12. Detection of the Assembly and Disassembly of PCV2b Virus-Like Particles Using Fluorescence Spectroscopy Analysis.

    PubMed

    Fang, Mingli; Diao, Wenzhen; Dong, Boqi; Wei, Hongfei; Liu, Jialin; Hua, Li; Zhang, Miaomin; Guo, Sheng; Xiao, Yue; Yu, Yongli; Wang, Liying; Wan, Min

    2015-01-01

    Monitoring the assembly and disassembly of virus-like particles (VLPs) is important in developing effective VLP-based vaccines. We tried to establish a simple and rapid method to evaluate the status of VLP assembly using fluorescence spectroscopic analysis (FSA) while developing a VLP-based vaccine against porcine circovirus type 2b (PCV2b). We synthesized the gene coding for PCV2b capsid protein (CP). The CP was expressed in Escherichia coli in a soluble form, dialyzed into three different buffers, and assembled into VLPs. The immunogenicity of the VLPs was evaluated by an enzyme-linked immunosorbent assay using the sera of mice immunized with inactivated PCV2b. The VLP assembly was detected using transmission electron microscopy and FSA. The assembled VLPs showed a distinct FSA curve with a peak at 320 nm. We found that the assembly status was related to the immunogenicity, fluorescence intensity, and morphology of the VLP. The FSA assay was able to monitor the various denatured statuses of PCV2b VLPs treated with β-mercaptoethanol or β-mercaptoethanol plus urea. We have demonstrated that FSA can be used to detect the assembly of PCV2b VLPs produced in E. coli. This provides a simple solution for monitoring VLP assembly during the production of VLP-based vaccines. © 2016 S. Karger AG, Basel.

  13. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  14. Digital PCR on a SlipChip.

    PubMed

    Shen, Feng; Du, Wenbin; Kreutz, Jason E; Fok, Alice; Ismagilov, Rustem F

    2010-10-21

    This paper describes a SlipChip to perform digital PCR in a very simple and inexpensive format. The fluidic path for introducing the sample combined with the PCR mixture was formed using elongated wells in the two plates of the SlipChip designed to overlap during sample loading. This fluidic path was broken up by simple slipping of the two plates that removed the overlap among wells and brought each well in contact with a reservoir preloaded with oil to generate 1280 reaction compartments (2.6 nL each) simultaneously. After thermal cycling, end-point fluorescence intensity was used to detect the presence of nucleic acid. Digital PCR on the SlipChip was tested quantitatively by using Staphylococcus aureus genomic DNA. As the concentration of the template DNA in the reaction mixture was diluted, the fraction of positive wells decreased as expected from the statistical analysis. No cross-contamination was observed during the experiments. At the extremes of the dynamic range of digital PCR the standard confidence interval determined using a normal approximation of the binomial distribution is not satisfactory. Therefore, statistical analysis based on the score method was used to establish these confidence intervals. The SlipChip provides a simple strategy to count nucleic acids by using PCR. It may find applications in research applications such as single cell analysis, prenatal diagnostics, and point-of-care diagnostics. SlipChip would become valuable for diagnostics, including applications in resource-limited areas after integration with isothermal nucleic acid amplification technologies and visual readout.

  15. The Digital Shoreline Analysis System (DSAS) Version 4.0 - An ArcGIS extension for calculating shoreline change

    USGS Publications Warehouse

    Thieler, E. Robert; Himmelstoss, Emily A.; Zichichi, Jessica L.; Ergul, Ayhan

    2009-01-01

    The Digital Shoreline Analysis System (DSAS) version 4.0 is a software extension to ESRI ArcGIS v.9.2 and above that enables a user to calculate shoreline rate-of-change statistics from multiple historic shoreline positions. A user-friendly interface of simple buttons and menus guides the user through the major steps of shoreline change analysis. Components of the extension and user guide include (1) instruction on the proper way to define a reference baseline for measurements, (2) automated and manual generation of measurement transects and metadata based on user-specified parameters, and (3) output of calculated rates of shoreline change and other statistical information. DSAS computes shoreline rates of change using four different methods: (1) endpoint rate, (2) simple linear regression, (3) weighted linear regression, and (4) least median of squares. The standard error, correlation coefficient, and confidence interval are also computed for the simple and weighted linear-regression methods. The results of all rate calculations are output to a table that can be linked to the transect file by a common attribute field. DSAS is intended to facilitate the shoreline change-calculation process and to provide rate-of-change information and the statistical data necessary to establish the reliability of the calculated results. The software is also suitable for any generic application that calculates positional change over time, such as assessing rates of change of glacier limits in sequential aerial photos, river edge boundaries, land-cover changes, and so on.

  16. Rapid and simple immunochemical screening combined with hand-shaking extraction for thiamethoxam residue in agricultural products.

    PubMed

    Watanabe, Eiki; Kobara, Yuso; Miyake, Shiro

    2013-06-01

    With the aim of expanding the applicability of a kit-based enzyme-linked immunosorbent assay (ELISA) for the neonicotinoid insecticide thiamethoxam, the ELISA was newly applied to three kinds of agricultural samples (green pepper, eggplant and spinach). To offer the ELISA as a screening analysis for thiamethoxam residues, a rapid and simple method of extraction by hand-shaking was used, and speed-up and simplification of the sample treatment before the ELISA analysis were examined. Finally, the validity of the ELISA combined with the proposed extraction method was verified against a reference high-performance liquid chromatography (HPLC) method using real-world agricultural samples. There were no marked matrix effects derived from green pepper and eggplant extracts. On the other hand, although the effect due to a pigment in spinach extract on the assay performance was significant, it was effectively avoided by increasing the dilution level of the spinach extract. For thiamethoxam-spiked samples, acceptable recoveries of 97.9-109.1% and coefficients of variation of 0.3-11.5% were obtained. Inspection of the validity of the ELISA by comparison with the reference HPLC method showed that the two analytical results were very similar, and a high correlation was found between them (r>0.997). The evaluated ELISA combined with hand-shaking extraction provided a rapid and simple screening analysis that was quantitative and reliable for the detection of thiamethoxam in complex agricultural products. © 2012 Society of Chemical Industry.

  17. Ad hoc versus standardized admixtures for continuous infusion drugs in neonatal intensive care: cognitive task analysis of safety at the bedside.

    PubMed

    Brannon, Timothy S

    2006-01-01

    Continuous infusion intravenous (IV) drugs in neonatal intensive care are usually prepared based on patient weight so that the dose is readable as a simple multiple of the infusion pump rate. New safety guidelines propose that hospitals switch to using standardized admixtures of these drugs to prevent calculation errors during ad hoc preparation. Extended hierarchical task analysis suggests that switching to standardized admixtures may lead to more errors in programming the pump at the bedside.

  18. Ad Hoc versus Standardized Admixtures for Continuous Infusion Drugs in Neonatal Intensive Care: Cognitive Task Analysis of Safety at the Bedside

    PubMed Central

    Brannon, Timothy S.

    2006-01-01

    Continuous infusion intravenous (IV) drugs in neonatal intensive care are usually prepared based on patient weight so that the dose is readable as a simple multiple of the infusion pump rate. New safety guidelines propose that hospitals switch to using standardized admixtures of these drugs to prevent calculation errors during ad hoc preparation. Extended hierarchical task analysis suggests that switching to standardized admixtures may lead to more errors in programming the pump at the bedside. PMID:17238482

  19. Recursive sequences in first-year calculus

    NASA Astrophysics Data System (ADS)

    Krainer, Thomas

    2016-02-01

    This article provides ready-to-use supplementary material on recursive sequences for a second-semester calculus class. It equips first-year calculus students with a basic methodical procedure based on which they can conduct a rigorous convergence or divergence analysis of many simple recursive sequences on their own without the need to invoke inductive arguments as is typically required in calculus textbooks. The sequences that are accessible to this kind of analysis are predominantly (eventually) monotonic, but also certain recursive sequences that alternate around their limit point as they converge can be considered.

  20. Free vibration of arches flexible in shear.

    NASA Technical Reports Server (NTRS)

    Austin, W. J.; Veletsos, A. S.

    1973-01-01

    An analysis reported by Veletsos et al. (1972) concerning the free vibrational characteristics of circular arches vibrating in their own planes is considered. The analysis was based on a theory which neglects the effects of rotatory inertia and shearing deformation. A supplementary investigation is conducted to assess the effects of the previously neglected factors and to identify the conditions under which these effects are of practical significance or may be neglected. A simple approximate procedure is developed for estimating the natural frequencies of arches, giving due consideration to the effects of the previously neglected factors.

  1. An Attempt to Measure the Traffic Impact of Airline Alliances

    NASA Technical Reports Server (NTRS)

    Iatrou, Kostas; Skourias, Nikolaos

    2005-01-01

    This paper analyzes the effects of airline alliances on the allied partners output by comparing the traffic change observed between the pre- and the post-alliance period. First, a simple methodology based on traffic passenger modelling is developed, and then an empirical analysis is conducted using time series from four global strategic alliances (Wings, Star Alliance, oneworld and SkyTeam) and 124 alliance routes. The analysis concludes that, all other things being equal, strategic alliances do lead to a 9.4%, on average, improvement in passenger volume.

  2. Nonlinear fracture mechanics-based analysis of thin wall cylinders

    NASA Technical Reports Server (NTRS)

    Brust, Frederick W.; Leis, Brian N.; Forte, Thomas P.

    1994-01-01

    This paper presents a simple analysis technique to predict the crack initiation, growth, and rupture of large-radius, R, to thickness, t, ratio (thin wall) cylinders. The method is formulated to deal both with stable tearing as well as fatigue mechanisms in applications to both surface and through-wall axial cracks, including interacting surface cracks. The method can also account for time-dependent effects. Validation of the model is provided by comparisons of predictions to more than forty full scale experiments of thin wall cylinders pressurized to failure.

  3. Analysis and control of the METC fluid bed gasifier. Quarterly progress report, January--March 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-03-01

    This document summarizes work performed for the period 10/1/94 to 3/31/95. In this work, three components will form the basis for design of a control scheme for the Fluidized Bed Gasifier (FBG) at METC: (1) a control systems analysis based on simple linear models derived from process data, (2) review of the literature on fluid bed gasifier operation and control, and (3) understanding of present FBG operation and real world considerations. Below we summarize work accomplished to data in each of these areas.

  4. Operational experience in underwater photogrammetry

    NASA Astrophysics Data System (ADS)

    Leatherdale, John D.; John Turner, D.

    Underwater photogrammetry has become established as a cost-effective technique for inspection and maintenance of platforms and pipelines for the offshore oil industry. A commercial service based in Scotland operates in the North Sea, USA, Brazil, West Africa and Australia. 70 mm cameras and flash units are built for the purpose and analytical plotters and computer graphics systems are used for photogrammetric measurement and analysis of damage, corrosion, weld failures and redesign of underwater structures. Users are seeking simple, low-cost systems for photogrammetric analysis which their engineers can use themselves.

  5. Hydrodynamic lift for single cell manipulation in a femtosecond laser fabricated optofluidic chip

    NASA Astrophysics Data System (ADS)

    Bragheri, Francesca; Osellame, Roberto

    2017-08-01

    Single cell sorting based either on fluorescence or on mechanical properties has been exploited in the last years in microfluidic devices. Hydrodynamic focusing allows increasing the efficiency of theses devices by improving the matching between the region of optical analysis and that of cell flow. Here we present a very simple solution fabricated by femtosecond laser micromachining that exploits flow laminarity in microfluidic channels to easily lift the sample flowing position to the channel portion illuminated by the optical waveguides used for single cell trapping and analysis.

  6. Wald Sequential Probability Ratio Test for Analysis of Orbital Conjunction Data

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Markley, F. Landis; Gold, Dara

    2013-01-01

    We propose a Wald Sequential Probability Ratio Test for analysis of commonly available predictions associated with spacecraft conjunctions. Such predictions generally consist of a relative state and relative state error covariance at the time of closest approach, under the assumption that prediction errors are Gaussian. We show that under these circumstances, the likelihood ratio of the Wald test reduces to an especially simple form, involving the current best estimate of collision probability, and a similar estimate of collision probability that is based on prior assumptions about the likelihood of collision.

  7. Constellation Coverage Analysis

    NASA Technical Reports Server (NTRS)

    Lo, Martin W. (Compiler)

    1997-01-01

    The design of satellite constellations requires an understanding of the dynamic global coverage provided by the constellations. Even for a small constellation with a simple circular orbit propagator, the combinatorial nature of the analysis frequently renders the problem intractable. Particularly for the initial design phase where the orbital parameters are still fluid and undetermined, the coverage information is crucial to evaluate the performance of the constellation design. We have developed a fast and simple algorithm for determining the global constellation coverage dynamically using image processing techniques. This approach provides a fast, powerful and simple method for the analysis of global constellation coverage.

  8. Quality Control System using Simple Implementation of Seven Tools for Batik Textile Manufacturing

    NASA Astrophysics Data System (ADS)

    Ragil Suryoputro, Muhammad; Sugarindra, Muchamad; Erfaisalsyah, Hendy

    2017-06-01

    In order to produce better products and mitigate defect in products, every company must implement a quality control system. Company will find means to implement a quality control system that is capable and reliable. One of the methods is using the simple implementation of the seven tools in quality control defects. The case studied in this research was the level of disability xyz grey fabric on a shuttle loom 2 on the Batik manufacturing company. The seven tools that include: flowchart, check sheet, histogram, scatter diagram combined with control charts, Pareto diagrams and fishbone diagrams (causal diagram). Check sheet results obtained types of defects in the grey fabric was woven xyz is warp, double warp, the warp break, double warp, empty warp, warp tenuous, ugly edges, thick warp, and rust. Based on the analysis of control chart indicates that the process is out of control. This can be seen in the graph control where there is still a lot of outlier data. Based on a scatter diagram shows a positive correlation between the percentage of disability and the number of production. Based on Pareto diagram, repair needs priority is for the dominant type of defect is warp (44%) and based on double warp value histogram is also the highest with a value of 23635.11 m. In addition, based on the analysis of the factors causing defect by fishbone diagram double warp or other types of defects originating from the materials, methods, machines, measurements, man and environment. Thus the company can take to minimize the prevention and repair of defects and improve product quality.

  9. Similarity networks as a knowledge representation for space applications

    NASA Technical Reports Server (NTRS)

    Bailey, David; Thompson, Donna; Feinstein, Jerald

    1987-01-01

    Similarity networks are a powerful form of knowledge representation that are useful for many artificial intelligence applications. Similarity networks are used in applications ranging from information analysis and case based reasoning to machine learning and linking symbolic to neural processing. Strengths of similarity networks include simple construction, intuitive object storage, and flexible retrieval techniques that facilitate inferencing. Therefore, similarity networks provide great potential for space applications.

  10. Project Air Force, Annual Report 2003

    DTIC Science & Technology

    2003-01-01

    to Simulate Personnel Retention The CAPM system is based on a simple assumption about employee retention: A rational individual faced with the...analysis to certain parts of the force. CAPM keeps a complete record of the assumptions , policies, and data used for each scenario. Thus decisionmakers...premises and assumptions . Instead, the Commission concluded that space is a separate oper- ating arena equivalent to the air, land, and maritime

  11. Finite Element Based Structural Damage Detection Using Artificial Boundary Conditions

    DTIC Science & Technology

    2007-09-01

    C. (2005). Elementary Linear Algebra . New York: John Wiley and Sons. Avitable, Peter (2001, January) Experimental Modal Analysis, A Simple Non...variables under consideration. 3 Frequency sensitivities are the basis for a linear approximation to compute the change in the natural frequencies of a...THEORY The general problem statement for a non- linear constrained optimization problem is: To minimize ( )f x Objective Function Subject to

  12. Association analysis of the monoamine oxidase A gene in bipolar affective disorder by using family-based internal controls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noethen, M.M.; Eggermann, K.; Propping, P.

    1995-10-01

    It is well accepted that association studies are a major tool in investigating the contribution of single genes to the development of diseases that do not follow simple Mendelian inheritance pattern (so-called complex traits). Such major psychiatric diseases as bipolar affective disorder and schizophrenia clearly fall into this category of diseases. 7 refs., 1 tab.

  13. A simplified life-cycle cost comparison of various engines for small helicopter use

    NASA Technical Reports Server (NTRS)

    Civinskas, K. C.; Fishbach, L. M.

    1974-01-01

    A ten-year, life-cycle cost comparison is made of the following engines for small helicopter use: (1) simple turboshaft; (2) regenerative turboshaft; (3) compression-ignition reciprocator; (4) spark-ignited rotary; and (5) spark-ignited reciprocator. Based on a simplified analysis and somewhat approximate data, the simple turboshaft engine apparently has the lowest costs for mission times up to just under 2 hours. At 2 hours and above, the regenerative turboshaft appears promising. The reciprocating and rotary engines are less attractive, requiring from 10 percent to 80 percent more aircraft to have the same total payload capability as a given number of turbine powered craft. A nomogram was developed for estimating total costs of engines not covered in this study.

  14. Experimental determination of gap flow-conditioned forces at turbine stages and their effect on the running stability of simple rotors. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Wohlrab, R.

    1983-01-01

    Instabilities in turbine operation can be caused by forces which are produced in connection with motions involving the oil film in the bearings. An experimental investigation regarding the characteristics of such forces in the case of three typical steam turbine stages is conducted, taking into account the effect of various parameters. Supplementary kinetic tests are carried out to obtain an estimate of the flow forces which are proportional to the velocity. The measurements are based on the theoretical study of the damping characteristics of a vibrational model. A computational analysis of the effect of the measured fluid forces on the stability characteristics of simple rotor model is also conducted.

  15. A simple and efficient method for predicting protein-protein interaction sites.

    PubMed

    Higa, R H; Tozzi, C L

    2008-09-23

    Computational methods for predicting protein-protein interaction sites based on structural data are characterized by an accuracy between 70 and 80%. Some experimental studies indicate that only a fraction of the residues, forming clusters in the center of the interaction site, are energetically important for binding. In addition, the analysis of amino acid composition has shown that residues located in the center of the interaction site can be better discriminated from the residues in other parts of the protein surface. In the present study, we implement a simple method to predict interaction site residues exploiting this fact and show that it achieves a very competitive performance compared to other methods using the same dataset and criteria for performance evaluation (success rate of 82.1%).

  16. Cellufine sulfate column chromatography as a simple, rapid, and effective method to purify dengue virus.

    PubMed

    Kanlaya, Rattiyaporn; Thongboonkerd, Visith

    2016-08-01

    Conventional method to purify/concentrate dengue virus (DENV) is time-consuming with low virus recovery yield. Herein, we applied cellufine sulfate column chromatography to purify/concentrate DENV based on the mimicry between heparan sulfate and DENV envelope protein. Comparative analysis demonstrated that this new method offered higher purity (as determined by less contamination of bovine serum albumin) and recovery yield (as determined by greater infectivity). Moreover, overall duration used for cellufine sulfate column chromatography to purify/concentrate DENV was approximately 1/20 of that of conventional method. Therefore, cellufine sulfate column chromatography serves as a simple, rapid, and effective alternative method for DENV purification/concentration. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Estimation of surface temperature in remote pollution measurement experiments

    NASA Technical Reports Server (NTRS)

    Gupta, S. K.; Tiwari, S. N.

    1978-01-01

    A simple algorithm has been developed for estimating the actual surface temperature by applying corrections to the effective brightness temperature measured by radiometers mounted on remote sensing platforms. Corrections to effective brightness temperature are computed using an accurate radiative transfer model for the 'basic atmosphere' and several modifications of this caused by deviations of the various atmospheric and surface parameters from their base model values. Model calculations are employed to establish simple analytical relations between the deviations of these parameters and the additional temperature corrections required to compensate for them. Effects of simultaneous variation of two parameters are also examined. Use of these analytical relations instead of detailed radiative transfer calculations for routine data analysis results in a severalfold reduction in computation costs.

  18. Ground-state energies of simple metals

    NASA Technical Reports Server (NTRS)

    Hammerberg, J.; Ashcroft, N. W.

    1974-01-01

    A structural expansion for the static ground-state energy of a simple metal is derived. Two methods are presented, one an approach based on single-particle band structure which treats the electron gas as a nonlinear dielectric, the other a more general many-particle analysis using finite-temperature perturbation theory. The two methods are compared, and it is shown in detail how band-structure effects, Fermi-surface distortions, and chemical-potential shifts affect the total energy. These are of special interest in corrections to the total energy beyond third order in the electron-ion interaction and hence to systems where differences in energies for various crystal structures are exceptionally small. Preliminary calculations using these methods for the zero-temperature thermodynamic functions of atomic hydrogen are reported.

  19. Design of a global soil moisture initialization procedure for the simple biosphere model

    NASA Technical Reports Server (NTRS)

    Liston, G. E.; Sud, Y. C.; Walker, G. K.

    1993-01-01

    Global soil moisture and land-surface evapotranspiration fields are computed using an analysis scheme based on the Simple Biosphere (SiB) soil-vegetation-atmosphere interaction model. The scheme is driven with observed precipitation, and potential evapotranspiration, where the potential evapotranspiration is computed following the surface air temperature-potential evapotranspiration regression of Thomthwaite (1948). The observed surface air temperature is corrected to reflect potential (zero soil moisture stress) conditions by letting the ratio of actual transpiration to potential transpiration be a function of normalized difference vegetation index (NDVI). Soil moisture, evapotranspiration, and runoff data are generated on a daily basis for a 10-year period, January 1979 through December 1988, using observed precipitation gridded at a 4 deg by 5 deg resolution.

  20. Structural expansions for the ground state energy of a simple metal

    NASA Technical Reports Server (NTRS)

    Hammerberg, J.; Ashcroft, N. W.

    1973-01-01

    A structural expansion for the static ground state energy of a simple metal is derived. An approach based on single particle band structure which treats the electron gas as a non-linear dielectric is presented, along with a more general many particle analysis using finite temperature perturbation theory. The two methods are compared, and it is shown in detail how band-structure effects, Fermi surface distortions, and chemical potential shifts affect the total energy. These are of special interest in corrections to the total energy beyond third order in the electron ion interaction, and hence to systems where differences in energies for various crystal structures are exceptionally small. Preliminary calculations using these methods for the zero temperature thermodynamic functions of atomic hydrogen are reported.

  1. A fast and simple spectrofluorometric method for the determination of alendronate sodium in pharmaceuticals

    PubMed Central

    Ezzati Nazhad Dolatabadi, Jafar; Hamishehkar, Hamed; de la Guardia, Miguel; Valizadeh, Hadi

    2014-01-01

    Introduction: Alendronate sodium enhances bone formation and increases osteoblast proliferation and maturation and leads to the inhibition of osteoblast apoptosis. Therefore, a rapid and simple spectrofluorometric method has been developed and validated for the quantitative determination of it. Methods: The procedure is based on the reaction of primary amino group of alendronate with o-phthalaldehyde (OPA) in sodium hydroxide solution. Results: The calibration graph was linear over the concentration range of 0.0-2.4 μM and limit of detection and limit of quantification of the method was 8.89 and 29 nanomolar, respectively. The enthalpy and entropy of the reaction between alendronate sodium and OPA showed that the reaction is endothermic and entropy favored (ΔH = 154.08 kJ/mol; ΔS = 567.36 J/mol K) which indicates that OPA interaction with alendronate is increased at elevated temperature. Conclusion: This simple method can be used as a practical technique for the analysis of alendronate in various samples. PMID:24790897

  2. A practical model for pressure probe system response estimation (with review of existing models)

    NASA Astrophysics Data System (ADS)

    Hall, B. F.; Povey, T.

    2018-04-01

    The accurate estimation of the unsteady response (bandwidth) of pneumatic pressure probe systems (probe, line and transducer volume) is a common practical problem encountered in the design of aerodynamic experiments. Understanding the bandwidth of the probe system is necessary to capture unsteady flow features accurately. Where traversing probes are used, the desired traverse speed and spatial gradients in the flow dictate the minimum probe system bandwidth required to resolve the flow. Existing approaches for bandwidth estimation are either complex or inaccurate in implementation, so probes are often designed based on experience. Where probe system bandwidth is characterized, it is often done experimentally, requiring careful experimental set-up and analysis. There is a need for a relatively simple but accurate model for estimation of probe system bandwidth. A new model is presented for the accurate estimation of pressure probe bandwidth for simple probes commonly used in wind tunnel environments; experimental validation is provided. An additional, simple graphical method for air is included for convenience.

  3. Simple-1: Development stage of the data transmission system for a solid propellant mid-power rocket model

    NASA Astrophysics Data System (ADS)

    Yarce, Andrés; Sebastián Rodríguez, Juan; Galvez, Julián; Gómez, Alejandro; García, Manuel J.

    2017-06-01

    This paper presents the development stage of a communication module for a solid propellant mid-power rocket model. The communication module was named. Simple-1 and this work considers its design, construction and testing. A rocket model Estes Ventris Series Pro II® was modified to introduce, on the top of the payload, several sensors in a CanSat form factor. The Printed Circuit Board (PCB) was designed and fabricated from Commercial Off The Shelf (COTS) components and assembled in a cylindrical rack structure similar to this small format satellite concept. The sensors data was processed using one Arduino Mini and transmitted using a radio module to a Software Defined Radio (SDR) HackRF based platform on the ground station. The Simple-1 was tested using a drone in successive releases, reaching altitudes from 200 to 300 meters. Different kind of data, in terms of altitude, position, atmospheric pressure and vehicle temperature were successfully measured, making possible the progress to a next stage of launching and analysis.

  4. A simple algorithm for distance estimation without radar and stereo vision based on the bionic principle of bee eyes

    NASA Astrophysics Data System (ADS)

    Khamukhin, A. A.

    2017-02-01

    Simple navigation algorithms are needed for small autonomous unmanned aerial vehicles (UAVs). These algorithms can be implemented in a small microprocessor with low power consumption. This will help to reduce the weight of the UAVs computing equipment and to increase the flight range. The proposed algorithm uses only the number of opaque channels (ommatidia in bees) through which a target can be seen by moving an observer from location 1 to 2 toward the target. The distance estimation is given relative to the distance between locations 1 and 2. The simple scheme of an appositional compound eye to develop calculation formula is proposed. The distance estimation error analysis shows that it decreases with an increase of the total number of opaque channels to a certain limit. An acceptable error of about 2 % is achieved with the angle of view from 3 to 10° when the total number of opaque channels is 21600.

  5. Discrimination of the rare medicinal plant Dendrobium officinale based on naringenin, bibenzyl, and polysaccharides.

    PubMed

    Chen, Xiaomei; Wang, Fangfei; Wang, Yunqiang; Li, Xuelan; Wang, Airong; Wang, Chunlan; Guo, Shunxing

    2012-12-01

    The aim of this study was to establish a method for discriminating Dendrobium officinale from four of its close relatives Dendrobium chrysanthum, Dendrobium crystallinum, Dendrobium aphyllum and Dendrobium devonianum based on chemical composition analysis. We analyzed 62 samples of 24 Dendrobium species. High performance liquid chromatography analysis confirmed that the four low molecular weight compounds 4',5,7-trihydroxyflavanone (naringenin), 3,4-dihydroxy-4',5-dime-thoxybibenzyl (DDB-2), 3',4-dihydroxy-3,5'-dimethoxybibenzyl (gigantol), and 4,4'-dihydroxy-3,3',5-trimethoxybibenzy (moscatilin), were common in the genus. The phenol-sulfuric acid method was used to quantify polysaccharides, and the monosaccharide composition of the polysaccharides was determined by gas chromatography. Stepwise discriminant analysis was used to differentiate among the five closely related species based on the chemical composition analysis. This proved to be a simple and accurate approach for discriminating among these species. The results also showed that the polysaccharide content, the amounts of the four low molecular weight compounds, and the mannose to glucose ratio, were important factors for species discriminant. Therefore, we propose that a chemical analysis based on quantification of naringenin, bibenzyl, and polysaccharides is effective for identifying D. officinale.

  6. Finite element modeling and analysis of tires

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Andersen, C. M.

    1983-01-01

    Predicting the response of tires under various loading conditions using finite element technology is addressed. Some of the recent advances in finite element technology which have high potential for application to tire modeling problems are reviewed. The analysis and modeling needs for tires are identified. Reduction methods for large-scale nonlinear analysis, with particular emphasis on treatment of combined loads, displacement-dependent and nonconservative loadings; development of simple and efficient mixed finite element models for shell analysis, identification of equivalent mixed and purely displacement models, and determination of the advantages of using mixed models; and effective computational models for large-rotation nonlinear problems, based on a total Lagrangian description of the deformation are included.

  7. Post hoc support vector machine learning for impedimetric biosensors based on weak protein-ligand interactions.

    PubMed

    Rong, Y; Padron, A V; Hagerty, K J; Nelson, N; Chi, S; Keyhani, N O; Katz, J; Datta, S P A; Gomes, C; McLamore, E S

    2018-04-30

    Impedimetric biosensors for measuring small molecules based on weak/transient interactions between bioreceptors and target analytes are a challenge for detection electronics, particularly in field studies or in the analysis of complex matrices. Protein-ligand binding sensors have enormous potential for biosensing, but achieving accuracy in complex solutions is a major challenge. There is a need for simple post hoc analytical tools that are not computationally expensive, yet provide near real time feedback on data derived from impedance spectra. Here, we show the use of a simple, open source support vector machine learning algorithm for analyzing impedimetric data in lieu of using equivalent circuit analysis. We demonstrate two different protein-based biosensors to show that the tool can be used for various applications. We conclude with a mobile phone-based demonstration focused on the measurement of acetone, an important biomarker related to the onset of diabetic ketoacidosis. In all conditions tested, the open source classifier was capable of performing as well as, or better, than the equivalent circuit analysis for characterizing weak/transient interactions between a model ligand (acetone) and a small chemosensory protein derived from the tsetse fly. In addition, the tool has a low computational requirement, facilitating use for mobile acquisition systems such as mobile phones. The protocol is deployed through Jupyter notebook (an open source computing environment available for mobile phone, tablet or computer use) and the code was written in Python. For each of the applications, we provide step-by-step instructions in English, Spanish, Mandarin and Portuguese to facilitate widespread use. All codes were based on scikit-learn, an open source software machine learning library in the Python language, and were processed in Jupyter notebook, an open-source web application for Python. The tool can easily be integrated with the mobile biosensor equipment for rapid detection, facilitating use by a broad range of impedimetric biosensor users. This post hoc analysis tool can serve as a launchpad for the convergence of nanobiosensors in planetary health monitoring applications based on mobile phone hardware.

  8. Behavior related pauses in simple spike activity of mouse Purkinje cells are linked to spike rate modulation

    PubMed Central

    Cao, Ying; Maran, Selva K.; Dhamala, Mukesh; Jaeger, Dieter; Heck, Detlef H.

    2012-01-01

    Purkinje cells (PCs) in the mammalian cerebellum express high frequency spontaneous activity with average spike rates between 30 and 200 Hz. Cerebellar nuclear (CN) neurons receive converging input from many PCs resulting in a continuous barrage of inhibitory inputs. It has been hypothesized that pauses in PC activity trigger increases in CN spiking activity. A prediction derived from this hypothesis is that pauses in PC simple spike activity represent relevant behavioral or sensory events. Here we asked whether pauses in the simple spike activity of PCs related to either fluid licking or respiration, play a special role in representing information about behavior. Both behaviors are widely represented in cerebellar PC simple spike activity. We recorded PC activity in the vermis and lobus simplex of head fixed mice while monitoring licking and respiratory behavior. Using cross correlation and Granger causality analysis we examined whether short ISIs had a different temporal relation to behavior than long ISIs or pauses. Behavior related simple spike pauses occurred during low-rate simple spike activity in both licking and breathing related PCs. Granger causality analysis revealed causal relationships between simple spike pauses and behavior. However, the same results were obtained from an analysis of surrogate spike trains with gamma ISI distributions constructed to match rate modulations of behavior related Purkinje cells. Our results therefore suggest that the occurrence of pauses in simple spike activity does not represent additional information about behavioral or sensory events that goes beyond the simple spike rate modulations. PMID:22723707

  9. Simple Colorimetric Sensor for Trinitrotoluene Testing

    NASA Astrophysics Data System (ADS)

    Samanman, S.; Masoh, N.; Salah, Y.; Srisawat, S.; Wattanayon, R.; Wangsirikul, P.; Phumivanichakit, K.

    2017-02-01

    A simple operating colorimetric sensor for trinitrotoluene (TNT) determination using a commercial scanner as a captured image was designed. The sensor is based on the chemical reaction between TNT and sodium hydroxide reagent to produce the color change within 96 well plates, which observed finally, recorded using a commercial scanner. The intensity of the color change increased with increase in TNT concentration and could easily quantify the concentration of TNT by digital image analysis using the Image J free software. Under optimum conditions, the sensor provided a linear dynamic range between 0.20 and 1.00 mg mL-1(r = 0.9921) with a limit of detection of 0.10± 0.01 mg mL-1. The relative standard deviation for eight experiments for the sensitivity was 3.8%. When applied for the analysis of TNT in two soil extract samples, the concentrations were found to be non-detectable to 0.26±0.04 mg mL-1. The obtained recovery values (93-95%) were acceptable for soil samples tested.

  10. ESCA studies of the surface chemistry of lunar fines. [Electron Spectroscopic Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Housley, R. M.; Grant, R. W.

    1976-01-01

    The paper presents an ESCA analysis based on the use of a synthetic lunar-glass standard that allows determination of the surface composition of lunar samples with an accuracy that appears to be better than 10% of the amount present for all major elements except Ti. It is found that, on the average, grain surfaces in the lunar fines samples 10084 and 15301 are strongly enriched in Si, moderately enriched in Fe, moderately depleted in Al and Ca, and strongly depleted in Mg. This pattern could not be produced by the deposition of any expected meteoritic vapor. Neither could it be produced by simple inverse-mass-dependent element loss during sputtering. It is suggested that at least part of the pattern may be a simple consequence of agglutinate glass formation in the fines since there is some evidence that Si can become enriched on the surface of silicate melts. These results do not support the strong enrichments in Fe on grain surfaces reported from Auger studies.

  11. A Simple and Computationally Efficient Sampling Approach to Covariate Adjustment for Multifactor Dimensionality Reduction Analysis of Epistasis

    PubMed Central

    Gui, Jiang; Andrew, Angeline S.; Andrews, Peter; Nelson, Heather M.; Kelsey, Karl T.; Karagas, Margaret R.; Moore, Jason H.

    2010-01-01

    Epistasis or gene-gene interaction is a fundamental component of the genetic architecture of complex traits such as disease susceptibility. Multifactor dimensionality reduction (MDR) was developed as a nonparametric and model-free method to detect epistasis when there are no significant marginal genetic effects. However, in many studies of complex disease, other covariates like age of onset and smoking status could have a strong main effect and may potentially interfere with MDR's ability to achieve its goal. In this paper, we present a simple and computationally efficient sampling method to adjust for covariate effects in MDR. We use simulation to show that after adjustment, MDR has sufficient power to detect true gene-gene interactions. We also compare our method with the state-of-art technique in covariate adjustment. The results suggest that our proposed method performs similarly, but is more computationally efficient. We then apply this new method to an analysis of a population-based bladder cancer study in New Hampshire. PMID:20924193

  12. Stability-Derivative Determination from Flight Data

    NASA Technical Reports Server (NTRS)

    Holowicz, Chester H.; Holleman, Euclid C.

    1958-01-01

    A comprehensive discussion of the various factors affecting the determination of stability and control derivatives from flight data is presented based on the experience of the NASA High-Speed Flight Station. Factors relating to test techniques, determination of mass characteristics, instrumentation, and methods of analysis are discussed. For most longitudinal-stability-derivative analyses simple equations utilizing period and damping have been found to be as satisfactory as more comprehensive methods. The graphical time-vector method has been the basis of lateral-derivative analysis, although simple approximate methods can be useful If applied with caution. Control effectiveness has been generally obtained by relating the peak acceleration to the rapid control input, and consideration must be given to aerodynamic contributions if reasonable accuracy is to be realized.. Because of the many factors involved In the determination of stability derivatives, It is believed that the primary stability and control derivatives are probably accurate to within 10 to 25 percent, depending upon the specific derivative. Static-stability derivatives at low angle of attack show the greatest accuracy.

  13. A Method for Automated Detection of Usability Problems from Client User Interface Events

    PubMed Central

    Saadawi, Gilan M.; Legowski, Elizabeth; Medvedeva, Olga; Chavan, Girish; Crowley, Rebecca S.

    2005-01-01

    Think-aloud usability analysis provides extremely useful data but is very time-consuming and expensive to perform because of the extensive manual video analysis that is required. We describe a simple method for automated detection of usability problems from client user interface events for a developing medical intelligent tutoring system. The method incorporates (1) an agent-based method for communication that funnels all interface events and system responses to a centralized database, (2) a simple schema for representing interface events and higher order subgoals, and (3) an algorithm that reproduces the criteria used for manual coding of usability problems. A correction factor was empirically determining to account for the slower task performance of users when thinking aloud. We tested the validity of the method by simultaneously identifying usability problems using TAU and manually computing them from stored interface event data using the proposed algorithm. All usability problems that did not rely on verbal utterances were detectable with the proposed method. PMID:16779121

  14. Research Techniques Made Simple: Analysis of Collective Cell Migration Using the Wound Healing Assay.

    PubMed

    Grada, Ayman; Otero-Vinas, Marta; Prieto-Castrillo, Francisco; Obagi, Zaidal; Falanga, Vincent

    2017-02-01

    Collective cell migration is a hallmark of wound repair, cancer invasion and metastasis, immune responses, angiogenesis, and embryonic morphogenesis. Wound healing is a complex cellular and biochemical process necessary to restore structurally damaged tissue. It involves dynamic interactions and crosstalk between various cell types, interaction with extracellular matrix molecules, and regulated production of soluble mediators and cytokines. In cutaneous wound healing, skin cells migrate from the wound edges into the wound to restore skin integrity. Analysis of cell migration in vitro is a useful assay to quantify alterations in cell migratory capacity in response to experimental manipulations. Although several methods exist to study cell migration (such as Boyden chamber assay, barrier assays, and microfluidics-based assays), in this short report we will explain the wound healing assay, also known as the "in vitro scratch assay" as a simple, versatile, and cost-effective method to study collective cell migration and wound healing. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  15. The probability heuristics model of syllogistic reasoning.

    PubMed

    Chater, N; Oaksford, M

    1999-03-01

    A probability heuristic model (PHM) for syllogistic reasoning is proposed. An informational ordering over quantified statements suggests simple probability based heuristics for syllogistic reasoning. The most important is the "min-heuristic": choose the type of the least informative premise as the type of the conclusion. The rationality of this heuristic is confirmed by an analysis of the probabilistic validity of syllogistic reasoning which treats logical inference as a limiting case of probabilistic inference. A meta-analysis of past experiments reveals close fits with PHM. PHM also compares favorably with alternative accounts, including mental logics, mental models, and deduction as verbal reasoning. Crucially, PHM extends naturally to generalized quantifiers, such as Most and Few, which have not been characterized logically and are, consequently, beyond the scope of current mental logic and mental model theories. Two experiments confirm the novel predictions of PHM when generalized quantifiers are used in syllogistic arguments. PHM suggests that syllogistic reasoning performance may be determined by simple but rational informational strategies justified by probability theory rather than by logic. Copyright 1999 Academic Press.

  16. A New Metre for Cheap, Quick, Reliable and Simple Thermal Transmittance (U-Value) Measurements in Buildings.

    PubMed

    Andújar Márquez, José Manuel; Martínez Bohórquez, Miguel Ángel; Gómez Melgar, Sergio

    2017-09-03

    This paper deals with the thermal transmittance measurement focused on buildings and specifically in building energy retrofitting. Today, if many thermal transmittance measurements in a short time are needed, the current devices, based on the measurement of the heat flow through the wall, cannot carry out them, except if a great amount of devices are used at once along with intensive and tedious post-processing and analysis work. In this paper, from well-known physical laws, authors develop a methodology based on three temperatures measurements, which is implemented by a novel thermal transmittance metre. The paper shows its development step by step. As a result the developed device is modular, scalable, and fully wireless; it is capable of taking as many measurements at once as user needs. The developed system is compared working together on a same test to the currently used one based on heat flow. The results show that the developed metre allows carrying out thermal transmittance measurements in buildings in a cheap, quick, reliable and simple way.

  17. Fuzzy logic-based flight control system design

    NASA Astrophysics Data System (ADS)

    Nho, Kyungmoon

    The application of fuzzy logic to aircraft motion control is studied in this dissertation. The self-tuning fuzzy techniques are developed by changing input scaling factors to obtain a robust fuzzy controller over a wide range of operating conditions and nonlinearities for a nonlinear aircraft model. It is demonstrated that the properly adjusted input scaling factors can meet the required performance and robustness in a fuzzy controller. For a simple demonstration of the easy design and control capability of a fuzzy controller, a proportional-derivative (PD) fuzzy control system is compared to the conventional controller for a simple dynamical system. This thesis also describes the design principles and stability analysis of fuzzy control systems by considering the key features of a fuzzy control system including the fuzzification, rule-base and defuzzification. The wing-rock motion of slender delta wings, a linear aircraft model and the six degree of freedom nonlinear aircraft dynamics are considered to illustrate several self-tuning methods employing change in input scaling factors. Finally, this dissertation is concluded with numerical simulation of glide-slope capture in windshear demonstrating the robustness of the fuzzy logic based flight control system.

  18. Single-molecular diodes based on opioid derivatives.

    PubMed

    Siqueira, M R S; Corrêa, S M; Gester, R M; Del Nero, J; Neto, A M J C

    2015-12-01

    We propose an efficient single-molecule rectifier based on a derivative of opioid. Electron transport properties are investigated within the non-equilibrium Green's function formalism combined with density functional theory. The analysis of the current-voltage characteristics indicates obvious diode-like behavior. While heroin presents rectification coefficient R>1, indicating preferential electronic current from electron-donating to electron-withdrawing, 3 and 6-acetylmorphine and morphine exhibit contrary behavior, R<1. Our calculations indicate that the simple inclusion of acetyl groups modulate a range of devices, which varies from simple rectifying to resonant-tunneling diodes. In particular, the rectification rations for heroin diodes show microampere electron current with a maximum of rectification (R=9.1) at very low bias voltage of ∼0.6 V and (R=14.3)∼1.8 V with resistance varying between 0.4 and 1.5 M Ω. Once most of the current single-molecule diodes usually rectifies in nanoampere, are not stable over 1.0 V and present electrical resistance around 10 M. Molecular devices based on opioid derivatives are promising in molecular electronics.

  19. Lax-Friedrichs sweeping scheme for static Hamilton-Jacobi equations

    NASA Astrophysics Data System (ADS)

    Kao, Chiu Yen; Osher, Stanley; Qian, Jianliang

    2004-05-01

    We propose a simple, fast sweeping method based on the Lax-Friedrichs monotone numerical Hamiltonian to approximate viscosity solutions of arbitrary static Hamilton-Jacobi equations in any number of spatial dimensions. By using the Lax-Friedrichs numerical Hamiltonian, we can easily obtain the solution at a specific grid point in terms of its neighbors, so that a Gauss-Seidel type nonlinear iterative method can be utilized. Furthermore, by incorporating a group-wise causality principle into the Gauss-Seidel iteration by following a finite group of characteristics, we have an easy-to-implement, sweeping-type, and fast convergent numerical method. However, unlike other methods based on the Godunov numerical Hamiltonian, some computational boundary conditions are needed in the implementation. We give a simple recipe which enforces a version of discrete min-max principle. Some convergence analysis is done for the one-dimensional eikonal equation. Extensive 2-D and 3-D numerical examples illustrate the efficiency and accuracy of the new approach. To our knowledge, this is the first fast numerical method based on discretizing the Hamilton-Jacobi equation directly without assuming convexity and/or homogeneity of the Hamiltonian.

  20. Diversity of chloroplast genome among local clones of cocoa (Theobroma cacao, L.) from Central Sulawesi

    NASA Astrophysics Data System (ADS)

    Suwastika, I. Nengah; Pakawaru, Nurul Aisyah; Rifka, Rahmansyah, Muslimin, Ishizaki, Yoko; Cruz, André Freire; Basri, Zainuddin; Shiina, Takashi

    2017-02-01

    Chloroplast genomes typically range in size from 120 to 170 kilo base pairs (kb), which relatively conserved among plant species. Recent evaluation on several species, certain unique regions showed high variability which can be utilized in the phylogenetic analysis. Many fragments of coding regions, introns, and intergenic spacers, such as atpB-rbcL, ndhF, rbcL, rpl16, trnH-psbA, trnL-F, trnS-G, etc., have been used for phylogenetic reconstructions at various taxonomic levels. Based on that status, we would like to analysis the diversity of chloroplast genome within species of local cacao (Theobroma cacao L.) from Central Sulawesi. Our recent data showed, there were more than 20 clones from local farming in Central Sulawesi, and it can be detected based on phenotypic and nuclear-genome-based characterization (RAPD- Random Amplified Polymorphic DNA and SSR- Simple Sequences Repeat) markers. In developing DNA marker for this local cacao, here we also included analysis based on the variation of chloroplast genome. At least several regions such as rpl32-TurnL, it can be considered as chloroplast markers on our local clone of cocoa. Furthermore, we could develop phylogenetic analysis in between clones of cocoa.

  1. Simple Spreadsheet Models For Interpretation Of Fractured Media Tracer Tests

    EPA Science Inventory

    An analysis of a gas-phase partitioning tracer test conducted through fractured media is discussed within this paper. The analysis employed matching eight simple mathematical models to the experimental data to determine transport parameters. All of the models tested; two porous...

  2. Validation of the Simple Shoulder Test in a Portuguese-Brazilian population. Is the latent variable structure and validation of the Simple Shoulder Test Stable across cultures?

    PubMed

    Neto, Jose Osni Bruggemann; Gesser, Rafael Lehmkuhl; Steglich, Valdir; Bonilauri Ferreira, Ana Paula; Gandhi, Mihir; Vissoci, João Ricardo Nickenig; Pietrobon, Ricardo

    2013-01-01

    The validation of widely used scales facilitates the comparison across international patient samples. The objective of this study was to translate, culturally adapt and validate the Simple Shoulder Test into Brazilian Portuguese. Also we test the stability of factor analysis across different cultures. The objective of this study was to translate, culturally adapt and validate the Simple Shoulder Test into Brazilian Portuguese. Also we test the stability of factor analysis across different cultures. The Simple Shoulder Test was translated from English into Brazilian Portuguese, translated back into English, and evaluated for accuracy by an expert committee. It was then administered to 100 patients with shoulder conditions. Psychometric properties were analyzed including factor analysis, internal reliability, test-retest reliability at seven days, and construct validity in relation to the Short Form 36 health survey (SF-36). Factor analysis demonstrated a three factor solution. Cronbach's alpha was 0.82. Test-retest reliability index as measured by intra-class correlation coefficient (ICC) was 0.84. Associations were observed in the hypothesized direction with all subscales of SF-36 questionnaire. The Simple Shoulder Test translation and cultural adaptation to Brazilian-Portuguese demonstrated adequate factor structure, internal reliability, and validity, ultimately allowing for its use in the comparison with international patient samples.

  3. Validation of the Simple Shoulder Test in a Portuguese-Brazilian Population. Is the Latent Variable Structure and Validation of the Simple Shoulder Test Stable across Cultures?

    PubMed Central

    Neto, Jose Osni Bruggemann; Gesser, Rafael Lehmkuhl; Steglich, Valdir; Bonilauri Ferreira, Ana Paula; Gandhi, Mihir; Vissoci, João Ricardo Nickenig; Pietrobon, Ricardo

    2013-01-01

    Background The validation of widely used scales facilitates the comparison across international patient samples. The objective of this study was to translate, culturally adapt and validate the Simple Shoulder Test into Brazilian Portuguese. Also we test the stability of factor analysis across different cultures. Objective The objective of this study was to translate, culturally adapt and validate the Simple Shoulder Test into Brazilian Portuguese. Also we test the stability of factor analysis across different cultures. Methods The Simple Shoulder Test was translated from English into Brazilian Portuguese, translated back into English, and evaluated for accuracy by an expert committee. It was then administered to 100 patients with shoulder conditions. Psychometric properties were analyzed including factor analysis, internal reliability, test-retest reliability at seven days, and construct validity in relation to the Short Form 36 health survey (SF-36). Results Factor analysis demonstrated a three factor solution. Cronbach’s alpha was 0.82. Test-retest reliability index as measured by intra-class correlation coefficient (ICC) was 0.84. Associations were observed in the hypothesized direction with all subscales of SF-36 questionnaire. Conclusion The Simple Shoulder Test translation and cultural adaptation to Brazilian-Portuguese demonstrated adequate factor structure, internal reliability, and validity, ultimately allowing for its use in the comparison with international patient samples. PMID:23675436

  4. A rule-based software test data generator

    NASA Technical Reports Server (NTRS)

    Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

    1991-01-01

    Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

  5. USING MICROSOFT OFFICE EXCEL® 2007 TO CONDUCT GENERALIZED MATCHING ANALYSES

    PubMed Central

    Reed, Derek D

    2009-01-01

    The generalized matching equation is a robust and empirically supported means of analyzing relations between reinforcement and behavior. Unfortunately, no simple task analysis is available to behavior analysts interested in using the matching equation to evaluate data in clinical or applied settings. This technical article presents a task analysis for the use of Microsoft Excel to analyze and plot the generalized matching equation. Using a data-based case example and a step-by-step guide for completing the analysis, these instructions are intended to promote the use of quantitative analyses by researchers with little to no experience in quantitative analyses or the matching law. PMID:20514196

  6. Using Microsoft Office Excel 2007 to conduct generalized matching analyses.

    PubMed

    Reed, Derek D

    2009-01-01

    The generalized matching equation is a robust and empirically supported means of analyzing relations between reinforcement and behavior. Unfortunately, no simple task analysis is available to behavior analysts interested in using the matching equation to evaluate data in clinical or applied settings. This technical article presents a task analysis for the use of Microsoft Excel to analyze and plot the generalized matching equation. Using a data-based case example and a step-by-step guide for completing the analysis, these instructions are intended to promote the use of quantitative analyses by researchers with little to no experience in quantitative analyses or the matching law.

  7. Lung sound analysis for wheeze episode detection.

    PubMed

    Jain, Abhishek; Vepa, Jithendra

    2008-01-01

    Listening and interpreting lung sounds by a stethoscope had been an important component of screening and diagnosing lung diseases. However this practice has always been vulnerable to poor audibility, inter-observer variations (between different physicians) and poor reproducibility. Thus computerized analysis of lung sounds for objective diagnosis of lung diseases is seen as a probable aid. In this paper we aim at automatic analysis of lung sounds for wheeze episode detection and quantification. The proposed algorithm integrates and analyses the set of parameters based on ATS (American Thoracic Society) definition of wheezes. It is very robust, computationally simple and yielded sensitivity of 84% and specificity of 86%.

  8. A joint analysis of the Drake equation and the Fermi paradox

    NASA Astrophysics Data System (ADS)

    Prantzos, Nikos

    2013-07-01

    I propose a unified framework for a joint analysis of the Drake equation and the Fermi paradox, which enables a simultaneous, quantitative study of both of them. The analysis is based on a simplified form of the Drake equation and on a fairly simple scheme for the colonization of the Milky Way. It appears that for sufficiently long-lived civilizations, colonization of the Galaxy is the only reasonable option to gain knowledge about other life forms. This argument allows one to define a region in the parameter space of the Drake equation, where the Fermi paradox definitely holds (`Strong Fermi paradox').

  9. Analysis of carbendazim, benomyl, thiophanate methyl and 2,4-dichlorophenoxyacetic acid in fruits and vegetables after supercritical fluid extraction.

    PubMed

    Anastassiades, M; Schwack, W

    1998-10-30

    Simple methods for the analysis of carbendazim, benomyl and thiophanate methyl in fruits and vegetables and of 2,4-D in citrus fruits are presented. Sample preparation involves supercritical fluid extraction with carbon dioxide and further analysis is performed without any additional clean-up by GC-MS after derivatisation or directly by HPLC-diode array detection. The SFE methods presented are clearly faster and more cost effective than traditional solvent based approaches. The recoveries, detection limits and repeatabilities achieved, meet the needs of tolerance level monitoring of these compounds in fruits and vegetables.

  10. Accurate analysis and visualization of cardiac (11)C-PIB uptake in amyloidosis with semiautomatic software.

    PubMed

    Kero, Tanja; Lindsjö, Lars; Sörensen, Jens; Lubberink, Mark

    2016-08-01

    (11)C-PIB PET is a promising non-invasive diagnostic tool for cardiac amyloidosis. Semiautomatic analysis of PET data is now available but it is not known how accurate these methods are for amyloid imaging. The aim of this study was to evaluate the feasibility of one semiautomatic software tool for analysis and visualization of (11)C-PIB left ventricular retention index (RI) in cardiac amyloidosis. Patients with systemic amyloidosis and cardiac involvement (n = 10) and healthy controls (n = 5) were investigated with dynamic (11)C-PIB PET. Two observers analyzed the PET studies with semiautomatic software to calculate the left ventricular RI of (11)C-PIB and to create parametric images. The mean RI at 15-25 min from the semiautomatic analysis was compared with RI based on manual analysis and showed comparable values (0.056 vs 0.054 min(-1) for amyloidosis patients and 0.024 vs 0.025 min(-1) in healthy controls; P = .78) and the correlation was excellent (r = 0.98). Inter-reader reproducibility also was excellent (intraclass correlation coefficient, ICC > 0.98). Parametric polarmaps and histograms made visual separation of amyloidosis patients and healthy controls fast and simple. Accurate semiautomatic analysis of cardiac (11)C-PIB RI in amyloidosis patients is feasible. Parametric polarmaps and histograms make visual interpretation fast and simple.

  11. VISA--Vector Integration Site Analysis server: a web-based server to rapidly identify retroviral integration sites from next-generation sequencing.

    PubMed

    Hocum, Jonah D; Battrell, Logan R; Maynard, Ryan; Adair, Jennifer E; Beard, Brian C; Rawlings, David J; Kiem, Hans-Peter; Miller, Daniel G; Trobridge, Grant D

    2015-07-07

    Analyzing the integration profile of retroviral vectors is a vital step in determining their potential genotoxic effects and developing safer vectors for therapeutic use. Identifying retroviral vector integration sites is also important for retroviral mutagenesis screens. We developed VISA, a vector integration site analysis server, to analyze next-generation sequencing data for retroviral vector integration sites. Sequence reads that contain a provirus are mapped to the human genome, sequence reads that cannot be localized to a unique location in the genome are filtered out, and then unique retroviral vector integration sites are determined based on the alignment scores of the remaining sequence reads. VISA offers a simple web interface to upload sequence files and results are returned in a concise tabular format to allow rapid analysis of retroviral vector integration sites.

  12. Comparison of two PCR-based methods and automated DNA sequencing for prop-1 genotyping in Ames dwarf mice.

    PubMed

    Gerstner, Arpad; DeFord, James H; Papaconstantinou, John

    2003-07-25

    Ames dwarfism is caused by a homozygous single nucleotide mutation in the pituitary specific prop-1 gene, resulting in combined pituitary hormone deficiency, reduced growth and extended lifespan. Thus, these mice serve as an important model system for endocrinological, aging and longevity studies. Because the phenotype of wild type and heterozygous mice is undistinguishable, it is imperative for successful breeding to accurately genotype these animals. Here we report a novel, yet simple, approach for prop-1 genotyping using PCR-based allele-specific amplification (PCR-ASA). We also compare this method to other potential genotyping techniques, i.e. PCR-based restriction fragment length polymorphism analysis (PCR-RFLP) and fluorescence automated DNA sequencing. We demonstrate that the single-step PCR-ASA has several advantages over the classical PCR-RFLP because the procedure is simple, less expensive and rapid. To further increase the specificity and sensitivity of the PCR-ASA, we introduced a single-base mismatch at the 3' penultimate position of the mutant primer. Our results also reveal that the fluorescence automated DNA sequencing has limitations for detecting a single nucleotide polymorphism in the prop-1 gene, particularly in heterozygotes.

  13. Simple Shared Motifs (SSM) in conserved region of promoters: a new approach to identify co-regulation patterns.

    PubMed

    Gruel, Jérémy; LeBorgne, Michel; LeMeur, Nolwenn; Théret, Nathalie

    2011-09-12

    Regulation of gene expression plays a pivotal role in cellular functions. However, understanding the dynamics of transcription remains a challenging task. A host of computational approaches have been developed to identify regulatory motifs, mainly based on the recognition of DNA sequences for transcription factor binding sites. Recent integration of additional data from genomic analyses or phylogenetic footprinting has significantly improved these methods. Here, we propose a different approach based on the compilation of Simple Shared Motifs (SSM), groups of sequences defined by their length and similarity and present in conserved sequences of gene promoters. We developed an original algorithm to search and count SSM in pairs of genes. An exceptional number of SSM is considered as a common regulatory pattern. The SSM approach is applied to a sample set of genes and validated using functional gene-set enrichment analyses. We demonstrate that the SSM approach selects genes that are over-represented in specific biological categories (Ontology and Pathways) and are enriched in co-expressed genes. Finally we show that genes co-expressed in the same tissue or involved in the same biological pathway have increased SSM values. Using unbiased clustering of genes, Simple Shared Motifs analysis constitutes an original contribution to provide a clearer definition of expression networks.

  14. Simple Shared Motifs (SSM) in conserved region of promoters: a new approach to identify co-regulation patterns

    PubMed Central

    2011-01-01

    Background Regulation of gene expression plays a pivotal role in cellular functions. However, understanding the dynamics of transcription remains a challenging task. A host of computational approaches have been developed to identify regulatory motifs, mainly based on the recognition of DNA sequences for transcription factor binding sites. Recent integration of additional data from genomic analyses or phylogenetic footprinting has significantly improved these methods. Results Here, we propose a different approach based on the compilation of Simple Shared Motifs (SSM), groups of sequences defined by their length and similarity and present in conserved sequences of gene promoters. We developed an original algorithm to search and count SSM in pairs of genes. An exceptional number of SSM is considered as a common regulatory pattern. The SSM approach is applied to a sample set of genes and validated using functional gene-set enrichment analyses. We demonstrate that the SSM approach selects genes that are over-represented in specific biological categories (Ontology and Pathways) and are enriched in co-expressed genes. Finally we show that genes co-expressed in the same tissue or involved in the same biological pathway have increased SSM values. Conclusions Using unbiased clustering of genes, Simple Shared Motifs analysis constitutes an original contribution to provide a clearer definition of expression networks. PMID:21910886

  15. Monostatic Radar Cross Section Estimation of Missile Shaped Object Using Physical Optics Method

    NASA Astrophysics Data System (ADS)

    Sasi Bhushana Rao, G.; Nambari, Swathi; Kota, Srikanth; Ranga Rao, K. S.

    2017-08-01

    Stealth Technology manages many signatures for a target in which most radar systems use radar cross section (RCS) for discriminating targets and classifying them with regard to Stealth. During a war target’s RCS has to be very small to make target invisible to enemy radar. In this study, Radar Cross Section of perfectly conducting objects like cylinder, truncated cone (frustum) and circular flat plate is estimated with respect to parameters like size, frequency and aspect angle. Due to the difficulties in exactly predicting the RCS, approximate methods become the alternative. Majority of approximate methods are valid in optical region and where optical region has its own strengths and weaknesses. Therefore, the analysis given in this study is purely based on far field monostatic RCS measurements in the optical region. Computation is done using Physical Optics (PO) method for determining RCS of simple models. In this study not only the RCS of simple models but also missile shaped and rocket shaped models obtained from the cascaded objects with backscatter has been computed using Matlab simulation. Rectangular plots are obtained for RCS in dbsm versus aspect angle for simple and missile shaped objects using Matlab simulation. Treatment of RCS, in this study is based on Narrow Band.

  16. A simple real-time polymerase chain reaction (PCR)-based assay for authentication of the Chinese Panax ginseng cultivar Damaya from a local ginseng population.

    PubMed

    Wang, H; Wang, J; Li, G

    2016-06-27

    Panax ginseng is one of the most important medicinal plants in the Orient. Owing to its increasing demand in the world market, cultivated ginseng has become the main source of medicinal material. Among the Chinese ginseng cultivars, Damaya commands higher prices and is grown in significant proportions among the local ginseng population. Due to the lack of rapid and accurate authentication methods, Damaya is distributed among different cultivars in the local ginseng population in China. Here, we identified a unique, Damaya-specific single nucleotide polymorphism (SNP) site present in the second intron of mitochondrial cytochrome c oxidase subunit 2 (cox2). Based on this SNP, a Damaya cultivar-specific primer was designed and an allele-specific polymerase chain reaction (PCR) was optimized for the effective molecular authentication of Damaya. We designed a method by combining a simple DNA isolation method with real-time allele-specific PCR using SYBR Green I fluorescent dye, and proved its efficacy in clearly discriminated Damaya cultivar from other Chinese ginseng cultivars according to the allelic discrimination analysis. Hence, this study provides a simple and rapid assay for the differentiation and conservation of Damaya from the local Chinese ginseng population.

  17. A novel photoelectrochemical biosensor for protein kinase activity assay based on phosphorylated graphite-like carbon nitride.

    PubMed

    Li, Xue; Zhou, Yunlei; Xu, Yan; Xu, Huijie; Wang, Minghui; Yin, Huanshun; Ai, Shiyun

    2016-08-31

    Protein kinases are general and significant regulators in the cell signaling pathway, and it is still greatly desired to achieve simple and quick kinase detection. Herein, we develop a simple and sensitive photoelectrochemical strategy for the detection of protein kinase activity based on the bond between phosphorylated peptide and phosphorylated graphite-like carbon nitride (P-g-C3N4) conjugates triggered by Zr(4+) ion coordination. Under optimal conditions, the increased photocurrent is proportional to the protein kinase A (PKA) concentration ranging from 0.05 to 50 U/mL with a detection limit of 0.077 U/mL. Moreover, this photoelectrochemical assay can be also applied to quantitative analysis of kinase inhibition. The results indicated that the IC50 value (inhibitor concentration producing 50% inhibitor) for ellagic acid was 9.1 μM. Moreover, the developed method is further applied to detect PKA activity in real samples, which contains serum from healthy person and gastric cancer patients and breast tissue from healthy person and breast cancer patients. Therefore, the established protocol provides a new and simple tool for assay of kinase activity and its inhibitors with low cost and high sensitivity. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Free-space optical channel simulator for weak-turbulence conditions.

    PubMed

    Bykhovsky, Dima

    2015-11-01

    Free-space optical (FSO) communication may be severely influenced by the inevitable turbulence effect that results in channel gain fluctuations and fading. The objective of this paper is to provide a simple and effective simulator of the weak-turbulence FSO channel that emulates the influence of the temporal covariance effect. Specifically, the proposed model is based on lognormal distributed samples with a corresponding correlation time. The simulator is based on the solution of the first-order stochastic differential equation (SDE). The results of the provided SDE analysis reveal its efficacy for turbulent channel modeling.

  19. Frequency Estimator Performance for a Software-Based Beacon Receiver

    NASA Technical Reports Server (NTRS)

    Zemba, Michael J.; Morse, Jacquelynne Rose; Nessel, James A.; Miranda, Felix

    2014-01-01

    As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a QV-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.

  20. One Step Quantum Key Distribution Based on EPR Entanglement.

    PubMed

    Li, Jian; Li, Na; Li, Lei-Lei; Wang, Tao

    2016-06-30

    A novel quantum key distribution protocol is presented, based on entanglement and dense coding and allowing asymptotically secure key distribution. Considering the storage time limit of quantum bits, a grouping quantum key distribution protocol is proposed, which overcomes the vulnerability of first protocol and improves the maneuverability. Moreover, a security analysis is given and a simple type of eavesdropper's attack would introduce at least an error rate of 46.875%. Compared with the "Ping-pong" protocol involving two steps, the proposed protocol does not need to store the qubit and only involves one step.

Top